Linked by Thom Holwerda on Wed 22nd Apr 2009 17:14 UTC, submitted by orfanum
Geek stuff, sci-fi... A detailed simulation of a small region of a brain built molecule by molecule has been constructed and has recreated experimental results from real brains. The "Blue Brain" has been put in a virtual body, and observing it gives the first indications of the molecular and neural basis of thought and memory. Scaling the simulation to the human brain is only a matter of money, says the project's head. The work was presented at the European Future Technologies meeting in Prague.
Thread beginning with comment 359924
To read all comments associated with this story, please click here.
by WereCatf on Wed 22nd Apr 2009 18:50 UTC
Member since:

AI development is an interesting topic, and a real tough nut to crack. As such, simulated brains is a really interesting topic. However, when/if scientists eventually develop virtual brains with enough complexity to allow independent, non-aided thought-process and emotions they will have created actual virtual human-made life.

The problem is, are you killing someone if you turn off such a virtual personality? Where do you draw the line between a life and non-life? If you have no biological body but can feel emotions, can think for yourself, can even discuss with others does that still mean that others should be allowed to decide whether you live or die, on a whim, without consequences?

I personally lean towards the idea that once the virtual personality has reached a certain point it should gain some rights and not be allowed to be dismissed without consequences.

Reply Score: 3

RE: Ethics
by Buck on Wed 22nd Apr 2009 18:53 in reply to "Ethics"
Buck Member since:

Oh people and their questionable ethics... First they create something and then start to question whether it's okay to kill it. Well, if you have a problem with that, don't create it in the first place, all right?

Reply Parent Score: 3

RE[2]: Ethics
by FealDorf on Wed 22nd Apr 2009 19:07 in reply to "RE: Ethics"
FealDorf Member since:

All of that is the reason why I'm against AI. There are things creepier than them coming to attack us..

Reply Parent Score: 1

RE[2]: Ethics
by darknexus on Thu 23rd Apr 2009 04:44 in reply to "RE: Ethics"
darknexus Member since:

In otherwords, your logic is:
I've created it, I can kill it?
Stop for a second and think of where that line of logic might lead you, or how it could be misused... No, we don't want to go there.

Reply Parent Score: 2

RE: Ethics
by Ventajou on Wed 22nd Apr 2009 19:38 in reply to "Ethics"
Ventajou Member since:

Easy. Create a virtually really painful disease and the thing will beg you to unplug it.

Reply Parent Score: 2

RE: Ethics
by CapEnt on Wed 22nd Apr 2009 20:39 in reply to "Ethics"
CapEnt Member since:

Just suspend the software until the technology matures enough to create a body for it. Technically, the "being" will remain alive.

We can stop and resume a software executing inside a conventional computing environment, unlike a human brain or even general biological functions (yet).

But shutdown, with loss of all information, a self-aware virtual personality should be a crime.

Reply Parent Score: 3