Linked by Eugenia Loli on Mon 10th Oct 2011 19:55 UTC
Graphics, User Interfaces Within the last few days we read the news about Apple's Siri AI personal assistant, and about a brain implant that lets monkeys control virtual limps & feel virtual objects. I believe that if someone is to also combine a few more technologies (e.g. high-res eyeware, appropriate operating system changes), we will be looking at the next user interface revolution, after the inventions of the computer mouse and touch interfaces.
Permalink for comment 492463
To read all comments associated with this story, please click here.
Brainwaves are not standard
by Laurence on Mon 10th Oct 2011 23:07 UTC
Laurence
Member since:
2007-03-26

This is a lengthy post, but it's a subject that has interested me for a while and so I've read a few papers on this over the years.

There's a fundamental flaw with peoples argument of using brainwaves to control hardware and that's the fact that no two peoples brainswaves are the same. We're all wired differently - which means different parts of out brain "lights up" when different people think the same thought. This means that not only does the user have to teach themselves how to use these new computers, but the computer also has to teach itself to read the user.

Things are then further complicated when you factor in how difficult it is for humans to focus on one thought at a time. Humans (like most animals) have scattered thoughts - it's a survival mechanism. So, for example, to play a racing game and to focus on just thinking "left" or "right" to steer the car is incredibly hard. Particularly as most games (even basic driving simulators) have more to concentrate on than simply just left or right turns. Thus training your brain to isolate those thoughts is a /huge/ learning curve which would otherwise been unnecessary with traditional human interfaces (and all of that is after you've already spent half your day just training the computer to recognise your left thoughts from your right thoughts).

Furthermore, the reason tactile and verbal interfaces work so well is because they're natural to the way we interact with the world. We don't "will" an object to move, we touch it and it moves. Or we shout at our family to move it for us. But either way, it's a physical or verbal process. And while the process is being performed, more often than not we'd be thinking about something else (eg I'd move a vase but that process will have me thinking about what I'd be buying my partner for Christmas). So how do you get the computer to separate actual instructions from natural tangents (again, going back to the driving sim, how would the computer know the difference between you thinking "break" and thinking "i need to break at the next turn")

The only research in this field that I've thought was even close to being practical was the research with amputees. However the only reason that was successful was simply because those subject already had pre-existing pathways that they'd practised using for 20+ years of their life and which, sadly, are now unused. So they could send the message to move their left finger and that would trigger whatever device was configured to react to that pathway. However even that research is impractical for those of us lucky enough to have a full compliment of limbs as if I send a message to move my left finger, my left finger moves. So if I'm moving my limbs anyway, I might as well just use a tactile interface.

We also have the problem that current technology simply cannot give us a fine enough detail on our firing neurons. Thus this makes reading our thought patterns infinitely more difficult. However I will concede that this point should / will disappear as technology advances and we find more accurate ways to monitor the electrical activity in our brains.

So don't get me wrong, I'm not saying that we'll never have thought controlled hardware, I'm sure with time we'll gradually get exposed to more and more novelty toys based around this tech and as future generations grow up, they'll just be so exposed to these things that eventually humans will have both mastered a technique of "thinking" with these devices, but also the technology will be in a better position to accurately read our thoughts. But that time, we'll also have learned from failed experiments and have developed better ways for computers to learn our individual brain patterns with greater speed than the lengthy process it is at the moment. When all that falls into place than this technology may well be viable for practical applications. But I very much doubt this will happen in our life time.

[edit]

Wow, my post is almost as long as Eugenia's article hehe

Edited 2011-10-10 23:17 UTC

Reply Score: 4