Linked by Eugenia Loli on Mon 10th Oct 2011 19:55 UTC
Graphics, User Interfaces Within the last few days we read the news about Apple's Siri AI personal assistant, and about a brain implant that lets monkeys control virtual limps & feel virtual objects. I believe that if someone is to also combine a few more technologies (e.g. high-res eyeware, appropriate operating system changes), we will be looking at the next user interface revolution, after the inventions of the computer mouse and touch interfaces.
Order by: Score:
Sorry to disappoint
by twitterfire on Mon 10th Oct 2011 20:35 UTC
twitterfire
Member since:
2008-09-11

The things aren't running exactly how you try to picture them. Nobody will let his brain to interface with suspect hardware and tablets, desktops and laptops will be with us at least a century or so.

Maybe at some point in the future the humans will interact with hardware in some very creative ways, but not in the foreseeable future.

Reply Score: 4

RE: Sorry to disappoint
by Eugenia on Mon 10th Oct 2011 20:44 UTC in reply to "Sorry to disappoint"
Eugenia Member since:
2005-06-28

Trust is a marketing point, not necessarily technical (as Apple has showcased many times). People in the past wouldn't entrust their money on banks for example, and while there are good reasons to not trust them, most people have their money on banks anyway.

When personal computers started becoming popular, people were afraid to trust them too. Your post sounded like the Orthodox Church and their fear for computers in the '80s and how we would all lose our individuality, and have a chip in our forehead as a personal ID, that's controlling us (I watched various such "documentaries" on the Greek TV as a teen). It's all FUD. Eventually things like that get ironed out, security also matures, not just software features.

Remember the iPhone and how everyone "hated it" in the beginning (before it got released 2 months later) because it had no stylus or hardware buttons. A lot of whining about that back in early 2007. But since its touch UI made sense eventually, people followed it anyway when the device actually came out. And they loved it.

The UI I suggest is even more intuitive and natural. There's nothing stopping progress. If something is useful, it gets adopted, despite the risks.

Edited 2011-10-10 21:07 UTC

Reply Score: 2

RE[2]: Sorry to disappoint
by Bill Shooter of Bul on Mon 10th Oct 2011 21:09 UTC in reply to "RE: Sorry to disappoint"
Bill Shooter of Bul Member since:
2006-07-14

I hadn't heard that bit about the Orthadox church but I'd rather there be a well defined interface based on open standards between the brain and computer to ensure that it was only able to gleem so much information.

Typically I use my brian as a buffer for things I might say/do/type before I actually do one of those things, then they are run through a couple situation dependant filters before they are possibly acted upon. I would like to retain that ability, even with brain interfaces.

Reply Score: 7

RE[2]: Sorry to disappoint
by Yoko_T on Tue 11th Oct 2011 23:19 UTC in reply to "RE: Sorry to disappoint"
Yoko_T Member since:
2011-08-18

Trust is a marketing point, not necessarily technical (as Apple has showcased many times). People in the past wouldn't entrust their money on banks for example, and while there are good reasons to not trust them, most people have their money on banks anyway.

When personal computers started becoming popular, people were afraid to trust them too. Your post sounded like the Orthodox Church and their fear for computers in the '80s and how we would all lose our individuality, and have a chip in our forehead as a personal ID, that's controlling us (I watched various such "documentaries" on the Greek TV as a teen). It's all FUD. Eventually things like that get ironed out, security also matures, not just software features.

Remember the iPhone and how everyone "hated it" in the beginning (before it got released 2 months later) because it had no stylus or hardware buttons. A lot of whining about that back in early 2007. But since its touch UI made sense eventually, people followed it anyway when the device actually came out. And they loved it.

The UI I suggest is even more intuitive and natural. There's nothing stopping progress. If something is useful, it gets adopted, despite the risks.


Do yourself a favor and watch a Anime OVA/Moive called Macross Plus and you'll realize just how utterly stupid what you are suggesting really is.

To make things simple, A test pilot hooked to an interface similar to what you are suggesting started daydreaming about causing his rival test pilot aircraft to crash and burn. Guess what happend? Yep, the interface took that idle thought and acted upon it, and there was nothing the pilot who thought the command could do to stop it.

Some future to forward to isn't it?

Reply Score: 1

RE: Sorry to disappoint
by tupp on Tue 11th Oct 2011 04:27 UTC in reply to "Sorry to disappoint"
tupp Member since:
2006-11-12

Nobody will let his brain to interface with suspect hardware and tablets, desktops and laptops will be with us at least a century or so.

Maybe at some point in the future the humans will interact with hardware in some very creative ways,but not in the foreseeable future.

Apparently, the future is now: http://www.post-gazette.com/pg/11283/1181062-53.stm

Reply Score: 2

Already there
by No it isnt on Mon 10th Oct 2011 21:20 UTC
No it isnt
Member since:
2005-11-14

I've been using my brain to control computers for years. The most practical brain-computer-interface I've tried so far is digital, i.e. using my brain to control the digits on my hands to strike keys on a keyboard. I especially like the feedback mechanisms, which are both visual and tactile, so that I can read the output immediately for easily calibration of the input mechanism. Often, I will intuitively 'feel' when the input is wrong, even without watching the output.

Yeah, it's amazing.

Reply Score: 17

RE: Already there
by tomcat on Mon 10th Oct 2011 21:59 UTC in reply to "Already there"
tomcat Member since:
2006-01-06

+1. Some people have zero sense of humor.

Reply Score: 2

RE: Already there
by viton on Tue 11th Oct 2011 00:24 UTC in reply to "Already there"
viton Member since:
2005-08-09

using my brain to control the digits on my hands to strike keys on a keyboard.

Well, it is nice if you have 2 hands. But some persons are not that lucky.

Reply Score: 2

RE[2]: Already there
by No it isnt on Tue 11th Oct 2011 08:38 UTC in reply to "RE: Already there"
No it isnt Member since:
2005-11-14

An excellent point. Brainwave control has its place, but it's a prosthesis. Physically healthy humans are just too well adapted for direct interaction with a spacial world for it to make sense, IMO.

Reply Score: 3

RE[2]: Already there
by Doc Pain on Tue 11th Oct 2011 17:37 UTC in reply to "RE: Already there"
Doc Pain Member since:
2006-10-08

Well, it is nice if you have 2 hands. But some persons are not that lucky.


Fully agree. Something similar can be said about people having non-neurotypical brainwaves - or just not the required set of two 100% working eyes to participate in 3D hype. :-)

Reply Score: 2

RE[3]: Already there
by Neolander on Tue 11th Oct 2011 18:55 UTC in reply to "RE[2]: Already there"
Neolander Member since:
2010-03-08

Fully agree. Something similar can be said about people having non-neurotypical brainwaves - or just not the required set of two 100% working eyes to participate in 3D hype. :-)

And even when you have the eyes... I have two 100% working eyes, but it was not the case when I was younger, so my brain worked things out in an unusual fashion and my perception of relief is not based on binocular vision, but rather on other things like shadows, perspective and parallax effects.

For me, the main difference between 3D cinema and 2D cinema is that the first one is more expensive and requires you to wear stupid glasses. So in my perception of things, this 3D hype is truly based on thin air ;)

Edited 2011-10-11 18:57 UTC

Reply Score: 1

Not holography
by zima on Mon 10th Oct 2011 22:00 UTC
zima
Member since:
2005-07-06

There are two ways to deal with the problem. One, is holographic projected displays. Unfortunately, we’re far away technologically from something like it, and no one really wants to see your huge display in the middle of the street, while you're trying to get directions

Cheap scifi film effects unfortunately totally hijacked the meaning of the word "holographic" over the decades, so the public imagines ...roughly this "huge display in the middle of the street" or, generally, what really seems more like a volumetric display (usually while projecting somebody in "scifi videoconferencing")

But that is NOT holography; the word has very specific meaning. And, sadly, people don't even realize how impressive real, already actually existing holograms feel (one rapidly found example http://www.youtube.com/watch?v=Xp7BP00LuA4 another nice one http://www.youtube.com/watch?v=6AVAzGQMxEg ), basically for a few decades already; people seem to not realize about them to the point of doubting what they see on such videos, thinking it's just a trick (no, it's not... they feel awesome when held, the light actually really behaving like the plate has some "inside")

Yes, static so far, also with poor colour gamut and such. Good holographic video display will require effective size of pixels comparable to the wavelength of light, plus processing and memory we're nowhere near yet.
But once we're there... a display will feel kinda like a window or mirror (bonus: at that point, we probably won't have to endure any more the efforts, every 2 decades or so despite numerous failures, to push on us the cheap and inherently flawed trick of stereography)

All this essentially being an example of the unfortunate effects of "scifi cargo cultism" that I sometimes point out - fluffy fantasies displacing, masking, causing the people to miss the wild possibilities in actually existing universe.

Edited 2011-10-10 22:07 UTC

Reply Score: 4

RE: Not holography
by Thom_Holwerda on Mon 10th Oct 2011 22:12 UTC in reply to "Not holography"
Thom_Holwerda Member since:
2005-06-29

I want one of those. Holy crap that was amazing.

Reply Score: 1

RE[2]: Not holography
by zima on Mon 10th Oct 2011 22:24 UTC in reply to "RE: Not holography"
zima Member since:
2005-07-06

See, what I was saying ;) - even technology enthusiasts, and such, not realizing about this fairly old[1] stuff (easily older than you, Thom) ...so what hope there is for the general public, whose "imagination" is shaped by cheap popular scifi productions?

1. Old in terms of recent technology progress; the implementation I randomly found on YT not really being so ground-breaking as they claim, it mostly "just" seems to revolve around being a decently marketable product. Any self-respecting physics department probably has some holograms, and there are a few ~commercial labs around the world that make them (contact one, Thom? Maybe they'll even have some defectively developed one - which could just look weird in part of the "image" or under certain viewing angles - but probably still impressive, clearly with depth[2])

2. Yes, in holograms that is essentially a real depth, they are about reproducing how the light would really behave if the plate had some insides.

Edited 2011-10-10 22:26 UTC

Reply Score: 2

RE[2]: Not holography
by zima on Tue 11th Oct 2011 10:16 UTC in reply to "RE: Not holography"
zima Member since:
2005-07-06

PS. BTW, you can easily do a basic, but still fun imitation of sorts (one which relies more on psycho-visual effects in how our brains are wired to see ~faces than on actual scattering of light in a "proper" way), if you have a printer! ;)

Hollow-face illusion http://en.wikipedia.org/wiki/Hollow-Face_illusion & three dragons to print out http://www.grand-illusions.com/opticalillusions/three_dragons/

Actually, inspired by how one preschool-theatre costume (of a... cat; with proper ears) supposedly induced a panic attack in the kitten of my buddy, I essentially reworked the dragon once, to be more "danger! Possible unknown big cat!"-like. Yup, my cate became... nervous.

(quick Google search for the above Wiki page even revealed one with a cat design http://allgraphical.blogspot.com/2010/10/paper-craft-illusion.html
...I can't vouch for how convincing it is, though)


Bonus points: place a large version in a street-facing window ;)


Then there are also some software toys like this DSi game http://www.youtube.com/watch?v=h5QSclrIdlE which, essentially, partly simulate - to one viewer at a time - roughly how a hologram would feel.

Reply Score: 2

RE: Not holography
by Eugenia on Mon 10th Oct 2011 22:25 UTC in reply to "Not holography"
Eugenia Member since:
2005-06-28

A display like this does not change much in the grand scheme of things. It's still a display, that happens to display in a 3D manner. It's an evolution on the current 2D UIs, but not a revolution (especially if someone has to carry lights). Such displays still have a rather large, physical X x Y. They're good to display stuff to a large crowd, but not for personal computing. The point is to eliminate the need for big displays. Most of us already wear eye-glasses. Not using them for anything else is just a waste of resources and lack of ingenuity.

Edited 2011-10-10 22:44 UTC

Reply Score: 1

RE[2]: Not holography
by zima on Tue 11th Oct 2011 09:50 UTC in reply to "RE: Not holography"
zima Member since:
2005-07-06

Main point was - don't call it holographic, don't reinforce such usage, don't waste the "holography" term by using it on some pop-cultural contraptions of, probably, ever dubious practicality. The term is much more specific, and at the same time might very well give something much (figuratively) bigger, nicer, broader than limited visions of directors or visual effects artists would suggest.


But also don't dismiss it so readily. Few points:

Remember, a good proper holographic display, if it also tracks viewers (comparatively trivial, relative to its other advances), could easily display completely different thing to each and every pair of eyes looking at it, that is just its inherent property.
(those constraints of "imposed imagination", ultimately limited one - vs. actually "applied" one, on the basis of how the world, science, etc. are - that I mentioned)

If some promising paths prove fruitful (possibly, say, applications of graphene and their implications), decent holographic displays could as well end up covering almost everything, at least in places with any notable ("valuable"? ;/ ) human concentrations and where they would likely look. And/or they could have the physical form of, essentially, wallpaper.*
Yes, to you (or to me for that matter) that would seem "crazy" and insanely wasteful - but, consider how people living just few short centuries ago would think the very same thing about covering whole facades with something so valuable as glass (especially one so incredibly translucent and smooth!), or aluminium and such.
Heck, "glass houses" were used less than a mere century ago in one locally notable novel ( http://en.wikipedia.org/wiki/The_Spring_to_Come )
as an idea, symbol representing unrealistic dreams of perfect place. Now look around you ...no perfect place in sight, glass houses everywhere. ;)



And that's only when touching on physical in-setting screens.

Because, see, what you don't realize is that there would be most likely a major technological overlap between such good holographic displays and ...good eye-displays. They are not so separate as they appear, able to use fundamentally similar technology in their quest to be any good.

What present eye-screens seem mostly "good" for - if they don't have optical systems making their size, weight & price not trivial - is giving people headaches.

However: the "substrate" required for good holographic screens basically would be also a perfect optical system (for the wavelengths larger than its "pixels", at least). You could possibly, perhaps, even hack a "wallpaper display" and reconfigure it (via ~firmware!) into something acting as & easily rivalling (being perfect optically) some tediously laboured lenses or mirrors in best telescopes.
That's also something which would be very helpful in good eye-screen (the other approach which seems promising is a direct laser projection on the retina, though possibly with more "schematic" graphics - but that's OK, since it will most likely be practical much sooner)


Generally, unspecific fantasies (how would it work, roughly, while doing it good?) are not ingenuity. Too often they are closer to wishy-washy visions which miss both the issues of what they propose, and many upcoming opportunities in somewhat different approaches.
We don't "waste" glasses for anything (good), not at this point.
(but BTW, I do have a related small & silly pet project on hold / waiting for some basically already "here" technology... but that's all I'm saying now! ;> )


*Best of all, if covering the inside walls, imagine: it could easily seem like every room has great view ;) ;) Also when it doesn't actually have any windows! ;) (say, in some monolithic mega-unit to find housing space for massive overpopulation ;) ). More, each of the occupants could have the view they prefer (as long as all scenes comparable lighting, I imagine; otherwise it would probably often lead to weird mood mix in the room), even "back to nature" - for example, looking like an open tent inside of a forest ;) (yeah, probably more depressing if anything; but perhaps it points to implications of another hypothetical major advancement way further down the line - if we would "augment" our bodies with some forms of technology allowing us to not care about cold, rain, elements overall, what difference do the walls make?)


"especially if someone has to carry lights"?

Reply Score: 2

a load of garbage
by unclefester on Mon 10th Oct 2011 23:04 UTC
unclefester
Member since:
2007-01-13

Humans already have a spectacularly efficient brain/machine interface. It is called the hand. The hand provides spatial (proprioception) and tactile feedback, It is semi-autonomous with training.

Aviation has avoided using voice control because it is extremely inefficient, imprecise and slow compared with using the hand.

Music players, phones and tablets are already causing significantly higher rates of vehicular accidents and pedestrian deaths. This is because the human brain has virtually no ability to multi-task.

Reply Score: 2

RE: a load of garbage
by Eugenia on Mon 10th Oct 2011 23:10 UTC in reply to "a load of garbage"
Eugenia Member since:
2005-06-28

First of all, voice was only mentioned as an alternative to brainwaves, so I don't see why you are mentioning just that. Secondly, many things require lots of clicks to do by using a hand, but it can be done instantly with a thought. Because the thought doesn't ask the computer to "click an icon", as the finger would do, but it actually carries out full actions. So if I want to incorporate a bicycle in my CGI scene or 2D pic, I simply think it. I transmit the mind picture to the app and the app figures out how to create it and display it. Using the hand to actually design such a thing in 3D, it can take about 3 days if it's to be done properly.

Edited 2011-10-10 23:13 UTC

Reply Score: 1

RE[2]: a load of garbage
by Laurence on Tue 11th Oct 2011 00:10 UTC in reply to "RE: a load of garbage"
Laurence Member since:
2007-03-26

First of all, voice was only mentioned as an alternative to brainwaves, so I don't see why you are mentioning just that. Secondly, many things require lots of clicks to do by using a hand, but it can be done instantly with a thought. Because the thought doesn't ask the computer to "click an icon", as the finger would do, but it actually carries out full actions. So if I want to incorporate a bicycle in my CGI scene or 2D pic, I simply think it. I transmit the mind picture to the app and the app figures out how to create it and display it. Using the hand to actually design such a thing in 3D, it can take about 3 days if it's to be done properly.


But how are you going to program the computer to recognise those brainwaves?

It takes humans months of crawling before we can learn how to walk. That's months of learning the size and shape of our limbs and how to move them. But that's not all we learn. Everything is "recorded"; our thoughts and emotions during that process. Smells, sights, sounds and even tastes. All of that on top of the thought processes to actually just move our limbs the correct amount and in the correct order. All of that wired and rewired on the fly and in a totally unique pattern to how our brother / sister, our cousins, our parents and our friends. All of us wired differently.

So how on Earth will you program a computer to understand which neurons firing will relate to "paste bicycle image over CGI scene" and which relate to "oh I like that song that's playing on the radio as it reminds me of yoghurt"?

Rudimentary stuff like copy/paste from clipboard can be programmed in. But then you end up in a situation where you have to chain these rudimentary thoughts in sequence like you would clicks on a mouse. So you're no better off. In fact worse off as you're now having to program your thoughts into a computer before using it (which is currently a very lengthy process), you're having to learn how to control your own thoughts so you don't have the mental equivalent of a muscle spasm everytime a tampax advert comes on the TV thus resulting in your computer shutting down and you losing your work. And all you gain from this is the reaction time that would have been spent between thinking about moving your left finger and your left finger clicking the left mouse button.

As I said in my other epic post, the technology isn't the only hurdle we face with this ideology of yours - it's human development.

Edited 2011-10-11 00:16 UTC

Reply Score: 3

RE[3]: a load of garbage
by Eugenia on Tue 11th Oct 2011 00:25 UTC in reply to "RE[2]: a load of garbage"
Eugenia Member since:
2005-06-28

The same way the technology currently exists where people can control UI using such an interface. Follow the links, watch the video.

Reply Score: 1

RE[4]: a load of garbage
by Laurence on Tue 11th Oct 2011 00:39 UTC in reply to "RE[3]: a load of garbage"
Laurence Member since:
2007-03-26

The same way the technology currently exists where people can control UI using such an interface. Follow the links, watch the video.

I've seen those videos and countless more like them. They just highlight my point that these kits need to be pre-populated with a map of the users pathways, which is something that can only be done on a command by command process. This is very time consuming and significantly reduces the power of these devices to far below what your expectations are.

So your reply doesn't address my criticisms in the slightest.

Reply Score: 4

RE[3]: a load of garbage
by Soulbender on Tue 11th Oct 2011 14:53 UTC in reply to "RE[2]: a load of garbage"
Soulbender Member since:
2005-08-18

Maybe it's like flying cars. It sounds awesome at first but then you realize something. Do you really want those schmucks who drive recklessly in two dimensions having a 3d dimension to worry about and fsck up? Never mind that at a couple of 100 feets even minor accidents become catastrophes.

Reply Score: 2

RE[2]: a load of garbage
by unclefester on Tue 11th Oct 2011 04:08 UTC in reply to "RE: a load of garbage"
unclefester Member since:
2007-01-13

This is a common and completely incorrect assumption about how the human motor control (movement) system functions. The body's motions are controlled by continuous feedback loops provided by muscles, nerves and the motor cortex not by conscious thought. Movement is only consciously controlled when we initially learn a new task. Conscious thought is only used to initiate a movement eg the desire to go and get a cup of coffee. The actual movements are essentially an automated process.


In fact mind control is extremely tiring because there are no real feedback loops. It is equivalent to being perpetually stuck at the ability level of your first driving lesson.

The only realistic use for mind control or voice control is to allow disabled people to perform simple tasks.

Edited 2011-10-11 04:09 UTC

Reply Score: 3

RE[3]: a load of garbage
by Laurence on Tue 11th Oct 2011 06:18 UTC in reply to "RE[2]: a load of garbage"
Laurence Member since:
2007-03-26

This is a common and completely incorrect assumption about how the human motor control (movement) system functions. The body's motions are controlled by continuous feedback loops provided by muscles, nerves and the motor cortex not by conscious thought. Movement is only consciously controlled when we initially learn a new task. Conscious thought is only used to initiate a movement eg the desire to go and get a cup of coffee. The actual movements are essentially an automated process.


In fact mind control is extremely tiring because there are no real feedback loops. It is equivalent to being perpetually stucK at the ability level of your first driving lesson.

The only realistic use for mind control or voice control is to allow disabled people to perform simple tasks.


I wasn't aware of much of that either. Thank you.

One thing I will add, is that even in the case of disabled where the subject is an amputee, it's more likely that any 'thought control' would be controlled via the nervous system using the impulses for the limbs they no longer have.

Reply Score: 2

RE[4]: a load of garbage
by unclefester on Tue 11th Oct 2011 07:17 UTC in reply to "RE[3]: a load of garbage"
unclefester Member since:
2007-01-13

Try to move your hand exactly 1mm by conscious thought - it is virtually impossible. However you can effortlessly do far more precise movements such as placing a mouse on a specific screen pixel when you have continuous positive and negative feedback from stretch receptors in muscles, pressure sensors in your fingertips and visual input.

Reply Score: 2

RE[2]: a load of garbage
by Neolander on Tue 11th Oct 2011 10:43 UTC in reply to "RE: a load of garbage"
Neolander Member since:
2010-03-08

I think this bike example wouldn't work, because thoughts are not detailed enough for the computer to know exactly what you want if you think "I need a bike there".

Have you ever faced a situation where you have something in mind, you think you know exactly what it is, but as soon as you want to explain it or create it (if it is a physical object), you hesitate and must think some more ? I believe this reflects the way our mind works. We have a blurry image, and we work out details as needed. Like with vision : we only see a huge load of blur, with a tiny neat region in the middle, but our brain and eyes silently fetch and parse details on demand, so fast we don't notice.

Again, maybe someone who knows more about the subject than me can confirm. But if it's true, adding a bike in a CGI scene would need as much attention to details with a mind control interface as with a mouse. It would remain very lengthy, because although the brain-computer link could be made a bit faster, brain speed wou

Edited 2011-10-11 10:51 UTC

Reply Score: 1

RE[3]: a load of garbage
by Neolander on Tue 11th Oct 2011 15:57 UTC in reply to "RE[2]: a load of garbage"
Neolander Member since:
2010-03-08

brain speed would be the limiting factor in the end*

(stupid phone browser character limit)

Reply Score: 1

Brainwaves are not standard
by Laurence on Mon 10th Oct 2011 23:07 UTC
Laurence
Member since:
2007-03-26

This is a lengthy post, but it's a subject that has interested me for a while and so I've read a few papers on this over the years.

There's a fundamental flaw with peoples argument of using brainwaves to control hardware and that's the fact that no two peoples brainswaves are the same. We're all wired differently - which means different parts of out brain "lights up" when different people think the same thought. This means that not only does the user have to teach themselves how to use these new computers, but the computer also has to teach itself to read the user.

Things are then further complicated when you factor in how difficult it is for humans to focus on one thought at a time. Humans (like most animals) have scattered thoughts - it's a survival mechanism. So, for example, to play a racing game and to focus on just thinking "left" or "right" to steer the car is incredibly hard. Particularly as most games (even basic driving simulators) have more to concentrate on than simply just left or right turns. Thus training your brain to isolate those thoughts is a /huge/ learning curve which would otherwise been unnecessary with traditional human interfaces (and all of that is after you've already spent half your day just training the computer to recognise your left thoughts from your right thoughts).

Furthermore, the reason tactile and verbal interfaces work so well is because they're natural to the way we interact with the world. We don't "will" an object to move, we touch it and it moves. Or we shout at our family to move it for us. But either way, it's a physical or verbal process. And while the process is being performed, more often than not we'd be thinking about something else (eg I'd move a vase but that process will have me thinking about what I'd be buying my partner for Christmas). So how do you get the computer to separate actual instructions from natural tangents (again, going back to the driving sim, how would the computer know the difference between you thinking "break" and thinking "i need to break at the next turn")

The only research in this field that I've thought was even close to being practical was the research with amputees. However the only reason that was successful was simply because those subject already had pre-existing pathways that they'd practised using for 20+ years of their life and which, sadly, are now unused. So they could send the message to move their left finger and that would trigger whatever device was configured to react to that pathway. However even that research is impractical for those of us lucky enough to have a full compliment of limbs as if I send a message to move my left finger, my left finger moves. So if I'm moving my limbs anyway, I might as well just use a tactile interface.

We also have the problem that current technology simply cannot give us a fine enough detail on our firing neurons. Thus this makes reading our thought patterns infinitely more difficult. However I will concede that this point should / will disappear as technology advances and we find more accurate ways to monitor the electrical activity in our brains.

So don't get me wrong, I'm not saying that we'll never have thought controlled hardware, I'm sure with time we'll gradually get exposed to more and more novelty toys based around this tech and as future generations grow up, they'll just be so exposed to these things that eventually humans will have both mastered a technique of "thinking" with these devices, but also the technology will be in a better position to accurately read our thoughts. But that time, we'll also have learned from failed experiments and have developed better ways for computers to learn our individual brain patterns with greater speed than the lengthy process it is at the moment. When all that falls into place than this technology may well be viable for practical applications. But I very much doubt this will happen in our life time.

[edit]

Wow, my post is almost as long as Eugenia's article hehe

Edited 2011-10-10 23:17 UTC

Reply Score: 4

RE: Brainwaves are not standard
by viton on Mon 10th Oct 2011 23:59 UTC in reply to "Brainwaves are not standard"
viton Member since:
2005-08-09

We're all wired differently
So we just need to produce an uber-human and clone it.
Then fill the brains of clones with the same data.

Problem solved

Edited 2011-10-11 00:00 UTC

Reply Score: 2

RE[2]: Brainwaves are not standard
by Laurence on Tue 11th Oct 2011 00:12 UTC in reply to "RE: Brainwaves are not standard"
Laurence Member since:
2007-03-26

We're all wired differently
So we just need to produce an uber-human and clone it.
Then fill the brains of clones with the same data.

Problem solved

Maybe we can then make them our slaves. ;)

Ahh hang on, that would make them voice activated and thus we wouldn't be using thought control. Damnit.

Reply Score: 2

RE[3]: Brainwaves are not standard
by viton on Tue 11th Oct 2011 00:19 UTC in reply to "RE[2]: Brainwaves are not standard"
viton Member since:
2005-08-09

Solution is not ideal, but they can use mind control to do things for us :-)

Reply Score: 2

RE: Brainwaves are not standard
by unclefester on Tue 11th Oct 2011 07:42 UTC in reply to "Brainwaves are not standard"
unclefester Member since:
2007-01-13

We learn to control movement almost entirely by trial and error. This creates a hard-wired algorithm in the neuromuscular system based on continuous feedback. The more we practice the stronger and more refined the algorithm becomes.

It is extremely unlikely that anyone could ever do a highly complex task by mind control alone. Because there is virtually no feedback the brain finds it extremely difficult to learn. The neuromuscular system takes only seconds to master simple tasks, such as flicking a light switch. The same task takes hours to learn to control by the mind.

Reply Score: 2

Comment by behemot
by behemot on Mon 10th Oct 2011 23:23 UTC
behemot
Member since:
2005-11-14

I think people have a real problem with imagination to just love spend their day typing in a chair, maybe they love their Alt+Tabbing of multiple Windows/Applications.

The thing is I believe that brainwaves as a user interface will be the future, but not with implants and not with complex AIs messing around. It'll just be better if we can use a display in glasses and/or contact lenses and if we discretize the elements of the user interface in such a way that we would not need any complex AI to be programmed and trained just to turn the device into something useful.

I think that this would be a better solution than messing with implants that would need a wholly class of new regulations (probably international regulations).

Every night when I spend some time browsing the internet with my iPod before I sleep I think that it would be really nice just to read, zoom, "type" if I just needed to think in press "A", then "R", then "S" instead of going and typing ARS or pinching the screen, it would be marvelous as well if I could read holding the device in front of me.

I believe this will be the future, it's better, more accessible, even to someone who cannot control his hands or do not have hands, this type of people exist, that what we have today, and I think will be the new touch when it finally arrives, I just hope to be alive in this happens.

Reply Score: 1

RE: Comment by behemot
by Eugenia on Mon 10th Oct 2011 23:52 UTC in reply to "Comment by behemot"
Eugenia Member since:
2005-06-28

That's what the article says too. No implants, just eyeware displays in the beginning, and small brainwave attachments in the far future.

Reply Score: 1

RE[2]: Comment by behemot
by viton on Tue 11th Oct 2011 06:41 UTC in reply to "RE: Comment by behemot"
viton Member since:
2005-08-09

http://innovega-inc.com
microdisplay+lenses

You can see virtual images that fill your entire field of view.

Edited 2011-10-11 06:42 UTC

Reply Score: 2

What about Kinect-style cam?
by kloty on Tue 11th Oct 2011 07:17 UTC
kloty
Member since:
2005-07-07

I had some similiar thoughts what can be the next breakthrough in computer technologies: http://technokloty.blogspot.com/2011/08/what-are-next-steps-for-aug...

What do you think about having 2 cams integrated in glasses, so you can do gestures to control your "computer"? Of course it looks silly when people start pointing to nowhere, but speaking in nowhere when you have a bluetooth headset was funny as well but people got used to it. I agree that voice recognition is very slow and unprecise.

Reply Score: 1

RE: What about Kinect-style cam?
by zima on Tue 11th Oct 2011 10:47 UTC in reply to "What about Kinect-style cam?"
zima Member since:
2005-07-06

As a main, prolonged way of control, this probably can work well only in films. Or on the ISS, perhaps.

Some starting points...
http://en.wikipedia.org/wiki/Touchscreen#Gorilla_arm
http://en.wikipedia.org/wiki/Gesture_recognition#.22Gorilla_arm.22

BTW, I don't think we really got used to people who loudly "talk to themselves" - we might accept it more readily, but it's still uncanny. Probably (still) triggers, and will continue to, some strong pathways in our brains which cause us to (unnecessarily) divert our attention to that speaker (to whom nobody replies - hence "wait, does that ape want something from me?!")

Reply Score: 2

Hacking
by jal_ on Tue 11th Oct 2011 08:26 UTC
jal_
Member since:
2006-11-02

What noone has mentioned as far as I could see skimming the comments, is the great potential for hacking, and the immense dangers that would ensue. With glasses for augmented reality (blinding someone, for example) and even more with brain implants (the article already mentioned hallucinations, what if an evil hacker sends you those "hallicinations")? And, to take it into the realm of "trust noone": what if the government would abuse your implants? I shudder to think of the possibilities the less benign governments would have (did I say less benign? German governement spyware anyone?).

Reply Score: 3

RE: Hacking
by righard on Tue 11th Oct 2011 13:34 UTC in reply to "Hacking"
righard Member since:
2007-12-26

The thought of a hallucination version of "Congratulations, You are the 1.000.000th visitor." makes me shudder.

"...Think of a pink elephant to allow downloading your personal details...[1]"

Reply Score: 3

Excellent Article.
by ExodusMachine on Tue 11th Oct 2011 09:00 UTC
ExodusMachine
Member since:
2005-07-06

@Eugenia, if you want to research a little more deeply into this subject, (specifically human-machine integration), look into trans-humanism. It's really a fascinating topic.

One of my favorite online magazines about the h+ community and the future of humanity is hplusmagazine.com. I don't agree with everything there, but it's very thought provoking.

It's also good to see you on the front page again.

Reply Score: 1

Some questions
by Neolander on Tue 11th Oct 2011 10:29 UTC
Neolander
Member since:
2010-03-08

A few questions for people who know more about the field than me.

1/Would it be possible to efficiently use brainwaves not as a full mind control interface, but as a keyboard + mouse replacement ? Like, to move an on-screen pointer, type words letter by letter or execute keyboard shortcuts using thoughts. The number of signals that the computer must recognize would become much smaller, and well defined so it seems to be a simpler technical problem.
2/If people were trained to use such a system from a young age, would they be as good using it as they are using physical pointing and typing devices, treating the computer as an extra limb so to speak ?

Reply Score: 1

RE: Some questions
by righard on Tue 11th Oct 2011 13:40 UTC in reply to "Some questions"
righard Member since:
2007-12-26

At my local hospital there is a "game" where one has to think a rocket up and down. I got to play with it and after a few minutes is is very easy.
This was just a y-axle, but I guess it is no problem to extend it with a x-axle.

Also note that this was about 10 years ago.

Reply Score: 2

RE[2]: Some questions
by Neolander on Tue 11th Oct 2011 16:09 UTC in reply to "RE: Some questions"
Neolander Member since:
2010-03-08

I'd be interested on some more details about this, if you still remember it ;) Could you modulate the speed at which that rocket moved ? Did you need to spend some time setting up the computer so that it recognizes your brain waves ? What kind of stuff did you have to think about in order to make the rocket move ?

Reply Score: 1

RE[3]: Some questions
by righard on Tue 11th Oct 2011 17:30 UTC in reply to "RE[2]: Some questions"
righard Member since:
2007-12-26

I'll try to describe it the best I can. It is difficult to explain because it is very subjective I think.

You how know when you want to hear something better you 'shift your focus' to your ears and how you can do this with other parts of your body. (There's a well known meditation technique where you do this with your whole body starting from your feet.)
I focused on a part of my brain like this, at first the rocket did nothing but when I repeated this the rocket did start to react. It did this very quickly, well under a minute. It did not react smoothly however but after a while (maybe a minute) it reacted very well.
I remember testing if focusing on other parts of my brain gave effect, which was a negative.

EDIT: By thinking/focussing 'louder' I could make the rocket increase it's speed.

The fact that I did this by focusing on a part of the brain was just my way of doing things. I just thought that I needed to create some electrical charge at a place of my brain where there where many electrodes. The operator told me to 'just move it up'.

My interpretation is that it doesn't move because you are thinking "please move up", but that you are just pressing buttons with you head. The computer is just looking for peaks in the brainwave signal, maybe it can read frustration or accomplishment so it can know if it did the right thing.

Sorry this is all I can remember though... Recently they where testing me for a rare form of epilepsy. I got many electrodes glued to my head going to an amplifier and then via a LPT-connector to a recording box.
Next time I have to wear this at home, and I will hook myself up to my computer. I'll do some experiments and post the results.

Edited 2011-10-11 17:31 UTC

Reply Score: 2

RE[4]: Some questions
by Neolander on Tue 11th Oct 2011 18:58 UTC in reply to "RE[3]: Some questions"
Neolander Member since:
2010-03-08

Perhaps this game was based on the detection of meditative brain states through beta waves, like this toy ? http://www.youtube.com/watch?v=IHA4j66MCa0

Edited 2011-10-11 19:03 UTC

Reply Score: 1

Not holding my breath...
by steve_s on Tue 11th Oct 2011 17:14 UTC
steve_s
Member since:
2006-01-16

Doug Englebart gave his infamous "mother of all demos" back in 1968. It was not until Apple launched the Macintosh in 1984 that many of the concepts he demonstrated became mainstream. Some of the things he demonstrated have only recently become an every-day reality.

Speech recognition, and speech driven user interfaces, have been touted since the early 80s. We've had to wait until 2011 to get Siri, and whilst Siri does appear to be fairly smart I strongly suspect it may take another decade for it to truly mature.

Back in the 80s we all believed that we'd be using speech as a primary interaction method by the mid 90's. We saw plenty of speech recognition demos back then, and it looked like the big problems had been solved. We vastly underestimated the complexity of the task.

There is no doubt that progress has been made in "brainwave interfaces", with fairly impressive demonstrations of people moving mouse pointers with their minds, and monkeys moving robot arms. Direct, or indirect, neural interfacing is however several orders of magnitude more complex than speech interaction. I have no doubt that it's a problem that will eventually get solved but I'm in my late 30s now and, given the historical rate of progress we've seen on other forms of human computer interaction, I doubt very much that it is something that is going to get solved within my lifetime.

Reply Score: 2

Amazing
by Moochman on Thu 13th Oct 2011 01:01 UTC
Moochman
Member since:
2005-07-06

This pretty much matches up exactly with my vision of the future of HCI. I didn't really expect the technology to develop this fast, though...

Reply Score: 2