“Operating systems need to evolve from performing traditional file management and I/O tasks to learning and accommodating user behaviors, a Microsoft executive said Tuesday during a keynote presentation at the O’Reilly Emerging Technology Conference here.” Read the article at InfoWorld.
People generally shun attempts to make computers more intelligent. Proof of this abounds: MS Bob whet the way of the dodo bird; the paper clip helper thing that pops up when you first start Word is one of the first features that is disabled by most. How many Win2K users actually enjoy those “personalized” menus or find them remotely helpful?
I don’t want to be friends with my computer nor do I want to have to communicate with it using anything resembling gramatically correct English. If I was forced to I’d probably say, “Computer, be stupid and really fast. While you’re at it, I’ll pick my own shortcuts, thank you very much.”
“Operating systems need to evolve from performing traditional file management and I/O tasks to learning and accommodating user behaviors..”
Is Microsoft moving out of the embedded, realtime and server markets then?
While i do agree with the notion of computers to become more tuned into our needs, i really wished they wouldn’t have used microsoft word as a example of things to come.
“The concept of intelligent software already is taking hold in Microsoft Word”
*Shudder*
and then in 5 years when they stitched together this lousy solution, they figure out it wont work on the general public and they’ll still look pathetic to me.. why could’nt they have chosen another line of business to torpedo. *sigh*
looks like another one to go into the pile of things that are taking away people’s privacy…
>”In this world, centralized security models disappear and you’re going
>to be thinking about digital rights management
>as a security mechanism for exchanging information,”
Clever,
IIRC Microsoft made some arguments that DRM needs to be secure and thus they should never be forced to open that code.
So, if Microsoft put DRM into say .doc format it will be illegal to reverse engineer it because that would be a breech of the DCMA.
Where would that leave Staroffice etc?
They get found guilty and still they never stop.
It keeps wrongly guessing what my goals are. This is the same reason I hate government. Grrrrrr
All this Linux vs MS whining some of you are doing is as per usual off base and tired. This has largely nothing to do with that. Why don’t you check out MSR and see what it is they do. And here’s a tip, a lot of what they do is not tied to any existing MS OS at all. Most of the time, you can’t very well do R&D for future technology if you’re limited by the functions and properties of current technology.
BTW looking at MS-Bob as a prime example of social/user computing is ignorant to say the least.
I still would rather see them junk it all and start over from scratch. And no, I don’t mean just the OS. ALL OF IT.
Microsoft is proof that throwing money at problems does not solve them. They put so much money and resources into collecting user-ideas and researching user-interface gizmos and behaviors that they don’t really see anything any more. They are too close to their own product, their own goals and their own way of thinking to see the reality.
Personalized menus are an example of Microsoft crime against user interface design. To go further with this concept (which they will do) is just to make it worse.
Current operating systems, Rashid said, are like the “worst nightmare” of an administrator or a secretary. “It starts on the first day and learns nothing,” he said.
Yeah, and filing cabinets and closets don’t learn either.
Microsoft research still hasn’t learned that the best organizational structure or system is still 100% useless if the user of that system is a disorganized person unwilling to take charge.
Looking at this in a pragmatic way, adding human language interfacing is just another way to justify faster processor and greater memory requirements.
Human languages are vastly inconsistent. People speaking the same language to each other have to work at their conversations to transfer worthwhile information. It’s natural to us and we use it intuitively. Being forced to learn how to change our language behaviors to match the disability of computers that are supposed to understand us will further widen the gap between normal people and computers.
“Currently, our systems aren’t paying attention [to behaviors],” Rashid said. “We have to make our systems understand about probabilistic behavior.” Computers, for example, should remember behaviors and use those as contexts for searching data, Rashid said.
Remembering behaviors doesn’t help when the user does something inconsistent (and they will). Then the expectation is wrong and the action is wrong and time is wasted and frustration created. Look at MS Word. Every time I try to type some simple documents here in the academic computing lab, I have to FIGHT against all the automated crap that it throws into my document. I have to hunt all over the damned software to turn it all off and not all of it can be turned off. MS Word is the best example of why Microsoft should STOP NOW and go no further on this path.
“In two to three years, a terabyte of storage will be pretty common,” Rashid said. “In a terabyte of [data] you could literally remember every conversation you’ve ever had, from the time you’re born to the time you die.”
So what?? Are you admitting that information storage is getting out of hand or just being clever?
“Today’s OSes don’t actively learn,” said John Barnhart, director of engineering at Walt Disney Internet Group, in Seattle. The concept of intelligent software already is taking hold in Microsoft Word, which detects errors, Barnhart said.
No, what MS Word detects is what it INTERPRETS as errors. It uncorrects proper grammar into improper grammar, uncorrects names into incorrect words and still cannot reliably protect the user from using “to,” “too” or “two” in the wrong places (and look at how many times you find simple grammatical errors like that in so-called professional publications, demonstrating that users themselves are the problem because they are lazy and think that running a spell-check one time is all the proof-reading that need be done).
In other words… I think this guy is full of crap. Not that my opinion (or that of four thousand others) will change anything. Rashid is the senior vice president of Microsoft Research and we’re just users. Oh wait, we’re more than that… we’re their users. Their customers. Their support structure. In fact, we should be listened to.
HAH.
Senior vice president of Microsoft Research. That title doesn’t mean he does any research or knows anything about human-computer interfacing. It’s not necessary for him to know anything other than PR and business. He need not be objective. He just needs to find ways to make it look like he is invalualble to Microsoft; spewing whatever “revolutionary” or “evolutionary” tripe he can that sounds neat and forward thinking and gets positive “good for the consumers and good for the economy” kinds of PR is all he really needs to do.
I am really getting tired of the os and applications becoming more and more complex with expotential
growth in bugs.
This IMHO applies to everything,Linux.MS,you name it.
I wish for something like Beos only simpler.
The whole scam is to drive hardware sales. Think how many cycles are wasted by all the usless eyecandy.
OSX is sloooooooooooooooooooow for no good reason.
Apple had a real chance to break the mold and put a very simple interface on OSX one that would have run really fast on G3’s,……… ooops can’t do tha,………..t must drive hardware sales.
:/
Oh well guess I am starting to become like all those old Amigans.
OSX is sloooooooooooooooooooow for no good reason.
Why is OSX so slow? Why can’t i run osx on my 604 machine?
anyone here know why?
>anyone here know why?
Yes. But if you already do not know why, you probably won’t understand the technical details either.
Let’s just say that it could be done faster and more responsive if certain features were not so unix-alike in the kernel. Let’s hope that Apple will adopt the FreeBSD 5.x codebase sooner.
Its funny, laugh…………
http://www.algonet.se/~afb/mac/osxemulator/
How many horrible features are inextricable from Windows, as they claim Internet Explorer is? If MS embeds speach recognition into their product, but later finds out that it doesn’t work and/or people hate it, will it be stuck in there forever?
Remember, MS Windows is not modular. Bugs go in, but they don’t come out.
You were the first person to mention Linux. Everyone else is just slagging off the interfaces that MS has planned for us, and no other OS was mentioned whether Linux, OS2, BeOS, MacOS, BSD, Unix, Atheos, Oberon, V2 etc etc. In fact I think most of the correspondants before you actually use Windows judging by their familiarity with Word.
Have some salt to go with that chip on your shoulder, or we’ll start calling you a “zealot”.
even funnier for a laugh…
http://www.deanliou.com/WinRG/
Every time i have a major assignment due for uni (yeah some time i will learn not too) i use word, and have many battles with word over what i want my document to look like vs what it wants my document to look like. While this is most certinly a great stride forward in Artificial Intellgence, it is not so in HCI.
In a course on Ethics, we studied an idea put forward that it is unethical to place Human like qualities into software as this creates false expectations from the user that the computer is a rationally thinking moral agent.
You know what, it’s people like you who hold the industry back.
Natural Language interfacing is a GREAT thing. Ever seen Star Trek? That is a Natural Language Interface. That is what is being strived for here.
Smart Tags in OfficeXP attempt to fix some of the issues with grammer correction, and it works damn well.
Basically, you need to pull your head out of your rear, just because you want to hold everyone back, and just because YOU think these things does not mean that everyone else thinks the same things.
Don’t call foul on MS just because you are ignorant.
LOL LOL LOL >…………………
Now thats funny.
Not only funny but sooooooo true.
CPUGuy,
you have to be kidding. what advantage does a natual language interface have really? pretty much none.
you know, most futuristic technologies that they put in movies are just about looking cool. I doubt that the computers in Star Trek have natural language interfaces because it’s efficient, my guess would that it’s because it is to show that computers has advanced to something “intelligent”, now a good way to show that in a movie is to make the computer act like a human, but that doesn’t mean that it would be the best way to make “intelligent” computers in real life.
what is better, writing/saying a few words or just pressing a button?
and if we use the spoken word as an input, how would that look(or sound) at offices? you would have to make the cubicals completety isolated or people would go nuts
nah, human language interfaces will never be more than a “cool thing”, but I’m not saying that we should stop developing input devices, I think there’s a lot that can be done there.
The obvious would be to make mind controlled computers, and they are allready making a lot of progress in that area, however it will take a long time before they are ready for the consumer market. also, you would need software to be written for it because using it with current UI’s would be very inefficient.
However, there are a lot more concepts to try out before we get mind controlled computers. I’d like something to replace both my keyboard and my trackball, with something that is more efficient not easier to use the first time.
I’ve been reading a lot of articles on MSR and I have to say that I think they are nuts, most of their ideas about the future of computing are just about making cool things. It seems like they are looking at sci-fi movies for hints about what the consumers wants in the future
The general idea seems to be that you shouldn’t need to learn anything to be able to use a computer. Why? Is it that much to ask from the users to spend a few weeks learning how to control the computer in a more efficient way, when it will save them months later on?
I dont’ think so.
Hey come on, give the newbies a chance……
Let’s just say that it could be done faster and more responsive if certain features were not so unix-alike in the kernel. Let’s hope that Apple will adopt the FreeBSD 5.x codebase sooner.
???? I always thought that the slowness of MacOS X was mostly caused by apple not having any hardware acceleration for the video..
what would be so slow about FreeBSD? and how is it relevant since Darwin is using the MACH microkernel , which – to the best of my knowledge – FreeBSD doesn’t do.
The interface is not about efficiancy or being cool, but, convenience. Going back to Star Trek, you’ll notice, that they still use “keyboard” interfaces. They use natural language interfaces, when they are walking around, away from a keyed interface. Without the interface, the users would have to find a console to interoperate with the computer. Also, there is security involved with voice print ids. I agree in an office, it would not work. Just like in engineering on Start Trek, the engineers are all in front of consoles. I think saying the Natural Language interfaces are just bad, and have no place on computers is wrong. Didn’t people use to argue that mice were bad, because it took your hand off the keyboard, and would therefor be inefficient?
Each tool has a purpose.
Again, that’s what science does (yes, making software/hardware is a science).
They take ideas from sci-fi movies and bring them to reality. Sci-Fi movies is what gives the inventors and innovators the ideas.
For example, flip-out phones, where do you think that came from? The original Star Trek series, with their communicators. Teleportation, yes, there is work being done on teleportation, last I saw, a couple years ago, it was able to teleport a photon three feet accrss a desk… again, star trek. Hell, even the gell pakcs in Star Trek Voyager are being researched TODAY.
Sci-fi movies and science feed off each other, and that’s a GOOD thing.