Linked by Neolander on Sun 2nd May 2010 12:23 UTC
Talk, Rumors, X Versus Y When computers--evolutive machines which may be reprogrammed at will--became widely available, it generated a well-deserved buzz. Some (scientists, banks, insurance companies) felt like they were dreaming, others (like some SF writers) rather thought of it as a nightmare. Today, like it or not, they're everywhere However, part of the individual-oriented computers are going rather far from the original programmable machine model. They rather look like usual tools, with a fixed use. Their customizable internals are only accessible to the people who engineered them. Is there no market anymore for general-purpose personal programmable machines, able to do about anything? I'll try to answer this question, taking into account two major trends in the personal computing market : touchscreen-powered devices and cloud computing.
Order by: Score:
There's room for everything.
by drstorm on Sun 2nd May 2010 13:08 UTC
drstorm
Member since:
2009-04-24

I believe that all the devices and concepts have their uses. The only thing that differs is the number of users, but even the dumbest gadgets are used by someone. That's why the general-purpose computing (GPC) will probably never completely die out.

Now, the GPC has proven its worth over a long period of time. That and the very limitations of the specialized devices presented in the article, in my opinion, ensures that the GPC is here to stay for the foreseeable future.

New ideas emerge (or the old ones get reevaluated) and naturally, to make room for them the current ones have to suffer somewhat. However, I don't think the GPC is not in any real danger as the new technologies are not valid replacements (yet).

The insurance for a technology's survival is the demand, and I think I lot of people will demand the GPC for a long time to come. Me included.

Reply Score: 10

General Purpose Computing won't die
by jokkel on Sun 2nd May 2010 13:21 UTC
jokkel
Member since:
2008-07-07

It's true that more and more specialized computing devices have entered and are entering our lives. TV-recorders, game consoles, phones, navigation systems, mobile media players, restaurant order PDAs, cash registers, ATMs and countless other categories.

But on the other hand general purpose computers are still used and in great numbers. Its main strength remains being able to do everything in one device (games, media playback, calculation). It's the only machine, that can gain additional functionality at any time. The downside is of course reduced usability and reliability.

Specialized devices are exactly what the name emplies - specialized. These allow more people access to technology to improve their lives. E.g. the iPad makes the Web accessible to people who never know which mouse button to click or what a file is supposed to be. This is a good thing.

Not everybody needs a general pupose computer of their own. But lots of people do and always will.
All this complaining a la Cory Doctorow about the end of computing as we know it, reminds me of another time. The time we finally got easy to use GUIs for our PCs. All the old guard complained, how this would dumb everything down too much. Soon nobody of these WIMPs would know how to properly use a computer anymore. They were false then and are false now.

Every one of us will use a lot more computers in the future, than we do today. Also more people who don't use one today will use one in the future. And of those only a fraction will be PCs as we know them today. But they won't cease to exist.

Reply Score: 8

It depends
by jiraiya on Sun 2nd May 2010 13:31 UTC
jiraiya
Member since:
2010-05-02

We must take in consideration that most (I said MOST) of today's average computer users don't care about that kind of thing. They just want to turn on their machines and look at some social networking website. If it's "cool", if "everybody is using it", why would they care if all their personal data is stored remotely? So what if the page has more ads than content?

Sometimes I feel that people like us will become some sort of "resistence" against some giant company that will rule the world. Sounds stupid now, but maybe in a few years?

Suddenly I remembered the movie WALL-E. All humans had a screen right in their faces. They didn't even need to touch it. And they liked it...

Reply Score: 6

Comment by massysett
by massysett on Sun 2nd May 2010 13:49 UTC
massysett
Member since:
2007-12-04

Yep, general purpose computing will die. Enterprises will replace thousands of PCs with iPads. Workers will type emails on fiddly touch screens. Sensitive proprietary data will be stored with Google. CAD drawings will be produced on iPhones.

Completely ridiculous.

Reply Score: 15

Get off my lawn argument.
by theTSF on Sun 2nd May 2010 15:06 UTC
theTSF
Member since:
2005-09-27

This seems like a general get off my lawn argument. You are comparing older applications to a newer interface. Yes the current versions of photoshop won't work well with a touch screen... It doesn't mean it is the end. Photoshop will need to be designed for touch screen. There are a lot of advantages to the touch screen that you cannot do with the mouse. Pinch Zoom is one of them. Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.

Secondly the argument is really based on Pixel based design, this is slowing going away as the pixel isn't as accurate as it use to be. With Anti-Aliasing, and image compression if you are off by a few pixles so what.

The argument smells a lot like when we started to move off Command line and went to Point and Click WIMP interfaces. How the mouse just isn't as accurate, or as functional as they keyboard and it really makes things that much worse.

Reply Score: 1

RE: Get off my lawn argument.
by Laurence on Sun 2nd May 2010 16:28 UTC in reply to "Get off my lawn argument."
Laurence Member since:
2007-03-26

This seems like a general get off my lawn argument. You are comparing older applications to a newer interface. Yes the current versions of photoshop won't work well with a touch screen... It doesn't mean it is the end. Photoshop will need to be designed for touch screen. There are a lot of advantages to the touch screen that you cannot do with the mouse. Pinch Zoom is one of them. Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.

Secondly the argument is really based on Pixel based design, this is slowing going away as the pixel isn't as accurate as it use to be. With Anti-Aliasing, and image compression if you are off by a few pixles so what.

The argument smells a lot like when we started to move off Command line and went to Point and Click WIMP interfaces. How the mouse just isn't as accurate, or as functional as they keyboard and it really makes things that much worse.


However, just like CLI, the mouse will still continue to be available to use for those who need or prefer it.

People keep talking about x killing y but the reality is so long as people are varied and given freedom of choice, then technology will remain equally varied.

So it's not an either/or situation. It's just a new method of input available for those who want or need it.

Reply Score: 5

RE: Get off my lawn argument.
by marcp on Sun 2nd May 2010 16:54 UTC in reply to "Get off my lawn argument."
marcp Member since:
2007-11-23

"There are a lot of advantages to the touch screen that you cannot do with the mouse. Pinch Zoom is one of them. Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful. "

You must be kidding us. It is obvious [to some], that pinch zooming is there because those devices are crippled down and fcuked with tiny little screens, so the final content doesn't fit right into these screens. Bam! here comes zoom as a 'semi-resolution'. I don't know how this can be practical, sorry.

Command line is no argument. Why? because it's still here - XXI century and we still use it. You wanna add some trick to you Win box? you gotta use command line or registry, which is all about typing the commands, not clicking the crap out of your box.
Servers are nothing without CLI, my friend. Ask HP for their HP-UX, Solaris for Solaris and other giants that produce SERVER Operating Systems, not so GUIsh stuff that hides everything from you and use your OS resources just to draw some incredibly huge button right in front of your eyes.
Some of us prefer good log message than a silly dialog box with: "you are fcuked: error 0000000x32" message.

Reply Score: 6

RE[2]: Get off my lawn argument.
by darknexus on Sun 2nd May 2010 17:46 UTC in reply to "RE: Get off my lawn argument."
darknexus Member since:
2008-07-15

Well, not all touch screens can be served better without them. Two instances imho:
1. Small mobile devices like smartphones. These aren't full computers with lots of keyboard shortcuts and the like, and having to arrow through menu after menu (Windows Mobile, I'm looking at you) just to find the item you want is more than bothersome. It's much easier on such devices to scroll and touch. Not to mention web browsing of course. It's much easier to touch a link or button than it is to arrow around to it, and it's not as though you'd want a mouse on these devices when the mouse would be as big as the phone to begin with.
2. Public kiosk systems. Love 'em or hate 'em, touch screens are much easier to keep clean than a keypad and are less prone to failure when roughly handled.
However, most other interfaces don't really benefit much in the end from a touch screen imho. On larger screens, the gestures become too exaggerated and that's just on flat devices. I wouldn't want a touch screen desktop or laptop, my arms would get tired in minutes if touch was our only method of input on them.

Reply Score: 3

RE[3]: Get off my lawn argument.
by marcp on Sun 2nd May 2010 18:52 UTC in reply to "RE[2]: Get off my lawn argument."
marcp Member since:
2007-11-23

Implementing touch screens into the mobile devices? Yes.
Implementing them into the kiosk systems? surely.
Implementing full browsing experience into m.dev.? No.

That is just not possible with the size of the typical smartphone screen.
You would either cripple device down, or just not provide such 'innovations'. IMHO it's better to buy something bigger just for the web browsing instead of smartphone.

Reply Score: 2

nt_jerkface Member since:
2009-08-26

Ask HP for their HP-UX, Solaris for Solaris and other giants that produce SERVER Operating Systems, not so GUIsh stuff that hides everything from you and use your OS resources just to draw some incredibly huge button right in front of your eyes.


That's BS, if anything the CLI hides stuff by not making your options apparent. For command line arguments you have to put in a bunch of ugly flags and a typo can screw up the whole thing. The UNIX Hater's handbook has some funny examples of this.

The GUI resources argument is also BS. It isn't 2000 anymore, the GUI overhead in Windows Server is negligible unless you are running on an old machine. Take a look at the cpu idle sometime, it doesn't take much work for Windows to update a 2d frame that is drawn by the video card.

The CLI can of course be very useful and I use it all the time but command line elitism is really silly. GUIs can be very useful for servers, especially for virtualizing and data monitoring.

Reply Score: 2

RE[3]: Get off my lawn argument.
by marcp on Mon 3rd May 2010 09:53 UTC in reply to "RE[2]: Get off my lawn argument."
marcp Member since:
2007-11-23

"The GUI resources argument is also BS. It isn't 2000 anymore, the GUI overhead in Windows Server is negligible unless you are running on an old machine. Take a look at the cpu idle sometime, it doesn't take much work for Windows to update a 2d frame that is drawn by the video card. "

Might I remind you, that we are talking about the SERVERS here. You don't even need a GUI in srv and that's a fact. It can be a browser interface with minimal overhead, but GUI? It's not an argument to say 'look, we have such a powerful machines nowadays'. Of course we do, but it doesn't mean, that the programmers of GUI OSs should bloat their Operating System into incredible extent just because they have a memory, or CPU power.
P.S servers don't usualy need 3D *unless* you are using them to wirtualize Windows instantions ...

"That's BS, if anything the CLI hides stuff by not making your options apparent. For command line arguments you have to put in a bunch of ugly flags and a typo can screw up the whole thing. The UNIX Hater's handbook has some funny examples of this. "

I'd rather say Unix Haters Book is no argument. Few frustrated people that are just bored with CLI [although they know it PERFECTLY from an early stage of it's development]. It's all about productivity and simplicity, not some 'silly elitism', so we agree in this part.

Reply Score: 2

nt_jerkface Member since:
2009-08-26

You don't even need a GUI in srv and that's a fact. It can be a browser interface with minimal overhead, but GUI?


So what are you saying here? You're against GUI interfaces for servers but don't care if there is a browser open? That's a minimalist GUI and having an interface for something like Apache open as well will hardly make a difference.


It's not an argument to say 'look, we have such a powerful machines nowadays'. Of course we do, but it doesn't mean, that the programmers of GUI OSs should bloat their Operating System into incredible extent just because they have a memory, or CPU power.

Ever noticed how much cpu Windows uses at idle? Using a couple percentage points of cpu share is not bloat. As I said this was more of an issue around 2000 when servers had 1ghz P3s and 256mb of RAM. With a quad core, 8gb of RAM server these concerns are no longer valid.

There is definitely a legitimate concern for stability when it comes to running X which is why I would only run FreeBSD from the command line. The Windows GUI however is something you can ignore.

Reply Score: 2

RE: Get off my lawn argument.
by vivainio on Sun 2nd May 2010 18:25 UTC in reply to "Get off my lawn argument."
vivainio Member since:
2008-12-26

There are a lot of advantages to the touch screen that you cannot do with the mouse. Pinch Zoom is one of them. Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.

More useful than zooming with mouse wheel?

Anyway, mouse is a somewhat unexciting device. Keyboard, OTOH... people that do useful stuff will always need one.

Edited 2010-05-02 18:25 UTC

Reply Score: 4

RE: Get off my lawn argument.
by nt_jerkface on Mon 3rd May 2010 06:33 UTC in reply to "Get off my lawn argument."
nt_jerkface Member since:
2009-08-26

Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.

You can also use the scroll wheel to zoom. Yes I would rather do than than pinch the screen like a retard. With the mouse I can also pinpoint the exact location where I want to zoom.


Secondly the argument is really based on Pixel based design, this is slowing going away as the pixel isn't as accurate as it use to be. With Anti-Aliasing, and image compression if you are off by a few pixles so what.

This is really funny.

So we are going to move to a sketch based web? Where people fingerpaint images and call it good enough? Sorry boss but I can't add any more detail with my finger. I can smudge an area of about 5 pixels but that is it. Do you want me to blur the image for you?

Photoshop touch-screen edition. Yea keep waiting on that one.

Reply Score: 2

RE: Get off my lawn argument.
by cerbie on Tue 4th May 2010 13:09 UTC in reply to "Get off my lawn argument."
cerbie Member since:
2006-01-02

Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.
Every photo editor I have ever used, since I moved from GeoWorks to Windows as a primary GUI, has, by default, used mouse wheel up to zoom in at the cursor's position, and zoom out from that position with mouse wheel down. Pinch zoom and mouse gestures? Put down the drugs, man.

Secondly the argument is really based on Pixel based design, this is slowing going away as the pixel isn't as accurate as it use to be. With Anti-Aliasing, and image compression if you are off by a few pixles so what.
No. We've had good working GUI interfaces, like X+Qt, that are very capable of changing DPIs (I typically ovverride mine to be 20-30% higher), and having the whole interface follow that change. This has been one reason I preferred X before Win7 (while still limited, the defaults are good enough in 7, while still a bit awkward in Vista), and Apple has basically added the feature to OS X, too.

The argument smells a lot like when we started to move off Command line and went to Point and Click WIMP interfaces. How the mouse just isn't as accurate, or as functional as they keyboard and it really makes things that much worse.
Which, it does. Until you get used to having several terminals open at the same time, without any of them having to be smaller than 80 columns, and bask in the best of both worlds--GUIs where they make sense to use, CLI where you know what you want to do, and can just get it done there. Then you get an interpolating 800+ DPI mouse, and do a little shin-chan-esque happy dance.

Reply Score: 2

Horses for courses
by mrhasbean on Sun 2nd May 2010 15:16 UTC
mrhasbean
Member since:
2006-04-03

Just like with cars and many other modern devices we've moved to a society where they are no longer toys of the rich or geeky but instead consumer devices used by everyone, so they are increasingly designed to be operable by the lowest common denominator, serviced by the specialists and disposable.

However, there are always enthusiasts who just want to play with their toys, so I don't believe the market will die completely.

Reply Score: 4

I'm not sure...
by reduz on Sun 2nd May 2010 15:29 UTC
reduz
Member since:
2006-02-25

The idea that everything will move to cloud is a little extreme. Native code is still needed for most popular PC applications (video players, image editors, video editors, sound editors, music editors, 3D modelling editors, CAD, web browsers, video games and so many other uses). Only the "low end" of applications (what has been around for longer than ever) and what is not as computationally intensive is moving to web, such as mail apps, word processors, etc.

I'm pretty sure that, to solve this, the cloud will be able to deliver native code or almost native code using approaches such as google's NACI or PNACI.

Reply Score: 4

RE: I'm not sure...
by righard on Sun 2nd May 2010 15:39 UTC in reply to "I'm not sure..."
righard Member since:
2007-12-26

I know a very popular site which houses an online video player, if only I could remember its name.

Reply Score: 2

RE[2]: I'm not sure...
by reduz on Sun 2nd May 2010 15:51 UTC in reply to "RE: I'm not sure..."
reduz Member since:
2006-02-25

Oh yeah, I know a very popular site where you can download plenty of movies and episodes too, that you can't watch in that other popular site..

Reply Score: 2

RE[2]: I'm not sure...
by darknexus on Sun 2nd May 2010 15:54 UTC in reply to "RE: I'm not sure..."
darknexus Member since:
2008-07-15

I know a very popular site which houses an online video player, if only I could remember its name.

Ah, but the video is still being decoded locally even though it's coming from the cloud. This is something a "dumb terminal" could not do if it was really nothing more than an interface to the cloud. IN that scenario, you would be fed the decoded video stream and your cloud client would just output it. I wouldn't call online video players cloud-based, as even though the video is streamed over the web it is still being decoded natively by your machine, therefore native code is involved in most of the process.

Reply Score: 6

RE[2]: I'm not sure...
by Laurence on Sun 2nd May 2010 16:43 UTC in reply to "RE: I'm not sure..."
Laurence Member since:
2007-03-26

I know a very popular site which houses an online video player, if only I could remember its name.


The video content is online but the player itself is installed locally.

Most websites like youtube need Flash, Silverlight, Quicktime or any number of other plug-ins installed and even the HTML5 sites with <video> tags need to run on web-browser that are installed locally (and lets not forget that many of these browsers also depend upon locally installed codecs too)

Reply Score: 4

RE[3]: I'm not sure...
by routitz on Sun 2nd May 2010 17:05 UTC in reply to "RE[2]: I'm not sure..."
routitz Member since:
2010-04-23

"I know a very popular site which houses an online video player, if only I could remember its name.


The video content is online but the player itself is installed locally.

Most websites like youtube need Flash, Silverlight, Quicktime or any number of other plug-ins installed and even the HTML5 sites with tags need to run on web-browser that are installed locally (and lets not forget that many of these browsers also depend upon locally installed codecs too)
"

First, sorry for my poor english ;-)

I don't think the facts that you pointed out really matters as long as portable machines can also run the players smoothly. Portability is a nice advantage and desktops should be able to compensate for our giving up portability to be competitive. Do they? I'm not so sure.

I believe that portion of desktops will shirink as the technology advances and handheld machines become powerful enough. But, still I can hardly believe that this means the death of desktop computing. There are always some areas in which the performance is much important than the portability, so the performance of handheld machines can never be "enough".

Edited 2010-05-02 17:12 UTC

Reply Score: 1

RE[4]: I'm not sure...
by Laurence on Mon 3rd May 2010 10:04 UTC in reply to "RE[3]: I'm not sure..."
Laurence Member since:
2007-03-26

Some people don't need portability though so they're not giving anything up.

However, if power and portability is really an issue then there is this crazy new invention that I predict will become popular one day: it's called the "laptop".

Hell, even some netbooks are as powerful as 4 year old laptops these days - and lets not forget UMPC either.

Reply Score: 2

RE[2]: I'm not sure...
by cerbie on Tue 4th May 2010 13:10 UTC in reply to "RE: I'm not sure..."
cerbie Member since:
2006-01-02

That's just cloud storage, not cloud computing.

Reply Score: 2

Naah
by marcp on Sun 2nd May 2010 16:40 UTC
marcp
Member since:
2007-11-23

I won't be adding anything to the touchscreen-trend, 'cause that would be complete waste of time. We all know it's an unpractical gadget.

The whole 'devices aimed at particular purpose' thing is just a MARKETING. Some ppl wants to sell something to the world and they produce new, silly gadget for dummies.
They base it on ridiculous - but quite true for most of the ppl - assumption, that most of the users are the conformists and so they will likely get the catch and use their device 'just because it's a NEW THING'.
That said, reason [consciousness] is the worst enemy of the marketers. Independent thinking doesn't leave any place for their filthy games.
So I wouldn't be demonizing the whole thing up. It's about purpose, conscious choices and our own will.

Reply Score: 2

What about speech input?
by mtlmtlmtl on Sun 2nd May 2010 17:41 UTC
mtlmtlmtl
Member since:
2010-05-02

What if speech recognition is added for text input and commands with a touchscreen. Why are we limiting our analysis to touch input devices only vs desktop computers. The question should be "in 2010 given the devices on the market right now, would you use a specialized device with limited input for general computing". Of course not. But what if you had a device with a large lightweight touchscreen with speech input for text and commands and a camera, for example, for gesture recognition (which would solve part of the hovering problem) and a stylus (yes I used the s word) for precise interaction then I don't think most people would still use desktop computers as we know them now. But your article illustrate well the limitation of current touch screen devices right now. As for cloud computing, that's something else.

Reply Score: 4

RE: What about speech input?
by darknexus on Sun 2nd May 2010 17:51 UTC in reply to "What about speech input?"
darknexus Member since:
2008-07-15

Nice idea, but speech recognition is just a bit too iffy yet. And what about noisy environments?

Reply Score: 3

RE: What about speech input?
by WorknMan on Sun 2nd May 2010 20:29 UTC in reply to "What about speech input?"
WorknMan Member since:
2005-11-13

What if speech recognition is added for text input and commands with a touchscreen.


Speech input is already here, and at least on Android, it is lovely. That being said, I don't think I'd want to do that in an office of 200 people separated by cubicles.

But what if you had a device with a large lightweight touchscreen with speech input for text and commands and a camera, for example, for gesture recognition (which would solve part of the hovering problem) and a stylus (yes I used the s word) for precise interaction then I don't think most people would still use desktop computers as we know them now.


That would sort of defeat the purpose of a portable machine, wouldn't it? If the screen is too large to practically carry around (and thus stationary), it's still technically a desktop. Plus, whether I would use one or not depends on the erganomics of the thing.

Reply Score: 2

Doomed? No.
by bryanv on Sun 2nd May 2010 17:45 UTC
bryanv
Member since:
2005-08-26

Passe? Absolutely.

Bring on the appliances!

Reply Score: 3

RE: Doomed? No.
by koki on Mon 3rd May 2010 04:43 UTC in reply to "Doomed? No."
koki Member since:
2005-10-17

You have very concisely summarized my own thoughts.

Reply Score: 1

I do not seeal
by drcoldfoot on Sun 2nd May 2010 18:20 UTC
drcoldfoot
Member since:
2006-08-25

cloud Computing taking over personal local computing in the near future. There are issues that are raised. ie: privacy, location. I think of security when it comes to my own "Intellectual Property" (Yes, I can use that term, so as everybody on the planet that produces a creative work, whether it's a letter, script, etc.) What's to prevent the government, instead of raiding your house, just type an email to Google or Apple, and basically download your entire life from the cloud? Instead of the government, what about a hacking group that is performing corporate espionage, and instead of accessing your company, exploits Google or Apple and compromises your corporation's IP or sensitive data?

The benefits include, centralized computing power and stateless data.

But does the benefits outweigh the drawbacks? I personally don't think so.

Reply Score: 2

Thanks !
by Neolander on Sun 2nd May 2010 18:25 UTC
Neolander
Member since:
2010-03-08

First, I'd like everyone for participating with such good comments. Maybe it's because I'm getting a bit tired of the H.264 vs the rest of the world war, but to me it looks like even if OSnews has generally high quality comments, a lot of those are of even higher quality.

This seems like a general get off my lawn argument. You are comparing older applications to a newer interface. Yes the current versions of photoshop won't work well with a touch screen... It doesn't mean it is the end. Photoshop will need to be designed for touch screen.

Question is : is it possible ? I've tried gimp on an EEEPC 701 (yay, the first one), and why I could resize various toolboxes, it still hid most of screen estate. Photoshop is a bit better as far as screen estate is concerned, but that's because it makes heavy use of drop-down menus, which are unsuitable on a touchscreen.

And if you always have to switch back and forth between a "tool" tab and an "image" tab, it's rather cumbersome. Then there's the precision issue : unless touchscreen devices are equipped with stylus that have wacom-like precision, it's hard to make serious work at the pixel scale on such a device. The stylus effectively solves several touch problem, that's one of the way I think touchscreens could win. But stylus-oriented applications are hard to make, though : anyone remembers Windows Mobile 6 or tried to use a drawing tablet in everyday web browsing ?

There are a lot of advantages to the touch screen that you cannot do with the mouse. Pinch Zoom is one of them. Sure you can zoom in with a mouse gesture but a pinch zoom is often more useful.

You're right. There's pinch and rotate. It's written in my article. But then what ? My point was that touch offers far less general-purpose actions.

Secondly the argument is really based on Pixel based design, this is slowing going away as the pixel isn't as accurate as it use to be.

You've horribly misunderstood what I wrote if you think that. I talk about screen estate in terms of inches and centimeters, not in terms of pixels. The way non-OS software should be talking about it since ages. The fact that buttons get smaller when screen size gets higher is actually a bug that should never have appeared in OSs and should be fixed by now. Fitt's law is not about pixels.

With Anti-Aliasing, and image compression if you are off by a few pixles so what.

I just didn't understand that.

The argument smells a lot like when we started to move off Command line and went to Point and Click WIMP interfaces. How the mouse just isn't as accurate, or as functional as they keyboard and it really makes things that much worse.

You're perfectly right, it's a bit of a nostalgic argument. Except for one difference : those who prefer CLI over GUI for a certain task and GUI over CLI for another can use both if they like to. However, lately, it sounds like touchscreens and mouses are mutually exclusive, except for some unsuccessful products that try to combine both without providing serious touchscreen support (from HP, Archos...)

People keep talking about x killing y but the reality is so long as people are varied and given freedom of choice, then technology will remain equally varied.

See above. I think it's not like GUI vs CLI, because a single machine can do both GUI and CLI, whereas the emerging touchscreen market nowadays is incompatible with the old behavior. And I don't see many people buying two computers in order to get both.

What if speech recognition is added for text input and commands with a touchscreen.

It may solve some issues, but you won't be writing trade secrets and written porn on your computer anymore ;) Speech recognition is nice, but the "anyone can hear" argument is IMO the #1 reason why it never truly caught up (#2 being poor quality of said recognition, especially in noisy environments).

Why are we limiting our analysis to touch input devices only vs desktop computers. The question should be "in 2010 given the devices on the market right now, would you use a specialized device with limited input for general computing". Of course not. But what if you had a device with a large lightweight touchscreen with speech input for text and commands and a camera, for example, for gesture recognition (which would solve part of the hovering problem) and a stylus (yes I used the s word) for precise interaction then I don't think most people would still use desktop computers as we know them now. But your article illustrate well the limitation of current touch screen devices right now. As for cloud computing, that's something else.

Your point is perfectly and totally valid, and shame on me for not minding to describe this with more attention in my article. What I'm talking about is today's "pure touchscreen" model. The one used in iPhones, Androphones, tablets, iPads... But technology from tomorrow may totally change the whole situation. However, we might be getting more and more distant from some of the core ideas of the current touchscreen thing then : you interact it directly with your hands so it's simpler, and it's only a screen so you can easily carry it around.

Well, not all touch screens can be served better without them. Two instances imho:
1. Small mobile devices like smartphones. These aren't full computers with lots of keyboard shortcuts and the like, and having to arrow through menu after menu (Windows Mobile, I'm looking at you) just to find the item you want is more than bothersome. It's much easier on such devices to scroll and touch. Not to mention web browsing of course. It's much easier to touch a link or button than it is to arrow around to it, and it's not as though you'd want a mouse on these devices when the mouse would be as big as the phone to begin with.

Maybe. I prefer to have some buttons, myself, at least for typing and common features like task-switching, but I guess it's the same as those people who prefer CLI over GUI for some purposes ;)

Edited 2010-05-02 18:30 UTC

Reply Score: 3

Right point, wrong arguments.
by Nicholas Blachford on Sun 2nd May 2010 18:32 UTC
Nicholas Blachford
Member since:
2005-07-06

I think the article makes a good point but it's using the wrong arguments.

General purpose means you can do whatever you want,
the whole resolution argument has nothing to do with this. That just means a different interface, it doesn't limit what you can do with a device, just how you control it.

The cloud computing thing is a better argument: By putting your apps on-line you are limiting the apps you can run to those supplied by the vendor, but that's only if you are limited to one vendor which is not likely to be the case. In any case, Web apps can be general purpose, the difference is in where you run them, not what they can or cannot do.

However, there is a good point to be made about the end of general purpose computing. I don't think it'll be the end but I do think general purpose machines could become a niche.

I think the iPad shows what will happen, I think it'll replace PCs for many people, it'll still be general purpose, but not in quite the same way.

I think PCs will become like SLR cameras are today, they're highly flexible devices that sell in reasonably high numbers. Point and click cameras are much less flexible and don't give the same results, but sell in vastly higher numbers.

The level of flexibility we want will still be available but you'll probably pay rather more for it.

Reply Score: 2

RE: Right point, wrong arguments.
by Neolander on Sun 2nd May 2010 18:42 UTC in reply to "Right point, wrong arguments."
Neolander Member since:
2010-03-08

General purpose means you can do whatever you want,
the whole resolution argument has nothing to do with this. That just means a different interface, it doesn't limit what you can do with a device, just how you control it.

Wrong. I've used a 15-inch screen then a 17-inch screen. For coding purposes, the second one is *much* better. And coding is not about pixel res, since text display is resolution-independent. When you have more space, you can do more things with it. With touchscreen-oriented buttons, we'll effectively have less space.

Sure, it's still possible to do the same thing using scrolling, tabs, and other tricks. But at some point it's not practically usable anymore. You miss the "I see everything at once" thing, which is really important.

In any case, Web apps can be general purpose, the difference is in where you run them, not what they can or cannot do.

You're right, It's the "personal" in personal computing that's dying here. You don't own your computer anymore. You're a minion of New IBM.

However, there is a good point to be made about the end of general purpose computing. I don't think it'll be the end but I do think general purpose machines could become a niche.

I think the iPad shows what will happen, I think it'll replace PCs for many people, it'll still be general purpose, but not in quite the same way.

I think PCs will become like SLR cameras are today, they're highly flexible devices that sell in reasonably high numbers. Point and click cameras are much less flexible and don't give the same results, but sell in vastly higher numbers.

The level of flexibility we want will still be available but you'll probably pay rather more for it.

That's what I see happening too, if the touchscreen things catches up. Hence this article ;) But you write this point much better.

Reply Score: 1

kragil
Member since:
2006-01-04

And I call BS.
The calculation is really that simple. I don't see why I couldn't do anything on a upcoming Meego Smartphone with HDMI and a LCD TV.

Soon phones will have 1,5 Ghz and 1GB Ram .. those things can do anything (as long as they run Linux :OP)

Reply Score: 2

Neolander Member since:
2010-03-08

And I call BS.
The calculation is really that simple. I don't see why I couldn't do anything on a upcoming Meego Smartphone with HDMI and a LCD TV.

Soon phones will have 1,5 Ghz and 1GB Ram .. those things can do anything (as long as they run Linux :OP)

What's a WiDi ? Some kind of pointing device that overcomes touchscreen limitations ? If so, why not... If not, cf article and comments. It's not only about computing power and number of pixels on the screen. If your pointing device (the smartphone) is still small, the precision-related issues remain...

Reply Score: 1

kragil Member since:
2006-01-04

http://www.engadget.com/2010/01/07/intel-announces-widi-hd-wireless...

For pointing there could be a touchpad on the keyboard. Or you could the phone touchscreen as a touchpad (if you are cheap and the software is flexible)

Reply Score: 2

Speech
by Bringbackanonposting on Sun 2nd May 2010 23:48 UTC
Bringbackanonposting
Member since:
2005-11-16

Re speech recognition:
Personally, i'm not a fan, have used it in the past.
Advances in technology can make it more of an option. How about voice recognition software that can detect your quiet speaking amongst a room full of people talking? What about the software recognising your voice spoken just above a whisper? The person in the next cubicle won't understand you but the software will. Better software/hardware and algorithms might be able to do this, you just never know. If there's a will....

Reply Score: 1

RE: Speech
by Neolander on Mon 3rd May 2010 04:40 UTC in reply to "Speech"
Neolander Member since:
2010-03-08

Advances in technology can make it more of an option.

You're right, I'm talking about today's speech recognition, plus perfect detection of the pronounced words.

How about voice recognition software that can detect your quiet speaking amongst a room full of people talking?

With some tiny directional piezzo microphones close to your mouth and noise removal, I think this is already doable in hardware and software today ;) Prices just have to drop, which will happen someday.

What about the software recognising your voice spoken just above a whisper?

That's possible on a longer term scale. Microphone and amplifier technology just has to get a lot more efficient. But if your ear can do that, a good mic can... Sound quality will be awful, but with good speech recognition algorithms equivalent to those of the human specie, who knows ?

The person in the next cubicle won't understand you but the software will.

That's where I don't follow you. I think there's a psychological issue about talking about something in a public place where everyone can go. But maybe I'm totally wrong...

Edited 2010-05-03 04:43 UTC

Reply Score: 1

Envying1
Member since:
2008-04-22

Those touchable stuff will just evolve into like a TV remote for general consumers...

Reply Score: 1

Comment by ZacharyM
by ZacharyM on Mon 3rd May 2010 04:19 UTC
ZacharyM
Member since:
2007-05-28

While I agree that many of these devices have created greater convenience, also I do not believe that we should be modularizing all of these devices, over specialization can be a negative aspect in many areas. One potential downfall I can see is increasing costs, by having to purchase a separate device for each task you would like to complete. It could be beneficial for large companies milking the consumer for every penny they can get.

Reply Score: 1

RE: Comment by ZacharyM
by Neolander on Mon 3rd May 2010 04:31 UTC in reply to "Comment by ZacharyM"
Neolander Member since:
2010-03-08

While I agree that many of these devices have created greater convenience, also I do not believe that we should be modularizing all of these devices, over specialization can be a negative aspect in many areas. One potential downfall I can see is increasing costs, by having to purchase a separate device for each task you would like to complete. It could be beneficial for large companies milking the consumer for every penny they can get.

Strange, what you're describing reminds me of a company who produces portable media players... ;)

Reply Score: 1

Oh no.
by strcpy on Mon 3rd May 2010 05:55 UTC
strcpy
Member since:
2009-05-20

OSNews is at it again.

Unrealistic nonsense.

Reply Score: 0

RE: Oh no.
by Neolander on Mon 3rd May 2010 06:04 UTC in reply to "Oh no."
Neolander Member since:
2010-03-08

OSNews is at it again.

Unrealistic nonsense.

1/This was not written by the OSnews editorial team, even though its help was precious ;)
2/To what extent is this nonsense ? Can you please go into more detail ?

(The author ^^)

Edited 2010-05-03 06:05 UTC

Reply Score: 1

No
by nt_jerkface on Mon 3rd May 2010 06:17 UTC
nt_jerkface
Member since:
2009-08-26

I saw some guy using an ipad at harbucks the other day.

He was clearly using way more effort than me to surf the web. It was actually quite funny. I was using a wireless laser mouse and making very small movements while he looked like he was frantically fingerpainting. He also had to lay the ipad flat to type on it.

Touchscreens are fine for phones but general computing? Oh God no. We're just in the middle of Apple hype when it comes to tablets. Most people would rather have a netbook than a tablet.

Reply Score: 2

RE: No
by mkone on Tue 4th May 2010 20:58 UTC in reply to "No"
mkone Member since:
2006-03-14

I saw some guy using an ipad at harbucks the other day.

He was clearly using way more effort than me to surf the web. It was actually quite funny. I was using a wireless laser mouse and making very small movements while he looked like he was frantically fingerpainting. He also had to lay the ipad flat to type on it.

Touchscreens are fine for phones but general computing? Oh God no. We're just in the middle of Apple hype when it comes to tablets. Most people would rather have a netbook than a tablet.


So, you figured he was struggling because he was making larger movements than you were? I can't think of a more ridiculous argument.

It's like saying regular drivers struggle because their steering wheels have a 2.5 turns lock to lock compared to a 0.9 turn lock in a formula one car.

These are different computing paradigms. In fact, I would say that the person on the ipad struggles less because he has multitouch, so he can interact with more than one screen element at a time, unlike a keyboard and mouse user. And needing to be less precise is actually a _good_ thing. One thing I hate about mice is that they force me to be more precise than one needs to be. And they require you to learn an unnatural hand/eye coordination to be able to use them.

Reply Score: 1

same old same old
by l3v1 on Mon 3rd May 2010 11:54 UTC
l3v1
Member since:
2005-07-06

OK, some here we go again with bedtime stories about general purpose PC's flying out the window. And again, I'll come up with one - of many - arguments against that: transitioning completely to portable dumb terminals (phones, pads, whatever) and cloud-based apps presumes a constant high bandwidth connection, and enough available computing power in the cloud to host all the algorithms and computing power the user can come up with. Which is insane! Half the algorithms I'm dealing with require large amounts of data with realtime availabilty. Another one, what provider would be happy to see me stream multiple live high resolution camera feeds into the "cloud" for running my algorithms on?

Again, there are so many arguments against the whole subject it's not even funny.

Reply Score: 3

The right tool for the job
by sydbarrett74 on Mon 3rd May 2010 17:55 UTC
sydbarrett74
Member since:
2007-07-24

Touch-screens are very appropriate for small mobile devices (smaller than a netbook). Though not as accurate as other pointing technologies, they work well with devices that get dropped, kicked, and generally beaten up during their lives.

As for virtual keyboards, they are very handy for texting or inputting URL's. Again, with a mobile form-factor, real keyboards stop working, keys fall off, gunk gets stuck between keys, etc. An on-screen keyboard solves these problems. Remember: people aren't going to be typing the Great American Novel(tm) on an iPhone. They're only keying in brief messages.

To address your comment about more generally-purposed devices, GP computing will never go away. There will always be people who want an all-in-one device. People said mainframes would die off. While mainframes comprise a shrinking percentage of all computing devices out there, the installed base of mainframes is still increasing in absolute terms. Why? Because they're very good at what they do.

Reply Score: 1