“I was prepared to write that the Windows 8 interface was forcing unnecessary touchscreen controls on people who wouldn’t appreciate them, particularly if they were simply grafted onto a traditional laptop. But the more I’ve used Windows 8, despite its faults, the more I’ve become convinced that touchscreens are the future – even vertical ones.” I can see his point. I, too, have often felt the desire to touch regular and laptop displays, especially when doing things like photo and video.
I don’t believe anything I read on The Verge these days. In fact, if they state something I automatically assume the reverse is true.
… which would mean that touchscreen laptops suck. I will assume that until Anand does a proper review …
“If you want to launch a program on your desktop, which makes more sense? Reach down to a special glass surface and drag a finger across it just long enough to land a floating pointer arrow on top of the icon, and then tap? Or simply reach up to a visible icon and tap it? Why try to aim that pointer at a little X icon, or remember keyboard shortcuts like Alt-F4, when I can just swipe down from the top of the screen to close a Windows 8 program? Why painstakingly zoom a web browser in 10 percent increments using a disembodied keyboard or trackpad when you can smoothly manipulate it between your fingers with pinch-to-zoom?”
Which makes more sense to you, your highness?
Ohh, so that’s how you do it! I was wondering what the “proper” way of doing that was. That’s totally undiscoverable, though, and quite clearly says something about the interface..
I never close applications at all in Windows 8. I just go back to Metro.
That’s what astounded me in using my iPad at work. It was literally years before someone mentioned that if I drag 4 fingers (not 3, not 5) up from the bottom of the screen, I’d find a row of recently launched icons.
I was supposed to discover that… how?
We’ve gone from command lines that required a manual, to GUIs that were entirely discoverable (even the keyboard shortcuts were clearly listed in the menus), back to magic gestures that require a manual. The Dummies book publishers must be ecstatic!
You can also right click it on the fly out task bar widget on the right and click close.
A friend had just bought a Windows 8 laptop with touch-screen.
He likes it to.
(And he’s a nerd.)
I second that – the only thing that is probably worse than the articles are the forums themselves. The reality is that touch screens make sense for devices like the ThinkPad Yoga but honestly the attraction beyond a few niches is pretty much a novelty that is later ignored just as Thunderbolt on my old iMac pretty much stood idle as I felt high and mighty believing I had something that none of the PC users had. The reality is that when the rubber hit the road you’ll see those touch desktop screens will be in use for a while and then the end user will go back to using a mouse/keyboard just the same with traditional laptops.
If the aim of Metro was to spur on the development of hybrid devices then hey, that is great, cool bananas but if they’re trying to get touch screens in all devices no matter how stupid and impractical then such a feature for many users will become as useless as the Thunderbolt port I had on my iMac. If Microsoft really want to do something that’ll benefit end users then they should expand WinRT so then not only Metro applications are written in it but also desktop ones to – have context sensitive XAML interfaces where the application launches either in tablet or desktop mode depending upon what mode it is being run in at that point in time. Having an improved and evolved API would also help developers as well as to avoid the esoteric issues of win32 and move over to a cleaner set of API’s so then more focus can be paid to the applications than working around issues or Microsoft having to balance moving forward and maintaining backwards compatibility. IMHO it seems that the touch screen is to Microsoft like what Retina is to Apple – a gimmick to bring in the punters but closer inspection show very little benefit to end users in day to day real world usage.
Am I alone in the world? I don’t want dirty fingerprints on my screen.
Lift your keyboard, flip 180°, tap the back, surprise !
Kochise
That’s no surprise. I think the point is that I don’t need to read the screen through a dirty keyboard; finger marks all over the screen make it more difficult to read.
Point, and sliding your finger, your hand hiding the screen. To that extend, the Android keyboard is one of the worst stuff ever, even in its 4.2.1 Jelly Bean incarnation.
Correctly selecting a portion of text, especially in the edge of the screen/text is getting ridiculously hard.
Placing the cursor with no arrow keys, on a Retina-like display (Nexus 7) is a matter of a quarter of a millimeter. Just your breathing or even your earth beat can make the whole process rather spurious, you have to have sniper’s abilities to remove your finger when the cursor, located beneath your finger, is located at the very location you were trying to reach.
Because like I said, with no arrow keys, you have to repeat the whole process just to get the cursor a character back or forth. Stunning that such an incredibly anti ergonomic keyboard is fitted into a mass product is beyond despair.
Not to speak about the auto-correction “features” that do not select the word you are currently typing but almost always select a wrong alternative with characters selected from the opposite side of the keyboard of the one you actually typed in, especially in landscape mode. Unbelievable.
Kochise
Kochise,
“Correctly selecting a portion of text, especially in the edge of the screen/text is getting ridiculously hard.”
I find that to be the case too! URLs can be all but impossible to type, autocorrect doesn’t help. Since virtual keyboards aren’t touch typed, ones eyes have to leave the text field to look at the keys, but if we only look at the keys and assume the entered text is correct, it becomes a futile task to correct a typo without backing out all the characters after it. The touchscreen is much too blunt of an input device for selecting characters. Sometimes I even think having arrow keys on the keyboard would help text editing.
I fully welcome the reintroduction of the traditional tablets form factors with real keyboards. They were always too expensive, but now with more competitive products we might finally see a real emergence of good hybrid tablets.
I’ve also bought a little physical BT KB that features these arrow keys.
Kochise
When you type or use the mouse your arms should be _resting_ on the desk.
Otherwise you’ll have problems when you use it for longer then a couple of minutes.
This seems obvious to me, how about you ?
And calling it an extra input device ? I’m not so sure how useful that is, as having to switch between mouse and keyboard is already something people try to avoid.
Edited 2012-11-30 23:34 UTC
But with laptops (and this is about laptops), the elbow can rest on the desk while touching the screen.
This seems obvious to me, how about you ?
I usually consider that to be pretty rude. I would expect the amount of people being beat to a pulp due to touching others’ screens will increase. Not only are you touching the screen and blocking their view, you are jacking with their user interface. Not to mention the fingerprint haters out there.
Edited 2012-11-30 23:58 UTC
Since the days I’ve been using the Psion 5 I realized – touching screens is useful and works for me.
I think the ‘gorilla arm’ fairy-tale is just used to not release a mixture between a MacBook and an iPad otherwise it would screw with the product portfolio.
But maybe Apple will do just that – it’s not the first time.
I liked the touch screen on my Psion 5 too. But that’s because it was a handheld device with a 5.6″ diagonal screen. A screen that lay back when opened, rather than being vertical like a typical laptop/desktop.
The Psion also had a convenient stylus that popped out the side, avoiding fingerprints on the screen. Its EPOC interface was designed for stylus use, and unlike Metro had a finger-unfriendly information density.
“Gorilla arm” comes from reaching out and holding your arm elevated to use a large vertical screen. It’s specifically an issue with desktop screens, and to a lesser extent laptops. Nobody is claiming that touch screens are unsuitable for tablets and palmtops.
But that’s not what you’d usually do with a laptop – your elbow placed right in front of the “keyboard half” can offer support (try it, even if none of your laptops has touchscreen). And this news is about laptops, not desktop screens.
On a very small laptop, ie. under 15″, a touchscreen might still make sense, but on anything above that you will be stretching your arms for the screen and they will tire. A touchscreen might be useful for certain kinds of applications, but I sure as hell hope people won’t be trying to push it as a general replacement for mice and keyboards across the whole board — if I had to poke at my 24″ display across my table, reaching outwards every single time and bending my wrist 90 degrees upwards because poking a touchscreen with long fingernails just doesn’t work otherwise I’d have extremely sore and tired arms and wrist very, very quickly.
Even at 15 tough is rediculous.
There are some applications where multi-modal input are useful and better than mouse or keyboard alone on a desktop. But it would definitely cause muscle fatigue for anything longer than short sessions. The obvious solution is to place a separate touchscreen flat on the desktop.
The touchscreen would be most useful for coarse adjustments, like scaling/rotating pictures on screen to make a collage, scrolling/zooming documents, quickly flinging app windows to different monitors, apps could display context sensitive toolbar pallets in games, photoshop, etc. It could also emulate a touchpad if desired. It could be used with or without a keyboard. All the while the primary monitor(s) don’t need to be smudged by any fingerprints. And the desktop interface doesn’t need to be dumbed down for touch, since only the touchscreen needs to show the touch controls. The touchscreen could highlight the touch hotspots without uglifying the main desktop screen, which solves a major problem with touch today: the lack of touch discover-ability.
Of course, when such systems come to market, I’m sure there will be plenty of fanboys insisting the ideas must have been copied and couldn’t be inspired through device evolution.
You just don’t watch enough movies. Everyone useds touchscreens, it’s totally awesome.
Movie set designers are the epitome of user interface designers and we can learn a lot from movies about how to create interfaces.
.
.
.
.
Like, do the exact opposite. Always.
It works fine on a 17″ desktop replacement, with the elbow resting on the table for support, just in front of the laptop. And I don’t have any unusually long arms. Really, try it.
That would cover vast majority of laptops that people buy, laptops being also the (still rising) majority of PCs sold. And the news is about laptops.
I used automatically to be on the side of those who went “eww” whenever anyone spoke about touching screens. But the other day, I just found myself doing it on a 24″ screen. I don’t even have a tablet (although I do have a Galaxy S2). I am not saying I would want device interaction only via screen gestures but at that moment, when my monitor did nothing in response, I actually found myself verbalizing the word “dumb”. Yes, it surprised me but perhaps more and more people are making this observation?
I wish I had read (and upvoted as insightful) your comment before I posted.
While I didn’t verbalise, I had a mental ‘wtf?’ moment when my non-touch display didn’t react. I think I actually tapped the screen a second time before I realised what I was doing and felt the shame wash over me.
Thanks for your kind words. And by the way, there is no shame .
… there should be a *bit* of shame. Just a bit. Still, nice of him to share that experience.
The only way I see touch coming to the desktop is if the monitor becomes the surface of your desk. And it tilts up like a drafting table or photo editors table. If the monitor is vertical like they are now, it will fail. And you will still need a slide out level surface for the keyboard and mouse. Question is, who can afford a 30-40 inch touch screen to go with their desktop. Of course you could also do a row of flat panels as multi monitors.
Did anyone else know that Microsoft makes a table called the surface?
http://www.samsung.com/us/business/commercial-display-solutions/LH4…
Maybe surface is the name of the interface?
Good points but you have missoed one essential factor.
In a drawing office you are essentially dealing with MATTE media. Thus reflections from the overhead lighting don’t come into play.
As soon as you have a large flat touch tilted surface that AFAIK (and probably due to current technology) you will suffer from a myriad of reflections that will effectively stop you from doing much productive work.
Once the boffins come up with a resilient non reflective coating for BOTH surfaces of the glass then we be getting somewhere. Why both surfaces?
Optics 101. As light enters glass some of it act as if the glass is a prism (think the front cover of Dark Side of the Moon if you are old enough). This is due to the angles at which the wavelengths of light strike the surface. Then glass is a two way medium.
The screen makers can learn a think or two from the Camera Lens designers. They’ve been battling with this subject for decades.
We aren’t there yet but I have no doubt that in time it will become an affordable reality.
Will I take it up? I doubt it as I’m less than 5 years away from retirement.
It was called the surface. For a while there was a bit of text on microsoft.com/surface that indicated that the product you linked to had been renamed to “SUR40 with PixelSense” so that they could repurpose the Surface name. Having played with one at a local Microsoft building… It has a unique interface (I think this is the part they now refer to as PixelSense) that seemed incredibly limited. It was little more than a kiosk, and was pretty horrible to use.
I love the idea of a large low-angle touchscreen as a lightbox or architect’s desk style thing. Now I want to know where to get a 30″ touchscreen and a strong luxo-lamp style mount for it to move between desk and display modes.
We actually have 30″ touchscreens at work at 2560×1600 pixels. The touch interface is a separate product, though – last I checked, you couldn’t get an integrated touchscreen in that size and resolution.
Yet.
Touchscreens have no purpose after a certain size. For a phone or tablet, sure, they’re supposed to be small, slim and lightweight.
But after, say 10 inch displays, any movement is very difficult, time consuming and tiring.
The mouse evolved over the years to a certain shape to fit your palm, and generally, you don’t need to move more than a few centimetres to reach anywhere on the screen.
The keyboard has the letters specially arranged so people don’t suffer from RSI, and when they know how to use it, they generally have to move their fingers very little.
Buttons will always be here. Look at Sony, they had this dilemma with PSVita, and now it has a “touchscreen” which is actually a well placed touchpad.
I thought, the keyboard has letters arranged to slow down typing on old mechanical typewriters to avoid proficient typists jamming the things when the arms collided if you typed too fast. Nothing to do with RSI (which hadn’t been invented in those days )
That’s largely a myth (by the time QWERTY took its ~final form, started getting popular, there were no technical reasons… http://en.wikipedia.org/wiki/QWERTY#Contemporary_alternatives ), promulgated by the proponents of some alternatives (while http://en.wikipedia.org/wiki/Dvorak_Simplified_Keyboard#Controversy )
Plus, QWERTY/Z was used internationally in old mechanical typewriters, in many different languages – you’d think it would make a difference, jam easily in some languages; but it didn’t do that.
That’s like saying big drawings are awkward …drafting tables are often quite large. Big touchscreen could be awesome for manipulating media, or as a proper evolution of the drafting table.
The article is rather biased. When did a trackpad become a “special glass surface” or how did keyboard shortcuts become hard to remember things?
I use a computer everyday, for most of the time that I am awake and comfort and efficiency are obviously among my top priorities. After reading that article you would think that I would have long got myself a touch monitor and I would be waving my hand at it all day long because that’s natural, wouldn’t I? Well I did the unnatural thing and bought a 4000DPI high precision gaming mouse so I can put the pointer anywhere on the two 1080 screens I use without lifting my palm from the desk. Another unnatural thing I constantly do is learn keyboard shortcuts so even if I have a very efficient mouse I only have to move my hand away from the keyboard only when it’s really necessary.
Lifting the hand from the keyboard and swiping down on the screen takes about a second, pressing Alt+F4 or Cmd+Q/W takes miliseconds.
I never tried to touch the screen of a computer in order to interact with it, although I did try to do it on dumb phones when I had one in my hand.
Yes, my comment is also biased and touchscreens might takeoff in laptops someday, but you can’t stop wondering how Microsoft spent that entire $1.5bn W8 launch budget.
Geeks are atypical. Generally, people are very bad at gauging such stuff by themselves (for example http://plan9.bell-labs.com/wiki/plan9/Mouse_vs._keyboard/index.html or how some people swore by trackpoints – but actual research shows them to be inferior to touchpads: http://en.wikipedia.org/wiki/Pointing_stick#Comparison_with_touchpa… & http://cat.inist.fr/?aModele=afficheN&cpsidt=18522893 ).
And the news is not talking about large screens far away, but specifically laptops.
Edited 2012-12-07 21:31 UTC
I had my surface RT for only 3 days before I caught myself prodding the screen on my (non-touchscreen) laptop.
There’s some weird thing in my mind where if I see the new flat/coloured buttons I instinctively want to tap them with a finger.
I also bought a Logitech T650 touchpad to use with my home PC, but for some reason; while I automatically handle mouse or direct-touch fine. Indirect touch for windows gestures just doesn’t come naturally at all.
As someone who uses VST (virtual instrument) apps with lots of knobs and sliders that are a ROYAL pain in the ass to use with a mouse, I welcome touch screens on the desktop. Sure, I realize it’s a niche use, but if Windows 8 helps touch screens rise in popularity, I’m not going to complain
Maybe that’s partly a niche case because it’s so awkward – touchscreens might help popularity (also photo or video editing, when it would similarly become much more comfortable on largish touchscreens)
Touchscreens make sense in some cases and not in others. Keyboard/mouse makes sense in some cases and not in others. There’s certainly room for both but if the idea is for touchscreens to replace keyboard/mouse, that would be a mistake of epic proportions.