Linked by Thom Holwerda on Wed 11th Nov 2015 13:53 UTC
Apple

The reviews for the Apple Surface are coming in. There's two reviews at The Verge, one at the Wall Street Journal, and John Gruber's got early access from Apple as well.

The general gist? If you've ever read a Surface Pro review, you've read all the iPad Pro reviews. Well, mostly - the complaints leveled at the Surface Pro are being tip-toed around a bit now that they apply to an Apple product, of course, and suddenly, the magic argument "but it will get better in the future" is now completely valid, while the same argument is never considered valid for the Surface Pro (or something like the Priv and its early bugs).

That being said, all reviews dive into just how uncomfortable the iPad Pro is to use as a laptop - and the problem, of course, is iOS itself. iOS is a mobile, touch-first operating system that Apple is now trying to shoehorn into a laptop role. iOS provides no support for mice or trackpads, and the keyboard and iOS lack most basic shortcut keys, so in order to do anything other than typing, you'll need to lift your arm and reach for the screen to use touch. This is something Apple has mocked for years as the reason not to include touch on laptops, and now they release a device which requires it 100%.

This is what happens when you run out of ideas and try to shoehorn your cashcow - iOS - into a role it was never intended to fulfill, without being gutsy enough to make the changes it requires. The iPad Pro is clearly screaming for a touchpad (and proper keyboard shortcuts), but it doesn't have any, and according to John Gruber, it never will (a comment I filed away for later when Apple inevitably adds mouse support to iOS).

Microsoft's Surface may not be perfect, but its problems stem almost exclusively not from a lack in hardware capability or a faulty concept, but from Microsoft's Metro environment being utterly shit. The concept of having a tablet and a laptop in the same device, seamlessly switching between a tablet UI and a desktop UI, is sound - the only problem is that Microsoft doesn't have a working tablet UI and applications. Meanwhile, trying to shoehorn a mobile, touch-first UI into a laptop form factor is just as silly and idiotic as trying to shoehorn a desktop UI into a mobile, touch-first form factor - and Apple should know better.

Or should they? Paul Thurrott, earlier this week:

While the iPad Pro was in many ways inevitable, it also points to a crisis of original thought at Apple, which has been coasting on the iPhone’s coattails for perhaps too long. At Apple, the solution to every problem is another iPhone. And the iPad Pro, like the new Apple TV and the Apple Watch, is really just another attempt to duplicate that singular success in other markets.

Thurrott really hits the nail on the head. The iPhone became a success because Apple sought - and succeeded in - designing an interface and interaction model that was specifically designed for the iPhone's input methods - the multitouch display, the home button. Ever since that major big hit, they've been trying to shoehorn that exact same interface and interaction model into every major new product - the Apple Watch, the new Apple TV, and now the iPad Pro. However, if there's one thing we've learned from Palm OS (pen-first, mobile-first) and iOS (multitouch-first, mobile-first), it's that every form factor needs a tailored interaction model - not a shoehorned one.

When you're a hammer, every problem looks like a nail - which sums up Apple's new major product lines ever since the release of the iPhone, and the iPad Pro seems no different. It will do great as an iPad+, but beyond that? It's not going to make a single, meaningful dent, without considerable restructuring of iOS' UI and interaction models - and lots and lots of crow.

Thread beginning with comment 620689
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: Some alternative views
by galvanash on Wed 11th Nov 2015 20:41 UTC in reply to "RE: Some alternative views"
galvanash
Member since:
2006-01-25

1. "it enables far more people to use these things."

BUT it does discourage people from understanding how things work, and it increasingly promotes the idea that a computing device is a magic box which converts input into output in some mysterious ways noone but its creators should be concerned with. I think that is ultimately an impoverishment for our minds.


I share the same feelings to a degree. As a developer who grew up during 80s it is hard not to. That said though, I believe that this viewpoint is just plain wrong and should be avoided.

Why? Because it isn't realistic.

Yes, mobile devices (particularly iOS) are simply not like the PCs we grew up with. They offer extremely regimented environments in which to run applications, and tinkering with it in an effort to "discover" how it all works is pretty much futile. As a conduit to further one's knowledge of how computers work, they are useless and a step backward - no argument. They are, as you said, magic boxes to most users.

But... There is a place in the world for magic boxes. Most of us have a TV. Few of us know how to actually build one, or much care how they work. We don't need to in order to use one. If you really, really, want to learn how to build one it is possible to do so - but unless you plan on designing TVs for a living there is little practical reason to. TVs have become a transparent tool to such a degree that you can actually make TV Shows without needing to understand how they work... This exact same logic applies to lots of things in our everyday lives - cars, microwaves, washing machines, etc., etc.

Why should computers be different? I get the sense of loss, really I do, but our fond memories of the good ole days are really immaterial to the vast majority of people. What we see as backwards and limiting they see as a tool they can finally use without needing to understand how it works.

That is what I think most people miss in the equation. Conventional PCs cannot be used without a fair bit of knowledge as to how they work. Even if you do understand them they tend to break horribly - things go wrong all the time (bad drivers, botched software updates, unexpected incompatibilities, viruses, malware, etc. etc.)

To use one, you will need to deal with these things at least every once in a while. Sure, it has gotten a lot better over the years, but even now if you don't know how to fix them, and don't have access to someone who does, getting through even a single year without any loss of productivity is probably nearly impossible (and the less you know about them the more likely things will go wrong). They are just too brittle and it is too easy to shoot yourself in your own foot.

All the things that make iOS and Android "bad" to us are the very things that "fix" this. Severely restricted software distribution, complex security, dumbed down interaction models, a heavy reliance on cloud computing resources, limited hardware resources, application signing requirements, etc. - those are the very things that make it possible for a complete novice to buy and use one of these things for years without any help from anyone. They just work. Like TVs.

That isn't impoverishing people's minds, it is giving them access to something genuinely useful that they simply did not have access to before, not without a whole lot of mental investment. Independence like this is very powerful - it is not something we (as developers) should be flippant about. The price for that is unfortunately turning computers into magic boxes, because there really isn't any other sane way to do it.

I certainly don't think that PCs are going to go away any time soon, and lots of people that start with tablets will hit the capabilities wall and graduate to real computers. I think there is a place for both kinds of devices. I just don't think it is fair to fixate on what was lost without considering what was gained. There are two sides to the story.

Reply Parent Score: 4