Username or EmailPassword
Within reason everything can be a computer.
All that it takes is for SOCs or processors to become cheap enough and low powered enough to move into that space.
We started off with computers that were the size of buildings that only a government at war could require.
However we still have super computers on that scale today.
Then we moved down to servers the size of rooms. Which only a large company could afford. Again this is still the case today.
Then we moved to individual desktop computers, but only businesses could afford. Today we call them workstations.
We then got a our day to day personal computers. They reduced in size and increased in availability. Again they are still prevalent today.
Then we had laptops which took the power of a PC and tried to make it portable. Now todays ultra light i5 laptop can have more CPU power than a Core 2 Quad from only a few years ago.
We had PDAs that merged with feature phones to become Smart Phones.
We have tablets that fill the gap between phone and computer.
We have small hobbiest machines, but also tiny little embeded machines that run Windows CE in the background with out people noticing. I am 90% that the coffee maker in the company kitchen is a Win CE device by its alert noise.
We have Kontiki that is runing on 8-bit SOCs that are in traffic lights and other mundane systems.
There is a stream of computing power that flows further and further down.
When everything is a computer all we have are form factors and 'computing' building blocks. Some companies make building blocks well for one use and another better for a different use. We don't critise mining companies for not selling wood. I am sure they would like the business, but who wouldn't want more business.
The article's emphasis was on 'personal computing' rather than every possible application of compact/low-power/special-purpose computing, so I can see why the author didn't get into embedded applications.
That said, you're very nearly onto an extremely significant point: the trend is definitely away from owning one big general-purpose box to owning a heterogeneous collection of small, specialised devices. Which starts to look more like the traditional embedded ecosystem in its general philosophy, only with one huge exception: ubiquitous integration. The real trick is going to be in getting all these user-oriented devices to integrate seamlessly and securely, so users can mix-n-match services and access their data from multiple devices at any time.
On a purely hardware level I can imagine, say, Apple producing a smart TV that can also do casual gaming a-la iPhone, waiting a couple years till that's well established and then releasing a snap-on box that boosts it up to full-blown console level, making it both a desirable 3D gaming platform just as the traditional console makers are falling asleep at the wheel again, and providing enough functionality to do general computing (e.g. run a copy of Word) as well, allowing it to do triple duty as an iMac-like PC as well. And neither device would require much internal storage, because users can either keep all their data in iCloud and/or on a local turn-key Apple 'iHub' NAS (a much more flexible successor to their rather old-fashioned Time Capsule back-up system). And then all this stuff is going to happily chat with your iPhones and iPads, and even MacBooks if you still bother to own those. And even that only scratches the surface of what might appear in future.
Essentially, personal computing is now entering a post-scarcity age... at least where hardware is concerned. Consumers can now afford to buy a specialised device for each class of tasks they regularly perform. Each device will still retain some general-purpose capability (e.g. you can type a letter on an iPad or smart TV; mostly it'll just be slower if you don't purchase a keyboard as well), but for its optimised purpose each one will really shine - certainly much brighter than the traditional general-purpose PC which does a bit of everything reasonably but nothing brilliantly.
The real challenge will be on the software side - getting every device talking to every other device with zero hassle and zero configuration/management costs for the user will be no small practical feat. The basic concepts needed already exist in isolation, but fusing them into a completely successful mass-market solution will be a non-trivial task.
Hopefully this is something the author will explore in future articles; looking forward to them already.
(Full disclosure: While not really a true nerd/geek, I do still keep my very first ZX81 up on my cupboard shelf.;)
Seems like that "ubiquitous integration [...] mix-n-match services" doesn't really work out in your envisioned future scenario, limited only to buying into Apple ecosystem ;p
(and generally, it might be a rather western perspective of things to come - so not really the most common one)
BTW, in a few short years we should see the next generation of consoles ...and actually, one present console maker seems to be much closer to that "ubiquitous integration" vision than anybody - Xbox360 works with any TV (not only Apple snap-on box + Apple TV), can stream media from a PC on home network (or even, IIRC, play contents of plugged-in iPod?), access many 3rd party services / streaming TV, and use various touchscreen devices (NOT limited to those with an MS OS: http://en.wikipedia.org/wiki/Xbox_SmartGlass ...working with any iOS or Android device one might already have) as a sort of remote, game/app controller, or 2nd screen showing stuff related to a TV show.
PS. You might keep that ZX81 around, but when was the last time you switched it on? ;p Edited 2012-08-28 09:56 UTC
I said I can see Apple doing it; reinventing entire markets to suit itself is something they've gotten rather good at. I didn't say it was desirable for the rest of the industry to sit on their behinds while Apple eats all their lunches.
Of course I'd like to see open protocols for everything, so users can choose the devices from the vendors they want. The internet itself is the greatest example of open interop, and it didn't get that way by vendors playing silly buggers over closed standards. A rising tide lifts all boats. OTOH, individual vendors like Apple and Google will no doubt be looking to tilt the field in their own favour if they can, because that's just business.
Alas, my cheap-ass crystal ball only shows where personal computing will eventually be at, not the exact route it'll take to get there or the precise form it'll take when it does. I suspect much will ride on other vendors not just waiting for Apple or Google to tell them what to copy...
Well, and I just said I see your chosen example as not really fulfilling the original presented premise
(but, generally, Apple is probably both unable and unwilling to "eat all their lunches" - big A rides itself largely on the tech advancements made by the industry at large, and openly wishes to target only the few most "profitable" % of human population)
Actually what I'm seeing is everyone and their dog and their dog's squeaky toy have one or more x86 systems now and the things last so long there is just no point in buying another one before they break.
I mean what is the average user doing that won't work just fine on that Phenom I X4 desktop or core duo laptop? Nothing, not a thing and those are 6 year old chips.
So what I'm seeing is people buying these other machines to go WITH, not replace, the machines they already have. A perfect example of the "average user" is someone like my dad. 2 desktops (one at work, one at home) plus a smartphone and until he ran over the dang cord and cooked it a laptop, which he is planning to get a tablet and use that instead of the smartphone for the web because "the screen is too dang tiny". His GF has 2 desktops he had me build her (one in the living room for her, one in the den for guests and grandkids) and a netbook that she prefers over her smartphone, again screen size.
As you can see computers? Tons of them, more cycles than they know what to do with, but things like tablets fit different niches so those like dad will get one for sitting on the couch and checking his email while the commercial is on. Heck this is why I always keep a couple of late model P4s at the shop, that way even the poorest person can easily have a PC if they want one. Computers are everywhere and all these new forms are just filling niches that x86 didn't fit into well, that's all.
Oh and I agree with the author, netbooks aren't going anywhere as customers love the size and easy of carry. The 10 inchers might go though, as I see more and more heading for the 12 inch which seems to be the sweet spot for ultra portable netbooks.
Last I checked, there are ~1.3 billion PCs for ~2 billion PC users - a little less than "everyone and their dog and their dog's squeaky toy" or "Computers are everywhere" (and people with 2-3 PCs of their own are nowhere near average ...but I suppose such whims are what brings this http://en.wikipedia.org/wiki/File:Human_welfare_and_ecological_foot... insanity, and generally resulting in http://en.wikipedia.org/wiki/Planetary_boundaries )
Meanwhile, there are more than 5 billion mobile subscribers...
Whatever the reasons for those differences (costs being of course a large one, but in a broad sense: "even the poorest person" absolutely can't easily have a P4 PC, if only because of the cost of electricity - not in a 'how much a kWh will cost me?' way, more like 'what kind of fortune for bringing a semi-reliable mains electricity to my home?'), large part of humanity is clearly more receptive to smallish, relatively inexpensive, mobile, battery-powered devices. And I suppose that large Android phones (but without the silly price premiums such models command now in the ~west) might become a dominating form of ~tablets of sorts - or "personal computer" (hey, we reinvented what that means few times already, really) - in the next decade or so; hardly a niche.
(also http://www.opera.com/smw/2012/03/ "Connecting the unconnected" section)
I doubt that this is true (about smartphones). "Today they're ubiquitous. Many who carry them would never touch "a computer."
There simply isn't big generation left that is so old they don't want to use computers. Not in the west. If you're seventy years old now, you were forty when computers came in. Like my dad, plenty young enough. My dad-in-law was ten years older, and he skipped computers. But then, he skipped mobile phones. Over 90% of Dutch people use computers every day. The remaining 10% is definitely not the demographic that uses a smartphone instead, like my dad-in-law. Just like these days the kids really don't know more about internet than their parents: their parents started using internet when they were in their twenties, and their kids know less.
And in the third world, the people who use mobile phones instead of computers aren't using smart phones. They cannot afford those, so they mostly use feature phones. Or, since very recently, those Asha wanna-be-a-smartphone feature phones that are such a huge success in India. And they still want to use real computers instead.
I agree. My in-laws are in their 70s and were given a computer to use (not too old, still ok) and multiple people sat down with them (on different weekends) trying to find ways to explain things so that they would be easier.
They did figure out how to use it and were just bored with it. Then I took my iPad to their house and they thought, at first, that it was just a photo viewer but then they saw all the other things I could and were very impressed and took to it like, "fish in a pond".
Funny, I'm just the opposite. I'm 60 years old and love computers, and can justly claim to be a Linux geek through-and-through.
But smart phones - no thanks. Yes, they're cute, but at least double the size of my "stupid" cell phone. I know lots of people who own smart phones and leave them home because they don't want to lug something that's too big to fit in their pocket or on a belt clip. But what is the use of a "mobile" phone if you leave it at home? I use my phone for talking, and very occasionally use the calculator, alarm clock and reminder (calendar) applications. I don't want to surf the Net from my phone or do email - that's what my computer is for.
Yes you want a keyboard for your office computer. But let me guess as to who owns that. The company you work for? Unless of course you work for yourself.
Your article is talking about personal computers, not work computers. "Most" people like the iPad, as you mentioned, since it doesn't have a keyboard or mouse or stylus. iPad "type" devices are exactly what people have been waiting for for a long time. You use your finger just like you would for water coloring or touching paper or ... whatever you typically do with everything that doesn't involve a computer.
Cheaper usually means less options. It's hard to build an "everything" computer that geeks will love for low dollars. Even the Pi (spelling?) computer have very limited ports on it. The more ports and capabilities they add the higher the price would be.
We'd all love to get free computers. But we would be tied to what the giver (someone has to "give" you that free computer) wants you to be able to do.
It is better that we pay for computers. The question is, how little or how much do you want? The more you want, the more it costs. The more the OS does things for you, the more the OS is going to cost. Sure cost for the company can be lowered if they sell a lot of their product. If they didn't, there is no way that Apple could sell their most recent OS (Mountain Lion) for $19.99. It would cost over $500 if they sold it to tens of thousands or maybe hundreds of thousands.
Sure, Linux is "free". But you've got a lot of people that work at other jobs (mostly) that are not paid for what they do for Linux. They do it for love, not money.
For those that do it for money, they get paid somehow and that means someone is paying or services and that money is paying for them. Basically, business Linux is subsidizing Linux for home use. I guess that works, except that only a few people use Linux (percentage wise it is still less than 2% of desktops).
In shorter timeframes you might have a point, but as integration increases that's not really how tech evolves - it does tend to give much more for much less, over generations.
Electronic calculators or even pendrives were a big deal not a long time ago - and they can be quite easily found as freebies now, and certainly for the price of a lunch. DAPs similarly (quite expensive, unwieldy and limited a decade ago - but now few lunches give a very small and capable unit). Present sub-$100 no-contract smarpthones are already much nicer than an order of magnitude more expensive (then) decade-old ones - and still only revving up their economies scale (some of their characteristics should ease further lowering of prices - for one, much simpler mechanically than "classic" mobile phones, basically just a screen)
BTW, $20 is not the true price of OSX - it's basically subsidised by hardware sales. Also, Linux is used by most smartphones... (and not a long time from now, most likely the same for tablets)
"Back in the 1990s, who would have thought that smartphones would popularize computing?"
Smartphones real ancestors are certainly the programmable pocket calculators (HP, TI, Casio, Sharp...) which ran all sorts of applications and
were quite popular in the '70s and the '80s
"Back in the 1990s, who would have thought that smartphones would popularize computing?"
I would also add that the '90s were not so long ago, I started university studies in the '90s and having mobile phones and laptops around and using computers for many years by then I actually remember talks with friends about smaller portable computers, how and when they'll come and how will they look like.
I know how 1.5 decades can seem a very long sometimes, but it isn't really.
I know it's the trend to talk about "smartphones" as if it was a revolution but it's not. Actually the concept of smartphone is a marketing term used by some manufacturers to advert and differentiate their products but in reality, it's just the high end class of phones. This class has existed for more than a decade. in 2000 we were surfing the web in text mode over the wap, i-mode, etc... and we were using J2ME applications for everything. This class of phones has been sold by billions since a long time ago. There were like 100 phones for 1 PC in 2000. I'd even say there are LESS phones sold these days than there were in 2005.
So modern "smartphones" are far more capable than the previous ones, that is true but the previous ones still qualify as computers or the Apple II does not.
As an ex-user of Psion PDA's and Nokia 9000-series Communicators I'd have to agree. There is simply no decisive changeover point between a "featurephone" and a "smartphone".
For may years now even basic phones have played media files, browsed the internet, taken pictures, and run "apps". The move from dedicated PDAs and phones towards single units was not "revolutionary" but rather a gradual process that began in the early 1990s, many years before ~either~ Apple or Samsung were making smartphones.
My point, the line between PDAs, basic phones, smartphones, and other devices has always been thin and gray.
When we are talking "everyone" I always think about those who may find smaller devices less comfortable or even impossible to use.
You do not have to be impaired in any way to see that even a best laptop computer is in fact not something a physician would recommend for extended use ... let alone smaller devices.
Actually, I stumbled once on some research seriously exploring, from a medical standpoint, the ergonomics of computer usage - and the laptop turned out to be fabulous, enabling a position with very little strain in extended usage...
...which was nothing like people often use computers, on a ~desk - basically, it was a half-lying position with gently bent knees, all supported by pillows or some such. Edited 2012-08-28 07:47 UTC
Hh, interesting... could be! Laptops are indeed very flexible as far as the posture goes.
But personally I find laptop displays just too small and positioned too low (too close to the keyboard). It is a strain. And a laptop with a extremely big display or external monitor is not really a laptop anymore. Edited 2012-08-28 07:51 UTC
Though the screen size is relative, it being quite close when using a laptop ...plus, in the position I mentioned, it ends up being sort of closer to the height of eye level. :p
I actually have a proper set up chair and desk and never have problems.
Any time I even go near a laptop for a while longer than a quick 10 minutes I already experience discomfort.
So I have some doubts about that, but I do wonder if there is some link.
My guess is, it takes away the biggest advantage that a laptop or tablet or even phone has over a desktop computer. which is: mobility. Edited 2012-08-28 13:55 UTC
Well, research generally tends to trump personal anecdotes and feelings...
(how much of a ~placebo-like effects with those, here, regarding ~workplace layouts? Even more so if somebody has doubts when confronted with info contrary to long-held beliefs - quite a few of cognitive biases manifest themselves in such scenario. And I could quickly dig up some loosely related examples: http://plan9.bell-labs.com/wiki/plan9/Mouse_vs._keyboard/index.html or how, contrary to some people praising trackpoints, actual research suggests that touchpads are superior... http://cat.inist.fr/?aModele=afficheN&cpsidt=18522893 & http://en.wikipedia.org/wiki/Pointing_stick#Comparison_with_touchpa... - and note that voices supportive of clit are of "subjective opinion" in character; and personally I do like trackpoints, I'm used to the concept, but...)
Anyway, most human dwellings have a ~bed of some kind, so adopting the general position I described, while being portable, shouldn't be much of a problem ;p (OTOH, yeah, overall mobility probably greatly helps by itself, allowing for quite great variability in body positions)
all this was obviously for years... for anybody who watching evolution of home computers starting with early 80s
anyway, good article
It's just a shame home computers are not really mentioned, apart from the Apple II and TRS-80.
The article says the IBM PC arrived and dominated. It did so only after a while, starting at work before invading homes. The first PCs cost a fortune.
No mention of the ZX81 which was pretty affordable by the masses.
Even worse transgression: no mention of the Commodore 64, the best-selling single computer model of all time, and which occupies a very large part in some of the linked diagrams http://jeremyreimer.com/postman/node/329 - it was basically the only thing really ever competing numerically with the IBM PC.
Still, I wonder how much the numbers used for those graphs are skewed for North American market - apparently ( http://en.wikipedia.org/wiki/ZX_Spectrum ), the Spectrum family sold 5+ million units not counting clones; considering that C=64 sold ~15 million, Speccy should be easily visible on the graph & much more than "Other".
Oh, and no mention of the Amiga, the sign of things to come WRT multimedia for the masses.
But what really surprises me is that you, MOS6510, didn't grumble about those two omissions
I didn't want to come across as a fanboy, having already mentioned the C64 a few times in comments under other topics. The C64 does make a small appearance in the graph.
But if the topic is computers for the masses the C64 certainly should be mentioned as Jack Tramiel wanted to keep it relative low cost to allow more people to buy it. The ZX81 was affordable for the masses, the ZX Spectrum/C64/Amiga/Atari computers were bought by masses.
Maybe Linux should get a mention too, an operating system so cheap anyone can afford it and it runs on all kinds of hardware.
It kind of seems the writer just wanted to jump to the points he wanted to make about the more recent stuff.
But to be fair, it would make a very long article if he gave everything its credit.
Ah, who cares about info or insight, I just enjoy having old computers being mentioned.
These days companies like Dell, HP, Acer, etc... have 25.000 different PC/laptop models EACH. It's hard to get nostalgic about any of them in a number of years.
When you mention a C64 or ZX Spectrum people's eyes light up. I can't imagine the same effect if in 10 years if I mention a Dell Optiplex 755.
But I guess these days it's the operating system that creates the memories and experiences, not the computer itself.
"The first big step toward computing for everyone occurred in the late 1970s with the wide popularity of the Apple II and TRS-80."
That is a rather US-centric POV. Elsewhere, the Sinclair ZX-series and the BBC Micro ruled the roost. And let's not forget the Atari/Amiga wars slightly later.
Computing technology reaches it's apogee when it disappears from view, when the user of the technology is not aware of the technology being used and they are only aware of the task being undertaken. All good technology should vanish from sight, the more in sight it is, the more the user has to think about the tool and not it's function, the worse the tool is.
Smart phone technology is a big step forward. You tap an icon and speak to someone, or dictate some text which someone else can read, anywhere in the world. You approach a location and the device gently reminds you to do something, you tap an image and a flood of information is available, you see something and instantly capture an image of of it.
As fantastic as smart phones are they still require too much thought to use, they still get between you and the activity or function. There is still a way to go yet.
Disappearing from view is a good thing for the masses who just want things to work and don't want to bother learning how to operate something, let alone read a manual.
BUT! I think there should always be the option to thinker around. I don't mind an iPad being a closed system, but it would be a sad day if you were no longer able to build your own computer and install an alternative OS on it.
When things start to become simple and people expect things to "just work" it's not such a small jump to a situation where a government forbids any computer devices where users can "tamper" with, for they may interrupt services of the it-just-works-machines/systems/services.
The movie and music industry wouldn't mind that happening for it makes pirating less easy, nor would the government mind in their fight against terrorists, hackers and tax evaders.
The world is becoming more and more dependent on computers and Internet. When something doesn't work, like a government website, Amazon or 4G, more and more people will get annoyed.
The more people that are annoyed the easier it will become for governments to "protect" these people from annoyance. They will attack the causes of system interruption. Hackers, unpatched PCs that are part of a bot net, overly creative IT students, computer users that pressed the wrong button.
What if they forbid the use of computers that can be "tempered" with?
This may sound a bit Big Brother and sci-fi, but I don't think it's that far fetched considering the movement towards IT dependence. Any government would love to be able to control the entire network, from the servers to the computers at home.
People kill millions of sharks each year, a shark grabs a surfer and there is talk of killing them all off so people can safely surf. It doesn't take much for a crowd to support extreme ideas.
99% of the people would be fine with it, but 1% still account for the entire Linux community and they have over 5.000 Linux distributions. So 1% is still a market.
Perhaps a little more "modern" then the examples from your link. :-p
But more the general notion of having the freedom of doing with technology what you want. Like build a PC, add some expansion card and hooking it up to your digital TV connection or install Linux.
More and more stuff will become automated/connected so companies and governments want the people to be less and less able to even have the possibility of messing with it or want to know exactly what you do.
Buying computer parts might be looked upon the same way as buying stuff to build a bomb in the future.
Absolutely, very little difference indeed (after all what would be the point in having it be substantially different). Other than the case, The educational release of the model-A will also come with educational materials for use in the classroom.
They very much did originally intend it to be used for educational/development purposes, which was their reasoning for not including the MPEG-2 licenses out of the box. While they are providing this as an add-on, I don't think they will be going for all-out consumer focussed any time soon
So I wonder how fast those "codecs" of sort will find their way on tpb...
(nah, not really wonder)
"Back in the 1990s, who would have thought that smartphones would popularize computing?"
Where do people get this stuff? By the end of the 90's nearly a trillion computers had been sold. It's safe to say computing was popular long before the introduction of the "smartphone".
Just look at the specs of just released (or to be released soon) DUO (though UNO and LEONARDO are interesting as well)