The dream of inexpensive computing for everyone has been with us since
the first computers. Along the way it has taken some unexpected turns.
This article summarizes key trends and a few of the surprises.
This article is about consumer computers. If a computer is a “general purpose processing device that
runs different programs to perform varied functions,”
smartphones, with their thousands of apps,
certainly qualify. Single purpose devices do not. These include music
players (like iPods) and e-readers (like Kindles), as well as embedded
devices.
Yet the line isn’t always clear — witness how the Kindle Fire crosses
over from mere e-reader to a general-purpose tablet.
The Personal Computer
The
first big step toward computing for everyone occurred in the
late 1970s with the wide popularity of the Apple II and TRS-80. Then
the IBM PC and clones came out in the
early 1980s and dominated.
Computers became accessible to millions for the first time. Yet the
millions had to be well-heeled. In 1977 an Apple II cost
$1,298 US. In 1997 I bought a Gateway Pentium II with monitor for
$3,228 US. Both prices are equivalent to over $5,000 in today’s
dollars. Personal computers? Yes. But only for those willing and able
to buy an expensive tool.
Computers for the Well-heeled Masses (Apple II and TRS-80)
PCs have become much more affordable over the years, so units shipped (in thousands)
have grown nearly every year since the 1970s:
Source: Reimer’s Blog
Yet some other device is making computers more popular with the general
public than the venerable PC. Let’s talk about…
Smartphones
Back in the 1990s, who would have thought that smartphones would
popularize computing?
Today they’re ubiquitous. Many who carry them would never
touch “a computer.”Two-thirds
of all phones sold today in the U.S. are smartphones, and over 450
million devices shipped in 2011. Sales surpassed
PCs in 2010.
Computerizing your phone adds a photo camera, video camera, web
browser, stereo,
watch, locational and positional sensing, voice recorder, texting,
email — plus several hundred thousand downloadable apps (programs).
You can swipe QR codes, and soon you’ll swipe your phone instead of
your credit card. You carry it in your pocket or purse.
Smartphones have their limits. Small size means no power touch-typing.
The screen is small too, even if
you can turn it sideways. This
article argues that phones make a poor substitute for laptops and
desktops as your sole point of Internet access.
And how about privacy? You would think that
“Few people would willingly carry
around a device that tracks their movements, records their
conversations, and keeps tabs on all the people they talk to.”
But they do in the U.S. This scandal
has not even caused a hiccup in the public’s adoption of these little
pocket spies.
We are in the midst of a shift to the mobile internet. Smartphones
lead the way.
Computers for Everyone
Television
If you asked the experts twenty years ago what would become the
ubiquitous computer of 2012, many would have answered: the TV.
Everybody’s got one, and we watch them five
hours a day. Plus in the 80’s and 90’s we connected up all our TVs, by
cable or satellite. So why didn’t the TV become the computer for the
masses?
It all comes down to competing technologies. No single one dominates.
The options are:
- Intelligent set-top boxes from the cable or satellite company
- Specialized media provision and control computers like TiVo or Roku
- Add-ons from computer vendors like MSN TV or Apple TV
- TV-connected game boxes with general purpose capabilities (eg,
Xbox recently added
Internet Explorer and SmartGlass) - Hooking up the TV to your personal computer (using the HDMI ports
and Windows Media Center or MythTV) - Replacing the TV altogether by watching shows on your PC
- Intelligence built into the TV set itself
TV manufacturers are evolving the set you buy into a full-fledged
TV/computer: the smart
TV. Smart TVs include built-in HD camera, microphones, facial
tracking, speech recognition, and requisite processors. Cost is high
but dropping fast. Given that smart TVs could monitor their watchers like Orwell’s telescreens, you’d
think privacy would be a concern. I believe that the public’s blasé reaction to the smartphone privacy scandals
prove they will gladly submit to whatever spying they must to get their
enhanced TV. Just ensure the exposés hit after smart TVs have won a naive
public’s acceptance.
With no one technology yet dominating, expect continued
sorting as the winner(s) emerge. I’m betting on smart TVs as prices
drop.
Netbooks
Remember netbooks? When small laptops were rechristened “netbooks” in
2007, the hype exploded. Then the iPad killed it all off. Articles
today have titles like “Are
Netbooks Dead? The Prognosis is Grim” and “Nothing Can Revive Netbook Sales.”
I think the issue here is one of terminology. I don’t know about
“netbooks” or “notebooks” or “ultras” or “Airs” per se, but I believe
it’s a safe
bet that people will continue to buy small laptops. Small laptops,
full-sized laptops, desktops, smartphones, tablets, and more will all
co-exist in
the marketplace for years to come.
These sales figures show that the current excitement over smartphones
and tablets is very well justified. And that millions continue to buy
laptops, desktops, and, yes, even small laptops:
Source: PC
World and Canalys 2012
Tablets
Finally! An always-on device with PC capabilities, longer battery life,
better
portability, and a touch interface. Several vendors
introduced tablets over the years — notably Microsoft with their Tablet PC
a decade ago — but it took Apple’s iPad for mass acceptance.
The big breakthrough is the user interface. No keyboard or keypad,
mouse or stylus; just touch the
screen. Sound is key with built-in speakers and microphones.
TechCrunch says
iPad sales will reach 66 million this year. Forrester Research predicts
that tablet sales will balloon to 375 million in 2016. They believe
tablets will become the “preferred,primary device for millions of people
around the world.”
Could be. I love my tablet! But different devices provide different
benefits. For office work I’d rather have a full-sized traditional
keyboard and screen. I certainly don’t want to do remote IT support on
a tablet, like this
poor guy! For watching TV and movies at home I prefer my big screen TV.
And I can’t see people replacing their handy little pocket smartphones
with tablets. There is room for many different devices in our lives.
The Primary Computer of the Future?
Small Super-cheap Computers
Let’s wrap up by talking about small computers with traditional
interfaces.
The non-profit One Laptop Per Child (OLPC)
Foundation started a project in 2006 to develop a small,
consumer-friendly laptop costing less
than $100 US. This would be mass-produced and purchased by governments
to spread educational computing around the world. Netbooks and tablets
ultimately doomed
OLPC.
The Raspberry Pi
computer is OLPC’s spiritual heir. It, too, is underwritten by a
charitable foundation with educational goals.
The Pi is the size of a deck of playing cards. The
Model B includes an ARM 700MHz processor,
VideoCore IV GPU, the Broadcom BCM2835 System on a Chip, 256M memory,
HDMI video/audio output, two USB ports, and an RJ45 Ethernet port. It
uses an SD card for permanent storage and runs Linux from the card. It
lists for $35 US.
Given that Raspberry targets consumers, I’d recommend consumer
packaging. Add a case. Offer a bundle that includes the
required cables, charger, mouse, keyboard, etc. Consumers want plug and
go, not a naked circuit board.
Source: BBC News
Expect more super-cheap PCs soon. I wonder if embedding the PC into the
monitor will become more popular as footprints shrink? But then you
lose the benefits of componentization.
Perhaps
we could standardize PC enclosures and put a snap-on mounting
bracket on the back of all displays. Just pop the PC on or off of the
rear of the display. This retains the benefits of individual components
while removing the PC from the desktop. Anything that’s not wireless
plugs inbehind the display.
Simple. (Just locate the
snap-on PC away from the monitor’s vents and hot spots!)
Big Changes
Popular computing has taken a turn few predicted. Smartphones — and
perhaps tablets — are becoming everyman’s computer. The
implications are huge:
- OS market shares for consumer computers are changing fast.
Windows lost out on smartphones
and tablets to Android and iOS (so far). My recent OS News article Smartphones
Reignite the OS Wars analyzes
the impact.
- Windows also faces market share
challenges in smart TVs and super-cheap computers like the Pi.
- Intel’s dominance is threatened
as the ARM chips in handhelds start to drive the processor chip market.
AMD is floundering.
- Computer makers that haven’t transitioned to handhelds — like
Dell — are threatened too.
- The user interface of smartphones and tablets challenges that of
laptops and desktops. Touch and sound replace
keyboards and mice.
- In response, OS vendors have changed their operating systems to
encompass handhelds (eg: Windows 8 UI, Ubuntu Unity, GNOME 3).
- The mobile internet
is a new paradigm of personal computing. It also raises malware and privacy issues that are today
largely unaddressed.
- Social networking is integral to the emerging mobile internet.
I’ll explore some of these trends in future articles.
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick (President, FCI) supports databases and operating
systems. He also consults for vendors as an industry analyst. Read his
other articles here.
Photos were retrieved from Wikipedia
(except for the Raspberry Pi).
Within reason everything can be a computer.
All that it takes is for SOCs or processors to become cheap enough and low powered enough to move into that space.
We started off with computers that were the size of buildings that only a government at war could require.
However we still have super computers on that scale today.
Then we moved down to servers the size of rooms. Which only a large company could afford. Again this is still the case today.
Then we moved to individual desktop computers, but only businesses could afford. Today we call them workstations.
We then got a our day to day personal computers. They reduced in size and increased in availability. Again they are still prevalent today.
Then we had laptops which took the power of a PC and tried to make it portable. Now todays ultra light i5 laptop can have more CPU power than a Core 2 Quad from only a few years ago.
We had PDAs that merged with feature phones to become Smart Phones.
We have tablets that fill the gap between phone and computer.
We have small hobbiest machines, but also tiny little embeded machines that run Windows CE in the background with out people noticing. I am 90% that the coffee maker in the company kitchen is a Win CE device by its alert noise.
We have Kontiki that is runing on 8-bit SOCs that are in traffic lights and other mundane systems.
There is a stream of computing power that flows further and further down.
When everything is a computer all we have are form factors and ‘computing’ building blocks. Some companies make building blocks well for one use and another better for a different use. We don’t critise mining companies for not selling wood. I am sure they would like the business, but who wouldn’t want more business.
The article’s emphasis was on ‘personal computing’ rather than every possible application of compact/low-power/special-purpose computing, so I can see why the author didn’t get into embedded applications.
That said, you’re very nearly onto an extremely significant point: the trend is definitely away from owning one big general-purpose box to owning a heterogeneous collection of small, specialised devices. Which starts to look more like the traditional embedded ecosystem in its general philosophy, only with one huge exception: ubiquitous integration. The real trick is going to be in getting all these user-oriented devices to integrate seamlessly and securely, so users can mix-n-match services and access their data from multiple devices at any time.
On a purely hardware level I can imagine, say, Apple producing a smart TV that can also do casual gaming a-la iPhone, waiting a couple years till that’s well established and then releasing a snap-on box that boosts it up to full-blown console level, making it both a desirable 3D gaming platform just as the traditional console makers are falling asleep at the wheel again, and providing enough functionality to do general computing (e.g. run a copy of Word) as well, allowing it to do triple duty as an iMac-like PC as well. And neither device would require much internal storage, because users can either keep all their data in iCloud and/or on a local turn-key Apple ‘iHub’ NAS (a much more flexible successor to their rather old-fashioned Time Capsule back-up system). And then all this stuff is going to happily chat with your iPhones and iPads, and even MacBooks if you still bother to own those. And even that only scratches the surface of what might appear in future.
Essentially, personal computing is now entering a post-scarcity age… at least where hardware is concerned. Consumers can now afford to buy a specialised device for each class of tasks they regularly perform. Each device will still retain some general-purpose capability (e.g. you can type a letter on an iPad or smart TV; mostly it’ll just be slower if you don’t purchase a keyboard as well), but for its optimised purpose each one will really shine – certainly much brighter than the traditional general-purpose PC which does a bit of everything reasonably but nothing brilliantly.
The real challenge will be on the software side – getting every device talking to every other device with zero hassle and zero configuration/management costs for the user will be no small practical feat. The basic concepts needed already exist in isolation, but fusing them into a completely successful mass-market solution will be a non-trivial task.
Hopefully this is something the author will explore in future articles; looking forward to them already.
(Full disclosure: While not really a true nerd/geek, I do still keep my very first ZX81 up on my cupboard shelf.;)
Seems like that “ubiquitous integration […] mix-n-match services” doesn’t really work out in your envisioned future scenario, limited only to buying into Apple ecosystem ;p
(and generally, it might be a rather western perspective of things to come – so not really the most common one)
BTW, in a few short years we should see the next generation of consoles …and actually, one present console maker seems to be much closer to that “ubiquitous integration” vision than anybody – Xbox360 works with any TV (not only Apple snap-on box + Apple TV), can stream media from a PC on home network (or even, IIRC, play contents of plugged-in iPod?), access many 3rd party services / streaming TV, and use various touchscreen devices (NOT limited to those with an MS OS: http://en.wikipedia.org/wiki/Xbox_SmartGlass …working with any iOS or Android device one might already have) as a sort of remote, game/app controller, or 2nd screen showing stuff related to a TV show.
PS. You might keep that ZX81 around, but when was the last time you switched it on? ;p
Edited 2012-08-28 09:56 UTC
I said I can see Apple doing it; reinventing entire markets to suit itself is something they’ve gotten rather good at. I didn’t say it was desirable for the rest of the industry to sit on their behinds while Apple eats all their lunches.
Of course I’d like to see open protocols for everything, so users can choose the devices from the vendors they want. The internet itself is the greatest example of open interop, and it didn’t get that way by vendors playing silly buggers over closed standards. A rising tide lifts all boats. OTOH, individual vendors like Apple and Google will no doubt be looking to tilt the field in their own favour if they can, because that’s just business.
Alas, my cheap-ass crystal ball only shows where personal computing will eventually be at, not the exact route it’ll take to get there or the precise form it’ll take when it does. I suspect much will ride on other vendors not just waiting for Apple or Google to tell them what to copy…
Well, and I just said I see your chosen example as not really fulfilling the original presented premise
(but, generally, Apple is probably both unable and unwilling to “eat all their lunches” – big A rides itself largely on the tech advancements made by the industry at large, and openly wishes to target only the few most “profitable” % of human population)
Actually what I’m seeing is everyone and their dog and their dog’s squeaky toy have one or more x86 systems now and the things last so long there is just no point in buying another one before they break.
I mean what is the average user doing that won’t work just fine on that Phenom I X4 desktop or core duo laptop? Nothing, not a thing and those are 6 year old chips.
So what I’m seeing is people buying these other machines to go WITH, not replace, the machines they already have. A perfect example of the “average user” is someone like my dad. 2 desktops (one at work, one at home) plus a smartphone and until he ran over the dang cord and cooked it a laptop, which he is planning to get a tablet and use that instead of the smartphone for the web because “the screen is too dang tiny”. His GF has 2 desktops he had me build her (one in the living room for her, one in the den for guests and grandkids) and a netbook that she prefers over her smartphone, again screen size.
As you can see computers? Tons of them, more cycles than they know what to do with, but things like tablets fit different niches so those like dad will get one for sitting on the couch and checking his email while the commercial is on. Heck this is why I always keep a couple of late model P4s at the shop, that way even the poorest person can easily have a PC if they want one. Computers are everywhere and all these new forms are just filling niches that x86 didn’t fit into well, that’s all.
Oh and I agree with the author, netbooks aren’t going anywhere as customers love the size and easy of carry. The 10 inchers might go though, as I see more and more heading for the 12 inch which seems to be the sweet spot for ultra portable netbooks.
Last I checked, there are ~1.3 billion PCs for ~2 billion PC users – a little less than “everyone and their dog and their dog’s squeaky toy” or “Computers are everywhere” (and people with 2-3 PCs of their own are nowhere near average …but I suppose such whims are what brings this http://en.wikipedia.org/wiki/File:Human_welfare_and_ecological_foot… insanity, and generally resulting in http://en.wikipedia.org/wiki/Planetary_boundaries )
Meanwhile, there are more than 5 billion mobile subscribers…
Whatever the reasons for those differences (costs being of course a large one, but in a broad sense: “even the poorest person” absolutely can’t easily have a P4 PC, if only because of the cost of electricity – not in a ‘how much a kWh will cost me?’ way, more like ‘what kind of fortune for bringing a semi-reliable mains electricity to my home?’), large part of humanity is clearly more receptive to smallish, relatively inexpensive, mobile, battery-powered devices. And I suppose that large Android phones (but without the silly price premiums such models command now in the ~west) might become a dominating form of ~tablets of sorts – or “personal computer” (hey, we reinvented what that means few times already, really) – in the next decade or so; hardly a niche.
(also http://www.opera.com/smw/2012/03/ “Connecting the unconnected” section)
Not really exclusively such. Building-scale computers were very limited in spread, and showing up a bit later than the very beginnings: Z3 or Colossus were between the size of a wall unit (or few wardrobes) and a small room (plus at least Z3 looks like it could be made by a dedicated individual backed by some patron – and indeed, that’s basically how Z2 was made).
Many other machines from that time were similar, the wardrobes-room size seems like it was more typical generally, also in the ~50s. I suppose the most-publicised ENIAC really fitted the “building” (and cost) perception, hence established it in public imagination over the decades…
Overall, history wasn’t so “linear” like you painted it, in the march towards “embedded” – don’t forget that the first x86 CPUs (or generally the first microprocessors) were meant more exactly for such scenarios (and, ironically, first ARMs were for desktop machines).
BTW, reminded me about one Wiki art: http://en.wikipedia.org/wiki/Microcomputer_revolution#The_Home_Comp… – how we apparently predicted that a central computer will control the home and its appliances …only, we failed to anticipate that connections and software will be the really expensive and/or hard part – meanwhile, the computer quickly became so inexpensive that each appliance can have one, with a bonus of keeping the software single-purpose & easy (and we still didn’t really manage to tackle the issue of interoperation).
By my rough estimate, there are around 20 processors in the room I’m sitting in …and only one PC.
PS. Recent examples:
http://www.nablaman.com/relay/ …Zusie, after Konrad Zuse (plus few other relay efforts in Links)
http://web.cecs.pdx.edu/~harry/Relay/index.html
Edited 2012-08-29 09:51 UTC
I doubt that this is true (about smartphones). “Today they’re ubiquitous. Many who carry them would never touch “a computer.”
There simply isn’t big generation left that is so old they don’t want to use computers. Not in the west. If you’re seventy years old now, you were forty when computers came in. Like my dad, plenty young enough. My dad-in-law was ten years older, and he skipped computers. But then, he skipped mobile phones. Over 90% of Dutch people use computers every day. The remaining 10% is definitely not the demographic that uses a smartphone instead, like my dad-in-law. Just like these days the kids really don’t know more about internet than their parents: their parents started using internet when they were in their twenties, and their kids know less.
And in the third world, the people who use mobile phones instead of computers aren’t using smart phones. They cannot afford those, so they mostly use feature phones. Or, since very recently, those Asha wanna-be-a-smartphone feature phones that are such a huge success in India. And they still want to use real computers instead.
No, but there are a great deal of people from certain generations that won’t touch them. Case in point, most of my relatives (parents, aunts, uncles). My dad, for example, will use a Windows PC when forced to but absolutely hates the things. In contrast, he loves his Android smartphone. It’s hard to believe this trend until you’ve actually seen it, but there it is, and it has nothing at all to do with a generation gap.
I agree. My in-laws are in their 70s and were given a computer to use (not too old, still ok) and multiple people sat down with them (on different weekends) trying to find ways to explain things so that they would be easier.
They did figure out how to use it and were just bored with it. Then I took my iPad to their house and they thought, at first, that it was just a photo viewer but then they saw all the other things I could and were very impressed and took to it like, “fish in a pond”.
Funny, I’m just the opposite. I’m 60 years old and love computers, and can justly claim to be a Linux geek through-and-through.
But smart phones – no thanks. Yes, they’re cute, but at least double the size of my “stupid” cell phone. I know lots of people who own smart phones and leave them home because they don’t want to lug something that’s too big to fit in their pocket or on a belt clip. But what is the use of a “mobile” phone if you leave it at home? I use my phone for talking, and very occasionally use the calculator, alarm clock and reminder (calendar) applications. I don’t want to surf the Net from my phone or do email – that’s what my computer is for.
Shoulder bag works too… OTOH belt clips are sooo 90s.
(but there are many smartphones really no larger than “feature phones” – some SE Mini series for example, or LG Optimus One)
Not that I don’t agree, but please note that the Raspberry Pi is selling like mad. This proves that, right now, the target consumers want what is being offered, no matter if it has downsides.
This has been standardized (at least de facto) for a few years now. I’ve used it and I’ve seen client using it. Not all displays have it, of course, but the ones that do usually are compatible between them. It’s the same size for wallmount.
Componentization is a good thing if you need it. For many use cases it’s not needed. Example: dumb terminals.
That’s not the best choice of words… now, definite numbers seem hard to stumble on, but from the data and sources listed in http://en.wikipedia.org/wiki/Raspberry_Pi#Launch we can probably extrapolate 200k units as an absolute upper limit of RPi units sold so far, and likely much less than that.
That’s still a great number for sure – but can’t be really described “like mad” when, IIRC, more Android devices are activated daily.
There was a Zerg rush & a flurry of interest about RPi throughout ~western web, understandable considering that mostly wasting-life-on-web geeks would be marvelled (for a time) about the device, but they don’t translate to “like mad” sales…
Yes you want a keyboard for your office computer. But let me guess as to who owns that. The company you work for? Unless of course you work for yourself.
Your article is talking about personal computers, not work computers. “Most” people like the iPad, as you mentioned, since it doesn’t have a keyboard or mouse or stylus. iPad “type” devices are exactly what people have been waiting for for a long time. You use your finger just like you would for water coloring or touching paper or … whatever you typically do with everything that doesn’t involve a computer.
Cheaper usually means less options. It’s hard to build an “everything” computer that geeks will love for low dollars. Even the Pi (spelling?) computer have very limited ports on it. The more ports and capabilities they add the higher the price would be.
We’d all love to get free computers. But we would be tied to what the giver (someone has to “give” you that free computer) wants you to be able to do.
It is better that we pay for computers. The question is, how little or how much do you want? The more you want, the more it costs. The more the OS does things for you, the more the OS is going to cost. Sure cost for the company can be lowered if they sell a lot of their product. If they didn’t, there is no way that Apple could sell their most recent OS (Mountain Lion) for $19.99. It would cost over $500 if they sold it to tens of thousands or maybe hundreds of thousands.
Sure, Linux is “free”. But you’ve got a lot of people that work at other jobs (mostly) that are not paid for what they do for Linux. They do it for love, not money.
For those that do it for money, they get paid somehow and that means someone is paying or services and that money is paying for them. Basically, business Linux is subsidizing Linux for home use. I guess that works, except that only a few people use Linux (percentage wise it is still less than 2% of desktops).
In shorter timeframes you might have a point, but as integration increases that’s not really how tech evolves – it does tend to give much more for much less, over generations.
Electronic calculators or even pendrives were a big deal not a long time ago – and they can be quite easily found as freebies now, and certainly for the price of a lunch. DAPs similarly (quite expensive, unwieldy and limited a decade ago – but now few lunches give a very small and capable unit). Present sub-$100 no-contract smarpthones are already much nicer than an order of magnitude more expensive (then) decade-old ones – and still only revving up their economies scale (some of their characteristics should ease further lowering of prices – for one, much simpler mechanically than “classic” mobile phones, basically just a screen)
BTW, $20 is not the true price of OSX – it’s basically subsidised by hardware sales. Also, Linux is used by most smartphones… (and not a long time from now, most likely the same for tablets)
“Back in the 1990s, who would have thought that smartphones would popularize computing?”
Smartphones real ancestors are certainly the programmable pocket calculators (HP, TI, Casio, Sharp…) which ran all sorts of applications and
were quite popular in the ’70s and the ’80s
“Back in the 1990s, who would have thought that smartphones would popularize computing?”
I would also add that the ’90s were not so long ago, I started university studies in the ’90s and having mobile phones and laptops around and using computers for many years by then I actually remember talks with friends about smaller portable computers, how and when they’ll come and how will they look like.
I know how 1.5 decades can seem a very long sometimes, but it isn’t really.
I’d disagree with that – programmability of those is fairly limited (especially ’70s models…), and ultimately they were not that different from their simpler siblings, very “focused” on the same kind of usage scenarios.
Meanwhile, the essence of a smartphone seems to be a convergence kind of device, and keeping track of more “soft” and “human” matters – therefore, I believe that more deserving of “smartphones real ancestors” are such smartwatches from the ’80s: http://pocketcalculatorshow.com/nerdwatch/fun2.html (tracing their ancestry to calculator watches hence also electronic calculators – still, a different lineage from scientific ones)
Their main problem seemed to be that a watch is simply the wrong form factor for such functionality – which was later solved by the synthesis of “mini tablet” PDA with mobile phone.
Well, not much of a bet, since even today you can buy some really good looking all-in-one PCs where everything is built inside the screen, with decent cpu, {s/h}dd and ram. The only thing missing is double-triple screen size with – and this is important – double-triple resolution and high dpi. And for the love of god, not some Apple-ish “retina” crap resolution, but real proper high resolution displays.
Memory is a b*tch. Netbooks fall into a dim territory, small screen, weak cpu, low memory, low storage – more specs of a phone than a portable computer. I’d say the whole netbook thing was a bad thing and everyone should just forget about it as wuickly as possible and move along. Also, mixing ultabooks and netbooks is a bad idea.
Oh come on. Tablets might just turn out to be one of the primary computer interfaces of the future, but not primary computers, for sure. Unless you count a 2m wide screen with 24 core cpu and 48 gigs of ram and 10TB of storage space with touch capabilities in your living room a “tablet”. In that case, it just might be the primary computer of the future
I think you’re a bit mistaken regarding the target demographic of the R.Pi.
Another person who always forgets about who creates the content for you people: developers. Right. Sound and touch my a**.
So? We might just largely return to the general model of workstations, for dev usage.
Unless you also think that, to happily live in a house, it is essential having always at hand a drafting table and a supply of raw building materials. Or that a good car requires onboard (miniature, obviously) design studio and assembly line…
Present netbooks are as powerful as laptops from few short years back BTW – quite decent, considering that we generally have more than enough processing power for longer than that.
I know it’s the trend to talk about “smartphones” as if it was a revolution but it’s not. Actually the concept of smartphone is a marketing term used by some manufacturers to advert and differentiate their products but in reality, it’s just the high end class of phones. This class has existed for more than a decade. in 2000 we were surfing the web in text mode over the wap, i-mode, etc… and we were using J2ME applications for everything. This class of phones has been sold by billions since a long time ago. There were like 100 phones for 1 PC in 2000. I’d even say there are LESS phones sold these days than there were in 2005.
So modern “smartphones” are far more capable than the previous ones, that is true but the previous ones still qualify as computers or the Apple II does not.
As an ex-user of Psion PDA’s and Nokia 9000-series Communicators I’d have to agree. There is simply no decisive changeover point between a “featurephone” and a “smartphone”.
For may years now even basic phones have played media files, browsed the internet, taken pictures, and run “apps”. The move from dedicated PDAs and phones towards single units was not “revolutionary” but rather a gradual process that began in the early 1990s, many years before ~either~ Apple or Samsung were making smartphones.
My point, the line between PDAs, basic phones, smartphones, and other devices has always been thin and gray.
When we are talking “everyone” I always think about those who may find smaller devices less comfortable or even impossible to use.
You do not have to be impaired in any way to see that even a best laptop computer is in fact not something a physician would recommend for extended use … let alone smaller devices.
Actually, I stumbled once on some research seriously exploring, from a medical standpoint, the ergonomics of computer usage – and the laptop turned out to be fabulous, enabling a position with very little strain in extended usage…
…which was nothing like people often use computers, on a ~desk – basically, it was a half-lying position with gently bent knees, all supported by pillows or some such.
Edited 2012-08-28 07:47 UTC
Hh, interesting… could be! Laptops are indeed very flexible as far as the posture goes.
But personally I find laptop displays just too small and positioned too low (too close to the keyboard). It is a strain. And a laptop with a extremely big display or external monitor is not really a laptop anymore.
Edited 2012-08-28 07:51 UTC
Though the screen size is relative, it being quite close when using a laptop …plus, in the position I mentioned, it ends up being sort of closer to the height of eye level. :p
I actually have a proper set up chair and desk and never have problems.
Any time I even go near a laptop for a while longer than a quick 10 minutes I already experience discomfort.
So I have some doubts about that, but I do wonder if there is some link.
My guess is, it takes away the biggest advantage that a laptop or tablet or even phone has over a desktop computer. which is: mobility.
Edited 2012-08-28 13:55 UTC
Well, research generally tends to trump personal anecdotes and feelings…
(how much of a ~placebo-like effects with those, here, regarding ~workplace layouts? Even more so if somebody has doubts when confronted with info contrary to long-held beliefs – quite a few of cognitive biases manifest themselves in such scenario. And I could quickly dig up some loosely related examples: http://plan9.bell-labs.com/wiki/plan9/Mouse_vs._keyboard/index.html or how, contrary to some people praising trackpoints, actual research suggests that touchpads are superior… http://cat.inist.fr/?aModele=afficheN&cpsidt=18522893 & http://en.wikipedia.org/wiki/Pointing_stick#Comparison_with_touchpa… – and note that voices supportive of clit are of “subjective opinion” in character; and personally I do like trackpoints, I’m used to the concept, but…)
Anyway, most human dwellings have a ~bed of some kind, so adopting the general position I described, while being portable, shouldn’t be much of a problem ;p (OTOH, yeah, overall mobility probably greatly helps by itself, allowing for quite great variability in body positions)
all this was obviously for years… for anybody who watching evolution of home computers starting with early 80s
anyway, good article
It’s just a shame home computers are not really mentioned, apart from the Apple II and TRS-80.
The article says the IBM PC arrived and dominated. It did so only after a while, starting at work before invading homes. The first PCs cost a fortune.
No mention of the ZX81 which was pretty affordable by the masses.
Even worse transgression: no mention of the Commodore 64, the best-selling single computer model of all time, and which occupies a very large part in some of the linked diagrams http://jeremyreimer.com/postman/node/329 – it was basically the only thing really ever competing numerically with the IBM PC.
Still, I wonder how much the numbers used for those graphs are skewed for North American market – apparently ( http://en.wikipedia.org/wiki/ZX_Spectrum ), the Spectrum family sold 5+ million units not counting clones; considering that C=64 sold ~15 million, Speccy should be easily visible on the graph & much more than “Other”.
Oh, and no mention of the Amiga, the sign of things to come WRT multimedia for the masses.
But what really surprises me is that you, MOS6510, didn’t grumble about those two omissions
I didn’t want to come across as a fanboy, having already mentioned the C64 a few times in comments under other topics. The C64 does make a small appearance in the graph.
But if the topic is computers for the masses the C64 certainly should be mentioned as Jack Tramiel wanted to keep it relative low cost to allow more people to buy it. The ZX81 was affordable for the masses, the ZX Spectrum/C64/Amiga/Atari computers were bought by masses.
Maybe Linux should get a mention too, an operating system so cheap anyone can afford it and it runs on all kinds of hardware.
It kind of seems the writer just wanted to jump to the points he wanted to make about the more recent stuff.
But to be fair, it would make a very long article if he gave everything its credit.
When did that ever stop you before? ;P
TBH I find this particular series of articles somewhat devoid of much real content/insight… like the writer just wanted to publish something (but hey, gives a decent excuse to waste time in the comments)
Ah, who cares about info or insight, I just enjoy having old computers being mentioned.
These days companies like Dell, HP, Acer, etc… have 25.000 different PC/laptop models EACH. It’s hard to get nostalgic about any of them in a number of years.
When you mention a C64 or ZX Spectrum people’s eyes light up. I can’t imagine the same effect if in 10 years if I mention a Dell Optiplex 755.
But I guess these days it’s the operating system that creates the memories and experiences, not the computer itself.
You got that right!
Thing is, most home computers almost certainly don’t elicit that response, also forgotten… (like most from http://en.wikipedia.org/wiki/List_of_home_computers – and I bet that list is still far from exhaustive)
In 2-3 decades (C64 or Spectrum are not about “10 years” timescales) I guess the consoles of today will cause that light in the eyes. Yeah, supposedly a bit different category – but let’s be honest, home computers were almost exclusively about games.
PS. What’s with the new weird avatar… (from some ~RPG game, I imagine). And, most importantly, why – while pixelated – it’s enlarged in a way which “blurs” the pixels…
Edited 2012-09-03 17:21 UTC
that was my first though also but I did not wan’t to spoil article.
yes, author should definitely read: “THE HOME COMPUTER WARS” by Michael S. Tomczyk
or, at least, could watch: http://www.youtube.com/watch?v=sIcAyFVK0gE
Edited 2012-08-29 09:13 UTC
“The first big step toward computing for everyone occurred in the late 1970s with the wide popularity of the Apple II and TRS-80.”
That is a rather US-centric POV. Elsewhere, the Sinclair ZX-series and the BBC Micro ruled the roost. And let’s not forget the Atari/Amiga wars slightly later.
Computing technology reaches it’s apogee when it disappears from view, when the user of the technology is not aware of the technology being used and they are only aware of the task being undertaken. All good technology should vanish from sight, the more in sight it is, the more the user has to think about the tool and not it’s function, the worse the tool is.
Smart phone technology is a big step forward. You tap an icon and speak to someone, or dictate some text which someone else can read, anywhere in the world. You approach a location and the device gently reminds you to do something, you tap an image and a flood of information is available, you see something and instantly capture an image of of it.
As fantastic as smart phones are they still require too much thought to use, they still get between you and the activity or function. There is still a way to go yet.
Disappearing from view is a good thing for the masses who just want things to work and don’t want to bother learning how to operate something, let alone read a manual.
BUT! I think there should always be the option to thinker around. I don’t mind an iPad being a closed system, but it would be a sad day if you were no longer able to build your own computer and install an alternative OS on it.
When things start to become simple and people expect things to “just work” it’s not such a small jump to a situation where a government forbids any computer devices where users can “tamper” with, for they may interrupt services of the it-just-works-machines/systems/services.
The movie and music industry wouldn’t mind that happening for it makes pirating less easy, nor would the government mind in their fight against terrorists, hackers and tax evaders.
I would pose it this way: if you are designing a tool should you design it for the 1% who are interested in the tools themselves or for the 99% who just want to use a tool to do something else and not because they are interested in the tools themselves?
Neither group, those interested in the tools themselves and those only interested in the thing the tools allows one to do, are right or wrong.
The tool lovers will always be a small minority.
Any company making tools will probably focus on the much larger group, those who only want tools for doing something else rather than for tinkering with the tools themselves. A company who makes tools so well designed and easy to use that they become invisible will probably be very successful but the design approach that created the invisibility of the tool will probably be based upon a design approach that makes tinkering with the tool harder and make the tool less satisfying for the small number of people who are mostly interested in the tool itself.
There are no rights and wrongs in tool design. Just trade offs.
The world is becoming more and more dependent on computers and Internet. When something doesn’t work, like a government website, Amazon or 4G, more and more people will get annoyed.
The more people that are annoyed the easier it will become for governments to “protect” these people from annoyance. They will attack the causes of system interruption. Hackers, unpatched PCs that are part of a bot net, overly creative IT students, computer users that pressed the wrong button.
What if they forbid the use of computers that can be “tempered” with?
This may sound a bit Big Brother and sci-fi, but I don’t think it’s that far fetched considering the movement towards IT dependence. Any government would love to be able to control the entire network, from the servers to the computers at home.
People kill millions of sharks each year, a shark grabs a surfer and there is talk of killing them all off so people can safely surf. It doesn’t take much for a crowd to support extreme ideas.
99% of the people would be fine with it, but 1% still account for the entire Linux community and they have over 5.000 Linux distributions. So 1% is still a market.
Albeit a very small one. And being very small means less clout in terms of setting design agendas for OEMs seeking to maximise the scale and profits of their business.
I cannot see how the trend to the sealed box type of computing, both physically sealed and sealed in the sense of hiding most of the file system and OS from view or user manipulation, can be reversed. If anything that trend is likely to accelerate, and it will do so because for 99% or so of users it will improve device functionality and the quality of their user experience. This may be a negative trend for the small number who like to tinker but it will be a positive thing for the majority who definitely do not want to tinker.
I agree with this but at the same time there’s no reason to worry that pc components are going to disappear from stores any time soon. Every big vendor & developer I’ve talked to about it has basically laughed it off saying the theory that they want to shoehorn everyone into these completely closed & pre-made systems is nothing but paranoia. Hardware makers aren’t exactly jumping at the chance to eliminate their own revenue streams.
When it comes to computers, the sky is _always_ falling. We are always on the brink of Armageddon. If there’s 99% of anything, it’s FUD with 1% being real & credible things to be concerned about.
Is that what you’re telling yourself when some app gets booted from appstore because it might possibly compete with (even future) function of the OS & appstore owner, or when an effort is exerted to lock 3rd party headphones out of DAP line, or when a very subpar processing in consumer NLE essentially destroys the footage? ( http://eugenia.queru.com/2009/04/11/stay-the-fuck-away-from-imovief… )
What do you call “build your own computer”?
http://members.iinet.net.au/~daveb/simplex/ringhome.html
Perhaps a little more “modern” then the examples from your link. :-p
But more the general notion of having the freedom of doing with technology what you want. Like build a PC, add some expansion card and hooking it up to your digital TV connection or install Linux.
More and more stuff will become automated/connected so companies and governments want the people to be less and less able to even have the possibility of messing with it or want to know exactly what you do.
Buying computer parts might be looked upon the same way as buying stuff to build a bomb in the future.
At least one of those machines has a Contiki port ( http://mycpu.thtec.org/www-mycpu-eu/contiki.htm ), a modern OS – your argument is invalid ;p
(but, seriously, the FPGA designs can be quite capable…)
And we mostly want it all integrated… (laptops are the majority of PCs sold; and when was the last time you wanted to change just the HDD bus controller, or an FPU?)
Well… http://xkcd.com/651/
Edited 2012-08-30 04:43 UTC
Actually the foundation doesn’t target consumers, the initial version that is on sale is a developer board; they wanted to release it so that people could write software for it. The fact it is so hugely successful amongst non-developers was pretty surprising to them.
The actual target market is schools, and the release to them will include a bare-bones case.
There’s very little difference between what you call the developer board (which is really called the model B board), and the coming “educational” model A board. What you get today is nearly the same as what you will get in the coming months with the official release — assuming of course it comes out any time soon.
What I find funny is how much they’ve gone out of their way to demonstrate the Raspberry Pi’s ability to perform as a media playback device, and play 3D games.
Btw, they’re now offering the hardware mpeg2 codec for purchase, priced at just a few bucks. Great news for people wanting to playback those avi’s they downloaded.
Absolutely, very little difference indeed (after all what would be the point in having it be substantially different). Other than the case, The educational release of the model-A will also come with educational materials for use in the classroom.
They very much did originally intend it to be used for educational/development purposes, which was their reasoning for not including the MPEG-2 licenses out of the box. While they are providing this as an add-on, I don’t think they will be going for all-out consumer focussed any time soon
Not sure where you heard that but it’s not true. According to the foundation themselves, the only reason the mpeg2 license wasn’t included was because they couldn’t do so and maintain the $35 price.
They’ve been clear that providing people with a cheap & usable computer is one of their primary goals. So is staying in business (yes, even non-profits are businesses) and for that reason alone they understand the importance of accommodating common use needs. This is exactly why the mpeg2 and vc1 licenses are now available for purchase.
So I wonder how fast those “codecs” of sort will find their way on tpb…
(nah, not really wonder)
“Back in the 1990s, who would have thought that smartphones would popularize computing?”
Where do people get this stuff? By the end of the 90’s nearly a trillion computers had been sold. It’s safe to say computing was popular long before the introduction of the “smartphone”.
The interest in Raspberry Pi was far greater than than the foundation anticipated, but that has not translated into massive sales. The majority of people who showed initial interest didn’t actually buy one. More potential customers left due to the unavailability and ridiculous lead times on orders. The Raspberry Pi is doing better than expected, no question. But, to say it’s selling like mad is misleading.
And yes, there are certainly some downsides to the device. A lot of people thought they were buying one thing and found out it’s not what they expected. The Raspberry Pi is absolutely no replacement for a desktop, laptop, tablet, or otherwise. The thing has little computing power and is slow (which is why everyone cross-compiles for it). There’s no gui/desktop acceleration (yet) either. The Raspberry Pi reminds me of a mid range cell phone without cellular capability or a case.
10^12 computers? (and that’s using the short scale – even more curious with long, 10^18)
I kinda doubt there was, by the end of the 90s, on the order of 100 computers per human …where do you get this stuff?
Just look at the specs of just released (or to be released soon) DUO (though UNO and LEONARDO are interesting as well)