But as the blog post points out, even though the Osborne company didn't succeed, it proved that there was a market for portable computers, and the next couple of generations of portables stuck to the suitcase form factor that Osborne pioneered.
These days, the Osborne name is mostly known in marketing circles for the "Osborne Effect," wherein anticipation for an upgraded product kills demand for the existing product. There's no better modern student of this phenomenon than Steve Jobs himself, who has proven to be absolutely fanatical about keeping a lid on Apple product updates in order to avoid undermining the market for existing Apple products.
What this comparison got me thinking about, however, was whether the hardware or the software has been the more important driver of the computing revolution of the past 30 years. Obviously, it was impossible back then for a handheld device to be powerful enough to do more than simple arithmetic, and that put a severe limitation on what people could do with computers, especially portable ones. I don't think anyone would disagree that big high resolution color screens, GUIs, 3D shading, and LiPo batteries don't represent a huge leap forward for technology.
But as I was contemplating the Osborne software specs: CP/M, WordStar, SuperCalc, dBASE II, and thinking about what could be done with such limited hardware, I started to think about the radical software advances over the past decades: TCP/IP and the popularization of the internet, email, messaging, the web, search, social networking. You could actually take a 1980s-level hardware infrastructure and layer some software with modern capabilities on top of it and get something really quite amazing. You could search for information with Google, keep up with your friends on Facebook, share your ideas over Twitter, collaborate with your colleagues using email and messaging, play an Angry Birds-like game, make a complex spreadsheet, prepare a presentation for a speech. You'd have to give up VoIP, digital photography, 3D gaming, and all but the most rudimentary GUI, but you'd have a decent computing experience.
Now let's consider the other scenario: We have modern multi-gigahertz, multi processor hardware, but 1980s level software: It's too horrible to even contemplate.
Obviously, neither of these scenarios is plausible, because advances in software and hardware have gone hand in hand, but I'd venture to say that, especially when you consider how pervasive the internet has become since 1981, and the explosion of innovation that it spawned, the bigger revolution has been on the software side.