Linked by alcibiades on Sun 28th May 2006 15:10 UTC
Hardware, Embedded Systems In this, the 60th anniversary year of the computer, it may be interesting to look back at a couple of key events in the evolution of this very important market. This is a market now amounting to extraordinary numbers of machines. In 2010, the last year for which we have numbers, there were no less than 10 million machines shipped! This growth and penetration is unparalled in the history of industrial products in the last 100 years, and is an amazing success. However, to get to this stage, the industry had to make its way through some issues and decision points. There are generally agreed to have been key turning points. What would have happened if they had gone differently?
Order by: Score:
by Ronald Vos on Sun 28th May 2006 16:02 UTC
Ronald Vos
Member since:


Reply Score: 5

RE: Eh?
by on Sun 28th May 2006 16:08 UTC in reply to "Eh?" Member since:

i must have got even more drunk than i thought, on what i thought was last night.

Reply Score: 1

RE: Eh?
by dylansmrjones on Sun 28th May 2006 16:28 UTC in reply to "Eh?"
dylansmrjones Member since:


Reply Score: 1

RE: Eh?
by Hugo on Mon 29th May 2006 14:27 UTC in reply to "Eh?"
Hugo Member since:
by maxx_730 on Sun 28th May 2006 16:14 UTC
Member since:

I didnt even drink last night! I played Risk all evening while drinking orange juice! Really, i did!

Ok, seriously, someone know how we should interpret this? Is this some kind of future scenario of someone who cant compute? (2010-1980 != 60). Or is Thom on acid again?

Edited 2006-05-28 16:26

Reply Score: 1

by Thom_Holwerda on Mon 29th May 2006 00:26 UTC in reply to "WTF?"
Thom_Holwerda Member since:

Or is Thom on acid again?

Drugs are for kids. Other than that, read the article again if you want to understand it. It's not that hard to see what this guest column is trying to prove.

Reply Score: 1

by Omega Penguin on Sun 28th May 2006 16:25 UTC
Omega Penguin
Member since:

Right idea,wrong decade.

Reply Score: 1

by atsureki on Sun 28th May 2006 16:36 UTC
Member since:

It's just saying that without directly competitive hardware vendors and open standards, technology wouldn't move on any front. A very confusing way to put out that information.

OK, Mr. Wizard, now that we have the hardware, why is most of the world stuck with a single software vendor? Where's the competition to push that technology ahead?

Reply Score: 2

RE: Weird
by hobgoblin on Sun 28th May 2006 23:53 UTC in reply to "Weird"
hobgoblin Member since:

well there is the penguin, the devil, and some others...

Reply Score: 1

by czubin on Sun 28th May 2006 16:39 UTC
Member since:

Can someone please explain the article? ;)

Reply Score: 4

RE: heh
by netpython on Sun 28th May 2006 16:57 UTC in reply to "heh"
netpython Member since:

It's an Apple commercial:-)

It is, we now see, only in a world in which hardware and software are developed, manufactured, marketed and supported by the same organisation, that customers can really get the quality and stability they most deeply desire and need.

Reply Score: 2

RE: heh
by alcibiades on Sun 28th May 2006 19:43 UTC in reply to "heh"
alcibiades Member since:

Can someone please explain the article?

Alcibiades is unwell and cannot respond personally to questions. He is heavily sedated, following a nasty attack of spatio-temporal displacement syndrome. It appears to have been triggered by reading the following article.

Anyone susceptible to this syndrome should exercise caution while reading it. You wouldn't want to end up like Al...

Reply Score: 2

RE[2]: heh
by PunchCardGuy on Mon 29th May 2006 10:47 UTC in reply to "RE: heh"
PunchCardGuy Member since:

Huh??? The article linked to compares the product develoment models of commodity dedicated functions consumer devices like the iPod to general purpose computer software (and the hardware that supports this), like MS Office and the like. Definitely not an apples to apples (pun?) comparison, and therefore in my view barking up the wrong tree. And as pointed out, MS is using a similar development methodolgy for similar products like the XBOX, etc. However, I suppose that if one is a regularly contributing columnist that is obligated to write articles on a regular basis that it may be difficult at times to come up with meaningful topics.

Reply Score: 1

by Dave_K on Sun 28th May 2006 17:09 UTC
Member since:

These kind of "what if?" scenarios work better if they have some degree of plausibility. The idea that the non-existence of PC clones would mean that the world wide web never existed, or that offices would all be using terminals connected to IBM mainframes, seems very, very silly to me.

Personally I find it very unlikely that IBM would have had so much success in that situation. I think that a lot of the success of the IBM PC platform was due to the cheap clones, rather than the IBM brand itself. After all, the reputation and image of IBM didn't stop companies buying those cheaper clones rather than "the real thing". If IBM only offered crippled terminals then surely the many alternatives would have been more attractive to businesses, and IBM would never have come to dominate the market?

The author drastically overstates the problems caused by the lack of a standard platform. After all, Mac and Linux users don't have much trouble existing in a Windows dominated world. Even in the days of 8-bit computers there were cross platform applications and file converters. If there were more competing platforms then I think it would encourage companies to create more cross platform software and standard file formats.

I don't really understand the author's reasoning regarding the lack of a standard platform killing the Internet, when did the Internet ever require a standardised platform? After all it existed before IBM started work on the PC, and today servers use a variety of hardware and operating systems. Why would things like email not exist in that world when they predate the PC?

Then there's the world wide web, I don't see how changes to the PC would have had any effect on its development by Tim Berners-Lee. After all, he developed it on NeXTSTEP running on 68K hardware, and IIRC web browsers appeared for the X Window System and Mac OS before DOS/Windows. Hardly an example of something that would have been killed by the lack of standardised hardware and software...

Particularly silly is the comment about digitised books not existing in that universe due to the lack of an open format. What about plain ASCII text? Most of the documents on Project Gutenberg are available in that format and it dates back to the 1960s.

Another thing I don't find plausible is the idea that hardware would be far more primitive than it is now, with 100Mb hard drives, limited RAM and basic graphics being the norm in 2010. Even if you take the IBM PC out of the picture, what about competition between other home computer companies like Apple and Commodore?

Look at all the progress that was made in a few short years in the early days of the home computer. For example, compare 8-bit computers from the early 80s, like the ZX Spectrum and C64, with 16 bit computers available by the mid 80s, such as the Amiga and Atari ST. Why would that level of progress have stopped because the IBM PC went in a different direction?

Failing that, what about the games market? I think people underestimate how many advances in graphics, sound and storage were pushed by the demand for more impressive games. This would all have driven the creation of more advanced hardware, and IBM would have had to adopt it or fall behind.

Of course it's impossible to know for sure what the computer industry would be like today if the history of the PC had been changed, but I can't imagine it being anything like this world the author has dreamed up.

Reply Score: 5

by netpython on Sun 28th May 2006 17:26 UTC
Member since:

What about plain ASCII text?

What about PDF?

Reply Score: 1

What joke
by Earl Colby pottinger on Sun 28th May 2006 17:54 UTC
Earl Colby pottinger
Member since:

The basis of this projection is wrong in that it does not take note that there were a number of other manufactors of microcomputers before and after the IBM-PC was delivered.

If IBM had tried to force buyers to use the software/hardware model shown in this article the S100-Bus machines might have taken over.

While not completely standardized you could buy hardware from a number of diffirent suppliers that would work in your machine - if you could get the drivers working.

The IBM-PC basicly won because:

1) It had the IBM name behind it.

2) You only needed one driver model for all early IBM PCs.

3) The full hardware write-ups in Popular Electronics and Byte did not hurt it a bit. I don't remember ever see such a complete write-up on the S-100 designs before that. Probably because all the diffirent S-100s machines had diffirent if interchangable hardware. IBM-PC one model only for the first year or so.

And maybe most importantly.

4) The major non S-100-bus microcomputer players dropped the ball:

Apple concentrated on Education at the time, they did not add the features in the pre-Macs that businesses wanted. And the Macs were too expensive for the low-end business work.

Commodore's management was greedy, paying themselves instead of development. The C64/128 models were never easyily expandable enough - the full range of the Amiga line was never reached in the proper time frames (the A2000 still had a 68000 instead of shipping with a 68020 on the motherboard), it was always behind the CPU state of the art.

Atari had a gaming image that breaking out would have been hard, but they did not help by crippling the Atari-ST line with limited data and address buss. If any early machine needed 32 Bit Data and 24 Address busses from day one to get max preformance out of it, the Atari-ST was it.


Correct me if I am wrong. But most of the other remaining manufactors at the time would had a problem maintaining the advertising blitz that we saw from the above companies. But if they had gotten together to agree on driver design and shared advertisizing they could have swamped a rigid IBM sales model.

On the otherhand after thinking about the egos and very dumb mistakes done by the S-100 crowd I have changed my mind - they were doomed because they could never agree on a standard design to ship that all basic business software could run on without any modifications.

Reply Score: 4

by JMcCarthy on Sun 28th May 2006 18:27 UTC
Member since:

Can I write a column if I snort a line too?

Reply Score: 5

RE: Awesome.
by Alex Forster on Mon 29th May 2006 19:04 UTC in reply to "Awesome."
Alex Forster Member since:

A +1 doesn't express how hard I laughed...

It's funny cause it's true!

Reply Score: 1

Hear me
by pompous stranger on Sun 28th May 2006 19:00 UTC
pompous stranger
Member since:

People of earth! I am from the future!

Reply Score: 5

Member since:

Right off, I think the article is a bit extreme and takes a simplistic view of how standards evolve and ignores the fact that not all standards are created equal.

Take the example of the internet. The "internet" as we know it represents a collection of standards that are oblivious to the hardware platform or software application that implements them and, in a manner of speaking, devalues the underlying infrastructure by providing universal communication. Which is why organizations like Microsoft or AOL fought so hard against internet adoption, lost, and were forced to adapt.

Hardware standards aren't altruistic, they're driven by business requirements. For manufacturers there are significant gains in economies of scale that can significantly reduce costs and broaden their potential market, with those gains often outweighing the cost/expense of maintaining proprietary technology or the risk of losing a controlled market. It's ultimately about profit. Apple's move to Intel is an ideal example of that. The more disparate a group of existing technologies, the more likely a standard will evolve or become established.

Sometimes the standard is mutually agreed upon, often through standards bodies. Sometimes the standard is determined by the market in a case of survival of the fittest (not necessarily survival of the best). Before hardware or software needed to be deemed "PC compatible" to find a successful market, it needed to be "Apple compatible" at a time when the Apple ][ was the leading computer platform. It's also interesting to note when looking at IBM and Apple that both platforms soared in popularity and acceptance due to the availability of clones or third-party hardware, and both manufacturers failed miserably when they attempted to regain control of their market by implementing proprietary technologies, which the market rejected. Unofficial standards are born.

There was also a time when printers needed to be "Epson-compatible" if they were to be taken seriously. Why? Epson established itself early as a market leader and software was written specifically for it. It was much easier for printer manufacturers to gain acceptance by adopting Epson control-set rather than trying to convince software vendors to keep revising their software for different printers. Of course, Windows changed all that by unifying the printer interface and allowing printer manufacturers the freedom to provide their own drivers, a standard of different kind.

Software is a different kettle of fish. What defines a software standard? The platform it runs on? The document formats it supports? Microsoft Windows and Office are two examples of products that are proprietary and lock customers in to Microsoft's platform, but at the same time they also give users access to the widest range of hardware and software options that can scale from home users to global conglomerate requirements. So Win/Office is proprietary and anathema to standardization at the same time that it likely encourages standards adoption.

I think the whole point is that the market and the industry would eventually have converged on standards-based computing. It was inevitable. Sure, certain events were key and influenced it, but in the absence of those events others would have stepped in.

And even aside from that, concepts like the internet or open-OSes like linux or BSD were equally inevitable, if not more so, since they reflect the ultimate example of users adopting and embracing standardization even in the face of established, big-business corporate-lockin opposition.

Just my 2c.

Reply Score: 3

by Kroc on Sun 28th May 2006 19:10 UTC
Member since:

How about we just focus on the present, k?

Reply Score: 1

Member since:

1. Learn math

2. Learn history

3. Use common sense

4. Write about what you know, and don't try to state your ideology in a "What if?" scenario that doesn't use the first 3 suggestions/rules

5. Don't use drugs!

6. Don't quit your day job to become a fiction writer ;)

Reply Score: 4

by sobkas on Sun 28th May 2006 19:56 UTC
Member since:

I don't know but from wiki:
"It was unveiled on February 14, 1946 at the University of Pennsylvania, having cost almost $500,000"
"the decimal-based American ENIAC (1946) which was the first general purpose electronic computer"

So I believe 2006-1946==60. Not every computer is a PC.
ps. This article is fictional, I don't see any facts that proves author claims.

Edited 2006-05-28 19:57

Reply Score: 2

by JonathanBThompson on Sun 28th May 2006 20:19 UTC in reply to "ENIAC"
JonathanBThompson Member since:

Wikipedia isn't 100% correct on things: that's why it can be edited by many people, which leads to....wikipedia isn't 100% correct on things!

There were definitely computers before ENIAC, though I don't care to check all the gory details, since I'm not writing an article that is based on that detail. But yes, if there's going to be time references in an article that's supposedly set in the future, it's good to get the timeline written down and lock down which year the article exists in ;)

sobkas, the author is projecting his wishes for what his interpretation of what his ideology says would have happened, without taking off blinders to how things work in business.

Reply Score: 2

Single vs commodity
by Hae-Yu on Sun 28th May 2006 22:05 UTC
Member since:

It's an facetious article demonstrating what might have happened if end-to-end vendors like IBM, SGI, Amiga, Sun, Apple, and the other industry dinosaurs had managed to lock out commodity vendors. If Wintel hadn't broke their backs, a similar scenario is what we'd have to deal with today.

I think the internet and email scenarios may or may not have worked out, but hardware would definitely be behind where it is now. Look at Apple and graphics cards. Even now, they scrape the bottom of the bucket with video cards. Apple doesn't push standards - their most innovative hardware accomplishments in the last 10 years was dropping the floppy and antiquated connections. They push cosmetics. They are just typical of the closed vendors.

Look at HDDs and the cost of RAM upgrades through them. Seagate vs WD vs Maxtor (oops, Seagate now) vs Hitachi (IBM couldn't keep up) vs Samsung creates forward progress. NVidia vs 3Dfx vs Matrox, now nVidia vs ATi provide progress in graphics. Now that AMD has given Intel a serious run for the last 2 years, one more block has fallen and we have another open area.

Reply Score: 2

RE: Single vs commodity
by Dave_K on Sun 28th May 2006 22:55 UTC in reply to "Single vs commodity"
Dave_K Member since:

"I think the internet and email scenarios may or may not have worked out, but hardware would definitely be behind where it is now."

I don't really see the evidence for that.

Look at the massive progress that was made in the early days of the home computer, when Commodore, Apple, Atari, etc. were competing with each other. Would you really say that there was less progress in hardware between 1980 and 1985, than there was between 2000 and 2005?

Personally I think the jump from 8bit computers like the C64 and Apple II, to computers like the Amiga and Mac, is more impressive than the multicore CPUs, faster graphics cards and other refinements we've seen over the last few years.

Apple may not have produced many hardware innovations recently, but I don't think that was always the case. Look at the innovations by other "end-to-end vendors", for example Commodore who pioneered multimedia, or Acorn who created the first RISC CPU home computer. At the time the PC couldn't compete when it came to graphics and sound capabilities, and even in the 90s the Wintel world was still copying their innovations.

Or look at games consoles and arcade hardware, they are produced by closed vendors, yet they have generally been close to the cutting edge. The competition between companies like Sega and Nintendo was enough to bring down prices and push the development of new technology.

As long as there's competition I think progress of computer hardware is assured, regardless of whether it's created by closed or commodity vendors.

Reply Score: 2

RE[2]: Single vs commodity
by hobgoblin on Mon 29th May 2006 00:19 UTC in reply to "RE: Single vs commodity"
hobgoblin Member since:

i think the article talks about a two punch combo where the software lockin we see these days with ms office helps the companys build socalled silos around their products.

therefor the only competition will be about having new customers select your silo over that of a competitor.

ok, there could a company like microsoft that produce a office suite or similar that use the same file format over the multiple platforms.

but stuff like that can be controled by proprietary os libs that needs a kind of "one platform only" contract before before you get the docs.

there are all kinds of dirty tricks one can pull. i keep refering back to a example of the industrial revolution, where diffrent screw producers used diffrent spacings and so on for their products so as to lock the customer to them.

still, i think the "article" is a worst case scenario. and i belive i read a similar one over on anandtech or maybe it was extremetech some time ago...

Edited 2006-05-29 00:21

Reply Score: 1

RE[3]: Single vs commodity
by Dave_K on Mon 29th May 2006 01:07 UTC in reply to "RE[2]: Single vs commodity"
Dave_K Member since:

"therefor the only competition will be about having new customers select your silo over that of a competitor."

Maybe I'm being dense, but I still don't see how that would stop the progress of technology. If you bought a C64 rather than a ZX Spectrum, an Amiga rather than a Mac, or a Sega rather than a Nintendo, you were locked in to a certain extent. If you decided to switch to another platform you had to purchase all new software, deal with file format issues, and probably replace most of your peripheral hardware too.

Yet there was a huge amount of innovation and progress due to the competition between those platforms. Even in the author's worst case scenario, I don't see why that would change.

"there are all kinds of dirty tricks one can pull. i keep refering back to a example of the industrial revolution, where diffrent screw producers used diffrent spacings and so on for their products so as to lock the customer to them."

And despite that, the technological progress during that time was so great that today it's considered revolutionary.

I'm not arguing that "lock in" like that is a good thing, just that it wouldn't create the kind of technological stagnation that the author of the article describes. The only thing I can see doing that is a lack of competition caused by the industry being dominated by one company.

Reply Score: 1

RE[4]: Single vs commodity
by hobgoblin on Mon 29th May 2006 12:39 UTC in reply to "RE[3]: Single vs commodity"
hobgoblin Member since:

it would be a dead stop, but it would slow down to a crawl compared to how it is today.

reason i say this is because i live in norway, and here i see two mobile phone providers that have more or less split the market between them. ok so there is a lot of virtual providers, but those rent capacity from the two big ones.

basicly the introduction of new tech is more or less a crawl. they have barely gotten UMTS working in the citys as we now hear talk about HDSPA (or whatever the name is).

"And despite that, the technological progress during that time was so great that today it's considered revolutionary."

but if you lokk into it, the development happend on a case to case basis. it was first when ford was able to create a factory that ate coal and iron ore at one end and spit out finished cars at the other that things got realy interesting.

before that it was one shot projects that required a strong personality at the helm that could stand up to the company representatives and demand stuff.

without that you was at their mercy.

sure it was revolutionary compared to what came before, but compared to today it was a crawl...

Reply Score: 1

faster pc for less
by happycamper on Sun 28th May 2006 22:59 UTC
Member since:

The Evolution of Business Models in the PC Market

now able to buy and build superfast systems for the fraction of the cost of the system of yesteryear.

Reply Score: 1

by happycamper on Sun 28th May 2006 23:09 UTC
Member since:

This growth and penetration is unparalled in the history of industrial products in the last 100 years, and is an amazing success. However, to get to this stage, the industry had to make its way through some issues and decision points. There are generally agreed to have been key turning points. What would have happened if they had gone differently?

a mac on every desktop.

Reply Score: 1

by helf on Mon 29th May 2006 03:13 UTC
Member since:

wha? That was the most pointless thing I have ever read.

Reply Score: 1

You win some, you lose some...
by alcibiades on Mon 29th May 2006 06:08 UTC
Member since:

I wrote the piece to entertain, amuse, provoke. But also with a serious thought in mind. I had written another proper analytical piece about the same subject, which Thom may publish at some point if he feels its appropriate, and an impish sense of humour got the better of me, so off it went.

The serious thought is this. In several quarters, the argument is made that Dell's recent problems, and the consolidation represented by HP-Compaq, show that the current industry model is failing. People argue in particular that Apple and Sun succeed in being more consistently profitable, avoiding competing purely on price, and deliver better quality products, because they control hardware and software and deliver a complete solution, and (not really true of Solaris of course) only let their OS run on the hardware of their choice, not yours.

Now it is obvious that in computing, at least in PCs, the so called 'end to end model' has become trivially unimportant in terms of share. What I tried to ask, and provoke thought on, was: what would have to have happened to make the reverse situation hold? How could it have occurred for the 'end to end model' to have got and kept a 97% market share? And what would a world in which that had occurred look like?

And would we like it better?

That's the real question. It is to some extent a live question too. Because there are voices, and I would say, not voices which are on the side of intellectual freedom, who genuinely would like to see the enforcement of an 'end to end model' - so called, enforced by varieties of DRM and locking.

Anyway, that was the genesis, thank you all for reading and reacting, and sorry you didn't like it.

Reply Score: 4

happycamper Member since:

why did not you such wrote it in plain english, like you just such did. you know say what you mean.

Reply Score: 1

PunchCardGuy Member since:

I understand why this item was written. I myself have been a big fan of science fiction for many years, with a particular fascination for the alternate universe thing. But if you are going to write articles like this, then they should be more plausible.

Your fictional premise that IBM is able to protect their PC architecture and eliminate clone manufacturers from becoming competitors is interesting. But I respond to this by saying that if this would have been the case, then companies like Atari and Commodore would probably still be around today, and Apple would have a larger market share. So it wouldn't be an all-IBM (or almost all) world as posited by any means. There might have been some of the issues regarding software compatibility across disparate platforms as you suggest, but there would still have been a lot of advanced software developed for the desktop systems that would be out there. And you also forgot about Bill Gates. The first software he developed and sold was a BASIC interpreter that ran on S-100 bus machines. He would certainly have developed and sold software for any dominant system architecture.

As for the Internet, the birth and development of the technology behind this had nothing to do with the PC. The Internet actually descends from a DARPA project that companies like Honeywell and BBN worked on to develop a world-wide computer network for military command and control. This network originally ran only on mainframes and dedicated communications front-end processors. It was only later with the proliferation of PCs that effort was undertaken to develop a means to interconnect these devices in a similar network, which is really the starting point of the Internet as we know it today. This could have been done with Apple, Atari or Commodore systems just as easily as with IBM PCs or clones thereof. There are no aspects of the protocols used on the Internet today that can be restricted by commercial interests, since they are all based on public RFCs.

Finally, regarding the comment about Dell and "the business model failing": wrong big time. Dell's problems (if you can even call them that, since they are still an incredibly profitable company), are based on arrogance, greed and complacency. They even stated themselves that they had let their price points get too high, thus opening the door to other manufacturers in the desktop and commodity server space to gain market share with lower prices. And where is Dell hurting the most? In the server space. Dell has focussed primarily on smaller, general purpose servers that are used mainly for supporting MS Exchange, Web servers and the like. I have many customers that like to rack and stack Dell PE2850s for just this purpose. But such servers are easy to manufacture, as they use commodity parts, so a lot of other companies are getting into this business with lower prices than Dell in some cases. But where Dell really fals down is in the large enterprise server space that support large valume transaction processing and large databases. This is where Sun, IBM and HP do well, and Dell has a long way to go before they can be a player here. But they are still king of the desktop, and I see no indications of this changing anytime soon. As for the mention of COMPAQ, they, too, got too complacent and let their cost structure grow to unmanageable levels. They also thought they could move in a big way into services and solutions selling, which did not at all fit their business model. And, of course, their emphasis on using the channel to sell commodity products instead of selling direct like Dell did not help.

Reply Score: 1

alcibiades Member since:

In the para starting 'As for the Internet...' you are basically agreeing to a large extent.

Here is how SiloWorld differs from today:

1) You can only buy a computer to run an OS from the OS developer.
2) The market is therefore fragmented, there is less hardware competition, which leads to higher prices, lower production runs, higher costs, slower functionality increases.
3) The prevalence of incompatible islands and higher prices lowers consumer demand. The Internet exists as protocol, but the apps hosted on it, shopping, searching and so on are far fewer and less profitable, if they exist at all. Because volumes are lower. Amazon may not exist. The dot com bubble?
4) There being fewer PCs installed means smaller network markets, much less and more expensive broadband, if indeed broadband in today's form exists.
5) The network islands that existed before continue, the Internet is not today's universal application rich and content filled space. You still have Compuserve, the old AOL, eWorld, Prodigy. Because there are five or six incompatible OSs you don't have total freedom to choose. Not all OSs are compatible with all BBSs. OnTyme probably still exists. Sending email from one BBS to another is iffy.
6) The application software space is fragmented by the incomptatible OSs and the smaller PC market, and so prices are higher, and you cannot always get the combinations of software you want, and this in turn diminishes the attractiveness of the PC as a consumer purchase.

And so, you look down the shopping mall and you don't find stores selling software or the local PC shop. You find the Apple Dealer, the Commodore Dealer and so on.

There is no reason why Gutenberg could not exist. The problem is, too few people have PCs, scanners cost too much, and its just not plausible as a method of distributing books. Do digital cameras exist? Yes, they are professional tools, and probably are tied to the OS+ hardware vendor you picked. Digital music? 5 different clients for the iTunes store? 5 different OSs for the player to interface with? Where do the microprocessors get their volume production runs from to get the prices down enough? Think how long it took to bring CDs to market.

I do understand and accept that people found the tone of the article irritating. But on the substantial point, I think SiloWorld would be very different indeed from today in most respects, and a lot less fun. And a lot less free. And that most people are not thinking specifically enough to see that.

Reply Score: 3

Dave_K Member since:

"1) You can only buy a computer to run an OS from the OS developer.
2) The market is therefore fragmented, there is less hardware competition, which leads to higher prices, lower production runs, higher costs, slower functionality increases."

What I don't understand at all is why that fragmentation would mean less hardware/software competition and less progress.

The early home computer market was heavily fragmented, with dozens of companies making incompatible computers, each with their own OS, peripheral connections, and unique software. Would you really say that there was less progress then than there is now?

To me the switch from 8bit computers to 16bit systems running multitasking GUIs was a massive leap forward, and it happened well before IBM PC clones came to dominate the industry.

You make a point about the lack of volume production causing higher prices, but that assumes that every company would make their own unique hardware. That certainly wasn't the case in the 80s, look at how many 8bit computers from different companies used either the Zilog Z80 or MOS 6502, or how many 16/32bit computers used a Motorola 68k. Companies like Acorn who designed their own CPUs were in a small minority, and generally peripherals like 5.25"/3.5" floppy drives and printers were manufactured by 3rd parties and had a degree of standardisation. Why would that have changed due to the lack of PC clones?

While the fragmentation and incompatibilities caused problems it certainly didn't kill demand. The C64 is still the biggest selling computer of all time, even though you had to replace all your software and many of your peripherals if you switched to another computer.

Of course, even back then there were cross-platform standards, and adapters that allowed you to use hardware designed for different systems. I had no trouble using a minority platform (RISC OS) in the early 90s, despite most of the people I worked with using Mac, Amiga, or Windows systems. There were utilities to read differently formatted disks and converters for popular file formats. I see no reason why the development of that kind of product would have stopped in the future, so I find your "SiloWorld" a rather unrealistic worst case scenario.

When it comes to the Internet/web, I don't see why having a variety of a incompatible platforms would change things significantly. In my opinion it was the popularity of the web, not the standardisation on one platform, that spelled the end for those "network islands".

Consider the fact that the first Web browser was created for NeXT hardware/software, while NCSA Mosaic was written for the X Window System then quickly ported to Mac OS, Amiga OS and Windows, web browsers for RISC OS, OS/2, Atari ST, and various other platforms followed quickly. At the time the market was quite fragmented in the real world, yet the web quickly spread to all those incompatible hardware/software platforms. I just don't see why it would be any different in your alt-universe.

I know I'm repeating myself here, but I don't think anyone has really managed to address these points and I'd like to understand your reasoning.

Reply Score: 1

by Beresford on Mon 29th May 2006 11:17 UTC
Member since:

Anyone remember the TV program Sliders, maybe an episode from them ;)

Reply Score: 1

by Punktyras on Mon 29th May 2006 13:25 UTC
Member since:

<...>lipped about how exactly this is to be accompished, do hint...

Reply Score: 1

Bad Drugs!
by xeniast on Mon 29th May 2006 16:30 UTC
Member since:

This is the result of Bad Drugs.

Reply Score: 1

Article is rubbish
by JaredWhite on Mon 29th May 2006 17:25 UTC
Member since:

I'm sorry to say this, but this article has more holes than Swiss cheese. The idea that a monopoly OS running on a sub-standard commodity platform created all the innovation of the last ~15 years, including the huge adoption of the Internet, is flat-out absurd.

The computers by Apple, Commodore, and Amiga were all ahead of the IBM PC in sophistication. Why? Because they were competing with each other! The competition had nothing to do with clone makers, price wars, and semi-standardized platforms and had everything to do with the fact that if someone compared Company A's platform with Company B's platform and Company B's was more powerful/easier to use, Company B won the sale.

The Web was invented on a NeXT machine. It and the other Internet services of the early-to-mid 90's were largely powered by Unix software running on a variety of hardware systems, and the clients were a wide range of systems including Macs. However, most consumer OSes had to have Internet access software added on by third-parties. Windows, for instance, DIDN'T EVEN HAVE a TCP/IP stack until Windows 95! (I'm talking consumer-land here, not NT.)

All the major Internet protocols were invented from a Unix standpoint, and Unix has always been at home on a variety of hardware platforms. There's little reason to believe that only by having consumer PCs based on a single semi-standard could the Internet flourish. The first major Web browser, Mosaic, was quite Mac-oriented, in fact. So was Netscape. Windows for some time was considered rather a joke in terms of Internet support, and until IE 3, any Microsoft-specific Internet software was highly dubious in its impact.

I believe the late 90's dominance of IBM clone PCs, Windows, and the fall of alternatives actually stifled innovation. Only in the 2000's with Linux, Mac OS X, open source Web browsers, and open source server technologies like LAMP is the Internet really on an innovation track again.

Reply Score: 2