Gartner analysts took out their crystal balls and came up with a list of ten predictions that will impact enterprise businesses. The predictions cross over technology, economics, and social boundaries over that will morph during the next eight years. Read the article at ZDNet.
if this does happen I hope someone is working on something that makes a 8 way 150ghz computer worth while. I can’t fathom what one would do with that. Does anyone have any idea for programs out there for the home/ semi-pro use that would need supercomputer power? Are we going to have our own personolized weather models? if we are still just using them for email and minesweeper it will show we have lost our creativity.
>>>6. Moore’s Law continues to hold true through this >>>decade
>>>Gartner gives Moore’s Law, which posits that processor >>>power doubles every 18 months, a 70 percent chance to >>>continue unabated through 2011. Gartner projects that >>>by 2008 the typical desktop computer will have 4 to 8 >>>CPUs running at 40 GHz, 4 to 12 gigabytes of RAM, 1.5 >>>terabytes of storage, and 100Gbit LAN technology. By >>>2011, processors will clock at 150 GHz and 6 terabytes >>>of storage will be common. And, there are numerous >>>technologies such as nanotube transistors and >>>spintronics that could jump the next hurdle when CMOS >>>reaches the end of its run, Claunch said.
i dont that that will be true especailly for the storage come on i dont think they realsize how BIG 6 terabytes is…..what would you do with all that???? Download Movies….. not with a MS system you wolnt. But i think that CMOS and the x86 platform should be retirred cuz it is just tooo backwards compatible
Loving Jesus,
Jon
i dont that that will be true especailly for the storage come on i dont think they realsize how BIG 6 terabytes is…..what would you do with all that???? Download Movies….. not with a MS system you wolnt.
If I could store all of my CDs uncompressed I’d be looking at about 300GB, give or take a few GB (not to mention adding more CDs before I have enough hard drive space to do so). Then maybe you get into DVD rips and TV shows recorded on your Tivo, which currently can take 300MB per episode compressed.
It’s not that 6TB is big, it’s simply that until we have the space readily available, we won’t put it to use.
Another law to remember is that for every new technology that seems impossible to fill up/maximize/overrun/etc., a new thing comes along to waste the resources.
Just think how much space people would have today on their harddrives if it was not for MP3s and movies. Ten years ago nobody would of thought that in the future people’s 60GB harddrives would be full of “MP3s” “downloaded” “off of the Internet.”
Or imagine you had Windows 3.1 or System 7 on a 1 Ghz machine.
Just wait, something will come along to waste those resources. It always does.
Remember, the Soviets in 1952 were not designing a device to broadcast sports games to consumer’s homes when they launched Sputnik. They just wanted to put a ball into space, and eventually a use for it was found.
“640K ought to be enough for anybody.”
— Bill Gates, in 1981
“Computers in the future may weigh no more than 1.5 tons.”
— Popular Mechanics, 1949
“I think there is a world market for maybe five computers.”
— Thomas Watson, Chairman of IBM, 1943
While I like his outlook regarding Moores Law and the faster, higher capacity desktop machines, I strongly disagree with part of his #1 point regarding system resources moving to more remote services and so on (Sounds like a Microsoft .Net commercial to be honest).
If todays trends of high tech crimes, terrorist threats and Big Brother -Oops! I mean Ashcroft looking over everyones shoulders continues, I can’t see this happening.
Especially in light of the Moores Law predictions. Let’s face it… If I have 1.5 terabytes of storage, and a 40Ghz PC, why in hell would I let some potentially untrustworthy, faceless corporation or government mind my confidential data for me?
Now if bandwidth picks up the way they’re predicting, I can see stuff like interactive TV (or TV like content) over the Net becoming more mainstream, but I can’t see how greater capacity at home would ever equal out to people trusting others with their sensitive data.
Perhaps 10 years ago this would have been a potential prediction, but if I have 1.5 terabytes at home (heheh… I kinda like the sounds of that!), I can’t imagine what I’d have that I wouldn’t just store locally.
Now I do agree that (unfortunately) app configuration and such may move more towards the remote web services route, but even this is kind of “iffy” when you consider that if we’d been there 5 years ago, many people would now be without software (or perhaps just certain features of their purchased software) due to the Dot Come bust.
Personally, I think that period of time opened peoples eyes up a little to the fact that there’s no guarentees that these companys may be here in a few years, let alone a few months.
(Enron anyone?)
Suffice it to say that I feel that more storage equals out to more opportunity’s to keep your personal information local and potentially safer from criminals, governments, and corporations who are only thinking of themselves.
Gartner wins no prizes for vision here. Let me make one prediction that actually goes out on a limb: UNIX will kick Microsoft’s butt. In ten years, it will be easier to buy a TRS-80 than a Windows box.
> Remember, the Soviets in 1952 were not designing a device to broadcast sports games to consumer’s homes when they launched Sputnik. They just wanted to put a ball into space, and eventually a use for it was found.
It was in 1957, on 12 of April. April, 12th is still widely celebrated in ex-USSR and I find this is pretty cool holiday, the Day of Cosmonautics
Seriously, I envy mr. Moore – it is so cool to discover the Law that has no any scientific background, but still is so precise for so many yers already!
It’ll be great if network bandwidth, chip densities, and hard disk capacities continue to increase at a geometric rate, but there are some problems before the Ellison-Gartner vision of “network computing” is reached:
– the last mile. The telcos and their equipment vendors are in survival mode now, and aren’t likely to touch this for years.
– local system bottlenecks. To my knowledge, RAM and hard disk access speeds haven’t improved according to Moore’s law, although this isn’t an insurmountable problem.
– speed of light. You’re always going to have latencies moving bits across a country or metropolitan area, compared with accessing from a local hard disk. Each intermediate piece of equipment will also add latency.
I will only address no 6, Moores law & 150GHz cpus.
Clearly this study reflects a poor education in mathamatics & physics. These guys must have gone to school with too many computers & no math/physics teachers. As a cpu & vlsi designer, I know very well what the speed of light is and how far a signal can propagate in 1 ns, not very far. At 150GHz single cpu design would be extremely challenging to say the least since many clock cycles will be needed to propagate a signal across the width of a chip.
This type of faster & faster single cpu smacks of Soviet Union way of organising the economy, when everybody knows that billions of free thinking individuals can do it better.
Even Intel has admitted that staying on the same course will lead to intractable heat removal problems where the die is hotter than a nuclear reactor core.
So far the industry has been riding the easy part of the curve, what most do not see is that it is an S curve, not a straight line to infinity. The curve will flatten out for many reasons sooner or later, I won’t go into here.
The only real future for faster cpus is significant use of multiple distributed lower power cpus that can spread out the heat. Once we all use multiple cpus at any frequency, power should be just a matter of how many do you want to get the job done & how to get rid of the heat. It all comes down to the SW industry getting a handle on Par programming for real instead of dicking around with minor use of a few threads here & there when the APIs force it on unwitting developers.
An immediate real boost to performance could come from directly compiling “some” SW code into HW using FPGAs where speed ups of 1-100k are possible right now depending on the type of app. If the app looks highly repetetive, structured & like a chip could be designed to speed it up (think DSP etc), then indeed it can be turned into a soft chip swapped into onboard FPGA when needed. This type of SW to HW compile can be done on the desktop with tools that can be bought now (Celoxica Handel C, etc). The exact type of apps that we already know could benefit from such single 150GHz cpus can already be built with FPGA accelerators using the most natural form of HW parallelism there is, thousands of Wires, Pipelines & good old combinatorial logic. AND the growth in this SW=>HW technology is right now on a much faster growth than a measly doubling every 18 months. FPGAs are catching up with cpu speed (but always a factor of 5 or so behind). FPGAs benefit directly from bigger dies made possible with shrinks allowing more logic cells on board. Right now state of the art (compare to 3GHz P4 for wow factor) are Virtex 10000 series with equivalent of several million gates.
Would Gartner know this, of course not, they are a bunch of know nothings. Well nanotubes & spintronics does look interesting, but by 2011, they might just produce something commercial to look at. Anybody can read EET and make up this nonsense.
New technologies often take far far longer to turn into practical lowcost devices than pundits predict. What the industry is extremely good at is continuos refinement until eventually crap technology turns into something amazing.
HDs, LCDs, CMOS, uPs all took >>> 30yrs to become low cost, high performance & pervasive!! All of these started in the 60s when these pundits were in diapers. Nanotubes, quantum dots & other exotics will take likely much longer to mature.
Let’s chop up their garbage predictions one by one.
1. Bandwidth becomes more cost effective than computing.
“cheap and plentiful bandwidth will catalyze a move towards more centralized network services, using grid computing models and thin clients.” What? I don’t think so.
Grid computing makes sense especially when you don’t have high-speed communication connections between nodes – as is the case now. When you have cheap and very powerful computing resources that you can centralize into clusters with super high speed interconnects (Gigabit Ethernet and greater) you can spread the work load across many computers with minimal communication costs. When high speed communications (Gigabit Ethernet and faster) become common between geographically separated computers and bandwidth price rates drop to reasonable levels there is less of a need for “centralized grid computers”. High-speed communications will primarily encourage “distributed grid systems” not centralized ones. Low speed communications now encourages grid clusters.
One of the main challenges to the widespread use of someone else’s “grid computer” is trusting that your “data” won’t be compromised when it’s on their equipment. Are you ready to trust an outside “.com” application service provider with your companies survival? Each business must carefully consider any dependencies that it places upon outside resources.
Another challenge is the creation of software systems that can effectively make use of grid systems – centralized or clustered.
The real reason that “grid clusters” will continue to exist far into the future is to ensure information security and physical equipment security. Redundant, and potentially highly distributed, cluster sites will be essential for “large scale disaster” survivability. Last years events impact in the financial industry and some of the companies ability to recover using redundant backup sites were the proof of this.
2. Most major applications will be inter-enterprise.
The evolution of applications had better head towards adaptive software architectures that can be reconfigured on the fly otherwise we’ll be running the same old applications on 10 to 40 Ghz CPUs – what a waste that will be. Unfortunately with the continued and foreseeable proliferation of low level programming languages like C, C++ and Java we are not likely to make much progress.
The future is in higher-level systems that a wider audience can manipulate. As Alan Kay said, “the best way to predict the future it to invent it.” Let’s invent high-level systems – not languages – that the average high school educated person can “manipulate” and “program” – systems that are based upon declarative ideas. Let’s take the “programming” out of “programming”!
The widest area of growth for applications is for individuals who must work with others to get their jobs done. Collaborative software systems support teams of people but also it must support the individual person in the team. People who work with others are the real power users since they are the ones who often – but not always – have the best idea on how to accomplish or improve their work and communications. This is especially true in highly educated work forces.
The challenge between “enterprises” for collaborative systems is determining and maintaining what information will be shared with outside companies. It will be important to track shared information, track who access what and when they accessed it. Privacy issues are at stake. The enabling and disabling such “information sharing” between corporate entities will be critical. This needs to be dynamic yet controlled and secure.
3. Macroeconomic boost from inter enterprise systems.
The real boost to the macro economy will come from the boost in performance in teams of people that use highly effective collaborative software and communication systems. Once these teams of people are empowered within and across corporate boundaries massive improvements can occur. However, a larger more powerful economic force exists in the massive numbers of small businesses that will benefit even more since even tiny improvements in performance can mean the difference between staying in business and making a profit. The benefits of collaborative systems will enable “collaborative” teams or “virtual businesses” to form for specific projects. The ones that succeed may continue to form longer-term business relationships with their highly integrated and customized systems. Some of these will aggregate into larger more permanent businesses when it’s economically advantageous to do so. Look out big boys.
4. Successful firms in strong economy lay off millions.
Using collaborative systems the former employees of “successful firms” will be able to form their own “dynamic collaborative” teams or small companies and compete or supplement their former employers. This is a trend that happens now. It’s common for employees to have better ways of running a company than the current management and it’s common for management to ignore these better ways. This negative management physiology leads to opportunities for “current” and “former” employees. Thus is born innovation and all too often independence in a new company. Oh, any company that remains “a firm” won’t survive for long. All companies must adapt.
5. Continued consolidation of vendors in many segments.
Life is change. While vendors will consolidate, many new companies will come into being as “former employees” and entrepreneurs take their initiative, hopes, dreams and futures into their own hands.
6. Moore’s Law continues to hold true through this decade.
Yes computers and communications will get much faster before leveling off. However Moore’s Law also brings with it a true challenge. What lags behind is not the hardware technologies of communications, storage or computation. What lags behind is the software technologies which are still in the stone ages. As the hardware technologies further outpace the invention of innovative software systems we are headed for real trouble. The key issue facing software people is the massive increase in complexity needed to build the current systems. Larger human teams simply won’t do since software development teams do not scale very well. Look at the level of software complexity and errors in software from Microsoft for example. Even NASA, which takes pride in their low level of bugs per million lines of code, has serious challenges creating and maintaining their software systems. More eyes won’t do. We need to invent systems that assist us in programming and remove the “rote” programming from programming. We need systems that take our “specifications” and write our software – with our guidance – for us. Fortunately such systems are under development.
7. Banks become primary provider of presence services by 2007.
Not everyone trusts banks. What more needs to be said on this one except that trust is a key issue in our society and has been for many thousands of years. Trust, especially between parties in business transactions, forms the basis of modern human civilization. Digital trust issues will continue to be a significant area of development for the foreseeable future. Many communities will form “trust collectives”. This will form into “trust networks”. Some of these communities will be banks. The opportunity is wide open. Lawyers will also have a large role to play since they currently fulfill many trust roles in society today. The key will remain to be choice, not just for trust, but also for the ultra important issue of privacy.
8. Business activity monitoring is mainstream by 2007.
Any business that does not monitory it’s activity in real time is playing with dice and taking too many chances. Collaborative systems enable much of business activity to be “monitored”. However, monitoring is not enough. Data is useless unless it’s processed and analyzed somehow. Information is power as the saying goes and getting the right information to the right people as fast as possible – or at the right time – is critical.
The events leading up to the horror of last year clearly demonstrate how the lack of getting the information to the right people at the right time gets in the way of responding to situations in the real world. However, even with the best “info flow” technology in place, politics in an organization will still present the most massive barrier to “effective responsiveness” of an organization of people. The tragic story of John O’Neill (www.pbs.org/wgbh/pages/frontline/shows/knew/john/) will become a textbook case study in the failure of organizations to effectively flow information to key individuals and how politics – internal and otherwise – is a truly massive barrier to humans working together to solve their common challenges in the world.
9. Business units, not IT, will make most application decisions.
Individuals and the teams that they collaborate within will be making the application decisions since they are the ones who’ll be using them. If the advanced collaborative “systems” that have been hinted at in this article come to exist then I’d have to agree with this prediction.
10. Pendulum swings back from centralized to decentralized.
Decentralized intra business collaborative units will dominate most enlightened and dynamically responsive companies as long as the advanced collaborative software technologies exist to support them. However, human nature has certain individuals desire power and control over other humans especially those working for them in corporate power pyramids. This human tendency will exert enormous force and power over future business relationships. Those forming collective teams, business units, small enterprises must be vigilant to ensure that they have rewarding business terms. An independent business unit or small company isn’t really free or separate when it’s locked in by a rapacious contract from a larger company or supplier. The real indicator of the growth of the economy will be the ability of individuals and those they work with or for to negotiate and execute on mutually beneficial business deals.
The following paragraph is a correction of one of the paragraphs in point one.
Another challenge is the creation of software systems that can effectively make use of grid systems – centralized clusters or distributed grids.
Man, I was overjoyed when I got a 1.1GHz Pentium III laptop last March……
Moore’s Law ? That’s just fine.
But I was more interested (just like that person above, stahbird) to get to know about chipset, or, even better, *new* computer *platforms*, not i386 with 8 CPUs at 40 GHZ 🙂
Also, network bandwidth at affordable price to the masses seems to be impossible. Where is the bussiness model that will turn this in reality ?
Today high prices of ADSL and cable have a small userbase (today, in USA, 90% of Internet users still use 56 kbps access, last time I checked, 10% use ADSL/cable).
ADSL uses plane phone lines, doesn’t it ?
So why is it so expensive ?
Pure Capitalism ?
In this paricular point (network bandwidth) I expected to hear from the advances in Internet access via FREE plane electrical copper wires that can be found at any home at Gigabips speeds, which are already “implemented” (unlike fiber optics).
Gartner analysts (?) are here only speculating about possible potential bussinesses in the near future – not in possible computer science breakthroughs.
But that’s their role anyway.
Every computer-literate household will have a www/email/im/presence/media/application server and a rock solid high speed Internet connection by 2010. Thin clients of all shapes/sizes/functions will exist throughout the household.
where will they go? farm workers went into factories, they were use to the laboer intensive work and the long hours so it fit right and there was a need at the time……
what industry is up and coming for IT/tech workers?
tech is suposed to be the end all be all for every other industry. if the need to tech workers goes down, I would also expect a need for most workers who are end userd of that tech will go down.
where the hell are we going to go? are we going to have a second great depresion due to over efficency? will we have to then increase corprate taxes to pay for all the expanded welfare beifits and volume due to the 40% work outage? (sure they say 10%, but please, lets be realistic 10% greater efficency means 40% out of work in corperate terms.)
I am glad I will be teaching computer science rather than working in the tech industry.
chicobaud: ADSL uses plane phone lines, doesn’t it ?
So why is it so expensive ?
Pure Capitalism ?
No, actually there is three types of ADSL, Type A, B and C. Depending on the type, you must be 3-5 miles away from the exchange. The exchange itself is expensive, very expensive.
Besides, it is only US’ and EU’s ADSL that have expensive ADSL, and this can be attributed to lack of pure capitalism, if anything. Take Japan for example… Cheapest ADSL in the world, US$22 a month (cheaper than regular dial up in the US).
Besides, there is a limit on the speed on copper wires, IIRC is is 1mbps or 100mbps….
Jeremy, with less jobs for techies, means less market for techies wanting to learn computers, ain’t it? Thats means your job is as threaten as those working in the field today.
Thank god I’m going into arts…..
Ha, eight cpu’s in one box in every house? Wow, Everyone must be rich in the future. Thats the problem with these guys, They don’t think about poor people and countries. Heck, everone on these types of sites represent the small portion of the computer population with up to date pc’s eg. over a 1ghz processor. bla, you get the point. And excuse anything I have spelled wrong at 3:13am.
Take out your price list. Compare the prices of equilevents in 1996. Tell me the difference, I really like to know. Then after that, witht that difference, minus it off current prices. Tell me what you get.
I hate postulative articles like this.
If we are to believe stuff like this then we
should be living in cities on the moon and
mars by now and flying around in hydrogen
powered hover cars.
Why not just report on what is *really* happening
in the industry at present?
People write articles like this when there is
really no news. It’s like scifi during the
great depression. Just something to take everyones mind
off a continued downward spiral.
I say, great! Dan Farber writes science fiction.
Now back to the real world.
where will they go?
You know, lawyers, doctors, politicians, psychics.
“Leave it to the [market research firm] Gartner to come up with a laundry list of vague and contradictory predictions for the years ahead.”
http://www.pcmag.com/print_article/0,3048,a=32288,00.asp
The best quote from the article, (regarding layoffs):
“I don’t know if anyone has noticed, but technology companies can barely survive with some of the boneheads they have working, now.”