Geek ambivalence about cloud computing is interesting, because it's not like it's a new phenomenon. In tech years, the idea has been around for ages. But part of the problem is that the actual definition of cloud computing isn't really all that easy to pin down. And marketers have been fond of talking about things being cloud computing when it's really only peripheral, and it's really a shameless ploy to capitalize on a hot trend. But in a nutshell, here's what I think is the essence of the cloud computing concept.
There are two technological innovations that, available together, make cloud computing possible: ubiquitous internet access and advanced virtualization technology. With virtualization, a "server" doesn't have to be a physical machine. In the olden days, if you wanted a server, you had procure a physical machine, or access to one. If you thought your needs would scale up, you would get a more powerful machine than you currently needed, just in case. And once you came close to outgrowing that machine, you would need to either get a new machine and migrate your system over, or scale out to more machines, by spreading components, such as a database, off to its own server, or doing load balancing between two. System administrators were constantly trying to strike a balance between paying for capacity that would never be used and dealing with problems or outages caused by usage spikes due to not scaling out quickly enough. And scaling out was sometimes very hard. Moving a running, missing critical system from an old server to a new, faster one was no picnic.
Virtualization made it possible to decouple the "server" from the server hardware. So if you needed more capacity (processor cycles, memory, or storage) you could scale out your server to new hardware, even if that meant moving to a new data center in a different part of the world, without all the fuss. And the ubiquitous network made it easier for the people who used these services to access them even if IT managers started to aggregate them into centralized "cloud" data centers. So this meant that a small startup could order a server from Amazon and never have to worry that they didn't order one that would be powerful enough if they hit the big time. A researcher would be able to build a system to crunch some numbers for three weeks, then just delete it when the calculation was done. And a large company's IT managers could start to decommission the various server boxes that were spread out in closets in offices around the country, and instead provision instances from their centrally-managed "private cloud".
I think the reason that so many geeks don't really understand what the big deal is over cloud computing is that unless you're running a big data center, you're not really the one who's reaping the direct benefit of cloud computing. I blame the marketing, to some extent. We hear about various cool web services, like Evernote or Dropbox, or even "operating systems" that depend on "the cloud", such as ChromeOS or eyeOS. (By the way, use Evernote and Dropbox.) But from the point of view of the end user, cloud computing is just a fancy word for web hosting. If I'm using Dropbox, I don't really care if the storage is on the cloud or on a big old-fashioned storage server somewhere. As long as the service is reliable, it doesn't matter to me. But is sure matters to the poor sap who has to maintain the Dropbox servers. Cloud computing makes it much easier for the company to grow as it needs to, even change hosting providers if necessary, without disrupting my service and without wasting money on unused capacity "just in case."
I guess the other big recipient of the value of cloud computing is the accountant (which would be another reason why the geeks wouldn't really get it, unless you're an accounting geek). Another buzzword that's commonly associated with cloud computing is "utility computing," which basically means that you pay for computing resources as a metered service, just like you would electricity. For the CFOs of the world, it means that you don't spend a bunch of money on hardware that you may or may not be extracting full value out of. I think it's safe to say that most large companies only end up using a small percentage of the computing resources that they have sitting in the racks and on the desks in their buildings, so from an efficiency standpoint, it's better to pay for what you use, even if theoretically you're paying a higher rate for each unit of potential processor cycle. The old way wastes time, money, and electricity.
So this is OSNews, and we primarily concern ourselves with operating systems here. Where do OSes fit into this new world? Well, virtual servers are still servers, and each and every one still needs an OS. What we've done is insert a "sub OS" called a (Type 1) hypervisor under the regular OS, and that hypervisor allows one or more servers to run one or more OSes or OS instances. You could have one OS instance spread across multiple physical machines, or hundreds of OS instances on one machine. A Type 2 hypervisor allows a guest OS to run inside a host OS, which is also useful, but is used for a very different purpose. Depending on the platform, a VM can be moved from one type of hypervisor to another, so you might configure a new server in a VM running as a guest on your laptop then transfer it to run on a "bare metal" hypervisor in a cloud hosting environment when you launch to production.
One aspect of the OS world that's made more complicated by cloud computing is licensing, and Microsoft in particular is in a kind of difficult position. One of the advantages of cloud computing is that you can turn server instances on and off willy nilly, as you need them. You can copy an entire instance from one datacenter to another, or clone one and made a derivative. But when you have to deal with whether the OS and software on that VM you're moving around is properly licensed, it adds a whole lever of unwelcome complexity to the mix. That's one reason why Linux, MySQL and other open source software has been very popular in cloud environments.
If you buy cloud hosting from Amazon, they just build the Windows license into the fee if you order a Windows instance. But if you're using a lot of capacity at Amazon, you end up getting kind of a bad deal, and you'd be better off buying your Windows server licenses at retail and installing them yourself on Amazon's VM.
And virtualization technology is getting bundled with operating systems more and more. Microsoft has its own hypervisor, which is included with Windows Server 2008. It's just one of the commercial virtualization platforms that's available today.
Another reason why cloud computing is an invisible revolution is that a lot of what's happening lately is in the arena of the "private cloud". OSNews' trip to VM world was sponsored by HP, which is putting a huge amount of effort into helping its enterprise customers replace their current infrastructure, which could easily be described as "servers here, servers there, servers we don't know about, we can't keep track of them all, and we certainly can't properly maintain them all". And one of the reasons why the server situation is so chaotic at big companies is that when someone needs a new server for something, and they contact IT, they get the runaround, or they're told they'll have to wait six months. So a lot of the innovation recently is around helping big companies set up a centralized data center where the whole thing is a private cloud, and when someone in some branch office needs a new server, one can be provisioned with a few keystrokes.
The people ordering the new servers don't even need to know it's a cloud. They don't care. All they know is that suddenly their IT people are getting things done a lot quicker. So again, to the outsider, it just looks like regular old hosting, or regular old IT provisioning.
So what about the so-called cloud OS? Where does that fit in? I'm afraid a lot of that is marketing hype, because for the user of a cloud OS, it doesn't really matter whether the apps and storage they're accessing over the network are stored in a cloud or on a regular old server. But the reason that it's meaningful is that it would be impractical for any company to offer a server-based desktop user experience without having cloud computing backing them up on the server side. It would just be too difficult to deal with the elasticity of demand from the users without the flexibility that comes from virtualization.
I think the reason for the marketing hype is that people are inherently wary about their "computer" not really existing inside the actual case that's on their lap or under their desk. Both novice and advanced computer users are nervous, though for different reasons. For some reason, the idea that their computer exists "in the cloud" is just inherently less scary than "it's on our server, in a rack, in our datacenter, in California". Though in reality there's barely any distinction. And until "the cloud" becomes the only way that anyone hosts anything, like at some point movie studios stopped advertising that their films were "in color!" I think marketers will still make the distinction.
But don't let the hype and misdirection confuse you from the real issue. We're in the midst of a huge revolution. And one of the reasons that a lot of people fail to appreciate the big deal is precisely why it's a big deal: for the end user, the move to cloud computing is supposed to be transparent and painless. Even for the programmers and power users and other geeks using these systems, they're just working like they always used to work, and that's the whole point.