No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

The Cloud, Like Fog, Encircles Me?

If you strip away the market hype around cloud computing (difficult, to be sure), you're left with the basic presumption that a single distributed computing repository hosted online will replace traditional computing. Does that mean that every single computer is "in the cloud"? How far would cloud computing have to penetrate to validate it? Perhaps most significantly, what are the techno-economic factors that will drive and eventually limit the growth of the cloud? Those are questions that enterprises need to have answered, and the answers are few and far between.

It's best to start with the basic value proposition. Most people realize that cloud computing is based on the presumption of a significant economy of scale in computing supply. The theory is that there’s a lot of wasted capacity on the servers scattered around in data centers, department computing rooms, and even desktops. A single virtual server resource in the cloud could, in theory, run all the applications at a radical increase in efficiency. That would justify moving every application to the cloud, if true. In some cases, though, it’s not likely to be true.

First, and most obviously, humans don't come equipped with Internet interfaces in their belly buttons, and so some kind of appliance will have to be provided for access. That appliance can be cheaper than a real PC, but it’s not clear just how much cheaper. An iPad, for example, makes a nice thin client but it’s more costly than many laptops. This limits how much of the desktop and laptop population can really be impacted by the cloud.

Second, there's the question of just what "costs" are being economized by the scale of the cloud. If the only cost is the server hardware, the technology improvements we see yearly could erode the benefit case considerably. If we assume that cloud computing doubles the resource efficiency of servers, the savings will depend on how much servers cost. Where they're tens of thousands of bucks, the savings is significant, but where they're hundreds of dollars, the savings for a given company might not offset cloud costs and risk.

As it happens, most of the savings being projected from the cloud don't come from servers or other hardware anyway. It comes from support. People will tell you that half of the TCO for IT is the support of the applications and users, and it stands to reason that support of cloud-hosted resources could be cheaper. Scattering support resources around the distributed population of servers today has to be much more costly than supporting a few enormous server farms in the cloud. But is it as reasonable as it sounds? Maybe not.

The majority of support costs are incurred in supporting workers on applications, and supporting their specific appliances (laptops and desktops today). Is it cheaper to support a thin client? Truth be told, the application support costs probably don't change because the worker is running the same application. I just watched two IT guys trying to figure out how to get an iPad onto a secure WiFi network, and that wasn't pretty, cheap, or ultimately successful. Thus we can’t presume that supporting thin clients as workers’ appliances would save a bundle either. A big chunk of the centralized cost of support is in helping users with the applications, and the cloud doesn’t impact that at all because the same applications are being run. Central resources can't support your own company’s use of ERP without creating an enormous security risk.

Then there's the network. Our infinite pool of computing resources in the cloud is going to need to be connected with users everywhere and with very low latency or loss, or the cloud resource pool won’t be universally available to users and thus won't be optimally efficient. If the network delay or loss mounts with distance, only close resources can be allocated and so the cloud essentially breaks up into little enclaves that start looking a lot like private data centers. Bye-bye benefits.

The fundamental cloud value proposition is totally hostage to network performance. We wouldn't be talking about cloud computing at all today if broadband access cost what it did in the 1980s. The problem is that if you wanted truly fast and high-QoS bandwidth today, you wouldn't pay that much less, because you’d still be buying a dedicated private service. Public-Internet bandwidth is cheaper, but could it provide the QoS enterprises need for key applications?

The point here is that we're getting aboard the cloud hype express way too fast. Enterprises are already suspicious about the vision of the cloud eating all of private IT, and an objective analysis of cost/benefit suggests that they're right. But we don't know yet whether some or all of the limitations on cloud benefits can be addressed by changes in software design, network economics, or support procedures. The trick will be to get good data on how cloud benefits are developing in a market that thinks the cloud can do no wrong and thus doesn’t need to be made right.

The cloud fog may still have time to dissipate, and leave some massive changes behind. We'll have to watch each critical area to see how it develops.