The cloud will cost you, but you’ll be happy to pay

What can coal teach us about the cloud?

Today, conventional wisdom suggests that cloud computing will bring increased efficiency to computing markets, which will then decrease costs. Cloud computing will allow organizations to cut IT spending and help relieve pressure on IT budgets. It’s a nice vision, but that’s not going to happen.

The good news is that you’re not going to mind that your cloud computing budget will be higher than what you’re paying now for IT, because you’ll be able to do more. And here’s why.

Coal and clouds, an unlikely pair.

The secret to this counterintuitive state of affairs was explained back in 1865 by a British economist named William Stanley Jevons who was investigating the use of coal within the British empire. Jevons wanted to understand the forces shaping the demand for coal and estimate the time to exhaustion for British coal mines.

William Stanley Jevons image courtesy of Wikipedia.

During his investigation, Jevons noticed that, counter-intuitively, increasing the efficiency of coal-burning machines did not, in fact, slow the overall consumption of coal. In fact, he saw that
the more efficient the coal-burning machines got, the more coal was consumed, at an ever-increasing rate. Efficiency, it seemed, actually made coal consumption increase, not decrease. This is now known as the Jevons Paradox.

The paradox is explained by noting that multiple forces are working together. While increasing efficiency reduces the individual fuel consumption of a single machine, it also reduces the cost to run that machine. The reduced cost, in turn, makes the machine economically viable in a number of new applications. While each machine is more efficient, the overall demand increases for the machines themselves, leading to an increase in fuel consumption, coal in the case of Jevons’ original investigations.

So what does that mean for cloud computing?

Jevon’s Paradox has influenced IT for decades. For instance, in 1943, Thomas J. Watson, the Chairman and CEO of IBM, was rumored to have said, “I think there is a world market for maybe five computers.” While this sounds silly on the face to modern ears, the irony is that Watson would have been right, if not for Moore’s Law and Jevons’ Paradox — if the world had been static and computers were still as inefficient and costly (in constant dollars) as they were in 1943, we might still be looking at a world market of five units.

But over the decades, the invention of the transistor and other technologies helped fuel a relentless increase in computer efficiency. As computers have become more efficient and dropped in price, we have found new applications for them and ended up spending more and more on them, in accordance with Jevons’ Paradox.

So when it comes to the cloud, we will likely continue to spend more each year on IT. However, with spending shifting into cloud computing as opposed to traditional forms of IT, this rise will be accompanied by a correspondingly greater increase in economic benefit. This is an important point: Jevons’ Paradox has two forces at work. The increase in consumption results from increased application of the technology to new problems, and this only happens if those new applications are economically viable. And “economically viable” is code for “makes you money.”

Jevons’ Paradox and the cloud meet in the real world.

Intel’s transistors at 32 nanometers. More transistors helped pave the way for cheaper computing.

I was recently speaking with a large bank that told me that cloud computing is allowing them to pursue projects that previously were unviable. For instance, if a new application could deliver $ 500,000 in revenue, but had IT expenditures of $ 1 million, it simply couldn’t be pursued. Using cloud computing, IT costs have dropped, allowing them to pursue smaller projects that would not otherwise have been viable.

So, the number of projects is increasing, which is leading to increased aggregate spending, but with correspondingly greater revenue. Further, the savings associated with projects that would have been feasible even before costs dropped can be plowed back into application development rather than sunk on infrastructure.

It’s important to note that any individual firm may or may not see the effects of the Jevons’ Paradox. In other words, the effects will be distributed across the economy in a lumpy fashion. Some enterprises are “IT intensive,” using IT as a primary input into the business process (think banking and financial services). Those firms will see the greatest effect from Jevons’ Paradox associated with cloud computing.

But there are other enterprises that are IT insensitive (think logging and mining, for instance). These firms need some information technology capabilities, but IT is not a primary driver of project-level business cases. Increasing IT efficiency is unlikely to allow them to pursue additional projects. Thus, any IT cost savings can drop more directly to the bottom line and IT spending in these firms will go down.

So, Jevons’ Paradox predicts that with increased IT efficiency in the form of cloud computing, actual spending will continue to increase. But with that increased spending will come greater economic return for everybody as new projects that were previously unviable can now be pursued. In the same way that Jevons saw coal use increasing, fueling the industrial revolution, cloud computing is poised fuel a 21st century computing revolution.

Dave Roberts is SVP of Strategy and Evangelism at ServiceMesh. He blogs here, and tweets as @sandhillstrat.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • A near-term outlook for big data
  • NewNet Q2: Google closes the quarter with a bang
  • From car to cloud: the future of the in-vehicle app landscape



GigaOM