Cloud Computing and the 10X Effect

In the IT industry, technology and the usage evolves faster than in perhaps any other industry. As a rule of thumb, systems can grow 10 times under their current architecture or paradigm, then they must be re-architected. This 10X effect causes old technologies to become obsolete and new ones to emerge. It also underlies the massive shift to cloud computing.

The last major computing infrastructure paradigm shift happened in the ‘80s when “client/server” was introduced as the new way to design business applications. Those applications typically ran on x86 computers – aka PCs.

Then, in the ‘90s, the “client” part of this overall design was disrupted and changed with the advent of the Internet. Instead of having applications running on a desktop PC accessing application servers on other PCs, we started accessing applications through web browsers which did little more than rendering on your own desktop or laptop computer. Now, 10-15 years later, we’re seeing the “server” side of client/server disrupted and replaced by cloud computing.

A New Architecture for More Devices

The underlying driver of these changes is the 10X effect — writ large. The early Internet had around 10 million users. Today we have on the order of one billion users (100X) on the Internet, and up to three billion if you count Internet-enabled mobile phones too. Whereas in the early days of the Internet, there were perhaps one million websites, today we have about 100 million active websites (100X). The total number of Internet connected devices is today around five billion (a 100X growth from about a decade ago), and the boldest predictions say that in the next few years this number will grow to a trillion! That’s 200X the current number.

These massive growth numbers, stepping up two or three orders of magnitude in about a decade, are forcing us to look for a new way to design our IT systems. The old architecture is completely unable to handle the new compute load, so we must re-architect the systems on all levels. Cloud computing is the new architecture.

Cloud computing is a term that reaches from the lowest piece of computer hardware all the way up to the highest level of web and mobile services. Google’s search is a cloud service. Salesforce.com is a cloud company. Apple’s iTunes is music and entertainment in the cloud. Amazon Web Services is cloud computing, as is Microsoft (a msft) Azure. Cloudera, RightScale and Eucalyptus are innovators of infrastructure software for the cloud. The modern servers produced by Dell and HP are made for cloud computing, as are the new storage solutions from giants like EMC and NetApp and from newer players such as Fusion-io.

Computers and the Three Musketeers

Whereas in current IT systems, computers and other resources are hardwired to serve just one specific set of users or set of applications, cloud computing lets any computer serve any need of any user. Computers are finally learning from the Three Musketeers: one for all and all for one.

Interestingly, the converse can be true as well in the cloud. When usage shoots through the roof (think of a mobile game that suddenly becomes popular), all cloud computers can be put to the use of one single application. Compute resources such as servers, storage devices and network equipment, can be called to duty and sent back on leave in an instance. This is called “elasticity.”

The “one for all and all for one” principle is possible because computer and software engineers have made sure that compute resources are fungible, i.e. mutually replaceable. You’d think engineers would have made this possible decades ago, but this has been a very hard nut to crack. It requires new thinking so new products (both software and hardware) are ready to operate across multiple computers from the start. This may sound like a natural thing to do, but so far in our history of computing, most software and hardware products were designed to function on their own with little interaction with other similar products.

Fungible compute resources that produce enormous elasticity are the only way to serve the growing Internet. When we reach the point of having one trillion connected devices, those connected devices all need service from the underlying computer network, but they do so at unpredictable times and with unpredictable workloads. It’s totally impossible to have servers devoted to certain uses — sitting there idly waiting for the user or the application to need them. Such a model would require close to a trillion servers. But with cloud computing, resources can be shared across the cloud, and a much smaller number of servers can successfully serve the fluctuating needs of the users and the connected devices. With fewer servers, we save time, money and energy.

As compute loads grow 10 times, 100 times and even 1,000 times, we need new architectures for our IT systems. In cloud computing, it is all about creating elasticity by making the compute resources fungible. When any computer can step in at any time to do any computation needed, scaling will no longer be an issue.

Marten will sharing his thoughts on “The Future of the Cloud” at our Structure 2011 event in June.

Marten Mickos is CEO of Eucalyptus Systems.

Related content from GigaOM Pro (subscription req’d):

  • Infrastructure Q1: IaaS Comes Down to Earth; Big Data Takes Flight
  • VMware’s Cloudy Ambitions: Can It Repeat Hypervisor Success?
  • The Structure 50: The Top 50 Cloud Innovators



GigaOMGigaOM · Tech News, Analysis and Trends