With data centers, web giants have great eco-responsibility

It appears that not everyone whose job involves running a data center is ready to join the angry mob trying to discredit New York Times reporter James Glanz. Dean Nelson, vice president of global foundation services at eBay, told me during a recent phone interview that although Glanz’s exposé on data center energy waste certainly didn’t present a complete picture about data center efficiency, it did spark a very necessary debate about how to responsibly scale in an age of always-increasing demand for computing resources.

When he read the first of Glanz’s two articles, Nelson said, “Literally, a movie quote came to mind … ‘With great power comes great responsibility.’” Internet companies need to use their immense technological prowess and bargaining power to make their chosen — and very lucrative — business as environmentally friendly as possible. Companies spend tens of millions of dollars, if not billions, trying achieve maximum efficiency, he said, and “we do have a lot of leverage, we can influence a lot of things.”

Efficiency is inherently limited …

Dean Nelson (left) with Utah legislator Mark Madsen

Nelson is by no means a data center apologist, though, and he’s fine with the fact that some things will never change. For example, as Glanz correctly points out in his article from Sunday, it’s demand (from either consumers or CEOs) for always-on applications and boundless, immortal data storage that drives much of the wasted energy within many data centers. Fearful of dropping offline and disappointing users or, perhaps worse, losing valuable transactions, companies vastly overprovision the amount of servers they actually need just to ensure their applications keep running if something fails.

That means at the very best, a company can run its data center at 50 percent capacity — well beyond the 6 percent to 12 percent Glanz cites as average —  and still have enough resources left over to ensure a site or application keeps running should the primary infrastructure fail. eBay pays close attention to its utilization rates, Nelson said, but it will never be anywhere near 80 or 90 percent. Unless companies and their users suddenly become willing to accept less-reliable applications, running that efficiently would actually be irresponsible.

“There’s absolute reality in the need to have disaster recovery.” Nelson said. “We are no exception to that rule. We build our infrastructure so our 115 million active users are able to search, find and buy what they want.”

… but innovation mitigates the waste

In that sense, Glanz has done something of a public service in highlighting the potential for wasteful practices when energy efficiency butts up against concerns such as profit margins and availability. In fact, one data center executive speaking on the condition of anonymity told me he thinks part of the reason for the outcry against Glanz’s articles is they’ve shed light on some dirty secrets that could lead to increased scrutiny from lawmakers. Anyone that hasn’t been building data centers with a focus on efficiency could end up on the hook for expensive retrofits, and that’s not a tempting proposition.

But it’s a big mistake to conflate the practices (and aged ones, at that) in corporate data centers with those of the companies powering the internet revolution. That’s like someone watching me run the 100-meter dash on a public street, while Usain Bolt is running on a nearby track, and concluding that all humans beings are slow.

Or, as Nelson put it, “There’s leading edge, and that’s really the web players … and then of course you have the lagging [edge], which is all the corporate infrastructure across the globe.”

These eBay containers are air-cooled in downtown Phoenix.

The difference between the two types of data center operators isn’t necessarily their requirements, but how they choose to address them. We’ve been covering for years the extreme measures that web companies employ to make their data centers as efficient as possible, and GigaOM’s Katie Fehrenbacher highlighted a number of them in a Monday response to the first piece in Glanz’s series. In April, I detailed the methods eBay used to achieve super efficiency at its Project Mercury data center in Phoenix.

Since then, Nelson said, eBay has gone even further. In order to ensure adequate availability while running as few servers as possible at any given location, the company distributes failover capacity across multiple geographic locations. For its Project Quicksilver data center outside Salt Lake City, Utah, eBay is running entirely on renewable energy thanks to a 6-megawatt deployment of Bloom fuel cells, and Nelson said eBay had to lobby for legislative changes in order to make that happen.

“We’re putting our money where our mouth is,” he said. “We’re going to make sure our engine runs as clean and efficient as possible.”

Pointing the ship in the right direction

When it comes to wasted energy, many of the most egregious practices come courtesy of more-mainstream companies that don’t operate data centers anywhere near the scale of what companies such as eBay, Facebook and Google do. They’re equally, if not more, risk-averse than the web giants, but smaller scale often means it’s more palatable to eat higher energy bills than to risk an application going down because there isn’t enough capacity or because a server overheats.

Many corporations’ data centers are really just a few racks of gear within commercially available data center facilities, or a converted closet somewhere in an office. As the aforementioned executive told me, his company builds its data center facilities as efficiently as it can, but it can’t dictate what gear customers put inside or how they choose to architect for high availability. If its customers want to install power-hungry servers running at 8 percent capacity, that’s their prerogative.

But the reality is that more and more of those customers are actually embracing practices that save energy while also assuring acceptable levels of availability. Server virtualization is beyond common at this point, with many large companies slashing their server footprints as they consolidate multiple applications on a single server. Even software for building private computing clouds has intelligent resource management at its core.

Efficiency is even better for applications running on public clouds such as those provided by Amazon Web Services. Many large cloud providers (including Amazon and Google) take efficient data center design very seriously, and cloud services for virtual meetings or even web search actually reduce emissions by eliminating physical travel.

Inside an IO.Anywhere module

In their own way, mainstream enterprises are catching on to new methods of data center construction, too. Goldman Sachs, for example, isn’t about to outsource its data center operations to Amazon, but it has committed to modular gear from a company called IO Data Centers for future expansions of its computing capacity. IO sells 500-square-foot units that are essentially fully contained and highly efficient data centers (they even include their own cooling systems), which means customers such as Goldman can buy capacity in smaller volumes and don’t have to expand their facilities or upsize their air conditioners.

If significant and industry-wide change is going to come, though, eBay’s Nelson thinks the web companies that really do put a high priority on energy efficiency will have a big role to play. They’ll help inspire new techniques and perhaps, like eBay did in Utah, effect legal changes that would make it easier for all data centers to implement energy-efficient designs. “We have a lot of influence,” he said, “that we should be able to utilize to go back and point the ships in the right direction.”

Feature image is of Apple’s solar farm in Maiden, N.C.


GigaOM