Making the web more efficient, a thousand servers at a time

Peak efficiency at a webscale data center feels like a blast furnace. I experienced it firsthand on the rooftop of eBay’s new Project Mercury data center in downtown Phoenix. It was hot enough standing on a grated-steel roof with the sun beating down on an 86-degree day. Then I stepped into the hot aisle of Dell Modular Data Center and 1,920 servers blasted 115-degree air right in my face.

If eBay’s Dean Nelson has his way, that was just the beginning. His future is one of ever-greater density in data centers driving ever-greater efficiency, and he’s relying on modular data centers like the ones Dell has provided to get him there.

Sometimes the modular data centers are the standard 8-foot by 15-foot shipping containers, and sometimes they’re more unique, custom designs. But they’re always loaded to the teeth with gear. A single unit can weigh 100,000 pounds — if you drop one in, you instantly have a whole lotta computing power in a relatively small, wholly weather-proof package.

Add a thousand servers, lower the PUE

Nelson, who serves as eBay’s senior director of global foundation services (read “he runs its data center operations”), is so excited about what his team has accomplished with the new data center that he couldn’t help but take to the whiteboard in the conference room and break into a mini-lecture on how the project came to be. Many of the nitty-gritty details on how Project Mercury shaped up — including specifications on how it’s cooled and powered –  are available in a whitepaper by the Green Grid organization. But here’s the takeaway: total cost of ownership (TCO) drives everything.

If Nelson is going to buy it, it’s going to be flexible enough to change with future generations of gear and it’s going to hit the perfect blend of density, efficiency and performance. Done right, containers and modular data centers fit the bill perfectly, and Nelson will buy them from whichever vendor is best able to meet his requirements when it’s time to load up on more capacity (and when he needs more than 1,000 servers at a time). Already, he told me, “We’ve doubled our capacity and my [operational] budget has stayed flat.”

Two HP containers, and the two Dell units in one enclosure.

Among that added capacity was 50 petabytes worth of Hadoop drives split among two 1,008-node clusters. One cluster resides at the SuperNAP data center in Las Vegas, the other in an HP container atop eBay’s new data center.

Nelson has been able to maintain a steady operational budget because it’s difficult to beat what Nelson’s containers are doing in terms of efficiency. And you can’t talk about efficiency without talking about power usage effectiveness, or PUE. It’s an industry standard for determining how much energy a data center uses for tasks other than computing (e.g., cooling or lighting). The ratio is simple enough, you divide the total power usage by the power used for computing, with 1.0 being perfect. The world’s most-efficient data centers from Google, Facebook and Yahoo top out at about 1.08.

Project Mercury gets free cooling year round, even in the heat of summer. On Aug. 23, 2011 — a 119-degree day — one of eBay’s Dell units had a partial-PUE score 0f 1.044 while drawing 520 kilowatts of power. On January 17, 2012, while drawing 1 megawatt, the same unit had consistent partial PUE of 1.018 while the rest of the data center was doing between 1.26 and 1.35. Project Mercury has room for 11 modular data center units on its roof, and every one drives down the PUE of the entire facility’s PUE. Nelson realistically expects an entire facility PUE of less than 1.2.

Big thing, small package

Given all the computing power it’s capable of consuming, though, one of the most striking things about eBay’s Project Mercury data center is its size. eBay has an existing data center next door that’s 43,000 square feet and can handle 6 megawatts of power. Project Mercury has about 14,000 square feet of computing space (including the roof), but is designed to handle 12 megawatts of power (it’s currently drawing 4 megawatts). It also costs half as much as the larger space to fill with gear and to operate.

Project Mercury has a maximum power capacity of three times its current draw for two major reasons: 1). it’s not yet full (there’s still space for seven more units on the roof alone) and 2). Nelson demands higher performance with each new generation of gear he buys. He’s even pushing modularity on the first floor that looks more like a traditional data center, which means standard-size racks with ever-more power crammed into the exact same amount of space.

It’s a strategy Nelson calls “rack and roll.” Vendors with the winning bids deliver 48U racks chock full of gear that eBay only has to plug in. When it’s time to replace racks or add new ones, his team just rolls them in and out as needed.

Right now, the HPC racks that power eBay’s search engine contain 96 servers and pull 28.8 kilowatts of power. They weigh in at 3,500 pounds apiece. Its Hadoop racks contain 48 servers, each stuffed with 12 2TB drives. The first-floor space can fit 220 of these power-packed racks, while each rooftop container can fit 20.

But they keep getting more powerful. The 28.8-kilowatt racks seem downright Herculean next to their legacy 8-kilowatt neighbors on the raised floor. When future generations surpass 40 kilowatts per rack, Nelson has equipped the facility for liquid cooling direct to the chips. On the roof, Dell boosted its performance 800 kilowatts max on the second-generation modular data center up from 550 kilowatts on the first-generation model.

At webscale, (almost) everyone’s doing it

To hear Dell’s Data Center Solutions division tell it, modular data centers aren’t exclusive to eBay. Steve Cumings, the division’s executive director, told me Dell has a good business in selling modular data centers to webscale (and other hyperscale) customers. And they like them for the sames reasons eBay does: they’re super-efficient and super-dense. Given the right cooling setup, the units can sit wherever they have access to power and act like their own little data centers.

Although Cumings isn’t allowed to give the names of most DCS customers, or even the size of their deployments, he did note that Microsoft uses Dell Modular Data Centers to power Bing Maps. He also said deployments range from eBay’s scale to “many, many, many modules.” Elsewhere, everyone from the U.S. Army to Amazon is going modular.

Better, perhaps, but will it be Tron-themed?

But eBay’s Nelson isn’t about to let anyone steal his thunder, at least when it comes to driving the most bang for his buck out of modular data centers. eBay is about to break ground on a new facility called “Project Quicksilver” in Salt Lake City, Utah, that he says is even bigger and better than Project Mercury. It hasn’t yet released details on what will be inside, but the RFP calls for a per-rack performance of up to 40 kilowatts and overall facility scalability from 4 megawatts to 30 megawatts.

And Nelson — always hungry for more performance per watt — has a two-word message for server vendors that want to win eBay’s business on future buildouts: liquid cooling. For servers, it’s like the difference between fanning yourself and jumping in a pool, and he says it will be necessary as his racks achieve more power density.

“That will be the competitive advantage for a vendor,” he said “because we will buy it.”

All images courtesy of eBay.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • Migrating media applications to the private cloud: best practices for businesses
  • Dissecting the data: 5 issues for our digital future
  • Defining Hadoop: the Players, Technologies and Challenges of 2011



GigaOM