Q: What do you get when you mix Facebook’s extensive memcached usage with its strategy of “cold storage” for infrequently accessed data?
A: McDipper, a Facebook-built implementation of the popular memcached key-value store designed to run on flash memory rather than pricier DRAM.
Memcached, for the unfamiliar, is an open-source key-value store that caches frequently accessed data in memory so applications can access and serve it faster than if it were stored on hard disks. It’s a very popular component of many web applications stacks, including at Facebook where the company runs thousands of memcached servers to power its various applications.
But DRAM is expensive, especially when you get to Facebook’s scale, and not all applications deserve that kind of performance. So, according to a Facebook Engineering post on Tuesday, the company designed McDipper to handle “working sets that had very large footprints but moderate to low request rates. … Compared with memory, flash provides up to 20 times the capacity per server and still supports tens of thousands of operations per second.”
Facebook has deployed McDipper for a handful of these workloads, the blog states, and has “reduced the total number of deployed servers in some pools by as much as 90% while still delivering more than 90% of get responses with sub-millisecond latencies.” It has been part of Facebook’s photo infrastructure for about a year and serves 150 gigabits of data per second — or “about one library of congress (10 TB) every 10 minutes” — over Facebook’s content-delivery network.
This is the same logic that drove Facebook to undertake its cold storage engineering effort for even more infrequently accessed data, which aims to find a middle ground between the inefficiency and latency of hard disks and the high cost of flash storage. To meet that goal, the company is getting creative by considering everything from lower-performance flash to Blu-ray — pretty much anything but tape — VP of Engineering Jay Parikh told me in January.
Building a tool like McDipper is the just the tip of the iceberg, though, when it comes to managing the cost and efficiency of infrastructure at large web companies such as Facebook. On Tuesday, eBay released its Digital Service Efficiency report that lays out a methodology for assessing the effect that infrastructure (more than 52,000 servers in eBay’s case; Facebook has even more) has on larger corporate goals such as clean energy and the bottom line.
And later this month at our Structure: Data conference, data center executives from Facebook, Microsoft and Goldman Sachs will take the stage to discuss how smart analytics help them plan to meet capacity needs while keeping costs in check.
Feature image is Facebook’s new all-flash Dragonstone server design.
Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.
- Migrating media applications to the private cloud: best practices for businesses
- How tomorrow’s mobile-centric data centers will look
- Takeaways from the second quarter in cloud and data