BofA Tech Guru preaches 6 cloud truths

Private cloud computing is big among security-obsessed financial services companies and Bank of America Merrill Lynch is no exception. In a note sent out by the company this week, Brad Spiers, the senior vice president who heads up Compute Innovation for the giant financial services firm, had some insightful points to make about BofA’s use of cloud technology and cloud computing in general.

Here are some takeaways:

  1. Private cloud mandate remains, but changes: For most financial services firms, secure private clouds (i.e., clouds they build and manage in house for themselves) remain the way to go. But Spiers foresees possible change. Over the past six months, he’s seen evidence that vendors are starting to “get” data security requirements and is more confident that the bank will build a secure private cloud with vendor partners. Over time, that is.
  2. GPUs are big in banking: The BofA uses GPU (graphics processing unit) computing techniques often seen in online gaming and in high-performance scientific computing to run simulations in its derivatives business.
  3. Applications have to catch up with cool technologies: Solid-state storage and in-memory databases are great if you have software that can use them. But … “I want software that I can buy that takes advantage of storage hierarchy changes. SSDs are a great first step in boosting performance, but applications are not yet written to take advantage of this technology,” Spiers wrote. Similarly, despite talk about “benefits of in-memory database as the network hierarchy expands with large, fast options, the software is not optimized at the drive and OS level,” he said.
  4. Other hot technologies worth watching: Non-volatile memory, software-defined networking and more parallel compute options have all caught his eye.
  5. What BofA runs now: As of the end of last year, Bof A ran just south of 90,000 physical servers and 40 percent of them were running a hypervisor. Most of these multi-core, mid-range machines have 16 to 96 G of memory, and run Linux and Windows. Many of the bank’s businesses were running in IT “silos.”
  6. What’s coming: A standards-based compute stack based on X86 servers, 10GbE networking, 16 GB Fibre Channel and NAS storage; Windows and Linux OSes and automation tools. The goal is to consolidate a lot of the workloads on the same infrastructure,  boost speed to market, allow elastic AWS-type scaling and better resilience and availability.

Photo courtesy of Flickr user Alex E. Proimos.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • Forecasting the future cloud computing market
  • The new IT manager, part 1
  • Amazon’s DynamoDB: rattling the cloud market



GigaOM