Holy cow, the Formula 1 races have a ton of tech inside

I am not a sports enthusiast. I am not a car racing enthusiast. Until a few days ago, I had no more than a vague recognition of Formula 1 racing. But the F1 race that happened Austin over the weekend at the Circuit of the Americas track, gave me ample opportunity to learn about Formula 1 and the crazy tech that makes it all possible — from supercomputers to real-time data analysis.

There is a ton of computing involved in this sport, and it all serves a different purpose.  On the one hand, there’s the horsepower required to compute different airspeed dynamics over the car’s form, while on the other there’s the massively parallel computing required to analyze the streams of data thrown off the cars in real time. And there’s a lot of gear that is packed up and travels from race site to race site.

The great F1 traveling IT show

Ahead of the show I talked to Mehul Kapadia, VP of strategic alliances and sponsorships from Tata Communications, about how some of the Formula 1 computing happens in the field. And one of the coolest things is that everything happens in the field — from the broadcasting of the race to TV and to apps to the creation of a DVD highlights video covering the whole season, most of the video and web output of the F1 occurs on a race site.

The place it occurs is a broadcast center that contains 160 tons of different gear, from servers to video-editing modules owned by F1. That gear is transported to each race on a jumbo jet and is designed to be set up in 24 hours and dismantled in 12 hours. The video feeds from the track to the rest of the viewing world is handled by satellite, but Kapadia is hoping that soon that will change. Tata is one of the largest IP backbone companies in the world and is hoping that after this year it can help transition the F1 world from satellite to fiber.

The trend of sending more and more HD-quality sports content via fiber networks instead of via satellite is one I’ve been hearing about for several years. Yet it’s a slow transition, in part because broadcasters and organizations that manage the sports are hesitant to mess with a formula that works. For all of the F1 races Tata provides a gigabit of connectivity to the track itself, which is more than 10 times the capacity than F1 had prior to the Tata sponsorship.

“In the middle of Austin this isn’t a big deal, but some of these sites are fairly remote,” said Kapadia.

And because the race teams are using on-site connectivity to stream data from hundreds of sensors inside the car to their teams that perform real-time calculations on-site, the extra capacity is going to get used. Those calculations help the racing teams tell their drivers when to come in for a pit stop or how to handle the course or other drivers. Somewhere in all that information, there’s almost certainly a great big data case study.

How much will F1 put in the cloud?

Tata is also hosting the Formula 1 web sites and apps in its data centers. And like most sports, the demand on the web site fluctuates incredibly — hitting peaks right before and during the race itself. It also experiences interest from all over the world, since F1 is a truly international sport. This poses geographical challenges and also means that any prime time isn’t constrained by a certain time zone. Tata hosts some of the site’s content on its on-demand InstaCompute cloud, but it also has built a content delivery network using technology it acquired through the purchase of BitGravity in 2011.

But still much of the F1 IT work is handled on site at the broadcast center, which means that F1 IT staff are on the road a lot — even if it’s to exotic locales. By putting more of its operations in the cloud and by having fat pipes connecting the racetracks, Kapadia hopes that he can take F1 online and let more take place from remote locations. Because as cool as it is to have a traveling IT infrastructure and support team, it’s also a logistical challenge — and something better connectivity and the cloud could help solve.


GigaOM