Why the snap of a photo changed my mind about Google Glass

As a serious smartphone addict who jumps at the sound of an iPhone buzz, I know that I really don’t need more internet in my life. So I haven’t had much interest in Google Glass so far, assuming it would just serve to put more apps in front of my face that my current attention span doesn’t need, and that my iPhone could easily handle.

But this morning when I was walking through downtown San Francisco before Google’s I/O conference, I was crossing a street when I saw a particularly pretty scene of the sun rising between two buildings. Hoping I’d have enough time before the walk signal ended, I dug my iPhone out of my bag, swiped to open the camera, snapped a photo, and then jogged to the curb to avoid getting hit by cars. (Mom, I hope you’re not reading this.)

So a few hours later, when I tried on Google Glass for the first time and said the command “take photo,” instantly capturing a photo of my colleague Kevin Tofel standing in front of me without moving either my head or my hands, I started to see the appeal of Glass.

I’d read a decent amount about the technology since Sergey Brin dropped from a helicopter at last year’s Google I/O, and not only was I sort of confused by the specifics of how Glass works (A camera on your face? Facebook on top of everyday life? How do people see when they’re wearing them?), I was turned off by the severely dorky appearance and the idea of constantly monitoring the things around you. They seemed vaguely creepy and intrusive. I was not attracted to the idea of wearing them as a normal person walking around town.

But even though I only got a short spin with the technology on Wednesday, it only took a few seconds for me to understand why people are so jazzed about Glass.

I put them on my face and was immediately impressed with how lightweight they felt. Despite their futuristic, clunkly-on-one-side appearance, they didn’t feel very bulky or heavy on my face, and it was easy to see the room around me while wearing them. (Even though they weren’t fitted specifically for my face the way they would be if I purchased them.) The screen felt much smaller and unobtrusive than I’d imagined, and it wasn’t hard to swipe the side of the glasses to navigate the screen. But it was the voice commands, and the “take photo” command, that changed my perspective on the technology.

Would I spend $ 1,500 on them right now? Definitely not. If you need prescription glasses of any kind, it would be hard to combine those with Glass. While Google has launched them in some jazzy new colors, you still look absurd wearing them (whether you’re in the shower or not). This probably makes me somewhat vain, but I’d want them to look cooler and less futuristic before I wore them in everyday life (seriously, embed them in some Warby Parker frames, and I’d be way more down with the idea.)

And once apps start streaming into the glasses, I can’t imagine how seeing New York Times headlines and tweets wouldn’t be distracting while you’re doing things like walking or driving. Of course, none of this even gets into the new etiquette that would have to arise from the spread of Glass.

But despite all the drawbacks, speaking the words for the “take photo” command made me realize that even if wearable computing has a pretty dorky image right now, the potential practical applications for real-life people who don’t consider themselves nerds are endless — once the technology gets a little more refined, and we figure out how to use them in public.

I talked to one Google employee who said she sat in her sister’s graduation and streamed video through Glass to family members from afar, and another who said she uses it to take photos of her little kids when her hands are full. I would imagine it could be huge for people with disabilities, or people doing outdoor sports (Kevin mentioned you could take photos of mile markers while running a marathon.)

“Every time we’ve tried to do something crazy we’ve made progress,” Larry Page said on stage today. So does Google Glass seem a little nuts right now? Sure. But if a few years from now I can snap a photo of a sunrise without having a near-miss with traffic, I’m open to the possibilities.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • Analyzing the wearable computing market
  • Social media in Q1: commerce and discovery dominated
  • Google doesn’t like walled gardens — except its own


GigaOM