Anil Sabharwal, vice president of Photos at Google.
Stephen Lam for BuzzFeed News
Almost one year ago, Google launched a new photo management application – Google Photos – for both the iPhone and Android. It’s been a hit. Last week, at its Google I/O developer conference, the company said the app has already amassed 200 million users. “I think it makes us one of the fastest growing consumer products in history,” said Anil Sabharwal, who runs Google Photos. And in an interview with BuzzFeed News looking back on the past year, Sabharwal suggested ways Google Photos might continue to change and evolve.
One of the app’s main purposes is to help people back up all of their photos. Whether you’re using iOS or Android, every single picture you take on your phone is automatically backed up to Google’s servers. And Google Photos does this with an eye toward eliminating duplicates. It automatically deletes photos you’ve already backed up to free up space on your device. And it’s deleted a lot of photos.
“In the year since we’ve launched, we’ve freed up 13.7 petabytes of storage on people’s phones,” Sabharwal told BuzzFeed News. “For a lot of our users this was incredibly valuable because they were running out of space on their devices.”
Sabharwal said many of the people using this Google Photos feature live in developing markets, and own phones that don’t have a lot of storage space to begin with. And so he pointed to a future in which Google might use artificial intelligence to determine whether a photo needs to be backed up at all before it’s deleted. “How can [people] free up space even when they have not backed up?” Sabharwal said. “You can imagine us doing things like deleting blurry photos or deleting duplicates.”
Another core feature of Google Photos is Google Assistant (which is now making its way throughout Google). Assistant will do things like automatically categorize photos into themed groupings – it will automatically find and group all your photos of beaches, for example. It also will group people together using face matching, and, because you tag those people by name, it then lets you do things like search “John at the beach” and find all your photos of, well, John at the beach. Google Photos can even use those groupings – say you take a bunch of photos at a specific beach on a specific weekend – to automatically generate albums and movies and collages and GIFs.
Google refers to these auto-generated moments as “creations.” According to Sabharwal, the company has made 1.6 billion of them in the past year and has big plans to do more. “I think there’s a really great opportunity to mix the machine learning and creations together,” he said. “One [creation] we love is the concept of ‘rediscover this day’ – where we present to our users meaningful moments on a particular date in previous years. Rather than ‘here’s what happened a year ago,’ it’s here’s a set of photos from the last time you were with these people, or the last time you were at this restaurant.”
Sabharwal also said Photos might become smarter about the albums and movies it creates by giving them a stronger perspective and point of view. It might, for example, automatically select a wedding shot in which you and your partner are looking at each other for the hero shot in an anniversary album.
Another of the three key pillars of Google Photos – along with storage and organization – is sharing. The feature is designed so that anyone you share photos with can, in a single click, automatically add those photos to their own libraries. People can form shared albums with multiple contributors. Photos can easily be exported to other apps and services, like Facebook or Gmail. But Sabharwal told BuzzFeed News there are improvements yet to be done here as well, specifically around person-to-person proximity sharing.
“Twenty-five million photos a week are shared by Bluetooth,” Sabharwal said. “There are a lot of bandwidth-sensitive markets. If you and I are standing next to each other and I’ve got a great photo and you want that photo, why would I spend data – which is a significant fraction of my disposable income – to send it?”
“If you think of that as a glimpse into where we’re going, you can see us investing in that experience to make it easier and better,” he continued. “How do we make proximity sharing easier? How do we help you to remember to share? How do we make it so every time I take a photo of my daughter, it’s shared with my wife?”
Sabharwal pointed to Nearby, a new project designed to help people share and communicate when in near proximity, also announced at Google I/O, as one possible solution for that.
“Rather than sharing to an app or a destination,” Sabharwal said, “we’re thinking about how we’re sharing with people. That’s the idea we’re building on.”