At Facebook Camera, Everyday They Think About Delivering Value to AR Users

Matt Uyttendaele, LDV Vision Summit 2018 ©Robert Wright

Matt Uyttendaele, LDV Vision Summit 2018 ©Robert Wright

Early Bird tickets now available for our LDV Vision Summit 2019 - May 22 & 23 in NYC at the SVA Theatre.  80 speakers in 40 sessions discuss the cutting edge in visual tech. Register now!

Matt Uyttendaele is the Director of Core AI at Facebook Camera. At our 2018 LDV Vision Summit , Matt spoke about enabling persistent Augmented Reality experiences across the spectrum of mobile devices. He shared how, at Facebook Camera, they are solving this and giving creators the ability to author these experiences on their platform. He showcased specific examples and highlighted future challenges and opportunities for mobile augmented reality to succeed.

Good morning LDV. I am Matt Uyttendaele. I work on the Facebook camera and today I'm going to talk about our AR efforts on smartphones.

We at Facebook and Oculus believed that AR wearables are going to happen someday, but we're not waiting for that. We want to build AR experiences into the apps that our community uses every day. That being Messenger, Facebook and Instagram. And I'm going to do a deep dive into some of those efforts.

How do we bring AR to mobile? There's three major investments that we're making at Facebook. First is just getting computer vision to run on the smartphone. We take the latest state of the art computer vision technology and get that to run at scale on smartphones.

Second, we're building a creator platform. That means that we want to democratize the creation of AR into our apps. We want to make it super simple for designers to create AR experiences on our apps.

And then we're constantly adding new capabilities. The Facebook app updates every two weeks. And in those cycles, we're adding new capabilities and I'll dive into some of those in the talk.

One of our challenges in bringing AR to mobile devices at this scale is that there's a huge variety of hardware out there, right? Some of these are obvious, like camera hardware. We need to get computer vision to run on a huge variety of phones. So that means we have to characterize exactly the cameras and lenses on all these phones. Inertial sensors are super important for determining how the phone moves. That works pretty well on the iPhone, not so much on Android. It was telling that on the Pixel 2 one of the top bullet items, marketing bullet items was IMU synchronized with camera because that enables AR. But that's a challenge that we face in bringing these experiences at scale. All told we support 10,000 different SKUs of cameras with our apps.

So let's dive a little bit into the computer vision, some of the computer vision that's running in our AR platform. On the left, take a video frame in, take an IMU data, and a user may select a point to track within that video. First we have a tracker selector that's analyzing the frame that's coming in and it is also aware of the device capabilities that we're operating on.

©Robert Wright/LDV Vision Summit 2018

©Robert Wright/LDV Vision Summit 2018

Then we've built several state of the art computer vision algorithms. I think our face tracker is probably one of the best monocular, or maybe the best monocular, face tracker out there running on a mobile phone. But we also have a simple inertial tracker that's just using the IMU. And we've implemented a really strong, simultaneously localization and mapping algorithm which is also known as SLAM. At any given time one of these algorithms is running while we're doing an AR experience. And we can seamlessly transition between these algorithms.

For example, if we're using SLAM and we turned the phone to a white wall and there's certainly no visual features to track, we can seamlessly transition back to the IMU tracker. And that lets us deliver a consistent camera pose across the AR experiences, that your AR object doesn't move within the frame. Okay, so that's the dive into our computer vision.

Here's a look at our creator platform. Here's somebody wiring up our face tracker to an experience that he has designed, right? So these arrows are designed by this. Here's similarly, somebody else taking our face tracker and wiring up to accustom mask that they have developed. So this is our designer experience in something called AR Studio that we deliver.

AR Studio is cross platform obviously because our apps running cross platform so you can build an AR experience and deliver that to both iOS and Android. It's delivered through our Facebook cameras stack, which means that runs across the variety of apps, Messenger, Facebook and Instagram. And we've enabled these AR experiences to be delivered to 1.5 billion people that run our apps. So if you build an app inside this AR Studio, you can have a reach of 1.5 billion people.

“We've enabled these AR experiences to be delivered to 1.5 billion people that run our apps.”

Okay, let me look at now a new capability that we recently delivered. This is called Target AR. And here this user is taking his phone out, pointing it at a target that's been registered in our AR Studio. So this is a custom target. And they've built a custom experience, the overlay on that target. So when we recognize that target, their experience pops up and is displayed there.

And we didn't build this as a one off experience that's shown here. We built this as a core capability into the platform. So here, our partners at Warner Brothers, at South by Southwest, deployed these posters across Austin about the time of Ready Player One launch and they use the AR studio to build a custom experience here where when we recognize that poster, their effect populations up in the app. And here's, one of my partners on the camera team, doing a demo. And that Warner Brothers experience popped up as it recognized that poster.

What I want to leave you with is we at Facebook deliver value to users in AR and that's something that we think about every day in the Facebook camera. I think I've shown you some novel experiences, but what we really strive to do is deliver real user value through these things. And that's something that, please look at what we're doing over the next year in our Facebook camera apps across Messenger, Facebook and Instagram, because that's something that we hope to achieve.

Thank you.

Watch Matt’s keynote at our LDV Vision Summit 2018 below and checkout other keynotes on our videos page.

Early Bird tickets are now available for our LDV Vision Summit May 22 & 23, 2019 in NYC to hear from other amazing visual tech researchers, entrepreneurs and investors.