One Lens in particular perfectly captured the feeling of the season. If you were hoping for a white Christmas this year — or just wanted to make a snow angel this winter — then developers Patrick Fox-Roberts, Isac Müller Sandvik, and Benjamin Barral came up with the perfect powdery solution for Snapchatters.

Their World Transformation Lens gave our community the chance to experience a winter wonderland, coating their AR-enhanced environment in a fresh blanket of snow — no jacket or scarf necessary. We had the chance to catch up with the team of London-based engineers to ask about how they made this Lens. And, like a true team, they combined their answers for us. 

Here are Patrick, Isac, and Benjamin.

Winter Forest block

Snap AR: Can you tell us about your role at Snap?

We work in computer-vision engineering and deep learning. In other words, we spend all day reading papers, implementing new ML (machine learning) techniques, and using them to make new, best-in-class world AR effects.

Snap AR: What inspired these December World Transformation Lenses?

Changing the season in an image has been a dream proof of concept for many deep-learning professionals. We wanted to make a system that trained quickly and efficiently, ran in real-time so users could actually hold it in their hands, and was robust enough to experience anywhere. In this particular case, it was simply fortuitous timing, because our work was perfectly suited for the end-of-year holidays!

Snap AR: Why use ML/World Transformation Technology for it?

ML-based transformation filters are a very exciting technology. You can show the network enough of the subject matter that you want to transform to, build a network, and add the secret sauce that makes it work well. On the other end, you have a transformation network that has learned rules like “lights are on at night” in our Day to Night effect or “trees have snow on them in winter” in our Winterification effect. It goes way beyond what is possible using traditional effects. 

Snap AR: How did you integrate ML/World Transformation into the Lenses?

At the most basic level, World Transformation effects work like full-screen effects. The system inputs the camera image to the ML network, and a transformed version comes out that we can draw on the screen. For most Lenses, this is not the whole story, as the effects themselves are part of a larger whole that makes up a good Lens. Animation, other filters, 3D assets, and interaction all combine to make a fun Lens experience.

Snap AR: What sort of new possibilities does this technology present to the Snap AR community?

Probably the most exciting part of this is that the ML deployment is done using the standard ML Components tools in Lens Studio! Any of our users out there can do this, and we’re simply trying to lead the way.

Snap AR: What exciting new project are you working on next?

Sorry, we really can’t say. You'll know it when you see it, though!

What a perfect way to end a year of incredible Lens Drops. As we look toward 2022, we know that, just like Patrick, Isac, and Benjamin, there’s so much potential on the horizon for your creations. With Snap AR’s technology and your talent, you have everything you need to make something truly innovative. It’s only a matter of time. See you next year!