Every Lens Studio update helps us further our mission to provide you with the best AR creation experience possible. This latest update is rolling out everything from code editing extensions to more photorealistic renderings to new landmarker templates.
The goal is to provide a seamless user experience that puts your creative vision first. Whether it’s streamlining features, expanding our offerings, or tweaking what’s already working to make it even better, each update is made with you and your creative process in mind.

Lens Cloud

Lens Cloud is a collection of backend services, built on the same infrastructure that powers Snapchat. Lens Cloud vastly expands what developers can build in augmented reality by providing Multi-User Services, Location Based Services, and Storage Services.

City Landmarker

Build custom landmarks for your city! This new template allows AR developers to launch location-based AR anywhere in a city or micro neighborhood through Lens Studio. The roll out starts with central London, with more cities to come later this year and early 2023.

VSCode Extension

Code editing has never been easier! Visual Studio Code is now offered as an IDE for your Lens Studio projects. The feature enables code editing and smart code completion, JavaScript debugging, and JS code snippets for building out your Lenses. Lens Studio’s built-in script editor is made for newcomers or pros knocking out a first version of a script.

Body Depth & Normal Textures

We want to make sure your Lens comes out as clean and produced as possible. Our Body Depth and Normal Textures feature gives a detailed estimate of the depth and normal direction for every pixel that makes up a person, including their body, head, hair, and clothes. The results are lighting effects and interactions with AR objects that are incredibly sophisticated and realistic.

ML Environment Matching

We’ve just upped our game with SnapML’s environment matching functions. With Light Estimation, Lens Creators can craft a more photorealistic rendering by matching environmental lighting on their object renderings. Now, AR items placed near or on the face (think sunglasses, hats, jackets, scarves) can better reflect real world lighting. With Noise/Blur, Lens Creators can match their AR content to the noise and blur levels of their camera. Similar to the Light Estimation feature, this is best demonstrated with AR objects near or on the face, including accessories like sunglasses, hats, jackets, and scarves.

API Library Expansion

We’re excited to add even more partners to Lens Studio’s API Library, including Sportradar, Plume Labs, and Astrology API. With Sportradar, access the details on play-by-play sport updates including basketball, tennis, soccer, and cricket to inspire your next Lens. Plume Labs allows you to build Lenses with the exact air quality of a specific location, furthering your ability to create a more realistic rendering. Lastly, Astrology APItaps into AR experiences surrounding daily horoscopes and astrology readings. Mercury was not in retrograde when updating this feature.

Accurate to Size Template

Shop and design with more accurate AR technology. This new feature utilizes the best tracking solution available for your device to provide an accurate scale when placing objects in their physical space. On LiDAR devices, World Mesh Capabilities will make for real-time occlusion and improved accuracy. Non-LiDAR devices will rely on multi-surface tracking to improve sizing accuracy.

Lower Garment Segmentation

Get excited — more garment segmentation options are here. Now that lower garment segmentation is in our Studio roster, creators now have three segmentation options to choose from: upper, lower, and full garment. Choose either, both, or full with little impact to performance.

VoiceML Features

New VoiceML functions just arrived. Our 2D Animated Text to Speech Template is enhancing Lens 2D animated experiences. With this feature, creators can convert text strings into speech, then lip-sync the audio over an animation with a moving mouth. Sentiment Analyzer explores five universal emotions. Creators and Snapchatters speak or insert text and the best matching sentiment will be returned. (happiness, sadness, etc.). As a bonus, the feature can also determine Yes/No intents per phrase.