- Lens Cloud is a collection of backend services, built on the same infrastructure that powers Snapchat. Lens Cloud vastly expands what developers can build in augmented reality by providing Multi-User Services, Location Services, Storage Services and Scanning Services.
- With the release of Lens Cloud, a City Scale Template has been provided to learn how to get started building city wide content for London, UK.
- Body Depth & Normal Texture Guide and Template provides developers a detailed estimate of the depth and normal direction for every pixel of a person including their body, head, hair, and clothes. This lets the user easily create sophisticated lighting effects and calculate realistic occlusion and collisions with AR objects.
- Depth Render Target gives developers the ability to capture a Camera's depth stencil texture, enabling ways to create custom effects such as screen space normals, custom occlusions, fog, and depth-based particle collisions.
- SnapML has also received some updated features to enhance a user’s experience and understand the user’s environment.
- Light Estimation allows developers to do more realistic rendering by matching the real world lighting on objects rendered by the Lens.
- Noise/Blur Estimation gives developers the ability to match the noise and blur levels from the device’s camera with the AR content.
- SnapML now supports quantized models, an ML model format. Some of the benefits include; model size reduction by 50%, fast inference speed, and more power efficiency.
- To help showcase quantized models, a new Multi Class Classification Template has been added to showcase these new features.
- Lower Garment Segmentation has been added to the Segmentation System allowing developers to get segmentation masks based on data for upper-garment and lower-garment systems simultaneously with low overhead. A Segmentation Trails Template has been added to showcase this functionality.
- True Size Object Template is a new template that utilizes the best tracking solution available for your device to provide users an accurate scale when placing objects in their physical space. This allows users to test how well physical objects can fit into their environment.
- Voice ML has received two new templates for developers to learn how to add an extra layer of personalization to their content.
- The API Library has expanded allowing you to now build Lenses with third party APIs that give play by play data from five different sport leagues, daily zodiac astrological predictions, and location specific air quality data.
- Additionally, we made it easier to build Lenses with third party APIs by automatically generating a block of code used to connect to an API upon importing the API asset into your project.
- Fixed issue when duplicate custom components cannot be created from script
- Fixed issue when base color textures are tinted blue on import of certain glTF files
- Fixed crash of Lens Studio with different glTF models
- Fixed crash of Lens Studio when specific script has errors
- Fixed issue when 'Device Camera Texture' moved from folder into root of Resources Panel
Windows 10 (64 bit); MacOS 10.13+
Minimum of Intel Core i3 2.5Ghz or AMD FX 4300 2.6Ghz or Apple M1 with 4 GB RAM; Intel HD Graphics 4000 / Nvidia GeForce 710 / AMD Radeon HD 6450 or better; screen resolution of 1280x768 or higher
Please make sure that you have the latest driver for your specific graphics card installed
Join The Community
Get updates from Snap AR including creator tools, tutorials, meetups, events, and more — all for free!