Snap AR is our developer platform, and together with our community, we’re expanding what’s possible through our Camera.
There are more than 250,000 creators, developers and partners from nearly every country in the world building on Snap AR. The community has tapped into their creativity and technical skills to build over 2.5 million Snapchat Lenses that have been viewed over 5 trillion times. Creators are building careers and fully-fledged businesses with our technology, and we’re continually offering new ways to support their endeavors. From the Snap Lens Network to GHOST, our programs provide funding, early access to new tools, and greater reach to Snapchatters globally.     
They’re also building a future beyond mobile with our next generation Spectacles. Hundreds of developers are using new capabilities like hand tracking, VoiceML, and Connected Lens technology to build new ways of interacting with AR.

New AR Tools

All of their creation and exploration begins with Lens Studio. Today, we’re releasing a new version so developers can build Lenses that are more dynamic and look true to life. 
With our API Library, Lenses can surface real-time financial, translation, and weather data from partners, ultimately making more useful and rich AR experiences. We’re expanding the API Library with new partners AstrologyAPI and Sportradar, so Lenses can tap into frequently-updating information like daily horoscopes and live sports scores. 
We’re also improving our Lens Analytics feature with Event Insights, offering creators a look at what happens while their Lenses are in use. We hope this makes it easier for creators to debug issues and build experiences that bring people back again and again. 
And coming soon, Ray Tracing will let reflections shine from AR objects in a lifelike way. It’s the first time this feature will scale across a wide range of mobile devices, enabling console-quality realism across platforms. Partners like Tiffany & Co. will bring a signature piece into AR that sparkles with precision, while Disney and Pixar will take Buzz Lightyear's iconic space suit to infinity and beyond in an AR try-on Lens.
As developers push the limits of AR creation, we want to empower them with an advanced suite of services that vastly expands what a Lens can do. So, we’re introducing Lens Cloud, a free collection of backend services that enables Lenses to be more useful, dynamic, and connected to the world than ever before. There are three main services: 
  • Multi-User Services lets groups of friends interact together at the same time within the same Lens.
  • Location-Based Services anchors Lenses to places using our city templates, or any custom location around the world. Central London is the first City Landmarker available now, with more launching over the next year. 
  • Storage Services make it possible to build complex and interactive Lenses by storing assets in our cloud, and call on them, on-demand. Snapchatters will also be able to pick up where they last left off through persistent data support. Storage Services will launch in the coming months. 
We’re excited to see how developers leverage Lens Cloud and Lens Studio together to build a new generation of AR experiences that enhance the way we play, learn, explore, and shop.

Camera Kit

The Snap AR platform not only powers Lens experiences on Snapchat and Spectacles, but also in other mobile applications through Camera Kit. With our AR SDK, any developer or partner can leverage Snap’s Camera technology in their own mobile app, unleashing new ways for AR to bring a new dimension to their customer experience. 
Camera Kit has been adopted by a wide range of partners – from global brands like Samsung, Disney, and Microsoft’s Flipgrid, to developers building an app for the first time. This year, we’ll continue to work with new partners to integrate Camera Kit, develop new use cases, and push the boundaries of AR. 
With new Lens capabilities, tools, and infrastructure, we can’t wait to see how the Snap AR community continues to prove the value and impact that AR can have on the world.