logo

Module 1

THE BASICS OF LOCATION-BASED AR LENSES
Any location-based Lens requires a thorough understanding of users’ surroundings through 3D reconstruction, semantic understanding, and location data. In contrast, non-location-based AR contextually or procedurally overlays objects onto the world without relying on a specific location and will load the Lens without having to scan the location for which the Lens was developed.  
Building location-based AR content presents some unique considerations. For this course, we will focus on three: blending AR content with the physical world, guiding your audience through the Lens experience, and helping your audience discover and engage with the Lens.

Blending AR Content With the Physical World

The experience of using location-based AR is most effective when the digital content interplays with the geometry of the physical world. For example, raindrops are more believable when they collide with surfaces rather than fall straight down. To create a successful AR experience, consider using static and dynamic content.

Static Content

Static content is placed directly into the Lens Studio scene view and appears the same every time.

Dynamic Content

Dynamic content is procedurally placed by code and can be adjusted based on various details of the scene like a Snapchatter’s camera angle, the distance to a specific object, or something as novel as the local weather.

The illusion of location-based AR can easily be shattered by technological shortcomings like occlusion and drift. Let’s review these:
Occlusion is the concept of being able to block an object with another object. Occlusion issues occur when people or objects enter a Snapchatter’s camera view and the digital content is not rendered behind them as expected. When deciding where to place AR content, try to avoid areas where too many people or objects would likely interfere with the camera view.
Drift occurs when a Snapchatter’s camera tracking is unstable, causing the AR content to seem to slide or appear erratically within a scene. To help account for drift when designing a Lens, consider using:

Recognizable Placement/Anchor Points

Place content on parts of the location that are very recognizable to the tracking system. Consider the most recognizable parts of a location to be “anchor points.” Using anchor points helps to discourage drifting. Module 2 will explain how to select recognizable parts of a location to identify possible anchor points.

Floating Assets

To mediate the challenges of sliding, try designing assets that are flexible in where they can be placed within the scene. This is often a relatively easy and flexible solution for location-based AR to help enhance the Lens experience.
When designing your Lens concept, to meet your Lens goals, be sure to weigh the pros and cons of location-based AR and its unique considerations.

Guiding Your Audience Through the Lens Experience

Location-based AR offers an incredible opportunity to find creative ways to move contextual information into a scene. Since a Snapchatter can open a Custom Location Lens from many angles, the Lens should be designed to account for this. Successful design ensures that, no matter which perspective a Snapchatter starts tracking from, they won’t miss any content.
If on-screen UI is necessary, try to keep the main viewport preserved for the real world as much as possible. Successful Lens design approaches include: 
  • Placing AR content around a location to naturally lead toward the intended focal point 
  • Triggering any one-time animations based on a Snapchatter’s position or camera view
  • Using UI markers to indicate elements that appear outside of the field of vision
    • Place UI at the bottom third of the screen or on the sides 
    • Ensure UI is dismissible or automatically disappears when no longer relevant 

Discovering the Lens — Get Your Audience Engaged

Since Snapchatters have to be at the exact location to experience your Custom Location Lens, as you develop your Lens, you’ll need to address discoverability and user engagement. The last thing you want is to develop a Lens for a location and then a Snapchatter visiting the location doesn’t even realize they could be engaging with your Lens!

Lens Discoverability

Once your Lens has been created and submitted to Snapchat — we’ll cover this in Module 5 — it’s time to promote it. If your Lens is developed for a business location, consider creating signage and promotional posts through your business’s social media channels to generate awareness. If you made a Lens for a local landmark or your school or university campus, consider working with local organizations to put Snapcode signs on location. Additionally, you can set up a Geofilter, which will cause the Lens to appear in the carousel when Snapchatters are nearby.
Big Ben Lens by Arthur Bouffard

Snapchatter Engagement

Custom Location Lenses do not open with location fully tracked, so if a Snapchatter happens to discover your Custom Location Lens, they will have to load it first. When creating your Lens, consider the first thing Snapchatters will see and make it as engaging as possible. Placing an attention-grabbing title, screen imagery, or non-location based AR content will engage a Snapchatter while they’re loading the Lens. Once loaded, encourage them to set up tracking by aiming their phone at the location. 
A note for Spectacles: If Snapchatters are using Spectacles to experience Custom Location AR, they will have to get closer to the location to set up tracking, as Spectacles offer a larger field of vision. If you deploy your Lens to Snapchat and want to make it available on Spectacles, consider creating a dedicated pop-up to guide the tracking setup process.

Snapcode

Scan the Snapcode to see the interactive loading screen created to direct Snapchatters to the Yu and Me Books storefront and guide users through the Custom Location AR tracking setup.

What's Next?