Here are Snapchat developers on ASL, accessibility, and the power of language.

Snap AR: Can you tell us about your role in creating these Lenses?


Jennica Pounds: The ASL-inspired Lenses were an amazing effort that unified much of Snap, and honestly, I’m not sure what made it all come together. I’m Deaf and nonverbal, and it’s always been a frustrating point that I can’t verbally communicate with my coworkers in my own language. In 2020, I built out a fingerspelling recognizer that translated fingerspelling to speech. That was good enough to get a small team funded in 2021. 


We tried to productionize fingerspelling in Lens Studio, but we weren’t able to overcome certain corner cases. A good friend of mine found a company named SignAll, and we made a contract with their CEO, Zsolt Robotka, and they were able to build a well-functioning model for Lens Studio. We set a launch deadline for the International Week of the Deaf. We then reached out to various teams inside Snap and ended up with an amazing international effort involving more than ten teams creating Lenses, Stickers, Bitmoji, and Cameos. 


Throughout this, the “Deafengers” (Snap’s Deaf and Hard of Hearing employees) served to give feedback at every step of the creative process. Now, Lens Studio has ASL fingerspelling as a template for any Creator to leverage! It’s a dream that started with one prototype but has reached all of Snap. 


Ashley Parsons: My role was to manage the Fingerspelling Lens project timeline, and to close communication gaps between the SignAll, Cam-Plat Machine learning, and Cam-Plat AR Engineering teams. I also helped to bring key stakeholders together and designate ownership across the cross-functional teams where it was not previously established. 


Once the Lenses were in development, I worked with Jennica to bring teams across Snap into the mix by getting ASL designs for their respective features. For example, Cameos implemented animations with short ASL phrases, Bitmoji added designs, and the Organic Lens Team contributed as well. All of this came together for the launch of the Fingerspelling Lenses last September for International Deaf Awareness Week. 


Austin Vaday: I was one of the main feedback providers during the development of the lens, alongside Jennica Pounds and the other Deafengers. We were the “Deaf eyes” and made sure that all ASL-related assets — including lenses, stickers, cameos, and messaging to our users — were as accurate as possible. This required early and frequent involvement and interactions with our peers at Snap, to ensure we created media that our Deaf, Hard of Hearing, and Hearing friends outside of Snap could enjoy and appreciate. 


Snap AR: What tools and techniques did you use to create this Lens?


AP: SignAll integrated components of their technology into Snap’s Lens Studio (AR Engineering team) in order to enable the creation of the ASL fingerspelling Lenses for Deaf Awareness Week, showcasing SignAll’s ASL hand-tracking capabilities. SignAll used Snap’s hand skeleton models and retrained the ML team’s classifiers, as well as imported their preprocessing and post-processing javascript into Lens Studio via a SnapML component. 


Snap AR: How is the Deaf Community currently represented in AR?


JP: In short: the Deaf community is not represented in the AR community enough! But, it is my belief that the fingerspelling model is the first step in not just the evolution of sign language, but also a Deaf-led advancement for AR resulting in linguistic equity. 


AV: In the AR community today, Deaf and Hard of Hearing folks are often treated as second-class citizens. The technology is built with the Hearing audience in mind and most often relies on hearing. However, we’re in a very lucky position where we have Deaf leadership here at Snap that can ensure members of our community are treated as first-class citizens, with the same access to such technology as a Hearing person. This takes time to build, of course, but we have a goal of fostering linguistic equity because we want the technology we build at Snap to be accessible for members of our community.


Snap AR: Why is including ASL Lenses important for a platform like Snapchat?


AP: Snapchat is a free app, and if we offer utilities like ASL training, not only do we help our users learn a new language, but we can do so for free! We also need to demonstrate in a real way how Snap intends to be truly inclusive, and that means more than just broadening our technology to appeal to people of different skin tones or gender identities, or cultural backgrounds, but also people who don’t share the same physical abilities. 


Snap, like a lot of app-based companies, has been quietly dismissing an entire group of people by not having the right tech in place for them to use the app in a way that is tailored to their abilities. ASL training brings the Deaf community into the Snap family by teaching others around them to speak their language and even helps train deaf people who have not had formal ASL training themselves. 


JP: I love Snapchat as a social media platform for its unique ability to bring people together through the camera. Having accessible, ASL-inspired creative media and Lenses allows the Deaf and Hearing alike to engage and connect with each other. Also, the games introduced by the Lenses are a fun way to learn and practice fingerspelling, which is one of the building blocks of ASL!  


AV: The more ASL tools available, the more we can encourage and help members of the Hearing community learn our language and interact with us! People interested in learning ASL through digital means typically do so through using an app or watching YouTube videos. Now, alongside these existing tools, we offer the ability to reinforce what you learned in AR using feedback and hand-tracking technologies. It can be quite useful for people who want to learn ASL but may not have people to practice with, and opens a new world of possibilities!  


Snap AR: What were some challenges in creating these Lenses?


AP: Identifying the tones, accents, or styles in which people display gestures with their hands and body. To recognize that, even though three different people have signed the phrase, was the most difficult part. 


Snap AR: What do you hope for in the future of integrating AR technology and the Deaf Community?


AV: The future is bright! There’s tremendous potential for building AR tech for both the Deaf and Hearing communities. First off, AR should be built in an accessible way so that members of the Deaf community can use and enjoy the technology as well. This requires feedback from our internal and external Deaf and Hard of Hearing friends, so we can ensure that the technology is built for us, by us. There are opportunities to build AR tech that can be beneficial for both communities, and that requires us to maintain an open feedback loop with our users.


JP: I think we are only starting to realize the impact the Deaf community will have on AR. Hands are emerging as an important component of AR, and no one understands hands better than the Deaf community.


Snap AR: What drives you to continue creating Lenses and new features?


JP: When texting became popular, language changed to adopt acronyms such as LOL or BBL. I strongly believe that, as we start interacting with AR using our hands, we will start seeing people use their hands more to communicate with each other. That represents an incredible opportunity to achieve linguistic equity between the Deaf and the Hearing. What drives me is a vision of a world where Snap users feel like the products are made for them, including native signers. 


AV: As a Deaf software engineer at Snap, I’m passionate about creating software in general. The AR space is so new, and we’re in an extremely fortunate position where our technology is being built with Deaf eyes and leadership here at Snap. We’re trying our best to create fun experiences and tools for all of our users, and it’s so exciting that there’s so much effort to ensure my community, the Deaf community, is seen. 



As you learned in this discussion, we’re just now beginning to see Snapchat’s potential for inclusion and accessibility when it comes to language. Augmented reality can play a huge role in how we communicate, and these ASL-focused Lenses are proof. Try them out for yourself and let us know what you think. If you're feeling inspired, check out Lens Studio and start exploring entirely new ways to communicate.