For 11 million Deaf and hard-of-hearing individuals in the U.S., effective communication remains a daily struggle. The issue is especially critical for Deaf children—90% of whom are born to hearing parents with little to no experience with sign language. Without early exposure to American Sign Language (ASL), these children risk language deprivation, leading to delayed cognitive development, educational disadvantages, and social isolation.
Despite the availability of ASL programs and online resources, learning sign language remains a challenge. Traditional classes can be expensive and geographically restrictive, while video tutorials lack real-time feedback—leaving learners without the necessary guidance to improve accuracy and fluency. The result is a significant gap in ASL accessibility, preventing families, educators, and allies from developing meaningful communication skills.
Our strategy aimed to bridge this gap by leveraging AI to create an intuitive, interactive, and widely accessible ASL learning experience. This required:
1.
Personalized, AI-driven feedback – Giving learners instant corrections to enhance accuracy and build confidence.
2.
An open-source ASL dataset – Encouraging community contributions to fuel AI advancements and future accessibility tools.
3.
A free, web-based platform – Eliminating cost and geographic barriers to ASL education.
By combining cutting-edge AI with human connection, we sought not just to teach ASL but to drive a systemic shift in how technology can support accessibility. More than a learning tool, Signs represents a movement toward inclusive communication—empowering individuals and shaping the future of AI-driven accessibility solutions.
In collaboration with NVIDIA and the American Society for Deaf Children, we created Signs—a groundbreaking AI-powered platform that transforms ASL learning through interactive, real-time feedback.
How it Works:
1.
Step-by-step guidance: A 3D avatar demonstrates each sign from multiple angles, helping learners understand the nuances of ASL.
2.
Real-time AI feedback: The platform uses computer vision and machine learning to track users’ hand movements, instantly correcting errors to improve fluency.
3.
A growing ASL dataset: Users contribute videos of their signs, expanding an open-source dataset that continuously enhances AI accuracy.
By making Signs a free, web-based platform, we removed traditional barriers to ASL education. Anyone with a device and a camera could start learning instantly, making ASL more accessible than ever. Unlike passive learning methods, Signs fosters active engagement, allowing learners to build confidence through direct interaction with AI-powered feedback.
Beyond individual learning, Signs is designed to shape the future of AI-driven accessibility. By crowdsourcing signing videos, the platform continuously refines its recognition capabilities, laying the groundwork for real-time ASL translation in video calls, AI assistants that understand sign language, and other accessibility innovations.
Without relying on paid media, Signs gained rapid traction through organic press coverage, social media buzz, and engagement from the Deaf community. The platform’s widespread adoption and open-source contributions are not only helping individuals learn ASL today but are also paving the way for the next generation of AI-powered communication tools.
Despite launching with zero paid media, Signs quickly gained global attention, proving the urgent demand for accessible ASL education. The platform’s impact was immediate and far-reaching:
*
1 billion+ earned impressions – Signifying global awareness and engagement.
*
20 million+ people reached in the first week – Demonstrating strong organic adoption.
*
25,000+ signs learned in just 10 days – Each representing a step toward breaking communication barriers.
*
Featured on CNN, Axios, VentureBeat, and major global media outlets – Driving mainstream recognition of ASL accessibility challenges.
More than just numbers, the true impact of Signs lies in its ability to foster human connections. Each sign learned represents a new conversation, a strengthened relationship, and an empowered Deaf or hearing individual gaining the confidence to communicate in ASL.
Beyond personal learning, Signs is fueling long-term innovation in AI-driven accessibility. With over 400,000+ video clips and 1,000+ signs in development, the platform is continuously training AI models to recognize and interpret sign language more accurately. NVIDIA hosts this growing dataset publicly, enabling researchers and developers to advance AI-powered solutions for ASL translation, digital assistants, and other assistive technologies.
By breaking down barriers to ASL education, Signs is actively bridging the communication gap—helping create a world where technology doesn’t just recognize sign language but truly understands it.