Sign language avatars may seem like a great idea, but like sign language gloves, the details are in the implementation. Until now, most have been overly simplified and have not involved people with disabilities in their design or development. NVIDIA's new SIGNS platform (https://signs-ai.com/) takes a different approach. NVIDIA incorporated artificial intelligence to create more naturalistic American Sign Language (ASL) avatars. Unlike previous attempts, this model was trained using real Deaf signers to improve accuracy. It focuses on the full range of ASL expression, including nuanced facial movements, precise hand placement, and the flow of movement between signs. This represents progress in digital inclusion. Facial expressions are critical in ASL, and past avatars failed to capture them effectively. SIGNS recognizes that and the importance of correctly positioning the hands in three-dimensional space. There is still work to do. What happens if someone has Bell’s palsy and cannot move one side of their face? What if someone has ulnar neuropathy and cannot keep their pinky in with the rest of their fingers? What about other forms of sign language used in the US, like SEE, LSM, CSL, cued speech, and Black ASL just to name a few? d/Deaf people are diverse, and tools that promote inclusion must reflect that. More research, testing, and collaboration with the Deaf and disabled communities are needed to ensure AI-driven signing works for everyone. Technology alone doesn’t make something accessible. People do. NVIDIA's crowdsourcing of signs for training should make this a more robust system. NVIDIA's SIGNS platform is definitely a step forward in applying AI to sign language. #AccessibilityTriumphThursday #Accessibility #Inclusion #Disability #AI4Accessibility #SignLanguageTech #DigitalInclusion #ASLInnovation #AccessibleAI #ASL #SignLanguage https://lnkd.in/gcwjS5ab
AI Applications That Enhance Digital Accessibility
Explore top LinkedIn content from expert professionals.
Summary
AI applications aimed at enhancing digital accessibility are transforming how people with disabilities interact with technology and the world around them. These innovations prioritize inclusivity by addressing unique needs, such as creating sign language avatars or offering real-time descriptions for the blind.
- Collaborate with communities: Involve people with disabilities during the design and testing phases to create tools that address their diverse needs and lived experiences.
- Leverage real-time tools: Integrate AI-driven solutions, like Seeing AI and Be My Eyes, to provide instant support for navigation, communication, and accessing visual information.
- Expand accessibility efforts: Ensure developments in AI include a wide range of disabilities to foster equity, such as supporting multiple sign languages or offering multimodal feedback for the deafblind.
-
-
While much of our focus around GenAI has been on productivity and efficiency gains, the findings from our latest study with EY highlight something even more profound: how AI can transform workplace experiences for employees with disabilities and neurodivergence. Seeing technology unlock experiences that were previously inaccessible for those with disabilities reinforces my belief in the power of innovation. Microsoft Copilot’s ability to foster a fulfilling and inclusive work environment is clear when considering the personal stories of these employees. Whether someone struggles in virtual meetings or writing and sending emails, Copilot can make a meaningful difference in their day-to-day work. It’s not just about getting the job done. It’s about empowering individuals to participate fully and confidently in their professional lives. Learn more about how Copilot is enhancing accessibility and forging inclusive ways of working in the study here: https://lnkd.in/gyi8XYcV #Inclusion #Accessibility
-
Trevor Noah dives into a fascinating topic: "How can AI empower people with disabilities?" Saqib Shaikh, creator of SeeingAI, joins Trevor to show how AI is revolutionizing life for people who are blind or have low vision. He demonstrates how Seeing AI provides detailed descriptions of surroundings—from buildings and film crews to even birdhouses! Why is this important? ● AI is not just tech; it's a tool for independence. ● It transforms how people with disabilities interact with the world. ● Seeing AI offers real-time information, enhancing daily experiences. Key Takeaways from the Video: 1.) Empowerment through technology: AI like Seeing AI opens new possibilities. 2.) Creating independence: No more waiting for assistance—access information instantly. 3.) Breaking barriers: AI helps to navigate spaces, understand environments, and make decisions. Kudos to Saqib Shaikh and your team! They are leading the way in making AI accessible, practical, and genuinely transformative. What’s next? Imagine a world where every tech innovation includes accessibility from the start. AI is a powerful step forward, but there’s so much more to explore. P.S. Check out the video—it's an eye-opener!
-
Exciting AI + accessibility news for the blind community! Be My Eyes has partnered with OpenAI/ChatGPT to create a groundbreaking accessibility tool that uses AI. Users can point their phone at the scenery in front of them, and the phone will provide a visual description and speak back to them in real time for tasks such as hailing down a taxi, reading a menu, or describing a monument. This could be a gamechanger for many blind people, enhancing independence and making the world more accessible for them. As a deafblind woman, it excites me to see a new accessibility tool emerging. This innovation holds great promise, and I’m eager to witness how it empowers the blind community by offering real-time descriptions of their surroundings. Imagine the freedom and confidence this could instill in daily life for blind people, from navigating new places to simply enjoying the beauty of nature. However, blindness varies widely, so this tool might be more suitable for some people than for others. For example, there are still limitations for the deafblind community. As blindness is a spectrum, many blind people still have remaining vision. If they're deafblind like me, they need captions to have full access when receiving auditory information. I'm curious about what blind users will think of the tool once they start to adopt it. While this is a fantastic advancement, there’s always need for continued improvements and iteration. I also care deeply about preventing the harmful impacts of AI so I hope that this is also being thought about. Accessibility technology is crucial for the disability community. It not only enhances our ability to engage with the world but also promotes independence and equity. What are your thoughts on this new development? P.S. Here’s a cool video on it: https://lnkd.in/etfHehCh #Accessibility #AI #DisabilityInclusion
Be My Eyes Accessibility with GPT-4o
https://www.youtube.com/