Real time sign language interpretation using Augmented Reality


Seeing that people with hearing disabilities in the Arabic speaking world rely heavily on the use of sign language to communicate, there is a critical need for the development of technology to provide sign language interpretation where human interpreters cannot always be present.

As such, a solution is required to provide people with hearing disabilities access to static and rich multimedia content (e.g. video, audio announcements, text, graphics, signage) through Augmented Reality.

The solution will enable people with hearing disabilities to access key information and partake in different activities in a more independent and equitable fashion.

Not Initiated

Target Users

Persons with Hearing Disabilities
Sign Language Users

User Journey

MaryamMaryam is 24-year-old. She is deaf and uses sign language to communicate. Maryam is looking to use her smartphone to access information on static and digital signage in her everyday life.


Maryam is looking at a digital sign, with information written in Arabic. She cannot read the text.


Maryam uses the camera on her smartphone to activate an Augmented Reality sign language interpretation app to access the information found on the sign.


Once open, the app will identify and interpret the content of the signage and convert it to sign language.


The interpreted content will be presented to Maryam through a sign language avatar displayed on her smartphone or wearable device.

Potential Service Features

  • NFC Enabled Identification
  • QR Code Scanner
  • Biometric Identification
  • Saved Favorite Points Of Interest
  • Automated Day Planner
  • Personalized Notifications
  • Predictive Insights
  • Issue Resolution

Touch Points

Issue Statement

As a sign language user, Maryam is unable to read information found in static or text and rich multimedia-based mediums (e.g. audio, video, graphics, etc.) during her everyday life.

The inability to access crucial city information including public transportation schedules, navigation, events, points of interests, and emergency notifications can limit her ability to live independently and at times pose a safety hazard.

Expected Key Benefits

Text / Graphics to Sign Language

Translate key information that is being delivered to mass audiences through text and video into sign language (both Arabic and English).

Contextual Information Delivery

Based on the user’s live location, relevant contextual information (e.g. emergency notifications, alternative routes, etc.) can be pushed to the user in sign language.

Audience Expansion

Businesses and public sector organizations can use the solution to broadcast contextualized information in sign language, thereby targeting a niche customer base that is not traditionally accessed through conventional advertising mediums.

Implementation Analysis

Implementation Timeline

Timeline Medium




Technology Commercial Viability

Timeline Short

Available Now

Viable in Short Term

Viable in Long Term

Investment Requirements

Timeline High




Key Implementation Considerations


User identification and sign-on


Collection and synchronization of real-time data from multiple sources (e.g. transportation schedules, emergency response routes, etc.)


Identification and Instant / Real-time sign language translation of relevant data in alternative formats (e.g. text, audio, etc.)


Data collection policies


Intuitive Graphical User Interface