Sign language interface
active,
Founded 5 years ago

Sign Language Interface project aims to improve accessibility of technology for individuals who use sign languages by incorporating sign language recognition, generation, translation, and tutoring functionality into UI/UX design and AI systems.

sign language recognitiongesture interfacesign language processingUXUI

Main goals:

  • To create a sign language search engine that enables users to easily find sign language content.(MVP - Minimal Viable Product)
  • To create an interactive sign language tutoring system that effectively teaches users sign language.(Medium Viable Product)
  • To create a machine translation system that accurately and efficiently translates between sign languages and spoken or written language, making communication more accessible for users.(Maximal Viable Product)

Terms & Definitions

  • Minimal Viable Product - The minimum set of features necessary to test the basic functionality of the sign language interface.
  • Medium Viable Product - An expanded version of the MVP with additional features to improve usability and functionality.
  • Maximal Viable Product - The final version of the product, with all the features and capabilities required for full functionality.

The tasks can include

  • Develop sign language generation and recognition system

  • Develop a tool for collecting and labeling sign language data, and create a dataset using this tool

  • Use the tool to mark up sign language data, such as videos or images, with linguistic and grammatical information, such as signs, movements, handshape, hand orientation, location, and facial expression

  • Develop a sign language search engine that allows users to search for sign language content.

  • Develop an interactive sign language tutoring system

  • Continuously monitor and analyse the performance of the system to identify areas for improvement.

  • Design a UI/UX interface that is easy to navigate and understand for sign language users.

App task examples:

Data labeling

  • Sign language data labeler uses the tool youtube
  • Collection of sign language sentence at NGTU NETI youtube

Sign language recognition

  • Recognition of Russian fingerspelling youtube
  • Recognize hand configurations-orientations to present examples of words youtube
  • Static one-handed gestures to present examples of words youtube
  • In the context of the course 'Designing a Machine Learning System', a Telegram bot is demonstrated that is able to recognize the shape of the hand and then match it to the corresponding word in Russian Sign Language. youtube (see 4:42:37-5:03:16)

UI/UX layout

  • Random gesture selection youtube
  • Search for gesture by word youtube
  • Sequential display of selected gesture parameters for gesture search youtube
  • Search for gesture by photo with its performance youtube
  • AdaptisCon#2 in Novosibirsk - Presentation of the Russian sign language dictionary youtube (see 1:48:10-2:22:14 of our speaker)

Site, bot & mobile apps

Social media:

About

Email: alexeyayaya@gmail.com - Alexey Prikhodko

  • Alexey Prikhodko - Domain expert, ML developer
  • Julia Prikhodko - Sign language data labeler and informer
  • Mark Belousov - Frontend-developer, system analyst

Our website uses cookies, including web analytics services. By using the website, you consent to the processing of personal data using cookies. You can find out more about the processing of personal data in the Privacy policy