Child pages
  • An app that translates sign language into spoken English using Brooklyn Research Cluster Platform
Skip to end of metadata
Go to start of metadata

 

A team of students from the NYU Tandon School of Engineering built the prototype of a mobile app that enables hearing people to understand sign language and helps the deaf by translating spoken words into sign language. Their project is part of the Verizon's Connected Futures challenge which, in partnership with NYC Media Lab, supports new media and technology projects from universities across New York City.

The team was led by Zhongheng Li who was inspired by his friend Fanny whose parents are deaf. Since there is no universal sign language, Fanny ’s family was having a tough time when they moved to the US. Therefore, these students created an app to empower millions of deaf people across the globe. The team built this app using the concepts of Machine Learning, Augmented Reality, Computer Vision and Cloud Computing.


Zhongheng Li and his team used Brooklyn Research Cluster as their high-performance cloud computing platform to host their deep learning API using both OpenPose and TensorFlow trained Image classification model on the cloud. Initially, they wanted to use 'Depth Mode' camera features for better recognition. But, soon they realized that not everyone can afford a high-end smartphone model with depth camera features. So, they converted RGB images into skeleton images using OpenPose library for better accuracy and eliminated the need for a depth camera. They leveraged the power and flexibility of cloud computing to enhance their recognition model. Since they are using RGB camera and processing the information in the cloud, they don't have to depend on any devices or platforms. They can implement their framework on other technologies such as Hololens.


They preferred to use Brooklyn Research Cluster for its availability, cost, support, and resources that are required to run high compute processes. Since cost was an important parameter for their project, they chose Brooklyn Research Cluster over Amazon AWS EC2 instances which charged them $1.14 per hour. It was not a feasible option for them because they wanted to use the servers for long duration and the cost added up. Being full-time students at NYU, all the HPC resources are available to them for free which includes compute power and storage.

In terms of performance, Brooklyn Research Cluster is a better alternative than Amazon EC2 instance for their team because AWS provides Nvidia Tesla M60 whereas BRC provides 3 NVIDIA P100 GPU's which gave them a better performance. They are thankful to the HPC team for their guidance and support to use these services effectively and provide detailed instructions on certain topics.


As of now, the app is still in its pilot phase where it can detect and translate few phrases. A user can book an appointment with the medical clinic using sign interpretations. You can watch the demo video to understand how this app works.

 



 

 

 

 

  
  • No labels