At times, it must feel like being a stranger in your own home.
The 200,000 deaf and hard of hearing people living in New York and relying on sign language as their chief form of communication must often feel like their neighbors just don’t understand them. After all, sign language is a foreign language to most New Yorkers. And that extends to the officers of the New York Police Department.
So, imagine trying to explain an emergency situation, while under great stress, to a police officer who can’t clearly understand you. Your words get broken down to primitive gestures, and the only way to convey any detail is to physically take the officer to the scene.
Now, however, the NYPD is attempting to solve that problem by embracing video conferencing for the deaf via on-demand, on-the-spot sign language translation services over the internet. It’s a first step toward making New York more accessible, inclusive, and deaf-friendly.
NYPD Video Sign Language Service
The NYPD began its video translation trial back in April by deploying dedicated tablet devices in three precincts. Watched over by the advocacy group Deaf Justice Coalition, the tablets were made available in stations covering Manhattan, Queens, and Staten Island. The initial trial has since proved successful enough to be expanded to four other precincts.
The technology operates the same way as a standard video calling app you might use on your smartphone, like Skype or Facetime, and the tablets are small enough to be carried around by patrolling police officers.
So, when approached on the street by a person who uses sign language, an officer can instantly place a video call to a waiting remote interpreter and carry out a three-way conversation, face-to-face (to-face). That mobility is what makes the program a potential life-changer for the deaf and hard of hearing. It lets them instantly communicate with police in an emergency, or just for casual assistance, without having to make a trip all the way to the station. It makes for more natural communication, which, in turn, helps establish stronger connections between police and their communities.
Video Translation Anti-Terrorism
London police recently introduced similar technology to link non-English speakers with video conferencing translators in criminal proceedings. Their devices, however, are restricted to the police station and are intended more as interrogation and investigative tools to aid in anti-terrorism cases. Again, the technology revolves around simple Skype-style video conferencing between detectives, suspects/witnesses, and the remote language translator. Here, though, it’s applied to cut down on delays associated with getting translators to attend interviews in person.
The NYPD model seems more intent on breaking down barriers between police and sign language users, and protecting the latter’s rights when they are in the position of being a “person of interest.” In short, it makes the police force more accessible for the deaf and hard of hearing. Along those lines, it could be expanded to other emergency services, especially ambulance officers, and to the broader social sector, such as government departments.
Going further, on-demand video translation could potentially be used city-wide to radically open up sign language use–and the first steps in this direction are already being taken out on the other side of the country.
Video Conferencing for Deaf-Friendly Cities
Newport Beach, California, was earlier this year declared a Deaf-Friendly City for its use of the same video conferencing tablet technology the NYPD is currently testing. The city has placed the tablets at City Hall, libraries, and other public facilities where they are free to use. The official rollout will link with the professional interpretation service Language People Inc., which has previously installed similar devices in schools, hospitals, and even a Newport Beach pizza restaurant.
That Newport program is specifically targeted at public institutions, but the devices could be placed in malls, public transport hubs, and large corporate buildings to create city-wide hotspots where sign language users could be better understood. Language People also provides on-demand foreign language translation, so the service could be expanded to aid non-English speakers as well.
In the near future, the technology could be embedded in a city’s infrastructure using smart objects and the Internet of Things. ATMs could house small video calling screens, as could next generation parking meters, or the DIY ticketing machines at cinemas and fast food outlets. Any machine that can send a digital signal is a potential link to the internet, and anything that can accommodate a screen could become a link to a video conferencing translator. As our cities become smarter, they’ll also become communications hubs where no one is unable to be understood.