Sometimes concentrating on what a piece of technology is designed to do blinds us to what it is actually capable of doing.
It doesn’t have to be as dramatic a pivot as Play-Doh’s switch from wallpaper cleaner to child’s toy, or shock absorbing springs being reimagined as Slinkys.
Just a subtle tweak in intention can put a technology to a whole new use. Take Skype’s translation service, for instance.
What is obviously intended as a way for users divided by language to better understand each other can be used instead as a way of subtitling a video call between speakers of the same language.
What use could this be put toward?
Well, it could become a valuable asset within the deaf community, and aid young sign language users as they learn the subtle gestures and body language cues that make up an adult vocabulary.
Learning Sign Language Online
Video conferencing in general already delivers some major advantages to sign
language users. It’s the only long-distance form of communication that offers real-time face-to-face communication. With a little planning and the use of good webcam it can convey not only a video caller’s hands and face but their entire body as well, and the use of body language and facial expressions is integral to using sign language.
That opens up a whole new frontier of communication that recreates the in-person use of sign language and makes it easier to establish long-distance private and professional relationships.
The Federal Communications Commission (FCC) has even moved to create a suite of apps that can make existing video calling platforms and social media more accessible to the deaf and hard of hearing. The Accessible Communications for Everyone program includes a promise to make sign language translators available as third parties to a video call, acting as a go-between for all sides of a conversation.
That initiative obviously holds great potential, but in the meantime Skype offers a simpler way to accomplish at least part of that process for free–even if the technology needs a little improvement.
How Skype Translator Works
As we’ve previously discussed at VC Daily, Skype’s Translator function still needs a little more fine tuning. Its shortcomings are easily noticed if you tweak your video calling setting to translate English into English. Once you start chatting you’ll find the real-time subtitles that pop up at the base of a chat window are often wrong. Sometimes it’s not even close.
However, it’s a free service and the technology in the background is relatively new–the limited first phase was formerly introduced in 2014–so there’s hope it will improve in the near future.
The technology is more advanced than just word recognition. It’s a kind of machine learning that adapts with exposure to human conversation, and it grows more accurate with open use in real world. It can learn sentence structure, common words and phrases grouped around specific topics, and how to ignore the umms and ahhs that litter our speech patterns. It also has to deal with a range of accents–and a conversation between a Texan and a New Yorker can be as challenging as one that includes two different languages–and the nuances of different languages, so there’s a good reason why it’s lagging a little at the moment.
As it improves–and Skype just added a 10th language to its repertoire so they’re in it for the long haul–the service could become a crucial asset in the teaching of sign language online.
Sign Language Courses by Video
There are a number of online sign language course currently available on the web, and a handful of smartphone apps. Some of them are free, but most follow the typical distance education format of pre-recorded video lessons and DIY modules and emailed correspondence.
What’s more interesting is the rise in online sign language tutors and teachers that work via live conversation. This is the group that could best make use of subtitled conversation using Skype Translator.
Within the framework of a regular Skype call teacher and student could go about their lesson just as they would in person. There’s enough room for including all the body language and facial cues, but now the instructor can incorporate verbal insights transformed into written language in real time. As long as the student is able to read reasonably well this could smooth the explanation of complex communication concepts. It offers the teacher a fall back to a common, known language when the gaps are still present in a student’s signing.
It could also promote the use of lip reading, pairing as it does the facial gesture with an instant written equivalent. Skype also provides the opportunity to use instant messaging as a running commentary, but the advantage of live translation is it means both parties can maintain eye contact at all times, and keep their hands actively conversing rather than lingering over a keyboard. This means that just a simple tweak of Skype’s intentions creates a whole new range of uses–one that even Skype Translator’s creators might not have imagined when they created the service.