William Mager: Could Google Glass be the Deaf community’s ‘disruptive innovation’?

Posted on May 7, 2013 by



The video below is an interesting time capsule. It’s Eric Sykes, talking to Jack Ashley on See Hear about his special glasses, which are in fact hearing aids. They use bone conduction technology to transmit sound into his inner ear.

A year from now, Google will be selling a pair of glasses that transmit sound to the wearer through bone conduction. However, unlike the specs that Eric Sykes wore, Google Glass has the potential to change deaf people’s lives forever.

Google Glass

Google Glass is essentially a wearable computer which sits on the user’s face. A single glass lens over one eye, displays colour images in a similar fashion to the head up display you see on fighter jets, or on sports cars. The resolution has been described as a 21inch television viewed at a distance.

Google Maps with Glass

You can adjust the headset so that the images appear directly in your line of vision, or just avobove or to the side. When the text and images are in your line of vision, they sit in a middle focal distance, similar to watching a 3D film through glasses and seeing subtitles appear on their own distinct focal plane.

Google Glass will use a combination of touch and voice commands, and function in a broadly similar way to an android mobile phone. The video below gives a good idea of what you can see and do with the headset.

The publicity around Google Glass so far has been a mix of the negative, and the sceptical. There are lots of people in the United States walking around wearing developer prototypes, with the nickname ‘Glassholes’. There are also privacy concerns – photographs and 720p video can be recorded with a single blink.

One Seattle bar has already banned Glass users from wearing them on the premises.

There are a few other niggling issues with the headset. It doesn’t fold like a pair of glasses, making storage difficult. Battery life is poor. People have reported that the display doesn’t look great.

Despite all that, despite the reliance on voice commands… I’m excited. From what I’ve seen and read so far, Google Glass could be a truly disruptive innovation that smashes down access barriers for deaf people in a way that’s never been done before.

The term “disruptive innovation” originates from a book by Harvard professor Clayton Christensen, The Innovator’s Dilemma. In simple terms, it’s a technology that does not just alter the market, but creates an entirely new one.

Gillette adding additional blades to their razors is an incremental innovation. Apple making their iPhone screen a centimetre taller is incremental.

Past disruptive innovations are the leap from physical music formats to MP3s, or the transition from the Encyclopaedia Britannica to Wikipedia.

Google Glass has the potential to be a bigger innovation for deaf people than the textphone, teletext subtitling, mobile phone texting, cochlear implants, or relay phone services. I’ll give some examples of how Google Glass might be used by deaf wearers.

CINEMA SUBTITLES ANY TIME, ANYWHERE

If you want to watch a subtitled film in the UK at a cinema, you have to keep an eye out for specially scheduled subtitled screenings, not always at a convenient time, and not always subtitled due to technical faults or oversights on the part of the cinemas themselves. Sony has cinema glasses in development, but these aren’t available to the public.

There are already lots of iOS and Android apps that provide real time subtitles for films – but watching a film with subtitles on a second screen is difficult, as you have to keep changing focus from near to far.

If similar apps were made available for Google Glass, deaf people would be freed from the tyranny of cinema scheduling and subtitle availability. We could go to see any film we wanted, wherever and whenever we wanted – pop the Google Glass headset on, use the built-in camera to recognise the opening logos of the film and sync the subtitles to the film as it plays out on the big screen.

LIVE SUBTITLING WHEREVER YOU GO

Google already has real time speech recognition software running on YouTube, and on its Android platform. It’s not perfect… but it’s getting better. It also seems to work better with American accents than British ones – I recently watched Patton Oswalt’s legendary Star Wars filibuster on Parks and Rec, and found the automatic captions surprisingly good.

Google are working to improve the accuracy of their automatic voice recognition all the time, with the recent acquisition of a Neural Network startup.

If Google’s automatic offering isn’t good enough, you could always book remote captioning for your Google Glass headset via companies like 121 Captions. Their service is pretty good for meetings, conferences and more – and reasonably priced too. The only issue could be ensuring the audio quality is good enough for the remote captioner to transcribe accurately.

DISCREET IN VISION INTERPRETING

Veteran interpreter Roger Beeson recently wrote an excellent article about his experiences, in which he envisioned a future where most sign language interpreting is done remotely.

That’s perfectly feasible, but one issue for any deaf person talking to a hearing person via sign language interpreter is eye contact. This eye contact is broken when the deaf person looks away from the hearing person speaking to an interpreter, or to their video monitor.

With Google Glass, having a sign language interpreter on screen means that you can look directly at the person you’re speaking to – AND see the sign language interpreter clearly. This is a first, and could make a subtle yet key difference to how deaf people interact with hearing people socially and professionally. The only bit I’m not sure about is how it would work if the deaf person wanted their own signing interpreted!

NIGHT VISION AND RADAR FOR THE DEAFBLIND

Some deaf users may also have visual impairments such as poor night vision, tunnel vision and more. Google Glass could be a useful tool, using the built in camera to project an enhanced, zoomed-in ‘night vision’ mode for people to navigate safely in the dark.

Or perhaps Google Glass could work with Google Maps and GPS to create a live real time ‘radar’ showing where the wearer is at all times and where they are in proximity to roads and other hazards, similar in function to the Soliton Radar of the Metal Gear Solid games.

Soliton Radar as seen in Metal Gear Solid

THE CONNECTED HOME

Many deaf people’s homes have a Mountcastle silent doorbell, a Bellman fire alarm, a Nightingale baby alerter, and more besides.

A recent Glass patent shows that Google is looking into various connectivity options – meaning that everything that happens in your home could be transmitted to, and controlled by, your headset.

You’d receive visual or vibrating alerts for anything from an oven pinging to a doorbell ringing. Not only would that be cool, but it would certainly minimise all the different gadgets cluttering up the home.

Those are just a few possibilities. I’m sure there are people out there thinking up new ways to use Google Glass that are beyond our current scope of imagination.

It isn’t perfect for deaf people – not everyone will find the bone conduction works for them, but it could be solved by a bluetooth hook up to hearing aids or cochlear implants. Deaf people still have an uneasy relationship with voice commands, though.

A recurring complaint that people have about Google Glass is that it just doesn’t look that cool. People are starting to appear out and about in public wearing them. They look like, well, people with some sort of visual impairment. Again, I don’t think that’ll be a problem for deaf people used to body-worn technology.

My gut feeling is that Google is a pretty deaf friendly company. As well as honing their real time captioning, they’ve done little things here and there like enabling sign language interpreting in Google Hangouts.

There’s also another Google Glass patent that I discovered in the course of writing this. As we know, Glass can be controlled by voice and touch… but also by gesture recognition.

Google Glass Gestures

By waving your hands in front of you, you can access different apps and commands.

A brave new world where everyone’s walking around, signing? Now that would be fun to see…

Further Reading:
A detailed infographic on how Google Glass works
Thoughts after A Week with Google Glass
BBC News: Google Glass – Will we Love it or Hate it?
Guardian: Are you a Google Glass Half Full or Half Empty kind of person?
Wired: The Inherent Dorkiness of Google Glass
Techcrunch: Using Google Glass is Weird
Eric Schmidt: Google Glass Critics Afraid of Change

William Mager is a Contributing Editor for Limping Chicken. He is also an award-winning director for film and TV, who made his first film aged 14 when he “set fire to a model Audi Quattro and was subsequently banned from the school film club for excessive pyromania.” He’s made short films, dramas and mini-series, and works for the BBC. Find out all about his work at his personal website, read his blog, and if you’re on Twitter, follow him here.

The Limping Chicken is supported by a range of charities and organisations linked to deafness, all of whom offer services that enhance deaf lives. Click on the images on the right-hand side of this site or go to our Supporter’s page to find out all about them!


Enjoying our eggs? Support The Limping Chicken:



The Limping Chicken is the world's most popular Deaf blog, and is edited by Deaf  journalist,  screenwriter and director Charlie Swinbourne.

Our posts represent the opinions of blog authors, they do not represent the site's views or those of the site's editor. Posting a blog does not imply agreement with a blog's content. Read our disclaimer here and read our privacy policy here.

Find out how to write for us by clicking here, and how to follow us by clicking here.

The site exists thanks to our supporters. Check them out below:

Posted in: william mager