Friday, 21 August 2009

Making TV widgets talk

The Royal National Institute of Blind People (RNIB) is working with industry to develop a talking TV UI. A little more detail is available on the Ocean Blue Software site, but I'm looking forward to seeing (or rather, hearing) it at IBC to find out how the EPG part of it works.

I imagine that it's not just a case of speaking out the contents of a whole screenful of guide information. What the blind or partially-sighted user needs is to be able to navigate around a screenful of information to control the order that things are spoken, much like a sighted person would look around by moving their eyes.

For this to work effectively, the navigation paths would need to be logical and structured so that the minimum navigation is required to get to the info you want. I doubt you'd get that simply by mapping left/right/up/down positions on the screen display to left/right/up/down navigation, so each "talking" application must be structured with spoken interaction in mind.

Which raises an interesting question - with the trend towards 3rd party extension applications and widgets for interactive environments, would every application author have to design their app with a "talking" mode of operation (including navigation paths optimised for a talking UI), or is it possible to design the UI building-blocks such that a useful talking mode is achieved without the app-author considering it?

A widgets environment already forces some navigation structure onto applications (so that the user can navigate between widgets without all widget authors cooperating with one another) - could this be extended to make widgets talk more easily?

Sighted-users can also benefit from this area of work. For example, imagine if you could phone your STB from the office and interact with the talking interface to set a recording. Or if the TV remote control had a phone-like handset built into the back, you could check your stocks and shares widget without disturbing the video that others are watching.


matttvgenius said...

Hi Richard

Matt from TV Genius here - we've been doing some stuff with the RNIB to have talking text reminders and to have a TV Guide that uses IVR technology; we've also done some remote record projects (albeit not voice activated) on web and mobile. We've also put a "web style" tv guide on a remote that has a decent screen and wireless capabilities - so I can look up the TV epg without disturbing my wife's viewing of Holby City! In reality it would be hard to have a talking epg for remote record due to the high number of TV shows available - currently 50,000 shows plus 90,000 catch up shows at any given time. It's quite easy for sighted people to scan an epg grid and take in lots of information quickly to get round this for visually impaired people is to build wish lists of fave stuff and pull in recommendations and picks of the day to present a personal TV guide.

Dr Rob said...

Hi Matt,

It's Rob actually, not Richard this time!

It sounds like you have your fingers in a lot of interesting pies there!

Narrowing down the navigation possibilities by applying preferences and recommendations is good - a conversation is easier if the other party knows something about you.

But I think that users build a "map" in their mind of "places" they can get to in a UI, and dynamically generating the UI pathways conflicts with that. The impact is greater when you can't see the UI. How can we improve this?