Friday, 28 December 2007

Talking iPhones - how Apple will integrate speech synthesis

There are two clues that point to Apple's introduction of a revolutionary voice interface to iPhone:
  1. Fantastic new Text to Speech functionality in Leopard
  2. Apple's patent activity in the area of speech recognition
Before getting an iPhone, I used to use a Nokia N61i. That phone made fantastic use of speech. For incoming calls, the phone would speak the caller's name - no need for you to first record a voice tag - it would use text to speech. Similarly, voice dialling worked without the requirement for first recording voice tags. This suddenly opens up your entire contacts database to voice interface - very useful, and hard to imagine that Apple won't add something similar to iPhone - especially when you consider that Apple has considerable IP in this area, perhaps more so even than Nokia. 

Apple has clearly invested hugely in the area of Speech for Mac OS X 10.5 Leopard. The new System Voice, "Alex" is amazingly good quality, and to be honest, it's a bit wasted on Mac, where the applications of speech are somewhat limited, (except for blind or partially sighted users). Apple surely has some other application in mind. Voice Interface (combination of Speech Recognition and Text to Speech) does, however, have obviously applications on a portable communications device with headphone/headset - i.e. iPhone.

Here's my guess at how Apple may integrate Speech into iPhone OS X v2:
  • Speak caller ID for incoming calls
  • Voice dialling for outgoing calls
  • Text to Speech support in all Applications - maybe triggered by a voice command, such as "Speak Screen"
  • Voice Interface for Text Messaging, so that you can conduct text exchanges without even looking at the screen
  • Speak Artist & Title info before or after audio track (optional setting, obviously!)
  • Speech recognition to browse music library and select next track
  • Spoken Calendar alerts

1 comment:

  1. when will the iphone have all this???

    ReplyDelete