![]() "It's such a complex world out there, we're working with hundreds of companies to build personal assistants for individual use cases," Thompson tells us. We want our personal assistants to do things relevant to those devices and designed around their capabilities. Thompson argues that personal assistants need to be specialized: the experience on a smartphone is different from a wearable device, which is different from a virtual reality device, which is different from a car. Mike Thompson, executive VP and general manager of Nuance's mobile division, has witnessed the evolution of his company's development on personal assistants: first on feature phones, and eventually on mobile devices and the mobile web. Our cars, our homes, and our health: These things can be controlled and monitored by our iPhones - and Siri, by extension. Of course, having a human-like conversation with our devices is only part of the recipe towards a hands-free future. This notion of repair, recognizing when things go off track on either side - crossing that boundary is going to be important." I understand 80% of what my wife says to me, but if I get something wrong - I don't take out the garbage, I empty the dishwasher instead - it gets corrected, quickly. "They don't have to get it right all the time as long as it's easy to repair the misunderstandings. "It's never going to be perfect," Kaplan said. (Of course, it's difficult to teach a computer to know when it doesn't know something!) It would just need to identify what it doesn't understand, and ask for clarification. Soon, voice assistants like Siri and Google Now will learn the principles of conversation: You won't have to restart a conversation if the voice assistant doesn't understand your request. "That's very different than what a human conversation would be." "You can go down these blind alleys, where misunderstands it's about to do something that's not as you intended, and there's no easy way out of that other than restarting," Kaplan said.
0 Comments
Leave a Reply. |