We’re addicted to connectivity, even when we’re behind the wheel of our car, but in-car personal assistants and voice recognition could make the obsession safer and more intuitive.
Transport is changing. Stepping into a metal box, turning on the engine and driving to our destination completely disconnected from the world around us is now seen as archaic. We live in a world of constant information streams, and need to stay connected even when we are in our cars.
But staying connected behind the wheel isn’t easy. Staring at a screen and using a touchscreen isn’t safe, so new interaction methods need to be introduced, and automotive firms are taking technologies found in consumer devices and integrating them in the vehicle.
How many times have you asked Siri a question, had Alexa remind you to do something, or used the Google Assistant to order food? Voice is the control method that can bring connectivity into the vehicle cabin without distracting us from the road.
Personal assistant are moving into the car, and the functionality and benefits that they offer will become even more apparent as cars continue on their journey to becoming fully-autonomous.
Now, companies including Harman and Nuance are developing systems that either allow you to take Siri, Alexa, and Google Assistant into your car, or are developing new, automotive-specific technologies that serve the same purpose, but with even greater functionality.
“With the rise in use of voice-enabled devices coupled with consumers’ always-on connected lifestyle, communicating clearly on the go, particularly in your car, is more important than ever before. As a result, in-vehicle communication is evolving to be a critical component of the overall passenger experience,” said Michael Mauser, president of lifestyle audio at Harman.
Harman’s approach to the new world of in-car personal assistants is to use its AudioworX proprietary, open-framework development platform. It allows the company to offer modular and scalable voice system across vehicle segments, compatible with the entire range of personal assistants available from Apple, Amazon, Google and others. These all bring into the cabin the ability to use voice to control your phone contacts, play music and much more, but that’s just the beginning.
Nuance, one of Harman’s competitors, is looking further forward, at how voice interaction and the integration of personal assistants can be used as vehicles take greater control of the act of driving, and cabin occupants have more time to conduct other tasks.
Nuance’s Dragon Drive demonstrates the firm’s next step in its attempt to create a more human-like, conversational experience for drivers and passengers – an experience that is reliable, accurate, safe and intuitive.
The system uses a combination of eye tracking and natural language understanding, allowing users to interact with points of interest outside the car, to get general information, opening hours, ratings and more, simply by asking the personal assistant about places they see as they travel.
“Enhanced context capabilities and the ability to have a collaborative dialogue ensures users can further specify their request or ask additional questions about the point of interest,” said Lior Ben-Gigi, director of product management at Nuance. “This combination ensures a very intuitive interaction with the outside world, that is becoming more like interacting with a human.”
Imagine travelling through a town you’d never been to before, seeing something of interest and wanting to know more about it. Nuance’s eye tracking system would be able to detect where you were looking, and with a simple command, such as, “What’s that place?”, the personal assistant would be able to give you any details you needed.
Nuance would also like to take the system a step further, improving vehicle safety.
In addition to understanding what users want to do, the company’s system can also sense how they feel. Using “emotion AI” technology supplied by its partner Affectiva, combined with cameras to analyse facial expressions and tone of voice the assistant understands drivers’ and passengers’ cognitive and emotional states. From this information the assistant can adapt its behaviour, changing both its response style and tone of voice to match the situation. This technology could enhance safety on the road by preventing distracted, drowsy and impaired driving.
Just as Nuance has, Harman is likely to continue developing its personal assistant technology to offer a variety of functions, making systems ever more user friendly, adaptable and intelligent. Something, that as vehicles begin to drive themselves could be even more important.
How long we have to wait until we can speak to our cars, they understand and can respond in a natural and unobtrusive manner remains to be seen. Current voice systems, which admittedly are demonstrators only, remain flaky, and have been known to freeze when servers go down, something that wouldn’t be accepted in a mass market technology.
But the future is definitely voice. Rightly or wrongly we crave connectivity even when we are behind the wheel of our car, at which point touchscreens become a dangerous liability, and personal assistants offer a means to remain in contact with the outside world.
How software can keep the water flowing
The robotic recycler that can sort through our trash
To enable comments sign up for a Disqus account and enter your Disqus shortname in the Articulate node settings.