Having A Bad Day? Your Car Will Know. | MHH International
It seems that the march of high-tech gadgetry continues unabated. By 2022 it is expected that virtually all new prestige cars will offer voice recognition. As an adjunct of that, the next step for the vehicles of tomorrow could be to pick up on tiny changes in our facial expression as well as modulations and inflections in our speaking voice.
Advanced systems – utilising microphones and in-car cameras – could learn which songs we like to hear when we are stressed and those occasions when we prefer silence. Interior lighting could also complement our mood, although how they’ll light a black and thunderous state of mind is a matter for wonder.
It seems manufacturers are well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive.
Cloud-based voice control is anticipated to be available on 75% of new high-end cars by 2022 and it is predicted future systems would evolve into personal assistants that shuffle appointments and order takeaways when drivers are held up in traffic jams. Imagine that; roadside pizza deliveries!
We Speak Your Language, Human
This very summer, Ford’s in-car connectivity system, SYNC 3, will enable some drivers to connect to Amazon’s virtual assistant Alexa and offer 23 different languages and many local accents. By accessing cloud-based resources, cars of the future could enable even more drivers to speak the local language, wherever they are which has to be better than speaking your own tongue only louder.
Voice commands like ‘I am hungry’ (to find a restaurant) and ‘I need coffee’ have already brought SYNC 3 into personal assistant territory. For the next step, drivers will not only be able to use their native tongue, spoken in their own accent, but also use their own wording, for more natural speech. No doubt most other car makers have similar technology in the pipeline. Isn't science wonderful?
Apple CarPlay, for example, provides a simplified way to use the iPhone interface on a car’s touch screen, giving users access to Siri Eyes-Free voice controls, as well as Apple Maps, Apple Music, Phone, Messages, and a variety of third party apps. Android Auto™ delivers Google Maps and music to a car’s screen while enabling voice controls for phone calls and messaging.
And There’s More
Gesture and eye control, already coming in, will ultimately enable drivers to answer calls by nodding their head, adjust the volume with short twisting motions, and set the navigation with a quick glance at their destination on a map.
Where will it end? How much control do you want your car to have over your life. Rather than on a first date simply picking up your possible future life partner what happens if the technology actually selects your future life partner? Where’s the ‘off’ button?