Learning to trust and communicate with autonomous cars is simpler than it may seem and it might include a trip to your favorite restaurant.
Increased use of self-driving cars might mean fewer accidents and smoother traffic flow; it could change the way you relate to machines. With trials beginning as early as 2021 in the UK, developers and city officials seem ready to embrace autonomous driving, but are you? What mental barriers will you have to overcome to accept self-driving cars as a part of your life? Let's take a look at how learning to trust machines can transform the autonomous car into our driving companion.
If only my car had a brain…
Accepting self-driving cars boils down to how you deal with risk and develop trust. Think of it this way; you probably have zero tolerance for fatalities caused by airplane crashes. But, every year approximately, 1.25 million people die in car accidents. While you certainly don't condone fatalities involving cars, you are more likely to tolerate them.
Dr. David Freeman Director at Consumers Union, a non-profit dedicated to unbiased product testing, says this is because you can see yourself in the drivers who weren't wearing seatbelts or missed the red light; who made critical yet human errors. An airline company is not human so you're much less likely to empathize with it.
This concept is related to 'theory of mind': the idea that we recognize that other people (not machines or corporations) have brains with complex mental states and emotions just like our own. Theory of mind is responsible for emotions like empathy and compassion, which allow us to connect with others. It also compels us to lock eyes with a driver behind the wheel of a car as you cross an intersection. “If they see me = I'm safe." But how do you make eye contact with an algorithm? This is one of the critical issues developers of Automated Driving Assistance Systems (ADAS) need to understand in order to create more human-centred automation. Manufacturers need to consider and design ways for you to better relate to a machine that doesn't have a brain.
Engines have emotions too
In 2017, Honda launched its New Electric Urban Vehicle (NeuV). NeuV is a self-driving car that uses an automated network assistant called “HANA" to analyze and respond to data about driver preferences and behaviour. “HANA" also has an 'emotion engine', which is similar to a "machine brain". Implementing an emotion engine into autonomous cars is one way that ADAS designers are changing the way that we can relate to machines.
Using cameras and sensors, “HANA"'s emotion engine can analyze our actions, facial expressions, voice and heart rate. After cross-referencing this data against a calendar it can determine if you're relaxed or stressed, If "HANA" concludes that you're anxious, it might use a POI database like HERE's Places to recommend a route that passes by your favorite restaurant. If the system includes HERE's advanced mapping services “HANA" could also avoid construction sites and other blockages that might further increase your stress-levels. (Remember “KITT", from the 1984 hit TV series Knight Rider?)
Machines can learn and so can we
For a machine to relate to you, to “make eye contact", developers will need to perfect machine learning. Machine learning is a component of artificial intelligence (AI), it focuses on the development of machines that can learn and perform without being explicitly programmed. When machines learn and respond to your behaviour in regulated ways (like "HANA" does) you'll be more likely to trust them. Like learning to trust people, you'll trust autonomous cars when you feel that you can predict their actions.
One solution comes from Drive.ai, a Texas-based company making self-driving vans. Their vehicles have LED signs on all sides that respond to the car's environment with messages. They can tell a pedestrian that they're "Waiting for You", or "Going Now, Please Wait". This kind of technology might seem advanced, or perhaps a little strange, but remember that cranking a lever, pressing a button and touching a screen are also ways that you learned to interact and communicate with machines. (And you're pretty used to doing that, aren't you?)
The screens are one of the most prominent features of the vans. The messages are directed at pedestrians and other vehicles on the road to ensure maximum safety.
Are we there yet?
A new study from the Capgemini Research Institute suggests that your trust of self-driving cars will continue to increase over time; in fact, they predict that the amount of people willing to accept autonomous vehicles will double in five years. The study also revealed key components that could ease the transition. They include avoiding falsification about the car's capabilities (autonomous cars are not personal assistants, yet) and prioritizing and relaying safety elements.
When a car can respond to you as an individual rather than a compilation of data, then you can consider it a little more human. Some ADAS analysts say this is decades away. In the meantime, you can look forward to learning how to relate to a car-brain and to road trips with a companion that wants to go to the same restaurant as you.