Skip to main content
Automated Driving 5 min read

How autonomous cars ‘talk’ to their passengers across three senses

How autonomous cars ‘talk’ to their passengers across three senses

HERE built a prototyping platform to demonstrate a way to increase trust in autonomous cars, and we’ve been talking to Alex Mangan from the automotive products division at HERE to find out more.

In a world in which drivers can manually drive themselves, before handing over control to their cars and then taking control again later in the journey, there are a number of different situations that can arise. Alex explains: “We wanted to use that journey and our prototyping platform to tell a story about how the combination of navigation with a very well done location-centric HMI (Human Machine Interface) can be merged with highly accurate, highly precise real-time content to make a better consumer experience.”

Switching back and forth between the car and driver (what is known as level three) is particularly complicated, and Alex says that Google doesn’t want to deal with that level of complexity. “We think we’re uniquely positioned in this space to have an insight into how these experiences can be brought to life,” says Alex, “and we can make this transition phase more acceptable for drivers and make them trust their cars, which in turn will help to drive adoption of these kind of technologies.”

HERE believes that in order for people to fully trust taking their hands off the wheel and being comfortable with it, it needs to be a stepped approach. Those initial steps will be brought to life through connected ADAS functions; the car will automatically change speed, for example, not only based on the car in front or the speed limit, but by variable message signs on the road itself that are changing by the minute. Or the car may refuse to change lanes because there’s a car in the driver’s blind spot.

 

Fully immersive experience

 

Alex explains that HERE’s prototyping platform aims to create a fully immersive experience, starting off with multi-channel feedback that can take into account not just the visuals shown to the driver, but also vibrations and sound. “HMI (Human Machine Interface) isn’t just what’s on the screen, but also the vibrations felt through the wheel or seats, or the sounds if something unexpected is happening.”

 

“If an ambulance is coming up from the right, how can we move the sound up through the car? We could start at the back right rear and move it up to the front right of the car to give spatial context to the driver, perhaps. The ambulance use case is something we used in the demonstration.”

Route abstraction

 

Then there’s route abstraction: distilling the information out of the experience, and only giving the driver critical details. We created a visualisation of a circular map with the driver’s current position in the centre. There are traffic details around the outside, and the driver can use a finger to go along the route, scrolling to see what the traffic looks like.

 

“The whole point is we don’t need a map that’s fully 3D and overwhelming in detail,” says Alex. “It’s just the critical information we need to be aware of when the car is driving itself.”

 

Map design

 

Even though a map doesn’t need to be overwhelming in detail, the design of it is still important. We have the ability to provide more compelling visuals for the map, and we can do it in a way that’s better than what we currently have:

 

“If you’re going to be parking near your destination, we can render the map in a way that shows the end location with every other building suppressed, and only two nearby parking lots shown in 3D and highlighted in blue. This kind of map rendering is distilling the details and showing what I really need.”

Augmented reality

 

Augmented reality also plays a part in the future of connected and autonomous cars, and we’re basically overlaying content onto the screen for simple things like the next manoeuvre.

 

“We’ll have an arrow that comes up to say you’ll need to turn right on the next street, and unlike current navigation, we’ll actually have the arrow highlighting the exact street you need to turn down, sitting on that corner in your windscreen and removing any confusion.”

 

Environmental awareness

 

Finally, there’s environmental awareness. “We’re taking real-time content from the cloud,” says Alex, “and we can tell the car things that are happening based on what we know, but it also works the other way around.” So when something unexpected happens, the car can put an instance on the map. It means we can tell the vehicles around us that something just happened and to be careful, and it’s this insider knowledge that could make your connected car a lot smarter than the average driver.

Philip Barker

Philip Barker

Have your say

Sign up for our newsletter

Why sign up:

  • Latest offers and discounts
  • Tailored content delivered weekly
  • Exclusive events
  • One click to unsubscribe