Skip to main content
Automated Driving 6 min read

How much can we actually control autonomous cars?

How much can we actually control autonomous cars?

If you’re riding in an autonomous car, and you want to go faster, what do you do? When asked this question, people often change their answer on a single condition: Does the car you’re riding in belong to you, or is it a car for hire?

Auto manufacturers are aiming to provide cars with Level 4 autonomy on the road by 2021. As we rapidly draw closer to having access to AI-driven vehicles, some tough questions will rise to the surface.

One question that commonly comes up is, “what happens if I want the AI to drive differently?” It’s a tough scenario, and many people in the industry are actively working to find the exact answer.

To navigate toward what that answer might be, we can consider how we exert control over the means of transportation that are in wide use today. To begin, let’s look at public transportation.

We have expectations from public transport

When you board a train, a bus, or a subway car, your starting expectation is that it’s safe to be on that vehicle. This is probably such a foregone conclusion that you hadn’t considered it, but remember that multiple entities have thoroughly tested and approved the vehicle’s features and safety standards. Its operators have been trained, and they will (with very few exceptions) stick to a defined path.

Ideally, this leaves you with a comfortable ride that you don’t have to consider.  You can then get back to reading, or thinking, or having a conversation with your travelling companion. (Hopefully, you’ve never tried to convince a bus driver to take a different route.)

We should carry a similar expectation of safety when we get into a Lyft, an Uber or taxi. The vehicle is maintained, the system has oversight, the driver is qualified, and you likely don’t need to worry about the road. You’re then free to do other things while you’re in transit from one place to another. But, there is a difference in this example

Unlike public transport, in the case of a car-for-hire, you would expect that you can exert control over the car. You can tell the driver to drive differently, or you can tell them to take a different route. You can tell the driver to take actions like stopping, or pulling over. The driver will then, hopefully, take your direction and alter course accordingly.

Is it unreasonable to expect the same from an autonomous car-for-hire?

If you were to get into an AI driven Lyft, Uber or taxi, it’s highly likely that you will be able to alter or control the route via your mobile app, or an interface built into the car. Just like a human driver, the AI navigating the car would take your input into account, evaluate what it can follow, and ask for clarification on what it can’t follow.

And here, we arrive at an extremely important point. In the case of a car-for-hire, a human driver, and an AI driver are alike in that they will both take your suggestions - but will not necessarily follow them. This is critical when considering a scenario where you might intentionally or unintentionally instruct the driver to break the law.

If you tell an AI to go the wrong way down the street, we don’t imagine that it should follow that command. But, we hope that the user experience designers at work on those systems are creating a feedback condition where – again like a human driver – the system informs you that it can’t take a left, because it’s a one-way street.

So the level of control we exert over a Car as a Service is going to have its limits. And, as with public transportation and taxis, we would be smart to expect those limits.

What happens when this takes place in a car that we own?

By nature, we control our cars. After all, we are the drivers. We pay for the car, the registration, the insurance, the gas – it’s our property. When we purchase a car equipped with highly assisted driving or autonomous driving systems, should we expect that those systems will do exactly what we say regardless of safety or legality?

The answer for now is: no.

In the case of owning an autonomous car, it might be better to consider the AI system as your personal chauffeur. Like the above example of a car-for-hire, the self-driving systems in your personal vehicle will have to evaluate your requests, then act on those requests in a manner that meets its safety and legal requirements.

In terms of levels of automation, only Level 5 cars can be built without a steering wheel or pedals. That design represents a point where the vehicle owner has agreed, in fact trusted, that the car will make all critical decisions on safety. Perhaps they’ll even give the car a name like Alfred, or Jeeves.

But in the near term, we can expect we’ll continue to have steering wheels and pedals in all our cars. For any car below Level 5, that’s a critical matter for safety. But, for many drivers, it will remain in place for those times when you simply want to take the wheel and drive.

Bradley Walker

Bradley Walker

Have your say

Sign up for our newsletter

Why sign up:

  • Latest offers and discounts
  • Tailored content delivered weekly
  • Exclusive events
  • One click to unsubscribe