In the previous part of this series, we closed by asking: what happens when an autonomous car detects an unmapped object in the road?
Answer: The car stops
The car’s object detection sensors always have priority. Common sense, right? However, this scenario raises a good question. The world, and its roads, are constantly changing. How does a live map stay perpetually current? Answer: intelligent aggregation.
As we’ve discussed before, an autonomous car drives down the road, and it’s constantly detecting objects like poles, signs, lane markers etc. The car compares those objects with what the HD Live Map says and orients itself accordingly. Along the way, there are bound to be discrepancies. The system must understand those discrepancies, and act accordingly.
One type of discrepancy will source from the car itself. Not every car will have its sensors located in the same place. A stop sign that is detected five feet off from one car might be detected as eight feet off to another. The algorithms that position the car relevant to its sensors and nearby objects must account for those differences. But, that’s a relatively easy thing to fix compared to the next example.
If the object isn’t obstructing the vehicle path, the car may take no action that the driver would notice. However, the car will try to identify the object. Is it a sign? Is it a tree or a pole? Is it a photographer with a tripod? Whatever it discovers is reported to the cloud, and the process of aggregation begins.
This is where the intelligence of the AI in the cloud comes into play. If a single car detects a discrepancy, and no other cars that pass through the area detect any issue, then the system can likely determine that it was not a permanent object – or otherwise – not an object that needs to be added to anyone else’s map.
On the other hand, if multiple cars that pass through detect the same discrepancy in the same place, the system can flag that the issue needs further investigation. The more cars with more sensors that report an issue in an area, the higher the level of confidence the AI has that something needs to be changed.
As a real-world example, multiple cars in an area recently detected a discrepancy in a road sign. What had changed was that a local construction crew had added a second sign below an existing sign to divert truck traffic. This change was detected, updated, and put into distribution in under 24 hours.
This is why we refer to HERE HD Live Map as self-healing, because the majority of work and updates can be done automatically and without human intervention.
The HD Live Map is designed to work around all of these examples, and many more that we haven’t touched. It handles multiple layers of maps, collects an immense amount of data, and has the intelligence to use that data efficiently to keep itself persistently updated. It’s already empowering better drivers, and will continue to lead the path to reliable autonomous driving.
Why sign up: