Self-Driving Cars and Street Signs

Self-Driving Cars and Street Signs

19th Jan 2018

Are self-driving cars autonomous enough to recognize street signs? While human drivers easily recognize most signs on roadways, can cars learn them, too? A self-driving car is basically a robot-vehicle hybrid. Machines can learn and are good at automating practices and offering assistance, but how well they can understand all the various signs that may be found on the road is still up for debate.

Autonomous vehicle (AV) providers like Google and Uber are trying to answer these questions. The biggest issue they have encountered is that not all traffic signage is standard, either within one nation or globally. Thus, there isn’t just one set of data that machines must learn. Eventually, one day the signs may be able to communicate with AVs, but for now, if they are going to be on the roads, they must learn signs.

The Current Challenges of Autonomous Vehicles and Signs

AVs need to be sign literate because they are still sharing the road with humans. AVs have to understand variations in signage as well; combinations of colors and shapes can indicate certain information. The symbol basics of stop, yield and speed limits aren’t the issue. It’s the more regional ones that are complex, like crossings for different types of animals or even roundabout guidance, as these traffic solutions aren’t universally used in the US. AVs need to be able to understand variance.

One strategy used right now is for AVs to learn signage via photographs of actual signs. Other approaches include AVs learning via images of street-level signage. There is a proliferation of these images available from sources like Google Maps. The software used in AVs has shown the ability to learn thousands of signs from multiple countries. Researches have used something called a “neural network,” which is a computer system that studies versions of one thing to be able to differentiate between small differences.

With any type of learning, it gets better with experience. With more experience and exposure to imagery, the AV gets smarter at interpreting road signs. Engineers and experts are also working to find the best cameras to accompany the LIDAR, sensor and radar technologies. Not only do these types of models offer different versions of a sign, they also provide imagery in different types of weather or environments. If fog or rain is present, signage will look different and be hard to decipher so AVs have to learn to read signs in these conditions, too.

As AVs become more advanced and more prevalent, signage will certainly change with new technology and possibly new regulations. Any of these will have to be government sanctioned. There is sure to be a lot of movement regarding how the roadways will need to adapt to driverless cars.