top of page

The Limits of Autonomous Driving

by Andrés Kaminker

As children, we used to picture futuristic cities filled with driverless, hovering cars and vehicles. The start of 2021 makes us wonder how close we actually were to this goal.

Several companies, be it traditional automakers such as General Motors (who has partnered with Microsoft*) or newer companies like Tesla, are currently working to build autonomous vehicles. With autonomous driving being a more modern feature, it is difficult to gauge and assess the progress of this field. However, using SAE (Society of Automotive Engineers) standards, we can better understand our progress and limitations. Currently, most commercially available vehicles that feature self-driving capabilities (Tesla’s Autopilot, Volvo Pilot Assist, etc.) stand at Level 2, meaning some tasks operate autonomously under certain conditions, but still require human intervention to be operative. Other cars like the Audi A8** sport is considered Level 3. These vehicles can complete a route under benevolent conditions while having an attentive driver at all times in case of emergencies. Other companies like Honda expect to release Level 3 vehicles to the consumer market during 2021. While progress is fast, there are still several technical and regulatory hurdles to overcome in order to achieve these ideal vehicles.

The technology used to develop self-driving cars is mostly composed of machine learning models fed by tremendous amounts of data, making them able to extract conclusions and make predictions from similar data aided by several sensors like LIDAR. One of the main problems is that the amount of data available to be collected is finite but the possible events that a driver can face on the road are endless. Human beings are able to extrapolate, that is take a set of rules and apply it to a situation beyond the former problem’s dimension.*** This is a hard task in itself, but it is even harder for computers. This presents a problem for these future vehicles because even if they function correctly under normal circumstances, they might encounter a situation where they have not been sufficiently trained. This dilemma is one of the biggest problems when applying Level 3 automation. This has made some automakers reluctant to make systems that operate in this fashion as opposed to lower-complexity ones such as Autopilot which requires constant human intervention and attention.

The question posed by self-driving cars is not only “when and how” but also “what then?.” Even if a fully autonomous vehicle is possible and available for the consumer market (something we are in fact a long way from), there are several obstacles when it comes to the popularization of these machines and their wide adoption. From the safety and legal concerns regarding responsibility to the dynamics introduced by autonomously driven vehicles interacting alongside human drivers, we are still a long way to go.




Photo from European GSA


Recent Posts

See All
bottom of page