Are Self-Driving Cars Safe?
A modern car contains more than 100 million lines of code. Software enables many different features — cruise control, speed assistance, and parking cameras. And the code within these embedded systems only gets more complex.
This trend will continue as cars of the future become more connected. Cars are increasingly dependent on technology. And they will progressively become more autonomous — and ultimately self-driving.
But Are Self-Driving Cars Safe?
Self-driving cars have the potential to be safer than human drivers, according to experts.
After all, human drivers get into millions of accidents each year. In 2016, there were over seven million accidents in vehicles with human drivers. Less than one percent of those accidents were fatal.
Self-driving cars, on the other hand, are still being tested. They have a long way to go before they become the norm.
In March 2018, an Uber self-driving test vehicle was involved in a fatal collision with a biker in Arizona. This self-driving car failed to identify the biker as a human until it was too late to stop. And the human safety driver in the vehicle failed to notice the biker.
It's likely to take some time before we see widespread use of self-driving cars.
But vehicles driven by humans are using more and more technology in Advanced Driver Assistance Systems (ADAS).
Today’s Advanced Driver Assistance Systems (ADAS)
Today, most new cars are equipped with ADAS.
- Lane tracking
- Autonomous emergency braking
- Enhanced vision systems
The systems delivering these functions rely on sensors and actuators that communicate over local networks. These are controlled by microcontrollers.
Cars also communicate with each other. This is known as Vehicle to Vehicle Communication (V2V). They also communicate to the infrastructure — such as traffic lights, road signs, or satellites. This is known as Vehicle to Infrastructure (V2I) or V2X.
Enabling all of this is, of course, software. In addition to the application code there are operating systems and middleware — such as network communication stacks — as well as sensor, actuator, and display interfaces.
Security Concerns in ADAS
There are more security concerns facing vehicles with ADAS than ever.
There are growing security concerns around vehicles. With the growth of V2X communication, cars are vulnerable to malicious attacks. There have been reports of hackers taking control of cars and overriding the driver.
Most car manufacturers use On Board Diagnostics (OBD). OBD provides access to various engine parameters for fault-finding and diagnostics at servicing.
Technical details of the connector interface — OBD II — are publicly available. There are a number of Bluetooth OBD connectors that enable anyone to access engine parameters using just a cell phone.
Clearly, this could expose the engine control system to anyone with good or bad intentions.
The University of Michigan recently put this to the test. They found that a direct laptop connection to the OBD interface could be used to override driver instructions.
Automotive Compliance Standards Become Even More Important
So, as ADAS grows and self-driving cars become the norm, automotive compliance will remain important. And automotive standards will need to be followed.
ISO 26262 is perhaps the most important safety standard for vehicles. It focuses on the functional safety of electrical and electronic systems. And it applies to all activities within the lifecycle of safety-related systems. This includes requirements applicable to the quality of software.
The standard uses Automotive Safety Integrity Levels (ASILs) to provide a measure of the risk. These range from A to D. A is the lowest safety integrity level. And D the highest with the most requirements.
The risk parameters include severity of risk, probability of exposure, and controllability.
Controllability assumes that the driver:
- Is in an appropriate condition to drive.
- Has the appropriate driver training (a driver’s license).
- Complies with all applicable legal regulations.
Are Self-Driving Cars Legal?
Self-driving cars can be legally tested in some countries. But there are more legal barriers that self-driving cars will need to overcome.
Laws will need to adapt to accommodate ADAS and self-driving cars. It’s critical that legislation accounts for what should happen in the event of an ADAS failure. ADAS will need to notify drivers and fall back to human control in these cases.
Software Design Standards
If the notification fails, the human driver may not be paying attention and won’t be able to avoid harm — like in the case of the Uber self-driving car. If the fallback fails, the system may stay in control instead of allowing the driver to intervene and avoid harm.
The Society of Automotive Engineers (SAE) standard J3016 breaks driving automation into six classes, from no automation to fully automatic.
ADAS at SAE level three or higher rely on software to:
- Gather data from sensors.
- Create a model of the environment.
- Decide how to assist the driver or control the vehicle.
ADAS at these levels also determine whether sensors are functioning correctly, when to alert the driver, and when to trigger a fallback to human control.
Traffic laws will need to change to accommodate ADAS — particularly in the area of liability and privacy. Every country has its own traffic laws.
In the U.S., the National Highway Traffic Safety Administration has proposed a formal classification system that defines five levels of automation. At the lowest level, the driver must be in complete control of the vehicle at all times. At the highest level, the vehicle performs all safety-critical functions for the entire trip — and the driver isn’t expected to control the vehicle at any time.
At a state level, it varies. In 2011, Nevada was the first state to authorize self-driving car tests on public roads. Today, 29 states allow self-driving cars to be tested.
A European research project — Automated Driving Applications & Technologies for Intelligent Vehicles — began in January 2014. It develops various automated driving functions for daily traffic by dynamically adapting the level of automation to the situation and driver status.
The project also addresses legal issues that might impact successful market introduction.
Vehicle & Road Automation (VRA) is a support action funded by the European Union. It aims to create a collaboration network of experts and stakeholders working on deployment of automated vehicles and related infrastructure.
The Japanese government is perhaps the closest to a self-driving car reality. They plan to have legislation passed and self-driving cars operating by the 2020 Olympics in Tokyo.
In China, autonomous driver legislation is making its way through major cities, including Shanghai and Beijing. China’s legislation is quite flexible so the government has more power to put the required changes in place. However, they will still need to deal with the same complex issues as other countries.
India is also thinking about autonomous driving but faces major challenges. One of them is the slow-moving legislation and the difficulty in imposing the expected rules because of its infrastructure.
Automotive Developers Will Always Need Compliance Tools
It is possible to develop safe and secure systems for vehicles. To comply with legislation and compliance regulation, automotive developers will always need smart tools.
System security starts with designing features.
This may include using firewalls to maintain separation between safety-critical applications (such as steering and brakes) and less critical applications. This is especially important for those that communicate with the outside world (such as infotainment).
It also includes reducing or limiting communication, as well as checking and validating any data that is communicated.
Safe Embedded Code
Most automotive software is written in C or C++.