The pace of technological change is advancing quicker than ever before. We now live in a world where it is normalized for cars to operate themselves, a technology leaving many people in discomfort. Humans’ constant need for control is clashing with cars’ ability to take over the wheel. As fascinating as these cars seem at first glance, the dangers they bring to the road outweigh their positives.
By definition, a self-driving vehicle is one that uses a combination of sensors, cameras, radar and artificial intelligence (AI) to travel between destinations without a human operator. The cars use radars, cameras and light detection or ranging in order to assess road features and potential obstacles While self-driving cars help improve safety, their sensors can be ineffective in conditions such as snow or other heavy precipitation— a factor that should be considered before blindly trusting the technology.
The Society of Automotive Engineers have defined six levels of self-driving cars, ranging from the driver being in full control to a vehicle that can operate without a human.
Levels 0-2 are vehicles operated by humans; however, they have safety features such as blind spot warning and automatic emergency braking. Many cars that are not typically considered self driving still have these features in place. Levels 3 and 4 hold the technology to operate themselves while the driver takes their attention off the road; however, human operators are able to take over whenever they feel it is necessary to do so. Level 5 vehicles are the only ones considered fully autonomous, as they do not require a human driver behind the wheel.
Common concerns regarding autonomous vehicles are their lack of human reactions. The technology is unable to emulate human logic and or instincts, especially our ability to make split second decisions. When autonomous vehicles face confusion—whether to swerve or brake—they go into a handover period. A handover takes around 3-10 seconds where the human driver is expected to take over the wheel, a sudden and concerning adjustment that can put others in danger.
Autonomous vehicles seem helpful for taxis and rideshare purposes, yet end up being a liability. The self-driving car company Cruise has faced backlash and were forced to shut down their robo taxi service after one of their vehicles killed a bicyclist in Tempe, Arizona back in 2018.
Even if you consent to the dangers of riding in a self-driving car, others on the road are not necessarily comfortable with it. Everyone should have the right to their own decisions; however, if the risk can put others in danger, the situation gets more complicated.
The technology of autonomous vehicles could be safer if all vehicles on the road were self driving. If all vehicles on the roads were autonomous, they would be reacting to other cars with the same technology as them. A combination of self driving and regular cars is problematic as the two can face struggles coexisting.
There is definitely a lot to take into consideration before becoming entirely reliable on autonomous vehicles. Until cars are able to react as quickly and logically as humans, self-driving cars can be a danger to human safety in the future.