Self-Driving Vehicles: When a Dream Becomes a Nightmare

Companies such as Uber and Waymo are actively testing their autonomous vehicles, but not everyone is excited about this technology. Regardless of how ingenious they are, self-driving vehicles are still plagued by safety issues.

Guest Writer
Image of Google's Self-Driving Car

Rapid advances in self-driving technology have brought completely autonomous vehicles within reach. As Forbes predicts, autonomous cars are just around the corner. 10 million autonomous vehicles expected to be cruising roads worldwide by 2020. Moreover, in the next decade, one in four cars on our roads will be fully autonomous.

While car manufacturers are claiming autonomous driving will make our road trips more convenient and comfortable, people are still skeptical toward self-driving tech. Their biggest concern is safety. For instance, a study conducted by the research firm J.D. Power and the law company Miller Canfield reveals that 52 percent of respondents wouldn’t ride in an autonomous vehicle, even if the car met all the necessary safety standards. The same survey shows that only two percent of respondents would definitely take a ride in such a vehicle.

Waymo’s Autonomous Cars Have Had Multiple Incidents so Far

Recent incidents involving self-driving vehicles show these concerns shouldn’t be taken lightly. In June, an autonomous minivan developed by the self-driving tech company Waymo crashed on the highway in California. The vehicle was operated by the autonomous software, but the safety driver, who was supposed to be fully awake during the drive, fell asleep and accidentally pressed the gas pedal. This action turned off the autonomous mode. Since the driver didn’t take control of the steering wheel, it didn’t take long before Waymo’s car crashed into the highway median. Luckily, no one was hurt, and no other cars were involved in the accident. Quartz reports that Waymo’s vehicle suffered damage to its bumper and tire.

[bctt tweet=”Companies such as @Uber and @Waymo are actively testing their #autonomous vehicles, but not everyone is excited about this #technology. Regardless of how ingenious they are, self-driving vehicles are still plagued by safety issues. || #IoTForAll #IoT” username=”iotforall”]

Although Waymo blames the driver for the accident, the company still needs to go a long way to make its cars completely safe—and, more specifically, it needs to find a way to address and resolve customers’ anxieties about the technology. Earlier, the company relied on two human drivers during test drives but decided to reduce the number to one. It turns out that decision wasn’t really a smart move.

The incident in California wasn’t the first accident Waymo experienced. In fact, Waymo could probably hold the record for the number of accidents involving its vehicles. In 2018 alone, the company’s vehicles were involved in 12 crashes. Although the police didn’t blame Waymo’s autonomous systems in any of these cases, these incidents show that autonomous cars can get into accidents as easily as traditional vehicles. So, it’s no wonder why people aren’t very enthusiastic about this technology.

How Do We Teach Self-Driving Vehicles to Make Life-or-Death Decisions?

Future developments in the field of self-driving tech could make consumers even more concerned. We all want to have vehicles that will protect us, but what if our safety is prioritized over everything else?

Imagine a car that will do anything necessary to protect its passengers, even if that means hitting a pedestrian. This is what Mercedes-Benz is planning to develop. Its self-driving vehicle would be programmed to put the driver’s safety first in any potentially dangerous situation. For instance, if a pedestrian suddenly appears on the road, the car would prioritize the passengers, and instead of swerving and possibly hitting an obstacle, it would hit the pedestrian. “If you know you can save at least one person, at least save that one. Save the one in the car,” says Christoph von Hugo, the manager of driver assistance systems at Mercedes-Benz. This raises a few ethical questions. What if the driver would rather choose to swerve and avoid the pedestrian? As Mercedes continues to develop this technology, it will certainly create more controversy in the future.

Uber’s Autonomous Driving Technology Caused a Fatal Accident

A lot of questions were raised after Uber’s self-driving vehicle killed a pedestrian in Arizona. The accident happened earlier this year when Uber’s vehicle hit 49-year-old Elaine Herzberg while operating in the autonomous mode. That night, Herzberg was walking her bier across a dangerous stretch of road that doesn’t have any pedestrian crossing nearby. Despite being equipped with special sensor tech capable of detecting obstacles—especially at night—the vehicle failed to detect the pedestrian and struck her around 10 p.m. The vehicle had a safety driver who wasn’t paying attention to the road and didn’t have her hands on the steering wheel. The incident forced Uber to suspend its testing programs in Arizona, San Francisco, and Toronto.

Vehicles without a human behind the wheel seemed unrealistic a few years ago, but things have changed. Self-driving tech developers are already testing their latest innovations on our roads. However, before we start looking forward to a driverless future, serious safety concerns need to be addressed. Until that happens, predictions claiming autonomous cars will soon be as common as traditional vehicles on our roads seem hard to believe.

Written by Richard van Hooijdonk, international keynote speaker, trend watcher, and futurist.

Author
Guest Writer
Guest Writer
Guest writers are IoT experts and enthusiasts interested in sharing their insights with the IoT industry through IoT For All.
Guest writers are IoT experts and enthusiasts interested in sharing their insights with the IoT industry through IoT For All.