XC90

Autonomous Vehicles are Not for Safety

Capture.JPG

Even if you’re not a real automotive enthusiast, you probably heard this week about the incident in Tempe, Arizona where a pedestrian was killed while crossing the road at night. This in itself is not rare. Pedestrians die every day, and Arizona actually has the fourth highest pedestrian deaths of any state, so it’s unfortunately especially common there.  The difference this time it was a Volvo XC90, being driven by Uber’s autonomous technology that struck and killed the woman, and it has understandably led to new questions about how safe autonomous vehicles actually are. The reality though, is that safety is only an occasional byproduct of autonomous technology.

Photo by ABC-15, via Associated Press

Photo by ABC-15, via Associated Press

But safety is absolutely paramount in testing unproven technologies, and it’s clear that Uber was not doing their due diligence in this regard. At the time of the accident, the Volvo was being chaperoned by one Uber employee who, according to video of the incident, spent his time looking down, either at a phone or at a monitor, not paying attention to the road ahead. Jalopnik called around and pretty much every automaker testing autonomous vehicles uses two in-car minders; one to watch the road and correct any issues with immediate human input, and another to monitor the technology and keep logs of the car’s activity. Uber uses only one, so does Waymo. And we have to remember that Uber isn’t a car company, they’re an app company bleeding funds and trying to come up with a technology as fast as they can to provide taxi service without having to pay human drivers to operate it. It’s not in their best interest to pay to have two people in a car, even if it makes the drive safer.

Velodyne_LiDAR_HDL-64_9C_Front_900.png

And there are some extenuating factors in this case. It was really dark and the forward-facing camera in the car shows that, until just about two seconds before impact, the woman was very difficult to see. But those cameras don’t exactly capture the full spectrum of what the human eye can see and it stands to reason that an alert driver might have seen and been able to react to the woman in time to at least avoid her death. What almost certainly did see the woman were the Velodyne LiDAR arrays on the top of the Volvo, for which it being dark or night is immaterial. Velodyne says that the problem probably wasn’t their system seeing the woman, but rather Uber’s software interpreting the shape the LiDAR was seeing as a woman and acting accordingly. Instead, the car didn’t slow down at all and hit the woman at 40 miles per hour as she walked her bike across the street.

astronomy-embers-fire-87611.jpg

And the hardware talking to the software is just one of many ways autonomous vehicles can go wrong. Just like my computer gives me the spinning wheel of death when I try to do to many tasks at once, machines encounter problems sometimes that can either render them unusable, unstable or unresponsive, which becomes a problem when the machines are propelling 4,000 pound death machines down motorways at dangerous speeds. And that’s just to mention factors inside the vehicle. Just this week, a representative from the National Center for Atmospheric Research voiced his concern over autonomous vehicles’ overreliance on GPS because the technology is so vulnerable to interruption due to solar flares, which could render vehicles without knowing how to get where they’re going. 

julia-stepper-596400-unsplash.jpg

As for the other side of the equation, the truth is we’ll never achieve 100% safety on the roads because humans are both stupid and unpredictable. We don’t use crosswalks, we pop out from behind things, we generally do our best to confuse and bewilder technology, like wearing billowy clothing that doesn’t make us look like humans, or carrying bikes that make us look like vehicles. Advances in artificial intelligence have computers beating humans in games like chess, Go and Jeopardy, but there’s a long way to go before it can adequately anticipate what us crazy humans are going to do.

mcity-aerial-test-facility.jpg

So that brings us back to this week, when a woman crossed the road in dark clothing at night and not in a crosswalk, which should not have been a death sentence. And it might not have if Arizona hadn’t made themselves the absolute wild west of motor vehicle testing. Or if Uber had bothered to put a second person in their cars like most other automakers do. There’s a reason most automakers have their own or use closed proving grounds and race tracks to test vehicles in a variety of situations. While nothing can fully compare to real-life testing in scenarios that are difficult to replicate in a closed environment, I would find it hard to believe that they couldn’t have tested a woman walking a bike across the road. 

self-driving-car.jpg

Since most automakers are testing on roadways with other drivers and pedestrians and no shortage of obstacles, safety clearly isn’t the primary concern of automakers in their rush to get autonomous technology into their vehicles. And if you’ve listened to my podcast, you probably know what I’m about to say. Autonomous cars are not about enhancing safety and reducing pedestrian or driver deaths. They’re about enhancing convenience and making money for both automakers and taxi apps like Uber and Lyft. Autonomous systems are yet another optional add-on for which Tesla, Cadillac, Chevrolet, Nissan and others can feel free to charge us thousands of dollars, which we’re happy to pay because rush hour driving is brutal. The incentive for safety comes not from a sense of duty to improve society, but from a fear of liability when and if something occurs. And now that something has, and the daughter of the woman killed has lawyered up, we’re going to see just how accountable these companies are going to be held when safety is not their first priority.

Devlin & G35 circle story attribution.png

Authored by
Devlin Riggs