Life, death, and autonomous vehicles

Self-driving cars will make decisions — and act — faster than humans facing the same dangerous situations.

1966PlymouthFuryIII

Plymouth Fury III. Photo by Infrogmation, on Wikimedia Commons.

There’s a steadily increasing drumbeat of articles and Tweets about the ethics of autonomous vehicles: if an autonomous vehicle is going to crash, should it kill the passenger in the left seat or the right seat? (I won’t say “driver’s seat,” though these sorts of articles usually do; there isn’t a driver.) Should the car crash into a school bus or run over an old lady on the side of the road?

Frankly, I’m already tired of the discussion. It’s not as if humans don’t already get into situations like this, and make (or not make) decisions. At least, I have.

When I was a grad student, I bought my first (very used) car: a monstrous 1966 Plymouth Fury III. I was working a part-time job about five miles away from where I lived. I normally commuted by bike, but the day after buying the car, and before I had registered or insured it, it rained. So, I decided: “what could go wrong? I’ll drive.”

Of course, something went wrong. While I was driving home, the brakes were soft. And as I was driving toward Stanford, approaching the underpass on Embarcadero, going downhill, they failed completely — while I was following a Mercedes, in heavy traffic. And I thought, quite literally, should I plow into the Merc or drive into a bridge abutment. So, debates about, “should the car take out the school children or the elderly lady” aren’t entirely academic to me.

Morally, I just froze. And whatever I might have thought that I should do, it was clear that the moral logic was totally irrelevant to what I was actually going to do, which was nothing at all. It certainly didn’t occur to me to think through all the ramifications of my situation: if I drove into the bridge abutment, sparing the Mercedes and its driver, would I have created a greater danger for the cars in back of me? If I took out the Mercedes and its driver, would I be killing the venture capitalist who was going to fund Google in a few years? Was the Mercedes possibly even more massive and collision-resistant than my monstrous Plymouth (man, that was a big car)? You can spin this all sorts of ways, and most of them are pretty silly.

At the last possible instant, I realized why emergency brakes were invented. I managed to get the car under control, drive under the underpass, and park it in a parking lot. I walked home and retrieved the car sometime after midnight, when there wasn’t anyone on the road.

But here’s the real moral of the story: a self-driving car wouldn’t have had to figure out why the emergency brakes were there.

tags: ,