Self-Driving Tesla Runs Over Kid – Keeps Driving

1
174

The headline isn’t entirely accurate – but the point is it could have been.

The owner of a self-driving Tesla posted a video on the speech suppression social media site now called X that shows his self-driving Tesla driving into a deer that was standing in the road. The self-driving Tesla not only hit the deer – it didn’t stop driving.

It could have been a kid standing in the road. And that’s the point. One missed by the owner of the self-driving car, who describes the deer strike as an “edge case.” Not for the deer, of course. And – potentially – not for a kid (or adult, for that matter) who gets in the way of a self-driving car. Which is a car driven by its programming and machine intelligence. Which lacks empathy or remorse programming.

Just keep on driving.

Of course, it was just a deer.

But imagine that it had been a kid. If you hit a kid, you’d stop the car and try to help the kid, call 911 and so on. Assuming you’re not a sociopath. But that is exactly the point. Machines – and machine intelligence – are sociopathic in that they (per Reese’s speech in Terminator) do not feel pity or remorse or, for that matter, anything at all. Morality and ethics are human constructs based on human feelings; shame and guilt, for instance. An awareness that we did something wrong – even if we didn’t mean to.

We have sense of obligation to not do it, too.

Self-driving cars lack this agency. They are not evil, in the human sense of the term. In the sense of consciously and callously deciding to deliberately cause harm or do some thing irrespective of the high risk to others. They just sometimes cause harm, unaware and indifferent. Does a toaster care whether it burns your toast?

Self driving cars are limited by their programming. No matter how  many possible variables and scenarios are programmed for, there are circumstances that will fall outside their programming. At which point, the machine errs.

Humans do too, of course. And this fact is often brought up by defenders of giving ourselves over to programmed machines. They say the machines err less than we humans do and so – on balance – it is better (in the aggregate) to let the machines take care of us.

This is statistics. And that is inhuman.

To understand why, recall the saying attributed to one of the most inhuman human beings who ever lived, Joseph Stalin. He is said to have described the death of a single human being as a tragedy – but the deaths of millions a statistic.

Exactly the point.

When you hit a deer – or a kid – it is you who hit the deer (or the kid). It is you who may have had the opportunity to not hit the deer – or the kid. If you did hit the deer – or the kid  – because you were not paying attention or on account of driving too fast or not being in full control of the vehicle, then it was your fault. There’s a healthy moral reckoning there. A proper element of remorse and accountability.

And agency. When you are in control you can avoid hitting the kid. Whew! That was close! And you feel good, having not hit the kid. When you are asleep at the wheel of a self-driving car and it hits a kid – Bump! What was that? – how will you feel about it?

When a machine is responsible for running over a deer or a kid – no one is responsible. It is just a statistic. If that makes you – the person who owns the car that ran over the kid feel any better about it.

We become automata – just like the machines. But worse, because we’re supposed to be human.

Even the predictably Leftist site that identifies as a “car” site – Jalopnik – says “one might argue that edge cases are actually very important parts of any claimed autonomy suite given how drivers check out when they feel the car is doing the work (of driving).”

The writer (predictably) does not explain – so I will. 

Tesla – and all the other purveyors of self-driving systems – officially say that self-driving isn’t really meant to be that. Because the person behind the wheel is expected to be ready to drive all the time. More finely, to intervene when necessary – as when the self-driving systems fail to see the deer (or the kid) standing in the road. So it’s not really self-driving. It is more like assisted driving. Of course – wink, wink – everyone understands what’s being sold. Just as – back in the day – everyone understood what a catalytic converter test pipe was sold for.

There is no point to “self-driving” cars if the person behind the wheel is still considered to be the driver. Hell, why is there even a steering wheel? Does its presence not imply that someone – rather than something – is responsible for steering the vehicle? How about the accelerator and brake pedals?

Why are they there – if the car is “self-driving”?

Part of the answer to that, of course, is that the “self-driving” car does require a driver. A driver who can be held responsible when the car drives into or over something or someone, because the situation fell outside the parameters of its programming. Or because one of the cameras didn’t see or the radar/LIDAR glitched.

But it’s a fine way to avoid anyone having to feel (let alone actually be held responsible) when it does.

. . .

If you like what you’ve found here please consider supporting EPautos. 

We depend on you to keep the wheels turning! 

Our donate button is here.

 If you prefer not to use PayPal, our mailing address is:

EPautos
721 Hummingbird Lane SE
Copper Hill, VA 24079

PS: Get an EPautos magnet or sticker or coaster in return for a $20 or more one-time donation or a $10 or more monthly recurring donation. (Please be sure to tell us you want a magnet or sticker or coaster – and also, provide an address, so we know where to mail the thing!)

If you like items like the Baaaaaa! baseball cap pictured below, you can find that and more at the EPautos store!

 

 

1 COMMENT

  1. Having flown a plane equipped with an auto-pilot (AP) for over 20 years, I’ve always known that as PIC (pilot in command) I am responsible for every aspect of safe flight e.g., taxi, takeoff, following airspace rules, landing, etc., whether I chose to engage the AP, or not. You see what is happening here Eric. The same thing that is/has occurred in just about all facets of life. Attorneys got paid to sprinkle holy water on the decision by an organization like Tesla to advertise the new systems as self-driving/AP. Then when a kid gets hit by a car operating on AP, they get paid more to defend that ill-advised choice. We could talk for days about this degradation of culture. Hell, when I was a kid, we were taught to ‘stay out of the street’.

LEAVE A REPLY

Please enter your comment!
Please enter your name here