Self-Driving Tesla Runs Over Kid – Keeps Driving

13
384

The headline isn’t entirely accurate – but the point is it could have been.

The owner of a self-driving Tesla posted a video on the speech suppression social media site now called X that shows his self-driving Tesla driving into a deer that was standing in the road. The self-driving Tesla not only hit the deer – it didn’t stop driving.

It could have been a kid standing in the road. And that’s the point. One missed by the owner of the self-driving car, who describes the deer strike as an “edge case.” Not for the deer, of course. And – potentially – not for a kid (or adult, for that matter) who gets in the way of a self-driving car. Which is a car driven by its programming and machine intelligence. Which lacks empathy or remorse programming.

Just keep on driving.

Of course, it was just a deer.

But imagine that it had been a kid. If you hit a kid, you’d stop the car and try to help the kid, call 911 and so on. Assuming you’re not a sociopath. But that is exactly the point. Machines – and machine intelligence – are sociopathic in that they (per Reese’s speech in Terminator) do not feel pity or remorse or, for that matter, anything at all. Morality and ethics are human constructs based on human feelings; shame and guilt, for instance. An awareness that we did something wrong – even if we didn’t mean to.

We have sense of obligation to not do it, too.

Self-driving cars lack this agency. They are not evil, in the human sense of the term. In the sense of consciously and callously deciding to deliberately cause harm or do some thing irrespective of the high risk to others. They just sometimes cause harm, unaware and indifferent. Does a toaster care whether it burns your toast?

Self driving cars are limited by their programming. No matter how  many possible variables and scenarios are programmed for, there are circumstances that will fall outside their programming. At which point, the machine errs.

Humans do too, of course. And this fact is often brought up by defenders of giving ourselves over to programmed machines. They say the machines err less than we humans do and so – on balance – it is better (in the aggregate) to let the machines take care of us.

This is statistics. And that is inhuman.

To understand why, recall the saying attributed to one of the most inhuman human beings who ever lived, Joseph Stalin. He is said to have described the death of a single human being as a tragedy – but the deaths of millions a statistic.

Exactly the point.

When you hit a deer – or a kid – it is you who hit the deer (or the kid). It is you who may have had the opportunity to not hit the deer – or the kid. If you did hit the deer – or the kid  – because you were not paying attention or on account of driving too fast or not being in full control of the vehicle, then it was your fault. There’s a healthy moral reckoning there. A proper element of remorse and accountability.

And agency. When you are in control you can avoid hitting the kid. Whew! That was close! And you feel good, having not hit the kid. When you are asleep at the wheel of a self-driving car and it hits a kid – Bump! What was that? – how will you feel about it?

When a machine is responsible for running over a deer or a kid – no one is responsible. It is just a statistic. If that makes you – the person who owns the car that ran over the kid feel any better about it.

We become automata – just like the machines. But worse, because we’re supposed to be human.

Even the predictably Leftist site that identifies as a “car” site – Jalopnik – says “one might argue that edge cases are actually very important parts of any claimed autonomy suite given how drivers check out when they feel the car is doing the work (of driving).”

The writer (predictably) does not explain – so I will. 

Tesla – and all the other purveyors of self-driving systems – officially say that self-driving isn’t really meant to be that. Because the person behind the wheel is expected to be ready to drive all the time. More finely, to intervene when necessary – as when the self-driving systems fail to see the deer (or the kid) standing in the road. So it’s not really self-driving. It is more like assisted driving. Of course – wink, wink – everyone understands what’s being sold. Just as – back in the day – everyone understood what a catalytic converter test pipe was sold for.

There is no point to “self-driving” cars if the person behind the wheel is still considered to be the driver. Hell, why is there even a steering wheel? Does its presence not imply that someone – rather than something – is responsible for steering the vehicle? How about the accelerator and brake pedals?

Why are they there – if the car is “self-driving”?

Part of the answer to that, of course, is that the “self-driving” car does require a driver. A driver who can be held responsible when the car drives into or over something or someone, because the situation fell outside the parameters of its programming. Or because one of the cameras didn’t see or the radar/LIDAR glitched.

But it’s a fine way to avoid anyone having to feel (let alone actually be held responsible) when it does.

. . .

If you like what you’ve found here please consider supporting EPautos. 

We depend on you to keep the wheels turning! 

Our donate button is here.

 If you prefer not to use PayPal, our mailing address is:

EPautos
721 Hummingbird Lane SE
Copper Hill, VA 24079

PS: Get an EPautos magnet or sticker or coaster in return for a $20 or more one-time donation or a $10 or more monthly recurring donation. (Please be sure to tell us you want a magnet or sticker or coaster – and also, provide an address, so we know where to mail the thing!)

If you like items like the Baaaaaa! baseball cap pictured below, you can find that and more at the EPautos store!

 

 

13 COMMENTS

  1. One of many problems with the concept of self driving cars is the expectation that you will be ready to take over at less than a moments notice. That might be true at first but after a while you’ll not pay attention at all because nothing ever happens. I guarantee you most people would soon be playing on their phones, doing crossword puzzles then WHAMMO!!!! WTF was that?

    Paul S. the driver who hit the dear says he got a dozen false stops a day; if that doesn’t tell you that the technology is not fit for use, what will? It’s shocking but may well be true but Clovers may be safer drivers than self driving.

  2. I like how the guy bitches he can’t get Tesla “service” until January, a normal person would decide to ditch Tesla completely and get a real car.

  3. The existing infrastructure was built for active driving, not a tacked-on autonomy package. It will cost billions to make the roads compatible with autonomous vehicles.

    Here in Colorado, the highways have high fences at the edge of the right of way. There are “one-way” escape mounds that animals can use to get out of the highway space. Many of the ramp crossings have cattle guards to prevent animals from entering the highway space using frontage roads or entrance ramps. Yet they still occasionally find their way onto the highway, and still sometimes meet their demise on the grill of a Kenworth tractor. And with winter on the horizon I’m sure there will be breaches of the fences as Texans and “new arrivals” fail to navigate turns and send their SUVs into the river.

    Eventually people will figure out that autonomous vehicles will need to have their own isolated infrastructure. It will be walled off, much like elevated trains or subways. While your vehicle is in self-driving mode you will not be able to override anything except in a high level command way. When in a mixed environment the driver will be controlling it, probably with assistance from the sensor array.

  4. “one might argue that edge cases are actually very important parts of any claimed autonomy suite given how drivers check out when they feel the car is doing the work (of driving).”

    What kind of gibberish is THAT?

    “edge case”? “autonomy suite”?

    This is supposed to be an automotive journal? It’s a bunch of geeks trying to sound cool because they us terms few do. Using obscure jargon does not make you an intellectual. It makes you a jerk.

    Sod off, Jalopnik

  5. Edge case? Bullshit. It’s one of an infinite range of things that can happen while driving. That’s why there needs to be a human in control. Period.

  6. Forcing slaves into very dangerous, defective FSD EV’s…

    Commiela’s hidden agenda if/when installed….the joy (tax) they talk about…get the stick out…

    Like the marxist just installed/elected in the UK….same agenda…..there is a big stick coming for slaves who refuse to buy EV’s…..

    UK to impose BRUTAL taxes on PETROL and DIESEL cars…another one of the many different ways they can get rid of….ban….. ice vehicles….

    https://www.youtube.com/watch?v=rB19hV8WiYg

  7. ‘If you hit a kid, you’d stop the car and try to help the kid, call 911 and so on. Assuming you’re not a sociopath.’ — eric

    Or hopped up on loco weed. In the 1936 film Reefer Madness, a THC-crazed fiend runs over a kid (which is obviously a cardboard cutout) then laughs his ass off.

    Can’t be arsed to find the scene, but it happens after this drug-den encounter in which a seductive vamp puts a wholesome young man on the path to perdition by challenging his manhood (‘of course if you’re afraid …’). Her confederate laughs maniacally after the victim gives in and takes a fatal puff.

    https://www.youtube.com/watch?v=q8oEOQIh1L8

    It could happen to you.

  8. Having flown a plane equipped with an auto-pilot (AP) for over 20 years, I’ve always known that as PIC (pilot in command) I am responsible for every aspect of safe flight e.g., taxi, takeoff, following airspace rules, landing, etc., whether I chose to engage the AP, or not. You see what is happening here Eric. The same thing that is/has occurred in just about all facets of life. Attorneys got paid to sprinkle holy water on the decision by an organization like Tesla to advertise the new systems as self-driving/AP. Then when a kid gets hit by a car operating on AP, they get paid more to defend that ill-advised choice. We could talk for days about this degradation of culture. Hell, when I was a kid, we were taught to ‘stay out of the street’.

    • Tell me, does your autopilot work on taxiways? Why do you suppose that is? Even with the excellent maps and information available about airports, ground operations are completely manual.

      While I only fly drones and simulators, I know that autopilot systems aren’t designed for obstacles. My Skydio drone is a unique outlier but it can still crash if the object in the way is too small to be detected, and it doesn’t see moving object (like birds) at all.

      Autopilots are designed to operate in open sky, with plenty of separation between aircraft. And in an environment where the entire trip is highly planned and coordinated. There’s not much impulse flying going on. And certainly no checking email, dialing up podcasts or yakking with your wife/girlfriend on the phone.

      • Yesterday a buddy who is still only moderately experienced with drones almost crashed his DJI Mini 3 while reversing, with no view to the rear.

        This was the ‘oh, sh*t’ moment when a pinon pine (at lower right) suddenly popped into the forward view screen, after the drone cleared it by maybe 6 inches. :-0

        https://ibb.co/HK1M0hx

LEAVE A REPLY

Please enter your comment!
Please enter your name here