The headline isn’t entirely accurate – but the point is it could have been.
The owner of a self-driving Tesla posted a video on the speech suppression social media site now called X that shows his self-driving Tesla driving into a deer that was standing in the road. The self-driving Tesla not only hit the deer – it didn’t stop driving.
It could have been a kid standing in the road. And that’s the point. One missed by the owner of the self-driving car, who describes the deer strike as an “edge case.” Not for the deer, of course. And – potentially – not for a kid (or adult, for that matter) who gets in the way of a self-driving car. Which is a car driven by its programming and machine intelligence. Which lacks empathy or remorse programming.
Just keep on driving.
Of course, it was just a deer.
But imagine that it had been a kid. If you hit a kid, you’d stop the car and try to help the kid, call 911 and so on. Assuming you’re not a sociopath. But that is exactly the point. Machines – and machine intelligence – are sociopathic in that they (per Reese’s speech in Terminator) do not feel pity or remorse or, for that matter, anything at all. Morality and ethics are human constructs based on human feelings; shame and guilt, for instance. An awareness that we did something wrong – even if we didn’t mean to.
We have sense of obligation to not do it, too.
Self-driving cars lack this agency. They are not evil, in the human sense of the term. In the sense of consciously and callously deciding to deliberately cause harm or do some thing irrespective of the high risk to others. They just sometimes cause harm, unaware and indifferent. Does a toaster care whether it burns your toast?
Self driving cars are limited by their programming. No matter how many possible variables and scenarios are programmed for, there are circumstances that will fall outside their programming. At which point, the machine errs.
Humans do too, of course. And this fact is often brought up by defenders of giving ourselves over to programmed machines. They say the machines err less than we humans do and so – on balance – it is better (in the aggregate) to let the machines take care of us.
This is statistics. And that is inhuman.
To understand why, recall the saying attributed to one of the most inhuman human beings who ever lived, Joseph Stalin. He is said to have described the death of a single human being as a tragedy – but the deaths of millions a statistic.
Exactly the point.
When you hit a deer – or a kid – it is you who hit the deer (or the kid). It is you who may have had the opportunity to not hit the deer – or the kid. If you did hit the deer – or the kid – because you were not paying attention or on account of driving too fast or not being in full control of the vehicle, then it was your fault. There’s a healthy moral reckoning there. A proper element of remorse and accountability.
And agency. When you are in control you can avoid hitting the kid. Whew! That was close! And you feel good, having not hit the kid. When you are asleep at the wheel of a self-driving car and it hits a kid – Bump! What was that? – how will you feel about it?
When a machine is responsible for running over a deer or a kid – no one is responsible. It is just a statistic. If that makes you – the person who owns the car that ran over the kid feel any better about it.
We become automata – just like the machines. But worse, because we’re supposed to be human.
Even the predictably Leftist site that identifies as a “car” site – Jalopnik – says “one might argue that edge cases are actually very important parts of any claimed autonomy suite given how drivers check out when they feel the car is doing the work (of driving).”
The writer (predictably) does not explain – so I will.
Tesla – and all the other purveyors of self-driving systems – officially say that self-driving isn’t really meant to be that. Because the person behind the wheel is expected to be ready to drive all the time. More finely, to intervene when necessary – as when the self-driving systems fail to see the deer (or the kid) standing in the road. So it’s not really self-driving. It is more like assisted driving. Of course – wink, wink – everyone understands what’s being sold. Just as – back in the day – everyone understood what a catalytic converter test pipe was sold for.
There is no point to “self-driving” cars if the person behind the wheel is still considered to be the driver. Hell, why is there even a steering wheel? Does its presence not imply that someone – rather than something – is responsible for steering the vehicle? How about the accelerator and brake pedals?
Why are they there – if the car is “self-driving”?
Part of the answer to that, of course, is that the “self-driving” car does require a driver. A driver who can be held responsible when the car drives into or over something or someone, because the situation fell outside the parameters of its programming. Or because one of the cameras didn’t see or the radar/LIDAR glitched.
But it’s a fine way to avoid anyone having to feel (let alone actually be held responsible) when it does.
. . .
If you like what you’ve found here please consider supporting EPautos.
We depend on you to keep the wheels turning!
Our donate button is here.
If you prefer not to use PayPal, our mailing address is:
EPautos
721 Hummingbird Lane SE
Copper Hill, VA 24079
PS: Get an EPautos magnet or sticker or coaster in return for a $20 or more one-time donation or a $10 or more monthly recurring donation. (Please be sure to tell us you want a magnet or sticker or coaster – and also, provide an address, so we know where to mail the thing!)
If you like items like the Baaaaaa! baseball cap pictured below, you can find that and more at the EPautos store!
Huge ships…..
They use auto pilot when at sea…but someone has to be on watch 24/7 to have human eyes watching…ready to take control….
Getting close to shore…entering a harbor….it is the exact opposite of self driving….
A pilot takes control ….someone who has probably 25 years experience in navigating that harbor…….complete manual control of the ship….no computers in control….just a highly skilled professional…..human control is far superior then any computer/AI control….
Bringing a huge ship into a harbor and docking it is a very dangerous activity……Ship’s pilots are very well paid….
A deer will pop out of the ditch and be right there before you can react to prevent a collision. It happens and has happened to me.
A child will do the same, dart out into traffic from between two parked cars. You stop and give the eight year-old clear instructions to look both ways and you won’t get run over.
It’ll be a bad day for a lot of people if you run over someone and kill them.
If you are Benjamin Netanyahu, you can use bombs and jets, you then can kill 20,000 Palestinian children. Bibi could be in a Tesla, run over a child, then leave the scene. Wouldn’t bother his conscience, he has none. The kid is to blame, not Bibi.
One death is a tragedy, 300,000 dead Palestinians, a statistic.
Bibi is such a woman.
Stand and Deliver
Said Bibi to the clueless slaves in the US Congress.
EV’s are very expensive to work on and have a huge number of problems, are very unreliable…….can’t get parts…….a ticket to bankruptcy….
they work…at stopping slave mobility….
To replace a $50 part….you have to drop the whole 1000 lb battery pack, disconnect all it’s coolant lines…30 hours? later at….. $140 to $190 per hour….
then no parts available….why would you buy something that breaks down all the time…and there is no parts available to fix it?…..is very expensive to buy….and depreciates to zero in no time….
Scotty on EV’s….
https://www.youtube.com/watch?v=lphSEqZEX2E
Eric be honest, the problem is that the public is stupid. You sound like the butt of your Nader rants here.
As a libertarian would you seriously sugguesting banning things from people who have every right to want this tech if it’s their thing? Or was this just a nudge to the industry to make the tech better?
I mean we don’t like this tech obviously, but it’s your right to have it and your responsibility to not kill kids with it.
You can ride a bycicle with no handlebars for fun in a free society.
And I would hope I’d be free to brake suddenly while next to a Tesla to trick a tesla out of auto-mode for a laugh without being charged with manslaughter if something happens, because at the end of the day it’s their responsibility.
People need to get healthy and get their common sense back and this wouldn’t be an issue.
And if I had a lot of money to spend on a car where I could ride around and PRETEND (italicized) I was in the jetsons, that’s our right too! As long as we don’t hurt someone else.
The gun isn’t the murderer, the criminal negligant is.
America needs to get healthy and smart again, and respect the right to travel freely however I damn well want. The way it works is the driver is responsible to drive around other people if they expect to get anywhere.
“There is no point to “self-driving” cars if the person behind the wheel is still considered to be the driver.”
I agree we should stop calling it “self driving” even with air quotes. It’s not, it’s a gimmick. And what’s wrong with a gimmick?
We have gotten away from honest language and created this problem. I blame the media and industry leading us to call it anything but a gimmick.
We like gimmicks, and people smart enough to know the difference should have no problem in this department.
Hi Steve,
I do not support banning the technology. I ought to have been clearer. I support accountability. If a car equipped with this tech runs into something (or someone) then who is responsible? Morally – as well as legally? That’s the issue, I think.
,
As long as kids have the luck of THIS one, in a thankfully FICTIONAL dystopian Aussie future (from 1979), starting at 5:40 :
https://www.youtube.com/watch?v=TMYLjlpP0NY
All because some Aussie single mother gets caught up in an argument with, presumably, the toddler boy’s father, and forgets to mind him, long enough for him to waddle onto the highway. Hilarity, at least for the time being, ensues, as no one gets killed, but it’s coming in spectacular fashion for the “Night Rider” and his unnamed “floozy”.
Maybe Dan Quayle, generally regarded as not the sharpest tool in the shed of politics, was right after all.
I figured you’d say as much. I thought that was worth clarifying. I’m in the absolutist camp when it comes to right to travel unimpeded.
The self driving cars are the waymo’s. (See CONEiggsegg).
The lanugage is importaint in this reguard, and it’d be interesting to see a short history of how this “self driving” moniker has been applied, and by who.
Tesla is half way in the middle between lane keep assist and the full waymo in this respect.
As far as the suppression of speech goes, I bet jalopnik or someone is mass flagging your posts from what you describe. I’m in the Musk is self interested camp definitely but X has been uber consequential this election cycle toward the positive as of late.
You are correct, Eric: One day it WILL be a kid. And then the fun starts, with the finger pointing and the accusations of who is ultimately responsible: The driver, or the self-driving program? In a real twist of irony, even if the driver were charged, at least in my neck of the woods, said driver would merely be charged with manslaughter, at best, and given 3 years. That happened when one lady (drugged up w/prescription meds) plowed through a school cross walk and killed a young boy.
The “woke” EV apologists will just say the kid was some MAGA spawn and got what (s)he “deserved”.
AGW comin’ thru to issue a speeding ticket!
(Replying to Douglas above)
He is right though. As the clarkson once said, one day someone will repair one of these themselves and this will happen, even if the programming isn’t to blame which is also possible. It’s gonna happen.
You are correct, Eric: One day it WILL be a kid. And then the fun starts, with the finger pointing and the accusations of who is ultimately responsible: The driver, or the self-driving programme? In a real twist of irony, even if the driver were charged, at least in my neck of the woods, said driver would merely be charged with manslaughter, at best, and given 3 years. That happened when one lady (drugged up w/prescription meds) plowed through a school cross walk and killed a young boy.
I’m hoping for the day when Tesla-Skynet becomes self-aware and starts liquidating the morons who buy these stupid devices so they can sleep while the ‘autopilot’ takes them home…
Ha!
My 2024 SUV which was not inexpensive has essentially the same thing as what the Tesla people call “Self Driving”. It’s adaptive cruise control with “active lane keep assist” or whatever buzz words. Well, that along with the “traffic jam assist” which means that it will handle full stops and stop-and-go as well, above and beyond just adaptive cruise control.
I know that a lot of cars can do that. I had a rental car, a Nissan something or something — complete garbage car, but had an engine at least — but it had the same shit, just a bit less refined.
But in either case, you can’t trust that shit to not do something majorly stupid. My car would have smashed into a couple fuckers like it was nothing, if I didn’t do something about it. It is absolutely the worst coming from *any* speed to full stop. I rarely allow it to do it.
And these things have a “keep your hands on the wheel” annoyance “feature”, where the self-driving will stop self-driving after so long of you not having your hands on the wheel. On my car, you have to at least wiggle the steering wheel or it will stop.
But they do dumb shit anyway. Especially when there is a turn out to left or right. Inevitably they want to go into the turn out lane and then fight you when you correct it as you’re supposed to do! Fucken bullshit.
One of many problems with the concept of self driving cars is the expectation that you will be ready to take over at less than a moments notice. That might be true at first but after a while you’ll not pay attention at all because nothing ever happens. I guarantee you most people would soon be playing on their phones, doing crossword puzzles then WHAMMO!!!! WTF was that?
Paul S. the driver who hit the dear says he got a dozen false stops a day; if that doesn’t tell you that the technology is not fit for use, what will? It’s shocking but may well be true but Clovers may be safer drivers than self driving.
I like how the guy bitches he can’t get Tesla “service” until January, a normal person would decide to ditch Tesla completely and get a real car.
Would a ‘normal person’ pay a significant premium for something that:
– Has half the range of an actual car
– Costs twice as much
– Depreciates like crazy
– Takes five to ten times longer to refuel, depending on whether you’re lucky enough to find a ‘fast’ charger
– Can’t be driven when it’s cold
Says it all, don’t you think?
Mind you, it’s true the Elon is a genius. Just not when it comes to actual technology.
Years and years ago, when I briefly worked in retail, I had a boss who told me that one can sell even a polished turd, provided it is marketed properly.
Now the longer I’ve been around, the more I realize just how true he was…
Amen, John –
I do not trust Musk because I know he’s a grifter and a liar. Tesla is a grift. Fact. And X is not a “free speech” venue. Also a fact. Musk bent knee openly to the thugs in Brazil – for the sake of money. He has no principles – except as regards making money.
Eric, why are you “dissing” an AFRICAN-American? Yes, Elon Musk came to this country from the Republic of South Africa. So did the still-fetching Charlize Theron, another AFRICAN-American.
The existing infrastructure was built for active driving, not a tacked-on autonomy package. It will cost billions to make the roads compatible with autonomous vehicles.
Here in Colorado, the highways have high fences at the edge of the right of way. There are “one-way” escape mounds that animals can use to get out of the highway space. Many of the ramp crossings have cattle guards to prevent animals from entering the highway space using frontage roads or entrance ramps. Yet they still occasionally find their way onto the highway, and still sometimes meet their demise on the grill of a Kenworth tractor. And with winter on the horizon I’m sure there will be breaches of the fences as Texans and “new arrivals” fail to navigate turns and send their SUVs into the river.
Eventually people will figure out that autonomous vehicles will need to have their own isolated infrastructure. It will be walled off, much like elevated trains or subways. While your vehicle is in self-driving mode you will not be able to override anything except in a high level command way. When in a mixed environment the driver will be controlling it, probably with assistance from the sensor array.
“one might argue that edge cases are actually very important parts of any claimed autonomy suite given how drivers check out when they feel the car is doing the work (of driving).”
What kind of gibberish is THAT?
“edge case”? “autonomy suite”?
This is supposed to be an automotive journal? It’s a bunch of geeks trying to sound cool because they us terms few do. Using obscure jargon does not make you an intellectual. It makes you a jerk.
Sod off, Jalopnik
Edge case: Only a few people died of “suddenly” after taking the vaccine. Otherwise they’re completely safe and effective!
So true. Especially combining use of obscure jargon with the phrase “one might argue”. Ugh!
Surprised that the super computer on wheels didn’t even recognize it had an impact.
Guess they didn’t code for that.
Edge case? Bullshit. It’s one of an infinite range of things that can happen while driving. That’s why there needs to be a human in control. Period.
Forcing slaves into very dangerous, defective FSD EV’s…
Commiela’s hidden agenda if/when installed….the joy (tax) they talk about…get the stick out…
Like the marxist just installed/elected in the UK….same agenda…..there is a big stick coming for slaves who refuse to buy EV’s…..
UK to impose BRUTAL taxes on PETROL and DIESEL cars…another one of the many different ways they can get rid of….ban….. ice vehicles….
https://www.youtube.com/watch?v=rB19hV8WiYg
‘If you hit a kid, you’d stop the car and try to help the kid, call 911 and so on. Assuming you’re not a sociopath.’ — eric
Or hopped up on loco weed. In the 1936 film Reefer Madness, a THC-crazed fiend runs over a kid (which is obviously a cardboard cutout) then laughs his ass off.
Can’t be arsed to find the scene, but it happens after this drug-den encounter in which a seductive vamp puts a wholesome young man on the path to perdition by challenging his manhood (‘of course if you’re afraid …’). Her confederate laughs maniacally after the victim gives in and takes a fatal puff.
https://www.youtube.com/watch?v=q8oEOQIh1L8
It could happen to you.
Of course, the wacky weed then was blamed on “Meskins” (Mexicans), “Chinks” (Chinese), and “Knee-Grows” (Blacks) corrupting WHITE “yutes” (YOUTHS).
Having flown a plane equipped with an auto-pilot (AP) for over 20 years, I’ve always known that as PIC (pilot in command) I am responsible for every aspect of safe flight e.g., taxi, takeoff, following airspace rules, landing, etc., whether I chose to engage the AP, or not. You see what is happening here Eric. The same thing that is/has occurred in just about all facets of life. Attorneys got paid to sprinkle holy water on the decision by an organization like Tesla to advertise the new systems as self-driving/AP. Then when a kid gets hit by a car operating on AP, they get paid more to defend that ill-advised choice. We could talk for days about this degradation of culture. Hell, when I was a kid, we were taught to ‘stay out of the street’.
Tell me, does your autopilot work on taxiways? Why do you suppose that is? Even with the excellent maps and information available about airports, ground operations are completely manual.
While I only fly drones and simulators, I know that autopilot systems aren’t designed for obstacles. My Skydio drone is a unique outlier but it can still crash if the object in the way is too small to be detected, and it doesn’t see moving object (like birds) at all.
Autopilots are designed to operate in open sky, with plenty of separation between aircraft. And in an environment where the entire trip is highly planned and coordinated. There’s not much impulse flying going on. And certainly no checking email, dialing up podcasts or yakking with your wife/girlfriend on the phone.
Yesterday a buddy who is still only moderately experienced with drones almost crashed his DJI Mini 3 while reversing, with no view to the rear.
This was the ‘oh, sh*t’ moment when a pinon pine (at lower right) suddenly popped into the forward view screen, after the drone cleared it by maybe 6 inches. :-0
https://ibb.co/HK1M0hx
It will work on taxiways, except that one must adjust the heading bug… which was the point of that remark, that one cannot safely rely on systems designed as aid to ‘control’ the trip to the destination. The gist of the post was that the attorneys have allowed the ‘autopilot’ idea to go ‘mainstream’… as if it is the do-all, end-all solution to championing a mode of travel sans responsibility, and they make a hefty pile of fiat doing it.