Now we know who’ll be held criminally accountable when a “self driving” car kills someone.
It will be the person who wasn’t driving it.
Rafaela Vasquez wasn’t driving a self-driving Uber Volvo when it drove over Elaine Herzberg of Tempe, AZ.
Herzberg was walking her bicycle across the road when the driverless car ran her over, killing her.
Vasquez was reportedly watching a reality TeeVee show when the car struck Herzberg.
Neither Uber nor Volvo have been charged – but Vasquez has just been indicted on a charge of negligent homicide, a serious felony that entails the possibility of time in prison upon conviction as well as the likelihood of financial ruin from the civil suit which will follow – and which will be predicated upon a successful conviction.
The prosecutor in Maricopa County says “Distracted driving is an issue of great importance in our community” and that “When a driver gets behind the wheel of a car, they have a responsibility to control and operate that vehicle safely and in a law-abiding manner.”
Because it raises the vital question: Vasquez wasn’t driving the car. The car was driving itself, as it was designed to do by Volvo and meant to do, by Uber. What is the point of a self-driving car if it requires a driver?
Isn’t it all about relieving the driver of the “responsibility to control and operate the vehicle”?
To – ultimately – eliminate the driver altogether?
That is what Uber (among others) wants, at any rate – drivers costing money and not always being reliable. Which as it turns out neither are self-driving cars. Not just Uber cars and not just Volvos, either.
But the difference is that self-driving cars aren’t held responsible for recklessly driving themselves; nor the people who programmed and pushed them and put people like poor Vasquez in the bizarre position of being held responsible for what the car does. For what it was designed to do.
Or doesn’t – such as notice a person walking across the road and apply the brakes instead of the gas.
If the driver is expected – is legally obliged – to drive the driverless car then the whole thing is bizarre, Kafkaesque. It is like No meaning Yes except it really means No even when she says it means Yes.
And then charging the poor guy with rape.
Which is essentially what has happened to Vasquez, who happens to be female – but the point stands. She was given a car that drives itself, then held responsible for letting it.
This isn’t an accident.
NHTSA – the federal “safety” apparat – is well-aware of the risk posed by self-driving cars because it is well aware of the many accidents and several deaths caused by them. Proportionately, self-driving cars are probably just as much of a threat as the WuFlu.
Yet NHTSA has not decreed that self-driving cars must be locked down – taken off the road in the interests of public safety. Just as it has not issued a decree that new cars may not be built with built-in LCD touchscreens which encourage people to take their eyes off the road while driving to use.
Just as it has forbidden people with known-to-be-defective (and very lethal) air bags to even temporarily disable them.
The point being that NHTSA isn’t a “safety” apparat. It is an apparat of the burgeoning technocracy apparat, which intends to control literally every aspect of our lives, including our mobility.
Thus, certain technologies are given a hall pass.
Those technologies which further the aims of the technocrats.
So-called “self-driving” technology being the main one as regards mobility because it holds the promise of total control of mobility. The end goal is indeed a car without a driver. A car that drives where (and how) it is programmed to drive – and not programmed by you. A car that won’t drive at all if the technocrats so decide – based on your not being a Good Citizen, as defined by them. One can envision such a car equipped with Advanced Face Diaper Assistance technology, for instance. If you have not Diapered your Face (or had your shots) the car will not move. It may not even open the door.
Or it may lock the doors – with you inside. To await the Diaper (or shot) police.
In the meanwhile, Vasquez faces as much as six years in prison for what amounts to turning the radio on. If the Volvo that killed Herzberg didn’t have self-driving technology then Vasquez would have had to drive it and in that case, would have been responsible for driving it.
But to hold her responsible for using the car’s technology – the thing it was designed to do – is of a piece with charging a man with rape for having sex with a woman who invited him into her bedroom, took off her clothes, said come hither . . . and then screamed stop! in the middle of the act.
. . .
Got a question about cars, Libertarian politics – or anything else? Click on the “ask Eric” link and send ’em in!
If you like what you’ve found here please consider supporting EPautos.
We depend on you to keep the wheels turning!
Our donate button is here.
If you prefer not to use PayPal, our mailing address is:
EPautos
721 Hummingbird Lane SE
Copper Hill, VA 24079
PS: Get an EPautos magnet or sticker or coaster in return for a $20 or more one-time donation or a $10 or more monthly recurring donation. (Please be sure to tell us you want a magnet or sticker or coaster – and also, provide an address, so we know where to mail the thing!)
If you’d like an ear tag – custom made! – just ask and it will be delivered.
My latest eBook is also available for your favorite price – free! Click here. If that fails, email me at [email protected] and I will send you a copy directly!
The “driver” of the driverless car is actually the programmer, and/or the vehicle/software developer, neither of whom are identifiable by name. Therefore blame will be thrown upon whomever is the most convenient, making simple answers for complex questions. Bottom line for consumers should be….don’t buy or use that useless lethal shit, or you may go to prison for it, if it doesn’t kill you first, that is.
Getting to the “no driver controls” stage will probably happen in the air before it happens on the ground. But a lot of birds will be shredded in the carbon fiber prop blades of the air taxis along the way. There’s just less stuff to hit in the air than there is on the ground, and aircraft are much more purpose built than an automobile. Not to mention we have decades of history with automation in aircraft and there’s a built-in incentive to make sure it’s 99.9999% reliable. Oh, and it’s pretty easy to justify canceling a flight due to weather or other issues, again because the consequences of getting into a mess are almost always fatal, not just “might” be fatal.
That’s not to say the skies will be black with passenger drones though. They’ll still be playthings of the rich, just not the $1200/hour Gulfstreams we see today. More like the $300/hr Piper Cub but with more room and door-to-door service.
I remember this case when it happened. The problem was the bicyclist came out of nowhere. Most active drivers wouldn’t have been able to stop in time. The charge in this case is pretty bogus. I wouldn’t disagree with the driver being held responsible, because the vehicle was a test mule and she was there to watch over the automation. It would be no different than a driver’s training instructor being held accountable for allowing a student to hit something without intervening.
Hi Mattacks,
The thing is, we’ll never know. Which is precisely why it’s the car that’s the problem. It is clearcut when a driver’s inattention results in loss of control/damage. It is unclear when it is the car that is driving. A driver either is – or is not. A driver cannot pay attention while not paying attention and if he is paying attention – and “ready to intervene,” as by having his hands on the wheel – then the car isn’t driving.
I work in this industry, building technology for self driving cars which don’t kill people. You don’t hear about us, because our software works, so there’s no drama, but we also don’t make claims that are false about our capabilities.
There’s an industry shift going on right now which worries me, but first some classification information. Cars like the Uber above are classified as “Level 2” self driving, L2 for short. L2 means it’s a driver assistance, not self driving, and cars classified this way always caution drivers to pay attention. It’s an impossible task for a person to stay alert while not doing anything. L3 means the car can drive by itself in some circumstances (like, clear sunny day), and would refuse to drive if it can’t drive safely without human intervention, but the human is still expected to intervene in emergencies. L4 and L5 cars do not requite human intervention at all. Until you can guarantee this safely, you don’t call your car L4.
So, what Saint Elon has done, is convinced companies that it’s ok to ship a car that is sometimes L4 and L5, but only call it L2 for liability reasons. See, the car drives itself most of the time with no attention required from you, the driver, except when it doesn’t, then it’s your fault about what happened. So, you will see cars advertised as “L2+” in not too long, which basically means L3/4 capability but all the responsibility is on you.
It frustrates me to no end that this is happening. We make the software that lets the car identify everything in its surroundings. Cars which can do this safely use every sensor that’s practical; cameras, lidar, laser, sonar. These are built in layers of failsafes, so if anything is amiss, the car will pull over (or even stop) and say it can’t drive safely anymore. When this tech is integrated in a car, you get a car which is safe, but sometimes refuses to drive or pulls over because it’s not sure it can drive safely. This is not something which car companies want to market, because the truth of what we can do today isn’t as cool as a hypothetical future tech, so they lie their asses off, make promises about full self driving, but still classify the car as L2, making the driver liable.
Hi Opposite Lock!
This – “It’s an impossible task for a person to stay alert while not doing anything” – is the key to the whole mess. Either a car can self-drive or the driver has to drive. It can’t be in-between. The problem, as you’ve explained, is that a safe self-driving car has to be programmed to pull over when it is “impaired” – and that won’t fly. So we play-pretend, as per usual.
Thank you for the info. Since we get nothing from anyone else lol.
The fat lady hasn’t sung yet on this one.
But, assuming it goes the way it looks like it’s going to…who in their right mind would take on that kind of responsibility for some programmer’s potential mistake? Particularly when that comes at an extremely high price tag? I’d rather drive myself, or take the bus…if I’m at the wheel, I know when I screwed up. If I’m on the bus, I know it can’t be my fault. I also know things on the road can happen very suddenly, quite possibly faster than I could force a manual override, even assuming an automated system actually allows for that and assuming I’m actually paying attention (why would I, when the machine is supposed to be doing the work?)
Cruise control is great, & I use it when conditions allow (highway, light traffic)…but it still takes a split second to get my feet back onto the pedals when they are needed (construction zone, slowing traffic, certain merge points, etc.).
Hi Publius,
Yup. The Catch 22 with “self driving” cars is that the person behind the wheel is still expected to supervise the drive – yet encourage not to drive. It’s like being encouraged to eat – but not swallowing.
And it was today I saw a quote from Eloi himself that said “I fear AI will bring the demon”. He got one thing right.
Ban people to be passengers in self-driving vehicles. All self-driving vehicles should be driverless and passengerless. All you have to do is have the self-driving vehicles travel in a circle all of the time.
Problem solved.
I would never feel safe in a self-driving car. I wouldn’t do it.
Self-driving tractors are okay, the concept self-driving tractor doesn’t have a cab.
Self-driving tractors have been around for about twenty years and were in use up in Saskatchewan when I first saw video on tv of a self-driving tractor. Not much traffic out in the middle of a wheat field, so it is rather safe to do.
Hu Drump,
Yup. And self-driving tractors are restricted to fields and private property. No issues there – for those reasons.
But the person killed wasn’t in the vehicle at the time. The solution will probably be to ban things other than self-driving vehicles from the roads, or construction of yet another transportation network that has no level crossings with the existing infrastructure.
More fences more signs and less freedom of movement.
Thanks for the update…Based on current information, here’s how I see liability with computer driven vehicles….When that Class 8 self driving tractor trailer plows up the ass of a multitude of cars at a stop light or off ramp somewhere, it will be the fault of those drivers in front of the big rig because they—–according to law enforcement and big tech—— “should have seen the 18-wheeler coming and avoided the collision by moving out of the way.”
They’re ALL responsible. Car co’s, the self-driving tech co’s, the driver, uber, all of them should be in jail. But I don’t have a say in anything because I’m just a serf in a dictatorship/”representation” society, and our “representatives” are all traitors. This is just one of many of the traitors’ toys with which they use to torture & murder the serfs with, and also destroy the entire car industry (IOW the entire transportation infrastructure).
“But to hold her responsible for using the car’s technology – the thing it was designed to do – is of a piece with charging a man with rape for having sex with a woman who invited him into her bedroom, took off her clothes, said come hither . . . and then screamed stop! in the middle of the act.”
It’s worse than that. It’s the old story about the woman who didn’t know she’d been raped until the check bounced.
Being liable for the self-driving car’s actions is quite like being a man and dating. You have no rights, the entire system is set up so you can’t win and they can’t lose, and you are at the mercy of circumstance and the moral choices of others.
Your only option is to not be in the “driver’s seat” of a self-driving car. Easy now, while we still have the choice to not be in a self-driving car, but when there is no choice, we’ll have to either not drive, or accept that anything that happens will be on us. If you’ll want to go places, you’ll just have to accept the risk that whatever the car does or doesn’t do, whatever happens, you’ll be liable, because that benefits the system.
Just as today a man who chooses to date instead of abstain, accepts the risk that it will all be on him, if things don’t work out.
‘NHTSA isn’t a “safety” apparat. It is an apparat of the burgeoning technocracy apparat, which intends to control literally every aspect of our lives, including our mobility. Thus, certain technologies are given a hall pass …’
… such as EVs. General Motors, politically a ward of the state ever since Uncle bailed it out in 2009, knows where its bread is buttered:
(Reuters) – General Motors Co GM.N is set to announce plans on Wednesday to put into production an interchangeable “family” of electric vehicle (EV) drive systems and motors, boosting manufacturing efficiencies as it transitions to a fully electric lineup.
The move, which follows earlier GM initiatives on next-generation batteries, comes as the Detroit automaker looks to build a vertically integrated electric car business, comparable to Tesla TSLA.O, inside its ongoing operations.
GM said its new electric drive systems, sometimes referred to as e-axles in the industry, will have a versatile enough power output to allow them to be used with vehicles ranging from beefy pickup trucks to performance vehicles.
The new technology highlights GM’s effort to transform itself and catch up with Tesla, whose share price has jumped over 400% this year as it has reported improved profitability.
https://www.reuters.com/article/us-autos-gm-ev-exclusive/exclusive-gm-to-manufacture-own-family-of-ev-drive-systems-motors-idUSKBN2671RW
————
Ain’t that pathetic? GM is so bereft of ideas and technology that Mary Barra thinks it has to “catch up” with Tesla.
GM’s executive suite is driven by options envy. Unlike Elon with his abusive (to shareholders) compensation plan, GM’s top brass aren’t getting rich from a bubbling share price.
With nearly ten times GM’s market capitalization, Tesla could actually buy out GM and send Mary packing. But owning all those smokestack factory assets that rust in the rain would risk popping Tesla’s bubble, which is built on hype more than tangible assets.
From what I understand, Vasquez was paid by Uber to act as a “Human Safety Monitor”, a paid backup driver ready to take over for the autonomous systems if the situation warranted intervention.
I understand some deep pockets, including the telecom and phone manufacturers who enabled the distraction, are throwing a convenient target into the maws of the legal system and running for cover, but the decision to stream reality TV was still as much the driver’s as if she got behind the wheel drunk. Vasquez was “on the clock” — even if it was a “gig” job, I imagine the hourly rate reflected the level of responsibility.
If her job was to monitor the car then of course she’s responsible to some extent and clouds the fundamental driver/driverless question. This particular scapegoat case would drill down into details of responsibility and authority of those involved. Say a bus or aircraft crashes, the driver or pilot, mechanics and engineers along the way may have varying degrees of liability for whatever issue arose. It’s not always as clear cut as the person in the left chair (so to speak) is 100% responsible. There are often design flaws or extenuating circumstances such as incorrect maintenance. Although I don’t know how often such things result in an individual being blamed but it can exonerate the driver or pilot from primary fault. Other than nine-eleven it’s usually never the fault of a passenger, which is what I’d think you have to consider the “driver” in a driverless vehicle.
Hi Anon,
This is the nut of the problem. Absent a mechanical failure or act of god, when I drive my ’02 pick-up, I am the one responsible for controlling it. Period. No gray area. With “self driving” cars, the driver is encouraged to not be responsible. This idea that Vasquez (or anyone else) should have been “monitoring” the car is absurd. Either you’re driving – or you’re not.
I think with a good attorney, she could have a chance to fight this.
Hope she has a lot of money for that good attorney. I also bet that she will not be allowed to set up a crowdfunding account as the PTB’s want her to be the scapegoat.
I am a lawyer.
Hi Ugg,
A lawyer question for you: I see this as the equivalent of handing a child a loaded gun and then advising them to “be careful.” In that case, the person who handed the kid the gun would be liable, criminally and civilly, yes? Granted, the Uber driver is an adult – but the underlying principle strikes me as essentially the same; i.e., the person was encouraged to use something inherently dangerous and ended up causing harm thereby.
The very nature of something that encourages the user not to pay attention but requires the user to pay attention with death as a consequence would never be allowed on the market through any properly functioning product development engineering system.
This is why the big three is “behind” on these systems. Their internal processes were set up decades ago and at this point are practically etched in stone and there’s no way these robot car systems can make it through without executive dispensation. And good luck an executive creating a paper trail like that for one of these robot car systems.
The fact they are on the market and the government allows it shows that the ruling class wants these things.
Hi Brent,
Yup. Opposite Lock made the point about being alert while not paying attention. It’s not possible. In order for a driver to be truly in control of the car, the driver must be in control of the car. Not “ready to intervene.” There is always going to be a delay when your hands aren’t on the wheel.
I can relate my personal experiences with elements of self-driving/partial self-driving tech. Fog, ice, shade/ road berms – things of that nature – confuse the cameras/radar, etc. that provide the sensory input for these systems. They also over-react and sometimes don’t react at all.
A “safe” self-driving car would automatically stop driving – halt/pull over – when its systems were unable to deal with external conditions, such as fog/ice/snow, etc. But that wouldn’t market well. So people are lied to about what the tech is capable of – and then held responsible for what it isn’t capable of.
I think so, too. There HAS to be a personal injury or criminal defense attorney out there just itching to take on a case like this to set a precedent.