Tesla “AutoCrash” System Probed by Feds

7
3365

A second federal safety agency has opened a probe of an incident in which a Tesla electric car on Autopilot rammed into a parked firetruck, a sign of the scrutiny the emerging automated and drive-assist technology is receiving.

The National Highway Traffic Safety Administration is sending investigators to California to evaluate the crash, according to a person familiar with the agency’s plans. Reuters also reported the latest development. The National Transportation Safety Board announced Tuesday that it was also sending investigators to examine the Monday morning incident, in which a Model S hit the truck assisting in a separate accident on the side of a freeway near Los Angeles.

The probes highlight the growing concerns about cars that are increasingly capable of automated operations, which have been permitted with limited government oversight. The NTSB, for example, rarely conducts investigations of highway accidents, generally opening probes of cases that have multiple fatalities or involve broader safety concerns.

NHTSA’s Special Crash Investigations unit typically probes around 100 crashes per year, often non-fatal ones.

The Tesla’s driver said he had the vehicle’s Autopilot driver-assist system engaged when it struck a firetruck, a union for the Culver City, Calif., firefighters said in a tweet on Monday. “Amazingly there were no injuries! Please stay alert while driving!” the union said in the tweet.

Tesla said in a statement Monday that Autopilot is “intended for use only with a fully attentive driver.” The company said it has taken steps to educate drivers about the need to keep their hands on the steering wheel and be prepared to take over from Autopilot, which it calls an “advanced driver assistance system” that is not intended to turn the vehicle into an autonomous car.

A Tesla spokeswoman declined to comment on the investigations.

7 COMMENTS

  1. I now understand why GovCo is pushing these automated vehicles. Just imagine how many more people are going to be needed at the NTSB if every time one of these things has a fender-bender they have to send TWO teams to investigate…on our dime of course.

  2. The fatal flaw with the expectation of “an alert driver taking over control”, is that you would actually have to be Hyper-Alert, and Hyper-Aware of every little nuance of the Auto-Drive operation. And really, how do you know when the computer does not recognize a “clear and obvious” danger, one you have already spotted?
    I can see all of us just sitting there, hands above wheel, sweating bullets, and saying “are you gonna change lanes yet, do you want me to take over, were getting really close now”. The problem is that the “car” does not provide adequate feedback to the driver, and you have absolutely no clue what it is, or is not, aware of in the driving environment, nor does it give you any indication, until you hit another vehicle, or stationary object.
    Tesla appears to be very adept at using hindsight to blame its customers, but offers no insight on how to predict a computer program error in flight. This is tant amount to putting the pilot of a Boeing 707 in the cockpit any new Airbus with fly-by-wire computer controls, and telling him to just fly it like do a 707, NOT.
    It just amazes me how the ground-bound general populous is so completely ignorant and naive about their own transportation devices, let alone ones run by computer. You would think that society is still on the mental level of cavemen, only armed with nuclear missiles. Look at any video of how the Russians drive, that will give you some idea of what I am talking about. Most of the public has difficulty just following simple traffic signals and basic driving instructions. Putting people in a complex automated machine, and telling them to closely monitor it at all time for any possible path deviation, just does not work. I agree that what we are seeing time and time again with the Tesla, is an attitude of “customer = lab rat” for their own financial gain.

  3. Most of the autonomous systems use an advanced (expensive) LIDAR system to find their way through the world. Teslas have visual light cameras, which are far less expensive, have no moving parts and use “machine learning” systems by Nvidia (most known for their high-end PC gamer graphics cards) to interpret the images.

    Elon wants to have it both ways: A system that is marketed as a self-driving car but is cheap and doesn’t detract from the “sexy” design (LIDAR systems look like the old roof lights of 1970s era police cars). But they are depending on their less than tech savvy owners to know that these systems aren’t ready for prime time.

    Unfortunately these days everyone thinks they’re a techno-expert because they successfully rebooted their phone when it acted up, so they just assume everything is as reliable as their iPhone (which given the complexity of today’s software and all the data going back and forth is actually pretty amazing). But when your phone crashes it sucks. When your driver assist system crashes it kills. Elon, because he doesn’t really care about you, only about advancing technology and being the first to market (fail early and often is the mantra of Silicon Valley -in other words, throw anything against the wall and see what sticks), so a few thousand dollars of damage or a couple of deaths are acceptable if it means he can gain an early foothold in the sector. It’s not him that’s being killed, only his customers.

    The larger problem is that because this system exists in the marketplace, many people are probably under the impression that Uncle’s minions have approved of the system. After all, the saintly federal regulators would certainly make sure there’s nothing improper going on right? This isn’t 1965 and Elon isn’t an evil GM executive (we love him, right?), and the government isn’t beholden to big business (except the evil Trump character). So it is implied that the Tesla autopilot is somehow safe. Just like raw milk should be outlawed because Uncle says it’s impossible for a dairy to handle it safely, and bump stocks are only useful for avoiding the full-automatic restrictions (which maybe they are, but also a good demonstration of human ingenuity when presented with an obstacle).

  4. I think we are far further from computer driven vehicles then we have been lead to believe. If it can’t see a big red firetruck in addition the side of a tractor trailer its not ready for prime time. Not even close. They will ram this stuff down our throats along with electric cars. They want the control that bad.

    If it wasn’t “tech” from our favorite crook, the government would make them keep this crap on the test track.

    I really don’t think it would ever bring down the accident rate either. Even if you got rid of human drivers. Computers are very glitchy and delicate. They break down and software still “crashes” too often.

    • See my post above. The problem is one of sensors, and of volume. One reason an aircraft’s autopilot works so well is because aircraft are pretty scarce and so therefore can be spread out over great distances, minimum distances are 1000 feet vertical separation and typically 10 nautical miles for visual flight and 5 NM for radar. If your vehicle couldn’t get closer than 5 NM to another we would have had autonomous automobiles years ago. But then a driver’s license would be as rare as a pilot’s license.

      So since autonomous systems have to fit into the existing environment with lots of vehicles and very close spacing. The programers and designers are looking for any shortcut they can get, and that will include vehicle to vehicle communication (a shortcut to let all vehicles be aware of each other), dedicated roads and just massive amounts of data collection. These system are all very invasive. It would be far better as a consumer to have the autonomous system to have all the knowledge needed to operate on board, much like our brains. But that’s still a few generations out, still very hard and the marketing hype machines want autonomous cars NOW. So we’ll end up with short cuts.

      This will end badly.

LEAVE A REPLY

Please enter your comment!
Please enter your name here