Tesla engineer testified that promotional self-driving video was staged

[ad_1]

A 2016 video that Tesla used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system lacked, according to testimony from a senior engineer.

The video, which remains archived on Tesla’s website, was released in October 2016 and promoted on Twitter by chief executive Elon Musk as evidence that Tesla is driving itself.

But the Model X does not drive itself with the technology that Tesla does, Ashok Elluswamy, director of autopilot software at Tesla, said in a transcript of a July deposition taken as evidence in a lawsuit against Tesla over a fatal 2018 accident involving a former Apple engineer. .

Elluswamy’s previously unreported testimony represents the first time a Tesla employee has confirmed and detailed how the video was produced.

The video carries a tagline that says: “The person in the driver’s seat is only there for legal reasons. They are not doing anything. The car is driving itself.”

Elluswamy said Tesla’s autopilot team planned and recorded a “demonstration of the system’s capabilities” at Musk’s request.

Elluswamy, Musk and Tesla did not respond to requests for comment. However, the company has warned that drivers should keep their hands on the wheel and maintain control of the vehicle when using autopilot.

Tesla’s technology is designed to help change steering, braking, speed and lane, but those features “do not make an autonomous vehicle,” the company says on its website.

To create the video, Tesla used 3D mapping of a predetermined route from its home in Menlo Park, California, to Tesla’s headquarters in Palo Alto, he said.

The driver intervened to control the trial, he said. While trying to show the Model X could park itself without a driver, the test car crashed into a fence in Tesla’s parking lot, he said.

“The purpose of the video is not to accurately reflect what is available to customers in 2016. It is to reflect what can be built into the system,” Elluswamy said, according to a transcript of the testimony seen by Reuters.

The Justice Department is investigating after a series of accidents

When Tesla released the video, Musk tweeted: “Tesla is driving itself (no human input) through city streets to the highway, then looking for a parking space.”

Tesla is facing lawsuits and regulatory scrutiny over its driver assistance systems.

The US Department of Justice has launched a criminal investigation into Tesla’s claims that its electric vehicles will be able to drive themselves by 2021, after a series of accidents, some fatal, involving the autopilot, Reuters reported.

WATCH | More questions for Tesla after 2 deaths in Texas crash:

Authorities are investigating a fatal Tesla crash with no one in the driver’s seat

Two people died after their Tesla MODEL S crashed into a tree and exploded in The Woodlands, Texas, on April 17. One person was found in the passenger seat, and another in the back, leading authorities to investigate whether the car was there. in fully self-driving mode that Tesla is promoting ahead of the release of a more extensive upgrade from semi-automated driving.

The New York Times reported in 2021 that Tesla engineers made a 2016 video to promote Autopilot without disclosing that the route had been mapped in advance or that a car had crashed while trying to complete the shoot, citing an anonymous source.

When asked if the 2016 video showed the performance of Tesla’s autopilot system available in production cars at the time, Elluswamy said, “No.”

Elluswamy was named in a lawsuit against Tesla over the 2018 crash in Mountain View, California, that killed Apple engineer Walter Huang, 38.

Andrew McDevitt, a lawyer representing Huang’s wife and who questioned Elluswamy in July, told Reuters it was “obviously misleading to feature a video without a disclaimer or an asterisk.”

The National Transportation Safety Board concluded in 2020 that Huang’s fatal crash was likely caused by autopilot distraction and limitations. It said Tesla’s “ineffective driver engagement monitoring” had contributed to the crash.

Elluswamy said drivers can “fool the system,” making the Tesla system believe that they are paying attention based on suggestions from the steering wheel when they are not. But he said there are no safety issues with autopilot if drivers pay attention.

[ad_2]

Source link

Leave a Reply