Deepfake Defense? Court Orders Elon Musk To Answer Under Oath Autopilot Questions In Tesla Fatal Car Crash

Elon Musk has been compelled to give a deposition in a lawsuit accusing Tesla’s driverless technology of causing a fatal collision, after the carmaker said the chief executive’s public claims regarding autopilot could have been deepfaked.

Judge Evette Pennypacker of Santa Clara County Superior Court stated Tesla’s case for why its wealthy CEO should not testify was “deeply troubling to the court.”

The company had maintained that it could not guarantee the validity of videotaped interviews in which Musk promotes its driver-assistance technology, claiming that some of them had been digitally manipulated.

“Their position is that because Mr Musk is famous and might be more of a target for deep fakes, his public statements are immune,” Judge Pennypacker wrote. “In other words, Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.”

The judge added that Musk can be questioned for up to three hours about certain statements he made about Tesla’s assisted driving features.

His testimony would be in addition the hundreds of hours of depositions already provided by other witnesses in the case, which is set to go to trial this year.

The family of Walter Huang, an Apple employee who died during his morning commute after his 2017 Model X crashed into a concrete barrier on a highway approximately 45 minutes south of San Francisco, filed the case. According to the family, the Autopilot system failed and steered the car into the barrier.

Huang’s hands were not detected on the steering wheel three times during the 19 minutes preceding the collision, during which Tesla’s autopilot provided two visual and one aural alerts for hands-off driving.

According to the National Transportation Safety Board, Huang was playing the video game Three Kingdoms on his phone at the time of the crash.

Lawyers for the Huang family have requested documentation from Tesla to back up Musk’s repeated public assertions about the company’s progress in developing self-driving technology between 2014 and 2017.

Some of those statements have been highlighted in recent customer lawsuits accusing Tesla of failing to deliver on Musk’s long-standing promise to build a self-driving car.

Musk will likely be questioned about a 2016 statement highlighted by the plaintiffs in which he allegedly stated: “A Model S and Model X, at this point, can drive autonomously with greater safety than a person. Right now.”

The plaintiffs also say that Musk completed the elements of a 2016 promotional film that stated, “The car is driving itself.” The video highlighted some technologies that did not exist at the time, according to the plaintiffs, who cited multiple Tesla engineers.

The family’s lawyers have asked the judge to censure Tesla for failing to appropriately reply to their requests for information throughout the pretrial discovery process.

However, in his decision on Wednesday, Judge Pennypacker refused that request.

The lawsuit is set to go to trial on July 31, adding to the escalating legal and regulatory scrutiny of Tesla’s Autopilot system.

Beyond the lawsuit, Tesla’s full-self driving (FSD) feature is facing a lot of wrinkles in the legal, regulatory, and commercial fronts. Earlier this year in February, around 363,000 Tesla vehicles equipped with the feature were recalled. Affected models include the 2016-2023 Model S and X, the 2017-2023 Model 3, and the 2020-2023 Model Y that are equipped with or are pending the installation of FSD Beta.

However, it was later revealed that the electric vehicle giant did not disclose the National Highway Traffic Safety Administration (NHTSA) requests for a recall in its most recent earnings. Instead, the firm announced some more than $1 billion in deferred revenue from FSD.

While being classified as a “recall,” Tesla will reportedly correct the software via an over-the-air, or OTA, update. Certain supporters of the brand have taken issue with the fact that it is being called as such, given that the automaker can theoretically address the issue via a “software update”.

On top of the nearly 363,000 vehicles Tesla is recalling, it is also recalling some more of its FSD-equipped models in Canada. According to Transport Canada, the American carmaker has filed a notice of defect affecting 20,667 vehicles in Canada, owners of whom will be notified by email and will be supplied “an over-the-air firmware update.”

Moreover, a research paper by proponents from Delft University of Technology highlighted the growing complacency and unideal behavior being encouraged by Tesla’s standard Autopilot and FSD Beta program.

This study presents the results of 103 in-depth interviews with drivers whose vehicles were outfitted with Tesla’s FSD Beta and standard Autopilot to investigate the driver’s direct short-term and indirect long-term behavioral adaptations to using Autopilot and FSD Beta, as well as changes in the driver’s travel behavior.

Amid all the brouhaha on the safety and alleged misrepresentation of the self-driving feature, documents obtained under a Freedom of Information request disclosing email exchanges between Tesla and the California Department of Motor Vehicles show that the automaker said it expects FSD functionality to “remain largely unchanged in a future, full release to the customer fleet.”

“We are analyzing the data obtained in the pilot and using it to refine the feature’s operation and customer experience,” said Tesla in a December 2020 letter undersigned by Associate General Counsel Eric Williams addressing California DMV’s Chief of Autonomous Vehicles Branch Miguel Acosta. “That said, we do not expect significant enhancements in OEDR or other changes to the feature that would shift the responsibility for the entire DDT to the system. As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature.”

Last March, Transportation Secretary Pete Buttigieg hit Tesla over its Autopilot feature, saying “I wouldn’t call something ‘Autopilot’ if the manual explicitly says that you have to have your hands on the wheel and the eyes on the road all the time.”

A California bill filed back in September 2022 also targeted Tesla’s FSD program, addressing the issue of the electric vehicle maker marketing the vehicle as “full self-driving,” implying in plain English that the car can be completely autonomous, when it can not.

However, in a recent tweet, Musk also said that the latest FSD update pushed Tesla vehicles equipped with the program to be now actively driving approximately 1 million miles per day across the fleet.


Information for this briefing was found via Yahoo Finance, Reuters, and the sources mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

One thought on “Deepfake Defense? Court Orders Elon Musk To Answer Under Oath Autopilot Questions In Tesla Fatal Car Crash

  • April 29, 2023 9:37 AM at 9:37 am
    Permalink

    We enjoy reading your blog! Your unconventional perspective and real voice make a difference in the world. Keep writing, because your ideas matter. Thank you for being you!

    Thanks – TheDogGod

    Reply

Leave a Reply

Share
Tweet
Share