Can a celebrity claim “Memelord immunity” over videoed statements due to deep fakes?
29 May 2023
In a wrongful death lawsuit stemming from a 2018 Tesla Model X crash that killed Apple engineer Walter Huang, Tesla attempted an unusual legal defence during discovery. The company argued that videos featuring Elon Musk discussing Autopilot reliability should be inadmissible because Musk, as a public figure, is frequently targeted by deepfakes.
Tesla’s Position
Tesla’s position stated: “he, like many public figures, is the subject of numerous deep fake videos and audio recordings that purport to show him saying and doing things he never actually said or did.”
Court Ruling
Judge Pennypacker rejected this blanket immunity approach, ruling that “Mr. Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.”
The judge ordered Musk to confirm or deny video authenticity under oath.
Disputed Evidence
The disputed videos included statements from 2014 shareholder meetings and Bloomberg appearances where Musk claimed Autopilot could operate autonomously with minimal driver intervention.
Forensic Authentication
Deepfakes—AI-generated manipulated media—pose authentication challenges. However, established forensic procedures exist to examine such evidence, though results may not always be conclusive.
Recommendations
Organisations are advised to employ third-party date-stamping and video storage to preserve statement integrity and defend against manipulation claims.