If TikTok Is Liable for ‘Blackout Challenge’ Video — Is Facebook Liable for Sharing CDC Press Releases Advising COVID Patients Not to Take Ivermectin?
A federal appeals court in Philadelphia last week handed down a decision against TikTok that could have outsized effects on internet censorship.
A federal appeals court in Philadelphia last week handed down a decision that could have outsized effects on internet censorship.
The U.S. Court of Appeals for the 3rd Circuit ruled that a lawsuit filed against TikTok by the mother of a 10-year-old girl seeking to hold the social media platform liable for her daughter’s death can proceed.
The 3rd Circuit is the same court that in July threw out the “Protocol 7” fraud case against Merck.
The lawsuit against TikTok centered around a video posted on the social media platform of a person asphyxiating herself until she passed out, and challenging viewers to upload videos of themselves.
The plaintiff, Tawainna Anderson, was the mother of the girl who died trying.
Clearly, the people who offer such “challenges” online are guilty of a provocation that predictably will result in some people dying.
However, the legal question is whether the host site, TikTok in this case, can be held liable for the algorithm that offered this video in someone’s personal feed.
The Communications Decency Act of 1996 includes Section 230, which explicitly immunizes social media websites for content they did not create but was uploaded by others.
In Anderson v. TikTok, the Court recognized this provision but argued that the TikTok algorithm “amalgamat[ed]” third-party videos that the algorithm predicted would interest particular users.
It is the amalgamation — (automatically) prepared for the deceased plaintiff — that creates liability for TikTok.
An appeals court decision is, by convention, precedent, not only for other states of the 3rd Circuit but generally for any federal court anywhere. This one raises a number of questions and possibilities.
Legal decisions often have unintended consequences
TikTok, known as a free-speech platform, has been targeted by the U.S. Congress. In April, President Joe Biden signed into law a bill that would ban TikTok from app stores in the U.S. unless its Chinese parent company, ByteDance, sold TikTok to a more friendly parent company by the end of 2024.
Telegram is another free-speech platform under attack. Telegram’s founding CEO, Pavel Durov, was arrested last week in Paris, reportedly for refusing to comply with French government requests for users’ confidential information.
Durov is Russian, but he grew up in Italy and now lives in Dubai.
Facebook, X (formerly Twitter) and YouTube also “amalgamate” content and recommend videos to users. But it is difficult to imagine a U.S. court treating Facebook or YouTube the way they treat TikTok and Telegram.
During the pandemic, Google, Twitter and Facebook actively directed subscribers to videos that discouraged the use of ivermectinand hydroxychloroquine as early treatments for COVID-19.
Heirs of a future plaintiff who died of COVID-19 might argue that the person would not have died if the messages in his inbox had not steered him away from life-saving early treatments.
It is a safe bet that the 3rd Circuit never intended to create an obligation for Facebook to fact-check press releases from the Centers for Disease Control and Prevention — but this could be the consequence if the 3rd Circuit’s decision in Anderson v. TikTok is cited as precedent in future lawsuits against social media.
Aside from government sources, there is a great deal of medical information online. Dr. Joseph Mercola recommends that you irrigate the nose with peroxide. Dr. Aseem Malhotrarecommends you stop taking statins. Deepak Chopra has advice for avoiding suicidal depression.
Inevitably, this advice will be helpful to some and harmful to others. Can YouTube be sued for directing me to a Chopra video, in which the advice turns out to be wrong for me?
The U.S. Army makes recruitment videos, and Facebook sends them to unemployed teenagers. Predictably, some of them enlist in the Army and are deployed to places where they die. Can Facebook be sued for these deaths?
Algorithms that drive cars have been field-tested for several years now, and have caused deaths of pedestrians and motorists. Can the programmers be held liable for what their algorithms do?
Legal decisions often have unintended consequences. We will be interested to see where this one goes.
No comments:
Post a Comment