The spread of disinformation is one of the most pressing problems facing society today. Lawmakers, policymakers, and researchers have focused on how disinformation disrupts political discourse and undermines democratic processes, threatens global security by contributing to the spread of extremist ideologies, furthers social and political polarization, and more. But as Leah Fowler, Max Helveston, and Zoë Robinson describe in their forthcoming article, Influencer Speech-Torts, it can also harm people’s health. (See also the Surgeon General’s Report, Confronting Health Misinformation.) They take on the specific problem of social media influencers providing health advice that causes their followers injury or even death.
Many of us might be thinking of COVID-19 examples, like how people may have ingested bleach and other disinfectants recommended by influencers and faced serious health consequences. But as Fowler, Helveston, and Robinson point out, the problem is pervasive, even outside the pandemic context. Consider, for instance, people with cancer who avoid evidence-based treatments because they become convinced that a particular diet or detox will cure their cancer, or influencers who promote extreme weight loss methods that, when followed, cause serious health harms.
In a world where content moderation, consumer protection laws, and other solutions have not solved the problem, the authors turn to private law. They argue that influencers should be held liable in negligence when they spread health misinformation to their followers that could foreseeably result in injury.
The article is well-written and convincing. Part I describes the influencer economy and how influencers make money from disseminating medical misinformation. The authors define influencers as “content creators who share information for commercial gain.” (P. 37.) They explain why influencing is different from advertising and describe the intense“parasocial relationship” that some individuals form with influencers, making them particularly susceptible to influencers’ advice. Indeed, the authors make the important point that influencer health misinformation exacerbates health inequity, as it is more likely to negatively impact those with limited access to healthcare professionals who could combat misinformation. Part I also explores why existing mechanisms are insufficient to compensate for influencer-driven physical harms.
Part II lays out the authors’ proposed solution. It argues that influencers should owe a duty of care to their followers and that public policy supports the imposition of a duty. Fowler, Helveston, and Robinson wrestle with an issue of first impression, as courts have not yet imposed a duty on influencers to avoid sharing health misinformation. But they argue that under traditional doctrinal approaches, an influencer should owe a duty to their followers. The nature of the influencer-follower relationship, being “fiduciary-like,” further strengthens the claim. Followers rely on the advice of influencers and assume they have followers’ best interests at heart, while the influencer maximizes commercial gain. The authors also offer three convincing policy rationales to support the imposition of a duty: “the social inutility of health misinformation, the potential for influencer duties to prevent disproportionate harms against marginalized populations, and the reduction in negative externalities that result from influencer duties.” (P. 36.)
The final part takes on the First Amendment and argues that free speech rights should not bar tort recovery for physical harm. The authors argue that influencer health misinformation should not be considered “covered speech” under the First Amendment because it is misleading commercial speech or because it should be deemed a new category of uncovered speech. In the alternative, regardless of how it is categorized, they argue that where the speech causes physical harm, the interest in remedying that harm should always outweigh speech rights for constitutional purposes. (Notably, the authors cabin their arguments to physical harm, and do not include economic and reputational harms.)
In sum, followers who suffer physical harm, because they took harmful health advice from an influencer, should be able to sue that influencer under the law of negligence and recover for their injury.
Followers, of course, face additional hurdles in bringing a successful claim beyond those that are the primary focus of the paper, including difficulty accessing the court system at all, proving that advice is indeed “misinformation,” establishing causation, and proving damages. But it might not take a flood of successful claims for the deterrence value of these negligence claims to still be impactful, which is part of the appeal of this solution.
Further, I see Influencer Speech-Torts as an important contribution to a larger problem in our field—namely, how to address areas where the market does not provide patients and consumers reliable signals of safety and efficacy, and where regulatory oversight is also non-existent or insufficient. This category arguably includes surgical techniques and medical procedures, dietary supplements, off-label use of drugs, complementary and alternative medicine, and more. Getting the balance right between patient autonomy, fostering innovation, and protecting people from harm is a perpetual challenge, making the approach in this paper even more provocative.






