Democrats Want To Hold Social Media Companies Responsible For Health Misinformation
Democratic senators introduced a bill on Thursday that would hold Facebook, YouTube and other social media companies responsible for the proliferation of falsehoods about vaccines, fake cures and other harmful health-related claims on their sites.
Co-sponsored by Democratic Senators Amy Klobuchar of Minnesota and Ben Ray Luján of New Mexico, the Health Misinformation Act targets a provision in Section 230 of the Communications Decency Act, which protects platforms from being held liable for what their users post in most cases.
The bill would strip the companies of that legal shield if their algorithms promote health misinformation during a public health crisis. It would not apply if such misinformation is shown in a chronological feed. (Most social platforms use algorithms to rank posts based on what they think users will be interested in,)
The legislation leaves it up to the U.S. Department of Health and Human Services, which is responsible for declaring public health emergencies, to define what constitutes health misinformation.
"These are some of the biggest, richest companies in the world and they must do more to prevent the spread of deadly vaccine misinformation," Klobuchar said in a statement. "The coronavirus pandemic has shown us how lethal misinformation can be and it is our responsibility to take action."
She cited a recent poll from the nonprofit Kaiser Family Foundation that found two-thirds of unvaccinated people believe myths about COVID-19 vaccines, such as the baseless notion that vaccines cause the disease.
Tensions over social media's role in the spread of fraudulent claims about COVID-19 vaccines have come to a head as stalling vaccination rates and the rise of the Delta variant threaten to prolong the pandemic.
U.S. Surgeon General Vivek Murthy warned last week that COVID-19 misinformation is an "urgent threat." The White House has called out Facebook in particular, saying it needs to do more to curb false anti-vaccination posts.
On Friday, President Biden said the social platforms are "killing people," although he later walked back that comment and said he meant that people who spread misinformation about vaccines online are irresponsible.
Facebook lashed back, accusing the administration of "finger pointing". The company said it's taken down more than 18 million pieces of COVID misinformation, showed authoritative information about COVID and vaccines to more than 2 billion people, and that its own surveys find "vaccine acceptance among Facebook users in the U.S. has increased."
CEO Mark Zuckerberg told the website The Verge on Thursday that he is "quite confident" that the company has been "a positive force here."
This week, YouTube said it would start putting notices on some videos about health with links to "authoritative" sources, and highlight videos from those sources in search results on certain health topics.
But critics say social media companies need to go further. The Center for Countering Digital Hate, a nonprofit that combats disinformation, says just 12 people are responsible for 65% of anti-vaccine posts shared on social media, and has criticized Facebook, YouTube and Twitter for not booting them off their platforms entirely. (The platforms have removed some accounts of a few of the dozen, but none has kicked all of them off.)
Klobuchar said the 25-year-old Section 230 is allowing misinformation to thrive online.
"The law — which was intended to promote online speech and allow online services to grow — now distorts legal incentives for platforms to respond to digital misinformation on critical health issues, like COVID-19, and leaves people who suffer harm with little to no recourse," Klobuchar's office said in a press release about her proposed legislation.
The White House has also said it's "reviewing" whether to seek changes to Section 230 to tackle COVID misinformation.
The legal shield has come under fire from lawmakers on both sides of the aisle who say it's become outdated now that tech platforms play such a dominant role in society.
Democrats say Section 230 allows social media companies to duck responsibility for harmful content like misinformation and hate speech, while Republicans claim it's given platforms cover to censor conservatives. (There is little public evidence showing platforms treat conservatives more harshly than others.)
However, trying to hold the platforms responsible for health misinformation may come up against challenges on First Amendment grounds, because such content likely falls into the category of "lawful but awful" speech, said Eric Goldman, a law professor at Santa Clara University.
"If health misinformation is constitutionally protected, then there's really not a lot Congress can do about that," he said. "Removing Section 230, which is a liability shield, doesn't expose a [social media] service to any new liability, because the Constitution will fill in the protection."
Defining which health claims are legitimate and which aren't also raises thorny issues, said Renée DiResta, who studies misinformation at the Stanford Internet Observatory.
"There are times when the consensus just isn't fully formed yet," she said, pointing to the early days of the pandemic when there were debates about whether the virus was airborne.
"Asking or expecting the platforms to take action on certain types of health misinformation may be reasonable, but this sort of dynamic that we've all watched unfold over the last year and a half makes clear how this approach has some potentially problematic pitfalls," she said.
DiResta also warned that focusing narrowly on how platforms handle specific types of content, whether it's lies about vaccines or baseless claims of election fraud, risks ignoring the bigger picture of how information is created and spread — both on social media and in other channels.
"There's hope that we can cure all of the problems of the world by amending [Section 230]," she said. "It's not so simple as we're going to regulate social media platforms and this is all going to go away."
Editor's note: Facebook and Google, which owns YouTube, are among NPR's financial supporters.