Critics of Facebook and Twitter — and even some people inside the companies — say dramatic action is needed to counter the way the platforms supercharge false, and sometimes dangerous, claims.

On social media, it is easy for rumors to go viral, while efforts to fact check or correct those rumors often lag behind. That dynamic has been playing out in Oregon. This month, despite efforts by the companies to quell viral unsupported rumors about massive wildfires, hoaxes spread quickly, diverting law enforcement resources at a critical time.

A case study in Oregon

In early September, Cameron Hill and his family fled their home as wildfires raged across Clackamas County, south of Portland. But as they were trying to figure out what was happening and whether their property was damaged, they encountered disturbing rumors.

"I heard that police had several activists in custody down in Eugene actually seen starting fires," Hill told Oregon Public Broadcasting on Sep. 9. "And I heard ... someone actually engaged them on their farm with Molotovs, catching their property on fire."

The unfounded rumors Hill heard were lighting up social media: They said the fires had been intentionally set by left-wing arsonists — specifically, Black Lives Matter or antifa, a loosely defined antifascist movement.

None of those claims were true. Law enforcement in Oregon and even the FBI said there was no evidence that any activist or political groups were involved in the fires.

And yet, local 911 dispatchers were soon overrun with calls. The rumors caused so much disruption, police departments took to Facebook and Twitter to beg people to stop spreading them.

Tim Fox, a captain with the Oregon State Police, said he received a hundred emails about the false claims, creating more work and diverting resources from actual emergencies.

"The fires, the evacuations, the security of the places, you know, we still have our regular calls for service. Life hasn't stopped," he said. "All these rumors and things that are going around are tough because we have to find people to respond to them, to investigate them, to check them out."

Social media's design helps hoaxes go viral

Part of the reason these claims spread so widely on Facebook, in particular, is that the world's biggest social network rewards engagement. Posts that get lots of shares, comments and likes get shown to more people, quickly amplifying their reach.

Facebook is well aware of its power to make stories go viral. As the fire rumors proliferated, the company put warnings on some posts its fact checkers had found false and reduced their distribution.

But that wasn't enough to quell the rumors. On Sep. 12, a day after the FBI put out its statement declaring the reports untrue and at least three days after the claims began spreading on social media, Facebook began removing posts connecting left-wing activists to the fires.

By then, however, hundreds of thousands of people had seen the false posts.

The claims were being amplified by social media accounts known to spread false information, including followers of Qanon, a baseless conspiracy theory, as well as by Russian state news outlet RT.

"What that said to us is that they're waiting too long," said Karen Kornbluh, director of the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States, a think tank.

Kornbluh's research team found the debunked claims still spreading in large private Facebook groups, some with hundreds of thousands of members, several days after Facebook said it was removing such posts.

Facebook said it was using artificial intelligence and human review to find and take down the hoax everywhere it appeared, including in private groups. The company said some of the posts in the private groups the German Marshall Fund identified had already been deleted by users, and the company removed the rest.

"We share the goal of limiting misinformation which is why we have taken aggressive steps to combat it — from creating a global network of over 70 fact-checking partners and promoting accurate information to removing content when it breaks our rules," said Andy Stone, a Facebook spokesman. "There's no silver bullet to addressing this challenge, which is why we continue to consult with outside experts, grow our fact-checking program, and improve our internal technical capabilities."

Still, what happened in Oregon shows that once this kind of hoax starts spreading, it is extremely difficult to stamp out.

"When you think of the psychology of misinformation, you can think of something like molding clay," said Dolores Albarracín, a psychology professor at the University of Illinois. "When you have soft clay, you can print anything you want on to it ... Once it dries out, though, then that's it. Your print or shape is set."

So while fact checks and removing posts can help, the real challenge is stopping harmful hoaxes from going viral in the first place.

Stock markets have "circuit breakers." Should social media?

Now, some experts are promoting a new way for social media platforms to hit pause on their powerful amplification engines.

It is modeled on the stock market's method of halting trading when stocks are too volatile.

"If the S&P drops really suddenly, we've had these thresholds in place for a lot of years now. The market will stop and that will automatically trigger review," said Erin Simpson, associate director of technology policy at the left-leaning Center for American Progress.

Those automatic triggers are called circuit breakers. Simpson says social media also needs circuit breakers to stop the viral spread that platforms are designed to encourage.

When a controversial topic is gaining steam, Facebook or Twitter could limit its reach, while reviewing disputed information.

"A system like this could maybe make it harder for stuff to go viral, instead of the status quo, which is a set of Facebook business products [that] make it easier to go viral," Simpson said.

Simpson and a colleague at CAP described such a viral circuit breaker in a paper last month, building off an idea first proposed by Ellen Goodman, a law professor at Rutgers University.

"Pausing waves of virality could stem disinformation, deepfakes, bot-generated speech, and other categories of information especially likely to manipulate listeners," Goodman wrote.

Now, the idea is even gaining traction inside Facebook. The company told The Verge it is testing this kind of speed bump for viral posts.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

300x250 Ad

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate