Disinformation Researchers Fret About Fallout From Judge’s Order

A federal judge’s decision this week to limit government communication with social media platforms could have far-reaching side effects, according to researchers and groups fighting hate speech, online abuse and disinformation. , can further hinder efforts to curb harmful content.

University of North Carolina Chapel Hill researcher Alice E. Marwick said Wednesday that the ruling could hamper efforts to stop false claims about vaccines and the spread of voter fraud. One of man’s disinformation experts.

He said the order largely follows other efforts by Republicans and is “part of a coordinated campaign to push back the whole idea of ​​disinformation.”

Judge Terry A. Doughty issued a preliminary injunction on Tuesday that the Department of Health and Human Services and the Federal Bureau of Investigation, along with other departments of government, “must stop contacting social media companies for the purposes of encouraging and encouraging.” admitted. In any way pressure or induce the removal, deletion, suppression, or reduction of any Content that contains protected free speech. “

The ruling stems from lawsuits by attorneys general of Louisiana and Missouri who accused Facebook, Twitter and other social media sites of censoring right-wing content, sometimes in concert with the government. . They and other Republicans cheered a judge’s ruling in the United States District Court for the Western District of Louisiana as a victory for the First Amendment.

But some researchers said the government’s cooperation with social media companies is fine as long as it doesn’t force content to be removed. Instead, the government has so far notified companies of potentially dangerous messages, including lies about election fraud and misleading information about the coronavirus pandemic, they said. Most misinformation or disinformation that violates a social platform’s policies is reported by researchers, non-profit organizations, or the platform’s own personnel and software.

Professor of Communications and Center for Information Technology and Society at the University of California, Santa Barbara.

An even bigger concern is the potential chill effect, the researchers said. The judge’s decision barred certain government agencies from communicating with some research institutions, such as the Stanford Internet Observatory and the Election Integrity Partnership, regarding the removal of social media content. Some of these groups have already been targeted by Republican-led legal campaigns against universities and think tanks.

Their colleagues said such provisions could discourage young academics from pursuing disinformation research and threaten donors who fund important grants.

Bond Benton, an associate professor of communications at Montclair State University who studies disinformation, said the ruling “could be a bit of a Trojan horse.” He said the document, although limited to the government’s relationship with social media platforms, contains the message that misinformation is considered speech and its removal is considered speech suppression.

“Previously, platforms could just say, ‘No shirts, no shoes, no service,’ and we didn’t want to host,” says Dr. Benton. “This ruling will probably cause platforms to be a little more cautious in this regard.”

In recent years, platforms have come to rely heavily on automated tools and algorithms to detect harmful content, limiting the effectiveness of complaints from people outside the company. Victoria Wilk, director of digital safety and freedom of expression at PEN America, a nonprofit that supports free expression, said academics and anti-disinformation groups often complain that platforms don’t address their concerns. He said he was.

“Platforms are good at ignoring civil society organizations and our requests for assistance, requests for information, or escalation of individual cases,” she said. “They don’t feel comfortable ignoring the government.”

Some disinformation researchers are concerned the ruling could give social media platforms a cover-up, with some platforms already scaling back efforts to curb disinformation and the 2024 election. It is supposed to lower its vigilance even further. It’s unclear how relatively new government initiatives that respond to researchers’ concerns and suggestions, such as the White House Task Force on Online Harassment and Abuse, will work, they said.

For Imran Ahmed, chief executive of the Digital Hate Countermeasures Center, Tuesday’s decision highlighted other issues: the US’s “particularly toothless” approach to dangerous content compared to the likes of Australia and the European Union; , the need to update the rules governing the responsibilities of social media platforms. Tuesday’s ruling said the center gave a presentation to the Surgeon General’s Office on its 2021 report on online anti-vaccine activists.disinformation dozen

“It’s bananas that can’t show nipples at the Super Bowl, but Facebook can still broadcast Nazi propaganda, empower stalkers and harassers, undermine public health, and promote extremism in the United States. said Mr Ahmed. “This court’s ruling further exacerbates the sense of impunity under which social media companies operate, even though they are the primary vehicle for hate and disinformation in society.”

Related Articles

Back to top button