Content media

Meta Cuts Election Disinformation Efforts Ahead of Midterm Elections | Partner content: news

WASHINGTON (AP) — Facebook owner Meta is quietly scaling back some of the safeguards designed to thwart election misinformation or foreign interference in U.S. elections ahead of November’s midterm vote.

It’s a stark departure from the social media giant’s multibillion-dollar efforts to improve the accuracy of U.S. election postings and regain the trust of lawmakers and the public after their outrage at learning that the company had exploited people’s data and allowed lies to invade its site during the 2016 campaign.

The kingpin is sounding the alarm about Meta’s priorities and how some may be exploiting the world’s most popular social media platforms to spread misleading claims, launch fake accounts and incite partisan extremists.

“They don’t talk about it,” said former Facebook policy director Katie Harbath, now CEO of tech and policy firm Anchor Change. “Best-case scenario: They’re still doing a lot behind the scenes. Worst-case scenario: They’re backing off, and we don’t know how that’s going to play out for midstream on the platforms.”

Since last year, Meta has closed a review into how lies are amplified in political ads on Facebook by banning researchers from the site indefinitely.

CrowdTangle, the online tool the company offered to hundreds of newsrooms and researchers so they could identify trending posts and misinformation on Facebook or Instagram, is now unusable on some days.

Public communication about the company’s response to election disinformation has gone decidedly silent. Between 2018 and 2020, the company released more than 30 statements detailing how it would stifle U.S. election disinformation, prevent foreign adversaries from running ads or posts around the vote, and crack down on divisive hate speech.

Senior managers held question-and-answer sessions with reporters on policy news. CEO Mark Zuckerberg has written Facebook posts promising to remove false voting information and penned opinion pieces calling for more regulations to tackle foreign interference in US elections via social media.

But this year, Meta has only released a one-page document outlining plans for the fall elections, though potential threats to the vote remain clear. Several Republican candidates are pushing false claims about the US election on social media. In addition, Russia and China continue to wage aggressive propaganda campaigns on social media aimed at creating new political divisions among the American public.

Meta says elections remain a priority, and policies developed in recent years around election misinformation or foreign interference are now being integrated into the company’s operations.

“With each election, we incorporate what we learn into new processes and establish channels to share information with government and our industry partners,” Meta spokesman Tom Reynolds said.

He declined to say how many staff would be on the job to protect the full-time US election this year.

During the 2018 election cycle, the company offered tours and photos and produced tallies for its war room in response to the election. But the New York Times reported that the number of Meta employees working on this year’s election had been reduced from 300 to 60, a figure Meta disputes.

Reynolds said Meta will bring in hundreds of employees who work across 40 of the company’s other teams to monitor the upcoming vote alongside the election team, with its undetermined number of workers.

The company is pursuing many of the initiatives it developed to limit election misinformation, such as a fact-checking program launched in 2016 that enlists the help of media outlets to investigate the veracity of popular lies spread on Facebook or Instagram. The Associated Press is part of Meta’s fact-checking program.

This month, Meta also rolled out a new feature for political ads that lets audiences search for details about how advertisers target people based on their interests on Facebook and Instagram.

Yet Meta has stifled other efforts to identify election misinformation on its sites.

It has stopped making improvements to CrowdTangle, a website offered to newsrooms around the world that provides insight into trends in social media posts. Journalists, fact checkers and researchers have used the website to analyze Facebook content, including tracing popular misinformation and who is responsible for it.

This tool is now “dying,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee this spring.

Silverman told the AP that CrowdTangle had been working on upgrades that would make it easier to find in-text Internet memes, which can often be used to spread half-truths and escape the scrutiny of fact-checkers, for example. .

“There really isn’t a shortage of ways to organize this data to make it useful to many different parts of the fact-checking community, newsrooms, and wider civil society,” said Silverman.

Not everyone at Meta agreed with this transparent approach, Silverman said. The company hasn’t rolled out any new CrowdTangle updates or features in over a year, and it’s been experiencing multi-hour outages for the past few months.

Meta also ended efforts to investigate how misinformation spreads through political ads.

The company has indefinitely revoked Facebook access to two New York University researchers who they believe collected unauthorized data on the platform. The move came hours after NYU professor Laura Edelson said she had shared plans with the company to investigate the spread of misinformation on the platform around the Jan. 6 attack. 2021 against the US Capitol, which is now under investigation by the House.

“What we found on closer inspection was that their systems were probably dangerous to a lot of their users,” Edelson said.

Privately, former and current Meta employees say exposure of these dangers around the US election has created public and political backlash for the company.

Republicans regularly accuse Facebook of unfairly censoring conservatives, some of whom have been kicked out for breaking company rules. Democrats, meanwhile, regularly complain that the tech company hasn’t gone far enough to tackle misinformation.

“It’s something so politically charged that they’re trying to get away from it more than jumping in head first.” said Harbath, Facebook’s former chief policy officer. “They just see it as a big old pile of headaches.”

Meanwhile, the possibility of U.S. regulation no longer hangs over the company, with lawmakers failing to reach consensus on how much oversight the multi-billion dollar firm should be subject to.

Freed from this threat, Meta executives have poured the company’s time, money and resources into a new project over the past few months.

Zuckerberg immersed himself in this massive rebranding and reorganization of Facebook last October, when he changed the company’s name to Meta Platforms Inc. He plans to spend years and billions of dollars evolving its social media platforms to a nascent virtual reality construct called the “metaverse” – much like the internet coming to life, rendered in 3D.

His posts on the public Facebook page now focus on product announcements, hailing artificial intelligence and photos of him enjoying life. Election readiness news is announced in company blog posts that were not written by him.

In one of Zuckerberg’s posts last October, after a former Facebook employee leaked internal documents showing how the platform amplifies hate and misinformation, he defended the company. He also reminded his supporters that he pushed Congress to modernize election regulations for the digital age.

“I know it’s frustrating to see the good work we do being misrepresented, especially for those of you who make important security, integrity, research and product contributions,” he wrote on October 5. “But I believe that in the long run, if we continue to try to do the right thing and provide experiences that improve people’s lives, it will be better for our community and our business.

That was the last time he discussed the Menlo Park, Calif.-based company’s campaign work in a public Facebook post.


Barbara Ortutay, technical writer at Associated Press, contributed to this report.


Follow AP’s misinformation coverage at https://apnews.com/hub/misinformation.

If you are interested in submitting a letter to the editor, click here.