header banner
TECHNOLOGY

Facebook’s election ‘war room’ takes aim at fake information

MENLO PARK, Oct 19: In an otherwise innocuous part of Facebook’s expansive Silicon Valley campus, a locked door bears a taped-on sign that reads “War Room.” Behind the door lies a nerve center the social network has set up to combat fake accounts and bogus news stories ahead of upcoming elections.
By Associated Press

MENLO PARK, Oct 19: In an otherwise innocuous part of Facebook’s expansive Silicon Valley campus, a locked door bears a taped-on sign that reads “War Room.” Behind the door lies a nerve center the social network has set up to combat fake accounts and bogus news stories ahead of upcoming elections.


Inside the room are dozens of employees staring intently at their monitors while data streams across giant dashboards. On the walls are posters of the sort Facebook frequently uses to caution or exhort its employees. One reads, “Nothing at Facebook is somebody else’s problem.”


That motto might strike some as ironic, given that the war room was created to counter threats that almost no one at the company, least of all CEO Mark Zuckerberg, took seriously just two years ago — and which the company’s critics now believe pose a threat to democracy.


Days after President Donald Trump’s surprise victory, Zuckerberg brushed off assertions that the outcome had been influenced by fictional news stories on Facebook, calling the idea ”pretty crazy .”


But Facebook’s blase attitude shifted as criticism of the company mounted in Congress and elsewhere. Later that year, it acknowledged having run thousands of ads promoting false information placed by Russian agents. Zuckerberg eventually made fixing Facebook his personal challenge for 2018.


Related story

Dodging Fake News on Social Media


The war room is a major part of Facebook’s ongoing repairs. Its technology draws upon the artificial intelligence system Facebook has been using to help identify “inauthentic” posts and user behavior. Facebook provided a tightly controlled glimpse at its war room to The Associated Press and other media ahead of the second round of presidential elections in Brazil on Oct. 28 and the U.S. midterm elections on Nov. 6.


“There is no substitute for physical, real-world interaction,” said Samidh Chakrabarti, Facebook’s director of elections and civic engagement. “The primary thing we have learned is just how effective it is to have people in the same room all together.”


More than 20 different teams now coordinate the efforts of more than 20,000 people — mostly contractors — devoted to blocking fake accounts and fictional news and stopping other abuses on Facebook and its other services. As part of the crackdown, Facebook also has hired fact checkers, including The Associated Press, to vet news stories posted on its social network.


Facebook credits its war room and other stepped-up patrolling efforts for booting 1.3 billion fake accounts over the past year and jettisoning hundreds of pages set up by foreign governments and other agents looking to create mischief.


But it remains unclear whether Facebook is doing enough, said Angelo Carusone, president of Media Matters For America, a liberal group that monitors misinformation. He noted that the sensational themes distributed in fictional news stories can be highly effective at keeping people “engaged” on Facebook — which in turn makes it possible to sell more of the ads that generate most of Facebook’s revenue.


“What they are doing so far seems to be more about trying to prevent another public relations disaster and less so about putting in meaningful solutions to the problem,” Carusone said. “On balance, I would say they that are still way off.”


The election war room and its inner workings remain too opaque to determine whether it’s helping Facebook do a better job of keeping garbage off its service or if it’s just a “temporary conference room with a bunch of computer monitors in it,” said Molly McKew, a self-described “information warfare” researcher for New Media Frontier, which studies the flow of content on social media.


McKew believes Facebook is conflicted about blocking some content it already knows is suspect “because they keep people on their platform by sparking an emotional response, so they like they like the controversial stuff. There will always be this toeing of the line about pulling down radical, crazy content because that’s what people engage on, and that’s what they want.”


Facebook defends its war room as an effective weapon against misinformation, although its efforts are still a work in progress. Chakrabarti, for instance, acknowledged that some “bugs” prevented Facebook from taking some unspecified actions to prevent manipulation efforts in the first round of Brazil’s presidential election earlier this month. He declined to elaborate.


The war room is currently focused on Brazil’s next round of elections and upcoming U.S. midterms. Large U.S. and Brazilian flags hang on opposing walls and clocks show the time in both countries.


Facebook declined to let the media scrutinize the computer screens in front of the employees, and required reporters to refrain from mentioning some of the equipment inside the war room, calling it “proprietary information.” While on duty, war-room workers are only allowed to leave the room for short bathroom breaks or to grab food to eat at their desks.


Although no final decisions have been made, the war room is likely to become a permanent fixture at Facebook, said Katie Harbath, Facebook’s director of global politics and government outreach.


“It is a constant arms race,” she said. “This is our new normal.”


 

Related Stories
OPINION

Stop Russia-Ukraine War for the Greater Interest

My City

Not lame, but I’m yet to take aim

SOCIETY

Police accused of leaking sensitive information am...

Infographic

Facebooks Keeps On Growing

OPINION

Aggression and expansion have never been China’s a...