Facebook’s controversial algorithms protect its users from being exposed to extreme content, hate speech and misinformation, the beleagured company’s vice president of politics and global affairs claimed in interviews Sunday.
Nick Clegg defended Facebook against accusations from whistleblower Frances Haugen that its algorithms push clickbait and extreme content – but insisted that the company would never be able to completely remove misinformation and hate speech from its platforms.
“If you remove the algorithms … the first thing that would happen is that people would see more, not less, hateful utterances – more, no less misinformation,” Clegg told Dana Bash in CNN’s Union of State. “These algorithms are precisely designed to act almost like giant spam filters to identify and dismiss bad content.”
“For every ten thousand bit of content, you would only see five pieces of heights of hate speech,” he said. “I wish we could eliminate it to zero. We have a third of the world’s population on our platforms. Of course, we see the good, the bad and the ugly of human nature on our platforms. ”
Clegg insisted to Bash that Facebook’s algorithms played no particular role in the run-up to the January 6 riots. On NBC’s “Meet The Press,” he told Chuck Todd that Haugen’s claim that Facebook lifted measures aimed at curbing user down feeds after the 2020 presidential election was “simply not true.”
“We actually kept the vast majority of them right until the inauguration, and we kept some in place permanently,” Clegg told Todd, adding that some of the changes were “disposable.”
He said the company rolled back “blunt tools” – such as reducing the circulation of videos, civic engagement opportunities and political ads – that had inadvertently “gathered a lot of completely innocent legitimate legal playful fun content.”
“We did it very exceptionally,” Clegg said. “We simply let completely normal content circulate less on our platform. It’s something we did because of the extraordinary circumstances. ”
Clegg told Todd that it is the responsibility of Congress to “create a digital regulator” and lay down rules for data protection and content moderation.
“I do not think anyone wants a private company to judge these really difficult trade-offs between, you know, free speech on the one hand and moderation or removal of content on the other,” he said. “There is fundamental political disagreement. The right thinks we are censoring too much content, the left thinks we are not taking down enough. ”
Clegg told ABC’s George Stephanopoulos that it was “extremely misleading” to analogize Facebook’s reported knowledge of the harm its products cause to children and society in relation to tobacco companies’ awareness of the dangers of cigarettes.
“In the ’80s and’ 90s, there were analogies that watching too much television was like alcoholism, or arcade games like Pac Man were, as you know, drug abuse,” he said. “We can not change human nature. You will always see bad things online. ”