As a result of Haugen’s testimony, Facebook has come under united two-party pressure from both Democrats and Republicans and has been forced to shelve plans for an Instagram Kids app for the preteen market – at least for now.
Even Facebook itself no longer seems to stand in the way of regulation.
“It’s been 25 years since the rules of the Internet were updated, and instead of expecting the industry to make societal decisions that belong to lawmakers, it’s time for Congress to act,” a Facebook spokesman said.
In Australia, history has revived a debate on how to regulate content on the Internet, and not just in terms of cybersecurity. In an extraordinary intervention, both Prime Minister Scott Morrison and Deputy Prime Minister Barnaby Joyce this week highlighted the possibility of making the platforms legally responsible for defamatory comments.
Morrison accused technology giants of allowing their platforms to become a “cowardly palace” for anonymous trolls who “ruin people’s lives and say the most ugly and offensive things to people, and do so with impunity.”
Haugen’s Facebook dossier is far-reaching. It reveals details of a program called “cross-checking” or “XCheck” that whitelists high-profile accounts so that the company’s normal enforcement measures against harassment and incitement to violence do not apply. This suggests that the company was consistently willing to accept 10-20 percent more misinformation if it meant 1 percent more commitment. It describes how Facebook prematurely removed controls that were introduced before the U.S. presidential election in November 2020 in December to re-prioritize engagement, just weeks before the U.S. Capitol riots. It reveals the weakness of the company’s response to criminal issues in developing countries, from drug cartels in Mexico to human traffickers in the Middle East.
One of the most striking revelations was what Facebook knew about the damaging effects of its photo-sharing app Instagram for many users, especially teenage girls, who account for a large portion of the audience.
Australia’s Assistant Minister of Mental Health David Coleman, a former chairman of NineMSN, says the whistleblower files show that the giants on social media “can not trust that they are acting in the best interests of children”. He is appalled by Facebook and Instagram’s “abysmal” efforts to enforce their own age limits.
“There are undoubtedly millions of children who are on social media platforms at an age where it is unsafe for them to be there,” he says.
“What is the role of society and government, if not to protect children, and we know we can not rely on social media platforms to do so.”
Australia has taken the lead in cybersecurity, established the world’s first eSafety Commissioner in 2015 and adopted the Online Safety Act 2021, which technology companies must comply with by mid-2022. Australia was among only a handful of countries to force technology platforms to pay news publishers for content, and debate is now turning to libel.
There are clear signs of a growing appetite within some sections of the Australian government to further crack down on the social media giants. But with only four already packed parliamentary weeks back this year, the momentum for reform could be dampened by the headwinds of the looming federal election and campaign season.
Facebook says the company has removed more than 600,000 underage accounts on Instagram over the past three months and has thousands of employees as well as AI technology dedicated to removing accounts belonging to underage users.
Many people can use Instagram and not get hurt, or the problems can fade over time as in Tilda’s case. For others, the app’s relentless focus on social competition and the algorithms that can lead users from healthy recipes to pro-anorexia content at chain speed can contribute to the development of eating disorders or self-harm.
Haugen’s documents show that internal Facebook research found that more than 40 percent of teens on Instagram users who reported feeling “unattractive” said the feeling began on the app, one in five teens says Instagram gets them to feel worse about themselves, and many teens reported the app undermined their confidence in their friendships. Teenagers regularly said they would spend less time on Instagram, but lacked the self-control to do so.
Facebook researchers concluded that some issues around social comparison were specific to Instagram, not social media more generally. Some Facebook executives resisted an internal push for change, saying social competition was the “fun part” of Instagram for users, publicly citing external research that downplayed the link between social media use and psychological harm.
In a public post, Facebook founder Mark Zuckerberg said it was wrong for Facebook to prioritize profits over security. He said Instagram research had been mischarged because it also showed many teenage girls struggling with loneliness, anxiety, sadness and eating disorders said Instagram made these problems better, not worse.
This week, Morrison and Joyce seized on the debate about online abuse spreading on social media as a stick to threaten a further crash through defamation law reform.
In a deliberate choice of words, Morrison said that platforms that refused to mask trolls “were no longer a platform, they are a publisher”. Joyce, whose daughter has been the subject of cruel gossip by anonymous commentators, stated that platforms “should be held accountable” and said that “if they activate the vise, they pay the price”.
Their comments follow a High Court ruling last month that the media found legally responsible as “publishers” for third-party comments on their Facebook pages, even though they were not aware of the comments. The bombing also has consequences for other administrators of Facebook pages, including MPs and ordinary citizens.
Associate Professor Jason Bosland, Director of the Center for Media and Communications Law at Melbourne Law School, says it would be an “extreme” result to hold social media giants responsible for defamatory remarks circulating on their platforms as soon as they are posted on Facebook, Twitter and other companies unable to operate due to the legal risk.
“You want very few experts to be consulted, suggesting that Facebook should be responsible for absolutely everything that is posted on their platform without notice,” Bosland says.
The nation’s Advocates General, led by Mark Speakman of NSW, are considering the possibility of reforming the defamation legislation.
Australia’s eSafety Commissioner Julie Inman Grant says the whistleblower revelations, while not surprising, could galvanize action in the US and will in turn strengthen Australia’s efforts. She, too, resembles it with efforts to regulate car safety and mandate seat belts in the 1960s to 1980s.
“This is the tech industry’s seat belt moment,” she says. “For too long, they have not had brakes on at all, and the primary reason is that they acted as a driving force for innovation and inspiration and growth and development, and no government wants to put the brakes on it.”
Australia’s efforts include giving the eSafety Commissioner statutory powers to order the removal of content, working with industry to promote the concept of “safety by design” and the Online Safety Act 2021. The law has a co-regulatory approach – eSafety has produced a white paper outlining the expected results, and technology companies or industry organizations (for sectors such as social media platforms, the Internet of Things, or gaming providers) have until June 2022 to register codes that show how they plan to comply with them. These codes must be approved by eSafety and will be registered in accordance with the law, giving regulatory power.
Inman Grant says Australia has in the past regulated a specific set of damages, such as cyberbullying, image-based abuse and illegal online content such as child sexual abuse or terrorist content. The new online security law is about basic online security expectations or a social license to operate.
However, the United States is a market that is 12 times larger than Australia and its home jurisdiction for Facebook and most other technology platforms, so regulation in the United States would be far more important.
Communications Minister Paul Fletcher and Inman Grant wrote this week jointly to the U.S. Senate Commerce Committee, sharing details of Australia’s legislative approach and offering Inman Grant to appear at the hearings.
Inman Grant believes that international standards are inevitable for technology, just as they are now embraced by the automotive industry.
Profit conflicts and security
During her two years on Facebook, Frances Haugen says she saw the company “repeatedly encounter conflicts between its own profits and our security. [and] consistently resolve these conflicts in favor of its own merits ”.
Haugen, who had previously worked at Google, Pinterest and Yelp, became so concerned about what she saw on Facebook that she resigned and decided to gather evidence before leaving.
She says the solution lies not only in regulation, but in a demand for full transparency about Facebook’s data and algorithms. She says other big tech companies like Google can independent researchers download and analyze corporate search results from the Internet, but Facebook “hides behind walls that prevent researchers and regulators from understanding the true dynamics of their system.”
However, Inman Grant says it would be very difficult to regulate algorithms because they are not static – you also need information on how the algorithms adapt and change through machine learning, and controllers like eSafety would need a team of data researchers and computer engineers.
There is also a question of whether private companies should be forced to share proprietary information – algorithms are similar to the “secret sauce” that helps their products compete in the market.
In response to Haugen’s testimony, a Facebook spokesman said: “A subcommittee on Senate trade held a hearing with a former Facebook product manager who worked for the company for less than two years, had no direct reports, never attended a decision meeting with C- level executives and testified more than six times about not working on that topic. We do not agree with her characterization of the many questions she testified about. ”
Inman Grant, who has worked for Microsoft, Twitter and as a lobbyist for the US Congress, describes this response as “a classic obfuscation technique”.
“I have seen the speeches written before. I do not think they hold much water. ”