Wed. Aug 10th, 2022

Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen have renewed the study of the impact Facebook and its algorithms have on teens, democracy and society as a whole. The fallout has raised the question of how much Facebook, and perhaps platforms like it, can or should reconsider using a host of algorithms to determine what images, videos and news users can see.

Haugen, a former Facebook product manager with a background in “algorithmic product management”, has in his critiques focused mainly on the company’s algorithm, which is designed to show users content they are likely to interact with. She has said this is responsible for many of Facebook’s problems, including fuel polarization, misinformation and other toxic content. Facebook, she said by a “60 minutes” look, understands that if it makes the algorithm more secure, “people will spend less time on the site, they click on fewer ads, they make less money.” (Facebook CEO Mark Zuckerberg has pushed back on the idea that the company prioritizes profits over users’ safety and well-being.)
Facebook’s head of global policy management, Monika Bickert, said in an interview with CNN after Haugen’s Senate hearing on Tuesday that it is “not true” that the company’s algorithms are designed to promote inflammatory content, and that Facebook is actually doing the “opposite” of downgrading. of so-called click bait.
Sometimes in her testimony, Haugen seemed to suggest a radical rethinking of how the news feed should work to solve the problems she presented via extensive documentation from within the company. “I am a strong advocate of chronological ranking, sorting by time,” she said in her testimony to a Senate subcommittee last week. “Because I think we do not want computers to determine what we focus on.”

But algorithms that pick and choose what we see are central not only to Facebook, but to many social media platforms that followed in Facebook’s footsteps. TikTok, for example, would be unrecognizable without content recommendation algorithms running the show. And the larger the platform, the greater the need for algorithms to aim and sort content.

Algorithms do not disappear. But there are ways Facebook can improve them, experts in algorithms and artificial intelligence tell CNN Business. However, it will require something that Facebook has so far been reluctant to offer (despite leading talking points): more transparency and control for users.

A woman's hand holding an iPhone X to use facebook with login screen.  Facebook is one of the largest social networks and the most popular social network in the world.

What’s in an algorithm?

The Facebook you experience today, with a constant stream of algorithmically selected information and ads, is a very different social network than what it was in its early days. In 2004, when Facebook was first launched as a site for college students, navigating was both simpler and more tedious: if you wanted to see what friends were posting, you had to visit their profiles one at a time.
This began to change in a big way in 2006, when Facebook introduced the News Feed, giving users a fire hose of updates from family, friends and the guy they took on a few bad dates with. From the beginning, Facebook allegedly used algorithms to filter content, users then in the news feed. In a 2015 Time Magazine story, the company’s product manager, Chris Cox, said curation was necessary even then because there was too much information to show it all to each user. Over time, Facebook’s algorithms evolved, and users became accustomed to algorithms determining how Facebook content would be presented.

An algorithm is a set of mathematical steps or instructions, especially for a computer, that tells it what to do with certain inputs to produce certain outputs. You can think of it as roughly akin to a recipe where the ingredients are input and the last dish is output. On Facebook and other social media sites, you and your actions – what you write or photos you send – are input. What the social network shows you – whether it’s a post from your best friend or an ad for camping gear – is output.

At its best, these algorithms can help personalize feeds so users can find new people and content that matches their interests based on past activity. In the worst case, as Haugen and others have pointed out, they risk leading people down into disturbing rabbit holes that could expose them to toxic content and misinformation. In either case, they keep people rolling longer, possibly helping Facebook make more money by showing users more ads.

Many algorithms work together to create the experience you see on Facebook, Instagram and elsewhere online. This can make it even more complicated to tease out what is going on inside such systems, especially in a large company like Facebook, where multiple teams build different algorithms.

‘ said Hilary Ross, a senior program manager at Harvard University’s Berkman Klein Center for Internet & Society and head of its Institute for Rebooting Social Media.

More transparency

There are ways to make these processes clearer and give users more sense of how they work. Margaret Mitchell, who leads artificial intelligence ethics for AI model builder Hugging Face and formerly led Google’s ethical AI team, believes this can be done by allowing you to see details of why you see what you see on a social network. , such as. in response to posts, ads, and other things you view and interact with.
Why whistleblower Frances Haugen is Facebook's worst nightmare

“You can even imagine having something to say about it. You may be able to choose preferences for the kind of thing you want to optimize for yourself,” she said, e.g. How often you want to see content from your immediate family, high school classmates or baby pictures. All of these things can change over time. Why not let users control them?

Openness is the key, she said, because it encourages good behavior from social networks.

Another way social networks could be pushed toward increased transparency is by increasing independent review of their algorithmic practices, according to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League. They envisage this as involving completely independent researchers, investigative journalists or individuals within supervisory bodies – not social media themselves or companies they employ – who have the knowledge, skills and legal authority to require access to algorithmic systems to ensure laws are not violated, and best practices are followed.

James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center’s Institute for Rebooting Social Media, suggests looking at ways elections can be revised without revealing private information about voters (such as who each person voted for) for insights on how algorithms can be revised and reformed. He believes it could provide some insights into building an audit system that would allow people outside of Facebook to monitor while protecting sensitive data.

Other measures of success

A major obstacle, experts say, to making meaningful improvements is the current focus of social networks on the importance of engagement or the time users spend scrolling, clicking and otherwise interacting with social media posts and ads.

Haugen revealed internal documents from Facebook showing that the social network is aware that its “core product mechanics, such as virality, recommendations and optimization for engagement, are an essential part” of why hate speech and misinformation “flourish” on its platform.

Changing this is difficult, experts said, although several agreed that it may involve considering the emotions users have when using social media and not just the time they spend using it.

“Commitment is not a synonym for good mental health,” Mickens said.

Can Algorithms Really Help Solve Facebook Problems? Mickens is at least hopeful that the answer is yes. He believes they can be optimized more towards the public interest. “The question is: What will convince these companies to start thinking this way?” he said.

In the past, some may have said that it would require pressure from advertisers if dollars support these platforms. But in his testimony, Haugen seemed to be betting on another answer: pressure from Congress.

.

Leave a Reply

Your email address will not be published.