The Facebook Algorithm Is Watching You

Here’s one way to confuse it.

Ben Grosser

You can tell a lot about a person from how they react to something.

That’s why Facebook’s various “Like” buttons are so powerful. Clicking a reaction icon isn’t just a way to register an emotional response, it’s also a way for Facebook to refine its sense of who you are. So when you “Love” a photo of a friend’s baby, and click “Angry” on an article about the New England Patriots winning the Super Bowl, you’re training Facebook to see you a certain way: You are a person who seems to love babies and hate Tom Brady.

The more you click, the more sophisticated Facebook’s idea of who you are becomes. (Remember: Although the reaction choices seem limited now—Like, Love, Haha, Wow, Sad, or Angry—up until around this time last year, there was only a “Like” button.)

This matters because of what Facebook might then do with its sense of your baby-loving, Tom-Brady-hating self. It might mean that Facebook will show you more photos of babies and fewer articles about football, which in turn might affect which friends appear more frequently and prominently in your News Feed. And that might affect your perception of the world.

It might mean you see sponsored posts aimed at parents of young kids. Or it might mean that Facebook shows you an outsized number of Tom Brady posts one week as a way to provoke you—Facebook does have a history of experimenting on its users, after all.

“Facebook has conducted covert experiments on its users to evaluate how Facebook can emotionally influence people,” says Ben Grosser, an artist and a professor at the University of Illinois at Urbana-Champaign. “They already have so much power. To give an algorithm and a corporation access to which of the things on your feed you are most reactive to—it’s really useful information that tells them to not just tailor content to what they think you like, but they can push you.”

Grosser’s latest project is an attempt to push back. He made a browser extension he’s calling Go Rando, which intercepts each time you click a reaction button on Facebook, then uses a random-number generator to select a reaction for you. “If you click ‘Like,’ you might get ‘Angry,’ or you might get ‘Haha,’ or you might get ‘Sad,’” Grosser told me. “Users can still hover and select a specific reaction if they want to—but it will randomize their reactions for them.”

The project is meant to encourage people to question what they’re doing when they click a Facebook reaction button. He wants people to ask: “Where does this data go? Who benefits from it? And who is made most vulnerable by it?”

“I want people to think about who is reading this data,” Grosser told me. “We think of [clicking reaction buttons for the benefit of] our friends, but the primary consumers of this data are not our friends. It’s for the news feed algorithm, advertising message profiling, predictive analytics. All these different systems that are looking to mine this data, hoping to understand our hopes or fears as a way of deciding how to sell us something, as a way of deciding whether we’re dangerous, as a way of deciding whether we’re worthy of getting a loan.”

Grosser concedes that actually using the browser extensions is, at times, awkward. Like when a friend of his shared good news—excitement about the opening of a new art exhibit. Grosser clicked the “like” button and Go Rando selected a sad-face reaction for him. “A lot of my friends have seen me post about my project, but they still are taken aback,” he says, “It’s like, ‘What’s going on? What are you sad about?’ It forces people to go into this conversation about what reactions are and how they might mean something or not mean something, or how they can be interpreted.”

Scrolling through a News Feed and clicking reaction buttons may feel as ethereal as waving at a friend from across a crowded room. It’s not.

“It’s almost a compulsive, involuntary behavior at this point,” Grosser said. “I think a lot of people can identify with the feeling of ‘liking’ something even if they didn’t really like it, because it’s important to indicate presence or having seen the item.”

The bottom line is this: Every time you click a button on Facebook, every time you indicate to a friend you’ve seen whatever it is they’ve posted, Facebook sees you back.

Adrienne LaFrance is the executive editor of The Atlantic. She was previously a senior editor and staff writer at The Atlantic, and the editor of TheAtlantic.com.