The Social Media Algorithm: Addiction by Design

Why the power of social media algorithms should be kept in check

The Social Media Algorithm: Addiction by Design

Sarah Patel, Editor

On Tuesday, April 27, the Senate Judiciary Committee questioned policy leads from Facebook, YouTube, and Twitter about the role of algorithms in their platforms and how they affect media consumers.

Senator Ben Sasse (R-NE) acknowledged that, while algorithms have great potential for good, they also have the potential to be misused. “We, the American people, need to be reflective and thoughtful about that,” he said. 

Before the shift to algorithms, social media platforms displayed content chronologically, with the most recent posts at the top of one’s feed. Today, however, the placement of content on our screens is determined by a much more complex feature: the algorithm. 

Social media algorithms track what content you engage with and how you engage with it: what photos you like and comment on, what users you follow or websites you view, and even how long you spend looking at certain videos or posts.    

Algorithms gather this information, rank what content is most relevant to an individual, and ultimately deliver that tailored content. 

“I once looked up Sebastian Stan,” said junior Anna Pierson. “Pinterest took this and ran with it; the next day my whole feed was never-ending pictures of him.” 

Social media companies use algorithms to maximize the amount of time we spend on their platforms. Higher engagement means a higher number of impressions on advertisements, so by keeping us addicted, social media platforms make more money. 

“The business model is addiction,” said Senator Sasse. 

Beyond keeping our eyes strained, algorithms impact us in other harmful ways. By feeding users solely with content they are interested in or agree with, algorithms contribute to a phenomenon that social psychologists call false consensus. 

“The false consensus effect explains that people tend to overestimate how much other people agree with them on matters of opinion, attitudes, and habits,” said AP Psychology teacher Julie Harris. “People tend to view alternatives as rare, and even deviant.”   

When we think that certain viewpoints are different from a perceived majority, we may believe they are illogical or senseless. 

“I personally choose to not post many controversial things on social media, just because I know it will lead to negativity and judgement,” said junior Catherine Morris. 

Another concern with the rise of social media algorithms is the rapid spread – and amplification – of misinformation. Unfortunately, the customization of posts to your interests doesn’t stop with your favorite foods or TV shows: it extends to news. 

“People are exposed to what they already believe,” said Harris. 

And just because certain news appeals to you – or provokes you to like or share – doesn’t necessarily mean it’s true. 

“It seems like you never know whether you’re being shown a satire story or a real, evidence-based one,” said junior Rose Nugent. “I think it’s pretty difficult to discern what’s real and what’s fake, but I know that if the post supports an opinion I already believe, I’m more likely to view it as true.”  

In an age where artificial intelligence accurately predicts our behaviors online, it’s easy to feel defenseless. It’s as if we’re fighting a losing battle against a faceless enemy. 

As media consumers, our ability to shield ourselves from powerful algorithms is limited – but not nonexistent.  

Switch to a chronological timeline, if available. Twitter offers one: tap the “sparkle” in the upper right hand corner of the app and choose “Switch to latest Tweets.” 

Limit your time on social media. Instagram has a feature that will notify you to “pause” once you’ve reached your chosen time limit.  You can also customize restrictions for certain apps in the Screen Time section of the Settings app. 

Be mindful of what others are sharing, and think critically about what is being shown to you before you double-tap. 

Although these suggestions are helpful, it’s clear that we must regulate the algorithms themselves. Perhaps the algorithms could be designed in a way that displays a hybrid of customized and generic content that is nonspecific to a user’s past behaviors. This way, users are exposed to a more diverse presentation of information. 

There must be a way for us to enjoy social media while maintaining an accurate perception of who agrees with us, an ability to identify misinformation, and the freedom to step away from the screen whenever we please.