Skip to content
NOWCAST Pittsburgh's Action News 4 at 9am Sunday
Watch on Demand
Advertisement

Get the Facts: Understanding social media algorithms

Get the Facts: Understanding social media algorithms
Advertisement
Get the Facts: Understanding social media algorithms
If you're seeing the same types of stories repeatedly on your social media feeds, you're not alone, and it's due to something called an "algorithm.""It's a series of steps that a computer follows in order to do what it's programmed to do," explains David Broniatowski from George Washington University. Social media platforms like TikTok and Facebook use these algorithms to decide what content to show you. This decision is based on your engagement. "The bottom line in terms of how these systems work is the main reason that you're seeing anything in social media is because someone in your network is engaging with it," says Sol Messing from the New York University Center for Social Media & Politics. Engagement can mean a number of things. It can be what you view, like, comment on, who your friends are, and how you share content back and forth. "When somebody views content, that factors in. When somebody engages with content, that probably factors in more heavily," Broniatowski said. In addition, he said your feeds could be impacted "when somebody adds a new friend or removes a friend, or their friends start doing things at all."Experts say an algorithm on any platform can promote what's called a confirmation bias. This means you're more likely to view or interact with content that furthers your own beliefs. "People tend to connect to people to other people who are similar. So we end up consuming a lot of content from sources whose perspectives matches our own," Messing explains. This can create a kind of echo chamber. "There's really no way for any algorithm, no matter how advanced it is, to really get the full context of something that's being shared," says Broniatowski. However, you can train your algorithm to give you a more robust experience by changing what you engage with. "The more you try to look for different things, the more the algorithm will learn you like those different things," Broniatowski advises. This means you'd see a wider variety of content suggested for you. You won't fall victim to confirmation bias and you'll get multiple sides to every story. "The answer is follow people from the other side, follow people whose views you might not agree with, and read what they post every once in a while," suggests Messing. While these types of mitigation techniques are helpful, more needs to be done. "I think platforms need to keep on fielding new updates, new advances, with the full recognition that they're not going to be perfect, they're still better than nothing," says Broniatowski. Social media platforms all have tools for you to regain control of what you see. Most of these can be found under the privacy settings or by toggling between different feeds.

If you're seeing the same types of stories repeatedly on your social media feeds, you're not alone, and it's due to something called an "algorithm."

"It's a series of steps that a computer follows in order to do what it's programmed to do," explains David Broniatowski from George Washington University.

Advertisement

Social media platforms like TikTok and Facebook use these algorithms to decide what content to show you. This decision is based on your engagement.

"The bottom line in terms of how these systems work is the main reason that you're seeing anything in social media is because someone in your network is engaging with it," says Sol Messing from the New York University Center for Social Media & Politics.

Engagement can mean a number of things. It can be what you view, like, comment on, who your friends are, and how you share content back and forth.

"When somebody views content, that factors in. When somebody engages with content, that probably factors in more heavily," Broniatowski said. In addition, he said your feeds could be impacted "when somebody adds a new friend or removes a friend, or their friends start doing things at all."

Experts say an algorithm on any platform can promote what's called a confirmation bias. This means you're more likely to view or interact with content that furthers your own beliefs.

"People tend to connect to people to other people who are similar. So we end up consuming a lot of content from sources whose perspectives matches our own," Messing explains.

This can create a kind of echo chamber.

"There's really no way for any algorithm, no matter how advanced it is, to really get the full context of something that's being shared," says Broniatowski.

However, you can train your algorithm to give you a more robust experience by changing what you engage with.

"The more you try to look for different things, the more the algorithm will learn you like those different things," Broniatowski advises.

This means you'd see a wider variety of content suggested for you. You won't fall victim to confirmation bias and you'll get multiple sides to every story.

"The answer is follow people from the other side, follow people whose views you might not agree with, and read what they post every once in a while," suggests Messing.

While these types of mitigation techniques are helpful, more needs to be done.

"I think platforms need to keep on fielding new updates, new advances, with the full recognition that they're not going to be perfect, they're still better than nothing," says Broniatowski.

Social media platforms all have tools for you to regain control of what you see. Most of these can be found under the privacy settings or by toggling between different feeds.