Extra Post: Did Meta Try to Encourage Politcal Polarization?

 


Throughout the last 2 presidential elections, there was an immense amount of talk surrounding political polarization displayed on social media. Two specific platforms received backlash due to their alleged attempts on spreading political misinformation, especially during elections. These platforms are Instagram and Facebook also known as Meta. Every user has an algorithm that is curated just for them based off of who they follow, posts that they might have liked previously, posts that they have commented on, headlines they have clicked on, etc. Through the algorithm, there were accusations made against the social media platforms stating that they tried to purposefully spread misinformation and opinionated content instead of factual. Because people were seeing this on either their feeds or other people's the accusations surrounding how Meta was singlehandedly increasing the country's political division. 



Because of how algorithms are designed for each user, that means it holds quite a bit of power in terms of how influential a recommended post or advertisement will be for said user. People argued that depending on the person, they might believe everything that the recommended post is saying due to the fact that they believe that their algorithm is made just for them. What some people might not know about the algorithm, is that it throws out all of the content that a user has been looking at, searching for, liking, etc on onto their feed because that is automatically what it thinks you want to view since the user is looking at that type of content so frequently. To see if the accusations were true, various researchers gathered together in order to gain unusual access to both Facebook and Instagram data from the 2020 presidential election timeframe. According to the researchers, Meta did not try to control their findings. They went through the process of simply replacing the algorithm with just posts from friends as well as turning off the reshare button. Because of this, users noticed that they weren't receiving nearly as many untrustworthy news sources and unsolicited political posts but there was no change in their political viewpoints. Altogether, these findings show that Meta users actively seek out content online that matches their political beliefs. 


It is much more common for Meta users that are conservative to believe content that is labeled as "misinformation" compared to liberals. According to fact checkers, 97% of political news sources on Facebook that were labeled as spreading misinformation were more popular amongst conservatives rather than liberals. Overall, in my opinion, I believe all of this. Algorithms are designed to help a user reach their desired content and if a conservative is actively seeking out misinformation, then the algorithm will deliver that to their feed. I think that this is a user issue if they feel that they are being shown misinformation as when it comes to Meta, the main purpose is to show content to each user that they will enjoy so that they keep entering the app. So if you think you are being politically misinformed on social media, make sure to go to your history beforehand, because you might have been searching up multiple times the very thing that you call misinformation.  



   

News source: https://www.usnews.com/news/business/articles/2023-07-27/deep-dive-into-metas-algorithms-shows-that-americas-political-polarization-has-no-easy-fix (Created on July 27, 2023)

Comments

Popular Posts