According to a recent study, Facebook has claimed that users are responsible for the filter bubble, an algorithm that hides posts which are disagreed by users. The study was conducted to find out why users see the posts that mirror their own beliefs.
Facebook studied 10,1 million users who had listed their political affiliations as either liberal or conservative from July 2014 to January 2015. The study revealed that the user’s News Feed contained contrary opinions, called as “cross-cutting” articles. The research found that Facebook hides 1 in 13 “cross-cutting” links if you identify as liberal, and if you are a conservative, 1 in 10 “cross-cutting” links are hidden.
“The research was conducted on a small, skewed subset of Facebook users who chose to self-identify their political affiliation,” said Prof. Zeynep Tufekci at University of North Carolina, Chapel Hill.
Tufekci states that the study does not represent Facebook as a whole as only 4 percent of the total population is taken for the study. The study concludes that users tend to create their own filter bubbles by clicking on stories that support their own beliefs. Tufekci mentions that the news feed algorithm filters out diverse opinions. Social scientist
Christian Sandvig blog post point out the problem in the study conducted by Facebook’s team, and the framing of results. Data scientists reported that Facebook’s algorithm decides the articles in the News Feed, but they limit themselves from taking a stance on whether it’s a good or bad thing. This is ridiculous, added Sandvig. in 2012, Facebook received backlash from the academic community when it experimented on users by displaying more positive content or negative content. Users were then monitored to find out if they were posting positive or negative statuses.
The site’s algorithm could play a major role in the U.S, during the election season as the site will decide on what news users will see. In conclusion, Facebook’s News Feed is not a reflection of user’s interests, though the social-networking site claims the opposite.
[ Via ]