Faculty

Commerce Professors Study How Facebook Pushes Users, Especially Conservative Users, Into Echo Chambers

Studying data from more than 200,000 users, Brent Kitchens, Steven L. Johnson, and Peter Gray found that Facebook pushes users, particularly conservative users, to more polarizing news sites, but Reddit and Twitter do not have the same effect.

app icons on mobile device

This week, Facebook CEO Mark Zuckerberg once again appeared before Congress, facing questions about how the massive social media platform that he created affects America’s political culture, and vice versa.

Thanks to an extensive study that includes four years of browsing data from 200,000 users on Facebook, Reddit, Twitter, and online news sites, three University of Virginia professors have some answers.

Analyzing opt-in user data from 2012 to 2016, Brent Kitchens, Steven L. Johnson, and Peter Gray, all faculty members in UVA’s McIntire School of Commerce, found that all three social media sites connect their users to a more diverse range of news sources than they would otherwise visit. However, Facebook tends to polarize users, particularly conservative users, more than other social media platforms.

In fact, the researchers found that typical conservative users, in months when they visited Facebook more than usual, read news that was about 30% more conservative than the online news they would typically read. Users who visited Reddit more than usual, on the other hand, read news that was about 50% more moderate than what they would typically read.

“We found that Facebook had a polarizing effect, Reddit had a moderating effect, and Twitter did not have a significant effect” on the types of news people consume, Kitchens said, noting that the study came before Twitter changed the algorithm it uses to build its feed in 2016.

Peter Gray, Steven Johnson, and Brent Kitchens

Peter Gray, Steven Johnson, and Brent Kitchens

Their findings, which will be published in a forthcoming peer-reviewed article in MIS Quarterly, have already received a lot of attention on The Washington Post, where the professors contributed an op-ed, and other news sites.

The key difference, they say, comes down to how each site structures the algorithm that determines what news users see when they log on. Though it doesn’t share its data publicly, Facebook ostensibly bases its algorithm on “engagement,” or the number of likes, comments, and shares a post garnered. Reddit is more topic-based, as users opt in or opt out of various discussion boards based on topics they are interested in. Twitter, until 2016, was purely chronological, showing news as it became available.

“On Facebook, inflammatory content tends to drive the most engagement, because it elicits emotion,” Kitchens said. The algorithm rewards that emotion, which can drown out more moderate posts and amplify more extreme ones.

That phenomenon holds true for both conservative and liberal users, but Kitchens, Johnson, and Gray theorize that the effect is stronger for conservative users because they tend to turn to a large network of right-leaning sites while eschewing more moderate “mainstream media.”

“The echo chamber effect was definitely present for both conservatives and liberals, but more pronounced for conservatives,” Kitchens said. “This could be because of the landscape of news outlets at that time, with more liberal-leaning sites clustered at the center or on the moderate left, and more conservative sites farther to the right. The distribution is a bit skewed.”

Whom users choose to follow and how they follow them also has an effect. On Facebook, many of the posts in users’ feeds come from friends, who have to accept a friend request before a user can see their posts. On the other hand, Reddit and Twitter users can select any topic or person to follow without any sort of mutual relationship.

“On Facebook, you have to have a reciprocating relationship,” Johnson said. “It is a bit different for Facebook groups or business pages, but most of the content someone sees on Facebook comes from their friends, and they are likely to have friends with similar views to their own.”

The effects of political polarization on American culture are obvious, from the vitriol around the recent presidential election to politicized debates about mask wearing or other public health measures. The solution, however, is less apparent, especially when it comes to regulating social media platforms.

“We have a long history of organizations, companies and for-profit businesses acting in ways that were not good for all stakeholders involved, and ultimately needing legislation to establish basic things like employee and environmental protections,” Gray said. “Now, I think the big open question is if there is enough willpower for these companies to change on their own, or if it will take the government establishing baseline regulations.”

Competition, he said, could also play a role.

“The more committed a social media company is to an algorithm that is not necessarily beneficial to users, the more it opens the door for another company to come along and do a better job creating value for users,” he said. “That could be really interesting. We have seen a lot of attempts to start new social media companies, but none have really taken off yet.”

Kitchens noted that there are many benefits to Facebook and other social media platforms, including the overall diversifying effect they have had on users’ news consumption.

“These platforms are not all bad; they have the potential to be beneficial, as we saw in our study. These platforms can shift people’s point of view, and widen it,” he said. “Their whole purpose is to show you something that you would not have seen if you just navigated directly to your favorite news site. So, the beneficial aspects are there, but we need to work out the kinks.”

All three professors were enthusiastic about an idea that Twitter CEO Jack Dorsey expressed in another recent congressional hearing: allowing users to choose how the algorithm building their news feed functions. They might, for example, choose an algorithm that curates the most trustworthy news from fact-checked sources. Or, they might choose to be show a random selection of viewpoints different from their own.

“I think providing users with more transparency about why they are seeing what they are seeing, and giving them more options, is a really promising idea,” Johnson said.

The time to act, Kitchens said, is approaching.

“These are growing pains, as innovations in technology lead to new problems that we did not foresee and now need to figure out how to fix,” he said. “I absolutely think we can, and we need to act while founders like Mark Zuckerberg and Jack Dorsey are still involved in the day-to-day running of their business. The profit motivation of these companies will only grow, so we need to take advantage of this time with founders who, hopefully, are investing in creating social good.”

By Caroline Newman, Associate Editor, Office of University Communications, cfn8m@virginia.edu, 434-924-6856. This story was first published in UVA Today Nov. 18, 2020.

Find out about all the exciting things happening in the McIntire community. Visit our news page for the latest updates.

More News