Social media is reshaping the way that Americans consume news and engage with current events. With the rise of smartphones and almost constant access to Internet, social media users can access the digital space to instantly share news stories, images, or videos and participate in political discussions on their respective network pages. According to Pew Research, 64% of U.S. adults have a Facebook account, and more than 40% of all Americans check Facebook everyday. Nearly half of social media users have read and shared news stories on Facebook or Twitter. People are increasingly relying on Facebook to contribute to the political conversation, and Facebook is now the primary source of news for millennials. Meanwhile, print newspapers are struggling to survive and have become increasingly obsolete.
The evolution of political engagement on social media has led to the rise of hyperpartisan Facebook pages that play into biases and publish stories containing false information. These types of pages, which exist on both sides of the political spectrum, often use misleading headlines and unverified claims to attract post clicks, likes, and shares. Through an analysis of more than 1,000 media
posts from six hyperpartisan Facebook pages, Buzzfeed found that right-wing Facebook pages published false information 38% of the time and left-wing pages published misleading information 20% of the time. Furthermore, 19% of all posts reviewed by Buzzfeed fell into the “no factual content” category of jokes and memes. Articles containing the most inaccurate information attracted the highest engagement and resulted in the highest numbers of shares, comments, and followers.
This phenomenon manifested itself during the recent presidential election, where partisan Facebook pages heavily circulated false information about both sides of the aisle. For example, The Political Insider published an article titled, “WikiLeaks CONFIRMs Hillary Sold Weapons to ISIS…Then Drops Another BOMBSHELL! Breaking news.” Ending the Fed falsely claimed that Pope Francis endorsed Donald Trump for President, while a fake news source called the Denver Guardian reported that an FBI Agent involved in the Hillary email leaks was found dead in an apparent murder-suicide. On the liberal side of the spectrum, Occupy Democrats falsely stated that Trump wants to expel all Muslims, including citizens, from the United States. Occupy Democrats also reported that Putin was responsible for rigging online polls showing that Trump won the first presidential debate.
Given the popularity of these types of pages and articles, a disconcerting conclusion is that the best way to increase engagement and gain followers for political news content on Facebook is to publish inaccurate information that plays into biases and stereotypes. Through personalized news feeds, social media users can follow pages and articles that reinforce what they want to hear, without hearing other perspectives. This creates digital bubbles and echo chambers where people only engage with other users who share their same liberal or conservative viewpoints, which drives further division and polarization. Nieman Lab notes, “American political discourse in 2016 seemed to be running on two self-contained, never-overlapping sets of information. It took the Venn diagram finally meeting at the ballot box to make it clear how separate the two solitudes really are.”
On November 18, 2016, Mark Zuckerberg announced Facebook’s plan for combating misinformation on its site. He explained that addressing the problem is difficult, as Facebook does not want to discourage users from sharing their opinions or to mistakenly remove articles that contain accurate information. Facebook’s strategy includes projects aimed at developing stronger detection of misinformation, easier reporting for people to report fake stories, working with third party fact checking organizations and journalists to get their input and understanding, and showing warnings when users read or share articles that have been flagged as false by members of the community.
Going forward, what does the rise in popularity of fake news mean for the future of reliable journalism? How can we lessen the spread of fake news and get back to news that is based on the facts, even if factual articles may not have the same click bait appeal? Facebook could continue to alter its algorithms to try to prevent the spread of misinformation and the political polarization of the digital world, and Google is also working on shifting its policy regarding page and website bans. On a more individual level, we can each be mindful of the prevalence of misinformation in our social media feeds, as well as try to desegregate and open up our social media universes to other viewpoints and perspectives. This will build better online experiences and hopefully result in more civil, informed political discussions in our society.