[ad_1]
The social network announced Wednesday that it had started changing its algorithm to reduce the political content in users’ news feeds. The less political feed will be tested on a fraction of Facebook’s users in Canada, Brazil and Indonesia beginning this week and will be expanded to the United States in the coming weeks, the company said.
“During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward,” Aastha Gupta, a Facebook product management director, wrote in a blog post announcing the test.
Facebook previewed the change last month when Mark Zuckerberg, the chief executive, said the company was experimenting with ways to tamp down divisive political debates among users.
“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.
Political stories won’t disappear from users’ feeds altogether. Content from official government agencies and services will be exempt from the algorithm change, Facebook said, as will information about COVID-19 from organizations like the Centers for Disease Control and Prevention and the World Health Organization. Last month, Zuckerberg said users would also still be able to discuss politics inside private groups.
“People don’t want politics and fighting to take over their experience on our services”
“They can be ways that people organize grassroots movements, speak out against injustice or learn from people with different perspectives, so we want these discussions to be able to keep happening,” Zuckerberg said.
Facebook has been under fire from lawmakers from both parties. Liberals have blamed the company for allowing hate speech and misinformation to spread, while conservatives have claimed that they were censored.
Making Facebook less political could satisfy critics who blame it for increasing partisan polarization. But the move could also cut into the time that users spend on the app. Many of the most-engaged news stories on Facebook are political, and charged political debates often generate the heavy use and repeat visits that are good for the bottom line.
Data released by Facebook last fall showed that during one week in October, seven of the 10 most-engaged pages were primarily political, including those of President Donald Trump, Fox News, Breitbart and Occupy Democrats.
Three years ago, Facebook said it would pull back on the amount of content posted to the site by news publishers and brands, an overhaul that it said put more focus on interaction among friends and family. At the time, Zuckerberg said he wanted to make sure that Facebook’s products were “not just fun but good for people.” He also said the company would take those actions even if it meant hurting the bottom line.
Still, Facebook users have had no problem finding political content. Nongovernmental organizations and political action committees paid to show millions of Americans highly targeted political advertising in the months before November’s presidential election. Users created vast numbers of private groups to discuss campaign issues, organize protests and support candidates. Until recently, Facebook’s own systems frequently suggested new, different political groups that users could join.
Facebook has backtracked on some of this in recent months. After the polls closed on Election Day, the company shut down the ability to buy new political advertising. And after the deadly Capitol riot Jan. 6, Zuckerberg said the company would turn off the ability to recommend political groups to “turn down the temperature” on global conversations.
Under the new test, a machine-learning model will predict the likelihood that a post — whether it is posted by a major news organization, a political pundit, or your friend or relative — is political. Posts deemed political will appear less often in users’ feeds.
It’s unclear how Facebook’s algorithm will define political content, or how significantly the changes will affect people’s feeds. Lauren Svensson, a Facebook spokeswoman, said the company would keep “refining this model during the test period to better identify political content, and we may or may not end up using this method longer term.”
It is also unclear what will happen if Facebook’s tests determine that reducing the political content also reduces people’s use of the site. In the past, the company has shelved or modified algorithm changes that aimed to lower the amount of misleading and divisive content people saw, after determining that the changes caused them to open Facebook less frequently.
Political posts make up only about 6% of what U.S. users see on their feeds, Facebook said. But given the headaches that these posts have caused for the company, it’s no mystery why it wants to shrink that number.
“Even a small percentage of political content can impact someone’s overall experience,” Gupta wrote.
[ad_2]
Source link