Skip to main content

Years after opening a Pandora’s box of bad behavior, social networking companies are trying to find subtle ways to reshape the way people use their platforms.

Following Twitter’s lead, Facebook is testing a new feature designed to encourage users to read a link before sharing it. The test will reach 6% of Facebook’s Android users globally in a gradual rollout that aims to encourage “informed sharing” of news on the platform.

Users can still easily click to share a given story, but the idea is that by adding friction to the experience, people might rethink their original impulses to share the kind of content that currently dominates the platform.

Twitter introduced ads last year that urged users to read a link before retweeting it. The company quickly found the test feature to be successful and expanded it to more users.

Facebook began testing more messages like this last year. The company launched pop-up messages to warn users before they shared any content that was more than 90 days old in an effort to reduce misleading stories taken out of their original context.

At the time, Facebook said it was looking at other pop-up messages to reduce some types of misinformation. A few months later, Facebook launched similar pop-up messages that indicated the date and source of the links they share related to COVID-19.

The strategy demonstrates Facebook’s preference for a passive strategy of moving people away from misinformation and toward its own verified resources on hot-button issues like COVID-19 and the 2020 U.S. election.

While the jury is still out on the impact this type of soft behavior moderation may have on the misinformation epidemic, both Twitter and Facebook also explored suggestions that discourage users from posting abusive comments.

Pop-up messages that give users the sense that their misbehavior is being observed could be where more automated moderation on social platforms is headed. While users would likely be much better served by social networking companies that remove misinformation and existing platforms plagued by abuse and rebuild them more carefully from the ground up, small behavioral nudges will have to suffice.


Leave a Reply