‘Rumour Cascades‘, a joint study from Stanford University and Facebook, provides some fascinating insights that can serve as advice for public bodies on halting the spread of misinformation.
This was first published on the Guardian website in mid May.
A new study, jointly conducted by Facebook and Stanford University, has shed new light on how rumours spread on the social network – research that offers a number of useful insights for councils and other organisations concerned with public safety. Social media is often the cause of a great deal of misinformation, and misinformation, especially in times of emergency, can be dangerous.
The Stanford University study tracked thousands of rumours through a website documenting urban legends and examined how they spread on Facebook. These spurious stories included posts on politics and even fake images.
Here’s what they found:
• True rumours spread further
People on Facebook seem to have a sense for spotting the truth. Rumours that are true get shared by more people (on average 163 shares per post) whereas false rumours are not as likely to be reposted to friends (with an average of 108 shares per upload).
• Facebook posts that are corrected in the comment sections are often deleted
When a post received a comment with a link debunking a false rumour, this comment increased the likelihood that a reshare of a rumour was deleted. However, when people discovered that a rumour was untrue and so deleted their reshare, the spread of the rumour could still continue to grow.
• Rumours have a habit of resurfacing
Rumours – even ones that have been circulating for years – can burst to life again. A rumour might lie dormant for weeks or months and then, either spontaneously or through an external jolt, become popular again. The study also showed that rumours change over time and that different variants of a rumour tend to dominate during different bursts in popularity.
What should public bodies do when confronted with a Facebook rumour
When an organisation doesn’t provide much information, the public can be left feeling that it has something to hide. A steady stream of information can increase trust and curb rumours.
Last December a town in north Queensland experienced a Facebook hoax. It was accused of banning Christmas lights. The council used its Facebook page to correct the rumour with a post telling people the story was unfounded.
In November, racial tensions were sparked when an untrue rumour spread about a coffee shop in Tunbridge Wells. Facebook posts accused the franchise’s Muslim owner of objecting to a closure for the Remembrance Parade.
According to a local newspaper, one Facebook user posted to complain that the refusal showed a lack of respect, while another linked it straight away with poppy burning images in the media.
When the Kent and Sussex Courier spoke to her she said: “That’s what people were saying and I assumed it was the truth. If it’s not, I take it back.” Ten minutes later, the Facebook comments were removed.
In these kinds of situations, the Stanford University research suggests that councils make an extra effort to find some of the most shared posts promoting untrue rumour and comment there too. As the story spreads, the comment correcting any untruths will travel with it. Research suggests that seeing a credible source debunk the rumour slows down how much it is shared. Public bodies will increasingly need to engage in other places on Facebook, rather than just post on their official Facebook pages that they manage themselves.
A study in the Australian Journal of Emergency Management quotes the US Director of Social Strategies for the Red Cross: “During the record-breaking 2011 spring storm season, people across America alerted the Red Cross to their needs via Facebook. We also used Twitter to connect to thousands of people seeking comfort and safety information to help get them through the darkest hours of storms.”
Commenting with facts on Facebook posts could be another way to keep people safe in cases of emergency. The more people, for instance, who read a legitimate government source comment – perhaps correcting false information about riots in their city, or casualties in a storm – the more likely they are to react in a sensible way.
Rumour Cascades was written by Adrien Friggeri, Lada A. Adamic, Dean Eckles, and Justin Cheng.
Leave a Reply
You must be logged in to post a comment.