‘Fake news’ flags won’t get to the truth
Let's trust people to be critical news consumers.
Ah, Facebook – the 21st century platform that so perfectly embodies all of our human failings: boastfulness, narcissism, the stalking of former lovers. It can bring joy (the validation of receiving more than 50 likes for a post); it can bring guilt trips (‘If you care about this disease, you will post this for 24 hours. I know my friends will…’); and envy (holiday snaps). Facebook is constantly evolving to find new ways to add to the already myriad ways in which it annoys its users. The latest was announced last week.
A prompt titled ‘How to spot fake news’ is set to appear at the top of users’ newsfeeds. The prompt offers 10 tips for spotting fake news, which include looking out for spelling errors (that’s the Grauniad finished then), being sceptical of headlines, and asking yourself the question: ‘Is the story a joke?’ Yes, Facebook users can now look forward to being patronised by the social network that brought them the online ‘poke’.
Adam Mosseri, head of Facebook’s newsfeed, says: ‘False news is harmful to our community, it makes the world less informed, and it erodes trust.’ Mosseri has described the fake-news measure as ‘an educational tool to help people spot false news’, suggesting Facebook believes it should play a role in educating people. Indeed, the introduction of an ‘educational tool’ implies Facebook doesn’t trust its users to identify for themselves what is fake and what is real.
Facebook’s move was inevitable given the hysteria about fake news from political and media elites. After Donald Trump won the US election last year, his adversaries were quick to point the finger at social media and online search engines for disseminating false stories – like the now infamous ‘Pizzagate’ story that falsely accused Hillary Clinton of being part of a paedophile ring operating out of a pizza restaurant.
It came as no surprise that, on Friday, Google also made an announcement about fake news. It is introducing a new product, Fact Check, which will be available under ‘Search’ and ‘News’. It will identify which stories have been verified by fact-checking organisations or by reputable news publishers. Google won’t be doing the fact-checking itself – instead results will show stories that have been verified by independent fact-checkers such as PolitiFact and Snopes. The tool will also allow established news publishers, including the Washington Post and the New York Times, to fact-check each other’s stories. These fact-checked labels won’t, however, affect the ranking of a story in Google results.
A Google blog post by Justin Kosslyn, product manager of Google subsidiary Jigsaw, and research scientist Cong Yu, says:
‘There may be search-result pages where different publishers checked the same claim and reached different conclusions. These fact checks are not Google’s and are presented so people can make more informed judgements. Even though differing conclusions may be presented, we think it’s still helpful for people to understand the degree of consensus around a particular claim and have clear information on which sources agree. As we make fact checks more visible in search results, we believe people will have an easier time reviewing and assessing these fact checks, and making their own informed opinions.’
This all sounds rather confusing. If the aim is to flag up false stories to enable readers to see the ‘truth’, then how will conflicting conclusions be of any help?
Herein lies the problem with affording authority to fact-checkers and certain publishers, and with making social networks the arbiters of truth. The truth, especially in politics, is not always clear-cut. As with most things, there are shades of grey. Time magazine is a well-respected publication and yet one of its top reporters falsely claimed that the bust of Martin Luther King had been removed from the Oval Office in the White House once Trump became president (the claim has since been retracted).
Adding arbitrary and even conflicting labels to news stories seems unlikely to be of any help to readers, yet online content platforms are in a rush to implement them. This is understandable given the government’s current obsession with how these kinds of companies present information. Just last month, Google, Facebook and Twitter were summoned before UK Labour MP Yvette Cooper and the Home Affairs Select Committee and interrogated as part of an inquiry into extremist and false content hosted on their sites. Under so much pressure from the powers-that-be, is it any wonder companies like Google have jumped to introduce superficial ‘fake news’ fighting tools?
It often seems that those who are most concerned about fake news are either people in power looking to assert more control over online content, or observers who picked the losing side in recent elections or referendums and who are now looking for someone to blame. Many of those still fighting to prevent Brexit from happening claim the Leave vote was a product of Vote Leave’s lies and fake claims. There is a strongly political element to the fake-news panic and the proposed solutions. This is not about truth and neutrality.
Undoubtedly the internet is an easy platform on which to spread false stories, and that is a concern. It is important in free, democratic societies that people have access to news that is as objective as possible. But censorious pressure from governments and patronising ‘educational tools’ are not the answer. Such measures undermine the capacity for critical thinking that is innate in all of us. In a world where an abundance of information is available, it is the responsibility of all free-thinking individuals to read, research, and decide for themselves what is the truth.
Naomi Firsht is staff writer at spiked and co-author of The Parisians’ Guide to Cafés, Bars and Restaurants. Follow her on Twitter: @Naomi_theFirsht
No paywall. No subscriptions.
spiked is free for all.
Donate today to keep us fighting.Donate online
To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.