Few people do science journalism right, and so few comprehend the difference.
I came across this recently. I clicked the link, which took me to an article describing a paper that I downloaded and read. The paper itself was OK as far as social science papers go, but as usual, elements within the media that don't know any better jumped on it.
The paper concludes "that individuals with an East German family background cheat significantly more on an abstract task than those with a West German family background." It also concludes that "The longer individuals were exposed to socialism, the more likely they were to cheat on our task."
The points I would like to raise are below.
You might download a film illegally, but this does not make you a thief in general. Your choice to download a film at that point in time is a function of the cost-benefit equation to you and the social context of your choice, how many other people are doing so, your perception of the 'rightness' or 'wrongness' of your act, etc. It does not necessarily reflect you attitudes or preferences in other contexts.
Even if you could extrapolate the effect observed in the game to life at large, you must remember that Correlation is not Causation. The fact that those with an East German background cheated more does not indicate that their background is what caused them to cheat. No one is denying that economic systems can change behaviours and attitudes of people, and it is worth studying, but you cannot jump to conclusions. You need to remove all the confounding variables, false positives and other possible causes. You do this by making as many comparisons as possible. Did the researchers do this? Not completely.
East Germany was not merely socialist but thrived on a culture of fear and repression, with secret police spying on citizens. There are social and cultural factors that could have lead to people developing a habit of cheating and might have had nothing to do with the economic system. The researchers have identified two of these - economic scarcity and social comparison - but were not able to verify them using their methods.
--------------------------------------------------------------------------------------
All in all, this is not a bad paper, compared to others I have read. The researchers are quite honest about most of their limitations. However, no one else seems to care. The original article that linked to this paper merely reiterated the findings as if they were correct, without taking into account the researcher's alternative explanations. This is bad journalism.
There are so many papers being published every month in various journals. Sometimes, the journals themselves are shady, and publish poor research for a fee. Researchers are under pressure to publish as this is what determines their reputation and pay in academia. So they tend to fudge data or manipulate it in dishonest ways to get positive results. Journals have a publishing bias towards positive results. And the journalists who write about the papers that are published in journals are usually under tight deadlines too. They cut corners. They trivialise, generalise and indulge in simplification. They have a poor understanding of scientific domains, empiricism, and critical reasoning. Most don't bother critiquing the papers they report on.
I have read so many bad science articles in the past 3 years that I have had to whittle down my RSS feeds to the extent that I only follow a few news feeds, scientists and professional science writers. On Twitter I am even stricter. I do not follow any pop science accounts, only professional researchers, people who will either share original research, or go the extra mile and critique people's research rather than blindly sharing links they come across. The best science communicators out there do not even bother writing about the latest developments in Psychology, given the faults in the field, the shakiness of results - the p value hacking, selective sampling, failure to replicate, false assumptions, etc.
So it's always sad when an individual with a lot of followers shares a bad article. I am not writing this to be mean or to hurt anyone's feelings or discourage anyone's work. I'm just saying that there is a clear demarcation between good science writing and poor science writing. I do not expect every journalist out there to be able to critique a paper (though it would help) but I do expect even a beginner to know the difference between a balanced and a biased article.
This is not an isolated case. There are other people on Twitter with a massive reach who also tend to share terrible links. I am sure they are lovely human beings who want the best for humanity and are smarter and more accomplished than me in many ways, but they still share terribly written pop science articles that distort a subject just because they have a catchy title or byline.
So here is some advice when you come across a piece of science journalism -
- When something sounds too good to be true, it usually is not true. In the social sciences, discoveries are few and far between, so an article that claims to have discovered a major effect must be met with skepticism.
- If an article claims something that you're sceptical about, read the paper and double check if those claims are true.
- If you cannot critique the paper or do not have the time, do not share the article. Wait for someone else to critique it.
- Follow professional science communicators who know how to critique scientific discoveries, and not mass produced pop science junk. The pros know how to write a balanced piece. Pop news channels just want to grab eyeballs and don't care about accuracy.
No comments:
Post a Comment