Monday 4 August 2014

On Science Journalism


Few people do science journalism right, and so few comprehend the difference. 




I came across this recently. I clicked the link, which took me to an article describing a paper that I downloaded and read. The paper itself was OK as far as social science papers go, but as usual, elements within the media that don't know any better jumped on it. 

The paper concludes "that individuals with an East German family background cheat significantly more on an abstract task than those with a West German family background." It also concludes that "The longer individuals were exposed to socialism, the more likely they were to cheat on our task." 

The points I would like to raise are below.

The first point is with the statistical inference used in the study. The researchers note that both groups cheated, but those with East German backgrounds cheated more than those with West German ones to a degree that was statistically significant. I won't go into detail about p value hacking, confidence intervals, effect sizes and power here, but suffice it to say that a statistically significant result does not reflect an actual real life effect. This is just a function of probability. Neither group of people may have cheated. But the statistical techniques picked up on variation that the researchers deemed significant. We do not know if this significant difference represents cheating in real life, or if it would hold if the study were to be repeated.

Even if the effect (cheating) were present, there is no way that you can automatically extrapolate the results of a game to a judgement of people's moral attitudes in general. This is because morality is complex. The fact that some people might use the opportunity to cheat if given the opportunity to do so in a game of dice does not necessarily reflect their attitudes in general, or choices in other situations. The researchers use terms like 'value system', but do not define what this encompasses. What constructs and concepts make up a value system? Is it objective or context dependent?

You might download a film illegally, but this does not make you a thief in general. Your choice to download a film at that point in time is a function of the cost-benefit equation to you and the social context of your choice, how many other people are doing so, your perception of the 'rightness' or 'wrongness' of your act, etc. It does not necessarily reflect you attitudes or preferences in other contexts.

Even if you could extrapolate the effect observed in the game to life at large, you must remember that Correlation is not Causation. The fact that those with an East German background cheated more does not indicate that their background is what caused them to cheat. No one is denying that economic systems can change behaviours and attitudes of people, and it is worth studying, but you cannot jump to conclusions. You need to remove all the confounding variables, false positives and other possible causes. You do this by making as many comparisons as possible. Did the researchers do this? Not completely.

East Germany was not merely socialist but thrived on a culture of fear and repression, with secret police spying on citizens. There are social and cultural factors  that could have lead to people developing a habit of cheating and might have had nothing to do with the economic system. The researchers have identified two of these - economic scarcity and social comparison - but were not able to verify them using their methods.

Also, the paper fails to mention if the researchers took into account the fact that former East Germans have been living in a new economic system for 23 years (1990-2013) and how this might have changed their preferences/choices/attitudes to the extent that it makes the effect of their background meaningless to the study.

Even if all the points above are wrong and the researchers' assumptions thus far are correct, the inference that people exposed to socialism cheat more would still be incorrect, as socialist economic systems themselves are very different from country to country. East Germany was a comparatively impoverished socialist country compared to the scandinavian countries for example, which also had elements of socialist governance. One could argue that the scandinavian countries were never truly socialist, but that's missing the point. If the authors are talking about one specific kind of socialism they need to be clear about this. If they are referring to socialism in general, then they need to test population samples in other formerly socialist and presently socialist countries, and control for cultural and other differences, before they can make such an inference. They have not done this either.

--------------------------------------------------------------------------------------

All in all, this is not a bad paper, compared to others I have read. The researchers are quite honest about most of their limitations. However, no one else seems to care. The original article that linked to this paper merely reiterated the findings as if they were correct, without taking into account the researcher's alternative explanations. This is bad journalism. 

There are so many papers being published every month in various journals. Sometimes, the journals themselves are shady, and publish poor research for a fee. Researchers are under pressure to publish as this is what determines their reputation and pay in academia. So they tend to fudge data or manipulate it in dishonest ways to get positive results. Journals have a publishing bias towards positive results. And the journalists who write about the papers that are published in journals are usually under tight deadlines too. They cut corners. They trivialise, generalise and indulge in simplification. They have a poor understanding of scientific domains, empiricism, and critical reasoning. Most don't bother critiquing the papers they report on.

I have read so many bad science articles in the past 3 years that I have had to whittle down my RSS feeds to the extent that I only follow a few news feeds, scientists and professional science writers. On Twitter I am even stricter. I do not follow any pop science accounts, only professional researchers, people who will either share original research, or go the extra mile and critique people's research rather than blindly sharing links they come across. The best science communicators out there do not even bother writing about the latest developments in Psychology, given the faults in the field, the shakiness of results - the p value hacking, selective sampling, failure to replicate, false assumptions, etc. 

So it's always sad when an individual with a lot of followers shares a bad article. I am not writing this to be mean or to hurt anyone's feelings or discourage anyone's work. I'm just saying that there is a clear demarcation between good science writing and poor science writing. I do not expect every journalist out there to be able to critique a paper (though it would help) but I do expect even a beginner to know the difference between a balanced and a biased article.

When a journalist with impressive credentials chooses to share a link to a clearly biased article that, to push an agenda, deliberately ignores the limitations in a paper that any 2nd year undergraduate student at a middle ranked Psychology department in the UK would notice, you question that person's credentials.

This is not an isolated case. There are other people on Twitter with a massive reach who also tend to share terrible links. I am sure they are lovely human beings who want the best for humanity and are smarter and more accomplished than me in many ways, but they still share terribly written pop science articles that distort a subject just because they have a catchy title or byline.

So here is some advice when you come across a piece of science journalism - 


  • When something sounds too good to be true, it usually is not true. In the social sciences, discoveries are few and far between, so an article that claims to have discovered a major effect must be met with skepticism.
  • If an article claims something that you're sceptical about, read the paper and double check if those claims are true.
  • If you cannot critique the paper or do not have the time, do not share the article. Wait for someone else to critique it.
  • Follow professional science communicators who know how to critique scientific discoveries, and not mass produced pop science junk. The pros know how to write a balanced piece. Pop news channels just want to grab eyeballs and don't care about accuracy.




Share/Save/Bookmark