Monday 21 December 2015

The Science of Everyday Thinking on EdX


I had the pleasure of completing 'The Science of Everyday Thinking' on EdX recently. The course deals with a lot of stuff i've been thinking about for the past few years, so I noted a lot of my thoughts.

Illusions

The course begins by stressing that it is really difficult to put yourself in the shoes of others. We over estimate the abilities of others to know what we know. An example of this is when we tap out a song on a table. We expect 25% of people to guess the song correctly but in reality only 2.5% do.

We're great at pattern recognition, maybe even too good at it. Things float to the top of our minds that match our expectations, so we see real effects in noisy data, for example -  a face on toast. We sharpen things to what we expect to see - the 'expectancy effect' - and level those that we don't.

The course also stresses on how faulty memory can be. Memory is not like a video camera. Every time we remember something we reconstruct past events in our mind. I have had personal experience with this when helping one of my classmates at uni with false memory experiments. It was interesting to see how people really believed that they had seen something when they hadn't. I do this to, which is why I now write down certain events immediately after they happen so I don't get sequences of events mixed up.

We exhibit Naive Realism - we think the world is as we perceive it to be. This is wrong.

We exhibit fundamental cognitive error - we tend to underestimate the contribution of our beliefs and theories to observation and judgement, and fail to realise how many other ways that they could have been interpreted. 

Know Yourself

Planning fallacy - we are terrible at planning or judgement-making or self-assessment. Examples are driving, attractiveness & morals. Even though we fall on a bell curve for some of these, and 50% of the population falls below the median, we are incapable of accepting that we could be in the bottom half. Statistically speaking we all have to be under 50% at some point, but we will never admit it.

I've seen this first hand when planning my own goals. Many a time, I've planned out a journey assuming I'd be ready by a certain time only to find I've taken longer to get ready. I overestimate my own ability to be ready in time. It's the same with my learning goals. I keep subscribing to the belief that I am a super-fast learner and can do multiple courses at once, and I always end up struggling with too many things on my plate. I've learned to cut back and take things slower. No one can be great at everything. I've also seen this when proof-reading for foreign students au University. Students would be incredulous at the number of mistakes I found in their writing and the amount of re-writing that was required. They thought their grammar was decent, when it wasn't. Their unrealistic expectations were tied to incorrect evaluations of their own abilities.

The false-consensus effect - we overestimate the extent to which our beliefs are typical of those of others. We believe that other people generally think like us. Important to be reminded that this is not the case.

People don't even know what makes them happy. The true reasons people are happy are usually different from the reasons they provide. I need to do a separate post of happiness as I'm currently researching this. 

Job interviews are usually bad because of confirmation bias - interviewers see what they expect to see. They make up their mind about a candidate soon after they meet them and then only ask questions that confirm their beliefs. Structured interviews, where every candidate is asked the same question, are better. 

People tend to exaggerate the long term emotion effects that events have on us. In reality, emotional trauma can have bad effects on us but for the most part we tend to over-emphasise their effects.

People have a strong 'order effect' when selecting from an identical pool - they mostly pick what's on the right. And then they don't believe the reason why -  which shows that we don't know ourselves well. We don't even know why we make certain choices.

Intuition and Rationality

Kahneman differentiates between System 1 and system 2 thinking i.e intuition and rational thought.

The Anchoring Effect is powerful - but be careful of noise in the data.

The Representativeness Heuristic - the frequency or likelihood of an event by the extent to which it resembles the typical case.

But from a practical point of view, do be careful of thinking too statistically - in the Rudy the farmer  example, where there are far more farmers than lawyers, statistically it would make sense to pick farmer as the option but a bit more context would propbably point towards one of the other options like lawyer.

Learning

I really enjoyed this part of the course as I could take away more from this part than any other. Keys to learning better are to - 

Distribute practice over time - spacing helps. 
Set calendar reminders.
Use Retrieval practice - instead of merely re-reading material, cover and try to recall it.
Learn by doing - practice and discuss the content.
Vary the settings in which learning takes place.
Relate learning to your everyday experiences.

An important thing to remember is to not mistake fluency with learning. If you're finding a new topic too easy, you're probably not learning it well enough. You only think you understand it.

Experiments

Beware the Gamblers Fallacy.

Apple's shuffle feature - people don't understand how randomisation works, Apple had to make their product less random so people would perceive it as being more random even though it wasn't.

Finding Things Out

Many phenomena are simply examples of Regression towards the Mean - things balance out. This is more apparent when there is more noise in measurement.

Also, Post hoc ergo propter hoc - we assume a causes b because b followed a. It's kind of like those other common biases that make us believe in superstitions, like correlation is not causation, or false premise reasoning, or circular reasoning.

Experiments show that for most competencies, there is no diff between large and small class sizes.

Six leads to opinion change -

What do you really believe anyway?
How well based is your belief?
How good is the evidence?
Does the evidence really contradict what you believe?
What would be enough to change your mind?
Is it worth finding out about?

Extraordinary Claims

There are multiple ways you can interpret things.

Question your intuitions and be willing to give them up.

People tend to accept information that is consistent with their pre-existing beliefs at face value, but critically scrutinise information that contradicts their beliefs.
Health Claims

Pseudo-scientists tend to make ambiguous statements that you can contort to your expectations.

The Placebo Effect can be a false positive response, but most are Regression to the Mean. People seek help when they are sickest.

The Availability Heuristic - if a treatment turned out negative, you would never hear about it. 

Like cures like - a diluted part of the disease can cure the disease - is a common false belief. 

Natural is not necessarily better - arsenic is not good for you, indoor plumbing is.

Clustered disease is possibly the availability heuristic. You're confusing normal randomness and noise for an actual effect. You need to create and test a hypothesis to determine if a true effect like cancer clusters exist in a population.

Always ask - what about the other 3 cells? Given that you can have true positives, true negatives, false positives and false negatives, always look at the costs and benefits of the two ways that you can be wrong.

Applied Claims

For example - facilitated communication, forensic science, conspiracy theories, gun laws, gay marriage, asylum seekers.

The Expectancy Effect affects interpretation of forensic evidence like DNA. Experts who expect or desire to see something see the evidence in ways that are consistent with what they want to see - this is in part helpful, but can be disastrous.

People tend to focus exclusively on what they consider to be the evidence.

Belief in conspiracy theories is mostly cherry picking information.

False consensus effect - everyone thinks that everyone agrees with them.

Exploiting the Situation

There is not much correlation between personality and cheating, it is more about the situation. Certain situations can encourage honesty. 

Social conformity, the bystander effect, attribution error.

We assume that the way we see the World is the only way to see the world and anyone else that sees it differently is wrong and we attribute it to their  education, personal biases, propaganda, lower intelligence.

Milgram experiment - authority factor, diffusion of responsibility factor, channel factor (increase in shocks in incremental steps), no clear exit.

Nudging changes the channel factors to induce behavioural change.

Putting it all together

Be aware of your intuitions.
Have a healthy skepticism.
Simulate your future desirable performance in the present.
Test hypotheses.
Pick a few areas where you want to change what you're doing w.r.t thinking and personal biases, and focus on those.
Just because something is portrayed confidently doesn't mean it's true.
Read.

----------------------------------------------------------------------------------------------

I really enjoyed the course. I initially felt that the instructors spent way too much time on discussing personal biases and our inability to be objective and accurate with our perceptions and beliefs, and that they were repeating these points through the first half of the course, but I see now how useful and essential this was. Indeed, only good can come from these constant reminders.

Throughout the course, I was reminded of the biases people use to justify their superstitions and irrational beliefs, and why they won't change their minds even after being presented with evidence. For some reason or another, people will believe what they want to believe, and then pick and choose evidence to confirm that belief. They will see patterns where there are none because that is what they would expect of that belief. It helps if the belief is vague to begin with. This makes it easier to confuse noise for a true effect. They will assume that everyone should think this way. They will not understand that everything they see and interpret this way can be interpreted in many different ways by different people. They will not accept that their beliefs are a result of critical reasoning flaws or cognitive biases, nor be willing to test and verify their beliefs experimentally.



Share/Save/Bookmark