Sunday, 25 January 2015

Movie reviews - Coraline, ParaNorman & The Boxtrolls

I have been catching up on the films from Laika studio recently. Even disregarding all other factors with which we judge a film, the body of work they have produced with respect to only stop-motion animation quality is astounding. 

Coraline (2009) is their best work yet. A visual masterpiece. And incredibly creepy. Worth spending money on for the visuals alone. The story was above par too. Most films have a typical plot line where a character is shown to desire something, then has obstacles put in the way of that desire, and spends the film overcoming these obstacles to achieve a resolution. This films takes a slightly different view, where like 'Alice in Wonderland', we see the main character change goals midway through the film, which is when the real source of conflict is revealed. Of course, they film makers did have good source material to work from (a Neil Gaiman story). This might also explain why the film was one of the creepiest I've ever seen. I was also surprised by how well crafted the film was in terms of pacing. The film wasn't long, and yet seems evenly paced throughout.

ParaNorman (2012), a dramedy, was a whole lot of fun to watch. The story of acceptance is a little predictable (but still reasonably engrossing), and the animation wasn't as great as 'Coraline' (maybe that comes from setting your film in a typical suburban setting), but it was still good. The supporting characters are colourful without being annoying, and they get the comedy right. The lead character's personality and development are both above par. For that matter, so is the script. The lead character's thoughts and conversations with others are perhaps the best thing about this film. The best part of the movie is of course the final confrontation with the little girl. A work of art on all levels, it's worth watching the entire film for that scene alone.

The Boxtrolls (2014) is a visual masterpiece. The set design itself is better than 'Coraline'. Again, watch it for the visuals alone. I liked the story, but felt it was aimed at younger audiences. The story felt old, like it has been done before, so it felt a little predictable and low key for me. Nothing about the plot really stood out. The Boxtrolls and ParaNorman are almost like inverted versions of each other, with the Boxtrolls being the better visual experience, and ParaNorman having a fuller, more emotional story to tell, with characters you really relate to. But this shouldn't take away how awesome the film is. Each individual frame is so well crafted it's tough not to admire the film.


Wednesday, 17 December 2014

On Beliefs, Assumptions and my World View

Beliefs are not equal.

I had a discussion with a gentleman recently who refused to accept that someone could have no beliefs. This because I told him I had no beliefs. I personally don't like to use the word belief. I would rather refer to any position I hold as a model or approximation of the truth. These models are in turn built on assumptions of properties about this world that could change. It is easy to dismiss the difference between approximations and beliefs as mere semantics, but do keep in mind that semantics is the first thing you learn in a Philosophy 101 course, and ensures that everyone begins a discussion on the same page, instead of ending up talking about different things while referring to the same term. 

Anyway, I see all views, opinions and theories as merely models that are built on assumptions. Nothing is based completely on evidence of course, as even the most basic evidence requires assumptions of the properties that the evidence is based on. For example, the colour red is not really seen by everyone in the exact same way. Our vision ensures that we all see the colour slightly differently, even if this difference is practically negligible. We still call it red though. This is an approximation. A generalisation. But there's more. We assume that the colour red, like other colours, exists as waves made up of photons. We don't know much about how light exists as energy, but we have created models that explain and exploit its properties to a degree that is useful to us. 

Of course, none of this may be real. We may all be plugged into the Matrix. This could all be a dream. The colours may not exist. This universe may not exist. The properties of this physical world that we think we know about may only be a function of a dream world we inhabit and not part of whatever is really out there. But we don't know for sure if this world is fake, and so we act under the assumption that this universe and all the properties in it are real. Because this is the only practical way to live if our goal is comfort and happiness. We don't know if we exist in the way we experience. But it is best to assume that we do.

So in this sense everything is an assumption. But that doesn't mean that all assumptions are equal. There is a hierarchy. If there weren't then any view or model we created, no matter how crazy, could all be equally plausible. So what we do to maintain order in our world is assume that certain things are probably real like our universe and our existence. We then build the rest of our models on top of these basic assumptions.

Now we need to be really careful about how we construct these models, because a lot of them are based on questions that involve incomplete definitions and subjectivity. For example, do we have free will? Luckily, a lot of our models are objective, and built on physical laws whose properties we can approximate quite well. We use mathematical operations to build bridges. Mathematical identities themselves like Pythagoras theorem are perfect and exist for themselves with no exception, at least under the assumptions of the mathematical laws of this universe. We don't know why these identities exist, but we know that they do and how to exploit them. This is not an excuse for a belief in the supernatural. That is simply uncalled for given the evidence. We simply do not know why identities exist. That is all. Any models explaining why will need additional evidence.

You could of course make up your own inductive proof for a supernatural entity that exists outside the laws of this universe and space and time and matter but you would eventually have to face the fact that the properties of this proof are made up by you. i.e the proof works by induction, just like mathematical proofs work, because you assume all the properties needed for it to work, like we do in math. You don't know if these properties are real, you just assume that they are. A logically valid argument will still lead to a false conclusion if its premise is false. If your assumptions are unfounded, then no matter how good your argument, the conclusion you reach will still be only as good as your assumptions. This is why models for God's existence are both perfect and probably wrong.

Moving on, when you build a model, you identify a pattern and make predictions based on evidence. Sometimes, you use other people's models. You act on expectations that another from another model that you know very little about. For example, when you get sick and pop a pill. You don't know anything about what you're consuming. But you take it anyway expecting to get better. Is this a belief?

You could call it a belief, yes. Like the belief that the sun will rise tomorrow morning, given a normal solar system, or the belief that you will be able to walk or talk tomorrow, given no major changes to your body. You could call these beliefs, and they are all based on assumptions. But are they the same as religious beliefs? No, of course not. Because unlike religious beliefs, all these beliefs are verifiable. You cannot know for sure if a pill will cure you, but you know that you can look up the details of the pill if you wanted to. You can examine the skies or your body for patterns if you want to confirm the expectations you have for your model. In other words, these models are verifiable. Not a 100% verifiable of course. Pills do not always work. Solar systems and human bodies do not always work the way we expect them do. Errors abound. Things unaccounted for. The model is updated with new data. This is how critical reasoning works. Religious models are different. They rely, as I have said, on assumptions that are unverifiable. They might lead to useful but false conclusions. Religious belief may be useful, but it is also unverifiable. 

So now we have not just beliefs, but levels of belief. There are verifiable and unverifiable beliefs. This allows for some degree of subjectivity, as what is verifiable depends on how good the evidence is, and all evidence comes down to further assumptions, which always comes down to our assumptions about this universe and our existence. But we can say for sure that some beliefs are more verifiable than others, because some evidence is better than others, assuming the basic laws governing this universe. Evidence that holds up to falsifiability and has predictive value will always be better than anecdotal evidence that relies false premise reasoning and confirmation bias. This is not to say that unverifiable assumptions are wrong. This is impossible to tell, but that's the problem. 

We are mostly concerned with truth or falsity of assumptions based on the evidence we have. But since we can only examine the evidence in light of what we know about the universe, and since this is itself a series of assumptions that do not take into account what we don't know, then of course anything we postulate about God or a supernatural being could be true. Not probable, but possible. I wouldn't say that  assumptions based on rules outside this universe are something we shouldn't bother to think about. But we definitely are limited in the ways we can verify them, given that all our means of verification exist only in this universe.

So yes, perhaps I do have beliefs. I suppose I do live my life along expectations of how the world should work even though I don't always understand why it works this way. These could be called beliefs. And they are certainly different from supernatural beliefs. My beliefs are based on assumptions that are verifiable, at least to a certain extent. I think is a more practical way to live for the moment, compared to holding beliefs that are unverifiable, because at least I can explain why I hold a belief. I can justify my beliefs with evidence. What about you?


Monday, 15 December 2014

Starting a Wildlife NGO

Here are some ideas for starting a Wildlife NGO. These are activities that your NGO could take up.

  1. Wildlife rescue –
    1. Rescue animals forced into entertainment
    2. Maintain a shelter to keep them.
    3. Maintain a network of professional animal caretakers & vets to help.
  2. Wildlife Research –
    1. Research aimed at conservation - ecology, distribution, predator-prey relationships, etc.
    2. Hire scientists to do research.
    3. Hire project managers to oversee researchers and develop conservation plans.
  3. Networking, communications & fundraising - Get a marketing team in place to do what they do best.
  4. Fight cases in court – Hire environment lawyers.
  5. Education/training – Hire education consultants to design outreach programs, scientists and and volunteers for education campaigns in schools.
  6. Excursions - Get your coordinators to organise and manage regular hikes or trips to wildlife sanctuaries.


Sunday, 14 December 2014

Stats Blogs I Follow

These are the stats blogs that make me better at what I do.

For stats literacy -

This is the main one for serious statisticians. Andrew Gelman, a statistics professor and Bayesian statistician and programmer, critiques poor statistical practices. Very informative - 

For some advanced talk and a lot of useful links -

Deborah Mayo writes about philosophy of statistics -

Great learning resource for advanced statistical concepts -

You might learn a few things from Daniel Lakens' blog -

A nice revision of important concepts with comics -

More on probability theory -


Saturday, 13 December 2014

Education without Innovation

Most complaints about education systems revolve around them being mostly theory without any practical application. This is a problem because practical application like research methodology & computer lessons are a large part of what you need to go from being a theorist to a practitioner. It is no use studying concepts if you can't use them. But there is another problem I have with the system and it is lack of innovation. 

Students are great at learning theoretical concepts. They are great at regurgitating what they are taught in the form of an essay. Sure this is a form of learning. But it is not innovative. When you gain knowledge being taught to you, you grow to the level of the person teaching you, but you don't necessarily exceed this level. This is why learning itself is useless for humanity without innovation. To truly make a change you need to go beyond what you are taught. You cannot simply learn concepts in a vacuum. You have to combine them to come up with new concepts. This is how new things are created. 

It is the same for practical lessons, which may suffer from the same problem. I can put students through computer classes, but it will not mean much if they just recreate what I can, unless you want no new development. The best way forward is to teach your students the basic concepts with practical application, but to connect these lessons with existing questions, theories or ideas that they already have. This makes their learning context dependent, and motivates them to go beyond their lessons, to use what they have learned to create something new. 


Friday, 14 November 2014

On Anecdotal Evidence

Too many people rely on anecdotal evidence (personal experience or cherry picked examples) to assess if something is true, and I don't like it. 

To me, everything is a model. All the ways in which we view the world, or our explanations for various phenomena like behaviour, are merely models. The techniques we use to estimate weather patterns are models. The techniques we use to estimate group dynamics are models. All estimates are models. There are a number of ways to consciously build models. You could use anecdotal evidence. You could also use critical reasoning. 

There's a famous phrase that goes, "all models are wrong, some are useful". I like this because it feeds into what scientists do. Science is not about finding the 'truth'. It can be about the pursuit of the truth, but the truth might never be known. Therefore, all you do is continue to build better approximations of the truth, or better models to explain and predict phenomena, for both academic and practical purposes. This is what science does. Science is essentially a mix of critical reasoning and research techniques combined with domain knowledge. The sciences - Biology, Psychology, Chemistry, Physics - are merely fields of knowledge, domains that revolve around certain interest areas. Of course there is overlap. But these are not sciences because they encompass domains of study. That's half of it. They are sciences because they use critical reasoning techniques to investigate and build models that approximate the truth. 

Where does anecdotal evidence come in? Anecdotal evidence is a first step towards building a model, but not evidence for the model. Anecdotal evidence is the presence of something interesting that requires further study. You see a ghostly white figure at night. You have no idea what you are looking at. You investigate, you make a hypothesis and attempt to verify it. Things can get a bit shaky if you skip the investigation and rush to make a claim, because anecdotal evidence could be due to a number of causes, not just the one you have in mind. False positives abound. This is why it is important to treat anecdotal evidence as a first step only. It would be disastrous to claim something as fact based on personal observation, and then find out that your claim is wrong because you didn't properly investigate the matter.

Let's take some examples. The claim that God is real. There are various types of  evidence for this claim. One is prayer, a type of anecdotal evidence. I pray for something, something happens, therefore God is real. Anecdotal evidence like prayer cannot be evidence for the existence of God till it is verified. For every anecdotal claim of prayer working, there could be another for it being useless. To verify if prayer works, you would have to experimentally demonstrate its effectiveness. This is called falsifiability. Note that this is neither proving nor disproving the existence of God. This is not the question at hand for the scientist. It might be the question at hand for the person claiming God's existence and using prayer as an example, but for the scientist the investigation only concerns the effectiveness of prayer. A scientist who demonstrates that prayer is useless is not proving or disproving the existence of God. He or she is merely verifying a specific claim. This is important to remember. Science is not always concerned with the big questions. It is merely a tool to verify claims or existing models. After all, prayer is a model of how the world works. A scientist can spend his or her entire life falsifying such claims. This would get us nowhere if the claims were spurious to begin with. This is why anecdotal evidence should not be used to claim something. Because there are more reliable ways to build models. 

[This is why proving or disproving the existence of God is a futile activity. No one knows exactly why the concept of God came about. We have theories. But nothing that seems to be founded in verifiable evidence. There is a lot of anecdotal evidence, but upon verification, a lot of it does not hold up to scrutiny. This is not to say that any of the thousands of Gods do not exist, or that people are wrong in believing in them. Science cannot falsify something that was made up to begin with, or is currently too difficult to verify. It can only analyse the evidence and show over time how improbable something is, using existing methods. True falsifiability is impossible. Which is why we will never be able to disprove the existence of the Loch Ness monster either.]

Here's another example. Psychometric tests like MBTI. HR professionals love them. But the data from meta analyses picks holes in the test's reliability and validity. But HR professionals who have used these tests swear by them. One person I spoke to even compared it to the accuracy of a horoscope while praising it (I doubt he was trying to be ironic). This kind of reliance on anecdotal evidence to back something, is used as a model by a lot of people, just like people use prayer as a model. Why do they use it when there are scientific techniques that discredit these models? I have no idea. Maybe people are ignorant. Maybe they find it easier to act on someone else's recommendation or 'try it yourself first' advice rather than doing personal research. Maybe they think that discrediting one model will mean discrediting a larger model that they have more of an emotional investment it. Maybe they already choose to believe in something to make themselves feel better. Maybe creating a faulty but useful model works for them. Maybe the model's degree of usefulness wins over the fact that it is wrong.

Which is interesting because of what I said earlier - all models are wrong, some are useful. Let's say human sacrifice to appease the weather Gods is supported by anecdotal evidence i.e. a group of people practice human sacrifice and choose to notice only when the weather changes for the better, convincing themselves of a correlation between the two. They of course ignore instances when sacrifice does not affect the weather, attributing it to human fault or God being angry with them, or it all being a part of God's larger plan. Now let's say hypothetically that this model/belief is the only thing keeping this society stable.

Note that science isn't always concerned whether the effect is real or not, or if belief in it should continue. Yes, assumptions are faulty. Correlations abound in large amounts of data. They're a function of statistical noise. Experimentation should verify the probability of the correlation. But even if it finds that the correlation/belief/model of human sacrifice for better weather is wrong, it doesn't erase the fact that it is useful. Now replace human sacrifice with belief in God, or MBTI. These models might work in certain contexts. Belief in God helps people in certain contexts. Belief in aliens might just help society. I have no idea. MBTI might be useful in certain contexts. Neither of these models might be correct, but they can be useful. If MBTI works for you, then great, use it. But that doesn't mean it does what it claims to do, which is why you wouldn't be right in recommending it to me. Which is why people need to look at the evidence to verify if a model is good for them, and not rely solely on anecdotal evidence, or else risk disappointment.

In summary,

1. Anecdotal evidence can be a good first step to further research.
2. If you notice something interesting, collect data, find patterns, make a hypothesis and verify it. Then make a claim.
3. Your claim is your model. It can only be built on the elements in point 2. 
4. Anecdotal evidence by itself cannot be used to build models. 
5. If someone builds a model that approximates what they think is the truth, question their assumptions and verify the evidence.
6. If their model is build on anecdotal evidence (personal or cherry picked examples), reject the model for being incomplete.
7. Their model is not necessarily completely wrong but it is pointless to consider something correct if it hasn't been verified, even if it is useful.
8. A model's usefulness does not necessarily reflect its correctness.


Monday, 4 August 2014

On Science Journalism

Few people do science journalism right, and so few comprehend the difference. 

I came across this recently. I clicked the link, which took me to an article describing a paper that I downloaded and read. The paper itself was OK as far as social science papers go, but as usual, elements within the media that don't know any better jumped on it. 

The paper concludes "that individuals with an East German family background cheat significantly more on an abstract task than those with a West German family background." It also concludes that "The longer individuals were exposed to socialism, the more likely they were to cheat on our task." 

The points I would like to raise are below.

The first point is with the statistical inference used in the study. The researchers note that both groups cheated, but those with East German backgrounds cheated more than those with West German ones to a degree that was statistically significant. I won't go into detail about p value hacking, confidence intervals, effect sizes and power here, but suffice it to say that a statistically significant result does not reflect an actual real life effect. This is just a function of probability. Neither group of people may have cheated. But the statistical techniques picked up on variation that the researchers deemed significant. We do not know if this significant difference represents cheating in real life, or if it would hold if the study were to be repeated.

Even if the effect (cheating) were present, there is no way that you can automatically extrapolate the results of a game to a judgement of people's moral attitudes in general. This is because morality is complex. The fact that some people might use the opportunity to cheat if given the opportunity to do so in a game of dice does not necessarily reflect their attitudes in general, or choices in other situations. The researchers use terms like 'value system', but do not define what this encompasses. What constructs and concepts make up a value system? Is it objective or context dependent?

You might download a film illegally, but this does not make you a thief in general. Your choice to download a film at that point in time is a function of the cost-benefit equation to you and the social context of your choice, how many other people are doing so, your perception of the 'rightness' or 'wrongness' of your act, etc. It does not necessarily reflect you attitudes or preferences in other contexts.

Even if you could extrapolate the effect observed in the game to life at large, you must remember that Correlation is not Causation. The fact that those with an East German background cheated more does not indicate that their background is what caused them to cheat. No one is denying that economic systems can change behaviours and attitudes of people, and it is worth studying, but you cannot jump to conclusions. You need to remove all the confounding variables, false positives and other possible causes. You do this by making as many comparisons as possible. Did the researchers do this? Not completely.

East Germany was not merely socialist but thrived on a culture of fear and repression, with secret police spying on citizens. There are social and cultural factors  that could have lead to people developing a habit of cheating and might have had nothing to do with the economic system. The researchers have identified two of these - economic scarcity and social comparison - but were not able to verify them using their methods.

Also, the paper fails to mention if the researchers took into account the fact that former East Germans have been living in a new economic system for 23 years (1990-2013) and how this might have changed their preferences/choices/attitudes to the extent that it makes the effect of their background meaningless to the study.

Even if all the points above are wrong and the researchers' assumptions thus far are correct, the inference that people exposed to socialism cheat more would still be incorrect, as socialist economic systems themselves are very different from country to country. East Germany was a comparatively impoverished socialist country compared to the scandinavian countries for example, which also had elements of socialist governance. One could argue that the scandinavian countries were never truly socialist, but that's missing the point. If the authors are talking about one specific kind of socialism they need to be clear about this. If they are referring to socialism in general, then they need to test population samples in other formerly socialist and presently socialist countries, and control for cultural and other differences, before they can make such an inference. They have not done this either.


All in all, this is not a bad paper, compared to others I have read. The researchers are quite honest about most of their limitations. However, no one else seems to care. The original article that linked to this paper merely reiterated the findings as if they were correct, without taking into account the researcher's alternative explanations. This is bad journalism. 

There are so many papers being published every month in various journals. Sometimes, the journals themselves are shady, and publish poor research for a fee. Researchers are under pressure to publish as this is what determines their reputation and pay in academia. So they tend to fudge data or manipulate it in dishonest ways to get positive results. Journals have a publishing bias towards positive results. And the journalists who write about the papers that are published in journals are usually under tight deadlines too. They cut corners. They trivialise, generalise and indulge in simplification. They have a poor understanding of scientific domains, empiricism, and critical reasoning. Most don't bother critiquing the papers they report on.

I have read so many bad science articles in the past 3 years that I have had to whittle down my RSS feeds to the extent that I only follow a few news feeds, scientists and professional science writers. On Twitter I am even stricter. I do not follow any pop science accounts, only professional researchers, people who will either share original research, or go the extra mile and critique people's research rather than blindly sharing links they come across. The best science communicators out there do not even bother writing about the latest developments in Psychology, given the faults in the field, the shakiness of results - the p value hacking, selective sampling, failure to replicate, false assumptions, etc. 

So it's always sad when an individual with a lot of followers shares a bad article. I am not writing this to be mean or to hurt anyone's feelings or discourage anyone's work. I'm just saying that there is a clear demarcation between good science writing and poor science writing. I do not expect every journalist out there to be able to critique a paper (though it would help) but I do expect even a beginner to know the difference between a balanced and a biased article.

When a journalist with impressive credentials chooses to share a link to a clearly biased article that, to push an agenda, deliberately ignores the limitations in a paper that any 2nd year undergraduate student at a middle ranked Psychology department in the UK would notice, you question that person's credentials.

This is not an isolated case. There are other people on Twitter with a massive reach who also tend to share terrible links. I am sure they are lovely human beings who want the best for humanity and are smarter and more accomplished than me in many ways, but they still share terribly written pop science articles that distort a subject just because they have a catchy title or byline.

So here is some advice when you come across a piece of science journalism - 

  • When something sounds too good to be true, it usually is not true. In the social sciences, discoveries are few and far between, so an article that claims to have discovered a major effect must be met with skepticism.
  • If an article claims something that you're sceptical about, read the paper and double check if those claims are true.
  • If you cannot critique the paper or do not have the time, do not share the article. Wait for someone else to critique it.
  • Follow professional science communicators who know how to critique scientific discoveries, and not mass produced pop science junk. The pros know how to write a balanced piece. Pop news channels just want to grab eyeballs and don't care about accuracy.


Friday, 13 June 2014

Proximate and Ultimate Explanations

Proximate and ultimate explanations are among the first terms that you learn about when studying ethology. These terms are used in other contexts, in addition to the study of behaviour, where they mean slightly different things, so it's important to understand them and not get them confused.

Why does an animals behave in a certain way? An animal's behaviour can be explained in proximate and ultimate terms. Proximate explanations deal with the 'how' of a behaviour i.e the underlying or mechanistic reasons behind a behaviour. Ultimate explanations deal with the 'why' of behaviour i.e the usefulness of this behaviour to the creature and how it came to acquire it.

Here's an example - birds singing in spring.

Proximate questions - How do birds manage to sing in spring?
Proximate answers - Daylight induces changes in hormones which make them sing. They learned to sing when young.

Ultimate questions - Why do birds sing in spring?
Ultimate reasons - For mating/reproductive value. The vocal chords of distant relatives and extinct birds indicate that this trait evolved concurrently with overall fitness.

It is important to realise that proximate and ultimate reasons are both explanations for the same behaviour/phenomenon but from different perspectives. You could say that proximate explanations provide the reasons underlying behaviour (what is it due to?) while ultimate explanations look at the bigger picture (what is it for?)

Proximate behaviours usually provide mechanistic reasons for behaviour or describe the 'triggers' behind behaviour. You can think of them as being the result of things occurring in the animal's body (e.g. hormones, nervous system, genes, age) or immediate environment. Proximate explanations can be further divided into mechanistic (causation) and ontogenetic (developmental). 

  • Mechanistic explanations (how does it work? how was it caused? what caused it?) usually deal with processes within the body that follow simple rules, like neurons and the nervous system, hormones, pheromones and other bio-chemical processes.
  • Ontogenetic explanations (how did it develop?) cover behaviour from the nature-nurture or gene-environment angle. These explanations build on mechanistic processes, and relate them to what's going on with the individuals environment, like learning or other aspects leading to behavioural development.

Ultimate explanations usually describe the function of a behaviour in terms of evolutionary history and function. Ultimate explanations deal with evolutionary benefits of a particular behaviour. These can be further divided into phylogenetic (evolutionary history) and adaptive (functional) explanations. 

  • Phylogenetic explanations (how did it evolve?) deal with why this behaviour might have evolved over successive generations instead of being lost. We look at the evolutionary history of the creature to see how natural selection worked on this trait. 
  • Functional explanations (what is it for? what purpose does it serve?) deal with the benefit the behaviour confers to the individual in terms of its current environment. It is important to remember that an individual can have a current trait that is adaptive without it being an adaptation. 

Here's another example - Honeybees swarming (splitting up and building new colonies elsewhere).

Proximate questions - How to honeybees manage to swarm? What factors lead to swarming?
Proximate answers - Because of the way their central nervous system responds to other bees doing the waggle dance. Or because this behaviour is triggered by colony size, brood comb congestion, worker age, or the queen having reached her maximum egg laying rate.

Ultimate questions - Why do honeybees swarm?
Ultimate reasons - For reproduction, survival, more food resources.
And one more - birds building nests.

Proximate questions - How does a bird know how to build a nest?
Proximate answers - It could be a genetically programmed or learned behaviour.

Ultimate questions - Why does a bird build a nest?
Ultimate answers - Because a nest assists in mating and so improves reproductive success, which means genes are passed on to the next generation.


Sunday, 27 April 2014

On MOOCs and Online learning

I think MOOCs are important and useful. I just think that a lot of them aren't following instructional design principles and enabling learners in the way that they should be. It's a great medium to change the world, but it's being run by computer scientists and businessmen with minimal input from learning designers, and this needs to change.

A note of advice, don't take more than one MOOC at a time. The first time I discovered and registered for MOOCs was 2012. I registered for four, but then realised I couldn't follow all the courses. Even after reducing the number to one, I couldn't cope with both the MOOC and my studies. I tried taking more MOOCs when my schedule cleared up in 2013. But again, I registered for too many. I finally completed two courses simultaneously from March to May, but the workload was so high that I decided to stick to one course at a time in future. And the only reason I was able to manage two was because one was really easy.

I learned two lessons here. One, a MOOC is a full time course of study. It's equivalent to one college level module, and a heavy one at that. It requires daily participation on your part, and is certainly not a 'one day a week' thing. You don't just watch a few videos and take a quiz, you need to do a lot more for the course to be effective. There's a lot of reading to do if it's a knowledge based course. And a lot of practice if it's a skill based course.

There's also a lot of knowledge sharing in online groups. And you need to budget your time accordingly. Granted, you don't always know exactly how much time you'll need at the start. Which is why it's a good idea to audit courses when you're not sure. And just drop out if it's too much for you. I tend to do this a lot, especially when the subject area is completely new to me. I've dropped out of around 6 courses for every one I've completed.

The second lesson is about the design aspect of MOOCs. Understanding a concept takes time. Learning comes from reflection, practise, application and knowledge exchange. You do not learn something by watching four 15-minute videos of the topic each week and then taking a quiz about it. The true test of learning comes not from summarising what you've learnt but applying what you've learnt to a new context. That's the real challenge. Do MOOCs meet this?

I say no. Most MOOCs consist of mainly videos and reading materials. Videos are at best an overview, an introduction. They cannot be the entirety of the course material. You should ideally watch a video, and then do a lot of follow up reading (the best MOOCs have their own textbooks), note taking, introspection, sharing ideas with others, summarising your conclusions in the form of essays, and a lot of follow up exercises involving applying your ideas to novel situations. This is how learning takes place.

And this is the problem with MOOCs. They're mostly just videos and quizzes, and they should be more. A course with just videos and quizzes and maybe a few assignments can never completely teach a complex subject to the extent that you begin using its ideas as a practitioner. Secondly, this type of course encourages sole study without group interaction, which is not preferable. Third, it fools you into thinking you're now an expert on a subject because you got a good score on a multiple choice quiz on the subject every week for eight weeks.

Multiple choice quizzes are generally not the best learning facilitation tool, given the amount of guesswork taking place. I understand that you can't have teacher graded essays or exercises in a class of 15,000 students, but peer review should definitely be an option. 

Real learning takes place through reflection and practice, which requires time. One of the better courses I've taken was on Psychology and had it's own free online text book that was required reading for the course, and was comprehensive in the materials it covered. However, I would have liked more essays and exercises to cover the practical aspect. 

Another good one (on mathematics) had loads of exercises that needed to be discussed in the group forums. Group learning is a good thing, and one of the main advantages of online learning. You have so many more classmates to share ideas with and learn from, and you do this on your own time. When a course only revolves around material presented through videos, without any other reference material or exercises, the course forums turn into a wasted opportunity, as you're only discussing topics covered in the videos, which is not extensive anyway.

MOOCs will never replace college education or be taken seriously as a means of education if they don't have their students use more reference material and application based exercises, which encourages reflection and knowledge sharing, and better forms of evaluation.


Wednesday, 23 April 2014

7 ways to study in the UK for cheap (what you won’t read elsewhere)

So you’ve been admitted into a university in the UK. You’re fees are probably 3 times higher than those of all your local classmates. Let’s crunch some numbers. Average Masters programmes in the UK cost at least 12,000 GBP for international students, and rising every year. Living expenses are approximately 6,000 GBP on average. So how do you ensure you spend a year in the UK without burning a 18,000 pound hole in your (or your parents’) pockets? Here are some useful tips.

1.    Find a part time job

If you want to make enough money to offset as much of your living expenses as is humanely possible, you need to find a part time job ASAP (forget about your fees, no job will ever pay you enough to cover that, apart from one where you sell weed). A part time job ensures that you earn steady income on a weekly basis to cover your rent, food, travel and other expenses. 

As a foreign student, you will be allowed to work 20 hours a week, and if you do work at this maximum capacity from day one, at a minimum wage of around 6 pounds per hour, you should earn 6400 pounds over twelve months, enough to cover your living expenses. In reality, it might take you a couple of weeks to find a part time job, when you do find one you might not be get enough to fill your 20 hour capacity, and you will be taking breaks from work during exam season or are busy with other aspects of your course, so you will probably make less than 6400 pounds.

Where do you find a part time job? Look at your university website for vacancies. Do they have a student union? Or a career center? Contact these groups to see if they know of any vacancies. Follow them on Facebook and Twitter to keep up to date on vacancies. Your university will probably have shops, restaurants & cafes on campus. Contact them to see if they need any staff. Do this weeks before you arrive, or there might not be any vacancies left by then. If you do see a job vacancy online, apply immediately. There might be hundreds of applicants, and vacancies are filled on a first come first serve basis.

If you don't find anything, look around for jobs as soon as you arrive. Talk a walk around the town or city you are staying in during your first week. Drop your CV in at all the coffee shops and fast food joints so they know you’re looking for a job. A good thing about the UK is that there are loads of Indian restaurants everywhere. And Indian restaurants in the UK tend to hire Indian students. Make a round of all such restaurants in your area to see if they need any help. They're always on the lookout for waiters and waitresses but don't bother advertising and usually recruit through word of mouth.

Extra work is usually available during the Christmas and Easter breaks. These vacancies are usually temporary in nature, lasting for 2-4 weeks. Additionally, your university itself should have internal vacancies that open up during the course of the academic year. If you're good at something technical, look for part time teaching jobs where you can teach undergrads for a semester. The pay is really good.

Please do note that finding work during your course should not take precedence over your academics. You have spent a lot of money to come to a foreign country to study, and you shouldn’t risk sacrificing this for immediate economic gain, even if this is what your employer wants.

2.    Don’t stay on campus. Find private accommodation.

Campus accommodation in the UK is comparatively more expensive, and can increase your rent by 30%. Private accommodation by contrast is usually around 800-1000 pounds lower. 

Also, staying on campus means you will probably be required to commute to your town or city centre to stock up on groceries every week. This is inconvenient for two reasons. One, you might not always have room in your fridge or freezer for a week’s worth of food, so you might have to make more than one trip. Two, the money spent on the commute is going to add up. Think two pounds every week for a return bus ticket, for the minimum 10 month (45 week) duration of your course. That's 90 pounds just for the shopping commute. With private accommodation you could try to get a place closer to a supermarket, and walk instead. You'd save 90 pounds. And you wouldn't have to worry about making multiple trips or kitchen space.

Find a cheap place to stay preferably before you get to the UK. Post queries on your university Facebook pages and other online forums which students frequent, asking if anyone needs a roommate. Check Contact former students, particularly Indian ones, to ask if they know of a cheap place to stay, or can recommend a good landlord.

3.    Shop smart

Shops on campus can be expensive. Do your shopping at one of the larger supermarkets, like Co-op, Tesco, Aldi or Lidl. Also, constantly be on the lookout for good deals. Larger supermarket chains tend to mark items down by 25% a day before they expire. Avoid tiny neighbourhood convenience grocery stores. They usually mark items up by 10%.

4.    Track your expenses

Set a weekly spending limit and don't cross this figure, no matter what. If you do, make up the difference by spending less the following week. Make a note of your expenditure so you know if you're nearing the limit. Record what you spend on most and try to reduce this.

5.    Take part in experiments

Universities in the UK have Health and Psychology departments whose students conduct experiments for which they require human volunteers. These experiments can last from 15 minutes to weeks, and usually pay around 5 pounds an hour. Drop by the offices of these departments around dissertation time, or keep an eye out for notices requesting volunteers. Some of the experiments can be fun, and you usually get to know a little bit more about yourself.

6.    Proofread

A lot of students on campus come from countries where English is not a first language, and aren't very comfortable writing long essays in English. If your own English writing skills are good, you can offer your services as a proofreader. Put up notices around campus advertising your services as a proofreader, or get the word around through your friends. Professional proofreading services charge hundreds of pounds to proofread essays, so you should be able to get work by charging less. Even a fee of 50 pounds would be a bargain for students looking to improve their dissertations.

7.    Don’t smoke

Cigarettes are expensive in the UK. A pack can cost around 7 pounds. That’s enough for a meal at a restaurant. Do yourself a favour a try to kick your smoking habit before travelling abroad. Or fill your suitcase with about 200 packs of ‘Goldphlake’. How you’d get that through customs is another problem, though.

Related Posts with Thumbnails