Wednesday, 4 February 2015

On International Politics and World Peace


Back in the 90's when I was in school and the US imposed sanctions on India for its nuclear tests, we cried hypocrisy. How could the US punish India for building nuclear weapons when the US also stockpiles them? I've come a long way since then, but only after a lot of self-education, education which I unfortunately didn't receive in school. 

Self interest and game theory

The thing about international politics and diplomacy is, no political action is a result of principles, other than those associated with self-interest. This is a historical fact. Countries do not build alliances or rivalries based on principles, they do so based on what maximises their own self-interest. 

This can be modelled using game theory. Draw a checker box. Put one player on the x axis, and the other one on the y axis. On each player's axis, list the name of the interaction i.e. cooperating, trading, going to war, etc., with the the player. Now in each box on the checkerboard, enter the values of one players's action given the other players action. These could be positive, negative or zero values. For example, the value of player 1 going to war while player 2 is at peace might give player one a high payoff and player 2 a negative payoff. Whereas the payoff to the players if they both decide to trade with each other could result in each of them getting an even higher payoff compared to the payoff that one got by attacking the other. In this way, countries pick the box with the highest value for themselves. The cost-benefit equation is of course more complicated than this, as any political scientist or economist will tell you. Like a giant live chess board of life, each country has to look at the best way to maximise its own interests in real time. The main difference is that chess is a zero sum game, whereas in politics more than one country can win.

This is the real driver behind political policy and action. When you look at history afresh after having learned this, you find no reason to use infantile terms and phrases like "we're friends with this country", "these countries have always been friendly", "these countries are enemies", etc. Another thing you feel no need to do is to cry hypocrisy at the actions of other countries - "this country is being hypocritical". 

When the US imposes sanctions on other countries for conducting nuclear tests when they themselves own an arsenal of nuclear weapons, this might be technical hypocrisy, but that's missing the point. The US does it because it's in their interest to do so. When you have nuclear weapons, it makes sense to prevent countries that are not aligned to you from obtaining the same weapons. It's just good strategy. Countries are only allies because it's in their mutual benefit to be allies. Because it pays to be allies more than it does to be something else. Here's another example - when the US commits to religious tolerance but backs Pakistan and Saudi Arabia with military aid. This isn't hypocrisy in a political sense. You say what you need to for votes, to define your value system (all countries run on value systems) but your actions have to be concurrent with good strategy, with game theory. In the end, self-interest wins out. 

How countries evolved

If every country had to follow principle-based politics, the world would be a better place, as long as every country agreed to follow the same principles. But they don't. Countries that exist today didn't exist 10000 years ago. We started off as nomads and hunter gatherers. Over time, different groups of people came together because division of labour made sense. These different groups had different principles, but they all focused on maximising their payoffs, whatever they were. Groups with different resources decided to trade with others as they both needed resources that the other had. Some developed a trusting relationship based on reciprocal cooperation, they didn't need to safeguard themselves against each other militarily. 

But what if there's a change of leadership and policy? Now one civilisation grows stronger, and begins to conquer more land and people. The other civilisation has a choice, do they ignore the first one, join forces with it, let themselves be conquered and assimilated, or turn into conquerers themselves to avoid being taken over? Also, military might isn't the only tool to use in your defence. There's also religion, which makes cultural assimilation easier, and economic conquest, where countries subjugate each other economically. Trusting your neighbour explicitly means giving them a chance to exploit you, and you won't do that if you think there's a chance that they will. Replay this scene for different groups across thousands of years, and through numerous conflict and death, we have the world today, fragmented groups with different value systems and ideals, terrified of being exploited or losing out. Hence, there's not much trust.

Self interest and trust

Which is not to say that trust doesn't exist. It does, but it comes about when it's in the players' self interest. The US and Canada can have a porous border because there's very little risk of a war breaking out between them because they have been at peace for so long and have reached numerous lucrative trade agreements in the process. If this peace was broken by say, America invading Canada and taking even some of their land, it would hurt trade, and Canada might align itself with an enemy of USA. It pays to keep your neighbours as allies to act as buffers between your other enemies. What we call international trust is ultimately all about money and security. Self-interest wins.

Here's another example - the Nordic countries being at peace. Denmark, Finland, Sweden and Norway all have porous borders, as do other European countries. The likelihood of a country misusing this trust and upsetting the status quo is low because the consequences would be dire. Border security would be strengthened, relations would sour, money would be lost, everyone would suffer. The payoffs from committing such an action wouldn't justify the costs. This is why the Nordic countries are at peace. They weren't always at peace. A long time ago they were at each others throats. But back then, it paid to conquer and kill each other more than it did to trade and cooperate freely. This also explains why Russia recently annexed part of Ukraine. The payoff (in terms of access to resources and trade routes) exceeded the cost (meagre threats from NATO?). Self-interests wins.

The situation today

Look at the two major power blocks today - the US and China. It used to be the US and the USSR. Preceding World War 2, the US was just another country gaining affluence through trade and innovation. Following World War 2, it emerged as a dominant superpower and its alliances with a number of European partners was sealed. But competition emerged with the USSR, whose economic and social polices rivalled that of the US. Here we have a case where countries have an internal value system linked to their economic systems, so they become economic competitors to protect their social systems. The US was terrified of communism, and the USSR was intent on spreading it. So they both embarked on policies of expansion. The USSR annexed and funded countries that embraced communism, crushing any opposition. The US did the same, backing fascist murderous dictators worldwide as long as they rejected communism. The devastation this wrought was immense in terms of human life. But it was in both countries' interests not to stop their activities, because then the enemy would have an upper hand. 

You see this mirrored today with the US and China, with China funding infrastructure projects in a number of Asian and African countries in exchange for political support, while the US can only count on its bases like South Korea and Japan for leverage, in addition to NATO. China knows North Korea is a powder keg but continues to maintain friednyl relations with them because they can use North Korea as leverage against the US if they need to. It pays to keep them close as an ally. Which is why the US has a military presence in Taiwan and the Pacific. Self-interest always wins. 

You also see this mirrored today with nuclear weapons. No world leader truly believes that these weapons are good, but they can't help keeping an arsenal as long as their enemies have them too. It's only the smaller countries with no ambitions of power that don't need nuclear weapons, but they are either aligned to a power block (like Bhutan), or are not threatened by one (like Oman).

Books like Isaac Asimov's Foundation series really open your eyes to these sorts of situations and decisions. You begin to see beyond the values that you were raised on, the values that countries should be run on, and see the world for what it really is, a blank slate ready for exploitation by power-hungry people ready to exploit anything.

Which is not to say that values don't matter. Of course they do. But they keep changing, and we need to be mindful of this, and ensure they change in a way that's best for all of humanity (if we are to take a humanistic approach towards existence). The Mesopotamian civilisation used slavery because it made sense to do so, values be damned. This doesn't make it right, but it made economic sense at the time, and later for thousands for years, until we decided that slavery was wrong. This didn't happen overnight. It took time. it's the same for universal suffrage, or homosexuality. Values change. But change takes time.

Attaining world peace

So how do you reduce international conflict and attain world peace? Again, you can use game theory to figure this out. Prevent countries from warring with each other by making it too costly (relatively speaking) to do so. People will always strive for power and self-interest. You can't take this urge away. You can only develop a political ecosystem in which acting upon the urge is too costly given more attractive alternatives. An ecosystem is which countries are incentivised to trade and cooperate peacefully with each other.

One way to do this is to build trust between all the different countries that currently exist. One way to do this is by removing any perceived threat between two countries and increasing trade opportunities. And you do this by economically developing every country equally. Remove economic gaps, invest in education and healthcare, make all countries economically powerful so they can serve as trading partners with each other. This serves as a status quo, a deterrent to attacking each other. Over time, this becomes trust. 

Of course, this should work better if the countries have similar value systems, as differing value systems pose a threat. For example, the world's number one economy - China - is communist, while the US-NATO power block isn't. Both blocks trade profitably with each other, but mistrust exists. A common value system would probably remove this. Remember that power blocks are only formed as reactions against perceived threats from other blocks. The other way to attain world peace is of course to ensure that there is only one power block in existence - yours.

p.s.

This Crash Course World History series is extremely informative w.r.t observing patterns in group behaviour across human history. By watching a concise approximation of human cooperation and conflict across time, you begin to observe patterns in group behaviour. Watch if you have 20 hours to spare. 

World history Part 1 - http://www.youtube.com/playlist?list=PLBDA2E52FB1EF80C9

World history Part 2 - http://www.youtube.com/playlist?list=PL8dPuuaLjXtNjasccl-WajpONGX3zoY4M


Share/Save/Bookmark

Sunday, 25 January 2015

Movie reviews - Coraline, ParaNorman & The Boxtrolls


I have been catching up on the films from Laika studio recently. Even disregarding all other factors with which we judge a film, the body of work they have produced with respect to only stop-motion animation quality is astounding. 

Coraline (2009) is their best work yet. A visual masterpiece. And incredibly creepy. Worth spending money on for the visuals alone. The story was above par too. Most films have a typical plot line where a character is shown to desire something, then has obstacles put in the way of that desire, and spends the film overcoming these obstacles to achieve a resolution. This films takes a slightly different view, where like 'Alice in Wonderland', we see the main character change goals midway through the film, which is when the real source of conflict is revealed. Of course, they film makers did have good source material to work from (a Neil Gaiman story). This might also explain why the film was one of the creepiest I've ever seen. I was also surprised by how well crafted the film was in terms of pacing. The film wasn't long, and yet seems evenly paced throughout.

ParaNorman (2012), a dramedy, was a whole lot of fun to watch. The story of acceptance is a little predictable (but still reasonably engrossing), and the animation wasn't as great as 'Coraline' (maybe that comes from setting your film in a typical suburban setting), but it was still good. The supporting characters are colourful without being annoying, and they get the comedy right. The lead character's personality and development are both above par. For that matter, so is the script. The lead character's thoughts and conversations with others are perhaps the best thing about this film. The best part of the movie is of course the final confrontation with the little girl. A work of art on all levels, it's worth watching the entire film for that scene alone.

The Boxtrolls (2014) is a visual masterpiece. The set design itself is better than 'Coraline'. Again, watch it for the visuals alone. I liked the story, but felt it was aimed at younger audiences. The story felt old, like it has been done before, so it felt a little predictable and low key for me. Nothing about the plot really stood out. The Boxtrolls and ParaNorman are almost like inverted versions of each other, with the Boxtrolls being the better visual experience, and ParaNorman having a fuller, more emotional story to tell, with characters you really relate to. But this shouldn't take away how awesome the film is. Each individual frame is so well crafted it's tough not to admire the film.


Share/Save/Bookmark

Wednesday, 17 December 2014

On Beliefs, Assumptions and my World View


Beliefs are not equal.

I had a discussion with a gentleman recently who refused to accept that someone could have no beliefs. This because I told him I had no beliefs. I personally don't like to use the word belief. I would rather refer to any position I hold as a model or approximation of the truth. These models are in turn built on assumptions of properties about this world that could change. It is easy to dismiss the difference between approximations and beliefs as mere semantics, but do keep in mind that semantics is the first thing you learn in a Philosophy 101 course, and ensures that everyone begins a discussion on the same page, instead of ending up talking about different things while referring to the same term. 

Anyway, I see all views, opinions and theories as merely models that are built on assumptions. Nothing is based completely on evidence of course, as even the most basic evidence requires assumptions of the properties that the evidence is based on. For example, the colour red is not really seen by everyone in the exact same way. Our vision ensures that we all see the colour slightly differently, even if this difference is practically negligible. We still call it red though. This is an approximation. A generalisation. But there's more. We assume that the colour red, like other colours, exists as waves made up of photons. We don't know much about how light exists as energy, but we have created models that explain and exploit its properties to a degree that is useful to us. 

Of course, none of this may be real. We may all be plugged into the Matrix. This could all be a dream. The colours may not exist. This universe may not exist. The properties of this physical world that we think we know about may only be a function of a dream world we inhabit and not part of whatever is really out there. But we don't know for sure if this world is fake, and so we act under the assumption that this universe and all the properties in it are real. Because this is the only practical way to live if our goal is comfort and happiness. We don't know if we exist in the way we experience. But it is best to assume that we do.

So in this sense everything is an assumption. But that doesn't mean that all assumptions are equal. There is a hierarchy. If there weren't then any view or model we created, no matter how crazy, could all be equally plausible. So what we do to maintain order in our world is assume that certain things are probably real like our universe and our existence. We then build the rest of our models on top of these basic assumptions.

Now we need to be really careful about how we construct these models, because a lot of them are based on questions that involve incomplete definitions and subjectivity. For example, do we have free will? Luckily, a lot of our models are objective, and built on physical laws whose properties we can approximate quite well. We use mathematical operations to build bridges. Mathematical identities themselves like Pythagoras theorem are perfect and exist for themselves with no exception, at least under the assumptions of the mathematical laws of this universe. We don't know why these identities exist, but we know that they do and how to exploit them. This is not an excuse for a belief in the supernatural. That is simply uncalled for given the evidence. We simply do not know why identities exist. That is all. Any models explaining why will need additional evidence.

You could of course make up your own inductive proof for a supernatural entity that exists outside the laws of this universe and space and time and matter but you would eventually have to face the fact that the properties of this proof are made up by you. i.e the proof works by induction, just like mathematical proofs work, because you assume all the properties needed for it to work, like we do in math. You don't know if these properties are real, you just assume that they are. A logically valid argument will still lead to a false conclusion if its premise is false. If your assumptions are unfounded, then no matter how good your argument, the conclusion you reach will still be only as good as your assumptions. This is why models for God's existence are both perfect and probably wrong.

Moving on, when you build a model, you identify a pattern and make predictions based on evidence. Sometimes, you use other people's models. You act on expectations that another from another model that you know very little about. For example, when you get sick and pop a pill. You don't know anything about what you're consuming. But you take it anyway expecting to get better. Is this a belief?

You could call it a belief, yes. Like the belief that the sun will rise tomorrow morning, given a normal solar system, or the belief that you will be able to walk or talk tomorrow, given no major changes to your body. You could call these beliefs, and they are all based on assumptions. But are they the same as religious beliefs? No, of course not. Because unlike religious beliefs, all these beliefs are verifiable. You cannot know for sure if a pill will cure you, but you know that you can look up the details of the pill if you wanted to. You can examine the skies or your body for patterns if you want to confirm the expectations you have for your model. In other words, these models are verifiable. Not a 100% verifiable of course. Pills do not always work. Solar systems and human bodies do not always work the way we expect them do. Errors abound. Things unaccounted for. The model is updated with new data. This is how critical reasoning works. Religious models are different. They rely, as I have said, on assumptions that are unverifiable. They might lead to useful but false conclusions. Religious belief may be useful, but it is also unverifiable. 

So now we have not just beliefs, but levels of belief. There are verifiable and unverifiable beliefs. This allows for some degree of subjectivity, as what is verifiable depends on how good the evidence is, and all evidence comes down to further assumptions, which always comes down to our assumptions about this universe and our existence. But we can say for sure that some beliefs are more verifiable than others, because some evidence is better than others, assuming the basic laws governing this universe. Evidence that holds up to falsifiability and has predictive value will always be better than anecdotal evidence that relies false premise reasoning and confirmation bias. This is not to say that unverifiable assumptions are wrong. This is impossible to tell, but that's the problem. 

We are mostly concerned with truth or falsity of assumptions based on the evidence we have. But since we can only examine the evidence in light of what we know about the universe, and since this is itself a series of assumptions that do not take into account what we don't know, then of course anything we postulate about God or a supernatural being could be true. Not probable, but possible. I wouldn't say that  assumptions based on rules outside this universe are something we shouldn't bother to think about. But we definitely are limited in the ways we can verify them, given that all our means of verification exist only in this universe.

So yes, perhaps I do have beliefs. I suppose I do live my life along expectations of how the world should work even though I don't always understand why it works this way. These could be called beliefs. And they are certainly different from supernatural beliefs. My beliefs are based on assumptions that are verifiable, at least to a certain extent. I think is a more practical way to live for the moment, compared to holding beliefs that are unverifiable, because at least I can explain why I hold a belief. I can justify my beliefs with evidence. What about you?


Share/Save/Bookmark

Monday, 15 December 2014

Starting a Wildlife NGO


Here are some ideas for starting a Wildlife NGO. These are activities that your NGO could take up.

  1. Wildlife rescue –
    1. Rescue animals forced into entertainment
    2. Maintain a shelter to keep them.
    3. Maintain a network of professional animal caretakers & vets to help.
  2. Wildlife Research –
    1. Research aimed at conservation - ecology, distribution, predator-prey relationships, etc.
    2. Hire scientists to do research.
    3. Hire project managers to oversee researchers and develop conservation plans.
  3. Networking, communications & fundraising - Get a marketing team in place to do what they do best.
  4. Fight cases in court – Hire environment lawyers.
  5. Education/training – Hire education consultants to design outreach programs, scientists and and volunteers for education campaigns in schools.
  6. Excursions - Get your coordinators to organise and manage regular hikes or trips to wildlife sanctuaries.

Share/Save/Bookmark

Sunday, 14 December 2014

Stats Blogs I Follow


These are the stats blogs that make me better at what I do.

For stats literacy - http://www.statschat.org.nz

This is the main one for serious statisticians. Andrew Gelman, a statistics professor and Bayesian statistician and programmer, critiques poor statistical practices. Very informative - http://andrewgelman.com 

For some advanced talk and a lot of useful links - http://simplystatistics.org

Deborah Mayo writes about philosophy of statistics - http://errorstatistics.com

Great learning resource for advanced statistical concepts - http://www.mii.ucla.edu/causality/

You might learn a few things from Daniel Lakens' blog - http://daniellakens.blogspot.in

A nice revision of important concepts with comics - http://statistically-funny.blogspot.in

More on probability theory - http://underpoint05.wordpress.com


Share/Save/Bookmark

Saturday, 13 December 2014

Education without Innovation


Most complaints about education systems revolve around them being mostly theory without any practical application. This is a problem because practical application like research methodology & computer lessons are a large part of what you need to go from being a theorist to a practitioner. It is no use studying concepts if you can't use them. But there is another problem I have with the system and it is lack of innovation. 

Students are great at learning theoretical concepts. They are great at regurgitating what they are taught in the form of an essay. Sure this is a form of learning. But it is not innovative. When you gain knowledge being taught to you, you grow to the level of the person teaching you, but you don't necessarily exceed this level. This is why learning itself is useless for humanity without innovation. To truly make a change you need to go beyond what you are taught. You cannot simply learn concepts in a vacuum. You have to combine them to come up with new concepts. This is how new things are created. 

It is the same for practical lessons, which may suffer from the same problem. I can put students through computer classes, but it will not mean much if they just recreate what I can, unless you want no new development. The best way forward is to teach your students the basic concepts with practical application, but to connect these lessons with existing questions, theories or ideas that they already have. This makes their learning context dependent, and motivates them to go beyond their lessons, to use what they have learned to create something new. 

---------------------------------------------------------------------------------------------------------------

To see this in a broader historical context, countries that invested heavily in scientific innovation have always also been quick to reduce poverty and grow economically following innovation. Innovation makes you rich.

For example, Britain once had a lot of poverty. They were able to grow as a nation and coloniser and reduce their poverty because they innovated. This does go hand in hand with how much poverty you have of course. Britain had some labour, but not a lot i.e their labour was expensive because they were few, so they were forced to innovate, to find ways to mechanise processes that didn't require labour. They invented the steam engine, among other things, which meant that more resources could be processed, and which made the means of processing them even cheaper. This also meant that they could now do things quicker and cheaper than other more labour intensive countries could. 

Compare this with India, where everything was done by hand. It still is in many villages, because it is still economical to do so. The low cost of living and availability of cheap labour acts as a deterrent to invent in technology. This is of course fine if you aren't competing with other countries and if those other countries are peaceful. But this isn't the case. Exploiters gotta exploit. Britain had superior technology because they innovated because they were under pressure to do so. India had cheap labour so there was no pressure to innovate and so didn't have superior technology. It was the same with a lot of African and Asian countries where labour was cheap. No investment in technology. No incentive to innovate. And of course the countries with superior technology ended up colonising the countries without any. 

Bouncing back

You also see this with countries like Japan and Germany. Germany of course had a history of scientific development. But Japan didn't. It is interesting to see how these two countries managed to become economic powerhouses and developed countries despite losing world wars. Germany invested heavily in industries prior to both world war one and world war two. Even though they lost the wars, they still had the brains, the skilled technicians to build their economy, to continue creating, processing and selling products and services that other countries needed, which kept the money coming in, which meant they could continue to invest internally, in infrastructure, healthcare, education, and yes, in science and technology, to keep that loop going. 

It was the same with Japan. A country with immense poverty before world war two, they invested heavily in technology and innovation. They knew they were decades behind other countries in scientific development because of their isolationist policy. Political ambition and conquest drove their industrialists and businessmen to invest in technology, to send their best people abroad for training, to bring back, adopt, copy or recreate whatever they could, to bridge that gap between themselves and the west. Which they finally did. In a very short time frame to boot. Sure this was partly driven by war, but following their loss, which included recovering from two atomic bombs, they still had the scientific knowhow to become the number one economy in Asia. Because they had invested in technology like no other country had. So even though they lost, they were still number one in Asia in science and technology. 

History shows us that winning or losing wars doesn't matter as long as you own superior technology and a workforce that knows how to use it. You might occasionally grow overambitious, make dumb decisions like invading another country, and getting your ass kicked and pride hurt, but as long you still own superior technology, you will always bounce back quickly.

Owning the future 

China and India were happy being agrarian societies, while Japan correctly ascertained that if you wanted to be a world leader, you had to own the technologies that no on else had, because this gave you an advantage. You had to have products and services that made you more powerful, because you were able to do things better than any other country (like build better factories that built better cars, faster planes, etc.). This not only gives you a military advantage, but also something to sell to other countries for a very high value. 

Having better weapons not only gives you a military advantage, but it also creates a new market for exports. Having more money go into medical research means a better healthcare industry which means better trained doctors and hospitals with more advanced tools and techniques, which they can export. It also means better pharmaceuticals, which can be licensed or manufactured abroad. Again, the foreign countries that lack innovation only get to do outsourced blue collar work, not highly paying work. R&D stays at home. No country that owns technology is going to sell it. This has changed to some degree in recent years, with companies becoming more global, and R&D happening worldwide, which is an interesting change. It flattens the playing field somewhat.

But it's still shocking that people ruling countries today still act like they don't get the fact that for innovation to truly benefit you, you have to partake in it, so you end up owning the technology that results from it. When you look back at the recent history of India, it is shocking that there has been no efforts at home grown anything. If all you do is import foreign technology, you aren't owning the technology, you're simply renting it, or buying an end product of that technology, which is easily outdated. When India buys weaponry from Russia, Israel or France, it's buying old technology, perhaps even second hand products. Even if it is better than what its competitors have, it is still no comparison to having your own state of the art military industrial complex, like the US, Russia or France have. 

This doesn't only go for weaponry, but also for public infrastructure like trains. Why does India have to go to France, China or Japan to build a Metro or Monorail? Because it doesn't have the technology to do it internally. It has to contract the design work out to foreign firms, and then use local labour to build them. This despite the fact that monorail technology is over a hundred years old. This shows you how backward India is, how lazy it has been at innovating. It isn't like the incentives weren't there. They were, just as they were there for Japan. I don't mean war, but the incentive of not being left behind, of wanting the best for your people. 

Indian leaders simply do not have this vision. If they did, they would invest more in education and research. Without these, you're always going to be second best. You're always going to be left behind. And your country might always be exploited, particularly in terms of trade. Crops don't fetch the same prices that advanced technology does. A lack of innovation means that you're constantly dependent on other countries and their aggressive policies for products and services. To be the best you can't keep chasing the best, you have to outrun them. To chase is to lose. If in ten years you aim to be where the US is today, say n years ahead, then you're still going to be n years behind the US ten years from now. Your goal should be to grow at a faster rate than your competitors if you want to catch up with them. 

Reducing poverty

This is also how you get rid of poverty. Yes, low cost labour intensive production provides jobs to everyone, but it also sustains poverty because it doesn't really help the economy. In 50 years, when other countries have moved on to other technologies and you're still using a low cost labour intensive system to make things by hand, your economy will be in bad shape. Your workers might have jobs, but their pay will be low, because their work is simple and there's many of them. They might have just enough to cover food and basic living expenses, but no money to spend on more expensive goods and services, which means low purchasing power and a smaller market for expensive goods, which hurts the economy. Whereas other countries that abandoned labour intensive production ended up with a highly skilled workforce that are highly paid because labour is now expensive because their skills are valued, and they can now buy expensive stuff, which creates a market for more expensive items, which in turn drives the economy. This is pretty much a comparison of socialist India and capitalist USA in the 80s. 

This isn't a bad thing if Indians don't care about foreign products. But they do. They care about a better quality of life. However, because their economy is in bad shape, there's no money for the government to invest in infrastructure. This is partly due to subsidies, but those subsidies wouldn't exist or matter if your people were richer, which they would be if they had higher order skills that they could sell for more money. If you invest in innovation, say in factory production, there might be some job loss, but in the long run, you will need a highly skilled workforce to manage these new processes. You could have a million people harvest cotton by hand, or you could have machines do it, and have those million people do more specialised, highly paying work, like overseeing the machines, maintaining them, working towards business strategy, doing logistics, HR, marketing, PR, sales, client relationship management, IT. 

Through innovation, investing in scientific development and developing new and better ways to do things, you make life better for your workforce. Instead of earning a pittance doing low value work, they're now earning a lot doing high value work. This is how countries and economies grow. Innovation makes you rich. It's a costly investment, but the returns justify the costs. You see this as a historical pattern when you look at present day industrial nations that used to be agrarian - China, USA, Japan. All you need are leaders who can see and learn from history.


Share/Save/Bookmark

Friday, 14 November 2014

On Anecdotal Evidence


Too many people rely on anecdotal evidence (personal experience or cherry picked examples) to assess if something is true, and I don't like it. 

To me, everything is a model. All the ways in which we view the world, or our explanations for various phenomena like behaviour, are merely models. The techniques we use to estimate weather patterns are models. The techniques we use to estimate group dynamics are models. All estimates are models. There are a number of ways to consciously build models. You could use anecdotal evidence. You could also use critical reasoning. 

There's a famous phrase that goes, "all models are wrong, some are useful". I like this because it feeds into what scientists do. Science is not about finding the 'truth'. It can be about the pursuit of the truth, but the truth might never be known. Therefore, all you do is continue to build better approximations of the truth, or better models to explain and predict phenomena, for both academic and practical purposes. This is what science does. Science is essentially a mix of critical reasoning and research techniques combined with domain knowledge. The sciences - Biology, Psychology, Chemistry, Physics - are merely fields of knowledge, domains that revolve around certain interest areas. Of course there is overlap. But these are not sciences because they encompass domains of study. That's half of it. They are sciences because they use critical reasoning techniques to investigate and build models that approximate the truth. 

Where does anecdotal evidence come in? Anecdotal evidence is a first step towards building a model, but not evidence for the model. Anecdotal evidence is the presence of something interesting that requires further study. You see a ghostly white figure at night. You have no idea what you are looking at. You investigate, you make a hypothesis and attempt to verify it. Things can get a bit shaky if you skip the investigation and rush to make a claim, because anecdotal evidence could be due to a number of causes, not just the one you have in mind. False positives abound. This is why it is important to treat anecdotal evidence as a first step only. It would be disastrous to claim something as fact based on personal observation, and then find out that your claim is wrong because you didn't properly investigate the matter.

Let's take some examples. The claim that God is real. There are various types of  evidence for this claim. One is prayer, a type of anecdotal evidence. I pray for something, something happens, therefore God is real. Anecdotal evidence like prayer cannot be evidence for the existence of God till it is verified. For every anecdotal claim of prayer working, there could be another for it being useless. To verify if prayer works, you would have to experimentally demonstrate its effectiveness. This is called falsifiability. Note that this is neither proving nor disproving the existence of God. This is not the question at hand for the scientist. It might be the question at hand for the person claiming God's existence and using prayer as an example, but for the scientist the investigation only concerns the effectiveness of prayer. A scientist who demonstrates that prayer is useless is not proving or disproving the existence of God. He or she is merely verifying a specific claim. This is important to remember. Science is not always concerned with the big questions. It is merely a tool to verify claims or existing models. After all, prayer is a model of how the world works. A scientist can spend his or her entire life falsifying such claims. This would get us nowhere if the claims were spurious to begin with. This is why anecdotal evidence should not be used to claim something. Because there are more reliable ways to build models. 

[This is why proving or disproving the existence of God is a futile activity. No one knows exactly why the concept of God came about. We have theories. But nothing that seems to be founded in verifiable evidence. There is a lot of anecdotal evidence, but upon verification, a lot of it does not hold up to scrutiny. This is not to say that any of the thousands of Gods do not exist, or that people are wrong in believing in them. Science cannot falsify something that was made up to begin with, or is currently too difficult to verify. It can only analyse the evidence and show over time how improbable something is, using existing methods. True falsifiability is impossible. Which is why we will never be able to disprove the existence of the Loch Ness monster either.]

Here's another example. Psychometric tests like MBTI. HR professionals love them. But the data from meta analyses picks holes in the test's reliability and validity. But HR professionals who have used these tests swear by them. One person I spoke to even compared it to the accuracy of a horoscope while praising it (I doubt he was trying to be ironic). This kind of reliance on anecdotal evidence to back something, is used as a model by a lot of people, just like people use prayer as a model. Why do they use it when there are scientific techniques that discredit these models? I have no idea. Maybe people are ignorant. Maybe they find it easier to act on someone else's recommendation or 'try it yourself first' advice rather than doing personal research. Maybe they think that discrediting one model will mean discrediting a larger model that they have more of an emotional investment it. Maybe they already choose to believe in something to make themselves feel better. Maybe creating a faulty but useful model works for them. Maybe the model's degree of usefulness wins over the fact that it is wrong.

Which is interesting because of what I said earlier - all models are wrong, some are useful. Let's say human sacrifice to appease the weather Gods is supported by anecdotal evidence i.e. a group of people practice human sacrifice and choose to notice only when the weather changes for the better, convincing themselves of a correlation between the two. They of course ignore instances when sacrifice does not affect the weather, attributing it to human fault or God being angry with them, or it all being a part of God's larger plan. Now let's say hypothetically that this model/belief is the only thing keeping this society stable.

Note that science isn't always concerned whether the effect is real or not, or if belief in it should continue. Yes, assumptions are faulty. Correlations abound in large amounts of data. They're a function of statistical noise. Experimentation should verify the probability of the correlation. But even if it finds that the correlation/belief/model of human sacrifice for better weather is wrong, it doesn't erase the fact that it is useful. Now replace human sacrifice with belief in God, or MBTI. These models might work in certain contexts. Belief in God helps people in certain contexts. Belief in aliens might just help society. I have no idea. MBTI might be useful in certain contexts. Neither of these models might be correct, but they can be useful. If MBTI works for you, then great, use it. But that doesn't mean it does what it claims to do, which is why you wouldn't be right in recommending it to me. Which is why people need to look at the evidence to verify if a model is good for them, and not rely solely on anecdotal evidence, or else risk disappointment.

In summary,

1. Anecdotal evidence can be a good first step to further research.
2. If you notice something interesting, collect data, find patterns, make a hypothesis and verify it. Then make a claim.
3. Your claim is your model. It can only be built on the elements in point 2. 
4. Anecdotal evidence by itself cannot be used to build models. 
5. If someone builds a model that approximates what they think is the truth, question their assumptions and verify the evidence.
6. If their model is build on anecdotal evidence (personal or cherry picked examples), reject the model for being incomplete.
7. Their model is not necessarily completely wrong but it is pointless to consider something correct if it hasn't been verified, even if it is useful.
8. A model's usefulness does not necessarily reflect its correctness.


Share/Save/Bookmark

Monday, 4 August 2014

On Science Journalism


Few people do science journalism right, and so few comprehend the difference. 




I came across this recently. I clicked the link, which took me to an article describing a paper that I downloaded and read. The paper itself was OK as far as social science papers go, but as usual, elements within the media that don't know any better jumped on it. 

The paper concludes "that individuals with an East German family background cheat significantly more on an abstract task than those with a West German family background." It also concludes that "The longer individuals were exposed to socialism, the more likely they were to cheat on our task." 

The points I would like to raise are below.

The first point is with the statistical inference used in the study. The researchers note that both groups cheated, but those with East German backgrounds cheated more than those with West German ones to a degree that was statistically significant. I won't go into detail about p value hacking, confidence intervals, effect sizes and power here, but suffice it to say that a statistically significant result does not reflect an actual real life effect. This is just a function of probability. Neither group of people may have cheated. But the statistical techniques picked up on variation that the researchers deemed significant. We do not know if this significant difference represents cheating in real life, or if it would hold if the study were to be repeated.

Even if the effect (cheating) were present, there is no way that you can automatically extrapolate the results of a game to a judgement of people's moral attitudes in general. This is because morality is complex. The fact that some people might use the opportunity to cheat if given the opportunity to do so in a game of dice does not necessarily reflect their attitudes in general, or choices in other situations. The researchers use terms like 'value system', but do not define what this encompasses. What constructs and concepts make up a value system? Is it objective or context dependent?

You might download a film illegally, but this does not make you a thief in general. Your choice to download a film at that point in time is a function of the cost-benefit equation to you and the social context of your choice, how many other people are doing so, your perception of the 'rightness' or 'wrongness' of your act, etc. It does not necessarily reflect you attitudes or preferences in other contexts.

Even if you could extrapolate the effect observed in the game to life at large, you must remember that Correlation is not Causation. The fact that those with an East German background cheated more does not indicate that their background is what caused them to cheat. No one is denying that economic systems can change behaviours and attitudes of people, and it is worth studying, but you cannot jump to conclusions. You need to remove all the confounding variables, false positives and other possible causes. You do this by making as many comparisons as possible. Did the researchers do this? Not completely.

East Germany was not merely socialist but thrived on a culture of fear and repression, with secret police spying on citizens. There are social and cultural factors  that could have lead to people developing a habit of cheating and might have had nothing to do with the economic system. The researchers have identified two of these - economic scarcity and social comparison - but were not able to verify them using their methods.

Also, the paper fails to mention if the researchers took into account the fact that former East Germans have been living in a new economic system for 23 years (1990-2013) and how this might have changed their preferences/choices/attitudes to the extent that it makes the effect of their background meaningless to the study.

Even if all the points above are wrong and the researchers' assumptions thus far are correct, the inference that people exposed to socialism cheat more would still be incorrect, as socialist economic systems themselves are very different from country to country. East Germany was a comparatively impoverished socialist country compared to the scandinavian countries for example, which also had elements of socialist governance. One could argue that the scandinavian countries were never truly socialist, but that's missing the point. If the authors are talking about one specific kind of socialism they need to be clear about this. If they are referring to socialism in general, then they need to test population samples in other formerly socialist and presently socialist countries, and control for cultural and other differences, before they can make such an inference. They have not done this either.

--------------------------------------------------------------------------------------

All in all, this is not a bad paper, compared to others I have read. The researchers are quite honest about most of their limitations. However, no one else seems to care. The original article that linked to this paper merely reiterated the findings as if they were correct, without taking into account the researcher's alternative explanations. This is bad journalism. 

There are so many papers being published every month in various journals. Sometimes, the journals themselves are shady, and publish poor research for a fee. Researchers are under pressure to publish as this is what determines their reputation and pay in academia. So they tend to fudge data or manipulate it in dishonest ways to get positive results. Journals have a publishing bias towards positive results. And the journalists who write about the papers that are published in journals are usually under tight deadlines too. They cut corners. They trivialise, generalise and indulge in simplification. They have a poor understanding of scientific domains, empiricism, and critical reasoning. Most don't bother critiquing the papers they report on.

I have read so many bad science articles in the past 3 years that I have had to whittle down my RSS feeds to the extent that I only follow a few news feeds, scientists and professional science writers. On Twitter I am even stricter. I do not follow any pop science accounts, only professional researchers, people who will either share original research, or go the extra mile and critique people's research rather than blindly sharing links they come across. The best science communicators out there do not even bother writing about the latest developments in Psychology, given the faults in the field, the shakiness of results - the p value hacking, selective sampling, failure to replicate, false assumptions, etc. 

So it's always sad when an individual with a lot of followers shares a bad article. I am not writing this to be mean or to hurt anyone's feelings or discourage anyone's work. I'm just saying that there is a clear demarcation between good science writing and poor science writing. I do not expect every journalist out there to be able to critique a paper (though it would help) but I do expect even a beginner to know the difference between a balanced and a biased article.

When a journalist with impressive credentials chooses to share a link to a clearly biased article that, to push an agenda, deliberately ignores the limitations in a paper that any 2nd year undergraduate student at a middle ranked Psychology department in the UK would notice, you question that person's credentials.

This is not an isolated case. There are other people on Twitter with a massive reach who also tend to share terrible links. I am sure they are lovely human beings who want the best for humanity and are smarter and more accomplished than me in many ways, but they still share terribly written pop science articles that distort a subject just because they have a catchy title or byline.

So here is some advice when you come across a piece of science journalism - 


  • When something sounds too good to be true, it usually is not true. In the social sciences, discoveries are few and far between, so an article that claims to have discovered a major effect must be met with skepticism.
  • If an article claims something that you're sceptical about, read the paper and double check if those claims are true.
  • If you cannot critique the paper or do not have the time, do not share the article. Wait for someone else to critique it.
  • Follow professional science communicators who know how to critique scientific discoveries, and not mass produced pop science junk. The pros know how to write a balanced piece. Pop news channels just want to grab eyeballs and don't care about accuracy.




Share/Save/Bookmark

Friday, 13 June 2014

Proximate and Ultimate Explanations


Proximate and ultimate explanations are among the first terms that you learn about when studying ethology. These terms are used in other contexts, in addition to the study of behaviour, where they mean slightly different things, so it's important to understand them and not get them confused.

Why does an animals behave in a certain way? An animal's behaviour can be explained in proximate and ultimate terms. Proximate explanations deal with the 'how' of a behaviour i.e the underlying or mechanistic reasons behind a behaviour. Ultimate explanations deal with the 'why' of behaviour i.e the usefulness of this behaviour to the creature and how it came to acquire it.

--------------------------------------------------------------------------------------------
Here's an example - birds singing in spring.

Proximate questions - How do birds manage to sing in spring?
Proximate answers - Daylight induces changes in hormones which make them sing. They learned to sing when young.

Ultimate questions - Why do birds sing in spring?
Ultimate reasons - For mating/reproductive value. The vocal chords of distant relatives and extinct birds indicate that this trait evolved concurrently with overall fitness.
---------------------------------------------------------------------------------------------

It is important to realise that proximate and ultimate reasons are both explanations for the same behaviour/phenomenon but from different perspectives. You could say that proximate explanations provide the reasons underlying behaviour (what is it due to?) while ultimate explanations look at the bigger picture (what is it for?)

Proximate behaviours usually provide mechanistic reasons for behaviour or describe the 'triggers' behind behaviour. You can think of them as being the result of things occurring in the animal's body (e.g. hormones, nervous system, genes, age) or immediate environment. Proximate explanations can be further divided into mechanistic (causation) and ontogenetic (developmental). 


  • Mechanistic explanations (how does it work? how was it caused? what caused it?) usually deal with processes within the body that follow simple rules, like neurons and the nervous system, hormones, pheromones and other bio-chemical processes.
  • Ontogenetic explanations (how did it develop?) cover behaviour from the nature-nurture or gene-environment angle. These explanations build on mechanistic processes, and relate them to what's going on with the individuals environment, like learning or other aspects leading to behavioural development.


Ultimate explanations usually describe the function of a behaviour in terms of evolutionary history and function. Ultimate explanations deal with evolutionary benefits of a particular behaviour. These can be further divided into phylogenetic (evolutionary history) and adaptive (functional) explanations. 


  • Phylogenetic explanations (how did it evolve?) deal with why this behaviour might have evolved over successive generations instead of being lost. We look at the evolutionary history of the creature to see how natural selection worked on this trait. 
  • Functional explanations (what is it for? what purpose does it serve?) deal with the benefit the behaviour confers to the individual in terms of its current environment. It is important to remember that an individual can have a current trait that is adaptive without it being an adaptation. 


---------------------------------------------------------------------------------------------
Here's another example - Honeybees swarming (splitting up and building new colonies elsewhere).

Proximate questions - How to honeybees manage to swarm? What factors lead to swarming?
Proximate answers - Because of the way their central nervous system responds to other bees doing the waggle dance. Or because this behaviour is triggered by colony size, brood comb congestion, worker age, or the queen having reached her maximum egg laying rate.

Ultimate questions - Why do honeybees swarm?
Ultimate reasons - For reproduction, survival, more food resources.
-----------------------------------------------------------------------------------------------
And one more - birds building nests.

Proximate questions - How does a bird know how to build a nest?
Proximate answers - It could be a genetically programmed or learned behaviour.

Ultimate questions - Why does a bird build a nest?
Ultimate answers - Because a nest assists in mating and so improves reproductive success, which means genes are passed on to the next generation.
-------------------------------------------------------------------------------------------------

Share/Save/Bookmark

Sunday, 27 April 2014

On MOOCs and Online learning


I think MOOCs are important and useful. I just think that a lot of them aren't following instructional design principles and enabling learners in the way that they should be. It's a great medium to change the world, but it's being run by computer scientists and businessmen with minimal input from learning designers, and this needs to change.

A note of advice, don't take more than one MOOC at a time. The first time I discovered and registered for MOOCs was 2012. I registered for four, but then realised I couldn't follow all the courses. Even after reducing the number to one, I couldn't cope with both the MOOC and my studies. I tried taking more MOOCs when my schedule cleared up in 2013. But again, I registered for too many. I finally completed two courses simultaneously from March to May, but the workload was so high that I decided to stick to one course at a time in future. And the only reason I was able to manage two was because one was really easy.

I learned two lessons here. One, a MOOC is a full time course of study. It's equivalent to one college level module, and a heavy one at that. It requires daily participation on your part, and is certainly not a 'one day a week' thing. You don't just watch a few videos and take a quiz, you need to do a lot more for the course to be effective. There's a lot of reading to do if it's a knowledge based course. And a lot of practice if it's a skill based course.

There's also a lot of knowledge sharing in online groups. And you need to budget your time accordingly. Granted, you don't always know exactly how much time you'll need at the start. Which is why it's a good idea to audit courses when you're not sure. And just drop out if it's too much for you. I tend to do this a lot, especially when the subject area is completely new to me. I've dropped out of around 6 courses for every one I've completed.

The second lesson is about the design aspect of MOOCs. Understanding a concept takes time. Learning comes from reflection, practise, application and knowledge exchange. You do not learn something by watching four 15-minute videos of the topic each week and then taking a quiz about it. The true test of learning comes not from summarising what you've learnt but applying what you've learnt to a new context. That's the real challenge. Do MOOCs meet this?

I say no. Most MOOCs consist of mainly videos and reading materials. Videos are at best an overview, an introduction. They cannot be the entirety of the course material. You should ideally watch a video, and then do a lot of follow up reading (the best MOOCs have their own textbooks), note taking, introspection, sharing ideas with others, summarising your conclusions in the form of essays, and a lot of follow up exercises involving applying your ideas to novel situations. This is how learning takes place.

And this is the problem with MOOCs. They're mostly just videos and quizzes, and they should be more. A course with just videos and quizzes and maybe a few assignments can never completely teach a complex subject to the extent that you begin using its ideas as a practitioner. Secondly, this type of course encourages sole study without group interaction, which is not preferable. Third, it fools you into thinking you're now an expert on a subject because you got a good score on a multiple choice quiz on the subject every week for eight weeks.

Multiple choice quizzes are generally not the best learning facilitation tool, given the amount of guesswork taking place. I understand that you can't have teacher graded essays or exercises in a class of 15,000 students, but peer review should definitely be an option. 

Real learning takes place through reflection and practice, which requires time. One of the better courses I've taken was on Psychology and had it's own free online text book that was required reading for the course, and was comprehensive in the materials it covered. However, I would have liked more essays and exercises to cover the practical aspect. 

Another good one (on mathematics) had loads of exercises that needed to be discussed in the group forums. Group learning is a good thing, and one of the main advantages of online learning. You have so many more classmates to share ideas with and learn from, and you do this on your own time. When a course only revolves around material presented through videos, without any other reference material or exercises, the course forums turn into a wasted opportunity, as you're only discussing topics covered in the videos, which is not extensive anyway.

MOOCs will never replace college education or be taken seriously as a means of education if they don't have their students use more reference material and application based exercises, which encourages reflection and knowledge sharing, and better forms of evaluation.


Share/Save/Bookmark
Related Posts with Thumbnails