For once, let's think about something else than programming. Or will we ;-)?
The bookshop
Two weeks ago, going back home after some consulting, I arrived early at the airport with a few dollars in my pocket, and I decided to stop by the book shop to have a look at some refreshing mental food.
Haha, I wish that was so easy! I had a first round, looking at almost all the books, then I narrowed down my choice to a few books, maybe 2 or 3 and started thinking really hard.
"Do I prefer a short and small book?"
"Should I take the most interesting?"
"Should it rather be entertaining? I should relax"
"Do I have enough cash? If not, is that worth withdrawing more money?"
"Do I rather to read a more political book?"
"Do I need a book at all?"
All this thinking (I'm not kidding, I spent at least 20 minutes) convinced me that this was the book I should buy: How we decide.
Feelings can be a good decision-maker
This book was for me a very good follow-up on Damasio's book Descartes' error where we learn, through different stories, the role of emotions in the decision making process. Like the story of that guy having a brain injury and not feeling any more emotions.
One morning, coming to Damasio's lab he explained everyone how he's been able to stay very calm in his car, avoiding a deadly car accident due to the iced road that morning. After the session, as he was about to leave, the doctor asked him to select the best date for the next appointment. That emotionless guy stood there, weighting for 30 minutes all the possibles options that were open on his schedule. The team just wanted to strangle him for not being able to make up his mind!
This kind of story is one of many showing the power of emotions or, you could say, the power of the unconscious decision circuits. Most of other Jonas Lehrer examples are taken from games, whether decisions must be taken fast, like baseball or american football, or whether decisions are very complex, like chess or backgammon.
In all those cases, Jonas Lehrer tells us that we have "dopamin learning circuits" which are directly connected to the reward system. Some neurons register when "it's good", when "it works", while others register the surrounding context and the consequences of our actions. The net result is that we progressively get a "feeling", an "intuition" of what works and what doesn't.
For example, Garry Kasparov has a global feeling of what is a good strategic position on the chessboard. Joe Montana had an instinctive feeling of when was the best moment to make a pass on the field. We must also note that both of them have spent a lot of time thinking hard about how they failed, how they made mistakes.
Is there anything we can use out of this for programming?
We should fail early, fail often and try to learn as much as possible from our failures:
- by reducing the develop-test cycle so that we have very frequent feedbacks of when changes are breaking stuff
- by using retrospectives/post mortems to analyse what is right when working as a group of people on a project
- by scrutinizing the code when it hurts. We have to change 100 lines of code all over the place for a seemingly innocent evolution. That hurts, doesn't it? Let's use this as a reinforcement for extreme refactoring. Next time we have to decide if we should refactor or not, the decision should feel obvious
This is also a good justification for code elegance. Code is elegant when it's easy to read, modify, reuse and getting that feeling helps making all the micro-decisions when writing code:
- naming
- formatting
- refactoring
- abstracting
- checking, validating
Dont always trust your feelings, Luke
Our "dopamin circuits" are trained to recognize success or failure patterns and give us a good feeling of what the next decision should be. Unless we totally fool ourselves.
For example, we're easily fooled by randomness, thinking we understand how probabilities work. In 1913, in a casino at the Roulette wheel, the Black color came out 26 times in a row! Can you imagine the number of people who put all their money on the White after the tenth or fifteenth time,... Even if you're reminded that each draw is independent from the previous one, wouldn't you bet all your money too?
That's not the only situation where we should make better use of our reason. Behavioral science has now plenty of evidence showing how limited our rationality is:
- we sometimes just don't learn from failing patterns. In some crisis situations, when emotions are high, we keep doing the wrong thing despite any adverse evidence. It's just impossible to think out of the box.
- we're generally very bad at understanding probabilities. For example, if there are 10 lottery tickets at 10$ and the reward is 200$, there's no reason not to play. But if we learn that we buy one ticket while another guy/gal buys the remaining 9 tickets we usually find the situation unfair and we don't play!
- we're subject to "framing issues" and "loss aversion". Will you prefer your doctor to say that you have 20% chances to die or 80% chances to live?
The solution here is to use our frontal precortex and really try to take rational decisions knowing our natural biaises and weaknesses.
Is there anything we can use out of this for programming?
Programming is a highly rational activity, but not always. Let's not forget to use our frontal precortex when:
- we have "impossible" bugs. Especially under pressure, we're can sometimes be completely stucked on a bug just thinking: "this is impossible!!!! I'm doing A, I should have B!!!!". In this situation the best is to stop and brainstorm. Try to come out with as many ideas as possible. Then carefully select and experiment each of them. There's no magic in programming, things happen for a reason, so let's try to find it fast!
- we're tuning for performance. Performance tuning should be as scientific as possible. If I have the feeling that X could be slowing down my system, I shouldn't tweak anything unless I have real evidence that X is the culprit
- we're interacting with others and our ego gets in the way
Too much information
Rationality= good. Too much rationality = bad.
Yes, this not obvious but for example, too much information is not the necessary ingredient for a good decision:
- some clients are tasting and comparing strawberry jams. They do a first ranking of their preferences. Now we ask them to think about why they like them and the rankings are completely changed! I have the feeling that this is related to Condorcet's paradox.
- numerous studies suggest that "experts" may not be good predictors of what's really going to happen in their field (well, also because they can be biaised by their own opinions)
- before MRI, doctors were sending patients with back pains to rest for a few days. Now they can see inside the body they suggest surgery and medecine. Even to well-being patients!
Is there anything we can use out of this for programming?
Focus!
- when tuning performance, it is very easy to be completely overwhelmed by data. I would suggest taking one "red flag" only (like a display issue) and making a thourough analysis of that only issue until it is completely explained, ignoring what's not relevant
- lean programming suggests that accumulating huge backlogs of features and issues may not be the best and most way to make decisions. One reason is that we're very sensitive to irrelevant alternatives.
Conclusion: Emotions vs Reason?
I really learned something with that book. When too many rational criteria come up to my mind on which book I should buy, I'm just losing my time and most probably I am going for a bad choice. I'd rather let my emotions drive me and learn instinctively what a good decision is for me (talk about an obsessional guy,...).
In conclusion, the debate is not such much "emotions or reason" but more why and how we use both of them to make our decisions.