RubyConf 2015 – The Not So Rational Programmer by Laura Eck

RubyConf 2015 – The Not So Rational Programmer by Laura Eck


(upbeat, electronic country music) – Alright, hi, everyone. I’m Laura and I’m a developer. I work for testCloud in Berlin. I live in Tokyo and I
work remotely from there. This is me. Just in case you wanted to know, and see a second image of myself. (chuckles) Alright. To start out with, I
have a question for you. Or, actually, I have two questions. But I’ll start with question number one. So, who in here has ever had to work with a really weird, old legacy system? (audience chuckles) Like, not even, a complete ribble, just had to deal with it somehow. Please raise your hand if you’ve… Alright. Second question: who here in this room has a brain? Please raise your hand if you do. Not everyone has a brain? That’s kinda surprising, but, eh… I assume you all have one. Alright, so, everyone
who raised their hand for question number two should in fact also have raised your hand
for question number one. Because what our brain really is, is a big, fat, old legacy system. And it’s been a while since the last hardware update. The good news. Our brain is extremely powerful. And it can do a lot of amazing things. The bad news? The documentation is pretty crappy. Their error-handling is
not that great either. And we can’t even debug it because we don’t have access to
the code base at all. So, that sounds like a nightmare of every programmer, doesn’t it? The problem is, we can’t quite walk out on a project manager and just quit. We’re kind of stuck with this obnoxious and brilliant heap of jelly in our skulls that helps us to process and react to our environment. That helps us to reason about abstract problems and actually lets us create, communicate, even program. And on the other hand, it constantly keeps forgetting people’s names, it reminds us of awkward situations from years ago, in a very random fashion. And it constantly makes decisions for us without even asking. So today, I would like
to talk about how we can understand our brains better. The way they work, and
also these weird, little flaws that are called cognitive biases. And what to do about them. You see? We as programmers, we really like to look at ourselves as a group of people that is somehow more rational than others. Because, after all, we earn our living by talking to machines every day and machines aren’t
exactly known for being super emotional. So. If we make technical decisions or if we plan projects or
if we assess capabilities or competencies, it’s
just fair to assume that we adhere to rational standards. And that’s what we base
our decisions on, right? Well, I have a surprise for you. Programmers are human. And they have human brains. To be fair, most of the time, our brains do an amazing job. They have to process vast amounts of information and they somehow have to come back to us with
appropriate reactions. Like, how to deal with all this stuff that comes to us all the time. You see, the human brain is really old. And many parts of it developed when social coherence of a group was actually really important for survival. And things like accurate and quick assessment of threats. When it really mattered what your peers think of you because
being ostracized might well mean that you’re gonna die. And race conditions were a completely different problem than what they are to us nowadays, most of the time. So, what is cognitive bias? Cognitive biases are heuristics that the brain uses to process information in a very quick way and come up with appropriate reactions. They’re pretty useful, most of the time. But they are not perfect processes. So they can actually lead to some pretty sub-optimal behavior and decisions. The thing is this: we are all biased. That’s something that’s very
important to understand. It’s natural. It’s how our brains work. And, it’s not really
necessarily a bad thing. You know, our brain
uses all these shortcuts so it can actually deal
with all that information that’s coming to us and give us back something reasonable. So we don’t just get
an information overflow because the brain gets stuck with all these details. In fact, our brains have different modes. How it acts, and I’m gonna show you two really simple examples. That will illustrate this. So, when you look at the next slide. There are a couple of things that happen to you, without even noticing. You recognize right away that this is a person, a child, in fact. And also probably someone
you haven’t seen before, or maybe you’ve seen them but you don’t know them personally. You can also tell right away that she’s currently not happy at all. And if she was standing
right in front of you, at this moment, she might be very close at starting to cry or
actually shout at you. This process of recognition and perception is something your brain
does really quickly and effortlessly for evolutionary reasons. It’s important to understand, “Oh, this is another person, they’re what I am.” And to understand what they feel like. And in this mode, the brain also does a lot of very quick and
automated decision-making. Where we often don’t even realize that is happening, just because it’s happening so fast. And oftentimes, it sacrifices accuracy and correctness for speed and approximated results that are okay most of the time. And it’s officially large number of times. When you look at this, on the other hand, unless you’re really good at mental arithmetic or you still remember your multiplication table
from elementary school, which I don’t, to be fair. Your brain probably drew a blank here. Like, there’s no evolutionary reason why our brain should be able to automatically process semi-difficult
multiplication problems. So it can’t and there’s not really any way for it, unless you actually
memorized the results. To spontaneously come up with them. So you can probably tell that this is a multiplication problem by looking at it, because that’s something you’ve learned, so you can recognize and you can also tell that five and 5 million is probably not a very reasonable
estimate for a result. But if you really want the result, you have to actively
start thinking about it. You have to start calculating. And that’s a lot slower, a lot more difficult, and a lot more demanding than a fast-thinking mode. Put simply, when our brain
is in fast-thinking mode, it uses cognitive biases
to approximate solutions. And like all approximation approaches, this does not prioritize
optimal solutions. But it prioritizes coming up with a feasible solution in a reasonable time. You can’t really turn this off. It’s hard-wired into our brains. But, in some situations, there are ways how you can work around it. That’s not always possible, simply because, firstly,
you might not even recognize it’s happening. Because it’s happening all the time. And secondly, it might not be viable to (coughs). Excuse me. To do something about it all the time, just because if you were to question every single move that your brain makes, this would slow us down so much that it would be difficult
to even act as humans. There are situations, though, where it definitely
viable and also very good, to try and work around these biases. And actively making
decisions is one of them. We make decisions all the time, from really small ones,
like what to have for lunch, to major ones, like how to
make our lives meaningful. And most of the decisions
that we make at work fall somewhere in between. How to implement this new feature. What kind of tools or frameworks to use for this new project. Which applicant to hire for the job we’re looking for. These are important decisions. And our brains’ biases
affect each and every single one of them. So what can we do to make our decisions as good as possible? I will start out with
looking at cognitive biases that affect our personal decision-making. Like when we make decisions on our own. And since we as programmers usually don’t work alone all the
time, but we work in teams, I will also look at some
cognitive biases that come into play when you’re making decisions in a team of people. Confirmation bias is one of the first things that we have to look at when we’re trying to trick our brains into making better decisions. Cognitive bias means that when we search for information or when
we interpret information, we tend to do it in a way that confirms the opinions we’re already holding. As I said, this affects both searching for information and
interpreting information. We take what we already think is right, or what seems to make sense or what seems to us as obvious, and then we try to confirm this idea. While at the same time, we are ignoring possible alternatives. If you have a strong opinion on something, or a very emotionally attached opinion, you might even get angry
if someone challenges it. For example, many people
have rather strong or emotional opinions about topics such as abortion, or gun control, gun ownership. Something like that. So, no matter wrong or right, that doesn’t matter at all, but if you read something or if you hear something
that challenges this opinion, you’re very prone to actually, like, waving it off as nonsense or even getting upset if somebody challenges it. While you will happily take in every piece of information that seems to confirm what you’re already
believing in, that confirms your opinion. Because it’s obviously right, right? Or, something more related to actual, technical decision making. And I would like to say here that this is a less emotionally charged topic, but I don’t really wanna lie. So. For example, if you already are convinced that Rails is the best
this world has to offer for this new project you’re starting, you’re really very prone to not listening to people that come to you and tell you that Rails is a shitty framework for reason X,Y, or Z. You might listen to what they have to say, but you’ll probably
discard it pretty quickly for the opinion you already have. And you won’t really
dwell on it for very long. You’re also much more prone to look for information that tells you why Rails would be a good choice for this project. And not why it would be a bad choice. The confirmation bias is a good example for the brain to recognize its shortcuts and sacrificing accuracy and correctness for speed and less effort. In fact, we have many preconceptions about the world that are actually true, or close enough to true
in most of the cases. So, accuracy and (audio cuts
out) enables us actually to act and think much faster. And therefore our brains doesn’t really constantly check if our opinions are true or false, it will just assume that this is the base on what we’re acting. So, this is not necessarily a bad thing. But, the problem comes up is when the opinion that we are holding and that we are trying to confirm in this way, is actually not a very good solution for the problem that
we’re trying to solve. So, what can we do about
the confirmation bias? A good approach to counter it is to challenge your own opinion. Try to prove yourself wrong. When you’re making an important decision, try to put yourself in
the shoes of someone whose job it is to actually show that your approach is not correct. This is not easy at all,
but will certainly help you take on a different viewpoint and it maybe uncovers some things that you hadn’t thought about before. And if you’re not sure that you can do this honestly enough, ask someone, like a coworker or just someone you trust, to play devil’s advocate for you. And to actually challenge this opinion. Don’t be defensive about it, actually take into consideration what this person is saying. Change your opinion, if required. I know this is not easy at all, but if we’re not ready
to change our opinions, we don’t really need
to start with all this working around cognitive
bias, because there’s no point to it. And then, if in the end it turns out that the original thought
or idea that you had still looks like the best, then it’s probably not such a bad choice. Alright. Another cognitive bias
that strongly influences our decision making is
the mere exposure effect. The mere exposure effect means that we tend to like things more if
we are familiar with them. We have a preference for something just because we know it. And this is a bias that’s also strongly rooted in survival. Things we know, things we understand, things we are familiar
with create something in us that is called cognitive ease. Cognitive ease makes us feel good, it makes us feel safe
in a given situation. And our brain uses this
effect as a kind of dial for constant situation assessment. To make sure we are safe. If there’s nothing that challenges us, if there’s nothing that
looks like a potential threat or that we have to direct a lot of attention to, it will assume that the situation is okay. Cognitive ease feels good,
it feels comfortable. But it also makes us think in a much more casual and superficial way. Cognitive strain, on the other hand, happens when we encouter something that we don’t really have any experience about. Something we don’t know, something we actively have to wrap our heads around. And our brain takes this as a clue that there might be a potential threat. There might be a problem
that we have to solve. And therefore, it gives
us a little heads up. Where it says, like,
“Attention, attention. There’s something you should think about.” So cognitive strain makes us pay a lot more effort into our thinking. We do fewer errors in
this state of thinking. But it also makes us a lot slower. Less creative and less intuitive. So. Again, preferring things we like is a natural thing, it just happens that way in our brains and we can’t really turn it off. Our brain wants us to be safe so it seeks out situations that make it feel safe. But, as with the confirmation bias, we can actually work around it by asking ourselves, “Is it
about (audio cuts out) that we like or don’t like something or someone?” A certain lanuage, a certain framework, a certain applicant? So. For example, if we’re thinking about a framework we were gonna use, do we like it just because we’re familiar with it? Or is it actually the
best tool for the job? Do we doubt a certain applicant for actual reasons or just because they’re not the kind of person we’re used to interacting with? But at the same time, we shouldn’t just toss familiarity out the window. I mean, there’s a point
to this whole thing, because being familiar with something enables us to hit the ground running. And to get started on
something really quickly, because we understand it. So. The thing is, we should figure out beforehand if this is
actually the direction we are wanting to run into. And not do it automatically because our brain throws it to us and says, “Take this, this is the easiest way.” So what do we do about the mere exposure effect? A good way of dealing with this is when you have a major decision to make, set up a list of objective criteria. Or, like, as objective as possible. And use these to evaluate your options. Do this before you actually start looking at options, and
when you’re evaluating, stick to these criteria. And that should help you to, at least to some degree, keep your personal impression out of the game. But. Once you’re done with this criteria, write down a short note about
your personal impression. Like, what do you feel about this option? That way you can combine this objective evaluation and your personal impression. And there’s been studies that show that this is actually a very good way to make decisions that brings good results. And when you decide, stick mostly to the objective criteria. And then use this personal
impression report as a support for decisions. If you have enough people for that, you can also separate the person that does the evaluation and the person that actually makes the final decision. Because that’s a way to
actually keep personal preferences of one person out of the game. And if you’re not super comfortable with making this hard separation, you can also just take
your evaluation results and give them to a person that was not involved in the evaluation process. And get their independent opinion. Yep, that was that. The confirmation bias
and the mere exposure effect have strong influences on the way how we personally make decisions. But a lot of the time, we
also make decisions in teams. Why do we make decisions
in teams or groups? There are different
benefits we can actually get from that. One of them is, for example, that we come in with a variety of perspectives and looking at the problem with a lot more information of possible alternatives or possible solutions. And it also makes for a
better decision reliability. Which means that, through being in a group, you can kind of even out the personal biases of people. And dampen them. So let’s look at a couple
of cognitive biases that actually have an effect when we’re making decisions in a group and that stop us from getting these benefits. When discussing something in a team, it’s really important that we’re all on the same page about the topic we’re talking about. I mean, that sounds like
a no-brainer, really. But I, personally, have surely been in on a lot of project meetings where we all thought we have an agreement on something, and then somebody asks a question, and suddenly the whole
thing is in confusion. Because, like, three people find out that they actually were thinking about something completely different. But they were thinking
this was actually the topic we were discussing about. So, that happens a lot and that is actually an example of the
false consensus effect. The false consensus effect means that people tend to assume that what they think is normal. And so they overestimate how much other people actually think
the same way that they do or that how much other
people agree with them. This is due to a couple
of different factors. One of them is that even though our brains are actually pretty good at people things, we often have surprisingly
poor social judgments. We just get people wrong a lot. Secondly. We tend to project our own assumptions and our own attitudes and opinions onto other people. This might be in part to wishful thinking. But one of the reasons is also that our brain does not really have any very good way into looking into the heads of other people and actively knowing what these people think. So our brain, sneakily and silently, replaces the actual
question we are asking, like, “What does this person think?” with another question
that’s easier to answer and that would be, “What would I think if I was this person?” And then it returns the answer to us. For the second question, but we never know that was actually not the answer to the question we asked. We tend to do this a lot more with people that we perceive as similar to us. Or that are members of the same group. So, mostly our coworkers
fall into that category. That means if we don’t clearly communicate our thoughts and our
opinions on something, everybody will think that everybody else thinks the same way that they do. And that’s a very clear recipe for disaster and chaos. Something else that can happen with the false consensus effect is when you have a fairly
dominant opinionator in your group. Like, for example,
someone who has a fairly senior role or someone
who is just very good at verbally leading discussions. Or. Yeah, conversations. Is that when they state their opinion, they’ll usually do it in
a very convincing way. Which is, per se,
obviously not a bad thing. But it might lead to a certain dynamic where other people, if they feel like they can’t really go up against this opinion or they’re not qualified enough to do so, they’ll just shut up and
not say what they think. And then this dominant opinionator, which I’m not saying this
because it’s something bad, it’s just a phrase to describe a certain communication style. This dominant opinionator will then assume that everybody else
thinks the way they do. And that way, because
nobody’s speaking up, decisions get made that do not actually reflect the opinion of the team. So what can we do about the mere… Excuse me, about the… False consensus effect? Be explicit, obviously. When we call a meeting for a discussion for a decision, be
absolutely explicit what this is about, (audio cuts out)
what is the topic of this. What is the goal of this? What are we going to talk about? And what is the decision
that we are going to make? So that way everyone is on the same page and knows what we’re talking about so we can go into the
right direction right away. For the second thing I described, encourage questions. Of course, when you’re, you want to encourage questions about the topic before you start discussing so. That kind of goes hand-in-hand with what I already said. And. Collect opinions first. So before you start out a discussion, before you get into this dynamic where one person’s opinion kind of lies over the other persons, let everyone write down their opinions on the topic at hand, on little pieces of paper or sticky notes or something. Because that way, you can actually collect everybody’s ideas and opinions without having them run through
the group consensus first. And then you can actually take each one of those points that came up and discuss about them and make sure that everybody’s
opinion gets heard. There’s something else that can happen in groups that can severely undermine decision making. And it’s called groupthink. Groupthink means that to preserve the harmony of the group, or the conformity in the group, members of this group or of this team will actually try to minimize conflict and reach a consensus without critical evaluation of alternative viewpoints. Or even by oppressing or suppressing differing opinions from
inside or outside the group. Inside the group, this can lead to things like people very
quickly adapting their own opinions to what they perceive as the majority opinion of the group. Or the opinion of the
leading member of the group. A senior developer, for example. It can also take the shape of people actively or unconsciously suppressing differing opinions and
discarding them really quickly if somebody brings them up. So what is perceived
as loyalty to the group actually makes people not bring up controversial issues. Or challenge opinions that have already been established. This generally leads to
a very sharp decrease in personal creativity
and individual creativity and then critical and
independent thinking. And it has a very negative effect on the decision making of the group. When it comes to the opinion
of non-group members, groupthink can start
out causing things like just not getting any input or feedback from members outside of the group. Consciously or unconsciously. All the way to actually, actively, or semi-actively trying to bar outside influences on the group. An example for the semi-active approach is something you might of heard before. Is when people say things like, “Well, they don’t really understand how we do things.” Or, “They don’t know as much about this as we do.” Or any variation of this. Again, this has negative
effects on the group’s decision making because it creates an echo chamber, where only the group’s own consensus, or artificial consensus, is reflected back at them and nothing else ever gets in. Ironically, even though
it provably worsens the decision making of a group, groupthink actually makes the group’s members feel a lot more confident that their decisions are right. And that the quality of their decisions is very good because it creates some type of feeling of belonging together, group cohesion, and
invincibility within the group. Groupthink is a dynamic
with an evolutionary background as well, and it’s purpose is, in fact, to create
cohesion within a social group and to avoid infighting, which
is important for survival. There are three factors that play together that lead to groupthink. First of all is high group cohesiveness. If you’re group doesn’t feel like it belongs together,
you won’t have an issue with groupthink at all because you do not have a cohesive group. The thing is, high cohesiveness alone does not necessarily lead to groupthink. There are two other factors. And at least one of them
needs to be present. To create this dynamic. One of them are structural faults. For example, an insulation of the group. So, it’s really isolated and does not really communicate with
a lot of outside people. Another one is actually
if you have a group that has a very homogenous setup of members. If you have a group
where every single member is very similar in their background, in where they are from,
what they are like, their opinions. This very strongly encourages groupthink. And a third factor that can also have some influence is situational context. Things like. Perceived threats from the outside that feel highly stressful to the group. Or recent failures of the group. Tend to encourage groupthink. So that’s actually good news for us because we really do want cohesive groups, but we want them without
the groupthink effect. So, what can we do to counter groupthink? To make our groups work together in a way that they are cohesive,
but not like sheep. First of all, a cohesive but diverse group starts you out with a really good bunch of different viewpoints and opinions. So try to form teams that are diverse and not homogenous, where people come from different backgrounds, from different demographics, from different viewpoints. With different experiences. Also. Oh, yeah, I have a slide for that. Also. Encourage critical evaluation. You have to try and build an atmosphere that actually encourages
people to voice their opinions. To evaluate ideas that
come up in a critical way. Because, you see, if
you have an environment where people feel that
if they actually say something that goes against the mainstream of the group, it will be frowned upon or they will be punished in some way or the other? They won’t do it, because why would they? It will only have negative
consequences for them. So try to build an environment that encourages critical thinking and the expression of your personal opinions in a constructive way. If you’re a leading or
fairly senior member of the group, you might wanna think about not starting out a discussion on decision making with
stating your own opinion on the matter. Because, that way, you’re very prone to priming your team to stick to, at least the area of things, like, “What do you think about this topic?” So they might just adapt to what you think and they might not bring
in their own ideas. So what you can do for this as well is you can use this sticky note opinion collection technique that I’ve already described that’s very
helpful for this as well. And this doesn’t really
mean that you can’t state your opinion if
you’re a group leader or senior member, that
you shouldn’t take part in the decision making,
it just means don’t state your opinion first. Let the other people talk first, let them bring in their opinion and then after that, bring in yours. To avoid creating an echo chamber, actually actively invite outside expert or outside people into the group. Let them state their view on things and then actively have your group members discuss the topic at
hand with these people. And in general, encourage your team members to actively discuss ideas of the group with trusted people outside the group. To get some feedback that’s exactly not within this echo chamber of the in group. And last but not least, think about appointing a devil’s advocate if you’re making important decision. Do this actually as a
real role in the team. So one person on the team will be responsible to take a critical stance against every idea that comes up. And actually question it, in
a constructive way, obviously. It’s not the point of
a devil’s advocate to shoot down everything other people say, but that way you can institutionalize critical thinking into your group. Just make sure that this devil’s advocate is another person every time you’re having a discussion, because otherwise you end up with a member of the group that other people really resent over time because he constantly keeps shooting down their ideas. (audience chuckles) And remember this: When we all think alike,
then no one is thinking. If somebody says something in a decision making process,
and everybody just agrees without any further discussion, you should become immediately suspicious. And think, “Why is this happening? Could this be a case of groupthink?” Alright. That’s been a lot of information so far. So let’s shortly re-cap
what cognitive biases we’ve looked at. We’ve looked at the confirmation bias. This is a bias that takes influence in our personal decision making and it causes us to search or interpret
information in a way that re-confirms our opinions that we’re already holding. The mere exposure effect also influences our personal decision making processes. And it means that we
tend to like things more just because we’re familiar with them. Next is the false consensus effect. This works in team decision making. And it actually says that
we tend to overestimate the degree to which other people agree with us or think like us. And then groupthink. Where people, in an attempt to conserve group harmony, try to minimize conflict and find a consensus
without actually actively evaluating alternative viewpoints or suppressing other opinions. I have one more thing that
I would like to mention. And I think it’s a very important thing. Because if we don’t understand this, there’s not really any point starting to think about, you know, working around cognitive biases at all. And it’s this. It’s ok to change your opinion based on new or updated information. (audience claps) (laughs) (audience keeps applauding) I mean, it’s really something that sounds very obvious but if we
are honest with ourselves, a lot of the time, we
hold onto our opinions that we’ve already formed out of a matter of pride or ego, because we really, really don’t like admitting that we might have been wrong about something. In fact, re-evaluating
and updating opinions based on new information and new facts is not shameful at all. And it’s not a sign that you can’t make up your mind. It’s, rather, actually
one of the only ways where we can get closer to rationally thinking and acting. Daniel Kahneman is a psychologist, he’s done a lot of great work in the field of cognitive bias. He wrote an awesome book and I definitely recommend you read it. You can ask me for the
title after the talk if you want to (chuckles softly). And Daniel Kahneman says,
“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.” You see, we can’t really
stop cognitive biases. They are hard wired into our brains. They’re just there. And it would probably
not even be a very good idea to try and stop
them, because that would quite interfere with
the way our brain works. So there’s not really any way for us to become completely rational creatures. But what we can do is that we can try to chip away at ignoring our ignorance. We can start to learn
about cognitive biases, we can start to learn about situations when they happen. And then we can recognize them and use techniques to counter or circumvent them when appropriate. That way, we become
better at understanding ourselves and thereby also, understanding others. We become better team workers. We become better decision makers. And we generally become better, and more successful, at what we do. So in that spirit, let’s all start working at being less ignorant about ourselves. And I hope my talk has given you a good start for that. Thank you for listening. (audience applauds loudly) (upbeat, electronic country music)

Leave a Reply

Your email address will not be published. Required fields are marked *