Philippe Delquié is Associate Professor at George Washington University School of Business. He’s also a regular speaker at the Berlin School, taking our EMBA participants on a journey through the complex and often uncertain world of Decision Making. We spoke to him about the importance of viewing ambiguity with an open mind, the creative capacity for leaders to use data to feed new solutions and ideas, and the responsibilities this ‘big data’ access brings to consumers and big businesses.
Can you give us some classroom insight into the world of ‘Decision Making’?
It’s about individual decision making, how to handle the complexity of business decisions, and how to handle uncertainty and ambiguity. What makes decisions very difficult and potentially paralyzing for people is the presence of uncertainty in the outcomes of their decisions. This is particularly true in the business world where most investments are very uncertain. Essentially, we look at decision making not just from a numbers point for view, but from a cognitive and psychological point of view. We look at the biases that we bring to decision making, the evaluations we make, and the way we tend to adopt a very narrow view when approaching decisions. We investigate how we can increase and expand the framework in which we view our options to make sure we’re solving the correct problem, and we examine how good we really are at appraising uncertainty – we all cognitive limitations that we can overcome with proper awareness and training.
Over the last decades, factors such as digitization and globalization have changed the way business leaders and entrepreneurs look at their futures. What disruptions and trends have impacted the way that leaders manage risk now versus ten years ago?
One of the big things that’s happening (and that’s not going to stop) is what we call ‘big data’. We see that it’s not only disrupting things but also enabling business opportunities that were not possible before. Artificial intelligence and big data are poised to affect our ability to make timely decisions and to put other decisions on auto-pilot which frees up the human brain for other exercises and interpretation. In the future, there’s only going to be more data and processing capabilities. A.I. can do a lot of things for us, but the key thing to remember is that human judgement will always be needed and won’t be irradiated. The boundaries will be pushed but human nature will always be part of things. And that’s why we need to be evermore well-trained and vigilant about how we use our limited attention and our judgement skills. Our critical evaluation will become even more important. Another thing we hear a lot about is the advent of Blockchain technology, which will essentially change the way that banking and a lot of administration is done. With computational and distributional power, there’ll be less need to trust a single organization like a government or a bank to do things. Systems will become more secure because it doesn’t rely on the simple idea of a vault that people might try to break into. Hacking a system with Blockchain will become so much more costly than the benefits you can get from it, that it will be inherently more stable and sustainable. But we are still just at the beginning of that and people are still arguing whether the current technology which is very costly in terms of data and computational resources will be ‘the future’. But the concepts clearly seem to be getting traction.
For businesses around the world, the upsides of big data are pretty clear. But from a consumer perspective, there’s also a lot of mistrust. Can you elaborate on any of the downsides and the responsibilities that come with big data capabilities?
I’m not an expert on the technology of big data but the wider issue of responsibility is hugely important. Users will have to be more sophisticated, more demanding and more discerning in the way their data is used. People tend to rely on default options – this is part of the narrow-framing we explore in decision making. If the default is ‘check the box if you want to be in our database’ or ‘check the box if you don’t want to be in our database’, it makes a difference. A great example is Facebook and the recent stories from Cambridge Analytics, where they used a lot of data unknown to the users, but also to some extent, unknown to Facebook itself. Essentially, by default, the data and anonymity of people were not protected. And this kind of framework will have to change. Companies will need your consent to use your data as opposed to the other way around. It will create challenges but again, there are benefits. This is just a hunch, but I feel that we can trust that people – users – will become more sophisticated, and that they will know where to pull the brakes on things. At some point there will be a limit, not just for privacy reasons but for individualization. Users will rebel and say, ‘I don’t want to be profiled this way.’ People inherently want to be more than just a profile in a computer. They want to be seen as human and complex.
In teaching uncertainty, what advice do you give more left-brain or right-brain leaders who might be intimated by either its mathematical or emotional aspects?
This is actually an interesting challenge. There are three places where decisions are made – the head, heart, and gut. It’s a question of balance. You should not expect all the answers to come from calculations and mathematics. If you do that, you create resistance. People will not trust you or they’ll be turned off by it because it’s not their cup of tea. As a leader, I think what’s important to convey is that using analytics is not contradictory to being creative. I want to open up people’s minds and ideas to the fact that sometime they’ll have to trust their gut and use their hearts to drive their organizations. But if they can better use their brains – the more computational and mathematics-based extension of the brain – it cannot be harmful. The idea is not to make us robots. As Einstein said, ‘Not everything that counts can be counted and not everything that’s counted counts.’ We need to appreciate that we can get deeper insights and accept that what we know intuitively may or may not be true. Challenging our intuitions with numbers and hard facts can help us to generate insights that we wouldn’t have had otherwise.
Are there any tools or authors that you’d recommend to a creative-minded business leader looking to expand their view on decision-making?
When it comes to visual tools, and this is very relevant to big data, there’s lots of work being done at the moment on visualization, asking what is the best way to convey vast quantities of information. We’re all familiar with bar charts, graphs, pie-charts and so forth. These are simple ways of displaying data that we see in the newspaper all the time. But there are evermore clever ways to do things when data becomes complex. If you’re interested in digging into analytics, there are basic textbooks that teach you how use you Excel to do this and that, and it’s more than just ‘Excel for Dummies’. But you can surprise yourself with the back of behind-the-spreadsheet analysis you can do with Excel. ‘Management Science: The Art of Modeling with Spreadsheets’ by Stephen G Powel is a book I often recommend to my MBA students. Then of course, there is all of the literature on Behavioral Economics and Behavioral Science as they apply to business, finance, and economics. For example, Daniel Kahneman’s ‘Thinking Fast and Slow’ and there’s the work of Dick Thaler who was awarded the Nobel Prize for Behavioral Economics in 2017. He has a famous best-seller called ‘Nudge’; how do you get people to do things that are in their best interest without patronizing or being heavy-handed? It’s using the idea of Behavioral Economics and psychology and asking how we can design the choice architecture so that more favorable outcomes can be obtained. We understand that because of human nature, people will not necessarily do things simply because they are told. Take safety, for example. We are all familiar with taking shortcuts even at our own risk. Can we design the choice architecture so that people do not hurt themselves and design devices to be robust to people’s shortcomings?
The Berlin School classroom is unique for its diverse talent and backgrounds. As a regular faculty member, can you tell us what it’s like to stand in front of the EMBA class?
The EMBA program is unlike any other program that I’ve been affiliated with. It’s no exaggeration to say that each class at the Berlin School is totally different from the typical class you might meet in Executive Education. It’s such a refresher for me and I really enjoy engaging with them. Even just the way they dress and look! They are not your typical business types. Their attitudes are fantastic. There’s a special dynamic. I’ve been coming back to the Berlin School for over four years, and my course always takes place at the beginning of the module. Participants have just landed in from all over the world. Some of them are reconnecting from old modules and some are meeting for the first time. It’s amazing to see how eager they are to share. Teaching this class forces me to revisit my approach. I can’t come in heavy-handed and dive straight into analytics and neither should I. It makes me explore new ways of conveying information and really think about the scope of Decision Making and its application, and the value of what I have to teach.