Unlearn management wisdoms

It’s time to unlearn zombie management wisdoms.

The crisis of replication in social psychology has highlighted how many of the established theories we rely on in curriculum and organisations become misappropriated, discredited, debunked, or are no longer context appropriate as the world moves on.

This project was supported by a CEMS Innovation Grant

Click on a topic to skip straight to the unlearn.

Unlearn digital natives

Introduction

For many people, using computers and related technologies feels like learning another language. In fact, computer code itself is written in many languages! And as with languages, there are people with native proficiency and people who have learned to use technology later in life. In 2001, Marc Prensky proposed that ‘digital natives’ who grew up with digital computing technologies have a different brain structure to ‘digital immigrants’, who learned to use digital technology later in life. Consider how intuitive it is for a 3-year-old to pick up an iPad and navigate: could your grandmother do the same? Perhaps not with the same comfort: Prensky would say she’s a digital immigrant. This has profound implications for the way we work and deliver training. Prensky argued that digital immigrants learn in a completely different way – analogue, slow, focused – whereas digital natives are able to multitask and split their focus across several screens, tabs, and applications. Accordingly, it’s important for us to design work (and training) around these different capabilities…or is it?

What’s the evidence?

Much of Prensky’s original work in support of the digital natives hypothesis is based on assumption and anecdote, with some reference to neuroplasticity research. However, by 2011 he had started to walk back the suggestion he had made between 2001 and 2005 that digital natives’ brains had physically changed. Instead, Prensky has argued that the ‘digital natives/immigrants’ metaphor is simply a useful tool for thinking about how people are socialised into online cultures. More recently, as digital technology has become ubiquitous and routinely part of daily life, the idea of the digital native has become redundant: most people in the workforce today are very comfortable with digital technologies. Recent research has challenged the idea that digital natives exist at all, suggesting that designing training with assumptions around multitasking in mind can hinder learning.

Why unlearn digital natives?

If we could stick to the idea that ‘digital natives’ is just a metaphor, it might be a harmless management wisdom. After all, acknowledging that some people aren’t as comfortable with technology might spark conversations about how to build those skills. But that shouldn’t extend to how we develop training or assign work. Human brains look the same now as they have done for about 10,000 years – what has changed are our cultural practices. Work cultures that embrace multitasking and distraction because ‘digital natives’ can handle it are likely to prevent deep work and focus. There is also a risk that hiring practices become biased and discriminatory if managers seek to hire based on age as a proxy for experience with digital technologies. The idea of ‘digital natives’ implies that there is a generational divide in the use of these technologies, however the ability to use digital tech is something that anybody can learn and it cuts across all age groups.

Unlearn power poses

Introduction

Sit up straight, stand tall, and show what it is to radiate confidence and power. Good posture and a few hand gestures can illustrate the charismatic properties that indicate a strong leader. Not only that, power poses can also reduce stress levels and build a degree of self-confidence in the power pose user, which can be perceived by their audience. What’s more, these small non-verbal gestures can generate increased social capital over time and improve performance in job interviews. The founder of the theory around power posing, Amy Cuddy, highlighted how something as simple as putting your hands together to form a steeple and a few small adjustments to body language can signal to your colleagues that you are in control of the situation… But how true is any of this?

What’s the evidence?

Some studies have noted that power poses reduce cortisol and increase testosterone, hormones responsible for feelings of stress and confidence respectively. Furthermore, other studies have highlighted the idea that power poses improve moods and induce an increased feeling of self-efficacy. However, this apparent evidence may be unreliable, with such promising findings failing to be replicated. In fact, larger samples of research reject entirely the premise that poses affect stress levels, behaviour, and the feeling of confidence. In fact, Amy Cuddy herself has since retracted her belief in power poses. In a joint response from academics from Berkeley, Cuddy and her colleagues highlighted the shortcomings of research supporting power poses, and denounced the concept and encouraged others to do so as well. So, why do some leaders hold onto a belief in power poses, even when the founder of such a theory has unlearned such a practice herself?

Why unlearn power poses?

Is there any harm in power poses even if they don’t work? What are the risks of reinforcing an unsubstantiated practice? Power poses promote “fake it ‘til you make it” behaviours which may run the risk of increasing the required emotional labour of work and reduce the sense of authenticity. At the same time, training and development programs can be expensive and paying for a development program teaching unsubstantiated practices has little value, but significant opportunity cost. Instead of focusing on what confidence seems like, management may want to place more effort in cultivating what confidence is. Shifting the focus onto developing and implementing alternative self-efficacy and confidence-building practices may be a more fruitful avenue to improve the performance of individuals, while at the same time unlearning power poses.

Unlearn the 80-20 rule

Introduction

It is the little things we do that truly matter. The 80-20 rule, or “the Pareto principle”, named after the concept’s founder Vilfredo Pareto in 1848, has been a staple of contemporary management practices –the idea being that 20% of the work drives 80% of the outcome. Such a rule is frequently employed by management to allocate time effectively, maximise efficiency, and encourage strategic thinking, emphasising how small, focused actions can generate disproportional outcomes. Having an 80-20 mindset encourages managers to focus on what is important. After all, we only have so many minutes in a day, so identifying which actions have the biggest impact keeps us at our most productive. But how true is this rule in practice?

What’s the evidence?

Originally, the 80-20 rule stems from an observation made by Pareto, where he noticed that 80% of the land in Italy was owned by 20% of the people. This conception of the 80-20 rule has since drifted in meaning, with the statistical ‘rule’ being visible in a range of domains. A classic example would be the adage that “80% of profit comes from 20% of customers”. In fact, the 80-20 rule can be teased out in nearly everything. Some studies have even suggested that data can always be tweaked to fit into an 80-20 rule. And the ratio itself can often be tweaked and still broadly conform to the 80-20 principle: there are also 90-10 rules, 70-30 rules, and 60-40 rules, all reflected in different patterns and situations. So while there are frequently statistical correlations that support the existence of the Pareto principle, those correlations don’t necessarily recommend a particular course of action. In addition to this, research has indicated that some domains where the 80-20 pattern was most visible – such as niche sales – increasingly see ‘long tails’ in the statistical distribution.

Why unlearn the 80-20 rule?

If we could stick to the idea that ‘digital natives’ is just a metaphor, it might be a harmless management wisdom. After all, acknowledging that some people aren’t as comfortable with technology might spark conversations about how to build those skills. But that shouldn’t extend to how we develop training or assign work. Human brains look the same now as they have done for about 10,000 years – what has changed are our cultural practices. Work cultures that embrace multitasking and distraction because ‘digital natives’ can handle it are likely to prevent deep work and focus. There is also a risk that hiring practices become biased and discriminatory if managers seek to hire based on age as a proxy for experience with digital technologies. The idea of ‘digital natives’ implies that there is a generational divide in the use of these technologies, however the ability to use digital tech is something that anybody can learn and it cuts across all age groups.

Unlearn growth mindset

Introduction

When something goes wrong, do you see it as a fault, or a learning opportunity? For many people, their approach to this question can define the opportunities that are available to them in life. Carol Dweck’s best-seller, Mindset, argues that there are two fundamental mindsets people bring to learning: fixed mindsets and growth mindsets. People with a fixed mindset view talent as a fixed, innate gift that generally cannot be changed. People with a growth mindset believe that talent can be developed in everyone and see failure as a learning opportunity. Dweck has argued – with support from Microsoft’s Satya Nadella – that organisations which embrace growth mindsets are more successful, forward-thinking, and open to possibilities. Perhaps most significantly, Dweck links the idea of a growth mindset to productive outcomes: it is not enough to believe you can achieve anything, you have to actually follow through!

What’s the evidence?

While there is some evidence in support of links between mindset and achievement, many scholars now argue that the claims made by the theory are far stronger than the research base can support. Perhaps more significantly, there hasn’t been a strong result from real-world studies demonstrating that a growth mindset can actually lead to improved outcomes. The few studies that have been conducted seem to indicate a weak or negligible impact – far less than the impact of other major traits such as intelligence and personality, or related educational concepts like self-efficacy (a person’s belief in their ability to complete a task). In fact, significant early research in the field of educational psychology on self-efficacy has a much longer and larger evidence base than growth mindset. A number of meta-analyses show that educational interventions focused on mindsets have such a low effect-size when compared to other interventions that they are almost not worth using.  While they may be cheap and easy to implement, and can have an impact in specific groups such as low-SES, alternative interventions can have a far greater impact.

Why unlearn growth mindset?

Is the ‘growth mindset’ really that bad? That depends on how it is deployed within an organisation. For some companies, buying consultancy services like intervention programs, or investing in growth mindset research, can involve significant capital which might be better directed to higher quality programs. If there is a widespread cultural push for having a ‘growth mindset’, colleagues who feel like they have a fixed mindset – and fail to recognise that people can shift between the two mindsets in different situations – may feel like they will never succeed, running counter to self-efficacy studies. Similarly, colleagues for whom there are specific difficulties, such as disability or environmental factors outside of their control, may feel like they are personally responsible for these circumstances. The opposite of this may also be true: it is possible that people who believe they have a growth mindset will fail to recognise the actual limits that will prevent them achieving outcomes. Organisations which subscribe to mindset theory may find leaders attributing success or failure to mindset, when other, more complex factors are at play.

Unlearn disruptive innovation

Introduction

Everyone likes a good underdog story, and in the world of business there are plenty. Netflix usurping Blockbuster or Uber shaking up the taxi industry. These are disruptive innovations: an underdog disturbing existing markets after entering from below. Sometimes these innovations have been dismissed as a novelty only to transform the status quo down the line. Described by Clayton Christensen in 1995, disruptive innovation refers to the entry of an innovation that disrupts incumbent players in an existing market, introducing a transformative product or approach that is initially cheaper and of lower quality but eventually catches up to the incumbent, who makes the mistake to overshoot average customer needs by ‘gold-plating’ its product (called the innovator’s dilemma). A new technology, say digital animation, allows creating a product with different, cheaper means (cheaper than original animation). Initially this approach is not quite good enough (initially digital animation films lacked warmth and character), but over time it improves and disrupts the incumbent player. Disruptive innovation has underpinned the rise of the Silicon Valley startup, and has enjoyed buzzword status for a while. But why is the concept so attractive, why does it carry so much weight, and while it feels intuitively true, what is the evidence to support it?

What’s the evidence?

Christensen uses many examples to illustrate his theory. And it makes intuitive sense, especially to the many start-ups who want to be the next disruptor, or to the many incumbents who fear such disruption. The theory is attractive, but it has at least three main issues. First, it lacks a good empirical basis. Second, it is too narrow in its formulation. And third, it misses what is most disruptive about innovations. King and Baatartogtokh (2015) traced 77 of Christensen’s examples and found that only 9% of cases showed all required features for a case to be called ‘disruptive’, according to the theory. In fact, many incumbents did not fail (like Disney, who simply purchased challenger Pixar). Additionally, many disruptions do not come from below, such as the iPhone, which Christensen therefore argued was not a disruptive innovation. And finally, Christensen misses that truly disruptive innovations are different in kind, and transform what counts as quality in a market. For example, while digital music has lower sound quality than the CD, what is important today is accessibility and portability, not the high-fidelity ideals of the 1990s. As a result, many incumbents are unable to react to emerging disruptions, another required characteristic in his theory.

Why unlearn disruptive innovation?

Disruptive and innovation are words we throw around without too much thought. But the story of disruption is too simplistic. No single company can disrupt a market on its own. True disruptions are often the result of many factors and developments cumulating over long periods, such as the internet or computing technology. And the biggest disruptions often come from large corporations (Apple was behind two major disruptions in the music and the phone markets). The concept also puts undue blame on incumbent managers. For example, neither Kodak nor Fuji could have seen the slow move to digital photography coming: the shift is obvious only in hindsight. Businesses need to be mindful of the future, but ultimately nobody has the ability to predict where the next disruption will come from. The better strategy for incumbents is to focus on innovation and organisational responsiveness to be able to react to change when it occurs.