The Myths of Disruption and Change

By | AgileBusiness | No Comments

…Or why in business we need a more sophisticated approach to appreciating the impact of digital technologies

When we think about the impact of digital technologies on the modern business environment there’s a prevailing narrative now around disruptive new market entrants, and accelerating and exponential change. This is often expressed in talk of the ‘Uberisation’ of entire industries, or catchy soundbiteslike: ‘the pace of change has never been this fast, yet it will never be this slow again’. Or demonstrated with graphs like Nicholas Felton’s for the New York Times that shows how technology consumption is spreading faster than ever (the telephone took decades to reach a penetration of 50% of U.S households yet the mobile phone took less than five years). Or supported by studies such as that conducted by Professor Richard Foster at Yale University that showed that the average lifespan of a company in the S&P 500 index has decreased from 61 years in 1958 to just 18 years now (although the detail is arguably more nuanced). Yet the reality is perhaps not as clear cut as this picture suggests.

The Myths Around Disruption

Given all the column inches and social media mentions dedicated to disruption you’d be forgiven for thinking that it was all around us. Yet the latest IBM Global C-suite Study (based on research with more than 12,500 CxOs worldwide, including 2,000+ CEOs) demonstrated a more refined picture of the reality of disruption but also more specifically where the C-Suite now see as its source.

To begin with, disruption is far from ubiquitous. Over a third of executives (36%) reported minimal or no disruption in their sector, and only just over a quarter (27%) reported that they were experiencing significant disruption. In a previous iteration of the survey in 2015, a majority of executives reported that they were expecting significant disruption from new market entrants outside their sector but in the latest survey only 23 percent of respondents reported that this was the case and 72 percent said that it was industry incumbents that were leading the disruption. Whilst the impact of new market entrants should never be under-estimated (I certainly wouldn’t want to be a CEO when Amazon entered my market), it’s unhelpful to consider either that disruption is everywhere or that it is solely the domain of innovative startups.

The Nuance of Change

It’s true, I think, that many things around us are changing quite profoundly, and faster than ever before. But not everything changes. Narratives about accelerating change are far from new but it’s lazy thinking to think that this applies to all contexts. More than this, it can also be dangerous if this then causes us to be distracted or overly tactical, or to pursue shiny technologies with questionable value. As Jeff Bezos once said:

“I very frequently get the question: ‘What’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘What’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two — because you can build a business strategy around the things that are stable in time…”

Bezos (of-course) talks about how fundamental customer needs like access to a great product range, good value, cheap prices, and fast delivery remain constant. Fundamental needs change slowly, if at all. So that affords the opportunity to invest energy and focus in areas that you know will pay dividends over the long-term. Essential elements like these should be the guiding North Star for any business. Yet the way in which we may deliver to these fundamental needs may indeed be susceptible to swift change.

So in order to make smarter decisions in businesses about how we respond to this environment we need to articulate a more nuanced understanding. Here’s a few thought starters from me on the specific dynamics that actually havechanged and are changing quickly and profoundly:

Capability Access:- the democratisation and widespread availability of new and potentially transformational enterprise services and means that even the smallest startup now has access to scaled capability that was once only the domain of large, well-funded businesses. Barriers to entry have been vastly reduced. For example, with the right approach any business regardless of scale can access some of the best digital infrastructure technology, analytics tools and open-source machine learning capability currently available.

Access to knowledge and expertise:- the democratisation of information has fundamentally changed the value dynamic. The value of specialist expertise has not diminished but when we can find the answer to pretty much any question on Google, and harness internal and external expertise more easily than ever (andeven access some of the best teaching online for free), advantage shifts from being focused on the stocks of knowledge that we have built up in the business over time to being more about the flow of ideas and knowledge, how we apply it, and what we choose to do with it.

Data:- an elemental part of this shift in the information dynamic is of-course the ability to access, filter and utilise the wealth of data that is now being continually generated. This is clearly not new news but businesses need to get better at extracting value from data by structuring good quality data, interpreting patterns and meaning, and originating processes that can execute against actionable insights quickly. We’ve yet to scratch the surface of how machine learning will take this to completely different level.

Networked value and connectivity:- with the explosion in the connection of things and people, network dynamics have change some fundamentals of how we should think about value creation. This is writ large in a diverse set of impacts from the development of platform business models and ecosystems which have changed competitive dynamics considerably in some sectors, a radical shift in the flow of data through APIs, the operational efficiency gains that can be derived from connected machines, the rapidity in which ideas and content can spread through networks, the eroding of traditional barriers like geographical borders and market boundaries, the ease with which collaboration can happen.

Lowering transactional costs:- digital has completely changed the cost dynamic in many areas of value chains, reducing key elements of some chains to zero marginal cost and enabling dramatic changes in efficiency and new entrants to compete at relative scale from a small base. Growing automation will continue to generate opportunity and impact here.

Scaling dynamics:- digital networks have brought with them a dramatic shift in scalability that gives individual people access to a global market, small teams the ability to originate and scale transformational ideas, and businesses with finite resources to have disproportionate impact. Pre-digital for example, it would potentially take decades for a business to expand to a global scale yet in a little over six years Netflix was able to complete an international expansion that has taken them into no less than 190 countries worldwide.

Customer expectation:- whilst some areas of consumer behaviour are changing rapidly, more fundamental needs are arguably not. But customer expectation is changing fast and changing all the time. Services like Lemonade Insurance, Monzo, Revolut, Netflix, Uber, Amazon set a new bar for customer experience that raises customer expectation (not only in category but more broadly across sectors) for how easy to use and convenient services should really be. This is a significant challenge for businesses of all types but also an opportunity. Jeff Bezos (in his 2018 annual letter to shareholders) said:

“One thing I love about customers is that they are divinely discontent. Their expectations are never static — they go up. It’s human nature.”

Bezos describes how the cycle of improvement required to serve customer’s appetite for better solutions is happening faster than ever, but the phrase ‘divinely discontent’ demonstrates how the real opportunity is to use continually rising customer expectation to challenge your teams to do it better or do it differently.

Accelerating complexity:- in-spite of the promise of technology to simplify, the reality is that it also creates growing inter-dependencies and problems with competing ecosystems that results in poor inter-operability and unnecessary friction. Whilst I’d love to believe that this is a temporary situation, my feeling is that this increasing complexity will remain a reality for years to come.

So if that’s what is changing rapidly, how should businesses respond?

Responding to this More Nuanced Picture of Change

It’s clear that organisations need to think smarter about potential sources of disruption but also how they can shape their organisations to be more adept at responding to it when it happens or even before it happens. Increasingly, I’m finding that it’s useful to think about the impact of change more in terms of heightened unpredictability. And the ability of businesses to re-orient themselves to becoming far better at rapid adaptation.

If you can’t easily predict how market, competitive and consumer dynamics will change in a world where some of them may change quickly and at scale, then the organisation needs to always be exploring, learning and inventing. This means evolving structures, strategies, processes and culture towards enabling continual reinvention of value. The reality however, is that most businesses are still structured and organised for a very different world.

It was notable that that IBM C-Suite study that I mentioned earlier found that the most successful businesses that were part of the study were those that weren’t waiting for disruption to hit before making change happen:

‘The organizations that are prospering aren’t lying in wait to time the next inflection point — the moment when a new technology, business model or means of production really takes off. Remaking the enterprise, they recognize, isn’t a matter of timing but of continuity. What’s required, now more than ever, is the fortitude for perpetual reinvention. It’s a matter of seeking and championing change even when the status quo happens to be working quite well.’

Cluster analysis of the research outputs led IBM to categorise three main types of organisation according to where they were on their journey towards reinvention:

The Reinventors: 27% of the total, these are the standout businesses that are successfully re-engineering their businesses to lead the way in innovation and disruption and outperforming their peers in revenue growth and profitability

The Practitioners: 37% of the total, this represented those businesses with big ambitions (notably to take on more risk or to launch new business models) but yet to develop the real capability to bring those ambitions to life

The Aspirationals: comprising 36% of the total, these organisations still have some way to go in their digital journey and in changing their companies to be able to move rapidly to adapt or capitalise on new opportunities

Looking at those organisations that are more advanced on the journey to reinvention and the incumbents that are successfully leading disruption there were some consistent attributes. IBM classify a range of factors that separate the ‘reinventors’ from the ‘practitioners’ and the ‘aspirationals’, most notably:

  • Continuous adaptation and the ability to evolve rapidly alongside a well-defined strategy to manage disruption
  • Strong alignment between IT and business strategy in order to deliver the technology infrastructure and foundation to optimise business processes and support new strategies
  • Redirection of resources towards deriving new scaled value from ecosystems and networks of partners, a willingness to explore opportunities for co-creation with partners and customers
  • An ability to derive exceptional value from data and analytics to inform business strategy, to support prototyping and fast feedback loops, to successfully iterate innovative products and services, and to build compelling customer experiences
  • Investment in and attention to developing people and leadership skills, structures and culture to support and empower greater experimentation and adaptiveness

These ‘reinventors’ are demonstrating the way in which incumbent organisations can not only learn but apply that learning to adapt capabilities, structures and ways of working to the new environment.

The really big change here is what John Hagel and John Seely Brown at Deloitte describe as the shift from ‘scalable efficiency’ to ‘scalable learning’. As institutions have become more adept at leveraging the benefits of scale they have structured around consistency, stability and predictability which has forced a trade-off between efficiency and the organisation’s ability to learn. When key dynamics such as those listed above change rapidly, the business struggles to adapt.

I’m curious to know what I’ve missed in my list so feedback and contributions are more than welcome.

For more like this, order your copy of Building the Agile Business Through Digital Transformation, or you can join our community to access exclusive content related to the book.

Moving Away From Strict Hierarchies

By | AgileBusiness | No Comments
Hierarchy

There’s been a reasonable amount of focus over the last few years questioning the value of strict hierarchical structures in business and asking whether, particularly in the context of today’s digital-empowered and networked operating models, this traditional approach to organisation needs to change. 

It’s a good question to ask. Despite the fact that the context in which business operates has changed substantially due to the impact of digital technology, most businesses are still structured in ways that made more sense in an industrial age where control, efficiency, scale and minimisation of deviance were all important. But does that really make sense in a world that is increasingly characterised by horizontality, networks, data and value flows, systems thinking and platform business models?

The challenges inherent in hierarchy as an organising principle has already been challenged by some. A broad meta-study of 54 prior studies (covering analysis of over 13,000 teams) conducted by Lindred Greer, Bart de Jong, Maartje Schouten, and Jennifer Dannals at Stanford Graduate School of Business (‘Why and When Hierarchy Impacts Team Effectiveness: A Meta-Analytic Integration’) found that the net effect of hierarchy on performance was broadly negative. Whilst some expert-based hierarchies helped improve team performance, many others were dysfunctional.

As I wrote in my book, the digital era has brought with it a need for a level of continuous innovation, team and individual autonomy, customer-centric approaches and strategic and tactical adaptability and responsiveness that is simply not served well enough by strictly hierarchical organisation. Functional silos act against the ability to create joined-up, exceptional customer experiences. They hamper the ability to collaborate quickly. They get in the way of cross-functional design and innovation. They limit flexibility of job activity. It’s no accident that (often self-organising) small multi-disciplinary teams have become the engine of change in many forward-thinking companies. With near universal access to information, the old ideas of leadership characterised by the perception that all the answers and solutions exist at the top of the organisation and flow down, have become grossly outdated. 

Yet there are challenges at the other end of the organisation design spectrum too. Some brave businesses have taken a radical approach to the adoption of flatter structures using methodologies like Holacracy right across the company. Yet in spite of Holacracy being around for over a decade there are still very few examples of it being applied successfully at scale.

But what if there was a way to successfully balance the benefits that can accrue from hierarchy (efficiency, clarity of authority, concentration of specialist expertise, clear lines of communication, simplified career path) with those of a more fluid, evolving structure (agility, adaptiveness, cross-functional collaboration and innovation, speed of delivery)? What if we challenged these long-standing orthodoxies about the optimal way to organise our teams and reinvented organisation design around a more nuanced understanding of where hierarchy is beneficial, and where it is not? What if we could design an organisation that could successfully balance these extremes, adeptly manage the interplay between them, and bring a new level of fluidity to structures that enabled far greater organisational agility?

I believe that this is very possible. Yet my contention is that few businesses seem to be thinking big enough about how we rewire structures for the world in which we now find ourselves. Traditional orthodoxies need to be challenged, but what does the new normal look like?

That’s what I’m interested in exploring over the next few months both here and over on my personal blog. As always, feedback, thoughts and ideas will be welcomed.

For more like this, order your copy of Building the Agile Business Through Digital Transformation, or you can join our community to access exclusive content related to the book.

Why Corporate Innovation is so Hard

By | AgileBusiness, Digital Disruption, Disruptive Innovation, Organisational Structure | No Comments

I loved this piece by Tim Harford in the FT about why corporate innovation fails not least because it emphasises one of the more under-acknowledged challenges with generating disruptive innovations in large organisations.

Harford references J F C Fuller’s ‘Plan 1919’ from the First World War, a pioneering and ambitious strategy to use new British tanks to roll over the German trenches and strike a decisive sledgehammer blow to the German army that would end the war. Fuller, Chief Staff Officer of the nascent Tank Corps, originated a plan to amass 5,000 heavy and medium British tanks, 3,000 of which would be used to penetrate German defences along a 90-mile front supported from the air. 800 faster-moving medium tanks would then proceed to attack the German Army’s string of headquarters miles behind the trenches to disrupt the command structures. A further 1,200 medium tanks, supported by artillery, airpower, cavalry and truck-mounted infantry would then move rapidly to penetrate far behind enemy lines.

Fuller’s lightning thrust plan was revolutionary. Tanks had until then only been used to open up gaps in the enemy trenches through which foot infantry could advance a few miles. But Fuller was proposing a new form of mechanised warfare that could end the attritional stalemate of trench warfare and focus on disorganising the enemy. He wrote: “Tactical success in war is generally gained by pitting an organized force against a disorganized one.”

Fuller’s biographer Brian Holden Reid called Plan 1919 “the most famous unused plan in military history” and yet as Harford notes, Fuller had actually created an entirely new military strategy that would be studied by the Germans and implemented to devastating affect in 1940. Fuller had in fact invented Blitzkreig.

The great irony was that after Fuller’s plan failed to see the light of day in the First World War, many nations still neglected to see the value in his strategy and believed that tanks should be used in small pockets to support infantry. The Army even went as far as stopping the publication of Fuller’s book for several years yet Heinz Guderian, the mastermind behind Hitler’s blitzkrieg, still managed to read Fuller’s work after the war and used it to great impact.

The story is an excellent analogy for why organisations so often look at the new through the lens of the old and ignore the kind of ideas and concepts that can be truly transformational, even if those concepts are originated inside their own organisations (Xerox Parc’s personal computer with mouse and graphical user interface, Steven Sasson’s first digital camera for Kodak, Sony’s Memory Stick Walkman, the IBM Simon, the first touchscreen phone). In his piece about why companies squander great ideas, Harford quotes Joshua Gans, author of The Disruption Dilemma and economist at the Rotman School of Management:

“Disruption describes what happens when firms fail because they keep making the kinds of choices that made them successful.”

In other words, companies get stuck in the strategies and thinking that have made them successful before.

And perhaps there is an under-acknowledged reason why this happens. Harford’s article references a 1990 paper by Rebecca Henderson and Kim Clark (Architectural Innovation: The Reconfiguration of Existing Product Technologies and The Failure of Established Firms) in which the authors distinguish between the components of a product and the way that they are integrated into the system, setting out four types of product innovation:

  • Incremental innovation: this may strengthen the core components of the product but it also maintains the existing linkages between them (an example would be improving the performance of a car component like a driveshaft without impacting the way in which the car is put together)
  • Modular innovation: which may change the fundamental technology of the component but still doesn’t change the way in which the system links together (like an automatic transmission)
  • Architectural innovation: this may change the design but whilst the components may not change significantly the way in which they link together does (like front-wheel drive transmissions)
  • Radical innovation: which is the most extreme, and involves changing both the technology of the components and also the way in which they link together (electric vehicles for example)

Henderson and Clark make the case that ‘architectural’ and ‘radical’ innovation can more fundamentally challenge the existing organisational structure and processes which makes it more difficult for incumbents to respond:

‘…architectural innovations destroy the usefulness of the architectural knowledge of established firms, and…since architectural knowledge tends to become embedded in the structure and information-processing procedures of established organisations, this destruction is difficult for firms to recognise and hard to correct.’

Incremental and modular innovation are less challenging to established structures since the system doesn’t fundamentally change. Radical innovation establishes a new dominant design and a new set of design concepts embodied in components that are linked together in a new architecture. Architectural innovation involves the reconfiguration of an established system, linking together existing components in new ways. So it can be harder to perceive and make sense of since it involves similar component parts to the problem that are put together in new ways and so have very different relationships with each other. When an organisational structure and information flow has grown up around the old system, it becomes very difficult for the company to respond in suitable ways. The structure gets in the way.

Henderson uses the example of IBM to demonstrate how a company can respond well even to radical innovation if it fits the structure that already exists. IBM successfully dealt with significant developments such as the semiconductor, the integrated circuit, the hard drive, and the shift to mainframe computing since it was not dissimilar in structure to producing mechanical tabulating machines. Yet when it came to the PC revolution IBM’s initial success came only from going against existing strengths and the advantages of its extant structure and eventually internal politics rose up and the PC division struggled to cope and was sold off. In the First World War, says Harford, the invention of the tank did not fit existing systems and structures for fighting the war and so the real potential to use it in a very decisive way was missed.

Reading Harford’s piece reminded me of Noah Brier’s write up of Conway’s Law. In 1968 programmer Melvin Conway wrote that:

‘organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.’

In the context of software for example, the designers of two different component modules need to communicate with each other in order to ensure that the modules work effectively together. So the interface structure of software naturally needs to show congruence with the social structure of the organisation that created it.  This similarity between organizations and designs he called homomorphism, noting that ‘the very act of organizing a design team means that certain design decisions have already been made, explicitly or otherwise’.

American Computer Scientist Fred Brooks noted the wider application of this concept in management theory, observing in his early book on software engineering (The Mythical Man-Month):

‘Because the design that occurs first is almost never the best possible, the prevailing system concept may need to change. Therefore, flexibility of organization is important to effective design.’

The key points are that when we’ve started to design any system the choices that we’ve already made can fundamentally affect the final output, that organisation structure and design/architecture are intrinsically linked with the former impacting and constraining the latter (rather than the other way around), and that organisational flexibility is required to organise quickly to fix mistakes and adapt.

Noah Brier builds from this idea, referencing the work of Harvard Business School professor Carliss Baldwin whose work around the mirroring hypothesis shows that mirroring between technical dependencies and organisational ties becomes evident as a way of preserving scarce cognitive resources in solving complex problems:

‘People charged with implementing complex projects or processes are inevitably faced with interdependencies that create technical problems and conflicts in real time. They must arrive at solutions that take account of the technical constraints; hence, they must communicate with one another and cooperate to solve their problems. Communication channels, collocation, and employment relations are organizational ties that support communication and cooperation between individuals, and thus, we should expect to see a very close relationship—technically a homomorphism—between a network graph of technical dependencies within a complex system and network graphs of organizational ties showing communication channels, collocation, and employment relations.’

In other words mirroring makes outputs and new products easier to understand because it aligns nicely to current ways of organising.

It may be that in some instances (particularly in the context of incremental or modular innovation) mirroring will result in effective innovations and outputs. But with more radical, architectural innovation that challenges the architectural knowledge, information flows and structure of an organisation, mirroring can be counter-productive.

Rapidly changing contexts or emergent capabilities characterised more by unknown unknowns do not suit rigid systems and ways of working that mirror existing ways of doing things. Instead we need to work back from the system design we need into a structure that can reflect it. Rigid and deeply hierarchical structures may be good at delivering incremental and modular innovation and supporting optimisation and efficiency, but do not lend themselves to the flexibility required for adaptive, emergent problem-solving. These types of challenges require small, cross-functional teams that can move quickly, unconstrained by ingrained systems, architecture and thinking.

We need to fundamentally redesign our organisations to reflect both of these needs.

For more like this, order your copy of Building the Agile Business Through Digital Transformation, or you can join our community to access exclusive content related to the book.

Why We Reject New Technology

By | AgileBusiness, Digital Disruption, Disruptive Innovation | No Comments

Thermometer

There’s a great account in the Scientific American focusing on why new technologies that can make our jobs easier are somehow often rejected, using the adoption of the thermometer as an exemplar.

At the end of the sixteenth century Galileo Galilei invented the first device that could measure temperature variations – a rudimentary water thermometer. Around 120 years later Gabriel Fahrenheit came up with the first modern mercury thermometer. The Dutch physician Herman Boerhaave thought that the device had great potential and proposed that measurements using a thermometer could be used for diagnosis and to improve treatment.

Yet despite its evident utility it took over hundred years for use of the thermometer and the discipline of thermometry to become widespread. Prior to the mercury thermometer, Doctors would largely use touch to determine whether the patient had a high temperature or was suffering from a fever. This qualitative approach was regarded as being able to capture a rich amount of information, more in depth than any tool could generate, and for many years was seen as a superior approach to using thermometry.

In spite of the prevailing inertia to adopting this new technology, a group of researchers persisted in attempting to turn the relatively idiosyncratic opinions and descriptions from Doctors into reproducible laws but it was not until 1851 when a breakthrough happened. In a transformation piece of work (published as “On the Temperature in Diseases: a manual of medical thermometry”) Carl Reinhold Wunderlich recorded temperatures in 100,000 patient cases, and successfully established not only that the average human body temperature was 37 degrees, but also that a variation of one degree above this constituted a fever, which meant that the course of illness could be better predicted than by touch alone.

Thermometry represented a giant leap towards modern medical practice. Patient expectation changed and by the 1880s it was considered medical incompetence not to use a thermometer. But why did it take so long to become widely adopted practice? The original thermometers were large, cumbersome devices and the tool developed over many iterations but this still doesn’t explain its slow advance.

The Scientific American article notes how easy it is to reject technology that we don’t understand, or technology whose successes we’ve had nothing to do with. Perhaps our fear is that in its success, new technology will detract from our own utility. More likely, slow adoption of technology comes down to what Andy Grove (of Intel) used to call the ’10x’ rule, referring to the idea that a product must be at least ten times better in order to overcome barriers to adoption and switching costs because people tend to underestimate the advantages of a new technology by a factor of 3 while simultaneously overestimating the disadvantages of giving up old technology by a factor of 3.

But as the piece also goes on to point out, the subtlety is actually in how we combine the best of the old with the best of the new – describing how a children’s hospital in Philadelphia had used quantitative algorithms to identify particularly dangerous fevers. The algorithms proved better at picking out serious infections than the judgement of an experienced doctor. But when the two were combined it outperformed either in isolation:

‘It’s true that a doctor’s eyes and hands are slower, less precise, and more biased than modern machines and algorithms. But these technologies can count only what they have been programmed to count: human perception is not so constrained.’

Similarly, at the 2016 International Symposium of Biomedical Imaging in Prague, a Harvard team developed an AI that could detect cancer cells amongst breast tissue cells with 92 percent accuracy, almost as good as the trained pathologists who could pick out 96 percent of the biopsy samples with cancer cells. Yet when artificial and human intelligence were combined 99.5 percent of cancerous biopsies were identified.

Technological change rarely means forgetting all that we know. More often it is helpful to frame it in that thought of combining the best of the old with the best of the new. Perhaps the key lesson here is that a fixed mindset (one where you believe that success happens at the expense of someone or something else) does not help the adoption of new technologies. When we can see the bigger picture, and adapt existing knowledge and skills to combine the best of the old with the best of the new, we always progress more. Growth mindsets win.

For more like this, order your copy of Building the Agile Business Through Digital Transformation, or you can join our community to access exclusive content related to the book.

Image source