There’s a great account in the Scientific American focusing on why new technologies that can make our jobs easier are somehow often rejected, using the adoption of the thermometer as an exemplar.
At the end of the sixteenth century Galileo Galilei invented the first device that could measure temperature variations – a rudimentary water thermometer. Around 120 years later Gabriel Fahrenheit came up with the first modern mercury thermometer. The Dutch physician Herman Boerhaave thought that the device had great potential and proposed that measurements using a thermometer could be used for diagnosis and to improve treatment.
Yet despite its evident utility it took over hundred years for use of the thermometer and the discipline of thermometry to become widespread. Prior to the mercury thermometer, Doctors would largely use touch to determine whether the patient had a high temperature or was suffering from a fever. This qualitative approach was regarded as being able to capture a rich amount of information, more in depth than any tool could generate, and for many years was seen as a superior approach to using thermometry.
In spite of the prevailing inertia to adopting this new technology, a group of researchers persisted in attempting to turn the relatively idiosyncratic opinions and descriptions from Doctors into reproducible laws but it was not until 1851 when a breakthrough happened. In a transformation piece of work (published as “On the Temperature in Diseases: a manual of medical thermometry”) Carl Reinhold Wunderlich recorded temperatures in 100,000 patient cases, and successfully established not only that the average human body temperature was 37 degrees, but also that a variation of one degree above this constituted a fever, which meant that the course of illness could be better predicted than by touch alone.
Thermometry represented a giant leap towards modern medical practice. Patient expectation changed and by the 1880s it was considered medical incompetence not to use a thermometer. But why did it take so long to become widely adopted practice? The original thermometers were large, cumbersome devices and the tool developed over many iterations but this still doesn’t explain its slow advance.
The Scientific American article notes how easy it is to reject technology that we don’t understand, or technology whose successes we’ve had nothing to do with. Perhaps our fear is that in its success, new technology will detract from our own utility. More likely, slow adoption of technology comes down to what Andy Grove (of Intel) used to call the ’10x’ rule, referring to the idea that a product must be at least ten times better in order to overcome barriers to adoption and switching costs because people tend to underestimate the advantages of a new technology by a factor of 3 while simultaneously overestimating the disadvantages of giving up old technology by a factor of 3.
But as the piece also goes on to point out, the subtlety is actually in how we combine the best of the old with the best of the new – describing how a children’s hospital in Philadelphia had used quantitative algorithms to identify particularly dangerous fevers. The algorithms proved better at picking out serious infections than the judgement of an experienced doctor. But when the two were combined it outperformed either in isolation:
‘It’s true that a doctor’s eyes and hands are slower, less precise, and more biased than modern machines and algorithms. But these technologies can count only what they have been programmed to count: human perception is not so constrained.’
Similarly, at the 2016 International Symposium of Biomedical Imaging in Prague, a Harvard team developed an AI that could detect cancer cells amongst breast tissue cells with 92 percent accuracy, almost as good as the trained pathologists who could pick out 96 percent of the biopsy samples with cancer cells. Yet when artificial and human intelligence were combined 99.5 percent of cancerous biopsies were identified.
Technological change rarely means forgetting all that we know. More often it is helpful to frame it in that thought of combining the best of the old with the best of the new. Perhaps the key lesson here is that a fixed mindset (one where you believe that success happens at the expense of someone or something else) does not help the adoption of new technologies. When we can see the bigger picture, and adapt existing knowledge and skills to combine the best of the old with the best of the new, we always progress more. Growth mindsets win.
For more like this, order your copy of Building the Agile Business Through Digital Transformation, or you can join our community to access exclusive content related to the book.
Also published on Medium.