In the well-known book Good to Great, Jim Collins notes that technological changes have always taken place – it is not new. The pace of change is now much faster, but the concept of technological disruption is not.
It looks to me like the early predictions about the impact of technology on business are not necessarily coming to fruition, with some very surprising realities becoming apparent. Who would have thought to suggest just a few years ago that it was radiologists, rather than wait staff, at risk of losing jobs due to a tech-driven overhaul of their roles?
One of the world’s leading academics in this space, Assoc. Professor Michael Osborne of Oxford University, has said the trends indicate we’re seeing more jobs created than lost.
I see that automation can be about creating new ways to work, rather than simply replacing work. In legal tech, we’re seeing this become apparent with the implementation of augmented – versus artificial – intelligence. I think this will apply to most professions – that technology will augment and support, but not necessarily replace. To be sure, some roles will disappear, but new ones will be created.
Rather than simply replacing a person with an algorithm, augmented intelligence delivers the capability to amplify the way people do their jobs. Data is terrific, but at the end of the day a person still needs to make decisions about how to apply it.
And, however smart and logical algorithms might be, we also need to be mindful of their faults, including the inherent bias that can exist in how the algorithm is programmed, and the data it uses to learn.
Let’s go back to wait staff. This was a role that many had predicted would be extinct in the near future; however, even the Amazon Go checkout-less stores employ “associates” to assist customers. Amazon recognises that it’s great to utilise customer data to understand more about their preferences; however, it takes a person to translate that insight into a personalised shopping experience.
There’s a host of cognitive skills that can’t be replicated by machines or algorithms, and Michael Osborne’s data indicates demand is more likely for jobs that are linked to advanced cognitive skills. Some of these include:
- Judgement and decision-making
- Fluency of ideas
- Active learning
- Systems evaluation
- Learning strategies
- Deductive reasoning
Ref: Associate Professor Michael Osborne, Dyson Associate Professor in Machine Learning and Co-Director of the Oxford Martin Program on Technology and Employment at the University of Oxford The Future of Employment
What all of this tells me is that nobody – not even the experts – can tell with any real certainty how the future of work looks. Despite all the hype around artificial intelligence, block chain and smart contracts, I think most of the software is not yet ready for prime time and will take a little while before it enters the mainstream in a significant way.
In saying that, I do remain mindful of the comments by Roy Amara from the Stamford Research Institute when he said: “We tend to overestimate the effect of technology in the short run and underestimate the effect in the long run”.
I think this means that we should avoid getting caught up in the hype of technological change, and instead recognise the profound impact these technologies will have on business and society when they become mainstream.
Our challenge is to find the highest and best use for our high level cognitive ability, and to identify what parts of our work can be fully automated and what parts will be so hard to automate that a man/machine partnership is the answer.
For the moment, I’m suggesting to my clients that we ignore the doom and gloom, and instead look at how we can augment the way we work to create more interesting, enlivening jobs.