The Singularity is an interesting hypothesis that, if the median of predictions is correct, the superiority of Artificial Intelligence over Human intelligence will arrive by the year 2040.1
It is a fascinating theory that, at the point of singularity, human existence as we know it could not continue and a step-change in our existence would occur, an evolutionary milestone for the human race.
It is clear that the pace of technological change is faster than ever and that governments, businesses and societies are struggling to keep up. What is technologically possible is not always easily adopted into the norms of a society.
A great example is autonomous cars, which technologically could soon be capable of self-driving, requiring little to no intervention by a human being. Developing suitable and effective technology for an autonomous car is a complicated problem, but one which can be solved with sufficient research and development. However, there are complex problems 2 to solve that are much more challenging.
It seems simple that while the car is in autonomous mode, the manufacturer is legally responsible for the car’s performance and, while the person is driving, they are legally responsible. However, who would be legally responsible for the car’s performance during the switchover between the person and the car, or vice versa?
Even more complex is the moral question of how the autonomous car should react to various emergency / non-standard situations. How should the car behave in a situation where two pedestrians walk in front of the car? Should it focus on the safety of the passengers or the pedestrians? What if the pedestrians are two young children? What if they are your children?
This kind of moral dilemma needs to be solved in ways that are non-binary and cannot be ‘engineered’. As a result, many organizations are working on solving these types of moral problem, such as MIT with its ‘Moral Machine.3
As we move along the path towards the singularity, news reports suggest that many of the jobs currently undertaken by humans will be automated within the next 20 years. on the BBC website 4, the figure given was as high as 35% and the website even gave the possibility to check the likelihood of your job becoming automated.
Notable personalities, including Stephen Hawking and Elon Musk, consider the uncontrolled rise of artificial intelligence as a matter of alarm for humanity’s future. The consequences of the singularity, and its potential benefit or harm to the human race, have been hotly debated by various intellectual circles.5
Regardless of how far, and how quickly, we move toward the singularity (and if we’ll ever reach that point), we must ensure that the humanity of this progress is maintained and that decisions focus primarily on the benefit to society.
What I have coined ‘The Lean singularity’ combines the vast opportunities of automation and artificial intelligence with the humanity of Lean thinking. It requires Lean leadership 6 to truly achieve the benefits that we need and organizations that operate in this way have a culture and a way of working that focuses on:
- Understanding customer value
- Designing processes that maximise customer value
- Developing and engaging their people
- Kaizen as an everyday part of their work
- Utilising ‘tools’ (Automation, Robotics, Machinery, Software, etc.) that improve the quality and effectiveness of the process outcomes
These ‘learning organizations’ understand that it is the combination of competent and engaged people, well-designed processes, and the right tools, that deliver maximum value for all.
There are many examples of this type of approach and Erik Brynjolfsson, in his interview with the Harvard Business Review 7, explained that machines often make different mistakes from Humans and, therefore, the combination of human and Artificial Intelligence creates a synergy of outcome.
The example that he gave was of detecting cancer, whereby a machine can very quickly screen multiple scans to detect those that appear problematic. However, the machine is more likely to make a ‘false positive’ assessment and therefore human intelligence will screen the suspect scans to determine which of them are really a cause for concern.
A manufacturing example is on the Mercedes assembly line for the new ‘S-Type’ 8, where they opted to use smaller, more flexible robots, working side-by-side with their team members and with much more focus on the flexibility that people bring in delivering customer value. The S-Type is one of their premium offerings and, as such, they need to be able to offer their customers a plethora of options, which requires that nearly every car going down the line is different in some aspect. This is where human intelligence and flexibility, combined with the consistency and labour-saving attributes of robots, and designed into a capable process, provide the capability required to deliver customer value most effectively.
Ultimately, everything we do, as a government, NGO or business, ought to be in the best interests of society. It is therefore incumbent on leaders across all organizations to consider each decision around how it maintains the humanity.
This will require us, as a society, to train our people for the realities of our journey toward the singularity, that individuals take the responsibility to continuously learn and adapt with technological advancement, and that we develop Lean leaders in our organizations who understand the difference between technological and human advancement.
Whereas the Technological Singularity is the point at which Artificial Intelligence becomes superior to human intelligence, the Lean Singularity is the point at which the critical mass of Lean leadership and Lean thinking exceeds that of the traditional leadership under which too many decisions are currently made.
About the author:
Philip Holt is an experienced Senior Operations and Business Transformation Leader with more than 20 Years’ experience delivering value and improvements globally, and a track record of Lean Transformation within a Global Blue Chip Organisation. He is passionate about delivering Operational Excellence through Lean Leadership and enjoys sharing and discussing his experiences with others.
Follow Philip on Twitter: @LeanMaster1 to find out more.
- Source: Wikipedia <https://en.wikipedia.org/wiki/Technological_singularity#cite_note-Singularity_hypotheses-2>
- In system theory: a system can be very complicated but not complex at all. A system is complex when it has emergent behaviour. Complicated systems can be solved with enough computing power. Complex systems cannot be solved. ~ This definition was shared on english.stackexchange.com by Patrick Savalle
- MIT’s Moral Machine is a website with a quiz that allows the user to make a number of ‘moral choices’ for a car in emergency situations. After completion, the user may compare their choices with those of other users: http://moralmachine.mit.edu/
- BBC Article, ‘Will a Robot take your Job?’: http://www.bbc.co.uk/news/technology-34066941
- Source: Wikipedia: https://en.wikipedia.org/wiki/Technological_singularity#cite_note-Singularity_hypotheses-2
- Leading with Lean Article: https://www.linkedin.com/pulse/leading-lean-philip-holt/
- How AI is already changing Business, HBR.org: https://hbr.org/ideacast/2017/07/how-ai-is-already-changing-business.html
- Guardian Article ‘Mercedes swaps Robots for People’: https://www.theguardian.com/technology/2016/feb/26/mercedes-benz-robots-people-assembly-lines
Philip’s book: “Leading with Lean” is available to purchase globally.