Report from the McKinsey Global Institute

It is easy to become blasé about technological progress in this non-stop, 24/7, digitaleverything-always-and-everywhere era. We take technological advances almost for granted and are frustrated when an app that streams the latest Hollywood movies crashes, or a smartphone which has many times the processing power of a 1980s Cray 2 supercomputer does not fire up the moment we press the “on” button.

We forget that it was not always this easy. Not so long ago, we had to go to libraries to look up quotations and insert compact discs into an audio system to play music. Transmitting even tiny amounts of data was complicated by today’s standards; sometimes we had to strap bulky acoustic couplers onto a fixed telephone and wait for the modem to screech. As the German sociologist Hartmut Rosa has pointed out, we live in an age of acceleration in which the art of saving time has reached unprecedented heights thanks to technology, but we nonetheless feel that we must run faster just to stay put. 28

Even by these standards, however, some of the most recent developments in robotics, artificial intelligence, and machine learning are noteworthy for the advances they represent. We are on the cusp of a new automation age in which technologies not only do things that we thought only humans could do, but also can increasingly do them at a superhuman level of performance. In this report, we focus on the adoption and implications of automation technologies rather than on the technologies themselves. However, by way of introduction, this chapter lays out some key areas of recent technical advances—and where remaining technical obstacles must still be overcome to achieve the full promise of workplace automation. 29


Physical robots have been around for a long time in manufacturing, but now we are seeing much more flexible, safer, and less expensive robots engaging in service activities—and improving over time as they are trained by their human coworkers on the shop floor. 30 For example, some hospitals now regularly use automated systems for storing and dispensing medication in their pharmacies, eliminating human picking errors, and also have automated haul and transport for their clinical supplies. 31 The advances in cognitive tasks are no less striking. Software has long been able to outperform humans in some areas, such as financial-service transactions or route optimization for companies such as UPS. 32 Now, artificial intelligence is starting to encroach on activities that were previously assumed to require human judgment and experience. Exhibit 1 is a non-exhaustive list of some of the technologies and techniques that are being developed to enable automation of different work activities. 28

28. Hartmut Rosa, Social acceleration: A new theory of modernity, Columbia University Press, 2013.
29. For a more detailed discussion of machine learning and deep learning technologies see the corresponding chapter in The age of analytics: Competing in a data-driven world, McKinsey Global Institute, December 2016.
30. Baxter robots by Rethink Robotics can now pick up items that are not precisely aligned, and then reorient and place them correctly.
31. Swisslog offers a medication management system, For a transport and logistics example, see
32. For example, insurance companies use novel pattern recognition to detect fraudulent claims, saving companies including GE millions of dollars annually. A 2014 study used US Securities and Exchange Commission filing data as well as social network analysis to determine clusters of insiders and correlated their trading patterns. See Tamersov Acar et al., Large-scale insider-trading analysis: Patterns and discoveries, Georgia Institute of Technology, August 2014.

and techniques
Robotics Soft robotics Non-rigid robots constructed with soft and deformable materials that can manipulate items of varying size, shape and weight with a single device. Soft Robotics Inc. grippers can adaptively pick up soft foods (e.g., baked goods, tomatoes) without damaging them.
Swarm robotics Coordinated multi-robot systems, often involving large numbers of mostly physical robots
Tactile/touch robotics Robotic body parts (often biologically inspired hands) with capability to sense, touch, exhibit dexterity, and perform variety of tasks
Serpentine robots Serpentine looking robots with many internal degrees of freedom to thread through tightly packed spaces
Humanoid robots Robots physical similar to human beings (often bi-pedal) that integrate variety of AI and robotics technologies and are capable of performing variety of human tasks (including movement across terrains, object recognition, speech, emotion sensing, etc.). Aldebaran Robotics and Softbank’s humanoid Pepper robot is being used to provide customer service in more than 140 Softbank Mobile stores in Japan
Automation product categories Autonomous cars and trucks Wheeled vehicles capable of operating without a human driver. In July 2016, Tesla reported that its cars had driven over 130 million miles while on “Autopilot.” In December 2016, Rio Tinto had a fleet of 73 driverless trucks hauling iron ore 24 hours/day in mines in Western Australia
Unmanned aerial vehicles Flying vehicles capable of operating without a human pilot. The unarmed General Atomics Predator XP UAV, with roughly half the wingspan of a Boeing 737, can fly autonomously for up to 35 hours from take off to landing
Chatbots AI systems designed to simulate conversation with human users, particularly those integrated into messaging apps. In December 2015, the General Services Administration of the US Government described how it uses a chatbot named Mrs. Landingham (a character from the television show The West Wing) to help onboard new employees
Robotic process automation Class of software “robots” that replicates the actions of a human being interacting with the user interfaces of other software systems. Enables the automation of many “backoffice” (e.g., finance, human resources) workflows without requiring expensive IT integration. For example, many workflows simply require data to be transferred from one system to another


SOURCE:; A tragic loss, Tesla blog, June 30, 2016; Resource revolution: Transformations beyond the supercycle, McKinsey Global Institute, forthcoming in 2017;; Jessie Young, How a bot named Dolores Landingham transformed 18Fs onboarding,, December 15, 2015; McKinsey Global Institute analysis

Some of AI’s exploits are less heralded than its victory over a human champion of the complex board game Go in March 2016.33 For example, a project by Google’s DeepMind and the University of Oxford has applied deep learning to a huge data set of BBC programs to create a lip-reading system. Trained using more than 5,000 hours of BBC TV programs, containing more than 100,000 sentences, it easily outperformed a professional human lip‑reader. In tests on 200 randomly selected clips, the professional annotated just 12.4 percent of words without error, while the computer annotated 46.8 percent error free and many of its mistakes were small ones, such as leaving a plural “s” off the end of a word. 34

Many other surprising technologies are making advances. Robot “skin” made of a piezoelectronic transistor mesh developed by Georgia Tech and covered in thousands of mechanical hairs is as sensitive as human skin and able to “feel” textures and find objects by touch. 35

In the social and emotional realm, Affectiva, a Boston-based company, uses advanced facial analysis to monitor emotional responses to advertisements and other digital media content, via a webcam.36 In the United Kingdom, the University of Hertfordshire has developed a minimally expressive humanoid robot called KASPAR that operates as a therapeutic toy for children with autism. Having physical, human-like properties, yet being non-human, allows the children to investigate the human-looking features—for example, squeezing KASPAR’s nose or tickling its toes—safely and in a way that would not be possible or appropriate with a real person. 37

Such advances suggest that an idea toyed with by science fiction writers for at least a century—that of robots and other machines replacing men and women in the workplace on a large scale—could soon become a reality. We seem to be approaching a new frontier, but we have not arrived there quite yet.


The Czech writer Karel Capek first used the word “robot” in 1920, in a play about a factory in which androids created partly through a chemical process each do the work of two-and-a-half humans at a fraction of the cost. One of his characters explains: “Robots are not people. Mechanically they are more perfect than we are, they have an enormously developed intelligence, but they have no soul.” 38

Today, mechanical perfection seems achievable, as robots become ever more adept at physical tasks, even if they are still wobbly on uneven terrain and consume a lot of energy. Through deep reinforcement learning, they can also untie shoelaces, unscrew bottle caps, and remove a nail from the back of a hammer. 39

Their “intelligence,” too, has progressed—but this is where the most formidable technical challenges still lie ahead. While machines can be trained to perform a range of cognitive tasks, they remain limited. They are not yet good at putting knowledge into context, let alone improvising. They have little of the common sense that is the essence of human experience and emotion. They struggle to operate without a pre-defined methodology. They are far more literal than people, and poor at picking up social or emotional cues. Sarcasm and irony pass them by. They generally cannot detect whether a customer is upset at a hospital bill or a death in the family, and for now, they cannot answer “What do you think about the people in this photograph?” or other open-ended questions. They can tell jokes without really understanding them. They don’t yet feel humiliation, fear, pride, anger, or happiness. They also struggle with disambiguation, unsure whether a mention of the word “mercury” refers to a planet, a metal, or the winged god of Roman mythology.

Moreover, while machines can replicate individual performance capabilities such as fine motor skills or navigation, much work remains to be done integrating these different capabilities into holistic solutions where everything works together seamlessly. Combining a range of technologies will be essential for workplace automation, but engineering such solutions—whether for hardware or software—is a difficult process. The creation of solutions that solve specific problems in the workplace is work that will have to be done as individual technical challenges are overcome in the lab. Even once the technical feasibility issues have been resolved and the technologies become commercially available, it can take years before they are adopted.

Yet, given the speed with which technological advances are happening, reaching and crossing the next frontier may just be a question of time. Moore’s law—that the number of transistors in a dense integrated circuit doubles approximately every two years—may be slowing, but we are still seeing massive increases in computing power. Machine learning and its subset deep learning continue to advance rapidly, while traditional AI algorithms become more versatile and powerful. Cloud computing and other technologies are opening new possibilities for more people to become involved in innovation. Academic research in these areas, especially in artificial intelligence, has increased significantly, and global markets are taking notice, with growing corporate investment in research and development.

When large-scale automation does come to the workplace, what will that mean for the economy, for jobs, and for the future of work itself? And how fast could it happen? Such existential questions are easier to answer through fiction. Capek’s 1920 play about robots ends with the destruction of mankind and robots discovering the meaning of love.

This report, by contrast, seeks to establish a fact base with which to address these issues and a foundation for a more informed dialogue. Robots may not have a soul, but their potential impact on the global economy can be calculated.

33. Choe Sang-Hun, “Google’s computer program beats Lee Se-dol in Go tournament,” New York Times, March 15, 2016.
34. Hal Hodson, “Google’s DeepMind AI can lip-read TV shows better than a pro,” New Scientist, November 21, 2016.
35. Klint Finley, “Syntouch is giving robots the ability to feel textures like humans do,” Wired, December 17, 2015.
37. Ricky Boleto, “Could robots help children with autism?” BBC News, March 10, 2014. See also the University of Hertfordshire web page for KASPAR, with-autism.
38. The word “robot” comes from “robota,” the Slavic word for work. Karel Capek, R.U.R. (Rossum’s Universal Robots), 1920. The play is available at Capek initially called the creatures “labori” but was persuaded by his brother to change the name. Science diction: The origin of the word “robot,” NPR Science Friday, April 22, 2011.
39. Signe Brewster, “A strong robot hand with a softer side,” MIT Technology Review, February 9, 2016; Robots master skills with “deep learning” technique, Kurzweil, May 22, 2015.