By Mark Vickers
My firm has explored the possibility of a third industrial revolution being brought about through the offshoring phenomenon, but we also need to delve into a related trend that’s likely to have a similar or even larger impact on workforce skill demands: the computerization of work.
This isn’t a new trend, of course, but it’s one that’s moving faster than many people realize. Workers’ fears of being replaced by a computer in the workplace go back a long way, probably to the first truly general-purpose computing machines of the 1940s. By the late ’50s, the notion of being replaced by computers had entered the mainstream to such a degree that Hollywood was making screwball comedies such as "The Desk Set" about it.
As it turns out, such fears were neither unfounded nor, for the newly unemployed, much of a laughing matter. Computers, in combination with other machines, have, at least in part, changed the occupational distribution of the U.S. workforce in major ways, suggests the analysis of Prof. Frank Levy of MIT and Richard J. Murnane (2005) of Harvard University. They point to a historical “hollowing out” of the U.S. occupational distribution between 1969 and 1999, a time when many assembly-line jobs as well as administrative support jobs disappeared.
Of course, other new jobs were created during the same time period, demonstrating that computerization has been as much a creative as a destructive force in the economy. This will probably be true for the future as well, but we shouldn’t be sanguine about it. After all, the fast-rising power of computers is bound to bring about some fascinating but painful changes to the jobs market.
In work they’ve done with Prof. David Autor of MIT, Levy and Murnane place human skills sets into five broad categories: expert thinking, complex communication, routine cognitive tasks, routine manual tasks, and nonroutine manual tasks. Since computers are excellent at handling many kinds of routine tasks, even formerly well paid cognitive work such as maintaining expense reports or evaluating mortgage applications are increasingly subject to automation. On the other hand, some relatively low-skilled manual jobs that require nonroutine tasks have so far proven hard to computerize.
But it’s really jobs that require complex communication (e.g., jobs that entail interaction skills such as negotiation, explanation or motivation) or expert thinking (e.g., jobs that entail diagnosis) that are least subject to automation, according to Autor, Levy and Murnane. Expert thinking occupations tend to rely on “pure pattern recognition,” which means “information processing tasks that—at least for today—cannot be articulated in inductive or deductive rules” (p. 7).
Of course, that’s not to say expert and complex communication jobs don’t rely on computers. They increasingly do, but computers aren’t substituted for human skill sets in these jobs; rather, they complement those skills.
Although this trend will continue, the dynamics will change. Computers are showing signs that they’re capable of being more than just aides to expert thinkers. In the near future, they’re likely to do more of the “thinking” themselves, even in scientific fields often associated with the purest of pattern recognition, suggests a new report called Towards 2020 Science. Based on the findings from a team of 34 leading scientists, the report stated that computer science is starting to go beyond just absorbing and ordering scientific data, moving into the analysis and interpretation of information (“Computing,” 2006).
Stephen Muggleton (2006) of the Centre for Integrative Systems Biology at Imperial College in London writes, “It is clear that computers will continue to play an increasingly central role in supporting the testing, and even formulation, of scientific hypotheses” and that “this traditionally human activity has already become unsustainable in many sciences without the aid of computers.”
And even in that most human area of thought—intuition—computers are beginning to play a role. Eric Bonabeau has created something he calls a “hunch engine” that is basically a program that “mutates” anything from photographic images to representations of molecules in order to generate new possibilities that appeal to users’ intuition (Norton, 2006).
So, what are the implications of these trends? Levy and Murnane argue that today’s education and training should emphasize pattern recognition and complex communication skills at least as much as they do the teaching of the basic rules of a given subject. Of course, students need to have a grasp of “the basics,” but they also need to be able to think well beyond routine cognitive tasks. Workers will increasingly need a deep understanding of the context of problems and how to see patterns among them. They must recognize when and how to apply the power of the computer. “For example,” write Levy and Murnane, “a mechanical engineer is valued for her ability to formulate a problem as a particular mathematical model. Once the model is formulated, a computer—not an engineer—will apply rules to calculate the actual solution.”
In essence, human beings will increasingly have to learn which cognitive tasks they do better than computers and how they can use computers to complement those tasks. The trick will be in staying ahead of the curve as the power of both computer hardware and software continues to grow.
For much more information visit www.hrinstitute.info
Documents used in the preparation of this article include:
“Computing the Future.” The Economist, May 23, 2006.
Levy, Frank and Richard J. Murnane. “How Computerized Work and Globalization Shape Human Skill Demands.” Paper prepared for the First International Conference on Globalization and Learning, Stockholm, March 17–18, 2005.
Muggleton, Stephen H. “2020 Computing: Exceeding Human Limits.” Nature, March 23, 2006.
Norton, Quinn. “Software Helps Develop Hunches.” Wired News, March 13, 2006.
About the Author(s)
Mark Vickers is an associate with the Institute for Corporate Productivity.