Article

Technology Backlash: Will This Time Be Different?

Posted July 1, 2015 | Leadership | Amplify
In this issue:

Cutter IT Journal Vol. 28 No. 7

A FEW FACTS ABOUT TECHNOLOGY

Technology and its advancement distinguish Homo sapiens from the rest of the world's menagerie. Who would disagree that technologies, defining the term broadly, have wrought wondrous improvements to virtually every aspect of our lives -- work, play, mobility, health, safety, social connectedness, and so on? Without technology in the broadest sense, our species would still be hunting and gathering at the mercy of microbes and weather.

Any discussion of backlash against technology should start by acknowledging some basic facts:

  • While many technologies enable us to do something we otherwise could not do, like air travel, others take over a task once done by humans, right back to the first time a beast of burden was pressed into carrying a load that might have required several people.

  • Technologies never have job creation per se as a primary goal, but it is almost always a by-product.

  • Technologies create winners and losers in the labor market. Winners are those with the inherent ability to work at a higher level or the adaptability to do something different. Losers are those who can't, won't, or lack the opportunity.

  • Historically, the jobs technologies destroy have typically been arduous, boring, unsafe, and generally lousy, requiring minimal skill. (In IT's case, though, that is changing. We'll come back to this.)

  • While some of the new jobs technology created have still been boring and sometimes unhealthy (e.g., assembly-line production), many require more -- often much more -- skill, from the first horse and camel drivers right through to engineers, managers, and computer scientists.

The massive unemployment each wave of laborsaving technology might have engendered has not materialized. In addition to the replacement jobs, compulsory education, prohibition of child labor, regulation of workweeks, and the concept of retirement have reduced the size of the workforce in beneficial ways. Unemployment among working-age people fluctuates but has not gone out of control since the Great Depression, and in that case, the widespread unemployment was not blamed on technology.

But is what's past always prologue? In this article, I suggest that our mostly positive history of adjusting to technological innovations may not be a trustworthy guide to the future when we consider the extraordinary scope and pace of IT innovation. To assume that any backlash against IT is no more than the latest futile manifestation of Luddism may be too optimistic.

HISTORICAL BACKLASH

A corollary of the fact that technologies create winners and losers in labor markets is that they decrease the bargaining power of the less skilled, at least when there's no shortage of available labor, and that condition has prevailed through most of history. The pace of technological development was steady but slow until about the middle of the 18th century, when knowledge generated by early scientists began to be applied to mundane activities like weaving, knitting, and dyeing. Factories appeared where there was falling water to be harnessed for power. (See the sidebar "Father of the Industrial Revolution.")

FATHER OF THE INDUSTRIAL REVOLUTION

Who is the father of the Industrial Revolution? Sir Richard Arkwright (1732-92) is often given that honor. In the 1770s, this humbly born and self-educated inventor and entrepreneur created ways to mass-produce (by the standards of the time) dyed cotton cloth in a huge (again for the time) factory in Cromford, England, one of the first of its kind. He became a very wealthy man, a model of merit-driven upward mobility followed by Thomas Edison, Henry Ford, and Steve Jobs.

Most factory jobs were mind-numbingly repetitive, but plenty of new jobs were created to design, build, and maintain the machinery; to supervise the workers; and to operate what we'd now call a business infrastructure, including accounting, production, inventory control, and sales.

Working conditions in these "dark, satanic mills," as the poet William Blake described them in 1808, were awful. Semiskilled workers toiled away for 12 hours a day, six days a week, with only two holidays a year. (That's 3,732 annual work-hours, about twice the current average.) This provoked one of the first notable backlashes against technology, as Luddites translated words into action by smashing textile machinery. Though remembered for their sabotage, the Luddites' anger was not so much against machinery as such, but rather poor pay and working conditions in mechanized factories.1 Author Richard Conniff writes that:

[P]eople of the time recognized all the astonishing new benefits the Industrial Revolution conferred, but they also worried, as [the Scottish essayist Thomas] Carlyle put it in 1829, that technology was causing a "mighty change" in their "modes of thought and feeling. Men are grown mechanical in head and in heart, as well as in hand." Over time, worry about that kind of change led people to transform the original Luddites into the heroic defenders of a pretechnological way of life.2

This form of nostalgia is still alive today, as people who can afford to willingly pay a premium for artisanal products.

The bargaining power of factory labor remained low throughout the 19th century as the supply of workers expanded due to better public health and immigration from poor countries, so pay and working conditions did not improve until the union movement and social activists shifted the balance of power in the early 20th century. Rapid expansion of new industries, like auto manufacturing, plus workweeks shrinking by nearly half helped factories absorb the continuing influx of workers from immigration and no-longer-needed farm labor.

COMPUTERS ARRIVE

Computers emerged from scientists' laboratories in the early 1950s to be applied to mundane business operations. John Diebold (1926-2005), the first real guru in that field, essentially coined the term "automation"3 in his book of that title.4 The prospect of displacing armies of clerks and factory hands with automation led to some serious concerns about massive unemployment, but no popular backlash emerged; the 1950s were a time of unquestioning belief in "progress," and the baby boom had not yet hit the labor market. (Plus, there was the Cold War to worry about.)

The 1960s saw some populist concerns about automation,5 but the space race had made science and technology fashionable. Furthermore, IT was still limited in its application by the high cost of raw computing power and data storage. Hordes of new jobs were created, both to program and tend the computers and, given all the data now retrievable, to extract knowledge and insight from it. Nor should we forget Parkinson's Law:6 work expands so as to fill the time available for its completion.

Also offsetting the potential net job loss was the fact that a great many early computer systems were not well conceived or designed, so their laborsaving benefits fell far short of expectations. This outcome inspired MIT economist (and Nobel laureate) Robert Solow to quip in 1987: "You can see the computer age everywhere but in the productivity statistics."

IT FLEXES ITS MUSCLES

The "productivity paradox" bemoaned by Solow and others did not last long. Business and IT managers got smarter with techniques like reengineering, software vendors improved their products, the Internet happened, and, more than anything else, dramatic declines in the cost of hardware and telecommunications opened up endless opportunities for digitizing just about anything. No longer was IT only about numbers and their manipulation:

  • Word processing and optical character recognition (OCR) added text to the mix.

  • Computer-aided design (CAD) added drawings that could be manipulated, eliminating the need for draftsmen.

  • Physical processes in three-dimensional space were encoded to let robots perform assembly-line tasks. While robots for manufacturing jobs raised concern in the 1950s, they did not really come into play until the 1980s. Early computer gurus did not appreciate that it was harder to teach a computer to drive a robot across a cluttered room without bumping into things than it was to teach it to play Grandmaster chess. (See the sidebar "A Tale of Two Assembly Lines.")

  • Global debit and credit cards, self-service card readers, and barcodes have eliminated vast numbers of bank tellers, inventory and checkout clerks, and gas station attendants.

  • Office automation has made the traditional secretary a perk reserved for the highest-level executives.

A TALE OF TWO ASSEMBLY LINES

In 1987, I walked down a corridor separating two assembly lines in the Ford plant in Cologne, Germany. One side made the Fiesta, an economy model in production for a number of years. Workers wielding heavy shoulder-mounted guns did the body welds. The other side made the Scorpio, a recently introduced upscale model. Robots in a mesmerizing ballet did the welds while a sole employee kept an eye on the process. The jobs of the Fiesta workers (mostly immigrants) were not "good," but they were probably better than anything on offer back in Turkey.

Two additional capabilities extended the impact of IT well beyond assembly-line workers, clerks, and secretaries:

  • Standardized digitization of previously incompatible and fixed forms of information -- books, documents, drawings, pictures, photos, videos, recorded sound -- allow manipulation, comparison, and analysis.

  • The Internet lets digital data traverse the world in bulk at the speed of light through fiberoptic cables.

This global availability of the full range of digital data has enabled a world market in higher-level professional jobs that once could only have been done locally. Even time-critical tasks like X-ray reading and fine-tooth review of legal documents have come into play. When well-trained, English-literate folks can get the source documents in India as fast as someone just down the corridor and do the job as well for a fraction of the cost, well ...

The outsourcing of computer work to low-wage countries is old news, but it recently became a headline issue when Disney and Southern California Edison each discharged several hundred IT workers who, before departing, had to train their Indian replacements brought in on H-1B work visas. Since those visas were not designed to allow employers to replace Americans in existing jobs,7 a bipartisan political issue has emerged.

Taken to its logical conclusion, a borderless job market means that unless a rich country's workers are demonstrably more skilled and productive than those in low-wage countries, they will have to accept poor-country wages to keep their jobs.

THIS TIME, IT'S NOT JUST JOBS

The extensive list of what digitization is currently doing raises the possibility that this IT onslaught really is different from any past technological wave. Maybe it means more than just destroying some jobs while creating other, better ones. What is fundamentally different?

IT's ever-growing capabilities and ever-decreasing costs have made it truly ubiquitous; for example:

  • Cars, appliances, and even whole houses use special-purpose computers for control and the Internet of Things to communicate.

  • Driverless cars are rapidly moving out of the pages of science fiction magazines.8

  • Handheld devices using satellites or the cellular network can pinpoint where we are.

  • Miniaturization has allowed our interface to just about anything digital to fit in a shirt pocket. Smartphone apps can control our thermostats and digital video recorders from the other side of the world, not to mention making photos and videos that can be immediately transmitted.

Furthermore, existing technologies just keep getting faster, more capable, and cheaper:

  • Recent revelations have shown the extent to which governments have used massive data storage capabilities to retain information about ordinary people, justified by the need to root out potential terrorists.

  • And it's not only governments. People just going about their IT-enabled lives, such as shopping or planning travel on the Internet, have created vast troves of information about themselves that companies like Google and Facebook put to commercial use, yielding enormous revenues.

  • Dramatic increases in processing power have enabled artificial intelligence to make commercially viable inroads into tasks like plagiarism detection, face recognition, language translation, speech recognition, and finding spoken words "of interest" in recorded conversations.9

It is easy to describe the advantages such capabilities provide, but it's sobering to think about their potential downsides. For example:

  • Cheap digital storage means that information can remain accessible forever. While that can be useful for historians and biographers, it also means we can be haunted for life by an embarrassing picture, a bit of writing, or even something we said that got recorded.

  • Sophisticated search algorithms are also invaluable for historians and biographers, but they can enable people in authority to ferret out and destroy information, erasing inconvenient history.10 Webpages disappear without a trace unless someone makes a screen shot of them.

  • GPS technology and smartphones enable people to be tracked. This is helpful for responders to an emergency, but it could enable highway patrols and car insurers to levy fines and rate increases every single time we exceed a speed limit.

  • A highly integrated intelligent network can make many things more efficient (e.g., toll roads and power grids), but near-total dependence on a digital infrastructure (banks, etc.) helps criminals profit and cyber terrorists unleash havoc.

  • A car computer connected to the Internet can speed up the response to a breakdown and maybe provide a diagnosis, but a malicious hacker could make the car stop or misbehave.11

  • The ability to take quick photos and videos anywhere and anytime was instrumental in tracking down the Boston Marathon bombers, but it also enables gross invasions of privacy to live forever on the Internet.

  • The smartphone/tablet/laptop allows us to carry out much of our work at home or in a coffee shop far from the annoying distractions of the office, but it also means we can be reached anywhere, anytime all too easily. The technology that allows this threatens employers with data theft and sabotage, while Internet connections facilitate goofing off during normal working hours.

  • Real-time staffing optimization threatens employees with a life of unpredictable on-call work for which they must remain available, reflecting the current imbalance of power in lower levels of the job market.

A REAL BACKLASH THIS TIME?

This round of concern about possible downsides of new technologies may be more than just the latest example of misguided Luddism. Some backlash is not just potential but likely, and it behooves the IT community to at least recognize the possibility and take actions that would mitigate it. Consider the following:

  • The new and better jobs may not appear as they always have, or they'll appear in a different country. We should not assume that global corporations will care about this without some financial incentives to do so. The hollowing out of the middle class has been going on for at least three decades and has finally become a political issue in the US, where it is most pronounced.

  • Privacy has in effect disappeared. We cannot assume that self-policing will be effective in preventing businesses from misusing or carelessly treating data on employees and customers.

  • Governments should not be considered invulnerable to the temptation to misuse data on their citizens, once collected, with overzealous "zero tolerance" law enforcement or the kind of police state a great many European adults do not remember fondly. (See the sidebar "East Germany: What If?")

  • To argue that because so many millions have willingly shared huge amounts of very personal information means privacy won't be an issue is to bury one's head in the sand. Not everyone shares profusely, and there's often no choice whether to do so as it becomes harder and harder to participate in today's economy without divulging more than we might like. Ugly incidents could turn people off their openness only to find out how difficult it is to become "forgotten." "Revenge porn" and other kinds of personal threats over the Internet are becoming an issue with high potential to turn people off technology.

  • To suggest that people with nothing to hide should not worry is specious. After all, what's worth hiding can change. A German in the 1920s may not have thought his Jewish ancestry was something to keep hidden even from friends, since officially sanctioned anti-Semitism had been abolished there for decades.

EAST GERMANY: WHAT IF?

The East German Stasi may have been the most developed police state enforcement organization ever, with its armies of spies and informers. Of course the apparatus had to be highly labor-intensive, given the technology of the day. The excellent German film The Lives of Others (2006) portrays this in fictional form, where the human decency of one professional snooper showed the ultimate weakness of the system. But if the Stasi had had today's technology ...

MINIMIZING THE BACKLASH

The IT community has a huge stake in minimizing the probability and severity of any backlash. One major asset is that their executives are, on the whole, more publicly respected than their counterparts in most other industries. They have bully pulpits that they can and should use to get in front of both technical and broader sociopolitical issues likely to bring on or intensify backlash. The most critical technical issues are security and privacy. The most critical sociopolitical issues are education and income inequality. There are also two philosophical issues -- 100% consistency and 100% optimization -- explained below.

Technical Issues

It has become difficult to get through more than a few days without news of yet another security breach. Cyber crime, cyber mischief, and cyber terrorism are no more 100% preventable than more traditional misbehavior, but that does not excuse anything other than giving top priority to security of both data and critical infrastructure. Say some nightmare scenario were to materialize. It would be a disaster for the industry if people lost confidence in the Internet, no matter how reasonable the excuse. Software and hardware designers need to become obsessed with security.

As the people who can best comprehend the possibilities for their handiwork, members of the IT community need to address the privacy issue proactively, pressing governments not to "get out of their way" but rather to work with them to establish practical policies and laws that ensure that the spirit of legal protections for ordinary people (such as in the US's Bill of Rights) pervades a society surrounded by 21st-century technology. Google's early motto of "Don't Be Evil" needs to be enhanced with "Don't Do Evil" and become the industry's motto. Europeans have given this far more thought. Their Data Protection Directive12 goes back to 1980, and a new framework has just been agreed.13

Sociopolitical Issues

As already mentioned, new technologies are not kind to people who lack the knowledge and skill needed for more mentally demanding jobs. When lots of non-mentally demanding jobs vanish, as they have throughout the rich world, even as primary and secondary schools continue to turn out too many barely literate or numerate graduates, the state of education in much of the rich world (and certainly in the US) constitutes an issue of national self-preservation. Leaders in the IT industry need to become very publicly vocal about this and should devote a portion of their often fabulous new wealth to philanthropies that fund innovative experiments and pilot programs challenging tradition and conventional wisdom.

Gross inequality of income and wealth has a destabilizing effect on societies, as more and more people feel they're in a rigged game. The US was more successful than Europe in resisting the appeal of various early 20th-century "-isms" because so many Americans were only a generation or less removed from deeply stratified Europe, and the American dream was credible and real. After World War II, both Europe and the US became much more economically egalitarian, as huge middle classes with rapidly improving living standards developed, offering a real chance for people born in the lowest income quintile to move to the highest. While technology and foreign competition have reversed this egalitarian tendency all over the rich world, the reversal has been most pronounced in the US. Minimum wages need to increase so that a person working full time will no longer be poor, and there needs to be strong disincentives to exporting jobs. When jobs do disappear, as some must, an honest effort is needed to help the people affected to adjust and adapt. IT firms can lead by example, paying even their humblest employees a living wage and insisting that their contractors do the same, as Facebook has just done.14

IT industry leaders need to be publicly vocal on this issue as well, in large part to offset the stereotype of tech executives living in bubbles, oblivious to the plight of those who can't play in their high-flying league.

Philosophical Issues

Computers are great at enforcing the rules they're given with 100% consistency, leaving no room for judgment or common sense. But without that slack, life become oppressive. As all drivers know, not every infraction or deviation matters.

Computers are good at achieving 100% optimization, squeezing out every last redundancy. But life without redundancy is too perilous. Tightly coupled systems are maximally efficient when everything works right, but expecting that disrespects Mr. Murphy.

The role of IT leaders here is to warn customers away from overcomputerization and to promulgate design guidelines that reflect this more humble philosophy.

SOFTENING TECHNOLOGY'S ROUGH EDGES

Psychologists and sociologists have expressed concerns about the effects all these wondrous new technologies will have on people -- our attention spans, our ability to make truly meaningful connections with others, or our capacity to function as a society. (Try Googling "psychological effects of technology" to see what I mean.) It would be foolhardy to suggest that we dismiss their concerns, but the Carlyle quote cited earlier may put things in perspective. While his concern that humankind would become "mechanical in head and in heart" seemed legitimate and plausible in 1829, it did not happen. Perhaps human nature is too hardwired for any software to change it, at least for the worse. We can hope.

However, that glimmer of optimism must not deter us from recognizing that unless changes are made to soften the rough edges of IT-driven disruption, some backlash is not just inevitable but justified. We in the IT community can choose to lead these changes calmly and rationally, or we can take our chances with unpredictable and probably irrational public anger -- to which politicians and courts will respond, not necessarily with calm rationality.

ENDNOTES

1 Conniff, Richard. "What the Luddites Really Fought Against." Smithsonian, March 2011.

2Conniff (see 1).

3 A Ford executive first used the term but applied it only to a narrow range of factory tasks.

4 Originally published in 1952, Automation: The Advent of the Automatic Factory was reissued by American Management Association in 1983.

5 The folksinger Phil Ochs released "Automation" in 1964 as part of his All the News That's Fit to Sing album. Lyrics can be found at www.metrolyrics.com/automation-song-lyrics-phil-ochs.html.

6 Wikipedia tells us this law was "articulated by Cyril Northcote Parkinson as part of the first sentence of a humorous essay published in The Economist in 1955; it was reprinted with other essays in the book Parkinson's Law: The Pursuit of Progress (London: John Murray, 1958). He derived the dictum from his extensive experience in the British Civil Service."

7 Preston, Julia. "Pink Slips at Disney. But First, Training Foreign Replacements." The New York Times, 3 June 2015.

8 Durbin, Dee-Ann. "Mercedes vs. Google: How Two Competing Visions Are Racing to Bring Driverless Cars to Market." The Associated Press, 10 July 2015.

9 One wonders, have Egyptologists had to find a substitute name for the goddess Isis?

10 Using paper, scissors, and a "memory hole," making history disappear was the job of Winston Smith, protagonist of George Orwell's novel 1984, written in 1948. How's that for 67 years of progress?

11 Greenberg, Andy. "Hackers Remotely Kill a Jeep on the Highway -- With Me In It." Wired, 21 July 2015.

12 "Data Protection Directive" (Wikipedia).

13 Gibbs, Samuel. "EU States Agree Framework for Pan-European Data Privacy Rules." The Guardian, 15 June 2015.

14 Zillman, Claire. "Facebook Is Wading Into the Minimum Wage Debate in a Huge Way." Time, 14 May 2015.

About The Author
Paul Clermont
Paul Clermont is a Cutter Expert. He has been a consultant in IT strategy, governance, and management for 40 years and is a founding member of Prometheus Endeavor, an informal group of veteran consultants in that field. His clients have been primarily in the financial and manufacturing industries, as well as the US government. Mr. Clermont takes a clear, practical view of how information technology can transform organizations and what it takes… Read More