If you’re relaxing on the beach – and we hope you are – work may be the last thing on your mind. But once you shake the sand off your feet and head back to the “real” world, you may be struck by the flurry of depressing stories about jobs.
Take, for instance, “zero-hours” contracts, which are causing controversy in the United Kingdom. Workers on these contracts receive no guarantee of regular work or pay; instead, says the BBC, they “only work as and when they are needed by employers, often at short notice”. According to some estimates, as many as a million workers may be on zero-hours contracts in the UK, employed by everyone from Buckingham Palace to McDonald’s.
Some workers undoubtedly like the flexibility of these contracts; others feel exploited, especially when their contract bars them from working for anyone else, a point acknowledged by British business minister Vince Cable: “Where it is a problem is … where there is an exclusive relationship with a particular employer who actually cannot provide stable employment, or indeed any employment, that stops the worker going to another company.”
Concerns about job quality aren’t restricted to the UK. Late last month, thousands of U.S. fast-food workers went on strike to protest against low wages and there have been similar actions in New Zealand. Nobody ever flipped burgers to become a millionaire, but in the past that didn’t matter so much: Fast-food jobs were mostly for teenagers or part-timers, most of whom didn’t hang around for long. That’s no longer the case, as James Surowiecki recently pointed out: These days, “more [low-wage workers] are relying on their paychecks not for pin money or to pay for Friday-night dates but, rather, to support families.”
These stories can be seen as symptomatic of what some argue is a hollowing-out of the workforce between high and low-skilled jobs. Many of the secure and reasonably paid jobs that used to sit in the middle are vanishing, victims, in part, of technological change. According to consultants McKinsey, about half of the face-to-face service jobs that people once took for granted – bank tellers, travel agents, typists and so on – no longer exist. That, in turn, is helping to fuel the income gap in our societies (although it should be noted that other factors – such as the decline of unions and changes in taxation –also play a role).
Of course, this isn’t the first time that technology has destroyed jobs – remember the Luddites. But, historically, each wave of technology also created new types of jobs – think of app developers, social media strategists and (if you can bear the thought) “chief listening officers”. On balance, the gains have outweighed the losses, which is part of the reason why our societies have tended to become wealthier over time.
But will this pattern continue? Not everyone is convinced. At the Massachusetts Institute of Technology, Erik Brynjolfsson and Andrew McAfee argue that the balance has begun to tilt: “They believe that rapid technological change has been destroying jobs faster than it is creating them,” reports David Rotman. They also believe that the rewards from work are increasingly divided between winners and – for want of a better word – losers: “Someone who creates a computer program to automate tax preparation might earn millions or billions of dollars while eliminating the need for countless accountants,” writes Rotman.
To be sure, the idea of making this link is not new, and it features strongly in research by the OECD, the IMF (pdf) and many others. Indeed, it was identified as far back as the 1970s by the economist Jan Tinbergen, who argued that the way incomes were distributed across society reflected in part a “race between technology and education”. When technology was winning, more rewards went to fewer people; when education was winning, more people were able to make use of technology in their work, and the rewards were spread much further.
At the moment, technology seems to be winning, which is one reason why you hear so many calls for more investment in education. But the MIT researchers seem to go further, arguing that today’s technologies and “digital versions of human intelligence” may prove to be permanent game changers: “I would like to be wrong,” Andrew McAfee says, “but when all these science-fiction technologies are deployed, what will we need all the people for?”