Take a fresh look at your lifestyle.

Technology builds more jobs than it destroys (but you may need to retrain to keep up)

We are often made to fear the approach of technological advances with headline stats like “47% of American jobs will be automated by 2033!” That is nearly 75 million people out of work, based on 2022 estimates. But is it really?

In the 1990s, banks widely rolled out automatic teller machines, and countless commentators sadly predicted the demise of the job. Why would anyone hire human bank tellers once people could get their basic banking needs from a faster, more accurate machine? The story is more complex than that, however. At the end of the 1980s and the beginning of the 1990s, the average bank required 21 bank tellers to operate thoroughly. Once the machines came out, each branch only needed 13 tellers. Suddenly, it became cheaper to operate a bank, and as costs fell, the number of new branches opening rose, requiring even more bank tellers. Today, even with online banking, mobile apps, and our trusty old ATMs, there are more bank tellers than ever. In fact, that specific job is growing a little faster than the job market as a whole.

It is a classic example of new technology being properly embraced, provides more opportunities for the humans it was ostensibly meant to “replace”. Tellers have had to change their roles. Very few continue to stand behind a counter and cash cheques. Instead, they sell more complex financial products and help sort out technical issues.

They have been upskilled, but they haven’t been replaced.

Something similar is happening in the world of medical care. Former UK health minister Professor, the Lord Ara Darzi, has looked at the impact of automation on the National Health Service (NHS) and social services there. In his report, he found that up to 30% of the work currently being undertaken by humans could be automated, saving almost 10% of its annual running costs of £9.7 billion. Now, health professionals spend around 70% of their time on administrative tasks, like transcribing notes and ordering tests. While these are undoubtedly crucial things to do, they take time away from diagnosing and treating illness in patients. Considering the NHS is facing a shortfall of 118,000 health staff by 2027, giving the doctors and nurses that are already hired more time with patients is critical.

In this case, technology is coming to fill the employment gaps that the NHS cannot fill with humans.

This is likely to be the case in more and more industries in the future. Gartner even estimated that while AI will replace 1.8 million jobs in the US, it will create 2.3 million. That is 500,000 extra jobs created by the widespread adoption of AI in the workplace. And the fact is, automation and similar technologies are simply too profitable for businesses to ignore. The Gartner report further predicted that automation would create “$2.9 trillion in business value and recover 6.2 billion hours of worker productivity.”

As Robert Atkinson of the Information Technology and Innovation Foundation recently said, “The truth is these technologies will provide a desperately needed boost to productivity and wages.”

Why is this boost so desperately needed? Because while half of all CEOs expect their industries to be transformed by digital technologies, according to one survey, only 16% feel they have the right staff to meet those future demands. There is a giant hole between companies’ future needs and employees’ current skill sets.

For employers who are really looking at the bottom line, the most profitable answer isn’t to fill their offices with a bunch of new tech experts and shiny robots. The most sensible thing to do is to get the existing workforce – and especially the employees whose jobs cannot be fully automated – retrained so they have the skills to embrace this new normal.

Scott Smith, AT&T vice president of human resources, has overseen a massive training program of ATT&T’s employees, and he said “You can go out to the street and hire for the skills, but we all know that the supply of technical talent is limited, and everybody is going after it. Or you can do your best to step up and reskill your existing workforce to fill the gap.”

It’s not as easy as putting staff on a single course, however. Some industry research indicates that digital skills have an average lifespan of three years. In other words, employee training will be out of date in just three years, so ongoing learning for people in employment may well become standard.

Ongoing education has its own challenges, of course. Companies are not able to let employees take time off to earn four-year degrees, for example, so education that can be built around employment schedules, like online courses, will have to be incorporated into workers’ lives.

Unfortunately, businesses aren’t investing in training. A 2015 study by the Council of Economic Advisers found that workers receiving employer-paid training fell from 19.4% in 1996 to 11.2% in 2008. For context, in 2008, while Apple was launching the App Store, businesses around the US were slashing their in-house training offerings. Many have learned from that mistake, like the notable Walmart Academy, but many, many others remain slow to retrain their staff.

Ultimately, time marches on and progress cannot be stopped. It is natural for rapid changes in the way we live our lives to trigger powerful emotions like uncertainty and fear. But we have seen in the past that technology doesn’t just replace human work; it gives us space to grow more, work more efficiently and create new opportunities for each other. The people who embrace the change and retrain to keep pace with technological changes are the ones who will thrive. In other words, “the best response to the threat of technological unemployment is to prevent it.”

Comments are closed.