All is not well in the global talent arena. The digital “skills gap” that emerged last decade is widening into a chasm. According to International Data Corp’s FutureScape 2019 report, two million jobs in artificial intelligence (AI), the Internet of Things, cybersecurity and blockchain will remain unfilled by 2023 due to a lack of human talent. Some experts claim the only solution is a structural reset focused on how individuals learn. Most agree that the transition won’t be easy.
That’s because the skills gap has deepened. It started in 1964, when the International Association for the Evaluation of Educational Achievement fielded the First International Math Study (FIMS), which ranked student math proficiency of students in 13 developed countries. The U.S., which finished last, was already experiencing a skills imbalance. The first signs emerged in 1942, when the U.S. War Department’s Army General Classification Test indicated that 40% of Americans aged 17 to 24 had the cognitive ability of an eight-year-old.
By 1983, officials in the Reagan Administration were so concerned that they commissioned a report entitled A Nation at Risk, whose ominous conclusion warned: “Our once unchallenged preeminence in commerce, industry, science and technological innovation is being overtaken by competitors throughout theworld...if an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.”
Alarmed by this finding, education administrators and politicians used the report to usher in the era of standardized testing, seeking accountability for the nation’s investment in public education.
But times have changed. Rick Miller, president of Olin College of Engineering, proclaimed back in November 2014 that “We live in an age of just-in-time learning facilitated by powerful online search engines, and in the workplace of the future, what one knows will be less important than what one can do with what one knows.”
Who will fill the gap?
And yet, in a recent study by Cognizant’s Center for the Future of Work (CFoW), only 27% of business executives claim their employees have the skills to work or interact with top emerging technologies, such as AI, big data/analytics, IoT, mobile technology, open APIs and cybersecurity.
That is a huge skills gap. And corporations aren’t necessarily looking to higher education institutions for help. In the CFoW study, 67% of business leaders said they’re concerned about the effectiveness of higher education institutions to prepare the workforce of the future.
AI skills are needed right now in the workplace, but it could be 2025 before many college students will find an AI course in their school syllabus. Higher education institutions, after all, refresh their curriculum only every two to six years, according to the CFoW study.
If there’s a looming shortage of two million workers for jobs in emerging technology areas, where are companies going to find the competent instructors to teach them?
No wonder, then, that according to the study, roughly 6 in 10 companies are beginning to bear the burden of learning for their employees, whether by overhauling their corporate learning and training development programs (65%), increasing their investment in reskilling (62%) or offering specialized training on emerging technology (60%).
That’s encouraging. But many chief information officers (CIOs) with whom I speak are reluctant to fully embrace these kinds of programs.
They remain convinced that once they reskill employees in an emerging technology area, they’ll add this skill to their resume and head off to a different employer.
I disagree with this contention. Thankfully, so do many forward-looking business executives. According to the CFoW study, these executives are prioritizing skill enhancement programs for workers in robotics/AI (82%), human-centric skills like communication, collaboration and problem-solving (80%), tech skills/web design/UI design (73%), project planning (67%) and discrete tech skills in STEM disciplines (63%).
Overcoming a last-century reskilling mindset
But here’s the rub. These approaches to upskilling are often grounded in 20th-century learning methods such as instructor-led classes rather than on-the-job training, e-learning and video learning. If there’s a looming shortage of two million workers for jobs in emerging technology areas, where are companies going to find the competent instructors to teach them? I was also surprised to see learning approaches based on AI (28%) and augmented reality (19%) far down the list. That’s another skills gap to reckon with.
A study by global recruitment firm Harvey Nash offers further insight into how CIOs are strategically dealing with the skills gap. Respondents were first asked about which tech areas are most impacted by a skills gap. Responses included big data (46%), enterprise architecture (36%) and security (35%).
For me, the key question in the study is this: “Which method do you use to find the right skills?” Rather than innovative reskilling, responses ranged from “using contractors/consultants to fill the gap” (85%), “using outsourcing/offshoring to supplement internal teams” (71%) and “using automation to remove the need for headcount” (67%).
If corporations aren’t interested in reskilling workers for emerging technologies, and higher education institutions are reluctant to change their insular business models, what options remain for workers looking to learn emerging technology skills?
Look in the mirror.
Kathleen deLaski, founder of Education Design Lab, says digital badges have gained a lot of traction quickly, but “we need corporate hiring managers to give clearer signals to validate these as credentials.”
A recent study from iCIMS of a thousand technology hiring executives offers three findings that suggest these “clearer signals” are emerging:
- 80% of respondents said they would offer tech job candidates the same salary regardless of whether they had a relevant tech degree
- 61% said a four-year college degree alone does not prepare job seekers to be successful in today’s workforce
- 45% said they believe that in the next two years a coding bootcamp certificate will be as meaningful a qualification for a skilled technology position as a college degree
Digital badges have shortcomings. Most notable is the lack of industry standards for course quality or the amount of personal commitment required to earn one. But from what Roger Schank, founder of Experiential Teaching Online and former chief education officer at Carnegie Mellon University, tells me, “In the end, credentials mean what we think they mean. ‘I’m a high school graduate’ used to mean something; now if you bragged about that, you would be laughed at. The real issue is what one has actually done and being supported by any credential that means something to corporations. The future belongs to digital credentials.”
A different kind of bridge to the future
What’s clear is that companies and higher education institutions are not doing enough to bridge the widening technology talent gap. Frank Gens, Senior Vice President and Chief Analyst at IDC, offers this ominous prediction: By 2023, the global economy will create 500 million new native applications—the same number created in the past 40 years. To compete in that environment, Gens says, C-suite executives “must consider everyone a developer.”
Francois Voltaire, a 17th-century French philosopher, wrote, “One day everything will be well, that is our hope. Everything is fine today, that is our illusion.” With a shortage of 900,000 emerging technology workers looming in a global digital economy seeking to roll out 500 million native apps, all is not well. This is especially so in an ecosystem in which “talentism is the new capitalism,” as Klaus Schwab, co-founder of the World Economic Forum, says.” Business and technology leaders know this first hand.
It’s foolish to continue believing that higher education institutions and corporate training programs grounded in traditional 20th-century approaches will offer meaningful solutions to this talent gap.
In the fourth industrial revolution, individuals can no longer primarily rely on higher education institutions or corporations to learn new skills. While it’s incumbent on these entities to restructure how they train and teach, that isn’t happening quickly enough. In the interim, it will be up to workers themselves to relearn how to learn, or rely on organizations that offer more agile and less costly approaches to upskilling.
As the landscape of work continues to shift in the digital era, all participants—higher-education institutions, corporations and workers themselves—have a role to play in making the future of work and the future of learning a reality.
This article first appeared in The Cognizanti Journal
*IDC FutureScape: Worldwide IT Industry 2020 Predictions, Doc # US45599219, October 2019
5 keys to successful organizational design
How do you create an organization that is nimble, flexible and takes a fresh view of team structure? These are the keys to creating and maintaining a successful business that will last the test of time.Read more
Why your best tech talent quits
Your best developers and IT pros receive recruiting offers in their InMail and inboxes daily. Because the competition for the top tech talent is so fierce, how do you keep your best employees in house?Read more
Technology in 2025: Prepare your workforce
The key to surviving this new industrial revolution is leading it. That requires two key elements of agile businesses: awareness of disruptive technology and a plan to develop talent that can make the most of it.Read more