Concerns about technology and automation stealing jobs have been playing out for centuries. But the ways in which technological developments and especially artificial intelligence (AI) actually change jobs are not always obvious or entirely negative. Data on skills can already show us four specific ways the new generative AI models are likely to change jobs and what this means for employers, training providers, workers, and learners.
Our data shows exponential growth in demand for generative AI tools.
Digging into the data, we can see the shift in generative AI from an idea still in testing to a tool that non-technicians commonly use. Some of the top titles for jobs calling for generative AI skills are what you would expect: Data scientists and Machine Learning engineers. But this list also includes Curriculum Writers and Program and Product Managers– people who are only going to use a tool once it’s practical.
Given this exponential growth, here are 4 key takeaways for what this might mean for the future labor market:
Across all roles, there will likely be a stronger focus on human or baseline skills (e.g., collaboration, communication, and critical thinking). Previous research by Lightcast and BCG has shown that across digital jobs, skills like these have become even more commonplace. As AI takes on more rote tasks like coding and programming, as well as even basic writing tasks, workers will need to take on more stakeholder management and idea generation. Analysis by Lightcast and the OECD released this month shows that while all AI employers have consistent demand for these human or baseline skills, leading firms (those who posted the most AI jobs) show a higher demand for workers combining their technical skills with leadership, innovation, and problem-solving.
Reskilling is even more important than before - and employers should rethink both the skills they train on and how they train. Assessing skills gaps or training needs can no longer be a one-and-done exercise by employers and training providers. Skill needs are changing too quickly: we found that on average 37% of the top 20 skills in a job changed in just five years. This is going to be exacerbated by generative AI.
A recent Science article using Lightcast data found that 10% of posted jobs are likely to be affected by automated writing tools. But even if large language models (the models that power technologies like Chat-GPT) can take on writing tasks, there still needs to be a person prompting the tool and identifying the topics to write about. Marketing specialists, for example, then may need more training in picking interesting topics and prompt engineering to produce good first drafts than in writing exciting copy. A 2021 BCG survey showed that 65% of employees prefer to learn on the job. Instead of waiting for content to be developed to show how your employees can best use AI, employers should think about allocating internal resources for more mentoring, shadowing, and apprenticeships. This can help employers fill their skills gaps without sinking money into L&D resources that aren't agile enough to change with them.
Skills-based hiring (and thoughtful implementation) will become even more critical. Traditional ways of initially screening applicants, such as cover letters, are going to become less useful. Work sample tests, short job-related case studies, or other project-based hiring tools are likely to become more popular (and also tend to be less biased). This will encourage skills-based hiring as employers will need to more thoughtful consider the skills and competencies that they require from a candidate and how to assess those in an application process rather than relying on signaling from a degree. With this higher focus on skills-based hiring, employers will naturally start collecting more data on skill needs and building a taxonomy of skills - something Lightcast can help with.
The ethics of using AI tools are not getting enough attention. Within AI roles, so far very few employers have been calling for ethical AI skills - less than half a percent of all AI jobs. That's likely to shift up as we see how new generative AI tools are rolled out and what potential bias and backlash there is. But, we shouldn’t have to wait for that: employers using AI tools should follow guides for using AI in HR and elsewhere, and employers developing AI should increase their focus on ethical considerations. This isn’t a problem unique to generative AI and all companies using AI models should have dedicated resources considering ethical and moral implications.
While AI may be changing the way we do work, robots aren’t stealing all our jobs (yet). Nor are we all working beside robots (yet). Those changes will come piece by piece. A focus on how skills are changing to cope with AI may be our most revealing window into how artificial intelligence is changing our jobs–and our lives.