Generative AI toolkit: How to prep your org for AI adoption
Before adopting AI tools, you need to bring your software developers on board. Use our toolkit to measure your org’s AI adoption readiness and AI skill threat.
Nov 06, 2023 • 5 Minute Read
- Software Development
- Engineering Leadership
- Software Delivery Process
- AI & Machine Learning
- Learning & Development
As generative AI adoption increases, so do your software engineers’ job security concerns. How can your organization adopt the latest and greatest tech while bringing your teams on board?
Our Developer Success Lab conducted original research to put together the Generative AI Adoption Toolkit. This toolkit gives you a way to benchmark your org’s AI adoption readiness with a human-centered mindset—plus the tools and strategies to assist you in your journey.
Keep reading to learn more about the toolkit and take the AI adoption assessment.
Table of contents
- The AI skill threat: Will AI replace programmers?
- What organizations need to confront the skill gap risks of artificial intelligence
- Generative AI toolkit: Preparing for organizational AI adoption
- Perform a post-assessment for AI adoption
- Upskilling is key to using AI coding tools
The AI skill threat: Will AI replace programmers?
The AI skill threat refers to the fear that developers’ skills will become obsolete with the rise of generative AI. In short, AI threatens their identities as software developers and code writers.
Nearly half (45%) of software developers report worry, anxiety, and fear about whether they can succeed in this era with their current technical skill set. Knowing this, organizations need to support developers when adopting AI-assisted software development practices. This includes upskilling to close the AI skills gap.
Hear from Pluralsight CEO Aaron Skonnard about the importance of human intelligence in the age of artificial intelligence.
What organizations need to confront the skill gap risks of artificial intelligence
While the AI skill threat can impact any engineer, it’s more likely to occur in organizations and software teams with high contest cultures and imposter syndrome. In other words, developers feel more anxious and worried about AI when they doubt their skill, intellect, and accomplishments as developers.
But the opposite is also true. Organizations with strong learning cultures and a sense of belonging have a lower likelihood of experiencing AI skill threat. In fact, we found that learning culture and belonging predicted a decrease in AI skill threat and an increase in individual developer productivity and overall team effectiveness.
Generative AI toolkit: Preparing for organizational AI adoption
Before adopting AI technology in the development process, take stock of your current situation. This gives you a baseline to evaluate the success of any AI services you adopt.
You can do this by taking our AI assessment. This assessment uses empirically validated survey items that have been tested on a large number of people and are designed to be stable and reliable. This means you can use it to measure the impact of things like learn-a-thons and pre-mortems on your organization’s AI skill threat.
Once you complete the assessment, you’ll receive a score across five key dimensions. We’ll take a look at each one in more detail.
1. AI tool use: How do teams use AI technology?
AI tool use refers to how and when you and your teams currently use AI technology in development work. Once you know where you use AI tools, you can determine the policies and practices to use them efficiently and securely. AI tool use also helps you identify new opportunities to use AI and cut down on shadow IT to improve security.
Use these question to assess AI tool use:
Do your teams currently use AI in the development process?
If your teams use AI for development work, where do you use it?
Searching through code
Conducting code reviews
Completing task management duties
2. AI quality rating: How trustworthy is artificial intelligence?
AI quality rating indicates how much you and your teams trust AI to deliver quality output. Knowing how much people trust AI can help you address concerns and boost AI adoption.
For example, some employees may not trust AI because it’s returned inaccurate or incomplete information. Maybe they just need to craft prompts that will give them more relevant responses. If you give them prompt engineering training, they’ll write better prompts, get better outputs, and use AI successfully.
On the other hand, some employees may trust AI tools too much. That overconfidence can lead to security risks, ethical concerns, and avoidable mistakes. These employees may need a brushup on the ethical considerations of AI, problem-solving, critical thinking, and other soft skills.
Consider the following to understand AI quality rating:
Do you frequently encounter inaccuracies or hallucinations when using generative AI?
How much do you trust the output of the AI tools you use?
3. AI skill threat: Will AI replace jobs?
As we noted above, AI skill threat refers to developers’ fear that AI will make their skills obsolete. This fear lowers morale and impedes effective AI use in your organization. To assess AI skill threat in your organization, consider:
Are your teams concerned about how AI will change software development?
Are teams worried their skills will become obsolete due to generative AI?
Are teams worried about job security with the rise of AI?
4. Learning culture: Are employees encouraged to learn tech skills?
Organizations with a strong learning culture encourage and incentivize employees to learn. They provide time to learn, relevant upskilling opportunities, and psychological safety. With AI roles in high demand, orgs with continuous learning cultures can build those skills in existing talent.
To measure your organization’s learning culture, ask:
Do you think your team members are learning and growing as developers?
Are team members encouraged to share what they learn?
Are teams encouraged to make mistakes?
Does your organization provide learning opportunities?
5. Belonging: Do developers feel a sense of belonging and inclusion?
Developers with a strong sense of belonging are less likely to feel threatened by generative AI and other emerging technologies. This makes belonging one of the most important indicators of AI tool success—and one of the trickiest to measure. To get more insight into software developers’ sense of belonging, consider:
Are team members accepted for who they are?
Do team members support each other?
Perform a post-assessment for AI adoption
After you implement AI tools, retake the AI assessment using the same questions and criteria from your pre-assessment. This allows you to measure the success of any tools you adopt or upskilling programs you implement.
Upskilling is key to using AI coding tools
Despite the AI skill threat, 74% of developers plan to actively upskill in AI-assisted coding tools and technologies. The key to navigating the AI skill threat? Adopt AI with a human-first approach.
That’s where our AI toolkit comes in. It includes an AI skill threat assessment, a pre-mortem template for adopting AI-assisted coding tools, and a guide for implementing a generative AI learn-a-thon for engineers.