Don’t ask, test: Measuring real AI skills without self-reporting
More often than not, your team’s confidence in their AI skills is misplaced. Rather than rely on self-reporting, here's how to independently verify their skills.
Jul 23, 2025 • 4 Minute Read

The AI revolution is in full swing, and many organizations are eager to ride the wave. But here's the catch: according to Pluralsight's 2025 AI Skills Report, a whopping 79% of us are guilty of overstating our AI expertise. The higher up the corporate ladder you go, the worse it gets—91% of C-suite execs are the biggest offenders! This is known as AI Skill Exaggeration, and it can have a significant impact on your AI adoption initiatives.
So, how do we, as leaders at the forefront of AI adoption, ensure staff have the AI skills they need—and also check that the perception of our own AI knowledge is not off? Let's dive in and explore some practical strategies to properly evaluate AI skills.
The Dunning-Kruger effect in AI
When people overstate their AI knowledge, it’s not always intentional. It can be a consequence of the Dunning-Kruger effect, a well-researched phenomenon where people with limited knowledge in a particular area greatly overestimate their own expertise. In the AI world, this can lead to:
Overconfident decision-making in AI implementation
Underestimating the complexity of AI projects
Misallocating resources due to misunderstood capabilities
Potential security risks from improper AI usage
Pluralsight’s 2025 AI Skills Report reveals a fascinating paradox: 92% of professionals feel confident in their AI skills, yet 88% believe their colleagues' lack of skills is holding back AI projects. Meanwhile, 65% of organizations have had to pull the plug on AI projects due to a lack of skilled staff. Clearly, there's a disconnect between perceived and actual AI competence.
The fear factor of selling your AI skills
The other likely cause of this exaggeration is a need to maintain job security. A staggering 95% of organizations now consider AI skills a key hiring factor, and 84% are likely to replace or outsource jobs if staff don't have the right AI chops. In a climate where having AI skills and embracing this technology can be career critical, many may feel the need to fake expertise.
How to properly assess your staff’s AI skills
1. Embrace objective skill assessments
As the report highlights, self-reporting of AI skills is often inaccurate. The solution? Seek out expert-designed skill assessments that provide an objective ranking against industry peers. Pluralsight offers AI-specific skill IQ tests to help you gauge your true proficiency and that of your staff. Make it a habit to reassess regularly, tracking progress and identifying areas for improvement.
2. Create a safe culture of continuous learning
Stop people from feeling the need to fake their skills by making it okay to not know. To create this culture:
Make learning part of daily work
Empower self-directed learning
Recognize and reward learning
Encourage peer knowledge sharing
Build psychological safety
The last is the most important one: Leaders should be open about acknowledging their own shortcomings, admitting mistakes, trying new things, and asking questions. Lead by example, then make sure it’s okay for staff to do the same.
3. Seek diverse and constructive feedback
Don't fall into the trap of relying solely on your own judgment. Implement 360-degree feedback mechanisms within your team or organization, encouraging open and honest evaluations. Engage in AI peer reviews to gain valuable insights from others. For tech professionals, participating in AI hackathons and competitions is another great way to benchmark your skills against a diverse pool of professionals.
4. Implement hands-on learning in safe environments
Theoretical knowledge alone is insufficient in the realm of AI. Provide your team with access to approved AI tools, sandbox environments and hands-on labs, allowing for risk-free experimentation and practical application of concepts. Start with smaller, manageable AI projects and gradually scale up as skills improve. Encourage team members to document and share their learning experiences, creating a knowledge base for others to learn from.
5. Develop metacognitive skills
Metacognition—the ability to think critically about one's own thought processes—is crucial in overcoming the Dunning-Kruger effect. Encourage regular self-reflection after completing AI tasks or projects, prompting team members to analyze their successes, challenges, and areas for improvement. Maintain learning journals to track progress and insights over time. Engage in discussions about AI concepts, openly exploring gaps in understanding and seeking clarification when needed.
6. Highlight the complexity and rapid evolution of AI
Create awareness about the vast scope and constant evolution of the AI landscape. Organize regular knowledge-sharing sessions where team members can present on emerging technologies, industry trends, and real-world applications. Invite external experts to provide fresh perspectives and insights into the challenges and opportunities within the field. By emphasizing the complexity and dynamism of AI, you can help combat the illusion of expertise and encourage a humble, learning-oriented mindset.
Conclusion: Measuring and upskilling your staff in AI is an ongoing process
AI is moving fast, and so the knowledge and skills your staff have today may no longer be relevant tomorrow. That means any point-in-time testing you do will not necessarily represent your ongoing reality. AI upskilling and testing need to be a fear-free part of your culture.
By embracing the methods listed above, you and your team can build a more accurate understanding of your AI capabilities and chart a path towards true expertise.
Want to find out more about this trend and others? Download Pluralsight’s full 2025 AI Skills Report to learn more about the AI landscape and how to build real AI skills that empower AI adoption and long-term success.
Advance your tech skills today
Access courses on AI, cloud, data, security, and more—all led by industry experts.