The AI Generations Confidence Problem
- Rebecca Chandler
- Mar 30
- 2 min read

I’m not so much concerned that a 26-year-old friend of the family doesn’t use AI – that can be solved.
I’m much more concerned that a member of Gen Z/Alpha can comfortably say, “Oh yeah, I use ChatGPT all the time” but can’t assess whether the information was any good or is even complete.
Gen Z rate themselves as “very knowledgeable” about AI but score poorly when asked to demonstrate mastery of the tools.
Write a real prompt. Evaluate what came back. Spot where the AI got it wrong or left something out.
Their confidence is there but the skill required for success isn’t.
That’s a different problem from not using AI at all because someone who doesn’t know how to use a tool can be trained.
But how do you train anyone who not only believes they already understand a tool but who don’t know they don’t know how to ask questions and investigate information before accepting the result?
Social media doesn’t build the digital literacy required today and yet it’s their primary “tool”. Creating content for likes or writing clever comments doesn’t produce a healthy workforce. An algorithm may make them feel good (or bad) but it’s deciding what they see and what they don’t.
There is very little investigation in a tool built for entertainment.
Think about what’s required to use AI well.
You have to know what you’re asking and read the output critically. If you want to be truly successful, you can’t just skim it to see if it looks right. You must ask objectively: is this good enough, or do I need to push further? And if I need to push further, how do I do that? By getting it wrong and having to figure out why.
It’s the same skill that makes someone good at research, analysis, and solving problems that don’t come with instructions.
That’s the 5% I wrote about last year—the human part that AI cannot replace if the 5% is taught.
Gallup found that 9% of workers currently in the workforce feel extremely prepared to use AI in their jobs. So, most people know they’re not ready. But the ones who think they are present a specific kind of risk.
Trust without literacy isn’t a skill. The next generation of workers is walking into offices unaware they’re an unprepared liability. This circles back to the systems that were supposed to prepare them.
School curricula aren’t keeping up with the expectations of hiring managers who now specify AI literacy on job postings. Employers are no longer asking, “can this person use AI?” They’re now asking, “can this person tell me when AI is wrong?”
Older generations act surprised when workers can’t operate tools the job market demands. They forget that Gen Z and Alpha have never existed in a world without instant answers while learning how to investigate is disappearing from the syllabus.
Next: What does this mean for creating a workforce pipeline that’s supposed to drive success?



