The Future Workforce: Overcoming the Real Barriers to AI Adoption – Skills, Trust and Clarity’

Karin Verspoor (RMIT University) explains why practical AI literacy and critical evaluation help public servants choose the right tools beyond generative AI, set guardrails for bias and hallucinations, and adopt AI confidently while keeping human judgement at the centre.

Benji Crooks, Marketing Director at Public Sector Network, sits down with Karin Verspoor (Dean, School of Computing Technologies, RMIT University) to discuss the most urgent AI skill gaps for public servants, why leaders need a practical framework for choosing between generative AI and more targeted analytics tools, and how to set the right guardrails to manage risks like hallucination and bias, ahead of her AI-focused panel session at Digital Leadership Day Victoria 2026.


Benji Crooks: Hello, I’m Benji Crooks. I’m the Marketing Director at Public Sector Network. I’m with Karin Verspoor, who will be speaking at Digital Leadership Day Victoria 2026. So to start off with, if you could just introduce yourself, your role, the company you work for.

Karin Verspoor: Yeah sure. So I’m Karin Verspoor, and I’m the Dean of the School of Computing Technologies at RMIT University.

Benji Crooks: Right. So I understand the panel that you’ll be on will be very AI-centric and focused. So asking you a question to start off with there, from your perspective, what are the two or three AI skill gaps that matter the most to public servants right now?

Karin Verspoor: I think understanding a little bit about how AI technology works, and in fact the breadth of the technology, is really critical because that gives you a framework for deciding when it’s appropriate to use an AI tool and when it might be better to just rely on your brain.

Benji Crooks: Perfect. And I guess one of the big things that people worry about with AI is having the right guardrails in place. Is this something you’re thinking about within your role as well?

Karin Verspoor: Yes absolutely. So my background is actually in natural language processing, and I’ve been working on trying to get machines to make sense of language for over 20 years. And I always approach the technology with a critical evaluation lens, right. I always think about like what works, what doesn’t work, how we can make it better, and when it’s appropriate to use the technology and when we might want to think about another way to solve the problem.

And so yeah it’s really just core to doing research in this area, is thinking about what works, what doesn’t work, why it doesn’t work and how we can improve it. And we always want to be approaching technology with that kind of framework.

Benji Crooks: And to your point, I guess that AI is only part of that story and there is probably other tools that could be used within that framework that you speak about.

Karin Verspoor: Yeah, so it’s important to remember that the kind of AI that we’re being exposed to right now, which is large language models and generative AI, is only one part of the landscape. We have tools that are more based around predictive analytics or trying to do what we call supervised learning, so basically to solve a much more specific and targeted problem.

And often those tools are a better choice for helping us than a generative tool, which is a little bit more unconstrained and comes with this risk of hallucination and bias and all the other problems that we’re aware of.

Benji Crooks: And I guess with all the tools that you are using, how do you identify they are the right AI tools to be using?

Karin Verspoor: Yeah so partly that’s about experience, right. And understanding what that landscape looks like. And then thinking about okay what are we actually trying to do, right. What’s the problem we’re trying to solve? What’s the human task we’re trying to support? And think about what the match between the technical solution and the problem is.

And so sometimes, often if we’re talking particularly about tasks that we repeat over and over and over again, or decisions that we’re doing on a regular basis, it is better to build a tool that is really tailored to solving that particular problem.

But if you have a context which is much more open-ended and you know you want to be able to ask any question, then a tool like an LLM is more appropriate. And so it’s really just thinking through, you know, what is the context that we want to use this in and what’s the problem we’re trying to solve, how can we best help a human, and then picking the tool that, or the collection of tools, that helps solve that problem best.

Benji Crooks: And from where you work within the universities and within government, where should the university be more cautious when introducing new tools and where can it move more fastly?

Karin Verspoor: Yeah, so I mean one thing in a university setting that we’re really worried about is of course how we use AI appropriately in our teaching. And there are some real opportunities there. So for example, we can potentially use AI to give more detailed and more tailored individual feedback to students.

But we’re also trying to think about what the impact of introducing AI into the educational setting is on learning itself. We don’t want students to lose the opportunity to develop their own conceptual framework in their brain. We need to make sure that the AI and the use of AI in the educational setting is helping and not hindering students to progress their own understanding.

Benji Crooks: Yeah, and I guess from just my own thoughts that you’re at this crossroads where students will start looking at their own resources and using AI if the university’s not quick enough to provide the different instructions for them.

Karin Verspoor: Yeah, and look, I mean this is the real world. Like people are gonna use the tools that are available to them. And so, you know, there’s no particular problem with that. We just need to be aware that that’s happening. And, you know, also give students the framework for thinking about what the impact of using those tools is, and how to use them effectively so that, to support their learning.

For example, one of my own children who is a university-level student is using Notebook LLM to help them digest their course notes and also to just give them, like, quizzes and things to test their own understanding of the material. And that’s a really useful way to help students, particularly those who, who, you know, might be neurodiverse in some way, who have learning or reading difficulties. You know, it gives them a different way of interacting with the material and something that they can interrogate as many times as they want.

And that’s a really useful way of using this technology, something we didn’t have when I was a university student, right. So it’s a benefit, but we have to make sure that it is supporting their own understanding, yeah, rather than doing the work for them.

Benji Crooks: Absolutely. So as we said, you’ll be at Digital Leadership Day Victoria 2026. If we look at the day, what would you hope public sector leaders would take away from your panel discussion?

Karin Verspoor: I hope that they take away something more than just the hype. We hear a lot of, you know, hype about what AI can do, and I think just giving people a framework to think through the use of the technology that’s a little bit going deeper under the surface is really important.

But also not being afraid of it. We need to find this balance. You know, we don’t want to throw the baby out with the bathwater. We wanna embrace the technology and use it where it’s valuable. And so I hope that people will have a little bit more confidence in adopting the tools in their own work environments.

Benji Crooks: Excellent. Okay, well that’s all my questions, so it’s been lovely speaking to you.


Hear Karin Verspoor live at Digital Leadership Day Victoria 2026. Join her AI-focused panel to move beyond the hype and get a practical framework for using AI safely and effectively, including how to choose the right tools for the right tasks, manage risks like hallucination and bias, and build confidence in adoption across public sector teams. View the agenda here.

Published by

Benji Crooks Marketing Director