Why the week after matters more than the session itself
The research on learning and behaviour change is consistent on one point: what happens in the days immediately after a training session determines whether anything sticks. The session creates a window of motivation and fresh knowledge. That window closes quickly. Within two weeks, most people have reverted to their existing habits unless something has actively reinforced the new ones.
This isn't a criticism of training — it's how learning works. The session is necessary but not sufficient. What you do with the week after is what determines whether the investment pays off.
For individuals: the first three things to do
Pick one task and do it with AI today. Not tomorrow. Today. The longer the gap between learning and application, the less likely application becomes. Identify one thing on your to-do list — a document to draft, an email to write, a meeting to summarise — and use the tool for it before the day is out. It doesn't need to be perfect. It just needs to happen.
Write down three use cases that are relevant to your role. Not use cases from the training. Your use cases — the specific, recurring tasks in your work where AI could save you time or improve the output. Having these written down means you have a personal reference point rather than relying on memory of the session.
Tell someone what you're trying. This sounds simple, but it matters. Sharing an intention — even with one colleague — creates a form of accountability. It also starts the peer learning loop that tends to accelerate adoption across teams far more effectively than top-down instruction.
For managers: how to make it stick for your team
Reference it in your next team meeting. Ask people what they've tried since the session. Share one thing you've tried yourself. The signal from a manager that this is expected and valued is the single most effective driver of team-level adoption. Without it, the training becomes optional in practice even if it wasn't in intention.
Create a low-stakes space for sharing. A Teams channel, a five-minute slot in a weekly meeting, a shared document — somewhere people can share what's working, what isn't, and what they've found useful. Early adopters within the team become the most credible advocates for everyone else.
Remove the friction points you can control. If people need access to a tool they don't have, get it sorted. If there's uncertainty about what's appropriate to use AI for, clarify it. If the biggest barrier is time, acknowledge that the first few uses will be slower than the established way of doing things — and that this is normal and temporary.
For the organisation: the 30-day checkpoint
Thirty days after a training session is the right point to take stock. By then, anyone who was going to start using the tools has started. Anyone who hasn't is unlikely to without further intervention.
A brief check-in — even a short survey or a ten-minute conversation with team leads — surfaces where adoption has happened, where it hasn't, and what the barriers are. This information is far more valuable than post-session satisfaction scores, which measure how people felt about the training rather than whether it changed anything.
Common findings at the 30-day mark: adoption is strong in some teams and absent in others, usually tracking closely to how actively the relevant manager engaged with the material. Use cases that were demonstrated in the session have been tried; use cases that weren't have largely been ignored. People who had a specific task in mind during the session have adopted the tool; people who attended without a clear application haven't.
Each of these findings points to a specific intervention — whether that's a follow-up session, targeted support for a particular team, or additional use case development for roles that weren't well served by the initial training.
The compounding effect
Organisations that treat AI training as an ongoing process rather than a one-time event see compounding returns. Each session builds on the last. Use cases that were theoretical become practical as people develop confidence. Teams that were sceptical become advocates when they see results from colleagues. The tools improve, and so does people's ability to use them effectively.
The week after a session is the beginning of that process, not the end of it.
The session plants the seed. What you do in the week after determines whether it grows. The actions are small — one task, three use cases, one conversation — but they're the difference between training that changes how people work and training that becomes a memory.