Learning in the Flow of Work is Still Largely Invisible
By Paul Wahltuch
Organisations say they value learning in the flow of work, yet they still judge development mainly through courses, completions, attendance and credentials. That is the central contradiction in workforce development today: capability often grows in the work itself, but formal learning signals still carry most of the evidentiary weight.
The problem is not that people are not learning. It is that organisations struggle to see, compare and use much of the learning that already happens through work.
People develop by handling tougher stakeholders, solving unfamiliar problems, making decisions with less supervision, recovering from mistakes and taking on stretch work that sharpens judgment. Most HR and L&D leaders know this. Yet many organisations can report learning completions far more easily than they can explain how someone has become more capable through the work they have done.
When that growth stays invisible, decisions default to easy proxies. Promotion and readiness judgments lean too heavily on credentials, course history, or polished self-description. Workforce planning reflects what has been recorded, not necessarily what has changed.
The result is distorted talent decisions. People who have quietly become more capable are overlooked for stretch work, internal mobility or succession conversations because their growth has never been translated into a usable form. Meanwhile, people with the strongest formal signals can appear more ready than they are. Visibility is not just an L&D issue; it shapes how organisations recognise potential, allocate opportunity and assess risk.
Making Workplace Growth Visible
That is why learning in the flow of work has to mean more than support delivered at the point of need. Just-in-time content, prompts and performance support matter. But they address only half the challenge. The other half is making workplace growth visible enough to guide better conversations and better decisions.
That is difficult because informal learning does not arrive in neat packets. It is spread across projects, handovers, customer interactions, problem-solving, rework and gradual increases in independence. Managers often notice the shift: someone now handles complexity more calmly, exercises better judgment, or solves problems that once needed escalation. But that understanding often remains local. It is seen, then lost.
Once we realise that the problem is visibility, capability frameworks become a practical operating tool. They provide a shared language for recognising stronger performance and discussing growth in a way that can travel beyond individual intuition.
Instead of just using intuition, a framework helps managers and employees ask sharper questions: What kind of work can this person now handle? What decisions can they now make with less support? What evidence from recent work shows that change?
This is not a new idea. Capability frameworks have long been one of the stronger disciplines available to organisations who want a more serious view of workforce capability. The reason they have not spread more widely is that they have traditionally been expensive to build, hard to maintain, expensive to operationalise and hard to apply consistently across managers and teams. They have remained ‘best practice’, but usable only by the most elite organisations with the scale to absorb the associated costs.
The Role of AI
AI changes that practicality equation. It can help translate role material into framework language, support the maintenance of capability libraries, and relate those capabilities to the roles of individuals.
These tailored-for-role capabilities can then be used during team leader and team member meetings to scaffold conversations with objective guidance on what outcomes, responsibilities and accountabilities a team member is expected to demonstrate.
Used well, AI does not replace judgment. It supports judgment by making a more disciplined way of recognising growth easier to sustain across a larger workforce.
This simple example of a manager and employee makes the shift clearer. In this scenario, Kirra is a database administrator and her team leader is Lisa. Thanks to a process of defining the capability level required and benchmarking Kirra against expectation, they have identified they need to elevate Kirra’s database administration abilities.
Figure 1: Greenbeam capability analysis and action plan
Figure 1 shows how a capability framework clarifies both the target and the current gap. Kirra and Lisa can see the outcomes she needs to demonstrate and look for upcoming work that gives her the chance to do it.
Figure 2: Skill journals can define action plans to close specific capability gaps
Figure 2 illustrates the capture of competence as it happens. Competence can be considered to be evidenced capability and here we see Kirra demonstrating the outcome Provides technical leadership to optimise the performance of databases and evidencing it by communicating it to her team leader. The framework has scaffolded communication and thinking to allow in the flow of work learning to be visible.
A learning system might still show a course or two completed during that period. But the real development is in the work. A capability-based conversation makes that growth usable: it names what has improved, links it to recent evidence, and informs the next stretch assignment, support decision, or readiness judgment.
This also improves the quality of the development experience. Recognition feels more credible when it is tied to new things someone now does. Manager conversations become more specific. HR and L&D get a clearer read on where capability is genuinely strengthening, where it is stalling, and where targeted support is worth the investment.
For HR and L&D leaders, the practical move is to make a small number of important capability areas easier to discuss inside work. Start with priority roles or capability domains. Define them in capability framework terms. Encourage capability reflection in conversations that already exist: one-to-ones, project reviews, check-ins, and development discussions.
The real test of learning in the flow of work is not what support is delivered, but what growth becomes visible, and therefore usable.
Interested in AI and eLearning?
AITD offers several courses on both AI and eLearning:
AI Essentials for L&D Professionals: Do you want to transform your L&D workflow and enhance learner experiences using Generative AI? In this blended learning course, you’ll learn how to use Generative (Gen) AI for a range of key L&D functions, including skills gap analysis, content creation, personalised learning and feedback. Register now.
eLearning: Foundations: This is Part I of an engaging, social learning suite of courses that provides you with access to learning experiences, activities and a comprehensive knowledge base. Register now. You may also be interested in eLearning: Planning and Design; and eLearning: Production and Delivery.
About the Author: Paul Wahltuch
Paul is a former infantryman who has spent more than 25 years in product creation across several domains and regions. He is currently the
Chief Product Officer at Greenbeam. Highly skilled at optimising product and engineering culture, Paul’s key focus is on enabling Greenbeam’s product teams to shine.