Navigating Emerging Technologies
By Alec Gardner
AI, Automation and Digital Tools are Enhancing Learning while Keeping it Human-Centred and Values-Driven
Emerging technologies are reshaping learning and development (L&D) at a pace that outstrips many organisations’ ability to update capability frameworks.
Artificial intelligence (AI), automation and digital learning tools are improving personalisation, increasing access, and reducing administrative load. At the same time, these technologies raise new risks: inequity, surveillance, de-skilling and a drift from learning as a human practice toward learning as a technical optimisation problem.
For L&D leaders, the challenge is no longer simply adopting new tools. It is ensuring that technology strengthens learning quality while protecting learner agency, trust and values.
A Global Shift Toward AI-Enabled Learning
The last decade normalised digital learning as essential infrastructure. AI is now accelerating another shift: from traditional learning management systems (LMS) and eLearning modules toward learning ecosystems where content, coaching, feedback and analytics can be dynamically generated or supported.
Three global forces sit behind this transition.
First, skills disruption is intensifying. The World Economic Forum’s Future of Jobs Report 2023 notes that automation and AI continue to reshape job roles and capability needs, increasing the demand for reskilling and continuous learning.
Second, generative AI has become mainstream. Tools such as ChatGPT, Copilot and Gemini have normalised AI-enabled support for drafting, ideation and problem solving. Learners increasingly expect similar support for learning: immediate feedback, personalised practice and guidance in plain language.
Third, global institutions are emphasising learner rights and equity. UNESCO’s 2023 guidance on generative AI in education highlights both the potential to improve access and the risks of bias, misinformation and misuse of data.
The direction is clear: learning is becoming more automated, more personalised and more data-informed. Whether this strengthens learning or weakens it depends on design choices.
What Emerging Technologies Can Enhance (When Used Well)
The strongest applications of AI and automation typically share a common feature: they reduce friction and expand learning opportunities that are otherwise scarce.
Personalised Learning Pathways
AI-enabled platforms can tailor learning recommendations to role, proficiency and learner needs. This can reduce wasted effort and increase relevance. However, personalisation should not become a “black box”. Learners need to understand why content is recommended and how it aligns with their goals and performance expectations.
A practical example comes from consumer learning. Duolingo’s Duolingo Max uses GPT-4 to provide roleplay conversation practice and contextual feedback through features like “Explain My Answer”. This is designed to address a persistent challenge in language learning: access to practice and feedback that usually requires a tutor.
The implication for workplace learning is important. AI adds most value when it expands access to high-quality practice and coaching, rather than simply producing more content.
AI as a Learning Coach
Generative AI can support learners through scenario prompts, reflective questions, drafting support, summarisation and plain-language explanations. Used well, this can strengthen self-directed learning by providing scaffolding at the moment of need.
The learning design risk is that AI becomes a shortcut. AI can generate answers, but it does not guarantee understanding. Human-centred design treats AI as a coach that supports reflection and judgement, not a replacement for thinking.
Automation that Creates Time for Learning
Automation can reduce workload across L&D functions, including enrolment workflows, compliance reminders, assessment scheduling and reporting. The benefits can be significant, particularly for small teams managing large programs.
The key question is how organisations reinvest time saved. When the automation dividend is directed into coaching, facilitation, evaluation and inclusion practices, learning quality improves. When it is used only to increase throughput, learning becomes transactional.
Scaling Content and Capability
AI can accelerate the translation and development of learning resources. Duolingo announced 148 new language courses in April 2025, describing how generative AI assisted rapid course development through a shared content approach. This reflects a wider global trend: AI is speeding up content production and localisation.
However, scale and quality are not the same thing. Public concerns about perceived quality issues and inaccurate lessons in 2025 show that rapid expansion can weaken learner trust if quality assurance and cultural relevance are not protected. For L&D leaders, the lesson is that governance is essential when scale increases.
Why Human-Centred and Values-Driven Matters
Technology is not neutral. It reflects and amplifies the values embedded in how it is designed and deployed. Three risks are particularly relevant to learning.
Bias and Inequity
AI systems learn from historical patterns, and those patterns may include bias. If AI influences assessment, progression tracking or selection for development opportunities, there is a risk of reinforcing inequities. UNESCO stresses the importance of transparency and safeguards where AI influences educational outcomes.
Privacy and Surveillance
Learning analytics can drift into monitoring. When learning becomes a source of behavioural tracking, psychological safety and engagement decline. Learners should know what data is collected, why it is collected, and how it is protected.
De-skilling and Over-Reliance
If learners outsource thinking to AI, they may develop surface competence without deeper capability. The skills that remain most valuable in an AI-enabled future are those AI cannot replace: ethical judgement, empathy, creativity, collaboration and context-sensitive decision making.
These risks do not mean AI should be avoided. They mean AI should be adopted with governance, transparency and strong learning design.
A Practical Framework for Responsible Adoption
A simple way for learning leaders to evaluate emerging technologies is to focus on three principles: agency, trust and impact.
Protect Learner Agency
Learners should have clarity when AI is used for recommendations, feedback or assessment. Where possible, provide opt-out mechanisms, and build AI literacy so learners understand both capability and limitation.
IBM’s approach offers a useful example. The European Commission’s repository of AI literacy practices describes how IBM uses its AI-powered “Your Learning Platform” and encourages structured learning and applied challenges, including the watsonx challenge. The focus is capability as empowerment, positioning AI literacy as workforce agency, not a compliance exercise.
Build Trust through Governance
Governance should not be framed as a barrier to innovation. It is a necessary condition for sustainable adoption. Clear accountability for AI outputs, equity checks, privacy impact assessments, and procurement scrutiny should become standard L&D practice.
The pace of industry investment reinforces why this matters. In December 2025, IBM and Pearson announced a partnership to build AI-powered personalised learning tools for organisations, signalling that AI-enabled learning products will become mainstream. As this ecosystem grows, governance becomes essential to maintain quality, fairness and learner trust.
Measure What Matters
AI generates more learning data than ever before, but the goal is not measurement for its own sake. Beyond completion rates, organisations should evaluate transfer into work, capability confidence, inclusion outcomes and unintended consequences such as stress, inequity or disengagement.
What to Watch in 2026 and Beyond
Several trends are likely to accelerate.
AI literacy will become a baseline workforce capability, embedded into professional standards and capability frameworks.
Human skills will become strategic differentiators as AI handles routine tasks. L&D will increasingly focus on leadership, ethical judgement, collaboration and empathy.
AI-supported assessment and credentialing will grow, raising the stakes for fairness, validity and transparency.
Regulation and standards will mature quickly, rewarding organisations that build governance early rather than retrofitting it after harm occurs.
A Grounded Starting Point for Learning Teams
A strong starting point is not adopting a tool. It is developing a values-based learning technology strategy.
Define the learning outcomes sought, such as capability uplift, inclusion, performance and wellbeing. Identify friction points, such as administrative overload, slow onboarding or limited access to practice. Deploy technology only where it improves outcomes. Test and evaluate with learners, and refine based on evidence. Build governance and AI literacy alongside implementation.
Emerging technologies can create the conditions for more human learning, not less. When automation reduces administration and AI expands access to practice and feedback, learning professionals gain more time to focus on what matters most: facilitation, judgement, culture and meaning.
The future of learning is not a race to automate. It is a decision about what should be protected and amplified.
Further Reading and Resources
UNESCO (2023). Guidance for Generative AI in Education and Research.
World Economic Forum (2023). The Future of Jobs Report 2023.
Stanford Human-Centered Artificial Intelligence (HAI) (2024). AI Index Report 2024.
Duolingo (2023). Introducing Duolingo Max (GPT-4 powered features).
The Verge (2025). “Duolingo said it doubled its language courses thanks to AI.”
Polygon (2025). “Duolingo users are in turmoil over the app’s AI lessons.”
European Commission. Repository of AI literacy practices: IBM.
IBM Newsroom (2025). “IBM and Pearson collaborate to build AI-powered learning tools.”
Interested in AI and eLearning?
AITD offers several courses on both AI and eLearning:
AI Essentials for L&D Professionals: Do you want to transform your L&D workflow and enhance learner experiences using Generative AI? In this blended learning course, you’ll learn how to use Generative (Gen) AI for a range of key L&D functions, including skills gap analysis, content creation, personalised learning and feedback. Register now.
eLearning: Foundations: This is Part I of an engaging, social learning suite of courses that provides you with access to learning experiences, activities and a comprehensive knowledge base. Register now. You may also be interested in eLearning: Planning and Design; and eLearning: Production and Delivery.
About the Author: Alec Gardner
Alec Gardner is a learning and development specialist with experience designing and improving capability systems across corporate and vocational contexts. His work focuses on practical, responsible adoption of emerging technologies, including AI, automation, and digital learning tools, to improve learning quality and performance outcomes. Alec supports L&D leaders to align learning design with workforce needs, establish fit-for-purpose governance, and measure impact beyond participation metrics. He is particularly interested in how technology can reduce admin load and enable more coaching, facilitation, and applied learning in the flow of work.