Skip to main content
AI in Learning: The Privacy Dilemma
April 9, 2025 at 2:00 PM
by Craig Simon
blog image - privacy.png

The Real Question in AI-Driven Learning Isn’t What It Can Do…. It’s What It Needs to Know

AI is reshaping learning and development in powerful ways. The tools now at our fingertips can personalise learning journeys, deliver real-time coaching, and surface insights that were once out of reach. But with all the excitement about what AI can do, we often overlook a more important-and more human question:

What does AI need to know about an employee in order to truly help them learn, grow, and thrive?

If we’re going to use AI to support people at work, then we need to start with people-not just technology.

Let’s break down three key questions that every organisation must ask as they bring AI into the learning space.

1. How Much Should AI Know About an Employee?

This is the foundational question-and possibly the most complex.

To deliver personalised, timely support, AI systems need to understand the employee they’re helping. That might include data on:

  • Job role and current responsibilities
  • Learning history and skill gaps
  • Preferred learning styles
  • Feedback from past training
  • Even emotional states like stress or confidence levels

The more an AI assistant knows, the more helpful it can be. Imagine an AI tool that recognises when someone is under pressure before a presentation and offers them a quick confidence-boosting checklist. That’s powerful. But it’s also personal.

We must therefore also ask:

  • Where’s the line between helpful and intrusive?
  • How much insight is too much?
  • And who decides?

Good design of AI solutions should be trained not just to perform, but to respect. We should work on building assistants that ask just enough to be helpful-no more. That means being intentional about data use and designing solutions that are privacy-first by default.

2. Can We Personalise Learning Without Compromising Privacy?

Too often, companies are told they must choose between rich personalisation and strong privacy protections. At RockMouse we believe it’s possible to do both if you start with the right principles:

  • Transparency: Employees need to know what the AI is learning about them and why.
  • Consent: People should opt in to deeper personalisation-not be forced into it.
  • Boundaries: There must be clear limits to what AI tools can access or infer.

This is what we mean when we say our approach is “humans before tech.” AI is a tool to empower, not surveil. A well-designed system doesn’t just adapt to a person’s needs; it respects their context, their agency, and their boundaries.

And when employees feel safe, respected, and understood—they engage more deeply. They take ownership of their learning. And they grow faster, because the system is built around them.

Designing for Trust, Not Just Performance

Trust is the foundation of any effective learning ecosystem. Our solutions are designed to act more like empathetic colleagues than automated content pushers. We don’t just serve up content; we tune into context:

  • Are you under time pressure? It adjusts.
  • Feeling overwhelmed? It offers calm, structured help.
  • Want to keep a conversation private? That’s respected.
  • Prefer voice guidance over text? No problem.

The more AI works with you, the more helpful and human it becomes. And that’s the future we’re building: real-time, real-world support that adapts to each employee as they are not just as data points in a dashboard.

3. Who Owns the Learning Data – The Company or the Employee?

This is a question that will could quite possibly define the next decade of workplace learning.

In traditional systems, data flows up: into the LMS, the HRIS platform, the learning dashboards. But in an age of hyper-personalisation, where AI tools are learning from every interaction, we need a different approach.

We should be asking:

Should employees have more control over the data that reflects how they learn, perform, and grow?

We believe they should. In fact, we think of learning data as a kind of digital fingerprint-something that belongs to the learner, not the system.

We’re working towards a future where employees can carry their learning profiles with them from job to job. Imagine a self-owned, AI-enhanced learning companion that travels with you that knows what you’ve mastered, how you like to learn, and where you want to grow. That kind of continuity empowers individuals and builds real trust in the system.

4. Final Thought: If You’re Thinking About AI in Learning, Start with the Right Questions

The promise of AI in learning isn’t about automating the old ways—it’s about reimagining what’s possible when we centre the experience around real people.

So before asking what your next AI tool can do, ask:

  • What does it need to know to be truly helpful?
  • How do we keep the human at the centre of the design?
  • And are we building a culture of empowerment—or control?

At RockMouse, these are the questions that guide our work every day. If you’re ready to explore AI-driven learning but want to keep ethics, trust, and personalisation in balance -let’s talk.

We’re here to make sure AI works for your people, not on them.