AI in Education Is Here to Stay. Let's Get It Right.

The educators weren't ready nor were the students. But AI is in education and it's here to stay. There is a lot of chaos at the moment, but long-term, utilizing AI into learning and education workflows is a good thing. It is also true that we need to be thoughtful about how and where we use it.

Some of the best learning techniques, like teaching a concept back to someone (the Feynman technique), mixing up topics instead of drilling one thing (interleaving), or combining words with pictures (dual coding), used to require great teachers and lots of time. Now AI can build these approaches right into learning tools. It's something we could never do at scale before.

I'm not here to argue against AI in education. I build these systems. I've seen what happens when they're done well: they help people learn at their own pace and give them confidence to keep going.

But we also have a technology problem that we have seen before, a Jurassic Park problem!

As scientist, Dr. Ian Malcolm said in the 1993 movie Jurassic Park, we tend build things because we can, but we don't always stop to ask whether we should.

When a system can shape how someone learns or which doors quietly close on them, the stakes get serious. It isn't like the personalization we're used to in everyday life.

Personalization in learning is different

Personalization is not new, we are all used to personalization in social media, apps we use, and in general life. Amazon has been recommending books for years. Spotify figures out your music taste with surprising accuracy.

When those systems get it wrong, it's not a big deal. You skip a song or you ignore a suggestion.

But learning is different.

A wrong call in a learning system doesn't just waste your time. It can shake your confidence or push you away from a subject you might have loved, without you ever knowing why.

That's where the Jurassic Park analogy matters.

In shopping and entertainment, getting it wrong is cheap. In education, getting it wrong shapes people. Once a system starts guiding someone's learning path, the question isn't whether it's impressive. It's whether it's responsible and can be held accountable.

That's not a reason to avoid personalized learning. But it is a reason to hold it to a higher standard.

Lack of transparency is the risk

The biggest problem with many AI learning systems is not so much the personalization. It's more of what happens behind the scenes.

These systems often work quietly in the background. Watching how the users interact with their platform and changing paths without letting their consent. Changing strategies without disclosing what or why it is doing so.

From the learner's perspective, things just shift without their knowledge - the material gets easier or harder, the pace changes and options appear or disappear.

When decisions about your learning happen out of sight, you slowly lose control. And once control is gone, trust in the system goes with it too.

Personalized learning shouldn't feel like something happening to you.

Behavioral analytics is an incomplete story

AI works well in education and learning in general because it watches what learners actually do. What students say and what they actually do are often very different. Someone might believe they are a “visual learner”, but their behavior tells a messier story of how they really learn.

This information is incredibly useful, but it's incomplete.

Behavior shows what is happening, but not always why. When systems treat behavior as the whole truth, they can mistake hesitation for inability and curiosity for confusion.

That's where the good intentions of people building AI education systems start backfiring.

Questions builders should ask

If you are a builder or manager of AI education systems, ask these questions to check the fairness of tool:

If the answer is no, personalization becomes something done to learners, not for them.

When the answer is yes, AI stops being a hidden authority and starts being a visible helper.

Watch the system. Not just the student

We've gotten very good at measuring learners: How much do they engage? Whatever they finish? How often do they get things right? And how long do they spend on each task?

This is very important and helpful for learners. But we rarely ask the same questions about the system shaping those results.

If a learning system shapes outcomes, it deserves the same hard look as the people using it.

Making learning more human

The primary point of personalized learning is more than just speed. It's understanding.

When AI systems are actually open, questionable and watched carefully, they don't replace teachers or take control away from learners. They support both.

They give teachers earlier warning signs. They give learners clearer direction. And they create room to fix things before frustration takes over.

Personalized AI learning has enormous potential. And because it does, we owe learners more than quiet fine-tuning in the background.

If we get this right, data doesn't just make learning smarter.

It makes it more human.

About Vin Mitty

Vin Mitty, also known as Vinayak Mitty, is the director of data science and AI of LegalShield, an American legal services provider. Passionate about technology adoption, he has been previously featured in USA Today, TechTimes and TheAdaNews.