← Terug naar blog

Three things I see going wrong with AI implementations in industry

AI projects rarely fail because of the technology. Things go wrong with the process analysis, the adoption on the work floor, or the knowledge transfer. These are the three pitfalls I encounter most often.

Three things I see going wrong with AI implementations in industry

I believe in AI. I think it can make a serious difference for companies in the manufacturing industry. But I also regularly see what goes wrong — and precisely because I'm enthusiastic about it, I think it's important to honestly acknowledge that.

Because when an implementation fails, it has consequences. Not just financial ones. It also creates skepticism in the organization — and that is much harder to overcome afterwards than the technical challenge ever was.

These are the three pitfalls I encounter most often.

1. Starting too quickly without process analysis

I understand the urge. AI promises a lot. Suppliers promise even more. And there's a sense of urgency — the idea that you'll fall behind if you don't start quickly.

But AI is not a miracle cure that suddenly makes a chaotic process orderly. If the information flows in your company are not in order, if data is scattered across dozens of Excel files and emails, if no one knows exactly how a process runs from A to Z — then AI makes that more chaotic, not more orderly.

What I see: a tool is purchased, people experiment enthusiastically with it, and after three months no one notices a difference anymore because the tool doesn't align with how the work actually flows.

The rule I always apply: if you can't explain the process without exceptions and workarounds, it's not ready for AI yet.

The solution is not complicated: start with an honest analysis of how the process works now. Not how it should work on paper — how it actually works. Only then look at where AI adds something.

2. Choosing tools that the shop floor doesn't use

This is perhaps the most common pitfall. A system is implemented that technically works fine. The dashboard looks good. The demo was impressive. But six months later, half the employees don't use it — and the other half does it alongside their old way of working.

Why? Because the people who have to use it weren't involved in the choice. Because the interface doesn't match how they work. Because it costs three extra steps compared to what they were already doing. Or simply because no one properly explained to them what it delivers for them.

I've experienced this up close. As a welder, as a work planner, as an operator — I know how it feels when a system is imposed that makes work harder instead of easier. And I know how quickly people then come up with their own workarounds.

The solution: involve the people who work with it early in the process. Not to convince them — but to understand what they need. A solution they helped build, they'll also use.

3. Knowledge that stays with the consultant

This is one that personally bothers me — also in my own field. A consultant builds a nice solution, implements it neatly, and leaves. Three months later no one knows exactly how the system works anymore. When something goes wrong, you call the consultant again. And you're dependent for an indefinite period.

That's not a good situation. Not for you, and actually not for a consultant who seriously believes in their work either.

Real implementation means transfer. It means your team understands what has been built, how it works, and how they can adapt or extend it when the situation changes. It means documentation that's correct. It means the knowledge sits in your organization — not with me.

I always ask myself the question: if I'm no longer available tomorrow, can this company continue independently? If the answer is no, the work isn't finished yet.

What this means in practice

If you want to start with AI — or are already working on it and notice it's not going as hoped — these are the three questions I'd like to give you:

  • Did we understand the process well enough before we started building?
  • Were the people who have to work with it really involved — or were they only informed?
  • Can we as an organization manage and further develop this system ourselves when the consultant is gone?

If you answer 'no' to any of these questions, that's not bad. It's a signal to take a step back and strengthen the foundation. That's always better than continuing on a shaky basis.

Do you recognize any of these pitfalls in your own organization, or do you want to prevent running into them? Schedule a no-obligation introduction meeting.