The "Done" Delusion: Why Data Modeling Should Be a Program, Not a Project

Practical Data Modeling Oct 21, 2025

Ellie makes data modeling as easy as sketching on a whiteboard—so even business stakeholders can contribute effortlessly. By skipping redraws, rework, and forgotten context, and by keeping all dependencies in sync, teams report saving up to 78% of modeling time.

This article is brought to you by Ellie.ai


I was recently re-listening to a podcast I recorded with my friends Remco Broekmans and Marco Wobben, and Marco brought up a point that completely stopped me in my tracks. It’s one of those ideas that, once you hear it, seems obvious, yet it challenges a core assumption I (and I think many of us) hold about our work.

Most of my experiences with data, ML, and AI initiatives are treated as finite and projects to be “done”. Of course, we all know that many apps or models aren’t one-off things, but instead evolve. But even then, we tend to handle that evolution in sprints, with a neat little notion of “done” at the end of each cycle. We build the thing, we ship it, we iterate, but the initiative itself has a beginning and an end.

Marco’s world is different. Where he works in the Netherlands, data and information modeling is treated as continuous work in progress, with no specific notion of “done” or ending. He framed it perfectly: they don’t treat modeling as an initiative or a project. They treat it as a program.

This isn’t just a semantic argument. In his case, Marco’s modeling activities have been going on for almost two decades and will likely continue indefinitely. That’s different from how many practitioners approach data modeling. But taking a step back, data modeling is more akin to a program. Data modeling isn’t a task you complete. It’s a core, perpetual business function, like finance or operations. It’s not one and one, but a living, breathing activity. This simple reframing helped me see data modeling not as a one-time setup, but as a continual activity of sense-making. We all intuitively know this, but it’s easy to forget in the day-to-day grind. The data model is a living reflection of the business, and as the business evolves, the model must evolve with it, not in fits and starts, but fluidly and continuously.

This idea of “program, not project” flies directly in the face of how I often hear data modeling discussed. I, too, am guilty of treating it as a one-time activity. Despite the best intentions, it’s usually seen as something you do at the start of a new project, or something that gets revisited on an annual basis. Or, if we’re being painfully honest, it’s something we only fix when it breaks. And especially in the US, data modeling is often seen as a nice-to-have, if it’s intentionally done at all. Often, data modeling takes the form of throwing data into one big table and calling it a day.

I’m generalizing a bit, but this “project” mindset is inherently reactive, with a dash of proactivity. The “program” mindset is more proactive than reactive, and evolutionary. It accepts that the map will never be the territory, and that our job is to keep the map as useful and accurate as possible, forever.

Of course, it’s not as simple as just deciding to change your perspective. Marco and I quickly realized there are profound geographical and cultural differences at play. He lives in the Netherlands, where tighter labor laws and social structures often foster a more long-term, stable perspective on business functions. It’s much harder to fire people, which may naturally lead to building multi-decade programs around your talent.

In the US, by comparison, the calculus is different. I don’t know many companies that would want to treat data modeling as a continuous, multi-decade program. The perspective is dominated by the immediate. What’s the issue for the quarter? What’s the deliverable for the year? What’s the budget, and, most importantly, what is the ROI?

This conversation left me wondering if our short-term, project-obsessed, ROI-driven culture is fundamentally holding us back. It works in some ways. But data is a thinking person’s sport. Are we building brittle systems because we treat data modeling as a task to complete, rather than a living program to nurture?

What’s the hidden cost of not treating data as a continuous program? And in an environment that demands immediate results, how can we even begin to build this long-term thinking into our data culture? Especially nowadays, when AI is driving iterations at warp speed, does AI also free up cycles for thinking? I hope so.

I’m still mulling it over, but it feels like a conversation worth having.

link to the original content