TL;DR "It's the domain modelling, stupid!" - Bill Clinton could have said if he were into software engineering. We tend to focus on technical design excellence (patterns, inversion of control, reactivity) without giving much thought to proper, maintainable domain model behind. Consequences are barely visible initially, but their impact grows exponentially when application scales up (not in terms of traffic but accumulated functionality) - this is what impedes industry behemoths' cycle time far more severely than dated technology or waterfallish processes.

I think I've seen some crap in my life. Hardcore legacy. Sheer bad engineering. Technical debt that got out of control. ERD-driven madness. Multi-layered CRUD super-behemoths with cascades of modals. Super-exotic solutions that got technically obsolete, but refused to die. Magnet-meta-applications that have aggregated all kinds of functionality within the enterprise.

All that sort of things.

Applications that did not perform fast enough.
Applications that did not scale among more than few devs.
Applications that failed to absorb non-trivial business logic.
Applications that have suffered from excessive boilerplate code.

I've seen ...

  • ... super-crude applications built with hammers only (the simplest means possible, not taking advantage of tech capabilities)
  • ... business logic implemented in data structures, smeared across user interfaces and hidden deeply beyond several layers of mysterious abstraction layers
  • ... systems that development was almost incapacitated by Conway's Law
  • ... systems coupled beyond recognition w/o a slightest trace of a single automated test
  • ... applications built with no conventions / ever-changing conventions / too many conventions & conventions straight out-of-hell

And frankly, all these technical issues (as you've most likely noticed I've completely skipped "soft" issues like: requirement misunderstanding, not adding value, etc.) have severe & grim consequences (especially when they overlapped), but there's one kind of "special" sin that IMHO has caused the most havoc over the time & what is more - in not that obvious way:

Shitty domain modelling

Naive. Shallow. Unstructured. Neglected. Omitted.

Bad Design has terrible consequences, but many seem not to notice. Inconsiderate "user story"-driven development (without a deeper vision & true understanding of the product) adds tiny "fibers" directly on the top of existing code, without any attempt to incorporate new demand cohesively into current functional model. In many cases the model doesn't even exist (database is not a model!). Even if it initially did, it degenerates quickly (if not taken care of properly).

It's really funny if you think for a moment about how much time devs spend on discussing technical details like adapters, containers, engines, facades ... But even perfected architectural vision has no chance to stand for insufficient / poor model design. Without proper modelling, there are no boundaries. Without boundaries, there can be no (domain) scaling. No testability. No state mutation control. No realistic regression foresight.

In simple words: things will stop flying. Not immediately, but eventually it will happen.

Slice'n'dice?

The usual answer (once problem's consequences become visible, perceivable and correctly attributed to) is rather naive - approach the problem from the code's perspective by "applying proper development practices" (SRP, function/class size limits, etc.) retroactively. But if you got for something like that, try to imagine the end result: what is the practical difference between homogeneous, tangled mess & equally (or even more) tangled mess of micro-debris with unclear purpose, mixed responsibilities & tons of micro-dependencies across artificial boundaries? It's a faulty model behind the code that acts like an anchor! Whatever you want to do (add/fix/improve), it will require constant mappings between desired & existing models.

Unfortunately the simplest means won't work beyond some level of complexity. I don't even speak about daring for a holistic model decomposition (into modules / areas / contexts, etc.), but about simple "peeling the onion" (extracting & re-modeling functional slices bit by bit) - it's just not feasible at all if everything depends on everything w/o any control.

So how is one supposed to fix large, complex system that was built without a trace of mindful modelling? There are few options that come to my mind, but it looks like a topic for totally separate blog post (or even series of such ones) ...

P.S. Yes, I occasionally see some good code engineering as well ;D

Pic: © attitude1 - Fotolia.com

Share this post