I tend to "spend" a lot of keystrokes on various aspects of technical debt and/or poor engineering practices. The list seems endless: component coupling, shallow modeling, leaky boundaries, inconsistent conventions, not-sufficiently-expressive design, incapacitated development agility, unnecessary layers of indirection, overzealous pattern usage, lava flow effect - naming just a few here.

These issues are all serious stuff and (pretty much) everyone has encountered them up to some degree. Also, it's worth noting that they all have quite a few things in common:

  • they all have certain remedies identified
  • neither of these is hopeless - there's always something you can do, even if it could take years (to get rid of the issue completely)
  • neither of these stops (/prevents) the development - you can keep building/extending features while fixing stuff (even if it's less efficient)
  • in terms of Cynefin classification (https://www.infoq.com/articles/cynefin-introduction/), these are (at worst) complex problems - ones that can't be decomposed into smaller ones, but one can probe them and assess the outcome

Wait, it can be worse

This raises a question - can't things go MORE wrong? Be more f*cked up? Are there ways to screw up systems/applications/platforms in an unrecoverable way? The ones architects can't get any CLARITY (my new favorite word these days) on ways to solve?

And if so - why? What's the criterion that determines whether there's hope or not? Maybe such criteria do not exist - it's software after all: what could be more malleable?


OK, here's what I think:

The "recoverable" cases have one common quality - there's always SOMETHING (some foundation) one can rely on. Something stable, reasonable & consistent enough that you can build your improved (/fixed) solution upon.

But that's NOT always the case! I've seen systems tainted in a way that has pretty much ruled out any reasonable ways out. What kind of flaws could have such severe consequences? OK, here we go with some examples:

  1. the system consists of parts (apps, components) that either do not have clear (even roughly) responsibilities or these overlap in an unclear way (w/o any patterns/clear logic)
  2. nothing owns its internal state - depending on the context, conceptually the same information is kept in different parts of the system (you never know where's the "golden copy" because it drifts ...)
  3. vital parts of behavior (incl. domain invariants) are conducted outside of the system - either by manual intervention, scripting or "hand of god" (you have no clue how) - the logic is vague & not validatable
  4. functionalities are commonly used inconsistently with their design and planned purpose - system acts as some sort of more sophisticated spreadsheet where business logic is being notoriously walked around
  5. in the fundamental core of the system, flexible parts (e.g. configurable data where hierarchy and structure can be adjusted by the business users) are closely coupled (interconnected) with the rigid ones (e.g. business logic stiffly dependent on manual setup)

These are just few examples - fortunately such serious issues don't happen too often - but they should let you get the idea: the flaws mentioned above are so deep and fundamental to the system design that (as an architect) when you're trying to make a decision to make the system BETTER, you actually make it WORSE.

Broken paradigm

Why? Because the paradigm - system of value (the most fundamental "laws of physics") for such an utterly broken system - is very different from the standards & patterns all the professionals learn when entering the craft (aka software development's equivalent of Ten Commandments). Quoting Cynefin (again), this is a chaotic system - cause and effect (even when it comes to design decisions!) can be hardly related, even in hindsight (!).

Got it? It's so f*cked up that your following design-related decisions should also be f*cked up! But the f*ck-upness direction should be consistent with the previous f*ck-upness orientation (they should be broken in a ... congruent way - however crazy it sounds).

Needless to say, this is NOT sustainable mid- and long-term. Unless the creators of the aforementioned system had just invented a new, revolutionary system of design (/set of architectural patterns), you've got into the dead end. Feel free to evaluate the chances yourself.


When and where do the such systems "happen"? IMHO it's a matter of lack of PROPER engineering foundations - negligence (intentional or due to ignorance) of collective sum of software craftsmanship experience accumulated & polished by former generations of developers. Programming languages are extremely flexible & even more powerful - the limitations seem virtual: but it doesn't mean that anyone can jump straight into them & craft quality stuff.

Following this thought even further - call me an elitist jerk, but I've got an impression that the continuously increasing number of neophytes (people without academic background in any discipline related to software engineering) in development roles will strongly (negatively) impact the average quality of software produced in the forthcoming years. Up to the point where a lot of poor quality software will have to be written off (as a sunk cost) not that long after it had been created.


An open question then - did you experience a situation when the piece of software you worked on was broken beyond all hope?

Share this post