Part I can be found here.
TL;DR - maybe thinking in "processes" is a relic of traditional, policy-driven, hierarchical enterprise? Decentralised model, based on business events & reactions tends to have some advantages & seems to match modern architecture principles better. WHAT IF we ditch BPM completely?
Is there a better way?
There are at least few scenarios that seem to fit modern
Disclaimer: the idea of immutable code ain't mine - I've read/heard about it somewhere (can't recall precisely ;/) some time ago & it sticked with me.
Code maintenance is already a huge problem & it won't get any better by itself. Software is everywhere - even mundane, basic everyday tools get "digital" & ... flawed. Two, three years ago I was encountering bugs (in software
I strongly believe in emerging design & evolutionary architecture. I've seen too much in my life ;> to believe in up-front design & beautiful pictures turning miraculously into flawless, working systems. I'll save you detailed reasoning as I've written about that on several occasions already in the past. Let's consider the emerging design instead, because I've seen many people struggling with the idea which seems
There's nothing permanent in software engineering. Consider building services for Internet applications - for the last few years service design principles were quite straightforward:
- stateless >>> stateful
- sticky session (& session affinity) are made of suck
- if you really, really need session, at least make it immutable
- build a dedicated service layer, but make it as lightweight as possible & keep the state in separate
Few weeks ago I've noted down an interesting notion that someone has mentioned during one of the discussions I've participated in:
The overall idea is simple, you may recognize it from hedging funds or hedging bets. Hedging technology is expanding your area of expertise in a particular field by learning another technology that basically applies to (/enables) the same scenarios as your
IMHO tech skills are 10% talent + 20% engineering common-sense + 20% theoretical knowledge + 50% practical experience. If I've under-estimated any of these, it's most likely (still) the practical experience.
Needless to say - it's not ANY practical experience. Going through tutorials on every possible topic is still something, but to actually learn & improve, you have to challenge yourself:
I've encountered a very similar situation in so many companies: their IT landscape grew all the time (while nothing was decommissioned), layers they've initially got became more complicated or even split into sub-layers: sometimes parallel, sometimes stacked. Obviously, different components frequently utilize different technologies, so the skill-set required to develop & maintain all that got significantly broader.
Consequences are numerous:
- architecture is complex &
Few days ago I've run into this article on InfoQ:
I found it amusing enough to dive deeper here (link taken from the article):
I won't keep you guessing any longer - my amusement isn't about the overall idea of IODA or this horizontal to vertical
The concept of Microservices, even if a bit ephemeral, gets a lot of love nowadays. Reasons are quite straightforward, so I'm not going through that once again - what bothers me is some kind of misconception / overinterpretation I've already encountered few times.
- YES, encapsulate service business logic for coherent & compact slice of (sub-)domain
- YES, encapsulate persistent data that business logic
The struggle against coupling doesn't just seem eternal. It IS eternal. Multi-dimensionality (compile-time, deploy-time, run-time, ...) of the problem, easiness of breaking the state that was achieved with the great effort, complexity of validation / monitoring (try to quantify coupling in a way that makes sense ;>) - all of these don't make things easier. Especially when you realize that sometimes you've got snakes in place you've
Have you heard about Lambda Architecture (LA)? It's an interesting concept introduced by Nathan Marz ("the father" of Apache Storm) and it's basically about processing massive quantitive of data using two parallel streams:
- batch stream - this one is able to do pretty much anything with unlimited quantity of data, but it has its latency, stored data has its inertia & crunching (even distributed)
I hate starting a blog post without a clear statement or a question to be answered. But sometimes it's just the best way - hopefully things will clear out for you during the reading.
No more "free lunch"
Technology is advancing like crazy since I can remember. Until quite recently, I was buying a new computer every 2-2.5 years. CPU frequencies, RAM, disk