Few days ago I had a short, informal meeting with few peers - among other things, they've asked about the interesting stuff I was doing recently, including data crunching powered by Apache Hadoop and complex event processing with Apache Storm. I went throught the stuff quite quickly just to give them a glimpse of the possibilities you have with real-time computing & big data analysis. Storm truly has caught their attention immediately, but they (well, some of them) shocked me with the following opinion on Hadoop:
The Hadoop stuff - I don't see much value in that, as it analyses the past, so the info you get is related to something that has already happen. What it may cause is just finger-pointing & blame contest: people won't like that here.
What is wrong in analysing the past? Actually, isn't it the way you get the feedback on your actions? So shouldn't you focus on:
- shortening the feedback period?
- making sure the feedback is measurable & quantifiable?
For a business owner short feedback loop is a pure win - instead of developing a full-blown, pricey, risky (even if visionary) solution that would take several months, you go for a MVP (minimum viable product) & you check how it's adopted in few weeks long time max:
- do people use it?
- do people give it a dry-run try or just read-up about it?
- if they use it, what are the usage patterns?
- if they use it, where do they come from (what makes them use it)?
Being afraid of feedback
Some people may be afraid of such feedback - it may reveal that their idea have failed somehow: the assumed targets have not been met or have been met just partially. And that could cause an internal rat race escalation. Isn't it better to keep things obfuscated - people just do as good as they probably can & things just roll-on?
Seriously, I didn't make this up - this is exactly what I've heard. And yes, I know it sounds ridiculous, so let's smash it to get rid of the sour sensation of ineptness :)
Let's assume this particular person gets what (s)he wants:
- the actions are not accountable
- there's no way to find out what was the actual result -> impact on KPIs, etc.
- the true effect of the change is unknown
An organisation that allows something like that:
- doesn't undersand (or reject) the idea of continuous improvement
- doesn't promote the proper mindset of people (effective vs efficient) & it doesn't motivate people to do good & valuable work, it just motivates them to run around, make a lot of window & appear busy
- fails to set the sensible priorities / goals for its people - and if these are not clear & coherent, organization may advance only by accident or due to market forcing it to
- is not internally transparent
- doesn't realize that it's fine to make mistakes, as long as you realize them quickly and introduce the corrective actions that fit what has happened
Back to PRE-
Let's get back to the analysis that aids the business decisions that are being made right now or even gives the input to make the business decision (triggers it), that wouldn't happen otherwise - the actual idea that was so much more alluring for my interlocutors ...
Seriously, how often do you make a decision that is based ONLY on present data (no long-running context, state, history at all)? Yes, it may happen, but these are exceptions that prove the rule. What you do usually is infering from past behavior patterns - this is what whole risk analysis is based upon (scoring models, machine learning, etc.).
And even if by any chance you're fine with using just the present data with no context (by introducing some kind of algorythmic model), do you really want to assume that your work was perfect and:
- there's no need to measure it to find out how it actually went?
- there's no need to find out, whether it requires adjustments in particular areas?
Measure, measure, measure, ...
It's the tangible stuff that matters, regardless of how hard the measuring is (check this book). When I hear about:
- "cleaner architecture"
- "simplier solution"
- "more modern product"
... I'm getting sick. These words mean freakin' nothing! What you should do is (aftert the change) to:
- prove lower maintenance cost
- make sure there are fewer artifacts to be released (& maintained, & re-tested, ...) - so the risk of intoducing a bug is minimized
- show that there are fewer defects & problems (so the quality has improved)
If someone avoids being transparent, there are three options:
- ... either (s)he's trying to weasel out of it -> which is just a mischief
- ... either (s)he doesn't know how to measure his stuff -> which means that (s)he's not credible enough
- ... or (s)he's just too lazy to do that -> and that means (s)he's redundant