Why best practices in IT are not necessarily what we think they are

By | May 18, 2016

Lately, I have been observing how a commonly accepted approach to the business analysis collided with a little bit unorthodox business analysis methodology, so that made me thinking of what really makes a best practice..

Many of us start in the information technology industry through some sort of “science” education. For me, it was applied mathematics. For many, it was computer science. For some, it might be engineering or other technical education. Either way, it seems that, in order to work in the IT, one has to possess some of the personal traits which are usually associated with the logical reasoning.

However, a lot of IT is, actually, illogical. It is illogical in the sense that, quite often, it is impossible to trace the whole sequence of decisions which lead to a particular outcome.

It is easy in mathematics, for example. You start with something established, you transform that knowledge using well-known logical rules, and you get a result that anyone else can validate just by applying the same rules to the original statement.

Computer science follows a similar approach. For example, you can take an algorithm, you can calculate the complexity, and you can say upfront which algorithm would work better in a particular situation.

Information technology, on the other hand, is not just about science. Come to think of it, how often do I really have to recall the idea of algorithm complexity in my daily consulting/development work? If it happens once a year, that’s already good enough. Instead, a lot of what we, information technology people, are doing is mostly based on the so-called best practices. But best practices, by definition, are based on something that cannot be easily validated. What is a best practice? It is, basically, an agreed upon description of the best approach to the process implementation.

Are best practices following the same logical reasoning patterns as science, though? The answer is yes and no. Once you have established that there are specific best practices, you can mix and match them to create other practices. That process would likely follow common logical rules. However, you have to establish those basic best practices first, and that’s exactly the area where information technology has become somewhat sloppy having assumed that best practices are universal.

As a result, we have started accumulating immense amount of knowledge in various areas, all combined under the umbrella of best practices: we have business analysis body of knowledge (BABOK), we have project management body of knowledge (PMBOK), we have various methodologies such as TOGAF, Prince2, Six Sigma, SureStep.. and this list can go on for quite some time. Yet it is almost impossible to question any of that knowledge because, essentially, there is nothing specific to question. Or, more exactly, we can question anything there, but how do we validate the answers?

Problem is, to say that something is a “best practice”, we need to be able to compare that practice with other practices, but we can only judge a practice based on the outcomes of implementing that practice. Let’s compare this information technology scenario to Formula 1 racing, though. If Michael Schumacher was able to tell you, in great details, what he used to do to win those races, would it turn you into an F1 champion? The answer is obvious – you might not be able to even finish one lap.. But this analogy easily demonstrates a few points:

– A lot there depends on the person who is implementing the practice

– What works for one person might not work for another

– When describing best practices, it is easy to miss on some of the critical details

In other words, as much as best practices are meant to provide guidelines, they have to be treated with some level of scepticism. That internal contradiction is, likely, the reason why we do see some adoption of the best practices, but it has not become mandatory at all. Any “best practice” should always be accompanied with a disclaimer: make sure you try it first to see if it fits you, and, then, use it at your own risk. That might put it all in the correct perspective where we would be considering best practices as tools we can use, but it would clearly be up to us to choose the tools which are fit for purpose.

Leave a Reply

Your email address will not be published.