Looking back at my work over the last few years I realised that a key part of my job has been to make sound decisions in environments where there is no data, and I have developed some skills and techniques in doing that. The primary skill is estimation. The more I looked at my recent past the more I realised that estimating underpins quite a lot of what I do, and it is quite uncommon to find people who are good at it. In fact my working relationships fall quite neatly along the boundary of people who are good at, and comfortable with, estimating, and those who are not.
I also realised that I have a few rules of thumb that I use when I am estimating but I have never written them down, and that’s what a blog is for. I decided to do this now because I was asked recently to justify a piece of analysis I had done for a business with whom I had been working for some while.
The request was perfectly reasonable but was accompanied by some disbelief which I found a little disconcerting, especially since I thought my analysis was thorough and robust. Underneath the request it turns out that there was a presumption that I had access to some magic source of data about their business which was invisible to them, since they had tried a similar analysis, failed and had been working with a presumption. As is very often the case that presumption migrated unchallenged from hypothesis to fact to orthodoxy, and nobody could justify it – they just knew I was wrong. I realise now that my analysis roundly challenged the presumption and with it a lot of decision making. No wonder I got such a cool response.
The truth was that there was no measured data to substantiate my analysis – or theirs – because they hadn’t measured any. Worse still, instead of accepting that their anaylsis was incomplete and stepping back and meausuring something, they just relied on a number that someone else had given them. Following our meeting I dug up that number it turns out it was a very good estimate communicated in an email some time earlier. The trouble was that it was an estimate of a completely different quantity.
In justifying my analysis I also had to justify my methods and that led me to thinking carefully about these rules of thumb. They apply mostly to strategic thinking, although in reality they are useful for just about any day to day decision-making that has any quantitative component. Here they are for posterity…
You have to estimate. It is almost always the case that your data are incomplete, or getting to a value for some quantity is very difficult or impossible. No amount of stats will help you if you don’t have the data.
You should estimate. Even if you don’t have to, you should still estimate. The value of a good estimate is that it gives you an intuitive framework in which to do any more rigorous analysis and a directionally correct number as a validation of your results. It also sharpens your skills of estimation, which will mean you can do it more quickly.
Get comfortable with inaccuracy. You are estimating, so your answers will only have a certain degree of accuracy, and you may not really know how accurate they are. The important thing is not to obsess about whether you are accurate, but obsess about whether the reasoning by which you came to your estimates is logical and coherent.
Don’t over-state your accuracy. A common mistake is to provide numbers to an unjustifiable degree of precision. If you have estimates that are close to the nearest thousand, then quoting an answer to three decimal places is probably being misleadingly accurate. If you are dividing one estimate by another, then round the result and be prepared to use a range. Approximately 15.2M divided by approximately 65k does not equal 233.85, it is about 235, or between 200 and 250.
A guess is better than zero. For your broader analysis, any value is better than no value, even if you pluck it from the air. This is the most common mistake that people make when trying to do some loose analysis. They would rather not include something important because they don’t have an accurate number than include a arbitrary estimate. I find that most Finance people fall into this category, it is what they are taught to do. Zero is almost never a valid estimate.
An informed guess is better than a wild guess. When you are guessing, try and make your guess based on one or more things that you can measure reasonably accurately. Often this is a good way to get at estimates of un-captured data, and reasoning like “we normally have 2 of those for every 5 of these other things” will generally give you a very good estimate. Look for proxies to your quantities that you have either measurements for, or other solid estimates of.
Estimates can be surprisingly accurate. Even if you do follow up your estimation with rigorous bottom-up analysis – presuming it is possible – then the chance that you were more than 15% wrong with your estimate is slim. The cost of the additional analysis is rarely worth it, especially for strategic analysis which needs to be directionally correct. Don’t shy away from the rigorous analysis, but…
Don’t be defensive. If challenged, remember that your audience probably doesn’t have a better grasp of the data than you, and if they do, then be prepared to include their data, it can only make your estimate better. The same is true for a methodology. If your working assumptions are challenged and are wrong, change them. The flip-side is that if your challenger just doesn’t like your numbers and method, then you are at liberty to push back and ask them to provide something better. It’s much easier to knock a straw-man than build one, and it is rare that a better estimate comes back. If it does, then use it.
Don’t get wedded to an answer. Be honest with your analysis and don’t try and work your estimates to get you to a particular number. Flip side is that if you know the answer in advance – because of some magic – and yet nobody can tell you why the answer is as it is, then if you can get close with your estimate then you provide a good reasoning for it, and that has immense business value because a model of a number is much more useful than a raw number. If you can’t get close, then you should question the magic answer because your well-reasoned estimate doesn’t get close. It is surprising how often the accepted truth is wrong.
Provide a range of answers. If you are really unsure about a number and it falls into the wild guess category, then a reasonable way of dealing with it is to put an upper and lower bound on it. This is usually much easier to do than think of a value, and it will give you an upper and lower bound on your subsequent calculations, which is fine and a very digestible form for your answers when presenting them to strategic decision makers.
These have served me well and I hope they are useful for you too.