Are project forecasters “fools or liars”?

From the The Money Files Blog
A blog that looks at all aspects of project and program finances from budgets and accounting to getting a pay rise and managing contracts.

About this Blog


Recent Posts

How to demo your project deliverables

What you need to know about your project supplier

In Memoriam: Wilhelm Kross

Get the most out of a conference (video)

How to handle out of hours work

Email Notifications off: Turn on

Categories: forecasting, research

Oxford University“The majority of forecasters are fools or liars,” says Professor Bent Flyvbjerg from the BT Centre for Major Programme Management, at the Sa?d Business School, University of Oxford, in a new paper on inaccurate estimates for major projects.

The paper, published in the International Journal of Project Management, sees Professor Flyvbjerg criticising the way that forecasts for projects are put together. He says they are inaccurate and provide poor material from which to make decisions about cost and benefits.

“Estimates are commonly poor predictors of the actual value and viability of projects, and cannot be trusted as the basis for informed decision-making,” he says. “These forecasts frequently misinform decision makers on projects instead of informing them. Some of the inaccuracy comes from genuine forecasting mistakes arising from over-optimism, but some estimates are deliberately misleading, designed to secure financial or political support for a project.”

You probably know of examples of where a project manager has padded estimates for one reason or another, by Prof. Flyvberg is pretty scathing about forecasting methods and the people who use them.

“Many forecasts are garbage and can be shown to be worse than garbage,” he is quoted as saying in a press release from the university. “These reports give the client, investors and others the impression that they are being informed about future demand, or the costs involved in a major project, when they are being misinformed. Instead of reducing risk, reports like this increase risk by systematically misleading decision-makers and investors about the real risks involved.”

What’s the answer?

Prof. Flyvbjerg says that the answer is for everyone to be a bit better at not putting up with this (I paraphrase). For example, he recommends that clients should ask for their money back when they receive reports which later prove to be significantly inaccurate and misleading. He even goes as far as saying that they could demand compensation (some contracts must have a clause for this in anyway). His most radical idea is that in some cases criminal action would be justified. “Merely firing the forecaster may be letting them off too easily,” he says. “Some forecasts are so grossly misrepresented and have such dire consequences that we need to consider suing them for the losses incurred as a result. In a few cases where forecasters foreseeably produce deceptive forecasts, criminal penalties may be warranted.”

Personally, I can’t see many project managers ending up in court because of poor scheduling, but as this has come from the Centre for Major Programme Management, Prof. Flyvbjerg is really talking about complex, mega projects.

When we say ‘everyone’, Prof. Flyvberg includes the professional bodies in that too. He calls on them to use their codes of ethics to penalise and possibly exclude members who produce unethical forecasts. “This needs to be debated openly within the relevant professional organisations,” he says. “Malpractice in project management should be taken as seriously as malpractice in other professions like medicine and law.” How many project managers genuinely produce unethical forecasts and how many are just incompetent? I think it would be hard to decide if someone was acting in good faith and to the best of their abilities or whether they were deliberately altering estimates for political gain.

A better way of forecasting

As you would hope from someone who is so outspoken about this, Prof. Flyvberg has all the answers. His answer is to turn to his own work and in this IJPM paper he sets out the case for quality control and due diligence to be applied to the evaluation of front-end forecasts. Unfortunately, I think his answer only works for massive projects and not for the type of forecasting and estimating most project managers do on their projects.

“Recent research has developed the concepts, tools and guidance on incentives that could help curb both delusional and deceptive forecasts,” he says. “Whether forecasters are unwittingly or deliberately under-estimating the costs, completion times, and risks of projects, and over-estimating their benefits, we need to have a systematic basis for evaluating their findings in order to make informed investment decisions. Given the high cost of major infrastructure projects, the irreversibility of decisions, and the limited availability of resources, this is clearly critical for both public sector and private sector projects. Significantly more accurate forecasts can be produced by looking at the evidence available from previous similar projects which have been already completed – what I call, taking an ‘outside view’. This seems so simple, but in practice it is transformative and leads to much more accurate forecasting.”

In other words, take large data sets or statistically relevant data for projects in your sector, apply due diligence, estimate from the basis of past experience and critically evaluate the forecasts. You’ve spotted it – the big downside to this estimating approach is that you need large, validated data sets to draw benchmark data from previous, relevant, projects. If your PMO has been up and running for years and has gathered all this, and you never do any projects which innovate or deliver something new in a way you haven’t done before, then you could make use of this technique.

If you don’t have all that data to hand, then this method of forecasting will not scale from mega projects and programmes to the humble projects that you and I work on. While using historical data is great and we should all look to the past to better predict the future, we would be wrong to expect this model to work for all projects.

Posted on: December 19, 2012 04:45 PM | Permalink



Actually, I think most project managers are not involved with forecast and if they are, usually get their budget forecasts discredited by an executive sponsor or other major stakeholders who either whittle it down to unrealistic levels or pads them to fund something on the side or worse, to line their own pockets. This also holds for consultants or consulting companies who have similar ulterior motives. I think if the project manager and team involved with these project were allowed the time, resources and credibility to provide as accurate a budget and schedule forecast as is necessary, you'd see lot less of the kinds of issues Prof. Flyvberg highlights (or rants about depending on your perspective).

I'm also suspicious of his advocacy of sophisticated statistic models and data mining methods favored by academics as a solution to such problems. If we use the Finance industry as an example, they employed highly intelligent "quants" (in many instances the actual proverbial "rocket scientist") to come up with very sophisticated mathematical models based on huge archived stock data to forecast where the mortgaged backed securities would be profitable in the mid-2000's and we all know how that turned out!

Its both the quantitative and qualitative that has to be factored in as well as a control and check on the human and environmental elements. My 2 cents.


I'm also sceptical of this approach to use data modeling as the answer to better forecasting on projects. I don't think most industries have this data, and while it may exist for large construction projects, the type of projects we are doing is changing. I think that the way project work is evolving means that we are seeing many more projects as 'one off' which means that any data gathered is not as relevant to other projects, nor would data gathered in the past be of much use for estimating. This is the knowledge economy and the requirement for businesses to innovate. You don't innovate by doing what you have done lots of times before!

This is not something that can be attributed to the PM profession alone. All forecasts carry the risks mentioned in the paper.

Statistical analysis along with validation by the stakeholders may minimize the forecasts going awry but does not eliminate the problem. Real Mega projects can validate through simulation too before accepting the forecasted data. Does anyone have experience doing this? It would be interesting to see how that worked out.


I agree that it would be interesting to see examples of what has happened on real projects, or of using simulation, so if anyone has stories to share, please do!

Please Login/Register to leave a comment.


"I have never met a man so ignorant that I couldn't learn something from him."

- Galileo Galilei