I recently came across an attention-grabbing paper titled “Dance hit song prediction” (Journal of New Music Research, Vol. 43, 2014). In this research, the authors developed a predictive model to evaluate the likelihood of a new tune to make it to the chart’s Top 10. For that purpose, a vast database containing dance hit songs from 1985 to 2013 was built; characteristics in a tune such as tempo, duration, loudness, energy or danceability were measured and modeled. Amazingly enough, the rate of predictability was found to be at least 70%!
Extrapolating this concept to project management, we all have heard – or worst, used – expressions such as “this project was doomed for failure”.
Thus, one wonders if a robust model to predict the odds of a project success could be developed, which would eventually save the organization human and monetary resources. It is not a trivial task; prior to diving into complex formulas or models – which requires a comprehensive and thorough analysis, as it may be seen from the cited paper – it is indispensable to identify the key success drivers:
Familiarity of performing organization with similar projects in the past
Strength of business case
Alignment of project within the performing organization strategy
Commitment from PM and project team members
Support from management (i.e. steering team, project sponsor)
Clarity in project deliverables
Understanding of the project scope by the customer
Truthfulness and accuracy of project schedule and budget
Availability of resources for the length of project life
Availability of project management plan and/or its subsidiaries (communication, risks, scope, etc.)
Are they all equally significant? If not, which are their relative weights? May some of the items be discarded under the Pareto 80/20 principle? Are there other important variables missing? A massive data mining from past projects – similarly to what was done with the Hit song project – is reckoned necessary in order to develop a realistic and accurate predictive model. It is beyond the intention of this short blog post to develop a predictive model – that would be indeed its ultimate goal.
Eduard, in theory, if an organization established a consistent method for data collection around all projects, an algorithm could run prediction models bast on the past. Certainly could be done .... the hardest piece is getting the data - consistently.
Is this something you envision happening once, when the decision is made regarding whether or not to pursue a project, or at planned intervals throughout a project? I'm thinking of rolling wave planning/progressive elaboration/agile, and how predictive analysis might be better for assessing how close a project is to being ready to start, if the analysis is only done once, before the project starts.
Don't get me wrong, I think you've made a good list of items that every project manager should ask at the beginning of, and throughout, a project. I would put this list in my list of questions to ask during the initial risk assessment.
I like the idea of a model for predicting project success, but I think more than one model might be needed. Can you apply the Dance Hit model to another type of music to determine it's success? Maybe, but probably not to every type of music, and some would define success differently.
I think it could be worthwhile to start a discussion, in Project Management Central or elsewhere, to identify different types of projects, how success is defined for each, identify the variables that contribute to success of each type of project, and then figure out how to determine a project's type. Who knows, it could be a really small list of project types, but I'm sure it would be a great learning opportunity revealing factors I have not considered.
I'm starting to get excited, thinking about how and where to try and put this together. Anybody else interested?
In order to predict, we need a way to categorize projects by type, with potentially further qualifiers to better refine the results. The categories and qualifiers would be pre-determined applicable values. Continuing with consistent data points, we can extract similar initiatives and run comparisons.
Also, to note, the old adage, quality in; quality out. There is creating the model, and there is following the model So, in keeping with this months theme, the simpler the better. Provide as much of a pathway as possible to elicit the data required to formalize results.
Thank you for taking the time to read the blog and your valuable contributions.
Indeed, it might happen that different type of projects need customized/individual predictive models, similarly to what may occur with i.e. pop Vs. classical music; some variables would surely be shared and some others would be specific for each music type.
Some one recommended the lecture of the following book (https://history.nasa.gov/SP-4111.pdf). I have not read it yet, but looks like a promising read.
Eduard, Andrew, and anybody else interested, what would the next steps be to pursuing this?
- Identify how to categorize projects? It's probably not as simple as Strategic, Compliance, and Maintenance, or Predictive versus Adaptive.
- Identify categories?
- Identify success criteria for projects in each category?
- Identify success drivers?