Viewing Posts by Lynda Bourne
By Lynda Bourne
Any output from a planning process is an embodiment of the planners’ fundamental principles and philosophies. They apply these principles, or approaches, to develop their plans. And different people will develop different plans to achieve the same objectives.
As early as the 1950s, James Kelley, one of the developers of the critical path method (CPM), reflected on this theme. He noted that in a class of 20-plus people learning the new CPM approach to scheduling, developing a 16-activity schedule from a set class exercise would result in nine to 10 different schedules. Clearly, different people use different approaches and assumptions.
What Shapes Approaches
The planner’s approaches may be explicitly stated, or they may be implicit and affected by:
The conundrum facing organizations is deciding the best approach to develop a plan—one that’s accomplished in the most efficient way within a given set of circumstances, in a given cultural environment, that results in the best outcomes. There is no one right answer to this question, or one way of knowing if the chosen options have delivered the desired result. Each project is unique, making tests and comparisons impossible.
Some of the approaches that can be used in combination, or isolation, include:
This diagram pairs opposite approaches; it’s up to you to determine where on the continuum is best for you in the current situation.
Applying the Approaches
The challenge is understanding the choices open to you and then making informed decisions about where on each of the dimensions is best for you in the current circumstances. Making overt choices rather than just doing the normal thing will generally lead to better planning outcomes.
For example, an agile project will require a planning approach that leans toward using non-rationality, incrementalism, contingent, emergence, improvisation, utopian, pluralistic, democratic and continuous approaches to the planning activity. A traditional “hard dollar” engineering contract, on the other hand, tends to require the opposite.
My recommendation is you think through these options. This offers you an opportunity to improve your planning practice, as one approach will not suit every project and simply doing the same as last time will inevitably lead to a suboptimal outcome.
How do you think about your approach to planning?
By Lynda Bourne
As you may know, any monitoring and control process has three components. The first is establishing a baseline that you plan to achieve, the second is comparing actual progress to the plan to see if there are any differences, and the third is taking corrective or preventative action. Corrective actions fix existing problems, while preventative actions stop problems from occurring in the future.
This post looks at the middle phase. Before taking action to bring performance into alignment with the plan, make sure the variance you are seeing in the control systems is real. Corrective and preventative actions take time and usually involve costs, and there is no point in expending effort where it is not needed.
The variance is the difference between two imprecise elements: the planned state and the actual situation. The plan is based on estimates and assumptions made some time ago about what may occur in the future. All plans and estimates have a degree of error built in; it is impossible to precisely predict the future of a complex system such as a project. Similarly, the measurement of the actual situation is prone to observational errors; key data may be missing or the situation misinterpreted.
So how do you decide if the measured variance is real and significant enough to warrant corrective action? I suggest considering the following:
1. Does the reported variance line up with your expectations?
2. Is the variance significant?
3. Is a solution viable?
Let’s explore these in depth.
Does the reported variance line up with your expectations?
Try looking at a couple of different monitoring systems, such as cost and time. Do the two systems correlate, or are they giving you very different information on the same group of activities? If they correlate, perhaps your expectations are misplaced. If they are giving you different information, there may be data errors.
Is the variance significant?
If the predicted slippage on the completion date for a key milestone over a series of reports is bouncing around, any single measurement within the noise factor is likely to be insignificant.
Trends, on the other hand, highlight issues. Sensible control systems have range statements that indicate the variance is too small to worry about if it is inside the allowed range. This general rule is modified to take trends seriously and to require action to correct negative variances close to a milestone or completion.
Is a solution viable?
Other situations are simply not worth the cost. There is no point in spending US$10,000 to correct a -US$5,000 variance. However, this decision has to take into account any effect on the client and your organization’s reputation. Cost overruns are generally internal, whereas late delivery and quality issues may have a significant reputational cost, affecting stakeholder perceptions.
Where a viable option exists to correct negative variances, corrective and preventative actions need to be planned, prioritized and implemented. There is no point wasting time on a controls system that does not generate effective controlling actions.
Second, implementing corrective and preventative actions requires the resources working on the project to do something different. Variances don’t correct themselves, and simply telling someone to catch up is unlikely to have any effect. Sensible management action, decisions and leadership are needed to physically change the situation so there is a correction in the way work is performed. This is a core skill of every effective manager.
I’d love to know: How do you deal with variances in your projects? Please share below.
By Lynda Bourne
Have you ever experienced technical debt on a project? As the debt builds up, everything looks good from the outside. However, when the crunch comes and that debt has to be repaid, a major reversal in fortune can occur.
Technical debt refers to the costs of having to go back and resolve problems that arise because an earlier decision was made to take the easy route, instead of the best one. By taking the easy option, the team incurs a debt to the project that will have to be repaid later. While the concept comes from software development, this insidious effect can be seen across industries.
The Crossrail project in London offers a current, extreme example. In July 2018, it was reported that on-time and on-budget completion of the £14.8 billon rail project would occur in December 2018. By August 2018, completion had slipped by a year. Currently, the delay extends to the end of 2020, with a cost overrun of 20 percent.
What’s the main driver of this delay and associated costs?
It appears to be decisions made to ignore problems in the signaling system development. According to Construction Manager magazine, while giving evidence to a government inquiry, Crossrail’s new chief executive Simon Wright said, “We were testing on incomplete systems. Productivity was under stress, but we fought hard to maintain the schedule and thought all along that we could find a solution to bring it back, just like we have done on countless other problems that occurred during the construction program.”
This is a classic example of management decisions building up a technical debt.
In 2015, The Independent newspaper reported that rail experts and engineers were having difficulty creating interfaces for the signaling systems. At the same inquiry, Crossrail’s new chairman Terry Morgan said “problems that emerged were mostly due to difficulties with developing software to allow Crossrail trains to travel safely at speed through three separate signalling systems,” according to Construction Manager magazine. The problem was identified in 2015 and hadn’t been resolved by 2019, despite time and money wasted testing incomplete systems. In fact, the irrelevant testing probably added to the delay and costs by distracting people from the real challenge.
Fixing the problem properly the first time would surely have caused a delay and cost blowout between 2016 and 2018. But in all likelihood, the costs would have been lower, the delay would have been shorter, and the current furor surrounding the project would have been minimized.
The problem with technical debt is that often, the people who need to know about a problem aren’t informed. We will never know what the chair and CEO of Crossrail (both sacked) really knew in the 2016 to 2018 period, or what their senior managers knew about the build-up of the technical debt in the Crossrail signalling systems. But the problem could have been avoided, or at least minimized, if the technical debt had been acknowledged. If people are unaware of technical debt, then they’ll be more likely to identify paths that will result in it being created.
To avoid this lack of insight, everyone in the project group, especially team members, must be in a position to offer insight into technical debt.
The project manager can then choose to act, or not. Aware teams bring up the subject of technical debt in planning meetings, and they keep focused on it. Aware managers pose questions such as, “If this proposed shortcut is the right choice, what is there to gain, and what are the challenges and future implications?”
As with financial debt, there are times when going into debt can be beneficial, but only if you can pay back the accrued debt and interest at the right time.
How much technical debt is your project running? Please share your experiences below.
By Lynda Bourne
After more than 40 years in project management, project controls and project governance, I’ve learned that every successful organization has its own unique culture and structure. Nothing works “out of the box.”
Each organization needs to identify the aspects of its existing culture and the parts of its management systems that offer the best opportunities for improvement, define options that may work (there are no guarantees), and decide on the steps needed to deliver the desired improvements.
This process is a journey, and the measure of success is achieving the level of maturity where continuous improvement is organic and internal.
Here are my tips for getting there:
Rely on These Resources for Help Along the Way
PMI has a range of resources to assist you on this journey. The newly created Standard for Organizational Project Management (OPM) provides a framework to align project, program and portfolio management practices with organizational strategy and objectives. This standard is supported by the Organizational Project Management Maturity Model (OPM3®), which defines a framework to measure progress toward maturity. These are assisted by Implementing Organizational Project Management: A Practice Guide. Finally, the Governance of Portfolios, Programs and Projects: A Practice Guide takes a closer look at the different types of governance and how you can implement or enhance governance on your portfolios, programs and projects. All of these standards are free downloads for PMI members.
This may not be the area many project managers focus on, but maybe it’s time for a change. After all, we cannot deliver successful projects when the project is set up to fail. Influencing senior management to focus on improving organizational maturity so that most projects have a fighting chance of being successful is good for everyone.
Have you created a culture of continuous improvement at your organization? I’d love to hear from you—please share below.
by Lynda Bourne
In my last post—It’s Time for a Long, Hard Look at Processes—I questioned if A Guide to the Project Management Body of Knowledge (PMBOK® Guide) should be updated every four years, or if it should become a dynamic knowledge management system similar to Wikipedia. The post generated a number of comments, which I’m going to try to address now.
The fundamental purpose of a standard is to offer standardized advice organizations can rely on. Standards are frequently referenced in contracts and other formal documentation, and they form the basis for certifications. The PMBOK® Guide fulfills all of these purposes. In this situation, stability is essential. Globally, standards are reviewed and updated every four to five years to balance the need for currency against the need for consistency.
The PMI Registered Education Provider (R.E.P.) community has a busy few months each time the PMBOK® Guide is updated, requiring them to go through their training materials to bring them all up to date. This is magnified many times over as organizations around the world update their documentation to align with the new standard.
But the American National Standards Institute (ANSI) standard is only one part of the PMBOK® Guide. The guidance part is the larger aspect of the book and also, in my opinion, the most useful. This element is a knowledge repository and if access to validated information is readily available, knowledge management systems should seek to be as up to date as possible. To achieve this, most knowledge management systems are web-based and assume that once information is printed, it is no longer current. Managing a knowledge management system needs skill and knowledge but should be a real-time, full-time function.
Given this, I suggest that PMI separate the standard part of the document from the knowledge element. The standard section would consist of the ANSI Standard (part two of the current PMBOK® Guide) and the supporting core knowledge that does not change much. This standard and supporting information would remain on the four- to five-year update cycle. The resulting document would be much thinner than the current PMBOK® Guide.
The knowledge element builds onto this as a cloud-based resource and should be the subject of continual improvement and updating. Allowing PMI members to contribute their knowledge on a continuous basis, subject to review and edit, would allow the body of knowledge to grow and adapt as project management grows and adapts.
A careful design of the knowledge structure based on the PMBOK® Guide—augmented with information from the other standards published by PMI and enhanced with current developments from industry—would create a very useful and dynamic source of knowledge for the global project management community.
If access to this project management knowledge bank is free to members and available for a fee to commercial users and non-members, the value of membership would be enhanced and PMI would be positioned to maintain its position as a global leader in the development of project management.
It’s an interesting challenge. What do you think?