Project Management

Voices on Project Management

by , , , , , , , , , , , , , , , , , ,
Voices on Project Management offers insights, tips, advice and personal stories from project managers in different regions and industries. The goal is to get you thinking, and spark a discussion. So, if you read something that you agree with--or even disagree with--leave a comment.

About this Blog


View Posts By:

Cameron McGaughy
Lynda Bourne
Kevin Korterud
Conrado Morlan
Peter Tarhanidis
Christian Bisson
Soma Bhattacharya
Emily Luijbregts
David Wakeman
Jen Skrabak
Mario Trentim
Ramiro Rodrigues
Wanda Curlee
Yasmina Khelifi
Sree Rao
Lenka Pincot
cyndee miller
Jorge Martin Valdes Garciatorres
Marat Oyvetsky

Past Contributors:

Rex Holmlin
Vivek Prakash
Dan Goldfischer
Linda Agyapong
Jim De Piante
Siti Hajar Abdul Hamid
Bernadine Douglas
Michael Hatfield
Deanna Landers
Kelley Hunsberger
Taralyn Frasqueri-Molina
Alfonso Bucero Torres
Marian Haus
Shobhna Raghupathy
Peter Taylor
Joanna Newman
Saira Karim
Jess Tayel
Lung-Hung Chou
Rebecca Braglio
Roberto Toledo
Geoff Mattie

Recent Posts

How to Escape Functional Fixedness

9 Key Skills of Great Project Managers

How Can We Keep Project Conflict in Check?

A Roadmap for Continuous Learning

The Power of Agile Team Cohesion


2020, Adult Development, Agile, Agile, Agile, agile, Agile management, Agile management, Agile;Community;Talent management, Artificial Intelligence, Backlog, Basics, Benefits Realization, Best Practices, BIM, business acumen, Business Analysis, Business Analysis, Business Case, Business Intelligence, Business Transformation, Calculating Project Value, Canvas, Career Development, Career Development, Career Help, Career Help, Career Help, Careers, Careers, Careers, Categories: Career Help, Change Management, Cloud Computing, Collaboration, Collaboration, Collaboration, Collaboration, Communication, Communication, Communication, Communication, Complexity, Conflict, Conflict Management, Consulting, Continuous Learning, Continuous Learning, Continuous Learning, Continuous Learning, Cost, COVID-19, Crises, Crisis Management, critical success factors, Cultural Awareness, Culture, Decision Making, Design Thinking, Digital Transformation, digital transformation, Digitalisation, Disruption, Diversity, Documentation, Earned Value Management, Education, EEWH, Enterprise Risk Management, Escalation management, Estimating, Ethics, execution, Expectations Management, Facilitation, feasibility studies, Future, Future of Project Management, Generational PM, Governance, Government, green building, Growth, Horizontal Development, Human Aspects of PM, Human Aspects of PM, Human Aspects of PM, Human Aspects of PM, Human Resources, Inclusion, Innovation, Intelligent Building, International, Internet of Things (IOT), Internet of Things (IoT), IOT, IT Project Management, IT Strategy, Knowledge, Leadership, Leadership, Leadership, Leadership, lean construction, LEED, Lessons Learned, Lessons learned;Retrospective, Managing for Stakeholders, managing stakeholders as clients, Mentoring, Mentoring, Mentoring, Mentoring, Methodology, Metrics, Micromanagement, Microsoft Project PPM, Motivation, Negotiation, Neuroscience, neuroscience, New Practitioners, Nontraditional Project Management, OKR, Online Learning, opportunity, Organizational Project Management, Pandemic, People, People management, Planing, planning, PM & the Economy, PM History, PM Think About It, PMBOK Guide, PMI, PMI EMEA 2018, PMI EMEA Congress 2017, PMI EMEA Congress 2019, PMI Global Conference 2017, PMI Global Conference 2018, PMI Global Conference 2019, PMI Global Congress 2010 - North America, PMI Global Congress 2011 - EMEA, PMI Global Congress 2011 - North America, PMI Global Congress 2012 - EMEA, PMI Global Congress 2012 - North America, PMI Global Congress 2013 - EMEA, PMI Global Congress 2013 - North America, PMI Global Congress 2014 - EMEA, PMI Global Congress 2014 - North America, PMI GLobal Congress EMEA 2018, PMI PMO Symposium 2012, PMI PMO Symposium 2013, PMI PMO Symposium 2015, PMI PMO Symposium 2016, PMI PMO Symposium 2017, PMI PMO Symposium 2018, PMI Pulse of the Profession, PMO, pmo, PMO Project Management Office, portfolio, Portfolio Management, portfolio management, Portfolios (PPM), presentations, Priorities, Probability, Problem Structuring Methods, Process, Procurement, profess, Program Management, Programs (PMO), project, Project Delivery, Project Dependencies, Project Failure, project failure, Project Leadership, Project Management, project management, project management office, Project Planning, project planning, Project Requirements, Project Success, Ransomware, Reflections on the PM Life, Remote, Remote Work, Requirements Management, Research Conference 2010, Researching the Value of Project Management, Resiliency, Risk, Risk Management, Risk management, risk management, ROI, Roundtable, Salary Survey, Scheduling, Scope, Scrum, search, SelfLeadership, SelfLeadership, SelfLeadership, SelfLeadership, Servant Leadership, Sharing Knowledge, Sharing Knowledge, Sharing Knowledge, Sharing Knowledge, Social Responsibility, Sponsorship, Stakeholder, Stakeholder Management, stakeholder management, Strategy, swot, Talent Management, Talent Management, Talent Management, Talent Management, Talent Management Leadership SelfLeadership Collaboration Communication, Taskforce, Team Building, Teams, Teams in Agile, Teams in Agile, teamwork, Tech, Technical Debt, Technology, TED Talks, The Project Economy, Time, Timeline, Tools, tools, Transformation, transformation, Transition, Trust, Value, Vertical Development, Volunteering, Volunteering #Leadership #SelfLeadership, Volunteering Sharing Knowledge Leadership SelfLeadership Collaboration Trust, VUCA, Women in PM, Women in Project Management


Viewing Posts by Lynda Bourne

3 Agile Disconnects We Need to Address

Categories: Agile

By Lynda Bourne

The never-ending debate between agile and waterfall seems to be fuelled by different groups of people talking about completely different concepts with little understanding of other’s perspectives. From my viewpoint, some of the key disconnects are:

Agile vs. Agility: In the modern VUCA[1] environment, agility is important. But organizational agility is not the same as the organization choosing to use an agile project delivery process.

Organizational agility is constrained by the nature of the organization and its assets. A major mining company cannot suddenly decide to stop mining iron ore and focus on rare earths; is has billions invested in its existing mines, and new mines take many years to bring on-line. It can refocus investments “immediately,”, but the results take decades to be fully realized?and suddenly deciding to reverse the decision in a few years’ time will waste millions. Adaptability is important, but decisions have to be nuanced.

Conversely, a small consulting business whose main asset is its people can decide to shift focus on an almost daily basis to keep up with fast moving trends?think of applying AI in almost any sphere.

However, any type of organization can choose to use an agile methodology to help deliver those projects that benefit from an inherent flexibility in working.

Agile vs. Projects: Agile methods are not exclusive to projects, and not all projects benefit from agile.

Agile methods such as Kanban and Scrum can be used for operational maintenance (particularly of software) without the overhead of project management. The maintenance team use its preferred method to prioritize the repair and upgrading of the operational system and keep track of the backlog. New requests are added to the backlog, prioritized, and completed in a stable business-as-usual function.

Where using a project approach to undertaking a defined scope of work is desirable, some projects are suitable for the use of agile methods, others are not. Most “soft” projects creating an intangible product such as software will benefit from an agile approach to development. But heavy engineering projects where safety and structural considerations are paramount need a fully planned and disciplined approach to avoid disaster.

There is a continuum from projects that are suited to agile through to those where a tightly controlled planned approach is essential. Deciding on how to best manage projects along the spectrum is as much a cultural decision as a technical one.

Non-Agile Projects vs. Waterfall: Agile advocates continue to try to divide the world into “agile = good”, “waterfall = bad”. I discussed this issue in my post “The Problem with Waterfall, Agile & ‘Other.’

The simple fact is very few software projects use waterfall; the concept was promulgated by the U.S. Department of Defense in 1988 for software development and abandoned in 1994, but some organizations have hung onto the perception of “control” for various reasons. However, outside of the software industry, no one uses waterfall.

Contrary to the view of most agile advocates, the concept of change as defined in the Agile Manifesto and change in almost all other projects is based on the same premise. From the Manifesto’s second principle: Agile processes harness change for the customer's competitive advantage. Change that destroys customer value is no more welcome in an agile project as any other.

Every contract for the delivery of a project to a client I’ve seen in the last 50 years has included clauses for the management of change. What varies is the cost of implementing the change. If you have delivered 15 out of 20 software modules and the client asks for five more, there will be time and cost implications based on the 25% increase in scope. If you have built 15 stories in a 20-story high-rise building and the client demands an additional five stories be added, the only option is to demolish everything, install stronger foundations and start again. But if the client decides to change the building color scheme from pale grey to pale blue before the paint is ordered, the cost of the change will be minimal. Regardless of the project delivery approach, change is only beneficial if it creates additional value.

Where the Agile Manifesto is of value across all project type is in its focus on relationships, people, and communication. These concepts are becoming more important in all industries and across all project types.

We need to move on from the “agile/waterfall” debate and recognize:

  1. An appropriate level of organizational agility is essential in the modern VUCA world.
  2. Agile project delivery methods have benefit in the right situations; they are not a silver bullet to solve all project delivery challenges.
  3. Waterfall is not a synonym for bad project management; no one uses waterfall, but there are plenty of examples of bad project management around.
  4. Good project management focuses on relationships, communication, and people by motivating the right people and using the best approaches to deliver value to the project client. But the best approach depends on the nature of the project deliverable.

What do you think?

[1] VUCA stands for volatility, uncertainty, complexity, and ambiguity.

Posted by Lynda Bourne on: February 16, 2024 06:15 PM | Permalink | Comments (12)

Do Modern PMs Rely on Charts Too Much?

By Lynda Bourne

Ptolemy's world map (source: Wikipedia)

Do modern project managers and their clients rely on their charts and reports too much? We all know that project schedules, cost reports, risk assessments and other reports are produced by sophisticated computer software, these days increasingly enhanced by artificial intelligence. But does this sophisticated processing mean the charts are completely reliable?

The modern world is increasingly reliant on computer systems to direct and control many aspects of life—from self-driving cars, to autonomous warehouses, to the flight control systems in aircraft. But can this reliance on computer systems be translated to project controls information, or do we need a more ancient mindset?

Modern navigators rely on the accuracy of their GPS to know exactly where they are and where they are going. The autopilots are better than the human, but the data being used is precise and validated.

The same level of reliability and accuracy cannot be applied to project controls data. Every estimate is an assessment of what may occur in the future based on what happened in the past. Even when a sophisticated risk model is built, the P80 or P90 result is based on subjective range estimates taken from past events.

The future may unfold within the expected parameters, and it may not. We simply cannot determine the future in advance. While the quality of the project predictions is based on the quality of the data being used in the modelling processes (and the only guaranteed fact is the model will be incorrect), predictions do not control the future. The key question is: How useful are the models in helping navigate the project through to a successful conclusion? [Remember GIGO (garbage in, garbage out)?!]

In days gone by, navigators did not need accurate charts and satnav systems to reach their destinations. The Viking and Polynesian navigators crossed thousands of miles of open ocean to land on small islands using observations of the natural environment and tacit knowledge passed down from earlier generations. They knew certain seabird species only ventured relatively short distances from land, how clouds formed and changed over land, etc., augmented by primitive technologies.

Fast-forward a few centuries, and the early European navigators (Columbus, Magellan, Drake, Cook and countless others) had steadily improving charts that made navigating easier—but they also knew the best charts available were not accurate. The general shape of the world had been mapped since the time of Ptolemy (circa 150 CE), and as better information became available, better maps and charts were created. But these are still continuing to be improved into the 21st century.

So how did people navigate the globe without accurate maps and charts? I suggest there were four core elements in the approach, all of which can be applied to modern project management:

  1. Recognize the chart is a guide, get the best possible chart available and use it to plan your course—taking into account as much additional information and tacit knowledge you can access.
  2. Then, assume the chart is incorrect. Keep a sharp look out for unexpected issues and dangers, adjust course as needed, and keep collecting information along the way. You only run into the rocks you do not see!
  3. Keep adapting and adjusting your course to make the best of the current circumstances, using both known and emerging information—the destination does not change, but how you get there may.
  4. Then use the new information you have gathered to update the chart to benefit future voyages in the same direction.

To move from assuming controls information is correct, to seeing it as a useful guide that can be improved as better knowledge becomes available, requires a paradigm shift in thinking that sits comfortably alongside many of the concepts of agile.

The future is inherently uncertain and we can learn a lot from the way early navigators used imprecise charts to sail the oceans. Navigating the globe in past centuries and leading a project to a successful conclusion are both risky endeavours; this fact needs to be accepted, and the risks minimized by using the best available charts—while being aware of their limitations.

What do you think?

Posted by Lynda Bourne on: September 14, 2023 09:52 PM | Permalink | Comments (10)

Predicting Completion in Agile Projects

Categories: Agile

By Dr. Lynda Bourne

The generally accepted way of assessing progress on a project, and predicting its completion, is to use a critical path method schedule. However, the CPM paradigm does not work across a wide range of projects where there is no predetermined sequence of working that must be followed. There may be a high level “road map” outlining the desired route to completion and/or specific constraints on the sequencing of parts of the work but in most agile projects, the people doing the work have a high degree of flexibility in choosing the way most of the work is accomplished.

The focus of this post is to offer a practical solution to the challenge of assessing progress, and calculating the likely completion date in agile projects.

WPM as an Alternative to ES and CPM
Work performance management (WPM) is designed as an alternative approach to project controls. It uses the same concept as earned schedule, but offers a simple, practical tool that uses project metrics that are already being used for other purposes.

The function of WPM is to assess progress and calculate a predicted completion date in a consistent, repeatable, and defensible way by comparing the amount of work achieved at a point in time with the amount of work planned to have been achieved at the same point in time. Then based on this data, you calculate an expected completion date.

The Theoretical Basis of WPM
WPM has been designed to fill an identified gap in the current controls systems used on agile projects. It is based on the same premise used in earned schedule and earned duration, and is expected to achieve a similar level of reliability by comparing the amount of work planned to be accomplished to the amount of work actually achieved in the period through to a data date (time now). However, unlike ES and ED, WPM focuses on the core elements of the work.

WPM Terminology
The terminology used for the data points in WPM is:

  • WP = Work Planned               measured in an appropriate unit – cumulative over time
  • WA = Work Accomplished     measured on the same basis as WP
  • PC = Planned Completion     project duration in time units (days, weeks, months)
  • TN = Time Now                       the number of PC time units to the date of assessment
  • TE = Time Earned                   the number of PC time units to the point where WA = WP

From this information, the work performance measures are calculated as follows:

  • WPV = Work Performed Variance TE - TN,
    negative values show the schedule slip in PC time units
  • WPI = Work Performed Index         TE/TN,
    values less than 1.0 show less work has been accomplished than planned
  • EC = Expected Completion            the expected project duration in PC time units calculated by                                      PC/WPI = EC

Applying WPM to a Project Using Scrum
Scheduling the work should be as realistic as possible, but in many situations a straightforward pragmatic approach will suffice. Take for example a 20-week software project that has 27 stories of various size, a total of 86 story points, and the resource planning to use two scrum teams. In the absence of any other information, you could assume:

  • The first two weeks are needed for team development, planning and other start-up processes
  • Sprints are expected to take two weeks each
  • The last two weeks will be for contingencies, bug fixes and other finalization work

This leaves 16 weeks for productive work; therefore, the first stories should be delivered at the end of the first productive sprint, Week 4, and all stories by the end of Week 18.

This means the rate of planned production between the start of Week 2 and the end of Week 18 is 86/16 = 5.375 story points per week. Based on these assumptions, at the end of Week 4 (two weeks of production), we can expect 10+ story points to be complete, and at the end of Week 18 all 86 story points complete. The rest of the planned distribution is simply a straight line between these two points.

We know sprints will not take exactly two weeks every time (some will overrun, and occasionally some will finish early), and we also know the number of story points generated in each sprint will vary. But on average, if the two sprint teams together are not completing a bit over 5.3 story points per week, every week, the project will finish late.

Once this basic rate of production has been determined for the project, WPM measures the actual work delivered (WA) and shows the time variance at time now (TN) and uses this information to predict the expected completion (EC).

For example, at the end of Week 8, three sprints should have been completed by both teams, and we are expecting 30 story points complete. But only 23 have been delivered. Velocity calculation will indicate more sprints will be needed, and the burndown chart will show the work is behind plan. But what does this mean from a time perspective?

A look at the planned rate of production will show 23 story points should have been finished during Week 7 (the actual fraction is 7.3). Therefore, the work is 0.7 weeks (3.5 working days) late. The work performance index (WPI) is 0.9125.

Dividing the original duration (20 weeks) by the WPI suggests the revised duration for the project is 21.9178 weeks; the variance at completion is -1.9178 weeks, or 13.4 calendar days late.

If these calculations look similar, they are based on the well-tried formula used in earned value management and earned schedule—all I’ve done is shift the metric to a direct measure of the work performed.

WPM is designed to be a simple robust performance measurement system that will provide an accurate assessment of the project’s status from a time management perspective. It can assess how far ahead or behind plan the work currently is—and based on this information, the likely project completion date based on the assumption work will continue at the current rate

The two requirements to implement WPM are:

  • A consistent metric to measure the work planned and accomplished
  • A simple but robust assessment of when the work was planned to be done

The metric used can be a core deliverable (e.g., 2,000 computers replaced in an organization), or a representation of work such as “story points,” or the monetary value of the components to be delivered to the client.

Peripheral and support activities can usually be ignored when establishing the WPM metric; they rarely impact the project delivery independently. Failures in the support areas typically manifest in delays to the primary delivery metric.

Has anyone seen or used something like this in the “real world”? I would love to hear if you have.

Posted by Lynda Bourne on: June 26, 2023 10:45 PM | Permalink | Comments (19)

Commercializing Agile

Categories: Agile

By Lynda Bourne

Agile in its various forms is becoming mainstream, and this means an increasing number of commercial contracts are being delivered by contractors who either choose, or are required, to use an agile methodology to create their contracted deliverables. While this is probably a good thing, this shift in approach can cause a number of problems. This post is a start in looking for practical solutions to some of these issues.

Two of the core tenets of agile are welcoming change to add value, and working with the client to discuss and resolve problems. While these are highly desirable attributes that should be welcomed in any contractual situation, what happens when the relationship breaks down, as it will on occasion?

The simple answer is that every contract is subject to law, and the ultimate solution to a dispute is a trial—after which a judge will decide the outcome based on applying the law to the evidence provided to the court. The process is impartial and focused on delivering justice, but justice is not synonymous with a fair and reasonable outcome. To obtain a fair and reasonable outcome, evidence is needed that can prove (or disprove) each of the propositions being put before the court.

The core elements disputed in 90% of court cases relating to contract performance are about money and time. The contractor claims the client changed, or did, something(s) that increased the time and cost of completing the work under the contract; the client denies this and counterclaims that the contractor was late in finishing because it failed to properly manage the work of the contract. 

The traditional approach to solving these areas of dispute is to obtain expert evidence as to the cost of the change and the time needed to implement the change. The cost element is not particularly affected by the methodology used to deliver the work; the additional work involved in the change and its cost can still be determined. Where there are major issues is in assessing a reasonable delay.

For the last 50+ years, courts have been told—by many hundreds of experts—that the appropriate way to assess delay is by using a critical path (CPM) schedule. Critical path theory assumes that to deliver a project successfully, there is one best sequence of activities to be completed in a pre-defined way. Consequently, this arrangement of the work can be modeled in a logic network—and based on this model, the effect of any change can be assessed.

Agile approaches the work of a project from a completely different perspective. The approach assumes there is a backlog of work to be accomplished, and the best people to decide what to do next are the project team members when they are framing the next sprint or iteration. Ideally, the team making these decisions will have the active participation of a client representative, but this is not always the case. The best sequence of working emerges; it is not predetermined.

There are some control tools available in agile, but diagrams such as a burndown (or burnup) chart are not able to show the effect of a client instructing the team to stop work on a feature for several weeks, or adding some new elements to the work. The instructions may have no effect (the team simply works on other things), or they may have a major effect. The problem is quantifying the effect to a standard that will be accepted as evidence in court proceedings. CPM has major flaws, but it can be used to show a precise delay as a specific consequence of a change in the logic diagram. Nothing similar seems to have emerged in the agile domain.

The purpose of this post is twofold. The first is to raise the issue. Hoping there will never be a major issue on an agile project that ends up in court is not good enough—hope is not a strategy. The second is to see if there are emerging concepts that can address the challenge of assessing delay and disruption in agile projects. Do you know of any?

Posted by Lynda Bourne on: March 16, 2023 01:11 AM | Permalink | Comments (10)

Social and Environmental Awareness is Becoming Confusing

By Lynda Bourne

It is important that both professionals, and the organizations that employ them, are socially and environmentally aware—and act responsibly to protect the rights of others. The financial consequences of failing to be socially aware started to be felt in the 1950s. Around this time, investors started excluding stocks, or entire industries, from their portfolios based on business activities such as tobacco production or involvement in the South African apartheid regime.

These considerations developed into the concept of environmental, social, and corporate governance. Today, ESG is an umbrella term that refers to specific data designed to be used by investors for evaluating the material risk that the organization is taking on based on the externalities it is generating.

The term ESG was popularly used first in a 2004 report titled Who Cares Wins[1], which was a joint initiative of financial institutions at the invitation of the United Nations. Then the UN’s 2006 report Principles for Responsible Investment (PRI) required ESG to be incorporated into the financial evaluations of companies.

Under ESG reporting, organizations are required to present data from financial and non-financial sources that shows they are meeting the standards of agencies such as the Sustainability Accounting Standards Board, the Global Reporting Initiative, and the Task Force on Climate-related Financial Disclosures. The data must be made available to rating agencies and shareholders.

Corporate social responsibility is the flip side of ESG. CSR is the belief that corporations have a responsibility toward the society they operate within. This is not a new idea; it is possible to trace the concerns of some businesses toward society back to the Industrial Revolution and the work of primarily Quaker business owners to provide accommodation and reasonable living standards for their workers.

However, it was not until the 1970s that concepts such as social responsibility of businesses being commensurate with their power, and business functions by public consent, started to become mainstream. Today, CSR is a core consideration for most ethical businesses.

These concepts were turned into a structured set of guidelines in 1981, when Freer Spreckley suggested in Social Audit - A Management Tool for Co-operative Working[2] that enterprises should measure and report on financial performance, social wealth creation, and environmental responsibility.

These ideas have become the triple bottom line (TBL), which is considered essential to effective organizational governance these days. Most of the major corporate governance frameworks require the TBL to be included in corporate reporting.

In his foreword to Corporate Governance: A Framework – Overview (prepared by the World Bank in 2000), Sir Adrian Cadbury summarized these objectives in his statement: "The aim is to align as nearly as possible the interests of individuals, corporations, and society."

Similar concepts to the TBL also form a core component of most codes of ethics and professional conduct. For example, the current version of PMI’s Code of Ethics and Professional Conduct incudes:

  • 2.2 Responsibility: Aspirational Standards: As practitioners in the global project management community:
    2.2.1 We make decisions and take actions based on the best interests of society, public safety, and the environment.
  • 4.3 Fairness: Mandatory Standards: As practitioners in the global project management community, we require the following of ourselves and our fellow practitioners:
    4.3.4 We do not discriminate against others based on, but not limited to, gender, race, age, religion, disability, nationality, or sexual orientation.

So far, so good. There has been a simple set of unambiguous requirements in place for 30-plus years that are straightforward and easy to understand. These simple (if difficult to achieve) concepts have been refined to make consideration of environmental (sustainability), social and financial outcomes important in every decision-making process, including those affecting the organization’s projects.

However, having become a hot topic for boards, investors and managers alike in the last couple of years, these ideas seem to be disappearing into a blizzard of acronyms that appear to be more about differentiating a consultant’s services than adding value. Some of the newer acronyms include:

  • 3Ps: People, Plant, Profit
  • DEI: Diversity, Equity and Inclusion (alternatively EDI)
  • DIB: Diversity, Inclusion and Belonging
  • D&I: Diversity & inclusion
  • JEDI: Justice, Equity, Diversity and Inclusion
  • RAP: Reconciliation Action Plan
  • SDGs: Sustainable Development Goals (published by the UN)

My concern is that while the concepts defined by each of the acronyms above are of themselves valuable (once you work out what they mean), and a few—such as the UN’s sustainable development goals—add substantially to the TBL framework, most are either sub-sets of the overarching objectives defined in PMI’s Code of Ethics and Cadbury’s simple statement, or essentially cover the same concepts.  

Do all of these extra acronyms add to the core objective of improving outcomes for people and the environment or not? What do you think?


[1]     Download the 2004 repot from:

[2] Spreckley, Freer (1981). Social Audit: A Management Tool for Co-operative Working. Beechwood College.

Posted by Lynda Bourne on: December 07, 2022 07:52 PM | Permalink | Comments (7)

"Life is to be lived. If you have to support yourself, you had bloody well better find some way that is going to be interesting. And you don't do that by sitting around wondering about yourself."

- Katharine Hepburn