Project Management

The Evolution of ALM

Michael R. Wood is a Business Process Improvement & IT Strategist Independent Consultant. He is creator of the business process-improvement methodology called HELIX and founder of The Natural Intelligence Group, a strategy, process improvement and technology consulting company. He is also a CPA, has served as an Adjunct Professor in Pepperdine's Management MBA program, an Associate Professor at California Lutheran University, and on the boards of numerous professional organizations. Mr. Wood is a sought after presenter of HELIX workshops and seminars in both the U.S. and Europe.

ADVERTISEMENT

Trending Articles

Project Management Is a Journey, Not a Destination

by Andy Jordan

Many of the stakeholders that support project management career growth have the wrong perspective on what project management actually is. Becoming a project manager isn't some kind of ending, a destination to reach that signifies progress is complete. It's just the beginning.

The Big Problem With Your Project Kickoff

by Andy Jordan

Your first project is about to kick off. How much knowledge and understanding should you have at this point? Not as much as you might think.

6 Ways To Prevent Team Dysfunction

by Bruce Harpham

Stress, surprises and communication failures—especially in a virtual environment—are ever-present threats to our project success. Unfortunately, project managers often have a blind spot when it comes to their teams.

One of the most well-thought-out, defined and formalized areas of IT is the body of knowledge surrounding application lifecycle management. Yet even in 2010, it still seems to mystify many. Most of us know ALM as the software or systems development lifecycle or SDLC. As Wikipedia so simply and eloquently puts it:
“Application lifecycle management (ALM) is the marriage of business management to software engineering made possible by tools that facilitate and integrate requirements management, architecture, coding, testing, tracking, and release management.”
However, few know the evolution of ALM and how the Structured Revolution of the 1970s and 80s was a major turning point in software development as we know it today. This article presents a retrospective on ALM/SDLC--those that shaped it, the influencing principles and the related methodologies and tools the movement spawned.
While the roots of software development date back to before the 1960s, the real birth of ALM took place around 1977 and blossomed in the 1980s. I know this not because of what the history books say, but because I was there when it all happened. In 1977 I was introduced to a budding group of philosophies, principles and methods known as structured methodologies or SDLCs [Structured (software) Development Life Cycles], each with its own body of knowledge and each with its own concepts and tools. It was during this time that names like Ed Yourdon, Tom DeMarco, Ken Orr, Constantine, Jackson, Dijkstra and more dominated the dialogue on ways to build software applications that were functionally complete and correct or--as we think of it today—“aligned” with the needs of the business. This was an age of discovery and enlightenment for IT and was not without controversy.
Some focused on moving structured programming concepts into structured design models, maintaining that if the design is properly structured the subsequent program will automatically be structured, regardless of the language used (true enough). Others felt that a structured design did not guarantee that the application produced would meet the needs of the organization or users (also true). Still others argued as to the approach. Should designs be process driven or data driven? Should the designs be programming language independent? Should the focus be on developing tools that generate programs based on design specifications? Should the design be driven from output requirements? The list went on and on.
To complicate matters, each school of thought was in competition with the other to see who would prevail and become the standard for the future. The answer? None and all. Each left their mark on the way systems are conceived, designed and developed.
Today’s CMM (capability maturity model) is a direct outgrowth of the Structured Revolution (The Revolution). Computer Aided Software Engineering (CASE) also was born of The Revolution. Even tools like Unified Rationale Rose have their roots in the genius of those leading The Revolution. During the 1990s, Business Process Reengineering and Business Process Improvement were greatly influenced by Structured Development Methodologies. These formal Software Development Methodologies came in a host of flavors and became almost a religion for its supporters.
My contribution was The Helix Methodology (Helix) that I first published in 1979. In actuality, Helix was a second generation SDLC in that it was an outgrowth of the works of Ed Yourdon and Warnier-Orr. Helix was the first process- and data-structured design methodology. It recognized the need to cross-functionally facilitate an organization’s management and knowledge workers in order to “discover” true needs and requirements--first in business terms and later in detailed specifications and application blueprints (logic and data structures). In the 1990s, Helix evolved into a formal Business Process Improvement method and in the 2000s, through the incorporation of organization development and other disciplines, a full-blown enterprise value improvement framework.
Unfortunately, most of what happened during the renaissance of the 70s and 80s is lost on today’s IT professionals--and it shows in the quality and resilience of the products that have been produced (like overweight legacy systems that are anything but adaptable to change). So, in a tribute to the “good fight”, here is a short history lesson as I remember it (with a little help from Wikipedia).
For me it all started in 1977 when I was invited to attend a workshop on structured system design that promised to teach me how to design and build systems that were functionally correct, complete, easy to understand and easy to maintain. Having been in the service bureau business for over five years—and contending with the continual customization of our programs--I had a keen appreciation for just how difficult it was to achieve those goals. The course was taught by Ken Orr of Warnier-Orr fame. It was the first time I had been exposed to something other than flowcharts as a primary logic diagramming technique. Using set theory combined with bracket diagrams to depict logic was amazing--and so intuitively obvious I was hooked.
At that time, the Warnier-Orr methodology was built around the premise that if you could identify all the output requirements of a system you could decompose those outputs into the data and program structures needed to support their production--and then through some basic process analysis, identify the inputs needed to capture the information required. For me, this was a total game-changer and I immediately became a Warnier-Orr zealot. Off to Kansas City I flew to attend a “train the trainer” workshop and immediately dedicated my entire consulting focus to helping companies implement mini-computer-based, enterprise-wide systems using my newfound skills.
At that same time, I became aware of others in the field on the same quest as Orr--the most prominent being Ed Yourdon, who approached the challenge a bit differently. Yourdon’s approach started with a more process-oriented focus and--through the use of what he called data flow diagrams--mapped out how objects of data worked their way through transactions and later programs. At the time, I felt Yourdon’s approach was too messy, hard to use and overly syntactical in its representations. Data flow diagrams (or as we called them, spaghetti and meatball diagrams) had no structure and didn’t promote structured outcomes. In fact, a good part of my consulting was derived from helping organizations migrate from Yourdon to Warnier-Orr and later Helix. Yet by far, Yourdon was the primary force in the structured thinking arena. Even his marketing was way out in front as evidenced by this promotional ad in ComputerWorld circa 1976:
Other names that were influential to the Structured Movement were Jackson, Dijkstra, DeMarco and Constantine. Each had their own theories and methodologies for skinning the structured design cat. By 1978 the movement was in full swing: A revolution was afoot and I was smack-dab in the middle of it. Below is an ad that illustrates how the movement was coming together. Instead of competing with each other, the gurus of the movement were promoting a joint movement via what was called “Structured Forums”:
In a matter of 24 days in 18 cities, the blitz was on. I was fortunate enough to present at one of these forums where I set forth the supposition that business process and data structure could be identified very early on in the lifecycle by adding a requirement discovery phase to the standard SDLC phases of requirements definition, design, construction, deployment and renewal. During this phase, I contended that objects of information could be identified and traced in how they moved through the organization in context to how they delivered value to stakeholders. From these “workflow” models, data structures could be identified and--using normalization rules--a data model could be accurately developed.
This of course was heresy to all camps. The data-structured folks wanted nothing of the process side of the equation and the process-driven folks wanted nothing of the data side. Neither believed what I was advocating could be done--until I presented examples from real-life projects. So like a second generation structured enthusiast, I blended the best of all the theories and Helix was born.
But something happened along the way of the revolution--the focus moved toward a new generation of application generators called C.A.S.E. (computer-aided software engineering) tools, fueled by a budding fourth generation language market. This short-circuited the revolution as IT groups flocked to these power tools for building dynamic prototypes and evolving business applications into existence via heuristic methods.
While these C.A.S.E. tools were promising, they did nothing to ensure that what was built had traceability back to the needs of the organization. Most of the systems developed using these tools were nothing more than automated versions of manual systems. Programmers hated C.A.S.E. tools because it took away their art and produced what they considered ugly code.
Some C.A.S.E. providers tried in vain to integrate structured methods into their tools sets. Companies like Texas Instruments had a promising entry into the field in that it would generate code in a variety of languages and provided facilities for maintaining that code at the design level, removing the programmer from the equation. In 1988, even I threw my hat in the ring with a C.A.S.E. tool called Advantage that built applications with little or no code--but also produced programming language independent of design specifications.
Perhaps the lone survivor of the C.A.S.E. offerings came in 1994 with UML (Unified Modeling Language), which uses a modeling language to capture business processes, business cases and generates object-oriented applications. Today, Microsoft’s 2010 version of Visual Studio also holds some new promise in terms of its tool sets and its integration of process modeling. Yet much of the body of knowledge that was developed during the Structured Revolution has been lost, watered down or homogenized to the point of being rather weak at best; thus the focus on CMMI and other frameworks which keep the last pieces of structured system development alive.
From structure programming to structured design to application lifecycle management, the evolution is clear but the goal blurred. It isn’t about applications at all but rather building technology solutions that align with the needs of the organization and its stakeholders--and thus needs to be tightly integrated to business process improvement methods and frameworks.
References
Wikipedia - Application Lifecycle Management
A short history of structured flowcharts (Nassi-Shneiderman Diagrams)  
Wikipedia - Software development methodology
Wikipedia - Structured Systems Analysis and Design Method
Wikipedia - Jackson System Development
Wikipedia - DATA STRUCTURED SYSTEMS & DATABASE DESIGN
Wikipedia - Edward Yourdon

Wikipedia - Edsger W. Dijkstra
Wikipedia - Warnier/Orr diagram
Wikipedia - Structured Systems Analysis and Design Method (SSADM)
ComputerWorld – Yourdon’s State of the Future
Wikipedia – Tom Demarco
Wikipedia - Larry Constantine
The Structured Forum
Article: CASE crop a flop. (computer-aided system design)
Diagramming techniques. (systems analysis and design diagramming; includes brief outlines of available Computer Aided Software Engineering software) (Database Design)        
Wikipedia - James Martin
Wikipedia – Computer-Aided Software Engineering (CASE)


Want more content like this?

Sign up below to access other content on ProjectManagement.com

Sign Up

Already Signed up? Login here.


Reviews (2)

Login/join to subscribe
ADVERTISEMENTS

"Success consists of going from failure to failure without loss of enthusiasm."

- Winston Churchill