Process/Project PKG - Application Solution Process

Stage PMP - Plan Project
Stage PMA - Activate Project
Stage PMC - Control Project
Stage PKS - Application Solution Strategy
Stage PKV - Application Solution Evaluation
Stage PKE - Application Solution Enablement
Stage PKD - Application Solution Deployment
Stage PME - End Project
Process Flow
Templates for this Process
Show all Techniques
Index
ADVERTISEMENT

Description

Application Solution Overview

Introduction to Application Solution Processes

Scope

The Application Solution Processes provide methods for evaluating, selecting, implementing, and deploying commercial off-the-shelf application packages. Considerable growth and maturity of the application package marketplace has prompted many organizations to adopt the policy of "package first, custom development second" in fulfilling business application needs. This product has been designed to help organizations select the right application package, tailor it to fit their needs, and integrate it with the business and technical environment where it will operate. Although the application solution processes are specifically designed for selecting and implementing business application packages, it may be possible to adapt the processes to address other types of packages such as infrastructure packages (e.g., DBMS, development tools, transaction monitors) or productivity packages (e.g., word processors, spreadsheets).

The reasons for using the application solution processes generally fall into three categories:

  • A business reengineering or business analysis project has identified the need for a system and the implementation of an application package is considered a logical and cost-effective solution;
  • An existing application package has become obsolete or unusable, and must be replaced; or,
  • A gap in automated support of the business has become a critical problem and needs to be plugged quickly.

Each of these scenarios is fully supported by this product.

Application Solution Processes

Processes for two types of application package projects are contained in this process library:

Application Solution Selection Process

The purpose of this process is to evaluate commercially available off-the-shelf packages against the organization's business and technical requirements, and select a package for implementation. This process is typically performed once for a given need, and encompasses the first two stages of this blueprint:

  • Application Solution Strategy - establishes the approach to be used in evaluating packages based on an understanding of business objectives, principles, scope, risks, and the marketplace.
  • Application Solution Evaluation - assesses qualified vendors and packages against the organization's requirements, and selects a package for implementation.

Application Solution Implementation Process

The purpose of this process is to tailor, prepare, and deploy the selected application package. This process may be performed more than once depending on the release strategy for the package, and encompasses the final two stages of this blueprint:

  • Application Solution Enablement - configures and customizes the selected package to meet the organization's needs, and develops a set of products that are used to deploy the package at specified sites.
  • Deployment - installs and makes the tailored package operational at specified sites.

The Application Solution Enablement and Deployment stages are typically instantiated as multiple projects when a package must be implemented in increments over time, or implemented on different technical platforms:

Integration with Other Processes

The Application Solution Selection and Application Solution Implementation processes are designed for integration with other processes.

The Application Solution Selection process may leverage the results of business reengineering or other types of business analysis projects to facilitate the definition of business requirements for the required package. The results of information strategy planning or infrastructure assessment projects may be used to facilitate the definition of technical requirements and standards for the required package. Likewise, the Application Solution Selection process may provide new ways of doing business or technology enablers that can be leveraged in business reengineering projects.

The Application Solution Implementation process may identify the need for one or more systems development or redevelopment projects. Once the selected package has been acquired, new functionality may need to be developed to fill gaps in the package. If this functionality cannot be easily developed using the package's configuration or customization features, one or more systems development projects will need to be initiated. In addition, if the package is replacing or being integrated with existing systems, changes may need to be applied to existing programs or files through one or more systems redevelopment projects.

Application Solution Selection Process

Scope

A small Application Solution Selection project will typically have a narrow scope (e.g., 3 high-level processes); a small number of qualified vendors, and is completed in just over a month. On the other hand, large projects will typically have a large scope (e.g., 10 high-level processes), a large number of qualified vendors (e.g., 10 to 12 vendors), and is completed in 3 1/2 months. Projects of larger scope (e.g., SAP, integrated packages) should be partitioned into two or more parallel and coordinated projects.

Several factors can increase the level of effort or duration of the project beyond those cited above. If an RFI or RFP is used as an information gathering method, additional time should be allotted in the schedule to allow vendors to prepare their responses. Additional effort and duration should be added to projects where business and technical requirements, or models and documentation of vendor packages are not readily available to the team. If the required application is functionally or technical complex, the evaluation team may need more time to understand the requirements and possible solutions.

Risk Based Approach

The experiences of organizations with application packages have generally been mixed. Because application packages are chosen as a quick path to implementation, pressure can be imposed on the evaluation team for a quick evaluation and selection. In some cases, this has led to the acquisition of a package that requires significant customization or to an implementation failure. Significant customization or redevelopment of an application package can be very costly and time-consuming because the development team usually lacks knowledge and understanding of the package's technical implementation. Organizations that customize more than 50% of the package's functionality generally spend more time and money on a package implementation than on a comparable custom development project.

The level of detail, rigor, and effort applied in evaluating and selecting packages should be commensurate with the level of risk. If the required application package is small in scope, interfaces with few systems, runs on a single platform at one site, and is inexpensive, a quick evaluation process can be used. Because the risk is controllable and the cost of selecting the wrong package is minimal, a process that uses checklists and focuses on a small number of market leaders can be completed in short order. On the other hand, if the required package is large in scope, interfaces with many systems, runs on multiple platforms at sites around the world, and is expensive to buy and maintain, a formal and rigorous evaluation process should be used. Because the business exposure is significant, an approach that employs sufficient detail and rigor can prevent the organization from making a costly mistake.

The Application Solution Strategy stage has been designed to evaluate business and technical risk and to select a package evaluation approach that is appropriate given an understanding of the key risk factors.

Principles-Based Evaluation

Deciding on "what's important" early in the evaluation and selection process is key to focusing the evaluation team and conducting a successful evaluation. Consequently, this Application Solution Selection process uses a "principles-based" evaluation process to ensure that key principles are established during the first few days of the project to guide and facilitate the team's decision-making. These principles address important package issues such as:

  • The acceptable degree of package customization versus the reception to modifying business processes to accommodate the package;
  • The expected life of the package solution;
  • The tradeoff between rapid implementation and accommodation of future requirements;
  • The type of relationship that is desired with the vendor;
  • The tradeoff between selecting the "best of breed" and integrating the package with existing systems; and,
  • The acquisition of the package through a third party (e.g., VAR) versus the package developer.

The first step within the Application Solution Strategy stage determines the principles that guide the evaluation team. In subsequent steps, these principles influence the definition of requirements, evaluation criteria, and scoring and weighting algorithms. A principles-based evaluation process gives the evaluation team a set of underlying standards to focus their time and energies on what counts.

Three Filters

The Application Solution Selection Process uses a three-filter approach to screening the field of candidate vendors and packages down to a single selected vendor and package. Speed and confidence in decision-making underlies the three-filter approach. Speed comes from eliminating non-qualifiers early in the process and selecting an appropriate evaluation approach based, in part, on the number of qualifiers. Confidence comes from a detailed evaluation of performance on a small number of finalist packages.

First Filter: The "Must Have" Filter

The first filter, which occurs in the Application Solution Strategy stage, focuses on eliminating vendors that do not support "must have" or strategic requirements, or have an obvious flaw or "show stopper". These "must have" requirements are the features that the package must support and the business factors that the vendor must meet. A list of knockout criteria is developed from these requirements against which candidate vendors are assessed. The objective is to screen the field of candidate vendors down to a list of no more than 12 qualified vendors. The number of resulting qualified vendors is one of the factors used to determine the type of evaluation approach to be used in the Application Solution Evaluation stage.

Second Filter: The "What" Filter

The second filter, which occurs in the first two steps of the Application Solution Evaluation stage, focuses on eliminating vendors that do not support a set of detailed requirements. Business, technical, and vendor requirements are defined and transformed into a set of evaluation criteria that are weighted for importance based on the principles defined in the Application Solution Strategy stage. Each vendor is assessed against the evaluation criteria and given a score using the scoring and weighting algorithms. The objective is to screen the field of qualified vendors down to a list of no more than four finalists. It is critical that this final short list consists of solutions that can be readily accepted and successfully implemented by the organization.

Third Filter: The "How" Filter

The final filter, which occurs in the last five steps of the Application Solution Evaluation stage, determines the winning package and vendor. The most effort should be spent evaluating a small number of packages in this third and final filter. The emphasis of this filter is on "how" each finalist implements the set of requirements. The "fit" of each package is examined, and the business and technical changes required to accommodate each package are detailed and analyzed. The performance of vendors and packages in key performance areas is assessed using test drives, site visits, benchmarking, phone surveys, or questionnaires. Every test is focused on understanding the package implementation, vendor relationship, and cost that the organization will be making a commitment to. In the final step, the evaluation team reviews all of the evaluation information and selects the package that will be implemented.

The "Rolling Flunk"

The three-filter approach provides a structured method for eliminating vendors in three passes to identify a winning package solution for the organization. While this approach is logical and practical, the real underlying principle for the approach is to eliminate non-qualified vendors at the earliest point possible. The concept of a "rolling flunk" is to allow the evaluation team to eliminate any vendor that is found deficient at any point in the process. The team has the authority and responsibility to throw out a solution before reaching the end of any given filtering process. For example, if phone surveys with customers of a finalist (e.g., a package that made the final short list) indicate performance problems with the package and vendor's customer service, the evaluation team can eliminate the vendor from further evaluation before reaching the end of the third filter.

The advantages of the rolling flunk are time and cost savings, and a continuous refocusing of the team on vendors and packages that are clear contenders. Because each subsequent step of the filtering process is more costly and time-consuming, products or vendors that do not measure up should not be allowed to consume any more of the organization's and team's vital resources. Losers should be flunked early to give the team more time to evaluate viable solutions.

Modeling Issues

Modeling of business and technical requirements, and each package's implementation is likely to occur in a high-risk, high-impact scenario. While this is an appropriate approach to mitigating risk and increasing the evaluation team's understanding of the problem and potential solutions, the modeling approach must be carefully selected to ensure usable results and avoid "analysis paralysis" at a time when speed of implementation is usually important. The evaluation team should carefully consider the following three modeling issues.

Level of Abstraction

It is essential that the models of business and technical requirements, and models of the package's implementation be at the same level of abstraction. This facilitates comparison of the problem against potential solutions -- typically called package fit analysis. For example, if the package is to provide a workflow solution for a customer service function, model the "desired" workflow (i.e., the business requirement) and the workflows that are provided in each package solution at the same level of abstraction. In this situation, comparing package workflow models against high-level business models will not provide the evaluation team with the information needed to easily determine package fit.

Modeling Techniques

Use modeling techniques that are appropriate for modeling both the problem and the potential solutions. If the requirement is to support a set of interrelated work activities, use event dependency or workflow diagrams to model the workflow, and map business rules to each workflow activity. On the other hand, if the requirement is to support simulation or reporting activities, use object relationship models and business rules to define the data to be stored, manipulated, and presented. Or, if the user interface is a critical concern, develop user task models to define the on-line workflow for key business activities. In all cases, use an approach that is appropriate for representing the requirements to be supported by the package, and use the same techniques in modeling both the requirements and the qualified package solutions.

Level of Detail and Rigor

The level of detail and rigor applied in modeling requirements and potential solutions does not have to be uniform throughout the specified business and technical scope. An approach where all process are detailed at the elementary process level, and all object types and business rules are fully defined may result in a late project. Instead, the evaluation team should be flexible and apply more detail and rigor where requirements are important -- less detail where they are not. For example, if the structure of account codes for an accounting requirement cannot be changed, it is appropriate to detail the attribute types and domains for this portion of the requirement, as well as for each potential package solution. In other cases, an understanding of the object types and key relationships may be sufficient because the requirement is not essential and the business is willing to adapt to the package's implementation. In all cases, the depth and rigor of the models of each package should parallel the depth and rigor of the business and technical requirement models. Spend time where it matters the most.

Evaluation Team Composition

The composition of the Application Solution Evaluation Team should remain relatively consistent throughout the Application Solution Strategy and Application Solution Evaluation stages. This continuity is necessary to provide consistent evaluation results in each stage of the evaluation process, and to ensure good team productivity. The evaluation team is likely to take on additional team members at the beginning of the Application Solution Evaluation stage to address specific requirements and increased workload. In addition, the project team to augment the team's knowledge and expertise may call upon business and technical experts.

The team may be organized into subteams to address specific aspects of the evaluation (e.g., business, technical, vendor, and cost). To achieve the greatest productivity, team members should focus on the same set of requirements throughout the entire selection process. This will ensure consistent understanding of requirements and rating of package capabilities by subject area, and minimizes the need for team member's to learn and understand new or additional requirements.

Application Solution Implementation Process

Scope

The Application Solution Implementation process is designed to configure, customize, and deploy the application package. Although the process identifies the need for new functionality to fill gaps and changes to existing systems to enable integration with the package, the Application Solution Implementation process is not designed to support systems development or redevelopment efforts.

A small Application Solution Implementation project typically consists of minor configuration and customization (e.g., setting parameters, creating a few new reports or queries), one or two new interfaces, deployment at a limited number of sites, and is completed within 3 months. On the other hand, a large project consists of significant configuration and customization, several new interfaces, deployment to a large number of sites, and is completed in 6 to 9 months. Projects of larger scope (e.g., SAP, integrated packages) should be partitioned into two or more projects using a release strategy approach.

Factors that can increase the duration and level of effort include complex applications, multi-platform implementation, use of new technologies, and synchronization with other system development or redevelopment projects.

Release Strategy

The Application Solution Implementation process uses a release strategy approach for delivering package functionality to users. The application package may need to be delivered in increments over time for the following reasons:

  • The package is large or complex and cannot be assimilated by the organization in a single implementation;
  • Some of the package's features depend on the development of new systems or changes to existing systems;
  • Some of the package's features must be significantly configured or customized before they can be deployed; or,
  • The package must be configured and customized to operate on multiple platforms.

A release strategy approach determines how new functionality and changes will be packaged, sequenced, and delivered to the organization. Each numbered release consists of a packaging of functionality that is meaningful and provides value to the organization. The release may consist of a standard version and one or more site-specific variants to address unique requirements at certain sites. Business and technical priorities, dependencies between technical components, the level of effort required for configuring and customizing the package determine the sequencing of the releases, and the level of effort required for developing new systems or making changes to existing systems. Wherever possible, concurrent development approaches should be used in order to deliver functionality to users in a more efficient manner.

Conducting a Trial Run

The Application Solution Implementation process contains a step for taking a "test drive" of the package before proceeding with the implementation activities. This optional step should be performed in situations where more information is required about the functionality and operating characteristics of the package before the terms and conditions of the vendor contract are accepted. Additional testing of the package may be required because:

  • The vendor's demonstration or documentation did not satisfactorily address certain critical requirements;
  • Apparent flaws were discovered during the Application Solution Evaluation stage that require further analysis;
  • The business functions supported by the package are strategic, and the risks associated with the implementation and use of the package are high; or,
  • The costs to implement the package are not clear and more information is needed to understand them.

The parameters of the trial run are usually written into the vendor contract signed at the end of the Application Solution Selection project. The trial runs will typically be timeboxed to a 30, 60, or 90-day test period where the organization is allowed to install the package and try out the features that must work well for the package to be accepted. Vendor assistance during this period may also be written into the trial run agreement.

A trial run gives the organization an opportunity to "explore and discover" package capabilities in an environment similar to the target environment. Users can try out package features to understand how they apply to their business environment. Developers can try out different configuration and customization options to understand how to make required changes to the package before it is deployed. The desired end result of a trial run is a confirmation that the right package was selected.

Customization and Upgrades

The Application Solution Implementation Team should carefully assess the degree and method of customization to be applied to the application package. Too much customization or certain types of customization methods may lock the organization into a situation where package upgrades may be expensive and time-consuming. For example, if future releases of the package do not provide a capability to roll customized features forward into the next release, some or the entire configuration and customization work will need to be repeated.

The implementation team should examine different methods for implementing each type of change that must be made to the package. Different methods for implementing changes might include setting parameters at installation time, using customization features provided in the package, or developing small custom components that can be integrated with the package. The team should carefully consider the implication of each method on functionality, usability, performance, and maintainability. Be sure to check with the vendor to determine whether the approach that has been chosen will enable changes to be easily migrated to future releases of the package.

Implementation Team Composition

The composition of the Application Solution Implementation Team should remain relatively consistent throughout the Application Solution Enablement and Deployment stages. This continuity is necessary to ensure consistency in the configuration, customization, and testing of package components, and to ensure good team productivity. The implementation team is likely to take on additional team members at the beginning of the Deployment stage to address the increased workload of deploying the package at specified sites. In addition, this core team to augment the team's knowledge and expertise may call upon business and technical experts.

The team may be organized into subteams to address specific aspects of the implementation (e.g., interfaces, customization, cultural changes, and system operations). To achieve the greatest productivity, team members should focus on the same components throughout the implementation process to provide a clear understanding of accountability for delivery.




Process Flow


ADVERTISEMENTS

"I choose a block of marble and chop off whatever I don't need."

- Rodin

ADVERTISEMENT

Sponsors