Just as BPM (business process management) technology is markedly different from conventional approaches to application support, the methodology of BPM development is markedly different from traditional software implementation techniques. With CPI (continuous process improvement) as the core discipline of BPM, the models that drive work through the company evolve constantly. Indeed, recent studies suggest that companies fine-tune their BPM-based applications at least once a quarter (and sometimes as often as eight times per year). The point is that there is no such thing as a “finished” process; it takes multiple iterations to produce highly effective solutions. Every working BPM-based process is just a starting point for the future. Moreover, with multiple processes that could benefit from BPM-style automated support, the issue becomes how to support dozens or even hundreds of engagements per year.
As the intervals of change get shorter and shorter, companies need to develop effective methodologies to get around the business optimization cycle quickly enough.
Figure 1 depicts a way of organizing activity and ensuring that a project stays on track. The methodology involves many smaller iterations focused on a particular topic, each with “playback” sessions in which the BPM team, subject matter experts, and business managers interactively validate newly developed functionality. This approach ensures that no surprises emerge along the way, while delivering the flexibility to change as needed. There are iterations in the requirements area (discover and understand), iterations during the design phase, iterations at build and test, etc. Further, managers need to monitor the process and control the way in which it operates. Before repeating the cycle, the analyze and optimize phases involve several additional iterations as business analysts and managers experiment with alternative scenarios.
Given four or five business optimization cycles per year, each overall cycle must complete within three months. To achieve this, it is necessary to time-box the different phases of activity. Otherwise, the temptation is always there to spend more time, which encourages scope creep and increases the risk of project failure.
Every organization has a different starting point and, as a result, different needs. Some already have a defined process; others are not as well developed. Some want to emphasize automation of the process, whereas others need better traceability, visibility, and performance measurement. Either way, the first objective is to understand the process. Instead of developing a 400-page requirements document with every detail tightly specified, the organization should focus on tying down the core functionality that will deliver the most value.
There is always a need to capture the as-is process. The underlying requirement is the ability to step outside of the process and understand it fully. The key protagonists need models that allow them to communicate effectively with each other. Secondly, these models will form the underlying structure to underpin the capture of baseline metrics. It is important to gather reference metrics to ensure that the team can later prove the performance improvements.
To achieve a rich appreciation of the process, it is necessary to model the process at a high level, from a number of different, complementary perspectives. Assessing the business situation using a set of complementary modeling techniques allows people to comprehend the fundamentals of the process better. The ideal techniques for this phase are:
The emphasis here is on understanding the process, not building models for transformation into code or executable process definitions. This then enables both the business analyst and the business user to step outside of the business-as-usual view and see the process differently.
Given suitable access to subject matter experts, a good rule of thumb is that this phase of activity should complete within a week or two. Although this might sound challenging, it is entirely feasible. The trick is to ensure that models are at a suitably high level. The team should always keep in mind the purpose of the modeling and the intended audience. Models should be detailed enough to drive understanding and discussion, but no more detailed than is necessary to support this aim.
A comprehensive BPM suite is necessary to underpin the target process-enabled application. The BPM suite provides a configurable platform that executes procedural models, delivering work to the relevant employee or partner (or even customer). It ensures traceability of individual cases of work and guarantees compliance. Modern BPM suites include integrated process modeling and business rules environments, integration facilities, sophisticated user interface capabilities, and powerful analytics.
When developing the actual BPM-enabled application itself, the BPM team will find it much easier to gain clarity if they have a deep understanding of the process provided by the previous phase. Rather than attempting to define 80 to 90 percent of the final functionality, the team should start with tying down a more modest target of around 20 to 40 percent that delivers the vast majority of the value.
The process, however, is just one area where work is required in the development and implementation phases. Focused effort is essential to ensure effective integration with third-party applications. Similarly, the user interface needs special attention, as do the metrics gathered and the mechanisms provided to managers for controlling the functionality.
Rather than trying to address all issues together, the team should focus on just one aspect of the development before moving to the next. They can create a separate subphase of work for each facet:
• Process flow. In the initial iteration, the aim is to agree on the core 20 to 40 percent of functionality that will deliver the bulk of the value. Although a fair amount of modeling was undertaken when looking at the as-is model, this effort is all about creating the to-be process. Although organizational change may be considered a deployment issue, the way in which activities are assigned to the groups and roles will also be important.
• Integration. This subphase focuses directly on information extraction and update from/to external applications. Associated with this subphase is design of the data for the process. Again, ensuring that the data model is developed separately from the as-is model ensures that team members design and support what is necessary, rather than taking for granted what is already there. The deliverable should concentrate on proving to the user community that the necessary data can be placed on a default user interface (i.e., do not attempt to customize the user interface).
• User interface. Ensure that the screens deliver the information required by the various roles involved in the process.
• Metrics. Explore the management information deemed necessary, how this data is gathered, who should have access to it, and how it is presented. The BPM suite should capture all the necessary information, often referred to as KPIs (key performance indicators). It is worth noting that the metrics used to track process efficiency and effectiveness may differ significantly from the data used to maintain the state of the process.
• Controls. Business managers will want a way of throttling performance of the process, allowing them to control process execution. They need mechanisms that help them cater to peaks and troughs in demand, or influence the way in which business rules apply.
It is important to separate these portions of the development, as this will allow specialist resources in the team to focus their efforts. Depending on the situation, the order of these subphases may change. For example, if extracting data from third-party applications will present the greatest difficulty, then this subphase should probably run first. Of course, in many projects, individual subphases may reiterate based on feedback from their particular experts and managers.
Moreover, each subphase allows the team to present results to the subject matter experts and managers of the business area. Supported by a shared model approach, these interactive playback sessions ensure that users can see the implemented functionality requested. Moreover, because the outputs are graphical, participants can quickly step through the changes made since the last iteration.
The capabilities of the BPM suite should ensure that project team participants have direct access to all related events, rules, user interfaces, process flows, code, and analysis from the same tool set, in the same context. The product should not force application developers and analysts to switch tools or contexts to see what is going on in the process.
The capabilities of the system to support effective business monitoring and control derive from an effective design and deployment phase. This section discusses the types of capabilities delivered. Several perspectives are important, including dashboards, alert and escalation mechanisms, control loops, and personnel management.
• Dashboard-style user interfaces deliver appropriate metrics to managers and supervisors. The assumption is that managers will intervene where necessary to expedite items of work as long as they have suitable visibility on work in the system. Of course, the system itself can help facilitate this through the provision of mechanisms that enable the manager to inspect the item of work, reassign that item, or interact directly with the worker responsible. Moreover, the system can prompt individual users directly, bringing to their attention items of work that are in danger of exceeding any milestone or SLA (service-level agreement) established.
• Monitoring and control mechanisms should also enable suitably authorized managers to direct the overall operation of the process. When most vendors talk about BPM and continuous process improvement, they are discussing the larger, overall business optimization loop. They have failed to grasp the importance of the secondary optimization loop where suitably qualified business managers control the running process directly. Generally, this is a design issue. The application should have built-in capabilities that provide managers with the controls they need to throttle business performance. (See the Pulte Mortgage sidebar for an example.)
Analysis and optimization are usually the responsibility of the business analyst or process owner. In an ad hoc fashion, these people are looking to identify the problems and suggest changes for the next release of the application. They are looking at the overall business process, its historical performance, and related business data, with the aim of identifying ways to improve performance.
Of course, the KBOs should drive performance optimization. For most companies, adding more people (resources) to a process to improve performance is just not an option. The best BPM products provide mechanisms to test alternatives (other than simply adding resources at bottlenecks).
In response to this need, most vendors have focused on providing simulation tools for comparing different what-if scenarios. At its core, simulation is a statistical technique that uses probabilities to predict average activity durations, queue lengths, resource utilization, etc.
Usage of simulation should come with a few health warnings, however. Simulation is most effective when used as a way of testing assumptions. For example, if interest rates fell, leading to an increase in mortgage applications, what would the impact be on the ability of the company to provide the same level of customer service? At what point would additional resources be required? But simulation models are notoriously difficult to construct, with each linkage in the model requiring careful statistical checking. In the Pulte Mortgage example in the sidebar, that might mean checking the sensitivity between interest rates and the number of mortgage applications. Solving this problem entails gathering historical data to support this testing. The best BPM simulation tools available today extract this data from the running process model to provide the baseline information. This dramatically changes the cost/benefit ratio associated with simulation.
Best-in-class BPM suites have optimization mechanisms that provide automated support to help determine the best means of process improvement. This takes simulation a step further, addressing some of its inherent difficulties. Rather than leaving it to the analyst to determine options for improvement, the system should help identify areas to consider. Moreover, it is often difficult to compare different simulated scenarios in ways that are meaningful to the business.
For example, business managers are usually most interested in assessing the effectiveness of a particular product or service. It is not good enough to know the average cycle time of all loans; they want to know the cycle time of the jumbo loan (since they know that has the best margin). Or they want to assess a particular line or service by sales channel.
Alternatively, they may want to analyze performance over time, comparing some slice of the past with current results or looking into the future. For example, rather than looking at the average over the last six months, the managers might want to compare the holiday sales season with the summer sales season. They might prefer to compare the amount of business in production today with the business situation three months ago, or the same period last year.
How does an organization go about improving its ongoing BPM delivery capabilities? Clearly, experience will grow over time. But responding to the situation outlined in the beginning of this article—where the organization needs to support dozens of BPM engagements per year—has its challenges. Companies should develop a proactive strategy to manage and grow the knowledge of the BPM team, capturing insights and developing effective engagement methods.
The most common response to this challenge is to develop a BPM COE (center of excellence) or process management office. The idea is that a group of committed individuals focus on the processes of the company as they drive bottom-line profitability and performance. The team can support a number of BPM projects across the business, keeping momentum going across a broad front. They are usually responsible for developing common principles, language, frameworks, and methodologies for process development and process architecture management.
In the early stages, however, the COE can present unnecessary overhead as it typically has a much wider scope than is necessary for a pilot project. The COE concept comes into its own as the BPM program starts to address the needs of the wider organization. With more and more projects, the need increases for a coordinated and integrated approach. The COE provides a central repository for knowledge and best practices development around BPM projects.
For companies to make the most of BPM initiatives, they must first realize that BPM involves a different way of working.2 The methodology of BPM application development is vastly different from that found in even the most agile programming environment. Rather than the traditional waterfall-style implementation—where application functionality appears in large monolithic blocks—iteration and adaptation are prevalent in every phase of the development life cycle. Instead of a timeline measured in years and months, application updates roll out in a few weeks or months.
From a BPM technology perspective, it is important to identify a product that supports the entire BPM life cycle, facilitating the interaction of the various individuals involved, as well as identifying process trends and optimization options. Alongside the technology component, the vendor should provide a robust development and implementation methodology with direct support provided by the platform itself. The technology and methodology components are interdependent.
The people at Pulte Mortgage set out to change the model of customer service by becoming proactive and concluding tasks before customers would expect completion. Without visibility into the metrics of the process, however, managers found it difficult to spot opportunities for improvement. Through the implementation of an automated case-tracking application, they could identify areas where service improvements were possible. Only when process metrics were gathered was it possible to identify requirements .
For example, as a result of the better visibility into the process, managers could monitor the number of hours it took to push a case through to the point of offer. As the number of cases rose in the queue awaiting approval, managers realized they could influence overall performance of the process by lowering the credit-scoring threshold (where the system automatically accepts mortgage applications). As they reduced cycle time, the financial risk would rise. This does not mean that managers were interested in reconfiguring business rules in realtime. Rather, as part of the manager’s dashboard user interface, slider control mechanisms enabled this sort of control.
This is a business judgment, however, trading off higher risk against more rapid response to customers and hence, more business. As the managers came to understand the dynamics of these decisions, they could then start to embed that enhanced understanding into a more dynamic set of business rules that supported the decision (even to suggesting the level of automatic credit-scoring approval).
DEREK MIERS (firstname.lastname@example.org) is an independent industry analyst and technology strategist. He has carried out a wide range of consulting roles—facilitating board-level conversations around BPM initiatives, establishing effective BPM project and expertise centers, and helping clients develop new business models that leverage business process strategies. Clients have included financial services companies, pharmaceutical companies, telecom providers, commercial businesses, product vendors, and government organizations.
Originally published in Queue vol. 4, no. 2—
see this item in the ACM Digital Library
Eric Bouwers, Joost Visser, Arie Van Deursen - Getting What You Measure
Four common pitfalls in using software metrics for project management
Tracy Ragan - Keeping Score in the IT Compliance Game
Achieving developer acceptance of standardized procedures for managing applications from development to release is one of the largest hurdles facing organizations today. Establishing a standardized development-to-release workflow, often referred to as the ALM (application lifecycle management) process, is particularly critical for organizations in their efforts to meet tough IT compliance mandates. This is much easier said than done, as different development teams have created their own unique procedures that are undocumented, unclear, and nontraceable.
Duncan Johnston-Watt - Under New Management
In an increasingly competitive global environment, enterprises are under extreme pressure to reduce operating costs. At the same time they must have the agility to respond to business opportunities offered by volatile markets.
James Champy - People and Process
When Mike Hammer and I published Reengineering the Corporation in 1992, we understood the impact that real business process change would have on people. I say “real” process change, because managers have used the term reengineering to describe any and all corporate change programs—even downsizings. One misguided executive told me that his company did not know how to do real reengineering; so it just downsized large departments and business units, and expected that the people who were left would figure out how to get their work done. Sadly, this is how some companies still practice process redesign—leaving people overworked and demoralized, while customers experience bad service and poor quality.
(newest first)Thanks for the information shared here. that was an interesting and informative. I had a good experience by participating in the Cloud Computing and SOA Conference in 2009 which is most influential Business Technology Conference covering latest innovations and trends of Cloud Computing, SOA and its technologies. I learnt lot of new technologies in Cloud Computing. And I am planning to attend 2010 edition as well. I found the information about the conference from btsummit dot com