The five dysfunctions of a team #GartnerBPM

Jeff Gibson of the Table Group gave the morning keynote based on some of the concepts in his colleague’s book, The Five Dysfunctions of a Team: A Leadership Fable.

He started with the idea that there are two requirements for a company’s success: it has to be smart (strategy, marketing, finance, technology) and it has to be healthy (minimal politics, minimal confusion, high morale, high productivity, low turnover). Although a lot of management courses are focused on the smart side, the healthy side is a multiplier of the smart side, boosting the success far beyond what you can do by being smart alone.

He then moved on to the five dysfunctions of a team:

  1. Absence of trust, specifically personal trust and exposing vulnerability to other team members. The role of the leader is to go first in order to show that it’s okay to make mistakes.
  2. Fear of conflict, which can lead to misunderstandings because people don’t speak their mind. The role of the leader is to search out conflict amongst team members, draw out the issues and wrestle with them.
  3. Lack of commitment, particularly to tough decisions. The role of the leader is to force clarity and closure on those decisions to ensure that everyone is committed to upholding them.
  4. Avoidance of accountability. The role of the leader is to confront difficult issues, such as problematic team behaviors.
  5. Inattention to results. The role of the leader is to focus on collective outcomes, not allowing a “superstar” on the team to make themselves look good to the detriment of the team result.

Usually I find these external keynotes that are unrelated to the conference subject to be so-so, but I really enjoyed this one, and could have used this advice when I was heading up a 40-person company. I’ll be checking out the book.

Advanced decisioning #GartnerBPM

I managed to get out of bed and down to the conference in time for James Taylor’s 7am presentation on advanced decisioning. If you’ve been reading here for a while, you know that I’m a big proponent of using decisioning in the context of processes, and James sums up the reasons why: it makes your processes simpler, smarter and more agile.

Simpler: If you build all of your rules and decisioning logic within your processes – essentially turning your process map into a decision tree – then your processes will very quickly become completely unreadable. Separating decisions from the process map, allowing them to become the driver for the process or available at specific points within the process, makes the process itself simpler

More agile: If you don’t put your decisioning in your processes, then you may have written it in code, either in legacy systems or in new code that you create just to support these decisions. In other words, you tried to write your own decisioning system in some format, but probably created something that’s much harder to change than if you’re using a rules management system to build your decisions. Furthermore, decisions typically change more frequently than processes; consider a process like insurance underwriting, where the basic flow rarely changes, but the rules that are applied and the decisions made at each step may change frequently due to company policy or regulatory changes. Using decision management not only allows for easier modification of the rules and decisions, it also allows these to be changed without changing the processes. This is key, since many BPMS don’t easily allow for processes that are already in progress to be easily changed: that nice graphical process modeler that they show you will make changes to the process model for process instances created after that point, but don’t impact in-flight instances. If a decision management system is called at specific points in a process, it will use the correct version of the rules and decisions at that point in time, not the point at which the process was instantiated.

Smarter: This is where analytics comes into play, with knowledge about processes fed into the decisioning in order to make better decisions in an automated fashion. Having more information about your processes increases the likelihood that you can implement straight-through processes with no human intervention. This is not just about automating decisions based on some initial data: it’s using the analytics that you continue to gather about the processes to feed into those decisions in order to constantly improve them. In other words, apply analytics to make decisions smarter and make more automated decisions.

To wrap up James’ five core principles of decisioning:

  • Identify, separate and manage decisions
  • Use business rules to define decisions
  • Analytics to make decisions smarter
  • No answer is static
  • Decision-making is a process

He then walked through the steps to apply advanced decisioning, starting with identifying and automating the current manual decisions in the process, then applying analytics to constantly optimize those decisions.

He closed with an action plan for moving to decisioning:

  • Identify your decisions
  • Adopt decisioning technology
  • Think about decisions and processes, and how those can be managed as separate entities.

Good presentation as always – well worth getting up early.

Using BPM to survive, thrive and capitalize #GartnerBPM

Last session of the day, a panel with Jim Sinur, Elise Olding and Michele Cantara on using BPM to survive, thrive and capitalize in a turbulent economy. I realize that this session has the same title as a webinar that Cantara and Janelle Hill did a while back, and there’s a lot of repeat material from that so I won’t bother to recapture it here. There’s a link to the webinar replay in that post, and I recommend checking it out if you weren’t here in Orlando today.

Off to the vendor showcase; that’s it for day 1 of the Gartner BPM summit.

Using a center of excellence to deliver BPM #GartnerBPM

Michelle Lagna of JPMorgan Chase, a Pegasystems customer, gave a presentation on their CoE as one of the solution providers sessions. Their CoE focuses on the use of BPM tools (primarily Pegasystems) to support their 30+ active systems. It was instrumental in allowing them to break down the departmental silos within the organization, establishing standard governance models, standardizing training and contributing to reusable assets.

The CoE supports all lines of business in planning and implementing BPM initiatives:

  • Creating and maintaining architectural standards
  • Centralizing and formalizing the housing and reuse of business-configurable assets
  • Promoting standard methodologies, tools and education

They use the Agile development methodology (and promote and educate on Agile across the organization), and believe that it is instrumental to their success by reducing time to market and aligning business and IT. They’ve made a gradual transition from waterfall to Agile in order to ease into the new methodology.

They’ve developed a standard engagement model (unfortunately depicted on the presentation slide in micro-print and low contrast colors):

  • Operational walkthrough and end-to-end review, including identification of process improvements and ROI
  • Impact analysis review, identifying execution gaps and automated solutions, plus IT and business sizing
  • Project initiation training, including both BPM and Agile training
  • Application profile, high level use case requirements and reusable asset review
  • Project setup and design review, including identifying assets leveraged from other projects, functionality specifications and a design compliance review
  • Environment build-out, including generating a base framework
  • Bootstrap session, which equips the project team to complete use cases on their own
  • Direct capture of objectives to elaborate use cases, design specifications and traceability matrix; this is specifically assisted by the Pega project
  • Identification of reusable assets, then harvesting those assets and making them available for reuse by other projects

The CoE is heavily involved in the early phases, but by the time that they get halfway through the project, the project team is running on their own and the CoE is just checking in occasionally to make sure that things are proceeding as planned, and to help resolve any issues. They had to make some organizational changes to ensure that the CoE is engaged at the right time and that siloed solutions are avoided.

She presented some of the key benefits provided by the CoE:

  • Common class structure for reusability
  • Library of reusable assets with tools to track usage
  • Standardized engagement model, including a “Perfect Start” training and certification stage
  • Monthly educational webcast
  • Improved release planning process (which I’ve seen listed as a key benefit of a CoE at other customers that use other BPM products)
  • Allowing for faster changes to improve business agility

The CoE has been backed by senior executive sponsors within JPMC, which has been key to its acceptance. They are run (and funded) as a shared service, so there are normally no direct chargebacks to the projects unless the CoE team is required to be onsite for an extended period of time due to a rush or urgent situation. Interestingly, the CoE is not all co-located: there are five offshore development resources that handle harvesting the reusable assets, although they are managed from an onshore resource.

Great case study, and a lot of material that is of use regardless of which BPM product that you’re using.

Hidden costs of unstructured processes #GartnerBPM

Elise Olding and Carol Rozwell kicked off the afternoon with a session on the hidden costs of unstructured processes: although a lot of focus of BPM efforts (time and money) is on structured processes, as much as 60% of an organization’s processes are unstructured – and probably also unmonitored, unmanaged, unknown and unruly.

Gartner defines unstructured processes as “work activities that are complex, nonroutine processes, predominantly executed by an individual or group highly dependent on the interpretation and judgment of the humans doing the work for their successful completion”, and notes that most business processes are made up of both structured and unstructured processes. Unstructured processes are costing organizations a lot of money in lost productivity, lack of compliance and other factors, and you can’t afford to ignore them. Although most processes aimed to meet regulatory requirements are structured, unstructured processes provide a company’s unique identity and often its competitive differentiation, as well as supporting operational activities.

In order to start managing unstructured processes, you need to get some visibility into them; start by understanding the critical path through the process. This can be a bit tricky, since as you start to map out your unstructured processes, there will be some points at which the process participant just has to wing it and make their own decisions. These are, after all, knowledge workers, and it’s not possible (or desirable) to map every possible process permutation. Instead, map the structured portions of the process, then the points at which it becomes unstructured, but don’t try to overengineer what happens in the unstructured parts. The unstructured parts can be modeled by the notification mechanism (how someone is notified that a piece of work requires attention), the information provided to the participant to allow them to complete the unstructured work, and how the outcome is recorded.

They presented a number of analysis techniques for getting to the heart of unstructured and folklore processes:

  • Observe work being done, and challenge tasks that don’t make sense. Keep asking “why”.
  • Use storytelling (“tell me what happens when…”) to uncover decision-making logic, methods and best practices: these types of narratives are not well-captured in standard process documentation.
  • Analyze the unstructured interactions between people (e.g., customers and CSRs) and extract the themes and patterns. Rozwell wrote a report “Business Narratives Supplement Traditional Data Analysis” that discusses one technique for doing this, although it wasn’t quite clear what it was from the discussion.
  • Get clarity around roles and who is the decision-maker in any given process.

There are a variety of different areas of knowledge that you need to consider when analyzing unstructured processes, from identifying what metadata is used for collaboration, to looking at alternative analysis techniques such as mind mapping and social network analysis. Understanding collaborative technologies is also key, since unstructured processes are often collaborative in nature, and make use of the participants’ social graphs.

Their final recommendations are to keep an eye on the technologies that can support unstructured processes, but not to go overboard on monitoring and managing these processes.

Navigating the BPM Wonderland #GartnerBPM

Alan Trefler of Pegasystems gave his traditional lunch address – entertaining as always, starting with a “White Rabbit” audio clip – with an Alice in Wonderland theme of how we have to chase our business goals down whatever rabbit hole that they disappear down. Continuing on the theme, he contrasted the “one pill makes you larger” end of the spectrum with monolithic applications, and the “one pill makes you small” end of point solutions, and how you need to look at something in the middle. Don’t be afraid to ask for advice (even from hookah-smoking caterpillars), watch for those delusional Mad Hatter software salespeople, and be sure to meet the needs of the Red Queen boss lady so that you don’t get your head chopped off in the process.

Trefler is a former chess champion, so it’s inevitable that he introduced an Alice-themed chess analogy when examining his recommended steps for implementing BPM:

  • Directly capture objectives, so that your BPM implementation is focused on business intents and goals.
  • Automate the programming: the computer can write code much better than human beings, which is much less expensive in the long run even if off-shoring development appears to make it cheaper up front. In other words, use a system that allows for model-driven development and zero-code (or near-zero-code) deployment.
  • Automate the work wherever steps can be automated.

I love the term that he introduced: “heritage systems”, which are just legacy systems that we like a little bit better, probably because we’ve wrapped them to allow them to be more easily integrated with other systems and processes.

Deciding on process modeling tools #GartnerBPM

Bill Rosser presented a decision framework for identifying when to use BPA (business process analysis), EA (enterprise architecture) and BPM modeling tools for modeling processes: all of them can model processes, but which should be used when?

It’s first necessary to understand why you’re modeling your processes, and the requirements for the model: these could be related to quality, project validation, process implementation, as part of a larger enterprise architecture modeling effort and many other reasons. In the land of BPM, we tend to focus on modeling for process implementation because of the heavy focus on model-driven development in BPMS, hence model within our BPMS, but many organizations have other process modeling needs that are not directly related to execution in a BPMS. Much of this goes back to EA modeling, where several levels of process modeling that occur in order to fulfill a number of different requirements: they’re all typically in one column of the EA framework (column 2 in Zachman, hence the name of this blog), but stretch across multiple rows of the framework such as conceptual, logical and implementation.

Different types and levels of process models are used for different purposes, and different tools may be used to create those models. He showed a very high-level business anchor model that shows business context, a conceptual process topology model, a logical process model showing tasks within swimlanes, and a process implementation model that looked very similar to the conceptual model but included more implementation details.

As I’ve said before, introspection breeds change, and Rosser pointed out that the act of process modeling reaps large benefits in process improvement since the process managers and participants can now see and understand the entire process (probably for the first time), and identify problem areas. This premise is what’s behind many process modeling initiatives within organizations: they don’t plan to build executable processes in a BPMS, but model their processes in order to understand and improve the manual processes.

Process modeling tools can come in a number of different guises: BPA tools, which are about process analysis; EA tools, which are about processes in the larger architectural context; BPM tools, which are about process execution; and process discovery tools, which are about process mining. They all model processes, but they provide very different functionality around that process model, and are used for different purposes. The key problem is that there’s a lot of overlap between BPA, EA and BPM process modeling tools, making it more difficult to pick the right kind of tool for the job. EA tools often have the widest scope of modeling and analysis capabilities, but don’t do execution and tend to be more complex to use.

He finished by matching up process modeling tools with BPM maturity levels:

  • Level 1, acknowledging operational inefficiencies: simple process drawing tools, such as Visio
  • Level 2, process aware: BPA, EA and process discovery tools for consistent process analysis and definition of process measurement
  • Levels 3 and 4, process control and automation: BPMS and BAM/BI tools for execution, control, monitoring and analysis of processes
  • Levels 5 and 6, agile business structure: simulation and integrated value analysis tools for closed-loop connectivity of process outcomes to operational and strategic outcomes

He advocates using the simplest tools possible at first, creating some models and learning from the experience, then evaluating more advanced tools that cover more of the enterprise’s process modeling requirements. He also points out that you don’t have to wait until you’re at maturity level 3 to start using a BPMS; you just don’t have to use all the functionality up front.

Patterns for Business Process Implementations #GartnerBPM

Benoit Lheureux from Gartner’s Infrastructure and Architecture group gave a presentation on process implementation patterns. I think that he sees BPM as just part of SOA, and presents as such, but I’m willing to give him a pass on that.

He discussed five styles of flow management in SOA:

  1. Microflows: fine-grained services implemented via flows amongst software components. This is a process from a software development standpoint, not a business-level process: probably 3GL code snippets assembled into what we old-timers might refer to as a “subroutine”. 🙂
  2. Service composition: coarse-grained services implemented by assembling fine-grained flows (microflows). This may be done with a BPMS tool, but is low-level service composition rather than business processes.
  3. Straight-through process: automating business processes involving multiple services across systems, but without human intervention.
  4. Workflow: pretty much the same as STP, but with human intervention at points in the process.
  5. Semi-structured processes: a combination of structured processes with unstructured activities or collaboration.

He has some good strategic planning assumptions based on these four patterns, such as 75% of companies will use at least three different products to implement at least three different styles of flows. His primary focus, however, is on B2B, and how internal process connect to multi-enterprise processes, and the ultimate goal of shared process execution across enterprises. This led to the four B2B flow management styles:

  1. Blind document/transaction exchange: loosely-coupled, with each partner managing their own internal processes, and no visibility outside their own processes.
  2. Intelligent document/transaction exchange: visibility across the shared process to provide a shared version of the truth, such as a BAM dashboard that provides an end-to-end view of an order-to-cash process across enterprises. Although this isn’t that popular yet, it is providing significant benefits for companies that are implementing it, and Lheureux estimates that 50% of B2B relationships will include this by 2013.
  3. Multi-enterprise applications: shared execution of a process that spans the enterprises, such as vendor-managed inventory. This may be hosted by one of the partners, or may be hosted by a third-party service provider.
  4. Multi-enterprise BPMS and rules: centralized processes and rules, such as shared compliance management on a shared process. By 2013, he predicts that at least 40% of new multi-enterprise integration projects will leverage BPMS technology.

He showed a chart that I’ve seen at earlier conferences on identifying process characteristics, classifying your processes as case management, form-driven workflow, content collaboration, multiparty transactional workflow, participant-driven workflow, and optimization of network relationships based on the unit of work, process duration, degree of expertise required, exception rate, and critical milestones that progress work. Then, consider when to use BPMS technology rather than code when there are specific process characteristics such as complexity and changeability.

The final recommendations: don’t try to use the same tool to handle every type of process implementation, but be aware of which ones can be best handled by a BPMS (and by different types of BPMS) and which are best handled in code.

BPM in Times of Rapid Change #GartnerBPM

For the next couple of days, I’m at the Gartner BPM Summit in Orlando. Jim Sinur and Janelle Hill gave the opening keynote this morning on BPM in times of rapid change, starting with a view of the global economy: basically, it’s down this year, although not as bad as expected, and the leading economic indicators are starting to trend up.

Gartner did a survey of CEOs in late 2008, and found that their top priority is shifting back from cutting operating costs to increasing revenues, although only by a slim margin. The resulting message: the time to return to business growth is now, and leveraging BPM to assist growth can provide a first-mover advantage if the economy does trend up in 2010 as predicted. BPM still provides assistance in restructuring operations (including mergers and acquisitions) and cutting costs that goes along with a down economy, so you might as well leverage what you’re already using to cut costs, and start looking forward and repositioning for growth. In many cases (in my experience), improving business processes using BPM has the impact of reducing costs of the specific processes, which can either translate to reduced operational costs through reduced headcount, or increased revenues due to the increased capacity of the process to handle new business: these are just two sides of the same process improvement coin.

Going into 2010, most large enterprises have already completed their cutbacks – reduced headcounts, reduced infrastructure, renegotiated contracts and elimination of redundant technologies – but their budgets are going to be pretty flat. If you already have a BPMS in your organization, then this might mean some incremental expansion, but if you don’t, you need to look at how to justify the technology acquisition. Fortunately, that’s getting easier as the capabilities of the BPMS products expand: consider the value of process modeling (reduced redundancy and better use of people in the process) as well as process and application orchestration (automating the linkages between many existing applications) and composite application development environments (bringing together many applications into a single user view).

Focus on improving processes that defend revenue and cash without impacting customer experience, such as order-to-cash, sales processes, and customer service. Depending on your industry, this could also be the time to take some risks in order to gain that first-mover advantage: reconsider institutionalized behaviors and what you might think of as best practices, and see if there’s an innovative way to improve processes that provide a competitive edge. There should be no processes that are immune to change: challenge the status quo. I see this all the time with how companies are embracing social media in addressing customer relationships: the ones that are successful at it are those that throw away all the old ideas about how companies communicate and interact with their customers. These customer-facing processes are no longer about executing transactions, they’re about coordinating social interactions and developing social relationships.

The hot button these days is unstructured processes (which I’m sure that we’ll hear a lot more about this week), and how some new BPMS functionality allows for dynamic collaboration instead of, or within the context of, a structured process. This provides methods for gaining visibility into processes that might exist now only in email or other ad hoc methods, and likely aren’t managed well in their current state.

It’s not good enough, however, to use old-style BPMS/workflow products: you need to be considering products that have model-driven development, composite application development, process discovery and optimization, and customized dashboards for different roles and personas within a process. Otherwise, you’ll just be stuck back in the same old waterfall development methodology, and won’t achieve a lot of benefit from BPM. Interestingly, Sinur and Hill highlighted three specific products to show examples of what they consider BPMS innovation: Vitria’s composite application development, Pallas Athena’s process discovery and simulation, and Global 360’s persona-based user interfaces.

In the recession of the 1980’s, business process reengineering was a high-profile, strategic activity with top executives involved; as the recession eased, the executives’ interest in BPR waned. The same cycle will repeat now: executives are very interested right now in process improvement and BPM, but that’s not going to last when the economy starts to recover, so you may want to take advantage of their interest now and get something going.

You can track the Twitter backchannel for the Gartner BPM summit here.

Gartner webinar on using BPM to survive, thrive and capitalize

Michele Cantara and Janelle Hill hosted a webinar this morning, which will be repeated at 11am ET (I was on the 8am ET version) – not sure if that will be just the recording of this morning’s session, or if they’ll do it all over again.

Cantara started talking about the sorry state of the economy, complete with a picture of an ax-wielding executioner, and how many companies are laying off staff to attempt to balance their budgets. Their premise is that BPM can turn the ax-man into a surgeon: you’ll still have cuts, but they’re more precise and less likely to damage the core of your organization. Pretty grim start, regardless.

They show some quotes from customers, such as “the current economic climate is BPM nirvana” and “BPM is not a luxury”, pointing out that companies are recognizing that BPM can provide the means to do business more efficiently to survive the downturn, and even to grow and transform the organization by being able to outperform their competition. In other words, if a bear (market) is chasing you, you don’t have to outrun the bear, you only have to outrun the person running beside you.

Hill then took over to discuss some of the case studies of companies using BPM to avoid costs and increase the bottom line in order to survive the downturn. These are typical of the types of business cases used to justify implementing a BPMS within conservative organizations in terms of visibility and control over processes, although I found one interesting: a financial services company used process modelling in order to prove business cases, with the result that 33% of projects were not funded since they couldn’t prove their business case. Effectively, this provided a more data-driven approach to setting priorities on project funding, rather than the more common emotional and political decision-making that occurs, but through process modelling rather than automation using a BPMS.

There can be challenges to implementing BPM (as we all know so well), so she recommends a few things to ensure that your BPM efforts are successful: internal communication and education to address the cultural and political issues; establishing a center of excellence; and implementing some quick wins to give some street cred to BPM within your organization.

Cantara came back to discuss growth opportunities, rather than just survival: for organizations that are in reasonably good shape in spite of the economy, BPM can allow them to grow and gain relative market share if their competition is not able to do the same. One example was a hospital that increased surgical capacity by 20%, simply by manually modelling their processes and fixing the gaps and redundancies – like the earlier case of using modelling to set funding priorities, this project wasn’t about deploying a BPMS and automating processes, but just having a better understanding of their current processes so that they can optimize them.

In some cases, cost savings and growth opportunities are just two sides of the same coin, like a pharmaceutical company that used a BPMS to optimize their clinical trials process and grant payments process: this lowered costs per process by reducing the resources required for each, but this in turn increased capacity also allowed them to handle 2.5x more projects than before. A weaker company would have just used the cost saving opportunity to cut headcount and resource usage, but if in a stable financial position, these cost savings allow for revenue growth without headcount increases instead.

In fact, rather than two sides of a coin, cost savings and growth opportunities could be considered two points on a spectrum of benefits. If you push further along the spectrum, as Hill returned to tell us about, you start to approach business transformation, where companies gain market share by offering completely new processes that were identified or facilitated by BPM, such as a rail transport company that leveraged RFID-driven BPM to avoid derailments through early detection of overheating problems on the rail cars.

Hill finished up by reinforcing that BPM is a management discipline, not just technology, as shown by a few of their case studies that had nothing to do with automating processes with a BPMS, but really were about process modelling and optimization – the key is to tie it to corporate performance and continuous improvement, not view BPM as a one-off project. A center of excellence (or competency center, as Gartner calls it) is a necessity, as are explicit process models and metrics that can be shared between business and IT.

If you miss the later broadcast today, Gartner provides their webinars for replay. Worth the time to watch it.