Can We Achieve BPMS Pricing Transparency?

I saw – and responded to – this question on Quora today:

What are the license fees / cost for the the top BPM solutions in the market ? ( Pegasystems, IBM ( Lombardi ) , Appian, Oracle , etc )

I am looking forward to do a benchmark of the top BPM ( Business Process Management ) sofware solutions on the market. Its well known that is always hard to have a clear knowledge on the licensing prices that top software vendors offer on their solutions.

My response:

Good luck with getting public information on that — like most enterprise software vendors, BPMS vendors are very reluctant to release this sort of information. Typically, they sell a package made up of software, services, maintenance and possibly even future growth, then apply discounts based on how close they are to the end of their sales quarter/year and other things that are unimportant to the customer but unfortunately used to drive sales.

You might have better luck finding prices for those who have a cloud version, since that tends to be based just on a per user/per month figure rather than a mysterious formula. However, I notice that many of them, stuck in their old ways of not disclosing any prices, don’t even publish those but make you request a quote.

The lack of transparency in pricing in enterprise software in general, and BPMS in particular, is not a good thing for customers.

I’ve helped a number of end-customer organizations with their BPMS purchasing decisions over the years, although I’m typically focused on how well the functionality fits their business requirements rather than how well the price fits their budget. At some point, however, the customer always asks “How much does it cost”, and my answer is almost always “I have no idea”. I might know how much it cost for some other customer, but the pricing calculations for a particular customer are usually so completely opaque that they can’t be generalized and – this is the killer – the vendor considers this confidential information that cannot be disclosed to other customers. Even cloud-based pricing, which should be publicly advertised on the vendors’ websites in easy-to-understand per-user-per-month terms, is sadly missing from some of the key cloud BPMS vendors.

Remember the bad old days of buying a car, when you had no idea how much it cost when you walked into the showroom, and had to go through some weird pseudo-negotiation between the salesperson and his manager, where they would throw in the free floor mats if you did your financing with them, give you an extra discount if it was within a week of the end of their sales quarter, or bait-and-switch you into a more expensive model? Enterprise software has always felt a bit like that to me, and BPMS pricing and sales tactics sadly fall into that same category, at least for many of the major vendors.

Now, remember what happened when people started sharing car prices online? Soon, everyone knew the dealer price on a car in advance, and were in a much better position to negotiate a fair price, rather than have the salesperson negotiate what they think you would pay. It would be good if the BPMS vendors could get ahead of that trend, and start being a bit more transparent about their pricing, starting with cloud pricing.

Progress Analyst Day: Industry and Product View

Rick Reidy, Progress CEO, opened their one-day analyst event in New York by talking about operational responsiveness: how your enterprise needs to be able to respond to events that happen that are outside your control. You can’t control crises, but you can control your organization’s response to those crises. Supporting operational responsiveness are four technology trends – cloud, mobile, social media and collaboration – with the key being to extend the use of technology to the business user. If you remember Progress’ roots in 4GL development, this isn’t exactly a new idea to them; 4GLs were supposed to be for business users, although it didn’t really work out that way. Today’s tools have a better chance at achieving that goal.

Today, they’re announcing the upcoming releases of Responsive Process Management Suite 2.0 and Control Tower 2.0, plus their cloud platform, Arcade. Interestingly, much like IBM’s Lombardi acquisition turned BPM into the biggest story at this year’s Impact conference, Progress’ Savvion acquisition is having the same effect here: RPM is the top-line story, albeit supported by event processing. It’s not that the entire product suite is the former Savvion BPMS, but rather than BPM is providing the face for the product.

Reidy turned the stage over to “the two Johns” (his words, not mine): Dr. John Bates, CTO, and John Goodson, Chief Product Officer. Bates dug further into the ideas of operational responsiveness, and how the combination of analytics, event sense and respond, and process management help to achieve that. As he put it, BPM alone won’t achieve the responsive business; businesses are event-driven. They’re really trying to define a new “responsive process management” (RPM) market, at the overlap between BPM, business event processing, business transaction management, and business intelligence and analytics. Cue Venn diagram, with RPM at the intersection, then fade to another Venn diagram between custom applications and packaged applications, again with RPM at the intersection. Their estimates put the current market at $2.5B, with a rise to $6.5B by 2014.

Bates talked about the value of BPM, and how that’s often not enough because businesses are very event-driven: events flow in and out of your business every day – from applications, devices and external feeds – and how you respond to them can define your competitive advantage. Patterns in the relationships between events can identify threats and opportunities, and are especially important when those events are traditionally held in separate silos that typically don’t interact. He gave a great example around the FAA fines for airlines who hold passengers captive on planes on the ground for more than 3 hours: by looking at the events related to crew, maintenance, weather and flight operations, it’s possible to anticipate and avoid those situations, and therefore the fines (and bad press) that go along with them. You don’t need to rework existing legacy systems in order to have this sort of operational responsiveness: automated agents trap the events generated by existing systems, which can then be analyzed and correlated, and processes kicked off in a BPMS to respond to specific patterns.

Progress presents all this through their Control Tower, which brings together a business view of analytics and visualization, event sense and respond, and process management. I’m sure that I’ve written about this previously and would love to link to it, but the wifi in here is so crap that I can’t get a solid connection, so can’t look up what I’ve written previously on my own blog. #fail

Goodson took over to discuss the product in more detail, showing how Savvion, Actional, Apama and other components make up their RPM suite and the event management layer below that. Control Tower is new product (or was new in 1.0, last year) that overlays all of this, and puts a consistent business-facing interface on all of this, and providing collaborative process design and other social media features. We saw a pre-recorded demo of Control Tower, showing how a dashboard can be created quickly by dragging analytics widgets onto the canvas. The key thing is that the widgets are pre-wired to specific analytics and processes, allowing drill-downs into the details and even into the process models. Process changes and simulations can be done interactively.

As the wifi flickers to life occasionally, it’s interesting to see the Twitter stream for the event (obviously being updated by people with their own connectivity solutions): Anne Thomas Manes says “Claims to be unique in the ‘responsiveness’ market, but the marketing story sounds very much like Tibco’s story”. Mike Gualtieri thinks that “Control Tower…looks good, but it would be cool to hold up an iPad and pass it into the audience”. Personally, I’m pretty much over the iPad gee-whiz demo phase.

We came back after the morning break to a continuation of the John and John show, with Bates initially responding to some of the tweets that had happened during the earlier session, then discussing their solution accelerators that include business processes, rules, analytics, alerts, interceptors and adapters. They have created accelerators for several verticals including capital markets front office, and communications and media order management, in part because of their history with those verticals and in part because of the suitability of RPM to those applications. Savvion had been going down this road long before the acquisition, and this makes a lot of sense for Progress in terms of competitive differentiation: a combination of industry-specific foundation services and adapters that are mostly productized (hence supported and maintained like product), and customizable solution accelerators that rest on top of that foundation. This makes them far more useful than the type of templates that are offered by other vendors, which are not upgradable after they’re customized; although not confirmed from the stage, my assumption is that the customization (and hence forking from the Progress-maintained code base) all happens in a thin layer at the top, not in the bulk of the foundation services.

They’re currently shipping six different accelerators, and are adding several more this year. These range across industry verticals including banking, capital markets, communications and media, insurance, supply chain, and travel and leisure. They’ve worked with partners and customers to develop these, as well as creating some internal industry experience. We saw a couple of canned demos, although it’s impossible to tell from this just how much effort is required to fit this to any particular company’s business. As part of the demos, there were some bits where a business user updated the event handling and created a new rule; I don’t think that this will be done by business users, although certainly a trained business analyst could handle it.

The solution accelerators form a big part of their go-to-market strategy, and see these as taking share away from packaged applications. They see solution accelerators as a differentiator for Progress, as well as their focus on responsive process management. They haven’t forgotten their OpenEdge agile development environment however: they’re announcing OpenEdge BPM to bring that development platform to BPM applications. I don’t know enough about OpenEdge to understand the implications here, but will be interesting to see what synergies are possible as they bring together the entire Progress product suite.

To finish the industry and product section of the day, we heard about their cloud strategy, focused on applications in the cloud (rather than just infrastructure or a platform): creating vertical ecosystems for their various industry foci, incorporating both Progress solutions and partners to create composite solutions based on RPM, with Control Tower for end-to-end visibility and improvement. Progress Arcade is their cloud application platform for allowing these vertical ecosystems to be easily created and deployed across a variety of public and private cloud environments. It reminds me a bit of TIBCO’s Silver BPM environment, where you can do all the provisioning and setup right from their environment rather than having to hop around between different configuration tools. They stated that this is targeted at small and medium businesses who want to be able to leverage technology to be competitive: this is definitely an echo of a conversation that I had about BPM in the cloud with Neil Ward-Dutton earlier this morning, where I stated that most of the growth would be in SMB since this is the only way that most of them can afford to consider this technology.

Progress does a combo analyst day for both industry and financial analysts; this morning was more for the industry analysts while this afternoon is more for the financial analysts, although we’re all invited to be here all day. Since I don’t cover a lot of the financial side, I likely won’t be writing a lot about it, although I may be tweeting since the wifi seems to be a bit better behaved now.

BPM Success at BlueCross BlueShield of Tennessee

Rodney Woods of Tennessee BCBS started out talking about their 18-month history with Pegasystems SmartBPM by stating that you would have to pry Pega out of his cold, dead hands to get it away from him.

His laws of BPM success:

  1. The most important activity in business is improvement: improvement ensures competitiveness; your job is to drive improvement; if you improve the right things, in the right sequence, your business will take care of itself.
  2. Setbacks are not failures; failure is staying the same. Success requires setbacks; winning daily firefights is not progress.

There are four areas of improvement to consider: profit, product, process and people. His key is to make these sorts of innovation second nature, so that they occur routinely within your organization.

He had some good points about identifying the right BPM project, including:

  • Make sure that it’s related to a key strategic business issue: e.g., not just process efficiency, but tied to more effective customer service
  • Get customer and stakeholder input on the issue
  • State the problem as threat or need, not a solution
  • Define the process owner and key stakeholders
  • Focus on the process that is most critical and/or contributes the most

Most of his comments were about organizational issues, not technical issues: strategy, reporting relationships, continuous improvement, and executive support. Many of these were not specific to BPM projects, but any potentially transformational business-technology project. In fact, except for his initial comment, he didn’t really talk about their Pega solution at all; instead, lots of great advice regardless of your technology selection.

That’s it for me at the Gartner BPM summit 2011 in Baltimore; there’s vendor hospitality suites tonight and a half-day of sessions tomorrow, but I’m headed home after a week on the road.

The Great Case Management Debate

With a title like that, how could I miss this session? Toby Bell (ECM), Kimberly Harris-Ferrante (insurance vertical) and Janelle Hill (BPM) took the stage for what was really a live research session rather than a debate. Is it a process pattern covered by BPM? Is it functionality within ECM? Is it an industry-specific vertical application? Gartner is still evolving their definition of case management (as are many people), and currently publish the following definition:

Case management is the optimization of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and human resources and require adherence to corporate and regulatory policies/rules to achieve decisions about rights, entitlements or settlements.

The path of execution cannot completely be predefined; human judgment and external events and interactions will alter the flow.

Harris-Ferrante said that we need to first create industry-specific definitions or examples of what a case is, then this definition can be presented in that context in order to make sense.

Bell made the distinction between content-triggered automation (e.g., paper invoice scanning and processing), collaborative content-rich processes (e.g., specific projects such as construction), and case management: there’s a bit of a spectrum here, based on a variety factors including cost, complexity, people involved and time to completion. Case management is distinguished from the others by (human) decisions supported by information: Hill felt that this decision-support nature of case management is a defining feature. Harris-Ferrante talked about the cost and risk factors: case management is used in situations where you have compliance requirements where you need to be able to show how and why you made a particular decision. She also pointed out that rules-based automated decision is really standard BPM, whereas rules-supported human decisioning falls into case management.

They showed a slide that talked about a continuum of business process styles, ranging from unstructured to structured; looks vaguely familiar. Winking smile Okay, they use “continuum” rather than “spectrum”, have five instead of four categories, and put structured on the right instead of the left, but I am a bit flattered. Their continuum includes unstructured, content collaboration, event driven, decision intensive, and structured activities; they went on to discuss how case management is the most common example of an unstructured process style. I found that wording interesting, and aligned with my ideas: case management is a process style, not something completely different from process. Business process management, in its most generic form, doesn’t mean structured process management, although that’s how some people choose to define it.

Looking at the issue of products, they showed a slide that looked at overlaps in product spaces, and puts BPM in the structured process/data quadrant, with case management far off in the opposite quadrant. As Hill points out, many of the BPM vendors are extending their capabilities to include case management functionality; Bell stated that this might fit better into the ECM space, but Hill countered (the first real bit of debate) that ECM vendors only think about how changes in content impact the case, which misses all of the rules and events that might impact the case and its outcome. She sees case management being added to ECM as just a way that the relatively small market (really just four or five key vendors) is trying to rejuvenate itself, whereas the case management advances from BPM vendors are much more about bringing the broad range of functionality within a BPMS – including rules and analytics – to unstructured processes.

Hill stated that Gartner doesn’t have an MQ for case management because there are so many different styles of case management: content-heavy, decision-heavy, and industry-specific packaged solutions. Besides, that way they could sell three reports instead of one. Not that they would think that way. Harris-Ferrante discussed the challenges to case management as an industry application, including the lack of shared definitions of both cases and case management, and Bell stated that buyers just don’t understand what case management is, and vendors are rejigging the definition to suit the customer context, so aren’t really helping in this regard.

In spite of stating that they don’t have a case management MQ, they did finish up with a slide showing the critical capabilities that customers are asking for in case management. such as a balance of content, collaboration and process services; and high-configurable case-based user interface. They lay these out against four styles of case management – collaborative forms-based case management, knowledge workers collaborating on internal content, regulated customer-facing file folders and data, and costly processes initiated by customers – and indicate how important each of the factors is for each style. I definitely see the beginnings of an MQ (or four) here. They did state that they would be issuing a research report on the great case management debate; I’ll likely be giving my take on this topic later this year as the industry track chair at the academic BPM 2011 conference.

It’s clear that the definition of case management needs to firm up a bit. As I asked in a tweet during the session: case management: is it a floor wax or a dessert topping? As any old Saturday Night Live fan knows, it’s both, and that could be part of the problem.

Selecting a BPMS

Janelle Hill of Gartner gave a short presentation on selecting a BPMS. Some of her points:

  • The coolest BPMS may not be appropriate. Take advantage of the model-driven development environment that is appropriate for your business people rather than just what’s the most fun for the developers. A typical feature-function evaluation may not be the best way to go about it, since the functionality can vary widely while providing the same business capability.
  • A BPMS is a suite of technologies for supporting the entire lifecycle of process improvement: discovery, modeling, execution, monitoring and optimization. It’s a platform that includes both design-time and runtime. She showed the classic Gartner “gears” diagram showing all the components in a BPMS, and pointed out that you probably don’t need to do a deep dive into some of the components such as business rules, since that’s typically not the deciding factor when selecting a BPMS. A BPMS is a composition environment rather than a full development environment, where the components are used together to graphically assemble pre-existing building blocks from outside the BPMS together with some functionality built within the BPMS to create a process application. As a composition environment, the registry and repository are important for being able to locate and reuse assets, whether created inside or external to the BPMS.
  • A BPMS is not the same as a SOA suite: the latter is used to create services, while the former consumes those services at a higher level and also provides user interaction. As I’ve said (usually in front of my service-oriented friends), a BPMS provides the reason that the SOA layer exists.
  • A BPMS provides visibility, adaptability and accountability particularly well, so you should be considering how a BPMS can help you with these business capabilities.
  • If business (or a combination of business and IT) need to be able to manage process change, or processes change frequently, then a BPMS is a good fit. If process changes are purely under the control of IT and the processes change infrequently, then more traditional development tools (or an ERP system) can be considered. She talked about frequently changing processes as being served by systems that are built to change, whereas those with less frequently changing processes as being built to last, but pointed out that “built to last” often translates to brittle systems that end up requiring a lot of workarounds or expense changes.
  • She presented Gartner’s top four BPMS use cases: a specific process-based solution, continuous process improvement, redesign for a process-based SOA, and business transformation. Their latest MQ on BPMS has more information on each of these use cases; if you’re not a Gartner customer, it’s available through the websites of many of the leading BPMS vendors.

She then moved into some specific evaluation criteria:

  • Know your dominant process patterns: straight-through, long-running with human involvement, dynamically changing processes flows, or collaboration within processes. She categorized these as composite-heavy, workflow-heavy, dynamic-composite-heavy and dynamic-collaborative-heavy, and showed some of the tools that they provide for helping to compare products against these patterns. She stated that you might end up with three different BPMS to match your specific project needs, something that I don’t completely agree with, depending on the size of your organization.
  • Don’t pick a BPMS because it’s a “safe” vendor or enterprise standard, or because of price, or because the developers like it.
  • Do pick a BPMS because it enables business-IT collaboration, because its capabilities match the needs of a defined process, it supports the level of change that you require, and it interoperates well with your other assets.
  • Do an onsite proof of concept (POC), 2-3 days per vendor where your people work side-by-side with the vendor, rather than relying on a prepared demo or proposal. She had a lot of great points here that line up well with what I recommend to my clients; this is really necessary in order to get a true picture of what’s required to build and change a process application.
  • Check for system scalability through reference checks, since you can’t do this during the POC.

She ended with some recommendations that summarize all of this: understand your requirements for change to determine if you need a BPMS; understand your resource interaction patterns to define the features most needed in a BPMS; ensure that your subject matter experts can use the tools; and have a POC to evaluate the authoring environment and the ease of creating process applications.

BPM and ERP at AmerisourceBergen

Gartner BPM always includes sessions for the vendor sponsors, and most of them are smart enough to put one of their customers on stage for those presentations. This afternoon, I listened to Manoj Kumar of AmerisourceBergen, a Metastorm (OpenText) customer discuss how they used BPM as an alternative to customizing their SAP system, as well as to streamline and improve their processes, and enforce compliance. He went through how they built their business case: demonstrating the BPM tool, surveying departments on their business processes and how they might benefit from BPM, and some analysis to wrap it all up. He also covered the business and IT drivers for creating a BPM center of excellence, with a focus on alignment, shared resources and reusability.

Building the execution team was key; with a model-driven tool, he didn’t really want “hard-core developers”, or even people who had used the tool before, but rather those who could adapt quickly to new environments and use model-driven concepts to drive agile development. Having a focus on quick wins was important, rather than getting bogged down in a long development cycle when it’s not necessary.

They also had considerations about their server infrastructure, and since they were using BPM across a wide variety of decentralized and non-integrated groups decided on separate virtual machines that could be taken down without impacting anything beyond the specific departmental process. This seems to indicate that they didn’t do much end-to-end work, but focused on departmental solutions; otherwise, I would have expected more integration and the requirement for shared process engines. When he showed his process stats – 200 different processes across 3000 users – it seemed to reinforce my assumption, although they are doing some end-to-end processes such as Procure To Pay.

He strongly encourages taking advantage of the BPM tool for what it does best, including change management for processes. They’ve obviously done a good job of that, since they’re managing their entire BPM program with 4 people on the implementation team. He recommends not allowing developers to write any code until you’ve prototyped what you can in the BPM tool, or else their tendency will be just to rewrite the BPMS functionality themselves; I am 100% behind this, since I see this happening on many BPM implementation projects and it’s a constant battle.

With an SAP/BPM integration like they’ve done at AmerisourceBergen, you need to be careful that you don’t get too carried away in the BPM tool and rebuild functionality that’s already in SAP (or whatever your ERP system is), but using BPM as a tool for orchestrating atomic ERP functions makes a lot of sense in terms of agility and visibility, and also provides the opportunity to build processes that just don’t exist in the ERP system.

Advancing BPM Maturity

Janelle Hill of Gartner presented on how to advance your BPM maturity, starting with the concept that not only isn’t there one path to get to BPM maturity, but there’s more than one maturity destination. There are many different mind-sets that organizations have about their BPM programs, ranging from simple automation and improvement efforts up to strategic business optimization; how you think about BPM will have an enormous impact on the potential value of BPM within your organization. This is really an excellent point that is rarely explicitly stated: if you think of BPM as a low-level tool to do some automation – more of a developer tool than a business tool – then you can see benefits, but they’ll be limited to that domain. Conversely, if you think of BPM as a tool/methodology for transforming your business, your use of BPM will tend to be more aligned with that. The tricky part is that BPM is both (and everything in between), and you don’t want to lose sight of its use at a variety of levels and for many different sorts of benefits: as fashionable as it is to see BPM as purely a strategic, transformational methodology, there are also a lot of practical BPM tools that are used for automation and optimization at a more tactical level that have huge benefits.

Gartner’s business process maturity model – the same, I think as the OMG BPMM – passes through five levels from process-aware, to coordinated processes, to cross-boundary process management, to goal-driven processes, to optimized processes. In line with this, benefits move from cost and productivity improvements at the low levels; to cycle time reductions, capacity and quality gains at the middle levels; to revenue gains, agility and predictability at the higher levels.

Advancing maturity requires work along six major dimensions:

  • Organization and culture
  • Process competencies
  • Methodologies
  • Technology and architecture
  • Metrics and measures
  • Governance

She then showed a mapping between the maturity levels and these dimensions, with the level of effort required for each, with the critical transition points highlighted. There are some interesting transition points, such as the effort required for organization and culture increasing right up until when you are well-entrenched in level 5 maturity, at which time the organization and culture aspects becomes systemic and mostly self-sustaining, and the explicit effort required to maintain them decreases sharply.

She broke out each of the dimensions in more detail, showing within the organization and culture dimension how the roles and responsibilities must be developed as the maturity level increases through education, establishing a BPCC and becoming goal-aligned.  Some dimensions, such as process competencies, methodologies and technology/architecture, follow fairly logical paths of increased effort as the maturity level increases, although there will be decisions within those such as which particular methodologies to develop within your organization, and your tools may change as your maturity level increases. Metrics and measures tend to be more aligned with the maturity levels, changing from individual lagging indicators to shared real-time metrics tied to strategic objectives and SLAs, and is also heavily supported by technology. Governance is the most difficult of the dimensions, with a collection of very different initiatives, and probably won’t even properly start until you’re transitioning from level 1 to level 2. A lot of what she covered here is centered around the process governance committee, and some level of centralized stewardship for end-to-end processes: otherwise, it’s impossible to fund and push forward with processes that span functional (and budgetary) boundaries. It’s also necessary to create incentives to support this, so that the entire process doesn’t end up sub-optimized when one of the functional subprocesses is optimized.

Gartner’s research has shown the impact of a BPCC on achieving business process maturity, and in turn, delivering more successful BPM projects across the organization; I definitely agree with this, although believe that you need to grow your BPCC more organically on the back of a BPM project rather than making it an independent project of its own. The BPCC should not be part of IT; although it contains some technical people with skills in the tools, it’s really about operational processes and should be under the auspices of the COO or other business leader.

She finished up with a contrast between functionally-driven and process-driven organizations in terms of roles and responsibilities, visibility, hand-offs, cost accounting, risk analysis and other areas, plus a great chart summarizing the linkages between maturity levels and the dimensions.

Excellent talk, and lots of great practical advice on what you need to do to increase your BPM maturity level.

Selling BPM to your Organization

Starting into the breakout sessions here at Gartner BPM 2011 in Baltimore, Elise Olding, with some help from Joel Kiernan of Altera, gave a presentation on selling BPM within your organization. This is about selling that first project internally as well as expanding your BPM initiative beyond the first project: leveraging your success so far and your business-focused BPM definition to see how it can be applied with other opportunities. Like any good sales pitch, you need to have content that is relevant, compelling and repeatable. I wrote about expanding BPM adoption within your organization in a recent article series for Global 360, and covered some of the same issues about generalizing beyond that first project into a BPM program.

Kiernan discussed their own case study at Altera (a semiconductor company), starting with how they had to understand their key business processes and communicate this to the steering committee responsible for the business process projects. They’re early in their journey, but have put together the storyline for how BPM will roll out in their organization: identify the right processes, do some as-is and to-be process analysis including external best practices, implement process/system changes, then move into ongoing process improvement.

As Olding discussed, there will need to be different messages for different internal audiences: senior executives are interested in how BPM will improve performance, competitiveness and operational flexibility; line of business managers are interested in operational goals including reducing errors and rework, and gaining visibility into processes for themselves and their management; front-line workers want to know how it will make their work easier, more interesting and more effective.

As an aside, I get the feeling that Gartner presenters have been coached by someone who really likes complex analogies woven throughout the presentation: in the keynote, Ken McGee used a courtroom analogy throughout the presentation, and here Olding is using a film-making analogy with “trailers”, “setting” and “engaging the cast”. It was also a bit of a strange segue to involve the Altera person for only about two minutes when they were really just starting in their process, although I have to give her credit for sharing the stage with a customer, since that’s pretty rare at any Gartner events that I’ve attended in the past. Would have been great to hear from someone further along in the process, and maybe a bit more from them than just two slides.

She covered some of what you actually want to communicate, as well as the who and how of the communication, stressing that you need to achieve buy-in (or at least understanding) from a lot of different stakeholders in order to reach that tipping point where BPM is seen by your organization as a key enabler for business improvement. She changed the format a bit to get people working on their own process issues, giving everyone time to jot down and discuss their challenges in each of the steps of selling BPM internally, then calling on a couple of audience members to share their thoughts with the room. This format shift caused a bit of loss of focus (and a bit of down time for those of us who aren’t really into this form of audience participation), although she was able to bring the experiences of the audience members in alignment with the material that she was presenting. Not surprisingly, one of the key messages is on the business process competency center (what Gartner calls the center of excellence) and the methodology that they employ with customers to make a BPCC successful within an organization. Success, in that case, is measured completely by how well you can sell BPM inside the organization.

Gartner BPM 2011 Kicking Off

I’m at my first Gartner BPM show in a while: a couple of years ago, I noticed a lot of repeated information from one summit to the next and decided to sit a few out, but decided that there was enough refresh by now and a good chance to catch up with a lot of people who I only ever see at these conferences.

The show kicked off with Michele Cantera, joined by Elise Olding, giving some opening remarks and introducing the winners of the Gartner BPM Excellence awards: Lincoln Trust, UPS, Carphone Warehouse, NY State Taxation, and Maximus.

The keynote was delivered by Ken McGee, Gartner fellow, opened with the statement that this is the time for the business process professional. He backed this up with a look at the economic growth forecast, including some optimistic survey numbers from businesses stating that their revenues and IT spending are going to increase this year. This was a fairly general presentation on the impact of the economy on business environments and the need to seize new opportunities; not at all specific to BPM, except for one slide of the APQC process framework that didn’t really seem to fit with much else.

Gartner has obviously released a report on the Money-Making CIO recently, and that’s what he spent part of his presentation on: looking at the six styles of money-making CIOS (entrepreneur, cost optimization, revenue searching, innovation, business development, and public serving). He mentioned other Gartner research, such as pattern-based strategy, and told us that social networking and cloud computing are important (duh); this seemed like a a bit of a grab-bag of concepts that could have been given to any IT audience at any conference.

I understand that it’s important to have presentations that show the larger context at a tightly-focused event like this BPM summit, but this didn’t have the cohesiveness or inspiration required to elevate it beyond just a summary of this year’s Gartner research.

Process Modeling With BPMN

I’m sitting in on Bruce Silver’s online BPMN course this week: this is the same as his onsite course, just using remote classroom tools to allow him to present and demonstrate to us, then get our feedback using typed chat. It’s a combination of lecture and hands-on, using a 60-day license for the business edition of the itp-commerce BPMN Visio add-in that is included with the course. The course runs 11am-3:30pm (Eastern) for three straight days, which took a bit of schedule juggling to be able to attend most of it; not sure if he is recording this for us to access after the course, which would be a nice benefit especially for those doing the certification exam. I use a lot of BPMN with my customers in my role as a process architect, but Bruce’s knowledge of the standard and its usage far outweigh mine, and I’m sure that I will learn a lot in addition to providing a review of the course for my readers.

He’s using the itp-commerce Visio tool, in spite of the hefty price tag ($975 for the Business Edition, $1,535 for the Professional Edition that also includes serialization; the free edition does not include full BPMN 2.0 support), because it natively supports Bruce’s methodology and style validation, which he covers in his book BPMN Method and Style and uses in this course. There are other Visio add-ons for BPMN 2.0 modeling, including one from Trisotech on the Business Process Incubator site that I’ve been using lately since it has a full-featured (but branded) version that customers can use for free, or the full non-branded version for the price of a BPI premium membership. Visio 2010 supports BPMN natively, but not the 2.0 version – if you’re a big Microsoft Visio customer, you might want to start agitating with Microsoft to include that via a service pack, since their current line seems to be that there isn’t sufficient demand for it yet. Bruce and I both believe that BPMN 2.0 support will become a significant differentiator for modeling products by the end of 2011, and Microsoft really needs to get on board with this if they’re going to be a player in the BPMN 2.0 market. There are some nice features in the itp-commerce tool that we didn’t cover in the course, such as simulation and BPMN 2.0 interchange, but many of those are available in lower-cost alternatives: I think that this is a race to the bottom price-wise, since Microsoft will eventually just include all of this in Visio natively.

He started with some basic definitions of BPMN and how it differs from flowcharts – especially in the area of collaboration, extra-process events and exception handling – highlighting the notions of standardization and of the hierarchical view that allows for inclusion of expandable subprocesses, rather than trying to put everything on one enormous top-level process model. He also covered how BPMN creates a bridge between business analysts who are creating these models, and developers who are making them executable, including the BPM systems that make the models directly executable without a lot of coding. He also discussed what’s not in the BPMN standard, such as user interface for human steps, data models, dynamic work assignments, rules, KPIs and anything to do with the higher-level strategy and goals. Although you may see some of these implemented in a BPMS, those will be done in a proprietary manner, and learning how to do that in that particular tool won’t be transferrable to other tools.

As I often do when I’m presenting a quick look at BPMN in a client presentation, he talked about the full BPMN 2.0 standard, with its new support for choreography and conversation diagrams, execution semantics and an XML schema for model interchange between tools, and highlighted that it’s possible to use the descriptive and analytic subclasses (subsets) of the standard if you don’t need to learn all 100 elements of the standard: the descriptive is for business analysts to be able to model processes as documentation, and the analytic is a minimum subset required to model executable processes.

Bruce keeps bringing it back to the value and benefits of BPMN: why it’s important both in terms of its modeling capabilities, and in the role as a standard for widespread understanding. There are a lot of BPMN detractors, but I don’t see the problem if you don’t try to shove the entire standard down the throats of business people: using the descriptive subclass (plus a few more elements), I’m able to have non-technical business people understand the notation in about 5 minutes, although it would take them a little bit longer to be able to create their own diagrams.

After an hour or so of initial presentation to provide the necessary background, Bruce shared his screen and had us all start up Visio with the itp-commerce add-in, and we started modeling some BPMN. As those of you familiar with BPMN know, there are only three main objects in a BPMN diagram: activities, gateways and events. The fun stuff comes with all the adornments that you can add to those three basic objects to indicate a huge variety of functionality.  We started off with a high-level straight-through order process, then added gateways for the exception paths. We started to learn some of the guidelines from Bruce’s style guide, such as using a gateway not to indicate work but only as a question testing the output state of the previous activity (which I always do), and using a separate end event for each distinct end state (which I rarely do but probably will start, since you can label the end events with the states). I also learned a standard Visio trick for moving the placement of the text on a connector using the Text Block tool, which allows you to snug labels of flows leaving a gateway right up to the gateway regardless of the connector length – cool! There were some great questions from the attendees, such as whether you can eliminate the gateway symbol and just have the labeled flows leaving the preceding activity, as you might in traditional flowcharting; in BPMN, that would denote have all of the paths be executed in parallel, not have one path or the other executed, so that’s not a legal representation of an exclusive OR gateway. Gateways can create a lot of confusion, because in spite of how they are often referred to as “decisions”, the decision is actually made in the previous activity, and the gateway just tests the result of that decision.

A great deal of day 1 alternated between some short presentations (a couple of slides each) on concepts, then exercises that allowed us to model those in diagrams ourselves, reinforcing the concepts immediately. While we were doing the modeling, Bruce would talk about other information about the concept, such as explaining some of the benefits and rules of pools while we were adding pools and lanes to our diagram, or the same for subprocess syntax. We saw some of the less-used but essential constructs such as ad hoc subprocesses, in which the contained activities don’t have a flow, and may be completed in any order (or not at all): this is how BPMN represents case management-style processes, for example, where the possible tasks are known but the order and applicability of any given task is not known. He also pointed out (and quizzed us on) common errors, such as having the same activity within a subprocesses and also in the process that calls it.

By the end of the first day, we had learned all of the Level 1 elements (effectively the BPMN 2.0 descriptive subclass), quite a bit of Bruce’s style guidelines around the use of those elements, and we were creating our own BPMN diagrams using those elements. At the start of day 2, after a recap, Bruce talked about having a BPMN method and style – whether it is his or not – so that there are standardized ways of using BPMN: in spite of it being a standard, it is possible to create diagrams that mean the same thing but look different, and having some standard usage makes it a more powerful communication tool within your organization. His method works toward four basic goals:

  • Structural consistency: a variety of the style points that he’s been covering, such as explicit end states and hierarchical composition
  • Readability: top-down traceability through levels of the process and subprocesses
  • Model completeness: diagram doesn’t require additional documentation to describe the process
  • Shareability with IT: models created by business analysts are aligned with the level 2 models used for executable processes

He then took us through the steps of his method for modeling processes that meets these goals; this was part of the essential intellectual property that he had to pass on to us (as opposed to the most standard BPMN on day 1), but too dense with slides and lecture rather than hands-on. Following that, he went through his BPMN style guides, which were also all lecture, but went much more quickly since these tended to be quick rules rather than larger concepts that we saw in the method section, and also we had covered a lot of these already in the exercises and the method. He did a blog post with a first cut of the rules and style of BPMN, both the standard BPMN rules and his style guidelines, plus a later post showing an example of reworking a process model to meet his style guidelines. The first is a great reference if you decide not to cough up for the itp-commerce product that will do the style validations for you; in reality, once you start using these for a while, they’ll become second nature and you won’t need to have them validated. He provided an updated list of the rules as part of the course, and has given me permission to republish, which I will do in a following post.

For the second half of day 2, we moved on to Level 2 BPMN elements (Analytic subclass) with more of the hands-on exercises on events: one of my favorite topics, since events are the most powerful yet the least understood of all BPMN elements. As Bruce pointed out, no one (not even him, and certainly not me) memorizes the table of 50 or so possible event permutations: for level 1 (descriptive subclass used by business analysts), you only need to know six of them (all start and end events), although I usually teach business analysts a couple of the intermediate events from level 2 as well. He suggests focusing on message, timer and error events, adding another nine to the six we’ve already seen; if you master these 15, then have to look up the others as required, you’re way ahead of most people using BPMN today.

Day 3 saw us still covering events via a combination of lecture and exercises; after timers on day 2, we moved on to message events and had a ton of great discussions on some of the finer points of BPMN usage (e.g., a script task that executes purely within the BPMS versus a service task that calls an external service). Message events are critical if you want to start modeling executable processes; intermediate message events are essential for automated messaging outside the process or organization, and boundary message events manage external events that modify or interrupt processes while in flight.  We also covered error events, and Bruce provided some supplementary information on other event types. Interestingly, Bruce is constantly reevaluating how BPMN can and should be used, with some changes over what he published in his book. He was a bit short on time for the last part of day 3 – the course timing definitely needs a bit of work – but we branched into splits and joins, went around iterations, and waded through multi-participant pools (which had an unfortunate effect on my brain).

He finished up with model validation using the itp-commerce add-in to Visio, which optionally validates against his style guide as well as the standard BPMN rules. As he puts it, any modeling tool that doesn’t provide validation against the BPMN specification is a toy, suitable only for drawing nice pictures. I suppose you could argue that after Bruce’s course, you will be able to validate automatically as you model so don’t need a tool to do it, but think of it as being like a spell-checker for process models: we all need a little help once in a while. 😉

He invited us all to go ahead and do the certification exam (no extra fee if done in the next 60 days), and showed one of the example multiple choice questions that had four possible answers, and received votes for all four of the answers from the class, showing that this is not quite as simple as it seems (yes, I got the right answer). If we pass that part, then we have to create a process model from one of our own processes of a specific level of complexity, following his method and style, and submit it for his review. Suffice it to say that certification via his BPMessentials exam will actually mean that you have mad BPMN skillz, it’s not just a certificate for showing up for the course.

Some potential improvements for the course:

  • It’s a bit hard to demo and talk at the same time, and Bruce could have captured screencams of some parts of the Visio demos to playback for us while he was discussing what we needed to do next, then just gone to Visio live for the finer points of demonstration; that would have made it easier for him to focus on describing what was happening rather than focusing on the actual drawing activity.
  • Some of the finer lecture points (such as going through the method and concepts) were a bit slow-moving, since Bruce would talk to one very dense slide for a number of minutes rather than having multiple slides with less information to absorb. Some restructuring of the slides would improve this, especially to show model snippets on the same page as the concept points, or possibly a much quicker summary to start, then return to the concepts later to reinforce.
  • The non-modeling exercises (e.g., defining the process scope given a specific scenario) didn’t work very well online, since there’s no fluid interaction with the participants, just the chat window with Bruce responding to the chat questions verbally when he sees them. In a regular classroom environment, he could ask for verbal solutions and write it out on a chart as they developed more collaboratively; here, all he could do was walk through the exercise and his solution. I’m not sure that a more interactive online collaboration tool would make a big dent in this problem; some things are just made for face-to-face (or at least audio) interaction. These sections could be enhanced by showing the process model solution at the same time as the exercise description – or better yet, a screencam – so that as he walks through it, he could point out how it manifests in the process.
  • It would be great to see a summary of the redundant elements in BPMN 2.0, with the preferred one (if one is preferred) indicated. For example, send/receive tasks are the same as intermediate throwing/catching message events except if you want to put boundary events (e.g., for error handling or timeouts) on the tasks in an executable process; a gateway is implied to be XOR if it has no marker; parallel split gateways and exclusive merge gateways are implied without showing the gateway. Although some of these are reflected in Bruce’s style guidelines, we just stumbled across some of them throughout the course.

I definitely learned some of the finer points of BPMN that I didn’t already know, and I will be going back to some BPMN diagrams that I’m working on with clients and clean up the style a bit with what I’ve learned. With this being an online course, I could multitask with other activities during the parts that were review for me; for a BPMN newbie (the target audience), the pace would have been just about right.

There are few people who have this depth of BPMN knowledge, and Bruce is the only one who I know who is doing this as a professional trainer: his is the only BPMN course that I recommend to my clients. He needs to work out a few bumps in how the online course works, but in general, I thought this was a great course, perfect for a business analyst who is already doing some process modeling but doesn’t know any BPMN, but also informative for those of us with some prior knowledge of BPMN.