The Great Case Management Debate

With a title like that, how could I miss this session? Toby Bell (ECM), Kimberly Harris-Ferrante (insurance vertical) and Janelle Hill (BPM) took the stage for what was really a live research session rather than a debate. Is it a process pattern covered by BPM? Is it functionality within ECM? Is it an industry-specific vertical application? Gartner is still evolving their definition of case management (as are many people), and currently publish the following definition:

Case management is the optimization of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and human resources and require adherence to corporate and regulatory policies/rules to achieve decisions about rights, entitlements or settlements.

The path of execution cannot completely be predefined; human judgment and external events and interactions will alter the flow.

Harris-Ferrante said that we need to first create industry-specific definitions or examples of what a case is, then this definition can be presented in that context in order to make sense.

Bell made the distinction between content-triggered automation (e.g., paper invoice scanning and processing), collaborative content-rich processes (e.g., specific projects such as construction), and case management: there’s a bit of a spectrum here, based on a variety factors including cost, complexity, people involved and time to completion. Case management is distinguished from the others by (human) decisions supported by information: Hill felt that this decision-support nature of case management is a defining feature. Harris-Ferrante talked about the cost and risk factors: case management is used in situations where you have compliance requirements where you need to be able to show how and why you made a particular decision. She also pointed out that rules-based automated decision is really standard BPM, whereas rules-supported human decisioning falls into case management.

They showed a slide that talked about a continuum of business process styles, ranging from unstructured to structured; looks vaguely familiar. Winking smile Okay, they use “continuum” rather than “spectrum”, have five instead of four categories, and put structured on the right instead of the left, but I am a bit flattered. Their continuum includes unstructured, content collaboration, event driven, decision intensive, and structured activities; they went on to discuss how case management is the most common example of an unstructured process style. I found that wording interesting, and aligned with my ideas: case management is a process style, not something completely different from process. Business process management, in its most generic form, doesn’t mean structured process management, although that’s how some people choose to define it.

Looking at the issue of products, they showed a slide that looked at overlaps in product spaces, and puts BPM in the structured process/data quadrant, with case management far off in the opposite quadrant. As Hill points out, many of the BPM vendors are extending their capabilities to include case management functionality; Bell stated that this might fit better into the ECM space, but Hill countered (the first real bit of debate) that ECM vendors only think about how changes in content impact the case, which misses all of the rules and events that might impact the case and its outcome. She sees case management being added to ECM as just a way that the relatively small market (really just four or five key vendors) is trying to rejuvenate itself, whereas the case management advances from BPM vendors are much more about bringing the broad range of functionality within a BPMS – including rules and analytics – to unstructured processes.

Hill stated that Gartner doesn’t have an MQ for case management because there are so many different styles of case management: content-heavy, decision-heavy, and industry-specific packaged solutions. Besides, that way they could sell three reports instead of one. Not that they would think that way. Harris-Ferrante discussed the challenges to case management as an industry application, including the lack of shared definitions of both cases and case management, and Bell stated that buyers just don’t understand what case management is, and vendors are rejigging the definition to suit the customer context, so aren’t really helping in this regard.

In spite of stating that they don’t have a case management MQ, they did finish up with a slide showing the critical capabilities that customers are asking for in case management. such as a balance of content, collaboration and process services; and high-configurable case-based user interface. They lay these out against four styles of case management – collaborative forms-based case management, knowledge workers collaborating on internal content, regulated customer-facing file folders and data, and costly processes initiated by customers – and indicate how important each of the factors is for each style. I definitely see the beginnings of an MQ (or four) here. They did state that they would be issuing a research report on the great case management debate; I’ll likely be giving my take on this topic later this year as the industry track chair at the academic BPM 2011 conference.

It’s clear that the definition of case management needs to firm up a bit. As I asked in a tweet during the session: case management: is it a floor wax or a dessert topping? As any old Saturday Night Live fan knows, it’s both, and that could be part of the problem.

Selecting a BPMS

Janelle Hill of Gartner gave a short presentation on selecting a BPMS. Some of her points:

  • The coolest BPMS may not be appropriate. Take advantage of the model-driven development environment that is appropriate for your business people rather than just what’s the most fun for the developers. A typical feature-function evaluation may not be the best way to go about it, since the functionality can vary widely while providing the same business capability.
  • A BPMS is a suite of technologies for supporting the entire lifecycle of process improvement: discovery, modeling, execution, monitoring and optimization. It’s a platform that includes both design-time and runtime. She showed the classic Gartner “gears” diagram showing all the components in a BPMS, and pointed out that you probably don’t need to do a deep dive into some of the components such as business rules, since that’s typically not the deciding factor when selecting a BPMS. A BPMS is a composition environment rather than a full development environment, where the components are used together to graphically assemble pre-existing building blocks from outside the BPMS together with some functionality built within the BPMS to create a process application. As a composition environment, the registry and repository are important for being able to locate and reuse assets, whether created inside or external to the BPMS.
  • A BPMS is not the same as a SOA suite: the latter is used to create services, while the former consumes those services at a higher level and also provides user interaction. As I’ve said (usually in front of my service-oriented friends), a BPMS provides the reason that the SOA layer exists.
  • A BPMS provides visibility, adaptability and accountability particularly well, so you should be considering how a BPMS can help you with these business capabilities.
  • If business (or a combination of business and IT) need to be able to manage process change, or processes change frequently, then a BPMS is a good fit. If process changes are purely under the control of IT and the processes change infrequently, then more traditional development tools (or an ERP system) can be considered. She talked about frequently changing processes as being served by systems that are built to change, whereas those with less frequently changing processes as being built to last, but pointed out that “built to last” often translates to brittle systems that end up requiring a lot of workarounds or expense changes.
  • She presented Gartner’s top four BPMS use cases: a specific process-based solution, continuous process improvement, redesign for a process-based SOA, and business transformation. Their latest MQ on BPMS has more information on each of these use cases; if you’re not a Gartner customer, it’s available through the websites of many of the leading BPMS vendors.

She then moved into some specific evaluation criteria:

  • Know your dominant process patterns: straight-through, long-running with human involvement, dynamically changing processes flows, or collaboration within processes. She categorized these as composite-heavy, workflow-heavy, dynamic-composite-heavy and dynamic-collaborative-heavy, and showed some of the tools that they provide for helping to compare products against these patterns. She stated that you might end up with three different BPMS to match your specific project needs, something that I don’t completely agree with, depending on the size of your organization.
  • Don’t pick a BPMS because it’s a “safe” vendor or enterprise standard, or because of price, or because the developers like it.
  • Do pick a BPMS because it enables business-IT collaboration, because its capabilities match the needs of a defined process, it supports the level of change that you require, and it interoperates well with your other assets.
  • Do an onsite proof of concept (POC), 2-3 days per vendor where your people work side-by-side with the vendor, rather than relying on a prepared demo or proposal. She had a lot of great points here that line up well with what I recommend to my clients; this is really necessary in order to get a true picture of what’s required to build and change a process application.
  • Check for system scalability through reference checks, since you can’t do this during the POC.

She ended with some recommendations that summarize all of this: understand your requirements for change to determine if you need a BPMS; understand your resource interaction patterns to define the features most needed in a BPMS; ensure that your subject matter experts can use the tools; and have a POC to evaluate the authoring environment and the ease of creating process applications.

BPM and ERP at AmerisourceBergen

Gartner BPM always includes sessions for the vendor sponsors, and most of them are smart enough to put one of their customers on stage for those presentations. This afternoon, I listened to Manoj Kumar of AmerisourceBergen, a Metastorm (OpenText) customer discuss how they used BPM as an alternative to customizing their SAP system, as well as to streamline and improve their processes, and enforce compliance. He went through how they built their business case: demonstrating the BPM tool, surveying departments on their business processes and how they might benefit from BPM, and some analysis to wrap it all up. He also covered the business and IT drivers for creating a BPM center of excellence, with a focus on alignment, shared resources and reusability.

Building the execution team was key; with a model-driven tool, he didn’t really want “hard-core developers”, or even people who had used the tool before, but rather those who could adapt quickly to new environments and use model-driven concepts to drive agile development. Having a focus on quick wins was important, rather than getting bogged down in a long development cycle when it’s not necessary.

They also had considerations about their server infrastructure, and since they were using BPM across a wide variety of decentralized and non-integrated groups decided on separate virtual machines that could be taken down without impacting anything beyond the specific departmental process. This seems to indicate that they didn’t do much end-to-end work, but focused on departmental solutions; otherwise, I would have expected more integration and the requirement for shared process engines. When he showed his process stats – 200 different processes across 3000 users – it seemed to reinforce my assumption, although they are doing some end-to-end processes such as Procure To Pay.

He strongly encourages taking advantage of the BPM tool for what it does best, including change management for processes. They’ve obviously done a good job of that, since they’re managing their entire BPM program with 4 people on the implementation team. He recommends not allowing developers to write any code until you’ve prototyped what you can in the BPM tool, or else their tendency will be just to rewrite the BPMS functionality themselves; I am 100% behind this, since I see this happening on many BPM implementation projects and it’s a constant battle.

With an SAP/BPM integration like they’ve done at AmerisourceBergen, you need to be careful that you don’t get too carried away in the BPM tool and rebuild functionality that’s already in SAP (or whatever your ERP system is), but using BPM as a tool for orchestrating atomic ERP functions makes a lot of sense in terms of agility and visibility, and also provides the opportunity to build processes that just don’t exist in the ERP system.

Advancing BPM Maturity

Janelle Hill of Gartner presented on how to advance your BPM maturity, starting with the concept that not only isn’t there one path to get to BPM maturity, but there’s more than one maturity destination. There are many different mind-sets that organizations have about their BPM programs, ranging from simple automation and improvement efforts up to strategic business optimization; how you think about BPM will have an enormous impact on the potential value of BPM within your organization. This is really an excellent point that is rarely explicitly stated: if you think of BPM as a low-level tool to do some automation – more of a developer tool than a business tool – then you can see benefits, but they’ll be limited to that domain. Conversely, if you think of BPM as a tool/methodology for transforming your business, your use of BPM will tend to be more aligned with that. The tricky part is that BPM is both (and everything in between), and you don’t want to lose sight of its use at a variety of levels and for many different sorts of benefits: as fashionable as it is to see BPM as purely a strategic, transformational methodology, there are also a lot of practical BPM tools that are used for automation and optimization at a more tactical level that have huge benefits.

Gartner’s business process maturity model – the same, I think as the OMG BPMM – passes through five levels from process-aware, to coordinated processes, to cross-boundary process management, to goal-driven processes, to optimized processes. In line with this, benefits move from cost and productivity improvements at the low levels; to cycle time reductions, capacity and quality gains at the middle levels; to revenue gains, agility and predictability at the higher levels.

Advancing maturity requires work along six major dimensions:

  • Organization and culture
  • Process competencies
  • Methodologies
  • Technology and architecture
  • Metrics and measures
  • Governance

She then showed a mapping between the maturity levels and these dimensions, with the level of effort required for each, with the critical transition points highlighted. There are some interesting transition points, such as the effort required for organization and culture increasing right up until when you are well-entrenched in level 5 maturity, at which time the organization and culture aspects becomes systemic and mostly self-sustaining, and the explicit effort required to maintain them decreases sharply.

She broke out each of the dimensions in more detail, showing within the organization and culture dimension how the roles and responsibilities must be developed as the maturity level increases through education, establishing a BPCC and becoming goal-aligned.  Some dimensions, such as process competencies, methodologies and technology/architecture, follow fairly logical paths of increased effort as the maturity level increases, although there will be decisions within those such as which particular methodologies to develop within your organization, and your tools may change as your maturity level increases. Metrics and measures tend to be more aligned with the maturity levels, changing from individual lagging indicators to shared real-time metrics tied to strategic objectives and SLAs, and is also heavily supported by technology. Governance is the most difficult of the dimensions, with a collection of very different initiatives, and probably won’t even properly start until you’re transitioning from level 1 to level 2. A lot of what she covered here is centered around the process governance committee, and some level of centralized stewardship for end-to-end processes: otherwise, it’s impossible to fund and push forward with processes that span functional (and budgetary) boundaries. It’s also necessary to create incentives to support this, so that the entire process doesn’t end up sub-optimized when one of the functional subprocesses is optimized.

Gartner’s research has shown the impact of a BPCC on achieving business process maturity, and in turn, delivering more successful BPM projects across the organization; I definitely agree with this, although believe that you need to grow your BPCC more organically on the back of a BPM project rather than making it an independent project of its own. The BPCC should not be part of IT; although it contains some technical people with skills in the tools, it’s really about operational processes and should be under the auspices of the COO or other business leader.

She finished up with a contrast between functionally-driven and process-driven organizations in terms of roles and responsibilities, visibility, hand-offs, cost accounting, risk analysis and other areas, plus a great chart summarizing the linkages between maturity levels and the dimensions.

Excellent talk, and lots of great practical advice on what you need to do to increase your BPM maturity level.

Selling BPM to your Organization

Starting into the breakout sessions here at Gartner BPM 2011 in Baltimore, Elise Olding, with some help from Joel Kiernan of Altera, gave a presentation on selling BPM within your organization. This is about selling that first project internally as well as expanding your BPM initiative beyond the first project: leveraging your success so far and your business-focused BPM definition to see how it can be applied with other opportunities. Like any good sales pitch, you need to have content that is relevant, compelling and repeatable. I wrote about expanding BPM adoption within your organization in a recent article series for Global 360, and covered some of the same issues about generalizing beyond that first project into a BPM program.

Kiernan discussed their own case study at Altera (a semiconductor company), starting with how they had to understand their key business processes and communicate this to the steering committee responsible for the business process projects. They’re early in their journey, but have put together the storyline for how BPM will roll out in their organization: identify the right processes, do some as-is and to-be process analysis including external best practices, implement process/system changes, then move into ongoing process improvement.

As Olding discussed, there will need to be different messages for different internal audiences: senior executives are interested in how BPM will improve performance, competitiveness and operational flexibility; line of business managers are interested in operational goals including reducing errors and rework, and gaining visibility into processes for themselves and their management; front-line workers want to know how it will make their work easier, more interesting and more effective.

As an aside, I get the feeling that Gartner presenters have been coached by someone who really likes complex analogies woven throughout the presentation: in the keynote, Ken McGee used a courtroom analogy throughout the presentation, and here Olding is using a film-making analogy with “trailers”, “setting” and “engaging the cast”. It was also a bit of a strange segue to involve the Altera person for only about two minutes when they were really just starting in their process, although I have to give her credit for sharing the stage with a customer, since that’s pretty rare at any Gartner events that I’ve attended in the past. Would have been great to hear from someone further along in the process, and maybe a bit more from them than just two slides.

She covered some of what you actually want to communicate, as well as the who and how of the communication, stressing that you need to achieve buy-in (or at least understanding) from a lot of different stakeholders in order to reach that tipping point where BPM is seen by your organization as a key enabler for business improvement. She changed the format a bit to get people working on their own process issues, giving everyone time to jot down and discuss their challenges in each of the steps of selling BPM internally, then calling on a couple of audience members to share their thoughts with the room. This format shift caused a bit of loss of focus (and a bit of down time for those of us who aren’t really into this form of audience participation), although she was able to bring the experiences of the audience members in alignment with the material that she was presenting. Not surprisingly, one of the key messages is on the business process competency center (what Gartner calls the center of excellence) and the methodology that they employ with customers to make a BPCC successful within an organization. Success, in that case, is measured completely by how well you can sell BPM inside the organization.

Gartner BPM 2011 Kicking Off

I’m at my first Gartner BPM show in a while: a couple of years ago, I noticed a lot of repeated information from one summit to the next and decided to sit a few out, but decided that there was enough refresh by now and a good chance to catch up with a lot of people who I only ever see at these conferences.

The show kicked off with Michele Cantera, joined by Elise Olding, giving some opening remarks and introducing the winners of the Gartner BPM Excellence awards: Lincoln Trust, UPS, Carphone Warehouse, NY State Taxation, and Maximus.

The keynote was delivered by Ken McGee, Gartner fellow, opened with the statement that this is the time for the business process professional. He backed this up with a look at the economic growth forecast, including some optimistic survey numbers from businesses stating that their revenues and IT spending are going to increase this year. This was a fairly general presentation on the impact of the economy on business environments and the need to seize new opportunities; not at all specific to BPM, except for one slide of the APQC process framework that didn’t really seem to fit with much else.

Gartner has obviously released a report on the Money-Making CIO recently, and that’s what he spent part of his presentation on: looking at the six styles of money-making CIOS (entrepreneur, cost optimization, revenue searching, innovation, business development, and public serving). He mentioned other Gartner research, such as pattern-based strategy, and told us that social networking and cloud computing are important (duh); this seemed like a a bit of a grab-bag of concepts that could have been given to any IT audience at any conference.

I understand that it’s important to have presentations that show the larger context at a tightly-focused event like this BPM summit, but this didn’t have the cohesiveness or inspiration required to elevate it beyond just a summary of this year’s Gartner research.

Process Modeling With BPMN

I’m sitting in on Bruce Silver’s online BPMN course this week: this is the same as his onsite course, just using remote classroom tools to allow him to present and demonstrate to us, then get our feedback using typed chat. It’s a combination of lecture and hands-on, using a 60-day license for the business edition of the itp-commerce BPMN Visio add-in that is included with the course. The course runs 11am-3:30pm (Eastern) for three straight days, which took a bit of schedule juggling to be able to attend most of it; not sure if he is recording this for us to access after the course, which would be a nice benefit especially for those doing the certification exam. I use a lot of BPMN with my customers in my role as a process architect, but Bruce’s knowledge of the standard and its usage far outweigh mine, and I’m sure that I will learn a lot in addition to providing a review of the course for my readers.

He’s using the itp-commerce Visio tool, in spite of the hefty price tag ($975 for the Business Edition, $1,535 for the Professional Edition that also includes serialization; the free edition does not include full BPMN 2.0 support), because it natively supports Bruce’s methodology and style validation, which he covers in his book BPMN Method and Style and uses in this course. There are other Visio add-ons for BPMN 2.0 modeling, including one from Trisotech on the Business Process Incubator site that I’ve been using lately since it has a full-featured (but branded) version that customers can use for free, or the full non-branded version for the price of a BPI premium membership. Visio 2010 supports BPMN natively, but not the 2.0 version – if you’re a big Microsoft Visio customer, you might want to start agitating with Microsoft to include that via a service pack, since their current line seems to be that there isn’t sufficient demand for it yet. Bruce and I both believe that BPMN 2.0 support will become a significant differentiator for modeling products by the end of 2011, and Microsoft really needs to get on board with this if they’re going to be a player in the BPMN 2.0 market. There are some nice features in the itp-commerce tool that we didn’t cover in the course, such as simulation and BPMN 2.0 interchange, but many of those are available in lower-cost alternatives: I think that this is a race to the bottom price-wise, since Microsoft will eventually just include all of this in Visio natively.

He started with some basic definitions of BPMN and how it differs from flowcharts – especially in the area of collaboration, extra-process events and exception handling – highlighting the notions of standardization and of the hierarchical view that allows for inclusion of expandable subprocesses, rather than trying to put everything on one enormous top-level process model. He also covered how BPMN creates a bridge between business analysts who are creating these models, and developers who are making them executable, including the BPM systems that make the models directly executable without a lot of coding. He also discussed what’s not in the BPMN standard, such as user interface for human steps, data models, dynamic work assignments, rules, KPIs and anything to do with the higher-level strategy and goals. Although you may see some of these implemented in a BPMS, those will be done in a proprietary manner, and learning how to do that in that particular tool won’t be transferrable to other tools.

As I often do when I’m presenting a quick look at BPMN in a client presentation, he talked about the full BPMN 2.0 standard, with its new support for choreography and conversation diagrams, execution semantics and an XML schema for model interchange between tools, and highlighted that it’s possible to use the descriptive and analytic subclasses (subsets) of the standard if you don’t need to learn all 100 elements of the standard: the descriptive is for business analysts to be able to model processes as documentation, and the analytic is a minimum subset required to model executable processes.

Bruce keeps bringing it back to the value and benefits of BPMN: why it’s important both in terms of its modeling capabilities, and in the role as a standard for widespread understanding. There are a lot of BPMN detractors, but I don’t see the problem if you don’t try to shove the entire standard down the throats of business people: using the descriptive subclass (plus a few more elements), I’m able to have non-technical business people understand the notation in about 5 minutes, although it would take them a little bit longer to be able to create their own diagrams.

After an hour or so of initial presentation to provide the necessary background, Bruce shared his screen and had us all start up Visio with the itp-commerce add-in, and we started modeling some BPMN. As those of you familiar with BPMN know, there are only three main objects in a BPMN diagram: activities, gateways and events. The fun stuff comes with all the adornments that you can add to those three basic objects to indicate a huge variety of functionality.  We started off with a high-level straight-through order process, then added gateways for the exception paths. We started to learn some of the guidelines from Bruce’s style guide, such as using a gateway not to indicate work but only as a question testing the output state of the previous activity (which I always do), and using a separate end event for each distinct end state (which I rarely do but probably will start, since you can label the end events with the states). I also learned a standard Visio trick for moving the placement of the text on a connector using the Text Block tool, which allows you to snug labels of flows leaving a gateway right up to the gateway regardless of the connector length – cool! There were some great questions from the attendees, such as whether you can eliminate the gateway symbol and just have the labeled flows leaving the preceding activity, as you might in traditional flowcharting; in BPMN, that would denote have all of the paths be executed in parallel, not have one path or the other executed, so that’s not a legal representation of an exclusive OR gateway. Gateways can create a lot of confusion, because in spite of how they are often referred to as “decisions”, the decision is actually made in the previous activity, and the gateway just tests the result of that decision.

A great deal of day 1 alternated between some short presentations (a couple of slides each) on concepts, then exercises that allowed us to model those in diagrams ourselves, reinforcing the concepts immediately. While we were doing the modeling, Bruce would talk about other information about the concept, such as explaining some of the benefits and rules of pools while we were adding pools and lanes to our diagram, or the same for subprocess syntax. We saw some of the less-used but essential constructs such as ad hoc subprocesses, in which the contained activities don’t have a flow, and may be completed in any order (or not at all): this is how BPMN represents case management-style processes, for example, where the possible tasks are known but the order and applicability of any given task is not known. He also pointed out (and quizzed us on) common errors, such as having the same activity within a subprocesses and also in the process that calls it.

By the end of the first day, we had learned all of the Level 1 elements (effectively the BPMN 2.0 descriptive subclass), quite a bit of Bruce’s style guidelines around the use of those elements, and we were creating our own BPMN diagrams using those elements. At the start of day 2, after a recap, Bruce talked about having a BPMN method and style – whether it is his or not – so that there are standardized ways of using BPMN: in spite of it being a standard, it is possible to create diagrams that mean the same thing but look different, and having some standard usage makes it a more powerful communication tool within your organization. His method works toward four basic goals:

  • Structural consistency: a variety of the style points that he’s been covering, such as explicit end states and hierarchical composition
  • Readability: top-down traceability through levels of the process and subprocesses
  • Model completeness: diagram doesn’t require additional documentation to describe the process
  • Shareability with IT: models created by business analysts are aligned with the level 2 models used for executable processes

He then took us through the steps of his method for modeling processes that meets these goals; this was part of the essential intellectual property that he had to pass on to us (as opposed to the most standard BPMN on day 1), but too dense with slides and lecture rather than hands-on. Following that, he went through his BPMN style guides, which were also all lecture, but went much more quickly since these tended to be quick rules rather than larger concepts that we saw in the method section, and also we had covered a lot of these already in the exercises and the method. He did a blog post with a first cut of the rules and style of BPMN, both the standard BPMN rules and his style guidelines, plus a later post showing an example of reworking a process model to meet his style guidelines. The first is a great reference if you decide not to cough up for the itp-commerce product that will do the style validations for you; in reality, once you start using these for a while, they’ll become second nature and you won’t need to have them validated. He provided an updated list of the rules as part of the course, and has given me permission to republish, which I will do in a following post.

For the second half of day 2, we moved on to Level 2 BPMN elements (Analytic subclass) with more of the hands-on exercises on events: one of my favorite topics, since events are the most powerful yet the least understood of all BPMN elements. As Bruce pointed out, no one (not even him, and certainly not me) memorizes the table of 50 or so possible event permutations: for level 1 (descriptive subclass used by business analysts), you only need to know six of them (all start and end events), although I usually teach business analysts a couple of the intermediate events from level 2 as well. He suggests focusing on message, timer and error events, adding another nine to the six we’ve already seen; if you master these 15, then have to look up the others as required, you’re way ahead of most people using BPMN today.

Day 3 saw us still covering events via a combination of lecture and exercises; after timers on day 2, we moved on to message events and had a ton of great discussions on some of the finer points of BPMN usage (e.g., a script task that executes purely within the BPMS versus a service task that calls an external service). Message events are critical if you want to start modeling executable processes; intermediate message events are essential for automated messaging outside the process or organization, and boundary message events manage external events that modify or interrupt processes while in flight.  We also covered error events, and Bruce provided some supplementary information on other event types. Interestingly, Bruce is constantly reevaluating how BPMN can and should be used, with some changes over what he published in his book. He was a bit short on time for the last part of day 3 – the course timing definitely needs a bit of work – but we branched into splits and joins, went around iterations, and waded through multi-participant pools (which had an unfortunate effect on my brain).

He finished up with model validation using the itp-commerce add-in to Visio, which optionally validates against his style guide as well as the standard BPMN rules. As he puts it, any modeling tool that doesn’t provide validation against the BPMN specification is a toy, suitable only for drawing nice pictures. I suppose you could argue that after Bruce’s course, you will be able to validate automatically as you model so don’t need a tool to do it, but think of it as being like a spell-checker for process models: we all need a little help once in a while. 😉

He invited us all to go ahead and do the certification exam (no extra fee if done in the next 60 days), and showed one of the example multiple choice questions that had four possible answers, and received votes for all four of the answers from the class, showing that this is not quite as simple as it seems (yes, I got the right answer). If we pass that part, then we have to create a process model from one of our own processes of a specific level of complexity, following his method and style, and submit it for his review. Suffice it to say that certification via his BPMessentials exam will actually mean that you have mad BPMN skillz, it’s not just a certificate for showing up for the course.

Some potential improvements for the course:

  • It’s a bit hard to demo and talk at the same time, and Bruce could have captured screencams of some parts of the Visio demos to playback for us while he was discussing what we needed to do next, then just gone to Visio live for the finer points of demonstration; that would have made it easier for him to focus on describing what was happening rather than focusing on the actual drawing activity.
  • Some of the finer lecture points (such as going through the method and concepts) were a bit slow-moving, since Bruce would talk to one very dense slide for a number of minutes rather than having multiple slides with less information to absorb. Some restructuring of the slides would improve this, especially to show model snippets on the same page as the concept points, or possibly a much quicker summary to start, then return to the concepts later to reinforce.
  • The non-modeling exercises (e.g., defining the process scope given a specific scenario) didn’t work very well online, since there’s no fluid interaction with the participants, just the chat window with Bruce responding to the chat questions verbally when he sees them. In a regular classroom environment, he could ask for verbal solutions and write it out on a chart as they developed more collaboratively; here, all he could do was walk through the exercise and his solution. I’m not sure that a more interactive online collaboration tool would make a big dent in this problem; some things are just made for face-to-face (or at least audio) interaction. These sections could be enhanced by showing the process model solution at the same time as the exercise description – or better yet, a screencam – so that as he walks through it, he could point out how it manifests in the process.
  • It would be great to see a summary of the redundant elements in BPMN 2.0, with the preferred one (if one is preferred) indicated. For example, send/receive tasks are the same as intermediate throwing/catching message events except if you want to put boundary events (e.g., for error handling or timeouts) on the tasks in an executable process; a gateway is implied to be XOR if it has no marker; parallel split gateways and exclusive merge gateways are implied without showing the gateway. Although some of these are reflected in Bruce’s style guidelines, we just stumbled across some of them throughout the course.

I definitely learned some of the finer points of BPMN that I didn’t already know, and I will be going back to some BPMN diagrams that I’m working on with clients and clean up the style a bit with what I’ve learned. With this being an online course, I could multitask with other activities during the parts that were review for me; for a BPMN newbie (the target audience), the pace would have been just about right.

There are few people who have this depth of BPMN knowledge, and Bruce is the only one who I know who is doing this as a professional trainer: his is the only BPMN course that I recommend to my clients. He needs to work out a few bumps in how the online course works, but in general, I thought this was a great course, perfect for a business analyst who is already doing some process modeling but doesn’t know any BPMN, but also informative for those of us with some prior knowledge of BPMN.

SAP Run Better Tour: Business Analytics Overview

Dan Kearnan, senior director of marketing for business analytics, provided a overview of SAP’s business analytics in the short breakout sessions following the keynote. Their “run smarter” strategy is based on three pillars of knowing your business, deciding with confidence and acting boldly; his discussion of the “act boldly” part seemed to indicate that the round-tripping from data to events back to processes is more prevalent than I would have thought based on my previous observations.

We covered a lot of this material in the bloggers briefing a couple of weeks ago with Steve Lucas; he delved into the strategy for specific customers, that is, whether you’re starting with SAP ERP, SAP NetWeaver BW or non-SAP applications as input into your analytics.

He briefly addressed the events/process side of things – I think that they finally realized that when they bought Sybase, they picked up Aleri CEP with it – and their Event Insight solution is how they’re starting to deliver on this. They could do such a kick-ass demo using all of their own products here: data generated from SAP ERP, analyzed with BusinessObjects, events generated with Event Insight, and exception processes instantiated in NetWeaver BPM. NW BPM, however, seems to be completely absent from any of the discussions today.

He went through a number of the improvements in the new BI releases, including a common (and easier to use) user interface across all of the analytics products, and deep integration with the ERP and BW environments; there is a more detailed session this afternoon to drill into some of these.

I’m going to stick around to chat with a few people, but won’t be staying for the afternoon, so my coverage of the SAP Run Better Tour ends here. Watch the Twitter stream for information from others onsite today and at the RBT events in other cities in the days to come, although expect Twitter to crash spectacularly today at 1pm ET/10am PT when the iPad announcement starts.

Blogger/Analyst Session with Mark Aboud at SAP Run Better Tour

We had the chance for a small group of bloggers and analysts (okay, I was probably the only one with “blogger” on my name tag) with Mark Aboud, Managing Director of SAP Canada, and Margaret Stuart, VP for the Canadian BusinessObjects division. Since this was a roundtable Q&A, I’ll just list some of the discussion points.

  • 50% of SAP Canadian customers are small and medium businesses, sold through their partner network. ERP sales tend to be made through larger partners, whereas analytics are handled by a larger number of smaller partners as well.
  • Business ByDesign has only been launched in Canada within the past 60 days, making it difficult to tell much about the uptake here. There is one live production customer in Canada now, although they were not able to name names. Pricing and minimum number of users is similar to the US offering.
  • It sounds like HANA is a focus in Canada, but nothing concrete to talk about yet – seems like the analytics sales team is being focused on it and has built a good pipeline. Maple Leaf Foods, who spoke at the keynote, is considering it. The use cases exist, but the customer may not realize that the solutions to big data analytics are within their reach.
  • StreamWork is pretty much a big zero in Canada right now: they’re starting to talk to customers, but it sounds like very early days here. I was promised a follow-up on this question.
  • They’re putting a lot of weight on mobile apps for the future, particularly in industries that have remote users. I’m envisioning an underground miner with an iPad. Winking smile
  • The use of analytics such as BusinessObjects has become much more agile: it’s not taking 6 months to create an analytical view any more, the end users have the expectation that this can be done in a much shorter time.
  • I posed the question about how (or whether) all these great analytics are being used to generate events that feed back automatically into business processes; although there was recognition that there’s some interesting potential, it was a bit of a blank. This is the same question that I posed at last year’s SAPPHIRE about creating a link between their sustainability initiatives and BPM – I’m seeing this as a critical missing link from analytics through events back to processes.

A good opportunity for Q&A with Aboud and Stuart about what’s happening with SAP in Canada. Since most of my focus with SAP has been through the US conferences, it was nice to see what’s happening closer to home.

SAP Run Better Tour Toronto

SAP is holding a Run Better Tour to highlight some of their new releases and customer success stories, and today it’s in Toronto which allows me to check it out without having to get on an airplane. I attended the Women’s Leadership Forum breakfast this morning, featuring Amanda Lang of CBC News, and she’s speaking again in the general keynote, along with Mark Aboud, Managing Director of SAP Canada.

To go off on a tangent for a moment, Lang had an interesting anecdote at breakfast from an interview that she did with the ambassador from Norway. Apparently, Norway mandated that there be equal representation of women in senior government and corporate board positions; all of the cries of “but there are no women to take these roles” turned out to be completely untrue once they were actually required to look for them. Very reminiscent of the brouhaha around women speakers at tech conferences that inevitably arises several times per year.

In her general keynote, Lang focused on the economy and market forces (after making a quick joke about economists getting laid), and the factors that could impact a return to prosperity: world instability, a repeat of the financial crisis due to mismanagement, and a decrease in productivity. In the relatively small Canadian market, we have no control over the first two of these – a financial crisis that impacts us is unlikely to come from our conservatively-run banks, but from US or European financial institutions – but we can be more productive. However, our productivity has declined in the past 20-30 years, and we are at risk of leaving our children worse off than we are. This started when our currency was so cheap, and our exports were selling at $0.60 on the dollar: no need to increase productivity when you can keep doing the same old thing and still make money at it. However, the past 8 years or so have seen an exchange increase such that our dollar sits near par with the US, which makes our exports much less competitive. Since we haven’t increased productivity, we don’t have better widgets to sell for less in spite of the exchange leveling. Productivity and innovation, although not identical, are highly correlated: we need to have more people inside organizations who challenge the status quo and bring forward better ideas for how to do things.

Mark Aboud started his presentation with the idea that you can’t just get better, you have to get better faster than your competition. Some of this is based on taming the explosion of data that is resulting from the digitalization of human culture: all that needs to be gathering and analyzed, then made available to a variety of constituents via a number of different channels. Another contributor is social media, both in terms of the power that it has a platform, but also in raising the expectations for user experience: the consumer experience is very powerful, but the typical employee experience is pretty lame. He moved on to talk about SAP, and particularly SAP Canada, where only 40% of their business is based on ERP: much of the rest is business analytics. This stress on analytics became obvious as he talked about one of their customers, Children’s Hospital of Eastern Ontario, and how they’re using a graphical real-time dashboard as their key interface in the emergency department to indicate how well they’re operating, and highlighting problem areas: a great analytics-in-action example, although it’s not clear where the underlying data is coming from. He also talked about CN Railways, and how they’re using Business Objects analytics to reduce their fuel costs.

Last up in the keynote was someone from Maple Leaf Foods (missed the name) talking about their ERP implementation, and how they use it to manage a company that has grown by acquisition and has very different types of operations in different regions, with 200 different systems and islands of data. They are trying to standardize their business processes across these units at some level, and started rolling out SAP in all of the business units early in 2011, with a planned completion date of early 2013. They’ve done 35 go-lives already, which necessitates a minimum of customization and, sometimes, changing their business processes to match out-of-the-box SAP rather than spending the time to customize SAP.

Good balance of keynotes; I’m now off to a bloggers’ briefing with Mark Aboud.