Robert Shapiro on BPMN 2.0

Robert Shapiro spoke on a webinar today about BPMN 2.0, including some of the history of how BPMN got to this point, changes and new features from the previous version and the challenges that those may create, the need for portability and conformance, and an update on XPDL 2.2. The webinar was hosted by the Workflow Management Coalition, where Shapiro chairs the conformance working group.

He started out with how WPDL started as an interchange format in the mid-90’s, then became XPDL 1.0 around 2001, around the time that the BPMN 1.0 standard was being kicked off. For those of you not up on your standards, XPDL is an interchange format (i.e., the file format) and BPMN prior to version 2.0 is a notation format (i.e., the visual representation); since BPMN didn’t include an interchange format, XPDL was updated to provide serialization of all BPMN elements.

With BPMN 2.0, serialization is being added to the BPMN standard, as well as many other new components including formalization of execution semantics and the definition of choreography model. In particular, there are significant changes to conformance, swimlanes and pools, data objects, subprocesses, and events; Shapiro walked through each of these in detail. I like some of the changes to events, such as the distinction between boundary and regular intermediate events, as well as the concept of interrupting and non-interrupting events. This makes for a more complex set of events, but much more representative.

Bruce Silver, who has been involved in the development of BPMN 2.0, wrote recently on what he thinks is missing from BPMN 2.0; definitely worth a read for some of what might be coming up in future versions (if Bruce has his way).

One key thing that is emerging, both as part of the standard and in practice, is portability conformance: one of the main reasons for these standards is to be able to move process models from one modeling tool to another without loss of information. This led to a discussion about BPEL, and how BPMN is not just for BPEL, or even just for executable processes. BPEL doesn’t fully support BPMN: there are things that you can model in BPMN that will be lost if you serialize to BPEL, since BPEL is intended as a web service orchestration language. For business analysts modeling processes – especially non-executable processes – a more complete serialization is critical.

In case you’re wondering about BPDM, which was originally intended to be the serialization format for BPMN, it appears to have become too much of an academic exercise and not enough about solving the practical serialization problem at hand. Even as serialization is built into BPMN 2.0 and beyond, XPDL will likely remain a key interchange format because of the existing base of XPDL support by a number of BPM and BPA vendors. Nonetheless, XPDL will need to work at remaining relevant to the BPM market in the world of BPEL and BPMN, although it is likely to remain as a supported standard for years to come even if the BPMN 2.0 serialization standard is picked up by a majority of the vendors.

The webinar has about 60 attendees on it, including the imaginatively named “asdf” (check the left side of your keyboard) and several acquaintances from the BPM standards and vendor communities. The registration page for the webinar is here, and I imagine that that will eventually link to the replay of the webinar. The slides will also be available on the WfMC site.

If you want to read more about BPMN 2.0, don’t go searching on the OMG site: for some reason, they don’t want to share draft versions of the specification except to paid OMG members. Here’s a direct link to the 0.9 draft version from November 2008, but I also recommend tracking Bruce Silver’s blog for insightful commentary on BPMN.

BPM Think Tank: Business Benefits of BPM Standards

Derek Miers gave a short session that was supposed to be about the business benefits of BPM standards, but ended up as a bit of a BPM standards bun fight. As I mentioned in my first post this morning, I think that Think Tank needs more about standards, I’m just not sure that a few minutes of unstructured debate — mostly from vendors who are involved in the standards process — really satisfies the need.

Business Process Driven SOA using BPMN and BPEL

I just received a review copy of Matjaz Juric and Kapil Pant’s new book, Business Process Driven SOA using BPMN and BPEL. It’s on my list of recent books that I’ve received to review, and I hope to get to it soon.

According to the authors’ description, you’ll learn the following from this book:

  • Modeling business processes in an SOA-compliant way
  • A detailed understanding of BPMN standard for business process modeling and analysis
  • Automatically translating BPMN into BPEL Executing business processes on SOA platforms
  • Overcome the semantic gap between process models and their execution, and follow the closed-loop business process management life cycle
  • Understand technologies complementary to BPM and SOA such as Business Rules Management and Business Activity monitoring Approach

I’ll let you know if I learned all of that once I’ve had a chance to read it.

BPM Milan: Instantiation Semantics for Process Models

Jan Mendling of Queensland University of Technology presented a paper on Instantiation Semantics for Process Models, co-authored with Gero Decker of HPI Potsdam. Their main focus was on determining the soundness of process models, particularly based on the entry points to processes.

They considered six different process notations and syntax: open workflow nets, YAWL, event-driven process chains, BPEL (the code, not a graphical representation), UML activity diagrams, and BPMN. They determined how an entry point is represented in each of these notations, with three different types of entry points: a start place (such as in open workflow nets), a start event (such as in BPMN), and a start condition (such as in event-driven process chains). He walked through a generic process execution environment, showing the entry points to process execution.

They created a framework called CASU: Creation (what triggers a new process instance), Activation (which of the multiple entry points are activated on creation), Subscription (which other start events are waited for upon the triggering of one start event), and Unsubscription (how long are the other start events waited for). Each of these four activities has several possible patterns, e.g., Creation can be based on a single condition, multiple events, or other patterns of events.

The CASU framework allows for the classification of the instantiation semantics of different modeling languages; he showed a classification table that evaluated each of the six process notations against the 5 Creation patterns, 5 Activation patterns, 3 Subscription patterns and 5 Unsubscription patterns, showing how well each notation supports each pattern. One important note is that BPEL and BPMN do not support the same patterns, meaning that there is not a 100% mapping between BPMN and BPEL: we all knew that, but it’s nice to see more research backing it up. 🙂

Having multiple start events in a process causes all sorts of problems in terms of understandability and soundness, and he doesn’t recommend this in general; however, since the notations support it and therefore it can be done in practice, analysis of multi-start point instantiation semantics is important to understand how the different modeling languages handle these situations.

BPM Milan: Paul Harmon keynote

After a few brief introductions from the various conference organizers (in which we learned that next year’s conference is in Ulm, Germany), we had a keynote from Paul Harmon on the current state and future of BPM. It covered a lot of the past, too: from the origins of quality management and process improvement through every technique used in the past 100 years to the current methods and best practices. A reasonable summary of how we got to where we are.

His “future promise”, however, isn’t all that future: he talks about orchestrating ERP processes with a BPMS, something that’s already a well-understood functionality, if not widely implemented. He points out (and I agree) that many uses of BPMS today are not that innovative: they’re being used the same way as the workflow and EAI systems of 5 years ago, namely, as better programming tools to automate a process. He sees the value of today’s BPMS as helping managers to manage processes, both in terms of visibility and agility; of course, it’s hard to do that unless you have the first part in place, it’s just that a lot of companies spend too much effort on the first level of just automating the processes, and never get to the management part of BPM.

He discussed the importance of BPMN in moving BPMS into the hands of managers and business analysts, in that a basic — but still standards-compliant — BPMN diagram can be created without adornment by someone on the business side without having to consider many of the exception flows or technical implementation details: this “happy path” process will execute as it is, but won’t handle all situations. The exceptions and technical details can be added at a second modeling/design phase while still maintaining the core process as originally designed by the business person.

He also showed a different view of a business process: instead of modeling the internal processes, model the customer processes — what the customer goes through in order to achieve their goals — and align that with what goes on internally and what could be done to improve the customer experience. Since the focus is on the customer process and not the internal process, the need for change to internal process can become more evident: a variation on walking a mile in their shoes.

His definition of BPM is very broad, encompassing not just the core processes, but performance management, people, technology, facilities, management and suppliers/partners: an integration of quality, management and IT. Because of the broad involvement of people across an organization, it’s key to find a common language about process that spans IT and business management.

Although they’re not there yet, you can find a copy of his slides later this week by searching for BPM2008HarmonKeynote at

Another new BPMN book

Another new BPMN book, this one by Stephen White (arguably the inventor of BPMN) and Derek Miers: BPMN Modeling and Reference Guide. It won’t be released until September, with a public launch at the Gartner BPM summit in DC. From the product description:

This book is for both business users and process modeling practitioners alike. Part I provides an easily understood introduction to the key components of BPMN (put forward in a user-friendly fashion). Starting off with simple models, it progresses into more sophisticated patterns. Exercises help cement comprehension and understanding (with answers available online). Part II provides a detailed and authoritative reference on the precise semantics and capabilities of the standard.

I wrote earlier this week about the just-released BPMN book by Tom Debevoise and Rick Geneva; this is obviously the year that BPMN goes mainstream, or at least makes the attempt. White and Miers’ book, although a bit longer than Debevoise and Geneva’s, is also more than twice the price, and also doesn’t seem to offer an e-book option: hard to become a staple of every process-oriented person in an organization at a $40 price point.

I’ll be very interested to read Bruce Silver‘s review of these books. Unless, of course, he’s writing his own. 🙂

Microguide to BPMN

I noticed in one of Tom Debevoise’s posts last week that he recently co-authored the book The Microguide to Process Modeling in BPMN, and on closer examination, I see that his co-author is Rick Geneva of Intalio, with Ismael Ghalimi writing the foreword.

From the product description on Amazon:

With over fifty implementations listed, Business Process Modeling Notation (BPMN) is an increasingly successful Object Management Group (OMG) standard. Whether you are in government, manufacturing or retailing you can accurately depict your processes in BPMN! Yet, OMG BPMN specification 1.1 is abstract, lengthy and complicated. So, learning to use BPMN can be daunting. So you will need the strait forward [sic] information in this book. This guide gathers all the ideas, design, and problem solving of BPMN into one simple, focused book, and offers concrete true-life examples that explain BPMN’s approach to process modeling.

I haven’t had a chance to read it yet so can’t compare it to the many other sources of BPMN instruction out there, such as the recently-released BPMN, the Business Process Modeling Notation Pocket Handbook. Unfortunately, Debevoise and Geneva’s book doesn’t appear to be available as an e-book.

BPMN 1.1 poster

Previously, I posted about the free BPMN 1.0 poster available for download from, and now the Business Process Technology Group at the Hasso Plattner Institute has published one for BPMN 1.1. Both provide a good quick reference; the BPT version has just the graphical object notation, while the ITPoster version also includes some patterns and antipatterns.

Also, check out BPT’s BPMN Corner, which has a number of good BPMN links, including Oryx, a web-based BPMN editor, and BPMN stencils for Visio and OmniGraffle.

Oracle BEA Strategy Briefing

Not only did Oracle schedule this briefing on Canada Day, the biggest holiday in Canada, but they forced me to download the Real Player plug-in in order to participate. The good part, however, is that it was full streaming audio and video alongside the slides.

Charles Phillips, Oracle President, kicked off with a welcome and some background on Oracle, including their focus on database, middleware and applications, and how middleware is the fastest-growing of these three product pillars. He described how Oracle Fusion middleware is used both by their own applications as well as ISVs and customers implementing their own SOA initiatives.

He outlined their rationale for acquiring BEA: complementary products and architecture, internal expertise, strategic markets such as Asia, and the partner and channel ecosystem. He stated that they will continue to support BEA products under the existing support lifetimes, with no forced migration policies to move off of BEA platforms. They now consider themselves #1 in the middleware market in terms of both size and technology leadership, and Phillips gave a gentle slam to IBM for over-inflating their middleware market size by including everything but the kitchen sink in what they consider to be middleware.

The BEA developer and architect online communities will be merged into the Oracle Technology Network: Dev2Dev will be merged into the Oracle Java Developer community, and Arch2Arch will be broadened to the Oracle community.

Retaining all the BEA development centers, they now have 4,500 middleware developers; most BEA sales, consulting and support staff were also retained and integrated into the the Fusion middleware teams.

Next up was Thomas Kurian, SVP of Product Development for Fusion Middleware and BEA product directions, with a more detailed view of the Oracle middleware products and strategy. Their basic philosophy for middleware is that it’s a unified suite rather than a collection of disjoint products, it’s modular from a purchasing and deployment standpoint, and it’s standards-based and open. He started to talk about applications enabled by their products, unifying SOA, process management, business intelligence, content management and Enterprise 2.0.

They’ve categorized middleware products into 3 categories on their product roadmap (which I have reproduced here directly from Kurian’s slide:

  • Strategic products
    • BEA products being adopted immediately with limited re-design into Oracle Fusion middleware
    • No corresponding Oracle products exist in majority of cases
    • Corresponding Oracle products converge with BEA products with rapid integration over 12-18 months
  • Continue and converge products
    • BEA products being incrementally re-designed to integrate with Oracle Fusion middleware
    • Gradual integration with existing Oracle Fusion middleware technology to broaden features with automated upgrades
    • Continue development and maintenance for at least 9 years
  • Maintenance products
    • BEA had end-of-life’d due to limited adoption prior to Oracle M&A
    • Continued maintenance with appropriate fixes for 5 years

For the “continue and converge” category, that is, of course, a bit different than “no forced migration”, but this is to be expected. My issue is with the overlap between the “strategic” category, which can include a convergence of an Oracle and a BEA product, and the “continue and converge” category, which includes products that will be converged into another product: when is a converged product considered “strategic” rather than “continue and converge”, or is this just the spin they’re putting on things so as to not freak out BEA customers who have put huge investments into a BEA product that is going to be converged into an existing Oracle product?

He went on to discuss how each individual Oracle and BEA product would be handled under this categorization. I’ve skipped the parts on development tools, transaction processing, identity management, systems management and service delivery, and gone right to their plans for the Service-Oriented Architecture products:

Oracle SOA product strategy

  • Strategic:
    • Oracle Data Integrator for data integration and batch ETL
    • Oracle Service Bus, which unifies AquaLogic Service Bus and Oracle Enterprise Service Bus
    • Oracle BPEL Process Manager for service orchestration and composite application infrastructure
    • Oracle Complex Event Processor for in-memory event computation, integrated with WebLogic Event Server
    • Oracle Business Activity Monitoring for dashboards to monitor business events and business process KPIs
  • Continue and converge:
    • BEA WL-Integration will be converged with the Oracle BPEL Process Manager
  • Maintenance:
    • BEA Cyclone
    • BEA RFID Server

Note that the Oracle Service Bus is in the “strategic” category, but is a convergence of AL-SB and Oracle ESB, which means that customers of one of those two products (or maybe both) are not going to be happy.

Kurian stated that Oracle sees four types of business processes — system-centric, human-centric, document-centric and decision-centric (which match the Forrester divisions) — but believes that a single product/engine that can handle all of these is the way to go, since few processes fall purely into one of these four categories. They support BPEL for service orchestration and BPMN for modeling, and their plan is to converge a single platform that supports both BPEL and BPMN (I assume that he means both service orchestration and human-facing workflow). Given that, here’s their strategy for Business Process Management products:

Oracle BPM product strategy

  • Strategic:
    • Oracle BPA Designer for process modeling and simulation
    • BEA AL-BPM Designer for iterative process modeling
    • Oracle BPM, which will be the convergence of BEA AquaLogic BPM and Oracle BPEL Process Manager in a single runtime engine
    • Oracle Document Capture & Imaging for document capture, imaging and document workflow with ERP integration [emphasis mine]
    • Oracle Business Rules as a declarative rules engine
    • Oracle Business Activity Monitoring [same as in SOA section]
    • Oracle WebCenter as a process portal interface to visualize composite processes

Similar to the ESB categorization, I find the classification of the converged Oracle BPM product (BEA AL-BPM and Oracle BPEL PM) as “strategic” to be at odds with his original definition: it should be in the “continue & converge” category since the products are being converged. This convergence is not, however, unexpected: having two separate BPM platforms would just be asking for trouble. In fact, I would say that having two process modelers is also a recipe for trouble: they should look at how to converge the Oracle BPA Designer and the BEA AL-BPM Designer

In the portals and Enterprise 2.0 product area, Kurian was a bit more up-front about how WebLogic Portal and AquaLogic UI are going to be merged into the corresponding Oracle products:

Oracle portal and Enterprise 2.0 product strategy

  • Strategic:
    • Oracle Universal Content Management for content management repository, security, publishing, imaging, records and archival
    • Oracle WebCenter Framework for portal development and Enterprise 2.0 services
    • Oracle WebCenter Spaces & Suite as a packaged self-service portal environment with social computing services
    • BEA Ensemble for lightweight REST-based portal assembly
    • BEA Pathways for social interaction analytics
  • Continue and converge:
    • BEA WebLogic Portal will be integrated into the WebCenter framework
    • BEA AquaLogic User Interaction (AL-UI) will be integrated into WebCenter Spaces & Suite
  • Maintenance:
    • BEA Commerce Services
    • BEA Collabra

In SOA governance:

  • Strategic:
    • BEA AquaLogic Enterprise Repository to capture, share and manage the change of SOA artifacts throughout their lifecycle
    • Oracle Service Registry for UDDI
    • Oracle Web Services Manager for security and QOS policy management on services
    • EM Service Level Management Pack as a management console for service level response time and availability
    • EM SOA Management Pack as a management console for monitoring, tracing and change managing SOA
  • Maintenance:
    • BEA AquaLogic Services Manager

Kurian discussed the implications of this product strategy on Oracle Applications customers: much of this will be transparent to Oracle Applications, since many of these products form the framework on which the applications are built, but are isolated so that customizations don’t touch them. For those changes that will impact the applications, they’ll be introduced gradually. Of course, some Oracle Apps are already certified with BEA products that are now designated as strategic Oracle products.

Oracle has also simplified their middleware pricing and packaging, with products structured into 12 suites:

Oracle Middleware Suites

He summed up with their key messages:

  • They have a clear, well-defined, integrated product strategy
  • They are protecting and enhancing existing customer investments
  • They are broadening Oracle and BEA investment in middleware
  • There is a broad range of choice for customer

The entire briefing will be available soon for replay on Oracle’s website if you’re interested in seeing the full hour and 45 minutes. There’s more information about the middleware products here, and you can sign up to attend an Oracle BEA welcome event in your city.

Architecture & Process keynote: Bill Curtis

The second part of the morning keynote was by Bill Curtis, who was involved in developing CMM and CMMI, and now is working on the Business Process Maturity Model (BPMM). I’ve seen quite a bit about BPMM at OMG functions, but this is the first time that I’ve heard Curtis speak about it.

He started by talking about the process/function matrix, where functions focus on the performance of skills within an area of expertise, and processes focus on the flow and transformation of information or material. In other words, functions are the silos/departments in organizations (e.g., marketing, engineering, sales, admin, supply chain, finance, customer service), and processes are the flows that cut across them (e.g., concept to retire, campaign to order, order to cash, procure to pay, incident to close. Unfortunately, as we all know, the biggest problems occur with the white space in between the silos when the processes aren’t structured properly, and a small error at the beginning of the process causes increasingly large amounts of rework in other departments later in the process: items left off the bill of sale by sales created missing information in legal, incomplete specs in delivery, and incorrect invoices in finance. Typical for many industries is 30% rework — an alarming figure that would never be tolerated in manufacturing, for example, where rework is measured and visible.

Curtis’ point is that low maturity organizations have a staggering about of rework, causing incredibly inefficient processes, and they don’t even know about it because they’re not measuring it. As with many things, introspection breeds change. And just as Ted Lewis was talking about EA as not just being IT architecture, but a business-IT decision-making framework, Curtis talked about how the concepts of CMM in IT were expanded into BPMM, a measurement of both business and IT maturity relative to business processes.

In case you haven’t seen the BPMM, here’s the five levels:

  • Level 1 – Initial: inconsistent management (I would have called this Level 0 for consistency with CMM, but maybe that was considered too depressing for business organizations). Curtis called the haphazard measures at this level “the march of 1000 spreadsheets”, which is pretty accurate.
  • Level 2 – Managed: work unit management, achieved through repeatable practices. Measurements in place tend to be localized status and operational reports that indicate whether local work is on target or not, allowing them to start to manage their commitments and capacity.
  • Level 3 – Standardized: process management based on standardized practices. Transitioning from level 2 to 3 requires tailoring guidelines, allowing the creation of standard processes while still allowing for exceptions: this tends to strip a lot of the complexity out of the processes, and makes it worth considering automation (automation of level 2 just paves the cowpaths). Measurements are now focused on process measures, usually based on reacting to thresholds, which allows both more accurate processes and more accurate cost-time-quality measures for better business planning.
  • Level 4 – Predictable: capability management through statistically controlled practices. Statistical measurements throughout a process — true process analytics — are now used to predict the outcome: not only are the measurements more sophisticated, but the process is sufficiently repeatable (low variance) that accurate prediction is possible. If you’re using Six Sigma, this is where the full set of tools and techniques are used (although some will be used at levels 2 and 3). This allows predictive models to be used  both for predicting the results of work in progress, and for planning based on accurately estimated capabilities.
  • Level 5 – Innovative: innovation management through innovative practices. This is not just about innovation, but about the agility to implement that innovation. Measurements are used for what-if analysis to drive into proactive process experimentation and improvement.

The top two levels are really identical to innovative management practices, but the advantage of BPMM is that it provides a path to get from where we are now to these innovative practices. Curtis also sees this as a migration from a chaotic clash of cultures to a cohesive culture of innovation.

This was a fabulous, fast-paced presentation that left me with a much deeper understanding of — and appreciation for — BPMM. He had some great slides with this, which will apparently be available on the Transformation & Innovation website later this week.

Now the hard part starts: trying to pick between a number of interesting-sounding breakout sessions.