IBM Vision for BPM, ODM and SOA

Opening day at IBM Impact 2012 (there were some sessions yesterday, but today is the real start), and a good keynote focused on innovation. The wifi is appalling – if IBM can’t get this right with their messages about scalability, who can? – so not sure if I’ll have the chance to post any of this throughout the day, or if you’ll get it all when I get back to my hotel room.

This post is based on a pre-conference briefing that I had a week or two ago, a regular conference breakout session this morning, and the analyst briefing this afternoon, covering  IBM’s vision for BPM, ODM (decision management) and SOA. Their customers are using technology to drive process innovation, and the IBM portfolio is working to address those needs. Cross-functional business outcomes, which in turn require cross-functional processes, are enabled by collaboration and by better technical integration across silos. And, not surprisingly, their message is moving towards the Gartner upcoming iBPMS vision: support for structured and unstructured process; flexible integration; and rules and analytics for repeatable, flexible decisions. Visibility, collaboration and governance are key, not just within departmental processes, but when linking together all processes in an organization into an enterprise process architecture.

The key capabilities that they offer to help clients achieve process innovation include:

  • Process discovery and design (Blueworks Live)
  • Business process management (Process Server and Process Center)
  • Operational decision management (Decision Server and Decision Center)
  • Advanced case management (Case Manager, which is the FileNet-based offering that not part of this portfolio, but integrated)
  • Business monitoring (Business Monitor)

Underpinning these are master data management, integration, analytics and enterprise content management, surrounded by industry expertise and solutions. IBM is using the term intelligent business operations (which was front and center at Gartner BPM last week) to describe the platform of process, events and decision, plus appropriate user interfaces for visibility and governance.

Blueworks Live is positioned not just as a front-end design tool for process automation, but as a tool for documenting processes. Many of the 300,000 processes that have been documented in Blueworks Live are never automated in IBM BPM or any other “real” BPMS, but it acts as a repository for discovering and documenting processes in a collaborative environment, and allowing process stakeholders to track changes to processes and see how it impacts their business. There is an expanded library of templates, plus an insurance framework and other templates/frameworks coming up.

One exciting new feature (okay, exciting to me) is that Blueworks Live now allows decision tasks to be defined in process models, including the creation of decision tables: this provides an integrated process/decision discovery environment. As with process, these decisions do not need to become automated in a decision management system; this may just document the business rules and decisions as they are applied in manual processes or other systems.

Looking at IBM BPM v8, which is coming up soon, Ottosson took us through the main features:

  • IBM BPM inbox showing inline task approvalSocial collaboration to allow users to work together on tasks via real-time interactions, view activity streams, and locate experts. That manifests in the redesigned task interface, or “coach”, with a sidebar that includes task details, the activity stream for the entire process, and experts that are either recommended by the system based on past performance or by others through manual curation. Experts can be requested to collaboration on a task with another user – it includes presence, so that you can tell who is online at any given time – allowing the expert to view the work that the user is doing, and offer assistance. Effectively, multiple people are being given access to same piece of work, and updates made by anyone are shown to all participants; this can be asynchronous or synchronous.
  • There is also a redesigned inbox UI, with a more up-to-date look and feel with lots of AJAX-y goodness, sorting and coloring by priority, plus the ability to respond to simple tasks inline directly in the inbox rather than opening a separate task view. It provides a single task inbox for a variety of sources, including IBM BPM, Blueworks workflows and Case Manager tasks.
  • Situational awareness with process monitoring and analysis in a performance data warehouse.
  • iPhone app task listMobile access via an iOS application that can interface with Blueworks Live and IBM BPM; if you search for “IBM BPM” in the iTunes app store (but not, unfortunately, in the Android Market), you’ll find it. It supports viewing the task list, task completion, attach documents and add comments. They are considering releases the source code to allow developers to use it as a template, since there is likely to be a demand for a customized or branded version of this. In conjunction with this, they’ve released a REST API tester similar to the sort of sandbox offered by Google, which allows developers to create REST-based applications (mobile or otherwise) without having to own the entire back-end platform. This will certainly open up the add-on BPM application market to smaller developers, where we are likely to see more innovation.
  • Enhancements to Process Center for federation of different Process Centers, each of which implies a different server instance. This allows departmental instances to share assets, as well as draw from an internal center of excellence plus one hosted by IBM for industry standards and best practices.
  • Support for the CMIS standard to link to any standard ECM repository, as well as direct integration to FileNet ECM, to link documents directly into processes through a drag-and-drop interface in the process designer.
  • There are also some improvements to the mashup tool used for forms design using a variety of integration methods, which I saw in a pre-conference briefing last week. This uses some of the resources from IBM Mashup Centre development team, but the tool was built new within IBM BPM.
  • Cloud support through IBM SmartCloud which appears to be more of a managed server environment if you want full IBM BPM, but does offer BPM Express as a pre-installed cloud offering. At last year’s Impact, their story was that they were not doing BPM (that is, execution, not the Blueworks-type modeling and lightweight workflow) in the cloud since their customers weren’t interested in that; at that time, I said that they needed to rethink their strategy on this and and stop offering expensive custom hosted solutions. They’ve taken a small step by offering a pre-installed version of BPM Express, but I still think these needs to advance further.

WebSphere Operational Decision Management (ODM) is a integration/bundling of WebSphere Business Event Manager and ILOG, bringing together events and rules into a single decision management platform for creating policies and deploying decision services. It has a number of new features:

  • ODM event streamSocial interface for business people to interact with rules design: decisions are assets that are managed and modified, and the event stream/conversation shows how those assets are being managed. This interface makes it possible to subscribe to changes on specific rules.
  • Full text searching across rules, rule flows, decision tables and folders within a project, with filtering by type, status and date.
  • Improved decision table interface, making it easier to see what a specific table is doing.
  • Track rule versions through a timeline (weirdly reminiscent of Facebook’s Timeline), including snapshots that provide a view of rules at a specific point in time.
  • Any rule can emit an event to be consumed/managed by the event execution engine; conversely, events can invoke rulesets. This close integration of the two engines within ODM (rules and events) is a natural fit for agile and rapid automated decisions.

There’s also zOS news: IBM BPM v8 will run on zOS (not sure if that includes all server components), and the ODM support for zOS is improved, including COBOL support in rules. It would be interesting to see the cost relative to other server platforms, and the compelling reasons to deploy on zOS versus those other platforms, which I assume are mostly around integrating with other zOS applications for better runtime performance.

Since last year’s big announcement about bringing the platforms together, they appear to have been working on integration and design, putting a more consistent and seamless user interface on the portfolio as well as enhancing the capabilities. One of the other analysts (who will remain nameless unless he chooses to identify himself) pointed out that a lot of this is not all that innovative relative to market leaders – he characterized the activity stream social interface as being like Appian Tempo three years ago, and some of the functionality as just repackaged Lombardi – but I don’t think that it’s necessarily IBM’s role to be at the very forefront of technology innovation in application software. By being (fairly) fast followers, they have the effect of validating the market for the new features, such as mobile and social, and introducing their more conservative customer base to what might seem like pretty scary concepts.

Emerging Trends in BPM – Five Years Later

I just found a short article that I wrote for Savvion (now part of Progress Software) dated November 21, 2006, and decided to post it with some updated commentary on the 5th anniversary of the original paper. Enjoy!

Emerging trends in BPM
What happened in 2006, and what’s ahead in 2007

The BPM market continues to evolve, and although 2006 has seen some major events, there will be even more in 2007. This column takes a high-level view of four areas of ongoing significant change in BPM: the interrelationship between SOA and BPM; BPM standards; the spread of process modeling tools; and the impact of Web 2.0 on BPM.

SOA and BPM, together at last. A year ago, many CIOs couldn’t even spell SOA, much less understand what it could do for them. Now, Service-Oriented Architecture and BPM are seen as two ends of the spectrum of integration technologies that many organizations are using as an essential backbone for business agility.

SOA is the architectural philosophy of exposing functionality from a variety of systems as reusable services with standardized interfaces; these, in turn, can be orchestrated into higher-level services, or consumed by other services and applications. BPM systems consume the services from the SOA environment and add in any required human interaction to create a complete business process.

As with every year for the last several years, 2006 has seen ongoing industry consolidation, particularly with vendors seeking to bring SOA and BPM together in their product portfolios. This trend will continue as SOA and BPM become fully recognized as being two essential parts of any organization’s process improvement strategy.

There has certainly been consolidation in the BPM vendor portfolios, especially the integration vendors adding better human-centric capabilities through acquisitions: Oracle acquired BEA in 2008, IBM acquired Lombardi in 2009, Progress acquired Savvion in 2010, and TIBCO acquired Nimbus in 2011. Although BPM is being used in some cases to orchestrate and integrate systems using services, this is still quite a green field for many organizations who have implemented BPM but are still catching up on exposing services from their legacy applications, and orchestrating those with BPM.

BPM standards. 2006 was the year that the Business Process Modeling Notation (BPMN), a notational standard for the graphical representation of process models, went mainstream. Version 2 of the standard was released, and every major BPM vendor is providing some way for their users to make use of the BPMN standard, whether it’s through a third-party modeling tool or directly in their own process modelers.

But BPMN isn’t the only standard that gained importance this year. 2006 also saw the widespread adoption of XPDL (XML Process Definition Language) by BPM vendors as an interchange format: once a process is modeled in BPMN, it’s saved in the XPDL file format to move from one system to another. A possible competitor to XPDL, the Business Process Definition Metamodel (BPDM) had its first draft release this year, but we won’t know the impact of this until later in 2007. On the SOA side, the Business Process Execution Language (BPEL), a service orchestration language, is now widely accepted as an interchange format, if not a full execution standard.

The adoption of BPM standards is critical as we consider how to integrate multiple tools and multiple processes to run our businesses. There’s no doubt that BPMN will remain the predominant standard for the graphical representation of process models, but 2007 could hold an interesting battle between XPDL, BPDM and BPEL as serialization formats.

The “Version 2” that I referred to was actually the second released version of the BPMN standard, but the actual version number was 1.1. That battle for serialization formats still goes on: most vendors support XPDL (and will continue to do so) but are also starting to support the (finally released) BPMN file format as well. BPDM disappeared somewhere in the early days of BPMN 2.0. BPEL is used as a serialization and interchange format primarily between systems that use BPEL as their core execution language, which are a minority in the broader BPMS space.

Modeling for the masses. In March of 2006, Savvion released the latest version of their free, downloadable process modeler: an application that anyone, not just Savvion customers, could download, install and run on their desktop without requiring access to a server. This concept, pioneered by Savvion in 2004, lowers the barrier significantly for process modeling and allows anyone to get started creating process models and finding improvements to their processes.

Unlike generic modeling tools like Microsoft Visio, a purpose-built process modeler can enforce process standards, such as BPMN, and can partially validate the process models before they are even imported into a process server for implementation. It can also provide functionality such as process simulation, which is essential to determining improvements to the process.

2006 saw other BPM vendors start to copy this initiative, and we can expect more in the months to come.

Free or low-cost process modelers have proliferated: there are web-based tools, downloadable applications and Visio BPMN add-ons that have made process modeling accessible – at least financially – to the masses. The problem continues to be that many people using the process modeling tools lack the analysis skills to do significant process optimization (or even, in some cases, representation of an event-driven process): the hype about having all of your business users modeling your business processes has certainly exceeded the reality.

Web 2.0 hits BPM. Web 2.0, a set of technologies and concepts embodied within the next generation of internet software, is beginning to impact enterprise software, too.

Web 2.0 is causing changes in BPM by pushing the requirement for zero-footprint, platform-independent, rich user interfaces, typically built using AJAX (Asynchronous Java and XML). Although browser-based interfaces for executing processes have been around for many years in BPM, the past year has seen many of these converted to AJAX for a lightweight interface with both functionality and speed.

There are two more Web 2.0 characteristics that I think we’re going to start seeing in BPM in 2007: tagging and process syndication. Tagging would allow anyone to add freeform keywords to a process instance (for example, one that required special handling) to make it easier to find that instance in the future by searching on the keywords. Process event syndication would allow internal and external process participants to “subscribe” to a process, and feed that process’ events into a standard feed reader in order to monitor the process, thereby improving visibility into the process through the use of existing feed technologies such as RSS (Really Simple Syndication).

Bringing Web 2.0 to BPM will require a few changes to corporate culture, especially those parts that require different – and more creative – types of end-user participation. As more people at all levels in the organization participate in all facets of process improvement, however, the value of this democratization of business processes will become clear.

I’ve been writing and presenting about the impact of social software on BPM for over five years now; adoption has been slower than I predicted, although process syndication (subscribing to a process’ events) has finally become mainstream. Tagging of processes is just starting to emerge; I’ve seen it in BonitaSoft but few other places.

I rarely do year-end prediction posts, but it was fun to look back at one that I did five years ago to see how well I did.

Enterprise BPM Webinar Q&A Followup

I know, two TIBCO-related posts in one day, but I just received the link to the replay of the Enterprise BPM webinar that I did for TIBCO last week, along with the questions that we didn’t have time to answer during the webinar, and wanted to summarize here. First of all, my slides:

These were the questions that came in during the webinar via typed chat that are not related to TIBCO or its products; I think that we covered some of these during the session but will respond to all of them here.

Is it possible to implement BPM (business process management) without a BPMS?

How to capture process before/without technology?

These are both about doing BPM without a BPMS. I wrote recently about Elevations Credit Union (the fact that they are an IBM customer is completely immaterial in this context) that gained a huge part of their BPM success long before they touched any technology, Basically, they carved out some high-level corporate goals related to quality, modeled their value streams, then documented their existing business processes relative to those value streams. Every business process had to fit into a value stream (which was in turn related to a corporate goal), or else it didn’t survive. They saw how processes touched various different groups, and where the inefficiencies lay, and they did all of this using manual mapping on white boards, paper and sticky notes. In other words, they used the management discipline and methodology side of BPM before they (eventually) selected a tool for collaborative process modeling, which then helped them to spread the word further in their organization. There is a misperception in some companies that if you a buy a BPMS, your processes will improve, but you really need to reorient your thinking, management and strategic goals around your business processes before you start with any technology, or you won’t get the benefits that you are expecting.

In enterprises that do not have SOA implemented horizontally across the organization, how can BPM be leveraged to implement process governance in the LOB silos, yet have enterprise control?

A BPM center of excellence (CoE) would be the best way to ensure process governance across siloed implementations. I wrote recently about a presentation that I was at where Roger Burlton spoke about BPM maturity; there was some advice that he had at the end of that about organizations that had only a level 1 or 2 in process maturity (which, if you’re still very siloed, you’re probably at): get a CoE in place and target it more at change initiatives than governance. However, you will be able to leverage the CoE to put standards in place, provide mentoring and training, and eventually build a repository of reusable process artifacts.

I work in the equipment finance industry. Companies in this space are typically classified as banks/bank-affiliates, captives and independents. With a few exceptions it’s my understanding that this particular industry has been rather slow at adopting BPMS. Have you noticed this in other industries and, if so, what do you see as being the “tipping point” for greater BPMS adoption rates? Does it ultimately come down to a solid ROI, or perhaps a few peer success stories?

My biggest customers are in financial services and insurance, so are also fairly conservative. Insurance, in particular, tends to adopt technology at the very end of adoption tail. I have seen a couple of factors that can impact a slower-moving adoption of any sort of technology, not just BPMS: first, if they just can’t do business the old way any more, and have to adopt the new technology. An example of this was a business process outsourcer for back-office mutual fund transactions that started losing bids for new work because it was actually written into the RFP that they had to have “imaging and workflow” technology rather than paper-based processes. Secondly, if they can’t change quickly enough to be competitive in the market, which is usually the case when many other of their competitors have already started using the technology. So, yes, it does come down to a solid ROI and some peer success stories, but in many cases, the ROI is one of survival rather than just incremental efficiency improvements.

Large scale organizations tend to have multiple BPM / workflow engines. What insights can you share to make these different engines in different organizational BUs into an enterprise BPM capability?

Every large organization that I work with has multiple BPMS, and this is a problem that they struggle with constantly. Going back to the first question, you need to think about both sides of BPM: it’s the management discipline and methodology, then it’s the technology.  The first of these, which is arguably the one with the biggest impact, is completely independent of the specific BPMS that you’re using: it’s about getting the organization oriented around processes, and understanding how the end-to-end business processes relate to the strategic goals. Building a common BPM CoE for the enterprise can help to bring all of these things together, including the expertise related to the multiple BPM products. By bringing them together, it’s possible to start looking at the target use cases for each of the systems currently in use, and selecting the appropriate system for each new implementation. Eventually, this may lead to some systems being replaced to reduce the number of BPMS used in the organization overall, but I rarely see large enterprises without at least two different BPMS in use, so don’t be fanatical about getting it down to a single system.

Typically what is the best order to implement ; first BPM and last SOA or vice versa.

I recommend a hybrid approach rather than purely top-down (BPM first) or bottom-up (SOA first). First, do an inventory in your environment for existing services, since there will almost always be some out there, even if just in your packaged applications such as ERP. While is this happening, start your BPM initiative by setting the goals and doing some top-down process modeling. Assuming that you have a particular process in mind for implementation, do the more detailed process design for that, taking advantage of any services that you have discovered, and identifying what other services need to be created. If possible, implement the process even without the services: it will be no worse from an efficiency standpoint than your current manual process, and will provide a framework both for adding services later and for process monitoring. As you develop the services for integration and automation, replace the manual steps in the process with the services.

Re: Enterprise BPM Goals – Develop, Execute, but what about Governance?

This was in response to the material on my agenda for the webinar. Yes, governance is important, but I only had 40 minutes and could barely cover the design/develop/execute parts of what we wanted to cover. Maybe TIBCO will have me back for another webinar on governance. 😉

Data/content centric processes vs. people-centric vs. EAI/integration centric re: multiple BPMS platforms. Any guidelines for when and where to demarcate?

These divisions are very similar to the Forrester divisions of the BPMS landscape from a few years ago, and grew mostly out of the different types of systems that were all lumped together as “BPMS” by the analysts in the early 2000’s. Many of today’s products offer strength in more than one area, but you need to have a good understanding of your primary use cases when selecting a product. Personally, I think that content-centric and human-centric isn’t the right way to split it: more like unstructured (case management) versus structured; even then, there is more of a spectrum of functionality in most cases than purely unstructured or purely structured. So really, the division is between processes that have people involved (human-centric) or those that are more for automated integration (system-centric), with the latter having to accommodate this wider spectrum of process types. If you have mostly automated integration processes, then certainly an integration-centric BPMS makes sense; if you have human-facing processes, then the question is a bit more complex, since you’re dealing with content/documents, process types, social/collaborative capabilities and a host of other requirements that you need to look at relative to your own use cases. In general, the market is moving towards the full range of human-facing processes being handled by a single product, although specialist product companies would differ.

Thoughts on the role of the application/solution architect within an LOB or COE vs. that of the enterprise architect assigned to the BPM domain?

An enterprise architect assigned to the BPM CoE/domain is still (typically) part of the EA team, therefore involved with the broader scope of enterprise architecture issues. An application/solution architect tends to be more product and technology focused, and in many some that is just a fancy term used for a developer. In other words, the EA should be concerned with overall strategy and goals, whereas the solution architect is focused on implementation.

Role of the COE in governance? How far does/should it extend?

The CoE is core to governance: that’s what it’s there for. At the very least, the CoE will set the standards and procedures for governance, and may rely on the individual projects to enforce that governance.

Is it really IT giving up control? In many cases, the business does whatever they do — and IT has little (or aged) information about the actual processes.

This was in reference to slide #11 in my deck about cultural issues. Certainly business can (and often do) go off and implement their own processes, but that is outside the context of enterprise-wide systems. In order to have the business be doing that within the enterprise BPMS, IT has to ensure that business can access the process discovery and modeling tools that become the front-end of process design. That way, business and IT share models of the business processes, which means that what gets implemented in the BPMS might actually resemble what is required by the business. In some cases, I see a company buy a BPMS but not allow the business users to use the business-level tools to participate in process modeling: this is usually the result of someone in IT thinking that this is beyond the capability of the business people.

Is following of any BPM notation standards part of BPM development? I saw that there was no mention of it.

There was so much that I did not have time to address with only 40 minutes or so to speak, and standards didn’t make the cut. In longer presentations, I always address the issue of standards, since a common process modeling notation is essential to communication between various stakeholders. BPMN is the obvious front-runner there, and if used properly, can be understood by both business and IT. It’s not just about process models, however: a BPMS implementation has to also consider data models, organizational models and more, around which there is less standardization.

Regarding Common UI: shouldn’t it be Common Architecture, accessed by different UIs that fit the user’s roles, knowledge, etc?

In the context of slide #6, I did mean a common UI, literally. In other words, using the BPMS’ composite application development and forms environment to create a user interface that hides multiple legacy applications behind a single user interface, so that the user deals with this new integrated UI instead of multiple legacy UIs. Your point seems to be more about persona-based (or role-based) interfaces into the BPMS, which is a valid, but different, point. That “single UI” that I mention would, in fact, be configurable for the different personas who need to access it.

How does a fully fledged BPM tool stack up against workflow tools part of other COTS application, e.g. workflow in a document management tool or in a trouble ticketing tool?

A full BPMS tends to be much more flexible than what you will find in the embedded workflow within another platform, and is more of an application development platform than just a way to control processes within that application. On the other side, the workflow within those applications are typically already fully integrated with the other business objects within them (e.g., documents, trouble tickets), so the implementation may be faster for that particular type of process. If the only type of process management that you need to do is document approvals within your document management system, it may make sense to use that rather than purchase a full BPMS; if you have broader process management needs, start looking at a more general BPMS platform that can handle more of your use cases.

How do u see BPM tools surviving when CRM tools with more or less same capability is getting widely accepted by enterprises with out-of-box processes defined?

Similar to my response to the previous question, if the processes are related only to the business objects within the CRM, then you may be better off using the workflow tools within it. However, as soon as you want to integrate in other data sources, systems or users, you’ll start to get beyond the functional capabilities of the simpler workflow tools within the CRM. There’s room in the market for both; the trick is, for customers, to understand when to use one versus the other.

What are the reasons you see for BPM tools not getting quickly and widely accepted and what are the solutions to overcome that?

There are both cost and complexity components with BPMS adoption, but a big reason before you even start looking at tools is moving your organization to a process-driven orientation, as I discussed above. Once people start to look at the business as end-to-end processes, and those processes as assets and capabilities that the business offers to its customers, there will be a great pull for BPMS technologies to help that along. Once that motivation is in place, the cost and complexity barriers are still there, but are becoming less significant: first of all, more vendors are offering cloud-based versions of their software that allow you to try it out – and even do your full development and testing – without capital expenditures. If they offer the option, you can move your production processes on-premise, or leave them in the cloud to keep the total cost down. As for complexity, the products are getting easier to use, but are also offering a lot more functionality. This shifts the complexity from one of depth (learning how to do a particular function) to breadth (learning what all the functions are and when to use which), which is still complex but less of a technological complexity.

Is it possible to start introducing and implementing BPM in one department or module only and then extending the BPM to other departments or modules? Or this should be the enterprise wide decisions since it involves heavy cost to bring BPM technologies.

Almost every organization that I work with does their BPM implementation in one department first, or for one process first (which may span departments): it’s just not possible to implement everything that you will ever implement in BPM at the same time, first time. There needs to be ROI within that first implementation, but you also have to look at enterprise cost justification as with any horizontal technology: plan for the other projects that will use this, and allocate the costs accordingly. That might mean that some of the initial costs come from a shared services or infrastructure budget rather than the project budget, because they will eventually be allocated to future projects and processes.

How difficult would it be to replace legacy workflow system with BPM?

It depends (that’s always the consultant’s answer). Seriously, though, it depends on the level of integration between the existing workflow system and other systems, and how much of the user interface that it provides. I have seen situations where a legacy workflow system is deeply embedded in a custom application platform, with fairly well-defined integration points to other systems, and the user interface hiding the workflow system from the end user. In this case, although it’s not trivial, it is a straightforward exercise to rip out the workflow system since it is being used purely as a process engine, replace it with a new one, refactor the integration points so that the new system calls the other systems in the environment (usually easier since modern BPMS’ have better integration capabilities) and refactor the custom UI so that it calls the new BPMS (also usually easier because of updated functionality). That’s the best case, and as I said, it’s still not trivial. If the legacy workflow system also provides the user interface, then you’re looking at redeveloping your entire UI either in the new BPMS or in some other UI development tool, plus the back-end systems integration work. A major consideration in either case is that you don’t just want to replace the same functionality of the old workflow system, since the new BPMS will have far greater functionality: you need to think about how you are going to leverage capabilities such as runtime collaboration that never existed in the old system, in order to see the greatest benefit from the upgrade.

Is it possible to switch between BPM vendors without having pain?

No. Similar to the previous answer, this is a non-trivial exercise, and depending on how much of the functionality of the BPMS that you were using, could be pretty much a complete redevelopment. If the BPMS was used primarily for orchestration of automated processes, it will be much easier, but as soon as you get into custom integration/orchestration and user interfaces, it gets a lot more complicated (and painful).

Do we really need to go for BPM in a situation where we need only integration orchestration only?

One end of the BPMS market is integration-centric systems, which primarily do just integration orchestration. The advantage of using a BPMS for this instead of orchestrating directly in application code is that you get all of the other stuff that comes with the BPMS “for free”: graphical process modeling, execution monitoring, process governance and whatever other goodies are in the BPMS. It’s not really free, of course, but it’s valid to consider a comparison of all of that functionality against what parts of it you would have to custom-build if you were to do the orchestration in code.

That’s it for the Q&A. If you listen to the replay, or were on the live broadcast, my apologies for the rushed beginning: I got off on the wrong foot out of the gate, but settled down after the first few minutes.

TIBCO Product Strategy With Matt Quinn

Matt Quinn, CTO, gave us the product strategy presentation that will be seen in the general session tomorrow. He repeated the “capture many events, store few transactions” message as well as the five key components of a 21st century platform that we heard from Murrary Rode in the previous session; this is obviously a big part of the new messaging. He drilled into their four broad areas of interest from a product technology standpoint: event platform innovation, big data and analytics, social networking, and cloud enablement.

In the event platform innovation, they released BusinessEvents 5.0 in April this year, including the embedded TIBCO Datagrid technology, temporal pattern matching, stream processing and rules integration, and some performance and big data optimizations. One result is that application developers are now using BusinessEvents to build applications from the ground up, which is a change in usage patterns. For the future, they’re looking at supporting other models, such as BPMN and rule models, integrating statistical models, improving queries, improving the web design environment, and providing ActiveMatrix deployment options.

In ActiveMatrix, they’ve released a fully integrated stack of BusinessWorks, BPM and ServiceGrid with broader .Net and C++ support, optimized for large deployments and with better high-availability support and hot deployment capabilities. AXM/BPM has a number of new enhancements, mostly around the platform (such as the aforementioned HA and hot deployment), with their upcoming 1.2 release providing some functional enhancements such as customer forms and business rules based on BusinessEvents. We’ll see some Nimbus functionality integration before too much longer, although we didn’t see that roadmap; as Quinn pointed out, they need to be cautious about positioning which tools are for business users versus technical users. When asked about case management, he said that “case management brings us into areas where we haven’t yet gone as a company and aren’t sure that we want to go”. Interesting comment, given the rather wild bandwagon-leaping that has been going on in the ACM market by BPM and ECM vendors.

The MDM suite has also seen some enhancements, with ActiveSpaces integration and collaborative analytics with Spotfire, allowing MDM to become a hub for reference data from the other products. I’m very excited to see that one-click integration between MDM and AMX/BPM is on the roadmap; I think that MDM integration is going to be a huge productivity boost for overall process modeling, and when I reviewed AMX/BPM last year, I liked their process data modeling stated that “the link between MDM and process instance data needs to be firmly established so that you don’t end up with data definitions within your BPMS that don’t match up with the other data sources in your organization”. In fact, the design-time tool for MDM is now the same as that used for business object data models that I saw in AMX/BPM, which will make it easier for those who move across the data and process domains.

TIBCO is trying to build out vertical solutions in certain industries, particularly those where they have acquired or built expertise. This not only changes what they can package and offer as products, but changes who (at the customer) that they can have a relationship with: it’s now a VP of loyalty, for example, rather than (or in addition to) someone in IT.

Moving on to big data and analytics technology advances, they have released FTL 2.0 (low-latency messaging) to reduce inter-host latency below 2.2 microseconds as well as provide some user interface enhancements to make it easier to set up the message exchanges. They’re introducing TIBCO Web Messaging to integrate consumer mobile devices with TIBCO messaging. They’ve also introduced a new version of ActiveSpaces in-memory data grid, providing big data handling at in-memory speeds by easing the integration with other tools such as event processing and Spotfire.

They’ve also released Spotfire 4.0 visual analytics, with a bit focus on ease of use and dashboarding, plus tibbr integration for social collaboration. In fact, tibbr is being used as a cornerstone for collaboration, with many of the TIBCO products integrating with tibbr for that purpose. In the future, tibbr will include collaborative calendars and events, contextual notifications, and other functionality, plus better usability and speed. Formvine has been integrated with tibbr for forms-based routing, and Nimbus Control integrates with tibbr for lightweight processes.

Quinn finished up discussing their Silver Fabric cloud platform to be announced tomorrow (today, if you count telling a group of tweet-happy industry analysts) for public, private and hybrid cloud deployments.

Obviously, there was a lot more information here that I could possibly capture (or that he could even cover, some of the slides just flew past), and I may have to get out of bed in time for his keynote tomorrow morning since we didn’t even get to a lot of the forward-looking strategy. With a product suite as large as what TIBCO has now, we need much more than an hour to get through an analyst briefing.

TIBCO Corporate Strategy Session with Murray Rode

I’m in Vegas this week at TUCON, TIBCO’s user conference, and this afternoon I’m at the analyst event. For the corporate strategy session, they put the industry analysts and financial analysts together, meaning that there were way too many dark suits in the room for my taste (and my wardrobe).

Murray Rode, COO, gave us a good overview presentation on the corporate strategy, touching on market factors, their suite of products, and their growth in terms of products, geographies and verticals. Definitely, event-driven processes are a driving force behind businesses these days – matching with the “responsive business” message I saw at the Progress conference last week – and TIBCO sees their product suite as being ideally positioned to serve those needs.

Rode defined the key components of a 21st century platform as:

  • Automation (SOA, messaging, BPM) as core infrastructure
  • Event processing
  • Social collaboration
  • Analytics
  • Cloud

Their vision is to be the 21st century middleware company, continuing to redefine the scope and purpose of middleware, and to provide their customers with the “2-second advantage” based on event processing, real-time analytics and process management. They see the middleware market as taking a bite out of the application development platforms and out of the box suites by providing higher-functioning, more agile capabilities, and plan to continue their pure-play leadership in middleware.

Looking at their performance in verticals, financial services is now only 25% of their business as they diversify into telecom, government, energy, retail and other market segments. This is an interesting point, since many middleware (including many BPM) vendors grew primarily in financial services, and have struggled to break out of that sector in a significant way.

From a product standpoint, their highest growth is happening in CEP, analytics and MDM, while core stable growth continues in BPM and SOA. They are starting to see new growth in cloud, tibbr, low-latency messaging and Nimbus to drive their future innovation.

They see their key competitors as IBM and Oracle, and realize that they’re the small fish in that pond; however, they see themselves as being more innovative and in touch with current trends, and having a better pure-play focus on infrastructure. Their strategy is to keep defining the platform through a culture of continuous innovation, so as not to become a one-hit wonder like many other now-defunct (or acquired) middleware vendors of the past; to maximize sales execution strengths for growth by setting vertical go-to-market strategies across their product suite; to organize for innovation particularly through cross-selling the newer products into mature opportunities; to cultivate their brand; and to manage for growth and continued profitability, in part by branching beyond their direct sales force, which has been a significant strength for them in the past, to invest in partner and SI relationships to broaden their sales further.

Rode spoke briefly about acquisitions (we’re slated for a longer session on this later today), and positioned Nimbus as having applicability to core infrastructure in terms of analytics and events, not just BPM. It will be interesting to see how that plays out. In general, their focus is on smaller acquisitions to complement and enhance their core offering, rather than big ones that would be much harder to align with their current offerings.

OpenEdge BPM: Modifying an OE Application To Integrate With BPM

I sat in on a breakout session today at the Progress Revolution conference on OpenEdge BPM and migrating existing OpenEdge applications to work with (Savvion) BPM. There are some new ways of doing this that are coming in OE 11 that we are not seeing in this session, but I’ve had a few conversations with people since my blog post yesterday and expect to have a more in-depth briefing on this soon.

Traditionally, in OE development as in many other application development environments, the business process and logic are built directly in the application, intertwined with the user interface and other components. This pre-SOA app dev methodology is pretty old-school, but I think that’s what we’re dealing with in considering a lot of the existing OE apps that have been in production for years: not only where they designed before multi-tiered architecture was generally acceptable practice, in many cases they may have been designed by less-technical business people who weren’t aware of these types of coding standards.

Now that Savvion BPM is integrated with OE, the plan will be that business processes will be made explicit in the BPM layer, and eventually much of the user interface will also be moved to the BPM forms layer, while the more traditional OE code will provide the services layer that is consumed by the process. This creates a more flexible and open service oriented architecture, allowing the BPM processes to consume both the OE services (currently with web services, but presumably with tighter couplings in the future) and services from other systems in an enterprise’s environment in order to orchestrate multiple applications.

If you were starting fresh with a new app, that would not be a significant problem: build your processes and UIs in BPM, build your services in OE, and you’re done. The problem, however, is the huge body of existing OE applications that need to start migrating in this direction. This problem exists far beyond the OpenEdge world, of course: those with legacy code and monolithic applications of any sort are having to deal with this as they struggle to become more service-oriented, and to integrate BPM as an overarching orchestration framework.

Brian Bowman, a Progress senior systems consultant, led this session and gave a demo of creating a process in BPM  – all the while explaining some BPM basics to what I assumed was a room full of OE developers. Like a huge portion of the business/IT world, most OE customers and partners have no idea what BPM looks like or what it can do for them, meaning that Progress has a lot of education to do before anyone actually starts integrating BPM into their OE apps. A huge opportunity for Progress, undoubtedly, but also a huge challenge. I’m also struck by the idea that a lot of the Progress people, particularly the SCs who will be demoing this to customers and partners, need to have some better general process modeling training including a bit more stringent BPMN education, not just training on pushing the mouse around in the BPM tool.

Brian was joined by Sandy (I missed her last name), another SSC, who moved from Brian’s “business analyst” role who created the process in BPM, to a “developer” role in OE. She had a pre-built OE app that had a button that instantiated a process and displayed the process ID; she showed the underlying OE code, which made a CreateInstance call followed by some updateDataSlot calls to update the parameters in the process instance with the OE database parameters. The rest of the integration happened on the BPM side, with the basic points of integration as follows:

  • Create a process instance from an OE app, and populate the data fields. I don’t know OE code, but it appears that it uses a couple of new or repurposed functions (CreateInstance and updateDataSlot) to call BPM.
  • Call an OE function from a process step using a SOAP call. This requires that the function be exposed in OE as a web service, but BPM would not have had to be changed in order to make the call, since that’s a standard functionality in Savvion.
  • Update the OE database from a process step. This is based on the OE database connectivity functionality that has been added to BPM.
  • Embed a WebSpeed form page in a BPM UI form: basically, replacing a BPM UI form with an existing WebSpeed form to complete a BPM step. It is not possible to use an existing OE GUI form in this way, only a WebSpeed form since the HTML can be embedded as a URL. This is done by embedding the search parameters directly in the URL that is called to invoke the embedded WebSpeed form, which may be a security concern in some situations.

There’s definitely an issue with those using the OE GUI (again, I’m not that familiar with OE, but I assume this is a thick client UI) since these can’t be used directly as BPM task UIs as you can with the WebSpeed forms, although you could add API calls to your OE GUI application to update the BPM process instance, effectively creating a shadow process in OE and BPM. Not best practice, I’m sure, but possibly a stop-gap measure for those migrating from the OE GUI either to WebSpeed forms or BPM UI forms.

OE and BPM have separate servers (as you would expect), and deployment is done independently, as far as I can tell. That makes sense, since the eventual goal is to have OE becomes more of the services layer consumed by BPM; there is no real need to have the deployment tightly coupled although you do have to be concerned about dependencies, as you would with any SOA deployment.

Some of the questions at the end were related to how OE functionality would be replicated in BPM, such as roles and security; Progress definitely needs to do some work around educating the customers and partners on how the features that OE developers currently rely on will be done on the BPM side, for those functions such as UI and work management that are expected to move over to this new environment.

OpenEdge BPM Introduction with @KenWilner

Ken Wilner, Progress’ VP of Technology for the OpenEdge product, gave a breakout session on OpenEdge BPM, which integrates the former Savvion BPM platform into the OpenEdge application development environment to allow the business process to be made explicit – externalized from the application – in order to improve agility and visibility. It’s interesting to hear this message, which is no longer even a topic of conversation in mainstream BPM circles because it is so well-understood, presented to a group of OpenEdge application developers.

Does the integration of BPM just relegate OpenEdge to the scripting/coding language slaved to BPM? Maybe, but that’s not necessarily a bad thing. Instead of layering BPM on top of a monolithic application developed with OpenEdge, it’s about having an integrated development platform that includes BPM as a part of the toolkit. It will be interesting to see how well this message is received by the OpenEdge development community, and how long it takes to actually impact their development methods. I had a number of questions yesterday during my workshop on exactly this issue: how does BPM fit with an application developed in OpenEdge? It’s about pulling the process out of the app and into BPM, as Wilner pointed out, but also about orchestrating and integrating apps including OpenEdge and other systems such as CRM and accounting.

Although (Savvion) BPM Studio and the OpenEdge Architect development environment are both Eclipse-based, it doesn’t appear that they’ve been integrated in any significant manner. Similarly, there are two different servers – although a BPM process can call an OpenEdge functionality, using web services at least – and two different end-user portal environments, where the BPM server functionality can be surfaced in the OpenEdge portal.

He gave a live demo of creating a process in BPM Studio. This was pretty straightforward, a BPMN diagram of a five-step order processing flow with roles assigned to human steps, plus a simple data model with specific fields exposed at the steps in order to auto-generate the related forms. He then assigned a system step to an OpenEdge function, using web services (SOAP) calls, and another system step using the standard Savvion email functionality. He ran the process in the BPM portal, showing how the tasks were routed to the different users, and how you can monitor the process as it moves through the steps. Nice, and probably new and exciting for the purely OpenEdge people in the audience, but so far, this is just standard BPM with no specific integration between OpenEdge and Savvion BPM, only the standard loosely-coupled web services that would have been there in BPM anyway.

Wilner discussed (but did not demo) the high level of reusability of existing OpenEdge application components in the context of a BPM process, including the use of existing UI forms, but it’s not clear that this is a level of integration specific to OpenEdge, or just using standard integration points and/or custom development.

There is no doubt that BPM provides value as a tool for any application developer, but this demo could have been done with any BPMS, and/or any application that exposes functionality as a web service. I know that this session was listed as an introduction to OpenEdge BPM, but appears to be more of an introduction to BPM for OpenEdge developers. I hope that there is more to OpenEdge BPM than this, as well as a comprehensive roadmap for further integration. His closing slides indicated that this was coming in OpenEdge 11 at the end of this year, and I look forward to seeing how they are going to push this forward.

Strategic Synergies Between BPM, EA and SOA

I just had to attend Claus Jensen’s presentation on actionable architecture with synergies between BPM, EA and SOA since I read two of his white papers in preparing the workshop that I delivered here on Wednesday on BPM in an EA context. I also found out that he’s co-authored a new red book on EA and BPM.

Lots of great ideas here – I recommend that you read at least the first of the two white papers that I link to above, which is the short intro – about how planning (architecture) and solution delivery (BPM) are fundamentally different, and you can’t necessarily transform statements and goals from architecture into functions in BPM, but there is information that is passed in both directions between the two different lifecycles.

He went through descriptions of scenarios for aligning and interconnecting EA and BPM, also covered in the white papers, which are quite “build a (IBM-based) solution”-focused, but still some good nuggets of information.

IBM BPM: Merging the Paths

“Is there any point to which you would wish to draw my attention?” “To the curious incident of the dog in the night-time.” “The dog did nothing in the night-time.” “That was the curious incident,” remarked Sherlock Holmes.

Silver Blaze, Sir Arthur Conan Doyle

And so the fact of me (and others) not yet blogging about the IBM BPM release has itself become a point of discussion. 😉

To recount the history, I was briefed on the new IBM BPM strategy and product offerings a few weeks before the Impact conference, with a strict embargo until the first day of the conference when the announcements would be made. Then, the week before Impact, IBM updated their online product pages and the sharp-eyed Scott Francis noticed this and jumped to the obvious – and correct – conclusion: IBM was about to integrate their WebSphere BPM offerings. That prerelease of information certainly diffused the urgency about writing about the release at the moment of announcement, and gave many of us the chance to sit back and think about it a bit more. I only had a brief day and a half at Impact before making my way back east for another conference where I was giving a workshop, and here I am a week later finally finishing up my thoughts on IBM BPM.

There’s been some written about it already by others who were there: Clay Richardson and his now-infamous “fresh coat of paint” post, which I’m sure did not make him any friends in some IBM circles, Neil Ward-Dutton with his counterpoint to Clay’s opinion, some quick notes from Scott Francis in the context of his keynote blogging (which also links to the video of Phil Gilbert making the announcement), and Tony Baer as part of his post on a week of BPM announcements.

It’s important to look at how the IBM organization has realigned to allow for the new product release: Phil Gilbert, former president and CTO of Lombardi, now has overall responsibility for all of WebSphere BPM – including both the former Lombardi and WebSphere BPM products – plus ILOG rules management. Neil Ward-Dutton referred to this as the reverse takeover of IBM by Lombardi; when I had a chance for a 1:1 with Phil at Impact, I told him that we’d all bet that he would be gone from IBM after a year. He admitted that he originally thought so too, until they gave him the opportunity to do exactly what he knew needed to be done: bring together all of the IBM BPM offerings into a unified offering. This new product announcement is the beginning of that unification, but they still have a ways to go.

Let’s take a look at the product offering, then. They’ve take pretty much everything in the WebSphere BPM portfolio (Lombardi Edition, Dynamic Process Edition, Process Server, Integration Developer, Business Modeler, Business Compass, Business Fabric) and mostly rolled it into IBM  BPM or replaced its functionality with something similar; there are a few exceptions, such as Business Compass, that have just disappeared. This reduces the entire IBM BPM portfolio to the following:

  • IBM Business Process Manager (which I’m covering here)
  • IBM Case Manager (the rebranding of some specialized functionality built on the IBM FileNet BPM platform, which is separate from the above IBM BPM offering)
  • IBM Blueworks Live
  • IBM Business Monitor
  • IBM BPM Industry Packs

Combining most of the WebSphere BPM components into IBM BPM V7.5, the new product offering has both a BPMN Process Designer and a BPEL Integration Designer, a common repository, and a process server that includes both the BPMN and BPEL engines. Now you can see where Clay Richardson is coming from with the “new coat of paint” characterization: the issue of one versus two process “servers” seemed to occupy an inordinate amount of time in discussions with IBM representatives, who stoically recited the party line that it’s one server. For those of us who actually used to write code like this for a living, it’s clear that it’s two engines: one BPMN and one BPEL. However, from the customer/user standpoint, it’s wrapped into a single Process Server, so if IBM ever gets around to refactoring into a single engine, that could be made fairly transparent to their customers, but would likely have the benefit of reducing IBM’s internal engineering costs around maintaining one versus two engines. Personally, I believe that there is enough commonality between process design and service orchestration that both the designers and the engines could be combined into something that offers the full spectrum of functionality while reducing the underlying product complexity.

In addition to the core process functionality, the ILOG rules engine is also present, plus monitoring tools and user interface options with both the process portal and the Business Space composite application environment.

I don’t want to understate their achievements in this product offering: the (Lombardi-flavored) Process Center with its shared repository and process governance is significant, allowing users to reuse artifacts from the two different sides of the BPM house: you can add a BPEL process orchestration created in Integration Designer to your BPMN process created in Process Designer, or you can include a business object created in Process Designer as a data definition in your BPEL service orchestration in Integration Designer, or call a BPMN process for human task handling. The fact remains, however, that this is still a slightly uneasy combination of the two major BPM platforms, and it will likely take another version or two to work out the bumps.

Since this is IBM, they can’t just have one product configuration, but offer three:

  • The Express edition, offered at a price point that is probably less than your last car, is for starter BPM projects: full functionality of the Process Designer to build and run BPMN processes, but only one server with no clustering, so unlikely to be used for any mission-critical applications. If you’re just getting started and are doing human-centric BPM, then this is for you.
  • The Standard edition, which is pretty much the same human BPM and lightweight integration functionality as the former Lombardi Edition BPMS. Existing Lombardi Edition customers will be able to upgrade to this version seamlessly.
  • The Advanced edition, which adds the Integration Designer and its ability to create a SOA layer of BPEL service/process orchestrations that can then be called from the BPMN processes or run independently.

In the product architecture diagram above, the Advanced edition is the whole thing, whereas the Standard and Express editions are missing the Integration Designer; to complicate that further, current WebSphere Process Server/Integration Designer customers will be transitioned to the Advanced edition but with the Process Designer disabled, a fourth shadow configuration that will not be available for new customers but is offered only as an upgrade. Both engines are still there in all editions, but it appears that without both designers, you can’t actually design anything that will run in one of the engines. For current customers, IBM has published information on migrating your existing configuration to the new BPM; there is a license migration path for all customers who currently have BPM products, but for some coming from the traditional WebSphere products, the actual migration of their applications may be a bit rocky.

The web-based Process Center is used for managing, deploying and interacting with processes of both types, although the Process Designer and Integration Designer are still applications that must be downloaded and installed locally. Within the Process Designer, there’s the familiar Lombardi “iTunes-style” view of the assets and dependencies. It’s important to point out that the Toolkits are assets that could have originated in either the Process Designer or the Integration Designer; in other words, they could be human workflows running on the BPMN engine or service orchestrations running on the BPEL engine, and can just be dragged and dropped onto BPMN processes as activities. The development environment includes versioning, shared concurrent editing to view what assets that other developers are editing that might impact your project, playback of previous process versions, and all versions of processes viewable for deployment in Process Center. The Process Center view is identical from either design tool, providing an initial common view between these two environments. Linking these two environments through sharing of assets in the Process Center also eases deployment: everything that a process application depends upon, regardless of its origin, can be deployed as a single package.

Not everything comes from the former Lombardi Edition, however: the user interface builder in BPM BPM is based on Business Space, IBM’s composite application development tool, instead of the old Lombardi forms and UI technology; this allows for easy reuse of widgets in portals, and there’s also a REST interface to roll your own UI. Also, the proprietary rules engine in Lombardi is being replaced with ILOG, with the rules editor built right in to the design environments; the ILOG engine is included in the Process Server, but can only be called from processes, not by external applications, so as to not cannibalize the standalone ILOG BRMS business. I’m sure that they will be supporting the old UI and rules for a while, but if you’re using those, you’re going to be encouraged to start migrating at some point.

There is currently no (announced) plan for IBM BPM process execution in the cloud (except for the simple user-created workflows in Blueworks Live), which I think will impact IBM BPM at some point: I understand that many of the large IBM customers are unlikely to go off premise for a production system, but more and more organizations that I work with are considering cloud-based solutions that they can provision and decommission near-instantaneously as a platform for development and testing, at the very least. They need to rethink their strategy on this, and stop offering expensive custom hosted or private “cloud” platforms as their only cloud alternatives.

Finally, there is the red-headed stepchild in the IBM BPM portfolio: IBM FileNet BPM, which has mostly been made over as the IBM Case Manager product. Interestingly, some of the people from the FileNet product side were present at Impact (usually they would only attend the IOD conference, which covers the Information Management software portfolio in which FileNet BPM is entombed), and there was talk about how Case Manager and the rest of the BPM suite could work together. In my opinion, bringing FileNet BPM into the overall IBM BPM fold makes a lot of sense; as I blogged back in 2006 at the time of the acquisition, and in 2008 when comparing it to the Oracle acquisition, they should have done that from the start, but there seemed (at the time) to be some fundamental misunderstandings about the product capabilities, and they chose to refocus it on content-centric BPM rather than combining it with WebSphere Process Server. Of course, if they had done the latter, we likely would be seeing a very different IBM BPM product mix today.

Towards Workflow Verification

Continuing on with the CASCON technical papers session focused on service oriented systems, we saw a paper on Towards Workflow Verification by Leyla Naiza, Ahmed Mashiyat, Hao Wang and MacCaull Wendy of St. Francis Xavier University in Nova Scotia. The basis of this research is on model verification to ensure that the model matches the specification properties and doesn’t produce undesirable effects at runtime. The problem with testing any sort of real process model (i.e., one with more than the 5 steps that you see in a demo) is that it’s difficult to check the model for all possible states.

Their research looks at this by decomposing the process models to fundamental workflow patterns (e.g., synchronization, sequence), then translate these into DVE, the input language for DiVinE, a distributed and parallel model checker. They can then verify specific properties in the model specification, such as their example in clinical trials, “the end of the workflow (patient release) cannot be reached without 5 drug tests”. They switched from using Petri Nets to YAWL for process modeling in order to make the automated translation to DVE possible, and built the automated translator to test this model verification method. They also introduced a time extension to the translation, allowing verification of time-related properties of the process model.

I see the biggest challenge in mapping the businesses’ view of their rules onto specific properties to be tested; this linkage between business requirements and technical implementation has been with us since the beginning of systems, and it’s not clear that this research does any validation on those rules and properties themselves, making verification against them less meaningful. Furthermore, these properties are really business rules, and imply that the rules are encoded within the process model rather than externalized in a business rules system. Nonetheless, automating verification of process models is a step in the right direction for improving the quality complex process models.