A Quick Peek at Cordys BPM

A month ago, I had a chance for a comprehensive demo of the Cordys BPMS via Webex, and I saw them briefly at the Gartner show last week. Their suite is of particular interest to me because the entire process life cycle of modelling, execution and monitoring is completely browser-based. I’ve been pushing browser-based process modelling/design for quite a while, since I think that this is the key to widespread collaboration in process modelling across all stakeholders of a process. I’ve reviewed a couple of browser-based process modellers — a full-featured version from Appian, and a front-end process mapping/sketch tool from Lombardi — and if it wasn’t already clear from what Appian has done, Cordys also proves that you can create a fully-functional process designer that runs in a browser and can have participants outside the corporate firewall. Like Appian, however, they currently only support Internet Explorer (and hence Windows), which will limit the collaboration capabilities at some point.

cordys-bpmn_402515333_o

Cordys’ claim is that their modeller is BPMN compliant and supports the entire set of BPMN elements including all of the complex constructs such as transactions and compensation rollback, although I saw a few non-standard visual notations. They also support both XPDL 2.0 and BPEL for import and export, but no word on BPDM. Given this dedication to standards, I find it surprising that they can integrate only with their own ESB and business rules engine, although you could call third-party products via web services. They also have their own content repository (although you can integrate with any repository that allows object access via URL) and their own BAM. In general, I find that when a smaller vendor tries to build everything in a BPM suite themselves, some of the components are going to be lacking; furthermore, many organizations already have corporate standards for some or all of these, and you’d better integrate with the major players or you won’t get in the door.

Like most BPMS’, much of the Cordys process design environment is too complex for the average business user/analyst, and probably would be used by someone on the IT side with input from the business people; a business analyst might draw some of the process models, but as soon as you start clicking on objects and pulling up SOAP syntax, they’re going to be out of there. Like most BPMS vendors, Cordys claims that the process design environment is “targetted towards business people”, but vendors have been doing this for years now, and the business people have yet to be convinced. To be fair, I was given the demo by the very enthusiastic product architect who knew that I’m technical, so he pulled out every bell and whistle for a ride; likely business users see a very different version of the demo.

There’s a lot of functionality here, although nothing that I haven’t seen in some form in other products. There’s support for human-facing tasks either via browser-based inbox and search functions, or by forwarding the tasks to any email system via SMTP (like Outlook). There also appear to be shared worklists, but I didn’t get a sense of how automated work allocation could be performed, something that’s required to support high-volume transaction processing environments. There’s also support for web services orchestration to handle the system integration side of the BPM equation.

One thing that I like is the visual process debugger: although you have to hack a bit of XML to kick things off, you can step through a process, calling web services and popping up user interfaces as you hit the corresponding steps, and stepping over or into subprocesses (very reminiscent of a code debugger, but in a visual form).

They do a good job of an object repository as well, which helps increase reusability of objects, and allows you to search for processes and artifacts (such as forms or web services) to see where they’re used. Any process that’s built can also be exposed as a web service: just add inputs and outputs at the start and end points and the WSDL is auto-generated, allowing the process to be called as a service from any other application or service.

Cordys mashup<geek>Another thing that I really liked is the AJAX-based framework and modelling layer for UI/forms design, which is an extension of Xforms. In addition to a nice graphical UI design environment, you can generate a working user interface directly from the WSDL of a web service — something that I’ve seen in other products such as webMethods, but I still think is cool — and run it immediately in the designer. In the demo that I saw, the architect found an external currency conversion web service, introspected it with the designer and generated a form representing the web service inputs and outputs that he popped directly onto the page, where he could then run it directly in debug mode, or rearrange and change the form objects. Any web service in the internal repository — including a process — can be dragged from the repository directly onto the page to auto-generate the UI. Linked data objects on a form communicate directly (when possible) without returning to the server in a true AJAX fashion, and you can easily create mashups such as the example that I saw with the external currency converter, a database table, and MSN Messenger. For the hardcore among us, you can also jump directly to the underlying scripting code.</geek>

Unfortunately, the AJAX framework is not available as a separate offering, only as part of the BPMS; I think that Cordys could easily spin this off as a pretty nice browser-based development environment, particularly for mashups.

BPM in Action panel

If you enjoyed the free-for-all discussion at the end of the webinar that I moderated with Colin Teubner and Jim Rudden a few weeks back, you’re going to love the panel that I’m hosting next week on BPM and Enterprise 2.0 as part of the BPM in Action series. It’s not sponsored by a vendor, so I was able to pick whoever I wanted on the panel, and there will be no vendor product pitches or slides — just an interactive discussion between the four of us. Get over there and sign up.

Here’s the description that I wrote for it:

As Web 2.0 technologies and concepts advance in the consumer world, they are also impacting enterprise software as users change their expectations of how software should behave. This “Enterprise 2.0” movement is impacting BPM software in a variety of ways ranging from platforms to user functionality to integration. This panel will explore the following:

  • How Web 2.0 is becoming Enterprise 2.0
  • BPM platform changes: the impact of browser-based tools and software as a service
  • New tools and techniques for improving user participation in process design and execution
  • New ways of “mashing up” BPM data with internal and external data

I picked three people to join me whose opinions I value and who I think will be interesting in this format: Phil Gilbert of Lombardi, Phil Larson of Appian, and Ismael Ghalimi of Intalio. They’re all very opinionated and all have a stake in the Web 2.0/Enterprise 2.0 space: Lombardi’s very cool Web 2.0 Blueprint release for widespread collaboration, Appian’s kick-butt browser-based process designer for serious Enterprise 2.0 work in a browser, and Ismael’s involvement in radical office 2.0 and BPM 2.0 ideas.

We only have 45 minutes so I’m going to have to keep a tight rein on the conversation to cover off our proposed subject areas, which could be difficult because I inadvertently invited two Phils to the panel (“Phil, pipe down! No, not you Phil, the other Phil!”). No “call me Ishmael” jokes, I’ve heard them already.

By the way, congratulations to Ismael and his wife on their new arrival. No wonder he wasn’t at the Gartner conference.

BPM splog

If you surf around looking for BPM blogs, you may have noticed something strange: my blog posts from here on Column 2 reproduced in their entirety and without permission on the blog of Mark Bean, the VP of Sales for an ECM/BPM-related vendor, Altien. I’m not linking to them or to the fake blog itself, called “Office 2.0 and ECM News”, since I am definitely not encouraging traffic.

This is a clear violation of my intellectual property and copyright, and I’m amazed that anyone who works in this industry would propagate such an openly fraudulent and illegal activity. Maybe that tells you something about how Altien does business in general.

I noticed this a few weeks back, but I only noticed that he started stealing the full posts (as opposed to significant chunks of them) with my Gartner coverage this week. I sent Bean a request this morning to stop stealing my blog posts, and he replied “Sure thing”, like I had asked him for the weather — no apology, no admission that he might have violated blogging etiquette, much less copyright law. I’ve asked him to remove all of my full posts from his site, although obviously there’s no law against him linking to any of my posts and publishing a short excerpt under fair use rules.

Imitation may be the sincerest form of flattery, but in this case, it’s also theft.

Update: According to Altien’s CEO, who left a comment on this post, Mark Bean is no longer in their employ. In my communications with Altien, it was clear that Bean’s activities do not reflect their general business practices.

Gartner Day 3: Microsoft session

I wanted to stop in on the Microsoft session, People-Ready Processes, in part because I’m a bit confused about what Microsoft is doing in this area, and in part because of the Business Process Alliance announcement from Monday. Microsoft sees themselves as a force for commoditizing (and in the subtext, dumbing down) technology so that it is accessible to a much wider audience, and this presentation was Burley Kawasaki’s take on how they’re doing that for BPM. He describes people-ready processes as a fusion of document-centric processes and system-centric processes, and I really with that he (and many other people in the industry) would stop equating human-centric with document-centric. Although human-facing BPM grew out of the workflow that started in document imaging systems, that was a long time ago, and there are many instances of human-facing BPM that don’t include documents — depending, of course, on how you define a document.

My previous view of Microsoft BizTalk was as a B2B message broker or an internal ESB. My view of SharePoint was as a collaboration and document management platform. I wanted to see how Microsoft was bringing together the technologies and concepts from both of these to create a seamless BPM solution.

Kawasaki showed a spectrum of BPM application types, from collaborative to transactional processes. Individual ad hoc processes (e.g., individual task lists), human semi-structured (e.g., vacation approval), system highly structured (e.g., expense reporting) and fixed process (e.g., supply chain). He then overlaid a split between a collaboration server and a process server, with some overlap in the middle of spectrum, and labelled these as SharePoint and BizTalk. My heart sank.

Okay, you can have a SharePoint collaboration or document kick off a BizTalk process, but that’s not the same as having a single end-to-end BPM solution. In the future, the Windows Workflow Foundation will be used as the underlying process infrastructure for both SharePoint and BizTalk, which might help to integrate them more closely.

He finished up with a light-speed overview of the Microsoft process platform roadmap, which includes Windows Workflow Foundation, the .Net framework, Office (including SharePoint) and BizTalk. He also made a big push for the benefits of a platform and partner ecosystem rather than a single vendor “close and proprietary” BPM stack. Not sure that I’m convinced.

Gartner Day 3: Yvonne Genovese keynote

We started the last day at the Gartner summit with a keynote by Yvonne Genovese, Business Applications Through 2010: Major Changes Will Affect Your Process Environment. Early in her presentation, she made an important statement: “the technology keeps breaking our processes”. Her focus is on business applications, not specifically BPM, but she’s looking at trends of what’s happening with enterprise applications like ERP and CRM systems. Her point is that these business applications have, in the past, forced businesses to use rigid business processes implemented within those systems.

However, the current trend is towards unbundling some of this functionality, exposing it through services, then consuming those services using a BPMS. This allows you to not only call specific functionality from your business applications at any point in a process that you now control, you can actually replace or augment the functionality of those applications by calling other services. This also provides an opportunity to more easily integrate between business applications if you have multiple ones in your environment. Although the business application vendors have been pushing suites for some time now, that packaging model will be less compelling to their customers as organizations start to slice and dice the atomic functionality of the business applications and compose their own processes using BPM rather than use the suite in its monolithic form.

Business applications aren’t going away: there’s still a huge amount of good functionality available in them, and as long as that commoditized functionality can be consumed as services, you’re not going to be writing a replacement yourself. What I think will happen, however, is that the amount of the functionality used from any given business application platform will begin to erode as other internal or external services replace some of that functionality. This frees organizations from the vendor lock-in that they’re subjected to now, and adds a new possibility for creating business applications: instead of just “buy” or “build”, you can now also “compose”. And if the megavendors in this field are going to stay competitive, they need to embrace and encourage an ecosystem that allows smaller vendors to provide services that can easily be integrated with their larger platform. This isn’t going to be the old model of the vendor controlling the ecosystem by anointing their favourite technology partners, however: the customer organizations are going to build their own ecosystem from their preferred vendors in a truly best-of-breed fashion.

At the end of the day, BPM is an essential part of all this, since it will be used as a composition framework for combining functionality from business applications, along with internal and external services, into the processes that the business really needs.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.

Gartner Day 2: BEA sessions

I really wanted to attend Daryl Plummer’s analyst/user roundtable on BPM and Web 2.0, but they don’t let press into those sessions, so I ducked in to hear Jesper Joergenson of BEA talk about Best Practices in Business Transformation. Jesper, I know that you’re reading this — no offence intended on being my second choice 🙂  I stayed through both half-hour sessions this time, seeing Jesper talk first, then BEA’s customer, Christophe Marcel of Integro Insurance Brokers with Building the Business Case for BPM.

Joergenson started with a cooking theme for this “BPM secret sauce” talk: start with sharp knives, make big meals of small dishes, measure to taste and adjust as required, have a recipe, and follow the recipe. In BPM, this translates to start with common tools, build a platform out of small projects, use simulation and measurement, have established best practices, and follow those best practices. Cute theme, and some nice cooking utensil graphics, although I have to admit that I rarely follow a recipe in the kitchen, even if I bother to have one.

He talked about the importance of modelling tools for business users, with a shared process model for the IT side for implementation to avoid the inevitably incomplete round-tripping that happens when you model in one tool and implement in another. He also discussed how to identify suitable first targets for BPM implementation — low complexity, high impact, and low maturity level — while planning for scale in both the tool selection and the methodology, since one successful project will breed demand. He briefly discussed process simulation and measurement/monitoring, and the importance of a process centre of excellence.

After a brief break, Christophe Marcel talked about their experiences with BPM. Their focus was on integration, tying together a number of existing systems with a minimum amount of new development. They made use of both human-facing tasks and web services calls to update data in the underlying enterprise systems, and built their own web-based user interface. In addition to the enterprise data systems, they integrated Microsoft Sharepoint as their document management system.

One of the major challenges, which I’ve seen many times before whenever you integrate BPM with enterprise systems, is the issue of data synchronization. When data is replicated into the BPMS for display or control purposes, any changes to the data either in the BPMS or the underlying enterprise system need to be considered for replication to the other system. Similarly, if an entire insurance program is sold, all tasks the BPMS may need to be updated or deleted to reflect that change.

Marcel had some best practices to share: do a proof of concept; hire an experienced consultant; keep in mind that data synchronization is probably a lot more complex than you think it is; use your best business analysts on the workflow rather than the UI; and users want all of their tasks in a single system, whether that’s the BPMS or their email.

Gartner Day 2: Jim Sinur panel

This afternoon, Jim Sinur hosted a panel on Implementing an Enterprise-Transforming BPMS, which included Jeff Akin from American Home Shield, Alan Jones from Sandisk, Craig Edmonds from Symetra Financial and Jodi Starkman-Mendelsohn of West Park Assessment Centre.

American Home Shield’s goal was to double their revenue by 2010 with limited growth in their service centres, which they planned to accomplish by replacing older systems with more agile systems and move towards a more process-centric view. They’ve just rolled things out so aren’t seeing the ROI yet, but are seeing more consistent customer handling and enforcement of best practices. They’re implementing Pegasystems as their BPM.

Symetra’s object was to improve satisfaction, since they recognize that it’s much easier to keep a customer than to get a new one, and they used goal management as their approach when building processes. They did what appears to be a fairly standard imaging+workflow type of implementation using Global 360, although with today’s BPM technology that provides greater agility than the older workflow systems. They’ve seen huge ROI numbers, and have increased levels of customer service in terms of transaction turnaround times.

Sandisk has deployed 4 mission-critical BPM applications using Handysoft, started with the purchase requisition process, which was paper-based and not scalable. Their goal was to improve employee efficiency by improving the approval cycle time and reducing processing costs. Like American Home Shield, they consider different classes of solutions: a module in their ERP system, online forms, and finally selected a BPMS. They reduced the processing cycle time from 3 weeks to 1 week, and saw a number of other advantages.

West Park Assessment Centre needed to bolster their IT infrastructure to allow them to grow, and improve the quality of their services such as scheduling. They also wanted to see cost savings to a 3-year ROI, improve productivity of remote users and improve operating efficiencies. They wanted to automate their processes from the point that a referral arrived (regardless of channel), scheduling, booking, reporting, invoicing and all the other tasks that are involved in providing their services. They went live in late 2002 using Ultimus, just in time for the SARS outbreak in early 2003 that locked them out of their hospital-based offices in Toronto. With no access to their physical records, or any space to provide assessment services, they set up shop in a local hotel and were up and running within two business days due in no small part to their BPM implementation — effectively preventing total business failure. They did get their 3-year ROI and reduced turnaround time by 27%; these efficiencies have increased their profitability. By externalizing their business rules and logic in the BPMS, they have improved their agility to the point where they can make changes to their systems within a couple of days.

Although I like to hear the customer case studies, I find these panels to be a pretty artificial construct: it’s like 4 mini-presentations by customers with a few questions from Sinur at the end of each section, joint questions from the audience at the end, but no interaction between the panelists. I’d really like to see less canned presentations and more conversation between the panelists.

Gartner Day 2: Bill Gassman

The afternoon started with several simultaneous sessions by Gartner analysts, and I sat in on Bill Gassman talking about Measuring Processes in Real Time, or as he put it later, learning to live in real time.

There’s no doubt that process visibility is a key benefit gained from BPM, and that visibility usually occurs through the integration of business intelligence (BI) or business activity monitoring (BAM) tools to assist in process monitoring. The goal of BAM is to monitor key objectives, anticipate operations risks, and reduce latency between events and actions, and there’s a number of different channels for funneling this information back to those who need to know, such as business intelligence systems for predictive modelling and historical reports, real-time dashboards, and alerts.

So what’s the difference between BI and BAM? According to Gassman, BI is used for insight and planning, and is based on historical — rather than real-time — data. BAM is event driven, and issues alerts when events occur. Personally, I think that there’s a spectrum between his definitions of BI and BAM, and it’s not clear to me that it’s a useful distinction; in many cases, data is trickle-fed from operational systems to BI systems so that the data is near-real-time, allowing dashboards to be driven directly from the BI system. True, traditional BI tools will typically see update intervals more like 15 minutes than the near-real-time event alerts that you’ll find in BAM, but that’s not a problem in some cases.

Gassman discussed the different real-time analytic techniques that are widely used today: process monitoring, logistics optimization (often based on optimizing delivery times while minimizing penalties), situational awareness, pattern matching (complex event processing, or CEP), track and trace (typically used for B2B processes), and comparison between predictions and reality.

Gartner found in a survey 18 months ago that half of their customers surveyed don’t use BAM, and claim that they don’t use it because they don’t really know about it. Considering that BI had long been a technology that can be cost-justified in an extremely short time-frame, and BAM follows the same ROI patterns, I find this surprising (and I had the feeling that they were a bit surprised, too), although I have had large customers who fall into the same category.

Looking at it from a BPM standpoint, automating a process without having appropriate monitoring is risky business: there’s a business value to awareness of what’s happening in your processes, so that problems are detected early, or possibly before they even occur. There’s a natural synergy between BPM and BAM: BPM generates a mound of process instance data, often in an event-driven manner, that just begs to be analyzed, aggregated, sliced and diced.

Gassman discussed some best practices for BAM/BPM synergy before moving on to his definition of the four generations of BAM architecture: isolated silos, standalone, integrated, and composite. We’re still seeing lots of 1st and 2nd generation BAM tools, the 3rd generation has just started happening, and the 4th generation is still at least a year away. He points out that most BPM vendors are adding BAM, but are using a 1st generation BAM system that’s an isolated silo. He sees the potential to move through 5 different styles of BAM automation, that is, how the analysis from the BAM tool feeds back to change the business process. The potential benefits are great as you move from the simple BAM dashboards up through adaptive rules that choose a path based on goals, but the risks increase as well.

BAM is coming from a number of different types of vendors, in spite of the small size of the market, and there will definitely be some convergence and shakeouts in this market. An example of a trend that I think will continue is the recent acquisition of BAM vendor Celequest, used by some BPM vendors as their embedded BAM, by Cognos, a BI vendor. When you’re using BPM, you’re also going to have to face the question of whether to use a BPM vendor’s embedded BAM, or look for a more fully-functional standalone BAM tool. Gassman showed a spider graph of how BPM/BAM matches up against BI on 8 different dimensions, which indicates that you may want to look for a separate product if you need more analytical capability or need to monitor events outside of the process model.

Gartner Day 2: Catching up with BPM bloggers

Lunchtime today was spent chatting with two other BPM bloggers: first, I met with Jesper Joergensen of BEA for a chat about what they’re doing; then I spent some time with Keith Swenson of Fujitsu, mostly talking about BPM standards. Add this to the fact that I had breakfast with Jason Klemow, and there’s been some pretty good BPM blogger networking today.