A Quick Peek at Cordys BPM

A month ago, I had a chance for a comprehensive demo of the Cordys BPMS via Webex, and I saw them briefly at the Gartner show last week. Their suite is of particular interest to me because the entire process life cycle of modelling, execution and monitoring is completely browser-based. I’ve been pushing browser-based process modelling/design for quite a while, since I think that this is the key to widespread collaboration in process modelling across all stakeholders of a process. I’ve reviewed a couple of browser-based process modellers — a full-featured version from Appian, and a front-end process mapping/sketch tool from Lombardi — and if it wasn’t already clear from what Appian has done, Cordys also proves that you can create a fully-functional process designer that runs in a browser and can have participants outside the corporate firewall. Like Appian, however, they currently only support Internet Explorer (and hence Windows), which will limit the collaboration capabilities at some point.

cordys-bpmn_402515333_o

Cordys’ claim is that their modeller is BPMN compliant and supports the entire set of BPMN elements including all of the complex constructs such as transactions and compensation rollback, although I saw a few non-standard visual notations. They also support both XPDL 2.0 and BPEL for import and export, but no word on BPDM. Given this dedication to standards, I find it surprising that they can integrate only with their own ESB and business rules engine, although you could call third-party products via web services. They also have their own content repository (although you can integrate with any repository that allows object access via URL) and their own BAM. In general, I find that when a smaller vendor tries to build everything in a BPM suite themselves, some of the components are going to be lacking; furthermore, many organizations already have corporate standards for some or all of these, and you’d better integrate with the major players or you won’t get in the door.

Like most BPMS’, much of the Cordys process design environment is too complex for the average business user/analyst, and probably would be used by someone on the IT side with input from the business people; a business analyst might draw some of the process models, but as soon as you start clicking on objects and pulling up SOAP syntax, they’re going to be out of there. Like most BPMS vendors, Cordys claims that the process design environment is “targetted towards business people”, but vendors have been doing this for years now, and the business people have yet to be convinced. To be fair, I was given the demo by the very enthusiastic product architect who knew that I’m technical, so he pulled out every bell and whistle for a ride; likely business users see a very different version of the demo.

There’s a lot of functionality here, although nothing that I haven’t seen in some form in other products. There’s support for human-facing tasks either via browser-based inbox and search functions, or by forwarding the tasks to any email system via SMTP (like Outlook). There also appear to be shared worklists, but I didn’t get a sense of how automated work allocation could be performed, something that’s required to support high-volume transaction processing environments. There’s also support for web services orchestration to handle the system integration side of the BPM equation.

One thing that I like is the visual process debugger: although you have to hack a bit of XML to kick things off, you can step through a process, calling web services and popping up user interfaces as you hit the corresponding steps, and stepping over or into subprocesses (very reminiscent of a code debugger, but in a visual form).

They do a good job of an object repository as well, which helps increase reusability of objects, and allows you to search for processes and artifacts (such as forms or web services) to see where they’re used. Any process that’s built can also be exposed as a web service: just add inputs and outputs at the start and end points and the WSDL is auto-generated, allowing the process to be called as a service from any other application or service.

Cordys mashup<geek>Another thing that I really liked is the AJAX-based framework and modelling layer for UI/forms design, which is an extension of Xforms. In addition to a nice graphical UI design environment, you can generate a working user interface directly from the WSDL of a web service — something that I’ve seen in other products such as webMethods, but I still think is cool — and run it immediately in the designer. In the demo that I saw, the architect found an external currency conversion web service, introspected it with the designer and generated a form representing the web service inputs and outputs that he popped directly onto the page, where he could then run it directly in debug mode, or rearrange and change the form objects. Any web service in the internal repository — including a process — can be dragged from the repository directly onto the page to auto-generate the UI. Linked data objects on a form communicate directly (when possible) without returning to the server in a true AJAX fashion, and you can easily create mashups such as the example that I saw with the external currency converter, a database table, and MSN Messenger. For the hardcore among us, you can also jump directly to the underlying scripting code.</geek>

Unfortunately, the AJAX framework is not available as a separate offering, only as part of the BPMS; I think that Cordys could easily spin this off as a pretty nice browser-based development environment, particularly for mashups.

Gartner Day 3: Jim Sinur scenario-based rules panel

Jim Sinur hosted a case study panel on scenario-based rules with two presenters: David Luce at UTi (a logistics outsourcing firm) and Husayn Alvarez-Gomariz at Micron (a semiconductor manufacturer).

Luce started out talking about UTi, and how as a logistics provider, they are actually a business process outsourcer. They pride themselves on customer intimacy, but that drives up their operational costs since there are so many manual, special-case processes. They were looking for ways to maintain the same level of customer intimacy while automating processes and rules wherever possible in order to increase efficiency and drive down costs, and what they devised was a rules-driven architecture where they use business rules as a policy validation tool. They’ve externalized rules from legacy code into a business rules management system, which provides them with the level of agility that they need to provide customized service to their customers while still automating their processes.

Alvarez-Gomariz discussed scenario analysis, and how to use scenarios to provide the agility to respond to changing market events. His talk was both detailed and abstract, not a good combination for holding my attention, although he had some good points about the intersection between BPM, BI and planning.

Like yesterday’s panel session, this was really more like two separate 30-minute presentations, with no interaction between the panelists. This format should definitely be changed to something more interactive, or be labelled as consecutive short presentations rather than a panel.

Although it’s only lunchtime, this was my last session of the day and of the conference: I’m on a flight back to Toronto in a couple of hours. I didn’t blog about the fun at the vendor hospitality suites, but suffice to say that it included Michael Beckley in a very tropical hat (he also hade a “Made in Mexico” sticker on his forehead at one point, but I couldn’t verify that statement with his parents), Scott the hotel bartender talking about SOA and Six Sigma, and a vendor ending up in my room for the night.

I hope that you enjoyed my coverage of the conference; I’ve had a lot of great feedback from people here, and I’ll soon catch up with the comments that you’ve added to my posts in the last couple of days.

Gartner Day 3: Fair Isaac customer session

For the second half of this morning’s vendor sessions, I sat in on Fair Isaac’s customer presentation, Michele Sprayregen Edelman of Discover Financial Services on Managing Business Rules and Analytics as an Enterprise Asset. As the largest proprietary credit card network in the US with 50 million cardholders and 4 million merchant and cash access locations, they need to have a good handle not just on what their customers are doing, but on how current market trends will change what their customers want to do in the future.

To them, this means using an advanced decision management environment: start with criteria- and rule-based decisions, then automate processes with business rule management, then increase decision precision with predictive analytics, and finally optimize strategies with predictive analytics. They’re only a few steps of the way along this route, but are starting to automate decisions in a more sophisticated manner for things such as individual purchase approval/denial, in order to increase revenue and reduce losses.

They wanted a modelling environment that could be done by analysts without requiring IT support, as well as methods for integrating with the transactional systems for automating decisions. They use other decisioning tools besides Fair Isaac’s, including SAS, and combine the decisions from all of the systems in order to make the ultimate decisions. When you look at what they’ve done, even in the simplified diagrams that Edelman showed us, it’s hugely complex but provides them with a huge competitive advantage: they’re using automated decisioning in a number of different areas across their organization, including portfolio scoring, dispute processing, customer contact strategy and many others.

She presented some final recommendations, the primary one being the importance of the data infrastructure that’s going to drive the decisioning.

Gartner Day 3: Microsoft session

I wanted to stop in on the Microsoft session, People-Ready Processes, in part because I’m a bit confused about what Microsoft is doing in this area, and in part because of the Business Process Alliance announcement from Monday. Microsoft sees themselves as a force for commoditizing (and in the subtext, dumbing down) technology so that it is accessible to a much wider audience, and this presentation was Burley Kawasaki’s take on how they’re doing that for BPM. He describes people-ready processes as a fusion of document-centric processes and system-centric processes, and I really with that he (and many other people in the industry) would stop equating human-centric with document-centric. Although human-facing BPM grew out of the workflow that started in document imaging systems, that was a long time ago, and there are many instances of human-facing BPM that don’t include documents — depending, of course, on how you define a document.

My previous view of Microsoft BizTalk was as a B2B message broker or an internal ESB. My view of SharePoint was as a collaboration and document management platform. I wanted to see how Microsoft was bringing together the technologies and concepts from both of these to create a seamless BPM solution.

Kawasaki showed a spectrum of BPM application types, from collaborative to transactional processes. Individual ad hoc processes (e.g., individual task lists), human semi-structured (e.g., vacation approval), system highly structured (e.g., expense reporting) and fixed process (e.g., supply chain). He then overlaid a split between a collaboration server and a process server, with some overlap in the middle of spectrum, and labelled these as SharePoint and BizTalk. My heart sank.

Okay, you can have a SharePoint collaboration or document kick off a BizTalk process, but that’s not the same as having a single end-to-end BPM solution. In the future, the Windows Workflow Foundation will be used as the underlying process infrastructure for both SharePoint and BizTalk, which might help to integrate them more closely.

He finished up with a light-speed overview of the Microsoft process platform roadmap, which includes Windows Workflow Foundation, the .Net framework, Office (including SharePoint) and BizTalk. He also made a big push for the benefits of a platform and partner ecosystem rather than a single vendor “close and proprietary” BPM stack. Not sure that I’m convinced.

Gartner Day 3: Yvonne Genovese keynote

We started the last day at the Gartner summit with a keynote by Yvonne Genovese, Business Applications Through 2010: Major Changes Will Affect Your Process Environment. Early in her presentation, she made an important statement: “the technology keeps breaking our processes”. Her focus is on business applications, not specifically BPM, but she’s looking at trends of what’s happening with enterprise applications like ERP and CRM systems. Her point is that these business applications have, in the past, forced businesses to use rigid business processes implemented within those systems.

However, the current trend is towards unbundling some of this functionality, exposing it through services, then consuming those services using a BPMS. This allows you to not only call specific functionality from your business applications at any point in a process that you now control, you can actually replace or augment the functionality of those applications by calling other services. This also provides an opportunity to more easily integrate between business applications if you have multiple ones in your environment. Although the business application vendors have been pushing suites for some time now, that packaging model will be less compelling to their customers as organizations start to slice and dice the atomic functionality of the business applications and compose their own processes using BPM rather than use the suite in its monolithic form.

Business applications aren’t going away: there’s still a huge amount of good functionality available in them, and as long as that commoditized functionality can be consumed as services, you’re not going to be writing a replacement yourself. What I think will happen, however, is that the amount of the functionality used from any given business application platform will begin to erode as other internal or external services replace some of that functionality. This frees organizations from the vendor lock-in that they’re subjected to now, and adds a new possibility for creating business applications: instead of just “buy” or “build”, you can now also “compose”. And if the megavendors in this field are going to stay competitive, they need to embrace and encourage an ecosystem that allows smaller vendors to provide services that can easily be integrated with their larger platform. This isn’t going to be the old model of the vendor controlling the ecosystem by anointing their favourite technology partners, however: the customer organizations are going to build their own ecosystem from their preferred vendors in a truly best-of-breed fashion.

At the end of the day, BPM is an essential part of all this, since it will be used as a composition framework for combining functionality from business applications, along with internal and external services, into the processes that the business really needs.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.

Gartner Day 2: BEA sessions

I really wanted to attend Daryl Plummer’s analyst/user roundtable on BPM and Web 2.0, but they don’t let press into those sessions, so I ducked in to hear Jesper Joergenson of BEA talk about Best Practices in Business Transformation. Jesper, I know that you’re reading this — no offence intended on being my second choice 🙂  I stayed through both half-hour sessions this time, seeing Jesper talk first, then BEA’s customer, Christophe Marcel of Integro Insurance Brokers with Building the Business Case for BPM.

Joergenson started with a cooking theme for this “BPM secret sauce” talk: start with sharp knives, make big meals of small dishes, measure to taste and adjust as required, have a recipe, and follow the recipe. In BPM, this translates to start with common tools, build a platform out of small projects, use simulation and measurement, have established best practices, and follow those best practices. Cute theme, and some nice cooking utensil graphics, although I have to admit that I rarely follow a recipe in the kitchen, even if I bother to have one.

He talked about the importance of modelling tools for business users, with a shared process model for the IT side for implementation to avoid the inevitably incomplete round-tripping that happens when you model in one tool and implement in another. He also discussed how to identify suitable first targets for BPM implementation — low complexity, high impact, and low maturity level — while planning for scale in both the tool selection and the methodology, since one successful project will breed demand. He briefly discussed process simulation and measurement/monitoring, and the importance of a process centre of excellence.

After a brief break, Christophe Marcel talked about their experiences with BPM. Their focus was on integration, tying together a number of existing systems with a minimum amount of new development. They made use of both human-facing tasks and web services calls to update data in the underlying enterprise systems, and built their own web-based user interface. In addition to the enterprise data systems, they integrated Microsoft Sharepoint as their document management system.

One of the major challenges, which I’ve seen many times before whenever you integrate BPM with enterprise systems, is the issue of data synchronization. When data is replicated into the BPMS for display or control purposes, any changes to the data either in the BPMS or the underlying enterprise system need to be considered for replication to the other system. Similarly, if an entire insurance program is sold, all tasks the BPMS may need to be updated or deleted to reflect that change.

Marcel had some best practices to share: do a proof of concept; hire an experienced consultant; keep in mind that data synchronization is probably a lot more complex than you think it is; use your best business analysts on the workflow rather than the UI; and users want all of their tasks in a single system, whether that’s the BPMS or their email.

Gartner Day 2: Jim Sinur panel

This afternoon, Jim Sinur hosted a panel on Implementing an Enterprise-Transforming BPMS, which included Jeff Akin from American Home Shield, Alan Jones from Sandisk, Craig Edmonds from Symetra Financial and Jodi Starkman-Mendelsohn of West Park Assessment Centre.

American Home Shield’s goal was to double their revenue by 2010 with limited growth in their service centres, which they planned to accomplish by replacing older systems with more agile systems and move towards a more process-centric view. They’ve just rolled things out so aren’t seeing the ROI yet, but are seeing more consistent customer handling and enforcement of best practices. They’re implementing Pegasystems as their BPM.

Symetra’s object was to improve satisfaction, since they recognize that it’s much easier to keep a customer than to get a new one, and they used goal management as their approach when building processes. They did what appears to be a fairly standard imaging+workflow type of implementation using Global 360, although with today’s BPM technology that provides greater agility than the older workflow systems. They’ve seen huge ROI numbers, and have increased levels of customer service in terms of transaction turnaround times.

Sandisk has deployed 4 mission-critical BPM applications using Handysoft, started with the purchase requisition process, which was paper-based and not scalable. Their goal was to improve employee efficiency by improving the approval cycle time and reducing processing costs. Like American Home Shield, they consider different classes of solutions: a module in their ERP system, online forms, and finally selected a BPMS. They reduced the processing cycle time from 3 weeks to 1 week, and saw a number of other advantages.

West Park Assessment Centre needed to bolster their IT infrastructure to allow them to grow, and improve the quality of their services such as scheduling. They also wanted to see cost savings to a 3-year ROI, improve productivity of remote users and improve operating efficiencies. They wanted to automate their processes from the point that a referral arrived (regardless of channel), scheduling, booking, reporting, invoicing and all the other tasks that are involved in providing their services. They went live in late 2002 using Ultimus, just in time for the SARS outbreak in early 2003 that locked them out of their hospital-based offices in Toronto. With no access to their physical records, or any space to provide assessment services, they set up shop in a local hotel and were up and running within two business days due in no small part to their BPM implementation — effectively preventing total business failure. They did get their 3-year ROI and reduced turnaround time by 27%; these efficiencies have increased their profitability. By externalizing their business rules and logic in the BPMS, they have improved their agility to the point where they can make changes to their systems within a couple of days.

Although I like to hear the customer case studies, I find these panels to be a pretty artificial construct: it’s like 4 mini-presentations by customers with a few questions from Sinur at the end of each section, joint questions from the audience at the end, but no interaction between the panelists. I’d really like to see less canned presentations and more conversation between the panelists.

Gartner Day 2: Bill Gassman

The afternoon started with several simultaneous sessions by Gartner analysts, and I sat in on Bill Gassman talking about Measuring Processes in Real Time, or as he put it later, learning to live in real time.

There’s no doubt that process visibility is a key benefit gained from BPM, and that visibility usually occurs through the integration of business intelligence (BI) or business activity monitoring (BAM) tools to assist in process monitoring. The goal of BAM is to monitor key objectives, anticipate operations risks, and reduce latency between events and actions, and there’s a number of different channels for funneling this information back to those who need to know, such as business intelligence systems for predictive modelling and historical reports, real-time dashboards, and alerts.

So what’s the difference between BI and BAM? According to Gassman, BI is used for insight and planning, and is based on historical — rather than real-time — data. BAM is event driven, and issues alerts when events occur. Personally, I think that there’s a spectrum between his definitions of BI and BAM, and it’s not clear to me that it’s a useful distinction; in many cases, data is trickle-fed from operational systems to BI systems so that the data is near-real-time, allowing dashboards to be driven directly from the BI system. True, traditional BI tools will typically see update intervals more like 15 minutes than the near-real-time event alerts that you’ll find in BAM, but that’s not a problem in some cases.

Gassman discussed the different real-time analytic techniques that are widely used today: process monitoring, logistics optimization (often based on optimizing delivery times while minimizing penalties), situational awareness, pattern matching (complex event processing, or CEP), track and trace (typically used for B2B processes), and comparison between predictions and reality.

Gartner found in a survey 18 months ago that half of their customers surveyed don’t use BAM, and claim that they don’t use it because they don’t really know about it. Considering that BI had long been a technology that can be cost-justified in an extremely short time-frame, and BAM follows the same ROI patterns, I find this surprising (and I had the feeling that they were a bit surprised, too), although I have had large customers who fall into the same category.

Looking at it from a BPM standpoint, automating a process without having appropriate monitoring is risky business: there’s a business value to awareness of what’s happening in your processes, so that problems are detected early, or possibly before they even occur. There’s a natural synergy between BPM and BAM: BPM generates a mound of process instance data, often in an event-driven manner, that just begs to be analyzed, aggregated, sliced and diced.

Gassman discussed some best practices for BAM/BPM synergy before moving on to his definition of the four generations of BAM architecture: isolated silos, standalone, integrated, and composite. We’re still seeing lots of 1st and 2nd generation BAM tools, the 3rd generation has just started happening, and the 4th generation is still at least a year away. He points out that most BPM vendors are adding BAM, but are using a 1st generation BAM system that’s an isolated silo. He sees the potential to move through 5 different styles of BAM automation, that is, how the analysis from the BAM tool feeds back to change the business process. The potential benefits are great as you move from the simple BAM dashboards up through adaptive rules that choose a path based on goals, but the risks increase as well.

BAM is coming from a number of different types of vendors, in spite of the small size of the market, and there will definitely be some convergence and shakeouts in this market. An example of a trend that I think will continue is the recent acquisition of BAM vendor Celequest, used by some BPM vendors as their embedded BAM, by Cognos, a BI vendor. When you’re using BPM, you’re also going to have to face the question of whether to use a BPM vendor’s embedded BAM, or look for a more fully-functional standalone BAM tool. Gassman showed a spider graph of how BPM/BAM matches up against BI on 8 different dimensions, which indicates that you may want to look for a separate product if you need more analytical capability or need to monitor events outside of the process model.