A Quick Peek at Cordys BPM

A month ago, I had a chance for a comprehensive demo of the Cordys BPMS via Webex, and I saw them briefly at the Gartner show last week. Their suite is of particular interest to me because the entire process life cycle of modelling, execution and monitoring is completely browser-based. I’ve been pushing browser-based process modelling/design for quite a while, since I think that this is the key to widespread collaboration in process modelling across all stakeholders of a process. I’ve reviewed a couple of browser-based process modellers — a full-featured version from Appian, and a front-end process mapping/sketch tool from Lombardi — and if it wasn’t already clear from what Appian has done, Cordys also proves that you can create a fully-functional process designer that runs in a browser and can have participants outside the corporate firewall. Like Appian, however, they currently only support Internet Explorer (and hence Windows), which will limit the collaboration capabilities at some point.

cordys-bpmn_402515333_o

Cordys’ claim is that their modeller is BPMN compliant and supports the entire set of BPMN elements including all of the complex constructs such as transactions and compensation rollback, although I saw a few non-standard visual notations. They also support both XPDL 2.0 and BPEL for import and export, but no word on BPDM. Given this dedication to standards, I find it surprising that they can integrate only with their own ESB and business rules engine, although you could call third-party products via web services. They also have their own content repository (although you can integrate with any repository that allows object access via URL) and their own BAM. In general, I find that when a smaller vendor tries to build everything in a BPM suite themselves, some of the components are going to be lacking; furthermore, many organizations already have corporate standards for some or all of these, and you’d better integrate with the major players or you won’t get in the door.

Like most BPMS’, much of the Cordys process design environment is too complex for the average business user/analyst, and probably would be used by someone on the IT side with input from the business people; a business analyst might draw some of the process models, but as soon as you start clicking on objects and pulling up SOAP syntax, they’re going to be out of there. Like most BPMS vendors, Cordys claims that the process design environment is “targetted towards business people”, but vendors have been doing this for years now, and the business people have yet to be convinced. To be fair, I was given the demo by the very enthusiastic product architect who knew that I’m technical, so he pulled out every bell and whistle for a ride; likely business users see a very different version of the demo.

There’s a lot of functionality here, although nothing that I haven’t seen in some form in other products. There’s support for human-facing tasks either via browser-based inbox and search functions, or by forwarding the tasks to any email system via SMTP (like Outlook). There also appear to be shared worklists, but I didn’t get a sense of how automated work allocation could be performed, something that’s required to support high-volume transaction processing environments. There’s also support for web services orchestration to handle the system integration side of the BPM equation.

One thing that I like is the visual process debugger: although you have to hack a bit of XML to kick things off, you can step through a process, calling web services and popping up user interfaces as you hit the corresponding steps, and stepping over or into subprocesses (very reminiscent of a code debugger, but in a visual form).

They do a good job of an object repository as well, which helps increase reusability of objects, and allows you to search for processes and artifacts (such as forms or web services) to see where they’re used. Any process that’s built can also be exposed as a web service: just add inputs and outputs at the start and end points and the WSDL is auto-generated, allowing the process to be called as a service from any other application or service.

Cordys mashup<geek>Another thing that I really liked is the AJAX-based framework and modelling layer for UI/forms design, which is an extension of Xforms. In addition to a nice graphical UI design environment, you can generate a working user interface directly from the WSDL of a web service — something that I’ve seen in other products such as webMethods, but I still think is cool — and run it immediately in the designer. In the demo that I saw, the architect found an external currency conversion web service, introspected it with the designer and generated a form representing the web service inputs and outputs that he popped directly onto the page, where he could then run it directly in debug mode, or rearrange and change the form objects. Any web service in the internal repository — including a process — can be dragged from the repository directly onto the page to auto-generate the UI. Linked data objects on a form communicate directly (when possible) without returning to the server in a true AJAX fashion, and you can easily create mashups such as the example that I saw with the external currency converter, a database table, and MSN Messenger. For the hardcore among us, you can also jump directly to the underlying scripting code.</geek>

Unfortunately, the AJAX framework is not available as a separate offering, only as part of the BPMS; I think that Cordys could easily spin this off as a pretty nice browser-based development environment, particularly for mashups.

CMU Masters in Software Management

Often, when I receive a request for a meeting on something that’s far outside of my usual BPM/Enterprise 2.0 interests, I’ll turn it down. However, when the meeting is with various deans and professors at Carnegie Mellon University West about their new Masters in Software Management program (press release here), I’m happy to make an exception. I graduated as an engineer over 20 years ago, and programs like this just weren’t available then; I was curious to see how engineering education has advanced. I had a call with Dr. Jim Morris, dean of the CMU west coast campus, Dr. Martin Griss, associate dean for education and director of the software engineering program, and Tony Wasserman, executive director of the Center for Open Source Investigation. Of course, they’re all professors at CMU as well, at the relatively new campus in Silicon Valley.

The Masters in Software Management is like a software engineering equivalent to an executive MBA: it’s intended for people who are already experienced practitioners but want to improve their management skills in a big way, and do so part-time while they continue to work so that they can start to see immediate application and benefit. It grew out of the high level of interest in the management courses offered as part of the Masters in Software Engineering program that’s been running since 2002, as well as interest from employers in the marketplace for the skills that they plan to teach. The Masters in Software Management is less technical than the Masters in Software Engineering, but offers some amazing courses that I think should work their way into any senior software engineering or computer science curriculum: open source, enterprise architecture, managing distributed teams, outsourcing, and many others. Since these are presented in a current business context, using long-running teams and simulating a small company experience, the goal is to produce the next generation of software leaders.

The program doesn’t kick off until later this year, so they don’t know the demographics of the student population yet, but are expecting that most will have a technical computer science/software engineering background, and that there will be a mix of those from small companies who want to improve their skills and build the next Google, and some from large companies who are either closet entrepreneurs or are serious about software management within their organization. About 1/3 of the Masters in Software Engineering program attendees are women, and they expect the percentage to be higher in the Software Management program. As in the Software Engineering program (where about 30% of the students are offsite), they’ll allow remote students, although they need to be onsite for the 4-day kickoff and a few more times during the 2-year program.

BPM in Action panel

If you enjoyed the free-for-all discussion at the end of the webinar that I moderated with Colin Teubner and Jim Rudden a few weeks back, you’re going to love the panel that I’m hosting next week on BPM and Enterprise 2.0 as part of the BPM in Action series. It’s not sponsored by a vendor, so I was able to pick whoever I wanted on the panel, and there will be no vendor product pitches or slides — just an interactive discussion between the four of us. Get over there and sign up.

Here’s the description that I wrote for it:

As Web 2.0 technologies and concepts advance in the consumer world, they are also impacting enterprise software as users change their expectations of how software should behave. This “Enterprise 2.0” movement is impacting BPM software in a variety of ways ranging from platforms to user functionality to integration. This panel will explore the following:

  • How Web 2.0 is becoming Enterprise 2.0
  • BPM platform changes: the impact of browser-based tools and software as a service
  • New tools and techniques for improving user participation in process design and execution
  • New ways of “mashing up” BPM data with internal and external data

I picked three people to join me whose opinions I value and who I think will be interesting in this format: Phil Gilbert of Lombardi, Phil Larson of Appian, and Ismael Ghalimi of Intalio. They’re all very opinionated and all have a stake in the Web 2.0/Enterprise 2.0 space: Lombardi’s very cool Web 2.0 Blueprint release for widespread collaboration, Appian’s kick-butt browser-based process designer for serious Enterprise 2.0 work in a browser, and Ismael’s involvement in radical office 2.0 and BPM 2.0 ideas.

We only have 45 minutes so I’m going to have to keep a tight rein on the conversation to cover off our proposed subject areas, which could be difficult because I inadvertently invited two Phils to the panel (“Phil, pipe down! No, not you Phil, the other Phil!”). No “call me Ishmael” jokes, I’ve heard them already.

By the way, congratulations to Ismael and his wife on their new arrival. No wonder he wasn’t at the Gartner conference.

BPM splog

If you surf around looking for BPM blogs, you may have noticed something strange: my blog posts from here on Column 2 reproduced in their entirety and without permission on the blog of Mark Bean, the VP of Sales for an ECM/BPM-related vendor, Altien. I’m not linking to them or to the fake blog itself, called “Office 2.0 and ECM News”, since I am definitely not encouraging traffic.

This is a clear violation of my intellectual property and copyright, and I’m amazed that anyone who works in this industry would propagate such an openly fraudulent and illegal activity. Maybe that tells you something about how Altien does business in general.

I noticed this a few weeks back, but I only noticed that he started stealing the full posts (as opposed to significant chunks of them) with my Gartner coverage this week. I sent Bean a request this morning to stop stealing my blog posts, and he replied “Sure thing”, like I had asked him for the weather — no apology, no admission that he might have violated blogging etiquette, much less copyright law. I’ve asked him to remove all of my full posts from his site, although obviously there’s no law against him linking to any of my posts and publishing a short excerpt under fair use rules.

Imitation may be the sincerest form of flattery, but in this case, it’s also theft.

Update: According to Altien’s CEO, who left a comment on this post, Mark Bean is no longer in their employ. In my communications with Altien, it was clear that Bean’s activities do not reflect their general business practices.

Gartner Day 3: Jim Sinur scenario-based rules panel

Jim Sinur hosted a case study panel on scenario-based rules with two presenters: David Luce at UTi (a logistics outsourcing firm) and Husayn Alvarez-Gomariz at Micron (a semiconductor manufacturer).

Luce started out talking about UTi, and how as a logistics provider, they are actually a business process outsourcer. They pride themselves on customer intimacy, but that drives up their operational costs since there are so many manual, special-case processes. They were looking for ways to maintain the same level of customer intimacy while automating processes and rules wherever possible in order to increase efficiency and drive down costs, and what they devised was a rules-driven architecture where they use business rules as a policy validation tool. They’ve externalized rules from legacy code into a business rules management system, which provides them with the level of agility that they need to provide customized service to their customers while still automating their processes.

Alvarez-Gomariz discussed scenario analysis, and how to use scenarios to provide the agility to respond to changing market events. His talk was both detailed and abstract, not a good combination for holding my attention, although he had some good points about the intersection between BPM, BI and planning.

Like yesterday’s panel session, this was really more like two separate 30-minute presentations, with no interaction between the panelists. This format should definitely be changed to something more interactive, or be labelled as consecutive short presentations rather than a panel.

Although it’s only lunchtime, this was my last session of the day and of the conference: I’m on a flight back to Toronto in a couple of hours. I didn’t blog about the fun at the vendor hospitality suites, but suffice to say that it included Michael Beckley in a very tropical hat (he also hade a “Made in Mexico” sticker on his forehead at one point, but I couldn’t verify that statement with his parents), Scott the hotel bartender talking about SOA and Six Sigma, and a vendor ending up in my room for the night.

I hope that you enjoyed my coverage of the conference; I’ve had a lot of great feedback from people here, and I’ll soon catch up with the comments that you’ve added to my posts in the last couple of days.

Gartner Day 3: Fair Isaac customer session

For the second half of this morning’s vendor sessions, I sat in on Fair Isaac’s customer presentation, Michele Sprayregen Edelman of Discover Financial Services on Managing Business Rules and Analytics as an Enterprise Asset. As the largest proprietary credit card network in the US with 50 million cardholders and 4 million merchant and cash access locations, they need to have a good handle not just on what their customers are doing, but on how current market trends will change what their customers want to do in the future.

To them, this means using an advanced decision management environment: start with criteria- and rule-based decisions, then automate processes with business rule management, then increase decision precision with predictive analytics, and finally optimize strategies with predictive analytics. They’re only a few steps of the way along this route, but are starting to automate decisions in a more sophisticated manner for things such as individual purchase approval/denial, in order to increase revenue and reduce losses.

They wanted a modelling environment that could be done by analysts without requiring IT support, as well as methods for integrating with the transactional systems for automating decisions. They use other decisioning tools besides Fair Isaac’s, including SAS, and combine the decisions from all of the systems in order to make the ultimate decisions. When you look at what they’ve done, even in the simplified diagrams that Edelman showed us, it’s hugely complex but provides them with a huge competitive advantage: they’re using automated decisioning in a number of different areas across their organization, including portfolio scoring, dispute processing, customer contact strategy and many others.

She presented some final recommendations, the primary one being the importance of the data infrastructure that’s going to drive the decisioning.

Gartner Day 3: Microsoft session

I wanted to stop in on the Microsoft session, People-Ready Processes, in part because I’m a bit confused about what Microsoft is doing in this area, and in part because of the Business Process Alliance announcement from Monday. Microsoft sees themselves as a force for commoditizing (and in the subtext, dumbing down) technology so that it is accessible to a much wider audience, and this presentation was Burley Kawasaki’s take on how they’re doing that for BPM. He describes people-ready processes as a fusion of document-centric processes and system-centric processes, and I really with that he (and many other people in the industry) would stop equating human-centric with document-centric. Although human-facing BPM grew out of the workflow that started in document imaging systems, that was a long time ago, and there are many instances of human-facing BPM that don’t include documents — depending, of course, on how you define a document.

My previous view of Microsoft BizTalk was as a B2B message broker or an internal ESB. My view of SharePoint was as a collaboration and document management platform. I wanted to see how Microsoft was bringing together the technologies and concepts from both of these to create a seamless BPM solution.

Kawasaki showed a spectrum of BPM application types, from collaborative to transactional processes. Individual ad hoc processes (e.g., individual task lists), human semi-structured (e.g., vacation approval), system highly structured (e.g., expense reporting) and fixed process (e.g., supply chain). He then overlaid a split between a collaboration server and a process server, with some overlap in the middle of spectrum, and labelled these as SharePoint and BizTalk. My heart sank.

Okay, you can have a SharePoint collaboration or document kick off a BizTalk process, but that’s not the same as having a single end-to-end BPM solution. In the future, the Windows Workflow Foundation will be used as the underlying process infrastructure for both SharePoint and BizTalk, which might help to integrate them more closely.

He finished up with a light-speed overview of the Microsoft process platform roadmap, which includes Windows Workflow Foundation, the .Net framework, Office (including SharePoint) and BizTalk. He also made a big push for the benefits of a platform and partner ecosystem rather than a single vendor “close and proprietary” BPM stack. Not sure that I’m convinced.

Gartner Day 3: Yvonne Genovese keynote

We started the last day at the Gartner summit with a keynote by Yvonne Genovese, Business Applications Through 2010: Major Changes Will Affect Your Process Environment. Early in her presentation, she made an important statement: “the technology keeps breaking our processes”. Her focus is on business applications, not specifically BPM, but she’s looking at trends of what’s happening with enterprise applications like ERP and CRM systems. Her point is that these business applications have, in the past, forced businesses to use rigid business processes implemented within those systems.

However, the current trend is towards unbundling some of this functionality, exposing it through services, then consuming those services using a BPMS. This allows you to not only call specific functionality from your business applications at any point in a process that you now control, you can actually replace or augment the functionality of those applications by calling other services. This also provides an opportunity to more easily integrate between business applications if you have multiple ones in your environment. Although the business application vendors have been pushing suites for some time now, that packaging model will be less compelling to their customers as organizations start to slice and dice the atomic functionality of the business applications and compose their own processes using BPM rather than use the suite in its monolithic form.

Business applications aren’t going away: there’s still a huge amount of good functionality available in them, and as long as that commoditized functionality can be consumed as services, you’re not going to be writing a replacement yourself. What I think will happen, however, is that the amount of the functionality used from any given business application platform will begin to erode as other internal or external services replace some of that functionality. This frees organizations from the vendor lock-in that they’re subjected to now, and adds a new possibility for creating business applications: instead of just “buy” or “build”, you can now also “compose”. And if the megavendors in this field are going to stay competitive, they need to embrace and encourage an ecosystem that allows smaller vendors to provide services that can easily be integrated with their larger platform. This isn’t going to be the old model of the vendor controlling the ecosystem by anointing their favourite technology partners, however: the customer organizations are going to build their own ecosystem from their preferred vendors in a truly best-of-breed fashion.

At the end of the day, BPM is an essential part of all this, since it will be used as a composition framework for combining functionality from business applications, along with internal and external services, into the processes that the business really needs.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.