Gartner BPM Day 1: Welcome and opening keynote

Scrambling down to the conference this morning — I arrived late last night and didn’t get enough sleep, much less a chance to register — I struck up a conversation in the elevator with someone who was already wearing a Gartner conference badge and asked him where the registration area was. He pointed me in the right direction, and said that he hoped that the process was faster than last night, saying that he didn’t know what they were running on their systems but that it was very slow. I tossed off my usual comment about systems that don’t work well — “probably Windows” — then turned to him and saw the Microsoft logo on his shirt. Great, I’m not even at the conference yet, and I’ve made my first enemy. 🙂

The conference kicked off with an welcome from Daryl Plummer, Bill Rosser and Pascal Winckel [all speakers that I reference at this conference are with Gartner unless otherwise noted]. Plummer started off with an audience vote that showed that there are way more business than technical people here, a great (and fairly unusual) thing for a BPM conference. Like most business-focussed conferences, however, the logistics are not blogging-friendly: there’s no wifi, only an internet area where I can plug into a physical cable, and there’s no power at the tables to keep my laptop juiced. In fact, when I ran into Jesper Joergensen from BEA at the break, the first thing that he said to me was “uh oh, no wifi — the conference is going to get a bad review!”

Plummer did tell the best BPM joke of the day so far (not a lot of competition there): What’s the lifecycle of a BPM project? About 2.5 CIOs.

After the preamble and logistics, the opening keynote was given by Janelle Hill. She started out with a great slide on the evolution of process improvement: from scientific management through computerized process flow to our current focus on flexible and adaptive BPM and the start of a focus on SOA (service oriented architecture), BAM (business activity monitoring) and EDA (event driven architecture).

She showed the results of some of their recent research showing that increasing BPM discipline as the second most important business trend affecting the ability to compete during the next five years, second only to better project/portfolio management.

She went on to talk about how software can contribute to business value, and I had to laugh at one of the conclusions that they draw: “The focus of software must shift to enabling business process innovation rather than hindering it.” In theory, that’s what the software is for; what she’s pointing out is that in reality, software — especially large ERP systems — actually hinders business agility and therefore process improvement because of the difficulties in changing the software to meet the current business needs. BPM software provides the opportunity to actually enable process innovation by allowing the business side to make frequent changes to the process to accommodate changing business processes and regulations. This is based on two fundamental bits of functionality that are part of any BPMS: first, the decoupling of the process flow, represented as a graphical process map, from the underlying technology; with a direct link from the process map to an executable flow, the business can now make changes in that graphical environment that can move into production with a minimum of IT involvement. Second, tools to provide a detailed visibility into processes in near real time so that the business can determine where changes need to be made in order to improve processes.

Interestingly, Gartner is bringing the focus back to the people in processes: putting the person-to-process interaction back at centre stage in terms of both process analysis and execution, rather than just seeing people as bots that execute granular tasks in a process. In other words, the SOA view of human-interrupted processes isn’t what’s going to drive the new wave of process improvement; the people in processes are. Maybe that’s an admission that much of the SOA level of improvement is well-understood, so that there’s unlikely to be quantum leaps in process improvement in that area that haven’t already been identified; on the other hand, we’re just starting to discover how some of the human-facing functionality such as collaboration will result in process improvements that we can’t even envision today as the emergent applications of the future. This is exactly what I’m seeing in the Enterprise 2.0 space, namely, that the new generation of technologies provides the tools that allow the business users and analysts to have more control over how their systems work and therefore the effectiveness of their business processes.

Hill discussed the three types of vendors in the BPM market today: traditional packaged application vendors, middleware vendors, and BPM specialists. The market trend, as she points out, is that the BPM pure-play vendors are increasingly being acquired by the first two types of vendors. In spite of the acquisitions, however, she points out that this is not a consolidating market, since the number of companies who claim to have something to do with process management is still increasing.

She finished up with some of the standard Gartner material on what it means to be a process-driven organization and some of the organizational and management issues that need to be addressed in order to enable this; this is very similar to what I’ve seen at previous Gartner conferences and in webinars. With the last BPM conference being only 7 months ago, it’s certainly expected that we’ll see some degree of reused material, but based on this first session, it looks like there will be enough new information to keep everyone happy.

Gartner BPM and Event Processing summits

I’m headed off to Orlando tomorrow for the Gartner BPM summit that’s happening during the first half of the week, so watch for my blogging from there under the Gartner BPM category, which also holds my coverage from their February event. They’re also running the Event Processing summit at the same location for the rest of the week; I’ll likely catch a few of the sessions before I leave on Wednesday.

I have interviews set up with many of the BPM vendors while I’m there to get their latest updates, and thought that it would be a good idea to add a disclosure page on this site rather than having to remember to note which of them are my customers each time that I mention them in a post.

A Quick Peek at Cordys BPM

A month ago, I had a chance for a comprehensive demo of the Cordys BPMS via Webex, and I saw them briefly at the Gartner show last week. Their suite is of particular interest to me because the entire process life cycle of modelling, execution and monitoring is completely browser-based. I’ve been pushing browser-based process modelling/design for quite a while, since I think that this is the key to widespread collaboration in process modelling across all stakeholders of a process. I’ve reviewed a couple of browser-based process modellers — a full-featured version from Appian, and a front-end process mapping/sketch tool from Lombardi — and if it wasn’t already clear from what Appian has done, Cordys also proves that you can create a fully-functional process designer that runs in a browser and can have participants outside the corporate firewall. Like Appian, however, they currently only support Internet Explorer (and hence Windows), which will limit the collaboration capabilities at some point.

cordys-bpmn_402515333_o

Cordys’ claim is that their modeller is BPMN compliant and supports the entire set of BPMN elements including all of the complex constructs such as transactions and compensation rollback, although I saw a few non-standard visual notations. They also support both XPDL 2.0 and BPEL for import and export, but no word on BPDM. Given this dedication to standards, I find it surprising that they can integrate only with their own ESB and business rules engine, although you could call third-party products via web services. They also have their own content repository (although you can integrate with any repository that allows object access via URL) and their own BAM. In general, I find that when a smaller vendor tries to build everything in a BPM suite themselves, some of the components are going to be lacking; furthermore, many organizations already have corporate standards for some or all of these, and you’d better integrate with the major players or you won’t get in the door.

Like most BPMS’, much of the Cordys process design environment is too complex for the average business user/analyst, and probably would be used by someone on the IT side with input from the business people; a business analyst might draw some of the process models, but as soon as you start clicking on objects and pulling up SOAP syntax, they’re going to be out of there. Like most BPMS vendors, Cordys claims that the process design environment is “targetted towards business people”, but vendors have been doing this for years now, and the business people have yet to be convinced. To be fair, I was given the demo by the very enthusiastic product architect who knew that I’m technical, so he pulled out every bell and whistle for a ride; likely business users see a very different version of the demo.

There’s a lot of functionality here, although nothing that I haven’t seen in some form in other products. There’s support for human-facing tasks either via browser-based inbox and search functions, or by forwarding the tasks to any email system via SMTP (like Outlook). There also appear to be shared worklists, but I didn’t get a sense of how automated work allocation could be performed, something that’s required to support high-volume transaction processing environments. There’s also support for web services orchestration to handle the system integration side of the BPM equation.

One thing that I like is the visual process debugger: although you have to hack a bit of XML to kick things off, you can step through a process, calling web services and popping up user interfaces as you hit the corresponding steps, and stepping over or into subprocesses (very reminiscent of a code debugger, but in a visual form).

They do a good job of an object repository as well, which helps increase reusability of objects, and allows you to search for processes and artifacts (such as forms or web services) to see where they’re used. Any process that’s built can also be exposed as a web service: just add inputs and outputs at the start and end points and the WSDL is auto-generated, allowing the process to be called as a service from any other application or service.

Cordys mashup<geek>Another thing that I really liked is the AJAX-based framework and modelling layer for UI/forms design, which is an extension of Xforms. In addition to a nice graphical UI design environment, you can generate a working user interface directly from the WSDL of a web service — something that I’ve seen in other products such as webMethods, but I still think is cool — and run it immediately in the designer. In the demo that I saw, the architect found an external currency conversion web service, introspected it with the designer and generated a form representing the web service inputs and outputs that he popped directly onto the page, where he could then run it directly in debug mode, or rearrange and change the form objects. Any web service in the internal repository — including a process — can be dragged from the repository directly onto the page to auto-generate the UI. Linked data objects on a form communicate directly (when possible) without returning to the server in a true AJAX fashion, and you can easily create mashups such as the example that I saw with the external currency converter, a database table, and MSN Messenger. For the hardcore among us, you can also jump directly to the underlying scripting code.</geek>

Unfortunately, the AJAX framework is not available as a separate offering, only as part of the BPMS; I think that Cordys could easily spin this off as a pretty nice browser-based development environment, particularly for mashups.

Gartner Day 3: Jim Sinur scenario-based rules panel

Jim Sinur hosted a case study panel on scenario-based rules with two presenters: David Luce at UTi (a logistics outsourcing firm) and Husayn Alvarez-Gomariz at Micron (a semiconductor manufacturer).

Luce started out talking about UTi, and how as a logistics provider, they are actually a business process outsourcer. They pride themselves on customer intimacy, but that drives up their operational costs since there are so many manual, special-case processes. They were looking for ways to maintain the same level of customer intimacy while automating processes and rules wherever possible in order to increase efficiency and drive down costs, and what they devised was a rules-driven architecture where they use business rules as a policy validation tool. They’ve externalized rules from legacy code into a business rules management system, which provides them with the level of agility that they need to provide customized service to their customers while still automating their processes.

Alvarez-Gomariz discussed scenario analysis, and how to use scenarios to provide the agility to respond to changing market events. His talk was both detailed and abstract, not a good combination for holding my attention, although he had some good points about the intersection between BPM, BI and planning.

Like yesterday’s panel session, this was really more like two separate 30-minute presentations, with no interaction between the panelists. This format should definitely be changed to something more interactive, or be labelled as consecutive short presentations rather than a panel.

Although it’s only lunchtime, this was my last session of the day and of the conference: I’m on a flight back to Toronto in a couple of hours. I didn’t blog about the fun at the vendor hospitality suites, but suffice to say that it included Michael Beckley in a very tropical hat (he also hade a “Made in Mexico” sticker on his forehead at one point, but I couldn’t verify that statement with his parents), Scott the hotel bartender talking about SOA and Six Sigma, and a vendor ending up in my room for the night.

I hope that you enjoyed my coverage of the conference; I’ve had a lot of great feedback from people here, and I’ll soon catch up with the comments that you’ve added to my posts in the last couple of days.

Gartner Day 3: Fair Isaac customer session

For the second half of this morning’s vendor sessions, I sat in on Fair Isaac’s customer presentation, Michele Sprayregen Edelman of Discover Financial Services on Managing Business Rules and Analytics as an Enterprise Asset. As the largest proprietary credit card network in the US with 50 million cardholders and 4 million merchant and cash access locations, they need to have a good handle not just on what their customers are doing, but on how current market trends will change what their customers want to do in the future.

To them, this means using an advanced decision management environment: start with criteria- and rule-based decisions, then automate processes with business rule management, then increase decision precision with predictive analytics, and finally optimize strategies with predictive analytics. They’re only a few steps of the way along this route, but are starting to automate decisions in a more sophisticated manner for things such as individual purchase approval/denial, in order to increase revenue and reduce losses.

They wanted a modelling environment that could be done by analysts without requiring IT support, as well as methods for integrating with the transactional systems for automating decisions. They use other decisioning tools besides Fair Isaac’s, including SAS, and combine the decisions from all of the systems in order to make the ultimate decisions. When you look at what they’ve done, even in the simplified diagrams that Edelman showed us, it’s hugely complex but provides them with a huge competitive advantage: they’re using automated decisioning in a number of different areas across their organization, including portfolio scoring, dispute processing, customer contact strategy and many others.

She presented some final recommendations, the primary one being the importance of the data infrastructure that’s going to drive the decisioning.

Gartner Day 3: Microsoft session

I wanted to stop in on the Microsoft session, People-Ready Processes, in part because I’m a bit confused about what Microsoft is doing in this area, and in part because of the Business Process Alliance announcement from Monday. Microsoft sees themselves as a force for commoditizing (and in the subtext, dumbing down) technology so that it is accessible to a much wider audience, and this presentation was Burley Kawasaki’s take on how they’re doing that for BPM. He describes people-ready processes as a fusion of document-centric processes and system-centric processes, and I really with that he (and many other people in the industry) would stop equating human-centric with document-centric. Although human-facing BPM grew out of the workflow that started in document imaging systems, that was a long time ago, and there are many instances of human-facing BPM that don’t include documents — depending, of course, on how you define a document.

My previous view of Microsoft BizTalk was as a B2B message broker or an internal ESB. My view of SharePoint was as a collaboration and document management platform. I wanted to see how Microsoft was bringing together the technologies and concepts from both of these to create a seamless BPM solution.

Kawasaki showed a spectrum of BPM application types, from collaborative to transactional processes. Individual ad hoc processes (e.g., individual task lists), human semi-structured (e.g., vacation approval), system highly structured (e.g., expense reporting) and fixed process (e.g., supply chain). He then overlaid a split between a collaboration server and a process server, with some overlap in the middle of spectrum, and labelled these as SharePoint and BizTalk. My heart sank.

Okay, you can have a SharePoint collaboration or document kick off a BizTalk process, but that’s not the same as having a single end-to-end BPM solution. In the future, the Windows Workflow Foundation will be used as the underlying process infrastructure for both SharePoint and BizTalk, which might help to integrate them more closely.

He finished up with a light-speed overview of the Microsoft process platform roadmap, which includes Windows Workflow Foundation, the .Net framework, Office (including SharePoint) and BizTalk. He also made a big push for the benefits of a platform and partner ecosystem rather than a single vendor “close and proprietary” BPM stack. Not sure that I’m convinced.

Gartner Day 3: Yvonne Genovese keynote

We started the last day at the Gartner summit with a keynote by Yvonne Genovese, Business Applications Through 2010: Major Changes Will Affect Your Process Environment. Early in her presentation, she made an important statement: “the technology keeps breaking our processes”. Her focus is on business applications, not specifically BPM, but she’s looking at trends of what’s happening with enterprise applications like ERP and CRM systems. Her point is that these business applications have, in the past, forced businesses to use rigid business processes implemented within those systems.

However, the current trend is towards unbundling some of this functionality, exposing it through services, then consuming those services using a BPMS. This allows you to not only call specific functionality from your business applications at any point in a process that you now control, you can actually replace or augment the functionality of those applications by calling other services. This also provides an opportunity to more easily integrate between business applications if you have multiple ones in your environment. Although the business application vendors have been pushing suites for some time now, that packaging model will be less compelling to their customers as organizations start to slice and dice the atomic functionality of the business applications and compose their own processes using BPM rather than use the suite in its monolithic form.

Business applications aren’t going away: there’s still a huge amount of good functionality available in them, and as long as that commoditized functionality can be consumed as services, you’re not going to be writing a replacement yourself. What I think will happen, however, is that the amount of the functionality used from any given business application platform will begin to erode as other internal or external services replace some of that functionality. This frees organizations from the vendor lock-in that they’re subjected to now, and adds a new possibility for creating business applications: instead of just “buy” or “build”, you can now also “compose”. And if the megavendors in this field are going to stay competitive, they need to embrace and encourage an ecosystem that allows smaller vendors to provide services that can easily be integrated with their larger platform. This isn’t going to be the old model of the vendor controlling the ecosystem by anointing their favourite technology partners, however: the customer organizations are going to build their own ecosystem from their preferred vendors in a truly best-of-breed fashion.

At the end of the day, BPM is an essential part of all this, since it will be used as a composition framework for combining functionality from business applications, along with internal and external services, into the processes that the business really needs.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.

Gartner Day 2: BEA sessions

I really wanted to attend Daryl Plummer’s analyst/user roundtable on BPM and Web 2.0, but they don’t let press into those sessions, so I ducked in to hear Jesper Joergenson of BEA talk about Best Practices in Business Transformation. Jesper, I know that you’re reading this — no offence intended on being my second choice 🙂  I stayed through both half-hour sessions this time, seeing Jesper talk first, then BEA’s customer, Christophe Marcel of Integro Insurance Brokers with Building the Business Case for BPM.

Joergenson started with a cooking theme for this “BPM secret sauce” talk: start with sharp knives, make big meals of small dishes, measure to taste and adjust as required, have a recipe, and follow the recipe. In BPM, this translates to start with common tools, build a platform out of small projects, use simulation and measurement, have established best practices, and follow those best practices. Cute theme, and some nice cooking utensil graphics, although I have to admit that I rarely follow a recipe in the kitchen, even if I bother to have one.

He talked about the importance of modelling tools for business users, with a shared process model for the IT side for implementation to avoid the inevitably incomplete round-tripping that happens when you model in one tool and implement in another. He also discussed how to identify suitable first targets for BPM implementation — low complexity, high impact, and low maturity level — while planning for scale in both the tool selection and the methodology, since one successful project will breed demand. He briefly discussed process simulation and measurement/monitoring, and the importance of a process centre of excellence.

After a brief break, Christophe Marcel talked about their experiences with BPM. Their focus was on integration, tying together a number of existing systems with a minimum amount of new development. They made use of both human-facing tasks and web services calls to update data in the underlying enterprise systems, and built their own web-based user interface. In addition to the enterprise data systems, they integrated Microsoft Sharepoint as their document management system.

One of the major challenges, which I’ve seen many times before whenever you integrate BPM with enterprise systems, is the issue of data synchronization. When data is replicated into the BPMS for display or control purposes, any changes to the data either in the BPMS or the underlying enterprise system need to be considered for replication to the other system. Similarly, if an entire insurance program is sold, all tasks the BPMS may need to be updated or deleted to reflect that change.

Marcel had some best practices to share: do a proof of concept; hire an experienced consultant; keep in mind that data synchronization is probably a lot more complex than you think it is; use your best business analysts on the workflow rather than the UI; and users want all of their tasks in a single system, whether that’s the BPMS or their email.