TUCON: Tom Laffey and Matt Quinn

Last in the morning’s general session was Tom Laffey, TIBCO’s EVP of products and technologies, and Matt Quinn, VP of product management and strategy. Like Ranadivé’s talk earlier, they’re talking about enterprise virtualization: positioning messaging, for example, as virtualizing the network layer, and BPM as enterprise process virtualization. I’m not completely clear if virtualization is just the current analyst-created buzzword in this context.

Laffey and Quinn tag-teamed quite a bit during the talk, so I won’t attribute specific comments to either. TIBCO products cover a much broader spectrum that I do, so I’ll focus just on the comments about BPM and SOA.

TIBCO’s been doing messaging and ESB for a long time, and some amount of the SOA talk is about incremental feature improvements such as easier use of adapters. Apparently, Quinn made a prediction some months ago that SOA would grow so fast that it would swallow up BPM, so that BPM would just be a subset of SOA. Now, he believes (and most of us from the BPM side agree 🙂 ) that BPM and SOA are separate but extremely synergistic practices/technologies, and both need to developed to a position of strength. To quote Ismael Ghalimi, BPM is SOA’s killer application, and SOA is BPM’s enabling infrastructure, a phrase that I’ve included in my presentation later today; like Ismael, I see BPM as a key consumer of what’s produced via SOA, but they’re not the same thing.

They touched on the new release of Business Studio, with its support for BPMN, XPDL and BPEL as well as UML for some types of data modelling. There’s some new intelligent workforce management features, and some advanced user interface creation functionality using intelligent forms, which I think ties in with their General Interface AJAX toolkit.

Laffey just defined “mashup” as a browser-based event bus, which is an interesting viewpoint, and likely one that resonates better with this audience than the trendier descriptions.

They discussed other functionality, including business rules management, dynamic virtual information spaces (the ability to tap into a real-time event message stream and extract just what you want), and the analytics that will be added with the acquisition of Spotfire. By the way, we now appear to be calling analytics “business insight”, which lets us keep the old BI acronym without the stigma of the business intelligence latency legacy. 🙂

They finished up with a 2-year roadmap of product releases, which I won’t reproduce here because I’d hate to have to embarrass them later, and some discussion of changes to their engineering and product development processes.

BrainStorm BPM Day 1: Neal McWhorter

I switched streams to the business rules symposium for the last breakout session of the day, The End of Requirements, because the description sounded too good to miss:

Business wants control of the business back. For years we’ve lived with a process where the business creates “requirements” and IT creates a business solution. While business processes are the lifeblood of an organization, rules are where the volume of business changes are. If the business is going to take back control of its own fate it all starts with making sure that the business rules they own are really under their control after they go into production. The current requirements process simply can’t handle that. It’s time to embrace a Business Engineering-based approach and move beyond the requirement-centric approach that we’ve love to hate.

He makes the distinction between knowledge and behaviour: rules is about (reusable) knowledge, and processes are about behaviour. For example, you might have a rule that would allow you to determine if a customer was a “good” customer, and you might use that knowledge in a process to provide a discount. He discussed different views of rules and how they integrate with processes:

  • Rules as structural knowledge about business entities
  • Rules as business judgements
  • Rules that trigger events

Unfortunately, McWhorter didn’t ultimately talk about how the business is going to get control of the business rules, or how it’s going to work once they do, which is what I was expecting from the session description. He did finish up with a call to arms to bring a design function back into the business side — sort of like what business analysts are supposed to do, but don’t because they don’t have any design training — which would allow proper designs to be created before things are ever passed on to IT.

BrainStorm BPM Day 1: Bruce Silver track keynote

There’s an awful lot of keynotes in this conference: a couple of overall sessions this morning, now “track keynotes” for each of the four tracks within the BPM conference. I’m in Bruce Silver’s New Directions in BPM Tools and Technology session, where he started by taking a gentle poke at Gartner, saying that BPM is more than a management discipline (Gartner’s most recent definition of BPM).

He started out discussing process modelling, and how it’s inherently a business activity, not an IT activity, which speaks directly to the issue of the tools used for modelling: is there a handoff from a modelling-only tool to an execution environment at the point of business to IT handoff, or is the model actually just a business view of the actual implementation? With all of the vendor demos that I’ve done lately (I know, I have yet to document many of there here, but I’m getting to it), I’ve had quite a focus on the distinction between having a model shared between business and IT, and having a separate BPA tool that models much more than just the processes that will be implemented in a BPMS. Bruce positions this as “BPA versus BPMN” views towards describing process modelling, and doesn’t see them in conflict; in fact, he thinks that they’re ignoring each other, a viewpoint that I’d have to agree with given that BPA initiatives rarely result in any processes being transferred to some sort of execution engine.

Bruce, who often accuses me of being too nice, takes a stab a the vendors in a couple of areas. First is with their BPMN implementations, specifically that of events: he states that many of the execution engines just don’t support intermediate events, so that the vendors conveniently forget to include those events in their BPMN modelling tool. Second is with simulation, and looking at whether a vendor’s implementation is actually a useful tool, or a “fake” feature that’s there to enable it to be checked off on an RFP, but not functional enough to make it worth using.

He has a nice way of categorizing BPMS products: by vendor speciality (e.g., integration, human-centric), by process type/use case (e.g., production workflow) and by business/IT interaction method (collaborative shared model versus handoff). This was interesting, because I wrote almost identical words two days ago in my presentation for the Shared Insights Portals and Collaboration conference that I’ll be speaking at next month; great minds must think alike. 🙂  His point, like the one that I was making in my presentation, is that most BPM products have some strengths and some weaknesses that can make or break some process automation; for example, a product focussed on human-centric workflow probably doesn’t do some nice integration tricks like mapping and transformation, or complex data objects.

He also makes a good distinction between business rules (implemented in a BRE) and routing rules (implemented in a BPMS): business rules represent corporate or departmental policies that may need to be shared across business processes, whereas routing rules are the internal logic within a process that’s just required to get through the process but don’t represent policy in any way.

Bruce thinks that BPM and SOA together is still vapour-ware for the most part: it’s what the vendors are selling but not typically what they’re delivering. In particular, he thinks that if the BPMS and the ESB are not from the same vendor, then “all bets are off” in terms of whether a BPMS will work with any particular ESB or other services environment.

The session turned out to be too short and Bruce couldn’t even finish his materials, much less take questions: it was only 45 minutes to begin with, and shortened at the beginning while Bruce waited for stragglers for the previous session to make their way upstairs.

Gartner Day 3: Jim Sinur scenario-based rules panel

Jim Sinur hosted a case study panel on scenario-based rules with two presenters: David Luce at UTi (a logistics outsourcing firm) and Husayn Alvarez-Gomariz at Micron (a semiconductor manufacturer).

Luce started out talking about UTi, and how as a logistics provider, they are actually a business process outsourcer. They pride themselves on customer intimacy, but that drives up their operational costs since there are so many manual, special-case processes. They were looking for ways to maintain the same level of customer intimacy while automating processes and rules wherever possible in order to increase efficiency and drive down costs, and what they devised was a rules-driven architecture where they use business rules as a policy validation tool. They’ve externalized rules from legacy code into a business rules management system, which provides them with the level of agility that they need to provide customized service to their customers while still automating their processes.

Alvarez-Gomariz discussed scenario analysis, and how to use scenarios to provide the agility to respond to changing market events. His talk was both detailed and abstract, not a good combination for holding my attention, although he had some good points about the intersection between BPM, BI and planning.

Like yesterday’s panel session, this was really more like two separate 30-minute presentations, with no interaction between the panelists. This format should definitely be changed to something more interactive, or be labelled as consecutive short presentations rather than a panel.

Although it’s only lunchtime, this was my last session of the day and of the conference: I’m on a flight back to Toronto in a couple of hours. I didn’t blog about the fun at the vendor hospitality suites, but suffice to say that it included Michael Beckley in a very tropical hat (he also hade a “Made in Mexico” sticker on his forehead at one point, but I couldn’t verify that statement with his parents), Scott the hotel bartender talking about SOA and Six Sigma, and a vendor ending up in my room for the night.

I hope that you enjoyed my coverage of the conference; I’ve had a lot of great feedback from people here, and I’ll soon catch up with the comments that you’ve added to my posts in the last couple of days.

Gartner Day 3: Fair Isaac customer session

For the second half of this morning’s vendor sessions, I sat in on Fair Isaac’s customer presentation, Michele Sprayregen Edelman of Discover Financial Services on Managing Business Rules and Analytics as an Enterprise Asset. As the largest proprietary credit card network in the US with 50 million cardholders and 4 million merchant and cash access locations, they need to have a good handle not just on what their customers are doing, but on how current market trends will change what their customers want to do in the future.

To them, this means using an advanced decision management environment: start with criteria- and rule-based decisions, then automate processes with business rule management, then increase decision precision with predictive analytics, and finally optimize strategies with predictive analytics. They’re only a few steps of the way along this route, but are starting to automate decisions in a more sophisticated manner for things such as individual purchase approval/denial, in order to increase revenue and reduce losses.

They wanted a modelling environment that could be done by analysts without requiring IT support, as well as methods for integrating with the transactional systems for automating decisions. They use other decisioning tools besides Fair Isaac’s, including SAS, and combine the decisions from all of the systems in order to make the ultimate decisions. When you look at what they’ve done, even in the simplified diagrams that Edelman showed us, it’s hugely complex but provides them with a huge competitive advantage: they’re using automated decisioning in a number of different areas across their organization, including portfolio scoring, dispute processing, customer contact strategy and many others.

She presented some final recommendations, the primary one being the importance of the data infrastructure that’s going to drive the decisioning.

Gartner Day 1: Jim Sinur

Jim Sinur took the stage for Are Rules Inside-Out in BPM?, where he claimed that he’d push the envelope in how we thought about rules. He started with how rules are a start, but agility requires a full business rule management strategy so that you can manage the rules that you’ve externalized, especially if you have multiple business rule engines. Now to be fair, many organizations haven’t even externalized their rules yet from their enterprise applications and business processes, but if they ever do, they’d better be ready to manage them or they’ll have a big mess of rules to deal with.

Today’s business rules landscape is pretty confusing, covering everything from neural nets and expert systems to business rule engines and business rule management systems. If these business rules are too rigid (unchangeable), it impacts the agility of the business processes and the entire organization; if IT has to spend a huge amount of time and money to change rules, then you can be sure that it’s not going to happen very often. However, IT is often unwilling to put control of the business rules into the hands of the business; there needs to be a way to have proper governance over changing of rules, but not so much control that it’s impossible to keep up with shifting business requirements. In many cases, the business has no idea how difficult it is to change any given rule, and some standardization of this — via rule externalization and management — would also improve service levels between business and IT.

The key is to understand where rules affect processes, and see where the ability to change rules for in-flight processes can greatly improve agility. Sinur went through the business benefits of rules, and some of the risks of fixed rules: primarily, business rule management is an enabler for governance. He also walked through different models for adopting rules: the safe and steady control model (slightly smarter process), the cautiously dynamic model (process with above average intelligence), and the aggressively predictive model (Mensa process). Obviously, the model has to suit the organization’s risk tolerance as well as the underlying process, since these range from just automating some well-understood decisions to suggesting and implementing new rules based on behaviour within the process.

He has some great recommendations for getting your rules under control, including such thoughts as 15% of the rules are the ones that the business really needs to change to remain agile, so pick the right ones to externalize, and understand both the business benefits and risks associated with changing that rule.

Watch for Gartner’s definition of what should be in a BRMS later in 2007, since this is becoming somewhat of a commodity.

Corticon Business Rules Foundation

Vendor announcements today seem to have a sci-fi theme: first it was IDS Scheer and Microsoft with the Alliance, and now it’s Corticon with the Foundation: the Business Rules Foundation, that is. I had a sneak preview last week with David Straus, Corticon’s SVP of Marketing, who did the most amazing thing for a guy with the word “marketing” in the title (much less an SVP): about a minute into our phone call, he said “wait, I can show you”, fired up Webex, sent me an invitation, and 2 minutes later he was giving me a product demo.

Corticon’s Studio product is a pretty capable business rules management system, and one of the 3 or 4 standard systems that ends up integrated into everything else, such as BPM suites that lack their own rules engines. You can define rules in a natural language, then use a decision table to map those rules onto conditions and actions, check for ambiguities and generate unconsidered cases. You can save a set of rules as a compiled service to run within their engine or to be called as a web service. You can even import (or create) test data to run against the rules, with the tests showing which rules were triggered in order to explain the decision. So far, so good: standard functionality with some nice features.

What they’ve announced today, however, is a version that separates the underlying services from the user interface, and allows those services to be embedded and tightly bound inside another application. The SDK opens up the ability for any application vendor to embed Corticon’s rules capabilities within their product without having to use any of Corticon’s user interface: they can create their own user interface paradigm for rules definition and integrate this with other parts of their appliation, so that the user is unaware that they are using software from two different vendors. The first big example of this is IDS Scheer’s ARIS, which embeds the Corticon Foundation (essentially) inside its ARIS Business Rules Designer: I saw this demonstrated at the IDS Scheer user conference a few weeks ago, but didn’t realize what I was seeing (although I knew that it was Corticon within ARIS).

Although the decision tables on the right look very much like the standard Corticon product, it’s completely and seamlessly housed within the framework of ARIS: that’s the ARIS repository tree view on the left. Since all of the new Corticon Studio is Eclipse-based, and most of the partner companies are using some sort of Eclipse tooling for their UI, this is a relatively painless integration.

There’s some other interesting applications for this that Straus mentioned, such as Adobe integration for dynamic document creation (e.g., for contract creation with rules-based selection of clauses), and Microsoft Word integration. With it opening up for development as of the announcement today, I’m sure that there will soon be a number of other application vendors trying it out. I’m waiting for the BPM vendors to start embedding this within their process designers instead of the paltry expression builders that most of them have: this seamless integration of business rules with business processes would eliminate the current barriers to using business rules in a BPM environment.

What Corticon is after, of course, is the Holy Grail of business rules: a common rules repository within an organization that is invoked by any enterprise application requiring a decision (think cross-organization compliance). By making it easier to integrate rules directly into any application, they may be that much closer.

ProcessWorld Day 2: Services industry breakout with Marshal Edgison of ELM Resources

The first breakout of day 2, I attended a session on “Optimizing Process Through Business Rules” with Marshal Edgison, Director of Application Development for ELM Resources, a not-for-profit organization focussed on facilitating and processing student loans, about how they’re leveraging both the process modeling and business rules modeling functionality of ARIS in order to drive their modernization efforts. The rules engine integrated into ARIS as the ARIS Business Rules Designer is Corticon.

They selected ARIS because they wanted a modelling tool that was not closely associated with the technology (i.e., from the process execution vendors) and could be used by business analysts. As a loan processing organization, their processes are very rules-based, and they found that their business rules were everywhere — in application code, in database triggers, in user interfaces — and were hard-coded into the system: the classic situation where business rules can be of enormous benefit. They saw an opportunity to not just model their business processes in order to get them under control, but modelling their business rules and encapsulating them into a business rules management system.

They recognized that BRMS could add agility to processes by automating recurring decisions, centralizing rules for easy management and consistent deployment, manage complex logic (they had over 1 million interdependent rules, although they fall into about 5 basic categories), increase development speed and reduce maintenance time and costs. With the ARIS Business Rules Designer, the rules could be seamlessly integrated into processes as automated decision points: ARIS defines the enterprise data model and vocabulary, and the BRMS leverages that vocabulary in transaction processing.

Edgison went through a case study of a new federally-mandated graduate loan program that came into effect in February 2006, with all participants required to support it by July 2006. Many of the financial institutions who are ELM’s member organizations were unable to comply within that timeframe, and it took ELM six months and more than $500K to implement it. As part of the sales cycle for the ARIS Business Rules Designer, they redid this using ARIS and the BRMS: it took one day with four people.

He ended up with some notes on determining whether business rules are right for you:

  • Do you have decision-intensive processes?
  • Do you have operational inefficiencies around decisions?
  • Do you have dynamic, frequently changing rules?
  • Do you need better synergy between business requirements and IT implementation?

Although a bit wordy and totally unable to control one the audience member who asked about 12 questions, Edgison was a great speaker: very knowledgeable about both his projects and the importance of business rules in modelling processes, with the ability to communicate his ideas clearly and in a compelling manner.

A Short History of BPM, Part 8

Continued from Part 7.

Part 8 (the last): The Current State of BPM. Every analyst, vendor and customer defines BPM differently, because the current definition of BPM is very broad, and there are many vendors jostling for position within it. EAI/ESB-type vendors call their products BPM, but the products may contain only rudimentary human-facing functionality. Workflow-type vendors, also labelling themselves as BPM, lack the necessary infrastructure for integration, and often handle automated steps poorly. Some pure integration products call themselves workflow, just to confuse things further. There’s a lot of complementary products, such as process analytics and simulation, and business rules engines: BPM vendors will either tell you that a particular capability must be part of the base BPM product (if their product has it), or should never be part of the base BPM product (if their product doesn’t have it). And now there’s the whole SOA wild card thrown into the mix.

BPM is definitely a case where the whole is greater than the sum of the parts. It’s not just workflow plus EAI plus B2Bi plus business rules, plus plus plus: it’s the near-seamless integration of all of these tools into a single suite that provides an organization with the ability to do things that they could never do before. That doesn’t mean that all the tools have to be from the same vendor, but it’s essential to deliver all of the BPM functionality in a single environment of closed-loop process improvement.

Smith and Fingar’s book Business Process Management, The Third Wave describes this “third wave” as providing the ability to create a single definition of a business process from which different views of that process can be rendered and new information systems can be built. This allows different people with different skills — business manager, business analyst, regular old user, programmer — to view and manipulate the same process in a representation suitable for them and derived from the same source. They make a great analogy with HTML, where a business user may use a high-level tool like FrontPage to view and edit HTML, whereas a developer may edit the HTML code directly, but they’re still working from the same source. Round-tripping between a business analyst’s modelling tool and a developer runtime environment is one way to do this, although it violates the “same source” in the purest sense, but we definitely have to get rid of the strictly one-way paths from business analysis to implementation that exist now in many organizations.

Furthermore, Smith and Fingar point out that in the world of BPM, the ability to change is far more prized than the ability to create in the first place, and that BPM has the potential to actually remove application development from the cycle — the “zero code” Holy Grail that gets a lot of press these days. They make an analogy with VisiCalc, which took customized data analysis out of the hands of the IT department and put it in the hands of the business users, thereby taking software development off the critical path for achieving results.

Getting back to the point of this post, what is the current state of BPM?

First of all, we have several companies from the pure-play BPM/BPM suites market: they provide excellent human-facing BPM and at least adequate integration capabilities, with some providing outstanding integration. At the Gartner BPM summit earlier this year, they listed three “major players” in this category who had revenues upwards of $100M — FileNet, Pegasystems and Global 360 — and five “up and comers” with revenues above $30M — Appian, Lombardi, Savvion, Metastorm and Ultimus — while ignoring anything smaller than that. All eight of these vendors hit into the right zones in the Gartner and Forrester charts, which means that they either have the necessary functionality or are partnered with someone to provide it.

Second, we have a couple of integration-focussed BPM vendors who have purchased pure-play BPM vendors to create the complete range of functionality. The two highest-profile examples are the TIBCO acquisition of Staffware in 2004, and the BEA acquisition of Fuego earlier this year. In both cases, there seems to be a reasonable fit, but my concern is that the human-facing BPM side is going to become weaker since the main focus of these companies is on integration.

Third, we have the large software companies that have developed (or acquired) a BPM product: IBM, Microsoft and Fujitsu all spring to mind. In many cases, such as IBM and Microsoft, their BPM products are primiarly integration-focussed without a lot of human-facing support, and likely started as a “would you like fries with that” sort of offering for customers who were already committed to their architecture. IBM’s MQ Series messaging is probably still the most commonly used piece of integration middleware in financial services, although I think that they call it (and everything else) “WebSphere” these days, and IBM rightly has it as a cornerstone of their BPM strategy. Fujitsu is the odd one out here, with what appears to be a fully-functional BPMS; unfortunately, they’ve been marketing it in stealth mode and most people are completely unaware of it: as I said in one of my posts about the Gartner BPM summit, “who knew that Fujitsu did BPM?”

We’ll continue to see most of the business functionality envelope being pushed by the vendors in the first category as they seek better ways to integrate business rules, analytics, performance management and other capabilities into BPM; in fact, the most innovation seems to be coming from the smaller vendors in this category because of the lack of baggage that I discussed in part 7.

Because of the current focus on process improvement in all organizations, I don’t think that there’s any great risk of any of the vendors that I’ve listed here going out of the BPM business any time soon. However, the integration vendors will acquire some of the smaller BPM suite vendors to round out their portfolios, and the large software companies will acquire some of everything, in a continuing Darwinian cycle.

Before you vendors start adding self-promoting comments to this post, keep in mind that this is not intended to be a comprehensive list or review of BPMS vendors, and I know that you’re all very special in your own way. 🙂