SEx machine

I just watched the replay of a presentation done by Peter Fingar, one of the most prominent names today in BPM and co-author of Business Process Management: The Third Wave. As usual, he had some interesting words about “How Work Works in Business”, because he understands that BPM is about business, not about IT: a fact of which many organizations have lost sight.

There was one false note in the presentation, however: he had a slide that described the work processor as the “strategy-execution machine”, and abbreviated that as “The SEx Machine™”. I kid you not, it even included the ™. With a slightly embarrassed pause, Peter pronounced the abbreviation as “the s-e-x machine”, as if he were the parent of a 4-year-old discussing a taboo subject at the dinner table. I was left wondering what marketing hack created that abbreviation, because I’m pretty sure that Peter wouldn’t coin a phrase to be used in a public presentation that he obviously can’t even pronounce in public. The really funny thing was that he spent a great deal of time in the presentation emphasizing how we had to get rid of 3-letter acronyms in IT, then he goes and turns “sex” into a TLA.

The inconsistency between the presentation slide and his obvious discomfort with the content really gave me pause, because it makes me wonder how many other presentations that I see where the presenter is equally ill-at-ease with the content, but just hides it better.

BPM standards

I feel pretty strongly about the benefits of standards in all areas of technology, and BPM is no exception. For years, we’ve been using different notations in different tools to create BPM systems that don’t communicate with each other because they speak a different language. There’s been a lot of headway in standardizing the communications part — the messaging protocols, for example — but there’s still work to be done in the design end. How can we hope to make BPM systems truly interoperable when they don’t use the same notation to model the flow within the system?

Every BPM system has its process designer tool, but when it comes right down to it, most people model their process in Microsoft Visio before implementing it in the BPM system. First of all, a lot of people already have Visio on their desktop and know how to use it, so there are less licensing and training issues. Secondly, a process model will usually include all sorts of non-automated steps that will never be represented in the BPM system (especially the “as-is” model), but need to be modelled for proper business analysis. I’m not interested in tools, however, but in the actual BPM notation, which can be drawn in Visio or any number of other process modelling tools.

There are standards emerging for BPM notation, with two strong contenders: UML activity diagrams, and BPMN business process diagrams. If you have a technical background and haven’t had your head in a paper sack for the past 10 years, you know about UML; however, you probably know it as a modelling tool for software design, with activity diagrams being used for computational processes, which is why the OMG originally developed it. UML activity diagrams have recently been repurposed for business process modelling: a type of object-oriented flowchart, if you please.

As much as I like UML for software design, I like the emerging BPMN standard (from BPMI, an industry standards body) better: it’s been designed as a business process modelling notation, not retrofit from some other standard; it’s more understandable to business analysts and other non-technical participants; and it has a direct mapping to WSBPEL for process orchestration. It was only introduced last year and may take a while to catch on, but it’s worth knowing about.

If you’re interested in learning more about BPM standards, there’s a non-vendor webinar (a rarity!) called The Business Value of Process Standards at ebizQ on April 6th.

Human, Interrupted

I just read about yet another analyst BPM workshop that claims to be “the definitive education on BPM”. Oh, puh-leeze. There is no such thing as a 2-day definitive education on a topic as broad as BPM, and besides, the analysts can’t even agree on the definition of BPM. I wrote a short course on BPM for a client recently (no, it wasn’t the definitive education on BPM), and as part of that, I created a brief history of BPM to show how this space evolved. One thing that becomes painfully clear in looking at the evolution and current state of BPM is that although everyone agrees that BPM is about managing processes, there is no clear definition of the divisions within the space, or which technologies belong where (if anywhere) within it.

It’s almost biblical: in the beginning, there was human-to-human workflow. Some time after that, there was system-to-system EAI. They were distinct, and that was good because everyone understood which was which. In time, workflow begat EAI-like capabilities in order to facilitate human-to-system interfaces, and EAI begat human-facing workflow-like capabilities in order to handle exceptions within processes. Then, workflow begat BAM and simulation, and EAI begat B2Bi. Finally, workflow and EAI together adopted process modelling and business rules (they didn’t beget these technologies, they already existed in other fields).

Then, the abomination: the analysts created a Tower of Babel by lumping all of this together and calling it BPM.

Yes, it was confusing before the term BPM was applied to it all, since workflow and EAI overlapped significantly, but it’s now monumentally more confusing to customers because any vendor in the entire BPM space can claim that they “do BPM” and can therefore compete with any other vendor. I saw a particularly painful example of this at a large company that had chosen an integration broker suite for their BPM standard. The internal IT groups were fully indoctrinated that this was the only BPM tool that they could use, and they actually seemed to believe that BPM meant the considerably narrower field of EAI. One of the senior people, on hearing me describe the requirements of a human-facing BPM project, referred to it as “human-interrupted”. Not surprisingly, there are considerably more system-to-system BPM projects than any other type in that organization; who would willing pick up that square peg and try to ram it into a round hole?

In an attempt to help with the confusion, the same analysts who lumped all of this together as BPM created divisions within the BPM space based on functionality. Unfortunately, they all created different divisions based on widely varying criteria. For example, Gartner, whose definition I tend to align with, created a taxonomy in a research note back in 2003 based on process type, dividing the space into pure-play (application-independent), integration-focussed, administrative, collaborative, and embedded. [For my work with back office systems, the key segments of the BPM space are pure-play and integration-focussed; pure-play is, more or less, what evolved from workflow, and integration-focussed is what evolved from EAI.] Delphi, on the other hand, makes divisions based on the degree of human interaction: person-to-person, person-to-system, and system-to-system. This is a very useful way to categorize applications of BPM, but I don’t agree with it as a way to categorize the products themselves, since all of them claim to do all of it.

There are many other BPM taxonomies: at least one per analyst, and usually one per vendor. Most of them are not created for the altruistic purpose of trying to clarify the space.

Creating a taxonomy is hard work, because it requires projecting a complex, multidimensional space onto a much simpler space of lower dimensionality in order to make it comprehensible and useful. BPM is definitely a case where the whole is greater than the sum of the parts, making the process even more difficult. BPM is not just workflow plus EAI plus BAM plus business rules, et cetera: it’s the near-seamless integration of all of these tools that is the real competitive differentiator, because that’s what enables an organization to do things that they could never do before.

Testing for real life

I watched the movie K-19: The Widowmaker on TV last night; it’s about a Russian nuclear submarine on its maiden voyage in 1961 where pretty much everything goes wrong. In the midst of watching reactor failures and other slightly less catastrophic mishaps, I started thinking about software testing. I’ve seen software that exhibited the functional equivalent of a reactor failure: a major point of failure that required immediate shutdown for repairs. Fortunately, since I have worked primarily on back-office BPM systems for financial services clients over the years, the impact of these catastrophic system failures is measured in lost efficiences (time and money) by having to revert to paper-based processes, not in human lives.

When I owned a professional services company in the 90’s, I spent many years being directly responsible for the quality of the software that left our hands and was installed on our clients’ systems. In the early days, I did much of the design, although that was later spread over a team of designers, and I like to think that good design led to systems with a low “incident” rate. That’s only part of the equation, however. Without doubt, the single most important thing that I did to maximize the quality of our product was to create an autonomous quality assurance and testing team that was equivalent in rank (and capabilities) to the design and development teams, and had the power to stop the release of software to a client. Because of this, virtually all of our “showstopper” bugs occurred while the system was still in testing, saving our clients the expense of production downtime, and maintaining our own professional reputation. Although we always created emergency system failure plans that would allow our client to revert to a manual process, these plans were rarely executed due to faults in our software, although I did see them used in cases of hardware and environmental failures.

When I watched Liam Neeson’s character in K-19 try to stop the sea trials of the sub because it wasn’t ready, and be overruled for political reasons, I heard echoes of so many software projects gone wrong, so many systems put into production with inadequate testing despite a QA team’s protests. But not on my watch.

A lesson in disintermediation

I recall learning the real meaning of the word “disintermediation” in the mid-90’s, when I was helping mutual fund companies build systems to do exactly that: cut out the middle-man (the broker or dealer) in some of the transactions that they have with their end-customers. The primary vehicle for this disintermediation is, of course, the web, where now almost all financial services companies provide some sort of self-service, bypassing a mutual funds dealer, securities trader, bank teller or insurance broker whenever regulations allow. This trend is not restricted to financial services: travel agents, for example, have been practically disintermediated out of existence in some market segments.

The “bad guy” in this has always been the big company: by allowing their customers direct access to their services, they endanger the livelihood of the intermediary. (Having been self-employed for most of 20 years, I don’t believe in the sanctity of any job, but that’s a topic for another day.)

Now it’s the banks’ turn to be disintermediated. The Economist reports in a recent article on the launch of Zopa, a UK-based online lending and borrowing exchange: consider it peer-to-peer lending for the rest of us.

Since borrowers and lenders can get together without a bank in the middle, Zopa effectively disintermediates the bank, while providing bank-like security through credit checks, spreading each loan over multiple parties, and committing to collecting from overdue borrowers. Borrowers are classified by their risk, and lenders choose the level of risk that they want to assume when picking a market within Zopa. Zopa takes 1% of the deal, which means that borrowers get a better rate than a bank could offer them for a consumer loan, and lenders get a better rate than a bank savings account. To quote their site:

Lenders can choose what rate to lend at and, by looking at the markets, decide what sort of people to lend to and when. Borrowers can choose to take a rate offered or to wait and see whether the rates drop. Both avoid paying needless chunks of commission to Financial MegaCorp plc and can get better rates of interest as a result.

I love that dig about “Financial MegaCorp plc”.

You can be sure that the big financial institutions will fight the growth of exchanges like Zopa when it starts to impact their business, but then, all parties being disintermediated fight the trend. This style of borrowing and lending won’t be for everyone, but it will attract those who don’t want to spend the extra money just to have a bank in the middle, any more than I would pay a higher fare to have a travel agent book an airline ticket for me.

BPTrends’ 2005 BPM Suites Report

BPTrends published a report on BPM Suites this week that reviews 13 different BPM products:

  • Appian Enterprise from Appian
  • AgilePoint from Ascentn
  • XicoBPM from B2Binternet
  • Chordiant Enterprise Platform from Chordiant Software
  • TRAXION Enterprise Business Process Management Suite from CommerceQuest
  • eg work manager from eg Solutions
  • FileNet Business Process Manager from FileNet
  • WebSphere Business Integration from IBM
  • WorkPoint from Insession Technologies
  • Business Conversion Suite from M1 Global Solutions
  • Pegasystems SmartBPM Suite from PegaSystems
  • TIBCO Staffware Process Suite from TIBCO Software
  • Ultimus BPM Suite from Ultimus

The major players are definitely covered here, but there’s a few of these that leave me wondering about the criteria for inclusion in this report. That cleared up a bit when I read the section Participating Vendors in the Foreward to the report:

BPTrends contacted all the BPM vendors we could identify and solicited their participation in this report at a cost to them of under $5,000. All products from participating vendors were evaluated in the same manner: Derek Miers and Paul Harmon prepared a detailed questionnaire which we asked each vendor to complete. They then reviewed the questionnaires, studied the product documentation and all other relevant materials provided by the vendors, and requested a product demonstration. Finally, they interviewed each vendor to eliminate any confusion and to make certain we had not overlooked anything. We did not conduct any actual product testing.

In other words, it appears that the vendors paid a fee to be included in the report, and the vendors provided the content for the report rather than it being based on independent observations. Although the authors have provided some nice summary pages that list each product’s characteristics on a separate page, there are no negative comments about any product, and there are few comparisons between products: this is more like a collation of each vendor’s product information rather than a product review as you might find from Gartner. Still, there is much useful information about the products in the report, and some excellent introductory material including the authors’ view of the BPM space and a summary of BPM drivers.

My assessment of BPTrends’ report: the first 38 pages (prior to the actual product information) contain some great background information. The product sections, although well organized and well written, don’t provide anything that you couldn’t get from the vendors’ websites.