James Taylor reporting from Gartner BI

James Taylor‘s been at the Gartner Business Intelligence Summit this week. On Monday, he posted some great thoughts on process, rules, BI and agility:

You can use business rules to automate decisions in business processes and then use analytics to optimize these decisions and hence the processes…

You must be able to change a process that you are monitoring when your monitoring tells you that something is wrong. Real-time measurement should not be combined with systems that take weeks or months to change.

Although there are caveats to that last sentence — for example, some real-time measurement is intended to allow the human elements in a process to change rather than the system, such as work re-allocation — I’d still like to have it tattooed on my forehead for every client to read. Making measurements with the intention of enabling agility is useless in many of the BPM installations today, not because the underlying BPMS isn’t agile, but because the customer chooses (or is coerced) to undertake a huge degree of customization that effectively pours concrete over the system.

Then later that day, he posts more on how BPM, BRE and analytics go together like chocolate and peanut butter (that’s my characterization, but I’m sure James would agree) — that seems to be a popular theme at the summit. He also posts about the Tuesday and Wednesday sessions, although less BPM-related than the Monday sessions.

Maybe because I come from the BPM side of the house, I don’t really see why the big fuss to rename parts of the BI space: BI seems to be an outdated term now, referring only to reporting on historical information from a data warehouse or operational data store. Other terms like CPM (corporate performance management), BAM (business activity monitoring), CEP (complex event processing) and EDM (enterprise decision management, which also involves BRE) have sprung up to cover the near-real-time space that I still think of as BI — after all, there’s much of the data aggregation, analytics and other common technology at the core. Many of these newer terms are touted as “[something more fabulous] BI”, such as James’ reference to EDM as “deployable BI”, but it feels a bit like the emperor’s new clothes. Maybe they’re all just BI 2.0.

Steps to BPM Success

I just watched a webinar hosted by BPMinstitute called “Proven Steps to BPM Success”. By the time the webinar started, it was retitled as “Breakaway BPM — Leveraging Business Process Innovation for Strategic Advantage”, although there wasn’t really a lot of content that fit that description. Unfortunately the webinar started with a (short) presentation by the hosting vendor, Metastorm, then proceeded to a presentation by AMTI, one of their partners. Basically, vendor followed by vendor. Whatever happened to having customers talk about their experiences?

A couple of good ideas and graphics from the Metastorm CEO, including this one on the evolution of BPM as driven by complex process initiatives:

However, pretty tame stuff from the “featured speaker” from AMTI talking about their process improvement efforts, like “reward success” and “process should be integral to the way you work”. And he totally didn’t understand why a recent Gartner survey (summarized in InfoWorld) showed that CIOs’ top business priority is improving processes but their top technology priority is business intelligence. Um, BPM and BI are related, dude — that’s what business activity monitoring (BAM) and corporate performance management (CPM) are all about.

You’ll be able to find a replay of the webinar on the BPMinstitute site within a few days, listed under their Round Tables section.

Multi-tasking during the webinar did give me a chance to glance through an interesting article in a recent copy of the Economist, Thinking for a living (paid subscription required), the title of which is based on the book of the same name by Tom Davenport. The article has a great nugget of truth from a consultant at Boston Consulting Group:

Mr. Morieux concludes that companies should concentrate on designing the processes that knowledge workers carry out, rather than measuring their performance.

Rather a different view on the whole BAM/CPM issue.

Hooked on Analytics

BAM, BI, CEP, analytics: call it what you will. An article by Tom Davenport in HBR, Competing on Analytics (free!), discusses the value of shining a bright light on what’s happening in your business processes:

At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the last remaining points of differentiation. And analytics competitors wring every last drop of value from those processes.

He makes the point that companies that become proficient at analyzing what’s happening with their business processes become the leaders in their field, like Capital One, Marriott and Amazon. These “analytics competitors” have top management buying in to the concept of analytics as a strategic differentiator, have multiple analytics initiatives going on, and are doing it at the enterprise rather than departmental level. I had a question sent to me via my Squidoo BPM lens recently about the BI/analytics marketplace in Toronto, and I had to admit that a lot of my local customers (and not-so-local ones) are still not seeing analytics as a strategic initiative, but are allowing it to languish in departmental applications where it provides ROI but (probably) not much strategic differentiation at an enterprise level. Read Tom’s full article for his full analysis of what the analytics competitors are doing right, and the difference that it’s making for them. He even pooh-poohs the use of the term “business intelligence” as being “the term IT people use for analytics and reporting processes and software”. Ouch!

The linked sub-article, You Know You Compete on Analytics When…, has a summary of the traits of a successful analytics competitor.

BAM technical session

This seemed to be a morning for networking, and I’m arriving late for a technical session on FileNet’s BAM. I missed the hands-on session this morning so wanted to get a closer look at this before it’s released sometime in the next couple of months.

The key functional things in the product are dashboards, rules and alerts. The dashboard part is pretty standard BI presentation-layer stuff: pick a data source, pick a display/graph type, and position it on the dashboard. Rules are where the smarts come in: pick a data source, configure the condition for firing an alert, then set the content and recipient of the alert. Alerts can be displayed on the recipient’s dashboard, or sent as an email or SMS, or even launch other processes or services to handle an exception condition automatically.

There’s a nice interface for configuring the dimensions (aggregations) in the underlying OLAP cubes, and for configuring buckets for running statistics. The data kept on the BAM server is cycled out pretty quickly: it’s really for tracking work in progress with just enough historical data to do some statistical smoothing.

Because they’re using a third-party OEM product for BAM, it’s open to other data sources plugged into the server, used in the OLAP cubes, combined on the dashboards or used in the rules. However, this model adds yet another server, since it pulls pre-processed work-in-progress data from the Process Analyzer (so PA is still required) and has a sufficiently hefty memory requirement since it’s maintaining the cubes in memory that it’s probably not a good idea to co-locate it on a shared application server. I suppose that this demotes PA to a data mart for historical data as well as a pre-processor, which is not a completely bad thing, but I’m imagining that a full replacement for PA might be better received by the customers.

Hot BAM!

If there’s anything better than hearing about a hot new product like FileNet’s BAM, it’s hearing it in Danny Pidutti’s lovely Aussie accent. There’s a few misconceptions in his presentation around the differences between BI and BAM; I see BAM as just a process-oriented subset of BI, although the real-time nature means that we’re in the realm of operational BI, such as was discussed in an eBizq webinar “Improving Business Visibility Through Operational BI” on Oct 27th (www.ebizq.net/webinars/6298.html according to my calendar, sorry for the lack of a direct hyperlink but that’s the limits of blogging via Blackberry email) and an earlier one about operational BI on Oct 12th, although I can’t recall who hosted it.

This looks like a pretty significant improvement on the old Process Analyzer: about 20 pre-configured reports, configurable role-based dashboards, KPIs for scorecard-like capabilities, alerts and other fun stuff. A bit of a catch-up from.a competitive standpoint, but FileNet’s more known for solid technology than being the first to market these days.

The demo starts with a Celequest login screen, telling you who the OEM vendor is. At this point, it’s really a standard BI demo, showing how dashboards are configured, alerts set and related functions.

My only question is, what took you guys so long?

High-level product info

Dave McCann, FileNet’s SVP of Products, is talking in some very broad strokes about product directions, and I’m yearning for more details on all the new announcements. I suppose that will come mostly in the breakout sessions, I just need to be patient. He’s also talking a lot about content, which is not my focus (in case you haven’t noticed already) — I consider content to be like the air we breathe: it’s always there, I just don’t think about it.

A few interesting factoids that he’s dropped into his talk based on his conversations with customers: a large insurance company who sits on the FileNet technical advisory board stated that the largest cost in their IT budget is integration between all of the vendor products that they own. Yikes! A European customer told him that 82% of their IT budget is committed to maintaining what’s already in place, with only the remaining 18% to spend on new technology. These two facts taken together point out the need for easier ways to integrate all the things that are there, which will free up part of the budget for new technology that will help companies maintain a competitive advantage. The need for consistent architectures and reusability has never been greater.

He’s finally onto the process stuff, and is talking about the recent and upcoming enhancements to the BPM product suite:

– Productization of the Business Process Framework, which is a BPM application development framework developed by FileNet’s Professional Services for use in their own customer engagements, including things like case management and skills/roles management. They’re being very careful about positioning this so that it’s not perceived as being too competitive with partner solutions, although I’m sure that there will be a few partners who are going to be a bit put out by this.

– Business Activity Monitoring as a new product, replacing the rudimentary Process Analyzer that has been holding the fort in the BAM area for the past few years. Shipping in December. I’ll definitely be going to the lab on this later this week, since this is something that I constantly talk to customers about.

– Enhanced integration with business intelligence, especially through their recent cozying up with Cognos. I’ll be talking about corporate performance management, and mentioning Cognos specifically, in my breakout session this afternoon, since I feel that this is a critical step for most organizations.

– eForms enhancements, which are always interesting but a bit peripheral to what I usually do.

– A business rules connectivity framework that integrates to Fair Isac, Corticon and Resolution in addition to the longer-standing integration with ILOG. BRE is another functionality that I feel is essential to BPM, as I discussed in my course on the weekend.

He’s also talking about the FileNet Enterprise Reference Architecture, which fits nicely as a technical architecture for ECM against a full EA context.

The most exciting thing about the features that will be released next year is full BPMN support, which further validates my personal preference for BPMN over UML for process modelling.

All-in-all, I’m quite pleased with what they’ve announced in the BPM area, since it’s addressing some key weaknesses (like BAM) that have existed in the product suite to date.

Neural nets in BPM?

Just saw this article in eWeek about Fuego releasing neural net capabilities in their BPM product.

Neural Network works through a decision activity capability that lets users define a set of variables that can be analyzed for process improvement…Neural Network takes that set of variables and builds a learning activity set that can monitor decisions and suggest behavior to improve the process.

I haven’t heard the term “neural net” much since my days in graduate school when I was slogging through a thesis on pattern recognition; it usually refers to a hardware implementation consisting of a massively-parallel network of simple processors (modelled on the human brain and its highly-connected network of neurons): think grid computing on a very tiny scale. Because these terms are not widely understood, there’s a long history of misuse: in fact, the first company that I worked for after university had the word “perceptron” (a type of neural net) in its name, although we wrote pattern recognition and scientific image analysis software, with nary a neural net in sight.

That being said, I’m assuming that what Fuego is calling “Neural Network” is actually artificial intelligence (AI) or cognitive modelling, although I can understand why the marketing types would avoid the overused “AI”, with its shades of science fiction, and positively run screaming from the overly-geeky “cognitive”. The problem with introducing a functionality that is barely understood in the marketplace (besides having to explain it to your own marketing people) is that the customers have no clue what to do with it, and probably not much time to spend doing the out-of-the-box thinking required to come up with some real business scenarios that have the potential for ROI. If you keep reading the article, you’ll see that the VP of process management at an existing Fuego customer considered “the Neural Network technology” to be “intriguing but not essential”. See the problem? It’s still “technology” in the minds of that customer, not a solution to a business problem.

I think that AI has a great future in BPM, but it’s still very early in the hype cycle. As a natural extension to business activity monitoring (BAM), pushing it into the milieu of semi-automated corporate performance management (CPM), it’s going to be the next “must-do” on BPM vendors’ product plans.

By the way, I wrote this post on my tablet PC (in tablet mode) — the handwriting recognition is really good, although a bit slower than my typing. I would like copy-cut-paste soft keys on the handwriting input panel, however: I had to keep switching from handwriting mode to keyboard mode in order to use Ctrl+C, Ctrl+X and Ctrl+V.

Fractured Language

Yesterday, I was finishing off a presentation for a talk that I’ll be giving next month about corporate performance management, including some of the analytics tools that are used to build things like executive dashboards to display the key performance indicators of a company’s operations as charts and dials. Two tools/metaphors are used a lot: dashboards and scorecards, which both do exactly as they sound. Unfortunately, in my research I found at least one vendor of these products who verbs the nouns, and refers to “dashboarding” and “scorecarding” as the activities of creating these things for a company. Blech.

I felt better after this morning’s daily dose of Savage Chickens.

Convergence of BPM and BI

If you’re interested in more about the Forrester report about BPM and BI that I mentioned a couple of weeks ago, but still don’t want to shell out the cash for it, you can find a more complete summary on here on BPM.com (registration required), written by a Forrester VP who was one of the original report authors.

The more that I look at compliance, which I’ve been doing a lot of lately, the more that I understand that BPM needs to feed its performance data into a larger BI infrastructure. And I can buy into what the Forrester report refers to as Process to Data, or P2D (as if we needed another TLA), which is the two-way synergy between BPM and BI: BPM feeds process performance data to BI, and BI invokes processes in BPM in order to gather information. However, I have to draw the line at their statement:

BI and BPM can no longer live without each other, and the time is right for these technologies to merge.

I don’t think so, any more than BPM will merge with any number of other technologies, although obviously there will be closer and closer integration ties in order to make all of this work smoothly. BPM and BI are strong, distinct markets served by a variety of strong vendors, and customers develop quite a bit of loyalty to their BPM and BI vendors because both of these technologies are pervasive in an organization’s IT infrastructure. If BPM vendor A decides to merge with BI vendor B and suggests to their customer that they should change all of their existing BI from vendor C to vendor B, there would be a great deal of rolling around on the floor and laughing. There’s a much stronger argument to be had for merging BPM and BR (business rules), but I don’t think that’s going to happen either, for much the same reasons: both technologies have strong, independent markets (that is, the technologies can exist successfully in organizations without each other) and there is little reason for a customer to want to buy their corporate-wide BR from their BPM vendor, or vice versa.

If the merging that Forrester suggets actually occurs, we’ll end up with monolithic Swiss-Army-knife-like vendor offerings from a few large players — how retro! — and a lot of unhappy customers. What most customers want is the best of both worlds: the best-of-breed technology, and the minimum amount of integration; hence the huge popularity of SOA. In the past, you couldn’t get both: you could buy best-of-breed from different vendors and take a lot of effort to integrate it, or you could buy a fully-integrated suite from a single vendor that included at least one sub-optimal component. Today, however, with the advent of integration standards and SOA, the integration effort for properly-constructed products from different vendors should be no more than that required for products from the same vendor.

My conclusion: as long as the vendors do what they’re supposed to in order to enable easy integration, there’s very few customer drivers for merging BI and BPM.