TUCON: Tom Laffey and Matt Quinn

Last in the morning’s general session was Tom Laffey, TIBCO’s EVP of products and technologies, and Matt Quinn, VP of product management and strategy. Like Ranadivé’s talk earlier, they’re talking about enterprise virtualization: positioning messaging, for example, as virtualizing the network layer, and BPM as enterprise process virtualization. I’m not completely clear if virtualization is just the current analyst-created buzzword in this context.

Laffey and Quinn tag-teamed quite a bit during the talk, so I won’t attribute specific comments to either. TIBCO products cover a much broader spectrum that I do, so I’ll focus just on the comments about BPM and SOA.

TIBCO’s been doing messaging and ESB for a long time, and some amount of the SOA talk is about incremental feature improvements such as easier use of adapters. Apparently, Quinn made a prediction some months ago that SOA would grow so fast that it would swallow up BPM, so that BPM would just be a subset of SOA. Now, he believes (and most of us from the BPM side agree 🙂 ) that BPM and SOA are separate but extremely synergistic practices/technologies, and both need to developed to a position of strength. To quote Ismael Ghalimi, BPM is SOA’s killer application, and SOA is BPM’s enabling infrastructure, a phrase that I’ve included in my presentation later today; like Ismael, I see BPM as a key consumer of what’s produced via SOA, but they’re not the same thing.

They touched on the new release of Business Studio, with its support for BPMN, XPDL and BPEL as well as UML for some types of data modelling. There’s some new intelligent workforce management features, and some advanced user interface creation functionality using intelligent forms, which I think ties in with their General Interface AJAX toolkit.

Laffey just defined “mashup” as a browser-based event bus, which is an interesting viewpoint, and likely one that resonates better with this audience than the trendier descriptions.

They discussed other functionality, including business rules management, dynamic virtual information spaces (the ability to tap into a real-time event message stream and extract just what you want), and the analytics that will be added with the acquisition of Spotfire. By the way, we now appear to be calling analytics “business insight”, which lets us keep the old BI acronym without the stigma of the business intelligence latency legacy. 🙂

They finished up with a 2-year roadmap of product releases, which I won’t reproduce here because I’d hate to have to embarrass them later, and some discussion of changes to their engineering and product development processes.

Convergence of BPM and BI

We’re 19 minutes into a webinar on “Adding Process Context to BI for Process Intelligence” that is supposed to be featuring Colin Teubner of Forrester, and the sponsor (Global 360) is still talking. Even worse, I’m not completely clear on how Global 360’s new “Business Process Intelligence” initiative is really any different from anyone else’s simulation, analytics and performance management offerings.

Colin did eventually get the floor, and talked about how BPM and BI are converging: at the basic level of implementation, they’re quite distinct, but in advanced implementation, they’re quite tightly intertwined. He spoke about the distinction between data-driven BI and process-centric BI, and how the latter (usually available as part of a BPMS) are sensitive to changes in a process and can self-adjust — hence provide better information about business processes. Colin is pushing for the the idea that BI and BPM will eventually merge into a single product class, which I’m not sure that I agree with: I think that there are a lot of valid data-driven applications for BI that aren’t, strictly speaking, process analytics. It is true, however, that there needs to be better BI integrated more closely with BPM, beyond the relatively simplistic BAM capabilities that are available out of the box.

The webinar was run by Shared Insights, but should be available for replay somewhere via the Global 360 website.

LucidEra launches today

I had the chance last week for a chat with Ken Rudin and Alex Moissis of LucidEra, and a preview of their SaaS business intelligence offering aimed at the SMB marketplace that is being released in general availability today. Rudin, LucidEra’s CEO, was previously with Salesforce.com, Oracle and Siebel CRM OnDemand, so you have to assume that he knows something about both BI and SaaS; Moissis, VP of Marketing, had a long run at Business Objects in product marketing and product strategy.

In most BI projects that I’ve seen, ROI comes quickly — usually less than a year, sometimes less than six months — since it allows analysis of costs, revenues and risks in ways that just aren’t possible using spreadsheets and paper reports. Once the patterns in the data are made visible, companies can act on these trends to cut costs and increase revenues, either in a manual or automated fashion. This is great if you have hundreds of thousands of dollars to spend on a big BI solution, and an IT team to put it in place and get the initial reports up and running, but not so great if you’re smaller, with less money to spend and little or no IT support for a BI project.

LucidEra report with quota field addedWhat LucidEra showed me will help to address that issue for SMBs: a very Web 2.0-looking hosted BI application, supporting multiple data sources, and easy enough to use by anyone familiar with a spreadsheet. In short, they’re trying to simplify BI enough that a smaller company with little IT infrastructure can adopt it and start to reap the benefits. There’s a basic BI platform with pre-built solutions on top of the platform; some of the solutions, like their initial forecast-to-billing one, are included in the base price, whereas others may be at an additional cost, especially those created by third parties. The base price will be around $3,000 per month, which includes 100 users, 3 different data connections, and the aforementioned forecast-to-billing application. It seems like a lot of money, but think about it: the per-user price is about halfway between Salesforce.com and Blueprint. Welcome to the world of paying for your “enterprise” software monthly on your American Express card, and stopping it at any time that you’re not happy with it.

Setting up a new company in LucidEra is a self-service activity, and LucidEra doesn’t even offer professional services to assist with this, although they do provide telephone and online support. Typically for their beta customers (of which there are about a dozen, ranging in size from less than 50 to several hundred employees), this takes up to five person-days spread over as much as three weeks, and is mostly about getting the data sources properly hooked up and doing some data cleansing on the results. Although I didn’t review this process, it sounds as if you’re not going to need professional help for this one, just someone internally who understands your data sources already.

LucidEra graph by regionWe spent quite a bit of time looking at the forecast-to-billing application, doing some slicing and dicing on the data. In the sample that we looked at, the customer data (expected revenue) came from Salesforce.com, the financials (booked revenue) came from NetSuite, and the quota information came from an Excel spreadsheet. These are just three of the data sources that LucidEra can support in any combination: for example, the financials could have come from Oracle Financials instead.

The really cool thing is that there is no distinction between the design and view environment: if you’re viewing a report, you can change it interactively. We added fields to the report, filtered it, grouped by fields (creating the equivalent to an Excel pivot table) and viewed it as a graph, all through dragging things around on the screen. If we didn’t like our changes, we could undo them one at a time, or revert back to the original report.

A few technical notes: the client is purely browser-based, and will run in IE or Firefox on Windows. Ken was going to confirm whether it ran on other platforms (Mac and Linux) but I haven’t heard back yet. They developed their own back-end database based on the Broadbase data warehouse source code and some open-source technology, then rebuilt for multi-tenancy, ease-of-use and to optimize for the SaaS environment. All of this was put together in about 15 months, a timeline that they could not have accomplished except by using the code bases that they started with.

The press release isn’t up on their site yet, but you should be able to find it, and all the other information, there later today.

BI isn’t a field that I usually cover in depth, but keep in mind last week’s themes at the Gartner conference: visibility and agility. BI combined with BPM is one of the ways that visibility into business processes is being realized.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.

Gartner Day 2: Bill Gassman

The afternoon started with several simultaneous sessions by Gartner analysts, and I sat in on Bill Gassman talking about Measuring Processes in Real Time, or as he put it later, learning to live in real time.

There’s no doubt that process visibility is a key benefit gained from BPM, and that visibility usually occurs through the integration of business intelligence (BI) or business activity monitoring (BAM) tools to assist in process monitoring. The goal of BAM is to monitor key objectives, anticipate operations risks, and reduce latency between events and actions, and there’s a number of different channels for funneling this information back to those who need to know, such as business intelligence systems for predictive modelling and historical reports, real-time dashboards, and alerts.

So what’s the difference between BI and BAM? According to Gassman, BI is used for insight and planning, and is based on historical — rather than real-time — data. BAM is event driven, and issues alerts when events occur. Personally, I think that there’s a spectrum between his definitions of BI and BAM, and it’s not clear to me that it’s a useful distinction; in many cases, data is trickle-fed from operational systems to BI systems so that the data is near-real-time, allowing dashboards to be driven directly from the BI system. True, traditional BI tools will typically see update intervals more like 15 minutes than the near-real-time event alerts that you’ll find in BAM, but that’s not a problem in some cases.

Gassman discussed the different real-time analytic techniques that are widely used today: process monitoring, logistics optimization (often based on optimizing delivery times while minimizing penalties), situational awareness, pattern matching (complex event processing, or CEP), track and trace (typically used for B2B processes), and comparison between predictions and reality.

Gartner found in a survey 18 months ago that half of their customers surveyed don’t use BAM, and claim that they don’t use it because they don’t really know about it. Considering that BI had long been a technology that can be cost-justified in an extremely short time-frame, and BAM follows the same ROI patterns, I find this surprising (and I had the feeling that they were a bit surprised, too), although I have had large customers who fall into the same category.

Looking at it from a BPM standpoint, automating a process without having appropriate monitoring is risky business: there’s a business value to awareness of what’s happening in your processes, so that problems are detected early, or possibly before they even occur. There’s a natural synergy between BPM and BAM: BPM generates a mound of process instance data, often in an event-driven manner, that just begs to be analyzed, aggregated, sliced and diced.

Gassman discussed some best practices for BAM/BPM synergy before moving on to his definition of the four generations of BAM architecture: isolated silos, standalone, integrated, and composite. We’re still seeing lots of 1st and 2nd generation BAM tools, the 3rd generation has just started happening, and the 4th generation is still at least a year away. He points out that most BPM vendors are adding BAM, but are using a 1st generation BAM system that’s an isolated silo. He sees the potential to move through 5 different styles of BAM automation, that is, how the analysis from the BAM tool feeds back to change the business process. The potential benefits are great as you move from the simple BAM dashboards up through adaptive rules that choose a path based on goals, but the risks increase as well.

BAM is coming from a number of different types of vendors, in spite of the small size of the market, and there will definitely be some convergence and shakeouts in this market. An example of a trend that I think will continue is the recent acquisition of BAM vendor Celequest, used by some BPM vendors as their embedded BAM, by Cognos, a BI vendor. When you’re using BPM, you’re also going to have to face the question of whether to use a BPM vendor’s embedded BAM, or look for a more fully-functional standalone BAM tool. Gassman showed a spider graph of how BPM/BAM matches up against BI on 8 different dimensions, which indicates that you may want to look for a separate product if you need more analytical capability or need to monitor events outside of the process model.

BPM Think Tank Day 2: Connie Moore keynote

Today started with Connie Moore and Colin Teubner from Forrester delivering the keynote “Making Sense of the Business Process Management Landscape”. Moore addressed the ever-present (and ever-changing) issue of defining the BPM landscape. She thinks that BPM was co-oped by the integration vendors — a view that I’ve heard a few times over the past day, and with which I agree to some degree — and thinks that it needs to be given back to the business. She splits the landscape into pure-play BPM, integration, traditional B2B, enterprise content management, application platform, and enterprise application. I found her comments about ECM vendors interesting (paraphrasing): “they don’t really understand it, but they created some of the early workflow products”. Considering that they put FileNet in this “don’t get it” category but that FileNet also ended up right on the border between “strong performer” and “leader” in their Wave for Human-Centric BPMS doesn’t match up (I mention FileNet specifically because I worked there a long time ago and still work with some of their products, so have a good idea of their capabilities), so not sure of the value of these categorizations.

She started out showing the results of a Forrester survey from last year about problems with enterprise application implementations, where several of the top responses were related to BPM in some way: inadequate support for cross-functional processes, limits on process change due to application inflexibility, lack of visibility and analytic insight into process results, and inability to extend business processes to external partners.

She showed how BPM evolved from workflow, although I think that her view is simplistic since it only considers the human-centric side. She then went on to talk about Ken Vollmer’s view, which is that BPM evolved from EAI; as you can imagine, I think that’s also a simplistic viewpoint. As I discussed in my history of BPM, I think that the market started to merge a few years back when workflow vendors started adding EAI, and EAI vendors started adding workflow, although all of them maintain an orientation in one direction or another. Forrester now ranks the integration vendors and the human-centric BPM vendors separately, and has very different analyst teams working on them, effectively tearing apart the originally artificial, but now well-accepted, combination of everything integration-related under the BPM umbrella that Gartner made a few years back. It feels like they’re trying to put the toothpaste back into the tube, and I don’t think that it’s going to work. Moore does make a valid point that one product won’t do it all, which is exactly what I’ve been telling my customers for some time: I think that most organizations need two in order to cover all the requirements currently, although they need to work together closely.

They showed a great diagram where BPM is positioned as the crossover technology between business and IT, whereas ESB and other more integration-focussed technologies are clearly on the IT side of the fence. Let’s face it, an IT person might talk to a business person about BPM, but they’re never going to talk about them about ESB or SOA with any degree of success: BPM lives in both of their worlds, although may show different faces to each side.

Moore then said those words that always chill my heart when I hear them from an analyst: “I’m going to talk about where BPM vendors ought to be thinking”. I had a lengthy conversation yesterday about how I disagree with the power that Gartner has as a market-maker, as opposed to an organization that analyzes and reports on the market and trends, and here’s Forrester playing the same game. I was quite relieved when she presented a very vanilla view of a value pyramid of BPM-related functions plus some predictions like the user experience will change dramatically (without mentioning Web 2.0), and that better integration between BPM and BI is needed. Whew.

BPM and BI

Lots of interesting news recently on BPM and BI. Last month, Lombardi and Cognos signed an OEM agreement to embed Lombardi’s Teamworks into Cognos’ analytics applications. Bruce Silver had a good post about the implications of this agreement, and the blurring lines between BPM and analytics. Then last week, IBI announced that they’ve embedded their WebFocus business intelligence into iWay‘s Process Manager (iWay is a subsidiary of IBI), further indicating this blurring of technologies.

Gartner BPM summit day 1: Sinur and Melenovsky

The conference opened with the two key faces of Gartner’s BPM vision — Jim Sinur and Michael Melenovsky — giving a brief welcome talk that focussed on a BPM maturity model, or what they are calling BPM3. There was only one slide for their presentation (if you don’t count the cover slide) and it hasn’t been published for the conference attendees, so I’ll rely on my sketchy notes and somewhat imperfect memory to give an overview of the model:

  • Level 0: Acknowledge operational inefficiences, with potential for the use of some business intelligence technology to measure and monitor business activities. I maintain that there is something lower than this, or maybe a redefinition of level 0 is required, wherein the organization is in complete denial about their operational inefficiences. In CMM (the Capability Maturity Model for software development processes), for example, level 0 is equivalent to having no maturity around the processes; level 1 is the “initial” stage where an organization realizes that they’re really in a lot of trouble and need to do something about it.
  • Level 1: Process aware, using business process analysis techniques and tools to model and analyze business processes. Think Visio with some human intelligence behind it, or a more robust tool such as those from Proforma, iGrafx or IDS Scheer.
  • Level 2: Process control, the domain of BPMS, where process models and rules can now be executed, and some optimization can be done on the processes. They admitted that this is the level on which the conference focusses, since few organizations have moved very far beyond this point. Indeed, almost every customer that I have that uses BPM is somewhere in this range, although many of them are (foolishly) neglecting the optimization potential that this brings.
  • Level 3: Enterprise process management, where BPM moves beyond departmental systems and becomes part of the corporate infrastructure, which typically also opens up the potential for processes that include trading partners and customers. This is a concept that I’ve been discussing extensively with my customers lately, namely, the importance of having BPM (and BRE and BI) as infrastructure components, not just embedded within departmental applications, because it’s going to be nearly impossible to realize any sort of SOA vision without these basic building blocks available.
  • Level 4: Enterprise performance management, which starts to look at the bigger picture of corporate performance management (which is what Gartner used to call this — are they changing CPM to EPM??) and how processes tie into that. I think that this is a critical step that organizations have to be considering now: CPM is a great early warning indicator for performance issues, but also provides a huge leap forward in issues such as maintaining compliance. I just don’t understand why Cognos or other vendors in this space aren’t at this conference talking about this.
  • Level 5: Competitive differentiation, where the business is sufficiently agile due to control over the processes that new products and services can be easily created and deployed. Personally, I believe that competitive differentiation is a measure of how well that you’re doing right from level 1 on up, rather than a separate level itself: it’s an indicator, not a goal per se.

That’s it for now, I’m off to lunch. At this rate, I’ll catch up on all the sessions by sometime next week. 🙂