Even Smart Enough Systems get the blues

Scott Selhorst dropped me a note last night to point to his interview with James Taylor about James’ book released this week, hoping that I could build on that rather than having to cover a lot of the same ground when I interview him today. Since Scott published a 52-minute podcast, I haven’t had time to listen to it yet, and may not before I talk to James later today, so I may be repeating some material. However, I think that there’s a big divide between text and voice interviews, and the same people don’t necessarily consume both.

I did have a laugh when I imported the interview into iTunes for synching onto my iPod; I assume that Scott didn’t set a default genre on the mp3 file, and in iTunes it shows up as “Blues”.

Business Rules Forum schedule

The schedule for the Business Rules Forum in October in Orlando has been posted, and I’m speaking on Tuesday, in the first breakout session following the opening keynote.

I’ll be talking about how business process management, business rules and business intelligence all come together to create both agility and visibility into business processes. Although they’ve listed my session as being for an IT audience, there will be plenty here for the business side as well.

IQPC BPM Summit: Kirk Gould

Kirk Gould, a performance consultant with Pinnacle West Capital, talked about business processes and metrics. I like his definition of a metric: “A tool created to tie the performance of the organization to the business objectives”, and he had lots of great advice about how to — and how not to — develop metrics that work for your company.

I came right off of my presentation before this one, so I’m a bit too juiced up to focus as well on his presentation as it deserves. However, his slides are great and I’ll be reviewing them later. He also has a good handout that takes us through the 10 steps of metric development:

  1. Plan
  2. Perform
  3. Capture
  4. Analyze
  5. Display
  6. Level
  7. Automate
  8. Adjust
  9. Manage
  10. Achieve

He has a great deal more detail for each of these steps, both on the handout and in his presentation. He discussed critical success factors and performance indicators, and how they fit into a metrics framework, but the best parts were when he described the ways in which you can screw up your metrics programs: there were a lot of sheepish chuckles and head-shaking around the room, so I know that many of these hit home.

He went through the stages of metrics maturity, which I’ll definitely have to check out later since he flew through the too-dense slides pretty quickly. He quotes the oft-used (and very true) line that “what gets measured, gets managed”, a concept that is at the heart of metrics.

Enterprise 2.0: Case Studies, Part I

Another panel, this one with moderator Brian Gillooly from Optimize, and including panelists Jordan Frank of Traction, Mark Mader of Smartsheet.com, Suresh Chandrasekaran of Denodo, Todd Berkowitz of NewsGator and David Carter of iUpload (which I understood was going to undergo a name change based on what their CEO John Bruce said last month at EnterpriseCamp in Toronto). Since these are all product companies, I expect that this might be a bit less compelling than the previous panel, which was primarily focused on two Enterprise 2.0 end-user organizations.

I’m not going to list the details of each vendors’ product; suffice it to say that Traction is an enterprise wiki platform (although there’s some blog type functionality in there too), Smartsheet.com is a spreadsheet-style project management application offered as a hosted service, Denodo does enterprise data mashups for business intelligence applications (now that’s kind of interesting), NewsGator is a well-known web feed aggregator and reader, and iUpload is a hosted enterprise social software service.

Mader had some interesting comments on how by making updates to a schedule completely transparent, no one wants to be the last one to add their part since everyone will know that they were last; this, however, is not unique to any Enterprise 2.0 functionality, but has been a well-known characteristic of any collaboration environment since Og was carving pictures of his kills on the community cave wall.

There was an interesting question about who, within an organization, is driving the Enterprise 2.0 technology adoption: although the CxO might be writing the cheque, it’s often corporate communications who’s pushing for it. In the last session, we saw that in one organization, it was pushed by HR, but I suspect that’s unusual.

Enterprise 2.0: Derek Burney

We heard from IBM, so it’s inevitable that we’re going to hear from Microsoft too: namely, Derek Burney, GM of the SharePoint Platform and Tools group. More stuff on how we’re in a new world of work, how technology is changing how and where we interact, but he also touches on the issues of the need to retain and share knowledge as the baby boomers start to retire, and what the incoming MySpace generation is going to demand in terms of functionality on enterprise platforms.

He covers the idea of a busines productivity infrastructure consisting of unified communications, business intelligence, ECM, collaboration (including wikis and blogs) and enterprise search — amazingly, that’s exactly what SharePoint offers 😉 He mentions BPM peripherally as it relates to content approval, but doesn’t cite it explicitly. He does mention “workflow” but that’s really Microsoft’s view of workflow, which is more web service orchestration than what I think of as workflow.

He discusses all the people who might be involved in some way in your organization — employees, partners, customers, and non-affiliated community — and how to better allow collaboration between, not just within, these groups. He directly addressed the concern that many IT (and business) managers have about bringing blogs and wikis inside the corporation, namely a loss of productivity, by showing an example from within Microsoft of how wikis can actually improve productivity, but I think that the Razorfish intranet example that I saw at a recent conference is much more compelling.

The presentation dragged a bit towards the end: I was losing the thread as the slides blurred past, and the guy beside me appeared to nod off. I would like to review his slides if they’re available online; I think that there’s a great deal of good information in there, I just need to dig it out.

Shared Insights PCC: RSS in the Enterprise

My session on the changing face of BPM went pretty well, except for one guy who said that I was wrong about pretty much everything 🙂

Today finishes early, so I’m at the last breakout session, Colin White discussing using RSS in the enterprise, and the broader subject of using web syndication to deliver content to users. It’s a bit distracting because he has exactly the same English accent as someone on my wine club board; I keep looking up and expecting to see my friend Bernard (who doesn’t even know how to spell RSS) at the front of the room.

White is looking at this from an architectural rather than implementation viewpoint, and focussing on enterprise rather than internet data sources: a standardized and lightweight XML-based integration protocol. He spent an undue amount of time explaining generically what RSS feeds are and how internet syndication works in various RSS readers; is there anyone in this fairly technical portal-savvy audience who doesn’t already know all this? He then moved on to the differences between RSS and Atom and the specific tags used in an RSS feed; 30 minutes into the presentation, we still haven’t yet seen anything to do with RSS in the enterprise.

Eventually he does get to enterprise uses of RSS; no surprise, one big use is to have it integrated into a business portal, although the XML can also be consumed by various search tools, including ETL to capture the data and load it into a data warehouse or content management system — something that I hadn’t thought about previously, but can be done with tools like Microsoft Integration Services. He points out how RSS is one piece in the integration puzzle, which is essentially what I’ve been saying with respect to using RSS feeds of process execution data as one way of providing visibility into processes.

White covers the different types of feed servers: external, internal, and hosted SaaS. Interestingly, NewsGator is now in all three areas, with both an enterprise server and an on-demand solution that can aggregate and syndicate internal as well as external content, as well as their well-known external internet version. That gives a variety of ways that a feed server can fit into an enterprise environment: either an external feed server providing only the external feeds, or an internal/hosted feed server that can handle both internal and external feeds. This has the advantage of reducing network traffic, since the feed server caches the feeds, as well as providing filtering and monitoring of content that is consumed.

I’m really aware of a push to give PCC a very Enterprise 2.0 flavour; having not been at any of the previous PCC conferences, or even the first half of this one, I don’t know if this is a new bandwagon that they’re leaping on, or something that’s a logical progression of where this conference has been in the past.

BEAParticipate: BAM

Eduardo Chiocconi of BEA gave us a technical view of the ALBPM BAM functionality: what’s available out of the box, the extensions, how to create customized dashboards, security, and a bit of the architecture underlying it all so that we have a bit of an understanding of what happens in the underlying services and data stores when a custom key performance indicator (KPI) is defined.

Like every other BPM vendors’ BAM, ALBPM’s BAM is visualized as a set of dashboards that show KPIs for the purpose of monitoring the health of a process and early problem detection. There are some out-of-the-box dashboards including widgets such as gauges and charts attached to a data source, and the ability to create custom dashboards. As we saw in the architectural view this morning, there’s a BAM database to collect and aggregate the analytics data from one or more process engines, plus external data sources if you want a combined view. There is a single BAM database for each directory service, and an updater service that executes regularly to pull data from the associated engine database(s) to the BAM database. Data in the BAM database is very granular — down to the second, if required — but is flushed out as it ages, typically after a day. The OLAP data mart, which has the same data as the BAM database and is updated by the same service, is much less granular and is not automatically purged; this is used for historical analytics rather than the near-real-time requirements of the BAM database.

The out-of-the-box dashboards are instance workload, percentage workload by organizational unit, performance (e.g., end-to-end cycle time) including drill-downs to more granular levels, or a unified dashboard with all three of these measures. Surprisingly, these widgets are not currently provided as standard portlets, but can be wrapped into a portlet if required.

Most organizations will want to define their own KPIs and create their own dashboards: KPIs can be defined by a business analyst in the Designer as dimensions (e.g., time or geographic aggregation) or measures (e.g., averages and other statistical aggregations), and can be based on standard process variables or business variables. This causes a new column to be created in each of the three main BAM database tables to capture the necessary data for the three display widgets for that measure or dimension.

It’s also possible to specify the points in the process where the KPI data are captured and sent to the BAM database in addition to allowing the automatic update process to occur, giving it a sort of audit functionality. Internally, the BAM data are generated from the process engine’s audit trail, so you’ll have to have auditing enabled for all of the processes and events that you want to track in BAM (in many cases, you would turn off auditing for processes and events that don’t require it in order to improve performance).

ALBPM allows for role-based security access to the BAM dashboards, so that only specific roles can see them.

Future directions are to allow ad hoc dashboard creation and move to event-driven BAM, although that will require some architectural changes to the underlying database and services in order to handle the increased load that will result from allowing everyone to roll their own analytics.

The more I look at it, the less than I’m convinced that all the BPM vendors should be developing their own BAM like this; I think that there could be a market for a BAM product that can connect to many different BPM products as soon as we get some standardization around the process engine audit trails that are typically used to populate BAM databases.

TUCON: Continuous Process Improvement Using iProcess Analytics

For the last breakout session of the day, Mark Elder of TIBCO talked about reporting and analytics with iProcess Analytics, their historical (rather than real-time) analytics product.. The crowd’s thin this time of day, although I understand that the lobby bar is well-populated; this was the same timeslot that Tim and I had yesterday, but the attendees now have an additional day of conference fatigue. Also doesn’t help that the presentation PC acted up and we were 10 minutes late starting.

He looks at the second half of any business process implementation: after it’s up and running, you need to measure what’s going on, then feed that back to the design stage for process improvement. iProcess Analytics has a number of built-in metrics, then wizard interfaces to allow a business analyst to build new KPIs by specifying dimensions and filters, then create interactive reports. It’s even possible to set different threshold values for filtered subsets of data, such as setting different cycle-time goals for different geographic regions.

He moved on to a live demo after a few minutes of slideware to show us just how easy it is to create a chart or report for a process, or even a single step within a process. The wizard looks pretty easy to use, although chart generation isn’t exactly rocket science. There’s some nice report distribution capabilities, much like what you’d see in a business intelligence suite, so that you can share a chart or report with other members of your team. You can’t do a lot of calculations on the data, however, but you can export tabular data to Excel for further calculations and aggregation.

One very cool feature is that for a given set of data that’s being used to generate a report, you can reconstruct the process map from the report data to see where the data is coming from, since process metadata is passed over to iProcess Analytics along with the execution data.

It appears that if you want to get real historical data into Business Studio for simulation, you’re going to create a tabular report in iProcess Analytics, export it to Excel, then save it as a text file for importing. Not as integrated as I would have expected — this needs to be fixed as well as more people start to use the simulation functionality within Business Studio.

It’s all browser-based, and can generate either the interactive web-based reports that we saw, or static HTML or PDF reports. It will be interesting to see how TIBCO moves forward with their analytics strategy, since they now have both iProcess Analytics and iProcess Insight (BAM). Although historical analytics and BAM serve different purposes, they’re opposite ends of the same spectrum, and analytics requirements will continue to creep from both ends towards the middle. Like many other vendors who started with an historical analytics tool then bolted on an OEM BAM tool, they’re going to have to reconcile this at some point in the future. There’s also the question, which was raised by an audience member, about the boundary between iProcess Analytics and a full BI suite like Cognos. Although there’s a lot of nice functionality built into iProcess Analytics that’s specific to analyzing processes, many customers are going to want to go beyond this fairly rudimentary BI capability.

That’s the end of day 2 at TUCON; tomorrow morning I’ll probably only be able to catch one session before I have to leave for the flight home. Tonight, 750 of us are off to the SF Giants game, where we’ll see if Vivek Ranadivé’s throwing practice paid off when he throws out the first pitch. Watch for all of us in our spiffy new TIBCO jackets; with free wifi in the stadium, there’s likely to be some geeks there with their laptops, too.

TUCON: Mark Elder and Venkat Swaminathan

I played hooky for a couple of sessions to go over my presentation for later today; as I mentioned earlier, TIBCO’s product base is broader than my interests, so where a couple of natural dead spots during the schedule for me.

Just before my presentation with Tim Stephenson, I sat in on Mark Elder and Venkat Swaminathan, both from TIBCO, talking about BAM. Since this room (which is also where I present next) has crappy wifi reception, publication will be delayed until after I present, since it might be considered a bit cavalier to dash out between presentations just to post.

They spent some amount of time at the beginning explaining what BAM is and why you need it; I’ll assume that most of you already know that stuff.

The interesting part (for me) are the specifics about TIBO’s BAM product, iProcess Insight, which is a plug-in to the BusinessFactor framework to provide BAM capability for monitoring iProcess process flows. Like most of the other BAM products that I’ve seen, it allows the definition of industry-specific KPIs in the business processes, then provides real-time monitoring of those KPIs with drill-downs from the aggregate statistics to the details. You can also use BusinessFactor to integrate external data sources, like a customer database. There’s not shared process models between the BPM execution environment and BAM, since the first step is to download process definitions from the iProcess Engine to create a project; changes to the iProcess model require the model to be re-downloaded to the iProcess Insight project and the project manually updated to suit the updated process model. With all the round-tripping problems that we have already with process modelling in one environment and execution in another, I would have favoured a shared model approach.

Once you have your project defined, the BAM runtime sends information over to the process engine about what to monitor, and the engine sends back the relevant events to be aggregated, analyzed and presented within iProcess insight.

You can define different dashboards for different parts of the process, with different KPIs visible in each dashboard. There are some standard dashboard views, but it’s pretty configurable/customizable for views such as balanced scorecards or even geographical overlays.

Looking at the components of iProcess Insight, there’s a wizard interface to initially create a BusinessFactor project that will become your BAM dashboard, a process monitor for the start/end of procedures, a step monitor for the start/end of steps and their deadlines, a resource monitor for user/group metrics, and a supervisor capsule to allow someone with the appropriate credentials to change a specific process instance.

We then looked at a comparison between iProcess Insight and iProcess Analytics: basically, Insight is near-real-time, event-driven operational BAM, whereas Analytics is historical analysis and reporting based on batch statistics export from the process engine. Many BPM vendors (especially the more mature ones) end up with this same split of functionality, since they tend to have first built the analytics years ago when they built the process execution engine, then OEM’d or bought a BAM engine and strapped it on the side within the last year or two.

Based on the audience questions, and some earlier observations, I’m starting to get the idea that TIBCO’s “user” base is pretty technical, with not much representation from the business side of organizations. Given that many of their products are development and low-level integration tools, this isn’t surprising overall, but I expected a few more non-geeks in the BPM sessions. If this is any indication of who’s using TIBCO within customer organizations, TIBCO needs to focus more on the business side of their customers to really play in the BPM space.