BrainStorm BPM Day 1: Peter Gilbertson

Peter Gilbertson is a  senior business analyst at CUNA Mutual, and his session was on Talking Process to Executives: Process, Tools and Techniques to Sell your BPM Project, which is about how to “sell” your BPM project to the people who write the cheques. Although he’s looking at it as an internal person selling to his own executives, I’m often in the position of helping people like him create exactly this sort of rationale to carry up the food chain and thought that it would be worthwhile to hear what they did.

He started by talking about their business review process, which is not really specific to BPM projects:

  • Build the business model, especially important for communication and education because they had a whole crop of new executives from the insurance industry, whereas CUNA deals with both insurance and investment products.
  • Determine industry benchmarks, by identifying key “success levers” and comparing themselves with their competitors.
  • Document core processes, which is where a lot of people start; Gilbertson felt that if they hadn’t done the first two steps, then the core processes wouldn’t have made sense to the executives.
  • Develop performance measures, but took the usual tabular data views of things such as customer service metrics and converted them to graphs to make the business health and trends more obvious.
  • Determine gaps, which they did using a pipe analogy: there were spots where the pipe leaked, or wasn’t big enough, or didn’t fit with other pipes. I liked the analogy, and Gilbertson said that it was immediately obvious to the executives and allowed them to move easily into process improvement specifics.
  • Develop strategy, again using an analogy: “shoot the moon” with a space shuttle and booster rocket visual. In order to show how the decisions were derived, they also created a decision tree to show the path to the strategy recommendations.
  • Create plan, which is the detailed execution plan and timelines.

He summed it up in his top 10 tips:

  • Start at 10,000 feet and parachute down
  • Show “industry standard” business models
  • Focus on (and show how) the customer gets served
  • Explain how you make money
  • Walk through high level processes
  • Pinpoint areas for improvements
  • Use visual (e.g., graphs, charts, pictures, etc.)
  • Use consultants or industry experts to build credibility [I liked this one 🙂 ]
  • Create a detailed plan to execute project
  • Continually communicate, educate and position the project

This was really about how to present the material to executives, with very little on how to create the material, but there were some good points: use visual analogies where appropriate, feed them small bits of information at a time, and show why and how that the decisions are being made.

BrainStorm BPM Day 1: BPMS vendor panel

Bruce and I changed between the same two sessions, although he was leading both sessions and I was sitting back blogging. This one is a panel of four BPMS vendors, Global 360, IBM, Savvion and BEA, to discuss What’s Next for BPM Suites. I arrived a few minutes late and didn’t catch all the names, although I recognized Pat Morrissey of Savvion. This was my first tough decision of the day, since I also wanted to attend the Business Process versus Business Rules panel, especially considering that the panel that I’m in could turn out to be just regurgitation of the vendors’ marketing materials. There’s always the chance, however, that one of them will blurt out something unexpected.

The word “mashups” just came out of the IBM person’s mouth, in response to a question about collaboration; this has nothing to do with their BPM offering of course, which doesn’t really have any collaborative aspects (if you discount the FileNet Team Collaboration Manager product, which she likely hasn’t even heard of) but they are offering a mashup tool so I suppose that triggered when she heard “collaboration”. She’s not the only vendor to drag the conversation off topic and down the rathole of their own product’s functionality, but she’s definitely the most effective at it.

There was a bit of a discussion about Web 2.0 at the end of the session, with the BEA person making a key point that one of the big issues is that Web 2.0 consumer applications are changing user expectations, which in turn drives the inclusion of social networking features into BPMS’ — a point that I’ve made here before and will be discussing at the Shared Insights conference next month.

Bruce finished up by asking each vendor to summarize what they thought was the next thing in BPMS:

  • Global 360: “process intelligence”, which appears to be their latest marketing buzzphrase
  • IBM: “frameworks for implementing BPMS”, since they like to sell frameworks plus lots and lots of professional services to make them work
  • Savvion: “process management that actually works” (on a global scale, as opposed to always being the next big idea or in a pilot stage)
  • BEA: “convergence of BPM and SOA” and “better collaboration”

All-in-all, I didn’t learn much in this session about what’s next for BPM suites, and it was much too polite: the vendors only very gently disputed each other’s opinions, and in fact rarely even directly addressed each other. It did give me a chance to get my queued blog posts published, and now I’m all caught up and ready for lunch.

BrainStorm BPM Day 1: Bruce Silver track keynote

There’s an awful lot of keynotes in this conference: a couple of overall sessions this morning, now “track keynotes” for each of the four tracks within the BPM conference. I’m in Bruce Silver’s New Directions in BPM Tools and Technology session, where he started by taking a gentle poke at Gartner, saying that BPM is more than a management discipline (Gartner’s most recent definition of BPM).

He started out discussing process modelling, and how it’s inherently a business activity, not an IT activity, which speaks directly to the issue of the tools used for modelling: is there a handoff from a modelling-only tool to an execution environment at the point of business to IT handoff, or is the model actually just a business view of the actual implementation? With all of the vendor demos that I’ve done lately (I know, I have yet to document many of there here, but I’m getting to it), I’ve had quite a focus on the distinction between having a model shared between business and IT, and having a separate BPA tool that models much more than just the processes that will be implemented in a BPMS. Bruce positions this as “BPA versus BPMN” views towards describing process modelling, and doesn’t see them in conflict; in fact, he thinks that they’re ignoring each other, a viewpoint that I’d have to agree with given that BPA initiatives rarely result in any processes being transferred to some sort of execution engine.

Bruce, who often accuses me of being too nice, takes a stab a the vendors in a couple of areas. First is with their BPMN implementations, specifically that of events: he states that many of the execution engines just don’t support intermediate events, so that the vendors conveniently forget to include those events in their BPMN modelling tool. Second is with simulation, and looking at whether a vendor’s implementation is actually a useful tool, or a “fake” feature that’s there to enable it to be checked off on an RFP, but not functional enough to make it worth using.

He has a nice way of categorizing BPMS products: by vendor speciality (e.g., integration, human-centric), by process type/use case (e.g., production workflow) and by business/IT interaction method (collaborative shared model versus handoff). This was interesting, because I wrote almost identical words two days ago in my presentation for the Shared Insights Portals and Collaboration conference that I’ll be speaking at next month; great minds must think alike. 🙂  His point, like the one that I was making in my presentation, is that most BPM products have some strengths and some weaknesses that can make or break some process automation; for example, a product focussed on human-centric workflow probably doesn’t do some nice integration tricks like mapping and transformation, or complex data objects.

He also makes a good distinction between business rules (implemented in a BRE) and routing rules (implemented in a BPMS): business rules represent corporate or departmental policies that may need to be shared across business processes, whereas routing rules are the internal logic within a process that’s just required to get through the process but don’t represent policy in any way.

Bruce thinks that BPM and SOA together is still vapour-ware for the most part: it’s what the vendors are selling but not typically what they’re delivering. In particular, he thinks that if the BPMS and the ESB are not from the same vendor, then “all bets are off” in terms of whether a BPMS will work with any particular ESB or other services environment.

The session turned out to be too short and Bruce couldn’t even finish his materials, much less take questions: it was only 45 minutes to begin with, and shortened at the beginning while Bruce waited for stragglers for the previous session to make their way upstairs.

BrainStorm BPM Day 1: Brett Champlin keynote

Since I arrived late, my day started with Brett Champlin’s keynote, BPM Triage — A Health and Wellness Model for Enterprise Business Processes. I know Brett through ABPMP, but he’s also

Brett uses a healthcare analogy for applying process management to your business, and I found that the analogy stretched a bit thin, although it had some interesting points. The fact that he came out dressed in a white lab coat with a stethoscope around his neck made it a bit hokey, too.

He had a couple of interesting diagrams, one in particular of a business process architecture, but it went by too quickly to sketch out and I can’t seem to find the presentations in any of the handout materials.

In general, however, I’d have to say that if you have to do your presentation in costume, it could be that it’s lacking in some other way.

Convergence of BPM and BI

We’re 19 minutes into a webinar on “Adding Process Context to BI for Process Intelligence” that is supposed to be featuring Colin Teubner of Forrester, and the sponsor (Global 360) is still talking. Even worse, I’m not completely clear on how Global 360’s new “Business Process Intelligence” initiative is really any different from anyone else’s simulation, analytics and performance management offerings.

Colin did eventually get the floor, and talked about how BPM and BI are converging: at the basic level of implementation, they’re quite distinct, but in advanced implementation, they’re quite tightly intertwined. He spoke about the distinction between data-driven BI and process-centric BI, and how the latter (usually available as part of a BPMS) are sensitive to changes in a process and can self-adjust — hence provide better information about business processes. Colin is pushing for the the idea that BI and BPM will eventually merge into a single product class, which I’m not sure that I agree with: I think that there are a lot of valid data-driven applications for BI that aren’t, strictly speaking, process analytics. It is true, however, that there needs to be better BI integrated more closely with BPM, beyond the relatively simplistic BAM capabilities that are available out of the box.

The webinar was run by Shared Insights, but should be available for replay somewhere via the Global 360 website.

Software AG acquires webMethods

Consolidation in the industry keeps grinding on, with Software AG announcing that they will acquire webMethods for $546M. Last year, when I interviewed webMethods’ EVP of Product Development, I wrote that their new BPM launch placed them squarely in competition with IBM and TIBCO. Not surprisingly, today’s press release states:

The combined company will create a market leader in the software industry, specifically in the fast growing services oriented architecture, business process management and software application integration markets – just behind IBM and Tibco.

I guess we know who the competition is…

Rewarding your customers for doing your job

I had a great experience at the Canadian passport office recently — words that are not often spoken in the same sentence — due to a bit of imaging and BPM technology that’s been possible for a long time, but is only just starting to be used in many industries.

My old passport was due to expire in July, and because many countries start to get antsy when your passport has less than six months to expiry, I decided to renew now while I had a 4-week window of no travel. I went online to Passport Canada and used the Passport Online application instead of downloading and filling out a PDF version on paper, and when I printed the final result, I noticed that it had a barcode embedded on each page of the form. I got all the necessary signatures and my photos, and headed for the passport office, taking along a book, my iPod, my Blackberry, a bottle of water, a snack and a fold-out bed. Okay, I’m kidding about the bed, but I was expecting a long wait — the downtown Toronto passport office is as crowded as an American Idol talent call.

The first step is triage: you wait in a relatively short line for someone to check over all your documents and make sure that you have everything and it’s all filled out properly, so that you don’t wait for an hour only to find out that you’re missing some vital bit. You wouldn’t think this would be necessary for responsible adults doing something as critical as getting a passport, but the person in front of me was missing half the information and had forgotten to have the guarantor sign the documents. You are then handed a number that puts you in the waiting queue for an available clerk to process your form, and I heard the triage clerk tell people that the wait was around an hour and a half.

I arrive at the triage desk, everything is in order, and the clerk says “you get a different number because you filled out the forms online” and hands me a numbered ticket prefaced with an “F”. I look up, and sure enough, there are “A” numbers, “B” numbers and “F” numbers on the call board. And I’m next in line in the “F” queue. Woohoo! Ten minutes later, I’m at the desk having my application processed, and less than two weeks later, my new passport is in my hands.

So what went on behind the scenes? I’m not privy to the internal workings of Passport Canada, but here’s my educated guess:

  • I filled out the form online, and the data went directly into their passport application database. In other words, I did their data entry for them.
  • They generated a unique application number, and printed that in barcode format on each page of the completed application that I printed to allow for automated matching with the data later in the process.
  • After I submitted my form at the passport office, the paperwork was scanned and the barcode used to automatically match it with the data that I had entered previously — much faster and more accurate than attempting to perform OCR on the form data itself. The scanning likely kicked off a process in a BPM system, or caused it to rendezvous with an existing process that had been started by the online data entry step.
  • At the next step in the process, someone viewed a scanned image of the sections of the document filled in by hand (my signature and that of my guarantor) and checked over all pages to make sure that I hadn’t made any hand annotations or changes.
  • Some applications — either triggered by certain data on the application, input from the initial reviewer or just a percentage of the total volume — would have the references and guarantor information checked via telephone calls, so would be passed to a step in the process prompting someone to make those calls. I didn’t ask my references and guarantor if they had been called, but as a professional engineer I often am a guarantor on friends’ passport applications, and I am usually called by Passport Canada during the process.
  • If all the data is verified and all the checks are passed, the application is approved, which would trigger the actual printing and mailing of the passport.

My reward for making their job easier was to get into a fast-track line at the passport office, which greatly reduced my wait time, and possibly a faster end-to-end time since I received the passport several days before I had expected. This reward is key, because it completely motivates me to do it again, and tell all my friends about it — although if everyone did this, then the fast line would have everyone in it and become the slow line.

This is similar to what some airlines are doing with online check-in: if you check in online and print your own boarding pass, then just have to drop off checked baggage, they provide a separate line just for those who checked in online. Of course, I showed up at the airport one morning expecting to have a short line for dropping off my bag, and it turned out that everyone used online check-in that morning, but it’s still a motivator.

This looks a bit like an old-fashioned imaging and workflow application, but it’s more than that: they’re integrating a web application, one or more back-end data systems, a content management system, some sort of BPM or workflow, and possibly even the passport production system itself. Furthermore, they’re changing their customer service model to motivate people to use this method, since it not only makes less work for Passport Canada, but it improves the speed of the process for the customer significantly. It’s not just about the technology, it’s about how you can use that technology to make your customers’ life easier, not just your own.

This sort of lesson seems to need re-learning every few years: if you automate a customer-facing business process to allow self-service, then you absolutely can’t make it more expensive or time-consuming for the customer, or you’ll have no one using it. If you actually make it cheaper or take less time than the non-automated service, like Passport Canada did, then you’ll have customers jumping on board faster than you ever dreamed possible.

Blueprint upgrade

I finally received my Blueprint beta account on the weekend, although I haven’t had time to do much more than sign in, and there was an interactive webinar today for about 20 beta testers to see the new features in this release, and hear a few things about what will be in the GA release on April 30th.

New in this release:

  • To do list on the home page.
  • Some easier-to-use UI controls.
  • Enhancements to collaboration to allow you to see who’s viewing a project right now and who’s online (via Google Talk), and provide the ability to easily invite new collaborators.
  • In the process view, there is a wiki-type view to add documentation to a process, which will appear in the PowerPoint presentation that you generate from this process.
  • The process outline now generates a more detailed BPMN model of the process, although it looks like it doesn’t support some of the more complex BPMN structures such as transactions (I’ll check this out in the product myself soon).
  • You can view differences between versions of a process.

In the GA release, they’ll be adding BPDM export so that the processes modelled here can be imported into a BPM execution environment such as TeamWorks, but no round-tripping from TeamWorks back into Blueprint until the next version of TeamWorks is released in May. I’m not sure if that means that Blueprint can’t import BPDM or if TW just can export it, but I think that they’re still working on how to do the round-tripping without losing any information that might be added on in the TW environment.

I asked about a shared process repository between Blueprint and TeamWorks, and it sounds like it’s something that they’re thinking about or working on, but no definite dates. Ideally (in my mind), there should be an option for a shared model so that there’s no round-tripping at all, but that Blueprint and TW just provide different views on the same model.

I also asked about support for other IM clients besides Google Talk (since Skype is my fave): they’re looking at alternatives, and suggested that I throw my suggestion in via the feedback functionality within the product. I guess that I really need to get on and start playing around with it soon 🙂

EMC/Documentum’s first steps in BPM

It’s no coincidence that you find EMC’s BPM offering under the Content Management menu from their home page: for years, EMC/Documentum have focused on content management, with process management a distant second concern, typically used for routing documents through a creation/approval/publishing cycle. Now, however, they’re finally following the FileNet model and attempting to break out from ECM with a standalone BPM product.

Process Analyzer, process closeupThey refer to the Documentum Process Suite as a collection of tools, and that starts right at design time: a business analyst uses the Process Analyzer (a Java application) to model the process using a non-BPMN, flowchart-like graphical notation. Then, they manually export the process as XPDL, and it’s manually imported into the Business Process Manager (a desktop application) where the process appears as not-quite-BPMN — they referred to it as “BPMN in spirit”, which caused me to have to mute my phone temporarily — so that a developer can make it into an executable process by hooking up web services, exposing processes as web services, and hooking up custom UI applications.

Process ManagerAlthough I haven’t seen it, they apparently have a BPMN modeler available for free on their site that can also be imported into the Business Process Manager, plus a Visio Interpreter product (similar to the Zynium product) for remapping existing Visio diagrams into XPDL for import. To make things even more confusing, they also have a web-based Process Navigator that provides a read-only view of the process models that looks similar to the Process Analyzer, but doesn’t allow you to do anything with the models.

In other words, they provide three tools for a business analyst to model processes, all of which require exporting to XPDL for import into the Business Process Manager. The problem with this, as we all know, is with round-tripping: any round-tripping that requires manual exporting and importing, even if it is technically possible, is unlikely to actually occur; this greatly reduced process agility, since it eventually comes back to the same old game of the business analysts creating their requirements in a different tool than IT, then throwing it over the wall and hoping for the best. When I asked about a common model, the response was that that functionality was on their roadmap, and they might do it through Eclipse perspectives, which makes me think that it’s still a long ways off. They also support BPEL import/export, although we didn’t discuss that in detail, but I imagine that it’s similar to the XPDL usage.

As an ECM vendor, they do have some advantages when it comes to integrating with some of their other product functionality: process models are versioned; and eRoom can be used for collaboration during process modeling, with the eRoom participants viewing the process models via the read-only Process Navigator. However, there’s some things that are missing or just don’t hang together: for example, a BAM dashboard doesn’t exist out of the box, although they demo’d one built in the BEA Weblogic portal environment using widgets from the Proactivity BAM product that EMC acquired last year and a report builder within Process Analyzer. An out of the box dashboard? Slated for a future version. The out of the box “webtop” user interface for those participating in a process is neither sophisticated nor particularly configurable, and it’s Documentum’s expectations that custom applications will be written for most process work.

Both portals and user interfaces are something that many companies want to have full control over, and may end up rewriting themselves anyway, but you can’t get something simple up and running if you don’t at least have a good set of the basic functions available: you’ll be off on the year+ development cycle in order to get the first version into production, which just doesn’t cut it these days.

They have Corticon pretty deeply embedded within the product for rules processing, although they can also support ILOG. No support yet for other rules engines, except via web services.

I briefly saw their simulation offering using preset values, although it apparently can also be driven by historical process data as well.

All in all, EMC/Documentum is far behind in the BPM field, and at this time is likely only attractive to current Documentum customers who are given a sweet deal on licensing. Since most of the major BPM vendors play well with Documentum content management however, even that might not be enough to make it worthwhile.