Blogging from BrainStorm BPM

The hotel wifi just kicked in, so I have a couple of posts queued up. I’m at the BrainStorm BPM conference in Chicago, and just flew in this morning so missed the initial keynotes (Enabling the Process-Centric Agile Enterprise and Engineering the Process-Centric Enterprise) but will be here the rest of today and most of tomorrow. One of the complaints that I have about this conference is that there’s just too many tracks: they’re running six sessions simultaneously: two of them forming the BPM conference (BPM for business professionals and BPM for technology professionals) an organizational performance symposium, a business rules symposium), an SOA conference, and a business architecture conference. At any given time, there are at least two of the sessions that I want to attend, sometimes three, and the choices are going to be difficult. Unfortunately, the conference program doesn’t help since it splits the session descriptions into five sections (the two BPM tracks are combined in the program) so that I have to flip around all over the program to compare concurrent sessions. There’s also some organizational issues, and I’m not just talking about the fact that they didn’t have a name badge for me when I showed up: the sessions are spread out over two floors of meeting rooms at the Drake Hotel, and there is zero time in the schedule between many of the sessions. Teleportation was not included.

Convergence of BPM and BI

We’re 19 minutes into a webinar on “Adding Process Context to BI for Process Intelligence” that is supposed to be featuring Colin Teubner of Forrester, and the sponsor (Global 360) is still talking. Even worse, I’m not completely clear on how Global 360’s new “Business Process Intelligence” initiative is really any different from anyone else’s simulation, analytics and performance management offerings.

Colin did eventually get the floor, and talked about how BPM and BI are converging: at the basic level of implementation, they’re quite distinct, but in advanced implementation, they’re quite tightly intertwined. He spoke about the distinction between data-driven BI and process-centric BI, and how the latter (usually available as part of a BPMS) are sensitive to changes in a process and can self-adjust — hence provide better information about business processes. Colin is pushing for the the idea that BI and BPM will eventually merge into a single product class, which I’m not sure that I agree with: I think that there are a lot of valid data-driven applications for BI that aren’t, strictly speaking, process analytics. It is true, however, that there needs to be better BI integrated more closely with BPM, beyond the relatively simplistic BAM capabilities that are available out of the box.

The webinar was run by Shared Insights, but should be available for replay somewhere via the Global 360 website.

Software AG acquires webMethods

Consolidation in the industry keeps grinding on, with Software AG announcing that they will acquire webMethods for $546M. Last year, when I interviewed webMethods’ EVP of Product Development, I wrote that their new BPM launch placed them squarely in competition with IBM and TIBCO. Not surprisingly, today’s press release states:

The combined company will create a market leader in the software industry, specifically in the fast growing services oriented architecture, business process management and software application integration markets – just behind IBM and Tibco.

I guess we know who the competition is…

Rewarding your customers for doing your job

I had a great experience at the Canadian passport office recently — words that are not often spoken in the same sentence — due to a bit of imaging and BPM technology that’s been possible for a long time, but is only just starting to be used in many industries.

My old passport was due to expire in July, and because many countries start to get antsy when your passport has less than six months to expiry, I decided to renew now while I had a 4-week window of no travel. I went online to Passport Canada and used the Passport Online application instead of downloading and filling out a PDF version on paper, and when I printed the final result, I noticed that it had a barcode embedded on each page of the form. I got all the necessary signatures and my photos, and headed for the passport office, taking along a book, my iPod, my Blackberry, a bottle of water, a snack and a fold-out bed. Okay, I’m kidding about the bed, but I was expecting a long wait — the downtown Toronto passport office is as crowded as an American Idol talent call.

The first step is triage: you wait in a relatively short line for someone to check over all your documents and make sure that you have everything and it’s all filled out properly, so that you don’t wait for an hour only to find out that you’re missing some vital bit. You wouldn’t think this would be necessary for responsible adults doing something as critical as getting a passport, but the person in front of me was missing half the information and had forgotten to have the guarantor sign the documents. You are then handed a number that puts you in the waiting queue for an available clerk to process your form, and I heard the triage clerk tell people that the wait was around an hour and a half.

I arrive at the triage desk, everything is in order, and the clerk says “you get a different number because you filled out the forms online” and hands me a numbered ticket prefaced with an “F”. I look up, and sure enough, there are “A” numbers, “B” numbers and “F” numbers on the call board. And I’m next in line in the “F” queue. Woohoo! Ten minutes later, I’m at the desk having my application processed, and less than two weeks later, my new passport is in my hands.

So what went on behind the scenes? I’m not privy to the internal workings of Passport Canada, but here’s my educated guess:

  • I filled out the form online, and the data went directly into their passport application database. In other words, I did their data entry for them.
  • They generated a unique application number, and printed that in barcode format on each page of the completed application that I printed to allow for automated matching with the data later in the process.
  • After I submitted my form at the passport office, the paperwork was scanned and the barcode used to automatically match it with the data that I had entered previously — much faster and more accurate than attempting to perform OCR on the form data itself. The scanning likely kicked off a process in a BPM system, or caused it to rendezvous with an existing process that had been started by the online data entry step.
  • At the next step in the process, someone viewed a scanned image of the sections of the document filled in by hand (my signature and that of my guarantor) and checked over all pages to make sure that I hadn’t made any hand annotations or changes.
  • Some applications — either triggered by certain data on the application, input from the initial reviewer or just a percentage of the total volume — would have the references and guarantor information checked via telephone calls, so would be passed to a step in the process prompting someone to make those calls. I didn’t ask my references and guarantor if they had been called, but as a professional engineer I often am a guarantor on friends’ passport applications, and I am usually called by Passport Canada during the process.
  • If all the data is verified and all the checks are passed, the application is approved, which would trigger the actual printing and mailing of the passport.

My reward for making their job easier was to get into a fast-track line at the passport office, which greatly reduced my wait time, and possibly a faster end-to-end time since I received the passport several days before I had expected. This reward is key, because it completely motivates me to do it again, and tell all my friends about it — although if everyone did this, then the fast line would have everyone in it and become the slow line.

This is similar to what some airlines are doing with online check-in: if you check in online and print your own boarding pass, then just have to drop off checked baggage, they provide a separate line just for those who checked in online. Of course, I showed up at the airport one morning expecting to have a short line for dropping off my bag, and it turned out that everyone used online check-in that morning, but it’s still a motivator.

This looks a bit like an old-fashioned imaging and workflow application, but it’s more than that: they’re integrating a web application, one or more back-end data systems, a content management system, some sort of BPM or workflow, and possibly even the passport production system itself. Furthermore, they’re changing their customer service model to motivate people to use this method, since it not only makes less work for Passport Canada, but it improves the speed of the process for the customer significantly. It’s not just about the technology, it’s about how you can use that technology to make your customers’ life easier, not just your own.

This sort of lesson seems to need re-learning every few years: if you automate a customer-facing business process to allow self-service, then you absolutely can’t make it more expensive or time-consuming for the customer, or you’ll have no one using it. If you actually make it cheaper or take less time than the non-automated service, like Passport Canada did, then you’ll have customers jumping on board faster than you ever dreamed possible.

Blueprint upgrade

I finally received my Blueprint beta account on the weekend, although I haven’t had time to do much more than sign in, and there was an interactive webinar today for about 20 beta testers to see the new features in this release, and hear a few things about what will be in the GA release on April 30th.

New in this release:

  • To do list on the home page.
  • Some easier-to-use UI controls.
  • Enhancements to collaboration to allow you to see who’s viewing a project right now and who’s online (via Google Talk), and provide the ability to easily invite new collaborators.
  • In the process view, there is a wiki-type view to add documentation to a process, which will appear in the PowerPoint presentation that you generate from this process.
  • The process outline now generates a more detailed BPMN model of the process, although it looks like it doesn’t support some of the more complex BPMN structures such as transactions (I’ll check this out in the product myself soon).
  • You can view differences between versions of a process.

In the GA release, they’ll be adding BPDM export so that the processes modelled here can be imported into a BPM execution environment such as TeamWorks, but no round-tripping from TeamWorks back into Blueprint until the next version of TeamWorks is released in May. I’m not sure if that means that Blueprint can’t import BPDM or if TW just can export it, but I think that they’re still working on how to do the round-tripping without losing any information that might be added on in the TW environment.

I asked about a shared process repository between Blueprint and TeamWorks, and it sounds like it’s something that they’re thinking about or working on, but no definite dates. Ideally (in my mind), there should be an option for a shared model so that there’s no round-tripping at all, but that Blueprint and TW just provide different views on the same model.

I also asked about support for other IM clients besides Google Talk (since Skype is my fave): they’re looking at alternatives, and suggested that I throw my suggestion in via the feedback functionality within the product. I guess that I really need to get on and start playing around with it soon 🙂

EMC/Documentum’s first steps in BPM

It’s no coincidence that you find EMC’s BPM offering under the Content Management menu from their home page: for years, EMC/Documentum have focused on content management, with process management a distant second concern, typically used for routing documents through a creation/approval/publishing cycle. Now, however, they’re finally following the FileNet model and attempting to break out from ECM with a standalone BPM product.

Process Analyzer, process closeupThey refer to the Documentum Process Suite as a collection of tools, and that starts right at design time: a business analyst uses the Process Analyzer (a Java application) to model the process using a non-BPMN, flowchart-like graphical notation. Then, they manually export the process as XPDL, and it’s manually imported into the Business Process Manager (a desktop application) where the process appears as not-quite-BPMN — they referred to it as “BPMN in spirit”, which caused me to have to mute my phone temporarily — so that a developer can make it into an executable process by hooking up web services, exposing processes as web services, and hooking up custom UI applications.

Process ManagerAlthough I haven’t seen it, they apparently have a BPMN modeler available for free on their site that can also be imported into the Business Process Manager, plus a Visio Interpreter product (similar to the Zynium product) for remapping existing Visio diagrams into XPDL for import. To make things even more confusing, they also have a web-based Process Navigator that provides a read-only view of the process models that looks similar to the Process Analyzer, but doesn’t allow you to do anything with the models.

In other words, they provide three tools for a business analyst to model processes, all of which require exporting to XPDL for import into the Business Process Manager. The problem with this, as we all know, is with round-tripping: any round-tripping that requires manual exporting and importing, even if it is technically possible, is unlikely to actually occur; this greatly reduced process agility, since it eventually comes back to the same old game of the business analysts creating their requirements in a different tool than IT, then throwing it over the wall and hoping for the best. When I asked about a common model, the response was that that functionality was on their roadmap, and they might do it through Eclipse perspectives, which makes me think that it’s still a long ways off. They also support BPEL import/export, although we didn’t discuss that in detail, but I imagine that it’s similar to the XPDL usage.

As an ECM vendor, they do have some advantages when it comes to integrating with some of their other product functionality: process models are versioned; and eRoom can be used for collaboration during process modeling, with the eRoom participants viewing the process models via the read-only Process Navigator. However, there’s some things that are missing or just don’t hang together: for example, a BAM dashboard doesn’t exist out of the box, although they demo’d one built in the BEA Weblogic portal environment using widgets from the Proactivity BAM product that EMC acquired last year and a report builder within Process Analyzer. An out of the box dashboard? Slated for a future version. The out of the box “webtop” user interface for those participating in a process is neither sophisticated nor particularly configurable, and it’s Documentum’s expectations that custom applications will be written for most process work.

Both portals and user interfaces are something that many companies want to have full control over, and may end up rewriting themselves anyway, but you can’t get something simple up and running if you don’t at least have a good set of the basic functions available: you’ll be off on the year+ development cycle in order to get the first version into production, which just doesn’t cut it these days.

They have Corticon pretty deeply embedded within the product for rules processing, although they can also support ILOG. No support yet for other rules engines, except via web services.

I briefly saw their simulation offering using preset values, although it apparently can also be driven by historical process data as well.

All in all, EMC/Documentum is far behind in the BPM field, and at this time is likely only attractive to current Documentum customers who are given a sweet deal on licensing. Since most of the major BPM vendors play well with Documentum content management however, even that might not be enough to make it worthwhile.

Swedish mashups

No, this isn’t some new dish from the Swedish Chef, it’s a new way to fool around with domain names to get them to actually spell something: en.terpri.se (using a Swedish domain name). Following in the etymological footsteps of del.icio.us, en.terpri.se is a site launched by BEA to showcase their Enterprise 2.0 tools: Pages for web authoring (including wikis and blogs), Ensemble for mashups, and Pathways for information discovery via tags, bookmarks and activity analysis.I haven’t seen a demo yet, but you know I want one.

These all fall under the AquaLogic family, and were previously known as Project Builder, Project Runner and Project Graffiti, respectively; it remains to be seen whether this is something new or just the old stuff with some rounded corners. They’re showing all this off at the O’Reilly Emerging Technology conference this week.

BPMG Toronto

Earlier this month was the second meeting of the Toronto BPMG chapter, following a successful turnout at the first meeting. Unfortunately, this one fell on a Friday after a large snowfall when the city had warned people not to travel unless necessary, which encouraged many people to take a snow day in spite of it turning out to be a sunny day with all of the snow melting. (For those of you who live in places where you don’t have snow days, try to imagine the pure delight of missing a day of school and spending it at the local park sledding down the hills, then apply that feeling to missing a day of work. Of course, everyone claims to be working at home…) The result: all the vendors and consultants struggled through the slush and made it to the meeting, and almost none of the “practitioners” (end-customers) did, resulting in an embarrassing total of three practitioners — including the speaker — out of about 25 people. It was funny to hear, however, that three people at the meeting said that they attended because they read about it on my blog.

As with the last meeting, Jim Baird talked about BPMG, then Ultimus (the meeting sponsor and the vendor to our speaker) gave a short overview of BPM without too much of a product plug. The main speaker was Jodi Starkman-Mendelsohn of West Park Assessment Centre, who I had heard speak earlier that same week at the Gartner conference in San Diego. Although I had heard some of the WPAC story, this was in much more detail:

  • One key business line for them was to assess injuries from auto accidents, with patient referrals from both insurance companies and lawyers.
  • The main driver for improved systems, including BPM, was 300% revenue growth over a 3-year period that basically broke their manual scheduling process. With increased numbers of double-booking and no shows, they were finding it hard to maintain their service levels and estimated that $3M/year in revenue was at risk.
  • They defined a strategic plan in 2001 with the objective to improve productivity and operating efficiencies by integrating multiple web-based applications together seamlessly to address their scheduling and financial management needs. They set a target of a 3-year return on investment.
  • In October 2002, they went live with a new scheduling system, a new financial system, a fax server, and Ultimus BPM for process management and to bind together the other components. Events in the scheduling system initiate processes, and there’s integration between BPM and scheduling throughout the processes. The fax server is kicked off at various points in the process to generate outbound documents. There are nightly uploads to the financial system (which is deemed adequate frequency), and occasional downloads of the master file.
  • Currently, they have 79 different processes, 29 active users, and 6,000 active incidents/month. I was surprised at the large number of different processes: maintaining 79 business processes that may be only slight variations of each other would be a significant burden, although I don’t know how similar the individual processes are.
  • They saw huge benefits: ROI in 3 years, reduced turnaround times, improved business efficiency, and reduced errors, allowing them to grow beyond their previous capabilities and meet market demands. They’re also adding value to customers by providing better visibility into processes, and have better agility of the business logic by externalizing it in the BPM rather than embedding it within financial or scheduling systems.

The big success story that Starkman-Mendelsohn talked about at Gartner and here was what happened to them when SARS hit Toronto in 2003. Although SARS didn’t affect most of us in Toronto all that much, it had a severe impact on health-care facilities, where all but critical services were cancelled. Although a private business, WPAC operates within a public healthcare facility, which meant that they were shut out of their own offices with very little warning. Prior to their new systems, this would have put them out of commission for the entire seven weeks of the lock-out, costing them $600K in direct revenue and untold damages in lost opportunity; with all of their applications available online via web interfaces, however, they were managing their business processes as usual by the next day, and two days later had outfitted space in a local hotel with examining tables and equipment to allow them to continue business as usual. Since they’re not capturing the patient files electronically, they still needed access to the paper files, but were allowed to send one person back into their offices once per day to fetch the necessary files.

I created a short course on business continuity planning last year, and I talked about exactly this issue: how having your business processes and other applications online can save your butt when disaster hits. If you have mostly manual processes, consider that that process is actually embodied within the worker’s heads, and likely in paper files on their desk or notes saved only to their local PC. Take away the physical desk, and they might have a hard time reconstructing a particular instance of a business process. Take away the specific worker as well, and you can forget about reconstructing that process instance until you can get access to either their desk or the person. If the business process is online, however, most of the notes and other instance-specific data is captured within the online process, making it possible to replace the original worker and/or remove them from their physical working environment with a minimal impact on their ability to complete the business processes, as long as they have access to the online systems.

Starkman-Mendelsohn talked about challenges that they are facing now due to recent deregulation in the auto insurance industry: first of all, one part of their business is likely to decrease because of changing rules around the use of assessment centres for resolving insurance disputes. Secondly, if someone is required to have an assessment, there is now a maximum distance that they’re required to travel. They were able to change their business processes to suit the new requirements and remain competitive — although the decreased business has resulted in reduced staff numbers — and set up four satellite locations, enabled by the ability to access the business applications remotely. In other words, the easy adaptability of the systems is providing them with the business agility that they require in a rapidly-changing business environment.

She finished up by noting that a BPM system needs to evolve over time, it’s not a one-time project — a big vote for not over-customizing your systems. She also said that involve their subject matter experts in process mapping and implementation was a big part of their success, and resulted in good staff buy-in.

And the final big win for them: the Infoworld 100 Awards named them as a winner in the health-care sector in November.