ProcessWorld Day 1: Services industry breakout with Todd Lohr of Zurich North America

The second customer breakout session was Todd Lohr of Zurich North America, who discussed various process modelling initiatives within Zurich. They’ve expended a ton of effort on detailed as-is process mapping in order to drive process improvement, and it appears to have paid off even before implementing process automation.

They had some interesting discoveries: 4 out of 5 top activities (by time spent) did not add value to the underwriting process; many activities done by an underwriter could be done by an underwriting assistant; the start time of certain processes was causing unnecessary delays due to timing or unavailability of staff (underwriters work late, whereas the assistants work 7-3, so all assistant-level work after 3pm was done by an underwriter); and bad insurance applications (e.g., missing data) can be found and aborted earlier in the process through the appropriate triage. Having worked with a lot of insurance customers, I don’t find any of these surprising, but I was impressed by the thoroughness of their as-is modelling and how they were able to exploit it to improve processes, technology and organizational structure.

They use ARIS to create the future state models and help the transition from the as-is to the to-be processes. They see it as a tool for training, simulating and communicating, as well as determine staffing and economic value of processes.

Future plans include integration of business rules, and getting some of these processes automated in a BPMS.

ProcessWorld Day 1: Services industry breakout with Marc Kase of SAIC

After lunch, I attended a couple of ARIS customer breakouts, the first of which was Marc Kase of SAIC. I won’t give a lot of detail about their business processes, since I’m not sure how much that they really want to share outside the conference, but there were some great points and lessons learned that are more generic.

One of the first stats that hit me on the SAIC case is that they moved from 700 to 70 (it may have been 78 — I was in the back and the print on the slide was small) job roles as part of their process modelling efforts, which is an incredible success story.

They’ve focussed on building a business architecture, with process models created for projects stored in local repositories, then promoted to a central enterprise architecture repository at certain milestones. From this, they’ve been able to see a number of benefits:

  • Context, e.g., which systems use which data
  • Documentation that allows requirements and design to be traced back to business processes
  • Standards enforcement
  • Ability to cascade changes across models
  • Web publication of process and architecture content
  • Strengthened ties between IT and functional process owners

They also learned a few lessons, such as some of the difficulties in enforcing change control in moving from a single project to a portfolio of projects, and some practical issues around setting tightly-controlled standards in order to reduce the user learning curve; in fact, with the appropriate filters and standards in place, their users find ARIS “much easier to use than Visio”.

They have a number of plans for 2007, including simulation, integration with a number of other systems including their BPMS, building out the complete enterprise business architecture, and using “system of systems engineering” to track interdependencies between projects.

The Modelling-to-BPM cycle

I attended a webinar last week about Proforma and Workpoint, and I have to say that the “for more information” slide at the end of the presentation is the best one that I’ve ever seen in all my years of corporate presentations. Dan, that may not be your picture on the slide, but kudos. Update: screenshot of the slide removed at the behest of Workpoint, who claim that the slide that I saw on the screen didn’t exist in their slide deck.

The topic of the webinar was Bringing BPM and BPA Together (replay available on Feb 7th), and it focussed on how you can do your complex modelling, analysis and simulation in Proforma’s ProVision, then export/import your way over to Workpoint’s BPMS for execution.

I found this particularly interesting because it highlights the divide between the BPMS vendors who (attempt to) provide everything to do with process under their own banner, and those that rely on partner relationships through a best-of-breed approach.

At the Proforma user conference last year, one of the speakers asked the audience how many of them were exporting their process models to a BPMS for execution. Out of about 150 people, only a couple of hands went up, which surprised me (as I discussed in my post about that presentation, Is Anyone Executing Those Processes?). ProVision doesn’t yet support XPDL, and many of the BPM vendors are just getting onboard with XPDL themselves so haven’t been ready to accept process models from a modelling tool, but there has been integration done between ProVision and some number of BPMS using their Proforma’s XML-based interchange format, CIF. Presuming that many of the companies using ProVision are also using a BPMS, this seems to imply that someone is taking the process models created in ProVision and recreating them manually in the BPMS. So why is this happening? Is it a technology disconnect (BPA and BPM can’t exchange models) or a human disconnect (modellers/architects and BPM jockeys don’t exchange models)?

Proforma and Workpoint are obviously trying to buck this trend by promoting how much better things can be when you use the strengths of both of their products. I’m a fan of this approach if you’re using a full enterprise architecture modeller like ProVision as opposed to a process-only modeller, since you can do all of your EA models in it and your process models become part of the larger picture. I’m headed off to the ARIS ProcessWorld conference later this week, so I may discover more of the advantages of a process-only modeller, too.

It makes sense for smaller BPMS vendors like Workpoint (whose product I am completely unfamiliar with, except for last week’s webinar) to leverage relationships with modelling vendors like Proforma to fill in the gaps in their product, although larger vendors who are side-slipping into the BPM space are also going that gap-filling route, such as we see with the OracleIDS Scheer relationship.

At the other end of the spectrum are the mainstream BPMS vendors, particularly those categorized as “suites” or “pure-play” (depending on which version of which analyst report that you read), which include modelling, simulation and all manner of process analysis as part of their product. Although you can’t escape having a process modeller (and likely simulation) in your BPMS in order to be considered a serious player, I’m still left with the feeling that there’s a lot to be gained by using a tool that is both more specialized in process modelling — in order to be able to capture non-automated steps, for example — and provides a broader enterprise architecture modelling scope.

Strategic Planning with Enterprise Architecture

Laura Six-Stallings from QAD gave a presentation on how they are using enterprise architecture for strategic corporate planning, which absolutely fascinated me since most EA projects that I’ve been involved in have been focussed at much lower levels. She used some wonderfully funny war analogies, going so far to call ProVision a “weapon of mass depiction”, which takes the prize for the best quote of the day.

Since I had been online earlier and determined that her presentation was not available on the Proforma website, I ended up taking a lot of notes, so have a better memory of this presentation than some of the others. I didn’t see anything in the presentation that would have made it particularly proprietary, since she didn’t show their actual strategic planning results, just talked about the methodology for achieving it, but some companies are more paranoid than others.

They started their EA initiative in 2002 with about a dozen business and technology architects, and started using ProVision just last year to implement the Zachman framework. They have a very holistic view of EA, from corporate strategy on down, and they hit their strategic planning process as an early target for EA. Like many organizations, they did their strategic planning in PowerPoint and Word; with over 60 pages of slides and 280 pages of backup documentation, it was a time-consuming and error-prone process to create it in the first place, then to map that onto the more concrete goals of the organization. By implementing EA and ProVision, they were looking to improve the entire process, but also gain some clarity of alignment between strategy, business and technology, and some clarity of ownership over processes and strategies.

She made several turns of phrase that elicited a knowing laugh from the audience — IKIWISI [I Know It When I See It] requirements; As-Was and Could-Be models — but really brought home the challenges that they had to overcome, and the wins that they are expecting as this process rolls out. The biggest issues weren’t surprising: a perception of complexity, based in part of the limited ProVision expertise within QAD, and the cultural shift required to embrace a new way of modelling their strategic plans. However, they now have a long-term strategic plan based roughly on balanced scorecard objectives, and have a whole list of anticipated benefits:

  • Common taxonomy and semantics
  • A holistic multi-dimensional view of enterprise activities
  • Enforced alignment to the strategic plan model
  • Exposure of dependencies, relationships, impacts and conflicts
  • Improved communication and acceptance of the strategic plan
  • Improved priority management
  • Common processes
  • Effective reporting and analysis
  • Improved collaboration

Quite lofty goals, but achievable given the level that they’re attacking with EA. What I took away from this, and from other conversations that I had during the two days, is that to many people, “EA” really translates to IT architecture, but not at QAD.

Enterprise Architecture in pharmaceuticals

It was Craig Confoy’s presentation on Johnson & Johnson Pharma where I really started to get interested in the issue of where EA sits in the enterprise. Although the “E” in EA stands for “Enterprise”, it seems that most organizations, and J&J is no exception, start out with EA in the IT infrastructure group somewhere. Like many large conglomerates, they had a bit of a mess with five pharmaceutical R&D companies (out of J&J’s 200-odd companies), each with its own IT department supporting 14 different functional units per company, and little alignment between the company functions. Since EA was in IT infrastructure, anything in the business layers of EA, such as business modelling, was done on a project-by-project basis and not shared between business units or companies.

Sound familiar? Almost every large company that I deal with has the same issues: some real architecture going on at the lower infrastructure levels, but practically none at the business levels.

About 5 years ago, J&J Pharma decided to do something about it, and created a business architecture group. There were a few stumbles along the way, such as the use of a (seemingly inappropriate) CASE tool that resulted in business process documentation that stretched over 42 feet at 8pt font — unusable and unsustainable — before they started using Proforma.

One of their models that I really liked was an enterprise data model that could be overlaid with departmental ownership, so that you can easily see how changing any part of the model would impact which departments. I think that this is one of the basics required by any large organization, but often not used; instead, companies tend to replicate data on a per-department basis since they don’t have any enterprise data models that would tell them who is using what data.

This was one customer presentation that showed some clear ROI of using the Proforma tools: they found that systems could be implemented 30% faster (a huge advantage in pharmaceuticals), that the modelling process identifies system integration points and allows them to create standard EAI models for reuse, and that the data models helped meet their regulatory requirements more easily.

Proforma conference presentations

I totally slacked off after leaving the conference on Thursday afternoon, spending the early evening at the Voodoo Lounge catching the sunset from 51 floors, then hanging around the Masquerade mezzanine watching the Mardi Gras show before turning in early enough to make that 7am flight home on Friday. So here it is, Monday morning, and I’m catching up on a week’s worth of blogging.

This was a relatively small conference, about 150 customers attending, but what an enthusiastic group! When one of the speakers talked about how ARIS had been abandoned on a project because of its complexity, there was clapping from the audience, and I don’t think that all of it came from Proforma employees. There were no breakout sessions, just a main stage, and almost half of the presentations were given over to customer presentations. Not only that, all of them were talking about what they’ve actually done with Proforma’s products, not what they plan to do, so had some pretty practical advice for the rest of the crowd.

The product presentations from the Proforma people were also pretty interesting, in part because I haven’t worked with the product that much so a lot of it was new to me.

More detail on the individual presentations to follow.

I also had a number of interesting conversations with customers, and I kept driving to the question of where enterprise architecture fits in their organization. For the most part, companies are keeping it under IT (which I think is a big mistake and posted about previously, not surprisingly when I was reviewing a Proforma webinar), and there seem to be a lot of conflicts in defining the roles of data, information, business and enterprise architects still.

Proforma conference Day 1 quick look

There’s wifi in the conference room, but you have to sign up at the business centre for it ahead of time, which was just too much logistics for me to blog live. However, it’s 5am on Day 2 and my brain is still on Eastern time, so time for a few updates. I’ll do a more complete review of the sessions after it’s all over. First, let’s start with the other conferences that were running in the same conference centre,which you can see in the photo on the left.

Best quote of the conference so far: “I can DODAF FEMA!”, from Tony Devino, an engineer with the Navy, in his presentation about creating a process for quality control on temporary housing installations in New Orleans following Katrina. First time that I’ve heard “DODAF” used as a verb, and a bit funny (well, to EA geeks), especially when you consider that they use DODAF for weapons systems.

Best dance (not usually a category that I assign at conferences): Judson Laipply, a motivational speaker who gave a keynote, also happens to be the originator of the Evolution of Dance, the most-viewed clip ever on YouTube. He talked about change, which is the theme of the conference, then did a live, extended-play version of the Evolution of Dance for us at the end of his talk. I really would have hated to follow him on stage as a speaker!

Dr. Geary Rummler spoke at the afternoon keynote (yes, that Rummler), which was pretty exciting for those of us who have been around in process modelling and management long enough to have a view of his part in its history.

There was a bit of discussion about Proforma’s leading position in the new Forrester report, which is an amazing coup for Proforma when they’re up against a company that’s many times their size.

I’m left with a great impression of Proforma as a company. Although considerably smaller than IDS Scheer, their major competitor, they have an enthusiastic customer base, judging by both the customer presenters and the attendees who I’ve met, and a really nice corporate culture. I sat at the dinner last night with Dave Ritter, one of the founders and currently VP of Enterprise Solutions; we had a lengthy chat before we realized that we had (sort of) met on a Proforma webinar where he spoke several months back, and in some follow-up emails to that webinar. Michelle Bretscher, their PR Director, has given me completely red-carpet treatment, offering to set up meetings with any of the executives, and making sure that I have whatever I need. I don’t think that a lot of press shows up to their user conferences, but when you’re a one-person consultant/analyst/blogger organization, it’s nice to be treated with that level of respect, something that larger vendors could take a lesson from. I also had the most pleasant surprise when I turned to page 6 of the program and saw the watermarked graphic behind the print.

Sessions today include a lot of material from Proforma on their upcoming Series 6, and I’ve very eager to hear about their advances in zero-footprint clients and other Web 2.0-like features, considering my recent focus on Web 2.0 and BPM.

Spectrum Radio on the FBI case file debacle

I’ve been a member of IEEE for over 20 years, and browse the periodicals that arrive at my door monthly, but have just become aware of the content that they offer to members and non-members on Spectrum Online. I’ve downloaded a number of the podcasts from Spectrum Radio and thoroughly enjoyed the two-parter about the enormously expensive and unsuccessful FBI case file project (I’d love to link to it directly, but they have a stupid Flash interface that doesn’t allow that, so you’ll just have to find it on the Radio page).

IEEE actually had to litigate to have the report about this project released under freedom of information laws, and the software experts in discussion on the podcast dissect the report and talk about what went wrong as well as lessons that can be learned for any large software project. Interesting that a company that has CMM level 5 certification can end up developing a $170M project with no proper requirements, no enterprise architecture, and a number of other things that seem like no-brainers for any large software project. Worth listening.

Paul Harmon on BPM state and trends

I’m in a webinar (sponsored by Proforma) with BPTrends‘ Paul Harmon discussing their recent survey of business process trends. I expect to meet Paul face-to-face next week at the ABPMP chapter meeting in San Francisco, where he’ll be speaking on “Business Process Today and Tomorrow”.

The first part of the webinar is pretty much just a review of the report itself, with a minor degree of added value, although it’s good for those who find it hard to plough through a 54-page report and find the high points without nodding off. He highlighted that most people are still doing their process modelling in Visio or PowerPoint (see page 29 of the report), although sees that as an indicator that an organization isn’t yet fully serious about their process modelling efforts because of the lack of an enterprise view that you can get with a repository-based tool such as Proforma’s. He sees many of the survey results as indicators that the BPM market that is still developing, not yet mature, and calls the market for tools “confusing” as he discusses the diagram on page 45. Considering that analysts tend to redefine “BPM” every couple of years, causing a vendor feature catch-up scramble, neither of these points is surprising and I agree with him. Furthermore, I think that the large percentage of Visio modelling is also an indicator of an immature market as much as it is of immature BPM initiatives within an organization.

He went through some results that I don’t recall seeing in the report that summarized what people would be doing less, the same, or more of in 2006 (the survey was taken in February), grouped into enterprise, process level and implementation activities as per their pyramid (page 41-42 in the report). He sees most of these trends as further proof that we’re still in a developing market for BPM, such as the large number of companies that are planning more of BPM systems, major process redesign and automation projects, and process analysis and design training in 2006 than they’ve done previously, as well as developing an enterprise architecture and enterprise performance management. I like the fact that he doesn’t show any bogus hockey stick projections for BPM growth; those of us in the BPM business have been seeing those for many years now and are understandably wary.

The webinar will be available for replay at some point; check the original registration link or the Proforma website to find it in a few days.

Slightly off topic, I appreciate the collaborative spirit of many recent webinars that I’ve attended of opening up the dial-in line so that any of the attendees can speak up with their questions (rather than using a chat window), but it doesn’t work so well in practice due to the large number of people who can’t find the mute button on their phone or just don’t consider the listening experience of others on the call. I can hear background conversations, papers rustling, computer noises of all sorts, and even a dog barking, all in spite of the speaker’s repeated request for people to mute their phones. Even on an online demo that I attended the other day with only two other attendees besides myself, one of those two put his phone on hold during half of the demo which treated the rest of us to the periodic “beep-beep” that most phone systems emit to the party on hold (and gave the speaker a pretty good indication of just how unimportant the material was to that attendee, since we could easily identify who had hit the hold button).

BPM Think Tank Day 2: BPMN Technology Roundtable

Since I’m here in part to firm up my knowledge about BPM standards, I chose to attend four technology roundtables and none of the executive (business focussed) ones. The first one that I attended was on BPMN, led by Petko Chobantonov of Lombardi. Petko’s involved with the development of the BPMN standard and was really pushing us to find out what else should be added to the standard in the future. I was the scribe for that session so have a ton of notes, my problem is trimming them down and making them understandable in this post.

First of all, Petko made the statement that OMG is not recommending XPDL for serialization of BPMN (i.e., a file format in which to save BPMN), but recommends the use of BPDM (which isn’t released yet, although a very early draft is due next month). This sets up for an interesting showdown between XPDL, which is already in use by 30+ modelling and BPM vendors, and BPDM when it finally is released this year or next.

For the first time, I heard about BPRI, Busines Process Runtime Interface, which incorporates information gathered at runtime such as metrics and statistics about a process (I think). Petko has a bit more on his blog about it here, and I’ll be looking at this in more detail since I think that this is a necessary standard as well.

One of the participants from an end-user organization said that they have extended BPMN with 3-4 custom types in their internal use, one for applications and one for data elements. He also said that they have difficulties in publishing and communicating BPMN diagrams because of the complexity, and that there needs to be some easier ways to abstract a flow in order to present it to someone who is not intimately involved with the process, such as executive management. Although using just a linear set of milestones was suggested as an abstraction model, removing all of the split/merge and other flow information, I think that some of the flow information should be left in place even in a high-level diagram in order to provide sufficient value.

This was also one of the times during the day when I heard about the crossover between BPMN and enterprise architecture. We discussed different perspectives (similar to the perspectives in a Zachman diagram), and although Petko felt that the standard could be extended to become effectively a higher-level diagram from which you could invoke other EA perspectives, like organizational and motivational models, I think that BPMN holds a place as a standard for creating artifacts in one or two of the Zachman cells in column 2 (process), not as an overarching EA model.

We had a discussion about the standard organizational tree-type chart, and how the boxes in that correspond to swimlanes in a BPMN diagram. From that, we talked about how to represent information in the org chart based on which processes that a particular role participates in, and also discussed the stickier subject of assigning roles a bit more dynamically based on a collection of capabilities rather than a pre-determined role. That got me thinking about whether we’re asking the question the wrong way around: instead of the asking what capabilities exist in a role or person, should we be creating the roles or services based on what combinations of capabilities exist? Something to think about later.

We talked about a dependency diagram for subprocesses used in multiple processes, and whether this should be a standard view defined in BPMN, or if it’s informational rather than notational. If the audience for this information is primarily the business analysts who use BPMN, then perhaps a graphical standard is appropriate, although it’s a “report” of sorts, not a working model.

Petko finished up with some ideas about defining aspects of a process, such as security, escalation and exception handling, in order to simplify the primary representation. The aspects would be invoked whenever an activity is executed, but represented on separate diagrams. In that way, an aspect would effectively be a template for activities that could be overlaid on any of the activities in the main diagram and extend the meaning of the main diagram. Each activity in the main diagram would need a mechanism for passing some number of parameters to the instance of each aspect that may execute for that activity, for example, some measure of the time-criticality of an activity in order to trigger an escalation at the approriate time.

Tons of ideas came out here, as they did at the later roundtable that I attended on BPEL, and I’m looking forward to the roundtables today.

Time to head off to the conference (I’m already 5 minutes late and still have to finish packing and check out); more throughout the day as I get a chance.