A last few notes on Proforma

I received Proforma’s press release last week about the Forrester report on process modelling tools (PDF, free download), in which Proforma places well against their usual competitors, IDS Scheer and MEGA. All three are in the leaders category, with Proforma leading on current product offerings, and IDS Scheer leading on strategy. This result is quite different from Gartner’s Magic Quadrant for Enterprise Architecture tools published in April: many of the same tools are being evaluated, but the Forrester focus is purely on business process modelling, while the Gartner focus is on the broader topic of EA modelling. Gartner also published an MQ on business process analysis tools this year that has results closer to the Forrester report, not surprisingly.

All of this made me realize that I still had a few notes about the Proforma user conference that I attended a couple of weeks back in Las Vegas, mostly all the ones from the Proforma folks about upcoming product release, ProVision Series 6. Here’s the rundown. [All inaccuracies in this information are due to my hurried notetaking, delayed transcription, and incomplete understanding of Proforma’s product, and I rely on those more knowledgeable to add any corrections in the comments.]

Software as a service was mentioned in the keynote on the first day, and Proforma’s push further into their Knowledge Exchange server-base product (an intended replacement for their ProServer product, and eventually their TeamWorks product with a “light” version) seems to support that concept architecturally, although the web client is not fully functional yet and web services interfaces won’t be supported until version 6.1. I asked a direct question about whether it would work across the firewall and the answer was “it should work”, which means to me that they haven’t actually tried it and you might want to wait until they do before trying that one at home.

The web client does have quite a bit of rich AJAX-y stuff going on: it shows all the inventory views in a browser, uses some nice UI controls such as elevator bars, drag+drop and double-clicking to open a property dialog, plus allows property editing in the browser client although no real modelling tasks. It uses scalable vector graphics to allow for fast zooming, panning and printing of complex models. I think that they might still be working on the licensing model for the web client: although a user must login, there is no licence required for the web client, such as there is for the desktop client, but this will certainly have to change when the web client is able to be a full desktop client replacement.

They’ve introduced the concept of dimensions in models, which allows for alternative versions to be created based on specific dimensions, where a dimension may be, for example, geography, or as-is versus to-be. In one model, then, you can compare North American as-is models with European to-be models, or whatever else you want to define based on your dimensions. It took me a while to wrap my head around it, but it’s pretty powerful stuff. This replaces the less-powerful concept of scenarios that were used in previous versions.

There were a number of enhancements that aren’t really meaningful to me since I’m not a regular Proforma user, but were welcomed by the audience: embedded Crystal Reports, federated search across repositories, more granular access rights down to the instance of an object, and the ability for a user to change their own password (?!).

There are some new business data modelling tools that are intended to allow designers to work in ProVision, then easily bridge to other technical design tools. This theme was picked up later during a lengthy discussion about interfacing with other applications, which is ultimately the key to making Proforma work as an integral part of any organization. They have development an XML-based common interchange format (CIF) and made it openly available to anyone who wants to interface with them; this covers all model types, not just process models. They interface with an impressive number of BPMS, SOA suites, and business rules systems.

Because of the rise of process model standards, however, they’ve also done a BPEL interface. The CTO’s keynote made a strong statement in support of standards, mentioning BPEL, WS-CDL, XPDL, SVBR and others. However, during a technical presentation the following day, I asked a question about XPDL to find out that it’s under review, but not even on the roadmap yet. They might use CIF as a stepping stone to get to XPDL, as they did with BPEL, but who knows. By then, BPDM will probably be out, and they’ll have to address all three serialization formats at some point.

In my opinion, there’s a few things that they’re going to have to address over the next few years in order to keep their product ahead of the big guys who are nipping at their heels, most of which are Web 2.0-type things that I’ve been talking about for BPMS:

  • Full functionality in a zero-footprint web client
  • Tagging to allow users to build up their own folksonomy around models
  • Syndication and feeds for alerts on changes to models, and to provide feedback to some of their new process monitoring capabilities
  • Support for XPDL now, and eventually BPDM

Is Anyone Executing Those Processes?

There’s just something about that mid-Western accent that I find endearing, and when Roy Massie from SunGard first pronounced “insurance” as a two-syllable word, I was hooked. Roy’s was the last non-Proforma presentation of the conference, and he was the only partner speaking (although I suppose that technically speaking, HP Consulting is likely a partner). If you’ve read Column 2 much in the past, you know that I have had a big focus on systems integration and implementation, so I was very interested in what SunGard had done to integrate ProVision with their products.

First of all, who knew that SunGard even had a BPMS product? Apparently the product of an acquisition, it doesn’t show up on the SunGard site, but has two other sites where it lives. Although SunGard did show up as a niche player on Gartner’s Magic Quadrant back in 2003, they’re not there any more; I imagine that niche might be limited to only SunGard customers for their other systems. I’ve seen SunGard transaction processing systems (not including any BPM functionality) in many of my mutual fund and other financial customers, so this isn’t a completely unexpected leap.

What was unexpected was the audience response when Roy asked the audience how many of them export their processes from ProVision to a BPMS for execution; I was sitting more than halfway back in the room, and there were no hands up in front of me. I didn’t turn quick enough to count, but Roy said “a couple of you” when characterizing the response. My question is if ProVision users aren’t pushing their process models through to a BPMS for execution, aren’t they missing a lot of value? And what, exactly, are they doing with those process models? Or is this just exposing my bigotry over what process models are good for?

The integration seems pretty straightforward, and based on later information, is similar to what is done by other BPMS vendors: processes are modelled in ProVision, then exported using Proforma’s open Common Interchange Format (CIF) and imported into SunGard EXP Process Director.

I did like Roy’s description of practices (determined by experienced specialists) versus procedures (executed by trained workers), and how they combine to make up processes. I also liked his phrase “enterprise technology sprawl”, and his discussion of how an unstructured collage of technologies can start to dictate business processes. He made the great point that all compliance initiatives are based on process transparency, and (referencing the Aloha Airlines presentation about how they started modelling their business in order to organization themselves out of bankruptcy) that a near-death experience is a great motivator.

Proforma for ITIL

ITIL is not a subject that I spend a lot of time thinking about, but John Clark from HP does. John sat beside me for all of day 2 (except when he was presenting 🙂 ), and I had the chance to talk to he and his wife at lunch. After his presentation, we had a quick session of dueling devices: I showed him the lingerie show photos on Flickr on my Blackberry, and he surfed to the same site on his laptop via a Bluetooth connection to his smartphone.

John’s presentation was about work that the HP Consulting organization had done for Lucent in the area of IT service change management. We saw some of the workflow diagrams that they had created in ProVision for modelling ITIL controls and policies, for example, for Lucent’s incident management process. They integrated the launch and display of ProVision content directly into the HP OpenView Service Desk application for publishing a visualization of the process models directly to the users; this allowed users to see their role in the process in context without having to request that information from the modelling team.

As John put it, it make the users “unconsciously competent”, something that we should all strive to do when designing and building systems.

Strategic Planning with Enterprise Architecture

Laura Six-Stallings from QAD gave a presentation on how they are using enterprise architecture for strategic corporate planning, which absolutely fascinated me since most EA projects that I’ve been involved in have been focussed at much lower levels. She used some wonderfully funny war analogies, going so far to call ProVision a “weapon of mass depiction”, which takes the prize for the best quote of the day.

Since I had been online earlier and determined that her presentation was not available on the Proforma website, I ended up taking a lot of notes, so have a better memory of this presentation than some of the others. I didn’t see anything in the presentation that would have made it particularly proprietary, since she didn’t show their actual strategic planning results, just talked about the methodology for achieving it, but some companies are more paranoid than others.

They started their EA initiative in 2002 with about a dozen business and technology architects, and started using ProVision just last year to implement the Zachman framework. They have a very holistic view of EA, from corporate strategy on down, and they hit their strategic planning process as an early target for EA. Like many organizations, they did their strategic planning in PowerPoint and Word; with over 60 pages of slides and 280 pages of backup documentation, it was a time-consuming and error-prone process to create it in the first place, then to map that onto the more concrete goals of the organization. By implementing EA and ProVision, they were looking to improve the entire process, but also gain some clarity of alignment between strategy, business and technology, and some clarity of ownership over processes and strategies.

She made several turns of phrase that elicited a knowing laugh from the audience — IKIWISI [I Know It When I See It] requirements; As-Was and Could-Be models — but really brought home the challenges that they had to overcome, and the wins that they are expecting as this process rolls out. The biggest issues weren’t surprising: a perception of complexity, based in part of the limited ProVision expertise within QAD, and the cultural shift required to embrace a new way of modelling their strategic plans. However, they now have a long-term strategic plan based roughly on balanced scorecard objectives, and have a whole list of anticipated benefits:

  • Common taxonomy and semantics
  • A holistic multi-dimensional view of enterprise activities
  • Enforced alignment to the strategic plan model
  • Exposure of dependencies, relationships, impacts and conflicts
  • Improved communication and acceptance of the strategic plan
  • Improved priority management
  • Common processes
  • Effective reporting and analysis
  • Improved collaboration

Quite lofty goals, but achievable given the level that they’re attacking with EA. What I took away from this, and from other conversations that I had during the two days, is that to many people, “EA” really translates to IT architecture, but not at QAD.

Six Sigma and Proforma

Day 2 of the Proforma conference included three additional customer presentations, one from a partner, then all the exciting stuff about the upcoming product release.

Following on the heels of the panel at the end of day 1, in which Paul Harmon and Geary Rummler slammed Six Sigma, Deb Berard from Seagate spoke about their successes with Six Sigma and Proforma. Seagate has been using Six Sigma since 1995, and has been seeing a lot of success with it and Lean — not surprising for a manufacturing organization, which is where Six Sigma originated. They use the Six Sigma framework in ProVision, and their initial process analysis and modelling efforts led to the improvement of some of their product development processes. Based on that success, they then pushed it out to an enterprise-wide initiative.

The only thing that I really had an issue with was her calling ProVision a business process management system (BPMS), which it’s not: it’s a modelling suite. Although BPM still doesn’t have a fully accepted definition, I believe that BPMS has a very specific meaning.

Proforma Conference Day 1: Geary Rummler x 2

Our after-lunch keynote on the first day was by Geary Rummler, co-creator of the well-known Rummler-Brache methodology and author of Improving Performance: How to Manage the White Space in the Organization Chart. In case you’re not getting the significance of this, the original swimlane diagrams are more properly called Rummler-Brache diagrams.

Rummler retired from Rummler-Brache a few years ago, then after “failing at retirement” as he put it, went back into practice the Performance Design Lab. His talk was a bit rambling, and he had 84 slides for a one-hour presentation, but I’m quite sure that he’s forgotten more about process than most of us will ever know.

He talked about how “as-is” process maps tend to drive out issues into the open, something that I have seen time and time again: management looks at what you’ve documented as their current process, and they say “We do that? Really?” One of the prime examples of this was a financial institution that I was working with on an BPM project a few years back. I documented all of their current processes on the way to the proposed processes, including their paper handling procedures. They sent the original of all transaction documents offsite in order by date, but made a photocopy of each document and filed it locally for reference by account number. Of course, we planned to replace this with a document scanning operation, but I felt obligated to point out a pretty major problem with their current process: since they were so behind in their local filing, the photocopies of the documents were just being boxed in date-order and stored onsite, which made the account-order files useless for any recent documents. Furthermore, they had stopped sending the originals offsite some months before that, so they now had both the original documents and a photocopy of each document, stored in the same order but in separate boxes, kept onsite. The management in charge of the area was truly shocked by what was going on, and I think that my fees were covered just by what I saved them in photocopy costs.

Back to Rummler, he showed a diagram of a business — any business — as a system, with financial stakeholders, the market, competition, resources and environmental influences as the inputs and outputs (since you can search the Proforma site and find the full presentation, I don’t think that I’m spilling the beans here to include one of the diagrams). I like this view, since it simplifies the business down to the important issues, namely the interactions with external people and organizations. He also spent quite a bit of time on the processing system hierarchy: the enterprise business model at the top, the value chain extracted from that, the primary processing systems derived out of the value chain, the processes from each primary processing system, and the sub-processes, tasks and sub-tasks that make up each process.

He went into organizational structure, specifically showing departments/resources on one axis and processes on the other, to illustrate how process cut across departments, but making the point that most organizations are managed by department rather than having true process owners.

There was one quote in particular that stuck with me: “Visibility is a prerequisite to optimizing and managing systems.”

We had a second dose of Rummler in the wrap-up panel on Day 1, where he joined Paul Harmon of BPTrends and one of the Proforma team who was filling in for the missing Aloha Airlines representative.

Harmon stated that none of the major business schools have any courses on process, but that they’re all function-based, and that most CEOs don’t see process as their primary concern. Rummler agreed, and made the point that being functionally-oriented, or siloed, leads to sub-optimization of the organization. Harmon’s initial comment led me to wonder if it’s necessary to have the CEO want to “do process”, or if a process approach is just an implementation detail, but Rummler ended up addressing exactly that issue by saying that it is necessary because methodologies are competing directly for the CEO’s attention, and it’s not always possible for the CEO to distinguish between the different methodologies at that level. Harmon made quite a rant against Six Sigma, saying that “Six Sigma people don’t understand high-level process”, blaming the widespread acceptance of Six Sigma on Jack Welch and GE strong-arming their suppliers into using it, and stating that Six Sigma people could be converted into a business process view, as if they were some sort of cult that had to be deprogrammed. I’m not sure that I would take such a hard line on Six Sigma versus a process-centric organization; “process” can’t be so easily pushed into an organization as Harmon implied since it’s not a methodology, it’s a pretty fuzzy concept that a lot of consultants like to bandy about.

At the end of the day, I’d have to say that I also disagree with Harmon’s assessment that BPMS is still very early market. Although it’s not a mature market, I think that to call it “very early” is ignoring the many successful products and implementations that have been done in this space over the past several years.

Seeking a BPM definition

The last customer presentation of Day 1 at the Proforma conference was Mary Baumgartner Vellequette from Genentech‘s Corporate Treasury division. Through curiosity on both of our parts, Mary and I later toured the show floor of the International Lingerie show that was going on down the hall, although they were in the process of tear-down so we didn’t see as much as we would have liked. 😉

Mary had some great material on establishing BPM programs within an organization, including governance, but the more that I listened to her, the more I realized that we still have a definition gap: her BPM (which does mean business process management) doesn’t really include one of the main foci of my BPM, namely the systems used to help automate business processes. Hers is really about analyzing and modelling the processes, integrating them into an overall architecture, documenting and communicating the processes, business reorganization and other non-automation tasks. Only on her long-term plans does she mention “business process automation” tools.

She does include some BPM measures and direct/indirect benefits in good detail, helpful to anyone who is looking to establish ROI for their BPM project. She also steps through the BPM project process in detail, discusses change management and how to map process improvements to organizational change.

I am left with the feeling that we still don’t have a comprehensive definition of business process management: although I consider everything that Mary talked about to be part of BPM, I also consider the process automation and BPMS to be a significant part.

Dispensing with the As-Is

Sometimes I don’t pick up a lot from a presentation, or at least I don’t write a lot of notes about what I learned from it. Bill Riordan from HP talked about modelling customer support centre processes, but I came away with only a few points.

First, the term “happy path”, which in process modelling, means the best-case/simplest route through a process map, originated at HP. Who knew.

Second, and likely more relevant to those who aren’t as fascinated by process modelling trivia as I, is that they didn’t do any “as-is” process modelling, only the “to-be” models. That way, they were able to completely bypass the “but that’s the way we do it now” argument for not changing things. That really harkens back to the days of business process reengineering, where everything old was tossed out and the new processes started from a completely clean slate.

Enterprise Architecture in pharmaceuticals

It was Craig Confoy’s presentation on Johnson & Johnson Pharma where I really started to get interested in the issue of where EA sits in the enterprise. Although the “E” in EA stands for “Enterprise”, it seems that most organizations, and J&J is no exception, start out with EA in the IT infrastructure group somewhere. Like many large conglomerates, they had a bit of a mess with five pharmaceutical R&D companies (out of J&J’s 200-odd companies), each with its own IT department supporting 14 different functional units per company, and little alignment between the company functions. Since EA was in IT infrastructure, anything in the business layers of EA, such as business modelling, was done on a project-by-project basis and not shared between business units or companies.

Sound familiar? Almost every large company that I deal with has the same issues: some real architecture going on at the lower infrastructure levels, but practically none at the business levels.

About 5 years ago, J&J Pharma decided to do something about it, and created a business architecture group. There were a few stumbles along the way, such as the use of a (seemingly inappropriate) CASE tool that resulted in business process documentation that stretched over 42 feet at 8pt font — unusable and unsustainable — before they started using Proforma.

One of their models that I really liked was an enterprise data model that could be overlaid with departmental ownership, so that you can easily see how changing any part of the model would impact which departments. I think that this is one of the basics required by any large organization, but often not used; instead, companies tend to replicate data on a per-department basis since they don’t have any enterprise data models that would tell them who is using what data.

This was one customer presentation that showed some clear ROI of using the Proforma tools: they found that systems could be implemented 30% faster (a huge advantage in pharmaceuticals), that the modelling process identifies system integration points and allows them to create standard EAI models for reuse, and that the data models helped meet their regulatory requirements more easily.

Milk, butter and business processes

Mary Berger from Land O’ Lakes kicked off the customer presentations by talking about how they modelled several of their core business processes in spite of the lack of in-house resources, both analysts and SMEs. They backfilled their own resources with some of the Proforma team, then had sufficient success on the first three core processes that they split the efforts and did the next six processes, three product and three office, as two separate streams using Proforma and internal resources.

Mary summarized a number of key success factors that any organization attempting this could take to heart:

  • Do the modelling live during the sessions with the SMEs, rather than taking notes and trying to transcribe them later. This increases the accuracy, since there is immediate feedback on the process model, creates the final documentation as you go along, and forces those who are facilitating and documenting the session to become very familiar with the modelling tool. I can’t tell you how many nights I’ve spent in hotel rooms after a day of customer requirements elicitation sessions, transcribing my notes and trying to recreate every detail mentioned during the day; real-time business modelling is definitely the right way to go, assuming that you have both a facilitator and a scribe/modeller.
  • Use small teams, and involve the right people from the start. Smaller teams just get things done faster and more efficiently, and having everyone on board from the beginning means that you spend less time playing catch-up with those who join later.
  • In workflow models (the most common model type that they used), you can pinpoint the functions of highest risk as those with the most I/O outside their own swimlane. That seems obvious in retrospect, but she highlighted the point well.

They also found that there were a number of unexpected benefits that came out of the analysis and modelling efforts: a common corporate glossary and vocabulary; documented business procedures for use in training and procedures manuals; a visible link between business requirements and goals; and a set of business rules.