Proforma for ITIL

ITIL is not a subject that I spend a lot of time thinking about, but John Clark from HP does. John sat beside me for all of day 2 (except when he was presenting 🙂 ), and I had the chance to talk to he and his wife at lunch. After his presentation, we had a quick session of dueling devices: I showed him the lingerie show photos on Flickr on my Blackberry, and he surfed to the same site on his laptop via a Bluetooth connection to his smartphone.

John’s presentation was about work that the HP Consulting organization had done for Lucent in the area of IT service change management. We saw some of the workflow diagrams that they had created in ProVision for modelling ITIL controls and policies, for example, for Lucent’s incident management process. They integrated the launch and display of ProVision content directly into the HP OpenView Service Desk application for publishing a visualization of the process models directly to the users; this allowed users to see their role in the process in context without having to request that information from the modelling team.

As John put it, it make the users “unconsciously competent”, something that we should all strive to do when designing and building systems.

Strategic Planning with Enterprise Architecture

Laura Six-Stallings from QAD gave a presentation on how they are using enterprise architecture for strategic corporate planning, which absolutely fascinated me since most EA projects that I’ve been involved in have been focussed at much lower levels. She used some wonderfully funny war analogies, going so far to call ProVision a “weapon of mass depiction”, which takes the prize for the best quote of the day.

Since I had been online earlier and determined that her presentation was not available on the Proforma website, I ended up taking a lot of notes, so have a better memory of this presentation than some of the others. I didn’t see anything in the presentation that would have made it particularly proprietary, since she didn’t show their actual strategic planning results, just talked about the methodology for achieving it, but some companies are more paranoid than others.

They started their EA initiative in 2002 with about a dozen business and technology architects, and started using ProVision just last year to implement the Zachman framework. They have a very holistic view of EA, from corporate strategy on down, and they hit their strategic planning process as an early target for EA. Like many organizations, they did their strategic planning in PowerPoint and Word; with over 60 pages of slides and 280 pages of backup documentation, it was a time-consuming and error-prone process to create it in the first place, then to map that onto the more concrete goals of the organization. By implementing EA and ProVision, they were looking to improve the entire process, but also gain some clarity of alignment between strategy, business and technology, and some clarity of ownership over processes and strategies.

She made several turns of phrase that elicited a knowing laugh from the audience — IKIWISI [I Know It When I See It] requirements; As-Was and Could-Be models — but really brought home the challenges that they had to overcome, and the wins that they are expecting as this process rolls out. The biggest issues weren’t surprising: a perception of complexity, based in part of the limited ProVision expertise within QAD, and the cultural shift required to embrace a new way of modelling their strategic plans. However, they now have a long-term strategic plan based roughly on balanced scorecard objectives, and have a whole list of anticipated benefits:

  • Common taxonomy and semantics
  • A holistic multi-dimensional view of enterprise activities
  • Enforced alignment to the strategic plan model
  • Exposure of dependencies, relationships, impacts and conflicts
  • Improved communication and acceptance of the strategic plan
  • Improved priority management
  • Common processes
  • Effective reporting and analysis
  • Improved collaboration

Quite lofty goals, but achievable given the level that they’re attacking with EA. What I took away from this, and from other conversations that I had during the two days, is that to many people, “EA” really translates to IT architecture, but not at QAD.

Six Sigma and Proforma

Day 2 of the Proforma conference included three additional customer presentations, one from a partner, then all the exciting stuff about the upcoming product release.

Following on the heels of the panel at the end of day 1, in which Paul Harmon and Geary Rummler slammed Six Sigma, Deb Berard from Seagate spoke about their successes with Six Sigma and Proforma. Seagate has been using Six Sigma since 1995, and has been seeing a lot of success with it and Lean — not surprising for a manufacturing organization, which is where Six Sigma originated. They use the Six Sigma framework in ProVision, and their initial process analysis and modelling efforts led to the improvement of some of their product development processes. Based on that success, they then pushed it out to an enterprise-wide initiative.

The only thing that I really had an issue with was her calling ProVision a business process management system (BPMS), which it’s not: it’s a modelling suite. Although BPM still doesn’t have a fully accepted definition, I believe that BPMS has a very specific meaning.

Proforma Conference Day 1: Geary Rummler x 2

Our after-lunch keynote on the first day was by Geary Rummler, co-creator of the well-known Rummler-Brache methodology and author of Improving Performance: How to Manage the White Space in the Organization Chart. In case you’re not getting the significance of this, the original swimlane diagrams are more properly called Rummler-Brache diagrams.

Rummler retired from Rummler-Brache a few years ago, then after “failing at retirement” as he put it, went back into practice the Performance Design Lab. His talk was a bit rambling, and he had 84 slides for a one-hour presentation, but I’m quite sure that he’s forgotten more about process than most of us will ever know.

He talked about how “as-is” process maps tend to drive out issues into the open, something that I have seen time and time again: management looks at what you’ve documented as their current process, and they say “We do that? Really?” One of the prime examples of this was a financial institution that I was working with on an BPM project a few years back. I documented all of their current processes on the way to the proposed processes, including their paper handling procedures. They sent the original of all transaction documents offsite in order by date, but made a photocopy of each document and filed it locally for reference by account number. Of course, we planned to replace this with a document scanning operation, but I felt obligated to point out a pretty major problem with their current process: since they were so behind in their local filing, the photocopies of the documents were just being boxed in date-order and stored onsite, which made the account-order files useless for any recent documents. Furthermore, they had stopped sending the originals offsite some months before that, so they now had both the original documents and a photocopy of each document, stored in the same order but in separate boxes, kept onsite. The management in charge of the area was truly shocked by what was going on, and I think that my fees were covered just by what I saved them in photocopy costs.

Back to Rummler, he showed a diagram of a business — any business — as a system, with financial stakeholders, the market, competition, resources and environmental influences as the inputs and outputs (since you can search the Proforma site and find the full presentation, I don’t think that I’m spilling the beans here to include one of the diagrams). I like this view, since it simplifies the business down to the important issues, namely the interactions with external people and organizations. He also spent quite a bit of time on the processing system hierarchy: the enterprise business model at the top, the value chain extracted from that, the primary processing systems derived out of the value chain, the processes from each primary processing system, and the sub-processes, tasks and sub-tasks that make up each process.

He went into organizational structure, specifically showing departments/resources on one axis and processes on the other, to illustrate how process cut across departments, but making the point that most organizations are managed by department rather than having true process owners.

There was one quote in particular that stuck with me: “Visibility is a prerequisite to optimizing and managing systems.”

We had a second dose of Rummler in the wrap-up panel on Day 1, where he joined Paul Harmon of BPTrends and one of the Proforma team who was filling in for the missing Aloha Airlines representative.

Harmon stated that none of the major business schools have any courses on process, but that they’re all function-based, and that most CEOs don’t see process as their primary concern. Rummler agreed, and made the point that being functionally-oriented, or siloed, leads to sub-optimization of the organization. Harmon’s initial comment led me to wonder if it’s necessary to have the CEO want to “do process”, or if a process approach is just an implementation detail, but Rummler ended up addressing exactly that issue by saying that it is necessary because methodologies are competing directly for the CEO’s attention, and it’s not always possible for the CEO to distinguish between the different methodologies at that level. Harmon made quite a rant against Six Sigma, saying that “Six Sigma people don’t understand high-level process”, blaming the widespread acceptance of Six Sigma on Jack Welch and GE strong-arming their suppliers into using it, and stating that Six Sigma people could be converted into a business process view, as if they were some sort of cult that had to be deprogrammed. I’m not sure that I would take such a hard line on Six Sigma versus a process-centric organization; “process” can’t be so easily pushed into an organization as Harmon implied since it’s not a methodology, it’s a pretty fuzzy concept that a lot of consultants like to bandy about.

At the end of the day, I’d have to say that I also disagree with Harmon’s assessment that BPMS is still very early market. Although it’s not a mature market, I think that to call it “very early” is ignoring the many successful products and implementations that have been done in this space over the past several years.

Seeking a BPM definition

The last customer presentation of Day 1 at the Proforma conference was Mary Baumgartner Vellequette from Genentech‘s Corporate Treasury division. Through curiosity on both of our parts, Mary and I later toured the show floor of the International Lingerie show that was going on down the hall, although they were in the process of tear-down so we didn’t see as much as we would have liked. 😉

Mary had some great material on establishing BPM programs within an organization, including governance, but the more that I listened to her, the more I realized that we still have a definition gap: her BPM (which does mean business process management) doesn’t really include one of the main foci of my BPM, namely the systems used to help automate business processes. Hers is really about analyzing and modelling the processes, integrating them into an overall architecture, documenting and communicating the processes, business reorganization and other non-automation tasks. Only on her long-term plans does she mention “business process automation” tools.

She does include some BPM measures and direct/indirect benefits in good detail, helpful to anyone who is looking to establish ROI for their BPM project. She also steps through the BPM project process in detail, discusses change management and how to map process improvements to organizational change.

I am left with the feeling that we still don’t have a comprehensive definition of business process management: although I consider everything that Mary talked about to be part of BPM, I also consider the process automation and BPMS to be a significant part.

Dispensing with the As-Is

Sometimes I don’t pick up a lot from a presentation, or at least I don’t write a lot of notes about what I learned from it. Bill Riordan from HP talked about modelling customer support centre processes, but I came away with only a few points.

First, the term “happy path”, which in process modelling, means the best-case/simplest route through a process map, originated at HP. Who knew.

Second, and likely more relevant to those who aren’t as fascinated by process modelling trivia as I, is that they didn’t do any “as-is” process modelling, only the “to-be” models. That way, they were able to completely bypass the “but that’s the way we do it now” argument for not changing things. That really harkens back to the days of business process reengineering, where everything old was tossed out and the new processes started from a completely clean slate.

Modelling for the masses

Cheryl Hamrick of Tennessee Valley Authority didn’t really tell us about their specific usages of Proforma, but took us through how they spread the use of Proforma through the enterprise. From conversations that I had at the conference, it seems that most organizations are using Proforma within a relatively small group; other people involved in modelling might use Visio or other tools, which then have to be imported or recreated within Proforma in order to include it in the global models. TVA didn’t want to do that, however, so decided to bring modelling to the masses. Cheryl had a fabulous description of why you want to reconsider doing business modelling in the same old way (in this case, “we” is a centralized business analysis group, and “the customer” is the business unit under analysis):

  • We have to learn the customer’s current process from scratch, whether we revise it or replace it
  • Then, the customer has to learn how to interpret the models
  • We can be seen as an intrusive, external critic
  • TVA has hundreds of complex processes, which could take years to model

In other words, it takes a long time and annoys a lot of people. Furthermore, the models are outdated as soon as the analysis team leaves the premises, with no way to feedback changes or collaborate on business models, which impacts business agility.

Their answer to this was to train the business units use Proforma directly, and it sounds like they’re having some good success with that. The business units actually liked being able to do this themselves, and it allowed the central team to push out enterprise standards for modelling and have the business units do some of the modelling themselves. They’re not trying to turn the business users into analysts, but having their direct involvement makes for happier business units, and more accurate and up-to-date models.

Proforma is releasing a web viewing capability in their Series 6 that should help this out as well, since it allows for viewing and limited updates by anyone with a valid login, without requiring a licence for the web users.

 

N.B.: Windows Live Writer, which I use for creating these posts, has been giving me a lot of grief lately over my spelling of “modelling”. I know that this is beta software, but give me a break and detect that my standard dictionary is Canadian, eh?

Enterprise Architecture in pharmaceuticals

It was Craig Confoy’s presentation on Johnson & Johnson Pharma where I really started to get interested in the issue of where EA sits in the enterprise. Although the “E” in EA stands for “Enterprise”, it seems that most organizations, and J&J is no exception, start out with EA in the IT infrastructure group somewhere. Like many large conglomerates, they had a bit of a mess with five pharmaceutical R&D companies (out of J&J’s 200-odd companies), each with its own IT department supporting 14 different functional units per company, and little alignment between the company functions. Since EA was in IT infrastructure, anything in the business layers of EA, such as business modelling, was done on a project-by-project basis and not shared between business units or companies.

Sound familiar? Almost every large company that I deal with has the same issues: some real architecture going on at the lower infrastructure levels, but practically none at the business levels.

About 5 years ago, J&J Pharma decided to do something about it, and created a business architecture group. There were a few stumbles along the way, such as the use of a (seemingly inappropriate) CASE tool that resulted in business process documentation that stretched over 42 feet at 8pt font — unusable and unsustainable — before they started using Proforma.

One of their models that I really liked was an enterprise data model that could be overlaid with departmental ownership, so that you can easily see how changing any part of the model would impact which departments. I think that this is one of the basics required by any large organization, but often not used; instead, companies tend to replicate data on a per-department basis since they don’t have any enterprise data models that would tell them who is using what data.

This was one customer presentation that showed some clear ROI of using the Proforma tools: they found that systems could be implemented 30% faster (a huge advantage in pharmaceuticals), that the modelling process identifies system integration points and allows them to create standard EAI models for reuse, and that the data models helped meet their regulatory requirements more easily.

Milk, butter and business processes

Mary Berger from Land O’ Lakes kicked off the customer presentations by talking about how they modelled several of their core business processes in spite of the lack of in-house resources, both analysts and SMEs. They backfilled their own resources with some of the Proforma team, then had sufficient success on the first three core processes that they split the efforts and did the next six processes, three product and three office, as two separate streams using Proforma and internal resources.

Mary summarized a number of key success factors that any organization attempting this could take to heart:

  • Do the modelling live during the sessions with the SMEs, rather than taking notes and trying to transcribe them later. This increases the accuracy, since there is immediate feedback on the process model, creates the final documentation as you go along, and forces those who are facilitating and documenting the session to become very familiar with the modelling tool. I can’t tell you how many nights I’ve spent in hotel rooms after a day of customer requirements elicitation sessions, transcribing my notes and trying to recreate every detail mentioned during the day; real-time business modelling is definitely the right way to go, assuming that you have both a facilitator and a scribe/modeller.
  • Use small teams, and involve the right people from the start. Smaller teams just get things done faster and more efficiently, and having everyone on board from the beginning means that you spend less time playing catch-up with those who join later.
  • In workflow models (the most common model type that they used), you can pinpoint the functions of highest risk as those with the most I/O outside their own swimlane. That seems obvious in retrospect, but she highlighted the point well.

They also found that there were a number of unexpected benefits that came out of the analysis and modelling efforts: a common corporate glossary and vocabulary; documented business procedures for use in training and procedures manuals; a visible link between business requirements and goals; and a set of business rules.

Proforma conference presentations

I totally slacked off after leaving the conference on Thursday afternoon, spending the early evening at the Voodoo Lounge catching the sunset from 51 floors, then hanging around the Masquerade mezzanine watching the Mardi Gras show before turning in early enough to make that 7am flight home on Friday. So here it is, Monday morning, and I’m catching up on a week’s worth of blogging.

This was a relatively small conference, about 150 customers attending, but what an enthusiastic group! When one of the speakers talked about how ARIS had been abandoned on a project because of its complexity, there was clapping from the audience, and I don’t think that all of it came from Proforma employees. There were no breakout sessions, just a main stage, and almost half of the presentations were given over to customer presentations. Not only that, all of them were talking about what they’ve actually done with Proforma’s products, not what they plan to do, so had some pretty practical advice for the rest of the crowd.

The product presentations from the Proforma people were also pretty interesting, in part because I haven’t worked with the product that much so a lot of it was new to me.

More detail on the individual presentations to follow.

I also had a number of interesting conversations with customers, and I kept driving to the question of where enterprise architecture fits in their organization. For the most part, companies are keeping it under IT (which I think is a big mistake and posted about previously, not surprisingly when I was reviewing a Proforma webinar), and there seem to be a lot of conflicts in defining the roles of data, information, business and enterprise architects still.