Enterprise Architecture in pharmaceuticals

It was Craig Confoy’s presentation on Johnson & Johnson Pharma where I really started to get interested in the issue of where EA sits in the enterprise. Although the “E” in EA stands for “Enterprise”, it seems that most organizations, and J&J is no exception, start out with EA in the IT infrastructure group somewhere. Like many large conglomerates, they had a bit of a mess with five pharmaceutical R&D companies (out of J&J’s 200-odd companies), each with its own IT department supporting 14 different functional units per company, and little alignment between the company functions. Since EA was in IT infrastructure, anything in the business layers of EA, such as business modelling, was done on a project-by-project basis and not shared between business units or companies.

Sound familiar? Almost every large company that I deal with has the same issues: some real architecture going on at the lower infrastructure levels, but practically none at the business levels.

About 5 years ago, J&J Pharma decided to do something about it, and created a business architecture group. There were a few stumbles along the way, such as the use of a (seemingly inappropriate) CASE tool that resulted in business process documentation that stretched over 42 feet at 8pt font — unusable and unsustainable — before they started using Proforma.

One of their models that I really liked was an enterprise data model that could be overlaid with departmental ownership, so that you can easily see how changing any part of the model would impact which departments. I think that this is one of the basics required by any large organization, but often not used; instead, companies tend to replicate data on a per-department basis since they don’t have any enterprise data models that would tell them who is using what data.

This was one customer presentation that showed some clear ROI of using the Proforma tools: they found that systems could be implemented 30% faster (a huge advantage in pharmaceuticals), that the modelling process identifies system integration points and allows them to create standard EAI models for reuse, and that the data models helped meet their regulatory requirements more easily.

Milk, butter and business processes

Mary Berger from Land O’ Lakes kicked off the customer presentations by talking about how they modelled several of their core business processes in spite of the lack of in-house resources, both analysts and SMEs. They backfilled their own resources with some of the Proforma team, then had sufficient success on the first three core processes that they split the efforts and did the next six processes, three product and three office, as two separate streams using Proforma and internal resources.

Mary summarized a number of key success factors that any organization attempting this could take to heart:

  • Do the modelling live during the sessions with the SMEs, rather than taking notes and trying to transcribe them later. This increases the accuracy, since there is immediate feedback on the process model, creates the final documentation as you go along, and forces those who are facilitating and documenting the session to become very familiar with the modelling tool. I can’t tell you how many nights I’ve spent in hotel rooms after a day of customer requirements elicitation sessions, transcribing my notes and trying to recreate every detail mentioned during the day; real-time business modelling is definitely the right way to go, assuming that you have both a facilitator and a scribe/modeller.
  • Use small teams, and involve the right people from the start. Smaller teams just get things done faster and more efficiently, and having everyone on board from the beginning means that you spend less time playing catch-up with those who join later.
  • In workflow models (the most common model type that they used), you can pinpoint the functions of highest risk as those with the most I/O outside their own swimlane. That seems obvious in retrospect, but she highlighted the point well.

They also found that there were a number of unexpected benefits that came out of the analysis and modelling efforts: a common corporate glossary and vocabulary; documented business procedures for use in training and procedures manuals; a visible link between business requirements and goals; and a set of business rules.

Proforma conference presentations

I totally slacked off after leaving the conference on Thursday afternoon, spending the early evening at the Voodoo Lounge catching the sunset from 51 floors, then hanging around the Masquerade mezzanine watching the Mardi Gras show before turning in early enough to make that 7am flight home on Friday. So here it is, Monday morning, and I’m catching up on a week’s worth of blogging.

This was a relatively small conference, about 150 customers attending, but what an enthusiastic group! When one of the speakers talked about how ARIS had been abandoned on a project because of its complexity, there was clapping from the audience, and I don’t think that all of it came from Proforma employees. There were no breakout sessions, just a main stage, and almost half of the presentations were given over to customer presentations. Not only that, all of them were talking about what they’ve actually done with Proforma’s products, not what they plan to do, so had some pretty practical advice for the rest of the crowd.

The product presentations from the Proforma people were also pretty interesting, in part because I haven’t worked with the product that much so a lot of it was new to me.

More detail on the individual presentations to follow.

I also had a number of interesting conversations with customers, and I kept driving to the question of where enterprise architecture fits in their organization. For the most part, companies are keeping it under IT (which I think is a big mistake and posted about previously, not surprisingly when I was reviewing a Proforma webinar), and there seem to be a lot of conflicts in defining the roles of data, information, business and enterprise architects still.

Proforma conference Day 1 quick look

There’s wifi in the conference room, but you have to sign up at the business centre for it ahead of time, which was just too much logistics for me to blog live. However, it’s 5am on Day 2 and my brain is still on Eastern time, so time for a few updates. I’ll do a more complete review of the sessions after it’s all over. First, let’s start with the other conferences that were running in the same conference centre,which you can see in the photo on the left.

Best quote of the conference so far: “I can DODAF FEMA!”, from Tony Devino, an engineer with the Navy, in his presentation about creating a process for quality control on temporary housing installations in New Orleans following Katrina. First time that I’ve heard “DODAF” used as a verb, and a bit funny (well, to EA geeks), especially when you consider that they use DODAF for weapons systems.

Best dance (not usually a category that I assign at conferences): Judson Laipply, a motivational speaker who gave a keynote, also happens to be the originator of the Evolution of Dance, the most-viewed clip ever on YouTube. He talked about change, which is the theme of the conference, then did a live, extended-play version of the Evolution of Dance for us at the end of his talk. I really would have hated to follow him on stage as a speaker!

Dr. Geary Rummler spoke at the afternoon keynote (yes, that Rummler), which was pretty exciting for those of us who have been around in process modelling and management long enough to have a view of his part in its history.

There was a bit of discussion about Proforma’s leading position in the new Forrester report, which is an amazing coup for Proforma when they’re up against a company that’s many times their size.

I’m left with a great impression of Proforma as a company. Although considerably smaller than IDS Scheer, their major competitor, they have an enthusiastic customer base, judging by both the customer presenters and the attendees who I’ve met, and a really nice corporate culture. I sat at the dinner last night with Dave Ritter, one of the founders and currently VP of Enterprise Solutions; we had a lengthy chat before we realized that we had (sort of) met on a Proforma webinar where he spoke several months back, and in some follow-up emails to that webinar. Michelle Bretscher, their PR Director, has given me completely red-carpet treatment, offering to set up meetings with any of the executives, and making sure that I have whatever I need. I don’t think that a lot of press shows up to their user conferences, but when you’re a one-person consultant/analyst/blogger organization, it’s nice to be treated with that level of respect, something that larger vendors could take a lesson from. I also had the most pleasant surprise when I turned to page 6 of the program and saw the watermarked graphic behind the print.

Sessions today include a lot of material from Proforma on their upcoming Series 6, and I’ve very eager to hear about their advances in zero-footprint clients and other Web 2.0-like features, considering my recent focus on Web 2.0 and BPM.

Office 2.0 no, Vision 2006 yes

This past weekend was Canadian Thanksgiving, so I was off for four days at the cottage. Now, I’m blogging in a hurry while I’m waiting for my airport taxi to arrive. However, I’m not headed for San Francisco; in spite of the hoopla about the Office 2.0 conference this week, I’ve decided not to attend in favour of going to Proforma’s Vision 2006 conference in Las Vegas. Ismael belatedly offered me a speaking spot at Office 2.0 on a technical panel, but it didn’t really fit what I felt that I had to offer and I declined. I probably would have attended anyway, just to float in the buzz, and I do like San Francisco a whole lot more than Vegas, but Vision 2006 is much more aligned with what I do and write about.

I haven’t been a big user of ProVision in the past, although I think that it’s a great product. There’s much more importance being placed on process modelling and enterprise architecture in my consulting practice these days, and the conference has a great lineup of BPM speakers.

I’ll be blogging from the conference, assuming that there’s any sort of decent connectivity. The hotel information said that they had dialup internet in the rooms (eeek!), so if that’s all that’s available, I’ll be hunting around for an internet cafe close by.

Although I won’t be at Office 2.0, I have contributed a podcast to the Office 2.0 Podcast Jam about Web 2.0 and BPM — a topic that I spoke about recently at the BPMG conference in London. Subscribe to the Jam’s podcast feed and listen to all the podcasts, there’s some great ones being published all week.

Grow up, guys

I just finished moderating today’s Gartner/Appian webinar on ebizQ, which means that I did the intro at the beginning, then moderated the Q&A at the end. On ebizQ webinars, audience members can ask questions through a typed chat window, and then I (as the moderator) can review them and pick out ones to ask. I also throw in a few of my own, especially if the questions are a bit slow coming from the audience.

Today, we had a sophomoric jerk from another BPM vendor decide to pollute the Q&A with a bunch of really stupid questions that had nothing to do with the webinar content, but were just personal jabs at Appian. I’m not revealing the name of the vendor because I don’t think that they deserve the publicity, but to the person in question, you have to realize that you acted like a complete moron, and my respect for your company just dropped through the floor. Maybe my opinion doesn’t mean much to you, but keep in mind that Jim Sinur from Gartner was also a speaker on the call, so had access to see all the questions that came up, and who asked them.

Gartner/Appian BPM webinar today

I’m moderating a webinar today at 2pm (Eastern), Driving the BPM Initiative, featuring Jim Sinur of Gartner and Steve Seese of Appian. Although I don’t know Steve, I’ve heard Jim speak many times both online at at conferences, and he synthesizes some great recommendations about BPM projects based on the Gartner research. Click here to sign up, and see you online.

Extra chunky user experience

I was on the treadmill this morning with Malcolm Gladwell. Actually, he was on my iPod, and I was watching his talk from TED 2004, which was posted recently on the TEDTalks site (not sure if the timing is correct — although the website claims that this was recorded in February 2004, Gladwell mentions his book Blink, which wasn’t published until January 2005).

The topic of his talk was spaghetti sauce, and although the words “long tail” are never mentioned, the story that he tells is about, to some degree, the lengthening of the spaghetti sauce tail: how a single style of tomato sauce in the 70’s became the myriad styles that you find on your supermarket shelf today. This came about not by asking people what style of sauce that they wanted, since many had only ever been exposed to one type of sauce, but by offering them a huge variety of sauce styles and plotting the clusters of preferences, which turned out to be plain, spicy, and extra chunky. This in turn led to the explosion of styles of vinegars, mustards, coffees and many other food products, as the food industry learned a couple of valuable lessons:

Lesson #1: The way to find out what people want is not to ask them, since we can’t always explain what we want, especially if we are completely unaware of what alternatives are possible.

Lesson #2: Different styles of products are not better or worse, just different. This democratized what might previously have been considered a hierarchy of product styles.

Although Gladwell’s discussion about sauces convinced me that he has never had a “culturally authentic” Italian pasta sauce (my favourite is a pureed tomato and red pepper sauce that I learned to make at cooking school in Tuscany), he makes an excellent point about how the food industry was trying to treat us all the same, when we really wanted variability. As he summed up, “in embracing the diversity of human beings, we will find a sure way to true happiness.

All that I could think of as I listened was that the lessons learned by the food industry could be well applied in software design (you knew that I’d be getting around to something relevant soon). One of the major causes of bad system implementations is to allow the users to design the system based on their current knowledge, through a misguided notion of what a JAD session is supposed to achieve. I’ve had this experience many times over when introducing new technology such as BPM into an enterprise: the business users have no idea what such technology can do, since they’ve never used it or probably even seen it before, so it’s foolish to expect that they are going to be able to tell you what they want the new system to do. Instead, they need to be presented with a vision of the future, or rather, several alternative visions, and be allowed to get their heads around what’s possible in the new world that includes that technology. I’m not suggesting that the technology should reshape the business process to fit its limitations, but that the business processes can radically change — for the better — with the advent of the technology, if they are allowed to do so.

The lesson about variability is a good one to take to heart as well. Many implementation “experts” have a set of templates that they seek to apply to every implementation in which they are involved; this maximizes their profitability since they don’t need to do much original design work for each project, but ultimately leads to unhappy users since it’s not so easy — or smart — to make one organization’s processes identical to another’s. The diversity of processes both within and across organizations is part of what makes an organization unique, and their ability to create that diversity easily while maintaining a common business goal is what makes for an agile company.

Appian webinar on October 4th

Next week (can it really be the last week of September already?), I’m moderating a webinar on ebizQ called Driving the BPM Initiative, featuring Jim Sinur of Gartner and Steve Seese of Appian. ebizQ’s description of the event:

Learn the formula for successful BPM implementation. Distinguished Gartner analyst Jim Sinur will discuss how IT can work with business users and C-level executives to ensure a successful BPM engagement, from picking the right BPM project and selecting the best BPM suite to getting buy-in and facilitating change within the organization. Steve Seese of Appian will then share his experiences and discuss the lessons he has learned in the field over the past 25 years.

What you will learn:

  • What criteria should you use to pick the right BPM project and suite?
  • How can you get buy-in and facilitate change within the organization?
  • What BPM project management tactics have worked in the field?

As the moderator rather than a speaker, I won’t get much of a chance to put in my $0.02 about this topic, but it will be interesting to compare my experiences at BPM implementations over the years with theirs.

Funny, I just went back to an earlier post of mine in order to scoop up the link to Sinur’s profile page, and found my commentary on his talk at the Gartner BPM conference in March. Very interesting to re-read, especially in light of the IBM-FileNet deal that is going to be closing quite soon — my comment back in March was “which BPM vendor will Oracle buy”, and I sure thought that it was going to be FileNet.