Gadget week #1: HP/Compaq tc4200 convertible laptop/tablet

I adore this machine. I come from the good old days of presentations done on overhead projectors with transparencies, and there was never a time when my transparencies weren’t covered with ink by the time that I finished a presentation. Then, along came PowerPoint and I was forced to gesture wildly at the screen instead, which is amazingly unsatisfactory. The tc4200 works like a regular laptop, but the screen swivels around and folds down flat, covering the keyboard, allowing me to write directly on the screen with the provided stylus. Now, I do my presentations on the tablet and am able to write and highlight all over the slides again. I’ve used this for several days of training plus some casual presentations that I’ve done in the past few months, and it works like a charm. I’ve had several favourable comments and envious glances about it from the attendees, as well. It’s also great for curling up in an easy chair and poking through my feeds in the web version of Bloglines, which is mostly a point-and-click activity that can easily be done with the stylus. The handwriting recognition is quite good, although my typing speed is fast enough that I don’t use that a lot — I’d rather convert back to keyboard mode and do 60 wpm.

The down side: it runs XP, which is a pig compared to Windows 2000, my previous operating system. On my old machine, which was less than half the speed, the same amount of memory (0.5GB) was plenty for Windows 2000, but I had to drop another 1GB into this for a total of 1.5GB before XP started behaving tolerably when I’m running multiple applications simultaneously. As far as I know, Windows 2000 doesn’t support tablets so I’m stuck with XP, and now I’m mostly used to the user interface, so I guess that I’ll just have to live with the crappy multitasking.

Also on the down side is the lack of a firewire port, and I haven’t been able to get my PCMCIA firewire card working properly with it: my Canon Elura 50 digital camcorder is not recognized, although the same card and camcorder work fine together on other machines. HP support claims that no one has tested a firewire card on this machine, so can’t even recommend a different card to buy. I’m suspecting that it has something to do with power to the PCMCIA slot, and that an externally-powered card might do the trick, which will require making a trip to a computer store with my laptop and camera to try said card before I buy. Troubleshooting three devices from different manufacturers is always a hassle, since none of the vendors will provide anything that resembles technical support.

My own personal gadget week

In the spirit of blogging about something a big lighter over the holiday week, I’m going to talk about my latest gadgets and their quirks next week, possibly sprinkled with some of my usual BPM-related fare. Due to some long-overdue upgrades, the past few months have been one big gadget-buying spree for me, so there’s lots to talk about.

Also check out my new Squidoo BPM lens, and sign yourself up at my Frappr map.

iPod in the Great White North

Completely unrelated to BPM, EA, or anything else that I usually blog about, I have a great idea for one of those iPod silhouette ads for Canadian viewers (readers from Michigan, Wisconsin and Switzerland will also appreciate). This came to me last week after I received a video iPod as an early Christmas gift, then had to travel halfway across the city to see a client on the day of a 3-4″ snowfall. Picture a long parka with a snorkel hood (something like this, if you’re having trouble imagining it), with the white headset cables emerging from inside the hood, then trailing into an inside pocket of the coat where the iPod is, of course, out of sight so that it doesn’t freeze over. By now, you shouldn’t have to see the actual iPod to know what’s connected to those white cables.

This would be funnier if it weren’t quite so true.

Face-to-face bloggers

I had the unique experience of meeting two other BPM bloggers this week for the first time, one unexpected and one planned.

As I mentioned in a previous post, I was in Dallas to deliver my Making BPM Mean Business course for Imagine Solutions, and was pleasantly surprised when George Dearing of the ECM Blog walked into my classroom at the end of the first day to introduce himself. I link to his blog on my blogroll, so I knew that he works for Imagine, but I had totally forgotten about that in all of the activities around preparing for the course. He read my blog post from Sunday night (in which I didn’t mention that Imagine was my client), and thought that this was too much of a coincidence that I was in Dallas so went scouting around his office to find out if I was there.

On a more planned note, I met up with Ethan Johnson of The Vision Thing for dinner; Ethan and I have been trading blog comments for months, and he interviewed me for one of his Sound of Vision podcasts, so when I knew that I was coming to Dallas, we made the arrangements to connect. I didn’t wear my Process For The People tank top (too cold!), but we still managed to identify each other. We had our own little geek dinner, and as he pointed out, it was unusual because we had equal gender representation, which likely doesn’t happen much at geek dinners. I also noted that we had equal representation from Canada and the U.S.: sort of a NAFTA geek dinner, if you will.

Great to finally meet both of you guys face to face!

O (Canada)

Although I’m based in Toronto, many of my clients are elsewhere, and the past year I’ve seen mostly American clients. For those of you who aren’t familiar with the significant cultural differences across the N-S divide, I won’t bore you with the details, but you’re likely aware that we talk different. There are different expressions (such as the American use of “uh-huh” as a replacement for “you’re welcome”, and the quintessential Canadian “eh?”), but I tend to notice pronunciation. All of the Americans reading this probably flashed immediately to “oot and aboot”, but my focus, as usual, is on process.

Today, in a meeting of about 15 people at a client (in Toronto), I heard — about 1000 times, considering the subject matter — the Canadian “PRO-cess” rather than the American “PRAW-cess”. Music to my ears! 🙂

Service-Oriented Business Architecture

I’ve been doing quite a bit of enterprise architecture work lately for a couple of clients, which has me thinking about how to “package” business processes as “services” for reusability: a service-oriented business architecture (SOBA), if you will. (I have no idea if anyone else has used that term before, but it fits in describing the componentization and reuse of various functions and processes within an organization, regardless of whether or not the processes are assisted by information systems.)

When we think about SOA, we think about automated processes and services: web services that can be called, or orchestrated, to execute a specific function such as mapping a set of input data to output data. SOBA, however, is for all those unautomated or semi-automated processes (what someone in a client IT department once referred to as “human-interrupted” processes) that may be reused, such as a credit adjudication process that requires human intervention. In many large organizations, the same (or very similar) processes are done by different groups of people in different departments, and if they’re not modeling some of this via enterprise architecture, then they likely have no idea that the redundancy even exists. There are exceptions to this, usually in very paper-intensive processes; most organizations, for example, have some sort of centralized mail room and some sort of centralized filing, although there will be pockets of redundancy even in such a structure.

From a Zachman framework standpoint, most web services are modeled at row 4 (technology model) of column 2 (function), whereas business “services” are modeled at row 2 (business model) of column 2. If you’ve spent some time with Zachman, you know that the lower (higher-numbered) rows are not just more descriptive versions of the upper rows; the rows described fundamentally different perspectives on the enterprise, and often contain models that are unique to that particular row.

In talking about enterprise architecture, I often refer to business function reusability as a key benefit, but most people think purely about IT functions when they think about reusability, and overlook the benefits that could arise from reusing business processes. What’s required to get people thinking about reusing business processes, then? One thing for certain is a common process modeling language, as I discussed here, but there’s more to it than that. There needs to be some recognition of business functions and processes as enterprise assets, not just departmental assets. For quite a while now, information systems and even data have been recognized as belonging to the enterprise rather than a specific department, even if they primarily serve one department, but the same is not true of the human-facing processes around them: most departments think of their business processes as belonging to them, and have no concept of either sharing them with other departments or looking for ways to reduce the redundancy of similar business functions around the enterprise.

These ideas kicked into gear back in the summer when I read Tom Davenport’s HBR article on the commoditization of processes, and gained strength in the past few weeks as I contemplate enterprise architecture. His article focused mainly on how processes could be outsourced once they’re standardized, but I have a slightly different take on it: if processes within an organization are modeled and standardized, there’s a huge opportunity to identify the redundant business processes across an organization within the context of an enterprise architecture, consolidate the functionality into a single business “service”, then enable that service for identification and reuse where appropriate. Sure, some of these business functions may end up being outsourced, but many more may end up being turned into highly-efficient business services within the organization.

There’s common ground with some older (and slightly tarnished) techniques such as reengineering, but I believe that creating business services through enterprise architecture is ultimately a much more powerful concept.

More on the Proforma webinar

I found an answer to EA wanna be!’s comment on my post about the Proforma EA webinar last week: David Ritter responded that the webinar was not recorded, but he’ll be presenting the same webinar again on December 9th at 2pm Eastern. You can sign up for it here. He also said that he’s reworking the material and will be doing a version in January that will be recorded, so if you miss it on the 9th you can still catch it then or (presumably) watch the recorded version on their site.

There’s a couple of other interesting-looking webinars that they’re offering; I’ve signed up for “Accelerated Process Improvement” on December 8th.

Through a fog of BPM standards

If you’re still confused about BPM standards, this article by Bruce Silver at BPMInstitute.org may not help much, but it’s a start at understanding both modelling and execution languages including BPMN, UML, XPDL, BPEL and how they all fit together (or don’t fit together, in most cases). I’m not sure of the age of the article since it predates the OMG-BPMI merger that happened a few months ago, but I just saw it referenced on David Ogren’s BPM Blog and it caught my attention. David’s post is worth reading as a summary but may be influenced by his employer’s (Fuego’s) product, especially his negative comments on BPEL.

A second standards-related article of interest appeared on BPTrends last week authored by Paul Harmon. Harmon’s premise is that organizations can’t be process-oriented until managers visualize their business processes as process diagrams — something like not being able to be truly fluent in a spoken language until you think in that language — and that a common process modelling notation (like BPMN) must be widely known in order to foster communication via that notation.

That idea has a lot of merit; he uses the example of a common financial language (such as “balance sheet”), but it made me think about project management notation. I’m the last person in the world to be managing a project (I like to do the creative design and architecture stuff, not the managing of project schedules), but I learned critical path methods and notation — including hand calculations of such — back in university, and those same terms and techniques are now manifested in popular products such as MS-Project. Without these common terms (such as “critical path”) and the visual notation made popular by MS-Project, project management would be in a much bigger mess than it is today.

The related effect in the world of BPM is that the sooner we all start speaking the same language (BPMN), the sooner we start being able to model our processes in a consistent fashion that’s understood by all, and therefore the sooner that we all starting thinking in BPMN instead of some ad hoc graphical notation (or even worse, a purely text description of our processes). There’s a number of modelling tools, as well as the designer modules within various BPMS, that allow you to model in BPMN these days; there’s even templates that you can find online for Visio to allow you to model in BPMN in that environment if you’re not ready for a full repository-based modeling environment. No more excuses.

Proforma Enterprise Architecture webinar

I’ve just finished viewing a webinar put on by Proforma that talks about building, using and managing an enterprise architecture, featuring David Ritter, Proforma’s VP of Enterprise Solutions. He came out of the EA group at United Airlines so really knows how this stuff works, which is a nice change from the usual vendor webinars where they need to bring in an outside expert to lend some real-world credibility to their message. He spent a full 20 minutes up front giving an excellent background of EA before moving on to their ProVision product, then walked through a number of their different models that are used for modelling strategic direction, business architecture, system (application and data) architecture and technology architecture. More importantly, he showed how the EA artifacts (objects or models) are linked together, and how they interact: how a workflow model links to a data model and a network model, for example. He also went through an EA benefits model based on some work by Mary Knox at Gartner, showing where the different types of architecture fit on the benefits map:

After the initial 30 minutes of “what is EA” and “what is ProVision”, he dug into a more interesting topic: how to use and manage EA within your organization. I loved one diagram that he showed about where EA govenance belongs:

This reinforces what I’ve been telling people about how EA isn’t the same as IT architecture, and it can’t be “owned” by IT. He also showed the results of a survey by the Institute for Enterprise Architecture Developments, which indicates that the two biggest reasons why organizations are implementing EA are business-IT alignment (21%), and business change (17%): business reasons, not IT, are driving EA today. Even Gartner Group, following their ingestion of META Group and their robust EA practice earlier this year, has a Golden Rule of the New Enterprise Architecture that reflects this — “always make technology decisions based on business principles” — and go on to state that by 2010, companies that have not aligned their technology with their business strategy will no longer be competitive.

Some of this information is available on the Proforma website as white papers (such as the benefits map), and some is from analyst reports. With any luck, the webinar will be available for replay soon.