Easy enough for Mom

There’s been a bit of a backlash lately about saying that some new technology is “easy enough for my mom to use” as if it denigrates women. However, when I use that phrase, I mean it quite literally: my mom turned 83 today, and for the past 15 years or so, I’ve been introducing her (and my dad) to more technology than they ever imagined possible, to the point where email and the internet are a daily part of their lives. At Christmas, she overheard me talking about my blog, and she asked what a blog was. I sent her a link to Steve Garfield‘s 80-year-old mother’s blog as an example, and two weeks later she sent me the inevitable email:

As you know my computer skills are not too good but thought that learning how to blog might be fun. Can you send something about how to do this?

Today, she blogged about turning 83, the problems with their local hospital’s IVR system, and a variety of other topics. If you have a minute, pop over there and add a “Happy Birthday” comment; just say that you know me and heard that it was her birthday, so that she’s not wondering why strangers are sending her email — her comments are auto-emailed to her, and she’s still a bit confused about that particular piece of the technology.

Update: tell her where you’re commenting from, she still can’t believe that anyone outside her neighbourhood reads her blog.

SOA 2-point-uh-oh

The first bit of David Linthicum‘s podcast today covers the SOA 2.0 naming nonsense, calling it “disingenuous” and a “land grab”, and pointing out that it will cause more confusion. Yesterday, I linked to Macehiter Ward-Dutton’s Stop the Madness petition to protest the use of the term SOA 2.0. Since then, Loek Bakker and I have been having a conversation about it.

To quote a comment that I put on Loek’s post, I’m assuming that the two Neils are being a bit tongue-in-cheek with the petition, and are using it more to raise awareness of the silliness of versioning a concept. Regardless, head over there and sign it as a strike against meaningless marketing-enabled terminology worldwide!

Update: I forgot to link to James Governor’s post, which links to several others that share this opinion on SOA 2.0, and on to their links, and so on.

SOA in OMG newsletter

The Spring OMG newsletter is available online (direct link to PDF) with a 2-page article “OMG and Service-Oriented Architecture”:

In essence, SOA is an architectural approach that seeks to align business processes with service protocols and the underlying software components and legacy applications that implement them.

So far, so good. Then they go on to say:

Both processes and services need to be carefully coordinated to assure an effective SOA implementation. You can’t really do SOA without a clear model of the business process to be supported.

Not sure that I fully agree with that: you have to have a clear model of your business process before you can implement SOA? Aren’t the underlying services supposed to be reusable even if the business process changes? Isn’t that really the whole point of SOA?

And you can’t link your business processes to your service models without the modeling standards the OMG is developing as part of its Model Driven Architecture® (MDA®).

Oh, I get it now.

They do include a nice diagram showing where the OMG standards fit in one representation of an SOA environment (see the newsletter for the full-size version). You can see where BPMN, BPDM and BPEL fit in, which I talked about in my posts from the BPM Think Tank last week, plus other standards such as SBVR (Semantics of Business Vocabulary and Rules) for business rules.

I also like that they’re platform-independent about this, and that they don’t equate SOA with WS-*.

You can check out the newly-formed OMG SIG on SOA if you want to get involved in discussing this MDA approach to SOA.

Irreversibility breeds complexity

This is brilliant: an article by Martin Fowler in IEEE Software magazine from a few years back (via Julian On Software) really nails the issue of agility and complexity by referencing, oddly enough, a speech given by economist Enrico Zaninotto at the XP 2002 conference. Fowler says:

One aspect I found particularly interesting was his [Zaninotto’s] comment that irreversibility was one of the prime drivers of complexity. He saw agile methods, in manufacturing and software development, as a shift that seeks to contain complexity by reducing irreversibility — as opposed to tackling other complexity drivers. I think that one of an architect’s most important tasks is to remove architecture by finding ways to eliminate irreversibility in software designs.

Most of my customers are large financial and insurance organizations that still use very waterfall methods for development. The requirements and functional design take months to develop, and have concrete poured firmly over them as soon as they are complete. In other words, the irreversibility starts at the requirements stage, long before development even starts. Of course, since a technical design follows the requirements stage and in turn is solidified before development begins, the irreversibilty is built into this stage as well: any changes have to go back through (potentially) several layers of approval and redesign, which impacts project schedules and contracts.

Fowler referenced an example where a database administrator made it easy to change the database schema and migrate the data for a project; as Fowler put it, “the database schema is no longer architectural” since it could be changed on the fly to accommodate the requirements of the project, rather than being a pre-supposed part of the design.

When we used to do this, it was called “coding by the seat of our pants”; now it’s Agile!

Stop the SOA 2.0 bull

I’m with the Neils, Brenda, Ronan and David on this one: stop the meaningless “numbering” of an architectural philosophy that is being perpetrated primarily by Oracle and Gartner. Sign the SOA 2.0? No thanks petition:

We’ve created this online petition because we’re dumbfounded at the attempt by certain parts of the IT industry to create and give weight to the term “SOA 2.0”.

Industry does not, at this point, need more confusion around SOA. SOA has real value, but industry at large is only just coming to terms with what it means and what it can do. Inventing terms like “SOA 2.0” might help some analysts and vendors make money, but overall, in the long run it damages us all.

Consider that a big vendor and a big analyst started all this SOA 2.0 nonsense, and you can be pretty sure that their motives are not altruistic. And considering the trademark nonsense that just happened over Web 2.0, lawsuits probably aren’t far behind either.

An online petition on its own may not hold much weight, but if you have a blog, then blog about it; if you’re a customer of Oracle or Gartner, let them know what you think. If the market rejects the concept of SOA 2.0 as so much marketing bull, it won’t fly.

BPM Think Tank wrapup

Since I only finished posting about yesterday’s sessions at the end of this morning, I decided to just do a final conference wrapup instead of separate wrapups for yesterday and today.

In general, the BPM Think Tank was great, and I’ll definitely attend again in the future. I learned a lot about some of the standards that I didn’t know much about before (like BPDM), and met some really smart people with lots of opinions on the topic of standards. It’s been so long since I was involved in any sort of standards work (AIIM in the early 90’s, and topographic data interchange formats for the Canadian Council of Surveying and Mapping back in the late 80’s), and I had forgotten about both the frustrations of dealing with standards committees and the excitement of being able to contribute to a little bit of computing history that will make things work better for a lot of people.

I’m still mulling over the XPDL/BPDM conundrum (and, to a lesser extent, BPEL), but the fact that different standards bodies are all here participating is a good indicator that there is the collective will to head off problems like this. At last year’s Think Tank, discussions between BPMI and OMG around the competing graphical process models of BPMN and UML activity diagrams helped lead to the absorption of BPMI into OMG, and the championing of a single standard, BPMN, being put forward by the merged organization. We can only hope that something similar will happen with XPDL and BPDM in order to avoid future problems in the BPMN serialization domain.

I had the chance to meet several people who I had connected with online but never met face-to-face: Dana Morris of OMG, Bruce Silver, John Evdemon (who I’ll be having ongoing discussions with about BPM and Web 2.0) and others. Jeanne Baker, who did such a great job at keeping things moving along during the sessions, even remembered one of my posts from last year about a webinar that she gave on standards — she turned to me at lunch yesterday and asked “Did you write that blog post called ‘Alphabet soup for lunch‘?” — proof that people will remember if you mention them in print. I missed other people completely in the crowd (Phil, where were you?).

There were a few logistical problems (conference rooms way too cold, no free wifi, not enough herbal tea, and no free t-shirts with vendor logos, about which I heard a lot of whining when I got home), but these were only minor annoyances in an otherwise well-executed conference with excellent content.

BPM Think Tank Day 3: BPDM technology roundtable

The last of the four roundtables that I attended was on BPDM, led by Fred Cummins. I started with my (by now) usual question about the distinctions and overlap between XPDL and BPDM: his response was that XPDL is an XML specification, and BPDM is a metamodel that can be exported to XML via XMI. He seemed to imply that they could coexist, but given that BPDM will include a serialization specification for BPMN (in addition to other models that can be represented in BPDM), I’m not sure I see the need for both in the standards world. He later stated that there is an expectation that people will model in BPDM (as visualized by BPMN or other visualizations as appropriate) and transform to an execution language such as BPEL, rather than BPDM being an interchange format; this seems to leave no room in the landscape for XPDL if you adopt BPDM, unless you need it as a legacy interchange format.

Moving on to other points about BPDM, it will include both orchestration and choreography (process flow, messages and collaboration), and will include more concepts than can be represented in BPMN, hence will support other views, e.g., process dependency diagrams, roles/organization view, choreography. A draft submission of the standard is due on June 5th, with a rough plan to finalize the underpinnings to provide BPMN support within 3-4 months, although there is no plan to issue a version with just the serialization as a preliminary release. In order to complete the release, they will likely do BPEL export from BPDM and a UML mapping to BPDM in order to demonstrate usability of the standard on a broad enough basis to initiate its acceptance.

When Cummins provided a summary of all of his roundtables at the end of the conference, he pointed out a couple of questions that had arisen during the discussions:

  • Is there a potential for executable BPDM? [I say that if there can be abstract BPEL then why not executable BPDM?]
  • Is there a way to achieve compatibility between XPDL and BPDM? [I think that there better be]

BPM Think Tank Day 3: XPDL technology roundtable

This afternoon, I attended technology roundtable on XPDL led by Keith Swenson.

Keith went around the table and asked how we (or our customers) are modelling processes now. The biggest faction by far use Visio, but PowerPoint (!), UML activity diagrams (using the IBM/Rational tools) and proprietary/internal tools specific to an industry were also mentioned. For the most part, people are concerned with sharing processes between tools, not between organizations, since most organizations are very protective of their processes. A major issue with most of these tools is round-tripping and process lifecycle issues, since in many cases it’s a one-way trip from the modelling tool to the execution engine. We talked about Byzio, the Zynium add-on to Visio that allows BPMN to be modelled in Visio, and a mapping from either a BPMN template or any other Visio set of shapes to XPDL. I reviewed Byzio several weeks back, and Keith is quite familiar with the product too.

We discussed how XPDL could be used to aggregate process models from disparate BPMS’ that might be in use within the same organzization.

In discussing BPEL, Keith felt that XPDL provides all of the support for everything that BPEL can do with respect to the interface to web services; this further pushes the issue that BPEL is not really required if it’s not being used as an execution language and if there is a transformation from XPDL to the specific engine’s execution environment (which implies that the BPMS vendor’s design tool can import the XPDL file).

XPDL provides support for extensions modelled in a BPMS vendor’s design tool that are specific to that engine; these are preserved in XPDL and should not be affected if the XPDL is manipulated by another process design tool. This is critical for supporting round-tripping from a design tool to the BPMS vendor’s engine (via their design tool) and back again, since the design tools should preserve the extensions even if they don’t interpret it. An example of such an extension is assigning colour to swimlanes (which Fujitsu allows in its design tool): the file can be read into a tool that doesn’t interpret the colour information, but when it is saved and read back into the design environment that does support colour, the colour’s there. Vendor extensions such as this may be brought forward at XPDL TC meetings for inclusion in future versions of the standard.

The most recent set of major changes to XPDL were BPMN-related enhancements including X-Y coordinates of lines, topology, etc.; however, they forgot to include scale, since some measures are in real-world units (inches/cm) and some are in pixels. This caused further discussion on the separation of presentation and logic data, since both are included and intermingled in XPDL when it’s used to serialize BPMN, and if logic and presentation be versioned separately, since some purely cosmetic changes can be made to presentation without affecting logic. Other presentation-related information includes a “page” indicator, since a process may span multiple pages when visualized.

We had a lengthy discussion on additional versioning information that could be included in XPDL, and how this ties in with SOA governance initiatives for maintaining the integrity of interfaces and functionality.

I repeated what I said in an earlier post about blaming the large analysts for forcing (sometimes inappropriate) standards by creating RFP checklists that are used (somewhat blindly) by customers — Keith agreed with this view.

We ended up with a bunch of ideas that deserve more thought: Should Java be extended to subsume BPEL functionality? XPDL is graph oriented, and BPEL is block structured; BPEL4people implies that you can extend a block-structured language to represent human-facing process flows which are inherently graph-oriented. Should BPDM be the metamodel behind XPDL? (This is not a viewpoint endorsed by OMG since XPDL uses some notation not recommended by OMG, and BPDM has a broader scope that inclues BPMN serialization.) If XPDL were made MOF-compliant, could it replace the need for BPDM?

BPM Think Tank Day 3: Nancy Craft keynote

Following this morning’s panel, Nancy Craft of Volvo gave a keynote on Process Integration in the Supply Chain. She works for the IT department that supports three different truck brand divisions (Volvo, Mack and Renault), and they initiated a business process innovation project for sharing and optimizing their Order to Delivery processes while still maintaining separate identities for the brands.

They used Proforma’s ProVision as a modelling tool, but found that it was complex and they struggled with the tool especially when they tried to use it interactively during meetings. She recommends having a trained modeller in the room if you’re going to try to do this while gathering the information, and not letting the documentation get behind.

They made use of SCOR from the Supply Chain Council in order to drive their modelling, starting with the identification of 50 best practices before the study even started as a comparison. They modelled their processes to level 3:

  • Level 1 = process types = the scope and content within each business domain
  • Level 2 = process categories = strategy or capability for level 1 process types
  • Level 3 = decomposed processes = process elements layer, used by companies to fine-tune their operations strategy

It appears that SCOR was a big part of their success in these modelling efforts by providing a framework for the information to be captured, standard language, and best practices. I don’t typically work in supply chain or manufacturing, so the SCOR details were new to me, but there’s obvious benefits from such a framework in terms of analyzing and optimizing processes. She later highlighted it as a “significant accelerator”.

She covered off their analysis and design techniques, and gave some fascinating insights into how to get people from these three competing brands to collaborate on improving business processes: more than just working with different business cultures between divisions, but the harder task of overcoming the desire for secrecy between competitors.

They’ve also put together a six-year roadmap for improving the Sales to Order, Order to Delivery, and Delivery to Repurchase processes (which is essentially all the processes in the organization), which had a very enterprise architecture-like view of mapping from the strategic direction and business drivers to business processes, then used that to push through to IT requirements. Their initial take on this turned out to be much too complex (what she referred to as a “horrible methodology”), and they ended up with a simpler model to map busines objectives through to specific IT application implementation projects. Not quite so EA-like, but at least providing some alignment between business and IT.

The rest of today will be the remaining to roundtables — XPDL and BPDM for me — and ongoing discussions, so the rest of my posts about the conference may be delayed until tomorrow.

BPM Think Tank Day 2: BPEL Technology Roundtable

I finished yesterday afternoon by attending a technology roundable on BPEL led by John Evdemon. There was a lot of ground covered there that I had heard in his workshop on Day 1 and the panel earlier yesterday that I won’t repeat here, so just a few brief notes.

There are some things that can be described in BPEL that can’t be modelled in BPMN, which I didn’t realize. The example that Evdemon gave was an online order for a book, then a follow-on process kicked off the next day when the customer cancels the order. Although both of the processes can be modelled in BPMN, I think that his point is that the interaction between the processes can’t be modelled there. There are apparently a few use cases like this that are being considered for inclusion in BPMN (if I understood correctly), but I didn’t hear anything about this in the earlier BPMN roundtable. Stephen White’s mappings of BPMN (available on the old BPMI.org site, so I imagine all still available on OMG‘s site) has many peole thinking that BPMN models a superset of BPEL, which is not strictly true.

Like the BPMN roundtable and some hallway discussions, there were a lot of comments about the linkage between process standards and enterprise architecture.

The issues that we discussed, and the notes that I made from the discussion:

  • BPEL doesn’t provide all the functionality that can be modelled in BPMN.
  • If BPEL isn’t used as an execution language, but just as an import/export language as is done by Microsoft, IBM and others, what value does it add over XPDL (or eventually, BPDM)?
  • Are we eventually going to end up with just BPMN, BPDM (or XPDL, if you believe Bruce), and a vendor-specific execution language in the process chain?

I have some additional research to do, some of which will start in this afternoon’s roundtables on XPDL and BPDM, about whether BPEL does add value over these standards by providing more specific web services information such as endpoints or ports. You can definitely use BPEL as an import/export and exchange format, or to store the representation of a process for future rehydration, but it appears that you could also use XPDL or eventually BPDM to do the same thing and provide a richer interpretation of models created using BPMN.

At the end of the day, when we all reconvened as a group, Evdemon gave a summary of what we discussed:

  • What is the value of BPEL if XPDL is a direct serialization of BPMN? BPEL had a lot of press because of who’s backing it, not necessarily because of its capabilities. (A direct quote from him during the roundtable itself on this subject: “Unless you’re going cross-platform, you may not need BPEL.”)
  • Use BPEL to store current processes to be rehydrated later if needed for audit or other legal and compliance requirements. BPEL is also being used by other standards such as RosettaNet to provide process-related templates for those standards.
  • Process formats may just become different serialization formats with different capabilities, accessible from many tools just like all the document formats that are available if you select File…Save As within Word.