Oracle BEA Strategy Briefing

Not only did Oracle schedule this briefing on Canada Day, the biggest holiday in Canada, but they forced me to download the Real Player plug-in in order to participate. The good part, however, is that it was full streaming audio and video alongside the slides.

Charles Phillips, Oracle President, kicked off with a welcome and some background on Oracle, including their focus on database, middleware and applications, and how middleware is the fastest-growing of these three product pillars. He described how Oracle Fusion middleware is used both by their own applications as well as ISVs and customers implementing their own SOA initiatives.

He outlined their rationale for acquiring BEA: complementary products and architecture, internal expertise, strategic markets such as Asia, and the partner and channel ecosystem. He stated that they will continue to support BEA products under the existing support lifetimes, with no forced migration policies to move off of BEA platforms. They now consider themselves #1 in the middleware market in terms of both size and technology leadership, and Phillips gave a gentle slam to IBM for over-inflating their middleware market size by including everything but the kitchen sink in what they consider to be middleware.

The BEA developer and architect online communities will be merged into the Oracle Technology Network: Dev2Dev will be merged into the Oracle Java Developer community, and Arch2Arch will be broadened to the Oracle community.

Retaining all the BEA development centers, they now have 4,500 middleware developers; most BEA sales, consulting and support staff were also retained and integrated into the the Fusion middleware teams.

Next up was Thomas Kurian, SVP of Product Development for Fusion Middleware and BEA product directions, with a more detailed view of the Oracle middleware products and strategy. Their basic philosophy for middleware is that it’s a unified suite rather than a collection of disjoint products, it’s modular from a purchasing and deployment standpoint, and it’s standards-based and open. He started to talk about applications enabled by their products, unifying SOA, process management, business intelligence, content management and Enterprise 2.0.

They’ve categorized middleware products into 3 categories on their product roadmap (which I have reproduced here directly from Kurian’s slide:

  • Strategic products
    • BEA products being adopted immediately with limited re-design into Oracle Fusion middleware
    • No corresponding Oracle products exist in majority of cases
    • Corresponding Oracle products converge with BEA products with rapid integration over 12-18 months
  • Continue and converge products
    • BEA products being incrementally re-designed to integrate with Oracle Fusion middleware
    • Gradual integration with existing Oracle Fusion middleware technology to broaden features with automated upgrades
    • Continue development and maintenance for at least 9 years
  • Maintenance products
    • BEA had end-of-life’d due to limited adoption prior to Oracle M&A
    • Continued maintenance with appropriate fixes for 5 years

For the “continue and converge” category, that is, of course, a bit different than “no forced migration”, but this is to be expected. My issue is with the overlap between the “strategic” category, which can include a convergence of an Oracle and a BEA product, and the “continue and converge” category, which includes products that will be converged into another product: when is a converged product considered “strategic” rather than “continue and converge”, or is this just the spin they’re putting on things so as to not freak out BEA customers who have put huge investments into a BEA product that is going to be converged into an existing Oracle product?

He went on to discuss how each individual Oracle and BEA product would be handled under this categorization. I’ve skipped the parts on development tools, transaction processing, identity management, systems management and service delivery, and gone right to their plans for the Service-Oriented Architecture products:

Oracle SOA product strategy

  • Strategic:
    • Oracle Data Integrator for data integration and batch ETL
    • Oracle Service Bus, which unifies AquaLogic Service Bus and Oracle Enterprise Service Bus
    • Oracle BPEL Process Manager for service orchestration and composite application infrastructure
    • Oracle Complex Event Processor for in-memory event computation, integrated with WebLogic Event Server
    • Oracle Business Activity Monitoring for dashboards to monitor business events and business process KPIs
  • Continue and converge:
    • BEA WL-Integration will be converged with the Oracle BPEL Process Manager
  • Maintenance:
    • BEA Cyclone
    • BEA RFID Server

Note that the Oracle Service Bus is in the “strategic” category, but is a convergence of AL-SB and Oracle ESB, which means that customers of one of those two products (or maybe both) are not going to be happy.

Kurian stated that Oracle sees four types of business processes — system-centric, human-centric, document-centric and decision-centric (which match the Forrester divisions) — but believes that a single product/engine that can handle all of these is the way to go, since few processes fall purely into one of these four categories. They support BPEL for service orchestration and BPMN for modeling, and their plan is to converge a single platform that supports both BPEL and BPMN (I assume that he means both service orchestration and human-facing workflow). Given that, here’s their strategy for Business Process Management products:

Oracle BPM product strategy

  • Strategic:
    • Oracle BPA Designer for process modeling and simulation
    • BEA AL-BPM Designer for iterative process modeling
    • Oracle BPM, which will be the convergence of BEA AquaLogic BPM and Oracle BPEL Process Manager in a single runtime engine
    • Oracle Document Capture & Imaging for document capture, imaging and document workflow with ERP integration [emphasis mine]
    • Oracle Business Rules as a declarative rules engine
    • Oracle Business Activity Monitoring [same as in SOA section]
    • Oracle WebCenter as a process portal interface to visualize composite processes

Similar to the ESB categorization, I find the classification of the converged Oracle BPM product (BEA AL-BPM and Oracle BPEL PM) as “strategic” to be at odds with his original definition: it should be in the “continue & converge” category since the products are being converged. This convergence is not, however, unexpected: having two separate BPM platforms would just be asking for trouble. In fact, I would say that having two process modelers is also a recipe for trouble: they should look at how to converge the Oracle BPA Designer and the BEA AL-BPM Designer

In the portals and Enterprise 2.0 product area, Kurian was a bit more up-front about how WebLogic Portal and AquaLogic UI are going to be merged into the corresponding Oracle products:

Oracle portal and Enterprise 2.0 product strategy

  • Strategic:
    • Oracle Universal Content Management for content management repository, security, publishing, imaging, records and archival
    • Oracle WebCenter Framework for portal development and Enterprise 2.0 services
    • Oracle WebCenter Spaces & Suite as a packaged self-service portal environment with social computing services
    • BEA Ensemble for lightweight REST-based portal assembly
    • BEA Pathways for social interaction analytics
  • Continue and converge:
    • BEA WebLogic Portal will be integrated into the WebCenter framework
    • BEA AquaLogic User Interaction (AL-UI) will be integrated into WebCenter Spaces & Suite
  • Maintenance:
    • BEA Commerce Services
    • BEA Collabra

In SOA governance:

  • Strategic:
    • BEA AquaLogic Enterprise Repository to capture, share and manage the change of SOA artifacts throughout their lifecycle
    • Oracle Service Registry for UDDI
    • Oracle Web Services Manager for security and QOS policy management on services
    • EM Service Level Management Pack as a management console for service level response time and availability
    • EM SOA Management Pack as a management console for monitoring, tracing and change managing SOA
  • Maintenance:
    • BEA AquaLogic Services Manager

Kurian discussed the implications of this product strategy on Oracle Applications customers: much of this will be transparent to Oracle Applications, since many of these products form the framework on which the applications are built, but are isolated so that customizations don’t touch them. For those changes that will impact the applications, they’ll be introduced gradually. Of course, some Oracle Apps are already certified with BEA products that are now designated as strategic Oracle products.

Oracle has also simplified their middleware pricing and packaging, with products structured into 12 suites:

Oracle Middleware Suites

He summed up with their key messages:

  • They have a clear, well-defined, integrated product strategy
  • They are protecting and enhancing existing customer investments
  • They are broadening Oracle and BEA investment in middleware
  • There is a broad range of choice for customer

The entire briefing will be available soon for replay on Oracle’s website if you’re interested in seeing the full hour and 45 minutes. There’s more information about the middleware products here, and you can sign up to attend an Oracle BEA welcome event in your city.

Architecture & Process keynote: Bill Curtis

The second part of the morning keynote was by Bill Curtis, who was involved in developing CMM and CMMI, and now is working on the Business Process Maturity Model (BPMM). I’ve seen quite a bit about BPMM at OMG functions, but this is the first time that I’ve heard Curtis speak about it.

He started by talking about the process/function matrix, where functions focus on the performance of skills within an area of expertise, and processes focus on the flow and transformation of information or material. In other words, functions are the silos/departments in organizations (e.g., marketing, engineering, sales, admin, supply chain, finance, customer service), and processes are the flows that cut across them (e.g., concept to retire, campaign to order, order to cash, procure to pay, incident to close. Unfortunately, as we all know, the biggest problems occur with the white space in between the silos when the processes aren’t structured properly, and a small error at the beginning of the process causes increasingly large amounts of rework in other departments later in the process: items left off the bill of sale by sales created missing information in legal, incomplete specs in delivery, and incorrect invoices in finance. Typical for many industries is 30% rework — an alarming figure that would never be tolerated in manufacturing, for example, where rework is measured and visible.

Curtis’ point is that low maturity organizations have a staggering about of rework, causing incredibly inefficient processes, and they don’t even know about it because they’re not measuring it. As with many things, introspection breeds change. And just as Ted Lewis was talking about EA as not just being IT architecture, but a business-IT decision-making framework, Curtis talked about how the concepts of CMM in IT were expanded into BPMM, a measurement of both business and IT maturity relative to business processes.

In case you haven’t seen the BPMM, here’s the five levels:

  • Level 1 – Initial: inconsistent management (I would have called this Level 0 for consistency with CMM, but maybe that was considered too depressing for business organizations). Curtis called the haphazard measures at this level “the march of 1000 spreadsheets”, which is pretty accurate.
  • Level 2 – Managed: work unit management, achieved through repeatable practices. Measurements in place tend to be localized status and operational reports that indicate whether local work is on target or not, allowing them to start to manage their commitments and capacity.
  • Level 3 – Standardized: process management based on standardized practices. Transitioning from level 2 to 3 requires tailoring guidelines, allowing the creation of standard processes while still allowing for exceptions: this tends to strip a lot of the complexity out of the processes, and makes it worth considering automation (automation of level 2 just paves the cowpaths). Measurements are now focused on process measures, usually based on reacting to thresholds, which allows both more accurate processes and more accurate cost-time-quality measures for better business planning.
  • Level 4 – Predictable: capability management through statistically controlled practices. Statistical measurements throughout a process — true process analytics — are now used to predict the outcome: not only are the measurements more sophisticated, but the process is sufficiently repeatable (low variance) that accurate prediction is possible. If you’re using Six Sigma, this is where the full set of tools and techniques are used (although some will be used at levels 2 and 3). This allows predictive models to be used  both for predicting the results of work in progress, and for planning based on accurately estimated capabilities.
  • Level 5 – Innovative: innovation management through innovative practices. This is not just about innovation, but about the agility to implement that innovation. Measurements are used for what-if analysis to drive into proactive process experimentation and improvement.

The top two levels are really identical to innovative management practices, but the advantage of BPMM is that it provides a path to get from where we are now to these innovative practices. Curtis also sees this as a migration from a chaotic clash of cultures to a cohesive culture of innovation.

This was a fabulous, fast-paced presentation that left me with a much deeper understanding of — and appreciation for — BPMM. He had some great slides with this, which will apparently be available on the Transformation & Innovation website later this week.

Now the hard part starts: trying to pick between a number of interesting-sounding breakout sessions.

BPEL for Java Developers Webinar

Active Endpoints is hosting a webinar this Thursday on BPEL Basics for Java Developers, featuring Ron Romano, their principal consulting architect. From their information:

A high-level overview of BPEL and its importance in a web-services environment is presented, along with a brief discussion of the basic BPEL activities and how they relate to Java concepts. The following topics will be covered:

  • Parsing the Language of SOA with Java as a guide
  • Breaking out of the VM: evolving from RPC to Web Services
  • BPEL Activities – Receive, Reply, Invoke
  • BPEL Facilities – Fault Handling and Compensation (“Undo”)

The VP of Marketing assures me that he was allowed only two slides at the end of the presentation, and that otherwise this is focused on the technical goodies.

You need to register in advance at the link above.

BPMN survey results

I really didn’t sit down this afternoon to write that last enormous post on the Great BPMN Debate, I remembered that Jan Recker (co-author on the research paper that sparked the debate, although not a participant in the debate) had sent me a pre-release copy of a paper that he authored, “BPMN Modeling — Who, Where, How and Why”, which summarizes the results of the survey that he conducted last year. One thought led to another, and before you know it, I’d written an essay on the most exciting thing to happen in BPM standards in ages.

Back to Jan’s paper however, which will be published this month on BPTrends. He surveyed 590 process modelers using BPMN from over 30 countries, and found some interesting results:

  • BPMN usage is split approximately in half over business and IT, which is a much higher percentage of IT users that I would have guessed. Business people are using it for process documentation, improvement, business analysis and stakeholder communications, whereas IT people are using it for process simulation, service analysis and workflow engineering.
  • As you might expect given that result, there’s a wide variation in the amount of BPMN used, ranging from just the core set for basic process models, to an extended set, to the full BPMN set. It would be interesting to see a correlation between this self-assessment and usage statistics based on the actual BPMN diagrams created, although as far as I know, the survey respondents didn’t submit any examples of their diagrams.
  • Not surprisingly, only 13.6% received any formal BPMN training, and I believe that this is the primary reason that most people are still using only a tiny subset of the BPMN constructs in order to create what are effectively old-fashioned flowcharts rather than full BPMN diagrams.

He finished with a list of the major obstacles that the respondents reported in using BPMN, or places that they would like to see improvement:

  • Support for specifying business rules, which echoes many of the other discussions that I’ve seen around having some standardization between process and rule vocabularies and modeling languages.
  • Support for process decomposition, although I really didn’t follow his argument on what this means.
  • Support for organizational modeling, particularly as that relates to the use of pools and lanes: sometimes, for examples, a lane indicates a role; other times, a department. There are some things happening at OMG with the Business Motivation Metamodel and Organizational Structure Metamodel that may help here.
  • There are some BPMN constructs that are less often used, although it’s not clear that anyone recommended getting rid of them.
  • The large number of different event types is problematic: “ease of use of process modeling is sacrificed for sheer expressive power”. This is a variation on the previous point (and on the crux of the Great BPMN Debate), indicating that actual BPMN users are a bit overwhelmed by the number of symbols.

I’ll publish a link to the paper when it appears on BPTrends; it’s fairly short and worth the read.

The Great BPMN Debate

If you have even a passing interest in BPMN, you’re probably aware of the great debate happening amongst a few of the BPM bloggers in the past week:

Michael zur Muehlen and Jan Recker publish an academic research paper on BPMN usage, “How Much Language is Enough? Theoretical and Practical Use of the Business Process Modeling Notation”, to be presented at an upcoming conference. To quote the introduction in the paper, its aim is “to examine, using statistical techniques, which elements of BPMN are used in practice”, and they laid out their methods for gathering the underlying data. They used some standard cluster analysis techniques to identify clusters of BPMN objects based on usage, and determined that the practical complexity (what’s really used) was significantly different from the theoretical complexity (the total set) of BPMN. Michael teaches in the BPM program at Stevens Institute of Technology, so I wasn’t surprised to see a stated objective related to BPMN training: “BPMN training programs could benefit from a structure that introduces students to the most commonly used subset first before moving on to advanced modeling concepts.” Note that he says “before moving on to”, not “while completely disregarding”.

Michael then blogged about the paper but went further by listing three implications that were not expressed in the paper:

  • Practitioners should start with the more commonly-used BPMN elements, and leave the more specialized constructs for analysts who will presumably be doing more complex modeling.
  • Vendors that support BPMN can make a realistic determination of what percentage of BPMN diagrams can be represented in their tool based on today’s usage of BPMN.
  • Standards bodies should consider if they should be creating additional complexity if no one is using it.

It was these implications that sparked the arguments that followed, starting with Bruce Silver’s post directly challenging much of what Michael said in his post. It appeared to me that Bruce hadn’t read the full research paper, but was commenting only on Michael’s blog post, hence didn’t fully appreciate that the paper was really just analyzing what people are doing now, not making any value judgements about it. Bruce was a bit harsh, especially where he suggests that Michael’s “BPMN Overhead” label on the least-used objects was “clearly meant to mean ‘useless appendages’.” Bruce had some valid rebuttals to Michael’s three implications, and although I disagree somewhat with Bruce’s first two points (as I commented on his post, and was rewarding by Bruce telling me that I was stating the bloody obvious), I agree that the standard makers have not included unnecessary complexity, but that they have created a standard that the market still needs to grow into. However, I find the BPMN specification to be overly verbose, creating a greater degree of perceived complexity than may actually exist.

Michael responded to Bruce’s post by pointing out that the aim of their research was to find out how people actually use BPMN, not how vendors, consultants and standards bodies think that they use it (or think that they should use it). Michael restates his three implications in greater detail, the first two of which seem to align with what I thought that he said (and stated in my comment on Bruce’s original post). His clarification on his third point was interesting:

We actually like BPMN’s advanced vocabulary. But have you asked end users what they think? Well, we did. Not only in this study but also in Jan’s large-scale BPMN usability studies we did find that users are in fact very troubled by the sheer number of, for example, event constructs. Are they used at a large scale? No. Do users understand their full capacity? Typically not. Why is this not at all reflected in BPMN development? That is exactly our point. Sure, our argument is a somewhat provocative statement. But if it helps to channel some attention to end usage, that’s fair by our standards.

Bruce responds in turn, saying that if Michael had presented this as “statistical correlations between diagram elements in a sample of BPMN collected in the wild”, then it would have been okay, but that the conclusions drawn from the data are flawed. In other words, he’s saying that the research paper is valid and interesting, but the post that Michael wrote promoting the paper (and including those unintentionally provocative implications) is problematic. As it turns out, in terms of Michael’s group of the 17 least-used BPMN constructs, Bruce could live without 15 of them, but will fight to the end for the other two: exception flow and intermediate error event. However, Michael doesn’t say that these are useless — that’s Bruce’s paraphrasing — just that they’re not used.

There’s a bit of chicken-and-egg going on here, since I believe that business analysts aren’t using these constructs because they don’t know that they exist, not because they’re useless. Many analysts don’t receive any sort of formal training in BPMN, but are given a BPMN-compliant tool and just use the things that they know from their swimlane flowcharting experience.

Anyway, Bruce finishes up by misinterpreting (I believe) the conclusion of Michael’s post:

Michael ends his piece by asserting that the real BPMN is not what vendors, consultants, and trainers like me say it is, but the way untrained practitioners are using it today.

What Michael actually said was:

[O]ur own experiences with BPMN and with those organizations using it gave us this hunch that the theoretical usage (what vendors and consultants and trainers tell us) often has little to do with what the end users think or do (the practical usage). And why is it important to know what the end users think and do? Because it can help the researchers, vendors, consultants and trainers of this world to channel their attention and efforts to those problems real users face. Instead of the problems we think exist in practice.

Although it’s not completely clear, I believe that Michael is saying that we need to understand what people are doing with BPMN now in order to design both training and systems.

This was an interesting debate to watch, since I know and respect both Michael and Bruce, and I found merit in the arguments on both sides although I don’t fully agree with either.

There was an interesting coda on the validity of BPMN for model-driven development with Tom Baeyens weighing in on the debate and stating that BPMN should stick to being a modeling notation and not be concerned with the underlying execution details: “[t]he properties can be removed and the mapping approach to concrete executable process languages should be left up to the vendors.” Bruce responded with some thoughts on model portability, and how that drives his categorization of BPMN constructs.

If you’re at all interested in BPMN, it’s worth taking the time to work your way through this debate, and keep and eye on Bruce and Michael’s blogs for any further commentary.

BPMN and the Business Process Expert

There’s something funny about chatting via IM with someone as you’re listening to them give a public webinar, even when you do know that the presentation is pre-recorded — I was on Skype with Bruce Silver today during his webinar The Business Process Expert and the Future of BPM on ebizQ, where he was speaking with Marco ten Vaanholt of SAP’s BPX community.

Except for one “happy smiling faces” graphic worthy only of Jim Sinur’s blog pimping marketing team, I really enjoyed Bruce’s presentation, although I’ve heard at least parts of it before. He started with a comprehensive description of BPM and why model-driven design is so critical to process agility, which he segued into a description of BPMN and its importance in making process models executable: the heart of model-driven design. He feels that it’s necessary to define the role of Business Process Expert (BPX): someone that bridges between business and IT, creating executable requirements for BPM solutions. Obviously, BPMN is a critical skill for the BPX, and Bruce offers a number of resources including a free series of articles and e-learning modules that he’s done on the SAP BPX community and the longer paid courses that he offers online and public classes through BPM Institute. No wonder he hasn’t blogged for months: he’s been too busy creating all this.

Marco ten Vaanholt talked about the importance of BPM and SOA — fairly motherhood sort of stuff — then dug into some details of the SAP BPX community, which is an incredibly well-developed resource for anyone involved in BPM, whether you’re an SAP customer or not. The core of the BPX community is collaboration and collective learning on business scenarios, process lifecycles, change leadership, social responsibility, horizontal and vertical practices, modeling tools, methodologies and a variety of other topics. It’s not just a discussion forum, however: there’s a lot of really valuable content, such as Bruce’s articles and e-learning, from both SAP and the community in general.

Marilyn Pratt, the BPX community evangelist, has been keeping me up to date on what’s happening on BPX and the worldwide community events in which she’s been involved, and I’m looking forward to catching up with her and seeing more of BPX in action when I attend SAPPHIRE in May.

There was some good Q&A at the end about process modeling and the BPX community. Definitely worth watching the replay, which should be available online at the original webinar link above.

IIR BPM: Me and the role of standards in BPM

I’m up now, and here’s what I’ll be presenting:

I know, it’s long, but I’ll breeze past a number of the slides that I put in there just for reference. If this isn’t enough on standards for you, I highly recommend Michael zur Muehlen’s BPM standards tutorial. I liked it so much, I stole a couple of his slides, although he’ll probably sit in on my session to keep me honest.

IIR BPM: Facilitated session on standards

Alec Sharp led a facilitated session on standards that we love, hate, or wish were there (or don’t care about). This is a bit similar to the BPM Think Tank roundtables, but we’re at about six small tables so had a chance for some mini-break-out sessions to discuss ideas, then gather them together.

The notes that came out of this:

  • One group had some general comments about standards, stating that a common language can simplify, but that the alphabet soup of standards is too complicated and IT driven.
  • Another group hates BPMN because they feel that a 200-page specification isn’t understandable by business users, and that BPMN is really for specifying automated process execution but is not for business consumption. It’s stifling and constrains what can be modelled.
  • Standards aren’t written in plain English. There are two sets of standards: methodology standards and tool standards, and we often confuse the two. Once is focussed on human-driven processes, and the other on technology-driven processes. A great analogy: the people coming up with the tools have never baked the cake, or even eaten one.
  • Standards are often misunderstood, both in terms of who they’re for and what they’re for: they’re misinterpreted by marketing types. [I see this a lot with BPEL having become a standard “check box” on BPM RFPs rather than a real requirement.]
  • Standards can seem inflexible.
  • Interchange standards are either insufficient or improperly used by the tools, making it near-impossible to do round-tripping between different tools. They’re intended to use for translation between business and technology domains, but notational standards are possibly becoming less understandable because they are targetted at flowing into interchange standards. [I’m not sure that I agree with this: IT may require that business model in specific forms rather than just allow business to use BPMN in the way that they best fits the organization.]
  • Standards should be discovered, not invented [Vint Cerf, via Michael zur Muehlen], and BPM standards have been mostly invented.
  • In defense of standards, one person noted that the form of a sonnet is one of the most constrained/standardized forms of writing, but that Shakespeare wrote some of his most beautiful works as sonnets.
  • I got in a few comments about the importance of interchange standards, and how round-tripping is one of the primary problems with these standards — or rather their implementation within the BPA and BPM tools.
  • There’s an issue with the priority when adopting standards: is it to empower the business users, or to support IT implementation? If the former, then it will likely work out, but if it’s for the latter, then the business is not going to totally buy in to the standards.
  • The relationship with the business has changed: it used to be treated as a black box, but now has to be more integrated with IT, which means that they have to bite the bullet and start using some of these standards rather than abdicate responsibility for process modelling.

I don’t necessarily agree with all of these points, since this turned into mostly a standards-bashing session, but it was an interesting debate.

BPM standards tutorial slides

Michael zur Muehlen, who I met at the BPM Think Tank in August, gave a tutorial on standards at the recent BPM conference in Brisbane, Australia. He noticed that I’m giving a presentation on standards at the upcoming Shared Insights/IIR BPM conference in San Diego next month, and invited me to check out his slides on Slideshare:

Considering that I have to submit my slides this week, these could really come in handy to help fine-tune my thoughts.