Integration World Day 1: Richard Maranville, FedEx Kinko’s

Richard Maranville, SVP and CIO of FedEx Kinko’s (FedEx acquired Kinko’s in 2004), gave the first customer presentation of the day, talking about how they use webMethods to integrate their systems.

FedEx has always been big on technology, and changed the courier world with paradigms like huge centralized auto-sorting centers, where all packages are sent to one central location for sorting and redistribution. You can’t do that without a significant amount of well-oiled technology. Since FedEx has grown significantly through acquisitions such as Kinko’s, they’ve also had to be able to integrate these acquisitions — and their technology — into the mother ship.

From Kinko’s point of view, the struggles since the acquisition have been primarily about the integration of systems and data with the parent company to allow for better functional integration with the parent company. They had a pretty low level of technology in a lot of business area, with many purely manual processes and re-entering of data from one system to another, which introduced both errors and latency. Now, however, the applications are integrated so that data flows from one to another in real-time, making it available not just within one location but to other locations that might be serving the same customer or that have similar requirements. This has even been extended out to their customers, so that customers can enter and track orders directly, and the underlying process is done in exactly the same way as if it were happening in a FedEx Kinko’s location. They reused components and stitched them together with webMethods, allowing them to add order tracking using FedEx’s existing world-class tracking system rather than building it themselves, and doing the entire integration in a matter of days. They’re also integrating with FedEx services: you can upload a job for printing and have it shipped back to you.

They’re going to be building some new dashboard functionality using the webMethods BAM tools that leverages all the bits and pieces that they already have, from their legacy systems to the new integrations.

I really like the format of these keynotes, by the way: at the end of each formal presentation, Mark Jeffries pops back up on stage to sit down with the presenter for a few minutes and ask him some questions.

Policies, procedures, processes and rules

Most of my customers are large financial institutions, and I help them to select and implement BPM within their organizations. The technology is only one part of that, however: I’m almost always helping them with their business processes as well. Since policies and procedures drive processes, I often end up in the thick of their policies and procedures, and that’s where the confusion starts.

First of all, there’s the definition of terms. What’s the difference between a policy and a procedure, when many people lump them together as if PoliciesAndProcedures were one word? I like this definition:

Policy is a mandate and directive from the top of the organization. Its purpose is to influence behaviour. From it, management provide the overarching principles under which the business operates. It should not vary in its message or enforcement model.

Procedures are process specific and detail the steps taken to achieve an objective. Procedures include operations manual, user manual, and all manner of process documentation.

I see policies as the rules, or laws, of an organization, whereas the procedures are the processes used to enact the policies. The problem is, however, that many companies see policies and procedures as belonging to the Legal/Compliance department, and create another set of processes — usually referred to as an "operational guide" — that are created and maintained by the operational area that executes the actual processes. If you throw in a BPMS, then some (but rarely all) of these operational procedures may be further documented in the process descriptions within the BPMS or a BPA tool.

What’s the distinction between a policy and a procedure? Is there a difference between a procedure and the operational description of a business process? What about between the business process and the process model in a BPMS?

Secondly, there’s the responsibility issue that I referred to above: who’s responsible for each of these essential bits of corporate documentation? Legal/Compliance is almost always handed policies and procedures, but what about the case when the procedures are actually descriptions the operational business processes? Should policies be left with Legal, and procedures given to the operational areas, with Compliance there to make sure that everything matches up? Or are the operational process descriptions a separate, more fine-grained version of the procedures, leaving the procedures with Legal and the operational processes with Operations?

If process maps are created within a BPMS, do they become part of the business process documentation, replace part of it, or stay as a separate "implementation view" of the processes? I’ve definitely seen cases where the process maps in a BPMS bear little resemblance to what the business perceives as its processes, either due to limitations in the BPMS environment or to the business having an incomplete view of the process.

And if there’s four separate types of documentation — policies, procedures, business processes and BPMS process definitions — who’s responsible for keeping them all in synch?

Third is the whole technology issue: how is all of this information captured, published and synchronized? There are a tools such as RulesArts and RuleBurst (both of which I saw last week at the Business Rules Forum) that help to capture policies as high-level non-executable rules — an approach that makes more sense than just trying to document them free-form in a word processor while praying for consistency. Check out the Flash demo on the RuleBurst site to see what this can look like. Some of this systems are also business rules engines, that is, they execute the rules and can be called from other applications; some are just platforms for non-technical users to document policies, detect gaps and exceptions, and help to ensure compliance.

As we move into procedures, operational guides and process definitions, it’s all about processes. Processes based on rules (and what process isn’t?), but processes nonetheless. Those organizations documenting their policies in a word processor are likely also documenting their procedures in the same way — in fact, possibly in the same document — using descriptive text and a few diagrams. At some level of detail, someone starts drawing process maps, although these are usually as illustrations to the descriptive text rather than a replacement for it.

The two biggest issues in all of this technology are synchronization (usually manual, and therefore almost certainly out of date) and publishing (ditto). From the synchronization standpoint, there needs to be something that links the policies (rules) with the various granularities of process descriptions (both text and graphical) and either keeps them in synch or alerts someone when related pieces are modified. For publication, none of this information is of any use unless it’s in the hands of the people who need it; that means that there needs to be an easy (or automated) way for all of this information to be published within an enterprise and accessed with nothing more than a browser and network authentication.

What starts to become shockingly apparent as you dig into the technology is that policies are about rules, and procedures are about processes. Yeah, I know, I said that at the start of this post, but it’s not just some abstract concept, it’s about how you need to document and implement policies and procedures. The crux of the issue is in the crossover from rules to process, since a rule (policy) usually doesn’t dictate the operational procedure required to enact it, hence there’s not a clear technology path to map from policies to procedures. If policies are maintained in a high-level rule repository and procedures are maintained in some combination of descriptive text and process maps, what’s the missing link between them?

Policy and procedure documentation is just one place where business rules and business processes intersect (they touch again at the point of process execution), and I’m interested in exploring the ideas around this. I’ve put forward more questions than answers — feel free to join the conversation by commenting on this post, tracking back from your own post, or dropping me an email.

BRF Day 3: Good Business Rules in Process — Eliminate 65% of the Activities

I couldn’t quite drag myself out of bed for the 8am sessions, but I did want to hear Kathy Long of the Process Renewal Group talking about process and rules. She talked about how to derive rules from processes, and use them as guides to the process. There’s a number of process-related problems that can occur when the rules are not explicit: assumed policies, activities with experience as the only guide, and inconsistent (and therefore likely non-compliant) processes. The key things to consider when analyzing the guides for a process can be focussed around what happens at a given activity (are what knowledge is required, what decisions are required, what reports have to be generated) as well as a number of other factors; she presents a number of different questions to ask in order to drive out the rules and make them explicit.

She also made a distinction between policies and rules, where the key differentiator is that rules are actionable, whereas policies must be interpreted into more concreted business rules in order to take action. Within rules, there’s both structural rules (can’t be broken) and operative rules (which have a bit more wiggle room); this sounds a bit like the distinction between a fact and a rule that I heard in a session yesterday, which makes me unsure that there’s a really common vocabulary for some of these things.

Looking at some of their process analysis techniques, she presented categories of activities as real value-added (impact the customer’s requirements), business value-added (required to run business, such as regulations), and non value-added (that 65-85% of work that doesn’t contribute to either RVA or BVA). There’s a whole list of verbs — adjust, approve, expedite, inspect, verify and many others — that tend to indicate that activities are NVA and should be considered for elimination. Many of these are because something wasn’t done right the first time; a lot of the NVA activities can be cut if there’s ways to reduce the error rates in the RVA and BVA activities. This isn’t, of course, really about rules: it’s about process improvement. Sure, the appropriate addition of business rules can certainly lead to process improvement, but it’s also about the myriad other ways that we improve processes, such as establishing accountability and eliminating unneeded steps that are only there for historical reasons. Some thoughts that Long gave us to take away:

  • The greatest opportunity to improve a process is by changing the rules
  • Challenge all policies
  • Validate all compliance interpretations
  • Eliminate the use of assumed policies
  • Ensure that all rules are documented including the use of experience/knowledge
  • Create consistent rules across the enterprise
  • Structure rules so that they can be easily changed
  • Allow the business to design its own processes

I was surprised that she didn’t talk at all about some of the technology issues such as how BPM and BR can be used together to improve processes, but her focus was not at all on technology: her only case study was about improving a process based on a manual procedural change.

I have to head off for a mid-day flight home, so that was the end of my Business Rules Forum experience. I’ve actually learned a lot here, which has made my time here definitely worthwhile. However, I’m still left with the feeling that I mentioned on my first post back on Tuesday: we need to start having much more crossover between different technology areas such as BPM and BR. I’ve been writing since mid-2005 about the importance of looking at BPM and BR together, but in spite of the technology advances that have occurred since then to facilitate this, I’m not seeing much happening in the real world.

BRF Day 2: How Business Rules Re(Define) Business Processes: A Service Oriented View

For the last session today, I attended Jan Venthienen’s session; he’s a professor at Katholieke Universiteit Leuven. He talked about different representations of rules, particularly decision tables (at length, although in an interesting way). He talked about the problems with maintaining decision trees, then as he moved on to business processes, he showed how a business process with the rules encoded in the process as routing logic was really just a form of decision tree, and therefore difficult to maintain from a rules integrity standpoint. As rules are distilled out of and separated from the processes, the processes become thinner and thinner, until you have a single branch straight-through flow. I have the feeling that he’d like to reduce the process to a single activity, where everything else is done in a complex rule called from that step. I’m not sure that I agree with that level of stripping of logic out of the process and into the rules; there’s value in having a business process that’s understandable by business users, and the more that the logic is encapsulated in rules, the harder it is to understand how the process flow works by looking at the process map. The critical thing is knowing which rules to strip out of the business process, and which to leave in.

He’s doing research now to determine if it’s possible to specify business rules, then automatically derive the business process from the rules; an interesting concept. In order to do this, there must be rules that constrain the permission and obligations of the actors in the process, e.g., an order must be accepted before the product is shipped. This presents two possible architectural styles: process first, or rules first. In either case, what is developed is an architecture of rules, events and services, with a top layer of business rules and processes, a middle layer of services and components, and a bottom layer of enterprise applications.

BRF Day 2: Intelligent Process Automation: The Key to Business Process Optimization

The opening keynote today was Steve Hendrick of IDC, discussing their acronym du jour, IPA (intelligent process automation), which is a combination of BPM, BI and decisioning. He lists four key constructs of IPA:

  • Event processing, providing a sense and respond approach
  • Decisioning, covering both rules and actions that might be derived from those rules
  • BPM (I knew that he’d get to this eventually)
  • Advanced analytics, including profiling and segmentation, predictive analytics and modeling, and decision optimization

I’m not sure how this differs from Gartner’s definition of BPMS technology, which includes all these factors; do we really need another acronym for this? I suppose that the analyst firms need to make these distinctions to play in the marketplace, but I’m not sure that a new term specific to one analyst firm provides benefit to the end customers of these systems.

He just put a non-linear programming equation up on the screen. It’s 9:19am, we were all up late last night at various vendor dinners, and he’s talking about the specifics of how to solve this optimization model. I really think that he’s overestimating the number of fellow analytics geeks in the audience.

He moved on to discuss BPM, which he characterizes as a context for putting advanced analytics to work. 🙂 He lists IBM, TIBCO and Adobe (huh?) as the leaders, Global 360 as “right on their heels”, and BEA just behind that with Lombardi somewhere back from that. Hmm, not necessarily everyone’s view of the BPM market.

He then discussed complex event processing for ultra-low latency applications, pointing out characteristics such as how it’s queue based (much like BPM) to allow asynchronous processing of events, and how this allows for extremely fast response to events as they occur. The tie-in to the other technologies that he’s discussing is that events can trigger processes, and can also trigger decisions, the latter of which he feels is more important.

He talked about a number of case studies about how analytics — in addition to other technologies and processes — made a difference for companies.

He ended with some predictions of a bright future for IPA, which included a hockey stick-like projection of BPMS sales increases of about 6x between now and 2011.

University of Exeter to offer Masters in BPM

Exeter’s Centre for Research in Strategic Processes and Operartions is partnering with BPTG (the part of BPMG that didn’t make off with the domain name) as part of their soon-to-be-launched Masters in Business Process Management. From the BPTG press release:

The Masters Programme has 10 modules and a dissertation and can be completed over three years. The modules include:

  • Business Process Foundations
  • Business Process Measurement
  • Business Process Improvement
  • Operations Management
  • Business Process Modelling
  • Business Process Change Management
  • Services Management
  • Research Methods
  • Customer Value and Process
  • Business Process Leadership

Looks like some good content here, although I think that BPTG needs to just get over the whole BPMG debacle and stop including phrases like this in their emails: “Some providers distribute course and other certificates like confetti, the authenticity and veracity of which, beyond simple attendance, have no discernable pedigree.”

Good and bad government processes

A few months ago, I blogged about the unexpectedly good experience that I’d had at the Canadian passport office, where the process actually worked the way it was supposed to, and rewarded the consumer (me) by accelerating my wait time since I did my own data entry online.

Last week, I had two other government business process experiences: one good, one bad.

The good one was my NEXUS card: NEXUS is a joint program between the Canadian and American governments to allow frequent travellers to replace the long immigration line-ups in both directions with a retinal scan for authentication and a few questions on a touch-screen kiosk. Since I travel across the border fairly regularly, I decided to apply for this, especially after being stuck in a line of 500 people waiting for immigration checks a few times. Friends warned that it took 6-8 weeks for the preliminary approval, and that the follow-up interviews were already being scheduled for December. Wrong. I applied using the online form about 3-1/2 weeks ago, and received (by email) my approval and invitation to schedule an interview about two weeks later. I went online the next day, a Saturday, and found an appointment for that Monday — a 2-day wait rather than the 2 months that I was expecting. I went out to their office at the Toronto airport for the interview, again expecting an hours-long delay, and was out of there so fast that my parking cost was $3 — that’s the minimum, which means that it was less than 30 minutes to park the car, find their office, have my eyes and fingers scanned, answer some questions and have my card issued.

Before the passport office experience, I never believed that the Canadian government could behave so efficiently. Before last week, I never believed that two governments in collaboration could possibly do something like this in less than 3 weeks, but they did. I have to imagine that part of this is because I chose to fill in the online application — thereby doing their data entry for them and hopefully allowing them to automate some parts of the process — rather than the paper application form; I’d be very curious to hear what the average application-to-interview time is for the paper method. I’d also love to know if they’re using some sort of BPM technology to help this process along.

The bad government business process experience that I had was with the Indian consulate in Toronto, and has killed my planned trip to speak at SOA India in Bangalore in November. I was getting my trip plans in place, and knew that I had to get a visa in order to enter India. On the website of the Indian consulate, however, I saw that the process is to mail in my passport, then they keep it for 3-5 days, then they mail it back. To be conservative, that’s 2+5+2 = 9 business days (if nothing goes wrong). My problem is that I don’t have a stretch of 9 business days in the next 3-4 weeks when I’m not flying between Canada and the US — which now requires a passport — because of the conferences that I’m attending, so I can’t go through the usual process. I email the Vice Consul for Visas to see if there’s an expedited process for this situation, who responds “Possibility can be explored but without any promises” and invites me to come into the consulate. We scramble around to get our visa applications filled out, get the requisite photos and money orders, then arrive shortly after the consulate opens one morning last week. Huge lineup just to get to the triage desk; we wait in line for over an hour just to speak with someone, who then wrote my name on a list for an interview. We sat in the waiting room for an additional 3 hours before my name was called, then entered the office of someone who may have been the Vice Consul or not. I explained the same thing that I had said in my email — I travel to the US frequently and can’t give up my passport for a week and a half, so am looking for an expedited process — and he immediately responded “I’ve had 10 people in here today with the same issue, and I had to turn them all down, so it’s not fair if I do it for you; we can only expedite the process for family emergencies.” The interview was over in 30 seconds. WTF? Why didn’t he tell me that in the email, so that I didn’t waste a couple of hours of prep time, four hours of sitting in their waiting room, and $50 on photos, money orders and prepaid return envelopes? For that matter, why isn’t there an expedited process (for a fee, of course) for those of us who can’t give up our passports for a long time due to frequent cross-border travel? My travel to India was to speak at a business conference, which presumably benefits the Indian economy in some small way.

What we have is the case of a business process gone horribly wrong, and not really serving all of the constituents that it is meant to serve. The process appears to be completely manual and not have the same rules for everyone: some visas were being expedited, but not for business reasons. There’s a mismatch between the information that was offered by email and what the consulate worker was actually empowered to do, or possibly what he chose to do at that moment. There’s excessive unscheduled wait time for participants in the process. And, in the end, it’s the Indian conference organizer (and potentially the attendees) who suffers through no actions of his own: he now needs to find a replacement speaker to come to India on 6 weeks notice.

I’m sure that the Indian government has challenges that the Canadian and American governments can’t even imagine, and I don’t expect to see the same level of technology and automation. However, there are huge opportunities for process improvement here that don’t involve technology, just standardization and a focus on efficiency.

BPM Chapter meeting in Toronto, October 12

On the morning of October 12th, the BPT Group — which rose from the ashes of the BPMG — is having a general-purpose BPM chapter meeting in Toronto open to anyone interested in BPM. Although the agenda is heavily weighted in favour of a demonstration by Metastorm, who are hosting the event, it will be an opportunity to meet other people interested in BPM. Jim Baird, who is organizing this, is actively looking for BPM practitioners to speak at future chapter meetings, so please let him know if you’re willing to talk about your BPM project or know of someone else who might.

Here’s the details for the meeting on October 12th:

Location:

O Beirao Restaurant, 5468 Dundas St. West (side banquet hall entrance). If you’re on public transit, you can walk west from the Kipling station in 10-15 minutes or catch a westbound bus out of the station (such as the 111 East Mall). I’m not sure of the parking situation but there’s sure to be some around.

Agenda:

  • 8:30 – 9:00 Registration and Continental Breakfast
  • 9:00 – 9:10 Welcome and Introduction, James Baird – BPTG North America and Australia
  • 9:10 – 9:15 Welcome from Metastorm (session host), Michael Szczerba – Account Executive – Americas
  • 9:15 – 9:45 BPM or Requirements Analysis – Where to Start?, James Baird
    • The links between BPM and gathering business and system requirements
    • Different approaches to documenting processes
    • How organizations are implementing BPM
    • Measuring BPM success
    • The importance of Business and IT involvement
    • Choosing your first BPM project
  • 9:45 – 10:00 Question and Answer / Discussion period
  • 10:00 – 10:15 Coffee Break
  • 10:15 – 11:00 Demonstration of ProVision BPM by Metastorm, Michael Szczerba
  • 11:00 – 11:45 Networking opportunity and discussion of future topics

You’ll need to RSVP Tuesday, October 9th if you want to attend, to Judith Baird at 416-252-8405 or Judith.Baird@BPM3inc.

BPMG held a couple of chapter meetings here before the big blow-up earlier this year; one of them was very well attended, the other (due to bad weather) was much less so. In any case, I think that these meetings are a good way to get more of a BPM community going in Toronto, although we definitely need to get some practitioners speaking at them.

Forrester Day 2: Colin Teubner

The last session of the last day, and a significant portion of the audience (especially those headed east) have bailed out but there’s still a few of us hanging around to hear Colin Teubner talk about optimizing business with BPM. I think that he drew the short straw as the junior guy, presenting in the last two sessions back-to-back. 🙂  I think that the two-day format is just the right length; the 2-1/2 days of Gartner last week was just a bit long. Also, starting on Tuesday so that people can travel on Monday rather than having to burn up half their weekend is nice, too.

The central theme is that the ultimate goal of BPM initiatives should be transformation, not just efficiency. As he points out, many companies focus purely on efficiency, trying to trim costs for small wins rather than looking to make a transformative change that can drastically improve the organization’s competitive differentiation and revenue. BPMS is more than just modeling and automation; it includes the whole cycle of monitoring and optimization feeding back to the modeling stage. He showed a slightly different version of the BPM maturity/adoption chart that Connie Moore showed yesterday; I’m still unsure why this is a two-dimensional graph, since it is really a projection on the diagonal axis, but I suppose a one-dimensional representation just doesn’t look as nice.

He then mapped BPM onto the “design for people, build for change” theme of the conference: UI creation and process mindset belong to the design for people side of things, whereas agile processes, SOA connections, business-friendly tools and comprehensive monitoring map to building for change. Different from, but compatible with, the view of BPM in the D4PB4C theme that I covered on slide 26 of my presentation this morning. He talked about why simulation tools are not used as widely as the BPM vendors would have you believe: they’re too hypothetical and require a certain amount of guesswork (although using detailed execution data to populate your simulation can help with this). I also think that they’re a bit too complex and analytical for many of the business analysts who are targeted to use them.

Tuebner covered a number of use cases for BPM integrated with other technologies — forms technology to integrate data from other sources, content, BI and more — and the ways in which BPM enables an improved customer experience both through direct interaction or by informing the environment of an internal employee who is dealing with the customer.

He showed (for about 10 seconds) their Q3 Wave for BPM vendors; I think that this is the human-centric BPM wave, although it really went by so fast that I didn’t catch it, much less any of the vendors’ positions on it.

He had some interesting words about end-to-end organizational visibility and how it allows executives to understand processes and systems by making the link from strategy goals down through other layers; not surprisingly, he discussed this in the context of enterprise architecture.

His final recommendations:

  • Don’t just make processes run faster, make them better and pay special attention to users’ work environments.
  • Use BPM for business improvement, not process improvement, and focus on customer experience.
  • Take an end-to-end view of process and plan BPM as a management discipline (by which he means a business initiative).

That’s it for the Forrester Technology Leadership Forum. I’ve enjoyed it, found the content solid, and the culture a refreshing change from some other large analyst conferences.

Forrester Day 2: The three B’s

I ended up skipping the session after mine at the end of the morning, but had some great hallway conversations with some of the business rules vendors who indicated that they think that I’m on track with what I’m saying about BPM and BR.

For the first of the afternoon sessions, I’m attending a panel discussion on the convergence of the three B’s — BI, BPM and BR — featuring Mike Gilpin (EA and application development), Boris Evelson (BI) and Colin Teubner (BPM). I covered a tiny bit of this topic in slides 22-24 of my presentation this morning, and will be doing a full-length presentation on this same topic at the Business Rules Forum next month in Orlando, so I’m interested to see if the Forrester analysts have the same thoughts on this subject as I do.

They start with the statement that “design for people, build for change” will drive the convergence of the three B’s. Interestingly, although a few people in the room stated that they use BPM and BI together, almost no one raised their hand to the combination of BPM and BR — a combination that I feel is critical to process agility. Gilpin went through a few introductory slides, pointing out that almost no business rules are explicitly defined, but are instead buried within processes and enterprise applications. He sees BI as driving effectiveness in businesses, and the combination of BPM and BR as driving efficiency.

Forrester will be publishing some reports about the convergence of the three B’s, and although there are some two-way combinations in vendor products now, there are no vendors that combine all three in a single product. I’m not sure that this is a bad thing: I don’t think that we necessarily want to see BR or BI become a part of BPM because it ultimately limits the usefulness of BR and BI. Instead, I see BR and BI as services to be consumed by BPM, with BI having the additional role of combining process execution statistics generated by the BPMS with other business data. An explicit question was asked about when to use the BR and BI included in the BPMS versus when to use a third-party best-of-breed BR or BI system; Teubner and Gilpin offered some guidelines for this as well as some examples of each situation, but it’s not completely clear if there’s a distinct boundary between when to use the BPMS’ in-built functionality versus the third-party specialist product.

My message on this topic is that BR is the key to process agility, and BI is the key to process visibility as well as feeding back into BR in order to further increase agility. By using the BR and BI functionality within your BPMS, however, you’re typically not getting full BR or BI functionality, but some limited subset that the BPMS vendor has selected to implement. Furthermore, you can’t reuse that functionality outside the BPMS, and in the case of business rules, a change to the BPMS’ rules often requires retesting and redeploying the process models, and does not apply to in-flight processes. However, if you’re not sure if you need BI or BR (hint: you do), then using the in-built functionality in the BPMS gives you an easy-to-integrate and lower cost way to get started. Moving to a separate third-party business rules system gives you a couple of key advantages: you can reuse the same rules across different processes and across other applications in your enterprise, and changes to the rule impacts in-flight processes since the rule is not executed from the BRE until that point in the process is reached. Moving to a separate third-party business intelligence system also provides the advantage of being able to analyze the process data in the context of other business data, and potentially feed back the results of complex analytics to inform the business rules, that in turn drive the business processes. The bottom line: BR and BI are used for many applications in the enterprise that are not explicitly process-related, or combine data from many systems of which the BPMS is just one source. For example, although there are processes embedded within your ERP system, your BPMS may not have direct access to all the information that’s in those processes and hence the BI that’s part of your BPMS can’t (easily) include that data in its analytics and reporting; a general-purpose BI platform may be much more suited to combining your BPMS statistics with your ERP statistics.

A lot of the conversation in this session, which was very interactive with the audience members, was around whether to use converged products versus separate products. It’s not a completely simple answer, and I’ll definitely be thinking about the use case boundaries between converged and separate products before I show up at the Business Rules Forum to continue this discussion.

Evelson and Teubner will be publishing an initial paper in this area in the next few weeks, using the concepts that they’ve presented here today, but see it as a springboard for more discussion in this area rather than an end-point.