SAPPHIRE: Perfectly Ordinary BPM

I took a guide dog and a GPS, and ventured out onto the show floor to visit the NetWeaver Theater to see Wolfgang Hilpert discuss SAP NetWeaver BPM (that’s the official name) in more detail. Since this is my first real exposure to the product, I’m trying to get as many views of this as possible over the couple of days that I’m here, both to learn about the functionality of the product and hear the messaging and positioning around the product.

He started with similar material to Ginger Gatling’s presentation yesterday, looking at the key dimensions of business processes — structured versus non-structured, human-centric versus system-to-system, and business versus IT driver/focus — and how business processes (and the systems that support them) need to cover all of these dimensions. He also covered key process metamodel concepts — events, rules, data, roles, workflow/task and UI — and why they’re important.

He moved on to talk about the goals and value proposition of NetWeaver BPM, including model-driven development and business-IT collaboration. BPM is an integrated part of the NetWeaver composition environment, which makes it an easy transition for existing NetWeaver developers, although the Eclipse environment needs to provide a simplified perspective for business analysts who are involved in modeling.

There’s three parts to the BPM suite:

  • Process Composer, which is the modeling environment;
  • Process Server, where processes are executed;
  • Process Desk, the user interface that is integrated into the portal environment and becomes part of the Universal Worklist.

SAP BPM process modelingI have a huge amount of respect for anyone who talks and demos at the same time, and particularly so when they have a title to the north of VP; Hilpert falls into this category, and he fired up the demo and showed us what this actually looks like.

There’s a nice graphical interface for projects that shows a hierarchy of processes, user interfaces, business logic and services and how they interact, which makes it possible to detect dependencies of reusable components. As I mentioned yesterday, the process modeler is a pretty standard BPMN modeler, although I really like the context-sensitive mini-icons that are displayed around the selected object on the process model for creating the next connected object — nice usability feature. He also showed an interesting feature for creating a data/document artifact with automatic mapping from a process step to the artifact.

SAP rules managementThe NetWeaver BRM integration is nice, since the rules and processes are designed within a common environment and can be linked up by dragging and dropping a rule onto a process condition instead of having to specify a web service call to the rules engine (although that could be what’s happening under the covers).

We then moved on to the process participant’s user interface, using the NetWeaver Business Client portal. He filled out a simple (text fields only) form — although we didn’t see how this form was designed — which kicked off a business process that appears in the appropriate person’s Universal Worklist. There is a fairly standard task UI once the process is opened from a person’s inbox, which allows adding attachments (files or links), setting parameters using a variety of UI widgets (calendars, drop-down lists), viewing the current step in the graphical context of the entire process, and completing the work step.

SAP BPM end-user interfaceUpdate: found some screenshots from Ginger Gatling’s presentation deck and dropped them in.

Moving away from the demo and back to presentation/discussion, he looked at the personas/roles within an enterprise that are involved in BPM, specifically the process architect (just on the IT side of the line), the business analyst (just on the business side of the line) and the business expert/SME (deep within business-land). Although this product release will address the needs of the process architect and business analyst, they are planning for other tools to expand business collaboration supporting the needs of the business analyst and the business expert. Part of that will be runtime visualization tools, such as dashboards, but it sounds like there’s also consideration of some design collaboration that will happen in the business area.

He went through the architecture that Gatling covered yesterday, and reinforced that the process integration layer is a rebranding of their current XI integration layer. The language around the future versions, where there will be a common process platform and SAP application core processes will be extended using the composition environment, is still a bit fuzzy, but we’re talking about things that will happen more than a year away.

There is nothing earth-shatteringly innovative about the SAP NetWeaver BPM suite: this is a perfectly ordinary BPMS. That’s not a criticism, especially considering that this is the first released version: it’s a reasonably full-functioned BPMS out of the box, and that’s all that SAP needs in order to compete within its existing customer base. They’re not trying to be the best BPMS on the market, they want to be the best BPMS for SAP customers.

SAPPHIRE: Henning Kagermann Keynote

I watched the general keynote this morning from the press room rather than finding my place amongst the 15,000 attendees; the first hour covered other announcements, but we did see about 10 minutes on the upcoming BPM product. Since a lot of the audience was likely unfamiliar with BPM, this was a pretty high-level architectural view plus a quick demo of the process modeler and services/rules integration, accompanied by the proclamation "you can change the process without changing code!" This message on the benefit of BPM would be have been fresh 5 years ago, but although it’s likely new to a lot of SAP customers, it shows that SAP is definitely playing catch-up in BPM. As I mentioned yesterday, this year’s release of SAP’s BPM will offer little advantage over using a more established BPMS with SAP (and in fact might be less functional), but they’ll hit the sweet spot with the future releases that have a tight integration with SAP core applications.

The demo theme throughout the keynote was feeding information to a portal/dashboard from multiple sources, including their core applications and BPM.

The message across the keynote is one that resonates with what we’ve been seeing in BPM for a while: transparency into business operations, agility in business processes, collaboration amongst stakeholders, and less coding required for implementation. Add this to the strength that SAP has in building software for running enterprises, and it’s pretty powerful. Whether they can shift from a legacy of highly-customized rigid ERP applications to this new world of flexible composite applications remains to be seen.

SAPPHIRE: Wolfgang Hilpert on BPM Overview

I’m picking and choosing my sessions carefully, in part because I have some prearranged meetings specifically about BPM. Here with all my Enterprise Irregulars blogging compatriots (many of whom I’m meeting in person for the first time), we all were given a personalized schedule of meetings tuned to our particular area of interest — very well organized, and still leaving some time in the schedule for me to go to some additional sessions.

I had a one-on-one meeting with Wolfgang Hilpert, SVP of NetWeaver BPM, this afternoon; funnily enough, just after I attended Ginger Gatling’s session this morning, I had lunch in the press area, and when I mentioned that I’d seen the session on the new SAP BPM, three pairs of ears at the table swiveled around. These three, who I didn’t know (nametags, unfortunately, hang below the level of the table when seated), gave me a light grilling on my opinions of what I had seen; although I figured that they worked for SAP, it wasn’t until they stood up that I saw Hilpert’s name tag.

By the time that we had our meeting, then, he knew that I’d seen a product overview, and he’d already heard my views on it, so we could jump right to some of the good stuff. As I suspected, and wrote in my earlier post, they’re not looking to compete in the general BPMS market for non-SAP customers, but see themselves as becoming the BPM tool of choice for SAP customers. More than just an appendage to SAP, their BPM will allow for orchestration of web services within heterogeneous environments (as do all other BPMS), plus provide the services repository and UDDI registry. He also sees them as eventually being able to identify SAP business objects directly as part of the orchestration, allowing for easy passing of the object from one step to another; another tight coupling with SAP applications that will win them an advantage. Their long history with enterprise software likely does give them a unique insight into how enterprises work, and they have innovations such as user role abstractions on a business level through interaction with other enterprise systems that contain role information, not just LDAP directories.

They built the BPM part of the product themselves after surveying the market and not finding what they wanted at the right price (I asked if Lombardi was too expensive but didn’t get much of a reply 🙂 ), then bought in the rules through the Yasu acquisition. The business rules can still be used as a standalone rules/decisioning engine, but they’re also tightly integrating it with the Eclipse process modeling environment for integrated lifecycle management of processes and rules. They also perceive their product as being more scalable, but frankly, every vendor says that to me so I’d like to see some benchmark data on that.

The composition environment is currently in closed beta, will open up to a few more beta sites over the summer, then be released in September. Because of the extensive beta period, they’re hoping to shortcut the usual ramp-up process and have this generally available shortly after that.

SAPPHIRE: BPM in SAP NetWeaver

My first time at SAPPHIRE, and I have one initial impression: this conference is huge. Most of you probably already knew that, but for me, 1,500 people at a conference is big, and this one is 10 times that size. The press room is the size of a regular conference’s general session ballroom. I just hiked 15 minutes to get to a session. More sessions run simultaneously than you’ll find in total at most conferences. There are 30 official conference hotels. Wow. And I have to report that there’s 5 bars of free wifi coverage everywhere in the conference center.

After a review of the massive schedule, I finally made it to a session: Ginger Gatling, SAP NetWeaver BPM Product Manager, giving an overview of the BPM component in SAP, including a demo and some thoughts on the future functionality. She started with a discussion of the evolution of BPM, including the drivers that have moved us from the old-style workflow and EAI to the present-day collaborative design environment where multiple people might be working on modeling different components, from human-facing processes to rules. For SAP, however, a lot of this is future state, not what they have now in the shipping product.

Currently, they offer the following functionality through various products and functionalities:

  • Manage business tasks across applications: manage and resolve business tasks centrally in one work list
  • Business process integration/automation: integrate SAP and non-SAP business applications and aut0omate the message flow between systems with an executable process model
  • Involve business users in automated processes for managing by exception: alerting in case of exceptions
  • Application workflow management: manage production workflow in your SAP application

Moving forward, there will be a more comprehensive BPM platform:

  • A composition environment for human-centric process modeling
  • Packaged processes, which are the same old processes embedded within SAP applications
  • Process integration for system-centric service calls

From a layering standpoint, the composition layer creates the process at the top layer, which calls the SAP core processes via web services calls; either layer can call into the process integration layer, which makes calls to other systems for system-centric integration. Around all of this is an enterprise services repository for governance, which can contain both the services that access SAP applications and those that access third-party systems.

IDS Scheer still holds an important position for enterprise modeling and business process analysis (including direct modeling of processes within SAP applications), but it appears that for now, composite processes will have to be remodeled in the new SAP composition environment. In the future, when BPMN 2.0 is released, they’ll use that (which will include BPDM for serialization) for transferring models between ARIS and the SAP composition environment. In other words, you can use ARIS to model the core SAP processes, then use the composition environment (NetWeaver BPM or whatever the "Galaxy" will be called on release) to extend the core processes, but these are two separate activities using two separate process modeling environments.

The new BPM product will include a graphical Eclipse-based BPMN modeler that directly translates to process execution using a shared model approach, and is embedded within SAP NetWeaver CE for an integrated composition experience, service-based connectivity and enterprise services repository. There will also be integration of business rules into the composite processes, using the Yasu technology acquired by SAP last year.

Eventually, they will evolve to a common process layer where it will be possible to use the BPM tool to extend core SAP processes, and have a single process model across both — this part is pretty exciting.

She gave us a demo of the process modeler, which was likely a lot more exciting to the rest of the audience than to me. 🙂 Pretty standard BPMN, with some nice context-sensitive tools for creating the next step when you have one selected. There’s a direct link to the enterprise services repository to call a service at a step, or to a services registry, or directly to a WSDL file. There’s also a direct link to the business rules decision tables within the same Eclipse environment. From inside the rules modeling environment, it appears that there’s a connector to direct grab data from the core SAP environment, which would remove the need to have the composite process extract the data from SAP and pass it to the rules engine.

There were lots of audience questions, running longer than the allotted time; many of these didn’t mean much to me since I’m a SAP newbie, but there was some amount of excitement about how SAP BPM will replace Guided Procedures, which will be phased out as the BPM product releases become fully functional.

My immediate impression is that in the near term, they’re creating a BPM platform that’s fairly loosely coupled (via web services) with core SAP applications, which doesn’t appear to provide any advantage over using a third-party BPMS with SAP applications; in fact, more mature BPM suites are likely to provide greater functionality. In the longer term, however, there will be much tighter integration of BPM and SAP core applications, moving to a common process model and platform: this will be a significant driver for the adoption of this product by existing SAP customers.

Business Rules Webinar Q&A

It was a busy week last week at TUCON and I completely forgot about the questions from the Business Rules Forum Q&A from the webinar that I did on the 24th. I’m not sure if the replay is available yet, I’ll post a link when I hear about it.

Here’s my answers to the questions that came up during the presentation, although I responded to some of these at the end. Where the question wasn’t clear, I took a stab at an interpretation; if I missed the point, please add a comment to this post with your clarification and I’ll follow up.

Explain the relationship between business models and BPM.

Not sure of the exact intent, but I think that this is asking about the relationship between business (process?) modeling and BPM. Business models of various sorts, including business process models, are often created by an organization to provide a high-level, business-oriented view of their operation. From an enterprise architecture standpoint, these are the models in the highest level of the architecture that may be created by, and are always understandable by, a non-technical business analyst. In the case of business process models, these are created to model the flow of a business process, usually in a flow-chart or swimlane type of diagram. In many cases, these are created in a standalone modeling tool — either a simple desktop application like Microsoft Visio, or a more comprehensive tool such as IDS Scheer’s ARIS — but may also be modeled directly in the process modeling environment of a BPM suite (BPMS). In this latter case, a process model can be directly translated to an executable process.

Is there any benefit to implementing a BRM without a BPM?

Yes, there are many cases of using a BRMS separately from BPMS: the rules/decisions may be accessed directly as part of a manual process, where a user enters in the required parameters and is given a decision back in return, or they may be called from other applications such as a CRM.

Please mention vendors or products by name, even if caveats apply.

and

Can you name products that support what has been presented?

and

What are the methods & technology tools used for BRM & BPM?

I can’t recall if we were talking about BPMS or BRMS vendors here, so I’ll try to cover both. To hit the major vendors, take a look at which ones are included in the reports by the big analysts. Gartner includes the following BPMS vendors in its Magic Quadrant for BPMS, published in December 2007: Adobe, Appian, Ascentn, AuraPortal, BEA, Captaris, EMC, Fujitsu, Global 360, IBM, Intalio, Lombardi, Metastorm, Microgen, Oracle, Pegasystems, Savvion, Singularity, Software AG, SunGard, TIBCO and Ultimus. Forrester splits up the market into four categories with several vendors in each, which I’ve listed in a previous post.

On the BRMS side, Forrester recently issued a report on BRMS vendors in which they evaluated CA, Corticon Technologies, Experian, Fair Isaac, Haley Limited, ILOG, Innovations Software Technology, InRule Technology, Intelligent Results, Pegasystems, and SAP.

There are other vendors of both types, but this covers the major players. Also notice that Pegasystems plays in both markets — and in fact is a leader in both — since its BPMS is based on a rules engine.

Who are some of the vendors with tight integration between BPM and BRM?

Pegasystems is the obvious starting point, since they use a rules engine as an underlying platform for their BPMS. Many BPMS vendors don’t want to talk about a tight integration with a third-party BRM since that implies a weakness in their own rules capabilities. All BPMS vendors, through their support for invoking web services, can integrate loosely with BRM.

In your opinion, have any BRMS suites achieved robust BPM capabilities?

Only Pegasystems, to my knowledge. It’s more likely that a BPMS will achieve BRM capabilities rather than the other way around, in my opinion.

How could you change a business rule and have it only affect new BPM processes and not in flight process instances?

There are two ways to do this. First, if all parameters that drive the rules are known at the beginning of the process, the process instance could invoke the rules immediately after it is created, and store the decision results until they are required; since the rules are executed at the time that the process instance is created, the instance will not be affected by any changes to rules while it is in progress. Second, a process can call a specific version of a rule, assuming that the BRMS supports rules versioning. That way, any process instances created from a specific version of a process definition can call a specific version of a rule, even if the rule has changed since then. Newer process definitions could be changed to call a later version of the rule.

You said that a BPMS will call a BRMS (typical scenarios). How would the BRMS know the scope of what needs to be checked? For example, if you have the rule "some applicant of each loan application must have a credit score of 600". When the business process for loan applications calls the BRMS, how does it determine the set of applicants that need to checked?

I think that the question is about where the BRMS gets its information that is used as parameters for the decisions. This would typically be passed to the BRMS from the calling application, in this case, the BPMS. The BPMS may need to make calls to other systems in order to get this information, then forward it to the BRMS: remember that part of the role of the BPMS is to orchestrate multiple systems and pass data along between them, including the BRMS.

Do you normally see that the same business users are maintaining both the processes and rules or are they normally different business users?

If you’re talking about the business analysts that would be designing the processes or rules, it is best if they are the same — so that they can decide what happens in a process versus what happens in a rule — but often are different people due to the training requirements. If these are separate roles, then the process analysts need to learn enough about rules to know when to request that a rule set be created for them to call from their processes.

How would you define the relationship between BPM, BRM and RBA (Run Book Automation)?

I’m not that familiar with RBA, but it is focused on IT and systems processes, not business processes. At TUCON last week, however, one of the presentations was on how they used TIBCO products for IT processes, although he didn’t refer to it as RBA.

Do you agree that BRM and BPM have to be married with the SME from the business side and the SME from the IT side to be successful?

I’m unclear on what this question means. "SME from the business side" means, to me, someone who is an expert on the business being performed; I’m not sure what "SME from the IT side" means. Both BRM and BPM are most successful when there is collaboration between business and IT: the business analysts doing the high-level modeling of the rules and processes to ensure that they meet the business requirements, and IT making sure that the technical underpinnings (such as calls to web services) are in place.

Do you have a list of feature set for BRM and BPM for product evaluation?

and

Could we get a list of recommended BPMS and Rules Management systems and why they are recommended?

Gartner and Forrester both publish comparative reports on BPMS and BRMS, I suggest that you start there.

What is the difference between BRM and procedures / governing document / policies? Give examples of BRM.

Policies, procedures and governing documents are the "rules" by which the business operates, but may not be automated in any way: many organizations just have people refer to a policies and procedures guide to tell them what to do in a manual process. BRM allows you to codify those policies and procedures so that they can be automated, and are executed the same way every time.

What about open source offerings? Have you worked or reviewed any of those?

Drools (from jboss) and NxBRE are two open source BRMS offerings. jBPM (also from jboss) and Intalio are both open source BPMS offerings. I recently did a review of Intalio but haven’t yet published what I saw; I haven’t worked with any of the other products. Many open source offerings don’t have the full functionality of their commercial counterparts so may not be included in the analysts’ comparative reports; the recent Gartner Magic Quadrant for BPMS is the first one in which Intalio has been included, for example.

Can you provide a simple example of how BPM and BRM are applied in practice?

A typical example that I’ve seen is in claims processing. There are many specific policies and procedures that must be followed to process a claim, and many BPM implementations just leave the decisions based on those policies to a person who is participating in the process: for example, give the work step to a person and have them decide the type of the claim and what region should be processing it. By adding BRM, these decisions can be automated based on data that is already known

Do you feel BPM can be used as a tool to integrate compliance management systems? e.g. OSH, Environment, Quality etc.

I’m not a compliance specialist, but I see many organizations using both BPM and BRM to help with their compliance efforts, since both can help to standardize processes and allow for easy auditing of what was actually done. As for integration with compliance management systems, that would depend on those systems and whether they provide a web services interface or other API.

What are some of the software packages you can purchase for extracting business rules?

The major BRMS products typically include tools for mining business rules from existing systems; you’ll need to check their functionality against the particular systems and platforms from which you want to extract rules to see if they’ll work for you.

How are the BRMS incorporated with any testing tools?

Many of the BRMS vendors have simulation and testing tools as part of their suite, specifically to test if the rules are complete and non-contradictory.

TUCON: Process Plans using iProcess Conductor

The last session of the day — and likely the last one of the conference for me, since I think that the engineering roundtables tomorrow morning are targeted at customers — was Enrique Goizueta of TIBCO discussing a "Lego approach" to creating business processes: dynamic BPM using the iProcess Conductor. Bruce Silver raved about the Conductor after seeing it at the solutions showcase on Tuesday, and it seems to have been a well-kept secret from those of us who watch the BPM space.

Goizueta started by discussing complex processes such as the cross-selling bundling processes seen in telecommunications and financial services, or claims management that may include both property damage and bodily injury exposures. In many cases, there are too many alternatives to realistically model all process possibilities explicitly, or the process is dynamic and specific instances may change during execution. The key is to identify reusable parts of the process and publish them as discrete processes in a process library, then mix them together at runtime as required for the specific situation. Each of these is a fully-functional, self-contained process, but the Conductor packages up a set of these at runtime and manages them as a "plan", presenting this package as a Gantt chart similar to a project plan. As with tasks in a project plan, you can set dependencies within a plan in Conductor, e.g., not starting one process until another is completed, or starting one process two weeks after another process starts. The iProcess engine still executes the processes, but Conductor is a layer above that to allow you to manage and monitor all the processes together in order to manage dependencies and identify the critical path across the set of processes.

TIBCO iProcess Conductor

This is very cool just as it is, but the Conductor also allows you to change a plan while it’s executing, adding and canceling processes on the fly.

He gave us a demo of Conductor for auto insurance claims management, where both vehicle damage and personal injury claims have been made, and these must be completed before the liability claim can be started processing.

For processes that always run together as single instances, such as a loss adjustment report followed by a vehicle repair claim, I’m not sure why you would represent these as separate processes that are put in the plan as end-to-end rather than subprocesses called by a single process, but there are other parts of this where the benefit of using Conductor is more clear, such as the ability to dynamically add a second liability claim a week into the process.

As Bruce pointed out, this is really case management, but it’s pretty nice case management. SLAs and critical paths can now be managed across the entire plan as well as for each individual process within it, and there’s lots of examples of complex processes that could benefit from this type of dynamic BPM.

Tonight we’re all off to the Exploratorium, where TIBCO is hosting a private party for us to check out the fun and interactive science exhibits. I’m flying back to Toronto tomorrow, which might give me a few hours on the flight to finish up some other blog posts that I’ve been working on, and watch for my coverage of SAPPHIRE next week from Orlando.

TUCON: BPM Health Insurance Case Study

Both Patrick Stamm (CTO) and Kevin Maloney (CIO) of Golden Rule Insurance were on hand to discuss their experiences in building a BPM infrastructure. They started out looking at BPM because of the multiple redundant systems and applications that they have, which is endemic in insurance: multiple ratings engines, multiple policy systems and multiple claims systems due to acquisitions and leapfrogging technologies. They needed to be more responsive and agile to changing business requirements, and increase end-to-end process visibility and management.

As they started looking at enterprise-wide BPM, they had a number of objectives:

  • Improving scalability
  • Improving cycle time and quality of process
  • Facilitating self-service on the web
  • Harvest rules from custom legacy systems
  • Reduce reliance on paper

This presentation focused on their new business process, from application submission through underwriting to issuance of the policy. Not surprisingly, adding BPM to underwriting was one of their significant challenges here; underwriting is often perceived as being as much of an art as a science, and I’ve seen a lot of resistance to introducing BPM into underwriting in many organizations that I’ve worked with.

They wanted to be strategic about how they implemented BPM, and established governance for the entire BPM program early on in the process. This allowed them take a big-picture approach, and led them to change how they do development by incorporating offshore development for the first time. The architecture of the TIBCO toolset allows them to get a lot of reusability across the different business silos (which still stay separate above the common platform), and the scalability helped them with both business continuity and business growth.

They have a 5-layer logical architecture:

  • UI layer, including General Interface, VB and other UI platforms
  • Services layer, strangely shown above the BPM layer, although it is called directly from the UI layer in some cases as well as from the BPM layer
  • BPM layer, which seems to actually show their queues rather than their business processes, which makes me wonder what the processes actually look like beyond a simple one-step queue
  • EAI layer, including all the adapters
  • Data access layer

Some of the highlights of their New Business process in BPM:

  • Mainframe integration to eliminate redundant data entry, triggering multiple mainframe transactions from a single BPM interface
  • Integration of business rules to eliminate error for incorrect riders, saving the underwriters’ time in researching which riders are applicable in which state
  • Integration with third parties, such as MIB (Medical Information Bureau) to automatically retrieve data from these sources rather than having users look it up manually on those parties’ web pages

The results that they’ve seen in less than a year since they’ve deployed:

  • New business volume is up over 50% with essentially the same number of staff
  • Applications processed per FTE is up over 30%
  • Cycle time is significantly reduced, as much as 30% in some cases
  • Better quality and consistency, with several error types eliminated
  • Improved visibility into business processes through better and more timely metrics and reporting

Their lessons learned:

  • Implementation partner selection is key: they’ve been happy with TIBCO as a product partner, but they had a bit of a rocky time with their first TIBCO integration partner and started over four months later. They still did the implementation in 11 months total, so really seven months from the point of restart.
  • You need to develop internal expertise in the tool and technology.
  • The first project should not be mission critical, and there must be a contingency plan. Funny, they didn’t consider New Business to be mission critical, but in reality, reverting to paper is an easy fallback in that case.
  • Don’t underestimate the impact that BPM will have on operational management and work culture.

This sounds like a fairly standard insurance implementation (I’ve done a few of these), but I like how they’re moving into the use of rules, and see the introduction of rules as having a significant impact on their process efficiency and cycle time.

TUCON: Predictive Trade Lifecycle Management

I switched over to the architecture stream to see the session on trade lifecycle management using BusinessWorks and iProcess, jointly presented by Cognizant and TIBCO. Current-day trading systems are under a great deal of stress because of increased volume of trades, more complex cross-border trades, and greater compliance requirements.

When trades fail, for a variety of reasons, there is a risk of increased costs to both the buyer and seller, and failed trade management is a key process that bridges between systems and people, potentially in different companies and locations. If the likelihood of a trade failure could be predicted ahead of time — based on some combination of securities, counterparties and other factors — those most likely to fail can be escalated for remediation before the trade hits the value date, avoiding some percentage of failed trades.

The TIBCO Event Fabric platform can be used to do some of the complex event processing required to do this; in fact, failed trade management is an ideal candidate for CEP since the underlying reasons for failure have been studied extensively and are fairly well-understood (albeit complex). Adding BPM into the mix allows the predicted failed trade situation to be pumped into BPM for exception processing.

The only thing that surprised me is that they’re not doing automated detection of the problem scenarios, but relying on experienced users to identify which combinations of parameters are likely to result in failed trades.

TUCON: Using BPM to Prioritize Service Creation

Immediately after the Spotfire-BPM session, I was up to talk about using BPM to drive top-down service discovery and definition. I would have posted my slides right away, but one of the audience members pointed out that the arrows in the two diagrams should be bidirectional (I begged forgiveness on the grounds that I’m an engineer, not a graphic artist), so I fixed that up before posting to Slideshare:

My notes that I jotted down before the presentation included the following:

  • SOA should be business focused (even owned by the business): a top-down approach to service definition provides better alignment of services with business needs.
  • The key is to create business-granular services corresponding to business functions: a business abstraction of SOA. This requires business-IT collaboration.
  • Build thin applications/processes and fat services to enable agile business processes. Fat services may have multiple operations for different requirements, e.g., retrieving/updating just the customer name versus the full customer record in an underlying system.
  • Shared business semantics are key to identifying reusable business services: ensure that business analysts creating the process models are using the same terminology.
  • Seek services that have the greatest business value.
  • Use cases can be used to identify candidates for services, as can boundary crossings activity diagrams.
  • Process decomposition can help identify reusable services, but it’s not possible to decompose and reengineer every process: look for ineffective processes with high strategic value as targets for decomposition.
  • Build the SOA roadmap based on business value.
  • SOA isn’t (just) about creating services, it’s about building business processes and applications from services.
  • Services should be loosely-coupled and location-independent.

There were some interesting questions arising from this, one being when to put service orchestration in the services layer (i.e., have one service call another) and when to put it in the process layer (i.e., have a process call the services). I see two facets to this: is this a business-level service, and do you want transparency into the service orchestration from the process level? If it’s not a business-level service, then you don’t want business analysts having to learn enough about it to use it in a process. You can still do orchestration of technical services into a business service using BPM, but do that as a subprocess, then expose the subprocess to the business analyst; or push that down to the service level. If you’re orchestration business-level services into coarser business-level services, then the decision whether to do this at the service or process level is about transparency: do you want the service orchestration to be visible at the process level for monitoring and process tracing?

This was the first time that I’ve given this presentation, but it was so easy because it came directly out of my experiences. Regardless, it’s good to have that behind me so that I can focus on the afternoon sessions.