Business Rules and Business Events: Where CEP Helps Decisions #brf

To finish off the second day of Business Rules Forum, Paul Vincent of TIBCO spoke about events and event-driven architecture as a useful way of dealing with business rules. TIBCO is best known for their messaging bus (although some of us know it more for the BPM side), and events are obviously one of the things that can be carried by the bus, or generated from other messages on the bus. The three major analysts who presented here this week – Jim Sinur of Gartner, Mike Gualtieri of Forrester, and Steve Hendrick of IDC – all stressed the importance of events and CEP; in fact, Gualtieri stated that CEP is the future of business rules in his breakfast roundtable this morning.

Going back to the basics of business rules, rules can be restrictions, guidelines, computations, inferences, timings and triggers; the last two are where events start to come into play. Rules are defined through terms and facts; some facts may be events, and rules enforced as events occur. Business rules drive process definitions and the decisions made within business processes, and mapping between rules, processes and decisions is easiest done from an event perspective. Events are key to business rule evaluation and enforcement, where events are triggers for both processes and the rules that determine the decisions within those processes: an event triggers a process, which in turn calls a decision service; or an event triggers a change to a rule, which in turn changes the decisions returned to a process. In fact, there’s a fine line between business processes and event processing if you consider how an event might impact an in-flight event-triggered process, and Paul declared that BPM is really just a constrained case of CEP.

Having taken over the world of BPM, he moved on to BRM, and showed how CEP systems are better for managing automated rules (when all you have is a CEP system, everything looks like an event, I suppose 🙂 ) since all decisions are event-driven, and CEP systems monitor simple events and decisions to identify patterns in real time by combining rules, events and real-time data in the same system to allow organizations to react intelligently to business events. He walked through an example architecture for real-time event processing (which happens to be TIBCO’s CEP architecture): a distributed CEP framework including an event bus and data grid, plus rule maintenance and execution, and real-time analytics. This allows historic patterns to be detected in real time (which sounds like a contradiction), while providing the decision management interfaces, rule agents and real-time dashboards. Rather than having a listener application feeding a rules engine, events are fed directly to the event processing engine in an event-driven architecture. He walked through other aspects, such as rule definition and decision services, showing how EDA provides a simpler and more powerful environment than standard BRMS and SOA.

Business rules are used in sense and respond, track and trace, and situation awareness CEP applications; business users (or at least business analysts) need to be able to understand and model events independent of any particular infrastructure. I completely agree with this, since I find that business analysts focused on process are woefully unaware of how to identify and model events and how those events impact their business processes.

The Decision Dilemma #brf

The Business Rules Forum has started here in Las Vegas, and I’m here all week giving a presentation in the BPM track, facilitating a workshop and sitting on a panel. James Taylor and Eric Charpentier are also here presenting and blogging, with a focus more purely on rules and decision management; you will want to check out their blogs as well since we’ll likely all be at different sessions. I’m really impressed with what this conference has grown into: attendance is fairly low, as it has been at every conference that I’ve attended this year due to the economy, but there is a great roster of speakers and five concurrent tracks of breakout sessions including the new BPM track. As I’ve been blogging about for a while (as has James), process and rules belong together; this conference is the opportunity to learn about both as well as their overlap.

We kicked off with a welcome from Gladys Lam, followed by a keynote from Jim Sinur on making better decisions in the face of uncertainty. One thing that’s happened during the economic meltdown is that a great deal of uncertainty has been introduced into not just financial markets, but many aspects of how we do business. The result is that business processes need to be more dynamic, and need to be able to respond to emerging patterns rather than the status quo. At the Appian conference last week, Jim spoke about some of their new research on pattern-based strategies, and that’s the heart of what he’s talking about today.

One of the effects of increased connectivity on business is that it speeds the impact of change: as soon as something changes in how business works in one part of the world, it’s everywhere. This makes instability – driven by that change – the normal state rather than an exception. Although he focused on dynamic processes at the Appian conference, here he focused more the role of rules in dealing with uncertainty, which I think is a valid point since rules and decision management are much of what allow processes to dynamically shift to accommodate changing conditions; although perhaps it is more accurate to consider the role of complex event processing as well. I am, however, left with the impression that Gartner is spinning pattern-based strategy onto pretty much every technology and special interest group.

The discussion about pattern-based strategies was the same as last week (and the same, I take it, as at the recent Gartnet IT Expo where this was introduced), covering the cycle of seek, model and adapt, as well as the four disciplines of pattern seeking, performance-driven culture, optempo advantage and transparency.

There’s lots of Twitter activity about the conference, and it’s especially interesting to see reactions from other analysts such as Mike Gualtieri of Forrester.

Leveraging process improvements depends on events

I was on a(nother) Gartner webinar today and this slide particularly resonated:

The key point is that although there’s a lot to be gained from automating and supporting business processes with BPMS, if you want to leverage productivity improvements, you need to be doing something with the events that are generated from the business processes. Without this, you’ll be blind to many of the delayed effects and unintended consequences of process decisions.

Oracle BEA Strategy Briefing

Not only did Oracle schedule this briefing on Canada Day, the biggest holiday in Canada, but they forced me to download the Real Player plug-in in order to participate. The good part, however, is that it was full streaming audio and video alongside the slides.

Charles Phillips, Oracle President, kicked off with a welcome and some background on Oracle, including their focus on database, middleware and applications, and how middleware is the fastest-growing of these three product pillars. He described how Oracle Fusion middleware is used both by their own applications as well as ISVs and customers implementing their own SOA initiatives.

He outlined their rationale for acquiring BEA: complementary products and architecture, internal expertise, strategic markets such as Asia, and the partner and channel ecosystem. He stated that they will continue to support BEA products under the existing support lifetimes, with no forced migration policies to move off of BEA platforms. They now consider themselves #1 in the middleware market in terms of both size and technology leadership, and Phillips gave a gentle slam to IBM for over-inflating their middleware market size by including everything but the kitchen sink in what they consider to be middleware.

The BEA developer and architect online communities will be merged into the Oracle Technology Network: Dev2Dev will be merged into the Oracle Java Developer community, and Arch2Arch will be broadened to the Oracle community.

Retaining all the BEA development centers, they now have 4,500 middleware developers; most BEA sales, consulting and support staff were also retained and integrated into the the Fusion middleware teams.

Next up was Thomas Kurian, SVP of Product Development for Fusion Middleware and BEA product directions, with a more detailed view of the Oracle middleware products and strategy. Their basic philosophy for middleware is that it’s a unified suite rather than a collection of disjoint products, it’s modular from a purchasing and deployment standpoint, and it’s standards-based and open. He started to talk about applications enabled by their products, unifying SOA, process management, business intelligence, content management and Enterprise 2.0.

They’ve categorized middleware products into 3 categories on their product roadmap (which I have reproduced here directly from Kurian’s slide:

  • Strategic products
    • BEA products being adopted immediately with limited re-design into Oracle Fusion middleware
    • No corresponding Oracle products exist in majority of cases
    • Corresponding Oracle products converge with BEA products with rapid integration over 12-18 months
  • Continue and converge products
    • BEA products being incrementally re-designed to integrate with Oracle Fusion middleware
    • Gradual integration with existing Oracle Fusion middleware technology to broaden features with automated upgrades
    • Continue development and maintenance for at least 9 years
  • Maintenance products
    • BEA had end-of-life’d due to limited adoption prior to Oracle M&A
    • Continued maintenance with appropriate fixes for 5 years

For the “continue and converge” category, that is, of course, a bit different than “no forced migration”, but this is to be expected. My issue is with the overlap between the “strategic” category, which can include a convergence of an Oracle and a BEA product, and the “continue and converge” category, which includes products that will be converged into another product: when is a converged product considered “strategic” rather than “continue and converge”, or is this just the spin they’re putting on things so as to not freak out BEA customers who have put huge investments into a BEA product that is going to be converged into an existing Oracle product?

He went on to discuss how each individual Oracle and BEA product would be handled under this categorization. I’ve skipped the parts on development tools, transaction processing, identity management, systems management and service delivery, and gone right to their plans for the Service-Oriented Architecture products:

Oracle SOA product strategy

  • Strategic:
    • Oracle Data Integrator for data integration and batch ETL
    • Oracle Service Bus, which unifies AquaLogic Service Bus and Oracle Enterprise Service Bus
    • Oracle BPEL Process Manager for service orchestration and composite application infrastructure
    • Oracle Complex Event Processor for in-memory event computation, integrated with WebLogic Event Server
    • Oracle Business Activity Monitoring for dashboards to monitor business events and business process KPIs
  • Continue and converge:
    • BEA WL-Integration will be converged with the Oracle BPEL Process Manager
  • Maintenance:
    • BEA Cyclone
    • BEA RFID Server

Note that the Oracle Service Bus is in the “strategic” category, but is a convergence of AL-SB and Oracle ESB, which means that customers of one of those two products (or maybe both) are not going to be happy.

Kurian stated that Oracle sees four types of business processes — system-centric, human-centric, document-centric and decision-centric (which match the Forrester divisions) — but believes that a single product/engine that can handle all of these is the way to go, since few processes fall purely into one of these four categories. They support BPEL for service orchestration and BPMN for modeling, and their plan is to converge a single platform that supports both BPEL and BPMN (I assume that he means both service orchestration and human-facing workflow). Given that, here’s their strategy for Business Process Management products:

Oracle BPM product strategy

  • Strategic:
    • Oracle BPA Designer for process modeling and simulation
    • BEA AL-BPM Designer for iterative process modeling
    • Oracle BPM, which will be the convergence of BEA AquaLogic BPM and Oracle BPEL Process Manager in a single runtime engine
    • Oracle Document Capture & Imaging for document capture, imaging and document workflow with ERP integration [emphasis mine]
    • Oracle Business Rules as a declarative rules engine
    • Oracle Business Activity Monitoring [same as in SOA section]
    • Oracle WebCenter as a process portal interface to visualize composite processes

Similar to the ESB categorization, I find the classification of the converged Oracle BPM product (BEA AL-BPM and Oracle BPEL PM) as “strategic” to be at odds with his original definition: it should be in the “continue & converge” category since the products are being converged. This convergence is not, however, unexpected: having two separate BPM platforms would just be asking for trouble. In fact, I would say that having two process modelers is also a recipe for trouble: they should look at how to converge the Oracle BPA Designer and the BEA AL-BPM Designer

In the portals and Enterprise 2.0 product area, Kurian was a bit more up-front about how WebLogic Portal and AquaLogic UI are going to be merged into the corresponding Oracle products:

Oracle portal and Enterprise 2.0 product strategy

  • Strategic:
    • Oracle Universal Content Management for content management repository, security, publishing, imaging, records and archival
    • Oracle WebCenter Framework for portal development and Enterprise 2.0 services
    • Oracle WebCenter Spaces & Suite as a packaged self-service portal environment with social computing services
    • BEA Ensemble for lightweight REST-based portal assembly
    • BEA Pathways for social interaction analytics
  • Continue and converge:
    • BEA WebLogic Portal will be integrated into the WebCenter framework
    • BEA AquaLogic User Interaction (AL-UI) will be integrated into WebCenter Spaces & Suite
  • Maintenance:
    • BEA Commerce Services
    • BEA Collabra

In SOA governance:

  • Strategic:
    • BEA AquaLogic Enterprise Repository to capture, share and manage the change of SOA artifacts throughout their lifecycle
    • Oracle Service Registry for UDDI
    • Oracle Web Services Manager for security and QOS policy management on services
    • EM Service Level Management Pack as a management console for service level response time and availability
    • EM SOA Management Pack as a management console for monitoring, tracing and change managing SOA
  • Maintenance:
    • BEA AquaLogic Services Manager

Kurian discussed the implications of this product strategy on Oracle Applications customers: much of this will be transparent to Oracle Applications, since many of these products form the framework on which the applications are built, but are isolated so that customizations don’t touch them. For those changes that will impact the applications, they’ll be introduced gradually. Of course, some Oracle Apps are already certified with BEA products that are now designated as strategic Oracle products.

Oracle has also simplified their middleware pricing and packaging, with products structured into 12 suites:

Oracle Middleware Suites

He summed up with their key messages:

  • They have a clear, well-defined, integrated product strategy
  • They are protecting and enhancing existing customer investments
  • They are broadening Oracle and BEA investment in middleware
  • There is a broad range of choice for customer

The entire briefing will be available soon for replay on Oracle’s website if you’re interested in seeing the full hour and 45 minutes. There’s more information about the middleware products here, and you can sign up to attend an Oracle BEA welcome event in your city.

TUCON: Predictive Trade Lifecycle Management

I switched over to the architecture stream to see the session on trade lifecycle management using BusinessWorks and iProcess, jointly presented by Cognizant and TIBCO. Current-day trading systems are under a great deal of stress because of increased volume of trades, more complex cross-border trades, and greater compliance requirements.

When trades fail, for a variety of reasons, there is a risk of increased costs to both the buyer and seller, and failed trade management is a key process that bridges between systems and people, potentially in different companies and locations. If the likelihood of a trade failure could be predicted ahead of time — based on some combination of securities, counterparties and other factors — those most likely to fail can be escalated for remediation before the trade hits the value date, avoiding some percentage of failed trades.

The TIBCO Event Fabric platform can be used to do some of the complex event processing required to do this; in fact, failed trade management is an ideal candidate for CEP since the underlying reasons for failure have been studied extensively and are fairly well-understood (albeit complex). Adding BPM into the mix allows the predicted failed trade situation to be pumped into BPM for exception processing.

The only thing that surprised me is that they’re not doing automated detection of the problem scenarios, but relying on experienced users to identify which combinations of parameters are likely to result in failed trades.

Agent Logic’s RulePoint and RTAM

This post has been a long time coming: I missed talking to Agent Logic at the Gartner BPM event in Orlando in September since I didn’t stick around for the CEP part of the week, they persisted and we had both an intro phone call and a longer demo session in the weeks following. Then I had a crazy period of travel, came home to a backlog of client work and a major laptop upgrade, and seemed to lose my blogging mojo for a month.

If you’re not yet familiar with the relatively new field of CEP (complex event processing), there are many references online, including a recent ebizQ white paper based on their event processing survey which determined that a majority of the survey respondents believe that event-driven architecture comprises all three of the following:

  • Real-time event notification – A business event occurs and those individuals or systems who are interested in that event are notified, and potentially act on the event.
  • Event stream processing – Many instances of an event occur, such as a stock trade, and a process filters the event stream and notifies individuals or systems only about the occurrences of interest, such as a stock price reaching a certain level.
  • Complex event processing – Different types of events, from unrelated transactions, correlated together to identify opportunities, trends, anomalies or threats.

And although the survey shows that the CEP market is dominated by IBM, BEA and TIBCO, there are a number of other significant smaller players, including Agent Logic.

In my discussions with Agent Logic, I had the chance to speak with Mike Appelbaum (CEO), Chris Bradley (EVP of Marketing) and Chris Carlson (Director of Product Management). My initial interest was to gain a better understanding of how BPM and CEP come together as well as how their product worked; I was more than a bit amused when they referred to BPM as an “event generator”. I was someone mollified when they also pointed out that business rules engines are event generators: both types of systems (and many others) generate thousands of events to their history logs as they operate, most of which are of no importance whatsoever; CEP helps to find the few unique combinations of events from multiple data feeds that are actually meaningful to the business, such as detecting credit card fraud based on geographic data, spending patterns, and historical account information.

Agent Logic - RulePoint - Home

Agent Logic has been around since 1999, and employs about 50 people. Although they initially targeted defence and intelligence industries, they’re now working with financial services and manufacturing as well. Their focus is on providing an end-user-driven CEP tool for non-technical users to write rules, rather than developers — something that distinguishes them from the big three players in the market. After taking a look at the product, I think that they got their definition of “non-technical user” from the same place as the BPM vendors: the prime target audience for their product would be a technically-minded business analyst. This definitely pushes down the control and enforcement of policies and procedures closer to the business user.

They also seem to be more focused on allowing people to respond to events in real-time (rather than, for example, spawning automated processes to react to events, although the product is certainly capable of that). As with other CEP tools, they allow multiple data feeds to be combined and analyzed, and rules set for alerts and actions to fire based on specific business events corresponding to combinations of events in the data feeds.

Agent Logic has two separate user environments (both browser-based): RulePoint, where the rules are built that will trigger alerts, and RTAM, where the alerts are monitored.

Agent Logic - RulePoint - Rule builderRulePoint is structured to allow more technical users work together with less technical users. Not only can users share rules, but a more technical user can create “topics”, which are aggregated, filtered data sources, then expose these to the less technical to be used as input for their rules. Rules can be further combined to create higher-level rules.

RulePoint has three modes for creating rules: templates, wizards and advanced. In all cases, you’re applying conditions to a data source (topic) and creating a response, but they vary widely in terms of ease of use and flexibility.

  • Templates can be used by non-technical users, who can only set parameter values for controlling filtering and responses, and save their newly-created rule for immediate use.
  • The wizard creation tool allows for much more complex conditions and responses to be created. As I mentioned previously, this is not really end-user friendly — more like business analyst friendly — but not bad.
  • The advanced creation mode allows you to write DRQL (detect and response query language) directly, for example, ‘when 1 “Stock Quote” s with s.symbol = “MSFT” and s.price > 90 then “Instant Message” with to=”[email protected]”,body=’MSFT is at ${s.price}”‘. Not for everyone, but the interesting thing is that by using template variables within the DRQL statements, you can converted rules created in advanced mode into templates for use by non-technical users: another example of how different levels of users can work together.

Agent Logic - RulePoint - WatchlistsWatchlists are lists that can be used as parameter sets, such as a list of approved airlines for rules related to travel expenses, which then become drop-down selection lists when used in templates. Watchlists can be dynamically updated by rules, such as adding a company to a list of high-risk companies if a SWIFT message is received that references both that company and a high-risk country.

Agent Logic - RulePoint - ServicesRulePoint includes a large number of predefined services that can be used as data sources or responders, including SQL, web services and RSS feeds. You can also create your own services. By providing access to web services both as a data source and as a method of responding to an alert, this allows Agent Logic to do things like kick off a new fraud review process in a BPMS when a set of events occur across a range of systems that indicate a potential for fraud.

Lastly, in terms of rule creation, there are both standard and custom responses that can be attached to a rule, ranging from sending an alert to a specific user in RTAM to sending an email message to writing a database record.

Although most of the power of Agent Logic shows up in RulePoint, we spent a bit of time looking at RTAM, the browser-based real-time alert manager. Some Agent Logic customers don’t use RTAM at all, or only for high-priority alerts, preferring to use RulePoint to send responses to other systems. However, compared to a typical BAM environment, RTAM provides pretty rich functionality: it can link to underlying data sources, for example, by linking to an external web site with criminal record data on receiving an alert that a job candidate has a record, and allows for mashups with external services such as Google maps.

Agent Logic - RTAM - AlertsIt’s also more of an alert management system rather than just monitoring: you can filter alerts by the various rules that trigger them, and perform other actions such as acknowledging the alert or forwarding it to another user.

Admittedly, I haven’t seen a lot of other CEP products to this depth to provide any fair comparison, but there were a couple of things that I really liked about Agent Logic. First of all, RulePoint provides a high degree of functionality with three different levels of interfaces for three different skill levels, allowing more technical users to create aggregated, easier-to-use data sources and services for less technical users to include in their rules. Rule creation ranges from dead simple (but inflexible) with templates to roll-your-own in advanced mode.

Secondly, the separation of RulePoint and RTAM allows the use of any BI/BAM tool instead of RTAM, or just feeding the alerts out as RSS feeds or to a portal such as Google Gadgets or Pageflakes. I saw a case study of how Bank of America is using RSS for company-wide alerts at the Enterprise 2.0 conference earlier this year, and see a natural fit between CEP and this sort of RSS usage.

Update: Agent Logic contacted me and requested that I remove a few of the screenshots that they don’t want published. Given that I always ask vendors during a demo if there is anything that I can’t blog about, I’m not sure how that misunderstanding occurred, but I’ve complied with their request.

BRF Day 2: How Business Rules Re(Define) Business Processes: A Service Oriented View

For the last session today, I attended Jan Venthienen’s session; he’s a professor at Katholieke Universiteit Leuven. He talked about different representations of rules, particularly decision tables (at length, although in an interesting way). He talked about the problems with maintaining decision trees, then as he moved on to business processes, he showed how a business process with the rules encoded in the process as routing logic was really just a form of decision tree, and therefore difficult to maintain from a rules integrity standpoint. As rules are distilled out of and separated from the processes, the processes become thinner and thinner, until you have a single branch straight-through flow. I have the feeling that he’d like to reduce the process to a single activity, where everything else is done in a complex rule called from that step. I’m not sure that I agree with that level of stripping of logic out of the process and into the rules; there’s value in having a business process that’s understandable by business users, and the more that the logic is encapsulated in rules, the harder it is to understand how the process flow works by looking at the process map. The critical thing is knowing which rules to strip out of the business process, and which to leave in.

He’s doing research now to determine if it’s possible to specify business rules, then automatically derive the business process from the rules; an interesting concept. In order to do this, there must be rules that constrain the permission and obligations of the actors in the process, e.g., an order must be accepted before the product is shipped. This presents two possible architectural styles: process first, or rules first. In either case, what is developed is an architecture of rules, events and services, with a top layer of business rules and processes, a middle layer of services and components, and a bottom layer of enterprise applications.

BRF Day 2: Intelligent Process Automation: The Key to Business Process Optimization

The opening keynote today was Steve Hendrick of IDC, discussing their acronym du jour, IPA (intelligent process automation), which is a combination of BPM, BI and decisioning. He lists four key constructs of IPA:

  • Event processing, providing a sense and respond approach
  • Decisioning, covering both rules and actions that might be derived from those rules
  • BPM (I knew that he’d get to this eventually)
  • Advanced analytics, including profiling and segmentation, predictive analytics and modeling, and decision optimization

I’m not sure how this differs from Gartner’s definition of BPMS technology, which includes all these factors; do we really need another acronym for this? I suppose that the analyst firms need to make these distinctions to play in the marketplace, but I’m not sure that a new term specific to one analyst firm provides benefit to the end customers of these systems.

He just put a non-linear programming equation up on the screen. It’s 9:19am, we were all up late last night at various vendor dinners, and he’s talking about the specifics of how to solve this optimization model. I really think that he’s overestimating the number of fellow analytics geeks in the audience.

He moved on to discuss BPM, which he characterizes as a context for putting advanced analytics to work. 🙂 He lists IBM, TIBCO and Adobe (huh?) as the leaders, Global 360 as “right on their heels”, and BEA just behind that with Lombardi somewhere back from that. Hmm, not necessarily everyone’s view of the BPM market.

He then discussed complex event processing for ultra-low latency applications, pointing out characteristics such as how it’s queue based (much like BPM) to allow asynchronous processing of events, and how this allows for extremely fast response to events as they occur. The tie-in to the other technologies that he’s discussing is that events can trigger processes, and can also trigger decisions, the latter of which he feels is more important.

He talked about a number of case studies about how analytics — in addition to other technologies and processes — made a difference for companies.

He ended with some predictions of a bright future for IPA, which included a hockey stick-like projection of BPMS sales increases of about 6x between now and 2011.