BRF Day 2: How Business Rules Re(Define) Business Processes: A Service Oriented View

For the last session today, I attended Jan Venthienen’s session; he’s a professor at Katholieke Universiteit Leuven. He talked about different representations of rules, particularly decision tables (at length, although in an interesting way). He talked about the problems with maintaining decision trees, then as he moved on to business processes, he showed how a business process with the rules encoded in the process as routing logic was really just a form of decision tree, and therefore difficult to maintain from a rules integrity standpoint. As rules are distilled out of and separated from the processes, the processes become thinner and thinner, until you have a single branch straight-through flow. I have the feeling that he’d like to reduce the process to a single activity, where everything else is done in a complex rule called from that step. I’m not sure that I agree with that level of stripping of logic out of the process and into the rules; there’s value in having a business process that’s understandable by business users, and the more that the logic is encapsulated in rules, the harder it is to understand how the process flow works by looking at the process map. The critical thing is knowing which rules to strip out of the business process, and which to leave in.

He’s doing research now to determine if it’s possible to specify business rules, then automatically derive the business process from the rules; an interesting concept. In order to do this, there must be rules that constrain the permission and obligations of the actors in the process, e.g., an order must be accepted before the product is shipped. This presents two possible architectural styles: process first, or rules first. In either case, what is developed is an architecture of rules, events and services, with a top layer of business rules and processes, a middle layer of services and components, and a bottom layer of enterprise applications.

BRF Day 2: Using Business Rules to Enable a Closed Loop of Compliance

I’m eager to learn more about the relationship between policies, procedures and rules, and how they relate to compliance, so I sat in on a presentation by Peter Still of RuleBurst. There’s a pretty high percentage of vendors on the speaker roster, but so far the quality has been good so no complaints.

The theme of Still’s talk is that the business rules approach will only gain critical mass if it stops being a technical implementation tool and starts being a business problem-solving tool. The current pitch from the business rules vendors is that this is a way to implement systems faster and cheaper, while allowing the business to access some tuning parameters, but this is really focussed on the technological capabilities and not the business value of business rules. This is such a perfect mirror of the BPM field, where BPM has just barely moved from a purely technical sell to something that’s now being sold more and more to the business side of an organization, so I can completely understand where the business rules market is and the challenges that lie ahead in shifting the focus of their marketing message. Worldwide market for business rules product revenue is $250M — not a lot when you consider the size of related markets — and it could be a lot larger if there was greater recognition of the business benefits of business rules.

A perfect business case for re-targeting the business rules message is compliance: it’s an enterprise-wide initiative with executive support where business rules can be included in the decisioning at key points of the process. Although business rules aren’t the complete answer to compliance since compliance is a very process-focussed initiative, rules can be a significant contributor to compliance efforts. One of the difficulties with compliance is that many regulations, such as Sarbannes Oxley, are pretty vague since they have to deal with such a broad range of companies, and it’s difficult to determine precise business rules to implement them. Compliance at a transactional level is a mostly automated application of BPM and business rules, but as you move up to risk management and higher-level compliance factors, there’s less automation although still opportunities for business rules to be wrapped in a compliance framework, such as using business rules to classify a risk although the management of that risk may be done manually. Still maintains that there’s a link between transactional and operational compliance, and believes that business rules can help with that link although that’s not recognized by most business rules vendors.

As with most other complex applications of technology, you can solve this with an integrated compliance and rules solution from a single vendor, or go for a best-of-breed approach. Still recommends the former approach, and invited us to drop by his booth to check out what RuleBurst has to offer in this area.

BRF Day 2: True Adventures in Business Rules

Paul Armborst of the Westfield Group presented on his experiences in implementing business rules for their various insurance applications over the past 6 years. He sees one of the key problems is in documenting business rules, with two main choices for how to do it:

  • Define the rules in a repository first (a rules management/modelling tool). Although this is the most likely approach for the first definition of rules within an organization, they’ll become out of sync as soon as they are implemented since the rules will be modified in the executing code or BRE rather than in the repository.
  • Define the rules directly in a business rules engine, then generate the documentation in some automated fashion (likely provided by the BRE vendor)

He sees that the best way would be a combination of both approaches: a throw-away repository to document rules as they are discovered, and an extract from the BRE to provide ongoing documentation.

He pointed out one of the problems with introducing business rules: “real” developers want to write Java code, not this airy-fairy business rules stuff, which is an argument that I see against BPM as well. As IT budgets get squeezed, however, development teams will start to look for ways to reduce the amount of code that they have to write and pass some of the tuning capabilities directly over to the business; both BRE and BPM provide these types of capabilities.

He discussed various methods of implementing business rules:

  • Decision tables, noting that the BRE vendor needs to provide some sort of analysis tool to detect gaps and overlaps in the rules, with the caveat that it’s possible to construct a decision table that’s too big to reasonably maintain.
  • Decision trees, which can provide the same functionality in a decision table, but in a graphical form; if the decision points are not in the right order, the number of nodes can multiply unnecessarily, and overly-large trees can be difficult to follow.

He also discussed stateful and stateless implementations of business rules: although stateless is simpler to implement, stateful allows for running only the rules that use data that has changed since the last evaluation of each rule.

There were some last comments on end user rule maintenance: all of their rules are written by developers, but they’re thinking about how to offer some rule creation and modification capabilities to end users. It’s important to have a BRE that allows some sort of restricted view for less technical users, but it’s also necessary for the techies to do some helpful things like naming objects in a manner that users can understand rather than, say, reverse Hungarian notation. Users who have access to create rules need to have some notion of the logic required to do so, and there needs to be some provision for testing.

BRF Day 2: Rules Management Without a Rule Engine

I moved over to the über-geeky “chief architect” track to hear Rik Gerrits of RuleArts and Petr Choteborsky of Microsoft, but 10 minutes into the session, Gerrits is still giving some fairly basic definitions of business rules management: where rules live, how they’re developed, and how they’re managed and used. He does make a point that business processes consume business rules but that they should be distinct in terms of methodology and implementation, as well as other motivations for business rules management such as compliance and efficiency improvements.

Choteborsky took over with a case study about Microsoft’s internal development (as in applications for their own internal use, like software licence authorization), and instantly endeared himself to the audience by saying that he was in corporate IT at Microsoft, and was just as much a victim of the Microsoft product groups as we were. They had issues with software development lifecycle documents and the rules that were embedded within those documents: multiple, conflicting instances of rules in different documents; rules not explicitly defined hence less agile; no common vocabulary leading to inconsistency and miscommunication. Over time, the business logic is lost, and the business requirements documentation becomes completely out of sync with the application and the user manual, so that the only true representation of the business logic is embedded within the application as coded.

He stepped through an example, showing how to break down the prose in a requirements document to determine what is a rule set (a group of related rules), what’s a rule, what’s a fact (unchangeable, therefore may be hard-coded), what is usability behaviour (which may include hidden rules and facts), and what is contextual information that describes capability without being something that will be explicitly coded. Very cool example, since he shows the tendency for the prose in what we think of as a fairly well-written requirements document to actually be a confusing mix of facts, rules, behaviour and context that doesn’t really provide adequate information about what should be written to be easily changeable versus what can be hard-coded into an application.

He went on to show how the same paragraph should be restructured as facts and rules (describe the pure essence of how business must be conducted, independent of implementation detail), requirements (UI and application requirements to implement the rules) and context (information that makes it easier to understand the facts, rules and requirements; redundant information that is not coded). The rules mantra (which I’m just learning today) is “rules build on facts, facts build on terms”, and he shows the terms sprinkled throughout the facts and rules.

They’re attempting to change their requirements documents to this form of structured requirements using business rules (for going-forward documents, not retrofitting the existing ones), but it’s a painful process: there needs to be some common vocabulary and a significant amount of training in some cases to have people start thinking in this way. There was a comment from the audience that once the vocabulary — particularly standardization of terms — was established, there’s usually a pretty good uptake from the business community since they really like language that can help them to define their business more precisely and unambiguously.

There was another comment from the audience that what he is calling a requirement is actually a specification, which is an argument that I’ve had about a zillion times in my years of software development: I completely agree with the comment, as did Choteborsky, but he stated that this was the common terminology in use at Microsoft today and he wasn’t trying to fix everything at once. I have to see the pragmatism in that, although there should likely be some sort of migration of terminology to be more accurate.

He went into more detail on terms, facts and rules, including descriptions of each, and the use of graphical term models and fact models. He also made a distinction between a rule and a policy: a rule can produce an action or decision, whereas a policy is more general but might sound rule-like. He stepped through the before and after of a fact model, where he went through and marked each object and relationship in the model as correct, sort of incorrect, or outright wrong, then found new relationship pathways and defined new terms in the model to make it a better reflection of the actual facts and provide a more logical structure for developing rules. He’s just using Visio for creating the fact models, although I’m sure that some more comprehensive modeling tools could make this process a bit easier. They’re starting to use RuleXpress (the RuleArts product) for terms, facts and rules, although the rules themselves are actually encoded within applications: rules management without a rule engine. As he pointed out, although some business rules may end up in a business rules engine, some end up directly in the code of an application, and some are never codified but become part of an operational manual. We see exactly the same thing in BPM, where a process model may include steps that are transferred to a BPMS, but also ones that are completely manual and never represented within a BPMS. Having a modelling tool separate from the execution environment provides greater flexibility in what can be modelled, but I suspect that the same issues of synchronization and round-tripping occur in rules modelling environments as exist in process modelling.

Choteborsky was a great speaker: knowledgeable, able to explain some fairly complex concepts, and funny (when one slide came up, he said “I don’t know why PowerPoint made the font on this slide bold and ugly, but I’ve learned that I don’t need to win every battle”). The great thing is that he presented a methodology for developing business specifications that everyone in the room involved in software development could take away and start examining for their own use.

BRF Day 2: Business Rules and Business Intelligence Make Great Bedfellows

David Straus of Corticon gave an engaging presentation about BR and BI, starting with the Wikipedia definitions about each, then characterizing BI as “understanding” and BR as “action” (not unlike my statement that BI in BPM is about visibility and BR in BPM is about agility). He started with the basic drivers for a business rules management system — agility (speed and cost), business control while maintaining IT compliance, transparency, and business improvement (reduce costs, reduce risk, increase revenue) — and went on to some generalized use cases for rules-driven analysis:

  • Analyze transaction compliance, i.e., are the human decisions in a business process compliant with the policies and regulations?
  • Analyze the effect of automation with business rules, i.e., when a previously manual step is automated through the application of rules
  • Analyze business policy rules change (automated or non-automated)

He walked through a simplified claims scenario, where the claims agent is not replaced with rules but still makes a decision in the process, but their decision is compared against a decision made by a rules system and any discrepancies are investigated. In other words, although there’s still a person making the decision in the process, the rules system is acting as a watchdog to ensure that their decisions are compliant with the corporate policy. After some time, there can be some analysis of the results to detect pattens in non-compliance: is it an individual agent that’s causing non-compliance, or a particular product, or are the rules not aligned with the requirements? In some cases, the policies given to the agents are actually in conflict, so that they have two different “right” answers in some cases; in other cases, agents may have information that’s just not represented in the rules. By modeling the policies in a business rules system, these conflicts can be driven out to establish integrity across the entire set of rules. This can also be used in cases where an organization just isn’t ready to replace a human decision with a business rules system, in order to validate the rules and compare them to the human decisions; this can establish some trust of the decisioning system that may eventually lead them to replace some of the human decisions with automated ones to create more consistent and compliant decisions.

David had a number of case studies for this combination of rules and analytics, such as investment portfolio risk management, where mergers and acquisitions in the portfolio holdings may drive the portfolio out of compliance with the underlying risk profile: information about the holdings is fed back through the rules on a daily basis to establish if the portfolio is still in compliance, and trigger a (manual) rebalancing if it is out of compliance.

By combining business intelligence (and the data that it’s based on) and business rules, it’s also possible to analyze what-if scenarios for changes to rules, since the historical data can be fed through the new version of the rules to see what would have changed.

He’s challenged the BI vendors to do this sort of rules-based analysis; none of them do it now, but it would provide a hugely powerful tool for providing greater insight into businesses.

There was a question from the audience that led to a discussion about the iterative process of discovering rules in a business, particularly the ones that are just in people’s heads rather than encoded in existing systems; David did take this opportunity to make a plug for the modeling environment in their product and how it facilitates rules discovery. I’m seeing some definite opportunities for rules modeling tools when working with my customers on policies and procedures.

BRF Day 2: Intelligent Process Automation: The Key to Business Process Optimization

The opening keynote today was Steve Hendrick of IDC, discussing their acronym du jour, IPA (intelligent process automation), which is a combination of BPM, BI and decisioning. He lists four key constructs of IPA:

  • Event processing, providing a sense and respond approach
  • Decisioning, covering both rules and actions that might be derived from those rules
  • BPM (I knew that he’d get to this eventually)
  • Advanced analytics, including profiling and segmentation, predictive analytics and modeling, and decision optimization

I’m not sure how this differs from Gartner’s definition of BPMS technology, which includes all these factors; do we really need another acronym for this? I suppose that the analyst firms need to make these distinctions to play in the marketplace, but I’m not sure that a new term specific to one analyst firm provides benefit to the end customers of these systems.

He just put a non-linear programming equation up on the screen. It’s 9:19am, we were all up late last night at various vendor dinners, and he’s talking about the specifics of how to solve this optimization model. I really think that he’s overestimating the number of fellow analytics geeks in the audience.

He moved on to discuss BPM, which he characterizes as a context for putting advanced analytics to work. 🙂 He lists IBM, TIBCO and Adobe (huh?) as the leaders, Global 360 as “right on their heels”, and BEA just behind that with Lombardi somewhere back from that. Hmm, not necessarily everyone’s view of the BPM market.

He then discussed complex event processing for ultra-low latency applications, pointing out characteristics such as how it’s queue based (much like BPM) to allow asynchronous processing of events, and how this allows for extremely fast response to events as they occur. The tie-in to the other technologies that he’s discussing is that events can trigger processes, and can also trigger decisions, the latter of which he feels is more important.

He talked about a number of case studies about how analytics — in addition to other technologies and processes — made a difference for companies.

He ended with some predictions of a bright future for IPA, which included a hockey stick-like projection of BPMS sales increases of about 6x between now and 2011.

BRF Day 1: Leveraging Predictive Modeling and Rules Management for Commercial Insurance Underwriting

For the last presentation today, I listened to John Lucker of Deloitte discuss what they’ve developed in the area of predictive pricing models for property and casualty insurance. Pricing insurance is a bit trickier than pricing widgets: it’s more than just cost of goods sold plus a profit factor, there’s also the risk factor, and calculating these risks and how they affect pricing is what actuaries do for a living. However, using predictive models can make this pricing more accurate and more consistent, and therefore provides insurance companies with a way to be more competitive and more profitable at the same time.

I know pretty much nothing about predictive modeling, although I think that the algorithms are related to the pattern recognition and clustering stuff that I used to do back in grad school. There’s a ton of recent books on analytics, ranging from pop culture ones like Freakonomics to the somewhat more scholarly Competing on Analytics. I’m expecting Analytics for Dummies to come out any time now.

Predictive modeling is used heavily in credit scoring — based on your current assets, spending habits and debt load, how likely are you to pay on time — and in the personal insurance business, but it hasn’t really hit the commercial insurance market yet. However, the insurance industry recognizes that this is the future, and all the big players are at least dabbling in it. Although a lot of them have previously considered this in order to just do more consistent pricing, what they’re trying to do now is have the predictive models integrate together with business rules in order to drive results. This is helping to reduce the number of lost customers (by providing more competitive pricing), reducing expenses (by providing straight-through processing), increasing growth (by targeting new business areas), and profitability (by providing more accurate pricing).

He talked about how the nature of targeting insurance products is moving towards micro-segmentation, such as finding the 18-year-old male drivers who aren’t bad drivers or the roofing companies with low accident rates, then selling to them at a better price than most insurance companies would offer to a broader segment, such as all 18-year-old male drivers or all roofers. He didn’t use the words long tail, but that’s what he’s talking about: this is the long tail of insurance underwriting. There’s so much data about everything that we do these days, both personal and business, that it’s possible to do that sort of micro-segmentation by gathering up all that data, applying some predictive modeling to extract many more parameters of the data than would have been done in a manual evaluation, and develop the loss predictive model that allows a company to figure out whether you’re a good risk or not, and what price to charge you in order to mitigate that risk. Violation of privacy? Maybe. Good insurance business? Definitely.

The result of all this is a segmented view of the market that allows a company to decide which parts they want to focus on, and how to price any of those parts. Now it gets really interesting, because now these models can be fed into the business rules in order to determine the price for any given policy: a non-negotiable price, much like Saturn does with its cars. This disintermediates both the agents and the underwriters in the sales process, since all of the decisions about what risks to accept and how to price the policies is automated based on the predictive models and the business rules. Rules can even be made self-optimizing based on emerging trends in the data, which I discussed in my presentation this morning, although this practice is not yet mainstream.

Lucker’s message is that business rules are what leverages the power of the predictive models into something that makes a difference for a business, namely, improving business processes: reducing manual processes and associated costs, enhancing service and delivery channels, targeting sales on profitable niches (that long tail), and improving point-of-sale decision-making at an agency.

He ended up describing a top-down approach for designing business rules, starting with organizational strategy, decomposing to the functional areas (business operations, sales, customer service, distribution), then developing the business rules required to help meet the objectives of each of the areas.

BRF Day 1: How Many Business Rule Analysts Does It Take to Change a Lightbulb?

Seriously, that was the name of Giovanni Diviacchi’s session that I attended this afternoon, which looked at his experience as a business analyst at both Freddie Mac and Fannie Mae (the two big government-backed mortgage companies in the US). He had a number of good pointers on how to extract rules from the business and document them in a way that will be properly implemented by the developers.

They developed a “business action language” for the business analysts to communicate with the developers in an unambiguous way, including statements such as “present” (i.e., >0 and not null), “mutually exclusive”, “is required”, and my personal fave, “read my mind”.

He pointed out that the old axiom “rules are meant to be broken” is true even for business rules, in that you can’t ever plan for all the ways in which a rule might need to be overridden; he discussed one case of a woman who was born prior to 1936, never worked and never had a Social Security Number, which meant having to override the rule that SSN is required for a mortgage. There’s a lot that can be learned from this one example: I see so many rules embedded directly in applications — especially web applications — that make some assumptions that aren’t necessarily true, such as assuming that all people have a US address.

I often work through issues of policies, procedures and processes with my customers, and it was interesting to hear his comments on the relationship between policies and rules. He said that if the policies are well-written, the rules pretty much write themselves, and by spending more time on the policy behind the rules, you end up with a better set of rules. That definitely caused an “aha” moment for me in my emerging role as an evangelist for business rules in a BPM world, and will help to form some of my ideas on how all these things come together.

BRF Day 1: Ron Ross keynote

After a brief intro by Gladys Lam, the executive director of the Business Rules Forum, the conference kicked off with a keynote from Ron Ross, the driving force behind this event and a big name in the business rules community. A couple of things are distracting my attention from his talk: I’m up directly after him, and I’m presenting in this room, which is the main (read: big) conference hall. Let me make my ever-present complaint about passworded wifi in the meeting room and no free wifi or wired internet in the hotel, since I know that my regular readers would be disappointed without that news from the front lines. 🙂

Ron and I have exchanged email over the years, but this is our first opportunity to meet face-to-face; I’ll also have the chance to meet James Taylor and a few others who I only know from afar. Today, Ron’s talking about the shift from business rules to enterprise decisioning. This is the first business rules conference that I’ve ever attended, which means that most of the attendees likely know a lot more about the subject matter than I do, and most of the sessions will be new material for me.

Ron predicted that no one will be talking about SOA at a major conference in 15 years, but they will be talking about business rules or decisioning; I certainly agree with the first point, and the second makes a lot of sense.

When he said “we want our business processes to be smarter”, it was like music to my ears, and a direct lead-in to my presentation. He talked about three trends in the decisioning space:

  • The shift from an exclusive focus on BPM to a balanced approach on enterprise decision management (EDM). He mock-grudgingly admitted that business processes are important, but pointed out that the “smarts” provided by business rules provides agility in the processes (which is exactly the point that I will be making in about 45 minutes — maybe the material here won’t be all that foreign after all).
  • The shift from an exclusive focus on data quality and accessibility to a balanced approach on decision deployment. This is the whole convergence of BI and BR into decisioning — again, a key point in my presentation. I think that Ron is scooping me! He included a great quote from Taylor and Raden’s new book, Smart Enough Systems: “Making information more readily available is important, but making better decisions based on information is what pays the bills.”
  • The shift from an exclusive focus on rules to a balanced approach on decisions. My key takeaway for this conference is figured out a good distinction between business rules and decisioning, since these terms seem to be used interchangeably in some cases; it seems that decisioning is about (not surprisingly) decisions, which in turn are based on business rules and some information about the current scenario.

He finished up with some pointers on where to think about applying decisioning in your business through a few use cases, such as creating “rules of record” for compliance purposes.

Like every other technology-specific conference (especially the BPM ones that I typically attend), this one has at the heart of it that its subject matter is the most important technology in an enterprise, and that herein lies the key to business perfection. I’m being a bit facetious, but we really do need to start getting a bit more cross-over between some of these conferences and technologies.

Business Rules Forum this week

It’s been a busy couple of weeks, between a presentation on the Art of Process Modeling to a group of TIBCO customers in NYC one day, and a full-day version of my Making BPM Mean Business course with a local client, so blogging has been pretty light.

Tomorrow I head to Orlando for the Business Rules Forum, where I give a presentation on Tuesday morning about BPM, business rules and business intelligence. I’ll be staying on until Thursday, so check back for some live blogging from the conference under the BusinessRulesForum category.

If you’re in Toronto this week, I highly recommend the CASCON 2007 event put on by the IBM Centers for Advanced Studies. I went last year and it was great, and (bonus) it’s free, although you need to register in advance.