Webinar on Enterprise Decision Management tomorrow

After attending the Business Rules Forum last week, either I’m more aware of related events or I’m on more mailing lists for rules/decisioning vendors. In either case, Fair Isaac is putting on an Introduction to Enterprise Decision Management webinar tomorrow at 2pm Eastern. From their description of the event:

Learn how your organization can automate and improve decisions by:

  • Taking control of your decisions and make them a corporate asset
  • Separating operational decisions from processes and systems for maximum agility
  • Using business rules management systems to ensure decision consistency and speed
  • Applying different kinds of analytics – descriptive, predictive and decision – to make more precise and profitable decisions

It’s free, just sign up online.

Integration World next week

This week I’m not travelling — a chance to catch up on the work that I’m actually paid for 🙂 — but next week, I’m off to the Software AG/webMethods user conference, Integration World. Watch for my blogging coverage from there.

Disclosure: as with all vendor user conferences that I attend, Software AG is covering my expenses, although I am not paid to attend.

Meeting the bloggers at BRF

Last week at the Business Rules Forum gave me a chance to meet many people who I’ve never met face-to-face, but feel that I know from our exchanges of blog comments and emails: at one point, I was standing around talking to James Taylor, Rolando Hernandez and Scott Sehlhorst.

James was certainly the most prolific in blogging about the conference: he live-blogged the sessions that he attended (even mine), so you can compare with the posts on those sessions that I wrote. He has a wrap-up post with pointers to all of the blogs that he found with coverage of the event.

Smart Enough Systems

I’m sure that James Taylor has almost given up on me ever writing a book review of Smart Enough Systems: I wrote a brief advance review back in April that’s printed in the book, but nothing since it was released. This week, I’ve been immersed in business rules and decisioning, and had a chance to finally meet James face-to-face after a couple of years of emailing back and forth. Also, James’ situation has changed since the book was released: he’s left Fair Isaac and is now an independent, working (I think) with his co-author, Neil Raden. Neil, who I met very briefly this week, is an independent consultant and analyst who’s been focussed on business intelligence for quite a while; James refers to his work as “BI 2.0” (a term that I think that I invented here in early 2006). The two of them met through James’ blog and started the conversation about how someone needed to write a book about this crossover area between business rules and business intelligence.

Just to get started, here’s my pre-release review:

Taylor and Raden’s central manifesto highlights that it’s critical to embody more intelligence in today’s business decision-making and have consistent, automated decisioning built into business processes in order to remain agile and competitive in today’s fast-moving market. They take you through the core concepts of enterprise decision management (EDM), dive into the underlying technologies, then address how to integrate EDM into your business processes to create your own Smart (Enough) Systems.

By focusing on operational decisions that contribute to corporate strategy, Smart (Enough) Systems provide the ability not only to create agile business processes, but to have these processes be self-learning based on historical results. Instead of simply capturing operational process statistics in a data warehouse for later analysis, Smart (Enough) Systems use that knowledge to inform the business rules and allow them to adapt their guidance of the decision-making process. By extracting the decisions from legacy applications, static enterprise applications and manual procedures, and managing them within a shared enterprise decision management system, operational decisions can be applied consistently — and modified easily for processes in flight — across the enterprise.

What I like about the book is that it provides an overview that will get business people interested in the topic, as well as providing a practical guide to getting started. There’s a lot of books focussed on analytics and business rules already, but most of them assume that you already know what you’re going to do with these technologies; in many cases, it’s hard to discover the right decisions on which to focus, since some things are more important than you may have originally perceived when starting a project.

As I heard this week, there’s still a strong tendency to sell rules technology to IT as a way to implement systems fast and cheaper, instead of selling decisioning to the business as a way to solve business problems. For the most part, decisions in business processes are unconsciously relegated either to a developer coding them into an application, or to a front-line worker executing them on an ad hoc basis. Decisions, and the rules that underlie them, need to be made explicit: therein lies the path to both compliance and agility. I joked yesterday about Jan Venthienen’s presentation that he’d like to reduce all processes to a single activity and have all business logic in a rule set, but there’s definitely value in what he’s saying: we need to seek out the decisions in processes, capture the ever-changing business rules that are embedded within them, and move these out of the processes and into rules/decisioning systems. The reality is that process maps don’t change all that often if the decisions and business rules are stripped out of the processes: most business agility is based on rule changes, not process changes, especially for high-volume core transactional processes. However, since we often code rules directly into the business processes — making these processes some combination of processes and decision trees — we perceive the need to be that of changing processes rather than changing rules. And so (and I’ll be totally blackballed in the BPM community for this particular blasphemy), the entire BPM industry has focussed on building tools that allow for easy modification of process maps by the business, when maybe they really should have been focussed on pushing their customers towards the integration of business rules for decisioning in order to greatly reduce the need to modify process maps.

Climbing off my soapbox for a moment and returning to James and Neil’s book, they focus on a key benefit of this sort of smart decisioning in operational processes, which is the ability to replace the “irreplaceables”, such as insurance underwriters, before all the boomers take their knowledge and retire to Palm Beach. This greatly controls risk: not just the risk of the workers leaving, but the risk of bad or inconsistent decision-making. By allowing decisions to become more personalized based on the customer and scenario, this can also provide the ability to be more customer-centric, as well as agility in the face of changing regulations and market conditions.

They see a number of roadblocks to this sort of smart decisioning at the operational level:

  • Everyone is focussed on strategic issues and not taking their operations seriously; however, the aggregate effect of a small but poor decision on millions of operational transactions is, in fact, strategic. In other words, execution matters.
  • Business and IT have an antagonistic relationship. This is often blamed on IT being arrogant and not allowing the business to participate in technology initiatives, but there’s also some amount of the business not wanting to take responsibility since that means that they can’t blame IT if something goes wrong.

The ideas that James and Neil put forward in their book are a great place for business and IT to start collaborating on how to make smart enough systems.

You may also want to listen to the interview that Scott Sehlhorst did with James back in June (I know, I know, when the book was actually released).

BRF Day 2: Using Business Rules to Enable a Closed Loop of Compliance

I’m eager to learn more about the relationship between policies, procedures and rules, and how they relate to compliance, so I sat in on a presentation by Peter Still of RuleBurst. There’s a pretty high percentage of vendors on the speaker roster, but so far the quality has been good so no complaints.

The theme of Still’s talk is that the business rules approach will only gain critical mass if it stops being a technical implementation tool and starts being a business problem-solving tool. The current pitch from the business rules vendors is that this is a way to implement systems faster and cheaper, while allowing the business to access some tuning parameters, but this is really focussed on the technological capabilities and not the business value of business rules. This is such a perfect mirror of the BPM field, where BPM has just barely moved from a purely technical sell to something that’s now being sold more and more to the business side of an organization, so I can completely understand where the business rules market is and the challenges that lie ahead in shifting the focus of their marketing message. Worldwide market for business rules product revenue is $250M — not a lot when you consider the size of related markets — and it could be a lot larger if there was greater recognition of the business benefits of business rules.

A perfect business case for re-targeting the business rules message is compliance: it’s an enterprise-wide initiative with executive support where business rules can be included in the decisioning at key points of the process. Although business rules aren’t the complete answer to compliance since compliance is a very process-focussed initiative, rules can be a significant contributor to compliance efforts. One of the difficulties with compliance is that many regulations, such as Sarbannes Oxley, are pretty vague since they have to deal with such a broad range of companies, and it’s difficult to determine precise business rules to implement them. Compliance at a transactional level is a mostly automated application of BPM and business rules, but as you move up to risk management and higher-level compliance factors, there’s less automation although still opportunities for business rules to be wrapped in a compliance framework, such as using business rules to classify a risk although the management of that risk may be done manually. Still maintains that there’s a link between transactional and operational compliance, and believes that business rules can help with that link although that’s not recognized by most business rules vendors.

As with most other complex applications of technology, you can solve this with an integrated compliance and rules solution from a single vendor, or go for a best-of-breed approach. Still recommends the former approach, and invited us to drop by his booth to check out what RuleBurst has to offer in this area.

BRF Day 2: True Adventures in Business Rules

Paul Armborst of the Westfield Group presented on his experiences in implementing business rules for their various insurance applications over the past 6 years. He sees one of the key problems is in documenting business rules, with two main choices for how to do it:

  • Define the rules in a repository first (a rules management/modelling tool). Although this is the most likely approach for the first definition of rules within an organization, they’ll become out of sync as soon as they are implemented since the rules will be modified in the executing code or BRE rather than in the repository.
  • Define the rules directly in a business rules engine, then generate the documentation in some automated fashion (likely provided by the BRE vendor)

He sees that the best way would be a combination of both approaches: a throw-away repository to document rules as they are discovered, and an extract from the BRE to provide ongoing documentation.

He pointed out one of the problems with introducing business rules: “real” developers want to write Java code, not this airy-fairy business rules stuff, which is an argument that I see against BPM as well. As IT budgets get squeezed, however, development teams will start to look for ways to reduce the amount of code that they have to write and pass some of the tuning capabilities directly over to the business; both BRE and BPM provide these types of capabilities.

He discussed various methods of implementing business rules:

  • Decision tables, noting that the BRE vendor needs to provide some sort of analysis tool to detect gaps and overlaps in the rules, with the caveat that it’s possible to construct a decision table that’s too big to reasonably maintain.
  • Decision trees, which can provide the same functionality in a decision table, but in a graphical form; if the decision points are not in the right order, the number of nodes can multiply unnecessarily, and overly-large trees can be difficult to follow.

He also discussed stateful and stateless implementations of business rules: although stateless is simpler to implement, stateful allows for running only the rules that use data that has changed since the last evaluation of each rule.

There were some last comments on end user rule maintenance: all of their rules are written by developers, but they’re thinking about how to offer some rule creation and modification capabilities to end users. It’s important to have a BRE that allows some sort of restricted view for less technical users, but it’s also necessary for the techies to do some helpful things like naming objects in a manner that users can understand rather than, say, reverse Hungarian notation. Users who have access to create rules need to have some notion of the logic required to do so, and there needs to be some provision for testing.

BRF Day 2: Rules Management Without a Rule Engine

I moved over to the über-geeky “chief architect” track to hear Rik Gerrits of RuleArts and Petr Choteborsky of Microsoft, but 10 minutes into the session, Gerrits is still giving some fairly basic definitions of business rules management: where rules live, how they’re developed, and how they’re managed and used. He does make a point that business processes consume business rules but that they should be distinct in terms of methodology and implementation, as well as other motivations for business rules management such as compliance and efficiency improvements.

Choteborsky took over with a case study about Microsoft’s internal development (as in applications for their own internal use, like software licence authorization), and instantly endeared himself to the audience by saying that he was in corporate IT at Microsoft, and was just as much a victim of the Microsoft product groups as we were. They had issues with software development lifecycle documents and the rules that were embedded within those documents: multiple, conflicting instances of rules in different documents; rules not explicitly defined hence less agile; no common vocabulary leading to inconsistency and miscommunication. Over time, the business logic is lost, and the business requirements documentation becomes completely out of sync with the application and the user manual, so that the only true representation of the business logic is embedded within the application as coded.

He stepped through an example, showing how to break down the prose in a requirements document to determine what is a rule set (a group of related rules), what’s a rule, what’s a fact (unchangeable, therefore may be hard-coded), what is usability behaviour (which may include hidden rules and facts), and what is contextual information that describes capability without being something that will be explicitly coded. Very cool example, since he shows the tendency for the prose in what we think of as a fairly well-written requirements document to actually be a confusing mix of facts, rules, behaviour and context that doesn’t really provide adequate information about what should be written to be easily changeable versus what can be hard-coded into an application.

He went on to show how the same paragraph should be restructured as facts and rules (describe the pure essence of how business must be conducted, independent of implementation detail), requirements (UI and application requirements to implement the rules) and context (information that makes it easier to understand the facts, rules and requirements; redundant information that is not coded). The rules mantra (which I’m just learning today) is “rules build on facts, facts build on terms”, and he shows the terms sprinkled throughout the facts and rules.

They’re attempting to change their requirements documents to this form of structured requirements using business rules (for going-forward documents, not retrofitting the existing ones), but it’s a painful process: there needs to be some common vocabulary and a significant amount of training in some cases to have people start thinking in this way. There was a comment from the audience that once the vocabulary — particularly standardization of terms — was established, there’s usually a pretty good uptake from the business community since they really like language that can help them to define their business more precisely and unambiguously.

There was another comment from the audience that what he is calling a requirement is actually a specification, which is an argument that I’ve had about a zillion times in my years of software development: I completely agree with the comment, as did Choteborsky, but he stated that this was the common terminology in use at Microsoft today and he wasn’t trying to fix everything at once. I have to see the pragmatism in that, although there should likely be some sort of migration of terminology to be more accurate.

He went into more detail on terms, facts and rules, including descriptions of each, and the use of graphical term models and fact models. He also made a distinction between a rule and a policy: a rule can produce an action or decision, whereas a policy is more general but might sound rule-like. He stepped through the before and after of a fact model, where he went through and marked each object and relationship in the model as correct, sort of incorrect, or outright wrong, then found new relationship pathways and defined new terms in the model to make it a better reflection of the actual facts and provide a more logical structure for developing rules. He’s just using Visio for creating the fact models, although I’m sure that some more comprehensive modeling tools could make this process a bit easier. They’re starting to use RuleXpress (the RuleArts product) for terms, facts and rules, although the rules themselves are actually encoded within applications: rules management without a rule engine. As he pointed out, although some business rules may end up in a business rules engine, some end up directly in the code of an application, and some are never codified but become part of an operational manual. We see exactly the same thing in BPM, where a process model may include steps that are transferred to a BPMS, but also ones that are completely manual and never represented within a BPMS. Having a modelling tool separate from the execution environment provides greater flexibility in what can be modelled, but I suspect that the same issues of synchronization and round-tripping occur in rules modelling environments as exist in process modelling.

Choteborsky was a great speaker: knowledgeable, able to explain some fairly complex concepts, and funny (when one slide came up, he said “I don’t know why PowerPoint made the font on this slide bold and ugly, but I’ve learned that I don’t need to win every battle”). The great thing is that he presented a methodology for developing business specifications that everyone in the room involved in software development could take away and start examining for their own use.

BRF Day 2: Business Rules and Business Intelligence Make Great Bedfellows

David Straus of Corticon gave an engaging presentation about BR and BI, starting with the Wikipedia definitions about each, then characterizing BI as “understanding” and BR as “action” (not unlike my statement that BI in BPM is about visibility and BR in BPM is about agility). He started with the basic drivers for a business rules management system — agility (speed and cost), business control while maintaining IT compliance, transparency, and business improvement (reduce costs, reduce risk, increase revenue) — and went on to some generalized use cases for rules-driven analysis:

  • Analyze transaction compliance, i.e., are the human decisions in a business process compliant with the policies and regulations?
  • Analyze the effect of automation with business rules, i.e., when a previously manual step is automated through the application of rules
  • Analyze business policy rules change (automated or non-automated)

He walked through a simplified claims scenario, where the claims agent is not replaced with rules but still makes a decision in the process, but their decision is compared against a decision made by a rules system and any discrepancies are investigated. In other words, although there’s still a person making the decision in the process, the rules system is acting as a watchdog to ensure that their decisions are compliant with the corporate policy. After some time, there can be some analysis of the results to detect pattens in non-compliance: is it an individual agent that’s causing non-compliance, or a particular product, or are the rules not aligned with the requirements? In some cases, the policies given to the agents are actually in conflict, so that they have two different “right” answers in some cases; in other cases, agents may have information that’s just not represented in the rules. By modeling the policies in a business rules system, these conflicts can be driven out to establish integrity across the entire set of rules. This can also be used in cases where an organization just isn’t ready to replace a human decision with a business rules system, in order to validate the rules and compare them to the human decisions; this can establish some trust of the decisioning system that may eventually lead them to replace some of the human decisions with automated ones to create more consistent and compliant decisions.

David had a number of case studies for this combination of rules and analytics, such as investment portfolio risk management, where mergers and acquisitions in the portfolio holdings may drive the portfolio out of compliance with the underlying risk profile: information about the holdings is fed back through the rules on a daily basis to establish if the portfolio is still in compliance, and trigger a (manual) rebalancing if it is out of compliance.

By combining business intelligence (and the data that it’s based on) and business rules, it’s also possible to analyze what-if scenarios for changes to rules, since the historical data can be fed through the new version of the rules to see what would have changed.

He’s challenged the BI vendors to do this sort of rules-based analysis; none of them do it now, but it would provide a hugely powerful tool for providing greater insight into businesses.

There was a question from the audience that led to a discussion about the iterative process of discovering rules in a business, particularly the ones that are just in people’s heads rather than encoded in existing systems; David did take this opportunity to make a plug for the modeling environment in their product and how it facilitates rules discovery. I’m seeing some definite opportunities for rules modeling tools when working with my customers on policies and procedures.

BRF Day 1: Leveraging Predictive Modeling and Rules Management for Commercial Insurance Underwriting

For the last presentation today, I listened to John Lucker of Deloitte discuss what they’ve developed in the area of predictive pricing models for property and casualty insurance. Pricing insurance is a bit trickier than pricing widgets: it’s more than just cost of goods sold plus a profit factor, there’s also the risk factor, and calculating these risks and how they affect pricing is what actuaries do for a living. However, using predictive models can make this pricing more accurate and more consistent, and therefore provides insurance companies with a way to be more competitive and more profitable at the same time.

I know pretty much nothing about predictive modeling, although I think that the algorithms are related to the pattern recognition and clustering stuff that I used to do back in grad school. There’s a ton of recent books on analytics, ranging from pop culture ones like Freakonomics to the somewhat more scholarly Competing on Analytics. I’m expecting Analytics for Dummies to come out any time now.

Predictive modeling is used heavily in credit scoring — based on your current assets, spending habits and debt load, how likely are you to pay on time — and in the personal insurance business, but it hasn’t really hit the commercial insurance market yet. However, the insurance industry recognizes that this is the future, and all the big players are at least dabbling in it. Although a lot of them have previously considered this in order to just do more consistent pricing, what they’re trying to do now is have the predictive models integrate together with business rules in order to drive results. This is helping to reduce the number of lost customers (by providing more competitive pricing), reducing expenses (by providing straight-through processing), increasing growth (by targeting new business areas), and profitability (by providing more accurate pricing).

He talked about how the nature of targeting insurance products is moving towards micro-segmentation, such as finding the 18-year-old male drivers who aren’t bad drivers or the roofing companies with low accident rates, then selling to them at a better price than most insurance companies would offer to a broader segment, such as all 18-year-old male drivers or all roofers. He didn’t use the words long tail, but that’s what he’s talking about: this is the long tail of insurance underwriting. There’s so much data about everything that we do these days, both personal and business, that it’s possible to do that sort of micro-segmentation by gathering up all that data, applying some predictive modeling to extract many more parameters of the data than would have been done in a manual evaluation, and develop the loss predictive model that allows a company to figure out whether you’re a good risk or not, and what price to charge you in order to mitigate that risk. Violation of privacy? Maybe. Good insurance business? Definitely.

The result of all this is a segmented view of the market that allows a company to decide which parts they want to focus on, and how to price any of those parts. Now it gets really interesting, because now these models can be fed into the business rules in order to determine the price for any given policy: a non-negotiable price, much like Saturn does with its cars. This disintermediates both the agents and the underwriters in the sales process, since all of the decisions about what risks to accept and how to price the policies is automated based on the predictive models and the business rules. Rules can even be made self-optimizing based on emerging trends in the data, which I discussed in my presentation this morning, although this practice is not yet mainstream.

Lucker’s message is that business rules are what leverages the power of the predictive models into something that makes a difference for a business, namely, improving business processes: reducing manual processes and associated costs, enhancing service and delivery channels, targeting sales on profitable niches (that long tail), and improving point-of-sale decision-making at an agency.

He ended up describing a top-down approach for designing business rules, starting with organizational strategy, decomposing to the functional areas (business operations, sales, customer service, distribution), then developing the business rules required to help meet the objectives of each of the areas.