BRF Day 2: Rules Management Without a Rule Engine

I moved over to the über-geeky “chief architect” track to hear Rik Gerrits of RuleArts and Petr Choteborsky of Microsoft, but 10 minutes into the session, Gerrits is still giving some fairly basic definitions of business rules management: where rules live, how they’re developed, and how they’re managed and used. He does make a point that business processes consume business rules but that they should be distinct in terms of methodology and implementation, as well as other motivations for business rules management such as compliance and efficiency improvements.

Choteborsky took over with a case study about Microsoft’s internal development (as in applications for their own internal use, like software licence authorization), and instantly endeared himself to the audience by saying that he was in corporate IT at Microsoft, and was just as much a victim of the Microsoft product groups as we were. They had issues with software development lifecycle documents and the rules that were embedded within those documents: multiple, conflicting instances of rules in different documents; rules not explicitly defined hence less agile; no common vocabulary leading to inconsistency and miscommunication. Over time, the business logic is lost, and the business requirements documentation becomes completely out of sync with the application and the user manual, so that the only true representation of the business logic is embedded within the application as coded.

He stepped through an example, showing how to break down the prose in a requirements document to determine what is a rule set (a group of related rules), what’s a rule, what’s a fact (unchangeable, therefore may be hard-coded), what is usability behaviour (which may include hidden rules and facts), and what is contextual information that describes capability without being something that will be explicitly coded. Very cool example, since he shows the tendency for the prose in what we think of as a fairly well-written requirements document to actually be a confusing mix of facts, rules, behaviour and context that doesn’t really provide adequate information about what should be written to be easily changeable versus what can be hard-coded into an application.

He went on to show how the same paragraph should be restructured as facts and rules (describe the pure essence of how business must be conducted, independent of implementation detail), requirements (UI and application requirements to implement the rules) and context (information that makes it easier to understand the facts, rules and requirements; redundant information that is not coded). The rules mantra (which I’m just learning today) is “rules build on facts, facts build on terms”, and he shows the terms sprinkled throughout the facts and rules.

They’re attempting to change their requirements documents to this form of structured requirements using business rules (for going-forward documents, not retrofitting the existing ones), but it’s a painful process: there needs to be some common vocabulary and a significant amount of training in some cases to have people start thinking in this way. There was a comment from the audience that once the vocabulary — particularly standardization of terms — was established, there’s usually a pretty good uptake from the business community since they really like language that can help them to define their business more precisely and unambiguously.

There was another comment from the audience that what he is calling a requirement is actually a specification, which is an argument that I’ve had about a zillion times in my years of software development: I completely agree with the comment, as did Choteborsky, but he stated that this was the common terminology in use at Microsoft today and he wasn’t trying to fix everything at once. I have to see the pragmatism in that, although there should likely be some sort of migration of terminology to be more accurate.

He went into more detail on terms, facts and rules, including descriptions of each, and the use of graphical term models and fact models. He also made a distinction between a rule and a policy: a rule can produce an action or decision, whereas a policy is more general but might sound rule-like. He stepped through the before and after of a fact model, where he went through and marked each object and relationship in the model as correct, sort of incorrect, or outright wrong, then found new relationship pathways and defined new terms in the model to make it a better reflection of the actual facts and provide a more logical structure for developing rules. He’s just using Visio for creating the fact models, although I’m sure that some more comprehensive modeling tools could make this process a bit easier. They’re starting to use RuleXpress (the RuleArts product) for terms, facts and rules, although the rules themselves are actually encoded within applications: rules management without a rule engine. As he pointed out, although some business rules may end up in a business rules engine, some end up directly in the code of an application, and some are never codified but become part of an operational manual. We see exactly the same thing in BPM, where a process model may include steps that are transferred to a BPMS, but also ones that are completely manual and never represented within a BPMS. Having a modelling tool separate from the execution environment provides greater flexibility in what can be modelled, but I suspect that the same issues of synchronization and round-tripping occur in rules modelling environments as exist in process modelling.

Choteborsky was a great speaker: knowledgeable, able to explain some fairly complex concepts, and funny (when one slide came up, he said “I don’t know why PowerPoint made the font on this slide bold and ugly, but I’ve learned that I don’t need to win every battle”). The great thing is that he presented a methodology for developing business specifications that everyone in the room involved in software development could take away and start examining for their own use.

BRF Day 2: Business Rules and Business Intelligence Make Great Bedfellows

David Straus of Corticon gave an engaging presentation about BR and BI, starting with the Wikipedia definitions about each, then characterizing BI as “understanding” and BR as “action” (not unlike my statement that BI in BPM is about visibility and BR in BPM is about agility). He started with the basic drivers for a business rules management system — agility (speed and cost), business control while maintaining IT compliance, transparency, and business improvement (reduce costs, reduce risk, increase revenue) — and went on to some generalized use cases for rules-driven analysis:

  • Analyze transaction compliance, i.e., are the human decisions in a business process compliant with the policies and regulations?
  • Analyze the effect of automation with business rules, i.e., when a previously manual step is automated through the application of rules
  • Analyze business policy rules change (automated or non-automated)

He walked through a simplified claims scenario, where the claims agent is not replaced with rules but still makes a decision in the process, but their decision is compared against a decision made by a rules system and any discrepancies are investigated. In other words, although there’s still a person making the decision in the process, the rules system is acting as a watchdog to ensure that their decisions are compliant with the corporate policy. After some time, there can be some analysis of the results to detect pattens in non-compliance: is it an individual agent that’s causing non-compliance, or a particular product, or are the rules not aligned with the requirements? In some cases, the policies given to the agents are actually in conflict, so that they have two different “right” answers in some cases; in other cases, agents may have information that’s just not represented in the rules. By modeling the policies in a business rules system, these conflicts can be driven out to establish integrity across the entire set of rules. This can also be used in cases where an organization just isn’t ready to replace a human decision with a business rules system, in order to validate the rules and compare them to the human decisions; this can establish some trust of the decisioning system that may eventually lead them to replace some of the human decisions with automated ones to create more consistent and compliant decisions.

David had a number of case studies for this combination of rules and analytics, such as investment portfolio risk management, where mergers and acquisitions in the portfolio holdings may drive the portfolio out of compliance with the underlying risk profile: information about the holdings is fed back through the rules on a daily basis to establish if the portfolio is still in compliance, and trigger a (manual) rebalancing if it is out of compliance.

By combining business intelligence (and the data that it’s based on) and business rules, it’s also possible to analyze what-if scenarios for changes to rules, since the historical data can be fed through the new version of the rules to see what would have changed.

He’s challenged the BI vendors to do this sort of rules-based analysis; none of them do it now, but it would provide a hugely powerful tool for providing greater insight into businesses.

There was a question from the audience that led to a discussion about the iterative process of discovering rules in a business, particularly the ones that are just in people’s heads rather than encoded in existing systems; David did take this opportunity to make a plug for the modeling environment in their product and how it facilitates rules discovery. I’m seeing some definite opportunities for rules modeling tools when working with my customers on policies and procedures.

BRF Day 2: Intelligent Process Automation: The Key to Business Process Optimization

The opening keynote today was Steve Hendrick of IDC, discussing their acronym du jour, IPA (intelligent process automation), which is a combination of BPM, BI and decisioning. He lists four key constructs of IPA:

  • Event processing, providing a sense and respond approach
  • Decisioning, covering both rules and actions that might be derived from those rules
  • BPM (I knew that he’d get to this eventually)
  • Advanced analytics, including profiling and segmentation, predictive analytics and modeling, and decision optimization

I’m not sure how this differs from Gartner’s definition of BPMS technology, which includes all these factors; do we really need another acronym for this? I suppose that the analyst firms need to make these distinctions to play in the marketplace, but I’m not sure that a new term specific to one analyst firm provides benefit to the end customers of these systems.

He just put a non-linear programming equation up on the screen. It’s 9:19am, we were all up late last night at various vendor dinners, and he’s talking about the specifics of how to solve this optimization model. I really think that he’s overestimating the number of fellow analytics geeks in the audience.

He moved on to discuss BPM, which he characterizes as a context for putting advanced analytics to work. 🙂 He lists IBM, TIBCO and Adobe (huh?) as the leaders, Global 360 as “right on their heels”, and BEA just behind that with Lombardi somewhere back from that. Hmm, not necessarily everyone’s view of the BPM market.

He then discussed complex event processing for ultra-low latency applications, pointing out characteristics such as how it’s queue based (much like BPM) to allow asynchronous processing of events, and how this allows for extremely fast response to events as they occur. The tie-in to the other technologies that he’s discussing is that events can trigger processes, and can also trigger decisions, the latter of which he feels is more important.

He talked about a number of case studies about how analytics — in addition to other technologies and processes — made a difference for companies.

He ended with some predictions of a bright future for IPA, which included a hockey stick-like projection of BPMS sales increases of about 6x between now and 2011.

BRF Day 1: Leveraging Predictive Modeling and Rules Management for Commercial Insurance Underwriting

For the last presentation today, I listened to John Lucker of Deloitte discuss what they’ve developed in the area of predictive pricing models for property and casualty insurance. Pricing insurance is a bit trickier than pricing widgets: it’s more than just cost of goods sold plus a profit factor, there’s also the risk factor, and calculating these risks and how they affect pricing is what actuaries do for a living. However, using predictive models can make this pricing more accurate and more consistent, and therefore provides insurance companies with a way to be more competitive and more profitable at the same time.

I know pretty much nothing about predictive modeling, although I think that the algorithms are related to the pattern recognition and clustering stuff that I used to do back in grad school. There’s a ton of recent books on analytics, ranging from pop culture ones like Freakonomics to the somewhat more scholarly Competing on Analytics. I’m expecting Analytics for Dummies to come out any time now.

Predictive modeling is used heavily in credit scoring — based on your current assets, spending habits and debt load, how likely are you to pay on time — and in the personal insurance business, but it hasn’t really hit the commercial insurance market yet. However, the insurance industry recognizes that this is the future, and all the big players are at least dabbling in it. Although a lot of them have previously considered this in order to just do more consistent pricing, what they’re trying to do now is have the predictive models integrate together with business rules in order to drive results. This is helping to reduce the number of lost customers (by providing more competitive pricing), reducing expenses (by providing straight-through processing), increasing growth (by targeting new business areas), and profitability (by providing more accurate pricing).

He talked about how the nature of targeting insurance products is moving towards micro-segmentation, such as finding the 18-year-old male drivers who aren’t bad drivers or the roofing companies with low accident rates, then selling to them at a better price than most insurance companies would offer to a broader segment, such as all 18-year-old male drivers or all roofers. He didn’t use the words long tail, but that’s what he’s talking about: this is the long tail of insurance underwriting. There’s so much data about everything that we do these days, both personal and business, that it’s possible to do that sort of micro-segmentation by gathering up all that data, applying some predictive modeling to extract many more parameters of the data than would have been done in a manual evaluation, and develop the loss predictive model that allows a company to figure out whether you’re a good risk or not, and what price to charge you in order to mitigate that risk. Violation of privacy? Maybe. Good insurance business? Definitely.

The result of all this is a segmented view of the market that allows a company to decide which parts they want to focus on, and how to price any of those parts. Now it gets really interesting, because now these models can be fed into the business rules in order to determine the price for any given policy: a non-negotiable price, much like Saturn does with its cars. This disintermediates both the agents and the underwriters in the sales process, since all of the decisions about what risks to accept and how to price the policies is automated based on the predictive models and the business rules. Rules can even be made self-optimizing based on emerging trends in the data, which I discussed in my presentation this morning, although this practice is not yet mainstream.

Lucker’s message is that business rules are what leverages the power of the predictive models into something that makes a difference for a business, namely, improving business processes: reducing manual processes and associated costs, enhancing service and delivery channels, targeting sales on profitable niches (that long tail), and improving point-of-sale decision-making at an agency.

He ended up describing a top-down approach for designing business rules, starting with organizational strategy, decomposing to the functional areas (business operations, sales, customer service, distribution), then developing the business rules required to help meet the objectives of each of the areas.

BRF Day 1: How Many Business Rule Analysts Does It Take to Change a Lightbulb?

Seriously, that was the name of Giovanni Diviacchi’s session that I attended this afternoon, which looked at his experience as a business analyst at both Freddie Mac and Fannie Mae (the two big government-backed mortgage companies in the US). He had a number of good pointers on how to extract rules from the business and document them in a way that will be properly implemented by the developers.

They developed a “business action language” for the business analysts to communicate with the developers in an unambiguous way, including statements such as “present” (i.e., >0 and not null), “mutually exclusive”, “is required”, and my personal fave, “read my mind”.

He pointed out that the old axiom “rules are meant to be broken” is true even for business rules, in that you can’t ever plan for all the ways in which a rule might need to be overridden; he discussed one case of a woman who was born prior to 1936, never worked and never had a Social Security Number, which meant having to override the rule that SSN is required for a mortgage. There’s a lot that can be learned from this one example: I see so many rules embedded directly in applications — especially web applications — that make some assumptions that aren’t necessarily true, such as assuming that all people have a US address.

I often work through issues of policies, procedures and processes with my customers, and it was interesting to hear his comments on the relationship between policies and rules. He said that if the policies are well-written, the rules pretty much write themselves, and by spending more time on the policy behind the rules, you end up with a better set of rules. That definitely caused an “aha” moment for me in my emerging role as an evangelist for business rules in a BPM world, and will help to form some of my ideas on how all these things come together.

BRF Day 1: Ron Ross keynote

After a brief intro by Gladys Lam, the executive director of the Business Rules Forum, the conference kicked off with a keynote from Ron Ross, the driving force behind this event and a big name in the business rules community. A couple of things are distracting my attention from his talk: I’m up directly after him, and I’m presenting in this room, which is the main (read: big) conference hall. Let me make my ever-present complaint about passworded wifi in the meeting room and no free wifi or wired internet in the hotel, since I know that my regular readers would be disappointed without that news from the front lines. 🙂

Ron and I have exchanged email over the years, but this is our first opportunity to meet face-to-face; I’ll also have the chance to meet James Taylor and a few others who I only know from afar. Today, Ron’s talking about the shift from business rules to enterprise decisioning. This is the first business rules conference that I’ve ever attended, which means that most of the attendees likely know a lot more about the subject matter than I do, and most of the sessions will be new material for me.

Ron predicted that no one will be talking about SOA at a major conference in 15 years, but they will be talking about business rules or decisioning; I certainly agree with the first point, and the second makes a lot of sense.

When he said “we want our business processes to be smarter”, it was like music to my ears, and a direct lead-in to my presentation. He talked about three trends in the decisioning space:

  • The shift from an exclusive focus on BPM to a balanced approach on enterprise decision management (EDM). He mock-grudgingly admitted that business processes are important, but pointed out that the “smarts” provided by business rules provides agility in the processes (which is exactly the point that I will be making in about 45 minutes — maybe the material here won’t be all that foreign after all).
  • The shift from an exclusive focus on data quality and accessibility to a balanced approach on decision deployment. This is the whole convergence of BI and BR into decisioning — again, a key point in my presentation. I think that Ron is scooping me! He included a great quote from Taylor and Raden’s new book, Smart Enough Systems: “Making information more readily available is important, but making better decisions based on information is what pays the bills.”
  • The shift from an exclusive focus on rules to a balanced approach on decisions. My key takeaway for this conference is figured out a good distinction between business rules and decisioning, since these terms seem to be used interchangeably in some cases; it seems that decisioning is about (not surprisingly) decisions, which in turn are based on business rules and some information about the current scenario.

He finished up with some pointers on where to think about applying decisioning in your business through a few use cases, such as creating “rules of record” for compliance purposes.

Like every other technology-specific conference (especially the BPM ones that I typically attend), this one has at the heart of it that its subject matter is the most important technology in an enterprise, and that herein lies the key to business perfection. I’m being a bit facetious, but we really do need to start getting a bit more cross-over between some of these conferences and technologies.

Business Rules Forum this week

It’s been a busy couple of weeks, between a presentation on the Art of Process Modeling to a group of TIBCO customers in NYC one day, and a full-day version of my Making BPM Mean Business course with a local client, so blogging has been pretty light.

Tomorrow I head to Orlando for the Business Rules Forum, where I give a presentation on Tuesday morning about BPM, business rules and business intelligence. I’ll be staying on until Thursday, so check back for some live blogging from the conference under the BusinessRulesForum category.

If you’re in Toronto this week, I highly recommend the CASCON 2007 event put on by the IBM Centers for Advanced Studies. I went last year and it was great, and (bonus) it’s free, although you need to register in advance.

University of Exeter to offer Masters in BPM

Exeter’s Centre for Research in Strategic Processes and Operartions is partnering with BPTG (the part of BPMG that didn’t make off with the domain name) as part of their soon-to-be-launched Masters in Business Process Management. From the BPTG press release:

The Masters Programme has 10 modules and a dissertation and can be completed over three years. The modules include:

  • Business Process Foundations
  • Business Process Measurement
  • Business Process Improvement
  • Operations Management
  • Business Process Modelling
  • Business Process Change Management
  • Services Management
  • Research Methods
  • Customer Value and Process
  • Business Process Leadership

Looks like some good content here, although I think that BPTG needs to just get over the whole BPMG debacle and stop including phrases like this in their emails: “Some providers distribute course and other certificates like confetti, the authenticity and veracity of which, beyond simple attendance, have no discernable pedigree.”

Collaboration software survey

Jive Software recently did a survey about “collaboration software”, which includes social networking tools such as blogs and wikis, although it’s not clear if it also includes other collaboration tools such as ECM. I think the latter, since 63% of the respondents said that they have access to some type of collaboration software, 78% use it at least weekly, and half use it on a daily basis.

Social networking is definitely starting to make an impact in enterprises, however: 98% of the respondents know what a blog is, and 63% know what a wiki is. No question about whether those people can define any enterprise uses for blogs and wikis, however.

Atlassian releases a SharePoint plug-in for Confluence

I had an update from Jeffrey Walker of Atlassian about today’s joint announcement with Microsoft at O’Reilly’s Web 2.0 Summit: Microsoft is partnering with a few Web 2.0 innovators including Atlassian (which is a pretty big vote of confidence since Atlassian is Java-based) in order to position SharePoint as a social computing platform. As part of this initiative, Atlassian is releasing a SharePoint connector/plug-in for their Confluence enterprise wiki product that provides for single sign-on to both product, and provides two pretty interesting capabilities: federated search and content sharing.

Every customer that I deal with has multiple content repositories of some sort — most of them including some amount of SharePoint — so the issue with bringing in any new content repository such as an enterprise wiki is that users will need to search in multiple locations to find information. The Atlassian plug-in allows for federating a search across Confluence and SharePoint repositories, regardless of where the search originated, while respecting each product’s security.

The second major capability of the plug-in is to allow content sharing. From the SharePoint side, this allows Confluence pages to be embedded into SharePoint pages, including in combination with other SharePoint content. From the Confluence side, you can link directly to SharePoint content, which is a bit lighter-weight integration, but allows for things such as a single click to edit an Office document that is stored within SharePoint.

This plugin is available today in beta for free (assuming that you already have Confluence and SharePoint, of course), and will become a for-fee plug-in when it reaches version 1.0 at some point in the future.

The other Enterprise 2.0 vendor included in this latest Microsoft initiative is NewsGator, although I don’t know much about their part except what I read in the Microsoft press release:

NewsGator, a leading RSS company that helps individuals and businesses improve the way they access information and communicate, today announced the general availability of NewsGator Social Sites. NewsGator Social Sites is a collection of site templates, profiles, Web parts and middleware that will enhance the social computing capabilities of Microsoft Office SharePoint Server 2007 and Windows