Integration World Day 1: Peter Kurpick

Peter Kurpick, CPO (Chief Product Officer) of webMethods Business Division, gave an overview of the technology direction. He talked about the paradigm for SOA governance, with the layers of technical services, business services and policies being consumed by business processes: the addition of the policy layer (which is the SOA governance part) sets this apart from many of the visions of SOA that you see.

He brought along Susan Ganeshan, the SVP of Product Management and Product Marketing, to give a (canned) demo similar to one that we saw yesterday at the end of the analyst sessions. She showed the process map as modelled in their BPM layer, where the appropriate services were called and other points of integration using webMethods, then we saw the custom portal-type interfaces for customers, suppliers and internal workers. They have Fair Isaac’s Blaze Advisor integrated with the BPMS that allows them to change rules for in-flight processes, and their own monitoring and analytics as well as some new Cognos analytics integration. She also showed us the CentraSite integration, where information about services and their policies are stored; CentraSite can be used to dynamically select from multiple equivalent services based on policies, such as selecting from one of several suppliers. The idea of the demo is to show how all of the pieces can come together — people, web services, B2B services, legacy services, and policy governance — all using the webMethods suite.

The original core functionality provided by webMethods is the ESB (originally from the EAI space), but now that’s surrounded by BPM, composite applications, B2B integration and legacy modernization tools (from the Software AG side). Around that is BAM, which is being raised in importance from being just an adjunct to BPM to being an event-related technology in its own right. Around all of this is SOA governance, which is what CentraSite brings to this.

The next release, due sometime in 2008, will be a fully-integrated suite of the Software AG and webMethods products, although Kurpick didn’t provide a lot of information.

Webinar on Enterprise Decision Management tomorrow

After attending the Business Rules Forum last week, either I’m more aware of related events or I’m on more mailing lists for rules/decisioning vendors. In either case, Fair Isaac is putting on an Introduction to Enterprise Decision Management webinar tomorrow at 2pm Eastern. From their description of the event:

Learn how your organization can automate and improve decisions by:

  • Taking control of your decisions and make them a corporate asset
  • Separating operational decisions from processes and systems for maximum agility
  • Using business rules management systems to ensure decision consistency and speed
  • Applying different kinds of analytics – descriptive, predictive and decision – to make more precise and profitable decisions

It’s free, just sign up online.

Smart Enough Systems

I’m sure that James Taylor has almost given up on me ever writing a book review of Smart Enough Systems: I wrote a brief advance review back in April that’s printed in the book, but nothing since it was released. This week, I’ve been immersed in business rules and decisioning, and had a chance to finally meet James face-to-face after a couple of years of emailing back and forth. Also, James’ situation has changed since the book was released: he’s left Fair Isaac and is now an independent, working (I think) with his co-author, Neil Raden. Neil, who I met very briefly this week, is an independent consultant and analyst who’s been focussed on business intelligence for quite a while; James refers to his work as “BI 2.0” (a term that I think that I invented here in early 2006). The two of them met through James’ blog and started the conversation about how someone needed to write a book about this crossover area between business rules and business intelligence.

Just to get started, here’s my pre-release review:

Taylor and Raden’s central manifesto highlights that it’s critical to embody more intelligence in today’s business decision-making and have consistent, automated decisioning built into business processes in order to remain agile and competitive in today’s fast-moving market. They take you through the core concepts of enterprise decision management (EDM), dive into the underlying technologies, then address how to integrate EDM into your business processes to create your own Smart (Enough) Systems.

By focusing on operational decisions that contribute to corporate strategy, Smart (Enough) Systems provide the ability not only to create agile business processes, but to have these processes be self-learning based on historical results. Instead of simply capturing operational process statistics in a data warehouse for later analysis, Smart (Enough) Systems use that knowledge to inform the business rules and allow them to adapt their guidance of the decision-making process. By extracting the decisions from legacy applications, static enterprise applications and manual procedures, and managing them within a shared enterprise decision management system, operational decisions can be applied consistently — and modified easily for processes in flight — across the enterprise.

What I like about the book is that it provides an overview that will get business people interested in the topic, as well as providing a practical guide to getting started. There’s a lot of books focussed on analytics and business rules already, but most of them assume that you already know what you’re going to do with these technologies; in many cases, it’s hard to discover the right decisions on which to focus, since some things are more important than you may have originally perceived when starting a project.

As I heard this week, there’s still a strong tendency to sell rules technology to IT as a way to implement systems fast and cheaper, instead of selling decisioning to the business as a way to solve business problems. For the most part, decisions in business processes are unconsciously relegated either to a developer coding them into an application, or to a front-line worker executing them on an ad hoc basis. Decisions, and the rules that underlie them, need to be made explicit: therein lies the path to both compliance and agility. I joked yesterday about Jan Venthienen’s presentation that he’d like to reduce all processes to a single activity and have all business logic in a rule set, but there’s definitely value in what he’s saying: we need to seek out the decisions in processes, capture the ever-changing business rules that are embedded within them, and move these out of the processes and into rules/decisioning systems. The reality is that process maps don’t change all that often if the decisions and business rules are stripped out of the processes: most business agility is based on rule changes, not process changes, especially for high-volume core transactional processes. However, since we often code rules directly into the business processes — making these processes some combination of processes and decision trees — we perceive the need to be that of changing processes rather than changing rules. And so (and I’ll be totally blackballed in the BPM community for this particular blasphemy), the entire BPM industry has focussed on building tools that allow for easy modification of process maps by the business, when maybe they really should have been focussed on pushing their customers towards the integration of business rules for decisioning in order to greatly reduce the need to modify process maps.

Climbing off my soapbox for a moment and returning to James and Neil’s book, they focus on a key benefit of this sort of smart decisioning in operational processes, which is the ability to replace the “irreplaceables”, such as insurance underwriters, before all the boomers take their knowledge and retire to Palm Beach. This greatly controls risk: not just the risk of the workers leaving, but the risk of bad or inconsistent decision-making. By allowing decisions to become more personalized based on the customer and scenario, this can also provide the ability to be more customer-centric, as well as agility in the face of changing regulations and market conditions.

They see a number of roadblocks to this sort of smart decisioning at the operational level:

  • Everyone is focussed on strategic issues and not taking their operations seriously; however, the aggregate effect of a small but poor decision on millions of operational transactions is, in fact, strategic. In other words, execution matters.
  • Business and IT have an antagonistic relationship. This is often blamed on IT being arrogant and not allowing the business to participate in technology initiatives, but there’s also some amount of the business not wanting to take responsibility since that means that they can’t blame IT if something goes wrong.

The ideas that James and Neil put forward in their book are a great place for business and IT to start collaborating on how to make smart enough systems.

You may also want to listen to the interview that Scott Sehlhorst did with James back in June (I know, I know, when the book was actually released).

BRF Day 2: Business Rules and Business Intelligence Make Great Bedfellows

David Straus of Corticon gave an engaging presentation about BR and BI, starting with the Wikipedia definitions about each, then characterizing BI as “understanding” and BR as “action” (not unlike my statement that BI in BPM is about visibility and BR in BPM is about agility). He started with the basic drivers for a business rules management system — agility (speed and cost), business control while maintaining IT compliance, transparency, and business improvement (reduce costs, reduce risk, increase revenue) — and went on to some generalized use cases for rules-driven analysis:

  • Analyze transaction compliance, i.e., are the human decisions in a business process compliant with the policies and regulations?
  • Analyze the effect of automation with business rules, i.e., when a previously manual step is automated through the application of rules
  • Analyze business policy rules change (automated or non-automated)

He walked through a simplified claims scenario, where the claims agent is not replaced with rules but still makes a decision in the process, but their decision is compared against a decision made by a rules system and any discrepancies are investigated. In other words, although there’s still a person making the decision in the process, the rules system is acting as a watchdog to ensure that their decisions are compliant with the corporate policy. After some time, there can be some analysis of the results to detect pattens in non-compliance: is it an individual agent that’s causing non-compliance, or a particular product, or are the rules not aligned with the requirements? In some cases, the policies given to the agents are actually in conflict, so that they have two different “right” answers in some cases; in other cases, agents may have information that’s just not represented in the rules. By modeling the policies in a business rules system, these conflicts can be driven out to establish integrity across the entire set of rules. This can also be used in cases where an organization just isn’t ready to replace a human decision with a business rules system, in order to validate the rules and compare them to the human decisions; this can establish some trust of the decisioning system that may eventually lead them to replace some of the human decisions with automated ones to create more consistent and compliant decisions.

David had a number of case studies for this combination of rules and analytics, such as investment portfolio risk management, where mergers and acquisitions in the portfolio holdings may drive the portfolio out of compliance with the underlying risk profile: information about the holdings is fed back through the rules on a daily basis to establish if the portfolio is still in compliance, and trigger a (manual) rebalancing if it is out of compliance.

By combining business intelligence (and the data that it’s based on) and business rules, it’s also possible to analyze what-if scenarios for changes to rules, since the historical data can be fed through the new version of the rules to see what would have changed.

He’s challenged the BI vendors to do this sort of rules-based analysis; none of them do it now, but it would provide a hugely powerful tool for providing greater insight into businesses.

There was a question from the audience that led to a discussion about the iterative process of discovering rules in a business, particularly the ones that are just in people’s heads rather than encoded in existing systems; David did take this opportunity to make a plug for the modeling environment in their product and how it facilitates rules discovery. I’m seeing some definite opportunities for rules modeling tools when working with my customers on policies and procedures.

BRF Day 2: Intelligent Process Automation: The Key to Business Process Optimization

The opening keynote today was Steve Hendrick of IDC, discussing their acronym du jour, IPA (intelligent process automation), which is a combination of BPM, BI and decisioning. He lists four key constructs of IPA:

  • Event processing, providing a sense and respond approach
  • Decisioning, covering both rules and actions that might be derived from those rules
  • BPM (I knew that he’d get to this eventually)
  • Advanced analytics, including profiling and segmentation, predictive analytics and modeling, and decision optimization

I’m not sure how this differs from Gartner’s definition of BPMS technology, which includes all these factors; do we really need another acronym for this? I suppose that the analyst firms need to make these distinctions to play in the marketplace, but I’m not sure that a new term specific to one analyst firm provides benefit to the end customers of these systems.

He just put a non-linear programming equation up on the screen. It’s 9:19am, we were all up late last night at various vendor dinners, and he’s talking about the specifics of how to solve this optimization model. I really think that he’s overestimating the number of fellow analytics geeks in the audience.

He moved on to discuss BPM, which he characterizes as a context for putting advanced analytics to work. 🙂 He lists IBM, TIBCO and Adobe (huh?) as the leaders, Global 360 as “right on their heels”, and BEA just behind that with Lombardi somewhere back from that. Hmm, not necessarily everyone’s view of the BPM market.

He then discussed complex event processing for ultra-low latency applications, pointing out characteristics such as how it’s queue based (much like BPM) to allow asynchronous processing of events, and how this allows for extremely fast response to events as they occur. The tie-in to the other technologies that he’s discussing is that events can trigger processes, and can also trigger decisions, the latter of which he feels is more important.

He talked about a number of case studies about how analytics — in addition to other technologies and processes — made a difference for companies.

He ended with some predictions of a bright future for IPA, which included a hockey stick-like projection of BPMS sales increases of about 6x between now and 2011.

BRF Day 1: Leveraging Predictive Modeling and Rules Management for Commercial Insurance Underwriting

For the last presentation today, I listened to John Lucker of Deloitte discuss what they’ve developed in the area of predictive pricing models for property and casualty insurance. Pricing insurance is a bit trickier than pricing widgets: it’s more than just cost of goods sold plus a profit factor, there’s also the risk factor, and calculating these risks and how they affect pricing is what actuaries do for a living. However, using predictive models can make this pricing more accurate and more consistent, and therefore provides insurance companies with a way to be more competitive and more profitable at the same time.

I know pretty much nothing about predictive modeling, although I think that the algorithms are related to the pattern recognition and clustering stuff that I used to do back in grad school. There’s a ton of recent books on analytics, ranging from pop culture ones like Freakonomics to the somewhat more scholarly Competing on Analytics. I’m expecting Analytics for Dummies to come out any time now.

Predictive modeling is used heavily in credit scoring — based on your current assets, spending habits and debt load, how likely are you to pay on time — and in the personal insurance business, but it hasn’t really hit the commercial insurance market yet. However, the insurance industry recognizes that this is the future, and all the big players are at least dabbling in it. Although a lot of them have previously considered this in order to just do more consistent pricing, what they’re trying to do now is have the predictive models integrate together with business rules in order to drive results. This is helping to reduce the number of lost customers (by providing more competitive pricing), reducing expenses (by providing straight-through processing), increasing growth (by targeting new business areas), and profitability (by providing more accurate pricing).

He talked about how the nature of targeting insurance products is moving towards micro-segmentation, such as finding the 18-year-old male drivers who aren’t bad drivers or the roofing companies with low accident rates, then selling to them at a better price than most insurance companies would offer to a broader segment, such as all 18-year-old male drivers or all roofers. He didn’t use the words long tail, but that’s what he’s talking about: this is the long tail of insurance underwriting. There’s so much data about everything that we do these days, both personal and business, that it’s possible to do that sort of micro-segmentation by gathering up all that data, applying some predictive modeling to extract many more parameters of the data than would have been done in a manual evaluation, and develop the loss predictive model that allows a company to figure out whether you’re a good risk or not, and what price to charge you in order to mitigate that risk. Violation of privacy? Maybe. Good insurance business? Definitely.

The result of all this is a segmented view of the market that allows a company to decide which parts they want to focus on, and how to price any of those parts. Now it gets really interesting, because now these models can be fed into the business rules in order to determine the price for any given policy: a non-negotiable price, much like Saturn does with its cars. This disintermediates both the agents and the underwriters in the sales process, since all of the decisions about what risks to accept and how to price the policies is automated based on the predictive models and the business rules. Rules can even be made self-optimizing based on emerging trends in the data, which I discussed in my presentation this morning, although this practice is not yet mainstream.

Lucker’s message is that business rules are what leverages the power of the predictive models into something that makes a difference for a business, namely, improving business processes: reducing manual processes and associated costs, enhancing service and delivery channels, targeting sales on profitable niches (that long tail), and improving point-of-sale decision-making at an agency.

He ended up describing a top-down approach for designing business rules, starting with organizational strategy, decomposing to the functional areas (business operations, sales, customer service, distribution), then developing the business rules required to help meet the objectives of each of the areas.

Forrester Day 2: The three B’s

I ended up skipping the session after mine at the end of the morning, but had some great hallway conversations with some of the business rules vendors who indicated that they think that I’m on track with what I’m saying about BPM and BR.

For the first of the afternoon sessions, I’m attending a panel discussion on the convergence of the three B’s — BI, BPM and BR — featuring Mike Gilpin (EA and application development), Boris Evelson (BI) and Colin Teubner (BPM). I covered a tiny bit of this topic in slides 22-24 of my presentation this morning, and will be doing a full-length presentation on this same topic at the Business Rules Forum next month in Orlando, so I’m interested to see if the Forrester analysts have the same thoughts on this subject as I do.

They start with the statement that “design for people, build for change” will drive the convergence of the three B’s. Interestingly, although a few people in the room stated that they use BPM and BI together, almost no one raised their hand to the combination of BPM and BR — a combination that I feel is critical to process agility. Gilpin went through a few introductory slides, pointing out that almost no business rules are explicitly defined, but are instead buried within processes and enterprise applications. He sees BI as driving effectiveness in businesses, and the combination of BPM and BR as driving efficiency.

Forrester will be publishing some reports about the convergence of the three B’s, and although there are some two-way combinations in vendor products now, there are no vendors that combine all three in a single product. I’m not sure that this is a bad thing: I don’t think that we necessarily want to see BR or BI become a part of BPM because it ultimately limits the usefulness of BR and BI. Instead, I see BR and BI as services to be consumed by BPM, with BI having the additional role of combining process execution statistics generated by the BPMS with other business data. An explicit question was asked about when to use the BR and BI included in the BPMS versus when to use a third-party best-of-breed BR or BI system; Teubner and Gilpin offered some guidelines for this as well as some examples of each situation, but it’s not completely clear if there’s a distinct boundary between when to use the BPMS’ in-built functionality versus the third-party specialist product.

My message on this topic is that BR is the key to process agility, and BI is the key to process visibility as well as feeding back into BR in order to further increase agility. By using the BR and BI functionality within your BPMS, however, you’re typically not getting full BR or BI functionality, but some limited subset that the BPMS vendor has selected to implement. Furthermore, you can’t reuse that functionality outside the BPMS, and in the case of business rules, a change to the BPMS’ rules often requires retesting and redeploying the process models, and does not apply to in-flight processes. However, if you’re not sure if you need BI or BR (hint: you do), then using the in-built functionality in the BPMS gives you an easy-to-integrate and lower cost way to get started. Moving to a separate third-party business rules system gives you a couple of key advantages: you can reuse the same rules across different processes and across other applications in your enterprise, and changes to the rule impacts in-flight processes since the rule is not executed from the BRE until that point in the process is reached. Moving to a separate third-party business intelligence system also provides the advantage of being able to analyze the process data in the context of other business data, and potentially feed back the results of complex analytics to inform the business rules, that in turn drive the business processes. The bottom line: BR and BI are used for many applications in the enterprise that are not explicitly process-related, or combine data from many systems of which the BPMS is just one source. For example, although there are processes embedded within your ERP system, your BPMS may not have direct access to all the information that’s in those processes and hence the BI that’s part of your BPMS can’t (easily) include that data in its analytics and reporting; a general-purpose BI platform may be much more suited to combining your BPMS statistics with your ERP statistics.

A lot of the conversation in this session, which was very interactive with the audience members, was around whether to use converged products versus separate products. It’s not a completely simple answer, and I’ll definitely be thinking about the use case boundaries between converged and separate products before I show up at the Business Rules Forum to continue this discussion.

Evelson and Teubner will be publishing an initial paper in this area in the next few weeks, using the concepts that they’ve presented here today, but see it as a springboard for more discussion in this area rather than an end-point.

BPM Think Tank Day 3: Colin Teubner

Colin Teubner of Forrester gave us a lunchtime presentation, hence my notes were on paper and it’s taken a bit of time to transcribe them. However, I’m on a roll to get all my Think Tank coverage wrapped up today so that I can take 4 days off for the holiday weekend.

Colin’s talk was on issues, challenges and trends in BPM, and the general opinion around my lunch table is that it was a bit lightweight, although a reasonable summary of the current state of affairs. I certainly don’t envy him the task of speaking over the clanking of cutlery and buzz of other conversations as people eat their lunch.

He sees that a maturing of tools and practitioners is pulling more tool types into BPM, particularly a convergence of BPM and BI, and a convergence of content management, collaboration and human-centric BPM. Seeing as how we’ve only just managed to pry content management and human-centric BPM apart, I’m not sure the latter is good news. As he pointed out, BPM is more than modelling and automation, although a lot of projects (and products) get stuck there and don’t do the monitor/manage/optimize parts very well.

He returned to the discussion on BI and BPM that came out of the previous day’s roundtable that he led:

  • BI on a process
  • BI triggering a process
  • BI affecting a process (e.g., event)
  • BI inside a process decision
  • BI inside a human task assignment (inform rather than automate decision)
  • BI to help humans with process work
  • BI to predict the future of process work

BI is positioned as making data actionable. Data-driven BI is bereft of process, and focussed on reports and presentation. Process-centric BI (mainly from BPM vendors) has awareness of BPM and the processes; there may be a tie-in with BPRI although there’s no standard linkage between process models and runtime data that could be consumed by a 3rd party BI product. No BI vendors are doing real BPM-aware BI yet.

He then discussed collaboration and information, showing that BPM is typically only used for the structured part of processes. Interestingly, he just redivided the BPM marketplace into ad hoc/collaborative, production and integration workflow, which is where we were 7 years ago before this all got lumped together as BPM. The future of BPM is a 360-degree view of business processes; the main barrier to that now is that there’s no collaboration in BPM products and no process management in collaboration products. Some BPM vendors are starting to pull in collaborative functions, such as discussion threads, process wikis, email notifications, embedded analytics, dynamic task support, and portal integration.

A few wrapup points on what all this means:

  • BPM vendors must partner to integrate with other functions, such as content management
  • Standards are essential to driving the integration partnerships
  • End users need to think about process, collaboration and ECM together, not as separate issues

Can RSS replace trickle feeds for BI?

I was having a conversation late last week with a SaaS BI vendor about how organizations get data into their online data warehouse (ftp seems to be the most popular method), when it struck me: why couldn’t they use an RSS feed from a transactional system to feed data into the data warehouse for BI purposes (or Atom, for that matter)? Near-real-time data is essential for many types of BI analysis, so there has to be something better than once-daily uploads.

LucidEra on AppExchange

Back in March, I spoke with Ken Rudin of LucidEra about their product launch, and this week I followed up to what’s been going on since then.

The big news is that LucidEra is now available through Salesforce.com’s AppExchange. Although Salesforce.com’s reporting capabilities might be enough if you’re just reporting on the data that you have in their system, LucidEra lets you combine your Salesforce.com data with your corporate data for more of a full business intelligence solution. All of this is hosted, of course, which makes it much more accessible (financially and from an internal IT skills standpoint) for small and medium businesses who want the advantages of BI but can’t afford to play in the BI big leagues.

Rudin was previously with Salesforce.com; combine that with the fact that the first (and, so far, only) LucidEra vertical solution is forecast-to-billing, and that a half dozen of their customers are also Salesforce.com customers, and it’s no real surprise to see LucidEra appear on AppExchange. It’s a natural partnership, and should be of real benefit to some of the Salesforce.com customers who need a bit more in the way of reporting and analytics. Salesforce.com led the first wave of SaaS applications, with a focus on transactional applications that were capturing information; LucidEra is focussed on getting that data out through analytics.

A key feature of LucidEra, whether it’s used in the Salesforce.com context or not, is that you can combine data from a number of different sources, both other hosted data sources and your own on-premise systems. LucidEra becomes your data mart for reporting and analytics, and you just need to get the data in there. They provide automated extract-transform-load for a number of common financial systems, and have some customizable scripts that can be used for systems that they don’t support directly, which may use an Excel spreadsheet as an interim part of the ETL process.

SMBs have never really had the opportunity to do any advanced business intelligence before; companies like this are full of Excel spreadsheets hacked together to mimic a subset of BI functionality. In that situation, the CFO (or any senior management) is always 6 or 7 reports away from what they really need, and with data manually extracted and massaged in spreadsheets, it can take hours or days to do a drill-down into the existing data. In many cases, they could greatly benefit from having BI, but the strategic advantage is not in having that as an in-house solution, but in having the analysis capability and knowing what to do with it. LucidEra gives them a chance to experience better data integration, better data analysis, and all in a way that can be more easily shared amongst people in the organization. Furthermore, it can be done by the business managers themselves, without relying on IT once the data ETL process is set up into LucidEra’s forecast-to-billing solution.

Forecast-to-billing is still the only vertical solution available, although LucidEra obviously wants to create an environment that will encourage other companies to build solutions on their underlying platform. They will be working on other solution applications, and also plan to extend forecast-to-billing to cover everything from the beginning of the cycle (prospects) to the end (cash). They’re also working on some time-based analysis functionality, although Ken wouldn’t talk much about that.

Since our first conversation in March, they have their first customers up and running, with more on the way. Being on AppExchange should certainly help that along.

I believe that SaaS is the way to go for much of the IT infrastructure for SMBs. If I were running a 40-person company now, as I did 7 years ago, I’d be using hosted solutions as much as possible to reduce the internal IT footprint, and therefore reducing costs.