Business Rules Forum: James Taylor and Neil Raden keynote

Opening the second conference day, James Taylor and Neil Raden gave a keynote about competing on decisions. First up was James, who started with a definition of what a decision is (and isn’t), speaking particularly about operation decisions that we often see in the context of automated business processes. He made a good point that your customers react to your business decisions as if they were deliberate and personal to them, when often they’re not; James’ premise is that you should be making these deliberate and personal, providing the level of micro-targeting that’s appropriate to your business (without getting too creepy about it), but that there’s a mismatch between what customers want and what most organizations provide.

Decisions have to be built into processes and systems that manage your business, so although business may drive change, IT gets to manage it. James used the term “orthogonal” when talking about the crossover between process and rules; I used this same expression in a discussion with him yesterday in discussing how processes and decisions should not be dependent upon each other: if a decision and a process are interdependent, then you’re likely dealing with a process decision that should be embedded within the process, rather than a business decision.

A decision-centric organization is focused on the effectiveness of its decisions rather than aggregated, after-the-fact metrics; decision-making is seen as a specific competency, and resources are dedicated to making those decisions better.

Enterprise decision management, as James and Neil now define it, is an approach for managing and approving the decisions that drive your business:

  • Making the decisions explicit
  • Tracking the effectiveness of the decisions in order to improve them
  • Learning from the past to increase the precision of the decisions
  • Defining and managing these decisions for consistency
  • Ensuring that they can be changed as needed for maximum agility
  • Knowing how fast the decisions must be made in order to match the speed of the business context
  • Minimizing the cost of decisions

Using an airline pilot analogy, he discussed how business executives need a number of decision-related tools to do their job effectively:

  • Simulators (what-if analysis), to learn what impact an action might have
  • Auto-pilot, so that their business can (sometimes) work effectively without them
  • Heads-up display, so they can see what’s happening now, what’s coming up, and the available options
  • Controls, simple to use but able to control complex outcomes
  • Time, to be able to take a more strategic look at their business

Continuing on the pilot analogy, he pointed out that the term dashboard is used in business to really mean an instrument cluster: display, but no control. A true dashboard must include not just a display of what’s happening, but controls that can impact what’s happening in the business. I saw a great example of that last week at the Ultimus conference: their dashboard includes a type of interactive dial that can be used to temporarily change thresholds that control the process.

James turned the floor over to Neil, who dug further into the agility imperative: rethinking BI for processes. He sees that today’s BI tools are insufficient for monitoring and analyzing business processes, because of the agile and interconnected nature of these processes. This comes through in the results of a survey that they did about how often people are using related tools: the average hours per week that a marketing analyst spends using their BI tool was 1.2, versus 17.4 for Excel, 4.2 for Access and 6.2 for other data administration tools. I see Excel everywhere in most businesses, whereas BI tools are typically only used by specialists, so this result does not come as a big surprise.

The analytical needs of processes are inherently complex, requiring an understanding of the resources involved and process instance data, as well as the actual process flow. Processes are complex causal systems: much more than just that simple BPMN diagram that you see. A business process may span multiple automated (monitored) processes, and may be created or modified frequently. Stakeholders require different views of those processes; simple tactical needs can be served by BAM-type dashboards, but strategic needs — particularly predictive analysis — are not well-served by this technology. This is beyond BI: it’s process intelligence, where there must be understanding of other factors affecting a process, not just measuring the aggregated outcomes. He sees process intelligence as a distinct product type, not the same as BI; unfortunately, the market is being served (or not really served) by traditional query-based approaches against a relatively static data model, or what Neil refers to as a “tortured OLAP cube-based approach”.

What process intelligence really needs is the ability to analyze the timing of the traffic flow within a process model in order to provide more accurate flow predictions, while allowing for more agile process views that are generated automatically from the BPMN process models. The analytics of process intelligence are based on the process logs, not pre-determined KPIs.

Neil ended up by tying this back to decisions: basically, you can’t make good decisions if you don’t understand how your processes work in the first place.

Interesting that James and Neil deal with two very important aspects of business processes: James covers decisions, and Neil covers analytics. I’ve done presentations in the past on the crossover between BPM, BRM and BI; but they’ve dug into these concepts in much more detail. If you haven’t read their book, Smart Enough Systems, there’s a lot of great material in there on this same theme; if you’re here at the forum, you can pick up a copy at their table at the expo this afternoon.

Business Rules Forum: Vendor Panel

All the usual suspects joined on a panel at the end of the day to discuss the vendor view of business rules: Pegasystems, InRule, Corticon, Fair Isaac ,ILOG (soon to be IBM) and Delta-R, moderated by John Rymer of Forrester.

The focus was on what happening to the rules market, especially in light of the big guys like SAP and IBM joining the rules fray. Most of them think that it’s a good thing to have the large vendors in there because it raises the profile of and validates rules as a technology; likely the smaller players can innovate faster so can still carve out a reasonable piece of the market. Having seen exactly this same scenario play out in the BPM space, I think that they’re right about this.

The ILOG/IBM speaker talked about the integration of business rules and BPM as a primary driver — which of course Pega agreed with — but also the integration of rules, ETL and other technologies. Other speakers discussed the importance of decision management as opposed to just rules management, especially with regards to detecting and ameliorating (if not actually avoiding) situations like the current financial crisis; the use of predictive analytics in the context of being able to change decisions in response to changing conditions; and the current state of standards in rules management. There was a discussion about the difference between rules management and decision management, which I don’t believe answered the question with any certainty for most of the audience: when a speaker says “there’s a subtle but important difference” while making hand motions but doesn’t really elaborate, you know that you’re deep in the weeds. The Delta-R speaker characterizes decision management as rules management plus predictive modeling; I think that all of the vendors agree that decision management is a superset of rules management, but there are at least three different views on what forms that superset.

As a BPM bigot, I see rules as just another part of the services layer; I think that there’s opportunity for BRM in the cloud to be deployed and used much more easily than BPM in the cloud (making a web services call from a process or app to an external rules system isn’t very different than making a web services call to an internal rules system), but I didn’t hear that from any of the vendors.

That’s it for the day; I know that the blogging was light today, but it should be back to normal tomorrow. I’m off to the vendor expo to check out some of the products.

Business Rules Forum: Mixing Rules and Process

I had fun with my presentation on mixing rules and process, and it was a good tweetup (meeting arranged via Twitter) opportunity: Mike Kavis sat in on the session, Miko Matsumura of Software AG caught up with me afterwards, and James Taylor even admitted to stepping in for the last few minutes.
 

I’ve removed most of the screen snapshots from the presentation since they don’t make any sense without the discussion; the text itself is pretty straightforward and, in the end, not all that representative of what I talked about. I guess you just had to be there.

Business Rules Forum: Ron Ross keynote

The good news is that it’s a lovely sunny, breezy and cool day: perfect fall weather for Toronto. The bad news is that I’m in Orlando, and was hoping to wear shorts more than sweaters this week. However, I’m here to attend — and speak at — the Business Rules Forum, not sit by the pool.

Ron Ross started the conference with a keynote called From Here to Agility; agility, of course, is one of the key reasons that you consider implementing business rules, whether in the context of BPM or other applications. It’s pretty well attended — probably 200 people here at the opening keynote, and likely a lot of vendors off setting up their booths for later today.

He started with a couple of case studies, both of companies that could really use rules due to the lack of agility in their legacy systems, and of companies that have successfully implemented rules and achieved their ROI on the first project. He then looked at what might be motivating people to attend this conference and what they can expect; a bit of an unnecessary sales pitch, considering that these people are already here.

He talked about the importance of decisioning, and how it’s a much better opportunity for business improvement than process; I’d have to agree that it’s a much greater contributor to agility, but not necessarily a better opportunity for improvement overall. I’ll have to think that through before my presentation this afternoon on mixing rules and process. He did have some convincing quotes from Tom Davenport’s “Competing on Analytics”, such as Davenport’s conclusion that automated decisioning will be the next competitive battleground for organizations.

The goals to creating business agility:

  • No artificial constraints in the representation of business products and your capacity to deliver them to customers — this is primarily a cultural issue, including a vocabulary to define your business practices, not a technical issue.
  • All operational business practices represented as rules.
  • All rules in a form such that they can be readily found, analyzed, modified and redeployed by qualified business people and product specialists.

Examples of operational business decisions:

  • How do we price our product for this transaction?
  • What credit do we give to this customer at this point in time?
  • What resource do we assign to this task right now?
  • Do we suspect fraud on this particular transaction?
  • What product configuration do we recommend for this request?
  • Can we confirm this reservation?

Note that these really are low-level, moderate complexity operational decisions, not strategic decisions: thousands or even millions of these decisions may be made every day in your business processes, and having agility in this type of decision can provide significant agility and competitive differentiation.

James Taylor and Neil Raden will be here later to talk about enterprise design management (EDM), but Ron gave us some of the basics: closed-loop decisioning that captures data about decisions, analyzing that data, then uses those results to make changes in a timely manner to the operational decisions. The “in a timely manner” part of that is where business rules come in, of course. That round-trip from analysis to deployment to execution to capture is key: we talk about it in BPM, but the analysis and deployment parts often require a great deal of an analyst’s time in order to determine the necessary improvements.

He went on to talk in more detail about why a focus on “business process” isn’t enough, since it doesn’t make the business adaptive, create consistent and reusable rules, or a number of other factors that are better served by business rules. To achieve business agility, then, he feels that you need:

  • Business-level rule management: having the business make changes to rules
  • Business-level change deployment: having the business in charge of the governance process for changing and rolling out changes to rules
  • Business-level organizational function to support the previous two activities

Looking at the problem decisions in existing legacy systems, look at the redundant, overlapping and conflicting rules; these could manifest as data quality problems, frequent change requests, or customer service problems. In many cases, these conflicting rules may be running on different platforms and address different channels. The key is to externalize these rules from the legacy systems into a decision service: a business rules management system that maintains the rules repository and is available to any application via a standard web services interface. This allows for a gradual transition from having these rules embedded within the legacy systems to centralizing them into a common repository that ensures consistent results regardless of channel or application. This provides consistency across channels, selective customer treatment and competitive time-to-market as well as rather painless compliance since your policies are embedded within the rules themselves and the rules management system can track what rules are executed at any given point in time.

Now, think of your BPMS as your legacy system in the context of the above paragraph…

Logistics: no wifi (there is wifi in the conference area but BRF didn’t spring for the password), requiring a trip to the lobby or my room in order to post — obviously, that will delay things somewhat. No power at the tables, which is not a big deal since I don’t use a lot of power with the wifi off. My blogging will be a bit light today until after my presentation this afternoon.

HD antenna

HD OTA antennaFor those of you in the conversation at last week’s after-conference drinks about HD digital over-the-air (OTA) antennae, and how my husband built one out of a salad spoon and tin foil, here’s the details (on his blog).

And yes, for those of you who read his text, he really did make a working antenna out of a tea strainer and a metal tape measure, but I was laughing too hard to take the picture.

FeedBurner kills my feed. Again.

When I switched to the new Google-hosted version of FeedBurner (which soon everyone using FeedBurner will be forced to convert to), they screwed up my feed, causing my subscriptions to drop by about 20%.

Since then, my numbers have come back to to around what they were — presumably through organic growth of the people who figured out that it wasn’t updating and re-added me to their feed reader — until yesterday, when they dropped by 25% and even further today. Grrrrrrrr.

Even worse, my feed hasn’t updated in my own Google Reader since my first post of the day, so even if you are still subscribed, you may not be seeing the posts in a timely manner.

If you’re a victim of this, of course you won’t know, you’ll just think that I’m not blogging. If you check on my site directly and see this, try removing and re-adding my feed to your reader, hopefully that will fix it. At least until the next time GoogleBurner screws up.

Update: After two days of the “25% off sale”, subscriptions jump back to normal.

Ultimus: V8 technical demo

FlobotI ended up wrapped up in a discussion at the break that had me arrive late to the last session of the day; Steve Jones of Ultimus is going through many of the technical underpinnings of V8 for designers and developers, particularly those that are relevant to the people in the audience who will be upgrading from those old V7 systems soon.

A nice way to integrate with web services, where the WSDL can be interrogated and a data structure matching the interface parameters created directly from that; most other systems that I’ve seen require that you define the process parameters explicitly then map from one to the other. Of course, there’s lots of cases when you don’t want a full representation of the web services interface, or you want to filter or combine parameters during interface, but this gives you the option for setting up a lot of web services really quickly.

The integrated rules editor allows you to drag and drop process variables — including recipients — onto a graphical decision tree; you don’t have the full power of a business rules system, but this may be enough for a lot of human-centric processes where most of the complex decisions in the process are made by people rather than the system.

For interfacing with any of the external components, such as the email connector or a form, it’s possible to drag and drop data fields from the process instance schema or org chart/ActiveDirectory directly to assign variables for that component, which is a pretty intuitive way to make the link between the data sources and the external calls. They’ve also eliminated some of the coding required for things like getting the current user’s supervisor’s email address, which used to require a bit of code in V7.

Ultimus provides a virtual machine with the software pre-installed as part of their training offerings, which is a great way to learn how to work with all of this; I don’t understand why more vendors don’t provide this to their customers.

I looked back to some old notes from early 2007 when I had a demo of Ultimus V7; my impression at that time is that it was very code-like, with very little functionality that was appropriate for business analysts; V8 looks like a significant improvement over this. They’re still behind the curve relative to many of their competitors, but that’s not completely surprising considering their management upheavals over the past year. If you’re a pure Microsoft shop, however, you’ll likely be willing to overlook some of those issues; Forrester placed Ultimus in the leaders sector (in an admittedly small field) in their report on human-centric BPM on Microsoft platforms. In the broader market of all BPM vendors, Gartner placed them in the visionaries quadrant: good completeness of vision, but not quite enough ability to execute to make it into the leaders quadrant, although this latter assessment seemed to be based on the performance of the previous management team.

Steve spent a bit of time showing the V8 end-user interface: reconfigurable columns in task lists, including queries and filters; shared views to allow a personal view to be shared with another user (and allow that other user to complete work on your behalf); and the ability to run reports directly out of the standard user environment, not a separate interface.

They’ve also done some performance improvements, such as moving completed process instances to a separate set of tables (or even archived out to another database) for historical reporting without impacting the performance of work in progress.

That’s it for me for the conference (and the week); tonight, we’ll be down by the Riverwalk drinking margaritas while listening to a Mariachi band. Tomorrow is an Ultimus partner day and I’ll be on an early morning flight home. Next week, I’ll be at the Business Rules Forum in Orlando, where I’m giving a presentation on mixing rules and process. The following week, I’m headed to Miami for the Software AG analyst/blogger roundtable and a day at their user conference, a late addition to my schedule.

Ultimus: Process optimization

Chris Adams is back to talk to us about process optimization, both as a concept and in the context of the Ultimus tools available to assist with this. I’m a bit surprised with the tone/content of this presentation, in which Chris is explaining why you need to optimize processes; I would have thought that anyone who has bought a BPMS probably gets the need for process optimization.

The strategies that they support:

  • Classic: updating your process and republishing it without changing work in progress
  • Iterative: focused and more specific changes updating live process instances
  • Situational/temporary: managers changing the runtime logic (really, the thresholds applied using rules) in live processes, such as changing an approval threshold during a month-end volume increase
  • Round-trip optimization: comparing live data against modeling result sets in simulation

There’s a number of tools for optimizing and updating processes:

  • Ultimus Director, allowing a business manager to change the rules in active processes
  • Studio Client, the main process design environment, which allows for versioning each artifact of a process; it also allows changes to be published back to update work in progress
  • iBAM, providing visibility into work in progress; it’s a generic dashboarding tool that can also be used for visualization of other data sets, not just Ultimus BPM instance data

He finished up with some best practices:

  • Make small optimizations to the process and update often, particularly because Ultimus allows for the easy upgrade of existing process instances
  • Use Ultimus Director to get notifications of
  • Use Ultimus iBAM interactive dials to allow executives to make temporary changes to rule thresholds that impact process flow

There was a great question from the audience about the use of engineering systems methodology in process optimization, such as theory of constraints; I don’t think that most of the vendors are addressing this explicitly, although the ideas are creeping into some of the more sophisticated simulation product.

Ultimus: Customer roundtable

I’m in a roundtable session with one brand new Ultimus customer, one who’s six months in and just getting their first processes rolled out, and one who’s done a lot of processes already. As was requested previously, and again not-so-subtly by Chris Heivly who is moderating this session, I won’t be documenting the names of the customers or the details.

Instead, I am forced to blog about my cat:

She’s not much into BPM, but you can see that she’s a bit of an efficiency expert.

Seriously, though, there were a few good nuggets in the session that won’t tell any tales out of school:

  • Product demos aren’t enough to bridge the gap to understanding what BPM can do for you: you need to prototype your processes in a product to really understand it.
  • Going paperless is a huge cultural challenge, but once the users get used to it, they wouldn’t give it up.
  • HR onboarding provides a good opportunity for automation, and can justify the cost of the BPM system alone in a larger organization.
  • Implementing one relatively straightforward process up front can be used to help bolster the business case for additional implementations. One example that we heard was implementing the approval process for the requirements for the processes that you’re there to build in the first place.
  • Sometimes it’s worthwhile to implement the current process without reengineering — pave the cowpaths, as it were — in order to start gathering statistics on the bottlenecks in the process and highlight the potential for improvements.
  • The center of excellence approach is critical for rolling out a large number of processes efficiently, using a small core of dedicated resources, then moving other people in and out of the team as their skills were required.
  • A rapid, agile-like approach with a minimum of structured requirements works well, especially for getting the initial happy path process up and running; you can go back and fill in the exception cases later in the design and implementation cycle.