Gartner Day 2: Jim Sinur panel

This afternoon, Jim Sinur hosted a panel on Implementing an Enterprise-Transforming BPMS, which included Jeff Akin from American Home Shield, Alan Jones from Sandisk, Craig Edmonds from Symetra Financial and Jodi Starkman-Mendelsohn of West Park Assessment Centre.

American Home Shield’s goal was to double their revenue by 2010 with limited growth in their service centres, which they planned to accomplish by replacing older systems with more agile systems and move towards a more process-centric view. They’ve just rolled things out so aren’t seeing the ROI yet, but are seeing more consistent customer handling and enforcement of best practices. They’re implementing Pegasystems as their BPM.

Symetra’s object was to improve satisfaction, since they recognize that it’s much easier to keep a customer than to get a new one, and they used goal management as their approach when building processes. They did what appears to be a fairly standard imaging+workflow type of implementation using Global 360, although with today’s BPM technology that provides greater agility than the older workflow systems. They’ve seen huge ROI numbers, and have increased levels of customer service in terms of transaction turnaround times.

Sandisk has deployed 4 mission-critical BPM applications using Handysoft, started with the purchase requisition process, which was paper-based and not scalable. Their goal was to improve employee efficiency by improving the approval cycle time and reducing processing costs. Like American Home Shield, they consider different classes of solutions: a module in their ERP system, online forms, and finally selected a BPMS. They reduced the processing cycle time from 3 weeks to 1 week, and saw a number of other advantages.

West Park Assessment Centre needed to bolster their IT infrastructure to allow them to grow, and improve the quality of their services such as scheduling. They also wanted to see cost savings to a 3-year ROI, improve productivity of remote users and improve operating efficiencies. They wanted to automate their processes from the point that a referral arrived (regardless of channel), scheduling, booking, reporting, invoicing and all the other tasks that are involved in providing their services. They went live in late 2002 using Ultimus, just in time for the SARS outbreak in early 2003 that locked them out of their hospital-based offices in Toronto. With no access to their physical records, or any space to provide assessment services, they set up shop in a local hotel and were up and running within two business days due in no small part to their BPM implementation — effectively preventing total business failure. They did get their 3-year ROI and reduced turnaround time by 27%; these efficiencies have increased their profitability. By externalizing their business rules and logic in the BPMS, they have improved their agility to the point where they can make changes to their systems within a couple of days.

Although I like to hear the customer case studies, I find these panels to be a pretty artificial construct: it’s like 4 mini-presentations by customers with a few questions from Sinur at the end of each section, joint questions from the audience at the end, but no interaction between the panelists. I’d really like to see less canned presentations and more conversation between the panelists.

Gartner Day 2: Bill Gassman

The afternoon started with several simultaneous sessions by Gartner analysts, and I sat in on Bill Gassman talking about Measuring Processes in Real Time, or as he put it later, learning to live in real time.

There’s no doubt that process visibility is a key benefit gained from BPM, and that visibility usually occurs through the integration of business intelligence (BI) or business activity monitoring (BAM) tools to assist in process monitoring. The goal of BAM is to monitor key objectives, anticipate operations risks, and reduce latency between events and actions, and there’s a number of different channels for funneling this information back to those who need to know, such as business intelligence systems for predictive modelling and historical reports, real-time dashboards, and alerts.

So what’s the difference between BI and BAM? According to Gassman, BI is used for insight and planning, and is based on historical — rather than real-time — data. BAM is event driven, and issues alerts when events occur. Personally, I think that there’s a spectrum between his definitions of BI and BAM, and it’s not clear to me that it’s a useful distinction; in many cases, data is trickle-fed from operational systems to BI systems so that the data is near-real-time, allowing dashboards to be driven directly from the BI system. True, traditional BI tools will typically see update intervals more like 15 minutes than the near-real-time event alerts that you’ll find in BAM, but that’s not a problem in some cases.

Gassman discussed the different real-time analytic techniques that are widely used today: process monitoring, logistics optimization (often based on optimizing delivery times while minimizing penalties), situational awareness, pattern matching (complex event processing, or CEP), track and trace (typically used for B2B processes), and comparison between predictions and reality.

Gartner found in a survey 18 months ago that half of their customers surveyed don’t use BAM, and claim that they don’t use it because they don’t really know about it. Considering that BI had long been a technology that can be cost-justified in an extremely short time-frame, and BAM follows the same ROI patterns, I find this surprising (and I had the feeling that they were a bit surprised, too), although I have had large customers who fall into the same category.

Looking at it from a BPM standpoint, automating a process without having appropriate monitoring is risky business: there’s a business value to awareness of what’s happening in your processes, so that problems are detected early, or possibly before they even occur. There’s a natural synergy between BPM and BAM: BPM generates a mound of process instance data, often in an event-driven manner, that just begs to be analyzed, aggregated, sliced and diced.

Gassman discussed some best practices for BAM/BPM synergy before moving on to his definition of the four generations of BAM architecture: isolated silos, standalone, integrated, and composite. We’re still seeing lots of 1st and 2nd generation BAM tools, the 3rd generation has just started happening, and the 4th generation is still at least a year away. He points out that most BPM vendors are adding BAM, but are using a 1st generation BAM system that’s an isolated silo. He sees the potential to move through 5 different styles of BAM automation, that is, how the analysis from the BAM tool feeds back to change the business process. The potential benefits are great as you move from the simple BAM dashboards up through adaptive rules that choose a path based on goals, but the risks increase as well.

BAM is coming from a number of different types of vendors, in spite of the small size of the market, and there will definitely be some convergence and shakeouts in this market. An example of a trend that I think will continue is the recent acquisition of BAM vendor Celequest, used by some BPM vendors as their embedded BAM, by Cognos, a BI vendor. When you’re using BPM, you’re also going to have to face the question of whether to use a BPM vendor’s embedded BAM, or look for a more fully-functional standalone BAM tool. Gassman showed a spider graph of how BPM/BAM matches up against BI on 8 different dimensions, which indicates that you may want to look for a separate product if you need more analytical capability or need to monitor events outside of the process model.

Gartner Day 2: Catching up with BPM bloggers

Lunchtime today was spent chatting with two other BPM bloggers: first, I met with Jesper Joergensen of BEA for a chat about what they’re doing; then I spent some time with Keith Swenson of Fujitsu, mostly talking about BPM standards. Add this to the fact that I had breakfast with Jason Klemow, and there’s been some pretty good BPM blogger networking today.

Gartner Day 2: Savvion customer presentation

Arun Mathews of Motorola, a Savvion customer, presented on the experiences of implementing BPM at Motorola. He started out with a list of reasons why they started with BPM, ranging from Six Sigma projects driving the need for new policies and procedures, through metrics and measurement needs. They started by mapping the as-is business processes, analyzing the processes, designing the to-be processes, and implementing in Savvion: pretty standard stuff.

Then, they start on process monitoring and continuous process improvement; as a big Six Sigma shop, continuous improvement and innovation are part of their corporate culture and their a process-oriented company, which gives them a huge advantage over many other large companies. They also have a focussed methodology for doing all of this, which appears to be a key differentiator for them over other organizations implementing (or attempting to implement) BPM.

They have a number of successful BPM projects that they’ve implemented, including core supply chain processes. Although he couldn’t share many of the numbers with us, since it’s proprietary information, he did discuss the metrics that they used as direct input into process improvement, such as timeliness and reworks, both of which impact customer satisfaction.

What did they learn from all this? First of all, this is a major paradigm shift that needs some amount of change management at all levels, but the business loves it once they start to see the benefits. Training in BPM methodology is key to this acceptance. Incorporate BPM into a long-term architecture plan, but start small on implementation projects with high ROI and/or high visibility.

Motorola has obviously made a big commitment to BPM, and are reaping the benefits of it: they’ve redefined their process automation and management to use a collaborative methodology with the business taking on much more ownership, which in turn reduces project timelines and costs. Motorola IT is recognized as an industry leader, and in 2005, their CIO recognized BPM as one of the top reasons for their innovation. The bottom line, however, is that it’s not enough to just buy a BPMS and start implementing: you have to have a process view on things.

Gartner Day 2: Appian customer panel

Next up was the Appian customer panel, hosted by Michael Beckley, co-founder and VP of Product Strategy at Appian, who I had met last night at the Appian booth.

Beckley started out with a few remarks about Appian, but the main part of the panel was hearing from their customers: Tom Bolger and Todd McGinnis of West Monroe Partners, Bruce Grenfell of Concur Technologies, Terry Jost of Talisen, and Dennis Nickel of Telus (who I met at breakfast this morning).

  • West Monroe is automating loan origination processes for their customers
  • Telus is adding Appian to their IT outsourcing business process not for the purpose of automation, but to make sure that the hand-offs between people are done in a standard fashion so that details are not lost.
  • Concur Technologies delivers travel and expense management services via SaaS to their customers, and manages business processes such as approvals involved in those services.
  • Talisen is adding BPM to procurement processes for their customers.

Interestingly, only one of the five is actually an end-customer; the others are technology companies who are implementing BPM for their customers in some way.

They really just got rolling when the first 30-minute session was up; like the ones I complained about yesterday, they structured an hour-long session spanning the two 30-minute session timeslots. With no logical breakpoint, I stepped out in the middle of the panel in order to catch the presentation by Savvion’s customer, Motorola.

Gartner Day 2: Bruce Williams

The second keynote speaker of the day is Bruce Williams of Savvi International, author of Six Sigma for Dummies (and the accompanying workbook) and Lean for Dummies, speaking on What BPM Means to Business Innovation. Funny, at last year’s Gartner BPM summit, everything was about Six Sigma; this year, this is the first time that I’ve heard it mentioned.

He points out one view of BPM, that it’s just a faster, better treadmill, but we’re still doing the same old things. BPM is more than that: not just operational efficiencies and defect reductions, but measurements and activity monitoring, process controls, and integration between systems and services. Furthermore, he goes on to say that the biggest value from BPM is in business innovation, not process improvement: the introduction of something new and useful and the process by which it is brought to life.

But why is innovation important? Why not just milk the cash cows? The answer is pretty obvious, although ignored by many traditional organizations: the lifecycle of every product or service eventually comes to an end, often because someone else introduces a disruptive product or service to the marketplace that obsolesces the old way of doing things. As James Morse of the Harvard Business Review said many years ago (a quote that I have referred to many times), “the only sustainable competitive advantage comes from out-innovating your competition.” Ultimately, innovation trumps optimization.

Williams continues on with a lot of stuff about why the innovation cycle looks like it does, but there’s really nothing new here: this is just the classic stuff for why products or services pass their peak: fatigue, customer demands, market redirections, competitive pressures, technological changes, globalization effects, organizational changes, demographic shifts, regulatory constraints, economic effects, supply drifts and many other factors. He does point out, however, that most US firms have no program in place for fostering innovation, and don’t even have a clear idea of how to become more innovative. Tom Davenport did a study last year that showed that companies are focussing primarily on product innovation, and mostly ignoring things like business model innovation, or even business process innovation; Williams added some things that didn’t even make the list, like innovation in accounting practices or risk management.

He went through some of the different dimensions of innovation — reactive versus proactive; incremental/sustaining versus radical/disruptive; formal versus informal — and looked at how these dimensions mapped onto some specific cases. When he referred to Americans as the kings of innovation, however, it made me doubt his world view overall and left me with a bit of a bad taste: it came across as ethnocentric flag-waving that has no place at a business conference. I recognize that Americans lead innovation in a number of areas, but there are many other countries in the world that are leaders in their own areas of innovation. He’s also under the deluded notion that everyone wants what Americans have, driving SUVs full of consumer goods back to their monster homes in the suburbs, and laughingly pointed out a survey that he had done that concluded that if everyone in the world lived like he did, we’d need over 7 planets worth of resources to accommodate them. Yikes.

At the end of it all, although he had a pithy quote about how BPM is the grand unification theory for business (which is apparently trademarked?!), Williams had very little to say about BPM, but a lot to say about innovation: one of the prime motivators for why you might be considering BPM.

Gartner Day 2: Daryl Plummer

The second day started with a keynote by Daryl Plummer, BPM/SOA Elixir (unfortunately, I missed the breakfast session with Michele Cantara about the BPMS market, but I ended up in a fascinating discussion about requirements collaboration using wikis with Jason Klemow, and the concept of subscribing to processes with specific attributes in a BPM with Dennis Nickel of Telus).

Plummer obviously likes to play with the new technology, which I suppose is a prerequisite for his job: he talks about TiVo and Second Life as things that are fast becoming essential parts of culture, although it’s clear that few people in the audience (except maybe me) embrace both the concept of downloading and consuming what I want when I want it, and the importance of online social networks.

He starts with some basic definitions of “service”, especially the relationship between processes and services, to drive home the idea that SOA impact people too, not just systems. He also made an excellent distinction between a model-driven (process-centric) view and a typical programmer view of things: a model-driven view describes what a person does (the business process); a programmer view describes what the system does (the code that attempts to implement that process). After having a discussion earlier about defining processes based on the data values that flow through those processes, when I couldn’t quite elucidate why that wasn’t ultimately the right way to do things, this strikes me as an important distinction.

He summarized the results of Gartner’s 2006 CIO survey (“CIOs are programmers that are passin’ for normal folk”), where the top business trend is improving business processes: there’s a lot of pressure all around to automate processes and improve them along the way, and this is going to happen only with both SOA and BPM. Plummer takes the enterprise architecture view of this, which I really appreciate — business context and functions driving from the top of the architectural view, and services as a way to fulfill the needs of the business. Processes need services to be implemented quickly and effectively, and services aren’t of use unless they are consumed by processes. SOA allows us to build an infrastructure of shared services for ready consumption by processes.

One of the key reasons for SOA and shared services is that legacy apps are still hanging around, in spite of all our efforts to get rid of them. Adding a service layer over the legacy apps allows us to create higher-level services and processes that consume these services without having to know how — or even what platform — to access the data directly on its original platform.

SOA, as Plummer reinforces, is an architectural style: not web services (although web services can be used to implement SOA), not a particular product, but encapsulated functionality accessed through a standardized interface that allows for loose coupling of services and applications. He goes through a checklist of how to know SOA when you see it, although some of the items on his list (such as a centrally managed runtime middleware network) are not, strictly speaking, an essential part of SOA.

He continues on with a discussion of event-driven processes (he refers to them as event-driven services in counterpoint to BPM-driven services, which is not necessarily a separation that I agree with). Services, properly implemented, can be combined into event-driven processes rather than structured, pre-defined processes, in order to be able to respond to events that happen in an unexpected order or manner. I did, however, like his view of the “new” application container: UI and navigation via a portal, security and management as part of the network, and logic and data accessed via services from wherever they might exist. Explicit orchestration ties all this together, which provides agility in the process model.

He points out that an SOA is never finished: in fact, it’s designed to never be finished. He uses the analogy of the Sagrada Familia in Barcelona, a cathedral that’s been under construction for more than 100 years now, and continues to change as it is built. He covers off some things about development techniques to be used when developing services within an SOA infrastructure, and highlights the business service repository as a centrepiece for BPM’s use of SOA.

His final recommendation is that the critical path to competitive business advantage goes through SOA and BPM: you need to use SOA as the underlying mechanism, BPM as the methodology for tying things together in order to get the business process improvement that’s required to differentiate your organization in the marketplace.

Gartner Day 1: Peter Schwartz closing keynote

Sessions for the day finished with Peter Schwartz of Global Business Network talking about Scenario Planning — Business and IT Strategies in an Uncertain World. He might seem like a strange choice to speak at a BPM conference, but his job is to help companies to think about the future: something that a lot of people here are obviously thinking about.

He started with an old map of North America that showed California as an island (due to some incomplete exploration at the time), and talked about how this caused some missionaries to head east across California, across the Sierra Nevada, lugging boats along with them so that they could cross the sea that would eventually separate them from the mainland: in other words, they based their processes on an incorrect map of the environment, and suffered for it. Sound familiar in your organization?

He showed Henry Mintzberg‘s concept of emergent strategy, which is when environmental forces impact intended strategy to create something that wasn’t originally envisioned. He introduced the concept of scenario thinking to overcome decision traps, and discussed the challenge that what is not foreseen is unlikely to be seen in time. You need a systematic methodology for developing scenarios, and a deep understanding of what you’re trying to accomplish, the forces impacting the scenario. The test of a good scenario is not whether it is right or wrong, but whether it leads to better decisions. Although Schwartz has been involved in developing scenarios for (mostly) science fiction movies, he has spent much more of his time looking at political scenarios, including predicting the fall of the Soviet Union. As an aeronautical engineer, he assured us that this was not rocket science. 🙂

He discussed how scenario thinking is used as a tool for strategic alignment across an organization in general, then took a crack at scenarios for IT and BPM.

Environmental forces have different impacts at different points in the business process maturity model. First, environmental forces drive changes in BPM and IT: change in BPM is a function of change in customers plus change in technology plus change in competition plus change in business environment:

  • Change in customers is driven by an aging population that is working longer, the long tail effect of stratified preferences, cultural diversification, supply chain integration, preferences for environmentally “green” products and services, and widening income gaps.
  • Change in technology is driven by hardware advances beyond Moore’s law (including orders of magnitude bandwidth increases as well as computing power), convergence of platforms, the shift from broadcast to download, and change accelerated by breakthroughs in related areas such as biotechnology and nanotechnology.
  • Change in competition is driven by new business models, new competitive sets (e.g., Apple in the music business), new nations in a flattening world, new sources of competitive differentiation (e.g., design), consolidation of players, and increasing pressure for innovation.
  • Change in business environment driven by interest and inflation rates, current conversion rates, geopolitical uncertainties and climate change, fluctuations in input costs (e.g., energy, silicon), and ubiquitous high-speed broadband.

He sees three possible scenarios for how BPM is deployed within organizations — slow & moderate change, slow & radical change, and rapid & radical change — and went through how the customer, technology, competition and business environment factors play in these scenarios. Definitely an interesting view on what we’re doing with BPM.

Schwartz was an incredibly captivating speaker, and obviously appreciated by the audience. He even told us an anecdote about where he got his ideas for the scenarios in War Games: he was on node #2 of ARPANET in 1973, and hacked his way through to someone in missile control.

Off to the vendor showcase and reception.

Pegasystems Lunch

This is out of chronological order, but I didn’t have my laptop at lunch so had to reconstruct this from memory and a few scratched notes.

I squeezed my way into the Pegasystems lunch, which was completely full, in order to hear Alan Trefler speak. He was very engaging, with lots of amusing graphics, but one phrase at the end of his talk really grabbed me: “you have to get away from the poisonous import/export environment so that BPM doesn’t become the next CASE”.

What he meant by this is that by modelling in one tool, then exporting it to an execution environment where there is imperfect round-tripping, there’s a danger of having the processes caught in the execution environment where you’ll be stuck maintaining it in a more code-like environment: presumably, the execution environment has imperfect modelling, or you wouldn’t be using another modelling tool in the first place. This makes the modelling tool useless except for the initial design process, and therefore hinders the future agility of the process. He makes the analogy to CASE (think back to the 80’s and 90’s), where nice-looking tools generated code could then be “tweaked”, but you then ended up doing any further code maintenance in the code environment rather than the CASE environment because there wasn’t proper round-tripping between the environments.

This is part of the problem that I have with external modelling tools: unless you can round-trip seamlessly, you don’t have process agility.

Food for thought.

Gartner Day 1: Jim Sinur

Jim Sinur took the stage for Are Rules Inside-Out in BPM?, where he claimed that he’d push the envelope in how we thought about rules. He started with how rules are a start, but agility requires a full business rule management strategy so that you can manage the rules that you’ve externalized, especially if you have multiple business rule engines. Now to be fair, many organizations haven’t even externalized their rules yet from their enterprise applications and business processes, but if they ever do, they’d better be ready to manage them or they’ll have a big mess of rules to deal with.

Today’s business rules landscape is pretty confusing, covering everything from neural nets and expert systems to business rule engines and business rule management systems. If these business rules are too rigid (unchangeable), it impacts the agility of the business processes and the entire organization; if IT has to spend a huge amount of time and money to change rules, then you can be sure that it’s not going to happen very often. However, IT is often unwilling to put control of the business rules into the hands of the business; there needs to be a way to have proper governance over changing of rules, but not so much control that it’s impossible to keep up with shifting business requirements. In many cases, the business has no idea how difficult it is to change any given rule, and some standardization of this — via rule externalization and management — would also improve service levels between business and IT.

The key is to understand where rules affect processes, and see where the ability to change rules for in-flight processes can greatly improve agility. Sinur went through the business benefits of rules, and some of the risks of fixed rules: primarily, business rule management is an enabler for governance. He also walked through different models for adopting rules: the safe and steady control model (slightly smarter process), the cautiously dynamic model (process with above average intelligence), and the aggressively predictive model (Mensa process). Obviously, the model has to suit the organization’s risk tolerance as well as the underlying process, since these range from just automating some well-understood decisions to suggesting and implementing new rules based on behaviour within the process.

He has some great recommendations for getting your rules under control, including such thoughts as 15% of the rules are the ones that the business really needs to change to remain agile, so pick the right ones to externalize, and understand both the business benefits and risks associated with changing that rule.

Watch for Gartner’s definition of what should be in a BRMS later in 2007, since this is becoming somewhat of a commodity.