Gartner MQ for BPMS Leaders

Gartner is pretty sticky about allowing anyone to publish anything about their magic quadrants (even if you could argue that excerpts, such as the MQ graph, constitute fair use). However, three of the leaders have a lot to say about it:

  • Pegasystems Positioned as a Leader in Prominent Analyst Firm’s 2010 Magic Quadrant for Business Process Management Suites (which includes a mini version of the graph)
  • Software AG Named in the Leaders Quadrant for Business Process Management Suites
  • IBM Lombardi positioned in the Leaders Quadrant of Gartner Magic Quadrant for Business Process Management Suites (also containing a mini graph)

At some point, you could probably reconstruct the Leaders quadrant based on press releases; many of the vendors in the other quadrants don’t bother to do a release about it (do they have to pay Gartner for that?): consider that IBM placed all three of its major BPM products in this MQ, but I only saw a press release about the one in the Leaders quadrant.

Adam Deane published something about it, which I missed before he had to pull it; the comments on his post are particularly interesting, especially the one from a vendor who believes that they were dropped from the MQ because they stopped being a Gartner customer.

IBM IOD Opening Session: ACM and Analytics

I’m at IBM’s Information On Demand (IOD) conference this week, attending the opening session. There are 10,000 attendees here (including, I assume, IBM employees) for a conference that covers information management of all sorts: databases, analytics and content management. As at other large vendor conferences, they feel obligated to assault our senses in the morning with loud performance art: today, it’s Japanese drummers (quite talented, and thankfully short). From a logistics standpoint, the wifi fell to its knees before the opening session even started (what, like you weren’t expecting this many people??); IBM could learn a few lessons about supporting social media attendees from SAP, which provided a social media section with tables, power and wired internet to ensure that our messages got out in a timely fashion.

Getting back to the session, it was hosted by Mark Jeffries, who provides some interesting and amusing commentary between sessions, told us the results of the daily poll, and moderated some of the Q&A sessions; I’ve seen him at other conferences and he does a great job. First up from IBM is Robert LeBlanc (I would Google his title, but did I mention that there’s no wifi in here as I type?), talking about how the volume of information is exploding, and yet people are starved for the right information at the right time: most business people say that it’s easier to get information on the internet than out of their own internal systems. Traditional information management – database and ECM – is becoming tightly tied with analytics, since you need analytics to make decisions based on all that information, and gain insights that help to optimize business.

They ran some customer testimonial videos, and the term “advanced case management” came up early and often: I sense that this is going to be a theme for this conference, along with the theme of being analytics-driven to anticipate and shape business outcomes.

LeBlanc was then joined on stage by two customers: Mike Dreyer of Visa and Steve Pratt of CenterPoint Energy. In both cases, these organizations are leveraging information in order to do business better, for example, Visa used analytics to determine that “swipe-and-go” for low-value neighborhood transactions such as Starbucks were so low risk that they didn’t need immediate verification, speeding each transaction and therefore getting your morning latte to you faster. CenterPoint, an energy distributor, uses advanced metering and analytics not only for end-customer metering, but to monitor the health of the delivery systems so as to avoid downtimes and optimize delivery costs. They provided insights into how to plan and implement an information management strategy, from collecting the right data to analyzing and acting on that information.

We then heard from Arvind Krishna, IBM’s GM of Information Management, discussing the cycle of information management and predictive analytics, including using analytics and event processing to optimize real-time decisions and improve enterprise visibility. He was then joined on a panel by Rob Ashe, Fred Balboni and Craig Hayman, moderated by Mark Jeffries; this started to become more of the same message about the importance of information management and analytics. I think that they put the bloggers in the VIP section right in front of the stage so that we don’t bail out when it starts to get repetitive. I’m looking forward to attending some of the more in-depth sessions to hear about the new product releases and what customers are doing with them.

Since the FileNet products are showcased at IOD, this is giving me a chance to catch up with a few of my ex-FileNet friends from when I worked there in 2000-1: last night’s reception was like old home week with lots of familiar faces, and I’m looking forward to meeting up with more of them over the next three days. Looking at the all-male group of IBM executives speaking at the keynote, however, reminded me why I’m not there any more.

Disclosure: In addition to providing me with a free pass to the conference, IBM paid my travel expenses to be here this week. I flew Air Canada coach and am staying at the somewhat tired Luxor, so that’s really not a big perq.

Integrating BPM and Enterprise Architecture

Michael zur Muehlen presented this morning on integrating BPM and enterprise architecture, based on work that he’s done with the US Department of Defense. Although they use the DoDAF architecture framework in particular, the concepts are applicable to other similar EA frameworks. Like the Zachman framework, DoDAF prescribes the perspectives that are required, but doesn’t specify the artifacts (models) required for each of those perspectives; this is particularly problematic in DoD EA initiatives where there are likely to be many contractors and subcontractors involved, all of whom may use different model types to represent the same EA perspective.

He talked briefly about what makes a good model: the information must be correct, relevant (and complete) and economical (with respect to level of detail), as well as clear, comparable (linked to reality) and systematic. From there, he moved on to their selection of BPMN as the dominant standard for process modeling, since it has better event handling than UML activity diagrams, better organizational modeling than IDEF0, and better cross-organizational modeling than simple flowcharts. However, many tools support only a subset of BPMN – particularly those intended for process execution rather than just process modeling – and some tools have non-standard enhancements to BPMN that inhibit interoperability. Another issue is that the BPMN specification is enormous, with over 100 elements, with some different constructs that mean the same thing, such as explicit versus implicit gateways.

They set out to design primitives for the use of BPMN: where they “outlawed” the use of certain symbols such as complex gateways, and developed best practices for BPMN usage. They also mapped the frequency of BPMN symbol usage from internal DoD models, those that Michael sees in his practice as a professor of BPM at Stevens Institute of Technology, as well as samples found on the web, and came up with a distribution of the BPMN elements by frequency of usage. This research led to the creation of the subsets that are now part of the BPMN standard, as well as usage guidelines for BPMN in terms of both primitives and patterns.

In addition to the BPMN subsets (e.g., the most commonly implemented Descriptive subclass), they developed naming conventions to use within models, driven by the vocabulary related to their domain content. This idea of separating the control of model structure from the vocabulary makes sense: the first is more targeted at an implementer, while the second is targeted at a domain/business expert; this in turn led to vocabulary-driven development, where the relationship between capabilities, activities, resources and performers (CARP analysis) is established as a starting point for the labels used in process models, data models (or ontologies/taxonomies), security models and more as the enterprise architecture artifacts are built out.

Having defined how to draw the right models and how to select the right words to put in the models, they looked at different levels of models to be used for different purposes: models focused on milestones, handoffs, decisions and procedures. These are not just more detailed versions of the same, but rather different views on the process. The milestones view is a high-level view of the major process phases; handoffs looks at transitions between lanes with all activities with a lane rolled up to single activity, primarily showing the happy path; decisions look at major decision points and exception/escalation paths; and procedures showing a full requirements-level view of the process, i.e., the greatest level of detail that a business analyst is likely to create before involving technical resources to add things such as service calls.

To finish up, he tied this back to the six measures of model quality and how this approach based on primitives conforms to these measures. They’ve achieved a number of benefits, including minimizing modeling errors, ensuring that models are clear and consistent, and ensuring that the models can be converted to an executable form. I’m seeing an increased interest with my clients and in the marketplace on how BPM and EA can work together, so this was a great example of how one large organization manages to do it.

Michael posted earlier this year on the DoDAF subset of BPMN (in response to a review that I wrote of a BPMN update presentation by Robert Shapiro). If we go back a couple of years before that, there was quite a dust-up in the BPMN community when Michael first published the usage distribution statistics – definitely worth following the links to see where all this came from.

The Rules For Process

I’ve been pretty busy here at the Building Business Capability conference the past two days with little time for blogging, and with two presentations to do today, I don’t have much time, but wanted to attend Roger Burlton’s “The Rules For Process” keynote, which he refers to as his business process manifesto. After some terms and meta-rules (e.g., short, jargon-free and methodology-neutral), he got into his eight main principles:

  1. A business process is a unique capability of an organization.
  2. A business process exists within a clearly defined business context.
  3. The name of a business process must be consistently structured, unambiguous and commonly used.
  4. A model of a business process holds knowledge about a business process.
  5. A model of a business process associates a business process with other capabilities of the organization.
  6. A business process is guided by the business’ strategy and its policies.
  7. The performance of a business process is measured and assessed in business terms.
  8. A business process is an essential asset of the organization.

He spent quite a bit of time delving into each of these principles in detail, such as describing a business process as an action, not a policy, business rule or technology application.

I’m not sure if Roger is considering publishing a paper on this; definitely lots of good information about what business processes are and are not, which could help many people with their business process capture efforts. There’s apparently a discussion about this on the BPTrends LinkedIn group where you can find out more and join in the conversation, although I haven’t found it yet.

Building Business Capability Conference: Rules and Process and Analysis, Oh My!

After a welcome by Gladys Lam, the conference kicked off with a keynote by Ron Ross, Kathleen Barret and Roger Burlton, chairs of the three parts of the conference: Business Rules Forum, Business Analysis Forum and Business Process Forum. This is the first time that these three conferences have come together, although last year BPF emerged as more than just a special interest track at BRF, and it makes a lot of sense to see them together when you consider the title of the conference: Business Business Capability.

The keynote was done as a bit of a panel, with each of the three providing a view on the challenges facing organizations today, the capabilities required to tackle these challenges, and how this conference can help you to take these on. Some themes:

  • Lack of agility is a major challenge facing organizations today. To become more agile, design for change, including techniques like externalizing rules from processes and applications for late binding.
  • Consider effectiveness before efficiency, i.e., make sure that you’re doing the right thing before seeking to optimize it. In the vein of “doing the right thing”, we need to change corporate culture to focus on customer service.
  • Structured business vocabularies are important for effectiveness, including things like rules vocabularies and BPMN. Roger pointed out that we need to keep things simple within the usage of these vocabularies, and jokingly challenged us to create a valid process model containing all 100+ BPMN elements.
  • The business analyst’s role will transform in the next five years as process, rules and decision tools converge and business architecture gains more significant. BAs need to step up to the challenge of using these tools and related methodologies, not just write requirements, and need to be able to assess return on investment of previous business decisions to assist with future directions.
  • There is no conflict between the rules and process domains, they’re complementary. I often joke that business process people want to embed all rules into their process maps and just turn them into big decision trees, whereas business rules people want the business process to have a single step that calls a rules system, but the truth is somewhere in between. I’ve written and presented a lot about how rules and process should work together, and know that using them together can significantly increase business process agility.
  • It’s not about aligning business and IT, it’s about aligning business strategy with IT capability. Don’t focus on delivering IT systems, focus on delivering business solutions.

Julian Sammy of IIBA tweeted that he was recording the keynote and will put some of it up on YouTube and the IIBA site, so watch for that if you want to see the highlights on video. You can also follow the conference Twitter stream at #bbc2010.

Thriving In A Process-Driven World

Clay Richardson and Dave West (definitely the two snappiest dressers at Forrester) opened the second day of the Forrester Business Process and Application Delivery Forum with a keynote on thriving in a process-driven world by shifting both your business and IT culture. These shifts are hard work and fraught with risk, but necessary in order to achieve transformation. It’s critical to lead change, not just manage it, by creating change agents inside your organization.

They discussed some tools for doing this: identifying value streams that can point everyone in the same direction; using process as a common language for transformation, although not necessarily a common process representation; extending agile thinking to the entire enterprise; and lean governance that starts at the top but pushes down responsibility to empower teams to make decisions.

To achieve agility, it’s necessary to align business and IT into integrated process teams and adopt agile processes for those integrated teams, as well as selecting tools and architectures that support change.

Good governance is less about telling people what to do (and what not to do), and more about educating people on why they need to do certain things and empowering them to make the right choices. Many successful organizations adopt not just centers of excellence, but build communities of practice around those CoEs.

Since Richardson focuses on business process and West on agile software development, this was an interesting hybrid of ideas that spanned both business and IT.

Dynamic Case Management In the Public Sector

There was nothing that could have entice me to attend the Lean Six Sigma presentation this late in the afternoon, so instead I opted for the public sector track (which is not really my area of interest) for a discussion on dynamic case management (which is my area of interest) by Craig LeClair.

Government workers have low levels of empowerment and resourcefulness, for both cultural reasons and lack of technology to support such activities. So why is dynamic case management important in the government? He lists several reasons, including the increased need to manage the costs and risks of servicing higher numbers of citizen requests, the less structured nature of government jobs, demographic trends that will see many experience government workers retiring soon, and new regulations that impact their work.

Forrester defines dynamic case management as “a semistructured but also collaborative, dynamic, human and information-intensive process that is driven by outside events and requires incremental and progressive responses from the business domain handling the case”. A case folder at the center of the case is surrounded by people, content, collaboration, reporting, events, policies, process and everything else that might impact that case or be consumed during the working of the case. It’s a combination of BPM, ECM and analytics, plus new collaborative user interface paradigms. Forrester is just wrapping up a wave report on dynamic case management (or adaptive case management, as it is also known), and we’re seeing a bit of the research that’s going into it.

Le Clair discussed the three case management categories – service requests, incident management and investigative – and showed several government process examples that fit into each type as well as some case studies. He moved on to more generic Forrester BPM advice that I heard earlier in sessions today, such as leveraging centers of excellence, but included some specific to case management such as using it as a Lean approach for automating processes.

39 minutes into a 45-minute presentation, he turned it over to his co-presenter, Christine Johnson from Iron Data, although he assured her that she still had 15-20 minutes. 🙂 She walked through a case lifecycle for a government agencies dealing with unemployment and disability claims through retraining and other activities: this includes processes for referral/intake, needs assessment and service plan, appeals, and so on. Some portions, such as intake, had more of a structured workflow, whereas others were less structured cases where the case worker determined the order and inclusion of activities. There were some shockingly detailed diagrams, not even considering the time of day, but she redeemed herself with a good list of best practices for case management implementations (including, ironically, “clutter-free screens”), covering technology, design and process improvement practices.

Interestingly, Johnson’s key case study on a federal agency handling disability claims started as an electronic case file project – how many of those have we all seen? – and grew into a case management project as the client figured out that it was possible to actually do stuff with that case file in addition to just pushing documents into it. The results: paperless cases, and reduced case processing times and backlogs.

Building Process Skills To Scale Transformation

Connie Moore (or “Reverend Connie” as we now think of her 😉 ) gave a session this afternoon on process skills at multiple levels within your organization, and how entire new process-centric career paths are emerging. Process expertise isn’t necessarily something that can be quickly learned and overlaid on existing knowledge; it requires a certain set of underlying skills, and a certain amount of practical experience. Furthermore, process skills are migrating out of IT into the business areas, such as process improvement specialists and business architects.

Forrester recently did a role deep dive to take a look at the process roles that exist within organizations, and found that different organizations have very different views of business process:

  • Immature, usually smaller organizations with a focus on automation, not the process; these follow a typical build cycle with business analysts as traditional requirements gatherers.
  • Aspiring organizations that understand the importance of process but don’t really know fully what to do with it: they’ve piloted BPM projects and may have started a center of excellence, but are still evolving the roles of business analysts and other participants, and searching for the right methodologies.
  • Mature organizations already have process methodologies, and the process groups sit directly in the business areas, with clear roles defined for all of the participants. They will have robust process centers of excellence with well-defined methodologies such as Lean, offering internal training on their process frameworks and methods.

She talked about the same five roles/actors that we saw in the Peters/Miers talk, and she talked about how different types of business process professionals learn and develop skills in different ways. She mentioned the importance of certification and training programs, citing ABPMP as the up-and-coming player here with about 200 people certified to date (I’m also involved in a new effort to build a more open process body of knowledge), and listed the specific needs of the five actors in terms of their skills, job titles and business networks using examples from some of the case studies that we’ve been hearing about such as Medco. The job titles, as simple as that seems, are pretty important: it’s part of the language that you create around process improvement within your organization.

Process roles are often concentrated in a process center of excellence, which can start small: Moore told the story of one organization that started with four developers, one business analysts and one enterprise architect. Audience members echoed that, with CoE’s usually in the under-10 size, and many without a CoE at all. You also need to have a mix of business and IT skills in a CoE: as one of her points stated, you can do this without coding, but that doesn’t mean that a business person can do it, which is especially true as you start using more complete versions of BPMN, for example. There’s definitely a correlation (although not necessarily causation) between CoE and BPM project success; I talked about this and some other factors in building a BPM CoE in a webinar and white paper that I did for Appian last year.

She had a lot of great quotes from companies that they interviewed in their process roles study:

“These suites still required you to have [a] software engineering skill set”

“The biggest challenge is how to develop really good process architects”

“They [process/business analysts] usually analyze one process and have limited ability to see beyond the efforts in front of them”

“Process experts are a rare type of talent”

“We thought the traditional business analyst would be the right source, but we were horribly disappointed”

A number of these comments are focused on the shortcomings of trying to retrain more traditionally-skilled people, such as business analysts, for process work: it’s not as easy as it sounds, and requires significantly better tooling that they are likely using now. You probably don’t need the 20+ years of experience that I have in process projects, but you’re not going to just be able to take one of your developers or business analysts, send them on a 3-day course, and have them instantly become a process professional. There are ways to jump-start this: for example, looking at cloud-based BPM so that you need less of the back-end technical skills to get things going, and consider alternatives for mentoring and pairing with existing process experts (either internal or external) to speed the process.

Phil Gilbert On The Next Decade Of BPM

I missed Phil’s keynote at BPM 2010 in Hoboken a few weeks ago (although Keith Swenson very capably blogged it), so I was glad to be able to catch it here at the Forrester BP&AD forum. His verdict: the next decade of BPM will be social, visible and turbulent.

Over the past 40-50 years, the hard-core developers have become highly leveraged such that one developer can support about five other IT types, which in turn support 240 business end users. Most of the tools to build business technology, however, are focused on those 6 people on the technical side rather than the 240 business people. One way to change this is to allow for self-selected collaboration and listening: allowing anyone to “follow” whoever or whatever that they’re interested in to create a stream of information that is customized to their needs and interests.

Earlier today, I received an email about IBM’s new announcement on IBM Blueworks Live, and Phil talked about how it incorporates this idea of stream communication to allow you to both post and follow information. It will include information from a variety of sources, such as BPM-related Twitter hashtags and links to information written by BPM thought leaders. Launching on November 20th, Blueworks Live will include both the current BPM BlueWorks site as well as the IBM BluePrint cloud-based process modeling capability. From their announcement email that went out to current Blueprint users:

The new version will be called IBM Blueworks Live and you’ll be automatically upgraded to it.  Just like in past releases, all your process data and account settings are preserved. All of the great Blueprint features you use today will be there, plus some new capabilities that I think you’ll be very excited to use.

Blueworks Live will allow your team to not only collaborate on daily tasks, but also gain visibility into the status of your work. You’ll be able to automate processes that you run over e-mail today using the new checklist and approval Process App templates. Plus, you’ll have real-time access to expert online business process communities right on your desktop, so you can participate in the conversation, share best practices, or ask questions.

It’s good to see IBM consolidating these social BPM efforts; the roadmap for doing this wasn’t really clear before this, but now we’re seeing the IBM Blueworks community coming together with the Lombardi Blueprint tools. I’m sure that there will still be some glitches in integration, but this is a good first step. Also, Phil told me in the hallway before the session that he’s been made VP of BPM at IBM, with both product management and development oversight, which is a good move in general and likely required to keep a high-powered individual like Phil engaged.

With the announcement out of the way, he moved on with some of the same material from his BPM 2010 talk: a specific large multi-national organization has highly repeatable processes representing about 2.5% of their work, somewhat repeatable processes are 22.5%, while barely repeatable processes form the remaining 75%, and are mostly implemented with tools like Excel over email. Getting back to the issue from the beginning of the presentation, we need to have more and better tooling for those 75% of the processes that impact many more people than the highly repeatable processes that we’re spending so much time and money implementing.

With Blueworks Live, of course, you can automate these long tail processes in a matter of seconds 😉 but I think that the big news here is the social stream generated by these processes rather than the ease of creating the processes, which mostly already existed in Blueprint. Instant visibility through activity streams.

BPM: The New Language Of IT To Business Technology

Alex Peters and Derek Miers presented in the business process track with a session on BPM as the new language of IT to business technology. Forrester has been pushing the phrase “business technology” instead of “information technology” for the past year or so, and it was funny this morning to hear John Rymer say that he didn’t like the term at first, but he’s really bought into it now, since it really describes the role of IT in supporting the business, rather than off in their own world.

Peters discussed three recent technologies that have become game changers: social computing to expand the customer interaction channels, dynamic business applications for cross-functional processes, and the cloud as a delivery platform. There are also new key actors in business process transformation initiatives, characterized as VP process improvement (“change agent”), business architect (“guru”), process architect (“prodigy”), business analyst (“wannabe”), and manager of IT business systems (“operator”). Business analyst = “wannabe? That’s gotta hurt, although it was Forrester that claimed that more than half of all business analysts couldn’t cut it as a process analyst.

In moving to this new world order, where technology is focused on business, it’s necessary to evaluate the maturity of the organization’s business process management, and start the journey by eliminating waste in processes. Suddenly, this is sounding a lot like Lean. He showed some examples of companies at various stages of the journey: an organization with immature processes, where IT uses a plan-build-run structure; an aspiring organization starting to move from reactive to proactive, with more of a demand-supply structure and the beginnings of centers of excellence; and an organization with mature process management, leveraging cross-business process centers of excellence and shared services.

Miers took over to explain how the language of BPM can be used along this journey to process maturity and a business technology focus. He’s not talking about a graphical notation for process models like BPMN; he’s talking about the natural language words that we use to describe processes and process improvements, and how we need to decide what they mean. In other words, what do you mean by process? Task? Process model? Object? Capability? And does everyone else in your organization use the same words to describe the same concepts? If not, you’re going to have some alignment problems, since language is key to building a common understanding between different roles.

He stepped through each of the five actors, the challenges that they encounter in doing their business transformation work, and the language that they need to use to describe their world. Although the call to action at the end was to do your process maturity assessment and portfolio analysis, there was little of that in the rest of the presentation.

A bit of a meta topic, and a bit unfocused due in part to logistical problems at the beginning of the session, but some interesting nuggets of information.