Advanced Case Management Empowering The Business Analyst

We’re still a couple of hours away from the official announcement about the release of IBM Case Manager, and I’m at a session on how business analysts will work with Case Manager to build solutions based on templates.

Like the other ACM sessions, this one starts with an overview of IBM’s case management vision as well as the components that make up the Case Manager product: ECM underlying it all, with Lotus Sametime for real-time presence and chat, ILOG JRules for business rules, Cognos Real Time Monitor for dashboards, IBM Content Analytics for unstructured content analysis, IBM (Lotus) Mashup Center for user interface and some new case management task and workflow functionality that uses P8 BPM under the covers. Outside the core of Case Manager, WebSphere Process Server can be invoked for integration/SOA applications, although it appears that this is done by calling it from P8 BPM, which was existing functionality. On top of this, there are pre-built solutions and solution templates, as well as a vast array of services from IBM GBS and partners.

IBM Case Management Vision

The focus in this session is on the tools for the business analyst in the design-time environment, either based on a template or from scratch, including the user interface creation in the Mashup Center environment, analytics for both real-time and historical views of cases, and business rules. This allows a business analyst to capture requirements from the business users and create a working prototype that will form the shell of the final case application, if not the full executing application. The Case Builder environment that a business analyst works in to design case solutions also allows for testing and deploying the solution, although in most cases you won’t have your BAs deploying directly to a production environment.

Defining a case solution stats with the top-level case solution creation, including name, description and properties, then completing the following:

  • Define case types
  • Specify roles
    • Define role inbasket
  • Define personal inbasket
  • Define document types
  • Associate runtime UI pages

We didn’t see the ILOG JRules integration, and for good reason: in the Q&A, they admitted that this first version of Case Manager didn’t quite have that up to scratch, so I imagine that you have to work in both design environments, then call JRules from a BPM step or something of that nature.

The more that I see of Case Manager, the more I see the case management functionality that was starting to migrate into the FileNet ECM/BPM product from the Business Process Framework (BPF); I predicted that BPF would become part of the core product when I reviewed P8 BPM v4.5 a year and a half ago, and while this is being released as a separate product rather than part of the core ECM product, BPF is definitely being pushed to the side and IBM won’t be encouraging the creation of any new applications based on BPF. There’s no direct migration path from BPF to ACM; BPF technology is a bit old, and the time has come for it to be abandoned in favor of a more modern architecture, even if some of the functionality is replicated in the new system.

The step editor used to define the tasks associated with cases provides swimlanes for roles or workgroups (for underlying queue assignment, I assume), then allows the designer to add steps into the lanes and assign properties to the steps. The step properties are a simplified version of a step definition in P8 BPM, so I assume that this is actually a shared model (as opposed to export/import) that can be opened directly by the more technical BPM Process Designer. In P8 BPM 4.5, they introduced a “diagram mode” for business analysts in the Process Designer; this appears to be an even simpler process diagramming environment. It’s not BPMN compliant, which I think is a huge mistake; since it’s a workflow-style model with lanes, activities and split/merge are supported, this would have been a great opportunity to use the standard BPMN shapes to start getting BAs used to it.

I still have my notes from last week’s analyst briefing and my meeting with Ken Bisconti from yesterday which I will publish; these are more aligned with the “official” announcement that will be coming out today in conjunction with the press release.

IBM’s New Case Manager Product Overview

The day before the official announcement of IBM’s Case Manager product, Jake Levirne, Senior Product Manager, walked us through the capabilities. He started by defining case management, and discussing how it is about providing context to enable better outcomes rather than prescribing the exact method for achieving that outcome. For those of you who have been following ACM for a while, this wasn’t anything new, although I’m imagining that it is for some of the audience here at IOD.

Case Manager is an extension of the core (FileNet) ECM product through the integration of functionality from several other software products across multiple IBM software groups, specifically analytics, rules and collaboration. There is a new design tool targeted at business analysts, and a user interface environment that is the next generation of the old ECM widgets. There’s a new case object model in the repository, allowing the case construct to exist purely in the content repository, and be managed using the full range of content management capabilities including records management. Case tasks can be triggered by a number of different event types: user actions, new content, or updates to the case metadata. By having tasks as objects within the case, each task can then correspond to a structured subprocess in FileNet BPM, or just be part of a checklist of actions to be completed by the case worker (further discussion left it unclear whether even the simple checklist tasks were implemented as a single-step BPM workflow). A task can also call a WebSphere Process Server task; in fact, from what I recall of how the Content Manager objects work, you can call pretty much anything if you want to write a Java wrapper around it, or possibly this is done by triggering a BPM process that in turn calls a web service. The case context – a collection of all related metadata, tasks, content, comments, participants and other information associated with the case – is available to any case worker, giving them a complete view of the history and the current state of the case. Some collaboration features are built in to the runtime, including presence and synchronous chat, as well as simple asynchronous commenting; these collaborations are captured as part of the case context.

As you would expect, cases are dynamic and allow case workers to add new tasks for the case at any time. Business rules, although they may not even be visible to the end user, can be defined during design time in order to set properties and trigger events in the case. Rules can be changed at runtime, although we didn’t see an example of how that would be done or why it might be necessary.

There are two perspectives in the Case Manager Builder design environment: a simplified view for the business analysts to define the high level view of the case, and a more detailed view for the technologists to build in more complex integrations and complex decision logic. This environment allows for either start-from-scratch or template-based case solution definitions, and is targeted at the business analyst with a wizard-based interface. Creating a case solution includes defining the following from the business analyst’s view:

  • case properties (metadata)
  • roles that will work on this case, which will be bound to users at runtime
  • case types that can exist within the same case solution
  • document types that can be included in the case or may even trigger the case
  • case data and search views
  • which case views that each role will see
  • default folders to be included in the case
  • tasks that can be added to this case, each of which is a process (even if only a one-step process), and any triggering events for the tasks
  • the process behind each of the tasks, which is a simple step editor directly in Case Builder; a system lane in this editor can represent the calling of a web service or a WPS process

All of these can be defined on an ad hoc basis, or stubbed out initially using a wizard interface that walks the business analyst through and prompts for which of these things needs to be included in the case solution. Comments can be added on the objects during design time, such as tasks, allowing for collaboration between designers.

As was made clear in an audience question, the design that a business analyst is doing will actually create object classes in both Content Manager and BPM; this is not a requirements definition that then needs to be coded by a developer. From that standpoint, you’ll need to be sure that you don’t let them do this in your production environment since you may want to have someone ensure that the object definitions aren’t going to cause performance problems (that seemed screamingly obvious to me, but maybe wasn’t to the person asking the question).

From what Levirne said, it sounds as if the simple step editor view of the task process can then be opened in the BPM Process Designer by someone more technical to add other information, implying that every task does have a BPM process behind it. It’s not clear if this is an import/export to Process Designer, or just two perspectives on the same model, or if a task always generates a BPM process or if it can exist without one, e.g., as a simple checklist item. There were a lot of questions during the session and he didn’t have time to take them all, but I’m hoping for a more in-depth demo/briefing in the weeks to come.

Case analytics, including both dashboards (Cognos BAM) and reports (Excel and Cognos BI reports) based on case metadata, and more complex analytics based on the actual content (Content Analytics), are provided to allow you to review operational performance and determine root causes of inefficiencies. From a licensing standpoint, you would need a Cognos BI license to use that for reporting, and a limited-license Content Analytics version is included out of the box that can only be used for analyzing cases, not all your content. He didn’t cover much about the analytics in this session, it was primarily focused on the design time and runtime of the case management itself.

The end-user experience for Case Manager is in the IBM Mashup Center, a mashup/widget environment that allows the inclusion of both IBM’s widgets and any other that support the iWidget standard and expose their properties via REST APIs. IBM has had the FileNet ECM widgets available for a while to provide some standard ECM and BPM capabilities; the new version provides much more functionality to include more of the case context including metadata and tasks. A standard case widget provides access to the summary, documents, activities and history views of the case, and can link to a case data widget, a document viewer widget for any given document related to the case, and e-forms for creating more complex user interfaces for presenting and entering data as part of the case.

Someone I know who has worked with FileNet for years commented that Case Manager looks a lot like the integrated demos that they’ve been building for a couple of years now; although there’s some new functionality here and the whole thing is presented as a neat package, it’s likely that you could have done most of this on your own already if you were proficient with FileNet ECM and some of the other products involved.

We also heard from Brian Benoit of Pyramid Solutions, a long-time FileNet partner who has been an early adopter of Case Manager and responsible for building some of the early templates that will be available when the product is released. He demonstrated a financial account management template, including account opening, account maintenance, financial transaction requests and correspondence handling. In spite of IBM’s claim that there is no migration path from Business Process Framework (BPF), there is a very BPF-like nature to this application; clearly, the case management experience that they gained from BPF usage has shaped the creation of Case Manager, or possibly Pyramid was so familiar with BPF that they built something similar to what they knew already. Benoit said that the same functionality could be built out of the box with Case Manager, but that what they have provided is an accelerator for this sort of application.

Levirne assured me that everything in his presentation could be published immediately, although I’ve had analyst briefings on Case Manager that are under embargo until the official announcement tomorrow so I’ll give any of the missing details then.

IBM IOD Opening Session: ACM and Analytics

I’m at IBM’s Information On Demand (IOD) conference this week, attending the opening session. There are 10,000 attendees here (including, I assume, IBM employees) for a conference that covers information management of all sorts: databases, analytics and content management. As at other large vendor conferences, they feel obligated to assault our senses in the morning with loud performance art: today, it’s Japanese drummers (quite talented, and thankfully short). From a logistics standpoint, the wifi fell to its knees before the opening session even started (what, like you weren’t expecting this many people??); IBM could learn a few lessons about supporting social media attendees from SAP, which provided a social media section with tables, power and wired internet to ensure that our messages got out in a timely fashion.

Getting back to the session, it was hosted by Mark Jeffries, who provides some interesting and amusing commentary between sessions, told us the results of the daily poll, and moderated some of the Q&A sessions; I’ve seen him at other conferences and he does a great job. First up from IBM is Robert LeBlanc (I would Google his title, but did I mention that there’s no wifi in here as I type?), talking about how the volume of information is exploding, and yet people are starved for the right information at the right time: most business people say that it’s easier to get information on the internet than out of their own internal systems. Traditional information management – database and ECM – is becoming tightly tied with analytics, since you need analytics to make decisions based on all that information, and gain insights that help to optimize business.

They ran some customer testimonial videos, and the term “advanced case management” came up early and often: I sense that this is going to be a theme for this conference, along with the theme of being analytics-driven to anticipate and shape business outcomes.

LeBlanc was then joined on stage by two customers: Mike Dreyer of Visa and Steve Pratt of CenterPoint Energy. In both cases, these organizations are leveraging information in order to do business better, for example, Visa used analytics to determine that “swipe-and-go” for low-value neighborhood transactions such as Starbucks were so low risk that they didn’t need immediate verification, speeding each transaction and therefore getting your morning latte to you faster. CenterPoint, an energy distributor, uses advanced metering and analytics not only for end-customer metering, but to monitor the health of the delivery systems so as to avoid downtimes and optimize delivery costs. They provided insights into how to plan and implement an information management strategy, from collecting the right data to analyzing and acting on that information.

We then heard from Arvind Krishna, IBM’s GM of Information Management, discussing the cycle of information management and predictive analytics, including using analytics and event processing to optimize real-time decisions and improve enterprise visibility. He was then joined on a panel by Rob Ashe, Fred Balboni and Craig Hayman, moderated by Mark Jeffries; this started to become more of the same message about the importance of information management and analytics. I think that they put the bloggers in the VIP section right in front of the stage so that we don’t bail out when it starts to get repetitive. I’m looking forward to attending some of the more in-depth sessions to hear about the new product releases and what customers are doing with them.

Since the FileNet products are showcased at IOD, this is giving me a chance to catch up with a few of my ex-FileNet friends from when I worked there in 2000-1: last night’s reception was like old home week with lots of familiar faces, and I’m looking forward to meeting up with more of them over the next three days. Looking at the all-male group of IBM executives speaking at the keynote, however, reminded me why I’m not there any more.

Disclosure: In addition to providing me with a free pass to the conference, IBM paid my travel expenses to be here this week. I flew Air Canada coach and am staying at the somewhat tired Luxor, so that’s really not a big perq.

Integrating BPM and Enterprise Architecture

Michael zur Muehlen presented this morning on integrating BPM and enterprise architecture, based on work that he’s done with the US Department of Defense. Although they use the DoDAF architecture framework in particular, the concepts are applicable to other similar EA frameworks. Like the Zachman framework, DoDAF prescribes the perspectives that are required, but doesn’t specify the artifacts (models) required for each of those perspectives; this is particularly problematic in DoD EA initiatives where there are likely to be many contractors and subcontractors involved, all of whom may use different model types to represent the same EA perspective.

He talked briefly about what makes a good model: the information must be correct, relevant (and complete) and economical (with respect to level of detail), as well as clear, comparable (linked to reality) and systematic. From there, he moved on to their selection of BPMN as the dominant standard for process modeling, since it has better event handling than UML activity diagrams, better organizational modeling than IDEF0, and better cross-organizational modeling than simple flowcharts. However, many tools support only a subset of BPMN – particularly those intended for process execution rather than just process modeling – and some tools have non-standard enhancements to BPMN that inhibit interoperability. Another issue is that the BPMN specification is enormous, with over 100 elements, with some different constructs that mean the same thing, such as explicit versus implicit gateways.

They set out to design primitives for the use of BPMN: where they “outlawed” the use of certain symbols such as complex gateways, and developed best practices for BPMN usage. They also mapped the frequency of BPMN symbol usage from internal DoD models, those that Michael sees in his practice as a professor of BPM at Stevens Institute of Technology, as well as samples found on the web, and came up with a distribution of the BPMN elements by frequency of usage. This research led to the creation of the subsets that are now part of the BPMN standard, as well as usage guidelines for BPMN in terms of both primitives and patterns.

In addition to the BPMN subsets (e.g., the most commonly implemented Descriptive subclass), they developed naming conventions to use within models, driven by the vocabulary related to their domain content. This idea of separating the control of model structure from the vocabulary makes sense: the first is more targeted at an implementer, while the second is targeted at a domain/business expert; this in turn led to vocabulary-driven development, where the relationship between capabilities, activities, resources and performers (CARP analysis) is established as a starting point for the labels used in process models, data models (or ontologies/taxonomies), security models and more as the enterprise architecture artifacts are built out.

Having defined how to draw the right models and how to select the right words to put in the models, they looked at different levels of models to be used for different purposes: models focused on milestones, handoffs, decisions and procedures. These are not just more detailed versions of the same, but rather different views on the process. The milestones view is a high-level view of the major process phases; handoffs looks at transitions between lanes with all activities with a lane rolled up to single activity, primarily showing the happy path; decisions look at major decision points and exception/escalation paths; and procedures showing a full requirements-level view of the process, i.e., the greatest level of detail that a business analyst is likely to create before involving technical resources to add things such as service calls.

To finish up, he tied this back to the six measures of model quality and how this approach based on primitives conforms to these measures. They’ve achieved a number of benefits, including minimizing modeling errors, ensuring that models are clear and consistent, and ensuring that the models can be converted to an executable form. I’m seeing an increased interest with my clients and in the marketplace on how BPM and EA can work together, so this was a great example of how one large organization manages to do it.

Michael posted earlier this year on the DoDAF subset of BPMN (in response to a review that I wrote of a BPMN update presentation by Robert Shapiro). If we go back a couple of years before that, there was quite a dust-up in the BPMN community when Michael first published the usage distribution statistics – definitely worth following the links to see where all this came from.

The Rules For Process

I’ve been pretty busy here at the Building Business Capability conference the past two days with little time for blogging, and with two presentations to do today, I don’t have much time, but wanted to attend Roger Burlton’s “The Rules For Process” keynote, which he refers to as his business process manifesto. After some terms and meta-rules (e.g., short, jargon-free and methodology-neutral), he got into his eight main principles:

  1. A business process is a unique capability of an organization.
  2. A business process exists within a clearly defined business context.
  3. The name of a business process must be consistently structured, unambiguous and commonly used.
  4. A model of a business process holds knowledge about a business process.
  5. A model of a business process associates a business process with other capabilities of the organization.
  6. A business process is guided by the business’ strategy and its policies.
  7. The performance of a business process is measured and assessed in business terms.
  8. A business process is an essential asset of the organization.

He spent quite a bit of time delving into each of these principles in detail, such as describing a business process as an action, not a policy, business rule or technology application.

I’m not sure if Roger is considering publishing a paper on this; definitely lots of good information about what business processes are and are not, which could help many people with their business process capture efforts. There’s apparently a discussion about this on the BPTrends LinkedIn group where you can find out more and join in the conversation, although I haven’t found it yet.

Building Business Capability Conference: Rules and Process and Analysis, Oh My!

After a welcome by Gladys Lam, the conference kicked off with a keynote by Ron Ross, Kathleen Barret and Roger Burlton, chairs of the three parts of the conference: Business Rules Forum, Business Analysis Forum and Business Process Forum. This is the first time that these three conferences have come together, although last year BPF emerged as more than just a special interest track at BRF, and it makes a lot of sense to see them together when you consider the title of the conference: Business Business Capability.

The keynote was done as a bit of a panel, with each of the three providing a view on the challenges facing organizations today, the capabilities required to tackle these challenges, and how this conference can help you to take these on. Some themes:

  • Lack of agility is a major challenge facing organizations today. To become more agile, design for change, including techniques like externalizing rules from processes and applications for late binding.
  • Consider effectiveness before efficiency, i.e., make sure that you’re doing the right thing before seeking to optimize it. In the vein of “doing the right thing”, we need to change corporate culture to focus on customer service.
  • Structured business vocabularies are important for effectiveness, including things like rules vocabularies and BPMN. Roger pointed out that we need to keep things simple within the usage of these vocabularies, and jokingly challenged us to create a valid process model containing all 100+ BPMN elements.
  • The business analyst’s role will transform in the next five years as process, rules and decision tools converge and business architecture gains more significant. BAs need to step up to the challenge of using these tools and related methodologies, not just write requirements, and need to be able to assess return on investment of previous business decisions to assist with future directions.
  • There is no conflict between the rules and process domains, they’re complementary. I often joke that business process people want to embed all rules into their process maps and just turn them into big decision trees, whereas business rules people want the business process to have a single step that calls a rules system, but the truth is somewhere in between. I’ve written and presented a lot about how rules and process should work together, and know that using them together can significantly increase business process agility.
  • It’s not about aligning business and IT, it’s about aligning business strategy with IT capability. Don’t focus on delivering IT systems, focus on delivering business solutions.

Julian Sammy of IIBA tweeted that he was recording the keynote and will put some of it up on YouTube and the IIBA site, so watch for that if you want to see the highlights on video. You can also follow the conference Twitter stream at #bbc2010.

Thriving In A Process-Driven World

Clay Richardson and Dave West (definitely the two snappiest dressers at Forrester) opened the second day of the Forrester Business Process and Application Delivery Forum with a keynote on thriving in a process-driven world by shifting both your business and IT culture. These shifts are hard work and fraught with risk, but necessary in order to achieve transformation. It’s critical to lead change, not just manage it, by creating change agents inside your organization.

They discussed some tools for doing this: identifying value streams that can point everyone in the same direction; using process as a common language for transformation, although not necessarily a common process representation; extending agile thinking to the entire enterprise; and lean governance that starts at the top but pushes down responsibility to empower teams to make decisions.

To achieve agility, it’s necessary to align business and IT into integrated process teams and adopt agile processes for those integrated teams, as well as selecting tools and architectures that support change.

Good governance is less about telling people what to do (and what not to do), and more about educating people on why they need to do certain things and empowering them to make the right choices. Many successful organizations adopt not just centers of excellence, but build communities of practice around those CoEs.

Since Richardson focuses on business process and West on agile software development, this was an interesting hybrid of ideas that spanned both business and IT.

Dynamic Case Management In the Public Sector

There was nothing that could have entice me to attend the Lean Six Sigma presentation this late in the afternoon, so instead I opted for the public sector track (which is not really my area of interest) for a discussion on dynamic case management (which is my area of interest) by Craig LeClair.

Government workers have low levels of empowerment and resourcefulness, for both cultural reasons and lack of technology to support such activities. So why is dynamic case management important in the government? He lists several reasons, including the increased need to manage the costs and risks of servicing higher numbers of citizen requests, the less structured nature of government jobs, demographic trends that will see many experience government workers retiring soon, and new regulations that impact their work.

Forrester defines dynamic case management as “a semistructured but also collaborative, dynamic, human and information-intensive process that is driven by outside events and requires incremental and progressive responses from the business domain handling the case”. A case folder at the center of the case is surrounded by people, content, collaboration, reporting, events, policies, process and everything else that might impact that case or be consumed during the working of the case. It’s a combination of BPM, ECM and analytics, plus new collaborative user interface paradigms. Forrester is just wrapping up a wave report on dynamic case management (or adaptive case management, as it is also known), and we’re seeing a bit of the research that’s going into it.

Le Clair discussed the three case management categories – service requests, incident management and investigative – and showed several government process examples that fit into each type as well as some case studies. He moved on to more generic Forrester BPM advice that I heard earlier in sessions today, such as leveraging centers of excellence, but included some specific to case management such as using it as a Lean approach for automating processes.

39 minutes into a 45-minute presentation, he turned it over to his co-presenter, Christine Johnson from Iron Data, although he assured her that she still had 15-20 minutes. 🙂 She walked through a case lifecycle for a government agencies dealing with unemployment and disability claims through retraining and other activities: this includes processes for referral/intake, needs assessment and service plan, appeals, and so on. Some portions, such as intake, had more of a structured workflow, whereas others were less structured cases where the case worker determined the order and inclusion of activities. There were some shockingly detailed diagrams, not even considering the time of day, but she redeemed herself with a good list of best practices for case management implementations (including, ironically, “clutter-free screens”), covering technology, design and process improvement practices.

Interestingly, Johnson’s key case study on a federal agency handling disability claims started as an electronic case file project – how many of those have we all seen? – and grew into a case management project as the client figured out that it was possible to actually do stuff with that case file in addition to just pushing documents into it. The results: paperless cases, and reduced case processing times and backlogs.

Building Process Skills To Scale Transformation

Connie Moore (or “Reverend Connie” as we now think of her 😉 ) gave a session this afternoon on process skills at multiple levels within your organization, and how entire new process-centric career paths are emerging. Process expertise isn’t necessarily something that can be quickly learned and overlaid on existing knowledge; it requires a certain set of underlying skills, and a certain amount of practical experience. Furthermore, process skills are migrating out of IT into the business areas, such as process improvement specialists and business architects.

Forrester recently did a role deep dive to take a look at the process roles that exist within organizations, and found that different organizations have very different views of business process:

  • Immature, usually smaller organizations with a focus on automation, not the process; these follow a typical build cycle with business analysts as traditional requirements gatherers.
  • Aspiring organizations that understand the importance of process but don’t really know fully what to do with it: they’ve piloted BPM projects and may have started a center of excellence, but are still evolving the roles of business analysts and other participants, and searching for the right methodologies.
  • Mature organizations already have process methodologies, and the process groups sit directly in the business areas, with clear roles defined for all of the participants. They will have robust process centers of excellence with well-defined methodologies such as Lean, offering internal training on their process frameworks and methods.

She talked about the same five roles/actors that we saw in the Peters/Miers talk, and she talked about how different types of business process professionals learn and develop skills in different ways. She mentioned the importance of certification and training programs, citing ABPMP as the up-and-coming player here with about 200 people certified to date (I’m also involved in a new effort to build a more open process body of knowledge), and listed the specific needs of the five actors in terms of their skills, job titles and business networks using examples from some of the case studies that we’ve been hearing about such as Medco. The job titles, as simple as that seems, are pretty important: it’s part of the language that you create around process improvement within your organization.

Process roles are often concentrated in a process center of excellence, which can start small: Moore told the story of one organization that started with four developers, one business analysts and one enterprise architect. Audience members echoed that, with CoE’s usually in the under-10 size, and many without a CoE at all. You also need to have a mix of business and IT skills in a CoE: as one of her points stated, you can do this without coding, but that doesn’t mean that a business person can do it, which is especially true as you start using more complete versions of BPMN, for example. There’s definitely a correlation (although not necessarily causation) between CoE and BPM project success; I talked about this and some other factors in building a BPM CoE in a webinar and white paper that I did for Appian last year.

She had a lot of great quotes from companies that they interviewed in their process roles study:

“These suites still required you to have [a] software engineering skill set”

“The biggest challenge is how to develop really good process architects”

“They [process/business analysts] usually analyze one process and have limited ability to see beyond the efforts in front of them”

“Process experts are a rare type of talent”

“We thought the traditional business analyst would be the right source, but we were horribly disappointed”

A number of these comments are focused on the shortcomings of trying to retrain more traditionally-skilled people, such as business analysts, for process work: it’s not as easy as it sounds, and requires significantly better tooling that they are likely using now. You probably don’t need the 20+ years of experience that I have in process projects, but you’re not going to just be able to take one of your developers or business analysts, send them on a 3-day course, and have them instantly become a process professional. There are ways to jump-start this: for example, looking at cloud-based BPM so that you need less of the back-end technical skills to get things going, and consider alternatives for mentoring and pairing with existing process experts (either internal or external) to speed the process.

Phil Gilbert On The Next Decade Of BPM

I missed Phil’s keynote at BPM 2010 in Hoboken a few weeks ago (although Keith Swenson very capably blogged it), so I was glad to be able to catch it here at the Forrester BP&AD forum. His verdict: the next decade of BPM will be social, visible and turbulent.

Over the past 40-50 years, the hard-core developers have become highly leveraged such that one developer can support about five other IT types, which in turn support 240 business end users. Most of the tools to build business technology, however, are focused on those 6 people on the technical side rather than the 240 business people. One way to change this is to allow for self-selected collaboration and listening: allowing anyone to “follow” whoever or whatever that they’re interested in to create a stream of information that is customized to their needs and interests.

Earlier today, I received an email about IBM’s new announcement on IBM Blueworks Live, and Phil talked about how it incorporates this idea of stream communication to allow you to both post and follow information. It will include information from a variety of sources, such as BPM-related Twitter hashtags and links to information written by BPM thought leaders. Launching on November 20th, Blueworks Live will include both the current BPM BlueWorks site as well as the IBM BluePrint cloud-based process modeling capability. From their announcement email that went out to current Blueprint users:

The new version will be called IBM Blueworks Live and you’ll be automatically upgraded to it.  Just like in past releases, all your process data and account settings are preserved. All of the great Blueprint features you use today will be there, plus some new capabilities that I think you’ll be very excited to use.

Blueworks Live will allow your team to not only collaborate on daily tasks, but also gain visibility into the status of your work. You’ll be able to automate processes that you run over e-mail today using the new checklist and approval Process App templates. Plus, you’ll have real-time access to expert online business process communities right on your desktop, so you can participate in the conversation, share best practices, or ask questions.

It’s good to see IBM consolidating these social BPM efforts; the roadmap for doing this wasn’t really clear before this, but now we’re seeing the IBM Blueworks community coming together with the Lombardi Blueprint tools. I’m sure that there will still be some glitches in integration, but this is a good first step. Also, Phil told me in the hallway before the session that he’s been made VP of BPM at IBM, with both product management and development oversight, which is a good move in general and likely required to keep a high-powered individual like Phil engaged.

With the announcement out of the way, he moved on with some of the same material from his BPM 2010 talk: a specific large multi-national organization has highly repeatable processes representing about 2.5% of their work, somewhat repeatable processes are 22.5%, while barely repeatable processes form the remaining 75%, and are mostly implemented with tools like Excel over email. Getting back to the issue from the beginning of the presentation, we need to have more and better tooling for those 75% of the processes that impact many more people than the highly repeatable processes that we’re spending so much time and money implementing.

With Blueworks Live, of course, you can automate these long tail processes in a matter of seconds 😉 but I think that the big news here is the social stream generated by these processes rather than the ease of creating the processes, which mostly already existed in Blueprint. Instant visibility through activity streams.