Toronto Wiki Tuesday: Using Wikis with Enterprise Content and Process Management

I’m presenting at Toronto Wiki Tuesday tonight on the interaction between wikis, ECM and BPM.

The organizers might be setting up to broadcast this live on Ustream; if so, I’ll update this post later today with the URL of the video stream.

Update: the presentation will be streamed live here at 7pm ET tonight.

Pegasystems SmartBPM V6

I’m wrapping up my coverage of this month’s round of four back-to-back conferences with the product reviews, which typically come from multiple meetings before, during and after a vendor conference, as well as some time spent pondering my notes and screen snapshots.

I had a remote product demo of SmartBPM prior to PegaWORLD, then a briefing from Kerim Akgonul at the conference. A lot of the changes to the product over the past year and a half have been focused on making it easier to use, trying to fight the perception that it’s a great product but that the inherent complexity makes it hard to use. In fact, the two main themes that I saw for this version are that it’s easy to use, and easy to share through design and runtime collaboration.

They started to address the complexity issue and promote the business agility message in Version 5 more than a year ago with a number of new tools:

  • Application Profiler to link requirements directly to developed applications and processes, replacing written specifications; the Application Accelerator then generates an application from this profile, as well as documentation
  • Improved non-technical process mapping with a shared model between business analysts and developers, including having the BA create UI forms
  • Visual Case Manager for mapping data from other systems into a case management application via various shared keys and search terms
  • Internet Application Composer for creating mashups with their own portlets and web components, plus other internal or external web components

Version 6 continues this direction, focusing on deploying solutions faster: a number of new gadgets allow building out the user experience and providing better visibility into what’s happening within business processes, and direct feedback on changes required to processes from participants to developers/analysts puts more ability to change processes into the hands of the business.

They’ve also started to use their own agile-like methodology and tools for internal projects, since the tools provide frameworks for project management, test management and documentation. Not only has this resulted in more rapid development of their own products and better alignment with the product requirements, it has eliminated the monolithic product release cycle in favor of smaller incremental releases that deliver new functionality sooner: they’ve released Pega Chat and other new features as modules without doing a full product release. With 1,200 Pega employees and 200 new ones from their Chordiant acquisition, introducing ways to shorten their product release cycle is an encouraging sign that they’re not letting their increasing size weigh them down in product innovation.

Discovery map viewTaking a look at the product, there’s a new Discovery Map view of a process, very similar to what you would see in the outline view other process discovery tools. The difference from other tools, however, is that this is a directly executable process: a shared model, rather than requiring a transfer to an execution environment (and the problems that come along with round-tripping). That ties in neatly with the “easy to use” theme, along with role-based views, reduced navigation complexity and case manager functionality.

The other theme, “easy to share” comes out in a number of ways. First of all, there’s a Facebook-style news feed of system-generated and team member alerts that shows who’s working with which processes and comments that participants have on them, including RSS feeds of the news feed and other sharing options to make it easy for people to consume that information in the format and tool that they choose; I’ve seen this in ARISalign and I suspect it will become standard functionality in most process discovery and design tools. With some of the sharing and bookmarking options, I don’t think that Pega even knows how (or if) customers are going to use them, but realizes that if you have to offer the functionality in order to start seeing the emergent usages.

User adds change request to designerThe second collaboration win is direct feedback from the participant of an executing process to the process designer. This is the type of functionality that I commented on a couple of months ago when Google came out with a way to provide feedback direction on their services from within the service (my tweet at the time was please, please, please can enterprise apps add a feature like this?). In SmartBPM, a user within an executing process drags a pushpin icon to the location of the problem on the screen, types a note in a popup and adds a category; when they click “Send” on the note, the current state (including a screenshot) is captured, and an item is added to the process designer’s feedback list in their news feed. Hot.

We also reviewed process optimization: optimization criteria are selected from the process attributes, and calculated based on actual process execution. A decision tree/table can be directly generated from the optimization results and added as a rule to the process: effectively, this automates the discovery of business rules for currently manual steps such as assignment, allowing for more process automation.

User builds custom subprocess in discovery map viewThe third collaboration-type functionality shown was the ability to spin off ad hoc subprocesses from any point in a structured process: just select “Build a Custom Process” from the action menu on a step to open up a new discovery map for creating the new subprocess, then add steps to create the flow. There’s only an outline view, not flow logic, and the step names are people to which the step is assigned: pretty simple, but little or no training required in order to use it for everyday users.

Later, all custom subprocesses created for a given process can be consolidated and summarized into suggested changes, and directed to a process designer for review; this may result in the original structured process being reworked to include some of the common patterns, or just have them left as ad hoc if they are not frequent enough to justify the changes.

Akgonul sees BPM and CRM converging; that’s certainly the direction that Pega has been taking recently, including (but not limited to) the Chordiant acquisition, and similar opinions are popping up elsewhere. As BPM products continue to turn into application development suites meant for building full enterprise applications, the boundaries start to blur.

One thing that I liked about the remote demo that has nothing to do with the product is that it was hosted on an Amazon EC2 instance: it’s only a short step from an EC2-based demo to providing a preloaded EC2 instance for those of us who like to get our hands on the products but don’t want to handle our own installation. For technical analysts like me, that’s a game-changer for doing product reviews.

Pegasystems SmartBPM V6 - 2010 

As a matter of disclosure, Pega paid my travel expenses to attend their conference, and they are a client of mine for creating webinars, but I am not compensated for writing about them here on my blog.

Will Social Revive Interest In BPM? Will BPM Make Social Relevant?

Social BPM saw a flurry of activity last week in the BPM blogosphere for some reason; I’ve been writing and presenting on social BPM for about four years now, so most of this isn’t new to me, but it’s good to see the ideas starting to permeate.

Keith Swenson writes on who is socializing in social BPM, and how the major analysts’ view of social BPM is that the BPM application developers are socializing, not the end users; that misses the point, in Keith’s (and my) opinion, since it ignores the runtime social/collaborative aspects as well as the blurring of the boundary between designing and participating in processes. He writes:

The proper use of social software in the business will eliminate the need for process designers.  Everyone will be a designer, in the way that everyone is a writer in the blogosphere.

This last part is not strictly true: everyone could be a writer in the blogosphere, but in reality, only a tiny fraction of those who read blogs actually write blogs, or even comment on blogs. The same will likely occur in runtime collaboration in BPM: only a fraction of users will design processes, even though all have the capability to do so, but all will benefit from it.

Then, at SAPPHIRE this week, I had a conversation with Enterprise 2.0 adoption expert Susan Scrupski, founder of the 2.0 Adoption Council, about her characterization of SAPPHIRE as 2.0 Reality Rehab, and her distressing discovery that 0 out of 20 SAP customers who she interviewed on the show floor had ever heard of Enterprise 2.0.

Distressing to her, but not so surprising to me: enterprise social software is not exactly mainstream with a lot of large companies that I work with, where wikis are used only by IT for tracking projects but not permitted into the user base at large, and blogs are viewed as disreputable sources of information. Imagine the reception that I get when I start talking to these companies about social BPM concepts: they don’t exactly warm up to the idea that users should design their own processes.

Before you jump all over me with examples of successful Enterprise 2.0 and social BPM adoption stories, I’m talking about mainstream adoption, not just in the echo chamber of those of us who think that this stuff is great, and root out the case studies like the rare and valuable gems that they are.

As a champion for Enterprise 2.0, and with only a few short weeks to go before the Enterprise 2.0 conference, Susan is keen to see more meaningful adoption within enterprises: not just more, but in applications that really make a difference for the core business of the company. This is, I believe, where social BPM can help: it’s an application that lends itself particularly well to collaboration and other social aspects, while providing a core critical function within enterprises. I’d love to see Enterprise 2.0 software vendors start to tackle core enterprise software, such as BPM, CRM and ERP, and stop building more enterprise wiki and blogging platforms. Think of it as 2.0 Reality Rehab for the whole Enterprise 2.0 industry.

SAPPHIRENOW Day 1 Wrapup

Not a lot of blogging yesterday; a couple of good keynotes (but I’m not going to blog about Richard Branson, Al Gore and Colin Powell), a press conference, the sustainability roundtable, a couple of other short meetings and networking at Blogger Central. Some links to items of interest:

Today, I’ll be getting a briefing on NetWeaver BPM, what’s happened in the last months and what’s coming up in future releases; I haven’t heard a peep since TechEd last fall.

Can We Make A Sustainability-BPM Connection?

Peter Graf, SAP’s Chief Sustainabilty Officer, and Scott Bolick, VP Sustainability, spoke to a group of bloggers and analysts at a sustainability roundtable today. Graf started with SAP’s definition of sustainability: increase short and long-term profitability by holistically managing economic, social and environmental risks and opportunities. Sustainability changes business processes drastically, especially those processes that span multiple organizations. SAP is leading by example, improving their own internal efficiencies by enacting sustainability measures such as reducing carbon emissions, but also see their software as an enabler for other organizations to implement sustainable solutions. SAP has a number of customers that are using SAP solutions across five general areas of sustainability: carbon impact, environmental compliance, people health and safety, product safety, and sustainability performance management. In addition to cost savings, sustainability can become a recruitment factor: younger people, in particular, want to work for a company that shares their environmental concerns.

They have made sustainability a focus of presentations at this conference, but also have made a number of sustainable logistics choices at the actual event. They have a new sustainability report that has already become hugely popular for fostering stakeholder dialog, and a sustainability map structured by line of business and business case. They are the first technology company to join the Sustainability Consortium, and we heard about acquisitions, customers and partners that are all focused on sustainability.

SAP sees Business Objects Explorer as being a key tool for helping to identify areas for sustainability; for example, providing an analytical view into office and plant costs to determine where unusual electricity consumption is occurring. SAP uses this internally for their own sustainability data analysis, and had a nice spiffy iPad version to show us, since you can’t have a conference these days without showing an iPad at least once. Analytics, especially real-time dashboards that allow for drilling into data, have been gaining popularity in a number of areas lately: we’ve seen everything from academic papers to mainstream reports in The Economist discussing analytics, and this is just one more high-profile example.

Bolick then took the stage to talk about their new sustainability report in more detail; if you want more information on everything from the basic definitions of sustainability to measuring performance to more complex solutions, check it out online. This is not a static PDF that you’ll never read; this is an interactive website that includes up-to-date SAP sustainability news and social content, as well as their own analytics tools allowing a drill-down into performance (e.g., carbon footprint reduction) numbers. The sustainability map is pretty interesting (under the Solutions tab), showing all the different targets for sustainability, organized by who is responsible for solutions in that area.

SAP Sustainability Map

There’s a pretty strong commitment to corporate transparency from SAP: they show both positive and negative performance measures in the report, such as the significant drop in employee engagement. This would make a great tool for other companies to measure and publish their sustainability measures; Tom Rafferty asked when they planned to productize a sustainability report generator for their customers, but since this is currently pretty specific to SAP’s operations, it’s not clear how easy that would be to do; they spoke about the potential to provide at least part of this as an on-demand solution, as well as providing benchmark performance data to help companies measure their “return on sustainability”.

The conversation came back to business processes, and the impact of IT in enabling more efficient and sustainable processes. There’s a key piece missing, however: their focus today was on analyzing sustainability performance data for human consumption, but I’m not hearing anything about using those analytics as events to feed back into any sort of automated process optimization, where optimization in this sense would be sustainability performance optimization rather than the usual type of process optimization that we do. I suspect that much of this sort of optimization is still fairly manual due to the nature of the measurement and what is required to optimize it (e.g., number of women in the workforce in order to create a more sustainable workforce), and also since many of these are such high level measures that they don’t relate to just a single process: optimizing sustainability performance is up in the first row of your enterprise architecture, and over in those columns dealing with motivation, and we haven’t yet worked out all the transformations needed to map that down to the nitty-gritty of actual business processes and rules.

Credit to Jon Reed for the title of this blog post; I was in the blogger area of the communications center (did I mention that SAP’s treatment of media in general and social media in particular really rocks?) and I told him my impressions of the roundtable and how I thought they should have more of a focus on a round-trip push back to BPM, and he popped out the phrase “the sustainability-BPM connection”. Thanks, Jon!

BPM Summer Camp Is Starting!

As the weather gets warmer, don’t you just naturally think of BPM? Okay, maybe that’s just me.

I’ll be presenting on a series of three webinars for Active Endpoints over the summer, starting this week on Thursday, that we’re calling BPM Summer Camp. First up: “Team Dynamics in BPM Projects: Avoiding the Pitfalls of Forced Marriage,” in which I’ll discuss what it takes to make a great cross-departmental BPM team, and some of the challenges that you might face in building that team.

We’ll do live Q&A at the end, and I’ll warn you that Active Endpoints will keep the session going until all the questions are answered (last time, we ran over by 30 minutes), so you might want to plan a bit of slack in your schedule. They always make their webinars available for replay afterwards on iTunes; just search for VOSibilities on iTunes and subscribe to the free podcast to get both audio and video podcasts, including webinar replays.

Open Source BPM with Alfresco’s Activiti

When Tom Baeyens announced that he and Joram Barrez stepped down from the jBPM project, he hinted about a new project, but details have been sparse until now except for a post that stated that they’re working on an open source BPMN 2.0 offering, plus one that gave unprecedented (for Tom) attention to ECM, which should have tipped me off as to their direction. Turns out that they have both joined Alfresco and are spearheading Activiti, an Apache-licensed open source BPM project, announced its Alpha 1 release today with a planned November GA date. From the press release:

An independently-run and branded open source project, Activiti will work independently of the Alfresco open source ECM system. Activiti will be built from the ground up to be a light-weight, embeddable BPM engine, but also designed to operate in scalable Cloud environments. Activiti will be liberally licensed under Apache License 2.0 to encourage widespread usage and adoption of the Activiti BPM engine and BPMN 2.0, which is being finalized as standard by OMG.

I met Tom face-to-face a couple of years ago when we ended up at different conferences in the same conference center and had a chat about total BPM world domination; interestingly, at the time he expressed that “BPMN should stick to being a modeling notation…and the mapping approach to concrete executable process languages should be left up to the vendors”; obviously, BPMN 2.0 execution semantics have changed his mind. 😉

Activiti Modeler - process designJohn Newton, CTO of Alfresco, and Tom Baeyens, in his new role as Chief Architect of BPM, briefed me last week on Activiti. The project is led by Alfresco and includes SpringSource, Signavio and Camunda; Alfresco’s motivation was to have a more liberally-licensed default process engine, although they will continue to support jBPM. Alfresco will build a business around Activiti only for content-centric applications by tightly integrating it with their ECM, leaving other applications of BPM to other companies. I’ll be very interested to see the extent of their content-process integration, and if it includes triggering of process events based on document state changes as well as links from processes into the content repository.

They believe that BPEL will be replaced by BPMN for most general-purpose BPM applications, with BPEL being used only for pure service orchestration. Although that’s a technical virtuous viewpoint that I can understand, there’s already a lot of commitment to BPEL by some major vendors, so I don’t expect that it’s going to go away any time soon. Although they are only supporting a subset of the BPMN 2.0 standard now – which could be said of any of the process modelers out there, since the standard is vast – they are committed to supporting the full standard, including execution semantics and the interchange format.

Activiti includes a modeler, a process engine, an end-user application for participating in processes, and an administration console. Not surprisingly, we spent quite a bit of time talking about Activiti Modeler, which is really a branded version of Signavio’s browser-based BPMN 2.0 process modeler. This uses AJAX in a browser to provide similar functionality to an Eclipse-based process modeler, but without the desktop installation hassles and the geeky window dressing. It is possible to create a fully executable process model in the Activiti Modeler, although in most cases a developer will add the technical underpinnings, likely in a more developer-oriented environment rather than the Modeler. Signavio includes a file-based model repository, which has been customized for inclusion in the Activiti Modeler; it would be great to see if they can do something a bit more robust to manage the process models, especially for cloud deployments. They are including support for certain proprietary scripting instead of using Java code for some interfaces, such as their Alfresco interface.

Activiti Explorer - end-user interfaceActiviti Explorer provides a basic end-user application for managing task lists, working on tasks, and starting new processes. Without a demo, it was hard to see much of the functionality, although it appears to have support for private task lists as well as shared lists of unassigned tasks; a typical paradigm for managing tasks is to allow someone to claim an unassigned task from the shared list, thereby moving it to their personal list.

The Activiti Engine, which is the underlying process execution engine, is packaged as a JAR file with small classes that can be embedded within other applications, such as is done in Alfresco for content management workflows. It can be easily deployed in the cloud, allowing for cross-enterprise processes. The only thing that I saw of Activiti Probe, the technical administration console, was its view on the underlying database tables, although it will have a number of other capabilities to manage the process engine as it develops. Not surprisingly, they don’t have all the process engine functionality available yet, but have been focusing on stabilizing the API in order to allow other companies to start working with Activiti before the GA release.

Activiti Cycle mockup - design collaborationI also saw a mockup of Activiti Cycle, a design-time collaboration tool that includes views (but not editing) of process models, related documents from Alfresco, and discussion topics. Activiti Cycle can show multiple models and establish traceability between them, since their expectation is that an analyst and a developer would have different versions of the model. This is an important point: models are manually forward-engineered from an analyst’s to developer’s version, and there are no inherent automated updates when the model changes, although there are alerts to notify when other versions of the same model are updated. This assumption that there can be no fully shared model between analyst and developer has formed a part of a long-standing discussion between Tom and I since before we met; although I believe that a shared model provides the best possible technical solution, it’s not so easy for a non-technical analyst to understand BPMN models once you get past the basic subset of elements. Activiti Cycle may not be in GA until after the other components, although they are working on it concurrently.

The screen shots that I saw looked nice, although I haven’t seen a demo yet; Tom gave credit to Alfresco’s UI designers for raising this above just another developer’s BPM tool into something that could be used by non-developers without a lot of customization. I’m looking forward to a demo next month, and seeing how this progresses to the November release and beyond.

BPM and Case Management Webinar Q&A

I presented a webinar on business process management and case management today, hosted by Pegasystems. Great fun as always, and a ton of questions that we didn’t have time to answer. I captured a lot of them and will address them here; if you attended the webinar, Pega will also send out their responses as well as a link to the recording of the webinar. First of all, here’s my slides from the presentation:

Update: the webinar replay is here on the Pega site (registration required).

And here’s the Q&A. I’ve grouped together questions that I’ve responded to in a single answer.

What work is being done on the BPMN standard to improve support for case management?

What about modeling? What effect do you think this dynamic/structured divide has on modeling. Can you model structured processes in the same way as dynamic cases?

is there any standard for case management like BPM?

Do you work on Case Management Process Modeling of OMG Standardisation

Although you can use BPMN to model ad hoc processes, it doesn’t currently lend itself that well to modeling case management situations: it doesn’t include good support for the rich content required in most case management scenarios, nor for completely on-the-fly subprocess definition by a participant. BPM products that only support BPMN are going to struggle with representing cases as well as structured processes. I’m not sure what OMG is doing (if anything) to address CM within the BPMN standard in order to address this issue; the fact that they have issued an RFP for Case Management Process Modeling indicates that they’re going to do something; in my mind, it makes sense to consider some sort of extension to BPMN since there are so many processes that include aspects of both. I am not involved in that standards work.

Is “Case Management” just another name for Forrester’s “Human-Centric BPM” ?

Not really, or at least not based on their last definition of human-centric BPM. There are many structured BPM situations that involve a large number of human-facing steps; that doesn’t make them case management since they are neither dynamic nor collaborative. Forrester does have a recent report on dynamic case management that is separate from their BPM reports.

Case Management seems limited to user self-selected processes, that are pre-defined. Wouldn’t a truly dynamic case management system be guiding users based on current case context and customizing responses to the specific need, rather than simply insert pre-defined segment?

Absolutely. Although I showed the situation where someone could add pre-defined process fragments, I didn’t mean to imply that that’s the only method. Most of the time, the user has a pre-defined set of actions (which are more granular than process fragments) from which they can select; however, it is possible in most case management systems to allow a user to define actions and subprocesses on the fly.

Is it typical for someone other than the “case worker” to initiate or create a case, e.g. by submitting some sort of request directly into the CM application, or is it more typical for the case worker to create the case based on input via some other channel such as phone, email, etc.?

To what extent do organizations integrate the case with legacy systems – claims, investment etc.

I’ve grouped these two questions together because my response to the first one is “yes, but not only someone – it could be a trigger from another system”. In almost every situation, a case is created in response to an event; if that event occurs in another system, it could be used to create the case directly. Otherwise, an event such as a phone call could be used by someone other than the case worker to create the case, such as the CSR who took a call from a customer. Depending on the case management system, there may be further integration with other business systems in order to update information or trigger other events, or it may rely on the case worker to check those systems for additional actions and information.

If you’re trying to provide tools for front line service agents who take a wide variety of requests, including routine and knowledge based, is CM the best approach; linking in to BPM to support the routine workflows or is it better to have BPM. The main challenge is that the frontline worker could receive queries on certain query types very rarely.

Great question, to which the answer is “that depends”. That’s a design issue that would depend on the nature of the requests as well as the need to cross over between BPM and CM within those requests. For example, if the requests are independent from each other, you could spawn individual processes or cases depending on the type of request; whether the two different types are handled by one or two different systems could be completely transparent to the service agent. However, if a request could come in that need to be combined with or related to an earlier request, then CM would likely be the way to go.

Will presenters discuss measurability for individuals participating in the case, time and actions needed to close – and sense of ownership of the customer solution?

Just because a case is dynamic doesn’t mean that it’s not measured: keep in mind that a case is based on goals, and there should be KPIs associated with those goals that can be measured. For example, it may be important that a case be completed within a specific timeframe, although not important that any given action within the case be done within a specific time as long as it doesn’t jeopardize the case milestones. The reverse may also be true: a case could have no specific deadline since it is open-ended (as in managing a chronic care patient), although there may be deadlines and milestones on actions and subprocesses within the case. As for ownership, usually a case has a specific case manager who holds ultimate responsibility, even some of the actions are performed by other people, although that’s not always the case. In situations where there is not a single case manager, the identification and monitoring of KPIs becomes more important, with alerts being raised to someone who can take responsibility if required in order to achieve the case result.

BPM fits easily in a Quality Management System – Plan Do Check Act. How would Case Management fit into a QM system and repository?

I’m not a QM expert, but much of what I see of how QM is applied is through the development and application of fairly specific procedures. In the presentation, I spoke about structured subprocesses that could be invoked from a case in order to complete specific actions; these would obviously fit well within a QM framework. A structured PDCA model isn’t going to fit for most case management, although could be applied at a higher level since there is often some design of a framework or template for cases that is done, and KPIs against which you measure the success of the case.

If you had a process that was this complicated would you not rationalise it using something like six sigma?

This was related to the scenario that Emily Burns from Pegasystems presented, but I’ll address the more general issue of complexity and measurement, in part based on my previous response to the QM question. Six Sigma in particular is based on statistical measurement of processes, with a goal of reducing defects in the process. Although you could apply some of the Six Sigma measurement principles, in general, since you don’t have predefined processes, it’s difficult to make a lot of statistical calculations about those processes. Case management isn’t a replacement for process analysis: if you have a highly complex but structured process, then analyze it and implement it using more standard structured BPM techniques. Case management is for when, regardless of the amount of analysis that you do, it’s just not possible to model every possible pathway through the business process. That being said, there are situations where using case management for a while does end up producing some emergent processes: processes that weren’t understood to be predictable and structured until they were done enough times in a case management framework to see the patterns emerge.

since case is so dynamic, what is the best practice when designing system to handle CM?

how do you decide the granularity of a case ?

I’ve grouped these two together since they’re both involved with case design. As I mentioned in my previous response, CM is not a replacement for analysis: you still need to understand your business processes before you start designing your CM system. You will need to design a case framework that doesn’t restrict what the case managers can do, while collecting the information that is required in order to document and act upon the case. Things to design into your case will include an overall data model (which will determine the ability of people to find and monitor a specific case), any required actions or subprocesses that need to be executed at some point, and content that needs to be collected before the case can be completed. Other things to include will be case context (the information from other systems that may be used by the case manager in order to complete their work) as well as events between the case and other systems, both inbound and outbound. You will also want to set KPIs, milestones and related alerts or escalations on specific actions or the entire case. Emily will likely respond with more specifics on how they set out cases and subcases within Pega, but I suspect that you might find that your definition of case may shift once you start doing case management for a while. I had a chance to speak with the person from BAA who presented the BAA case study (the one that Emily showed at the end), and he said that they were in the process of rolling up the previous separate cases that they had for things such as passenger handling and luggage handling into a single case for each flight, with those as subcases.

What i understand from a case is that they are basically business scenarios. Can we assume that?

If you mean “business scenario” in the enterprise architecture sense, then a case and business scenario could be considered as equivalent in some situations, although business scenarios usually end up with some sort of structured process model defined. There are many common aspects, however, and I think that we can learn much about defining CM standards by looking at what has been done in EA.

To understand cases or to handle case management solutions, some extra tools are needed that handle things like case history, status etc., so, what generic list of tools do you think are needed from a holistic case management tool?

The list of tools and functionality is still emerging, and will continue to evolve over the next while, but Forrester’s report on dynamic case management has a useful diagram showing what to expect in a case management platform:

They also list the capabilities that would translate directly from BPMS in that part of the framework, such as human interaction, integration and analytics.

With a Case Management structure, is work typically or ever completed in the Case Management tool itself? If not, does the Case Management tool depend on the users updated the case periodically to indicate what stage they are in & how far they are toward completion, etc.?

Work is typically done both within the CM structure and in other systems. Since part of the expected functionality is integration between the CM system and other systems, there may be some degree of automated exchange of events and information between them, or users may be required to update the case directly with their progress in non-integrated systems. Since the case file serves as a permanent record of the case, it is often considered the system of record, not transient information such as a typical process instance might be: that means that updating the case isn’t just a matter of documenting what was done in other systems, but could be the only place in which that information is captured.

If you were on the call and have other questions, feel free to add them in the comments and I’ll respond.

TIBCO BPM Now and Future: iProcess, Meet ActiveMatrix BPM

The session that I’ve been waiting all day for is with Roger King, who runs BPM product management and strategy for TIBCO, where he discussed the new ActiveMatrix BPM and TIBCOSilver BPM offerings for on-premise and cloud deployments. They’ve been working on this for a couple of years, and obviously keen to get it out of the gate. As I tweeted earlier after taking a look at ActiveMatrix BPM in the solutions showcase, this isn’t a complementary product to iProcess: it’s the successor to iProcess, in spite of what was said about this yesterday. Have no doubt: AMX BPM is not an upgrade to iProcess, it’s a new product, based on a new technical architecture, and already (at version 1) provides more functionality than iProcess.

With both AMX BPM and Silver BPM, Business Studio is used for modeling the process; ActiveMatrix versus Silver is just a deployment choice at the time of deployment, which means that you can deploy the exact process to an on-premise ActiveMatrix application server or to the cloud. In fact, if you’re modeling your iProcess processes now in Business Studio, rather than in the iProcess Modeler, you can deploy those directly to AMX or Silver, too. What’s changed from iProcess is that they’ve bundled much more into the BPM bundle: it’s a full composite application development and deployment plaform, including forms-based user interface, rules and SOA capabilities, so that all of the process-related artifacts can be modeled in a single environment. Their previous focus on support for process patterns is now extended to include resource, business and data patterns, too, and there’s more work management and workforce optimization functionality. Their tag line: “Business Friendly, Enterprise Strength”.

This model-driven development is based on five types of models: process (which we’re used to in BPM), form, data, organizational and client application. In order to do this, they reused some pieces that will be familiar for iProcess customers, but some new stuff too:

  • BusinessStudio for modeling, extended for new functionality
  • New OSGi-based deployment model, where an application package (process, rules, services, etc.) is deployed rather than just a process
  • New container-based grid platform
  • New runtime, which is an ActiveMatrix application
  • Workspace, similar to that used by iProcess, but extended
  • New Openspace gadget-based client, including interfaces for mobile devices

The architecture starts with the OSGi runtime with the ActiveMatrix service platform as the basic platform, with the ActiveMatrix BPM SCA composite application as the BPM platform running on that platform, including Process Manager, Openspace, Event Collector, Work Manager and Workspace components. Everything used by the AMX BPM components are visible to other applications, meaning that it can be easily embedded or integrated with other AMX BPM applications.

Both business analysts and process developers create executable process models with the other supporting models and forms user interfaces, while the SOA developer creates process-based services, all within the AMX BPM environment. Work is managed and executed by various level of workers, using organizational models that can be extracted from LDAP. Users may access work using Workspace (the same interface as is used for iProcess), Openspace (a mashup-type interface) or Mobilespace (the mobile version of Openspace, currently available for iPhone), or through a custom interface. Performance data is visible for different levels of monitors, again through standard dashboards or custom interfaces.

One of the interesting things that can be done is modeling of page flows: since AMX BPM allows for both user interface and process to be modeled, there are some parts of the flow that aren’t run in the process engine, but are executed in the web tier as a series of pages/views linked by rules and services, presented to the same user during a single session with the state information maintained during the flow; this provides smart capabilities to an otherwise simple forms user interface, without having to round-trip to the process engine for some basic decisioning and screen flows. It also allows for a more seamless interface in the modeler: a page flow model is shown almost as if it were an expanded subprocess from a task in the main process model, so that you can view the whole process – that which runs on the process engine as well as in the web tier – in a common environment. This reminds me somewhat of the screen flow capabilities that are starting to emerge as part of web application platforms such as Salesforce and NetSuite, although in the context of a larger process rather than in the context of a packaged application.

I also like data modeling capabilities in their business object models: you can interrogate an existing database directly in order to derive the data model for your process instance data, which saves a lot of redefinition (and the errors that can be introduced) of the data model as part of the process model. You can also import the data model from UML and other formats. Eventually, this needs to be able to integrate with enterprise MDM initiatives, but this is a good start.

The forms-based UI designer has some nice features as well, being able to automatically generate master-detail forms with grids for detail records based on joins in the data model. Although it’s not a really complex forms designer, it does allow styling with a style sheet, and I expect to see some improvements here as they figure out what their customers really want. They can separate presentation from page flow, and some companies may decide to use the AMX BPM page flow but do their own presentation screens.

They’ve moved away from the concept of queues that supported iProcess to dynamic work lists that are generated on the fly; this makes sense given the advances in dynamic data access. In general, creating a new BPM product from the ground up today not only makes their 20-year-old iProcess architecture look dated, but also the 10-year-old generation of products from other vendors that started the current BPM revolution in the early 2000’s.

Tons of interesting stuff here, more than I can absorb on the fly for a live blogging post, but I’ll nail down a full briefing in the next couple of weeks.

AMX BPM shares common components with the AMX SOA product line, but does not include AMX Service Grid (which includes more containers) or AMX Service Bus – if you’re a TIBCO customer (or planning to become one), these are details that you’ll want to work out in terms of licensing to make sure that you have all the right pieces, and aren’t paying for things that you’re not using. If you’re an iProcess customer, then don’t look for AMX BPM as part of your upgrade maintenance: it’s not an upgrade, it’s a new product. iProcess is not being end-of-lifed; it will be maintained and have minor enhancements for some time to come, but I don’t get the idea that you’re going to see a lot happening here since King stated that the major BPM investment for them will be in AMX BPM. If you have one of the other BPM products, such as InConcert, you may want to start saying your prayers now (although there has been no EOL notice as yet). In any case, at some point you’re going to want to consider a migration path off these older platforms for processes that you want to continuously improve, since they are not going to see any significant upgrades in the future, even though the official line is that iProcess “is not going away for a long, long time”.

The current plan is to provide for coexistence of iProcess and AMX BPM in Workspace so that users can pull work from either system without having to worry about which one it is on. And, although you could take an iProcess model in Business Studio and deploy it in AMX BPM, you’d probably want to take advantage of much more of the new functionality, such as the forms-based user interface designer, which means essentially rewriting everything except the process model. Although there is some service composition capability in AMX BPM, you’re probably going to leave most of the service composition heavy lifting in BusinessWorks, since AMX BPM really is geared towards turning processes into services, not general composition.

Interestingly, when I saw a quick demo at the booth earlier today, I detected essence of BPEL in the process model (such as catch and throw events); King confirmed that at the composition level, this is heavily extended BPEL.

Essentially, AMX BPM provides BPM on an SOA platform, but without the BPM designers having to worry about the SOA parts. From that standpoint, the BPM functionality competes well with the pure play BPM suites, but it provides a great deal more flexibility in dealing with services than you’ll see from the pure plays. They see their competition as the other stack vendors, IBM and Oracle, but with the lack of innovation and cohesion in both of those vendors’ BPM offerings, TIBCO seems to come out ahead in BPM functionality. Seems like the best of both worlds.