camunda BPM 7.0 Refactors Engine And Process Administration

On August 31, camunda released camunda BPM platform 7.0 (community open source and enterprise editions), the first major release of the software since it was forked from the Activiti project in March, although there were nine community releases between the fork at the release of 7.0. I had the chance for a couple of briefings over a period following that with Daniel Meyer, the project lead, and promised that I’d actually get this post written in time for Christmas. 🙂

The 7.0 release contains a significant amount of new code, but their focus remains the same: a developer-friendly BPM platform rather than a tool positioned for use by end users or non-technical analysts. As I discussed in a recent webinar, BPMS have become model-driven application development environments, hence having a BPMS positioned explicitly for developers meets a large market segment, especially for complex core processes.

The basic tasklist and other modeler and runtime features are mostly unchanged in this version, but there are big changes to the engine and to Cockpit, the technical process monitoring/administration module. Here’s what’s new:

Cockpit:

  • Inspect/repair process instances, including retrying failed service calls.
  • Create instance variables at runtime, and update variable values.
  • Reassign human activities.
  • Send a link directly to a specific instance or view.
  • Create a business key corresponding to a line-of-business system variable, providing a fast and easy to search on LOB data.
  • Extensible via third-party plug-ins. The aim with Cockpit is to solve 80% of use cases, then allow plug-ins from consulting partners and customers to handle the remainder; they provide full instructions on how to develop a Cockpit plug-in.
  • Add tabs to the detailed views of process instance, e.g., a link to LOB or other external data.

Engine:

  • A new authorization framework (also manifesting in admin capabilities for users/groups/authorizations): this is a preview feature in 7.0, supporting only application, group, group membership authorization. In the future, this will be expanded to include process definition and instance authorization. Users can be maintained in an internal camunda database or using a direct link to LDAP.
  • A complete rewrite of the history/audit log, splitting the history and runtime databases, which is potentially a huge performance booster. Updates to the history are triggered from events on running instances, whereas previously, writing history records required querying and updating existing records for that instance. The history log can be redirected to a database that is shared by multiple process engines; since some amount of the Cockpit monitoring is done on the history database, this makes it easier to consolidate monitoring of multiple process engines if the history logs are redirected to the same database. The logs can also be written directly to an external database based on the new history event stream API. Writes to the history log are asynchronous, which also improves performance. At the time of release, they were seeing preliminary benchmarks of 10-20% performance improvement in the process engine, and a significant reduction in the runtime database index size.
  • There is some increase in the coverage of the BPMN 2.0 standard; their reference document shows  supported elements in orange with a link through on each element to the description and usage, including code snippets where appropriate. Data objects/stores are still not supported, nor are about half of the event types, but their track record is similar to most vendors in this regard.

Version 7.0 is all open source, but a consolidation release (7.1) is already in alpha and will contain some proprietary administration features in Cockpit not available in the open source version: bulk edit/restart of instances, complex search/filter across instances from different process definitions, and a process-level authorizations UI (although the authorization structure will be built into the common engine). camunda is pretty open about their development, as you might expect from an open source company; you can even read some quite technical discussions about design decisions such as a new Activity Instance Execution Model have been made in the process engine in order to improve performance.

In September, camunda released a cloud application for collaborating on process models, camunda share. This is not a full collaborative authoring environment, but a place to upload, view and discuss process models. The camunda team created it during their “ShipIt-Day”, where they are tasked with creating something awesome within 24 hours. There’s no real security, your uploaded model generates a unique URL that you can send to others, but it provides the option to anonymize the process model by removing labels if your process contains proprietary information. A cool little side project that could let you avoid sending around PDFs of your process models for review.

camunda’s business model is in providing and supporting the enterprise edition of the software, which includes some proprietary functions in Cockpit but is otherwise identical to the community open source edition, plus in consulting and training services to help you get started with camunda BPM. They provide a great deal of the effort behind the community edition, while encouraging and supporting platform extensions such as fluent testing, PHP developer support and enterprise integration via Apache Camel.

camunda BPM 7.0

Webinar On Business-IT Alignment In Process Applications

This afternoon, I’m giving a webinar (hosted by Software AG) on business-IT alignment when developing process-centric applications: you can sign up for it or see the replay here.

Some interesting stuff on model-driven development and also why we usually need to use separate modeling tools when we’re building applications for complex core processes.

We’re also developing a white paper on this topic, to be released in the next few weeks; I’ll post a link to that when it’s out.

Breakfast Seminar On Intelligent Business Processes (Toronto) – December 3

I recently wrote a white paper and gave a webinar on intelligent business processes, sponsored by Software AG (although not about their products), and I’m now giving a breakfast seminar for them on the same topic in Toronto on December 3rd. If you’re Toronto-based, or are going to be there that day, you can see more information on the free seminar here and sign up for it here.

From Brazil To Vegas: BPM Global Trends and Building Business Capability

My frenzy of seven conferences in six weeks (eight, if you count the two different cities in Brazil as different conferences) is drawing to a close, but the past two weeks have been a bit brutal. Last week, I was in São Paulo and Brasília for the BPM Global Trends seminar series, where I presented in both cities along with Jan vom Brocke from University of Liechtenstein. It was arranged by ELO Group with the strong support of ABPMP Brazil, and was most interesting because I was presenting from a Portuguese version of my slides (with an English version visible to me) while United Nations-style simultaneous translators worked their magic from a booth at the back of the room.

I did a longer presentation in São Paulo earlier in the week with a workshop in the afternoon, then split it into two presentations with some added material for the public sector seminar in Brasília:

Many thanks to my hosts, and to those voices in my head: Leonardo and Daniel, the wonderful translators who brought my material alive for the Portuguese audience, and translated the questions and comments into English for me.

Unfortunately, I didn’t get to see a lot of Brazil except for hotels, airports and conference rooms, although I did get a short tour (thanks, Jones!) of the weird and wonderful modernist architecture of Brasília on the day that I flew out.

I arrived home in Toronto on Sunday morning, then 24 hours later was on a flight to Las Vegas for the Building Business Capability conference – my third trip to Vegas in a month. I presented a half-day seminar yesterday on emerging BPM technology, an ever-changing topic that continues to fascinate me:

I finished up today with a breakout session on the interplay of rules, process and content in case management, which is the combination of a number of different themes that I’ve been playing with over the past few years, but the first time for the presentation in this form:

I’m off to the evening reception to meet up with my peeps here, then tomorrow I get to take it easy and listen to someone else present for a change. Or maybe sit by the pool and let my brain relax for a day before I fly home to get back to my regular client work, and start to work through that backlog of product briefings that I have piled up in my drafts folder.

That’s the last of my conference travel for the year, but not the last of my conferences: I’ll be attending at least one day of CASCON next week for a workshop on Real Time Patient Flow Management using Business Process Management, Location Tags, and Complex Events Processing and to hear some of the research papers, then the Technicity Focus on Cyber Security event on November 26th. I’m also speaking at a Toronto breakfast seminar on intelligent business processes on December 3rd for Software AG.

Whew!

Intelligent Business Processes Webinar Q&A

Earlier this week, I gave a webinar on intelligent business processes, sponsored by Software AG; the slides are embedded following, and you can get a related white paper that I wrote here.

There were a number of questions at the end that we didn’t have time to answer, and I promised to answer them here, so here goes. I have made wording clarifications and grammatical corrections where appropriate.

First of all, here’s the questions that we did have time for and a brief response to each – listen to the replay of the webinar to catch my full answer to those.

  • How do you profile people for collaboration so that you know when to connect them? [This was in response to me talking about automatically matching up people for collaboration as part of intelligent processes – some cool stuff going on here with mining information from enterprise social graphs as well as social scoring]
  • How complex is to orchestrate a BPMS with in-house systems [Depends on the interfaces available on the in-house systems, e.g., web services interfaces or other APIs]
  • Are Intelligent Business Processes less Dynamic Business Processes [No, although many intelligent processes rely on an a priori process model, there’s a lot of intelligence that can be applied via rules rather than process, so that the process is dynamic]
  • How to quantify the visibility to the management? [I wasn’t completely sure of the intention of this one, but discussed the different granularities of visibility to different personas]
  • Where does real-time streaming fit within the Predictive Analytics model? [I see real-time streaming as how we get events from systems, devices or whatever as input to the analytics that, in turn, feed back to the intelligent process]

And here’s the ones that we didn’t get to, with more complete responses. Note that I was not reading the questions as I was presenting (I was, after all, busy presenting), so some of them may be referring to a specific point in the webinar and may not make sense out of context. If you wrote the question, feel free to elaborate in the comments below. If something was purely a comment or completely off topic, I have probably removed it from this list, but ping me if you require a follow-up.

There were a number of questions about dynamic processes and case management:

We are treating exceptions as more as normal business activity pattern called dynamic business process to reflect the future business trend [not actually a question, but may have been taken out of context]

How does this work with case management?

I talked about dynamic processes in response to another question; although I primarily described intelligent processes through the concepts of modeling a process, then measuring and providing predictions/recommendations relative to that model, a predefined process model is not necessarily required for intelligent processes. Rules form a strong part of the intelligence in these processes, and even if you don’t have a predefined process, you can consider measuring a process relative to accomplishments of goals that are aligned with rules rather than a detailed flow model. As long as you have some idea of your goals – whether those are expressed as completing a specific process, executing specific rules or other criteria – and can measure against those goals, then you can start to build intelligence into the processes.

Is process visibility about making process visible (documented and communicated) or visibility about operational activities through BPM adoption?

In my presentation, I was mainly addressing visibility of processes as they execute (operational activities), but not necessarily through BPM adoption. The activities may be occurring in any system that can be measured; hence my point about the importance of having instrumentation on systems and their activities in order to have them participate in an intelligent process. For example, your ERP system may generate events that can be consumed by the analytics that monitor your end-to-end process. The process into which we are attempting to gain visibility is that end-to-end process, which may include many different systems (one or more of which may be BPM systems, but that’s not required) as well as manual activities.

Do we have a real world data to show how accurate the prediction is from Intelligent Processes?

There’s not a simple (or single) answer to this. In straightforward scenarios, predictions can be very accurate. For example, I have seen predictions that make recommendations about staff reallocation in order to handle the current workload within a certain time period; however, predictions such as that often don’t include “wild card” factors such as “we’re experiencing a hurricane right now”. The accuracy of the predictions are going to depend greatly on the complexity of the models used as well as the amount of historical information that can be used for analysis.

What is the best approach when dealing with a cultural shift?

I did the keynote last week at the APQC process conference on changing incentives for knowledge workers, which covers a variety of issues around dealing with cultural shifts. Check it out.

In terms of technology and methodology, how do you compare intelligent processes with the capabilities that process modeling and simulation solutions (e.g., ARIS business process simulator) provide?

Process modeling and simulation solutions provide part of the picture – as I discussed, modeling is an important first step to provide a baseline for predictions, and simulations are often used for temporal predictions – but they are primarily process analysis tools and techniques. Intelligent processes are operational, running processes.

What is the role of intelligent agents in intelligent processes?

Considering the standard definition of “intelligent agent” from artificial intelligence, I think that it’s fair to say that intelligent processes are (or at least border on being) intelligent agents. If you implement intelligent processes fully, they are goal-seeking and take autonomous actions in order to achieve those goals.

Can you please talk about the learning curves to the Intelligent Business process?

I assume that this is referring to the learning curve of the process itself – the “intelligent agent” – and not the people involved in the process. Similar to my response above regarding the accuracy of predictions, this depends on the complexity of the process and its goals, and the amount of historical data that you have available to analyze as part of the predictions. As with any automated decisioning system, it may be good practice to have it run in parallel with human decision-making for a while in order to ensure that the automated decisions are appropriate, and fine-tune the goals and goal-seeking behavior if not.

Any popular BPM Tools from the industry and also any best practices?

Are ERP solutions providers and CRMs doing anything about it?

I grouped these together since they’re both dealing with products that can contribute to intelligent processes. It’s fair to say that any BPM system and most ERP and CRM systems could participate in intelligent processes, but are likely not the entire solution. Intelligent processes combine processes and rules (including processes and rules from ERP and CRM systems), events, analytics and (optionally) goal-seeking algorithms. Software AG, the sponsor of the webinar and white paper, certainly have products that can be combined to create intelligent processes, but so do most of the “stack” software vendors that have BPM offerings, including IBM, TIBCO and SAP. It’s important to keep in mind that an intelligent process is almost never a single system: it’s an end-to-end process than may combine a variety of systems to achieve a specific business goal. You’re going to have BPM systems in there, but also decision management, complex event processing, analytics and integration with other enterprise systems. That is not to say that the smaller, non-stack BPM vendors can’t piece together intelligent processes, but the stack vendors have a bit of an edge, even if their internal product integration is lightweight.

How to quantify the intelligent business process benefits for getting funding?

I addressed some of the benefits on slide 11, as well as in the white paper. Some of the benefits are very familiar if you’ve done any sort of process improvement project: management visibility and workforce control, improved efficiency by providing information context for knowledge workers (who may be spending 10-15% of their day looking for information today), and standardized decisioning. However, the big bang from intelligent processes comes in the ability to predict the future, and avoid problems before they occur. Depending on your industry, this could mean higher customer satisfaction ratings, reduced risk/cost of compliance, or a competitive edge based on the ability for processes to dynamically adapt to changing conditions.

What services, products do you offer for intelligent business processes?

I don’t offer any products (although Software AG, the webinar sponsor, does). You can get a better idea of my services on my website or contact me directly if you think that I can add value to your process projects.

How are Enterprise Intelligent Processes related to Big Data?

If your intelligent process is consuming external events (e.g., Twitter messages, weather data), or events from devices, or anything else that generates a lot of events, then you’re probably having to deal with the intersection between intelligent processes and big data. Essentially, the inputs to the analytics that provide the intelligence in the process may be considered big data, and have some specific data cleansing and aggregation required on the way in. You don’t necessarily have big data with intelligent processes, but one or more of your inputs might be big data. 

And my personal favorite question from the webinar:

Humans have difficulty acting in an intelligent manner; isn’t it overreaching to claim processes can be “intelligent”?

I realize that you’re cracking a joke here (it did make me smile), but intelligence is just the ability to acquire and apply knowledge and skills, which are well within the capabilities of systems that combine process, rules, events and analytics. We’re not talking HAL 9000 here.

To the guy who didn’t ask a question, but just said “this is a GREAT GREAT webinar ” – thanks, dude. 🙂

Webinar And White Paper On Intelligent Business Processes

I recently wrote a white paper on intelligent business processes: making business processes smarter through the addition of visibility, automation and prediction. On Wednesday (October 30), I’ll be giving a webinar to discuss the concepts in more detail. You can sign up for the webinar here, which should cause a link of the replay to be sent to you even if you can’t make it to the webinar. Software AG sponsored the white paper and webinar (as well as another one coming up next month); you can download the white paper from Software AG directly or on CIO.com.

As with all of my vendor-sponsored white papers/webinars, these are my opinions in an educational/thought leadership format, not vendor promotional pieces.

APQC Process Conference

This week, I started in Vegas with huge SAP TechEd conference, then moved on to Houston for the much more intimate APQC Process Conference, attended by 150 of so quality practitioners who are focused on process. I arrived too late for the first day’s sessions, but caught up with people at the reception, then gave the keynote this morning on how we need to change incentives for knowledge workers within the social enterprise:

This is an area that I’ve been pondering over for quite a while, but the first presentation that I’ve done explicitly on this topic. I’m going to do a separate post on this including all of the research pointers to open it up for more discussion; for a technology geek like me, looking at HR issues such as employee incentives makes me feel a bit out of my depth, but it’s been tapping away at my hindbrain since I first started talking about social BPM more than seven years ago, and I’m intensely interested in some of the research that can start to make its way into enterprise process software.

We had a full 25-30 minutes of Q&A after the keynote; there is a huge amount of interest amongst this audience, and a lot of related experiences to share.

I had the huge pleasure of hearing Jack Grayson, founder of APQC and productivity guru, speak about his ongoing work as well as his skydiving experience at the age of 90 (!), and he graciously gave me a tour of the Houstontonian conference center and the adjacent APQC offices that he has helped to build over the years. Impressive and inspirational, although a bit intimidating to follow onto stage.

Keeping focus long enough to blog right after doing a presentation can be a bit challenging, but I sat in on the joint APQC/ASQ breakout session that I attended just after the keynote, discussing their research linking quality practices to quality performance and presented by Travis Colton. Quality measurement systems tend to be related pretty strongly to process improvement and BPM initiatives, and this was a much more detailed view of the process of quality management (as opposed to quality within the enterprise processes) than I usually see, and some interesting points. He finished up, quite by coincidence, with a bit on employee incentives for quality; interesting how much my message from earlier seemed to resonate with a lot of people who I talked to as well as showing up in other presentations. You can see more about their research and results here.

The final session of the day (and the conference) was a wrap-up led by Elisabeth Swan, a process improvement consultant. She applied her background in improvisational comedy to tease out the main themes from the breakout sessions based on post-it notes that people had created during each session, and give an opportunity for people who attended the sessions to speak up about what they heard there. Good interactive wrap-up, and an opportunity to hear about all of the sessions that I missed.

APQC holds a knowledge management conference each year as well as this process conference, plus a number of webinars related to productivity and quality improvement.

NetWeaver Process Orchestration Update At SAPTechEd

It’s been a busy first day here at SAP TechEd, although some of that in briefings that I haven’t yet fully digested, and I’m finishing off with an update on NetWeaver Process Orchestration from Alexander Bundschuh (from the PI side) and Benjamin Notheis (from the BPM side). The goal of Process Orchestration is to improve processes and save integration costs, and it includes a number of tools:

SAP Process Orchestration

For customers currently running a PI and BPM dual-stack system moving to the consolidated Process Orchestration stack, they can reduce their operational footprint by running them both on the same platform but also improve performance and stability because what were previously web service calls between PI and BPM are now direct Java calls on the same engine. There are some good resources on implementing enterprise integration patterns in Process Orchestration, they discussed a number of techniques for the migration:

  • Migrating the ccBPM BPEL models over to BPMN
  • Designing integration flows graphically in the Eclipse
  • Using conditional process starts (for aggregator/correlation patterns where an inbound message event either matches with an existing process instance or starts a new one) which is a new feature in NW BPM,
  • Using the new NW BPM inbox for human-centric tasks, using either a traditional inbox view or a stream view, and providing features such as enabling users to manage substitution rules for their out-of-office times (note that this is for BPM only, and not a UWL replacement for BW tasks)
  • Integrated monitoring and administration between PI and BPM, allowing navigation from a BPM process instance to all associated PI messages, or from a PI message to the associated BPM process, plus graphical dashboards
  • End-to-end integration visibility to allow transactions to be correlated and tracked across systems, including B2B scenarios

Process Orchestration is available using HANA as the database (BPM can use HANA as a database, although there are no BPM-specific services available yet on HANA), although that was done for infrastructure simplification purposes and doesn’t show any significant performance improvement; in the future, further refactoring will exploit HANA capabilities fully and should show a greater performance increase. Also on their roadmap are technical error handling in BPM, and some improved B2B trading partner and intelligent mapping capabilities.

Good update, with lots of great information for customers who are using both PI and BPM, and want to bring them together on a common stack.

There a number of other sessions on Process Orchestration here at TechEd Las Vegas, as well as at the upcoming Amsterdam and Bangalore events, and a section of the SAP Community Network (SCN) on Process Orchestration.

BPM For Product Lifecycle Management At Johnson & Johnson

In this last breakout of Innovation World, simultaneous sessions from Johnson & Johnson and Johnson Controls were going on in adjacent rooms. I’m guessing that a few people might have ended up in the wrong session.

I was in the J&J session, where Pieter Boeykens and Sanjay Mandloi presented on web collaboration and process automation for global product development in the highly regulated health and pharmaceutical industry. They have a standardized set of processes for developing and launching products, with four different IT systems supporting the four parts of the PLM. A lot of this focuses on collecting documents from employees and suppliers all over the world, but there was no control over the process for doing this and the form of the information collected – they had five different processes for this in four regions. They rationalized this into a single standardized global process, modeled in webMethods BPM, then spent a significant amount of time on the human interaction at each step in the process: creating wireframes, then going through several version of the UI design in collaboration with the business users to ensure that it was intuitive and easy to use. They integrated BrainTribe for content management, which apparently handles the documents (the architecture diagram indicated that the actual documents are in Documentum) but also integrates structured content from other systems such as SAP.

In conjunction with this, they performed a webMethods upgrade from 8.2.x to 9 for their existing integration applications, migrating over their existing applications with little impact. Interestingly, this aspect generated far more questions from the audience than any of the functionality of the new BPM implementation, which gives you an idea of the business-technical mix in the audience. Smile

That’s it for Software AG’s Innovation World 2013. Next week, I’ll be in Vegas for TIBCO’s TUCON conference, where I’ll be on an analyst panel on Wednesday, then back to Vegas the following week for SAP TechEd (not next week, as I tweeted earlier) with a detour through Houston on the way home to speak at the APQC process conference. If you’re at any of those events, look me up and say hi.

High-Value Solution Consulting At Amdocs With An ARIS-Based Solution Book

Down to the last two breakout sessions at Innovation World, and we heard from Ophir Edrey of Amdocs, a company providing software for business support, with a focus on the communications, media and entertainment industries. They wanted to be able to leverage their own experience across multiple geographies, leading their customers towards a best practice-based implementation. To do this, they created a solution book that brings together best practices, methodologies, business processes and other information within an enterprise architecture to allow Amdoc consultants to work together with customers to collaborate on how that architecture needs to be modified to fit the customer’s specific needs.

The advantage of this is the Amdocs doesn’t just offer a software solution, but an entire advisory service around the best practices related to the solution. The solution book is created in ARIS, including the process models, solution design, solution traceability, customer collaboration (which they are migrating to ARIS Connect, not Process Live), and review and approval management.

He showed us a demo of the Amdocs Solution Book, specifically the business process framework. It contains four levels of decomposition, starting with a value chain of the entire operator landscape mapped onto the full set of process model families. Drilling through into a specific set of processes for, in this example, a mobile customer upgrading a handset, he showed the KPIs and the capabilities provided by their solution for that particular process; this starts the proof of Amdocs value to the customer as more than just a software provider. Drilling further into the specific process model, the Amdocs consultant can gather feedback from the customer on how this might need to be modified for their specific needs, and comments added directly on the models for others to see and comment.

They have had some pushback from customers on this – some people really just want a paper document – but generally have had very enthusiastic feedback and a strong demand to use the tool for projects. The result is faster, better, value-added implementations of their software solutions, giving them a competitive edge. Certainly an interesting model for the services arm of any complex enterprise software provider.