ActiveMatrix BPM at Citibank Brazil

Roberto Mercadante, SVP of operations and technology at Citibank Brazil, presented a session on their journey with AMX BPM. I also had a chance to talk to him yesterday about their projects, so have a bit of additional information beyond what he covered in the presentation They are applying AMX BPM to their commercial account opening/onboarding processes for “mid-sized” companies (between $500M-1B in annual revenue), where there is a very competitive market in Brazil that requires fast turnaround especially for establishing credit. As a global company in 160 countries, they are accustomed to dealing with very large multi-national organizations; unfortunately, some of those very robust features manifest in delays when handling smaller single-country transactions, such as their need to have a unique customer ID generated in their Philippines operation for any account opening. Even for functions performed completely within Brazil, they found that processes created for handling large corporate customers were just too slow and cumbersome for the mid-size market.

Prior to BPM implementation, the process was very paper-intensive, with 300+ steps to open an account, requiring as many as 15 signatures by the customer’s executives. Because it took so long, the commercial banking salespeople would try to bypass the process by collecting the paperwork and walking it through the operations center personally; this is obviously not a sustainable method for expediting processes, and wasn’t available to those people far from their processing center in Sao Paulo. Salespeople were spending as much as 50% of their time on operations, rather than building customer relationships.

They use an Oracle ERP, but found that it really only handled about 70% of their processes and was not, in the opinion of the business heads, a good fit for the remainder; they brought in AMX BPM to help fill that gap, typically representing localized processes due to unique market needs or regulations. In fact, they really consider AMX BPM to be their application development environment for building agile, flexible, localized apps around the centralized ERP.

When Citi implemented AMX BPM last year — for which they won an award — they were seeking to standardize and automate processes with the primary target to reduce the cycle time, which could be as long as 40 days. Interestingly, instead of reengineering the entire process, they did some overall modeling and process improvement (e.g., removing or parallelizing steps), but only did a complete rework on activities that would impact their goal of reducing cycle time, while enforcing their regulatory and compliance standards.

A key contributor to reducing cycle time, not surprisingly, was to remove the paper documents as early as possible in the process, which meant scanning documents in the branches and pushing them directly into their IBM FileNet repository, then kicking off the related AMX BPM processes. The custom scanning application included a checklist so that the branch-based salespeople could immediately know what documents that they were missing. Because they had some very remote branches with low communications bandwidth, they had to also create some custom store-and-forward mechanisms to save document transmission for times of low bandwidth usage, although that was eventually retired as their telecom infrastructure was upgraded. I’ve seen similar challenges with some of my Canadian banking customers regarding branch capture, with solutions ranging from using existing multifunction printers to actually faxing in documents to a central operational facility; paper capture still represents some of the hairiest problems in business processes, in spite of the fact that we’re all supposed to be paperless.

They built BPM analytics in Spotfire (this was prior to the Jaspersoft acquisition, which might have been a better fit for some parts of this) to display a real-time dashboard to identify operational bottlenecks — they felt strongly about including this from the start since they needed to be able to show real benefits in order to prove the value of BPM and justify future development. The result: 70% reduction in their onboarding cycle time within 3 months of implementation, from as much as 40 days down to a best time of about 3 days; it’s likely that they will not be able to reduce it further since some of that time is waiting for the customers to provide necessary documentation, although they do all the steps possible even in the absence of some documents so that the process can complete quickly as soon as the documents arrive. They also saw a 90% reduction in standard deviation, since no one was skewing the results by personally escorting documents through the operations center. Their customer rejection rate was reduced by 58%, so they captured a much larger portion of the companies that applied.

The benefits, however, extended beyond just operational efficiency: it allowed for decentralization of some amount of the front office functions, and allowed relocation of some back-office operations. This allows for leveraging shared services in other Citibank offices, relocating operations to less-expensive locations, and even outsourcing some operations completely.

They’re now looking at implementing additional functionality in the onboarding process, including FATCA compliance, mobile analytics, more legacy integration, and ongoing process improvement. They’re also looking at related problems that they can solve in order to achieve the same level of productivity, and considering how they can expand the BPMS implementation practices to support other regions. For this, they need to implement better BPM governance on a global basis, possibly through some center of excellence practices. They plan to do a survey of Citibank worldwide to identify the critical processes not handled by the ERP, and try to leverage some coordinated efforts for development as well as sharing experiences and best practices.

There’s one more breakout slot but nothing catches my eye, so I’m going to call it quits for TIBCO NOW 2014, and head out to enjoy a bit of San Francisco before I head home tomorrow morning. This is my last conference for the year, but I have a backlog of half-written product reviews that I will try to get up here before too long.

AMX BPM and Analytics at TIBCONOW

Nicolas Marzin, from the TIBCO BPM field group, presented a breakout session on the benefits of combining BPM and analytics — I’m not sure that anyone really needs to be convinced of the benefits, although plenty of organizations don’t implement this very well (or at all) so it obviously isn’t given a high priority is some situations.

BPM analytics have a number of different audiences — end users, team leaders, live of business managers, and customer service managers — and each of them are interested in different things, from operational performance to customer satisfaction measures. Since we’re talking about BPM analytics, most of these are focused on processing work, but different views and aspects of that process-related information. Regardless of the information that they seek, the analytics need to be ease to use as well as informative, and focused on how analytics is more driven by questions that more static reporting.

There are some key BPM metrics regardless of industry:

  • Work backlog breakdown, including by priority, segment and skillset (required to determine resourcing requirements) or SLA status (required to calculate risk)
  • Resource pool and capacity
  • Aggregate process performance
  • Business data-specific measures, e.g., troublesome products or top customers

Monitoring and analytics are important not just for managing daily operations, but also to feed back into process improvement: actions taken based on the analytics can include work reprioritization, resource reallocation, or a request for process improvement. Some of these actions can be automated, particularly the first two; there’s also value in doing an in situ simulation to predict the impacts of these actions on the SLAs or costs.

By appropriately combining BPM and analytics, you can improve productivity, improve visibility, reduce time to action and improve the user experience. A good summary of the benefits; as I mentioned earlier, this is likely not really news to the customer in the audience, but I am guessing that a lot of them are not yet using analytics to the full extent in their BPM implementations, and this information might help them to justify it.

In AMX BPM, Spotfire was previously positioned for analytics and visualization, but TIBCO’s acquisition of Jaspersoft means that they are now bundling Jaspersoft with AMX BPM. You can use either (or both), and I think that TIBCO needs to get on top of identifying the use cases for each so that customers are not confused by two apparently overlapping BPM analytics solutions. Spotfire allows for very rich interactive visualizations of data from multiple sources, including drill-downs and what-if scenarios, especially when the analysis is more ad hoc and exploratory; Jaspersoft is better suited for pre-defined dashboards for monitoring well-understood KPIs.

TIBCONOW ActiveMatrix BPM Roadmap

On Monday, we heard an update on the current state of AMX BPM from Roger King; today, he gave us more on the new release and future plans in his “BPM for Tomorrow” breakout session. He started out introducing ActiveMatrix BPM 3.1, including the following key themes:

  • Case management
  • Data
  • Usability and productivity

As we saw in the previous breakout, the addition of ad hoc activities to process models enables case management capabilities. Ad hoc (disconnected) activities are fully supported in BPMN; TIBCO provides tooling to add preconditions and the choice of manual/automatic invocation: that allow an activity to be started manually or to start itself once the preconditions are met. If there are no preconditions, the activity will start (or be available to start) as soon as the process is instantiated. Manually-startable activities are surfaced for the user in the case UI, in the task list and in the process list. Case states and actions are defined in the case model, specifying the states, actions, and which actions are valid for each state. Support for CMIS has been extended to allow the addition of content (in an external ECM system) to a case object via a case folder paradigm; this includes some new document operations such as linking/unlinking to a case object.

Data and self-serving reporting is now enabled with the inclusion of the full capabilities of Jaspersoft — acquired by TIBCO in April 2014 — in AMX BPM (limited in use to BPM) and a number of out of the box reports and dashboards. This works with case data as well as process data. The messaging and capabilities of Spotfire for BPM analytics has been a bit lacking in the past, and obviously Jaspersoft is being positioned as the “right” way to do BPM analytics (which is probably not happy news for the customers that sweated through the BPM-Spotfire implementations).

On the usability side, they have improved some BPM developer tools such as developer server configuration, and added “live development” capability for iterative development of UI forms without needing to rebuild and redeploy: just edit, save and test directly.

He then talked about their future product direction, which is predicated on their role in managing the “crown jewel” core business processes, necessitating a lot of non-functional capability such as high availability and scalability. As for market trends, they are seeing the cloud being used to drive innovation through experimentation because of the low cost of failure, and the rise of disposable enterprise apps. As enterprise processes become more and more digital, organizations are starting to respond with more automated business processes as well as case management for more dynamic processes. Not surprisingly, they are seeing BPMS with HTML5 as an enterprise rapid application development platform: I have been seeing a merging of the high end of the BPMS market with the application development platform market for some time.

Every organization has a lot of non-differentiating applications with standardized experiences, such as those that support procurement and HR; TIBCO’s target is the differentiating apps within an enterprise, which may not be the systems of record but likely are the systems of engagement. The key to this is enterprise case management and process-centric apps, which include data, process, organizational and social aspects, but also UI composition capabilities, since out-of-the-box UI is rarely differentiating. They are moving toward having some large part of their development environment on the web rather than Eclipse, which will roll out around the time that Microsoft finally forces companies onto Internet Explorer 11 where HTML5 is properly supported. Through this, they will support more of the composable situational apps that can be built, rolled out, used and discarded in less time that it used to take you to write the requirements for an app.

Declarative (data and rules-driven) versus imperative (predefined flow) process models are on their roadmap, and they will start to roll out declarative models in the context of case management: not to the exclusion of imperative models, but to augment them where they provide a better fit. Tied into this, at least in my mind, they are providing stronger support for rules integrated into BPM development.

He restated the official TIBCO party line that BPMN is not for business users, but that they need something more like Nimbus UPN instead; however, those are currently offered by two separate and non-integrated products that can’t exchange models, making Nimbus less useful for process discovery that will lead to automation. In the future, they will address this with enterprise BPM in the cloud, providing a “Nimbus-style” experience for business users and business-IT collaboration to start, then more analyst-style BPMN modeling, design and implementation. Not clear how they are going to reconcile UPM and BPMN, however.

King then announced TIBCO Cloud BPM — not yet available, but soon — which will be a BPM service powered by AMX BPM. They deprecated their Silver Fabric BPM support, which allowed you to run AMX BPM in the Amazon cloud; it wasn’t a particularly flexible or supportable cloud BPM offering, and a true SaaS offering will be a good addition when it comes along.

Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so not a lot of writing. No keynotes today, on this last day of TIBCO NOW 2014, but some BPM breakouts on the calendar — stay tuned.

I started the day with Jeremy Smith and Nam Ton That presenting on case management. They discussed customer journeys, and how their Fast Data platform allows you to detect and respond to that journey: this often includes semi-structured, dynamic processes that need to change based on external events and the process to date. It’s more than just process, of course; there needs to be context, actionable analytics, internal and external collaboration, and recommended actions, all working adaptively towards the customer-centric goal.

TIBCO addresses case management with additions to AMX BPM, not with a separate product; I believe that this is the best way to go for a lot of case management use cases that might need to combine more traditional structured processes with adaptive cases. The new capabilities added to support case management are:

  • Case data, providing context for performing actions. The case data model is created independently of a process model; the modeling uses UML to create relational-style ERDs, but also scripting and other functions beyond simple data modeling. This appears to be where the power — and the complexity — of the case management capabilities lie.
  • Case folders, integrating a variety of document sources, including from multiple ECM systems using CMIS, to act as the repository for case-related artifacts.
  • Case state and actions, allowing a user (or agent) to view and set the state of a case — e.g., received, in process, closed — and take any one of a number of actions allowed for the case when it is that state. This is modeled graphical with a state/action model, which also can apply user/role permissions, in a very similar fashion to their existing page flows capability. Actions can include social interactions, such as requesting information from an expert, accessing a Nimbus-based operations manual related to the current action, applying/viewing analytics to provide context for the action at that state, or providing recommendations such as next best action. Rules can be integrated through pre-conditions that prevent, require or invoke actions.
  • Ad hoc tasks, allowing the case user to instantiate a user task or subprocess; it appears they are doing this by pre-defining these in the process model (as ad hoc, or disconnected, tasks) so although they can be invoked on an ad hoc basis, they can’t be created from scratch by the user during execution. Given that multiple process models can be invoked from a case, there is still a lot of flexibility here.
  • Case UI, providing some out of the box user interfaces, but also providing a framework for building custom UIs or embedding these capabilities within another UI or portal.

Related cases can be linked via an association field created in the case data model; since this is, at heart, an integration application development environment, you can do pretty much anything although it looks like some of it might result in a fairly complex and technical case data model.

They didn’t do an actual demo during the presentation, I’ll drop by the showcase later and take a peek at it later today.

BPM For Today At TIBCONOW

Roger King, who heads up TIBCO’s BPM product strategy, gave us an update on ActiveMatrix BPM, and some of the iProcess to AMX BPM tooling (there is a separate session on this tomorrow that I may attend, so possibly more on that then). It’s been four years since they launched AMX BPM; that forms the model-driven implementation side of their BPM offering, augmented by Nimbus for business stakeholders for procedure documentation and business-IT collaboration. AMX BPM provides a number of process patterns (e.g., maker-checker) built in, intelligent work and resource management, actionable analytic insights and more. This is built on an enterprise-strength platform — as you would expect from TIBCO — to support 24×7 real-time operations.

In May of this year, they released AMX BPM 3.0 with a number of new features:

  • Support all styles of processes in a single solution: human workflow, case management, rules-based processes, automation, etc.
  • To support case management, they enable global data to allow the creation of a case data model in a central repository separate from processes, allowing cases to exist independent of processes, although they can be acted upon by processes. Work items representing actions on cases can retrieve and update case data on demand, since it references the case data rather than having it copied to local instance data.
  • In work management enhancements, support for elastic organizations (branches, such as you see in retail banking). This allows defining a model for a branch — you could have different models for different sizes of branches, for example — then link to those from branch nodes in the static organization model. Work can then be managed relative to the features of those underlying models, e.g., “send to manager”.
  • Also in work management, they have added dynamic performers to allow for distribution based on business data in a running instance rather than pre-determined role assignments. This is supported by dynamic RQL (resource query language), a query language specifically for manipulating resource assignments.
  • Some new LDAP functions.

There will be another session on Wednesday that covers the new features that are new since May, including a lot about case management; I’ll report more from that.

He also gave us some of the details of the iProcess to AMX BPM “conversion” tools, which migrate the process models (although not the applications that use those models): I assume that the conversion rate of their iProcess customers to AMX BPM has been lower than they expected, and they are hoping that this will move things along.

We then heard a Nimbus update from Dan Egan, which will release version 9.5 this month: this is positioned as a “how to” guide for the enterprise, showing process models in a more consumable format than a full technical BPMN model. They have added collaboration capabilities so that users can review and provided feedback on the business processes, and the ability to model multiple process variants as multiple drill-downs from a single object. The idea is that you use Nimbus both as a place to document manual procedures that people need to perform, and as a process discovery tool for eventual automation, although the former is what Nimbus was originally designed for and seems to still be the main use case. They’ve spiffed up the UI, and will soon be offering their authoring, admin and governance functions on the web, allowing them to offer a fully web-based solution.

Nimbus uses their universal process notation (UPN) rather than BPMN for process models; King stated in response to a question about Nimbus supporting BPMN by stating that they do not believe that BPMN is a user-consumable format. They don’t have have tooling — or at least haven’t talked about it — to convert UPN to BPMN; they’re going to need to have that if they want to position UPN as being for business-led process discovery as well as procedural documentation.

If you want to see the replay of this morning’s keynote, or watch tomorrow’s keynotes live or on demand, you can see them here.

BPM COE at TIBCONOW 2014

Raisa Mahomed of TIBCO presented a breakout session on best practices for building a BPM center of excellence. She started with a description of different types of COEs based on Forrester’s divisions (I’m too lazy to hack the HTML to add a table in WordPress for Android, so imagine a 2×2 quadrant with one axis being centralized versus decentralized, the other tactical, i.e., focused on cost and efficiency, versus strategic, i.e., focused on revenue and growth):

  • Center of Expertise (decentralized, strategic) – empowers business stakeholders with expert assistance, provides best practice, governance, technology that is configurable and consumable by business
  • Center of Excellence (centralized, strategic) – governs all processes in organization, enforces strict guidelines and process methodology governance, owns the BPMS, engagement models foster trust and collaboration including internal evangelists
  • Community of Practice (decentralized, tactical) – small teams, departmental priorities and scope, basic workflow capabilities, little or no governance
  • Process Factory (centralized, tactical) – optimized for process automation projects, processes as application development, frameworks

Center of Expertise and Process Factory work well together and are often seen in combination.

image

Best practices (these went by pretty quickly with a lot of detail on the slides, so I’ve just tried to capture some of the high points):

  • Find executive sponsorship for the COE: they must be influential across the organization, and be in the right place for the COE within your organization (e.g., COO, CIO, separate architecture group)
  • Create a governance framework – style will be based on the type(s) of COEs in use
  • Establish a methodology, which may have to accommodate different levels of BPM maturity within organization; be sure to address reusability and common components
  • Start with a core process, but relatively low complexity – this is exactly what I recommend, and I’m always frustrated by the “experts” that recommend starting with a non-core process even if the core processes are the target for implementation.
  • Encourage innovation and introduce disruptive technology.
  • Collaboration is key, via co-location and online collaboration spaces.
  • Don’t skip the metrics: remember that measuring project success is essential for future funding, as well as day-to-day operations and feeding the continuous improvement cycle.
  • Don’t let the program go stale, or become an ivory tower; rotate SMEs from the COE back into the business.
  • There’s not a single BPM skillset: you need a variety of skills spread across multiple people and roles.
  • Make a business case to provide justification for BPM projects.
  • Empower and educate through training and change management.
  • Avoid the “build it and they will come” mentality: just because you create some cool technology, that doesn’t mean that business people will stop doing the things that they’re doing to take it up.
  • Institute formal reviews of process models and solutions.

Nothing revolutionary here, but a good introduction and review of the best practices.

What’s Next In camunda – Wrapping Up Community Day

We finished the camunda community day with an update from camunda on features coming in 7.2 next month, and the future roadmap. camunda releases the community edition in advance of the commercial edition; this is the way that open source should work, but some commercial open source vendors switch that around so that the community version lags by as much as a full version.

The highlights of the 7.2 release are as follows:

  • CMMN-based case management engine, which includes the core activities (stages, human tasks, process tasks, case tasks, milestones and sentries), the base case instance and plan item lifecycle, and a CMMN model API and REST API on a common process engine. They demonstrated a basic case manager UI that can manage cases and the related tasks; I assume that this is really just a demo of what can be done rather than intended as production code. They also don’t have case modeling in their modeler yet, so it’s early times.
  • A variety of functions for speeding development: connectors (currently REST and SOAP), dataformats, templating and scripting (calling external scripts, currently Groovy or Javascript but with others to come)
  • New tasklist, updating the tasklist UI that they released just before announcing camunda as an open source project. It allows filters to be defined, including specifying who can see the results of a filter in addition to the search criteria; that filter then appears as a tab on the task list, in the color defined by the filter author. The sort order can’t currently be defined as part of the filter, but can be set on the general tasklist interface. This adds a third (left) column to the tasklist UI, which also shows the list of tasks and the form for the selected task. Still work to be done, but the new filters capability is a big step up, providing a conceptually similar (but much different graphically) functionality to the Brazos portal filters.

There were a list of other smaller enhancements and fixes, from platform support to performance improvements to new functions.

We also saw some work in progress from the labs. First of all, an update on bpmn.io, which I saw at bpmNEXT earlier this year: a BPMN viewer and web modeler.. The viewer allows embedding of a BPMN diagram into a web page, including adding annotations, overlays and markers on the diagram, via a Javascript API. Check out a live demo here, demonstrating a BPMN diff function based on two similar process diagrams. From the viewer, you can export the model to a file. You can also create BPMN diagrams from scratch or import from a file, either directly on their site or embedded in another page. The modeler is still bit basic, and doesn’t handle containers (pools, lanes, subprocesses) very well yet, but that’s all coming; keep up with new functionality on the bpmn.io blog.

Another lab project is the camunda BPM workbench, a debugging tool that allows inspection of the runtime state of processes alongside the process model, allowing breakpoints to be set in the process model (rather than in code). A console interface allows for interrogation and updating of the process variables as the developer steps through the process. The process model is displayed using the bpmn.io viewer.

At the end of all the roadmap sessions, the audience had a chance to say what was most important for them in terms of what will be implemented when; there were questions about case management, centralized model repositories, bulk runtime operations and other features.

A great half-day; this is the first time that I’ve attended an open source code community day, and it’s quite a different environment from a typical vendor conference. We’re about to enter the beer-drinking portion of the day so I will sign off for today; I’m giving the keynote at the main camunda user conference tomorrow morning, and not sure how much blogging that I’ll do during after that.

Disclaimer: camunda paid my travel expenses to be here today and tomorrow, and is providing a speaking fee for tomorrow’s keynote. I was not compensated for blogging, and the opinions here (and in my keynote) are my own.

camunda Community Day technical presentations

The second customer speaker at camunda’s community day was Peter Hachenberger from 1&1 Internet, describing how they use Signavio and camunda BPM to create their Process Platform, which is in turn used by their clients’ developers for building and executing automated processes. His presentation was primarily about the details of their technical implementation of the platform; they have built some fairly comprehensive tools for monitoring and managing executing processes, many of which are facilitated by changes that they made to the core process engine, including retry behavior, process ID generator, multiple business keys, an asynchronous process starter API, an extended REST API and a few new commands. Since camunda BPM is open source, any customer such as 1&1 can take a copy of the code and make changes to it, optionally returning them to the community if they are valuable to others. There’s a bit of danger in this, in that if you make changes to core functionality (such as the engine) rather than create an extension or plug-in, and those changes do not end up back in the community version, you’re not only on your own for future development on those components but may not be able to upgrade to future versions.

We had a number of short (10 minute) presentations from community members to discuss extensions that they are working on:

  • Grails plugin to add camunda functionality to Grails applications
  • OSGi module extension for greater flexibility and configurability at runtime, including sharing process engines as services
  • Elasticsearch extension to write camunda BPM history data to an elasticsearch cluster to allow full-text searching, enabling more comprehensive analytics
  • camunda mocking extensions for process testing with mockito
  • Cockpit plugin to add interactive graphs and some statistical calculations (e.g., aggregation, regression, min/max) for process monitoring directly on the camunda history database

Some of these extension projects were done by camunda employees, but great to see the external community contributions as well.

Australia Post at camunda Community Day

I am giving the keynote at camunda’s BPMcon conference tomorrow, and since I arrived in Berlin a couple of days early, camunda invited me to attend their community day today, which is the open source community meeting. Nice to see such a great turnout — 70 or 80 people? — and I definitely wasn’t the one to travel the furthest to get here, since today’s opening presentation is from Rob Parker of Australia Post. Australia has a lot of the same issues as Canada when it comes to nation-wide services such as post, since we both have a lot of geography and not a lot of population: this means that a lot of services have to be delivered at a fiscal loss to sparsely-populated areas, with specific rules about what percentage of the population has to be within a certain distance of a postal outlet.

Post offices in particular are hard-hit by digital disruption; Australia Post has seen their letter delivery service decline by 1 billion articles (and the related revenue), even though the number of addresses to cover has increased. However, they have seen their parcel delivery business increase, even though this is a competitive business with courier companies. They’re also offering a number of other products, such as electronic bill payment, digital mail delivery and even passport interviews, which has driven them to create a more integrated multi-channel/multi-product architecture to be able to quickly bring new products to market. They’re using camunda BPM for their order management processses, both for customer orders and service fulfillment orders. Customer order processes support the various customer channels, then drive out one or more service order processes to fulfill a customer order.

They decided to use BPM in order to externalize processes from applications, making for more agile development and better reusability. They picked camunda because they wanted “just enough technology”: that is, they wanted to add process management to their existing Java application development environment, not rewrite all of their apps in a proprietary, monolithic BPMS app dev environment. camunda BPM is used to implement the multiple service order processes that might be kicked off by any given customer order, with their overall architecture handling the communication between the two order management layers: the customer order layer as a consumer for the service order layer producer.

Parker went into a lot of detail of how they have implemented this architecture, putting their BPM usage into the context of their overall technical architecture, and walked through the general process model for their service order that instantiates a dispatcher process for each customer order, which in turn instantiates a subprocess for each line item in the order. They really want to implement all of this in camunda, but are still using TIBCO for the dispatching process while they work out some of the sticky bits such as how subprocess cancelations are propagated to the parent process. They are also having some challenges with handling process versions, considering that they run 7×24: they need a mapping table that takes these temporal anomalies into consideration, so that the process version in use may be tied to the order date for longer-running order processes. They also created a business dashboard by modifying Cockpit, the camunda IT operations dashboard, to remove all of the “dangerous” operations while exposing the work in progress, and adding some additional functions such as searching by a business key.

Parker ended up with their outcomes, some expected, some less so: basically, BPMN 2.0 is really working for them both for business-IT collaboration and model-driven development; this level of business-IT alignment means that error handling can be shared, with business handling business errors, and IT handling IT errors. They found that developers became productive very quickly since they were just adding some tools to their existing familiar Java application development environment, although some had to be gently reminded to use the BPM capabilities instead of writing code.

It was great to see the reactions and interactions of the camunda team during the presentation: Australia Post is a “do-it-themselves” open source user of camunda, and as Parker discussed some of the shortcomings, they were obviously taking some notes for future work. The presentation finished with him being presented as an award for the non-camunda person who contributed most to the community forum discussions, suggesting that you get out of open source what you put into it.

Survey on Mobile BPM and DM

James Taylor of Decision Management Solutions and I are doing some research into the use and integration of BPM (business process management) and DM (decision management) technology into mobile applications, as background for co-authoring a white paper. For this study we are focused on the mobile applications that you use to interact with businesses (as a consumer), government (as a constituent) or your employer (as an employee), NOT games or personal productivity tools.

As part of this research, we are doing a short survey on the use, development and support of mobile applications; please help us out by taking the survey:

surveymonkey.com/s/smartermobile

The first few questions are about your own use of mobile applications. If you are involved with the development of mobile applications, there are a few additional questions. It should take no more than 5-10 minutes to complete.

Thanks!