June BPM Conferences

After a month at home, I’m hitting the road for a few vendor events. First, a couple where I’m attending, but not presenting:

  • IBM Content 2014 in Toronto (so technically not hitting much of the road to get there), May 30 – this will travel to Austin, Minneapolis and Chicago in early June (but not with me)
  • SAP’s SAPPHIRENOW in Orlando, June 2-6
  • PegaWORLD in Washington DC, June 8-10

This gives me a chance to catch up on what’s been happening with their products since my last briefing, and talk to the internal teams as well as customers. In both of the latter cases, the vendors are covering my travel expenses but not compensating me for my time, so anything that I blog here (as usual) is my own opinion and not influenced by them.

After that, I have a couple of speaking gigs:

  • Two seminars hosted by IBM in Boston and Seattle, June 17 and 19 respectively, on new business operations imperatives (cloud, mobile, social and analytics with BPM)
  • DST’s ADVANCE Forum Europe in London, June 25, where I’ll be presenting “Designing Process-Based Applications: The Dos and Don’ts”, an updated version of the presentation that I gave at their North American conference in March

I likely won’t be blogging much from these ones since I’ll be busy presenting, but may post my slides online. Obviously, the vendors are paying my expenses as well as a speaking fee, but not for any specific coverage on my blog.

The Case For Smarter Process At IBMImpact 2014

At analyst events, I tend to not blog every presentation; rather, I listen, absorb and take some time to reflect on the themes. Since I spent the first part of last week at the analyst event at IBM Impact, then the second half across the country at Appian World, I waited until I had to pull all the threads together here. IBM keeps the analysts busy at Impact, although I did get to the general session and a couple of keynotes, which were useful to provide context for the announcements that they made in pre-conference briefings and the analyst event itself.

A key theme at Impact this year was that of “composable business” (I have to check carefully every time I type that to make sure I don’t write “compostable business”, but someone did point out that it is about reuse…). I’m not sure that’s a very new concept: it seems to be about assembling the building blocks of business capabilities, processes and technologies in order to meet the current needs without completely retooling, which is sort of what we’ve all been saying that BPM, ODM and SOA can do for some years now.

Smarter Process is positioned as an enabler of composable business, and is IBM’s approach for “reinventing business operations” by enabling the development of customer-centric applications that push top-line growth, while still providing the efficiency and optimization table stakes. Supporting knowledge workers has become a big part of this, which leads to IBM’s biggest new feature in BPM: the inclusion of “basic” case management within BPM. The idea is that you will be able to support a broader range of work types on a single platform: pre-defined “structured” processes, structured processes with some ad hoc activities, ad hoc (case) work that can invoke structured process segments, and fully ad hoc work. I’ve been talking about this range of work types for quite a while, and how we need products that can range across them, because I see so few real-world processes that fit into the purely structured or the purely unstructured ends of the spectrum: almost everything lies somewhere in the middle, where there is a mix of both. In fact, what IBM is providing is “production case management”, where a designer (probably not technical, or not very technical) creates a case template that pre-defines all of the possible ad hoc activities and structured process fragments; the end user can choose which activities to run in which order, although some may be required or have dependencies. This isn’t the “adaptive case management” extreme end of the spectrum, where the end user has complete control and can create their own activities on the fly, but PCM covers a huge range of use cases in business today. Bruce Silver

“But wait,”, you say, “IBM already has case management with IBM Case Manager. What’s the difference?” Well, IBM BPM (Lombardi heritage) provides full BPM capabilities including process analytics and governance, plus basic case capabilities, on the IBM BPM platform;  IBM Case Manager (FileNet heritage) provides full content and case capabilities including content analytics and governance, plus basic workflow capabilities, on the IBM ECM platform. Hmmm, sounds like something that Marketing would say. The Smarter Process portfolio graphic includes the three primary components of Operational Decision Management, Business Process Management and Case Management, but doesn’t actually specify which product provides which functionality, leaving it open for case management to come from either BPM or ICM. Are we finally seeing the beginning of the end of the split between process management in BPM and ICM? The answer to that is likely more political than technical – these products report up through different parts of IBM, turning the merging/refactoring of them into a turf war – and I don’t have a crystal ball, but I’m guessing that we’ll gradually see more case capabilities in BPM and a more complete integration with ECM, such that the current ICM capabilities become redundant, and IBM BPM will expand to manage the full spectrum of work types. The 1,000th cut may finally be approaching. Unfortunately for ICM users, there’s no tooling or migration path to move from ICM to BPM (presumably, no one is even talking about going the other way) since they are built on different infrastructure. There wasn’t really a big fuss made about this new functionality or how it might overlap with ICM about this outside the BPM analyst group; in fact, Bruce Silver quipped “IBM Merges Case into BPM but forgets to announce it”. Winking smile

The new case management functions are embedded within the BPM environment, and appear fairly well integrated: although a simple web-based case design tool is used instead of the BPM Eclipse authoring environment, the runtime appears within the BPM process portal. The case detail view shows the case data, case document and subfolders, running tasks, activities that can be started manually (including processes), and an overall status – similar enough to what you would see with any work item that it won’t be completely foreign, but with the information and controls required for case management. Under the covers, the ad hoc activities execute in the BPM (not ICM) process engine, and a copy of ECM is embedded within BPM to support the case folder and documents artifacts.

The design environment is pretty simple, and very similar to some parts of the ICM design environment: required and optional ad hoc activities are defined, and the start trigger (manual or automatically based on declarative logic or an event) of each activity is specified. Preconditions can be set, so that an activity can’t be started (or won’t automatically start) until certain conditions are met. If you need ad hoc activities in the context of a structured process, these can be defined in the usual BPM design environment – there’s no real magic about this, since ad hoc (that is, not connected by flow lines) activities are part of the BPMN standard and have been available for some time in IBM BPM. The case design environment is integrated with Process Designer and Process Center for repository and versioning, and case management is being sold as an add-on to IBM BPM Advanced.

Aside from the case management announcement, there are some new mobile capabilities in IBM BPM: the ability to design and playback responsive Coaches (UI) for multiple form factors, and pushing some services down to the browser. These will make the UI look better and work faster, so all good there. IBM also gave a shout out to BP3’s mobile portal product, Brazos, for developing iOS and Android apps for IBM BPM; depending on whether you want to go with responsive browser or native apps as a front-end for BPM, you’re covered.

They also announced some enhancements to Business Monitor: a more efficient, high-performance pub-sub style of event handling, and the ability to collect events from any source, although the integration into case management (either in BPM or ICM) at design time still seems a bit rudimentary. They’ve also upgraded to Cognos BI 10.2.1 as the underlying platform, which brings more powerful visualizations via the RAVE engine.  I have the impression that Business Monitor isn’t as popular as expected as a BPM add-on: possibly by the time that organizations get their processes up and running, they don’t have the time, energy or funds for a full-on monitoring and analytics solution. That’s too bad, since that can result in a lot of process improvement benefits; it might make sense to be bundling in some of this capability to at least give a teaser to BPM customers.

In BPM cloud news, there are some security enhancements to the Softlayer-based BPM implementations, including 2-factor authentication and SAML for identity management, plus new pricing at $199/user/month with concurrent user pricing scenarios for infrequent users. What was more interesting is what was not announced: the new Bluemix cloud development platform offers decision services, but no process services.

Blueworks Live seemed to have the fewest announcements, although it now has review and approval processes for models, which is a nice governance addition. IBM can also now provide Blueworks Live in a private cloud – still hosted but isolated as a single tenant – for those who are really paranoid about their process models.

bpmNEXT 2014 Wrapup And Best In Show

I couldn’t force myself to write about the last two sessions of bpmNEXT: the first was a completely incomprehensible (to me) demo, and the second spent half of the time on slides and half on a demo that didn’t inspire me enough to actually put my hands on the keyboard. Maybe it’s just conference fatigue after two full days of this.

However, we did get a link to the Google Hangout recording of the BPMN model interchange demo from yesterday (be sure to set it to HD or you’ll miss a lot of the screen detail).

We had a final wrapup address from Bruce Silver, and he announced our vote for the best in show: Stefan Andreasen of Kapow – congrats!

I’m headed home soon to finish my month of travel; I’ll be Toronto-based until the end of April when IBM Impact rolls around.

bpmNEXT 2014 Thursday Session 2: Decisions And Flexibility

In the second half of the morning, we started with James Taylor of Decision Management Solutions showing how to use decision modeling for simpler, smarter, more agile processes. He showed what a process model looks like in the absence of externalized decisions and rules: it’s a mess of gateways and branches that basically creates a decision tree in BPMN. A cleaner solution is to externalize the decisions so that they are called as a business rules activity from the process model, but the usual challenge is that the decision logic is opaque from the viewpoint of the process modeler. James demonstrated how the DecisionsFirst modeler can be used to model decisions using the Decision Model and Notation standard, then link a read-only view of that to a process model (which he created in Signavio) so that the process modeler can see the logic behind the decision as if it were a callable subprocess. He stepped through the notation within a decision called from a loan origination process, then took us into the full DecisionsFirst modeler to add another decision to the diagram. The interesting thing about decision modeling, which is exploited in the tool, is that it is based on firmer notions of reusability of data sources, decisions and other objects than we see in process models: although reusability can definitely exist in process models, the modeling tools often don’t support it well. DecisionsFirst isn’t a rules/decision engine itself: it’s a modeling environment where decisions are assembled from the rules and decisions in other environments, including external engines, spreadsheet-based decision tables, or knowledge sources describing the decision. It also allows linking to the processes from which it is invoked, objectives and organizational context; since this is a collaborative authoring environment, it can also include comments from other designers.

François Chevresson-Aubain and Aurélien Pupier of Bonitasoft were up next to show how to build flexibility into deployed processes through a few simple but powerful features. First, adding collaboration tasks at runtime, so that a user in a pre-defined step who needs to include other users at that point can do so even if collaboration wasn’t built in at that point. Second, process model parameters can be changed (by an administrator) at runtime, which will impact all running processes based on that model: the situation demonstrated was to change an external service connector when the service call failed, then replay the tasks that failed on that service call. Both of these features are intended to address dynamic environments where the situation at runtime may be different from that at design time, and how to adjust both manual and automated tasks to accommodate those differences.

We finished the morning with Robert Shapiro of Process Analytica on improving resource utilization and productivity using his Optima workbench. Optima is a tool for a serious analyst – likely with some amount of statistical or data science background – to import a process model and runtime data, set optimization parameters (e.g., reduce resource idleness without unduly impacting cycle time), simulate the process, analyze the results, and determine how to best allocate resources in order to optimize relative to the parameters. Although a complex environment, it provides a lot of visualization of the analytics and optimization; Robert actually encourages “eyeballing” the charts and playing around with parameters to fine-tune the process, although he has a great deal more experience at that than the average user. There are a number of analytical tools that can be applied to the data, such as critical path modeling, and financial parameters to optimize revenues and costs. It can also do quite a bit of process mining based on event log inputs in XES format, including deriving a BPMN process model and data correlation based on the event logs; this type of detailed offline analysis could be applied with the data captured and visualized through an intelligent business operations dashboard for advanced process optimization.

We have one more short session after lunch, then best in show voting before bpmNEXT wraps up for another year.

bpmNEXT 2014 Thursday Session 1: Intelligence And A Bit More BPMN

Harsh Jegadeesan of SAP set the dress code bar high by kicking off the Thursday demos in a suit jacket, although I did see Thomas Volmering and Patrick Schmidt straightening his collar before the start. He also set a high bar for the day’s demo by showing how to illuminate business operations with intelligent process intelligence. He discussed a scenario of a logistics hub (such as Amazon’s), and the specific challenges of the hub operations manager who has to deal with inbound and outbound flights, and sorting all of the shipments between them throughout the day. Better visibility into the operations across multiple systems allows problems to be detected and resolved while they are still developing by reallocating the workforce. Harsh showed a HANA-based hub operations dashboard, where the milestones for shipments demark the phases of the value chain: from arrival to ground handling to warehouse to outbound buffer to loading and takeoff. Real-time information is pulled from each of the systems involved, and KPIs show; drill downs can show the lower level aggregate or even individual instance data to determine what is causing missed KPIs – in the demo, shipments from certain other hubs are not being unloaded quickly enough. But more than just a dashboard, this allows the hub operations manager to add a task directly in the context of the problem and assign it (via an @mention) to someone else, for example, to direct more trucks to unload the shipments. The dashboard can also make recommendations, such as changing the flights for specific shipments to improve the overall flow and KPIs. He showed a flight map view of all inbound and outbound flights, where the hub operations manager can click on a specific flight and see the related data. He showed the design environment for creating the intelligent business operations process by assembling SAP and non-SAP systems using BPMN, mapping events from those systems onto the value chain phases (using BPAF where available), thereby providing visibility into those systems from the dashboard; this builds a semantic data mart inside HANA for the different scenarios to support the dashboard but also for more in-depth analytics and optimization. They’ve also created a specification for Process Façade, an interface for unifying process platforms by integrating using BPMN, BPAF and other standards, plus their own process-based systems; at some point, expect this to open up for broader vendor use. Some nice case studies from process visualization in large-scale enterprises.

Dominic Greenwood of Whitestein on intelligent process execution, starting by defining an intelligent process: it has experiences (acquired data), knowledge (actionable information, or analytical interpretation of acquired data), goals (adoptable intentions, or operationally-relevant behavioral directives), plans (ways to achieve goals through reusable action sequences, such as BPMN processes) and actions (result of executing plans). He sees intelligent process execution as an imperative because of the complexity of real-world processes; processes need to dynamically adapt, and process governance needs to actively apply constraints in this shifting environment. An intelligent process controller, or reflective agent, passes through a continuous cycle of observe, comprehend, deliberate, decide, act and learn; it can also collaborate with other intelligent process controllers. He discussed a case study in transportation logistics – a massively complex version of the travelling salesman problem – where a network of multi-modal vehicles has to be optimized for delivery of goods that are moved through multiple legs to reach their destinations. This involves knowledge of the goods and their specific requirements, vehicle sensors of various types, fleet management, hub/port systems, traffic and weather, and personnel assignments. DHL in Europe is using this to manage 60,000 orders per day, allocated between 17,500 vehicles that are constantly in motion, managed by 300 dispatchers across 24 countries with every order changing at least once while en route. The intelligent process controllers are automating many of the dispatching decisions, providing a 25-30% operational efficiency boost and a 12% reduction in transportation costs. A too-short demo that just walked through their process model to show how some of these things are assigned, but an interesting look into intelligent processes, and a nice tie-in to Harsh’s demonstration immediately preceding.

Next up was Jakob Freund of camunda on BPMN everywhere; camunda provides an open-source BPM framework intended to be used by Java developers to incorporate process automation into their applications, but he’s here today to talk about bpmn.io: an open-source toolkit in Javascript that provides a framework for developers and a BPMN web modeler, all published on GitHub. The first iteration is kicking off next week, and the web modeler will be available later this year. Unlike yesterday’s demonstrators who firmly expressed the value of no-code BPM implementations, Jakob jumped straight into code to show how to use the Javascript classes to render BPMN XML as a graphical diagram and add annotations around the display of elements. He showed how these concepts are being used in their cockpit process monitoring product; it could also be used to demonstrate or teach BPMN, making use of functions such as process animation. He demonstrated uploading a BPMN diagram (as XML) to their camunda community site; the site uses the Javascript libraries to render the diagram, and allows selecting specific elements in the diagram and adding comments, which are then seen via a numeric indicator (indicating the number of comments) attached to the elements with comments. He demonstrated some of the starting functionality of the web modeler, but there’s a lot of work to do there still; once it’s released, any developer can download the code and embed that web modeler into their own applications.

We finished the first morning session with Keith Swenson of Fujitsu on letting go of control: we’re back on the topic of agents, which Keith initially defined as autonomous, goal-directed software that does something for you, before pointing out that that describes a lot of software today. He expanded that definition to mean something more…human-like. A personal assistant that can coordinate your communications with those of other domains. These type of agents do a lot of communication amongst themselves in a rules-based dynamic fashion, simplifying and reducing the communication that the people need to do in order to achieve their goals. The key to determining what the personal assistants should be doing is to observe emergent behavior through analytics. Keith demonstrated a healthcare scenario using Cognoscenti, an open-source adaptive case management project; a patient and several different clinicians could set goals, be assigned tasks, review documents and other activities centered around the patient’s care. It also allows the definition of personal assistants to do specific rules-based actions, such as cloning cases and synchronizing documents between federated environments (since both local and cloud environments may be used by different participants in the same case), accepting tasks, and more; copying between environments is essential so that each participant can have their information within their own domain of control, but with the ability to synchronize content and tasks. The personal assistants are pretty simple at this point, but the concept is that they are helping to coordinate communications, and the communications and documents are all distributed via encrypted channels so safer than email. A lot of similarities with Dominic’s intelligent process controllers, but on a more human scale. As many thousand of these personal assistant interactions occur, patterns will begin to emerge of the process flows between the people involved, which can then be used to build more intelligence into the agents and the flows.

bpmNEXT 2014 Wednesday Afternoon 2: Unstructured Processes

We’re in the Wednesday home stretch; this session didn’t have a specific theme but it seemed to mostly deal with unstructured processes and event-driven systems.

The session started with John Reynolds and Amy Dickson of IBM on blending structured flow and event condition action patterns within process models. John showed how they are modeling ad hoc activities using BPMN (rather than CMMN): basically, disconnected activities can have precondition events and expressions specified as to when and how they are triggered, be identified as optional or mandatory, and their behavior. It’s not completely standard BPMN, but uses a relatively small number of extensions to indicate how the activity is started and whether it is optional or required. The user sees activities with different visual indicators to show which are required or optional, and if an activity is still waiting for a precondition. This exposes the natural capabilities of the execution engine as an event handling engine; BPMN just provides a model for what happens next after an action occurs, as well as handling the flow model portions of the process. They’re looking at adding milestones and other constructs; this is an early pre-release version and I expect that we’ll see some of these ideas rolling into their products over the months to come. An interesting way to combine process flows and ad hoc activities in the same (pre-defined) process while hiding some of the complexity of events from the users; also interesting in that this indicates some of IBM’s direction for handling ad hoc cases in BPM.

Ashok Anand and R.V.S. Mani of Inswit presented their beta appiyo “business response platform”, which is an application development platform for small, simple BPM apps that can interconnect with social media such as Facebook, but an overly-short demo followed an overly-long presentation so difficult to grasp much of the capability.

We finished the day with Jason Bloomberg of EnterpriseWeb discussing agent-oriented architecture for cross-process governance: a “style of EA that drives business agility by leveraging policy-based, data-driven intelligent agents”. They call their intelligent agent SmartAlex; it’s like Siri for the enterprise, dynamically connecting people and content at the right time in a goal-driven manner rather than with pre-defined processes. Every step is just an event that calls SmartAlex; SmartAlex interprets models, evaluates and applies policies and rules, then delivers results or makes recommendations using a custom interface and payload depending on the context. Agents can not only coordinate local processes, but also track what’s happening in all related processes across an enterprise to provide overall governance and support integrated functions. EnterpriseWeb isn’t a BPM tool; it’s a tool for building tools, including workflows. Bill Malyk joined remotely to do the demo based on resolving a declarative conflict of interest; he showed creating an application related to cases in the system, and stating that potential conflict of interest cases are those that have relationships between people involved in the case. This immediately identified existing cases where there is a potential conflict of interest, and allowed navigation through the graph that links the cases and the criteria. He then demonstrated creating a process related to the application, which can then run flow-oriented processes based on potential conflicts of interest found using the declarative logic specified earlier. Some powerful capabilities for declarative, agent-based applications that take advantage of a variety of data sources and fact models, with greater flexibility and ease of use than complex event processing platforms.

My brain is full, so it must be time for dinner and another evening of drinks and conversation; I’ll be back tomorrow with another full morning and half afternoon of sessions.

bpmNEXT 2014 Wednesday Afternoon 1: Mo’ Models

Denis Gagne of Trisotech was back after lunch at bpmNEXT demonstrating socializing process change with their BPMN web modeler. He showed their process animation feature, which allows you to follow the flow through a process and see what happens at each step, and view rich media that has been attached at any given step to explain that step. He showed a process for an Amazon order, where each step had a slideshow or video attached to show the actual work that was being performed at that step; the tool supports YouTube, Slideshare, Dropbox and a few others natively, plus any URL as an attachment to any element in the process. The animated process can be referenced by a URL, allowing it to be easily distributed and socialized. This provides a way for people to learn more about the process, and can be used as an employee training tool or a customer experience enhancement. Even without the rich media enhancements, the process animation can be used to debug processes and find BPMN logical errors (e.g., deadlocks, orphan branches) by allowing the designer to walk through the process and see how the tokens are processed through the model – most modeling tools only check that the BPMN is syntactically correct, not for more complex logical errors that can result in unexpected and unwanted scenarios. Note that this is different from process simulation (which they also offer), which is typically used to estimate performance based on aggregate instances.

Bruce Silver took a break from moderating to do a demo together with Stephan Fischli and Antonio Palumbo of itp commerce on wizard-based generation of “good BPMN” that they’ve done through their BPMessentials collaboration for BPMN training and certification. Bruce’s book BPMN Method and Style as well as his courses attempt to teach good BPMN, where the process logic is evident from the printed diagram in spite of things that can tend to confuse a reader, such as hierarchical modeling forms. He uses a top-down methodology where you identify the start and end states of a process instance, then decompose the process into 6-10 steps where each is an activity aligned with the process instance (i.e., no multi-instance activities), and enumerate the possible end states of each activity if there is more than one so that end states within subprocesses can be matched to gateways that immediately follow the subprocesses. This all takes a bit of a developer’s mindset that’s typically not seen in business analysts who might be creating BPMN models, meaning that we can still end up with spaghetti process models even in BPMN. Bruce walked through an order-to-cash scenario, then Stephan and Antonio took over to demonstrate how their tool creates a BPMN model based on a wizard that walks through the steps of the BPMN method and style: first the process start and (one or more) end states; then a list of the major steps, where each is named, the end states enumerated and (optionally) the performer identified; then the activity-end state pairs are listed so that the user can specify the target (following step), which effectively creates the process flow diagram; then, each activity can be expanded as a subprocess by listing the child activities and the end states; finally, the message flows and lanes are specified by stating which activities have incoming and outgoing message flows. The wizard then creates the BPMN process model in the itp commerce Visio tool where all of the style rules are enforced. Without doubt, this creates better BPMN, although specifying a branching process model via a list of activities and end states might not be much more obvious than creating the process model directly. I know that the itp commerce and some other BPMN modeling tools can also run a check on a BPMN model to check for violations of the style rules; I assume that detecting and fixing the rule violations from a model is just another way of achieving the same goal.

Last up before the afternoon break was Gero Decker of Signavio to demonstrate combining process modeling and enterprise architecture. Signavio’s only product is their process modeler – used to model, collaborate, publish and implement models – which means that they typically deal with process designers and process centers of excellence. However, they are finding that they are now running into EA modelers as they start to move into process architecture and governance, and application owners for application lifecycle management. EA modelers have to deal with the issues of whether to use a unified tool with a single object repository for all modeling and unified model governance, or multiple best of breed tools where metamodels can be synchronized and may be slaved between tools. Signavio is pushing the second alternative, where their tool integrated with or overlays other tools such as SAP Solution Manager and leanIX. Signavio has added ArchiMate open standard enterprise architecture model types to their tool for EA modeling, creating linkages and tracing from ArchiMate objects to BPMN models. Gero demonstrated what the ArchiMate models look like in Signavio, then how processes in leanIX can be directly linked to Signavio process models as well as having applications from the EA portfolio available as performers to specify in a Signavio process model. Creating of process models in Signavio that use applications from the portfolio then show up (via automated synchronization) in leanIX as references to that application. He also showed an integration with Effektif for approving changes to the process model in Signavio, comparing the before and after versions of the flow, since there is a pluggable connector to Signavio from Effektif processes. Connections to other tools could be built using the Signavio REST API. Nice integration between process models and application portfolio models in separate tools, as well as the model approval workflow.

bpmNEXT 2014: BPMN MIWG Demo

The BPMN Model Interchange Working Group is all about (as you might guess from the name) interchanging BPMN models between different vendors’ products: something that OMG promised with the BPMN standard, but which never actually worked out of the box due to contradictions in the standard and misinterpretations by some vendors. To finish off Wednesday morning at bpmNEXT, we have a live demo involving 12 different tools with participants in different locations, with Denis Gagne of Trisotech (who chairs the working group) and Falko Menge of camunda (who heads up the test automation subgroup) on the stage, a few others here on the sidelines, some at the OMG meeting in Reston, and some in their offices in France and Poland.

To start, different lanes of the process were designed by four different people on IBM Blueworks Live, Activiti, camunda and W4; each then exported their process models and saved to Dropbox. Denis switched back and forth between the different screens (they were all on a Google Hangout) to show us what was happening as the proceeded, and we could see the notifications from Dropbox as the different files were added. In the second stage, Bonitasoft was used to augment the Blueworks Live model, itp-commerce edited the Activiti model, and Signavio edited the camunda model. In the third stage, ADONIS was used to merge together the lanes created in several of the models (I lost track of which ones) into a single process model, and Yaoqiang used to merge the Signavio and camunda models. Then, the Trisiotech Visio BPMN modeler was used to assemble the ADONIS and Yaoqiang models into the final model with multiple pools. At the end, the final model was imported into a number of different tools: the Trisotech web modeler, the Oracle BPM modeler, the bpmn.io environment from camunda, and directly into to the W4 execution engine (without passing through a modeling environment). Wow.

The files exchanged were BPMN XML files, and the only limitations of which tool to use when was that some only support a single pool so had to be used at the earlier stages where each tool was only modeling a single lane or pool. This is how BPMN was supposed to work, but the MIWG has found some number of inconsistencies with the standard and also some issues with the vendors’ tools that had to be corrected.

They have developed a number of test cases that cover the descriptive and analytic classes within BPMN, and automated tools to test the outcome of different vendors’ modelers. Over 20 BPMN modelers have been tested for import, export and roundtrip capabilities; if you’re a BPMS vendor supporting BPMN 2.0 (or claiming to), you should be on this list because there are a lot of us who just aren’t going to write our own XSLT to translate your models into something that can be read by another tool. If you’re a process designer using a BPMS, shame your vendor into participating because it creates a more flexible and inclusive environment for your design and modeling efforts.

This is hugely valuable work that they’re doing in the working group; note that you don’t have to be an OMG member to get involved, and the BPMN MIWG would love to have others join in to help make this work even better.

We’re off for lunch and a break now, then back for six more sessions this afternoon. Did I mention how awesome bpmNEXT is?

bpmNEXT 2014 Wednesday Morning: Cloud, Synthetic APIs and Models

I’m not going to say anything about last night, but it’s a bit of a subdued crowed here this morning at bpmNEXT. Smile

We started the day with Tom Baeyens of Effektif talking about cloud workflow simplified. I reviewed Effektif in January at the time of launch and liked the simple and accessible capabilities that it offers; Tom’s premise is that BPM is just as useful as email, and it needs to be just as simple to use as email so that we are not reliant on a handful of power users inside an organization to make them work. To do this, we need to strip out features rather than add features, and reduce the barriers to trying it out by offering it in the cloud. Inspired by Trello (simple task management) and IFTTT (simple cloud integration, which basically boils down every process to a trigger and an action), Effektif brings personal DIY workflow to the enterprise that also provides a bridge to enterprise process management through layers of functionality. Individual users can get started building their own simple workflows to automate their day-to-day tasks, then more technical resources can add functionality to turn these into fully-integrated business processes. Tom gave a demo of Effektif, starting with creating a checklist of items to be completed, with the ability to add comments, include participants and add attachments to the case. There have been a few changes since my review: you can use Markdown to format comments (I think that understanding of Markdown is very uncommon in business and may not be well-adopted as, for example, a TinyMCE formatted text field); cases can now to started by a form as well as manually or via email; and Google Drive support is emerging to support usage patterns such as storing an email attachment when the email is used to instantiate the case. He also talked about some roadmap items, such as migrating case instances from one version of a process definition to another.

Next up was Stefan Andreasen of Kapow (now part of Kofax) on automation of manual processes with synthetic APIs – I’m happy for the opportunity to see this because I missed seeing anything about Kapow during my too-short trip to the Kofax Transform conference a couple of weeks ago. He walked through a scenario of a Ferrari sales dealership who looks up SEC filings to see who sold their stock options lately (hence has some ready cash to spend on a Ferrari), and narrow that down with Bloomberg data on age, salary and region to find some pre-qualified sales leads, then load them into Salesforce. Manually, this would be an overwhelming task, but Kapow can create synthetic APIs on top of each of these sites/apps to allow for data extraction and manipulation, then run those on a pre-set schedule. He started with a “Kapplet” (applications for business analysts) that extracts the SEC filing data, allows easy manual filtering by criteria such as filing amount and age, then select records for committal to Salesforce. The idea is that there are data sources out there that people don’t think of as data sources, and many web applications that don’t easily integrated with each other, so people end up manually copying and pasting (or re-keying) information from one screen to another; Kapow provides the modern-day equivalent to screen-scraping that taps into the presentation logic and data (not the physical layout or style, hence less likely to break when the website changes) of any web app to add an API using a graphical visual flow/rules editor. Building by example, elements on a web page are visually tagged as being list items (requiring a loop), data elements to extract, and much more. It can automate a number of other things as well: Stefan showed how a local directory of cryptically-named files can be renamed to the actual titles based on table of contents HTML document; this is very common for conference proceedings, and I have hundreds of file sets like this that I would love to rename. The synthetic APIs are exposed as REST services, and can be bundled into Kapplets so that the functionality is exposed through an application that is useable by non-technical users. Just as Tom Baeyens talked about lowering the barriers for BPM inside enterprises in the previous demo, Kapow is lowering the bar for application integration to service the unmet needs.

It would be great if Tom and Stefan put their heads together and lunch and whipped up an Effektif-Kapow demo, it seems like a natural fit.

Next was Scott Menter of BP Logix on a successor to flowcharts, namely their Process Director GANTT chart-style process interface – he said that he felt like he was talking about German Shepherds to a conference of cat-lovers – as a different way to represent processes that is less complex to build and modify than a flow diagram, and also provides better information on the temporal aspects and dependencies such as when a process will complete and the impacts of delays. Rather than a “successor” model such as a flow chart, that models what comes after what, a GANTT chart is a “predecessor” model, that models the preconditions for each task. A subtle but important difference when the temporal dependencies are critical. Although you could map between the two model types on some level, BP Logix has a completely different model designer and execution engine, optimized for a process timeline. One cool thing about it is that it incorporates past experience: the time required to do a task in the past is overlaid on the process timeline, and predictions made for how well this process is doing based on current instance performance and past performance, including tasks that are far in the future. In other words, predictive analytics are baked right into each process since it is a temporal model, not an add-on such as you would have in a process based on a flow model.

For the last demo of this session, Jean-Loup Comeliau of W4 on their BPMN+ product, which provides model-driven development using BPMN 2, UML 2, CMIS and other standards to generate web-based process applications without generating code: the engine interprets and executes the models directly. The BPMN modeling is pretty standard compared to other process modeling tools, but they also allow UML modeling of the data objects within the process model; I see this in more complete stack tools such as TIBCO’s, but this is less common from the smaller BPM vendors. Resources can be assigned to user tasks using various rules, and user interface forms are generated based on the activities and data models, and can be modified if required. The entire application is deployed as a web application. The data-centricity is key, since if the models change, the interface and application will automatically update to match. There is definitely a strong message here on the role of standards, and how we need more than just BPMN if we’re going to have fully model-driven application development.

We’re taking a break, and will be back for the Model Interchange Working Group demonstration with participants from around the world.

bpmNEXT 2014: Work Management And Smart Processes

Bruce Silver always makes me break the rules, and tonight I’m breaking the “everything is off the record after the bar opens” rule since he scheduled sessions after dinner and with an open bar in the back of the room. Rules, as they say, are made to be broken.

Roger King of TIBCO attempted to start this demo during the earlier session but there were problems with the fancy projector setup. He’s back now to talk about model-driven work management. TIBCO’s core customer base (like mine) is traditional enterprises such as financial services, and they’re seeing a lot of them retiring legacy enterprise apps now in favor of process-centric apps built on platforms such as TIBCO. They see specific problems with work management in very large, branch-network organizations like retail banks; by work management and resource management, they mean the way that work is distributed to and accessed by end users, one of the things that BPMN doesn’t do when you define processes. With tens of thousands of participants, just a small increment in productivity through better work management can cause a significant ROI in absolute terms, but traditionally this has been done through custom user interfaces and distribution/matching. There are a number of resource patterns that have been studied and developed, e.g., separation of duties, round robin; Roger demonstrated how these are being incorporated into TIBCO’s AMX BPM (modeled within their Business Studio product) through organizational models, where you can find the resources in the organization, groups and custom organizational units that you need to bring your business vocabulary to determining how work is distributed within your organization. The idea is that once you have this defined, you can then use very fine-grained rules for determining which person gets which piece of work, or who has access to what. This now becomes something that you can attach to an activity in a process model using simple assignments or with a resource query language that assigns it dynamically, including based on process instance variables – essential when you have 100’s or 1000’s of branches and can’t realistically administer your organizational model and work distribution methods manually. Furthermore, you need to be looking at having people go to the work rather than having work sent to the people. This is the only type of work distribution approach when you’re creating declarative processes, where configuration needs to be much more dynamic than what might be drawn in the process model.

We finished off the short opening day of bpmNEXT with a keynote by Jim Sinur, late of Gartner (but not hesitant to use the materials that he helped to create there) and now an independent analyst, on how his processes are smarter than him. Processes based on machine learning, however, can only go so far: although machines are more accurate and consistent (and never complain when you ask them to work overtime), people are better at unexpected situations. The key is to have computers and people work together within intelligent processes: let the computers work on the parts that they do best, including events, analytics standardized decisions, pre-defined processes and the resulting actions from combining all of these; exploit emerging technologies such as cognitive systems, what-if scenarios via simulation, intelligent business operations, visualization and social analytics. Intelligent agents are a big part of this, but we need to have goal-directed processes to really make this work, or abandon the concept of processes at all except for the footprints that they leave behind.

Rule-breaking done. Back tomorrow for a full day of bpmNEXT 2014.