DecisionCAMP 2019: collaborative decision making and temporal reasoning in DMN

Collaborative decisions: coordinating automated and human decision-making. Alan Fish, FICO

Alan Fish presented on the coordination of decisions between automation, individuals and groups. He considered how DMN isn’t enough to model these interactions, since it doesn’t allow for modeling certain characteristics; for example, partitioning decisions over time is best done with a combination of BPMN and DMN, where temporal dependencies can be represented, while combining CMMN and DMN can represent the partitioning decisions between decision-makers.

Partitioning decisions over time, modeled with BPMN and DMN. From Alan Fish’s presentation.

He also looked at how to represent the partition between decisions and meta-decisions — which is not currently covered in DMN — where meta-decisions may be an analytical human activity that then determines some of the rules around how decisions are made. He defines an organization as a network of decision-making entities passing information to each other, with the minimum requirement for success based on having models of processes, case management, decisions and data. The OMG “Triple Crown” of DMN, BPMN and CMMN figure significantly in his ideas on a certain level of organizational modeling, and the success of the organizations that embrace them as part of their overall modeling and improvement efforts.

He sees radical process reengineering as being a risky operation, and posits that doing process reengineering once then constantly updating decision models to adapt to changing conditions. An interesting discussion on organizational models and how decision management fits into larger representations of organizations. Also some good follow-on Q&A about whether to consider modeling state in decision models, or leaving that to the process and case models; and about the value of modeling human decisions along with automated ones.

Making the Right Decision at the Right Time: Introducing Temporal Reasoning to DMN. Denis Gagné, Trisotech

Denis Gagné covered the concepts of temporal reasoning in DMN, including a new proposal to the DMN RTF for adding temporal reasoning concepts. Temporal logic is “any system of rules and symbolism for representing, and reasoning about, propositions qualified in terms of time”, that is, representing events in terms of whether they happened sequentially or concurrently, or what time that a particular event occurred.

The proposal will be for an extension to FEEL — which already has some basic temporal constructs with date and time types — that provides a more comprehensive representation based on Allen’s interval algebra and Zaidi’s point-interval logic. This would have built-in functions regarding intervals and points, with two levels of abstraction for expressiveness and business friendliness, allowing for DMN to represent temporal relationships between points, between points and intervals, and between intervals.

Proposed DMN syntax for temporal relationships. From Denis Gagné‘s presentation.

The proposal also includes a more “business person common sense” interpretation for interval overlaps and other constructs: note that 11 of the possible interval-interval relationships fall into this category, which makes this into a simpler before/after/overlap designation. Given all of these representations, plus more robust temporal functions, the standard can then allow expressions such as “interval X starts 3 days before interval Y” or “did this happen in September”.

This is my first time at DecisionCAMP (formerly RulesFest), and I’m totally loving it. It’s full of technology practitioners — vendors, researchers and consultants — who more interested in discussing interesting ways to improve decision management and the DMN standard rather than plugging their own products. I’m not as much of a decision management expert as I am in process management, so great learning opportunities for me.

CamundaCon 2019 breakout: DMN and BPMN for reusable survey forms at Indiana Farm Bureau

Sowmya Raghunathan and Corinna Cohn presented on a claims intake implementation that uses BPMN and DMN in an interesting way: driving the intake forms used by a claims administrator when gathering first notice of loss (FNOL) information from a claimant. The idea is that the claims admin doesn’t need to have any information about the claim type, and the claimant doesn’t get asked any irrelevant questions, because the form always presents the next best question based on previous responses: a wizard-like model, but driven by BPMN and DMN.

As the application and technical architects at Indiana Farm Bureau Insurance, they were able to give us a good view of how they use the tools for this: BPMN for orchestrating the DMN and UI communication as well as storing the responses, DMN for defining the questions and question/response mapping, and a UI component for implementing the survey forms. They consider this a headless application, but of course, it does surface via the form UI; from a Camunda process standpoint, however, it is decoupled piece of the architecture that interfaces with the claims system.

Technical architecture of the survey DMN/BPMN system

We saw a demo of one of the claim forms at work, where the previous questions and responses can be seen, and changes to the previous responses may cause changes to subsequent questions based on the DMN decision tables behind the scenes. They use a couple of DMN tables just as configuration tables for the UI for the questions and options (e.g., radio buttons versus free-form responses), then a Next Question decision table to determine the next question based on the previous response: this table is based on a directed acyclic graph that links questions (nodes) via answers (links), which allows for easy re-navigation of the graph if an earlier response is changed.

DMN decision table for FNOL questions related to auto claim

BPMN is used to navigate and determine the next question in a dynamic question subprocess, and if the survey can be exited; once sufficient information has been collected, the FNOL is initiated in the claims systems.

Dynamic Questions BPMN subprocess

The use of DMN means that the questions can be changed very easily since they’re not embedded in the code; this means that they can be created and modified by business analysts rather than requiring developers to code these into the UI directly.

FNOL BPMN process

The entire framework is reusable, and could be quickly reconfigured to be used for any type of survey. That’s great, because a few years ago, I saw a very similar use case for this in a clinical situation for a stroke assessment questionnaire: in a hospital setting, when someone arrives at an emergency department and is suspected of having had a stroke, there are standard questions to ask in order to evaluate the patient’s condition. At the time, I thought that this would be a perfect use case for DMN and BPMN, although that was beyond the scope of the project at that time.

A match made in BPMN/DMN heaven: @bpmswatch joining @Trisotech

Trisotech recently announced that Bruce Silver – who writes and teaches the gold standard Method & Style books and courses on BPMN and DMN, and who has forgotten more about BPMN than most people ever learned – is joining Trisotech as a principal consultant. Congrats all around, although Bruce may regret this when he’s needed at Trisotech Montreal headquarters in January when it’s -30C. Winking smile

Bruce even has his first post on the Trisotech blog, about practical DMN basics. Essential reading for getting started with DMN.

Disclosure: Trisotech is a consulting client of mine. I’m not being paid for writing this post, I just like these guys because they’re smart and do great work. You can read about my relationship with vendors here.

bpmNEXT 2019 demo: intelligent BPM by @SAP plus DMN TCK working group

ML, Conversational UX, and Intelligence in BPM, with Andre Hofeditz and Seshadri Sreeniva of SAP plus DMN TCK update

We’re at the end of bpmNEXT for another year, and we have one last demo. Seshadri showed a demo of their intelligent BPM for an employee onboarding process (integrated with SuccessFactors), where the process can vary widely depending on level, location and other characteristics. This exposes the pre-defined business processes in SuccessFactors, with configuration tools for customizing the process by adding and modifying building blocks to create a process variant for a special case. Decisions involved in the processes can also be configured, as well as dashboards for viewing the processes in flight. Extension workflows can be created by selecting a standard process “recipe” from a SuccessFactors library, then configuring it for the specific use; he showed an example here for adding an equipment provisioning extension that can be added as a service task to one of the top-level process models. He demonstrated a voice-controlled chatbot interface for interacting with processes, allowing a manager to ask what’s happening for them today, and get back information on the new employee onboardings in progress, and expected delays and a link to his task inbox. Tasks can be displayed in the chat interface, and approvals accepted via voice or typed chat. The chatbot is using AI for determining the intent of the input and providing a precise and accurate response, and using ML to provide predictions on the time required to complete processes that are in flight if asked about completion times and possible delays. The chatbot can also make decision table-based recommendations such as creating an IT ticket to assign roles to the new employee and find a desk location. He showed the interface for designing and training the bot capabilities, where a designer can create a new conversational AI skill based on conditions, triggers and actions to take. This is currently a lab preview, but will be rolled out as part of their cloud platform workflow (not unique to the SuccessFactors environment) in the coming months.

Decision Model and Notation Technology Compatibility Kit update with Keith Swenson

We finished off bpmNEXT 2019 with an update on the DMN TCK, that is, the set of tools provided for free for vendors to test their implementation of DMN. The TCK provides DMN 1.2 models plus sets of input data and expected results; a runner app calls the vendor engine, compares the results and exports them as a CSV file to show compliance. In the three years since this was kicked off, there are eight vendors showing results and over 1000 test cases, with another vendor about to join the list and add another 600 test cases. The test cases are determined through manual examination of the standard specification, so represents a significant amount of work to create this robust set of compliance tests. The TCK group is not creating the standard, but testing it; however, Keith identified some opportunities for the TCK to be more proactive in defining some things such as error handling behavior that the revision task force (RTF) at OMG are unlikely to address in the near term. He also pointed out that there are many more vendors claiming DMN compatibility than have demonstrated that compatibility with the TCK.

That’s it for bpmNEXT 2019 – always feels like it’s over too soon, yet I leave with my brain stuffed full of so many good ideas. We’ve done the wrapup survey and heading off to lunch, but the results on Best in Show won’t come out until I’m already on my way to the airport.

bpmNEXT 2019 demos: automation services with @Trisotech and @bpmswatch

The day started with my keynote on rolling your own digital automation platform using BPM and microservices, which set the stage for the two demos and the round table discussion that followed.

Business Automation as a Service, with Denis Gagne of Trisotech

Denis demoed a new product release from Trisotech, their business automation as a service platform: competing with services such as Zapier and IFTTT but with better process and decision management, and more complex service types available for integration. He showed creating a service built on a Twitter trigger, using BPMN to model the orchestration and FEEL as the scripting language in script activities, and incorporating a machine learning sentiment score and a decision service for categorizing the results, with the result displayed in the color of a flashing smart light bulb. Every service created exposes an Open API and REST API by default, and is deployed as a self-contained microservice. He showed a more complex example of marketing automation that extracts data from an input form, uses a geo-locator to find the customer location, uses a DMN decision model to assign to a sales team based on geography and other form parameters, then creates a lead in Microsoft Dynamics CRM. He finished up with an RPA task example that included the funniest execution of an “I am not a robot” CAPTCHA ever. Key point here is that Trisotech has moved from a pure modeling vendor into the execution space, integrated with any Open API service, and deployable across a number of different cloud platforms using standard protocols. Looking forward to playing around with this.

Business-Composable Services for the Mortgage Industry, with Bruce Silver of Method and Style

Bruce showed the business automation services that he’s created using Trisotech’s platform for the mortgage industry. Although he started looking at decision services around how to determine if someone should be approved for a mortgage (or how large of a mortgage), process was also required to do things like handle mapping and validation of data. Everything is driven by a standard application form and a standard set of underwriting rules used in the US mortgage industry, although this could be modified to suit other markets with different rules. The DMN rules are written in business-readable language, allowing them to be changed by non-developers. The BPMN process does the data validation and mapping before invoking the underwriting decision service. The entire process can be published as a service to be called from any environment, such as a web app used by underwriters inside a financial company or by an online prequalification review done directly by the consumer. The plan is to make these models and services available to see what the adoption is like, to help highlight the value and drive the usage of BPMN and DMN in practice.

Industry Round Table: The Coming Impact of Decision Services and Machine Learning on Business Automation

We finished the morning of day 2 with a discussion that included three of the earlier demo presenters: Denis Gagne, Bruce Silver and Scott Menter. They each gave a short talk on how decision services and machine learning are changing the automation landscape. Some ideas discussed:

  • It’s still up in the air whether DMN will “cross the chasm” and become generally used (to the same degree as, for example, BPMN); this means that vendors need to fully support it, potentially as an execution as well as requirements language.
  • Having machine learning algorithms expressed as DMN can improve transparency of decisions, which is essential in some jurisdictions (e.g., GDPR). There is a need for “explainable AI”.
  • The population using DMN is lower than BPMN, and the skill level is higher, although still well within the capabilities of data-focused business people who are comfortable with formulas and expression languages.
  • There’s a distinction between symbolic (rules-based) and sub symbolic (neural network) AI algorithms in terms of what they can do and how they perform; however, sub symbolic AI is less of a black box in terms of decision transparency.
  • If we here at bpmNEXT aren’t thinking about the ethics of automation, who will? Consider the labor disruption of automation, or decisions that make a choice involving the value of life (the AI “trolley problem”), or old norms used as training data to create biased machine learning.
  • We’re still in a culture of having people at a certain skill level (e.g., surgeons, pilots) make their own decisions, although they might be advised by AI. How soon before we accept automated decisions at that level?
  • Individually-targeted decisions are happening now by what is presented to specific people through platforms like Google Search and Amazon. How is our behavior being controlled by the limited set of options presented to us?
  • The closer that a technology gets to the end effect, the more responsibility that the creator of the technology needs to take in how it is used.
  • Machine learning may be the best way to discover the best transparent decision logic from human action (unfortunately that will also include the human biases), allowing for people to understand how and why specific decisions are made.
  • When AI is a black box, it needs to be understood as being a black box, so that adequate constructs can be created around it for testing and usage.

Great discussion and audience participation, and a good follow-on from the two demos that showed decision services in action.

bpmNEXT 2018: Last session with a Red Hat demo, Serco presentation and DMN TCK review

We’re on the final session of bpmNEXT 2018 — it’s been an amazing three days with great demos and wonderful conversations.

Exploiting Cloud Infrastructure for Efficient Business Process Execution, Red Hat

Kris Verlaenen, project lead for jBPM as part of Red Hat, presented on cloud BPM infrastructure, specifically for execution and monitoring. Cloud makes BPM lightweight, scalable, embedable and able to take advantage of the larger cloud app ecosystem. They are introducing some new cloud infrastructure, including a controller for managing server deployments, a smart router for delegating and aggregating requests from applications to servers, and monitoring that aggregates process statistics across servers and containers. The demo showed using Red Hat’s OpenShift container application platform (actually MiniShift running on his laptop) to create a new environment and deploy an IT hardware ordering BPM application. He walked through using the application to create a new order and see the milestone-based monitoring of the order, then the hardware provider’s view of their steps in the process to provide information and advance the process to the next stage. The process engine and monitoring engine can be deployed in different containers on different hardware, in any combination of cloud providers and on-premise infrastructure. Applications and servers can be bundled into a single immutable image for easy provisioning — more of a microservices style — or can be deployed independently. Multiple versions of the same application can be deployed, allowing current instances to play out in the original version while new instances use the most recent version, or other strategies that would allow new instances of any version to be created, while monitoring can aggregate instance data from all versions in all containers.

Kris is also live-blogging the conference, check out his posts. He has gone back and included the video of each presentation when they are released (something that I didn’t do for page load performance reasons) as well as providing his commentary on each presentation.

Dynamic Work Assignment, Serco

Lloyd Dugan of Serco has the unenviable position of being the last presenter of the conference, although he gave a presentation of dynamic work assignment implementation rather than an actual demo (with a quick view of the simple process model in the Trisotech animator near the end, plus an animation of the work assignment in action). His company is a call center business process outsourcer, where knowledge workers use a case management application implemented in BPMN, driven by events such as inbound calls and documents, as well as timers. Real-time work prioritization and assignment is necessary because of SLAs around inbound calls, and the task management model is moving from work being selected (and potentially cherry-picked) by workers, to push assignments. Tasks are scored and assigned using decision models that include task type and SLAs, and worker eligibility based on each individual’s skills and training. Although work assignment products exist, this one is specifically for the complex rules around the US Affordable Care Act administration, which requires a combination of decision tables, database table-driven rules, and lower-level coding to provide the right combination of flexibility and performance.

DMN TCK (Technical Compatibility Kit) Working Group

Keith Swenson of Fujitsu (but presenting here in his role on the DMN standards) started on the idea of a set of standardized DMN technical compatibility tests based on conversations at bpmNEXT in 2016, and he presented today on where they’re at with the TCK. Basically, the TCK provides a way for DMN vendors to demonstrate their compliance with the standard by providing a set of DMN models, input data, and expected results, testing decision tables, boxed expressions and FEEL. Vendors who can demonstrate that they pass all of the TCK tests are listed on a github site along with information about individual test results, providing a way for DMN customers to assess the compliance level of vendors. Keith wrote an update on this last September that provides a good summary up to that point, and in today’s presentation he walked through some of the additional things that they’ve done including identifying sections of the DMN specification that require clarifications or additions due to ambiguity that can lead to different implementations. DMN 1.2 is coming out this year, which will require a new set of tests specifically for that version while maintaining the previous version tests; they are also trying to improve testing of error cases and introducing more real-world decision models. If you create and use DMN models, or make a DMN-compliant decision management product, or you’re otherwise interested in the DMN TCK, you can find out here how to get involved in the working group.

That’s it for bpmNEXT 2018. There will be voting for the best in show and some wrapup after lunch, but we’re pretty much done for this year. Another amazing year that makes me proud to be a part of this community.

bpmNEXT 2018: All DMN all the time, with Trisotech, Bruce Silver Associates and Red Hat

First session of the afternoon on the first day of bpmNEXT 2018, and this entire section is on DMN (decision management notation) and the requirement for decision automation based on DMN.

Decision as a Service (DaaS): The DMN Platform Revolution, Trisotech

Denis Gagne of Trisotech, who knows as much about DMN and other related standards as anyone around, started off the session with his ideas on the need for decision automation driven by requirements such as GDPR. He walked through their suite of decision-related products that can be used to create decision services to be consumed by other applications, as well as their conformance to the DMN standards. His demo showed a decision model for determining the best price to offer a rental vehicle customer, and walked through the capabilities of their platform with this model: DMN style check, import/export, execution, team collaboration, and governance through versioning. He also showed how decision models can be reused, so that elements from one model can be used in another model. Then, he showed how to take portions of the model and define them as a service using a visual wrapper, much like a subprocess wrapper visualization in BPMN, where the relationship lines that cross the service boundary become the inputs and outputs to the service. Cool. The service can then be deployed as an executable service using (in his demo) the Red Hat platform, test its execution using from a generated HTML form, generate the REST API or Open API interface code, run predefined test cases based on DMN TCK, promote the service from test to production, and publish it to an API publisher platform such as WSO2 for public consumption. The execution environment includes debugging and audit logs, providing traceability on the decision services.

Timing the Stock Market with DMN, Bruce Silver Associates

Bruce Silver, also a huge contributor to BPMN and DMN standards, and author of the BPMN Method & Style books and now the DMN M&S, presented an application for buying a stock at the right time based on price patterns. For investors who time the market based the pricing, the best way to do this is to look at daily min/max trends and fit them to one of several base type models. Bruce figured that this could be done with a decision table applied to a manipulated version of the data, and automated this for a range of stocks using a one-year history, processing in Excel, and decision services in the Trisotech cloud. This is a practical example of using decision services in a low-code environment by non-programmers to do something useful. His demo showed us the decision model for doing this, then the data processing (smoothing) done in Excel. However, for an application that you want to run every day, you’re probably not going to want to do the manual import/export of data, so he showed how to automate/orchestrate this with Microsoft Flow, which can still use the Excel sheet for data manipulation but automate the data import, execute the decision service, and publish the results back to the same Excel file. Good demonstration of the democratization of creating decisioning applications by through easy-to-use tools such as the graphical DMN modeler, Excel and Flow, highlighting that DMN is an execution language as well as a requirement language. Bruce has also just published a new book, DMN Cookbook, co-authored with Edson Tirelli of Red Hat, on getting started DMN business implementations using lightweight stateless decision services called via REST APIs.

Smarter Contracts with DMN, Red Hat

Edson Tirelli of Red Hat, Bruce Silver’s co-author on the above-mentioned DMN Cookbook, finished this section of DMN presentations with a combination of blockchain and DMN, where DMN is used to define the business language for calculations within a smart contract. His demo showed a smart land registry case, specifically a transaction for selling a property involving a seller, a buyer and a settlement service created in DMN that calculates taxes and insurance, with the purchase being executed using cryptocurrency. He mentioned Vanessa Bridge’s demo from earlier today, which showed using BPMN to define smart contract flows; this adds another dimension to the same problem, and likely no reason why you wouldn’t use them all together given the right situation. Edson said that he was inspired, in part, by this post on smart contracts by Paul Lachance, in which Lachance said “a visual model such as a BPMN and/or DMN diagram could be used to generate the contract source code via a process-engine”. He used Ethereum for the blockchain smart contract and the Ether cryptocurrency, Trisotech for the DMN models, and Drools for the rules execution. All in all, not such a far-fetched idea.

I’m still catching flak for suggesting the now-ubiquitous Ignite style for presentations here at bpmNEXT; my next lobbying effort will be around restricting the maximum number of words per slide. 🙂

Financial decisions in DMN with @JanPurchase

Trisotech and their partner Lux Magi held a webinar today on the role of decision modeling and management in financial services firms. Jan Purchase of Lux Magi, co-author (with James Taylor) of Real-World Decision Modeling with DMN, gave us a look at why decision management is important for financial services. One of the key places for applying decision management is in compliance, which is all about decision-making: assessing risks, applying regulations, sharing data, and ensuring that rules are applied in a uniform manner. There are a lot of other areas where decision management can be applied, and potentially automated where this is a high volume/speed of transactions with a non-zero cost of errors. Decision management lets you make decisions explicit: it separates them from other business software to increase transparency and agility, and makes it easier for business people to understand what decisions are being applied and how that links to overall business goals. In particular, if decisions are automated with a decision management system, business people can quickly make changes to decision-making when compliance regulations change, with a much smaller IT involvement that would be required to modify legacy business systems.

There is a great deal of value in modeling decisions even if they are embedded within business systems and won’t be automated using a decision management system: decision models provide a way for business people to specify how systems should behave based on business data. Luckily, there is now a standard for decision modeling: Decision Model and Notation (DMN). This notation allows a decision to be modeled as a Decision Requirements Diagram (DRD) of the sub-decisions and knowledge sources that are required to reach that decision, and the possible paths to take in order to reach the decision. Within each of the decision nodes in the DRD, a definition of the decision can be specified using a decision table or the Friendly Enough Expression Language (FEEL), which may then be linked to an automated decision management system.

We then saw what a decision model looks like in Trisotech’s DMN Modeler, which allows for a standard DRD to be created, then augmented with additional information such as decision makers and owners. Purchase walked us through a number of the features of DMN as well as specific features of Trisotech’s tool, including analysis of decisions relative to Bruce Silver’s Method and Style best practices, and decision animation.

Lux Magi/Trisotech DMN 2017-10

If you know a bit about DMN already but want to understand some of the practical aspects of working with it in financial services, I assume that a replay of the webinar will be available at the original registration link or the Lux Magi event page.

bpmNEXT 2016 demo: Capital BPM and Fujitsu

Our final demo session of bpmNEXT — can’t believe it’s all over.

How I Learned to Tell the Truth with BPM – Gene Rawls, Capital BPM

Their Veracity tool overlays architecture and process layers using visual models, integrated with a few different BPMS (primarily IBM); create models in the tool for process and underlying technical architecture (SOA, rules and data) layers, and create linkages between them to indicate interactions. Direct integration of IBM ODM into rules layer.

Business Process and Self-Managed Teams – Keith Swenson, Fujitsu and WfMC

wp-1461264472735.jpgFinishing bpmNEXT with a presentation on self-managed teams in the context of BPM, not a demo. Contrasting organizational styles of “early structured” (aka “structured”) versus “late structured” (aka unstructured), with respective characteristics of centralized versus decentralized, and machine-style versus garden-style. Concepts of sociocracy (on which holocracy is based): a formal method for running self-managed teams that are structured around social relationships, aka dynamic governance. Extremely agile, allows ideas to boil up from the bottom. Replaces voting with consensus, where there is open discussion of options and everyone must consent that it is acceptable; objections must require a better proposal. Defining principles: consent governs policy decision making; organizing in circles; double-linking; and elections by consent. Self-managed organizations are inherently agile since good decisions are made where needed and everyone agrees. May be implications on DMN as to how decisions are modeled and captured. wp-1461264490831.jpgBPMN and CMMN can cover some of the domains of predictability; we saw other demos this week using other model types that extend further into unpredictable work, such as a process timeline view. Outstanding issues of whether BPMN should be extended to handle less predictable work, or if CMMN can handle this. Keith ended with the observation that this was the year of DMN at bpmNEXT, and issued a call to action for an open-source implementation of DMN execution with conformance suite; likely more possible than for BPMN since it is more constrained. A lot of great discussion ensued, and Keith will be spearheading a WfMC committee to look at this.

bpmNEXT 2016 demos: IBM, Orquestra, Trisotech and BPM.com

On the home stretch of the Wednesday agenda, with the last session of the four last demos for the day.

BPM in the Cloud: Changing the Playing Field – Eric Herness, IBM

wp-1461193672487.jpgIBM Bluemix process-related cloud services, including cognitive services leveraging Watson. Claims process demo that starts by uploading an image of a vehicle and passing to Watson image recognition for visual classification; returned values show confidence in vehicle classification, such as “car”, and sends any results over 90% to the Alchemy taxonomy service to align those — in the demo, Watson returned “cars” and “sedan” with more than 90% confidence, and the taxonomy service determined that sedan is a subset of cars. This allows routing of the claim to the correct process for the type of vehicle. If Watson has not been trained for the specific type of vehicle, the image classification won’t be determined with a sufficient level of confidence, and it will be passed to a work queue for manual classification. Unrecognized images can be used to add to classifier either as example of an existing classification or as a new classification. Predictive models based on Spark machine learning and analytics of past cases create predictions of whether claim should be approved, and the degree of confidence in that decision; at some point, as this confidence increases, some of the claims could be approved automatically. Good examples of how to incorporate cognitive computing to make business processes smarter, using cognitive services that could be called from any BPM system, or any other app that can call REST services.

Model, Generate, Compile in the Cloud and Deploy Ready-To-Use Mobile Process Apps – Rafael Bortolini and Leonardo Luzzatto, CRYO/Orquestra

Demo of Orquestra BPMS implementation for Rio de Janeiro’s municipal processes, e.g., business license requests. From a standard worklist style of process management, generate a process app for a mobile platform: specify app name and logo, select app functionality based on templates, then preview it and compile for iOS or Android. The .ipa or .apk files are generated ready for uploading to the Apple or Google app stores, although that upload can’t be automated. Full functionality to allow mobile user to sign up or login, then access the functionality defined for the app to request a business license. Although an app is generated, the data entry forms are responsive HTML5 to be identical to the desktop version. Very quick implementation of a mobile app from an existing process application without having to learn the Orquestra APIs or even do any real mobile development, but it can also produce the source code in case this is just wanted as a quick starting point for a mobile development project.

Dynamic Validation of Integrated BPMN, CMMN and DMN – Denis Gagné, Trisotech

wp-1461196893964.jpgKommunicator tool based on their animation technology that animates models, which allows tracing the animation directly from a case step in the BPMN model to the CMMN model, or from a decision step to the DMN model. Also links to the semantic layer, such as the Sparx SOA architecture model or other enterprise architecture reference models. This allows manually stepping through an entire business model in order to learn and communicate the procedures, and to validate the dynamic behavior of the model against the business case. Stepping through a CMMN model requires selecting the ad hoc tasks as the case worker would in order to step through the tasks and see the results; there are many different flow patterns that can emerge depending on the tasks selected and the order of selection, and stages will appear as being eligible to close only when the required tasks have been completed. Stepping through a DMN model allows selecting the input parameters in a decision table and running the decision to see the behavior. Their underlying semantic graph shows the interconnectivity of all of the models, as well as goals and other business information.

Simplified CMMN – Lloyd Dugan, BPM.com

wp-1461198272050.jpgLast up is not a demo (by design), but a proposal for a simplified version of CMMN, starting with a discussion of BPMN’s limitations in case management modeling: primarily that BPMN treats activities but not events as first-class citizens, making it difficult to model event-driven cases. This creates challenges for event subprocesses, event-driven process flow and ad hoc subprocesses, which rely on “exotic” and rarely used BPMN structures and events that many BPMN vendors don’t even support. Moving a business case – such as an insurance claim – to a CMMN model makes it much clearer and easier to model; the more unstructured that the situation is, the harder it is to capture in BPMN, and the easier it is to capture in CMMN. Proposal for simplifying CMMN for use by business analysts include removing PlanFragment and removing all notational attributes (AutoComplete, Manual Activitation, Required, Repetition) that are really execution-oriented logic. This leaves the core set of elements plus the related decorators. I’m not enough of a CMMN expert to know if this makes complete sense, but it seems similar in nature to the subsets of BPMN commonly used by business analysts rather than the full palette.