bpmNEXT 2018: Complex Modeling with MID GmbH, Signavio and IYCON

The final session of the first day of bpmNEXT 2018 was focused on advanced modeling techniques.

Designing the Data-Driven Company, MID GmbH

Elmar Nathe of MID GmbH presented on their enterprise decision maps, which provides an aggregated visualization of strategic, tactical and operational decisions with business events. They provide a variety of modeling tools, but see decisions as key to understanding how organizations are driven by data and events. Clearly a rich decision modeling environment, including support for PMML for including predictive models and other data scientist analysis tools, plus links to other model types such as ERDs that can show what data contributes to which decision model, and business process models. Much more of an enterprise architecture approach to model-driven design that can incorporate the work of data scientists.

Using Customer Journeys to Connect Theory with Reality, Signavio

Till Reiter and Enrico Teterra of Signavio started with a great example of an Ignite presentation, with few words, lots of graphics and a bit of humor, discussing their new notation for modeling an outside-in view of the customer journey rather than just having an undifferentiated “customer” swimlane in a BPMN diagram. The demo walked through their customer journey mapping tool, and how their collaboration hub overlays on that to allow information about each component of the journey map to be discussed amongst process modeling users. The journey map contains a lot of information about KPIs and other process metrics in a form most consumable by process owners and modelers, but also has a notebook/dashboard view for analysts to determine problems with the process and identify potential resolution actions. This includes a variety of analysis tools including process discovery, where process mining techniques are applied to determine which paths in the process model may be contributing to specific problems such as cycle time, then overlay this on the process model to assist with root cause analysis. Although their product does a good job of combing CJMs, process models and process analysis, this was more of a walkthrough of a set of pre-calculated dashboard screens rather than an actual demo — a far cry from the experimental features that Gero Decker showed off in their demo at the first bpmNEXT.

Discovering the Organizational DNA, IYCON and Knowledge Consultants

The final presentation of this section was with Jude Chagas Pereira of IYCON and Frank Kowalkowski of Knowledge Consultants presenting IYCON’s Afterspyre modeling tool for creating a catalog of complex business objects, their attributes and their linkages to create organizational DNA diagrams. Ranking these with machine learning algorithms for semantic and sentiment analysis allows identification of process improvement opportunities. They have a number of standard business analysis techniques built in, and robust analytics focused on problem solving. The demo walked through their catalog, drilling down into the “Strategy DNA” section and into “Technology Solutions” subsection to show an enumeration of the platforms currently in place together with attributes such as technology risk and obsolescence, which can be used to rank technology upgrade plans. Relationships between business objects can be auto-detected based on existing data. Levels including Objectives, Key Processes, Technology Solutions, Database Technology and Datacenter and their interrelationships are mapped into a DNA diagram and an alluvial diagram, starting at any point in the catalog and drilling down a specific number of levels as selected by the modeling analyst. These diagrams can then be refined further based on factors such as scaling the individual markers based on actual performance. They showed sentiment analysis for a hotel rank on a review site, which included extracting specific phrases that related to certain sentiments. They also demonstrated a two-model comparison, which compared the models for two different companies to determine the overlap and unique processes; a good indicator for a merger/acquisition (or even divestiture) level of difficulty. They finished up with affinity modeling, such as the type used by Amazon when they tell you what books that other people bought who also bought the book that you’re looking at: easy to do in a matrix form with a small data set, but computationally intensive once you get into non-trivial amounts of data. Affinity modeling is most commonly used in marketing to analyze buying habits and offering people something that they are likely to buy, even if that’s what they didn’t plan to buy at first — this sort of “would you like fries with that” technique can increase purchase value by 30-40%. Related to that is correlation modeling, which can be used as a first step for determining causation. Impressive semantic data-driven analytics tool for modeling a lot of different organizational characteristics.

That’s it for day one; if everyone else is as overloaded with information as I am, we’re all ready for tonight’s wine tasting! Check the Twitter stream for opinions and photos from other attendees.

bpmNEXT 2018: All DMN all the time, with Trisotech, Bruce Silver Associates and Red Hat

First session of the afternoon on the first day of bpmNEXT 2018, and this entire section is on DMN (decision management notation) and the requirement for decision automation based on DMN.

Decision as a Service (DaaS): The DMN Platform Revolution, Trisotech

Denis Gagne of Trisotech, who knows as much about DMN and other related standards as anyone around, started off the session with his ideas on the need for decision automation driven by requirements such as GDPR. He walked through their suite of decision-related products that can be used to create decision services to be consumed by other applications, as well as their conformance to the DMN standards. His demo showed a decision model for determining the best price to offer a rental vehicle customer, and walked through the capabilities of their platform with this model: DMN style check, import/export, execution, team collaboration, and governance through versioning. He also showed how decision models can be reused, so that elements from one model can be used in another model. Then, he showed how to take portions of the model and define them as a service using a visual wrapper, much like a subprocess wrapper visualization in BPMN, where the relationship lines that cross the service boundary become the inputs and outputs to the service. Cool. The service can then be deployed as an executable service using (in his demo) the Red Hat platform, test its execution using from a generated HTML form, generate the REST API or Open API interface code, run predefined test cases based on DMN TCK, promote the service from test to production, and publish it to an API publisher platform such as WSO2 for public consumption. The execution environment includes debugging and audit logs, providing traceability on the decision services.

Timing the Stock Market with DMN, Bruce Silver Associates

Bruce Silver, also a huge contributor to BPMN and DMN standards, and author of the BPMN Method & Style books and now the DMN M&S, presented an application for buying a stock at the right time based on price patterns. For investors who time the market based the pricing, the best way to do this is to look at daily min/max trends and fit them to one of several base type models. Bruce figured that this could be done with a decision table applied to a manipulated version of the data, and automated this for a range of stocks using a one-year history, processing in Excel, and decision services in the Trisotech cloud. This is a practical example of using decision services in a low-code environment by non-programmers to do something useful. His demo showed us the decision model for doing this, then the data processing (smoothing) done in Excel. However, for an application that you want to run every day, you’re probably not going to want to do the manual import/export of data, so he showed how to automate/orchestrate this with Microsoft Flow, which can still use the Excel sheet for data manipulation but automate the data import, execute the decision service, and publish the results back to the same Excel file. Good demonstration of the democratization of creating decisioning applications by through easy-to-use tools such as the graphical DMN modeler, Excel and Flow, highlighting that DMN is an execution language as well as a requirement language. Bruce has also just published a new book, DMN Cookbook, co-authored with Edson Tirelli of Red Hat, on getting started DMN business implementations using lightweight stateless decision services called via REST APIs.

Smarter Contracts with DMN, Red Hat

Edson Tirelli of Red Hat, Bruce Silver’s co-author on the above-mentioned DMN Cookbook, finished this section of DMN presentations with a combination of blockchain and DMN, where DMN is used to define the business language for calculations within a smart contract. His demo showed a smart land registry case, specifically a transaction for selling a property involving a seller, a buyer and a settlement service created in DMN that calculates taxes and insurance, with the purchase being executed using cryptocurrency. He mentioned Vanessa Bridge’s demo from earlier today, which showed using BPMN to define smart contract flows; this adds another dimension to the same problem, and likely no reason why you wouldn’t use them all together given the right situation. Edson said that he was inspired, in part, by this post on smart contracts by Paul Lachance, in which Lachance said “a visual model such as a BPMN and/or DMN diagram could be used to generate the contract source code via a process-engine”. He used Ethereum for the blockchain smart contract and the Ether cryptocurrency, Trisotech for the DMN models, and Drools for the rules execution. All in all, not such a far-fetched idea.

I’m still catching flak for suggesting the now-ubiquitous Ignite style for presentations here at bpmNEXT; my next lobbying effort will be around restricting the maximum number of words per slide. 🙂

bpmNEXT 2018: Here’s to the oddballs, with ConsenSys, XMPro and BPLogix

And we’re off with the demo sessions!

Secure, Private Decentralized Business Processes for Blockchains, ConsenSys

Vanessa Bridge of ConsenSys spoke about using BPMN diagrams to create smart contracts and other blockchain applications, while also including privacy, security and other necessary elements: essentially, using BPM to enable Ethereum-based smart contracts (rather than using blockchain as a ledger for BPM transactions and other BPM-blockchain scenarios that I’ve seen in the past). She demonstrated using Camunda BPM for a token sale application, and for a boardroom voting application. For each of the applications, she used BPMN to model the process, particularly the use of BPMN timers to track and control the smart contract process — something that’s not native to blockchain itself. Encryption and other steps were called as services from the BPMN diagram, and the results of each contract were stored in the blockchain. Good use of BPM and blockchain together in a less-expected manner.

Turn IoT Technology into Operational Capability, XMPro

Pieter van Schalkwyk of XMPro looked at the challenges of operationalizing IoT, with a virtual flood of data from sensors and machines that needs to be integrated into standard business workflows. This involves turning big data into smart data via stream processing before passing it on to the business processes in order to achieve business outcomes. XMPro provides smart listeners and agents that connect the data to the business processes, forming the glue between realtime data and resultant actions. His demo showed data being collected from a fan on a cooling tower, bringing in data the sensor logs and comparing it to manufacturer’s information and historical information in order to predict if the fan is likely to fail, create a maintenance work order and even optimize maintenance schedules. They can integrate with a large library of action agents, including their own BPM platform or other communication and collaboration platforms such as Slack. They provide a lot of control over their listener agents, which can be used for any type of big data, not just industrial device data, and integrate complex and accurate prediction models regarding likelihood and remaining useful life predictions. He showed their BPM platform that would be used downstream from the analytical processing, where the internet of things can interact with the internet of people to make additional decisions required in the context of additional information such as 3D drawings. Great example of how to filter through hundreds of millions data points in streaming mode to find the few combinations that require action to be taken. He threw out a comment at the end that this could be used for non-industrial applications, possibly for GDPR data, which definitely made me think about content analytics on content as it’s captured in order to pick out which of the events might trigger a downstream process, such as a regulatory process.

Business Milestones as Configuration, BPLogix

Scott Menter and Joby O’Brien of BPLogix finished up this section on new BPM ideas with their approach to goal orientation in BPM, which is milestone-based and requires understanding the current state of a case before deciding how to move forward. Their Process Director BPM is not BPMN-based, but rather an event-based platform where events are used to determine milestones and drive the process forward: much more of a case management view, usually visualized as a project management-style GANTT chart rather thana flow model. They demonstrated the concept of app events, where changes in state of any of a number of things — form fields, activities, document attachments, etc. — can record a journal entry that uses business semantics and process instance data. This allows events from different parts of the platform to be brought together in a single case journal that shows the significant activity within the case, but also to be triggers for other events such as determining case completion. The journal can be configured to show only certain types of events for specific users — for example, if they’re only interested in events related to outgoing correspondence — and also becomes a case collaboration discussion. Users can select events within the journal and add their own notes, such as taking responsibility for a meeting request. They also showed how machine learning and rules can be used for dynamically changing data; although shown as interactions between fields on forms, this can also be used to generate new app events. Good question from the audience on how to get customers to think about their work in terms of events rather than procedures; case management proponents will say that business people inherently think about events/state changes rather than process. Interesting representation of creating a selective journal based on business semantics rather than just logging everything and expecting users to wade through it for the interesting bits.

We’re off to lunch. I’m a bit out of practice at live-blogging, but hope that I captured some of the flavor of what’s going on here. Back with more this afternoon!

bpmNEXT 2018 kicks off: keynotes with @JimSinur and @NeilWD

It’s the first day of bpmNEXT, the conference for BPM visionaries and free thinkers to get together, share ideas, show their cool new stuff, meet new friends and get reacquainted with old ones. This is an opportunity for technologists (primarily senior technical people from BPM vendors) to give demos in a very structured format, but it’s not really a place for customers: more like the BPM Think Tanks of old. Organized by Bruce Silver and Nathaniel Palmer, themselves both long-time contributors to the industry, with content provided by a lot of people who are loud and proud about their technology.

That very structured format, in case you haven’t read about or attended bpmNEXT before, is a strictly limited Ignite-style presentation followed by a live demo. This limits the amount of time that presenters can spend showing slides and forces them to get to the good stuff.

You can see the the agenda here, and we started out the first day with a few keytnoes from industry thought leaders before getting to the demo presentations. I’ll cover those in this post, then do individual posts for each section of demos (usually three in a section). These will be rough notes since there’s a lot of information that goes by quickly; you’ll be able to see video of all of the sessions, most likely on the bpmNEXT YouTube channel (where you can also see previous years’ sessions).

The Future of Process in Digital Business, Jim Sinur

Jim Sinur, a long-time Gartner analyst who is now with Aragon Research, spoke about trends in digital businesses. Most of this was a plug for Aragon and their research reports that seemed focused on customer organizations, which doesn’t seem like a good fit with this audience where most of the people in the room are well-versed in these technologies and how to apply them in real life.

I’d really like to see more conversational sessions rather than presentations for the keynotes, or at least content that is more directly relevant to the audience.

A New Architecture for Automation, Neil Ward-Dutton

Neil Ward-Dutton, who heads up MWD Advisors, presented a distillation of the conversations that they’re having with customer organizations, starting with the difficult choices that they have to make in terms of which technologies to choose: for example, when RPA vendors tell them that they don’t need BPM any more. he went through some insights into the technologies that are impacting CIOs’ strategic decisions — no surprises there — then presented a schematic model for how work happens in organizations as a basis for understanding how different technologies impact different parts of their work. The framework categorizes work into exploratory, transactional and programmatic, and he walked through what each of those types defines up front, and how the technologies are used within that. Good view of how to help organizations think about their work and how to develop automation strategies that address different work styles and applications.

Although a lot of his presentation was aimed at a general audience that could include customers, Neil finished up with a bit on next moves for vendors and technologists as the technology market changes: there are a lot of mergers and acquisitions going on, and older technologies are being replaced with newer ones in specific instances. He had some recommendations about rearchitecting products and adding value, shifting from one-size-fits-all products to collections of independent runtime services in order to support cloud architectures (especially elastic computing requirements) and provide more flexibility in product offerings.

bpmNEXT 2016 demo: Capital BPM and Fujitsu

Our final demo session of bpmNEXT — can’t believe it’s all over.

How I Learned to Tell the Truth with BPM – Gene Rawls, Capital BPM

Their Veracity tool overlays architecture and process layers using visual models, integrated with a few different BPMS (primarily IBM); create models in the tool for process and underlying technical architecture (SOA, rules and data) layers, and create linkages between them to indicate interactions. Direct integration of IBM ODM into rules layer.

Business Process and Self-Managed Teams – Keith Swenson, Fujitsu and WfMC

wp-1461264472735.jpgFinishing bpmNEXT with a presentation on self-managed teams in the context of BPM, not a demo. Contrasting organizational styles of “early structured” (aka “structured”) versus “late structured” (aka unstructured), with respective characteristics of centralized versus decentralized, and machine-style versus garden-style. Concepts of sociocracy (on which holocracy is based): a formal method for running self-managed teams that are structured around social relationships, aka dynamic governance. Extremely agile, allows ideas to boil up from the bottom. Replaces voting with consensus, where there is open discussion of options and everyone must consent that it is acceptable; objections must require a better proposal. Defining principles: consent governs policy decision making; organizing in circles; double-linking; and elections by consent. Self-managed organizations are inherently agile since good decisions are made where needed and everyone agrees. May be implications on DMN as to how decisions are modeled and captured. wp-1461264490831.jpgBPMN and CMMN can cover some of the domains of predictability; we saw other demos this week using other model types that extend further into unpredictable work, such as a process timeline view. Outstanding issues of whether BPMN should be extended to handle less predictable work, or if CMMN can handle this. Keith ended with the observation that this was the year of DMN at bpmNEXT, and issued a call to action for an open-source implementation of DMN execution with conformance suite; likely more possible than for BPMN since it is more constrained. A lot of great discussion ensued, and Keith will be spearheading a WfMC committee to look at this.

bpmNEXT 2016 demos: Appian, Bonitasoft, Camunda and Capital BPM

Last day of bpmNEXT 2016 already, and we have a full morning of demos in two sessions, the first of which has a focus on more technical development.

Intent-Driven, Future-Proof User Experience – Malcolm Ross and Suvajit Gupta, Appian

Appian’s SAIL UI development environment. Build interfaces with smart components that detect the capabilities of the runtime device (e.g., camera, Bluetooth) and enable/disable/configure components on the fly. Supports a variety of UI rendering architectures/frameworks for desktop, and generates native mobile apps for Android and iOS. Directly supports their underlying constructs such as Records and process models when building forms. Dynamic content based on selections and data on form. Fast rebranding of forms with color and logos. Full functionality on mobile devices, and offline support via caching data down to device, and saving any offline transactions to automatically synchronize when reconnected. Switch between design (tree/graphical) view and code view in IDE to support different technical capabilities of UI designers. Not a focus on BPM per se, since Appian is repositioning as more of a process-centric application development tool than BPMS, although used as the UI development environment for their process applications.

Continuous Integration: Tools to Empower DevOps in Process-Based Application Development – Charles Souillard, Bonitasoft

Embodying continuous integration for live updates of applications, enabling easier development and automated testing supported by Docker images. Demo of simple shopping cart application created using BonitaBPM, with a combination of forms, pages, layouts, custom widgets and fragments that can be rendered on desktop and mobile devices. Underlying BPMN process model with human activities connected to UI artifacts. Versioned using Subversion. The continuous integration functionality monitors checked-in changes to the application and integrates them into the dev/test repository to allow immediate testing; in the demo, a new input parameter was added to a process step; the updated code was detected and tested, with testing errors raised due to the unknown parameter. Potential to accelerate the dev-test cycle, since code can be checked in by developers several times each day, with the results automatically tested and fed back to them.

Combining DMN with BPMN and CMMN: The Open Source Way – Jakob Freund, Camunda

wp-1461259584764.pngCamunda’s “developer-friendly” BPM for developers to add process, case and decision capabilities to their applications. Their DMN decision tables allows changing decision tables at runtime for increased agility, depending on binding specified by process designer. Decisions executed as decision tasks from a process are logged as part of process history, and visible in their admin Cockpit interface to trace through decisions for a specific process instance. DMN engine also available outside decision tasks in a process, such as a REST API call from a form to dynamically update values as parameters change; when deploying a table, both a public ID for executing the table and a private ID for editing the table are generated for the REST access. Nice traceability directly into the decision table, and fast changes to production decision tables. Open source, with a free (non-production) DMN cloud version. Extra points for creating an online dungeon game using BPMN, and playing a round during the demo.

bpmNEXT 2016 demos: IBM, Orquestra, Trisotech and BPM.com

On the home stretch of the Wednesday agenda, with the last session of the four last demos for the day.

BPM in the Cloud: Changing the Playing Field – Eric Herness, IBM

wp-1461193672487.jpgIBM Bluemix process-related cloud services, including cognitive services leveraging Watson. Claims process demo that starts by uploading an image of a vehicle and passing to Watson image recognition for visual classification; returned values show confidence in vehicle classification, such as “car”, and sends any results over 90% to the Alchemy taxonomy service to align those — in the demo, Watson returned “cars” and “sedan” with more than 90% confidence, and the taxonomy service determined that sedan is a subset of cars. This allows routing of the claim to the correct process for the type of vehicle. If Watson has not been trained for the specific type of vehicle, the image classification won’t be determined with a sufficient level of confidence, and it will be passed to a work queue for manual classification. Unrecognized images can be used to add to classifier either as example of an existing classification or as a new classification. Predictive models based on Spark machine learning and analytics of past cases create predictions of whether claim should be approved, and the degree of confidence in that decision; at some point, as this confidence increases, some of the claims could be approved automatically. Good examples of how to incorporate cognitive computing to make business processes smarter, using cognitive services that could be called from any BPM system, or any other app that can call REST services.

Model, Generate, Compile in the Cloud and Deploy Ready-To-Use Mobile Process Apps – Rafael Bortolini and Leonardo Luzzatto, CRYO/Orquestra

Demo of Orquestra BPMS implementation for Rio de Janeiro’s municipal processes, e.g., business license requests. From a standard worklist style of process management, generate a process app for a mobile platform: specify app name and logo, select app functionality based on templates, then preview it and compile for iOS or Android. The .ipa or .apk files are generated ready for uploading to the Apple or Google app stores, although that upload can’t be automated. Full functionality to allow mobile user to sign up or login, then access the functionality defined for the app to request a business license. Although an app is generated, the data entry forms are responsive HTML5 to be identical to the desktop version. Very quick implementation of a mobile app from an existing process application without having to learn the Orquestra APIs or even do any real mobile development, but it can also produce the source code in case this is just wanted as a quick starting point for a mobile development project.

Dynamic Validation of Integrated BPMN, CMMN and DMN – Denis GagnĂ©, Trisotech

wp-1461196893964.jpgKommunicator tool based on their animation technology that animates models, which allows tracing the animation directly from a case step in the BPMN model to the CMMN model, or from a decision step to the DMN model. Also links to the semantic layer, such as the Sparx SOA architecture model or other enterprise architecture reference models. This allows manually stepping through an entire business model in order to learn and communicate the procedures, and to validate the dynamic behavior of the model against the business case. Stepping through a CMMN model requires selecting the ad hoc tasks as the case worker would in order to step through the tasks and see the results; there are many different flow patterns that can emerge depending on the tasks selected and the order of selection, and stages will appear as being eligible to close only when the required tasks have been completed. Stepping through a DMN model allows selecting the input parameters in a decision table and running the decision to see the behavior. Their underlying semantic graph shows the interconnectivity of all of the models, as well as goals and other business information.

Simplified CMMN – Lloyd Dugan, BPM.com

wp-1461198272050.jpgLast up is not a demo (by design), but a proposal for a simplified version of CMMN, starting with a discussion of BPMN’s limitations in case management modeling: primarily that BPMN treats activities but not events as first-class citizens, making it difficult to model event-driven cases. This creates challenges for event subprocesses, event-driven process flow and ad hoc subprocesses, which rely on “exotic” and rarely used BPMN structures and events that many BPMN vendors don’t even support. Moving a business case – such as an insurance claim – to a CMMN model makes it much clearer and easier to model; the more unstructured that the situation is, the harder it is to capture in BPMN, and the easier it is to capture in CMMN. Proposal for simplifying CMMN for use by business analysts include removing PlanFragment and removing all notational attributes (AutoComplete, Manual Activitation, Required, Repetition) that are really execution-oriented logic. This leaves the core set of elements plus the related decorators. I’m not enough of a CMMN expert to know if this makes complete sense, but it seems similar in nature to the subsets of BPMN commonly used by business analysts rather than the full palette.

bpmNEXT 2016 demos: Oracle, OpenRules and Sapiens DECISION

This afternoon’s first demo session shifts the focus to decision management and DMN.

Decision Modeling Service – Alvin To, Oracle

wp-1461187532237.jpgOracle Process Cloud as an alternative to their Business Rules, implementing the DMN standard and the FEEL expression language. Exposes decisions as services that can be called from a BPMN process. Create a space (container) to contain all related decision models, then create a DMN decision model in that space. Create test data records in the space, which will be deleted before final deployment. Define decisions using expressions, decision tables, if-then-else constructs and functions. Demo example was a loyalty program, where discounts and points accumulation were decided based on program tier and customer age. The decisions can be manually executed using the test data, and the rules changed and saved to immediately change the decision logic. A second demo example was an order approval decision, where an order number could be fed into the decision and an approval decision returned, including looping through all of the line items in the order and making decisions at that level as well as an overall decision based on the subdecisions. Once created, expose the decisions or subdecisions as services to be called from external systems, such as a step in a BPMN model (or presumably any other application). Good way to introduce standard DMN decision modeling into any application without having an on-premise decision management system.

Dynamic Decision Models: Activation/Deactivation of Business Rules in Real Time – Jacob Feldman, OpenRules

wp-1461187557576.jpgWhat-If Analyzer for decision modeling, for optimization, to show conflicts between rules, and to enable/disable rules dynamically. Interface shows glossary of decision variables, and a list of business rules with a checkbox to activate/deactivate each. Deactivating rules using the checkboxes updates the values of the decision results to find a desired solution, and can find minimum and maximum values for specified decision variables that will still yield the same decision result. The demo example was a loan approval calculation, where several rules were disabled in order to have the decision result of “approved”, then a maximum value generated for accumulated debt that would still give an “approved” result. Second example was how to build a good burger, optimizing cost for specific health and taste standards by selecting different rules and optimizing the resulting sets of decision variables. Third example was a scheduling problem, optimizing activities when building a house in order to maintain precedence and resulting in the earliest possible move-in date, working within budget and schedule constraints. Interesting analysis tool for gaining a deep understanding of how your rules/decisions interact, far beyond what can be done using decision tables, especially for goal-seeking optimization problems. All open source.

The Dirty Secrete in Process and Decision Management: Integration is Difficult – LarryGoldberg, Sapiens DECISION

wp-1461190003376.jpgData virtualization to create in-memory logical units of data related to specific business entities. Demo started with a decision model for an insurance policy renewal, with input variables included for each decision and subdecision. Acquiring the data for those input variables can require a great deal of import/export and mapping from source systems containing that data; their InfoHub creates the data model and allows setup of the integration with external sources by connecting data sources and defining mapping and transformation between source and destination data fields. When deployed to the InfoHub server, web service interfaces are created to allow calling from any application; at runtime, InfoHub ensures that the logical unit of data required for a decision is maintained in memory to improve performance and reduce implementation complexity of the calling application. There are various synchronization strategies to update their logical units when the source data changes — effectively, a really smart caching scheme that syncronizes only the data that is required for decisions.

bpmNEXT 2016 demos: W4 and BP3

Second round of demos for the day, with more case management. This time with pictures!

BPM and Enterprise Social Networks for Flexible Case Management – Francois Bonnet, W4 (now ITESOFT Group)

wp-1461177318999.jpgAdding ESN to case management (via Jamespot plugin) to improve collaboration and flexibility, enhancing a timeline of BPM events with the comments and other collaboration events that occur as the process executes. Initiates social routing as asynchronous event call. Example shows collaborative ownership assignment on an RFP, where an owner must self-select within the ESN before a process deadline is reached, or the assignment is made automatically. Case ID shared between W4 BPM and Jamespot ESN, so that case assignments, comments and other activities are sent back to BPM for logging in the process engine to create a consolidated timeline. Can create links between content artifacts, such as between RFP and proposal. Nice use of BPMN events to link to ESN, and a good example of how to use an external (but integrated) ESN for collaborative steps within a standard BPMN process, while capturing events that occur in the ESN as part of the process audit trail.

A Business Process Application with No Process – Scott Francis, BP3 Global

IMG_9207Outpatient care example with coordination of resources (rooms, labs) and people (doctors, patients), BPMN may not be best way to model and coordinate resources since can end up as a single-task anti-pattern. Target UI on tablet, using their Brazos tools with responsive UI, but can be used on desktop or phone. Patient list allows provider to manage high-level state of waiting versus in progress by assigning room, then add substatessuch as “Chaperone Required”, immediate updates regardless of platform used. Patient and doctor notifications can be initiated from action menu. A beautiful UI implementation of a fairly simple state management application built on IBM BPM, although the infrastructure is there to tie in events and data from other systems.

bpmNEXT 2016 demos: Salesforce, BP Logix and RedHat

Day 2 of bpmNEXT is all demos! Four sessions with a total of 12 demos coming up, with most of the morning focused on case management.

Cloud Architecture Accelerating Innovation in Application Development – Linus Chow, Salesforce

App dev environment that allows integration of Salesforce data with other sources, such as SAP. Schema builder allows data models to be visualized and linked in an ERD format, with field-level security and audit capabilities. Process Builder is an environment for visual creation of Salesforce-related data-driven processes, typically simple update actions triggered by data updates. User experiences created using Lightning App Builder, including support for mobile devices. Work-Relay as a more traditional process orchestration environment leveraging the Salesforce environment. Although mostly live demo, the entire Work-Relay section was a pre-recorded screencast, which was a disappointing violation of the bpmNEXT format.

One Model, Three Dimensions: Combining Flow, Case and Time Into a Unified Development Paradigm – Scott Menter and Joby O’Brien, BP Logix

Process Timeline as a GANTT chart view of process, where highly-parallel tasks must have conditions of precedence, eligibility and necessity met in order to execute, as the underlying structure for case management. An application can include a goal (objective, KPI) that can drive actions and impose conditions while being evaluated independent of any process. Define process as a timeline where activities have “start when” (precedence), “completed when”, “needed when” conditions plus due date, forms and participants. Drag and drop activities on each other to establish precedence dependencies, and group into parent/child relationships to organize sections of process. Can use predictions of completion dates for activities, based on historical data, as triggers for actions. Data virtualization for external data sources, allowing more technical designer to publish the results of queries/views on external sources for other designers to use in applications. Integrated form builder with validation rules based on the shared data and rules previously defined. External events of various types can trigger actions in an event-condition-action paradigm.

Building Advanced Case-Driven Applications – Kris Verlaenen, RedHat

Extension of jBPM from structured process to dynamic case management, seen as a spectrum rather than distinct functionality. Building blocks to add ad hoc choices, milestones, case participants and other case constructs on the core process capabilities. Workbench for authoring case definitions, including creating BPMN process models with ad hoc tasks and structured process snippets, decision tables that can include automatic task triggering. Roles are defined to limit access to data, tasks and functionality. UI for admins, but demonstrated UI built for end users using their UI building blocks that allows selection of the ad hoc tasks in the context of the case data; this extracts the structure data from the case definition that will self-adjust if new data or tasks are added. UI functionality limited, and likely useful more as a prototype than full production UI. As with other open source tools, more targeted at developers than low-code environment. Interesting use of BPMN ad hoc tasks for case tasks rather than CMMN, supporting their basic premise that it’s a spectrum of capabilities rather than two distinct work modes.