bpmNEXT 2016 demo session: Signavio and Princeton Blue

Second demo round, and the last for this first day of bpmNEXT 2016.

Process Intelligence – Sven Wagner-Boysen, Signavio

Signavio allows creating a BPMN model with definitions of KPIs for the process such as backlog size and end-to-end cycle time. The demo today was their process intelligence application, which allows a process model to be uploaded as well as an activity log of historical process instance data from an operational system — either a BPMS or some other system such as an ERP or CRM system — in CSV format. Since the process model is already known (in theory), this doesn’t do process mining to derive the model, but rather aggregates the instance data and creates a dashboard that shows the problem areas relative to the KPIs defined in the process model. Drilling down into a particular problem area shows some aggregate statistics as well as the individual instance data. Hovering over an instance shows the trace overlaid on the defined process model, that is, what path that that instance took as it executed. There’s an interesting feature to show instances that deviate from the process model, typically by skipping or repeating steps where there is no explicit path in the process model to allow that. This is similar in nature to what SAP demonstrated in the previous session, although it is using imported process log data rather than a direct connection to the history data. Given that Signavio can model DMN integrated with BPMN, future versions of this could include intelligence around decisions as well as processes; this is a first version with some limitations.

Leveraging Cognitive Computing and Decision Management to Deliver Actionable Customer Insight – Pramod Sachdeva, Princeton Blue

Sentiment analysis of unstructured social media data, creating a dashboard of escalations and activities integrated with internal customer data. Uses Watson for much of the analysis, IBM ODM to apply rules for escalation, and future enhancements may add IBM BPM to automatically spawn action/escalation processes. Includes a history of sentiment for the individual, tied to service requests that responded to social media activity. There are other social listening and sentiment analysis tools that have been around for a while, but they mostly just drive dashboards and visualizations; the goal here is to apply decisions about escalations, and trigger automated actions based on the results. Interesting work, but this was not a demo up to the standards of bpmNEXT: it was only static screenshots and some additional PowerPoint slides after the Ignite portion, effectively just an extended presentation.

bpmNEXT 2016 demo session: 8020 and SAP

My panel done — which probably set some sort of record for containing exactly 50% of the entire female attendees at the conference — we’re on to the bpmNEXT demo session: each is 5 minutes of Ignite-style presentation, 20 minutes of demo, and 5 minutes for Q&A. For the demos, I’ll just try capture some of the high points of each, and I highly recommend that you check out the video of the presentations when they are published after the conference.

Process Design & Automation for a New Economy – Ian Ramsay, 8020 BPM

A simplified, list-based process designer that defines a list of real-world business entities (e.g., application), a list of states unique to each entity (e.g., approved), lists of individuals and groups, lists of stages and tasks associated with each stage. Each new process has a list of start events that happen when a process is instantiated, one or more tasks in the middle, then a list of end events that define when the process is done. Dragging from the lists of entities, states, groups, individuals, stages and tasks onto the process model creates the underlying flow and events, building a more comprehensive process model behind the scenes. This allows a business specialist to create a process model without understanding process modeling or even simple flowcharting, just by identifying the relationships between the different states of business entity, the stages of a business process, and the people involved. Removing an entity from a process modifies the model to remove that entity while keeping the model syntactically correct. Interesting alternative to BPMN-style process modeling, from someone who helped create the BPMN standard, where the process model is a byproduct of entity-state modeling.

Process Intelligence for the Digital Age: Combining Intelligent Insights with Process Mining – Tarun Kamal Khiani and Joachim Meyer, SAP, and Bastian Nominacher, Celonis

Combining SAP’s Operational Process Intelligence analytics and dashboard (which was shown in last year’s bpmNEXT as well as some other briefings that I’ve documented) with Celonis’ process mining. Drilling down on a trouble item from the OPInt dashboard, such as late completion of a specific process type, to determine the root cause of the problem; this includes actionable insights, that is, being able to trigger an operational activity to fix the problem. That allows a case-by-case problem resolution, but adding in the Celonis HANA-based process mining capability allows past process instance data to be mined and analyzed. Adjusting the view on the mined data allows outliers and exceptions to be identified, transforming the straight-through process model to a full model of the instance data. For root cause analysis, this involved filtering down to only processes that took longer than a specific number of days to complete, then manually identifying the portions of the model where the lag times or certain activities may be causing the overly-long cycle time. Similar to other process mining tools, but nicely integrated with SAP S4 processes via the in-memory HANA data mart: no export or preprocessing of the process instance history log, since the process mining is applied directly to the realtime data. This has the potential to be taken further by looking at doing realtime recommendations based on the process mining data and some predictive modeling, although that’s just my opinion.

Good start to the demos with some new ideas on modeling and realtime process mining.

Building a Value-Added BPM Business panel at bpmNEXT

BPM implementations aren’t just about the software vendors, since the vendor vision of “just take it out of the box and run it” or “have your business analyst build operational systems with our low-code platform” is rarely realized in practice. Instead, systems integrators and other value-added service companies bring product knowledge, industry knowledge and pre-built solutions to make these implementations happen better and faster. On a panel about value-added BPM businesses, Pramod Sachdeva of Princeton Blue, Scott Francis of BP3 and Jonathan Sapir of SilverTree brought their perspectives on the role of service providers in the BPM market.

Points covered on the panel included:

  • Customers want to integrate multiple systems, not just build using the BPMS; typically, a BPMS vendor’s professional services group will work only with their own systems, whereas the service providers will help to integrate other capabilities.
  • Service providers can identify and harvest the best capabilities from different systems to provide an integrated solution, rather than trying to do everything with the BPMS tool.
  • BPMS software vendors typically underestimate the level of effort — and the skills required — to bring a solution to full implementation. It’s more than just a demo, and involves more than just the BPMS product.
  • Building a BPM product for developers and building a solution for end-users are quite different, and often the BPMS vendors don’t have the skills to do the latter.
  • Service providers often bring business knowledge about the customer’s industry, and can better put themselves in the customer’s position rather than just focus on selling the technology “feeds and speeds”. Part of this is created more innovative and engaging user experiences on top of the core BPMS platform, although (in my opinion), these are more likely to come from the smaller boutique firms than the large systems integrators.
  • Business analysts and end users can be involved in building solutions in low-code environments, although these are often simpler or template-based applications.
  • Service providers choose to work with a BPMS platform because it gives them agility and speed in building solutions. Often, they can build a solution that can be reconfigured by the customer, such as through simple rule changes.

Having run a boutique BPM service provider in the past, I have a lot of my own opinions on this topic too, although many of them were covered on the panel. My experience is that in situations that require full development efforts (as opposed to purely low-code), service providers can typically provide solutions that are superior to those from either the vendor or the customer’s internal development group, in terms of quality and innovation of technology and often in terms of business fit. Also, it’s hard to hire the same type of skills within a customer organization, since the ideal skill set for a service provider employee is a degree of curiosity that spans multiple businesses.

After lunch is the BPM analyst panel that I’m speaking on, so I’ll be back once the demo sessions start after that. In the meantime, follow the #bpmNEXT hashtag to hear the buzz.

Positioning Business Modeling panel at bpmNEXT

We had a panel of Clay Richardson of Forrester, Kramer Reeves of Sapiens and Denis Gagne of Trisotech, moderated by Bruce Silver, discussing the current state of business modeling in the face of digital transformation, where we need to consider modeling processes, cases, content, decisions, data and events in an integrated fashion rather than as separate activities. The emergence of the CMMN and DMN standards, joining BPMN, is driving the emergence of modeling platforms that not only include all three of these, but provide seamless integration between them in the modeling environment: a decision task in a BPMN or CMMN model links directly to the DMN model that represents that decision; a predefined process snippet in a CMMN model links directly to the BPMN model, and an ad hoc task in a BPMN model links directly to the CMMN model. The resulting models may be translated to (or even created in) a low-code executable environment, or may be purely for the purposes of understanding and optimizing the business.

Some of the points covered on the panel:

  • The people creating these models are often in a business architecture role if they are being created top down, although bottom-up modeling is often done by business analysts embedded within business areas. There is a large increase in interest in modeling within architecture groups.
  • One of the challenges is how to justify the time required to create these models. A potential positioning is that business models are essential to capturing knowledge and understanding the business even if they are not directly executable, and as organizations’ use of modeling matures and gains visibility with executives, it will be easier to justify without having to show an immediate tangible ROI. Executable models are easier to justify since they are an integrated part of an application development lifecycle.
  • Models may be non-executable because they model across multiple implementation systems, or are used to model activities in systems that do not have modeling capabilities, such as many ERP, CRM and other core operational systems, or are at higher levels of abstraction. These models have strategic value in understanding complexity and interrelationships.
  • Models may be initiated using a model derived from process/data mining to reduce the time required to get started.
  • Modeling vendors aren’t competing against each other, they’re competing against old methods of text-based business requirements.
  • Many models are persistent, not created just for a specific point in time and discarded after use.

A panel including two vendors and an analyst made for some lively conversation, and not a small amount of finger-pointing. đŸ™‚

bpmNEXT 2016

It’s back! My favorite conference of the year, where the industry insiders get together to exchange stories and show what cool stuff that they’re working on, bpmNEXT is taking place this week in Santa Barbara. This morning is a special session on the Business of BPM, looking forward at what’s coming in the next few years, with an analyst panel just after lunch that I’ll be participating in. After that, we’ll start on the demos: each presenter has a 5-minute Ignite-style presentation as an intro (20 auto-advancing slides of 15 second each) followed by a live demo.

After a brief intro by Bruce Silver, the morning kicked off with Nathanial Palmer providing an outlook of the next five years of BPM, starting with what we can learn from other areas of digital disruption, where new companies are leveraging infrastructure built by the long-time industry players. He discussed how the nature of work (and processes) is becoming data-driven, goal-oriented, adaptive, and containing intelligent automation. His take on what will drive BPM in the next five years is the three R’s: robots (and other smart things), rules, and relationships (really, the data about the relationships). The modern BPMS framework is much more than just process, but includes goal-seeking optimization, event processing, decision management and process management working on events captured from systems and smart devices. We need to redefine work and how we manage tasks, moving away from (or at least redefining) the worklist paradigm. He also suggests moving away from the monolithic integrated BPMS platform in favor of assembling best-of-breed components, although there was some discussion as to whether this changed the definition of a BPMS to steer away from the recent trend that is turning most BPMS into full-fledged application development platforms.

Up next was Neil Ward-Dutton, providing insights into how the CxO focus and influences are changing. Although many companies have a separate perspective and separate teams working on digital business strategy based on their focus — people and knowledge versus processes and things, internal versus external — these are actually all interconnected. The companies most successful at digital transformation recognize this, and create integrated experiences across what other companies may think of as separate parts of their organization, such as breaking down the barriers between employee engagement and external engagement. Smart connected things fill in the gaps of digital transformation, allowing us to not only create digital representations of physical experiences, but also create physical representations of digital experiences. Neil also looked at the issue of changing how we define work and how it gets done: automation, collaboration, making customers full participants in processes, and embracing new interfaces. Companies are also changing how they think about what they do and where their value lies: in the past 40 years, the S&P 500’s market value has changed from primarily tangible assets to primarily intangible assets, with a focus on optimizing customer experiences. In the face of that, there is a high employee turnover in call centers that are responsible for some of those customer experiences, driving the need for new ways to serve and collaborate with customers. He finished with five imperatives for digital work environments: openness, agility, measurability, collaboration and augmentation. Successful implementation of these digital transformation imperatives may allow breaking the rules of corporate strategy, allowing an organization to show excellence in products, customer engagement and operations rather than just along a single axis.

Great start to the conference, with lots of ideas and themes that I’m sure we’ll see echoed in the presentations over the next couple of days.

bpmNEXT 2015 Day 3 Demos: Camunda, Fujitsu and Best In Show

Last demo block of the conference, and we’re focused on case management and unstructured processes.

Camunda, CMMN and BPMN Combined

Jakob Freund presented on OMG’s (relatively) new standard for case management modeling, CMMN, and how they combine it with BPMN to create processes that have a combination of pre-defined flows and case structures. They use the Trisotech CMMN modeler embedded in their environment, running both the CMMN and BPMN on the same engine; they are looking at adding DMN for decision modeling as well. He demonstrated an insurance application example there BPMN is used to model the overall process, with the underwriting subprocess actually being a CMMN model within a BPMN model. The user task list can show a consolidated view of both BPMN tasks and CMMN tasks, or a dedicated UI can be used for a case since it can also show enabled activities that are not yet instantiated (hence would not appear in a task list) as available user actions. BPMN processes can also be triggered from the CMMN model, providing pre-defined process fragments that can be triggered by the case worker to perform standard operations. He also showed their developer workbench, including a full-featured debugger that includes stepwise execution and the ability to execute code at any step. Since their paradigm is to provide process management services to a developer writing in Java, their tooling is more technical than what is found in a no-code or low-code environment. Also, a BPMN font.

Fujitsu: Using Agents to Coordinate Patient Care across Independent Specialists

Keith Swenson finished the demos presenting healthcare research from the University of Granada, which helps to create patient treatment plans based on rules and iterative goal-seeking rather than pre-defined processes. This allows for different medical specialists to have their own sets of rules and best practices for dealing with their own specialization; automated agents can combine and negotiate the rules from multiple specialists to create a consolidated treatment plan for patients with multiple conditions, allowing each of the participants to monitor progress. He demonstrated a prototype/sample application that allows each specialist to set out a schedule of actions that make up a treatment plan; the multiple treatments plans are conciliated against each other — basically, modifying a plan by adding steps from another plan — and presented back to the referring physician, who can then select one of the plan processes for execution. He used the IActive Knowledge Studio to show how the plans and rules are designed, and discussed how the processes for the interacting agents would be emergent as they communicate and negotiate.

That’s it for bpmNEXT for me. Great conference, as always. As a matter of disclosure, I was not charged the conference fee to attend, although I paid my own travel and living expenses. A number of the vendors that I have written about here over the past three days are my clients or have been so in the past, but that did not allow them to escape the snarky Twitter comments.

Update: waiting to take off at Santa Barbara airport, and I see from the Twitter stream that SAP won the Best In Show award for their Internet of Everything demo – congratulations! Top five presentations: W4, Camunda, Trisotech, Bonitasoft and BP-3. Kudos all around. 

bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you will have to check the Twitter hashtag to see who won — or maybe I’ll do a wrapup post from the road.

IBM: BPM, say Hello to Watson. A New Era of Cognitive Work – Here Today

First up was Chris Vavra discussing how Watson and cognitive computing and natural language analysis capabilities can be used in the context of BPM, acting as an expert advisor to knowledge workers to enhance, scale and accelerate their work with its (or as Chris said, “his”) reasoning capabilities. There are a number of Watson services offered on their Bluemix cloud development platform; he demonstrated an example of an HR hiring process where the HR person uses Watson to analyze a candidate’s personality traits as part of the evaluation process. This is based on a written personal statement provided by the candidate; Watson analyzes that text (or could link through to a personal website or blog) to provide a personality analysis. From the Bluemix developer dashboard, you can create applications that include any of the services, including Watson Personality Insights that provides ranking on several factors in the five basic personality traits of Openness, Conscientiousness, Extraversion, Agreeableness and Emotional Range, with a graphical representation to highlight values and needs that may be of concern in the hiring process. It’s unlikely that a hiring manager would use solely this information to make a decision, but it’s interesting for exploring a candidate’s personality characteristics as part of the process. There are a number of other Watson-based services available on Bluemix to bind into BPM (and other) applications; in the IBM cloud BPM designer, this just appears as a service connector that can be configured with the Watson authentication information, and invoked at a services step in a process flow. Lots of other potential applications for bringing this level of expert recommendations into processes, such as healthcare condition diagnoses or drug interactions.

Safira: Managing Unstructured Processes with AdHoc BPM Framework

Filipe Pinho Pereira addressed the issue of the long tail of organizations’ processes, where only the high-volume, high-value structured processes are being implemented as full BPM projects by IT, and the long tail of less critical and ad hoc processes that end up being handled manually. Using IBM BPM, he demonstrated their Ad-Hoc BPM Framework add-on that allows a business user to create a new ad-hoc process based on a predefined request-intervention process pattern, which has only an initial data capture/launch step, then a single “do it” human step with a loop that keeps returning to the same step until explicitly completed. The example was an expense report process, where a blank expense spreadsheet was attached, a form created to capture basic data, and SLAs specified. Routing is created by specifying the primary recipient, and notifications that will be issued on start, end and SLA violations. Users can then create an instance of that process (that is, submit their own expense report), which is then routed to the primary recipient; the only routing options at that point are Postpone, Forward and Complete, since it’s in the main human task loop part of the process pattern. This distills ad-hoc processes to their simplest form, where the current recipient of the main task decides on who the next recipient is or whether to complete the task; this is functionally equivalent to an email-based process, but with proper process monitoring and SLA analytics. By looking at the analytics for the process, we saw the number of interventions (the number of times that the human step loop was executed for an instance), and the full history log could be exported to perform mining to detect patterns for process improvement. Good example of very simple user-created ad hoc processes based on an industrial-strength infrastructure; you’re not going to buy IBM BPM just to run this, but if you’re already using IBM BPM for your high-volume processes, this add-on allows you to leverage the infrastructure for the long tail of your processes.

Cryo: Tools for Flexibility in Collaborative Processes

Rafael Fazzi Bortolini and Leonardo Luzzatto presented on processes that lie somewhere in the middle of the structured-unstructured spectrum, and how to provide flexibility and dynamic aspects within structured constraints through decision support, flexible operations, ad-hoc task execution and live changes to processes. Demonstrating with their Orquestra BPMS, they showed a standard process task interface with the addition of localized analytics based on the history of that task in order to help the user decide on their actions at that point. Flexible routing options allow the user to return the process to an earlier step, or forward the current task to a colleague for consultation before returning it to the original user at the same step; this does not change the underlying process model, but may move the instance between activities in a non-standard fashion or reassign it to users who were not included in the original process definition. They also have an ad-hoc process pattern, but unlike Safira, they are using actual ad-hoc activities in BPMN, that is, tasks that are not connected by flow lines. Users are presented with the available ad hoc tasks in the process model, allowing them to “jump” between the activities in any order. They also demonstrated live changes to production processes; the examples were adding a field to a form and changing the name of a task in the process, both of which are presumably loaded at runtime rather than embedded within the instantiated process to allow these types of changes.

bpmNEXT 2015 Day 2 Demos: Omny.link, BP-3, Oracle

We’re finishing up this full day of demos with a mixed bag of BPM application development topics, from integration and customization that aims to have no code, to embracing and measuring code complexity, to cloud BPM services.

Omny.link: Towards Zero Coding

Tim Stephenson discussed how extremely low-code solutions could be used to automate marketing processes, in place of using more costly marketing automation solutions. Their Omny.link solution integrates workflow and decisioning with WordPress using JavaScript libraries, with detailed tracking and attribution, by providing forms, tasks, decision tables, business processes and customer management. He demonstrated an actual client solution, with custom forms created in WordPress, then referenced in a WordPress page (or post) that is used as the launch page for an email campaign. Customer information can be captured directly in their solution, or interfaced to another CRM such as Sugar or Salesforce. Marketers interact with a custom dashboard that allows them to define tasks, workflows, decisions and customer information that drive the campaigns; Tim sees the decision tables as a key interface for marketers to create the decision points in a campaign based on business terms, using a format that is similar to an Excel spreadsheet that they might now be using to track campaign rules.

BP-3: Sleep at Night Again: Automated Code Analysis

Scott Francis and Ivan Kornienko presented their new code analysis tool, Neches, that applies a set of rules based on best practices and anti-patterns based on their years of development experience to identify code and configuration issues in IBM BPM implementations that could adversely impact performance and maintainability. They propose that proper code reviews — including Neches reviews — at the end of each iteration of development can find design flaws as well as implementation flaws. Neches is a SaaS cloud tool that analyzes uploads of snapshots exported from the IBM BPM design environment; it scores each application based on complexity, which is compared to the aggregate of other applications analyzed, and can visualize the complexity score over time compared to found, resolved and fixed issues. The findings are organized by category, and you can drill into the categories to see the specific rules that have been triggered, such as UI page complexity or JavaScript block length, which can indicate potential problems with the code. The specific rules are categorized by severity, so that the most critical violations can be addressed immediately, while less critical ones are considered for future refactoring. Specific unused services, such as test harnesses, can be excluded from the complexity score calculation. Interesting tool for training new IBM BPM developers as well as review code quality and maintainability of existing projects, leveraging the experience of BP-3 and Lombardi/IBM developers as well as general best coding practices.

Oracle: Rapid Process Excellence with BPM in the Public Cloud

Linus Chow presented Oracle’s public cloud BPM service for developing both processes and rules, deployable in a web workspace or via mobile apps. He demonstrated an approval workflow, showing the portal interface, a monitoring view overlaid on the process model, and a mobile view that can include offline mode. The process designer is fully web-based, including forms and rules design; there are also web-based administration and deployment capabilities. This is Oracle’s first cloud BPM release and looks pretty full-featured in terms of human workflow; it’s a lightweight, public cloud refactoring of their existing Oracle BPM on-premise solution, but doesn’t include the business architecture or SOA functionality at this time.

Great day of demos, and lots of amazing conversations at the breaks. We’re all off to enjoy a free night in Santa Barbara before returning for a final morning of five more demos tomorrow.

bpmNEXT 2015 Day 2 Demos: Kofax, IBM, Process Analytica

Our first afternoon demo session included two mobile presentations and one on analytics, hitting a couple of the hot buttons of today’s BPM.

Kofax: Integrating Mobile Capture and Mobile Signature for Better Multichannel Customer Engagement Processes

John Reynolds highlighted the difficulty in automating processes that involve customers if you can’t link the real world — in the form of paper documents and signatures — with your digital processes. Kofax started in document scanning, and they’ve expanded their repertoire to include all manner of capture that can make processes more automated and faster to complete. Smartphones become intelligent scanners and signature capture devices, reducing latency in capture information from customers. John demonstrated the Kofax Mobile Capture app, both natively and embedded within a custom application, using physical documents and his iPhone; it captures images of a financial statement, a utility bill and a driver’s license, then pre-processes them on the device to remove irregularities that might impact automated character recognition and threshold them to binary images to reduce the data transmission size. These can then be directly injected into a customer onboarding process, with both the scanned image and the extracted data included, for automated or manual validation of the documents to continue the process. He showed the back-end tool used to train the recognition engine by manually identifying the data fields on sample images, which can accept a variety of formats for the same type of document, e.g., driver’s licenses from different states. This is done by a business person who understands the documents, not developers. Similarly, you can also use their Kapow Design Studio to train their system on how to extract information from a website (John was having the demo from hell, and his Kapow license had expired) by marking the information on the screen and walking through the required steps to extract the required data fields. They take on a small part of the process automation, mostly around the capture of information for front-end processes such as customer onboarding, but are seeing many implementations moving toward an “app” model of several smaller applications and processes being used for an end-to-end process, rather than a single monolithic process application.

IBM: Mobile Case Management and Capture in Insurance

Mike Marin and Jonathan Lee continued on the mobile theme, stressing that mobile is no longer an option for customer-facing and remote worker functionality. They demonstrated IBM Case Manager for an insurance example, showing how mobile functionality could be used to enhance the claims process by mobile capture, content management and case handling. Unlike the Kofax scenario where the customer uses the mobile app, this is a mobile app for a knowledge worker, the claims adjuster, who may need a richer informational context and more functionality such as document type classification than a customer would use. They captured the (printed and filled) claims form and a photo of the vehicle involved in the claim using a smartphone, then the more complete case view on a tablet that showed more case data and related tasks. The supervisor view shows related cases plus a case visualizer that shows a timeline view of the case. They finished with a look at the new IBM mobile UI design concepts, which presented a more modern mobile interface style including a high-level card view and a smoother transition between information and functions.

Process Analytica: Process Discovery and Analytics in Healthcare Systems

Robert Shapiro shifted the topic to process mining/discovery and analytics, specifically in healthcare applications. He started with a view of process mining, simulation and other analytical techniques, and how to integrate with different types of healthcare systems via their history logs. Looking at their existing processes based on the history data, missed KPIs and root causes can be identified, and potential solutions derived and compared in a systematic and analytic manner. Using their Optima process analytics workbench, he demonstrated importing and analyzing an event log to create a BPMN model based on the history of events: this is a complete model that includes interrupting and non-interrupting boundary events, and split and merge gateways based on the patterns of events, with probabilistic weights and/or decision logic calculated for the splitting gateways. Keeping in mind that the log events come from systems that have no explicit process model, the automatic derivation of the boundary events and gateways and their characteristics provides a significant step in process improvement efforts, and can be further analyzed using their simulation capabilities. Most of the advanced analysis and model derivation (e.g., for gateway and boundary conditions) is dependent on capturing data value changes in the event logs, not just activity transitions; this is an important distinction since many event logs don’t capture that information.

bpmNEXT 2015 Day 2 Demos: Sapiens Decision, Signavio

We finished the morning demo sessions with two on the theme of decision modeling and management.

Sapiens: How to Manage Business Logic

Michael Grohs highlighted the OMG release of the Decision Model and Notation (DMN) standard, and how the decision model is really a business logic model. However, business rule management systems are typically technical solutions, and don’t do much for business users and analysts trying to model their decision logic and rules based on their policies and procedures. Decision-aware processes extract declarative knowledge from process models, greatly simplifying the process models and moving the declarative information to a model format more suitable to business logic, such as a decision table. BPMS and DMS are complementary, and can be combined to create a complete model of the business process. He provided a demo of their decision modeling and repository tooling, which starts with the definition of a community space that shares a glossary, attributes and models, and has governance workflows for decision model approval and deployment. The glossary allows for the definition of fact types, including multiple synonyms to allow different stakeholders to use their own terminology. The decision models are made up of rule families that capture the business logic, with a visual syntax that indicates the rules and conditions that make up a particular decision. This can be expanded into a full decision table style that shows the if-then-else logic using the business terms. Different instances of decisions and rules sets can be created — in his demo example, the insurance policy renewals base logic versus that for a hurricane-prone state such as Florida — and visually compared in the graphical or tabular view, with changes highlighted or listed in detail in a report. Rule sets can be validated to highlight conflicts and missing information, then exported in a variety of formats for importing into a DMS for execution.

Signavio: Business Decision Management

Gero Decker talked about their collaborative process design and SAP upgrade tools as an introduction, but mainly addressed decision modeling and how they are embracing the DMN standard: modeling decisions, inputs and knowledge sources, then linking that to a decision activity in a BPMN model. DMN provides a graphical model form, and also allows for decision tables for detailed steps. Like Sapiens, Signavio does only decision modeling, not execution, and exports in standard formats for importing to a DMN such as Drools for execution. They are releasing the Signavio Decision Manager in a few weeks, and he gave us a preview demo of modeling and testing rules integrated with their process modeling environment. Similar to the modeling that we saw from Comindware earlier this morning, Signavio can be used to model higher-level enterprise architecture constructs such as value chains plus full BPMN models for specific capabilities within those models; he used a BPMN model as a jumping-off point for demonstrating decision modeling by creating a business rule task. From that point, you can specify a decision table directly in situ, or choose to create a DMN model at that point, which launches the DMN modeler with the top-level question/answer in the DMN model linked to the business rule activity from the BPMN model. The DMN model can be built out graphically, data objects defined and rules added with decision tables, and sub-decisions added as required. The DMN modeler can make use of the existing glossary in the Signavio environment for data objects and attributes. The decision tables can be validated to detect conflicts, and can export test cases in a spreadsheet format to drive manual or automated testing. They are also doing some work on detecting complex decision logic within BPMN models, with the goal to refactor the models to externalize the decision logic into DMN models where it makes the BPMN model unnecessarily complex.