OpenText Enterprise World 2019 day 1 keynote

OpenText is holding their global Enterprise World back in Toronto for the third year in a row (meaning that they’ll probably move on to another city for next year — please not Vegas) and I’m here for a couple of days for briefings with the product teams and to sit in on some of the sessions.

I attended a session earlier on connecting content and process that was mostly market research presented by analysts John Mancini and Connie Moore — some interesting points from both of them — before going to the opening keynote with CEO/CTO Mark Barrenechea and a few guests including Sir Tim Berners-Lee.

Barrenechea started with some information about where OpenText is at now, including their well-ranked positions in analyst rankings for content services platforms (Content Services), supply chain commerce networks (Business Network) and digital process automation (AppWorks). He believes that we’re “beyond digital”, with a focus on information rather than automation. He announced cloud-first versions of their products coming in April 2020, although some products will also be available on premise. Their OT2 Cloud Platform will be sold on a service model; I’m not sure if it’s a full microservice implementation, but it sounds like it’s at least moving in that direction. They’ve also announced a new partnership with Google, with Google Cloud being their preferred platform for customers and the integration of Google Services (such as machine learning) into OpenText EIM; this is on a similar scale to what we’ve seen between Alfresco and Amazon AWS.

The keynote finished with a talk by Sir Tim Berners-Lee, inventor of the World Wide Web, on how the web started, how it’s now used and abused, and what we all can do to make it better.

What’s hot this summer? @Camunda Day NYC 2019

Robert Gimbel of CamundaI popped down to a steamy New York today for the half-day local Camunda Day, which was a good opportunity to see an update on their customer-facing messaging and also hear from some of their customers. It was a packed agenda, starting with Robert Gimbel (Chief Revenue Officer) on best practices for successful Camunda projects. Since he’s in charge of sales, some amount of this was about why to choose the enterprise edition over the community edition, but lots of good insights for all type of customers and even applicable to other BPM products. Although he characterized the community edition for lower complexity and business criticality, I know there are Camunda customers using the open source version on mission-critical processes; however, these organizations have made a larger developer commitment to have in-house experts who can diagnose and fix problems as required.

Gimbel outlined the four major types of projects, which are similar to those that I’ve seen with most enterprise clients:

  • Automation of manual work
  • Migrate processes from other systems, whether legacy BPMS, an embedded workflow within another system, or a homegrown workflow system
  • Add process management to a software product that has no (or inflexible) workflows, such as an accounts payable system
  • Provide a centralized workflow infrastructure as part of a digital automation platform, which is what I talked about in my bpmNEXT keynote

They are seeing a typical project timeline of 3-9 months from initiation to go-live, with the understanding that the initial deployment will continue to be analyzed and improved in an agile manner. He walked through the typical critical success factors for projects, which includes “BPMN and DMN proficiency for all participants”: something that is not universally accepted by many BPM vendors and practitioners. I happen to agree that there is a lot of benefit in everyone involved learning some subset of BPMN and DMN; it’s a matter of what that subset is and how it’s used.

We had a demo by Joe Pappas, a New York-based senior technical consultant, which walked us through using Cawemo (now free!) for collaborative modeling by the business, then importing, augmenting, deploying and managing an application that included both a BPMN and a DMN model. He showed how to detect and resolve problems in operational systems, and finished with building new reports and dashboards to display process analytics.

John Fontaine, Capital OneThe first half of the morning finished with a presentation from John Fontaine, Master Software Engineer at Capital One (a Camunda customer) on organizing a Camunda hackathon. As an aside, this is a great topic for getting a customer involved who can’t talk directly about their BPM implementation due to privacy or intellectual property concerns. They had a 2-day event with 42 developers in 6 teams, plus product and process owners/managers — the latter of which are a bit less common as hackathon participants, but everyone was expected to work collaboratively and have fun.

Capital One started with a problem brief in terms of the business case and required technical elements, and a specific judging rubric for evaluating the projects. Since many of the participants were relatively new to Camunda and BPMN, they included some playful uses of BPMN such as the agenda. The first morning was spent on ideation and solution selection, with the afternoon spent creating the BPMN models and creating application wireframes. On the second day, the morning was spent on completing the coding and preparing their demo, with the afternoon for the team demos.

Fontaine finished up with lessons learned across all aspects of the hackathon, from logistics and staffing to attendee recruiting and organization, agenda pacing and milestones, judging, and resource materials such as code samples. Their goal was not to create applications ready for deployment, but a couple of the teams created applications that have become a trigger for ongoing projects.

After the break, we heard from Bernd Ruecker, co-founder of Camunda and now in the role of developer evangelist, on workflow automation in a microservices architecture. He has been writing and speaking on this topic for a while now, including some key points that run counter to many BPM vendors’ views of microservices, and even counter to some of Camunda’s previous views:

  • Every workflow must be owned by one microservice, and workflow live inside service boundaries. This means no monolithic (end-to-end) BPMN models for execution, although the business likely creates higher-level non-executable models that shown an end-to-end view.
  • Event-driven architecture for passing information between services in a decoupled manner, although it’s necessary to keep a vision of an overall flow to avoid unexpected emergent behaviors. This can still be accomplished with messaging, but you need to think about some degree of coupling by emitting commands rather than just events: a balance of orchestration and choreography.
  • Microservices are, by their nature, distributed systems; however, there is usually a need for some amount of stateful orchestration, such as is provided by a BPM engine.

From Bernd Ruecker’s blog post

Ruecker talked about the different ways of communication — message/event bus versus REST-ish command-type events between services versus using a BPM engine as a work distributor for external services — with the note that it’s possible to do good microservices architecture with any of these methods. He notes that in the last scenario (using a BPM engine as the overall service orchestrator) is not necessarily best practice; he is looking more at the use of the engine at a lower granularity, where there is a BPM engine encapsulated in each service that requires it. Check out his blog post on microservices workflow automation for more details.

The (half) day finished with Frederic Meier, Camunda’s head of sales for North America, in conversation with Michael Kirven, VP of IT Business Solutions at People’s United Bank about their Camunda implementation in lending, insurance, wealth management and other business applications. They opened it up to the audience of mostly financial services customers to talk about their use cases, which included esoteric scenarios such as video processing (passing a video through transcoding and other services), and more mainstream examples such as a multi-account closure. This gave an opportunity for prospects and less-experienced customers to ask questions of the battle-hardened veterans who have deployed multiple Camunda applications.

Great content, and definitely worthwhile for the 40-50 people in attendance.

bpmNEXT 2019 wrapup: coverage from others plus my keynote video

Finally getting around to going through all of the other coverage of bpmNEXT, and reviewing the video of my keynote.

This is the first time that I’ve presented these concepts in this presentation format, and I definitely have ideas about how to make this clearer: there are some good use cases to include in more detail, plus counter-use cases where a microservices approach doesn’t fit.

All of the presentation videos are now available online, check out the entire playlist here.

Kris Verlaenen from Red Hat, in addition to presenting his own session on automating human-centric processes with machine learning, posted his impressions in five posts. He also went back and updated them with the videos of each session:

  • Day 1, covering the two keynotes by Nathaniel Palmer and Jim Sinur, and the initial demo session by Appian.
  • Day 1 Part 2, covering demo sessions by BP Logix, Minit, Cognitive Technology, Kissflow, Wizly and IBM.
  • Day 2, covering my keynote, demo sessions by Trisotech and Method & Style, and a panel on decision services and machine learning.
  • Day 2 Part 2, covering demo sessions by Bonitasoft, Signavio and Flowable, plus a panel on the value proposition of intelligent automation.
  • Day 3, covering demo sessions by Serco, Fujitsu, Red Hat (his own presentation) and SAP, wrapping up with the discussion on the DMN TCK.

Great coverage, since he and I sometimes see different things in the demo and it’s good to read someone else’s views.

Keith Swenson wrote a summary post for the three keynotes including some detailed criticisms of my keynote; I’ll definitely be reviewing these for improving the presentation and reworking how I present some of the concepts. He also wrote a post about the DMN TCK (technical compatibility kit) efforts, now three years in, and some of the success that they’re seeing in helping to standardize the use of DMN.

Another great year of bpmNEXT.

bpmNEXT 2019 demo: intelligent BPM by @SAP plus DMN TCK working group

ML, Conversational UX, and Intelligence in BPM, with Andre Hofeditz and Seshadri Sreeniva of SAP plus DMN TCK update

We’re at the end of bpmNEXT for another year, and we have one last demo. Seshadri showed a demo of their intelligent BPM for an employee onboarding process (integrated with SuccessFactors), where the process can vary widely depending on level, location and other characteristics. This exposes the pre-defined business processes in SuccessFactors, with configuration tools for customizing the process by adding and modifying building blocks to create a process variant for a special case. Decisions involved in the processes can also be configured, as well as dashboards for viewing the processes in flight. Extension workflows can be created by selecting a standard process “recipe” from a SuccessFactors library, then configuring it for the specific use; he showed an example here for adding an equipment provisioning extension that can be added as a service task to one of the top-level process models. He demonstrated a voice-controlled chatbot interface for interacting with processes, allowing a manager to ask what’s happening for them today, and get back information on the new employee onboardings in progress, and expected delays and a link to his task inbox. Tasks can be displayed in the chat interface, and approvals accepted via voice or typed chat. The chatbot is using AI for determining the intent of the input and providing a precise and accurate response, and using ML to provide predictions on the time required to complete processes that are in flight if asked about completion times and possible delays. The chatbot can also make decision table-based recommendations such as creating an IT ticket to assign roles to the new employee and find a desk location. He showed the interface for designing and training the bot capabilities, where a designer can create a new conversational AI skill based on conditions, triggers and actions to take. This is currently a lab preview, but will be rolled out as part of their cloud platform workflow (not unique to the SuccessFactors environment) in the coming months.

Decision Model and Notation Technology Compatibility Kit update with Keith Swenson

We finished off bpmNEXT 2019 with an update on the DMN TCK, that is, the set of tools provided for free for vendors to test their implementation of DMN. The TCK provides DMN 1.2 models plus sets of input data and expected results; a runner app calls the vendor engine, compares the results and exports them as a CSV file to show compliance. In the three years since this was kicked off, there are eight vendors showing results and over 1000 test cases, with another vendor about to join the list and add another 600 test cases. The test cases are determined through manual examination of the standard specification, so represents a significant amount of work to create this robust set of compliance tests. The TCK group is not creating the standard, but testing it; however, Keith identified some opportunities for the TCK to be more proactive in defining some things such as error handling behavior that the revision task force (RTF) at OMG are unlikely to address in the near term. He also pointed out that there are many more vendors claiming DMN compatibility than have demonstrated that compatibility with the TCK.

That’s it for bpmNEXT 2019 – always feels like it’s over too soon, yet I leave with my brain stuffed full of so many good ideas. We’ve done the wrapup survey and heading off to lunch, but the results on Best in Show won’t come out until I’m already on my way to the airport.

bpmNEXT 2019 demos focused on creating smarter processes: decisions, RPA, emergent processes and machine learning with Serco, @FujitsuAmerica and @RedHat

A Well-Mixed Cocktail: Blending Decision and RPA Technologies in 1st Gen Design Patterns, with Lloyd Dugan of Serco

Lloyd showed a scenario of using decision management to determine if a step could be done by RPA or a human operator, then modeling the RPA “operator” as a role (performer) for a specific task and dynamically assigning work – this is instead of refactoring the BPMS process to include specific RPA robot service tasks. This is shown from an actual case study that uses Sapiens for decision management and Appian for case/process management, with Kapow for RPA. The focus here is on the work assignment decisioning, since the real-world scenario is managing work for thousands of heads-down users, and the redirection of work to RPA can have huge overall cost savings and efficiency improvement even for small tasks such as logging in to the multiple systems required for a user to do work. The RPA flow was created, in part, via the procedural documentation wiki that is provided to train and guide users, and if the robot can’t work a task through to completion then it is passed off to a human operator. The “demo” was actually a pre-recorded screen video, so more like a presentation with a few dynamic bits, but gave an insight into how DM and RPA can be added to an existing complex process in a BPMS to improve efficiency and intelligence. Using this method, work can gradually be carved off and performed by robots (either completely or partially) without significantly refactoring the BPMS process for specific robot tasks.

Emergent Synthetic Process, with Keith Swenson of Fujitsu

Keith’s demo is based on the premise that although business processes can appear to be simple on the surface when you look at that original clean model, the reality is considerably messier. Instead of predefining a process and forcing workers to follow that in order, he shows defining service descriptions as tasks with their required participants and predecessor tasks. From that, processes can be synthesized at any point during execution that meet the requirements of the remaining tasks; this means that any given process instance may have the tasks in a different order and still be compliant. He showed a use case of a travel authorization process from within Fujitsu, where a travel request automatically generates an initial process – all processes are a straight-through series of steps – but any changes to the parameters of the request may modify the model. This is all based on satisfying the conditions defined by the dependency graph (e.g., departmental manager requires that the manager approve before they can approve it), starting with the end point and chaining backwards through the graph to create the series of steps that have to be performed. Different divisions had different rules around their processes, specifically the Mexico group did not have departmental levels so did not have one of the levels of approval. Adding a step to a process is a matter of adding it as a prerequisite for another task; the new step will then be added to the process and the underlying dependency graph. As an instance executes, the completed tasks become fixed as history but the future tasks can change if there are changes to the tasks dependencies or participants. This methodology allows multiple stakeholders to define and change service descriptions without having a single process owner controlling the end-to-end process orchestration, and have new and in-flight processes generate the optimal path forward.

Automating Human-Centric Processes with Machine Learning, with Kris Verlaenen of Red Hat

Kris demonstrated working towards an automated process using machine learning (random forest model) in incremental small steps: first, augmenting data, then recommending the next step, and finally learning from what happened in order to potentially automate a task. The scenario was provisioning a new laptop inside an organization through their IT department, including approval, ordering and deployment to the employee. He started with the initial manual process for the first part of this – order by employee, quote provided by vendor, and approval by manager – and looked at  how ML could monitor this process over many execution instances, then start providing recommendations to the manager on whether to approve a purchase or not based on parameters such as the requester and the laptop brand. Very consistent history will result in high confidence levels of the recommendation, although more realistic history may have lower confidence levels; the manager can be presented with the confidence level and the parameters on which that was based along with the recommendation itself. In case management scenarios with dynamic task creation, the ML can also make recommendations about creating tasks at a certain stage, such as creating a new task to notify the legal department when the employee is in a certain country. Eventually, this can make recommendations about how to change the initial process/case model to encode that knowledge as new rules and activities, such as adding ad hoc tasks for the tasks that were being added manually, triggered based on new rules detected in the historical instances. Kris finished with the caveat that machine learning algorithms can be biased by the training data and may not learn the correct behavior; this is why they look at using ML to assist users before incorporating this learned behavior into the pre-defined process or case models.

bpmNEXT 2019 demos: microservices, robots and intentional processes with @Bonitasoft @Signavio and @Flowable

BPM, Serverless and Microservices: Innovative Scaling on the Cloud with Philippe Laumay and Thomas Bouffard of Bonitasoft

Turns out that my microservices talk this morning was a good lead-in to a few different presentations: Bonitasoft has moved to a serverless microservices architecture, and the pros and cons of this approach. Their key reason was scalability, especially where platform load is unpredictable. The demo showed an example of starting a new case (process instance) in a monolithic model under no load conditions, then the same with a simulated load, where the user response in the new case was significantly degraded. They then demoed the same scenario but scaling the BPM engine by deploying it multiple times in componentized “pods” in Kubernetes, where Kubernetes can automatically scale up further as load increases. This time, the user experience on the loaded system was considerably faster. This isn’t a pure microservices approach in that they are scaling a common BPM engine (hence a shared database even if there are multiple process servers), not embedding the engine within the microservices, but it does allow for easy scaling of the shared server platform. This requires cluster management for communicating between the pods and keeping state in sync. The final step of the demo was to externalize the execution completely to AWS Lambda by creating a BPM Lambda function for a serverless execution.

Performance Management for Robots, with Mark McGregor and Alessandro Manzi of Signavio

Just like human performers, robots in an RPA scenario need to have their performance monitored and managed: they need the right skills and training, and if they aren’t performing as expected, they should be replaced. Signavio does this by using their Process Intelligence (process mining) to discover potential bottleneck tasks to apply RPA and create a baseline for the pre-RPA processes. By identifying tasks that could be automated using robots, Alessandro demonstrated how they could simulate scenarios with and without robots that include cost and time. All of the simulation results can be exported as an Excel sheet for further visualization and analysis, although their dashboard tools provide a good view of the results. Once robots have been deployed, they can use process mining again to compare against the earlier analysis results as well as seeing the performance trends. In the demo, we saw that the robots at different tasks (potentially from different vendors) could have different performance results, with some requiring either replacement, upgrading or removal. He finished with a demo of their “Lights-On” view that combines process modeling and mining, where traffic lights linked to the mining performance analysis are displayed in place in the model in order to make changes more easily.

The Case of the Intentional Process, with Paul Holmes-Higgin and Micha Kiener of Flowable

The last demo of the day was Flowable showing how they combined trigger, sentry, declarative and stage concepts from CMMN with microprocesses (process fragments) to contain chatbot processes. Essentially, they’re using a CMMN case folder and stages as intelligent containers for small chatbot processes; this allows, for example, separation and coordination of multiple chatbot roles when dealing with a multi-product client such as a banking client that does both business banking and personal investments with the bank. The chat needs to switch context in order to provide the required separation of information between business and personal accounts. “Intents” as identified by the chatbot AI are handled as inbound signals to the CMMN stages, firing off the associated process fragment for the correct chatbot role. The process fragment can then drive the chatbot to walk the client through a process for the requested service, such as KYC and signing a waiver for onboarding with a new investment category, in a context-sensitive manner that is aware of the customer scenario and what has happened already. The chatbot processes can even hand the chat over to a human financial advisor or other customer support person, who would see the chat history and be able to continue the conversation in a manner that is seamless to the client. The digital assistant is still there for the advisor, and can detect their intentions and privately offer to kick off processes for them, such as preparing a proposal for the client, or prevent messages that may violate privacy or regulatory compliance. The advisor’s task list contains tasks that may be the result of conversations such as this, but will also include internally created and assigned tasks. The advisor can also provide a QR code to the client via chat that will link to a WhatsApp (or other messaging platform) version of the conversation: less capable than the full Flowable chat interface since it’s limited to text, but preferred by some clients. If the client changes context, in this case switching from private banking questions to a business banking request, the chatbot an switch seamlessly to responding to that request, although the advisor’s view would show separate private and business banking cases for regulatory reasons. Watch the video when it comes out for a great discussion at the end on using CMMN stages in combination with BPMN for reacting to events and context switching. It appears that chatbots have officially moved from “toy” to “useful”, and CMMN just got real.

bpmNEXT 2019 demos: automation services with @Trisotech and @bpmswatch

The day started with my keynote on rolling your own digital automation platform using BPM and microservices, which set the stage for the two demos and the round table discussion that followed.

Business Automation as a Service, with Denis Gagne of Trisotech

Denis demoed a new product release from Trisotech, their business automation as a service platform: competing with services such as Zapier and IFTTT but with better process and decision management, and more complex service types available for integration. He showed creating a service built on a Twitter trigger, using BPMN to model the orchestration and FEEL as the scripting language in script activities, and incorporating a machine learning sentiment score and a decision service for categorizing the results, with the result displayed in the color of a flashing smart light bulb. Every service created exposes an Open API and REST API by default, and is deployed as a self-contained microservice. He showed a more complex example of marketing automation that extracts data from an input form, uses a geo-locator to find the customer location, uses a DMN decision model to assign to a sales team based on geography and other form parameters, then creates a lead in Microsoft Dynamics CRM. He finished up with an RPA task example that included the funniest execution of an “I am not a robot” CAPTCHA ever. Key point here is that Trisotech has moved from a pure modeling vendor into the execution space, integrated with any Open API service, and deployable across a number of different cloud platforms using standard protocols. Looking forward to playing around with this.

Business-Composable Services for the Mortgage Industry, with Bruce Silver of Method and Style

Bruce showed the business automation services that he’s created using Trisotech’s platform for the mortgage industry. Although he started looking at decision services around how to determine if someone should be approved for a mortgage (or how large of a mortgage), process was also required to do things like handle mapping and validation of data. Everything is driven by a standard application form and a standard set of underwriting rules used in the US mortgage industry, although this could be modified to suit other markets with different rules. The DMN rules are written in business-readable language, allowing them to be changed by non-developers. The BPMN process does the data validation and mapping before invoking the underwriting decision service. The entire process can be published as a service to be called from any environment, such as a web app used by underwriters inside a financial company or by an online prequalification review done directly by the consumer. The plan is to make these models and services available to see what the adoption is like, to help highlight the value and drive the usage of BPMN and DMN in practice.

Industry Round Table: The Coming Impact of Decision Services and Machine Learning on Business Automation

We finished the morning of day 2 with a discussion that included three of the earlier demo presenters: Denis Gagne, Bruce Silver and Scott Menter. They each gave a short talk on how decision services and machine learning are changing the automation landscape. Some ideas discussed:

  • It’s still up in the air whether DMN will “cross the chasm” and become generally used (to the same degree as, for example, BPMN); this means that vendors need to fully support it, potentially as an execution as well as requirements language.
  • Having machine learning algorithms expressed as DMN can improve transparency of decisions, which is essential in some jurisdictions (e.g., GDPR). There is a need for “explainable AI”.
  • The population using DMN is lower than BPMN, and the skill level is higher, although still well within the capabilities of data-focused business people who are comfortable with formulas and expression languages.
  • There’s a distinction between symbolic (rules-based) and sub symbolic (neural network) AI algorithms in terms of what they can do and how they perform; however, sub symbolic AI is less of a black box in terms of decision transparency.
  • If we here at bpmNEXT aren’t thinking about the ethics of automation, who will? Consider the labor disruption of automation, or decisions that make a choice involving the value of life (the AI “trolley problem”), or old norms used as training data to create biased machine learning.
  • We’re still in a culture of having people at a certain skill level (e.g., surgeons, pilots) make their own decisions, although they might be advised by AI. How soon before we accept automated decisions at that level?
  • Individually-targeted decisions are happening now by what is presented to specific people through platforms like Google Search and Amazon. How is our behavior being controlled by the limited set of options presented to us?
  • The closer that a technology gets to the end effect, the more responsibility that the creator of the technology needs to take in how it is used.
  • Machine learning may be the best way to discover the best transparent decision logic from human action (unfortunately that will also include the human biases), allowing for people to understand how and why specific decisions are made.
  • When AI is a black box, it needs to be understood as being a black box, so that adequate constructs can be created around it for testing and usage.

Great discussion and audience participation, and a good follow-on from the two demos that showed decision services in action.

bpmNEXT 2019 demos: citizen development, process analysis and AI-driven automation with @kissflow Wizly and @IBM

Is the Citizen Developer Story a Fairytale? by Neil Miller of Kissflow

Given that Kissflow provides a low-code BPM platform, Neil’s answer is that citizen developers are not, in fact, unicorns: given the right tools, non-developers can build their own applications. Their platform allows a citizen developer to create a process-based application by defining a form, then a related process using a flowchart notation. Forms can link to internally-defined (or imported) data sources, and process steps can include links to webhooks to access external services. Simple but reasonably powerful capabilities, easy enough for non-technical analysts and business users to create and deploy single-form applications for their own use and to share with others. He also showed us the new version that is being released next month with a number of new features and tools, including more powerful integration capabilities that are still well within the reach of citizen developers. The new version also includes completely new functionality for unstructured collaborative scenarios, which can include conversation streams and tasks, plus Kanban boards for managing projects and tasks. There’s still a lot missing for this to handle any type of core processes (e.g., versioning, testing) but good for administrative, situational and collaboration processes.

Insightful Process Analysis, by Jude Chagas-Pereira of Wizly, Frank Kowalkowski of Knowledge Consultants, Inc., and Gil Laware of Information by Design

Wizly provides a suite of analysis tools including process analytics, using process mining and other techniques in demo focused on improving an airline’s call center performance. Jude showed how they can compare process history data against a pre-defined model for conformance checking, and a broad range of analysis techniques to discover correlations between activities and customer satisfaction. They can also generate a “DNA analysis” and other data visualizations, then filter and re-slice the data to hone in on the problems. The main tabular interface is similar to Excel-type filtering and pivot charts, so understandable to most business data analysts, with visualizations and extra analytical tools to drive out root causes. This set of process analytics is just part of their suite: they can apply the same tools to other areas such as master data management. We had a previous look at this last year under the name Aftespyre: Frank pointed out that he and Gil develop the intellectual property of the analytical models, while Jude’s company does the tool implementation.

Improving the Execution of Work with an AI Driven Automation Platform, by Kramer Reeves, Michael Lim and Jeff Goodhue of IBM

Jeff took us through a demo of their Business Automation Workflow Case Builder, which is a citizen developer tool for creating case and content-centric applications that can include processes, decisions and services created by technical developers layered on a simpler milestone-based flow. Checklists are built in as a task management and assignment, allowing a business user to create an ad hoc checklist and assign tasks to other users at any point in the case. We also saw the process task interface with an attended RPA bot invoked by the user as a helper to open the task, extract related data from a legacy interface, then update and dispatch the task . Alongside the process task interface, he showed us using a conversational interface to their Watson AI to ask what type of accounts that the client has, and what documents that they have for the client. We also saw the integration of AI into a dashboard to make decision recommendations based on historical data. He finished with their new Business Automation Studio low-code design environment, where we saw how the citizen developer can add integrations that were created by technical developers, and create new pages in a page flow application. It’s taken a long time for IBM to bring together their entire automation platform based on a number of past acquisitions, but now they appear to have a fairly seamless integration between case/content and process (BPM) applications, with low code and Watson sprinkled in for good measure. They’re also trying to move away from their monolithic pricing models to a microservices pricing model, even though their platforms are pretty monolithic in structure: Mike made the point that customers only pay for what they use.

That’s it for day 1 of bpmNEXT 2019; tomorrow morning I’ll be giving a keynote before we start back into demo rounds.

Machine learning and process mining at bpmNEXT 2019 with BP Logix, Minit and Cognitive Technology

Note that Kris Verlaenen, jBPM project lead at Red Hat, is also blogging from here, check out his coverage for a different view.

Democratizing Machine Learning with BPM, by Scott Menter and Joby O’Brien of BP Logix

We’re now into the full demo sessions at bpmNEXT, and Scott and Joby are up to talk about they’re making machine learning more accessible to non-data scientists and integrating it into their BPM tool, Process Director. They do this by creating a learner object that pulls in data from an external source, then configure the system to select the predicted data field, the algorithm to use and the input data feature to use for prediction. Their example is whether an employee is at risk for leaving the company (possibly a gentle dig at a bigger company making the same sort of predictions), so select one or more input values from the employee data set such as amount of travel and income. They have some nice visualization tools to use while building the learner object, selecting a couple of input features to see which may be the most interesting in the prediction, then can create the learner object so that it can update forms as data is entered, such as during a performance review. This now allows the output from a fairly sophisticated ML object that is analyzing past data to be used just like any other rule or data source in their BPMS. In general, their tools can be used by someone who knows about data scientists to create learner objects for other people to consume in their processes, but can also be used for those without a lot of data science knowledge to create simple but powerful machine learning predictions on their own.

Leveraging Process Mining to Enable Human and Robot Collaboration, by Michal Rosik of Minit

Michal started with the analysis of an invoice approval process as seen through their process mining tool, but the point of his demo was to perform data mining on UI session recording data, that is, the data collected when a recorder is monitoring a person’s activities to figure out exactly the steps they are taking to perform a task. Unlike a strict RPA training/scripting session, this can use data from users just doing their day-to-day work, filter out the activities that aren’t related to the task, and create a definition of the best RPA path. Or, it can use data from the process when RPA is performing the tasks to see where there are potential problems within the bot’s actions or if the existence of the bot is causing bottlenecks to be shifted to other parts of the process. It can use process variant analysis to look at the differences between the process pre- and post-bot implementation. He also showed their Minit dashboard, being released now, which combines process mining and business intelligence to see a much more predictive environment for business managers.

Process Mining and DTO — How to Derive Business Rules and ROI from the Data, with Massimiliano Delsante and Luca Fontanili of Cognitive Technology

DTO – the digital twin of an organization – is the focus of Massimiliano and Luca’s presentation, and how to get from process mining to DTO for analyzing and governing processes in their myInvenio tool. From their process mining model, they can show a number of visualizations: non-conformant cases within the process, manual steps (not yet automated, showing potential for improvement), steps that are in violation of their SLA, and a dashboard combining activity cost and other performance data with the process mining model. They demonstrated how a reference model would be created using BPMN and DMN to allow conformance checking and simulation, or derive the BPMN model – including branching rules – directly from the discovered process model. They’re using machine learning to discover the correlation from which the branching conditions are determined, but the business user/analyst can override the discovered branching rules to define more precise decision rules and decision tables. This “decision mining” is a unique capability in the process mining world (for now). The analyst can also add manual steps to the discovered process model in BPMN mode, which will update the related analytics and visualizations. Their simulation allows each step to not just be simulated as it is currently, but by specifying potential robot replacements of some of the human operators at an activity, comparing the different scenarios.

As a comment on the latter two process mining sessions, I’m really happy to see process mining moving from a purely post-execution analytical tool to an interactive process health check and prediction tool. I’ve done some presentations in the past in which I suggested that process mining would be a great tool for forward-looking simulations and what-if scenarios, and there’s so much more than can be done in this area.

bpmNEXT 2019 demos: Appian

Usually I blog about the demos in groups, but Malcolm Ross of Appian was the lone demo between the panel and lunch so he gets his own post. Smile

As a reminder, demos are a five-minute Ignite-style presentation (20 slides with an auto-advance every 15 seconds) followed by a live demo and Q&A. Malcolm had a lot to say, however, so had five minutes of slide followed by another four minutes of talk in front of a looping video before he started the actual demo.

Malcolm’s demo is on realigning BPM in the age of intelligent automation, in the context of different automation technologies (RPA, AI, BPM, integration) that are being sold as separate solutions into organizations. Not surprisingly, he positions BPM as the core technology and integration platform, but they also OEM Blue Prism’s RPA into their product suite and can integrate with many other web services to take part in the automation. He demonstrated an invoice processing application where he uploaded an invoice PDF where the data was captured using an RPA bot where BPM was used for exception handling when the bot couldn’t complete its task as well as overall monitoring of processes including the bot tasks. He walked through some of their design-time experience that is focused on integration, showing how connections to services from Blue Prism, Automation Anywhere, AWS machine learning, Google NLP and others can be used to create integration points that can then be called from their BPM processes. Good use case of using BPM and RPA together – they are much more complementary than competitive – by allowing RPA tasks to be orchestrated and monitored as part of a larger BPM process. He also had a great analogy when asked about deciding when to use RPA versus BPM: RPA is like a pain reliever that provides temporary relief, while BPM (and SOA) is like an antibiotic that cures the underlying problem.