Virtual conference best practices: 2020 in review

Wow, it’s been over two months since my last post. I took a long break over the end of the year since there wasn’t a lot going on that inspired me to write, and we were in conference hiatus. Now that (virtual) conferences are ramping up again for 2021, I wanted to share some of the best practices that I gathered from attending — and in one case, organizing — virtual conferences over 2020. Having sent this information by email to multiple people who were organizing their own conferences, I decided to just put it here where everyone could enjoy it. Obviously, these are all conferences about intelligent automation platforms, but the best practices are applicable to any technical conference, and likely to many non-technical conferences.

In summary, I saw three key things that make a virtual conference work well:

  1. Live presentations, not pre-recorded. This is essential for the amount of energy in the presentation, and makes the difference between a cohesive conference and a just a bunch of webinars. Screwups happen when you’re live, but they do at in-person conferences, too.
  2. Separate and persistent discussion platform, such as Slack (or Pega’s community in the case of their conference). Do NOT use the broadcast vendor’s chat/discussion platform, since a) it will disappear once your conference is over, and b) it probably sucks.
  3. Replays of the video posted as soon as possible, so that people who missed a live session can watch it and jump into the discussion later the same day while others are still talking about it. Extra points for also publishing the presentation slides at the same time.

A conference is not a one-way broadcast, it’s a big messy collaborative conversation

Let’s start with the list of the virtual conferences that I wrote about, with links to the posts:

What I saw by attending these helped me when I was asked to organize DecisionCAMP, which ran in late June: we did the sessions using Zoom with livestreaming to YouTube (participants could watch either way), used Slack as a discussion platform (which is still being used for ongoing discussions and to run monthly events), and YouTube for the on-demand videos. Fluxicon used a similar setup for their Process Mining Camp: Skype (I think) instead of Zoom to capture the speakers’ sessions with all participants watching through the YouTube livestream and discussions on Slack.

Some particular notes excerpted from my posts on the vendor conferences follow. If you want to see the full blog posts, use the tag links above or just search.

Camunda

  • “Every conference organizer has had to deal with either cancelling their event or moving it to some type of online version as most of us work from home during the COVID-19 pandemic. Some of these have been pretty lacklustre, using only pre-recorded sessions and no live chat/Q&A, but I had expectations for Camunda being able to do this in a more “live” manner that doesn’t completely replace an in-person event, but has a similar feel to it. They did not disappoint: although a few of the CamundaCon presentations were pre-recorded, most were done live, and speakers were available for live Q&A. They also hosted a Slack workspace for live chat, which is much better than the Q&A/chat features on the webinar broadcast platform: it’s fundamentally more feature-rich, and also allows the conversations to continue after a particular presentation completes.”
  • “As you probably gather from my posts today, I’m finding the CamundaCon online format to be very engaging. This is due to most of the presentations being performed live (not pre-recorded as is seen with most of the online conferences these days) and the use of Slack as a persistent chat platform, actively monitored by all Camunda participants from the CEO on down.”
  • “I mentioned on Twitter today that CamundaCon is now the gold standard for online conferences: all you other vendors who have conferences coming up, take note. I believe that the key contributors to this success are live (not pre-recorded) presentations, use of a discussion platform like Slack or Discord alongside the broadcast platform, full engagement of a large number of company participants in the discussion platform before/during/after presentations, and fast upload of the videos for on-demand watching. Keep in mind that a successful conference, whether in-person or online, allows people to have unscripted interactions: it’s not a one-way broadcast, it’s a big messy collaborative conversation.”
  • Note that things did go wrong occasionally — one presentation was cut off part way through when the presenter’s home internet died. However, the energy level of the presentations was really high, making me want to keep watching. Also hilarious when one speaker talked about improving their “shittiest process” which is probably only something that would come out spontaneously during a live presentation.

Alfresco

  • “Alfresco Modernize didn’t have much of a “live” feel to it: the sessions were all pre-recorded which, as I’ve mentioned in my coverage of other online conferences, just doesn’t have the same feel. Also, without a full attendee discussion capability, this was more like a broadcast of multiple webinars than an interactive event, with a short Q&A session at the end as the only point of interaction.”

Celonis

  • “A few notes on the virtual conference format. Last week’s CamundaCon Live had sessions broadcast directly from each speaker’s home plus a multi-channel Slack workspace for discussion: casual and engaging. Celonis has made it more like an in-person conference by live-broadcasting the “main stage” from a studio with multiple camera angles; this actually worked quite well, and the moderator was able to inject live audience questions. Some of the sessions appeared to be pre-recorded, and there’s definitely not the same level of audience engagement without a proper discussion channel like Slack — at an in-person event, we would have informal discussions in the hallways between sessions that just can’t happen in this environment. Unfortunately, the only live chat is via their own conference app, which is mobile-only and has a single chat channel, plus a separate Q&A channel (via in-app Slido) for speakers that is separated by session and is really more of a webinar-style Q&A than a discussion. I abandoned the mobile app early and took to Twitter. I think the Celosphere model is probably what we’re going to see from larger companies in their online conferences, where they want to (attempt to) tightly control the discussion and demonstrate the sort of high-end production quality that you’d have at a large in-person conference. However, I think there’s an opportunity to combine that level of production quality with an open discussion platform like Slack to really improve the audience experience.”
  • “Camunda and Celonis have both done a great job, but for very different reasons: Camunda had much better audience engagement and more of a “live” feel, while Celonis showed how to incorporate higher production quality and studio interviews to good effect.”
  • “Good work by Celonis on a marathon event: this ran for several hours per day over three days, although the individual presentations were pre-recorded then followed by live Q&A. Lots of logistics and good production quality, but it could have had better audience engagement through a more interactive platform such as Slack.”

IBM

  • “As I’ve mentioned over the past few weeks of virtual conferences, I don’t like pre-recorded sessions: they just don’t have the same feel as live presentations. To IBM’s credit, they used the fact that they were all pre-recorded to add captions in five or six different languages, making the sessions (which were all presented in English) more accessible to those who speak other languages or who have hearing impairments. The platform is pretty glitchy on mobile: I was trying to watch the video on my tablet while using my computer for blogging and looking up references, but there were a number of problems with changing streams that forced me to move back to desktop video for periods of time. The single-threaded chat stream was completely unusable, with 4,500 people simultaneously typing “Hi from Tulsa” or “you are amazing”.”
  • “IBM had to pivot to a virtual format relatively quickly since they already had a huge in-person conference scheduled for this time, but they could have done better both for content and format given the resources that they have available to pour into this event. Everyone is learning from this experience of being forced to move events online, and the smaller companies are (not surprisingly) much more agile in adapting to this new normal.”

Appian

  • “This was originally planned as an in-person conference, and Appian had to pivot on relatively short notice. They did a great job with the keynotes, including a few of the Appian speakers appearing (appropriately distanced) in their own auditorium. The breakout sessions didn’t really grab me: too many, all pre-recorded, and you’re basically an audience of one when you’re in any of them, with little or no interactivity. Better as a set of on-demand training/content videos rather than true breakout sessions, and I’m sure there’s a lot of good content here for Appian customers or prospects to dig deeper into product capabilities but these could be packaged as a permanent library of content rather than a “conference”. The key for virtual conferences seems to be keeping it a bit simpler, with more timely and live sessions from one or two tracks only.”

Signavio

  • “Signavio has a low-key format of live presentations that started at 11am Sydney time with a presentation by Property Exchange Australia: I tuned in from my timezone at 9pm last night, stayed for the Deloitte Australia presentation, then took a break until the last part of the Coca-Cola European Partners presentation that started at 8am my time. In the meantime, there were continuous presentations from APAC and Europe, with the speakers all presenting live in their own regular business hours.”
  • “The only thing missing is a proper discussion platform — I have mentioned this about several of the online conferences that I’ve attended, and liked what Camunda did with a Slack workspace that started before and continued after the conference — although you can ask questions via the GoToWebinar Question panel. To be fair, there is very little social media engagement (the Twitter hashtag for the conference is mostly me and Signavio people), so possibly the attendees wouldn’t get engaged in a full discussion platform either. Without audience engagement, a discussion platform can be a pretty lonely place. In summary, the GTW platform seems to behave well and is a streamlined experience if you don’t expect a lot of customer engagement, or you could use it with a separate discussion platform.”

Pega

  • “In general, I didn’t find the prerecorded sessions to be very compelling. Conference organizers may think that prerecording sessions reduces risk, but it also reduces spontaneity and energy from the presenters, which is a lot of what makes live presentations work so well. The live Q&A interspersed with the keynotes was okay, and the live demos in the middle breakout section as well as the live Tech Talk were really good. PegaWorld also benefited from Pega’s own online community, which provided a more comprehensive discussion platform than the broadcast platform chat or Q&A.”

Fluxicon

  • “The format is interesting, there is only one presentation each day, presented live using YouTube Live (no registration required), with some Q&A at the end. The next day starts with Process Mining Café, which is an extended Q&A with the previous day’s presenter based on the conversations in the related Slack workspace (which you do need to register to join), then a break before moving on to that day’s presentation. The presentations are available on YouTube almost as soon as they are finished.”
  • “The really great part was engaging in the Slack discussion while the keynote was going on. A few people were asking questions (including me), and Mieke Jans posted a link to a post that she wrote on a procedure for cleansing event logs for multi-case processes – not the same as what van der Aalst was talking about, but a related topic. Anne Rozinat posted a link to more reading on these types of many-to-many situations in the context of their process mining product from their “Process Mining in Practice” online book. Not surprisingly, there was almost no discussion on the Twitter hashtag, since the attendees had a proper discussion platform; contrast this with some of the other conferences where attendees had to resort to Twitter to have a conversation about the content. After the keynote, van der Aalst even joined in the discussion and answered a few questions, plus added the link for the IEEE task force on process mining that promotes research, development, education and understanding of process mining: definitely of interest if you want to get plugged into more of the research in the field. As a special treat, Ferry Timp created visual notes for each day and posted them to the related Slack channel.”

Bizagi

  • “The broadcast platform fell over completely…I’m not sure if Bizagi should be happy that they had so many attendees that they broke the platform, or furious with the platform vendor for offering something that they couldn’t deliver. The “all-singing, all-dancing” platforms look nice when you see the demo, but they may not be scalable enough.”

Final thoughts

Just to wrap things up, it’s fair to say that things aren’t going to go back to the way that they were any time soon. Part of this is due to organizations understanding that things can be done remotely just as effectively (or nearly so) as they can in person, if done right. Also, a lot of people are still reluctant to even think about travelling and spending days in poorly-ventilated rooms with a bunch of strangers from all over the world.

The vendors who ran really good virtual conferences 2020 are almost certain to continue to run at least some of their events virtually in the future, or find a way to have both in-person and remote attendees simultaneously. If you run a virtual conference that doesn’t get the attendee engagement that you expected, the problem may not be that “virtual conferences don’t work”: it could be that you just aren’t doing it right.

My writing on the Trisotech blog: better analysis and design of processes

I’ve been writing some guest posts over on the Trisotech blog, but haven’t mentioned them here for a while. Here’s a recap of what I’ve posted over there the past few months:

In May, I wrote about designing loosely-coupled processes to reduce fragility. I had previously written about Conway’s Law and the problems of functional silos within an organization, but then the pandemic disruption hit and I wanted to highlight how we can avoid the sorts of cascading supply chain process failures that we saw early on. A big part of this is not having tightly-coupled end-to-end processes, but separating out different parts of the process so that they can be changed and scaled independently of each other, but still form part of an overall process.

In July, I helped to organize the DecisionCAMP conference, and wrote about the BPMN-CMMN-DMN “triple crown”: not just the mechanics of how the three standards work together, but why you would choose one over the other in a specific design situation. There are some particular challenges with the skill sets of business analysts who are expected to model organizations using these standards, since they will end up using more of the one that they’re most familiar with regardless of its suitability to the task at hand, as well as challenges for the understandability of multi-model representations that require a business operations reader of the models to be able to see how this BPMN diagram, that CMMN model and this other DMN definition all fit together.

In August, I focused on better process analysis using analytical techniques, namely process mining, and gave a quick intro to process mining for those who haven’t seen it in action. For several months now, we haven’t been able to do a lot of business “as is” analysis through job shadowing and interviews, and I put forward the idea that this is the time for business analysts to start learning about process mining as another tool in their kit of analysis techniques.

In early September, I wrote about another problem that can arise due to the current trend towards distributed (work from home) processes: business email compromise fraud, and how to foil it with better processes. I don’t usually write about cybersecurity topics, but I have my own in-home specialist, and this topic overlapped nicely with my process focus and the need for different types of compliance checks to be built in.

Then, at the end of September, I finished up the latest run of posts with one about the process mining research that I had seen at the (virtual) academic BPM 2020 conference: mining processes out of unstructured emails, and queue mining to see the impact of queue congestion on processes.

Recently, I gave a keynote on aligning intelligent automation with incentives and business outcomes at the Bizagi Catalyst virtual conference, and I’ve been putting together some more detailed thoughts on that topic for this months’ post. Stay tuned.

Disclosure: Trisotech is my customer, and I am compensated for writing posts for publication on their site. However, they have no editorial control or input into the topics that I wrote about, and no input into what I write here on my own blog.

Next week at DecisionCAMP 2020, hosted by @DecisionMgtCom

We’re reaching the end of what would have been the usual spring season of tech conferences, although all of them moved online with varying degrees of success. After the first few that I attended, I promised a summary of the best and worst practices, and I still plan to do that, but Jacob Feldman convinced me to help him out with the logistics for the online version of DecisionCAMP, which was supposed to be in Oslo next week. I first attended DecisionCAMP last year in Bolzano since I was already in Berlin the week before for CamundaCan, and managed to spend a few days vacation in northern Italy as a bonus. This year, I won’t be blogging about it live, because I’ll be running the logistics and the on-screen monitoring. This is giving me a chance to test-drive some of my ideas about how to run a good online event without spending a fortune.

Note that the last day of the conference, Wednesday July 1, is Canada Day: a major national holiday sometimes referred to as “Canada’s birthday”, but I’ll be online moderating the conference because who’s really going anywhere these days. I do expect everyone on the Zoom meeting that day to be sporting a red maple leaf, or at least be wearing red and white, at risk of having their video disabled by the diabolical Canadian moderator.

Here’s how we’re running it:

  • Registration is via the Declarative AI 2020 site, and is open until tomorrow, June 27.
  • All presentations will be live on Zoom, with simultaneously livestreaming on YouTube. If you are registered, you will receive the Zoom link; if you’re not registered or prefer to watch on YouTube, subscribe to the DecisionCAMP YouTube channel and watch it there.
  • Discussions and Q&A will be on the DecisionCAMP Slack workspace, with dedicated channels for discussions about each day’s presentations. We are encouraging presenters to engage with their audience there after their presentation to answer any open questions, and we already have some discussions going on. This type of persistent, multi-threaded platform is much better for emulating the types of hallway conversations and presenter Q&A that might occur at an in-person conference
  • For Zoom attendees, there will also be the option to use the “raise hand” feature and ask a question verbally during the presentation.

We already have four pre-conference presentations that you can see on the YouTube channel; all of the presentations from next week will join them for on-demand viewing except where the presenter asks us not to record their session.

I’ve learned a lot about online conference tools in the past month or so, including:

  • Zoom features, settings and all variations on recording to have the best possible experience during and after each presentation. I will share all of those in my “best practices” post that I’ll create after DecisionCAMP is over, based on what I’ve seen from all the online conferences this spring.
  • Slack, which I have used before but I’ve never created/administered a workspace or added apps.
  • YouTube livestreaming, or rather, stream-through from Zoom. This is a very cool feature of how Zoom and YouTube work together, but you have to learn a few things, such as to manually end the stream over on YouTube once you’ve closed the Zoom meeting so that it doesn’t keep running with no data input for several hours. Oops.

I’m not being financially compensated for working on DecisionCAMP: I’ve been treating it as a learning experience.

DecisionCAMP 2020 Call for Presentations

image001I really enjoyed my first trip to DecisionCAMP earlier this year, and not just because it was in beautiful Bolzano. In 2020, it will be Oslo, Norway in late June – just after the summer solstice, which is a great time to visit a northern country where the sun (almost) never sets at that time of year.

You can check out all the information on the conference at the DecsionCAMP 2020 website and submit your presentation proposal based on the Call for Presentations.

Summary of DecisionCAMP from @DecisionMgtCom

It seems like just yesterday that we were in beautiful Bolzano, but I’ve been back at my desk for more than a month and still wading through some of the news stories from when I was away.

I noticed this wrapup of the sessions from Jacob Feldman which includes a link to all of the presentation slide decks, plus a more in-depth explanation of his presentation.

He also covers some of the particularly interesting topics in more detail, including the need for DMN 2.0, the user-friendliness of FEEL and several real-world use cases.

I also had a note from Dario Campagna regarding the post that I wrote about his presentation; I’ve updated it to reflect that the work that he presented is part of the COMPOSELECTOR project, to which ESTECO is a contributor. My apologies for the omission of the overarching project in the original version of the post, although I was live-blogging so some detail is always missed.

DecisonCAMP 2019: Decision test tools, complex payroll decisioning, and decisions as a service

Modeling Decisions With Embedded Testing. Daniel Schmitz-Hübsch and Ulrich Striffler, Materna

Daniel Schmitz-Hübsch and Ulrich Striffler of Materna, who presented earlier this week on whether FEEL is friendly enough, returned to discuss testing of decision models using a tool that they have developed. The typical life cycle for developing and testing decision models has a business analyst modeling the decisions and creating test cases, but having to pass it off to a developer for executing the test cases and drawing conclusions to feed back into the design. To cut the developer out of the cycle — and therefore shorten the lifecycle — they have developed declab, a browser-based test harness for decision models and test cases. Business analysts can perform ad hoc tests, or build a tree of test cases.

Live demo of declab

This includes FEEL testing, and the business analyst can enter and test and variety of constructs to test out a field function without having to deploy an entire model — envision an analyst with their modeler on one screen and declab on another screen to allow them to do micro-testing as they design decision models.

Materna has released the tool as open source, and it’s based on Red Hat’s DROOLS engine performing the tests in the background. You can try it out online here. Lots of great suggestions and comments from the audience; hopefully some of them will contribute to the open source project.

Exploiting payroll knowledge with Viren. Tim Stuyckens, Teal Partners

Tinm Stuyckens presented on their Viren decision-based tool for modeling and executing knowledge, specifically for calculating expat tax in payroll software. Payroll tax in Belgium is particularly complex, and sometimes it’s difficult to know which statutes to apply to make the most beneficial calculation.

Payroll tax calculation. From Tim Stuyckens’ presentation.

In addition to straightforward tax calculations, the tool can work backwards from a desired point to the necessary conditions, such as how many days to work in order to earn a certain income, or the optimal day rate to minimize taxes and earn a certain income. Business analysts can enter and modify the knowledge rules and data, while the platform handles versioning, compilation and deployment.

They use declarative rules and structured data to represent knowledge in the system, and apply constraint solvers for optimization with non-linear constraints. They only discovered DMN earlier this year and have embraced it in their tool, providing a unified DRD and decision tables to allow business analysts to more easily step through the decision logic.

Decision Management as a Service. Dennis Aarts, The Business Analysts

The last presentation in this session was by Dennis Aarts on a use case of decision management shared services model at Informatie Vlaanderen, an entity of the Flemish government in Belgium. They provide digital services to other parts of the government, and they were looking at ways to improve the quality and consistency of their services. The solution is Automatisch Advies (Automated Advice) which includes authentic and authorized data sources, orchestration using Camunda BPM, and business rules using IBM ODM. It has an extensible architecture to allow other capabilities to be integrated in the future, such as AI/ML.

There were several goals for the project, including productivity (reducing cycle time, reuse of data), regulatory (GDPR requirements) and ease of use (business can make modifications). The solution provides a centralized platform where rules can be developed and used by multiple entities.

Automatisch Advies decision as a service platform. From Dennis Aarts’ presentation.

Having decisions as a shared service amongst many government entities has many benefits in terms of reuse across entities, and not requiring expertise or maintenance skills for the platform in each entity. Like any shared services IT, however, there are complexities in allocating costs, governance of the decision models, security of models specific to a subset of entities, and maintenance of the rule sets.

This was my last session of DecisionCAMP 2019 — I’m skipping the final vendor statements and the closing remarks to head off and have a few days of vacation before I have to return to real life sometime next week. It’s been a great experience, and thanks to Jacob Feldman for inviting me. It’s been several years since I’ve been in Bolzano, and it’s just as beautiful as I remember.

Beautiful Bolzano!

This has been a bit of an epic trip, having left home almost three weeks ago to attend the academic BPM conference in Vienna, give a keynote at CamundaCon in Berlin, then here for DecisionCAMP. You can find my coverage for each of those events at the links.

DecisionCAMP 2019: DMN TCK, BPO with AI and rules, and business logic hidden in spreadsheets

Close Is Not Close Enough. Keith Swenson, Fujitsu

A few months ago at bpmNEXT, I saw Keith Swenson give an update on the DMN Technology Compatibility Kit, and we’re seeing a bit of a repeat of that presentation here at DecisionCAMP. The TCK defines a set of test cases (as DMN decision models, input data and expected results) that assure conformance to the specification, plus a sample runner application that will pass the models and data to the vendor’s engine and evaluate the results.

DMN TCK. From Keith Swenson’s presentation.

There are about 120 test models and 1600 test cases, supporting only DMN 1.2; these tests come from examining the specification as well as cases from practice. It’s easy for a vendor to get involved in the the TCK, both in terms of running it against their engine and in terms of participating through submitting new test models and cases. You can see the vendors that have submitted their results; although many more vendors claim that they “have DMN”, their actual level of compatibility may be suspect.

The TCK committee is getting ready for DMN 1.3, and considering tests for modeling tools in addition to the current tests for the engine. He also floated the idea of a standardized API for DMN as a service, so that the calling application doesn’t need to know which engine it’s calling — possibly something that’s not going to be a big hit with vendors.

Business innovation of BPO realized by Task Center and AI and Rule Engine. Yoshihito Nakayama, NTT DATA INTRAMART

Yoshihito Nakayama presented on the current challenges of BPO with respect to improving productivity, and how they are resolving this using AI and a rules engine to aggregate and assign human tasks from multiple systems to different team members. This removes the requirement to manually review and assign work, and also provides a dashboard for visualizing work in progress and future forecasts.

Intramart’s Task Center for aggregating and assigning work. From Yoshihito Nakayama’s presentation.

AI is used to predict and optimize task classification and assignment, based on time required to complete the task and the individual worker skill level and productivity. It is also used to predict workload for task types and individual workers.

Their visualization dashboard shows drilldowns on current and past productivity, plus future forecasts. The simulation models for forecasting can be fine-tuned to optimize for cost, performance and other factors. It brings together work monitoring from all systems, including RPA processes. They’re also using process mining on a variety of systems to create a digital twin of the organization for better tracking and predictions, as well as other tools such as voice and image identification to recognize what tasks are being done that are not being recorded in any system logs.

They have a variety of case studies across industries, looking at automating non-routine work using case management, BPM, RPA, AI and rules.

Spaghetti Spreadsheets Untangled – Benefits of decision modeling when uncovering complex business logic hidden in spreadsheets. Charlotte Bouvy, M.C. Bouvy Consultancy

Charlotte Bouvy presented on her work done with SVB, the Netherlands social insurance administrator, on implementing business rules management. They are using DMN-based wizards for supporting 1,500 case workers, and the specific case was around the operational control and audit departments and the “lawfulness” of how the assessment work is done. Excel spreadsheets were used to do this, which had obvious problems in terms of being error prone and lacking domain-specific business logic. They implemented their SARA system to replace the spreadsheets with Oracle OPA, which allowed them to more accurately represent knowledge, as well as separate the data from the decision model while creating an executable model.

Decision model to determine lawfulness. From Charlotte Bouvy’s presentation.

These type of audit processes require sampling over a wide variety of case files to compare actual payments against expected amounts, with some degree of aggregation within specific laws being applied. Moving to a rules engine allowed them to model calculations and decisions, and separate data and model to avoid errors that occurred when copying and pasting data in spreadsheets. The executable model is now a single source of truth to which version control and change management can be applied. They are trying out different ways of using the SARA system: directly in Oracle Policy Modeler for building and debugging; via a web interview and an RPA robot for data input; and eventually via direct integration with the SVB’s case management system to load data.

DecisionCAMP 2019: Industry use cases in airport gate allocation, financial risk monitoring, and composite material design

The Decision Model for Gate Allocation. Silvie Spreeuwenberg, Librt

Day 3 of DecisionCAMP 2019 started with three use cases from industry. First, Silvie Spreeuwenberg presented on decision models for allocating airport gates, specifically at Schiphol airport in Amsterdam. Although gate plans are made a day in advance based on flight schedules, they change constantly due to early arrivals, late departures and other unexpected disruptions to the schedule. Any given day, there are 50-100 gate changes one hour before an aircraft arrival; although this was seen as a disruption, this could also be considered an opportunity for optimization.

Day-ahead decision model for assigning gates. From Silvie Spreeuwenberg’s presentation.

There were a lot of rules used for the planning and reassignment that had more to do with preferences than actual optimization; they really wanted to drive towards the objective of optimizing asset usage and therefore airport capacity. There are a lot of factors involved, such as having sufficient gate area capacity to handle the number of passengers for a flight, or having buses available to offload flights that can’t be assigned a gate. They have created a policy for aircraft stand allocation which includes some identifiable decision tables, although these are just at the strategy documentation phase.

Definitely a complex problem that has applicability at every major airport around the world.

A hybrid implementation of multi-channel, multi-modal, high volume financial risk monitoring. Martijn Tromm and Marten Schokking, Oracle

Marten Schokking and Martijn Tromm presented a use case from Rabobank using decision management and machine learning for customer risk assessment in terms of KYC (know your client) and AML (anti-money laundering). This is used during client onboarding, but also during periodic reviews as well as reviews triggered by specific events. There are scoring rules that use data input from a variety of sources, including client information from a CRM, interview responses and policies.

Risk model for customer risk assessment. From Marten Schokking and Martijn Tromm’s presentation.

There are government regulations requiring that this be done for all clients at certain times. A triggering event, such as a change in the customer’s circumstances, will cause a customer interview and other data analysis to recalculate the risk; this may result in a more detailed manual review of the risk. At this point, there is still a lot of employee work which is creating a challenge in completing the customer risk assessments within the regulatory deadlines; they are looking at how to automate the basic assessment using machine learning in order to reduce the manual work required.

The risk model has been built using Oracle Policy Automation rules engine integrated with the Siebel CRM. They are reusing rules across channels where possible, and the use of natural language in the rules definition helps with traceability to the policies. They are continuing to innovate with rules, such as having context-driven rules based on user behavior on specific channels, and having a fast two-day turnaround for rule changes related to certain types of policy changes. The ability to predict the impact of policy changes based on actual data allows for operational planning to accommodate those changes.

The Role of DMN and BPMN in the Design of Composite Materials. Dario Campagna, ESTECO

Dario Campagna presented on how the COMPOSELECTOR project is integrating material modeling and business process management in a decision support system for composite material design; this type of design can have complex requirements, business decisions and simulation workflows. Using application cases from Dow, Airbus and Goodyear, they modeled the business flow using BPMN and DMN. ESTECO, which creates software tools for engineering design, is a contributor to the COMPOSELECTOR project.

Subprocess in Dow flow showing decision invocation. From Dario Campagna’s presentation.

While BPMN is used to model the flow at the business level, DMN decision tables are used to make decisions on the class of materials and manufacturing process, then on the simulation workflows to use based on business and engineering KPIs. DMN provides the link from the business layer to the engineering layer, then to the simulation layer. Using DMN provides a higher level of consistency in decision-making, which leads to better design and lower costs.

Decision table used to select simulation workflow. From Dario Campagna’s presentation.

We saw a brief video of a demo of the system in use: a business-level manager selects high-level parameters and KPIs for the proposed design; this selects one or more simulation models for the material design, which is then confirmed or decided by an engineer; the results of the simulation are passed back to the manager for final decision-making. This has the effect of integrating the business and technical sides of the design process, and include modeling and simulation results in the business-level (human) decisions in a standardized way.

DecisionCAMP 2019: model-based optimization, and decision management in claims

Model-Based Optimization for Effective and Reliable Decision-Making. Robert Fourer, AMPL

Robert Fourer presented on model-based optimization, starting with a bit of background on mathematical optimization techniques and the optimization cycle in practice. He looked at method-based approaches — which define how a solution should be found — and model-driven approaches — which define what a solution should satisfy. There are several solver solutions (often algebraic in nature) that allow you to solve within a broad problem class that is well understood by providing your specific constraints. When a problem changes, a method-based approach requires rethinking and updating the implementation, whereas a model-based approach will require new variables, expressions and constraints but this is typically easier than updating methods.

I’m fairly sure that the subtleties of how this relates to decision management have escaped me, although I can see some conceptual links with the earlier discussions on declarative rules: potentially declarative rules could be used to generate the algebraic notation that was presented in order to create inputs for a model-based optimization.

Decision Management Journey at Hiscox Claims. Larry Goldberg, Sapiens, and Harriet Parkinson, Hiscox

The last presentation on day 2 of DecisionCAMP is a customer case study, with a Sapiens implementation at Hiscox UK Claims. It sounds like they were in the same position as pretty much every insurance claims operation that I’ve seen: little to no automation, and decisions based on the expertise of the claims managers. In other words, a great opportunity for decision management (and process management). As a specialized insurer of high-net-worth customers, however, they have additional drivers for automating the routine administrative parts of claims: their claims adjudicators are highly skilled and well-paid, often lawyers, who do not want to be spending time on admin. There’s also market pressure to start processing claims digitally to reduce the cycle time for less complex claims, or at least triage and process the FNOL (first notice of loss) automatically.

Decision management is especially important for claims in order to ensure consistency: whether the DM system is providing a recommendation to a human claims adjudicator or automating the decision, the decision should be the same from one instance to another given the same inputs and context. Automated decision management is key to increasing the number of day 1 settlements for straightforward claims, while more complex claims will still be done with that human touch. They need to have the ability to change the rules to account for surge scenarios, such as flood that impacts a large number of customers and causes a large number of new claims; this could just change thresholds for determining whether a claim could be automated, or could do some other form of triage on the claim. I talked about scalability for resilience in my CamundaCon keynote last week, and definitely having the ability to quickly change decision parameters is part of that.

Factors in deciding whether to pay a claim immediately (automated) or send to a claims adjudicator. From Harriet Parkinson’s presentation.

They’ve been through the design sessions and the implementation is underway; the final decision models will be built and tested this year, with the full implementation integrated with their claims management system in 2020. Changes to the rules can be done by business analysts, most without IT involvement. In addition to managing decisions that are part of the claims process, the Sapiens system will provide next-best-question support for interactive customer self-service forms (or maybe a chatbot in the future) that can perform an initial triage to determine if a claim can be handled automatically or requires a claims adjudicator to talk to the claimant. One lesson learned is that the initial models took much longer than they expected: almost a year versus the estimate of a couple of months to get a consistent model that was accepted by all business users.

I’m up next to moderate a vendor panel, which will close out this second day of DecisionCAMP. Back tomorrow for a last full day of sessions.

DecisionCAMP 2019: the evolving DMN standard and the quality of decision models

DMN 2.0? Gary Hallmark, Oracle

Gary Hallmark presented on the next major version of DMN that’s in the works, starting with a timeline of what’s happened so far since the original RFP in 2011 up to the expected 1.3 release later this year. He added the question mark to his title because whether to issue a major (fix the mistakes of the past) or minor (patch and live with the mistakes of the past) release is still under consideration. He went through the top 10 requested features, half of which can be backward-compatible with DMN 1.x (i.e., DMN 1.x models can be ingested and executed in the new version) and half of which can’t.

He mentioned the issue of harmonization of DMN, BPMN and CMMN, a topic that I am especially interested in, and plan to ask the vendors about later this afternoon when I am moderating the panel. That can include a common item definition model that is used by all three notations, tighter integration of DMN with BPMN gateways and CMMN sentries, and easier reference and interchange between the model types. This could also include the use of FEEL and decision tables for expressions and logic in BPMN and CMMN, such as at gateways and in data associations. We already have the problem of keeping a collection of different model types sorted out, and there may need to be a “model of models” concept to tie these together.

Concept for how to use a DMN-style decision table directly in a BPMN gateway to model the logic. From Gary Hallmark’s presentation.

Better model validation is another request for the next version of DMN; I don’t have a lot of experience with DMN, but judging by the comments here, the “null” returned for all types of errors is definitely a touchy issue. This could be improved with a “required” property in item definitions and model validation with respect to the item definitions. Additional datatypes would also be useful, such as integers and some types of ranges. There are suggestions for better ways to deal with iteration and recursion in a new version of DMN, some of which are already being done by vendors such as Oracle in their products to make it easier for business analysts to understand and model.

Something that seems simple but would break compatibility is moving to case insensitive names (indicating just how IT-centric the original, and possibly the current, committee was), and handling some things such as single-quoted strings unambiguously. Moving to Xpath-like sequences instead of the current lists also wouldn’t be compatible, nor are many types of recursion and cyclic information requirements.

As mentioned earlier, half of the top requested features could be done in 1.x because they don’t break compatibility; one option is to implement those in a 1.x version and leave the others for now. The alternative is to start the DMN 2.x RFP process, which is a much larger undertaking, which will delay the implementation of those features but will open the door (or Pandora’s box) for a completely new version of DMN. Lots of great discussion at the end of this presentation: many of the people in the room are active contributors to the standard and/or vendors who implement the standard, so they definitely have both knowledge and opinions on the subject.

Quality of Decision Models. Jan Vanthienen, KU Leuven

Jan Vanthienen presented on measuring quality of decision models, starting with notions of information quality, resulting in measures such as complexity and traceability. For example, you can look at complexity of a DRD based on number of decision, number of elements and cyclomatic comlexity; complexity of a decision table can be based on hit policy usage and total number of input variables.

Consistency and interpretability in decision tables can be measured by a unique hit table (no overlapping rules) and a natural order for easy visual reading of the rules — it is more important to be correct and consistent than compact.

There’s a contextual quality factor when we look at a decision model that is related to a process model, where the connections between the two models can be fairly complex. He presented a set of guidelines for integrating processes and decision, including avoiding embedding decisions in gateways: something that happens all the time, in my experience with process modeling.

Integration between a process model and a decision model From Jan Vanthienen’s presentation.

He covered some ideas on decision modeling methodology for creating the models of highest quality: usually this will involve working back and forth between the DRD and the decision tables, rather than trying to do a pure top-down or bottom-up approach. There’s a lot of past research that covers many of the issues of creating quality models, most of which predates BPMN and DMN but the same principles apply. The DMN and BPMN standards embody some of these principles, such as separating decision and process logic.