My upcoming webinar sponsored by Signavio – How to Thrive During Times of Rapid Change

This will be the fourth in a series of webinars that I’m doing for Signavio, this time focused on the high-tech industry but with lessons that span other industries. From the description:

High-Tech businesses are renowned disruptors. But what happens when the disruptors become the disrupted? For example, let’s say a global pandemic surfaces and suddenly changes your market dynamics and your business model.

Can your business handle an instant slowdown or a hyper-growth spurt? What about your operating systems? Are they nimble enough for you to scale? Can you onboard new customers en masse or handle a high volume of service tickets overnight? What about your supply chain; how agile are your systems and supplier relationships?

The first two webinars were discussing banking in February and insurance in March, and the role that intelligent processes play in improving business, with a brief mention in the March webinar about addressing business disruption caused by the pandemic. By the time we hit the third webinar on financial services in April, we had pivoted to look at the necessity of process improvement technologies and methodologies in times of business disruption such as the current crisis. Unlike a lot of industries, many high-tech sectors have been booming during the pandemic: their problems are around being able to scale up operations to meet demand without sacrificing customer service. Although they share some of the same issues as I looked at in the earlier webinars, they have some unique issues where process intelligence and automation can help them.

Tune in on May 20th at 11am Eastern; if you can’t make it then, sign up anyway and you’ll get a link to the on-demand version.

CelosphereLive 2020 — Day 3: extending process mining with multi-event logs and task mining

Traditionally, process mining is fed from history logs from a single system. However, most businesses aren’t run on a single system, and Celonis Product Lead for Discovery Sabeth Steiner discussed how they are allowing multi-event log process mining, where logs from multiple systems are ingested and correlated to do a more comprehensive analysis. This can be useful to find friction between parallel (inbound) procurement and (outbound) sales processes, or customer service requests that span multiple process silos. Different parallel processes appear in Celonis process discovery in different colors, and the crossover points between them highlighted.

Each of the processes can be analyzed independently, but the power comes when they are analyzed in tandem: optimizing the delivery time within an order-to-cash process while seeing the points that it interacts with the procure-to-pay process of the vendors providing materials for that order. Jessica Kaufmann, Senior Software Developer, joined Steiner to show the integrated data model that exists behind the integrated analysis of multiple processes, and how to set this up for multiple event logs. She discussed the different types of visualization: whether to visualize the different processes as a single process (by merging the event logs), or as multiple interacting processes. KPIs can also be combined, so that overall KPIs of multiple interacting processes can be tracked. Great Q&A at the end where they addressed a number of audience questions on the mechanics of using multi-event logs, and they confirmed that this will be available in the free Celonis Snap offering.

Another analysis capability not traditionally covered by process mining is task mining: what are the users doing on the desktop to interact between multiple systems? Andreas Plieninger, Product Manager, talked about how they capture user interaction data with their new Celonis Task Mining. I’ve been seeing user interaction capture being done by a few different vendors, both process mining/analysis and RPA vendors, and this really is the missing link in understanding processes: lack of this type of data capture is the reason that I spend a lot of time job-shadowing when I’m looking at an enterprise customer’s processes.

Task Mining is installed on the user’s desktop (Windows only for now), and when certain white-listed applications are used, the interaction information is captured as well as data from the desktop files, such as Excel spreadsheets. AI/ML helps to group the activities together and match them to other system processes, providing context for analysis. “Spyware” that tracks user actions on the desktop is not uncommon in productivity monitoring, but Celonis Task Mining this is a much more secure and restricted version of that, capturing just the data required for analyzing processes, and respecting the privacy of both the user and data on their screen.

Once the user interaction data is captured, it can be analyzed in the same way as process event log: it can discover the process and its variants, and trigger alerts if process compliance rules are violated. It’s in the same data later as process mining data, so can analyzed and exposed using the same AI, boards and apps structure as process data. Task Miner also captures screen snapshots to show what was actually happening as the user clicked around and entered data, and can be used to check what the user was seeing while they were working. This can be used to determine root causes for the longer-running variants, find opportunities for task automation, and check compliance.

He showed a use case for finding automation opportunities in a procure-to-pay process, similar to the concept of multi-event logs where one of those logs is the user interaction data. The user interaction data is treated a bit differently, however, since it represents manual activities where you may want to apply automation. A Celonis automation could then be used to address some of the problem areas identified by the task mining, where some of the cases are completely automated, while others require human intervention. This ability to triage cases, sending only those that really need human input for someone to process, while automatically pushing actions back to the core systems to complete the others automatically, can result in significant cost savings and shortened cycle time.

Celonis Task Mining is still in an early adopter program, but is expected to be in beta by August 2020 and generally available in November. I’m predicting a big uptake in this capability, since remote work is removing the ability to use techniques such as job shadowing to understand what steps workers are taking to complete tasks. Adding Task Mining data to Process Mining data creates the complete picture of how work is actually getting done.

That’s it for me at CelosphereLive 2020; you can see replays of the presentation videos on the conference site, with the last of them likely to be published by tomorrow. Good work by Celonis on a marathon event: this ran for several hours per day over three days, although the individual presentations were pre-recorded then followed by live Q&A. Lots of logistics and good production quality, but it could have had better audience engagement through a more interactive platform such as Slack.

CelosphereLive 2020 — Day 3: Process AI for automation

I started my day early to see Dr.Steffen Schenk, Celonis Head of Product Management, talk about the Celonis Action Engine and process automation. In short, they are seeing that because they integrate directly with core systems (especially ERP systems, that have their own processes built in), they can do things that RPA and BPM systems can’t do: namely, data-driven sense and act capabilities. However, these processes are only as timely as the data connection from the core systems into Celonis, so there can be latency.

He walked through an example of an order management process where he filtered SAP order data to show those with on-time delivery problems, due to order approval or credit check required, and created a query to detect those conditions in the future. Then, he created a process automation made up of system connectors that would be triggered based on a signal from that query in the future. In addition to system connectors (including webhooks), the automation can also directly call Celonis actions that might prompts a user to take an action. The automation can do branching based on data values: in his example, a customer credit block was automatically removed if they have a history of on-time payment, and that data was pushed back to SAP. That, in turn, would cause SAP to move the invoice along: it’s effectively a collaborative process automation between SAP and Celonis. The non-automated path sends a task to an order manager to approve or deny the credit, which in turn will trigger other automated actions. This process automation is now a “Skill” in Celonis, and can be set to execute for all future SAP order data that flows through Celonis.

Once this automation has been set up, the before and after processes can be compared: we see a higher degree of automation that has led to improving the on-time delivery KPI without increasing risk. It’s data-driven, so that only customers that continue to have an on-time payment record will be automatically approved for credit on a specific order. This is an interesting approach to automation that provides more comprehensive task automation than RPA, and a better fit than BPM when processes primarily exist in a line-of-business or legacy system. If you have an end-to-end process to orchestrate and need a comprehensive model, then BPM may be a better choice, but there’s a lot of interesting applications for the Celonis model of automating the parts of an existing process that the core system would have “delegated” to human action. I can definitely see applications for this in insurance claims, where most of the claim process lives in a third-party claims management system, but there are many decisions and ancillary processes that need to happen around that system.

This level of automation can be set up by a trained Celonis analyst: if you’re already creating analysis and dashboards, then you have the level of skill required to create these automations. This is also available both for cloud and on-premise deployments. There was an interesting discussion in the Q&A about credentials for updating the connected systems: this could be done with the credentials of the person who approves a task to execute (attended automation) or with generic system credentials for fully-automated tasks.

This was a really fascinating talk and a vision of the future for this type of process automation, where the core process lives within an off-the-shelf or legacy system, and there’s a need to do additional automation (or recommendations) of supporting decisions and actions. Very glad that I got up early for the 7:15am start.

I listened in on the following talk on machine learning and intelligent automation by Nicolas Ilmberger, Celonis Senior Product Manager of Platform and Technology, where he showed some of their pre-built ML tools such as duplicate checkers (for duplicate invoices, for example), root cause analysis and intelligent audit sampling. These are used to detect specific scenarios in the data that is flowing into Celonis, then either raising an action to a user, or automating an action in the background. They have a number of pre-configured connectors and filters, for example, to find a duplicate vendor invoice in an SAP system; these will usually need some customization since many organizations have modified their SAP systems.

He showed a demonstration of using some of these tools, and also discussed a case study of a manufacturing customer that had significant cost savings due to duplicate invoice checking: their ERP system only found exact matches, but slight differences in spelling or other typographical errors could result in true duplicates that needed more intelligent comparison. A second case study was for on-time delivery by an automotive supplier, where customer orders at risk were detected and signals sent to customer service with recommendations for the agent for resolution.

It’s important to note that both for these ML tools and the process automation that we saw in the previous session, these are only as timely as the data connection from the core processing system to Celonis: if you’re only doing daily data feeds from SAP to Celonis, for example, that’s how often these decisions and automations will be triggered. For orders of physical goods that may take several days to fulfill, this is not a problem, but this is not a truly real-time process due to that latency. If an order has already moved on to the next stage in SAP before Celonis can act, for example, there will need to be checks to ensure that any updates pushed back to SAP will not negatively impact the order status.

There was a studio discussion following with Hala Zeine and Sebastian Walter. Zeine said that customers are saying “we’re done with discovery, what’s next?”, and have the desire to leverage machine learning and automation for day-to-day operations. This drove home the point that Celonis is repositioning from being an analysis tool to an operational tool, which gives them a much broader potential in terms of number of users and applications. Procure-to-pay and order-to-cash processes are a focus for them, and every mid-sized and large enterprise has problems with these processes.

The next session was with Stefan Maurer, Vice President of Enterprise Effectiveness for AVNET, a distributor of electronic components. He spoke about how they are using Celonis in their procure-to-pay process to react to supplier delivery date changes due to the impact of COVID-19 on global supply chains. He started with a number of good points on organizational readiness and how to take on process mining and transformation projects. He walked us through their process improvement maturity lifecycle, showing what they achieved with fundamental efforts such as LEAN Six Sigma, then where they started adding Celonis to the mix to boost the maturity level. He said that they could have benefited from adding Celonis a bit earlier in their journey, but feels that people need a basic understanding of process improvement before adding new tools and methodologies. In response to an audience question later, he clarified that this could be done earlier in an organization that is ready for process improvement, but the results of process mining could be overwhelming if you’re not already in that mindset.

Their enterprise effectiveness efforts focus on the activities of a different team members in a cycle of success that get the business ideas and needs through analysis stages and into implementation within tools and procedures. At several points in that cycle, Celonis is used for process mining but not automation; they are using Kofax and UIPath for RPA as their general task automation strategy.

Maurer showed a case study for early supplier deliveries: although early deliveries might seem like a good thing, if you don’t have an immediate use for the goods and the supplier invoices on delivery, this can have a working capital impact. They used Celonis to investigate details of the deliveries to determine the impact, and identify the target suppliers to work with on resolving the discrepancies. They also use Celonis to monitor procure-to-pay process effectiveness, using a throughput time KPI based over time windows a year apart: in this case, they are using the analytical capabilities to show the impact of their overall process improvement efforts. By studying the process variants, they can see what factors are impacting their effectiveness. They are starting to use the Celonis Action Engine for some delivery alerts, and hope to use more Celonis alerts and recommendations in the future.

Almost accidentally, Celonis also provided an early warning to the changes in the supply chain due to COVID-19. Using the same type of data set as they used for their early delivery analysis, they were able to find which suppliers and materials had a significant delay to their expected deliveries. They could then prioritize the needs of their medical and healthcare customers, manually interfering in their system logic to shift their supply chain to guarantee those customers while delaying others. He thinks that additional insights into materials acquisition supply chains will help them through the crisis.

I’m taking a break from Celosphere to attend the online Alfresco Modernize event, but I plan to return for a couple of the afternoon sessions.

CelosphereLive 2020 – Day 2: From process mining to intelligent operations

I’m back for the Celonis online conference, CelosphereLive, for a second day. They started much earlier since they using a European time zone, but I started in time to catch the Q&A portion of Ritu Nibber’s presentation (VP of Global Process and Controls at Reckitt Benckiser) and may go back to watch the rest of it since there were a lot of interesting questions that came up.

There was a 15-minute session back in their studio with Celonis co-CEO Bastian Nominacher and VP of Professional Services Sebastian Walter, then on to a presentation by Peter Tasev, SVP of Procure to Pay at Deutsche Telekom Services Europe. DTSE is a shared services organization providing process and service automation across many of their regional organizations, and they are now using Celonis to provide three key capabilities to their “process bionics”:

  1. Monitor the end-to-end operation and efficiency of their large, heterogeneous processes such as procure-to-pay. They went through the process of identifying the end-to-end KPIs to include into an operational monitoring view, then use the dashboard and reports to support data-driven decisions.
  2. Use of process mining to “x-ray” their actual processes, allowing for process discovery, conformance checking and process enhancement.
  3. Track real-time breaches of rules in the process, and alert the appropriate people or trigger automated activities.

Interesting to see their architecture and roadmap, but also how they have structured their center of excellence with business analysts being the key “translator” between business needs and the data analysts/scientists, crossing the boundary between the business areas and the CoE.

He went through their financial savings, which were significant, and also mentioned the ability of process mining to identify activities that were not necessary or could be automated, thereby freeing up the workforce to do more value-added activities such as negotiating prices. Definitely worth watch the replay of this presentation to understand the journey from process mining to real-time operational monitoring and alerting.

It’s clear that Celonis is repositioning from just process mining — a tool for a small number of business analysts in an organization — into operational process intelligence that would be a daily dashboard tool for a much large portion of the workforce. Many other process mining products are attempting an equivalent pivot, although Celonis seems to be a bit farther along than most.

Martin Klenk, Celonis CTO, gave an update on their technology strategy, with an initial focus on how the Celonis architecture enables the creation of these real-time operational apps: real-time connectors feed into a data layer, which is analyzed by the Process AI Engine, and then exposed through Boards that integrate data and other capabilities for visualization. Operational and analytical apps are then created based on Boards. Although Celonis has just released two initial Accounts Payable and Supply Chain operational apps, this is something that customers and partners can build in order to address their particular needs.

He showed how a custom operational app can be created for a CFO to show how this works, using a real-time connectors to Salesforce for order data and Jira for support tickets. He showed their multi-event log analytical capability, which makes it much easier to bring together data sources from different systems and automatically correlate them without a lot of manual data cleansing — the links between processes in different systems are identified without human intervention. This allows detection of anomalies that occur on boundaries between systems, rather than just within systems.

Signals can be created based on pre-defined patterns or from scratch, allowing a real-time data-driven alert to be issued when required, or an automation push to another system be triggered. This automation capability is a critical differentiator, allowing for a simple workflow based on connector steps, and can replace the need for some amount of other process automation technologies such as RPA in cases where those are not a good fit.

He was joined by Martin Rowlson, Global Head of Process Excellence at Uber; they are consolidating data from all of their operational arms (drive, eats, etc.) to analyze their end-to-end processes, and using process mining and task mining to identify areas for process improvement. They are analyzing some critical processes, such as driver onboarding and customer support, to reduce friction and improve the process for both Uber and the driver or customer.

Klenk’s next guest as Philipp Grindemann, head of Business Development at Lufthansa CityLine, discussing how they are using Celonis to optimize their core operations. They track maintenance events on their aircraft, plus all ground operations activities. Ground operations are particularly complex due to the high degree of parallelism: an aircraft may be refueled at the same time that cargo is being loaded. I have to guess that their operations are changing radically right now and they are having to re-structure their processes, although that wasn’t discussed.

His last guest was Dr. Lars Reinkemeyer, author of Process Mining in Action — his book has collected and documented many real-world use cases for process mining — to discuss some of the expected directions of process mining beyond just analytics.

They then returned to a studio session for a bit more interactive Q&A; the previous technology roadmap keynote was pre-recorded and didn’t allow for any audience questions, although I think that the customers that he interviewed will have full presentations later in the conference.

#CelosphereLive lunch break

As we saw in at CamundaCon Live last week, there is no break time in the schedule: if you want to catch all of the presentations and discussions in real time, be prepared to carry your laptop with you everywhere during the day. The “Live from the Studio” sessions in between presentations are actually really interesting, and I don’t want to miss those. Today, I’m using their mobile app on my tablet just for the streaming video, which lets me take screenshots as well as carry it around with me, then using my computer for blogging, Twitter, screen snap editing and general research. This means that I can’t use their chat or Q&A functions since the app does not let you stream the video and use the chat at the same time, and the chat wasn’t very interesting yesterday anyway.

The next presentation was by Zalando, a European online fashion retailer, with Laura Henkel, their Process Mining Lead, and Alejandro Basterrechea, Head of Procurement Operations. They have moved beyond just process mining, and are using Celonis to create machine learning recommendations to optimize procurement workflows: the example that we saw provided Amazon-like recommendations for internal buyers. They also use the process automation capabilities to write information back to the source systems, showing how Celonis can be used for automating multi-system integration where you don’t already have process automation technology in place to handle this. Their key benefits in adding Celonis to their procurement processes have been efficiency, quality and value creation. Good interactive audience Q&A at the end where they discuss their journey and what they have planned next with the ML/AI capabilities. It worked well with two co-presenters, since one could be identifying a question for their area while the other was responding to a different question, leaving few gaps in the conversation.

We broke into two tracks, and I attended the session with Michael Götz, Engineering Operations Officer at Celonis, providing a product roadmap. He highlighted their new operational apps, and how they collaborated with customers to create them from real use cases. There is a strong theme of moving from just analytical apps to operational apps that sense and act. He walked through a broad set of the new and upcoming features, starting with data and connectivity, through the process AI engine, and on to boards and the operational apps. I’ve shown some of his slides that I captured below, but if you’re a Celonis customer, you’ll want to watch this presentation and hear what he has to say about specific features. Pretty exciting stuff.

I skipped the full-length Uber customer presentation to see the strategies for how to leverage Celonis when migrating legacy systems such as CRM or ERP, presented by Celonis Data Scientist Christoph Hakes. As he pointed out, moving between systems isn’t just about migrating the data, but it also requires changing (and improving) processes . One of the biggest areas of risk in these large-scale migrations is around understanding and documenting the existing and future-state processes: if you’re not sure what you’re doing now, then likely anything that you design for the new system is going to be wrong. 60% of migrations fail to meet the needs of the business, in part due to that lack of understanding, and 70% fail to achieve their goals due to resistance from employees and management. Using process mining to explore the actual current process and — more importantly — understand the variants means that at least you’re starting from an accurate view of the current state. They’ve created a Process Repository for storing process models, including additional data and attachments

Hakes moved on to talk about their redesign tools, such as process conformance checking to align the existing processes to the designed future state. After rollout, their real-time dashboards can monitor adoption to locate the trouble spots, and send out alerts to attempt remediation. All in all, they’ve put together a good set of tools and best practices: their customer Schlumberger saved $40M in migration costs by controlling the migration costs, driving user adoption and performing ongoing optimization using Celonis. Large-scale ERP system migration is a great use case for process mining in the pre-migration and redesign areas, and Celonis’ monitoring capabilities also make it valuable for post-migration conformance monitoring.

The last session of the day was also a dual track, and I selected the best practices presentation on how to get your organization ready for process mining, featuring Celonis Director of Customer Success Ankur Patel. The concurrent session was Erin Ndrio on getting started with Celonis Snap, and I covered that based on a webinar last month. Patel’s session was mostly for existing customers, although he had some good general points on creating a center of excellence, and how to foster adoption and governance for process mining practices throughout the organization. Some of this was about how a customer can work with Celonis, including professional services, training courses, the partner network and their app store, to move their initiatives along. He finished with a message about internal promotion: you need to make people want to use Celonis because they see benefits to their own part of the business. This is no different than the internal evangelism that needs to be done for any new product and methodology, but Patel actually laid out methods for how some of their customers are doing this, such as road shows, hackathons and discussion groups, and how the Celonis customer marketing team can help.

He wrapped up with thoughts on a Celonis CoE. I’m not a big fan of product-specific CoEs, instead believing that there should be a more general “business automation” or “process optimization” CoE that covers a range of process improvement and automation tools. Otherwise, you tend to end up with pockets of overlapping technologies cropping up all over a large organization, and no guidance on how best to combine them. I wrote about this in a guest post on the Trisotech blog last month. I do think that Patel had some good thoughts on a centralized CoE in general to support governance and adoption for a range of personas.

I will check back in for a few sessions tomorrow, but have a previous commitment to attend Alfresco Modernize for a couple of hours. Next week is IBM Think Digital, the following week is Appian World, then Signavio Live near the end of May, so it’s going to be a busy few weeks. This would normally be the time when I am flying all over to attend these events in person, and it’s nice to be able to do it from home although some of the events are more engaging than others. I’m gathering a list of best practices for online conferences, including the things that work and those that don’t, and I’ll publish that after this round of virtual events. So far, I think that Camunda and Celonis have both done a great job, but for very different reasons: Camunda had much better audience engagement and more of a “live” feel, while Celonis showed how to incorporate higher production quality and studio interviews to good effect, even though I think it’s a bit early to be having in-person interviews.

CelosphereLive 2020 – Day 1

I expect to be participating in a lot of virtual vendor conferences over the next months, and today I tuned in to the Celonis CelosphereLive. They are running on European time, with today’s half day starting at a reasonable 9am Eastern North American time, but the next two days will be starting at 4am my time– I may be catching some of the sessions on demand.

We had a keynote from co-CEO Alexander Rinke that included a short discussion with the godfather of process mining, Wil van der Aalst. I liked Rinke’s characterization that every process in every company is waiting to be improved: this is what process mining (aka process intelligence, depending on which vendor is talking) is all about in terms of discovering processes. Miguel Milano, Celonis Chief Revenue Officer, joined him to talk about their new Celonis Value Architects certification program. The fact that this announcement takes such a prominent place in the keynote highlights that there’s still a certain amount of expertise required to do process mining effectively, even though the tools have become much easier to use.

There were also some new product announcements, first around the availablity of their real-time data connectors. This is a direction that many of the process mining vendors are taking, moving from just an analytical process discovery role to more of an operational monitoring process intelligence role. Chief Product Officer Hala Zeine joined Rinke to talk about their connectivity — out of the box, the product connects to 80 different data sources — and their process AI engine that fits the data to a set of desired outcomes and makes recommendations. Their visualization boards then let you view the analysis and explore root causes of problem areas.

Their process AI engine does some amount of automation, and they have just released operational apps that help to automate some activities of of the workflow. These operational apps are an overlay on business processes that monitor work in progress, and display analysis of the state of (long-running) processes that it is monitoring. The example shown is an Accounts Payable operational app that looks at invoices that are scheduled for payment, and allows a financial person to change parameters (such as date of payment) in bulk, which would then push that update back to the underlying A/P system. Think of these operational apps as smart dashboards, where you can do some complex analysis for monitoring work in progress, and also push some updates and actions back to the underlying operational system. These first two apps are already available to Celonis customers in their app store, and tomorrow there will be a session with the CTO showing how to build your own operational app.

To finish off the day we had two product demos/discussions. First was JP Thomsen, Celonis’ VP Business Models, giving a more in-depth demo of their Accounts Payable operational applications. He was joined by Jan Fuhr, Process Mining Lead at global healthcare company Fresenius Kabi, which collaborated on the creation of the A/P operational application; Fuhr discussed their process mining journey and how they are now able to better support their A/P function and manage cash flow. The sweet spot for these operational apps appears to be when you don’t have end-to-end management on your process with another system such as a BPMS: the operational app monitors what’s happening in any core systems (such as SAP) and replaces ad hoc “management by spreadsheet” with AI and rules that highlight problem areas and make suggestions for remediation. They’ve had some great cost savings, through taking advantage of paying within a specified time frame to receive a discount, and optimizing their payment terms.

Last up was Trevor Miles, Celonis’ head of Supply Chain and Manufacturing Solutions, talking about the supply chain operational application: obviously these operational apps are a big deal for Celonis and their customers, since they’ve been the focus of most of these first half-day. Process mining can provide significant value in supply chain management since it typically involves a number of different systems without an explicit end-to-end process orchestration, and can have a lot of exceptions or variants. Understanding those variants and being able to analyze and reroute things on the fly is critical to maintaining a resilient suppy chain. This has been highlighted during the COVID-19 pandemic, where supply chains have been disrupted, overloaded or underused, depending on the commodity and the route.

Process mining is used to generate a digital twin for the supply chain, which can then be used to analyze past performance and use as a point of comparison with current activities. The Celonis operational app for supply chain is about closing the gap between sensing and actions, so that process mining and simulation isn’t just an analytical tool, but a tool for guiding actions to improve processes. It’s also a tool for bridging across multiple systems of the same time: many large organizations have, for example, multiple instances of SAP for different parts of their processes, and need to knit together all of that information to make better decisions.

Not quite social distancing…

They finished up with a discussion in the studio between Hala Zeine, co-CEO Bastian Nominacher and CTO Martin Klenk, covering some of the new announcements and what’s coming up in the next two days. I’ll be back for some of the sessions tomorrow, although likely not before 8am Eastern.

A few notes on the virtual conference format. Last week’s CamundaCon Live had sessions broadcast directly from each speaker’s home plus a multi-channel Slack workspace for discussion: casual and engaging. Celonis has made it more like an in-person conference by live-broadcasting the “main stage” from a studio with multiple camera angles; this actually worked quite well, and the moderator was able to inject live audience questions. Some of the sessions appeared to be pre-recorded, and there’s definitely not the same level of audience engagement without a proper discussion channel like Slack — at an in-person event, we would have informal discussions in the hallways between sessions that just can’t happen in this environment. Unfortunately, the only live chat is via their own conference app, which is mobile-only and has a single chat channel, plus a separate Q&A channel (via in-app Slido) for speakers that is separated by session and is really more of a webinar-style Q&A than a discussion. I abandoned the mobile app early and took to Twitter. I think the Celosphere model is probably what we’re going to see from larger companies in their online conferences, where they want to (attempt to) tightly control the discussion and demonstrate the sort of high-end production quality that you’d have at a large in-person conference. However, I think there’s an opportunity to combine that level of production quality with an open discussion platform like Slack to really improve the audience experience.

Snap! @Celonis offers free process mining in the cloud

I attended a webinar today where Celonis showed Snap, the new free tier of their cloud-based process mining platform. It can work with flat files (CSV) or manually connect to ServiceNow, Google Sheets and a few other data sources directly, plus it supports teams for collaboration. You’re limited to uploading 500MB of data (which is a lot of records you’re uploading just case ID, activity name, resource, start and end time), and there’s no continuous data integration for event collection the way there is with their full-blown IBC Enterprise version; additionally, several functions from the main product are visible but not enabled. However, if you want to dip your toe into process mining with real-sized datasets, this is a good choice.

The process mining functionality is similar to what you see with competitive products, plus some nice dashboard capabilities for visualizing not just the overall discovered process flow, but for drilling into problem areas to do root cause analysis.

You can sign up for a free Celonis Snap account on their website. It self-deploys within a couple of minutes, creating a team account and setting you up as the admin, with the ability to add other users to the team. It doesn’t support everything in the paid platform, but definitely a good way to get started with process mining. There’s also an online community to ask (and answer) questions.

They are also offering free online training (not sure if that’s just now or if they will do this on an ongoing basis) that covers their full paid product suite; if you’re using Snap, the parts of the training related to process discovery and analysis will be relevant. They are launching free Snap-specific training next week, and adding new features to the product such as connectors to SAP Ariba. Obviously, they’d like to leverage free accounts into paid accounts, but this is more than just a 30-day trial or non-functional demo account; you can do some real work with the free version and move to the paid version when your data or processing needs exceed that.

APQC webinar: 2020 process and performance management priorities, with @hlykehogland

I listened in on a webinar today with APQC‘s Holly Lyke-Ho-Gland looking at the results of their 2020 process and performance management priorities survey (conducted in late 2019). Some good insights here, looking at the top three challenges in business process management and continuous improvement. Process modeling and mining vendors will be happy to see that the highest priority challenge in BPM is defining and mapping end-to-end processes.

She covered a number of tips and solutions to address these challenges, from points on developing end-to-end processes, how to develop a culture of continuous improvement, and governance alignment. She included a lot of great case studies and examples across all of these areas, and what type of resources and expertise is required to achieve them.

After covering the business process management and continuous improvement side, she moved on to discuss the organizational performance management challenges and solutions. Performance management is more about analytics and metrics, and using those measures to support decision making; apparently this year’s challenges are the same as last year’s, meaning that organizations are still struggling with these issues.

Some interesting points here about change management plans and what needs to be done in order to be successful in performance management; check out the webinar replay for details.

The last part of the webinar was on their “special interest” section, which this year is process management. The first point was on the purpose of process teams and work, the most important of which is supporting strategic initiatives. This is definitely what I see in my own consulting practice, with process gaining a much higher profile as companies focus on digital transformation efforts: at their heart, many transformation efforts are process-centric. The APQC research also showed information on measuring process work, and she notes (as I often see) that the top measures are still focused on bottom-line savings rather than more strategic measures, meaning that process metrics are misaligned with strategic focus. She also covered the impact of technology on process work: not just process automation, but collaboration, data management and visualization, collaboration and cloud computing topped the technology list, since they are looking at the entire process management lifecycle. She made a direct call-out to process mining (although it wasn’t in the top five list) as a cross-over between data analysis and process modeling; I’m definitely in agreement with that as you can see from my post earlier this week.

She finished with a summary of the survey results, plus a peek at their research agenda for 2020 with lots of interesting and timely topics. I like that their research uses a lot of real-world case studies.

I couldn’t find a direct link to the webinar replay yet, but it will likely available on APQC’s On-Demand Webinars page soon; definitely worth checking out for Lyke-Ho-Gland’s insights and discussion. While you’re over there, check out their Process and Performance Management Conference, coming up in October. I spoke at their conference back in 2013, and really enjoyed the experience, good sessions and a smaller conference so great for networking.

Process is cool (again), and the coolest kid on the block is process mining

I first saw process mining software in 2008, when Fujitsu was showing off their process discovery software/services package, plus an interesting presentation by Anne Rozinat from that year’s academic BPM conference where she tied in concepts of process mining and simulation without really using the term process mining or discovery. Rozinat went on to form Fluxicon, which developed one of the earliest process mining products and really opened up the market, and she spent time with me providing my early process mining education. Fast forward 10+ years, and process mining is finally a hot topic: I’m seeing it from a few mining-only companies (Celonis), and as a part of a suite from process modeling companies (Signavio) or even a larger process automation suite (Software AG). Eindhoven University of Technology, arguably the birthplace of process mining, even offers a free process mining course which is quite comprehensive and covers usage as well as many of the underlying algorithms — I did the course and found it offered some great insights and a few challenges.

Today, Celonis hosted a webinar, featuring Rob Koplowitz of Forrester in conversation with Celonis’ CMO Anthony Deighton, on the role of process improvement in improving digital operations. Koplowitz started with some results from a Forrester survey showing that digital transformation is now the primary driver of process improvement initiatives, and the importance of process mining in that transformation. Process mining continues its traditional role in process discovery and conformance checking but also has a role in process enhancement and guidance. Lucky for those of us who focus on process, process is now cool (again).

Unlike just examining analytics for the part of a process that is automated in a BPMS, process mining allows for capturing information from any system and tracing the entire customer journey, across multiple systems and forms of interaction. Process discovery using a process mining tool (like Celonis) lets you take all of that data and create consolidated process models, highlighting the problem areas such as wait states and rework. It’s also a great way to find compliance problems, since you’re looking at how the processes actually work rather than how they were designed to work.

Koplowitz had some interesting insights and advice in his presentation, not the least of which was to engage business experts to drive change and automation, not just technologists, and use process analytics (including process mining) as a guide to where problems lie and what should/could be automated. He showed how process mining fits into the bigger scope of process improvement, contributing to the discovery and analysis stages that are a necessary precursor to reengineering and automation.

Good discussion on the webinar, and there will probably be a replay available if you head to the landing page.

DecisionCAMP 2019: DMN TCK, BPO with AI and rules, and business logic hidden in spreadsheets

Close Is Not Close Enough. Keith Swenson, Fujitsu

A few months ago at bpmNEXT, I saw Keith Swenson give an update on the DMN Technology Compatibility Kit, and we’re seeing a bit of a repeat of that presentation here at DecisionCAMP. The TCK defines a set of test cases (as DMN decision models, input data and expected results) that assure conformance to the specification, plus a sample runner application that will pass the models and data to the vendor’s engine and evaluate the results.

DMN TCK. From Keith Swenson’s presentation.

There are about 120 test models and 1600 test cases, supporting only DMN 1.2; these tests come from examining the specification as well as cases from practice. It’s easy for a vendor to get involved in the the TCK, both in terms of running it against their engine and in terms of participating through submitting new test models and cases. You can see the vendors that have submitted their results; although many more vendors claim that they “have DMN”, their actual level of compatibility may be suspect.

The TCK committee is getting ready for DMN 1.3, and considering tests for modeling tools in addition to the current tests for the engine. He also floated the idea of a standardized API for DMN as a service, so that the calling application doesn’t need to know which engine it’s calling — possibly something that’s not going to be a big hit with vendors.

Business innovation of BPO realized by Task Center and AI and Rule Engine. Yoshihito Nakayama, NTT DATA INTRAMART

Yoshihito Nakayama presented on the current challenges of BPO with respect to improving productivity, and how they are resolving this using AI and a rules engine to aggregate and assign human tasks from multiple systems to different team members. This removes the requirement to manually review and assign work, and also provides a dashboard for visualizing work in progress and future forecasts.

Intramart’s Task Center for aggregating and assigning work. From Yoshihito Nakayama’s presentation.

AI is used to predict and optimize task classification and assignment, based on time required to complete the task and the individual worker skill level and productivity. It is also used to predict workload for task types and individual workers.

Their visualization dashboard shows drilldowns on current and past productivity, plus future forecasts. The simulation models for forecasting can be fine-tuned to optimize for cost, performance and other factors. It brings together work monitoring from all systems, including RPA processes. They’re also using process mining on a variety of systems to create a digital twin of the organization for better tracking and predictions, as well as other tools such as voice and image identification to recognize what tasks are being done that are not being recorded in any system logs.

They have a variety of case studies across industries, looking at automating non-routine work using case management, BPM, RPA, AI and rules.

Spaghetti Spreadsheets Untangled – Benefits of decision modeling when uncovering complex business logic hidden in spreadsheets. Charlotte Bouvy, M.C. Bouvy Consultancy

Charlotte Bouvy presented on her work done with SVB, the Netherlands social insurance administrator, on implementing business rules management. They are using DMN-based wizards for supporting 1,500 case workers, and the specific case was around the operational control and audit departments and the “lawfulness” of how the assessment work is done. Excel spreadsheets were used to do this, which had obvious problems in terms of being error prone and lacking domain-specific business logic. They implemented their SARA system to replace the spreadsheets with Oracle OPA, which allowed them to more accurately represent knowledge, as well as separate the data from the decision model while creating an executable model.

Decision model to determine lawfulness. From Charlotte Bouvy’s presentation.

These type of audit processes require sampling over a wide variety of case files to compare actual payments against expected amounts, with some degree of aggregation within specific laws being applied. Moving to a rules engine allowed them to model calculations and decisions, and separate data and model to avoid errors that occurred when copying and pasting data in spreadsheets. The executable model is now a single source of truth to which version control and change management can be applied. They are trying out different ways of using the SARA system: directly in Oracle Policy Modeler for building and debugging; via a web interview and an RPA robot for data input; and eventually via direct integration with the SVB’s case management system to load data.

Industry forum presentations at @BPMConf – process improvement benefits realization, IoT process mining, quality management, and deep learning of shipping container movement

To finish up my time at the academic research BPM 2019 conference, I attended one of the industry forum sessions, which highlights initiatives that bring together academic research and practical applications in industry. These are shorter presentations than the research sessions, although still have formal published papers documenting their research and findings; check those proceedings for the full author list for each paper and the details of the research.

Process Improvement Benefits Realization: Insights from an Australian University. Wasana Bandara, QUT

The first presentation was about process improvement at the author’s university. They took on an enterprise business process improvement project in 2017, and have developed a Business Process Improvement Office (BPIO — aka centre of excellence). They wanted to be able to have measurable benefits, and created a benefits realization framework that ran in parallel with their process improvement lifecycle to have the idea and measurement of benefits built in from the beginning of any project.

Alignment of BR lifecycle with BPI lifecycle
(From the industry forum research paper)

They found that identifying and aligning the ideas of benefits realization early in a project created awareness and increased receptiveness to unexpected benefits. Good discussion following on the details of their framework and how it impacts the business areas as they move their manual processes to more automation and management.

Enabling Process Mining in Aircraft Manufacture: Extracting Event Logs and Discovering Processes from Complex Data. Maria Teresa Gómez López, Universidad de Sevilla

The second presentation was about using process mining to discover the processes used in aircraft manufacture. The data underlying the process mining is generated by IoT manufacturing devices, hence had much higher volumes than a usual business process mining application, requiring preprocessing to aggregate the raw log data into events suitable for process mining. They also wanted to be able to integrate knowledge from engineers involved in the manufacturing process to capture best practices and help extract the right data from the raw data logs.

Result of process discovery for three test cases
(From the industry forum research paper)

They had some issues with analyzing the log data, such as incorrect data (an aircraft was in two stations at the same time, or was moving backwards through the assembly process), incomplete or insufficient information. Future research and work on this will include potential integration with big data architectures to handle the volume of raw log data, and and finding new ways of analyzing the log data to have cleaner input to the process discovery algorithms.

The adoption of globally unified process standards via a multilingual management system The case of Marabu, a worldwide manufacturer of printing inks and creative colours of the highest quality. Klaus Cee, Marabu, and Andreas Schachermeier, ConSense

The next presentation was about how Marabu, a printing ink company, standardized and aligned their multinational subsidiaries’ business processes with the parent company. This was not a research project per se, although ConSense is a spinoff consulting company from a university project several years ago, but a process and knowledge management implementation project. They had some particular challenges with developing uniform multi-lingual processes that could have local variants, integrated with needs of quality, environmental and occupational safety management.

Data-driven Deep Learning for Proactive Terminal Process Management. Andreas Metzger, University of Duisburg-Essen

The final paper in this industry forum session was on the digitalization of the Duisberg intermodal container shipping port, a large inland port dealing with about 10,000 containers arriving and departing by rail and truck each month. Data streams from terminal equipment captured information about the movement of containers, cranes and trains; their goal was to predict based on current state whether a given train would be loaded and could depart on time, and proactively dispatch resources to speed up loading when necessary. This sounds like a straightforward problem, but the data collected can lead to erroneous results: waiting for more data to be collected can lead to more accurate predictions, but earlier intervention can resolve the problem faster.

They applied multiple parallel deep learning models (recurrent neural networks) to improve the predictions, dynamically trading off between accuracy and earliness of detection. They were able to increase the number of trains leaving on time by 4.7%, which is a great result when you consider the cost of a delayed train.

Terminal Productivity Cockpit excerpt
(From the industry forum research paper)

They used RNNs as their deep learning models because they can handle arbitrary length process instances without sequence or trace encoding, and can perform predictions at any checkpoint with a single model; there’s a long training time and it’s compute-intensive, but that pays off in this scenario. Lessons that they learned included the fact that the deep learning ensembles worked well out of the box, but also that the quality of the data used as input is a key concern for accuracy: if you’re going to spend time working on something, work on data cleansing before you work on the deep learning algorithms.

The Zaha Hadid-designed Library and Learning Center at UW Wien, our home for the main conference

The last segment following this is a closing panel, so this is the last of my coverage from BPM 2019. I haven’t attended this conference in many years, and I’m glad to be back. Looking forward to next year in Seville!

It’s been great to catch up with a lot of people who I haven’t seen since the last time that I attended, plus a few who I see more often. UW Wien has been a welcoming host as well as having a campus full of extraordinary modern architecture, with a well-organized conference and great evening social events.