BPM2023 Day 3 BPM Forum: come for the chatbots, stay for the weasels

I moved to the BPM Forum session for another rapid-fire succession of 15-minute presentations, a similar format to yesterday’s Journal First session. No detailed notes in such short presentations but I captured a few photos as things progressed. So many great research ideas!

Conversational Process Modelling: State of the Art, Applications, and Implications in Practice (Nataliia Klievtsova, Janik-Vasily Benzin, Timotheus Kampik, Juergen Mangler and Stefanie Rinderle-Ma), presented by Nataliia Klievtsova.

Large Language Models for Business Process Management: Opportunities and Challenges (Maxim Vidgof, Stefan Bachhofner and Jan Mendling), presented by Maxim Vidgof.

Towards a Theory on Process Automation Effects (Hoang Vu, Jennifer Haase, Henrik Leopold and Jan Mendling), presented by Hoang Vu.

Process Mining and the Transformation of Management Accounting: A Maturity Model for a Holistic Process Performance Measurement System, presented by Simon Wahrstoetter.

Business Process Management Maturity and Process Performance – A Longitudinal Study (Arjen Maris, Guido Ongena and Pascal Ravesteijn), presented by Arjen Maris.

From Automatic Workaround Detection to Process Improvement: A Case Study (Nesi Outmazgin, Wouter van der Waal, Iris Beerepoot, Irit Hadar, Inge van de Weerd and Pnina Soffer), presented by Pnina Soffer.

Detecting Weasels at Work: A Theory-driven Behavioural Process Mining Approach (Michael Leyer, Arthur H. M. ter Hofstede and Rehan Syed), presented by Michael Leyer.

BPM2023 Utrecht Workshop on Business Process Optimization

11 years ago, Marlon Dumas from Tartu University chaired the BPM2012 conference in Tallinn, Estonia, which I was able to attend. We had met at a previous conference (maybe Milan?), then since that time I’ve worked with his company Apromore to create a white paper on process mining. Today, he gave a keynote at the workshop here at BPM2023 on the status and perspectives of business process optimization.

He started with the origins of process optimization from 20+ years ago: start with discovery to create the as-is model, then develop of the to-be model, testing of the new model, and eventually deployment. Adding simulation to the loop allows for some testing and predicted performance to feed back into the design prior to deployment. This type of process analysis had a lot of flaws due to some fundamentally flawed assumptions about the correctness of the process model and simulation parameters, the skills and behaviour of the process participants, and general resource management. Marlon (and many others) have endorsed these imperfect methods in the past, and he invited us to tear up his earlier book on BPM. 😆

Since then, he has been working on a better sort of simulation model based on discovery from event logs: think of it as using process mining as an automated generator for more complex simulation parameters rather than just the base process model. They have shared their work for other researchers to review and extend.

This has opened the door to more automated process optimization techniques: search-based, which adds domain knowledge to the simulation model discovery to generate a set of possible process changes that can then be simulated and tested to determine the best improvement opportunities. Optimization, as he pointed out, is a multi-dimensional problem since we are always working towards the improvement of more than one performance indicator. Dimensions of improvement may include optimization of decision rules, flow, tasks and/or resources. They’ve done some additional work on an optimization engine that’s also shared on GitHub.

He moved on to talking about conversational process optimization, which makes search-based optimization just a step in a broader approach that puts a human expert in the loop to guide the exploration of the optimization space. In this approach, a conversational UI has an interactive discussion with a human expert, then combines that with the search-based optimization techniques, then presents that back to the expert for review and further conversation and optimization.

As the presentation finished and we were moving to questions, security kicked us out of our room for overcrowding, so we adjourned to the outdoor square. Lots of great discussion, with Marlon mentioning that the field of Operations Research is okay except that it’s the domain of a bunch of mathematicians, and urging us to cast off the shackles of process models. Also a good bit about the optimization of resource workload and allocation to maximize efficiency: people work best when they are “happy” (a proxy for “unstressed and productive”), which means having neither too much nor too little work, and the right mix of work. 

Marlon published his slide deck on Slideshare, which allowed me to steal a few screenshots rather than trying to photograph the live presentation.

Accelerating Process Modeling with Process Mining

Back in 2008, I started attending the annual academic research BPM conference, which was in Milan that year. I’m not an academic, but this wasn’t just an excuse for a week in Europe: the presentations I saw there generated so many ideas about the direction that the industry would/should take. Coincidentally, 2008 was also the first year that I saw process mining offered as a product: I had a demo with Keith Swenson of Fujitsu showing me their process discovery product/service in June, then saw Anne Rozinat’s presentation at the academic conference in September (she was still at Eindhoven University then, but went on to create Fluxicon and their process mining tool).

Over the years, I met a lot of people at this conference who accepted me as a bit of a curiosity; I brought the conference some amount of publicity through my blog posts, and pushed a lot of software vendors to start showing up to see the wild and wonderful ideas on display. They even invited me to give a keynote in 2011 on the changing nature of work. Two of the people who I met along the way, Marlon Dumas of University of Tartu and Marcello La Rosa of University of Melbourne, went on to form their own process mining company, Apromore.

I’ve recently written a white paper for Apromore to help demystify the use of process mining alongside more traditional process modeling techniques by business analysts. From the introduction:

Process modeling and process mining are complementary, not competitive, techniques: a business analyst needs both in their toolkit. Process mining provides exact models of the system-based portions of processes, while manual modeling and analysis captures human activities, documents informal procedures, and identifies the many ways that people “work around” systems.

Head on over to their site to read the full paper (registration required).

My upcoming webinar sponsored by Signavio – How to Thrive During Times of Rapid Change

This will be the fourth in a series of webinars that I’m doing for Signavio, this time focused on the high-tech industry but with lessons that span other industries. From the description:

High-Tech businesses are renowned disruptors. But what happens when the disruptors become the disrupted? For example, let’s say a global pandemic surfaces and suddenly changes your market dynamics and your business model.

Can your business handle an instant slowdown or a hyper-growth spurt? What about your operating systems? Are they nimble enough for you to scale? Can you onboard new customers en masse or handle a high volume of service tickets overnight? What about your supply chain; how agile are your systems and supplier relationships?

The first two webinars were discussing banking in February and insurance in March, and the role that intelligent processes play in improving business, with a brief mention in the March webinar about addressing business disruption caused by the pandemic. By the time we hit the third webinar on financial services in April, we had pivoted to look at the necessity of process improvement technologies and methodologies in times of business disruption such as the current crisis. Unlike a lot of industries, many high-tech sectors have been booming during the pandemic: their problems are around being able to scale up operations to meet demand without sacrificing customer service. Although they share some of the same issues as I looked at in the earlier webinars, they have some unique issues where process intelligence and automation can help them.

Tune in on May 20th at 11am Eastern; if you can’t make it then, sign up anyway and you’ll get a link to the on-demand version.

CelosphereLive 2020 — Day 3: extending process mining with multi-event logs and task mining

Traditionally, process mining is fed from history logs from a single system. However, most businesses aren’t run on a single system, and Celonis Product Lead for Discovery Sabeth Steiner discussed how they are allowing multi-event log process mining, where logs from multiple systems are ingested and correlated to do a more comprehensive analysis. This can be useful to find friction between parallel (inbound) procurement and (outbound) sales processes, or customer service requests that span multiple process silos. Different parallel processes appear in Celonis process discovery in different colors, and the crossover points between them highlighted.

Each of the processes can be analyzed independently, but the power comes when they are analyzed in tandem: optimizing the delivery time within an order-to-cash process while seeing the points that it interacts with the procure-to-pay process of the vendors providing materials for that order. Jessica Kaufmann, Senior Software Developer, joined Steiner to show the integrated data model that exists behind the integrated analysis of multiple processes, and how to set this up for multiple event logs. She discussed the different types of visualization: whether to visualize the different processes as a single process (by merging the event logs), or as multiple interacting processes. KPIs can also be combined, so that overall KPIs of multiple interacting processes can be tracked. Great Q&A at the end where they addressed a number of audience questions on the mechanics of using multi-event logs, and they confirmed that this will be available in the free Celonis Snap offering.

Another analysis capability not traditionally covered by process mining is task mining: what are the users doing on the desktop to interact between multiple systems? Andreas Plieninger, Product Manager, talked about how they capture user interaction data with their new Celonis Task Mining. I’ve been seeing user interaction capture being done by a few different vendors, both process mining/analysis and RPA vendors, and this really is the missing link in understanding processes: lack of this type of data capture is the reason that I spend a lot of time job-shadowing when I’m looking at an enterprise customer’s processes.

Task Mining is installed on the user’s desktop (Windows only for now), and when certain white-listed applications are used, the interaction information is captured as well as data from the desktop files, such as Excel spreadsheets. AI/ML helps to group the activities together and match them to other system processes, providing context for analysis. “Spyware” that tracks user actions on the desktop is not uncommon in productivity monitoring, but Celonis Task Mining this is a much more secure and restricted version of that, capturing just the data required for analyzing processes, and respecting the privacy of both the user and data on their screen.

Once the user interaction data is captured, it can be analyzed in the same way as process event log: it can discover the process and its variants, and trigger alerts if process compliance rules are violated. It’s in the same data later as process mining data, so can analyzed and exposed using the same AI, boards and apps structure as process data. Task Miner also captures screen snapshots to show what was actually happening as the user clicked around and entered data, and can be used to check what the user was seeing while they were working. This can be used to determine root causes for the longer-running variants, find opportunities for task automation, and check compliance.

He showed a use case for finding automation opportunities in a procure-to-pay process, similar to the concept of multi-event logs where one of those logs is the user interaction data. The user interaction data is treated a bit differently, however, since it represents manual activities where you may want to apply automation. A Celonis automation could then be used to address some of the problem areas identified by the task mining, where some of the cases are completely automated, while others require human intervention. This ability to triage cases, sending only those that really need human input for someone to process, while automatically pushing actions back to the core systems to complete the others automatically, can result in significant cost savings and shortened cycle time.

Celonis Task Mining is still in an early adopter program, but is expected to be in beta by August 2020 and generally available in November. I’m predicting a big uptake in this capability, since remote work is removing the ability to use techniques such as job shadowing to understand what steps workers are taking to complete tasks. Adding Task Mining data to Process Mining data creates the complete picture of how work is actually getting done.

That’s it for me at CelosphereLive 2020; you can see replays of the presentation videos on the conference site, with the last of them likely to be published by tomorrow. Good work by Celonis on a marathon event: this ran for several hours per day over three days, although the individual presentations were pre-recorded then followed by live Q&A. Lots of logistics and good production quality, but it could have had better audience engagement through a more interactive platform such as Slack.

CelosphereLive 2020 — Day 3: Process AI for automation

I started my day early to see Dr.Steffen Schenk, Celonis Head of Product Management, talk about the Celonis Action Engine and process automation. In short, they are seeing that because they integrate directly with core systems (especially ERP systems, that have their own processes built in), they can do things that RPA and BPM systems can’t do: namely, data-driven sense and act capabilities. However, these processes are only as timely as the data connection from the core systems into Celonis, so there can be latency.

He walked through an example of an order management process where he filtered SAP order data to show those with on-time delivery problems, due to order approval or credit check required, and created a query to detect those conditions in the future. Then, he created a process automation made up of system connectors that would be triggered based on a signal from that query in the future. In addition to system connectors (including webhooks), the automation can also directly call Celonis actions that might prompts a user to take an action. The automation can do branching based on data values: in his example, a customer credit block was automatically removed if they have a history of on-time payment, and that data was pushed back to SAP. That, in turn, would cause SAP to move the invoice along: it’s effectively a collaborative process automation between SAP and Celonis. The non-automated path sends a task to an order manager to approve or deny the credit, which in turn will trigger other automated actions. This process automation is now a “Skill” in Celonis, and can be set to execute for all future SAP order data that flows through Celonis.

Once this automation has been set up, the before and after processes can be compared: we see a higher degree of automation that has led to improving the on-time delivery KPI without increasing risk. It’s data-driven, so that only customers that continue to have an on-time payment record will be automatically approved for credit on a specific order. This is an interesting approach to automation that provides more comprehensive task automation than RPA, and a better fit than BPM when processes primarily exist in a line-of-business or legacy system. If you have an end-to-end process to orchestrate and need a comprehensive model, then BPM may be a better choice, but there’s a lot of interesting applications for the Celonis model of automating the parts of an existing process that the core system would have “delegated” to human action. I can definitely see applications for this in insurance claims, where most of the claim process lives in a third-party claims management system, but there are many decisions and ancillary processes that need to happen around that system.

This level of automation can be set up by a trained Celonis analyst: if you’re already creating analysis and dashboards, then you have the level of skill required to create these automations. This is also available both for cloud and on-premise deployments. There was an interesting discussion in the Q&A about credentials for updating the connected systems: this could be done with the credentials of the person who approves a task to execute (attended automation) or with generic system credentials for fully-automated tasks.

This was a really fascinating talk and a vision of the future for this type of process automation, where the core process lives within an off-the-shelf or legacy system, and there’s a need to do additional automation (or recommendations) of supporting decisions and actions. Very glad that I got up early for the 7:15am start.

I listened in on the following talk on machine learning and intelligent automation by Nicolas Ilmberger, Celonis Senior Product Manager of Platform and Technology, where he showed some of their pre-built ML tools such as duplicate checkers (for duplicate invoices, for example), root cause analysis and intelligent audit sampling. These are used to detect specific scenarios in the data that is flowing into Celonis, then either raising an action to a user, or automating an action in the background. They have a number of pre-configured connectors and filters, for example, to find a duplicate vendor invoice in an SAP system; these will usually need some customization since many organizations have modified their SAP systems.

He showed a demonstration of using some of these tools, and also discussed a case study of a manufacturing customer that had significant cost savings due to duplicate invoice checking: their ERP system only found exact matches, but slight differences in spelling or other typographical errors could result in true duplicates that needed more intelligent comparison. A second case study was for on-time delivery by an automotive supplier, where customer orders at risk were detected and signals sent to customer service with recommendations for the agent for resolution.

It’s important to note that both for these ML tools and the process automation that we saw in the previous session, these are only as timely as the data connection from the core processing system to Celonis: if you’re only doing daily data feeds from SAP to Celonis, for example, that’s how often these decisions and automations will be triggered. For orders of physical goods that may take several days to fulfill, this is not a problem, but this is not a truly real-time process due to that latency. If an order has already moved on to the next stage in SAP before Celonis can act, for example, there will need to be checks to ensure that any updates pushed back to SAP will not negatively impact the order status.

There was a studio discussion following with Hala Zeine and Sebastian Walter. Zeine said that customers are saying “we’re done with discovery, what’s next?”, and have the desire to leverage machine learning and automation for day-to-day operations. This drove home the point that Celonis is repositioning from being an analysis tool to an operational tool, which gives them a much broader potential in terms of number of users and applications. Procure-to-pay and order-to-cash processes are a focus for them, and every mid-sized and large enterprise has problems with these processes.

The next session was with Stefan Maurer, Vice President of Enterprise Effectiveness for AVNET, a distributor of electronic components. He spoke about how they are using Celonis in their procure-to-pay process to react to supplier delivery date changes due to the impact of COVID-19 on global supply chains. He started with a number of good points on organizational readiness and how to take on process mining and transformation projects. He walked us through their process improvement maturity lifecycle, showing what they achieved with fundamental efforts such as LEAN Six Sigma, then where they started adding Celonis to the mix to boost the maturity level. He said that they could have benefited from adding Celonis a bit earlier in their journey, but feels that people need a basic understanding of process improvement before adding new tools and methodologies. In response to an audience question later, he clarified that this could be done earlier in an organization that is ready for process improvement, but the results of process mining could be overwhelming if you’re not already in that mindset.

Their enterprise effectiveness efforts focus on the activities of a different team members in a cycle of success that get the business ideas and needs through analysis stages and into implementation within tools and procedures. At several points in that cycle, Celonis is used for process mining but not automation; they are using Kofax and UIPath for RPA as their general task automation strategy.

Maurer showed a case study for early supplier deliveries: although early deliveries might seem like a good thing, if you don’t have an immediate use for the goods and the supplier invoices on delivery, this can have a working capital impact. They used Celonis to investigate details of the deliveries to determine the impact, and identify the target suppliers to work with on resolving the discrepancies. They also use Celonis to monitor procure-to-pay process effectiveness, using a throughput time KPI based over time windows a year apart: in this case, they are using the analytical capabilities to show the impact of their overall process improvement efforts. By studying the process variants, they can see what factors are impacting their effectiveness. They are starting to use the Celonis Action Engine for some delivery alerts, and hope to use more Celonis alerts and recommendations in the future.

Almost accidentally, Celonis also provided an early warning to the changes in the supply chain due to COVID-19. Using the same type of data set as they used for their early delivery analysis, they were able to find which suppliers and materials had a significant delay to their expected deliveries. They could then prioritize the needs of their medical and healthcare customers, manually interfering in their system logic to shift their supply chain to guarantee those customers while delaying others. He thinks that additional insights into materials acquisition supply chains will help them through the crisis.

I’m taking a break from Celosphere to attend the online Alfresco Modernize event, but I plan to return for a couple of the afternoon sessions.

CelosphereLive 2020 – Day 2: From process mining to intelligent operations

I’m back for the Celonis online conference, CelosphereLive, for a second day. They started much earlier since they using a European time zone, but I started in time to catch the Q&A portion of Ritu Nibber’s presentation (VP of Global Process and Controls at Reckitt Benckiser) and may go back to watch the rest of it since there were a lot of interesting questions that came up.

There was a 15-minute session back in their studio with Celonis co-CEO Bastian Nominacher and VP of Professional Services Sebastian Walter, then on to a presentation by Peter Tasev, SVP of Procure to Pay at Deutsche Telekom Services Europe. DTSE is a shared services organization providing process and service automation across many of their regional organizations, and they are now using Celonis to provide three key capabilities to their “process bionics”:

  1. Monitor the end-to-end operation and efficiency of their large, heterogeneous processes such as procure-to-pay. They went through the process of identifying the end-to-end KPIs to include into an operational monitoring view, then use the dashboard and reports to support data-driven decisions.
  2. Use of process mining to “x-ray” their actual processes, allowing for process discovery, conformance checking and process enhancement.
  3. Track real-time breaches of rules in the process, and alert the appropriate people or trigger automated activities.

Interesting to see their architecture and roadmap, but also how they have structured their center of excellence with business analysts being the key “translator” between business needs and the data analysts/scientists, crossing the boundary between the business areas and the CoE.

He went through their financial savings, which were significant, and also mentioned the ability of process mining to identify activities that were not necessary or could be automated, thereby freeing up the workforce to do more value-added activities such as negotiating prices. Definitely worth watch the replay of this presentation to understand the journey from process mining to real-time operational monitoring and alerting.

It’s clear that Celonis is repositioning from just process mining — a tool for a small number of business analysts in an organization — into operational process intelligence that would be a daily dashboard tool for a much large portion of the workforce. Many other process mining products are attempting an equivalent pivot, although Celonis seems to be a bit farther along than most.

Martin Klenk, Celonis CTO, gave an update on their technology strategy, with an initial focus on how the Celonis architecture enables the creation of these real-time operational apps: real-time connectors feed into a data layer, which is analyzed by the Process AI Engine, and then exposed through Boards that integrate data and other capabilities for visualization. Operational and analytical apps are then created based on Boards. Although Celonis has just released two initial Accounts Payable and Supply Chain operational apps, this is something that customers and partners can build in order to address their particular needs.

He showed how a custom operational app can be created for a CFO to show how this works, using a real-time connectors to Salesforce for order data and Jira for support tickets. He showed their multi-event log analytical capability, which makes it much easier to bring together data sources from different systems and automatically correlate them without a lot of manual data cleansing — the links between processes in different systems are identified without human intervention. This allows detection of anomalies that occur on boundaries between systems, rather than just within systems.

Signals can be created based on pre-defined patterns or from scratch, allowing a real-time data-driven alert to be issued when required, or an automation push to another system be triggered. This automation capability is a critical differentiator, allowing for a simple workflow based on connector steps, and can replace the need for some amount of other process automation technologies such as RPA in cases where those are not a good fit.

He was joined by Martin Rowlson, Global Head of Process Excellence at Uber; they are consolidating data from all of their operational arms (drive, eats, etc.) to analyze their end-to-end processes, and using process mining and task mining to identify areas for process improvement. They are analyzing some critical processes, such as driver onboarding and customer support, to reduce friction and improve the process for both Uber and the driver or customer.

Klenk’s next guest as Philipp Grindemann, head of Business Development at Lufthansa CityLine, discussing how they are using Celonis to optimize their core operations. They track maintenance events on their aircraft, plus all ground operations activities. Ground operations are particularly complex due to the high degree of parallelism: an aircraft may be refueled at the same time that cargo is being loaded. I have to guess that their operations are changing radically right now and they are having to re-structure their processes, although that wasn’t discussed.

His last guest was Dr. Lars Reinkemeyer, author of Process Mining in Action — his book has collected and documented many real-world use cases for process mining — to discuss some of the expected directions of process mining beyond just analytics.

They then returned to a studio session for a bit more interactive Q&A; the previous technology roadmap keynote was pre-recorded and didn’t allow for any audience questions, although I think that the customers that he interviewed will have full presentations later in the conference.

#CelosphereLive lunch break

As we saw in at CamundaCon Live last week, there is no break time in the schedule: if you want to catch all of the presentations and discussions in real time, be prepared to carry your laptop with you everywhere during the day. The “Live from the Studio” sessions in between presentations are actually really interesting, and I don’t want to miss those. Today, I’m using their mobile app on my tablet just for the streaming video, which lets me take screenshots as well as carry it around with me, then using my computer for blogging, Twitter, screen snap editing and general research. This means that I can’t use their chat or Q&A functions since the app does not let you stream the video and use the chat at the same time, and the chat wasn’t very interesting yesterday anyway.

The next presentation was by Zalando, a European online fashion retailer, with Laura Henkel, their Process Mining Lead, and Alejandro Basterrechea, Head of Procurement Operations. They have moved beyond just process mining, and are using Celonis to create machine learning recommendations to optimize procurement workflows: the example that we saw provided Amazon-like recommendations for internal buyers. They also use the process automation capabilities to write information back to the source systems, showing how Celonis can be used for automating multi-system integration where you don’t already have process automation technology in place to handle this. Their key benefits in adding Celonis to their procurement processes have been efficiency, quality and value creation. Good interactive audience Q&A at the end where they discuss their journey and what they have planned next with the ML/AI capabilities. It worked well with two co-presenters, since one could be identifying a question for their area while the other was responding to a different question, leaving few gaps in the conversation.

We broke into two tracks, and I attended the session with Michael Götz, Engineering Operations Officer at Celonis, providing a product roadmap. He highlighted their new operational apps, and how they collaborated with customers to create them from real use cases. There is a strong theme of moving from just analytical apps to operational apps that sense and act. He walked through a broad set of the new and upcoming features, starting with data and connectivity, through the process AI engine, and on to boards and the operational apps. I’ve shown some of his slides that I captured below, but if you’re a Celonis customer, you’ll want to watch this presentation and hear what he has to say about specific features. Pretty exciting stuff.

I skipped the full-length Uber customer presentation to see the strategies for how to leverage Celonis when migrating legacy systems such as CRM or ERP, presented by Celonis Data Scientist Christoph Hakes. As he pointed out, moving between systems isn’t just about migrating the data, but it also requires changing (and improving) processes . One of the biggest areas of risk in these large-scale migrations is around understanding and documenting the existing and future-state processes: if you’re not sure what you’re doing now, then likely anything that you design for the new system is going to be wrong. 60% of migrations fail to meet the needs of the business, in part due to that lack of understanding, and 70% fail to achieve their goals due to resistance from employees and management. Using process mining to explore the actual current process and — more importantly — understand the variants means that at least you’re starting from an accurate view of the current state. They’ve created a Process Repository for storing process models, including additional data and attachments

Hakes moved on to talk about their redesign tools, such as process conformance checking to align the existing processes to the designed future state. After rollout, their real-time dashboards can monitor adoption to locate the trouble spots, and send out alerts to attempt remediation. All in all, they’ve put together a good set of tools and best practices: their customer Schlumberger saved $40M in migration costs by controlling the migration costs, driving user adoption and performing ongoing optimization using Celonis. Large-scale ERP system migration is a great use case for process mining in the pre-migration and redesign areas, and Celonis’ monitoring capabilities also make it valuable for post-migration conformance monitoring.

The last session of the day was also a dual track, and I selected the best practices presentation on how to get your organization ready for process mining, featuring Celonis Director of Customer Success Ankur Patel. The concurrent session was Erin Ndrio on getting started with Celonis Snap, and I covered that based on a webinar last month. Patel’s session was mostly for existing customers, although he had some good general points on creating a center of excellence, and how to foster adoption and governance for process mining practices throughout the organization. Some of this was about how a customer can work with Celonis, including professional services, training courses, the partner network and their app store, to move their initiatives along. He finished with a message about internal promotion: you need to make people want to use Celonis because they see benefits to their own part of the business. This is no different than the internal evangelism that needs to be done for any new product and methodology, but Patel actually laid out methods for how some of their customers are doing this, such as road shows, hackathons and discussion groups, and how the Celonis customer marketing team can help.

He wrapped up with thoughts on a Celonis CoE. I’m not a big fan of product-specific CoEs, instead believing that there should be a more general “business automation” or “process optimization” CoE that covers a range of process improvement and automation tools. Otherwise, you tend to end up with pockets of overlapping technologies cropping up all over a large organization, and no guidance on how best to combine them. I wrote about this in a guest post on the Trisotech blog last month. I do think that Patel had some good thoughts on a centralized CoE in general to support governance and adoption for a range of personas.

I will check back in for a few sessions tomorrow, but have a previous commitment to attend Alfresco Modernize for a couple of hours. Next week is IBM Think Digital, the following week is Appian World, then Signavio Live near the end of May, so it’s going to be a busy few weeks. This would normally be the time when I am flying all over to attend these events in person, and it’s nice to be able to do it from home although some of the events are more engaging than others. I’m gathering a list of best practices for online conferences, including the things that work and those that don’t, and I’ll publish that after this round of virtual events. So far, I think that Camunda and Celonis have both done a great job, but for very different reasons: Camunda had much better audience engagement and more of a “live” feel, while Celonis showed how to incorporate higher production quality and studio interviews to good effect, even though I think it’s a bit early to be having in-person interviews.

CelosphereLive 2020 – Day 1

I expect to be participating in a lot of virtual vendor conferences over the next months, and today I tuned in to the Celonis CelosphereLive. They are running on European time, with today’s half day starting at a reasonable 9am Eastern North American time, but the next two days will be starting at 4am my time– I may be catching some of the sessions on demand.

We had a keynote from co-CEO Alexander Rinke that included a short discussion with the godfather of process mining, Wil van der Aalst. I liked Rinke’s characterization that every process in every company is waiting to be improved: this is what process mining (aka process intelligence, depending on which vendor is talking) is all about in terms of discovering processes. Miguel Milano, Celonis Chief Revenue Officer, joined him to talk about their new Celonis Value Architects certification program. The fact that this announcement takes such a prominent place in the keynote highlights that there’s still a certain amount of expertise required to do process mining effectively, even though the tools have become much easier to use.

There were also some new product announcements, first around the availablity of their real-time data connectors. This is a direction that many of the process mining vendors are taking, moving from just an analytical process discovery role to more of an operational monitoring process intelligence role. Chief Product Officer Hala Zeine joined Rinke to talk about their connectivity — out of the box, the product connects to 80 different data sources — and their process AI engine that fits the data to a set of desired outcomes and makes recommendations. Their visualization boards then let you view the analysis and explore root causes of problem areas.

Their process AI engine does some amount of automation, and they have just released operational apps that help to automate some activities of of the workflow. These operational apps are an overlay on business processes that monitor work in progress, and display analysis of the state of (long-running) processes that it is monitoring. The example shown is an Accounts Payable operational app that looks at invoices that are scheduled for payment, and allows a financial person to change parameters (such as date of payment) in bulk, which would then push that update back to the underlying A/P system. Think of these operational apps as smart dashboards, where you can do some complex analysis for monitoring work in progress, and also push some updates and actions back to the underlying operational system. These first two apps are already available to Celonis customers in their app store, and tomorrow there will be a session with the CTO showing how to build your own operational app.

To finish off the day we had two product demos/discussions. First was JP Thomsen, Celonis’ VP Business Models, giving a more in-depth demo of their Accounts Payable operational applications. He was joined by Jan Fuhr, Process Mining Lead at global healthcare company Fresenius Kabi, which collaborated on the creation of the A/P operational application; Fuhr discussed their process mining journey and how they are now able to better support their A/P function and manage cash flow. The sweet spot for these operational apps appears to be when you don’t have end-to-end management on your process with another system such as a BPMS: the operational app monitors what’s happening in any core systems (such as SAP) and replaces ad hoc “management by spreadsheet” with AI and rules that highlight problem areas and make suggestions for remediation. They’ve had some great cost savings, through taking advantage of paying within a specified time frame to receive a discount, and optimizing their payment terms.

Last up was Trevor Miles, Celonis’ head of Supply Chain and Manufacturing Solutions, talking about the supply chain operational application: obviously these operational apps are a big deal for Celonis and their customers, since they’ve been the focus of most of these first half-day. Process mining can provide significant value in supply chain management since it typically involves a number of different systems without an explicit end-to-end process orchestration, and can have a lot of exceptions or variants. Understanding those variants and being able to analyze and reroute things on the fly is critical to maintaining a resilient suppy chain. This has been highlighted during the COVID-19 pandemic, where supply chains have been disrupted, overloaded or underused, depending on the commodity and the route.

Process mining is used to generate a digital twin for the supply chain, which can then be used to analyze past performance and use as a point of comparison with current activities. The Celonis operational app for supply chain is about closing the gap between sensing and actions, so that process mining and simulation isn’t just an analytical tool, but a tool for guiding actions to improve processes. It’s also a tool for bridging across multiple systems of the same time: many large organizations have, for example, multiple instances of SAP for different parts of their processes, and need to knit together all of that information to make better decisions.

Not quite social distancing…

They finished up with a discussion in the studio between Hala Zeine, co-CEO Bastian Nominacher and CTO Martin Klenk, covering some of the new announcements and what’s coming up in the next two days. I’ll be back for some of the sessions tomorrow, although likely not before 8am Eastern.

A few notes on the virtual conference format. Last week’s CamundaCon Live had sessions broadcast directly from each speaker’s home plus a multi-channel Slack workspace for discussion: casual and engaging. Celonis has made it more like an in-person conference by live-broadcasting the “main stage” from a studio with multiple camera angles; this actually worked quite well, and the moderator was able to inject live audience questions. Some of the sessions appeared to be pre-recorded, and there’s definitely not the same level of audience engagement without a proper discussion channel like Slack — at an in-person event, we would have informal discussions in the hallways between sessions that just can’t happen in this environment. Unfortunately, the only live chat is via their own conference app, which is mobile-only and has a single chat channel, plus a separate Q&A channel (via in-app Slido) for speakers that is separated by session and is really more of a webinar-style Q&A than a discussion. I abandoned the mobile app early and took to Twitter. I think the Celosphere model is probably what we’re going to see from larger companies in their online conferences, where they want to (attempt to) tightly control the discussion and demonstrate the sort of high-end production quality that you’d have at a large in-person conference. However, I think there’s an opportunity to combine that level of production quality with an open discussion platform like Slack to really improve the audience experience.

Snap! @Celonis offers free process mining in the cloud

I attended a webinar today where Celonis showed Snap, the new free tier of their cloud-based process mining platform. It can work with flat files (CSV) or manually connect to ServiceNow, Google Sheets and a few other data sources directly, plus it supports teams for collaboration. You’re limited to uploading 500MB of data (which is a lot of records you’re uploading just case ID, activity name, resource, start and end time), and there’s no continuous data integration for event collection the way there is with their full-blown IBC Enterprise version; additionally, several functions from the main product are visible but not enabled. However, if you want to dip your toe into process mining with real-sized datasets, this is a good choice.

The process mining functionality is similar to what you see with competitive products, plus some nice dashboard capabilities for visualizing not just the overall discovered process flow, but for drilling into problem areas to do root cause analysis.

You can sign up for a free Celonis Snap account on their website. It self-deploys within a couple of minutes, creating a team account and setting you up as the admin, with the ability to add other users to the team. It doesn’t support everything in the paid platform, but definitely a good way to get started with process mining. There’s also an online community to ask (and answer) questions.

They are also offering free online training (not sure if that’s just now or if they will do this on an ongoing basis) that covers their full paid product suite; if you’re using Snap, the parts of the training related to process discovery and analysis will be relevant. They are launching free Snap-specific training next week, and adding new features to the product such as connectors to SAP Ariba. Obviously, they’d like to leverage free accounts into paid accounts, but this is more than just a 30-day trial or non-functional demo account; you can do some real work with the free version and move to the paid version when your data or processing needs exceed that.

APQC webinar: 2020 process and performance management priorities, with @hlykehogland

I listened in on a webinar today with APQC‘s Holly Lyke-Ho-Gland looking at the results of their 2020 process and performance management priorities survey (conducted in late 2019). Some good insights here, looking at the top three challenges in business process management and continuous improvement. Process modeling and mining vendors will be happy to see that the highest priority challenge in BPM is defining and mapping end-to-end processes.

She covered a number of tips and solutions to address these challenges, from points on developing end-to-end processes, how to develop a culture of continuous improvement, and governance alignment. She included a lot of great case studies and examples across all of these areas, and what type of resources and expertise is required to achieve them.

After covering the business process management and continuous improvement side, she moved on to discuss the organizational performance management challenges and solutions. Performance management is more about analytics and metrics, and using those measures to support decision making; apparently this year’s challenges are the same as last year’s, meaning that organizations are still struggling with these issues.

Some interesting points here about change management plans and what needs to be done in order to be successful in performance management; check out the webinar replay for details.

The last part of the webinar was on their “special interest” section, which this year is process management. The first point was on the purpose of process teams and work, the most important of which is supporting strategic initiatives. This is definitely what I see in my own consulting practice, with process gaining a much higher profile as companies focus on digital transformation efforts: at their heart, many transformation efforts are process-centric. The APQC research also showed information on measuring process work, and she notes (as I often see) that the top measures are still focused on bottom-line savings rather than more strategic measures, meaning that process metrics are misaligned with strategic focus. She also covered the impact of technology on process work: not just process automation, but collaboration, data management and visualization, collaboration and cloud computing topped the technology list, since they are looking at the entire process management lifecycle. She made a direct call-out to process mining (although it wasn’t in the top five list) as a cross-over between data analysis and process modeling; I’m definitely in agreement with that as you can see from my post earlier this week.

She finished with a summary of the survey results, plus a peek at their research agenda for 2020 with lots of interesting and timely topics. I like that their research uses a lot of real-world case studies.

I couldn’t find a direct link to the webinar replay yet, but it will likely available on APQC’s On-Demand Webinars page soon; definitely worth checking out for Lyke-Ho-Gland’s insights and discussion. While you’re over there, check out their Process and Performance Management Conference, coming up in October. I spoke at their conference back in 2013, and really enjoyed the experience, good sessions and a smaller conference so great for networking.