CamundaCon Live 2020 – Day 1: Optimize, RPA, and how 24 Hour Fitness executes 5B process nodes per month

We continued the first day of CamundaCon Live (virtual) 2020 with Felix Mueller, senior product manager, presenting on how to use Camunda Optimize for driving continuous improvement in processes. I attended the Optimze 3.0 release webinar a couple of weeks ago, and saw some of the new things that they’re doing with monitoring and optimization of event-based processes — this allows processes that are not part of Camunda to be included in Optimize. The CamundaCon session started with a broader view of Optimize functionality, showing how it collects information then can be used for root cause analysis of process bottlenecks as well as displaying realtime metrics. They have some good case studies for Optimize, including insurance provider Visana Group.

He then moved to show the event-based process monitoring, and how Optimize can ingest and aggregate information from any external system with a connector, such as RabbitMQ (which they have built). His demo showed a customer onboarding process that could be triggered either by an online form that would be a direct Camunda process instantiation, or via a mailed-in form that was scanned into another system that emitted an event that would trigger the process.

It was very obvious that this was a live presentation, because Mueller was scrambling against the clock since the previous session went a bit long, having to speed through his demo and take a couple of shortcuts. Although you might think of this as a logistical “bug”, I maintain that it’s an interactivity “feature”, and made the experience much closer to an in-person conference than a set of pre-recorded presentations that were just queued up in sequence.

This was followed by a presentation by Kris Barczynski of Nokia Bell Labs about a really interesting use case: they are using Camunda to guide visiting groups on tours through the Nokia Campus customer experience spaces, and interact with devices including the guests’ wearables, drones and robots. Visitors are welcomed and guided by a robot, and they can interact with voice-controlled drones; Camunda is orchestrating the processes behind the scenes. He talked about some of their design decisions, such as using Camunda JavaScript workers to call external services, and building a custom Android app. Really interesting combination of physical and virtual processes.

Next was a panel discussion on the future of RPA, with Vittorio Dal Bianco of Nokia, Marco Einacker of Deutsche Telekom, Paul Jones of NatWest Group, and Camunda CEO Jakob Freund, moderated by Jason Bloomberg of Intellyx Research. The three customer presenters are involved with the RPA initiatives at their own organizations, and also looking at how to integrate that with their Camunda processes. Panels are always a challenge to live-blog, but here’s some of the points discussed (attributed where I remembered):

  • The customer panelists agreed that RPA has allowed people to move to more interesting/valuable work, rather than doing routine tasks such as copying and pasting between application screens. Task automation through RPA reduces resources/costs, decreases cycle time, and also improves quality/compliance.
  • RPA is a “short-term bandaid” driven from outside the IT organization in order to get some immediate efficiency benefits. It’s maintenance-intensive, since any changes to the appliations being integrated means that the bots need to be reprogrammed. Deutsche Telekom is moving from RPA front-end integration/automation to drive the more strategic BPMS/API automation, so sees that RPA has been an important step on the strategic journey but not the endpoint. NatWest recognizes RPA as a key automation tool, but see it as a short-term tactical tool; they classify RPA as part of their technical debt, and it is not a part of their long-term architecture. Nokia thinks that RPA will remain in niche pockets for applications that will never have a proper API, such as Excel-based applications.
  • Nokia uses Blue Prism for RPA. NatWest uses UIPath RPA, and has a group that is building the integration for having Camunda execute a UIPath task — although I would have thought this would be a relatively simple service call or external task. Deutsche Telekom is using seven different RPA platforms, three of which are commercial including Another Monday and Kryon; they are just starting to look at the integration between Camunda and RPA with a plan to have Camunda orchestrate steps, and one “microbot” performing an atomic task at that step. As their core system offers an API for that task, the RPA bot will be replaced with a direct API call. This last approach is definitely aligned with Camunda’s vision of how their BPM can work with RPA bots as well as any other “task performers”.
  • More discussion on the role of RPA in digital transformation: recommendations to go ahead and use it, but consider it as a stop-gap measure to get a quick win before you can get the APIs built out in the systems that are being integrated. It’s considered technical debt because it will be replaced in the future as the APIs of the core systems become available. It’s a painkiller, not a cure.
  • Although some of the companies are using business people to build their own bots, that has a mixed degree of success and other companies do not classify RPA as citizen developer technology. This is pretty much the same as we’re seeing with other low-code environments, where they are often sold as application development platforms for non-professional developers, but the reality is that many applications require a professional developer because of the technical complexity of systems being integrated.
  • Cost and effort of RPA bot maintenance can be significant, in some cases more than back-end integration. Bot fixes may be fairly quick, but are required much more frequently such as when a password changes: bots require babysitting.
  • The customers had a few Camunda product requests, such as better connectors to more of the RPA tools. In general, however, they don’t want Camunda to build/acquire their own RPA offering, but just see it as another example of where you can pick a best-of-breed RPA tool and use it for task automation at individual steps within a Camunda process.
  • Best practices/lessons learned:
    • Separate the process orchestration layer from the bot execution layer from the beginning, with the process orchestration being done by Camunda and the bot task execution being done by the RPA tool.
    • Use process mining first to objectively identify what should be automated; of course, this would also require that you mine the user interaction processes that would be automated with bots, not just the system logs.
    • Have a centralized control center for bot control.
    • Develop bot templates that can be more quickly modified and deployed.

Looking at how the panel worked, there are definitely aspects of online panels that work better than in-person panels, specifically how they respond to audience questions. Some people don’t want to speak up in front of an audience, while others get up and bloviate without actually asking a question. With online-only questions, the moderator can browse through and aggregate them, then select the ones that are best suited to the panel. With video on each of the presenters (except for one who lost his connection and had to dial in), it was still possible to see reactions and have a sense of the live nature of the panel.

The last session of the day was Jimmy Floyd of 24 Hour Fitness on their massive Camunda implementation of five billion (with a “B”) process node executions per month. You can see his presentation from CamundaCon Berlin 2018 as a point of comparison with today’s numbers. Pretty much everything that happens at 24 Hour Fitness is controlled by a Camunda process, from their internal processes to customer-facing activities such as a member swiping their card to gain access to a club. It hasn’t been without hiccups along the way: they had to turn off process history logging to attain this volume of data, and can’t easily drill down into processes that call a lot of other processes, but the use of BPMN and DMN has greatly improved the interactions between product owners and developers, sometimes allowing business people to make a rule change without involving developers.

He had a lot of technical information on how they built this and their overall architecture. Their use is definitely custom code, but using Camunda with BPMN and DMN gave them a huge step-up versus just writing code. Even logic inside of microservices is implemented with Camunda, not written in code. Their entire architecture is based on Camunda, so it’s not a matter of deciding whether or not to use it for a new application or to integrate in a new external solution. They are taking a look at Zeebe to decide if it’s the right choice of them moving forward, but it’s early days on that: it would be a significant migration for them, they would likey lose functionality (for BPMN elements not yet implemented in Zeebe, among other things), and Zeebe has only just achieved production readiness.

Camunda is changing how they handle history data relative to the transactional data, in part likely due to input from high-throughput customers, and this may allow 24 Hour Fitness to turn history logging back on. They’re starting to work with Optimize via Kafka to gain insights into their processes.

Day 1 finished with a quick wrapup from Jakob Freund; in spite of the fact that it’s probably been a really long day for him, he seemed pretty happy about how well things went today. Tomorrow will cover more on microservices orchestration, and have customer case studies from Cox Automotive, Capital One and Goldman Sachs.

As you probably gather from my posts today, I’m finding the CamundaCon online format to be very engaging. This is due to most of the presentations being performed live (not pre-recorded as is seen with most of the online conferences these days) and the use of Slack as a persistent chat platform, actively monitored by all Camunda participants from the CEO on down. They do need a little bit more slack in the schedule however: from 10am to 3:45pm there was only one 15-minute break scheduled mid-way, and it didn’t happen because the morning sessions ran overtime. If you’re attending tomorrow, be prepared to carry your computer to the kitchen and bathroom with you if you don’t want to miss a minute of the presentations.

As I finish off my day at the virtual CamundaCon, I notice that the videos of presentations from earlier today are already available — including the panel session that only happened an hour ago. Go to the CamundaCon hub, then change the selection from “Upcoming” to “On Demand” above the Type/Day/Track selectors.

CamundaCon Live 2020 – Day 1: Jakob Freund keynote and customer presentations

Every conference organizer has had to deal with either cancelling their event or moving it to some type of online version as most of us work from home during the COVID-19 pandemic. Some of these have been pretty lacklustre, using only pre-recorded sessions and no live chat/Q&A, but I had expectations for Camunda being able to do this in a more “live” manner that doesn’t completely replace an in-person event, but has a similar feel to it. They did not disappoint: although a few of the CamundaCon presentations were pre-recorded, most were done live, and speakers were available for live Q&A. They also hosted a Slack workspace for live chat, which is much better than the Q&A/chat features on the webinar broadcast platform: it’s fundamentally more feature-rich, and also allows the conversations to continue after a particular presentation completes.

Very capably hosted by Director of Developer Relations Mary Thengvall, presentations were all done from the speaker’s individual locations, starting with CEO Jakob Freund’s keynote. He covered a bit of Camunda’s history and direction, and discussed their main focus of providing end-to-end process orchestration using the example of Camunda together with RPA, then gradually migrating the RPA bots (widely used as a stop-gap process automation measure) to more robust API integrations. He also shared some news on new and timely product offerings, including a starter package for work-from-home human workflow, and an early adopter package for Camunda Cloud. I’ve shared a few of his slides below, but you can also go and see the recording: they are getting the videos and slides up within about an hour after each presentation, directly on the conference hub.

Next up was Simon Letort, Chief Digital Officer at Société Générale, on how they implemented their corporate investment banking’s core process automation platform using Camunda. They use Camunda as the core of their managed workflow platform, with 500+ processes deployed throughout their operations worldwide. They also use bpmn.io and form.io as their built-in process and forms modelers. Letort responded to an audience question about why not use another large BPMS product that was already in use; they wanted a best-of-breed solution rather than a proprietary walled garden, and also wanted to leverage open source tools as part of that so that they weren’t building everything from scratch. They transitioned from some of the proprietary tools by first replacing the underlying engine with Camunda, then trading out other components such as form.io as a more flexible UI was required.

Interestingly, about half of their workflows are created by 30 expert modelers within centers of expertise, and half by 1200 “amateur” modelers, or citizen developers. This really points out the potential for companies to mix together the experts (focused on core processes) and amateurs (focused on tactical or administrative processes) using the same engine, although they likely use quite different tools for the full development cycle. The SG Workflow “product” offers three main features to support these different modeler/developer types: the (Camunda) process engine, a workflow aggregator for grouping tasks and cases from multiple systems, and UI web components and apps. Their platform also auto-generates process documentation. The core product is created and maintained by a team of about 10, distributed between France and Canada.

He shared some good information on their architecture and roadmap: I did a few presentations last year (one of them at CamundaCon in Berlin) and wrote a paper on building your own process-centric platform using a BPMS and an assembly of other tools, inspired in part by companies like Société Générale that have done this to create a much more flexible application development environment for their large enterprises.

We moved from the main stage to the track sessions, and I sat in on a presentation by Jeremy Warren of Keller Williams Realty (a Camunda customer that works with integrator BP3) about their “SmartPlans” dynamic processes — these aren’t actually dynamic at runtime, but use a flower process model that loops back to allow any task to lead to any other task — which allow real estate agents to create their own plans and tasks.

This is a great example of automating some of the processes that real estate agents use to drive new business, such as contacting prospects on a regular schedule, which would normally be done (or not done) manually. Agents can decide what tasks to do in what order; the branching logic in the model then executes the plan as specified. He also shared some of their experiences in rolling out and debugging applications on this scale.

The second track session was Derek Hunter and Uzma Khan of Ontario Teachers’ Pension Plan (who have been an occasional client of mine over a number of years, including introducing them to startup Camunda back in 2013). They have a number of case management style of processes to handle requests from members (teachers) regarding their pensions. They have 144 BPMN templates, and execute 70,000 process instances per year with up to 20,000 active instances at any time since these are generally long-running workflows. Some of the extremely long-running processes are actually terminated after a specific stage, then a scheduler restarts a corresponding instance when new work needs to be done. Other processes may be suspended in the workflow engine, making them invisible to a user’s worklist until work needs to be done.

Camunda is really just an engine buried within the OTPP workflow system, completely hidden from calling applications by a workflow intermediary. This was essential during their migration off other platforms: at one time, they had three different workflow engines running simultaneously, and could migrate everything to Camunda without having to retool the higher-level applications. In particular, end users are never aware of the specific workflow engine, but work within applications that integrate business data from multiple systems.

They take advantage of in-flight instance migration due to the long-running nature of their processes, which is something that Camunda offers that is missing from many other BPMS products. Because of the large number of process templates and the complex architecture with many other systems and components, they have implemented automated testing practices including modeling user interactions through their workflow interface service (that sits above the workflow intermediary and the Camunda engine), and handling work-arounds for emulating external task processing in their core services.

They’ve developed a lot of best practices for automated testing, and built tools such as a BPMN navigation tool to use during unit testing. Another of their colleagues, Zain Esmail, will be presenting more about this on the technical track tomorrow. They have also developed tools for administrative monitoring and reporting on external tasks, to allow these to be integrated with the internal Camunda workflow metrics in Prometheus.

We’re taking a short break between the morning and afternoon sessions, so I’ll close this out now and be back in another post as things progress, either this afternoon or tomorrow.

My post on @Trisotech blog: The Changing Nature of Work – 2020 Version

I’ve been interested for a long time in how the work that people do changes when we introduce new types of technology. In 2011, I gave a keynote at the academic BPM conference in Clermont-Ferrand called “The Changing Nature of Work”, and I’ve written and presented on the topic many times since then.

The current pandemic crisis has me thinking about how work is changing, not due to the disruptive forces of technology, but due to the societal disruption. Technology is enabling that change, and I have some ideas on how that’s going to work and what that will mean going forward even when things are “back to [new] normal”. I believe there’s a big part for process to play in this, including process mining and automation, and you can find my thoughts about this in a guest post over on the Trisotech blog.

Free BPM (and other) textbooks and journals from SpringerLink

Springer, a publisher of academic journals and books, is offering many textbooks and papers for free right now; normally these come at a hefty price. Although they state that they are offering these “to support learning and teaching at higher education institutions worldwide”, there are no restrictions for anyone to download them. The first of these that I found of particular interest is Fundamentals of Business Process Management (2018) by Marlon Dumas, Marcello La Rosa, Jan Mendling and Hajo Reijers. This is a classic textbook used for teaching BPM at the university level, although useful to practitioners looking to dig a bit deeper into subjects.

The second one is Business Process Management Cases (2018) by Jan vom Brocke and Jan Mendling. This is a collection of real-world case studies highlighting best practices and lessons learned, hence aimed at a professional practitioner audience as well as students. Each case study chapter is written by a different author, and they are grouped into sections focused on Strategy and Governance, Methods, Information Technology, and People and Culture.

If you check out the SpringerLink search page, you’ll find a wide variety of books and papers available to download as PDF and/or EPUBs. To see the ones that are free, uncheck the “Include Preview-Only content” box near the top left. You can filter to see only certain types of content (e.g., books), as well as by discipline and language.

Thanks to Marlon Dumas and Jan Mendling for giving the shout out on Twitter about this.

My upcoming webinar on Financial Services and the need for flexible processes to weather a crisis

It’s a busy time for webinars, and I’ll be doing two more in the next two months sponsored by Signavio. On April 15, I’ll be talking about financial services, and how flexible processes are even more important now because of the current pandemic situation:

The current pandemic crisis and resultant market downturn has customers scrambling for safe investments, and financial services companies being forced to support remote working, all while legacy platforms creak precariously under the strain. Now, more than ever, it is critical for companies to understand their end-to-end processes and apply intelligent automation just to survive. Market leaders with flexible processes will be able to adjust quickly to the rapidly-changing environment, and even offer new business models and products suited to the uncertain times ahead.

I just had someone in financial services reach out to me and say “new automation strategy initiatives are the furthest thing from my leadership’s mind right now”, but I agree with other analysts that this is the perfect time to be pushing innovation in your organization. Regardless of your industry, you’re in the middle of a massive disruption; you can either respond to that disruption and innovate to stay ahead of your competition, or you’re going to be one of the many economic casualties in the months ahead.

Interested in hearing more about this, and participating in the conversation? Register for the webinar at the link above, and feel free to send me your ideas and experiences in advance to discuss during the webinar.

Catch me on a @SoftwareAG webinar next week talking about operational excellence

With so many cancelled conferences and people working from home, this is a great time to tune up your skills through online courses and webinars. I’ll be presenting with Eric Roovers, business transformation leader at Software AG, on a webinar about operational excellence next Wednesday, April 1, at 10:00 Eastern (16:00 in Central Europe, 07:00 Pacific).

We are doing this as a conversation rather than a presentation, so be prepared for Eric and I to riff on our opinions about the key issues today in achieving and maintaining operational excellence. This is the first of a series of webinars that Software AG will be hosting on operational excellence, although the only one where I’ll be speaking.

The webinar is sponsored by Software AG, and you’ll need to register at the link above. If you can’t make it at that time, go ahead and register and you’ll be sent a link to the on-demand version later.

Snap! @Celonis offers free process mining in the cloud

I attended a webinar today where Celonis showed Snap, the new free tier of their cloud-based process mining platform. It can work with flat files (CSV) or manually connect to ServiceNow, Google Sheets and a few other data sources directly, plus it supports teams for collaboration. You’re limited to uploading 500MB of data (which is a lot of records you’re uploading just case ID, activity name, resource, start and end time), and there’s no continuous data integration for event collection the way there is with their full-blown IBC Enterprise version; additionally, several functions from the main product are visible but not enabled. However, if you want to dip your toe into process mining with real-sized datasets, this is a good choice.

The process mining functionality is similar to what you see with competitive products, plus some nice dashboard capabilities for visualizing not just the overall discovered process flow, but for drilling into problem areas to do root cause analysis.

You can sign up for a free Celonis Snap account on their website. It self-deploys within a couple of minutes, creating a team account and setting you up as the admin, with the ability to add other users to the team. It doesn’t support everything in the paid platform, but definitely a good way to get started with process mining. There’s also an online community to ask (and answer) questions.

They are also offering free online training (not sure if that’s just now or if they will do this on an ongoing basis) that covers their full paid product suite; if you’re using Snap, the parts of the training related to process discovery and analysis will be relevant. They are launching free Snap-specific training next week, and adding new features to the product such as connectors to SAP Ariba. Obviously, they’d like to leverage free accounts into paid accounts, but this is more than just a 30-day trial or non-functional demo account; you can do some real work with the free version and move to the paid version when your data or processing needs exceed that.

My new post on the @Trisotech blog: the case for a business automation center of excellence

I’m doing a series of posts on the Trisotech blog, and this month I’m digging into the concept of a broader center of excellence (CoE) around business automation because I’ve seen a lot of problems when CoEs are implemented around a specific methodology, technology or tool, e.g., a BPM CoE, or even worse, a [insert vendor name here] BPMS CoE.

In addition to my consulting practice, where I see CoEs in practice and help organizations to create them, I work with a lot of vendors. I included a diagram of how most vendors see themselves relative to their market:

Head on over to the Trisotech blog to read the entire article.

Tune in on March 10 for my webinar on intelligent insurance processes

I’ve been sponsored by Signavio for a short webinar series. The first, on intelligent banking processes, was last month; this month on March 10, I’ll be presenting on intelligent insurance processes:

With customer churn rates approaching 25% in some insurance sectors, insurers are attempting to increase customer retention by offering innovative products that better address today’s market. The ability to create and support innovative products has become a top-level goal for many insurance company executives, and requires agile and automated end-to-end processes for a personalized customer journey.

I’m focusing these webinars on very real use cases that I’ve observed in my years working with financial and insurance customers, linking up the top-level goals in organizations with process tools that can help to achieve those goals. There’s such a confusion in the landscape of intelligent automation tools, and many of the vendors don’t help this much by obfuscating the differences between product categories (not just differences between products in the same category). I’ll try to bring a bit of clarity to all of that, plus walk through an insurance use case to see how different types of process tools can be applied.

Head over to the link above to register for the webinar; even if you can’t attend next week, you’ll get a follow-up link to the recorded webinar for viewing later.

Replay of my webinar on banking processes with @Signavio

I did a webinar last week, sponsored by Signavio, that focused on banking processes. In particular, I looked at the top-level goals such as revenue, costs, compliance and competitive differentiation, and lined those up with some of the departmental goals and the process-related tools that can help to reach those goals. As usual, you can find my slides on Slideshare:

The replay is up, you can find it on the Signavio website (registration required). If you registered prior to the webinar, you will have already received an email with a link to the replay.