SAP acquiring Signavio into Business Process Intelligence unit

I first met Signavio CEO Gero Decker in 2008, when he was a researcher at Hasso Platner Institut and emailed me about promoting their BPMN poster — a push to have BPMN (then version 1.1) recognized as a standard for process modeling. I attended the academic BPM conference in Milan that year but Gero wasn’t able to attend, although his name was on a couple of that year’s modeling-related demo sessions and papers related to Oryx, an open source process modeling project. By the 2009 conference in Ulm we finally met face-to-face, where he told me about what he was working on, process modeling ideas that would eventually evolve into Signavio. By the 2010 BPM conference in Hoboken, he was showing me a Signavio demo, and we ended up running into each other at many other BPM events over the years, as well as having many online briefings as they released new products. The years of hard work that he and his team have put into Signavio have paid off this week with the announcement of Signavio’s impending acquisition by SAP (Signavio press release, SAP press release). There have been rumors floating around for a couple of days, and this morning I had the chance for a quick chat with Gero in advance of the official announcement.

The combination of business process intelligence from SAP and Signavio creates a leading end-to-end business process transformation suite to help our customers achieve the requirements needed to gain a competitive edge.

Luka Mucic, CFO of SAP

SAP is launching RISE with SAP today, with the Signavio acquisition a part of the announcement. RISE with SAP is billed as “business transformation as a service”, providing business process redesign (including Signavio), technical migration (which appears to be a push to get reluctant customers onto their current platform), and building an intelligent enterprise (which is mostly a cloud infrastructure message).

This is a full company acquisition, including all Signavio employees (numbering about 500). Gero and the only other co-founder still at Signavio, CTO Willi Tscheschner, will continue in their roles to drive forward the product vision and implementation, becoming part of SAP’s relatively new Business Process Intelligence unit, which is directly under the executive board. Since that unit previously contained about 100 people, the Signavio acquisition will swell those ranks considerably, and Gero will co-lead the unit with the existing GM, Rouven Morato. A long-time SAP employee, Morato can no doubt help navigate the sometimes murky organizational waters that might otherwise trip up a newcomer. Morato was also a significant force in SAP’s own internal transformation through analytics and process intelligence, moving them from the dinosaur of old to a (relatively) more nimble and responsive company, hence understands the importance of products like Signavio’s in transforming large organizations.

Existing Signavio customers probably won’t see much difference right now. Over time, capabilities from SAP will become integrated into the process intelligence suite, such as deeper integration to introspect and analyze SAP S/4 processes. Eventually product names and SKUs will change, but as long as Gero is involved, you can expect the same laser focus on linking customer experience and actions back to processes. The potential customer base for Signavio will broaden considerably, especially as they start to offer dashboards that collect information on processes that include, but are not limited to, the SAP suite. In the past, SAP has been very focused on providing “best practice” processes within their suite; however, if there’s anything that this past year of pandemic-driven disruption has taught us, those best practices aren’t always best for every organization, and processes always include things outside of SAP. Having a broader view of end-to-end processes will help organizations in their digital transformations.

Obviously, this is going to have an impact on SAP’s current partnership with Celonis, since the SAP Process Mining by Celonis would be directly in competition with Signavio’s Process Intelligence. Of course, Signavio also has a long history with SAP, but their partnership has not been as tightly branded as the Celonis arrangement. Until now. Celonis arguably has a stronger process mining product than Signavio, especially with their launch into task mining, and have a long history of working with SAP customers on their process improvement. There’s always room for partners that provide different functionality even if somewhat in competition with an internal functionality, but Celonis will need to build a strong case for why a SAP customer should pick them over the Signavio-based, SAP-branded process intelligence offering.

Keep in mind that SAP hasn’t had a great track record of process products that aren’t part of their core suite: remember SAP NetWeaver BPM? Yeah, I didn’t think so. However, Signavio’s products are focused on modeling and analyzing processes, not automating them, so they might have a better chance of being positioned as discovering improvements to processes that are automated in the core suite, as well as giving SAP more visibility into how their customers’ businesses run outside of the SAP suite. There’s definitely great potential here, but also the risk of just becoming buried within SAP — time will tell.

Disclosure: Signavio has been a client of mine within the last year for creating a series of webinars. I was not compensated in any way for writing this post (or anything else on this blog, for that matter), and it represents my own opinions.

Making experience matter by building the right incentives into processes

Last month at the Bizagi virtual conference, I gave a keynote on aligning intelligent process automation with employee incentives and business goals. I decided to expand on those themes a bit for my monthly post on the Trisotech blog. Rather than the usual sort of performance metrics, I suggest the following:

The key to designing metrics and incentives is to figure out the problems that the workers are there to solve, which are often tied in some way to customer satisfaction, then use that to derive performance metrics and employee incentives.

There are a lot of challenges with figuring out how to measure and reward experience and innovative thinking: if it’s done wrong, then companies end up measuring how long you spent with a particular app open on your screen, or how many times you clicked on your keyboard.

We’re going through a lot of process disruption right now, and smart companies are using this opportunity to retool the way that they do things. They also need to be thinking about how their employee incentives are lined up with that redesign, and whether business goals are being served appropriately.

You can check out the whole post over at Trisotech’s blog.

Disclaimer: Trisotech is my client.

(Post image by my talented friend Alison Garwood-Jones).

The State of Process Automation, from Camunda

Camunda has just published a 20-page report on the state of process automation, which is pretty balanced (i.e., not particularly biased to their products). They get right to the point up front:

Process automation has emerged as a linchpin
for digital transformation, powering innovation across a
company. Process automation is equally sought after to
improve an organization’s top line as well as its bottom line
– helping to improve customer service, lower costs and drive
business growth.

I’m definitely on board with this statement. Companies that are most likely to emerge successfully from the current disruption are taking a hard look at their business processes, and considering how to include more intelligent automation.

The report is based on the results of a survey that they commissioned, which included 400 IT decision makers in the US and Europe. Almost all of those interviewed (97%) agreed that process automation is vital to digital transformation, and I was encouraged that half of of the current initiatives are focused on growth rather than just efficiency or firefighting. As I’ve been saying for a while, efficiency and productivity are table stakes: you have to consider those, but you’re not going to get the biggest benefit until you start looking at what intelligent automation can do for top-line growth and customer satisfaction.

The survey included a few questions on the impact of the pandemic, with 80% of respondents saying that they are doing more automation because of remote work and (I assume) fewer workers in some cases. This is not unexpected, with 68% reporting that key business processes had breakdowns due to remote work, and most companies are working harder on automation initiatives in order to survive the current disruption.

Definitely worth a read.

Pandemic-driven digital transformation in the legal world: this genie is not going back in the bottle

When I write a present about the type of digital transformation that the pandemic is forcing on firms in order to survive, I usually use examples from financial services and insurance, since that’s where I do most of my consulting. However, we see examples all around us as consumers, as every business of every size struggles to transform to an online model to be able to continue providing us with goods and services. And once both the consumers and the businesses see the benefits of doing some (not all) transactions online, there will be no going back to the old way of doing things.

I recently moved, and just completed the closing on the sale of my previous home. It’s been quite a while since I last did this, but it was always (and I believe still was until a few months ago) a very paper-driven, personal service type of transaction. This time was much easier, and almost all online; in fact, I’ve never even met anyone from my lawyer’s office face-to-face, I didn’t use a document courier, and I only saw my real estate agent in person once. All documents were digitally signed, and I had a video call with my lawyer me to walk through the documents and verify that it was me doing the signing. I downloaded the signed documents directly, although the law office would have been happy to charge me to print and mail a copy. To hand over the keys, my real estate agent just left their lockbox (which contained the keys for other agents to do showings) and gave the code to my lawyer to pass on to the other party once the deal was closed. Payments were all done as electronic transfers.

My lawyer’s firm is obviously still struggling with this paradigm, and provided the option to deliver paper documents, payments and keys by courier (in fact, I had to remind them to remove the courier fee from their standard invoice). In fact, they no longer offer in-person meetings: it has to be a video call. Yes, you can still sign physical documents and courier them back and forth, but that’s going to add a couple of days to the process and is more cumbersome than signing them digitally. Soon, I expect to see pricing from law firms that strongly encourages their clients to do everything digitally, since it costs them more to handle the paper documents and can create health risks for their employees.

Having gone through a real estate closing once from the comfort of my own home, I am left with one question: why would we ever go back to the old way of doing this? I understand that there are consumers who won’t or can’t adopt to new online methods of doing business with organizations, but those are becoming fewer every day. That’s not because the millennial demographic is taking over, but because people of all ages are learning that some of the online methods are better for them as well as the companies that they deal with.

Generalizing from my personal anecdote, this is happening in many businesses now: they are making the move to online business models in response to the pandemic, then finding that for many operations, this is a much better way of doing things. Along the way, they may also be automating some processes or eliminating manual tasks, like my lawyer’s office eliminating the document handling steps that used to be done. Not just more efficient for the company, but better for the clients.

As you adjust your business to compensate for the pandemic, design your customer-facing processes so that they make it easier (if possible) for your customer to do things online than the old way of doing things. That will almost always be more efficient for your business, and can greatly improve customer satisfaction. This does not mean that you don’t need people in your organization, or that your customers can’t talk to someone when required: automating processes and tasks means that you’re freeing up people to focus on resolving problems and improving customer communications, rather than performing routine tasks.

As one of my neighbourhood graffiti artists so eloquently put it, “6 feet apart but close 2 my ❤”.

IBM #Think2020 Day 2: Smarter Business is apparently only about AI

This is now my third day attending IBM’s online Think 2020 conference: I attended the analyst preview on Monday, then the first day of the conference yesterday. We started the day with Mark Foster, SVP of IBM Services, giving a keynote on building resilient and smarter businesses. He pointed out that we are in uncertain times, and many companies are still figuring out whether to freeze new initiatives, or take advantage of this disruption to build smarter businesses that will be more competitive as we emerge from the pandemic. This message coming from a large software/services vendor is a bit self-serving, since they are probably seeing this quarter’s sales swirling down the drain, but I happen to agree with him: this is the time to be bold with digital transformation. He referred to what can be done with new technologies as “business process re-engineering on steroids”, and said that it’s more important than ever to build more intelligent processes to run our organizations. Resistance to change is at a particular low point, except (in my experience) at the executive level: workers and managers are embracing the new ways of doing things, from virtual experiences to bots, although they may be hampered somewhat by skittish executives that think that change at a time of disruption is too risky, while holding the purse strings of that change.

He had a discussion with Svein Tore Holsether, CEO of Yara, a chemical company with a primary business in nitrogen crop fertilizers. They also building informational platforms for sustainable farming, and providing apps such as a hyper-local farm weather app in India, since factors such as temperature and rainfall can vary greatly due to microclimates. The current pandemic means that they can no longer have their usual meetings with farmers — apparently a million visits per year — but they are moving to virtual meetings to ensure that the farmers still have what the need to maximize their crop yields.

Foster was then joined by Carol Chen, VP of Global Lubricants Marketing at Shell. She talked about the specific needs of the mining industry for one of their new initiatives, leveraging the ability to aggregate data from multiple sources — many of them IoT — to make better decisions, such as predictive maintenance on equipment fleets. This allows the decisions about a mining operation to be made from a digital twin in the home office, rather than just by on-site operators who may not have the broader context: this improves decision quality and local safety.

He then talked to Michael Lindsey, Chief Transformation and Strategy Officer at PepsiCo North America, with a focus on their Frito-Lay operations. This operation has a huge fleet, controlling the supply chain from the potato farms to the store. Competition has driven them to have a much broader range of products, in terms of content and flavors, to maintain their 90%+ penetration into the American household market. Previously, any change would have been driven from their head office, moving out to the fringes in a waterfall model. They now have several agile teams based on IBM’s Garage Methodology that are more distributed, taking input from field associates to know what it needed at each point in the supply chain, driving need from the store shelves back to the production chain. The pandemic crisis means that they have had to move their daily/weekly team meetings online, but that has actually made them more inclusive by not requiring everyone to be in the same place. They have also had to adjust the delivery end of their supply chains in order to keep up the need for their products: based on my Facebook feed, there are a lot of people out there eating snacks at home, fueling a Frito-Lay boom.

Rob Thomas, SVP of IBM Cloud & Data Platform, gave a keynote on how AI and automation is changing how companies work. Some of this was a repeat from what we saw in the analyst preview, plus some interviews with customers including Mirco Bharpalania, Head of Data & Analytics at Lufthansa, and Mallory Freeman, Director of Data Science and Machine Learning in the Advanced Analytics Group at UPS. In both cases, they are using the huge amount of data that they collect — about airplanes and packages, respectively — to provide better insights into their operations, and perform optimization to improve scheduling and logistics.

He was then joined by Melissa Molstad, Director of Common Platforms, Stata Strategy & Vendor Relations at PayPal. She spoke primarily about their AI-driven chatbots, with the virtual assistants handling 1.4M conversations per month. This relieves the load on customer service agents, especially for simple and common queries, which is especially helpful now that they have moved their customer service to distributed home-based work.

He discussed AIOps, which was already announced yesterday by Arvind Krishna; I posted a bit about that in yesterday’s post including some screenshots from a demo that we saw at the analyst preview on Monday. They inserted the video of Jessica Rockwood, VP of Development for Hybrid Multicloud Management, giving the same demo that we saw on Monday, worthwhile watching if you want to hear the entire narrative behind the screenshots.

Thomas’ last interview segment was with Aaron Levie, CEO of Box, and Stewart Butterfield, CEO of Slack, both ecosystem partners of IBM. Interesting that they chose to interview Slack rather than use it as an engagement channel for the conference attendees. ¯_(ツ)_/¯  They both had interesting things to add on how work is changing with the push to remote cloud-based work, and the pressures on their companies for helping a lot of customers to move to cloud-based collaboration all at once. There seems to be a general agreement (I also agree) that work is never going back to exactly how it was before, even when there is no longer a threat from the pandemic. We are learning new ways of working, and also learning that things that companies thought could not be done effectively — like work from home — actually work pretty well. Companies that embrace the changes and take advantage of the disruption can jump ahead on their digital transformation timeline by a couple of years. One of them quoted Roy Amara’s adage that “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”; as distributed work methods, automation and the supporting technology get a foothold now, they will have profound changes on how work will be done in the future. This is not going to be about which organizations have the most money to spend: it will hinge on the ability and will to embrace AI and automation to remake intelligent end-to-end processes. Software vendors will need to accept the fact that customers want to do best-of-breed assembly of services from different vendors, meaning that the vendors that integrate into a standard fabric are going to do much better in the long run.

I switched over to the industry/customer channel to hear a conversation between Donovan Roos, VP of Enterprise Automation at US Bank, and Monique Ouellette, VP of Global Digital Workplace Solutions at IBM. She invited us at the beginning to submit questions, so this may have been one of the few sessions that has not been prerecorded, although they never seemed to take any audience questions so I’m not sure. Certainly much lower audio and video quality than most of the prerecorded sessions. US Bank has implemented Watson AI-driven chatbots for internal and external service requests, and has greatly reduced wait times for requests where a chatbot can assist with self-service rather than waiting for a live agent. Roos mentioned that they really make use of the IBM-curated content that comes as part of the Watson platform, and many of the issues are resolved without even hitting internal knowledge bases. Like many other banks during the current crisis, they have had to scale up their ability to process small business loans; although he had some interesting things to mention about how they scaled up their customer service operations using AI chatbots, I would also be interested to hear how they have scaled up the back-end processes. He did mention that you need to clean up your business processes first before starting to apply AI, but no specifics.

I stayed on the industry channel for a presentation on AI in insurance by Sandeep Bajaj, CIO of Everest Re Group. I do quite a bit of work with insurance companies as a technical strategist/architect so have some good insights into how their business works, and Bajaj started with the very accurate statement that insurance is an information-driven industry, both in the sense of standard business information, but also IoT and telematics especially for vehicle and P&C coverage. This provides great opportunities for better insights and decisions based on AI that leverages that data. He believes that AI is no longer optional in insurance because of the many factors and data sources involved in decisions. He did discuss the necessity to review and improve your business processes to find opportunities for AI: it’s not a silver bullet, but needs to have relatively clean processes to start with — same message that we heard from US Bank in the previous presentation. Everest reviewed some of their underwriting processes and split the automation opportunities between robotic process automation and AI, although I would have thought that using them together, as well as other automation technologies, could provide a better solution. They used an incremental approach, which let them see results sooner and feed back initial results into ongoing development. One side benefit is that they now capture much more of the unstructured information from each submission, whereas previously they would only capture the information entered for those submissions that led to business; this allows them to target their marketing and pricing accordingly. They’re starting to use AI-driven processes for claims first notice of loss (FNOL is a classic claims problem) in addition to underwriting, and are seeing operational efficiency improvements as well as better accuracy and time to market. Looking ahead, he sees that AI is here to stay in their organization since it’s providing proven value. Really good case study; worth watching if you’re in the insurance business and want to see how AI can be applied effectively.

That’s it for me at IBM Think 2020, and I’ve really noticed a laser focus on AI and cloud at this event. I was hoping to see more of the automation portfolio, such as process modeling, process management, robotic process automation, decision management and even content management, but it’s as if they don’t exist.

IBM had to pivot to a virtual format relatively quickly since they already had a huge in-person conference scheduled for this time, but they could have done better both for content and format given the resources that they have available to pour into this event. Everyone is learning from this experience of being forced to move events online, and the smaller companies are (not surprisingly) much more agile in adapting to this new normal. I’ll be at the virtual Appian World next week, then will write an initial post on virtual conference best — and worst — practices that I’ve seen over the five events that I’ve attended recently. In the weeks following that, I’ll be attending Signavio Live, PegaWorld iNspire and DecisionCAMP, so will have a chance to add on any new things that I see in those events.

Alfresco Modernize 2020

I’ve been attending the online Celonis conference for the past couple of days, but taking a break for Alfresco‘s short event, Alfresco Modernize. We started with an insightful keynote from CTO John Newton on patterns of digital transformation. As we likely enter a recession triggered by the global pandemic, he pointed out that most companies fail to execute properly through a recession, and showed some Harvard Business Review research on what actually works. This includes investing in digital transformation, decentralizing decision making, and being sure to retain knowledge and experience. The responses of digital leaders to disruptions such as what we’re now seeing focus on improving business processes, modernizing infrastructure, and making it easier to connect with customers and suppliers.

He discussed the concept of digital transformation patterns that can be derived from successful journeys, such as customer onboarding or improving manufacturing operations. He addressed the different layers of patterns shown in the chart at the left, and how they interact. We’ve used patterns in software development for a long time, and Newton shows us that it’s time to start documenting, understanding and applying digital transformation patterns. Alfresco wants to start documenting these in a very open source manner, and create solutions to address the common patterns.

Up next was a presentation by Dinesh Selvakumar, Global Head of Enterprise Content Management at Invesco, a global investment management firm. They are a relatively new Alfresco ECM customer, implementing in their own AWS instance during 2018-2019, and migrated content from other systems. They still have a lot of content silos, plus ad hoc routing and approval workflows, and have created an ECM CoE to improve standardization and governance. They want to integrate their systems to provide a unified user experience, and moved from an ECM mindset to that of Enterprise Content Services (ECS) that provides unified capabilities across the disparate platforms. They realize that there are some content and collaboration platforms that they’re never going to get rid of, but still need to have them integrated into the big picture connected by Alfresco. Eventually, enterprise content may be created in other applications, but then sent to Alfresco for enterprise-level management. They are quantifying the benefits of the move to an ECS, although some of the benefits are difficult to measure, such as decreased time searching for content. He shared some of the lessons that they learned and their key success factors, several of which are based on having a global focus and deployment.

Tony Grout, Alfresco Chief Product Officer, provided a product roadmap for their digital business platform. I found the slide on content and process interesting, in that it mentioned “processes relating to a document”: it seems like they have really trimmed off any of the pure process management messaging that they had previously, although Alfresco Process Server (Activiti) is still alive and well. Part of their core value proposition is the ability to start with open source and transition to the fully-supported (and more functional) enterprise version: this is true of any commercial open source vendor, but it’s front and center with Alfresco.

There are a number of new features on the roadmap: Federation Services (launching today) to federate different repositories, managing content in place instead of having to migrate everything to Alfresco; regulatory compliance in AWS; and the Enterprise Viewer that we saw demoed a bit later. Some of these capabilities likely came from their acquisition of Technology Services Group, a former Alfresco partner that built out a lot of value-added functionality.

Mark Stevens, General Manager for the Alfresco Cloud, introduced how they are rolling out the Alfresco Digital Business Platform as a service, and why cloud provides such great benefits for content management: resiliency, availability, and lower costs. Their platform is cloud-native, not just a containerized version of an on-premised platform, which provides better scalability and extensibility. Removing most of the overhead from managing an ECM platform means that you have more time (and money) for more innovation and digital transformation. He walked us through their overall architecture, and what a typical implementation would look like, in terms of what’s managed by the customer and what’s managed by Alfresco. They’ve had some pretty high-profile wins over other ECM vendors, such as OpenText Documentum and IBM FileNet, with transitioned customers seeing a lot of hard benefits from Alfresco Cloud.

Last up in this short event were Paul Hampton, Senior Director of Product Marketing, and Ben Allen, Technical Architect, talking about the new Federation Services and Enterprise Viewer products that were announced earlier by Tony Grout. These are both pretty significant new capabilities: Federation Services allows all content repositories to be federated through Alfresco, so that users have a single user experience, and all of the sources can be managed directly by policies set in Alfresco. Content is managed in place rather than all migrated into Alfresco, although in some cases this will likely be a first step on the way to a migration.

We saw a demo of the Enterprise Viewer, which has a lot of interesting capabilities for both internal and external participants. It’s fast to browse and load large document sets, and to individual large documents since they’re caching across the network by page. Documents can be redacted for external participants, for example, removing personal information from an insurance claim when sending to an external party for a repair quote. Video can be annotated to add comments at specific points to highlight certain things in the video, with the ability to jump directly to the point of the comment. Annotations are collaborative, so that a user can reply to an existing annotation.

I didn’t stick around for the live Q&A since I wanted to get back to CelosphereLive for a session starting at the same time. Alfresco Modernize didn’t have much of a “live” feel to it: the sessions were all pre-recorded which, as I’ve mentioned in my coverage of other online conferences, just doesn’t have the same feel. Also, without a full attendee discussion capability, this was more like a broadcast of multiple webinars than an interactive event, with a short Q&A session at the end as the only point of interaction. To their credit, each speaker was in their own home, practicing social distancing; although I liked the Celonis studio environment, I did feel that it’s a bit too early to have people in the same location for an event, no matter how controlled.

CamundaCon Live 2020 – Day 2: Microservices Orchestration, new stuff from Camunda, and legacy BPM migration

Day 2 of CamundaCon Live kicked off with Camunda co-founder Bernd Rücker talking about microservices orchestration and integation using workflow automation. This is a common theme for him, and I’ve seen earlier versions of this presentation, but he always brings something fresh to the discussion. He discussed reactive applications that are responsive, resilient, elastic and message-driven, then covered different styles of event-driven architecture.

He gave a (live) demo of autonomous services communicating using Kafka, and showed the issue with peer-to-peer choreography: there is no sense of the end-to-end orchestration to ensure that all services that should have run did actually run. He created an event-based process in Camunda Optimize that modeled the expected end-to-end process, and now by connecting that to the Kafka messages, he had a visualization of the workflow that he defined that showed what happens when one service isn’t running: effectively, the virtual workflow is stuck at the previous service since it does not receive a message that the (stopped) service has picked up the messages.

One solution is to extract the end-to-end responsibility into its own service: really, this implies some level of orchestration via commands rather than purely reacting to events, even if it’s not a completely tightly-coupled workflow. If you use an engine like Camunda to do that top-level orchestration, then you can move the monitoring of the process within that engine (Cockpit rather than Optimize) although it’s likely that anyone using an event-based architecture is going to be looking at an event monitoring system like Optimize as well. You can see his slides below, and the video will be available on the CamundaCon Live hub probably by the time that I publish this post.

The morning session continued with CTO Daniel Meyer on some of the new product capabilities. Camunda’s goal has moved from just being a BPM engine for Java developers to a much broader orchestration platform that can integrate any technology stack and any endpoints.

He introduced a new distribution called Camunda Run (or Lil’ Camboot, as Niall Deehan calls it) provides a lightweight package (50MB) that includes the BPMN and DMN workflow and decision engines, Cockpit, Tasklist and the REST API. It can even be run in headless mode, which disables the web apps, if you just want the engines. It’s Open API enabled, CORS enabled, and SSL enabled out of the box. He gave a quick demo of downloading, starting and running Camunda Run: it’s pretty familiar if you’ve spent any time with Camunda, and it starts fast. From the blog post announcement, the target audience for Run is if at least one of the following is true:

  • You need a standalone process engine accessible via REST API
  • You don’t have extensive Java knowledge (or none at all) but still want to use Camunda BPM
  • You don’t want to configure an application server yourself
  • You want to configure everything in one place
  • You just want to Run Camunda BPM

Meyer also talked about Camunda Optimize, specifically the event-based process monitoring. We saw a bit of that yesterday in Felix Müller’s presentation, and I had a more complete view of the event-based features of Optimize a few weeks ago on the 3.0 release webinar. Basically, you add the event source to Optimize (such as Kafka), and Optimize exposes messages and allows them to be attached to the entry/exit points of elements on a BPMN diagram that represents the event-driven process. They are offering a 30-day free trial for Optimize now if you want to try it out.

Meyer’s third topic was about process automation as a service via Camunda Cloud, which is powered by Zeebe (rather than Camunda BPM). Having cloud-native Zeebe behind the scenes means that it’s highly scalable and fault-tolerant, and uses pub-sub orchestration to let you include endpoints from anywhere. He demonstrated how to spin up a new Zeebe cluster, then deploy a BPMN model that was created in the Zeebe Modeler and start instances of the process using the zbctl command line. These instances were then visible in Camunda Operate (the Zeebe process monitoring tool), and he ran JavaScript workers and published messages to complete tasks in the process and show the instance progressing through the process model. There’s a free trial for Camunda Cloud, and an early access version for $699/month that includes access to larger clusters and technical support.

He fielded some questions that came up on the Slack workspace during his talk. Moving from an existing Camunda BPM implementation to Camunda Run is apparently as easy as just redirecting to the new application server. You can’t use Java delegates, but will have to switch those out for external tasks. There was a question about BPM versus Zeebe, which I think is a question that a lot of Camunda customers have: although most are likely familiar with the technical and functional differences, there is an open question of whether Camunda will continue to support two workflow engines in the future, and if they are going to shift focus more towards Zeebe use cases.

The morning finished by breaking out into two tracks; I stayed with the customer presentations rather than the technical breakout to hear some of the case studies. The one that I was most interested in was Fareed Saeed, head of Product and Tools for Advanced Process Solutions at Fidelity Investments, talking about migrating their monolithic legacy BPM to Camunda, in part because I did some early technical architecture consulting with them on their digital process automation platform over a year ago, although I’m not involved at this time. For those of you who know me mostly through this blog and as an independent industry analyst, you may not be aware that the other half of my business is as a consultant to large enterprises, mostly financial services and insurance, on technical architecture and strategy, or anything else to help make their process-centric implementation projects a success.

James Watson of Doculabs, who advised Fidelity on migration strategies, joined him on the discussion. Saeed talked about their current home-built workflow system, which runs thousands of different processes for most of their back office operations, and the need to move away from monolithic architecture and fragile, non-agile systems to a more flexible platform. This talk was not about the architecture or platform, but about the migration planning and execution: a key subject for any large enterprise moving off a legacy platform, but one that is often not fully considered during new digital automation platform implementation.

There are a few different strategies for migrating process-based applications, and it’s not the same story for each process. Watson shared his thoughts on this (see the slide at right), but this is my take on it:

  • High-volume processes, that usually represent a smaller number of process models but most of the transaction volume, are usually rewritten from scratch while incorporating some degree of re-engineering and process improvement along the way. These are the core business processes that need to be done right, and will most benefit from the more agile and scalable new platform.
  • Lower volume processes can be reviewed to see if they’re still required, may possible be combined into similar processes, then a straightforward “lift and shift” rewrite done to just duplicate the functionality as is. In short, these aren’t worth the time to do the re-engineering unless there are obvious wins, since the volume is relatively low. These are also candidates for low-code business-led development if that’s available on the automation platform, rather than the professional development teams required for the high-volume transactional processes.
  • Very low volume processes can be retired, especially if their functionality can be rolled into processes in one of the first two categories.

Although they are looking at a “factory model” for some level of automation around the migration, Saeed believes that this is an opportunity to re-engineer the processes rather than just rewriting the same (broken) process on a new platform. They want to have smaller, distributed groups for developing and delivering new applications, which means that they need to have the right governance and standards in place to support a distributed model. He sees the need for early pilots and successes to allow everyone to see how this can work, and learn how to make it successful. A strong diverse team of business leaders is also a plus, since there will be some degree of pain in the business units as the migration happens.

That’s it for the morning of Day 2, they must have read my comments yesterday and actually made sure that we finished on time so that we get our 15 minute lunch break. 🙂 I’ll be back for the afternoon to finish off CamundaCon Live 2020.





CamundaCon Live 2020 – Day 1: Jakob Freund keynote and customer presentations

Every conference organizer has had to deal with either cancelling their event or moving it to some type of online version as most of us work from home during the COVID-19 pandemic. Some of these have been pretty lacklustre, using only pre-recorded sessions and no live chat/Q&A, but I had expectations for Camunda being able to do this in a more “live” manner that doesn’t completely replace an in-person event, but has a similar feel to it. They did not disappoint: although a few of the CamundaCon presentations were pre-recorded, most were done live, and speakers were available for live Q&A. They also hosted a Slack workspace for live chat, which is much better than the Q&A/chat features on the webinar broadcast platform: it’s fundamentally more feature-rich, and also allows the conversations to continue after a particular presentation completes.

Very capably hosted by Director of Developer Relations Mary Thengvall, presentations were all done from the speaker’s individual locations, starting with CEO Jakob Freund’s keynote. He covered a bit of Camunda’s history and direction, and discussed their main focus of providing end-to-end process orchestration using the example of Camunda together with RPA, then gradually migrating the RPA bots (widely used as a stop-gap process automation measure) to more robust API integrations. He also shared some news on new and timely product offerings, including a starter package for work-from-home human workflow, and an early adopter package for Camunda Cloud. I’ve shared a few of his slides below, but you can also go and see the recording: they are getting the videos and slides up within about an hour after each presentation, directly on the conference hub.

Next up was Simon Letort, Chief Digital Officer at Société Générale, on how they implemented their corporate investment banking’s core process automation platform using Camunda. They use Camunda as the core of their managed workflow platform, with 500+ processes deployed throughout their operations worldwide. They also use bpmn.io and form.io as their built-in process and forms modelers. Letort responded to an audience question about why not use another large BPMS product that was already in use; they wanted a best-of-breed solution rather than a proprietary walled garden, and also wanted to leverage open source tools as part of that so that they weren’t building everything from scratch. They transitioned from some of the proprietary tools by first replacing the underlying engine with Camunda, then trading out other components such as form.io as a more flexible UI was required.

Interestingly, about half of their workflows are created by 30 expert modelers within centers of expertise, and half by 1200 “amateur” modelers, or citizen developers. This really points out the potential for companies to mix together the experts (focused on core processes) and amateurs (focused on tactical or administrative processes) using the same engine, although they likely use quite different tools for the full development cycle. The SG Workflow “product” offers three main features to support these different modeler/developer types: the (Camunda) process engine, a workflow aggregator for grouping tasks and cases from multiple systems, and UI web components and apps. Their platform also auto-generates process documentation. The core product is created and maintained by a team of about 10, distributed between France and Canada.

He shared some good information on their architecture and roadmap: I did a few presentations last year (one of them at CamundaCon in Berlin) and wrote a paper on building your own process-centric platform using a BPMS and an assembly of other tools, inspired in part by companies like Société Générale that have done this to create a much more flexible application development environment for their large enterprises.

We moved from the main stage to the track sessions, and I sat in on a presentation by Jeremy Warren of Keller Williams Realty (a Camunda customer that works with integrator BP3) about their “SmartPlans” dynamic processes — these aren’t actually dynamic at runtime, but use a flower process model that loops back to allow any task to lead to any other task — which allow real estate agents to create their own plans and tasks.

This is a great example of automating some of the processes that real estate agents use to drive new business, such as contacting prospects on a regular schedule, which would normally be done (or not done) manually. Agents can decide what tasks to do in what order; the branching logic in the model then executes the plan as specified. He also shared some of their experiences in rolling out and debugging applications on this scale.

The second track session was Derek Hunter and Uzma Khan of Ontario Teachers’ Pension Plan (who have been an occasional client of mine over a number of years, including introducing them to startup Camunda back in 2013). They have a number of case management style of processes to handle requests from members (teachers) regarding their pensions. They have 144 BPMN templates, and execute 70,000 process instances per year with up to 20,000 active instances at any time since these are generally long-running workflows. Some of the extremely long-running processes are actually terminated after a specific stage, then a scheduler restarts a corresponding instance when new work needs to be done. Other processes may be suspended in the workflow engine, making them invisible to a user’s worklist until work needs to be done.

Camunda is really just an engine buried within the OTPP workflow system, completely hidden from calling applications by a workflow intermediary. This was essential during their migration off other platforms: at one time, they had three different workflow engines running simultaneously, and could migrate everything to Camunda without having to retool the higher-level applications. In particular, end users are never aware of the specific workflow engine, but work within applications that integrate business data from multiple systems.

They take advantage of in-flight instance migration due to the long-running nature of their processes, which is something that Camunda offers that is missing from many other BPMS products. Because of the large number of process templates and the complex architecture with many other systems and components, they have implemented automated testing practices including modeling user interactions through their workflow interface service (that sits above the workflow intermediary and the Camunda engine), and handling work-arounds for emulating external task processing in their core services.

They’ve developed a lot of best practices for automated testing, and built tools such as a BPMN navigation tool to use during unit testing. Another of their colleagues, Zain Esmail, will be presenting more about this on the technical track tomorrow. They have also developed tools for administrative monitoring and reporting on external tasks, to allow these to be integrated with the internal Camunda workflow metrics in Prometheus.

We’re taking a short break between the morning and afternoon sessions, so I’ll close this out now and be back in another post as things progress, either this afternoon or tomorrow.

Free COVID-19 apps from @Trisotech @Appian and @Pega

Yesterday, I passed on a link to The Master Channel’s free e-learning courses that you can use to start skilling up if you’re on the bench right now due to COVID-19. I’m also aware of a few companies in our industry who are offering free apps — some just to customers, some to everyone — that can help to fight COVID-19 in different ways.

The ability to build apps quickly is a cornerstone in our industry of model-driven development and low-code, and it’s encouraging to see some good offerings on the table already in response to our current situation.

Appian was first out of the blocks with a COVID-19 Response Management application for collecting and managing employee health status, travel history and more in a HIPAA-compliant cloud. You can read about it on their blog, and sign up for it online. Their blog post says that it’s free to any enterprise or government agency, although the signup page says that it’s free to organizations with over 1,000 employees — not sure which is accurate, since the latter seems to exclude non-customers under 1,000 employees. It’s free only for six months at this point.

Pegasystems followed closely behind with a COVID-19 Employee Safety and Business Continuity Tracker, which seems to have similar functionality to the Appian application. It’s an accelerator, so you download it and configure it for your own needs, a familiar process if you’re an existing Pega customer — which you will have to be, because it’s only available for Pega customers. The page linked above has a link get the app from the Pega Marketplace, where it will be free through December 31, 2020.

Trisotech is going in a different direction by offering several free online COVID-19 assessment tools based on clinical guidelines: some for the general public, and some to be used by healthcare professionals.

As a founding member of OMG’s BPM+ Health community, Trisotech has been involved in developing shareable clinical pathways for other medical conditions (using visual models in BPMN, CMMN and/or DMN), and I imagine that these new tools might be the first bits of new shareable clinical pathways targeted at COVID-19, possibly packaged as consumable microservices. You can click on the tools and try them out without any type of registration or preparation: they ask a series of questions and provide an assessment based on the underlying business rules, and you can also upload files containing data and download the results.

My personal view is that making these apps available to non-customers is sure to be a benefit, since they will get a chance to work with your company’s platform and you’ll gain some goodwill all around.

Free online digital transformation courses from @TheMasterChnnl

E-learning platform The Master Channel (which inexplicably has only 8 Twitter followers after I followed them, so get over there and connect) is offering free courses, exams, certificates and downloads to anyone affected by COVID-19. That is pretty much everyone on the planet by now. You can find out more details at the link above and in a recent LinkedIn post by Jan Moons, and he writes in more detail about e-learning in the time of the current pandemic in another post.

There’s a very real possibility that a lot of people will be “on the bench” in the near future: either their work requires travel, or their company has to make tough decisions about staffing. This is a great time to consider skilling up, and The Master Channel has courses on process and decision modeling, business analysis, analytics and more. I have never taken one of their courses so can’t vouch for the quality, and I am not being compensated in any way for writing this post, but probably worth checking out what they have to offer.

Their current offer is only until April 5th, although it’s clear to most people that our period of confinement is going to last much longer than that. Get them while you can.

If you know of other e-learning companies making similar offers, please add them in the comments of this post (including a link if you have one). I know of several universities that offer free online courses for related topics although they tend to be longer and much more detailed — I had to dedicate four weeks and relearn a lot of forgotten graph theory to get through the Eindhoven University of Technology’s course in process mining, which is more than a lot of people have time (or patience) for.