CamundaCon 2024 Day 2: Technical Keynote

We’re kicking off day 2 of CamundaCon in New York with the technical keynote, featuring Bernd Rücker, Co-Founder and Chief Technologist; Daniel Meyer, CTO; and Bastian Körber, Principal Product Manager. Bernd opened the session talking about organizations’ conflicting goals to continue innovating their business while also transforming and modernizing their technical architecture. This was an interesting although possibly unintentional tie-in with the SAP integration session that I attended at the end of the day yesterday, where the migration example from SAP ECC to S/4HANA falls into the latter category, but the business leaders are pushing for the business innovation and don’t want to “waste” time on technology modernization. Adding RPA/AI bots and moving to an orchestrated architecture allows for gradual architecture modernization while making the business processes much more agile by externalizing the processes from the legacy systems.

We saw a demonstration of claims handling showing their upcoming IDP (Intelligent Document Processing) capability, which calls AI to extract information from receiving documents then figures out what to do with the information. The goal is to map that information onto the data elements in the process model, which then allows documents to be automatically integrated into processes with little or no human intervention.

We also saw some of their upcoming lightweight RPA capabilities built on the open source Robot framework. The addition of IDP and RPA — necessary if Camunda wants to work their way into the new Gartner BOAT category — are intended to be relatively lightweight, and not replace the need for more robust IDP and RPA products if an organization is already using third-party products, which can just be treated as external services to be orchestrated as part of a Camunda process.. Hopefully these will actually be “good enough” to be generally used, rather than being toy versions that are just there to chase the analyst categorization that we’ve seen from many other vendors in the past.

The demo also includes other AI calls and SAP integration, highlighting their new/upcoming features. Worth watching the replay of the demo when the sessions are released next week to see Bernd walk through it all (with a bit of help from Daniel).

Daniel took over to discuss the next generation of automation platform, which expands their orchestration environment through the addition of AI at a number of different points. This is exposed in the modeler as IDP, RPA and AI connectors and services.

Bastian described the AI offerings in more detail, starting with the BPMN Copilot, which can be used to create BPMN diagrams based on text descriptions. There have been natural language processing interfaces to BPMN model generation around for quite a while, both in research and as some released products, but this adds LLMs behind the text processing for better results — the more text that is provided, the less AI hallucination. Output is not (necessarily) intended to be the final version, but a fairly advanced starting point for a human modeler to then continue modifying and completing. The LLM is using publicly available information to provide best practices for process models. The BPMN Copilot demos well but feels like a bit of a party trick. A cool party trick, but maybe not something that’s going to be mainstream for a while. Some of the underlying technology can definitely be used, however, for automated process optimization or at least optimization recommendations, by bringing process mining data and some natural language to bear.

Daniel referred to Forrester’s definitions of AI Agents (task automation) and Agentic AI systems (orchestration of multiple types of tasks including AI agents). AI agents may be descendants of RPA bots, where some level of AI is already in use, while Agentic AI is focused on autonomous systems that optimize themselves without human intervention.

We also saw a demo of a travel booking process that uses AI agents to organize, research and present options based on a general description of a desired trip booking. These agents are orchestrated into a process with some human touch points, where the AI options are shown as recommendations: calls to third-party AI/LLMs as part of an orchestrated process, demonstrating AI agents and agentic AI in the context of E2E business orchestration

There was another example of a claims process with an ad hoc subprocess, blending deterministic and ad hoc in the same process where AI can be used to decide which activities are executed in which order within the ad hoc subprocess. The ad hoc subprocess has been in BPMN for a long time, but usually used to represent case management with human decisions or standard decision management on which activity to perform next; now, an LLM acts as the Next Best Action selector.

Daniel finished up with release dates: all of the features discussed will be released in 8.7 or 8.8 within the 2025 calendar year.

As we kick off the second day with an informative keynote, I also want to give a shout out to the Camunda events team, who keep everything running smoothly when I’m sure there are mini disasters happening behind the scenes every minute. Kudos!

CamundaCon 2024 Day 1: CEO Keynote

It seems that a lot of my posts are about Camunda lately, mostly because these are the events that I’m attending in person. Like a lot of people, I’m a bit over online conferences since too many of them are pre-recorded and the speaking is uninspired – as a long-time conference presenter, there’s just something about presenting to a live audience that livens up a presentation. This week, I’m in New York for CamundaCon 2024, which is also being streamed live if you want to participate remotely. The livestream really is live, and they use Slido to field questions from audience members regardless of location.

Day 1 opened with a keynote by CEO Jakob Freund. It’s been a while since I’ve had a Camunda briefing, and several people hinted in advance that there would some interesting updates. He opened with their growth stats: Camunda now has more than $100M in revenue and 500 employees, which is a pretty stellar path from its humble origins. He then went on to discuss waves of change, primarily waves of AI opportunities such as the current agentic AI.The goal is to drive the right process architecture, which doesn’t mean just throwing AI at a spaghetti architecture, although many organizations will be unable to resist that path since it gives the illusion of progress. The same has happened with many new technologies in the past: think of how RPA (robotic process automation) has been added to existing overly-complicated architectures and just serves to make them more complex and rigid.

Process orchestration is a potential path to taming the complexity by providing a layer above the complicated and disparate legacy and “helper” technologies, loosely binding and coordinating them. Instead of the “spaghetti bot” architecture that results from many RPA implementations, process orchestration allows the bots to be separated from the process orchestration layer. Then, the underlying bots can gradually be replaced with APIs while maintaining the same process layer focused on customer journeys. This is not really new — in fact, I think I wrote a paper or two on exactly this method of continuous improvement rather than a big bang approach.

How is Camunda responding to this new reality? They’ve come from a core vision of process automation through microservice orchestration as a developer tool, to adding out of the box connectors, low-code capabilities, human-facing tooling, and lightweight decision management with DMN. Their next steps are the addition of RPA and IDP (intelligent document processing) to their core stack. Their RPA, in particular, isn’t a competitive independent RPA product, but a built-in capability to integrate legacy applications without having to expose an API. You can still integrate bots from other RPA vendors to Camunda processes, but they are providing a lightweight capability for customers who need a small amount of RPA without having to work with a second product. This is not — or at least should not be — particularly heavy lifting for a capable process orchestration product company, since it’s just process on a different level. 

Another big announcement was about Camunda’s move further into the business solution space. Their marketplace has allowed partners to provide templates and solutions for some time, but now Camunda is taking a bigger role in providing solutions themselves. As part of this, they are providing a “process orchestration for SAP” solution. There’s another session on this later today that I plan to attend.

Jakob wrapped up his keynote with an overview of their expansion in AI, which we will be hearing more about over the next two days. This includes chatbot-supported process modeling as well as process orchestration runtime capabilities from lightweight helpers to full agentic AI: the operationalization of AI.

Some exciting announcements, and we’ll see more detail and demos at the technical keynote tomorrow.

This was followed by a keynote from Gartner on their latest move to rebrand the market under a new acronym: BOAT, or Business Orchestration and Automation Technologies.

Not completely surprisingly, Camunda’s new product announcements seem to be aligned with BOAT, and Gartner’s presence here may be an indicator that Camunda is going down the path of chasing the analysts and aiming for a good Magic Quadrant placement.

BPM2023 Day 2: RPA Forum

In the last session of the day, I attended another part of the RPA Forum, with two presentations. 

The first presentation was “Is RPA Causing Process Knowledge Loss? Insights from RPA Experts” (Ishadi Mirispelakotuwa, Rehan Syed, Moe T. Wynn), presented by Moe Wynn. RPA has a lot of measurable benefits – efficiency, compliance, quality – but what about the “dark side” of RPA? Can it make organizations lose knowledge and control over their processes because people have been taken out of the loop? RPA is often quite brittle, and when (not if) it stops working, it’s possible that organizational amnesia has set in: no one remembers how the process works well enough to do it manually. The resulting process knowledge loss (PKL) can have a number of negative organizational impacts.

The study created a conceptual model for RPA-related PKL, and she walked us through the sets of human, organizational and process factors that may contribute. In layman’s terms, use it or lose it.

In my opinion, this is different from back-end or more technical automation (e.g., deploying a BPMS or creating APIs into enterprise system functionality) in that back-end automation is usually fully specified, rigorously coded and tested, and maintained as a part of the organization’s enterprise systems. Conversely, RPA is often created by the business areas directly and can be inherently brittle due to changes in the systems with which it interfaces. If an automated process goes down, there are likely service level agreements in place and IT steps in to get the system back online. If an RPA bot goes down, a person is expected to do the tasks manually that had been done by the bot, and there is less likely to be a robust SLA for getting the bot fixed and back online. Interesting discussion around this in the Q&A, although not part of the area of study for the paper as presented.

The second presentation was “A Business Model of Robotic Process Automation” (Helbig & Braun), presented by Eva Katarina Helbig of BurdaSolutions, an internal IT service provider for an international media group. Their work was based on a case study within their own organization, looking at establishing RPA as a driver of digitization and automation within a company based on an iterative, holistic view of business models with the Business Model Canvas as analysis tool.

They interviewed several people across the organization, mostly in operational areas, to develop a more structured model for how to approach, develop and deploy RPA projects, starting with the value proposition and expanding out to identify the customers, resources and key activities.

That’s it for day two of the main BPM2023 conference, and we’re off later to the Spoorwegmuseum for the conference dinner and a tour of the railway museum.

BPM2023 Day 1: RPA Forum

In the afternoon breakouts, I attended the RPA (robotic process automation) forum for three presentations.

The first presentation was “What Are You Gazing At? An Approach to Use Eye-tracking for Robotic Process Automation”, presented by Antonio Martínez-Rojas. RPA typically includes a training agent that captures what and where a human operator is typing based on UI logs, and uses that to create the script of actions that should be executed when that task is automated using the RPA “bot” without the person being involved – a type of process mining but based on UI event logs. In this presentation, we heard about using eye tracking — what the person is looking at and focusing on — during the training phase to understand where they are looking for information. This is especially interesting in less structured environments such as reading a letter or email, where the information may be buried in non-relevant text, and it’s difficult to filter out the relevant information. Unlike the UI event log methods, this can find what the user is focusing on while they are working, which may not be the same things in the screen that they are clicking on – an important distinction.

The second presentation was “Accelerating The Support of Conversational Interfaces For RPAs Through APIs”, presented by Yara Rizk. She presented the problem that many business people could be better supported through easier access to all types of APIs, including unattended RPA bots, and proposed a chatbot interface to APIs. This can be extracted by automatically interrogating the OpenAPI specifications, with some optional addition of phrases from people, to create natural language sentences: what is the intent of the action based on the API endpoint name and description plus sample sentences provided by the people. Then, the sentences are analyzed and filtered, and typically also with some involvement from human experts, and used to train the intent recognition models required to drive a chatbot interface.

The last presentation in this session was “Migrating from RPA to Backend Automation: An Exploratory Study”, presented by Andre Strothmann. He discussed how RPA robots need to be designed and prioritized so that they can be easily replaceable, with the goal to move to back-end automation as soon as it is available. I’ve written and presented many times about how RPA is a bridging technology, and most of it will go away in the 5-10 year horizon, so I’m pretty happy to see this presented in a more rigorous way than my usual hand-waving. He discussed the analysis of their interview data that resulted in some fundamental design requirements for RPA bots, design guidelines for the processes that orchestrate those bots, and migration considerations when moving from RPA bots to APIs. If you’re developing RPA bots now and understand that they are only a stopgap solution, you should be following this research.

Maximizing success in automation projects: my presentation from CommunityLIVE 2022

Hey, I gave a presentation yesterday, first time in person in almost three years! Here’s the slides, and feel free to contact me if you have questions. I can’t figure out how to get the embed short code on mobile, but when I’m back in the office I’ll give it another try and you may see the slideshow embedded below. Update: found the short code!

Live! From Nashville! It’s CommunityLIVE

It’s been a long 2.5 years since I was last at a conference in person, and I’m kicking off the new era with Hyland’s CommunityLIVE in Nashville. I came in early to attend today’s Executive Forum, where we were welcomed by Stephanie Dedmon, CIO of the state of Tennessee. She gave us a brief view of their IT initiatives, one of which is process automation (specifically RPA). I will be giving a presentation tomorrow about some of the best practices around intelligent automation, and one of those is having process automation right on your strategic initiatives list, like what Dedmon tells us is the case with the local state government.

We had a corporate update from Hyland’s CEO, Bill Priemer. I haven’t been to a Hyland event before — I came to this from my past relationship with Alfresco prior to their acquisition by Hyland — and it’s good to see a more complete briefing including how their recent acquisitions are being handled. He covered some financials and other numbers that I have not included here since I usually just focus on the technology, and I’m not sure if I’m cleared to discuss those outside this venue.

Priemer said that they are “solely focused on content services”, which does not sound all that great for the process side of the former Alfresco product; recall that the absorption of Activiti into Alfresco which turned it into essentially (just) a content-centric process engine was controversial, and led to the departure of some of the original Activiti architects and developers. I expect that many Activiti customers/users that were not doing content-centric projects have already migrated to other platforms that came from the same core code base, such as Camunda and Flowable.

Their corporate priorities around product development are focused on developing their next-gen SaaS experience platform, and building a cloud core engine to migrate existing customers. I’m a bit surprised that they’re this far behind the curve on cloud technology, but they have a pretty significant on-premise customer base for their legacy OnBase product. Having acquired Perceptive (2017) and Nuxeo (2021) in addition to Alfresco (2020), they are also still busy digesting those: supporting (and advancing) each of them as separate products, while planning out a product roadmap for convergence. Interestingly, they have committed to their current 80% remote workforce (which used to be 80% in the office), and are likely learning to “eat their own dog food” and therefore coming to a full understanding of what their customers are facing as they move to cloud platforms to support remote work. If nothing else, they could become their own best testbed for cloud.

There was a panel hosted by Ed McQuiston, Chief Commercial 1Officer (which includes sales, marketing, customer success and a few other things); panels are difficult to capture in a post like this, but there was an interesting bit of the discussion on how automation is becoming paramount: costs are being cut after a couple of years of “drunken sailor” spending just to stay in business, and if you don’t start automating, you’re going to be in trouble. The easy stuff needs to get automated, to leave the hard stuff for the staff remaining after the Great Resignation. In my presentation tomorrow, I’m going to be talking about the “automation imperative” which expands these ideas a bit more.

I stepped out while they did some roundtable sessions, then returned at the end of the afternoon for the product update with Hyland’s Chief Product Officer, John Phelan. He will be covering some of this same territory in the general keynote tomorrow morning, but I’ve grabbed what I could from this session and can fill in some of the blanks tomorrow. He spoke quite a bit about platform extensibility, allowing many other types of capabilities to plug into Hyland’s content services core. Or rather, cores, since this could be any of their (competing) content services engines. I’m looking forward to hearing more about the roadmap for convergence of the engines; with content engines, this is an tough one because full platform convergence requires a migration pathway — at a reasonable cost — for clients. He showed a slide with different use cases for platform extensibility, being able to plug in RPA, or records management, or intelligent capture, or case management. But not mentioned (obvious to my process-centric ears) was process management, a capability that they now have in the Activiti/Process Services that came with the Alfresco acquisition. Even if they call it workflow, a term that most people in process management feel is a bit too simplistic, it still was missing from his slide. Case management and process management are highly related, but not the same thing, unless you’re going to restrict your process management to case management paradigms in order to have process exist only as an adjunct to content. RPA is, of course, task automation, not process management. I’m seeing a bit of a gap in the strategy, or maybe it’s a terminology issue; I’d like to see a more detailed briefing of the whole platform to gain a better understanding.

Phelan was followed by Hyland’s Chief Innovayion Officer, Sam Babic, who gave a bit of a review of Gartner’s definition of hyperautomation (a term that still makes me giggle a bit in spite of having written a paper on the topic recently). Every vendor has their spin on hyperautomation, and Babic spoke about some of the practical aspects of how to implement solutions in a hyperautomation fashion: leveraging multiple leading-edge technologies (IoT, event-driven architecture, AI/ML, RPA, chatbots, etc.) to be able to swiftly create new business solutions. He does include workflow as a (I believe) headless orchestration of triggers that can then instantiate a case, so that’s something, and included the phrase BPM/BPA/Workflow on his product capability word salad slide. Obviously, they have a very content-centric view of the product space, whereas I’m a column 2 kind of girl.

I’ll be presenting tomorrow afternoon in the Business Transformation track — in the least desirable time spot at the end of the day, where I’m contractually obligated to tell the attendees that I’m the only thing standing between them and the bar — with on the topic of maximizing success in automation projects. I’ve spent 30+ years building automation software (content and process) and building solutions using that same type of software, so have seen a lot of things go wrong, and some things go right. If you’re here at CommunityLIVE, stop by to hear about my best practices, plus a few anti-patterns to watch out for.

Camunda Platform 7.15: now low-code (-ish)

I had a quick briefing with Daniel Meyer, CTO of Camunda, about today’s release. With this new version 7.15, they are rebranding from Camunda BPM to Camunda Platform (although most customers just refer to the product as “Camunda” since they really bundle everything in one package). This follows the lead of other vendors who have distanced themselves from the BPM (business process management) moniker, in part because what the platforms do is more than just process management, and in part because BPM is starting to be considered an outdated term. We’ve seen the analysts struggle with naming the space, or even defining it in the same way, with terms like “digital process automation”, “hyperautomation” and “digitalization” being bandied about.

An interesting pivot for Camunda in this release is their new support for low-code developers — which they distinguish as having a more technical background than citizen developers — after years of primarily serving the needs of professional technical (“pro-code”) developers. The environment for pro-code developers won’t change, but now it will be possible for more collaboration between low-code and pro-code developers within the platform with a number of new features:

  • Create a catalog of reusable workers (integrations) and RPA bots that can be integrated into process models using templates. This allows pro-code developers to create the reusable components, while low-code developers consume those components by adding them to process models for execution. RPA integration is driving some amount of this need for collaboration, since low-code developers are usually the ones on the front-end of RPA initiatives in terms of determining and training bot functionality, but previously may have had more difficult integrating those into process orchestrations. Camunda is extending their RPA Bridge to add Automation Anywhere integration to their existing UIPath integration, which gives them coverage of a significant portion of the RPA market. I covered a bit of their RPA Bridge architecture and their overall view on RPA in one of my posts from their October 2020 CamundaCon. I expect that we will soon see Blue Prism integration to round out the main commercial RPA products, and possibly an open source alternative to appeal to their community customers.
  • DMN support, including DRD and decision tables, in their Cawemo collaborative modeler. This is a good way to get the citizen developers and business analysts involved in modeling decisions as well as processes.
  • A form builder. Now, I’m pretty sure I’ve heard Jakob Freund claim that they would never do this, but there it is: a graphical form designer for creating a rudimentary UI without writing code. This is just a preliminary release, only supporting text input fields, so isn’t going to win any UI design awards. However, it’s available in the open source and commercial versions as well as accessible as a library in bpmn.io, and will allow a low-code developer to do end-to-end development: create process and decision models, and create reusable “starter” UIs for attaching to start events and user activities. When this form builder gets a bit more robust in the next version, it may be a decent operational prototyping tool, and possibly even make it into production for some simple situations.

They’ve also added some nice enhancements to Optimize, their monitoring and analytics tool, and have bundled it into the core commercial product. Optimize was first released mid-2017 and is now used by about half of their customers. Basically, it pumps the operational data exhaust out of the BPM engine database and into an elastic search environment; with the advent of Optimize 3.0 last year, they could also collect tracking events from other (non-Camunda) systems into the same environment, allowing end-to-end processes to be tracked across multiple systems. The new version of Optimize, now part of Camunda Platform 7.15, adds some new visualizations and filtering for problem identification and tracking.

Overall, there’s some important things in this release, although it might appear to be just a collection of capabilities that many of the all-in-one low-code platforms have had all along. It’s not really in Camunda’s DNA to become a proprietary all-in-one application development platform like Appian or IBM BPM, or even make low-code a primary target, since they have a robust customer base of technical developers. However, these new capabilities create an important bridge between low-code developers who have a better understanding of the business needs, and pro-code developers with the technical chops to create robust systems. It also provides a base for Camunda customers who want to build their own low-code environment for internal application development: a reasonably common scenario in large companies that just can’t fit their development needs into a proprietary application development platform.

OpenText Enterprise World 2020, Day 1

The last time that I was on a plane was mid-February, when I attended the OpenText analyst summit in Boston. For people even paying attention to the virus that was sweeping through China and spreading to other Asian countries, it seemed like a faraway problem that wasn’t going to impact us. How wrong we were. Eight months later, many businesses have completely changed their products, their markets and their workforce, much of this with the aid of technology that automates processes and supply chains, and enables remote work.

By early April, OpenText had already moved their European regional conference online, and this week, I’m attending the virtual version of their annual OpenText World conference, in a completely different world than in February. Similar to many other vendors that I cover (and have attended virtual conferences for in the past several months), OpenText’s broad portfolio of enterprise automation products has the opportunity to make gains during this time. The conference opened with a keynote from CEO Mark Barrenechea, “Time to Rethink Business”, highlighting that we are undergoing a fundamental technological (and societal) disruption, and small adjustments to how businesses work aren’t going to cut it. Instead of the overused term “new normal”, Barrenechea spoke about “new equilibrium”: how our business models and work methods are achieving a stable state that is fundamentally different than what it was prior to 2020. I’ve presented about a lot of these same issues, but I really like his equilibrium analogy with the idea that the landscape has changed, and our ball has rolled downhill to a new location.

He announced OpenText Cloud Edition (CE) 20.4, which includes five domain-oriented cloud platforms focused on content, business network, experience, security and development. All of these are based on the same basic platform and architecture, allowing them to updated on a quarterly basis.

  • The Content Cloud provides the single source of truth across the organization (via information federation), enables collaboration, automates processes and provides information governance and security.
  • The Business Network Cloud deals directly with the management and automation of supply chains, which has increased in importance exponentially in these past several months of supply chain disruption. OpenText has used this time to expand the platform in terms of partners, API integrations and other capabilities. Although this is not my usual area of interest, it’s impossible to ignore the role of platforms such as the Business Network Cloud in making end-to-end processes more agile and resilient.
  • The Experience Cloud is their customer communications platform, including omnichannel customer engagement tools and AI-driven insights.
  • The Security and Protection Cloud provides a collection of security-related capabilities, from backup to endpoint protection to digital forensics. This is another product class that has become incredibly important with so many organizations shifting to work from home, since protecting information and transactions is critical regardless of where the worker happens to be working.
  • The Developer Cloud is a new bundling/labelling of their software development (including low-code) tools and APIs, with 32 services across eight groupings including capture, storage, analysis, automation, search, integration, communicate and security. The OpenText products that I’ve covered in the past mostly live here: process automation, low-code application development, and case management.

Barrenechea finished with their Voyager program, which appears to be an enthusiastic rebranding of their training programs.

Next up was a prerecorded AppWorks strategy and roadmap with Nic Carter and Nick King from OpenText product management. It was fortunate that this was prerecorded (as much as I feel it decreases the energy of the presentation and doesn’t allow for live Q&A) since the keynote ran overtime, and the AppWorks session could be started when I was ready. Which begs the question why it was “scheduled” to start at a specific time. I do like the fact that OpenText puts the presentation slides in the broadcast platform with the session, so if I miss something it’s easy to skip back a slide or two on my local copy.

Process Suite (based on the Cordys-heritage product) was rolled into the AppWorks branding starting in 2018, and the platform and UI consolidated with the low-code environment between then and now. The sweet spot for their low-code process-centric applications is around case management, such as service requests, although the process engine is capable of supporting a wide range of application styles and developer skill levels.

They walked through a number of developer and end-user feature enhancements in the 20.4 version, then covered new automation features. This includes enhanced content and Brava viewer integration, but more significantly, their RPA service. They’re not creating/acquiring their own RPA tool, or just focusing on one tool, but have created a service that enables connectors to any RPA product. Their first connector is for UiPath and they have more on the roadmap — very similar rollout to what we saw at CamundaCon and Bizagi Catalyst a few weeks ago. By release 21.2 (mid-2021), they will have an open source RPA connector so that anyone can build a connector to their RPA of choice if it’s not provided directly by OpenText.

There are some AppWorks demos and discussion later, but they’re in the “Demos On Demand” category so I’m not sure if they’re live or “live”.

I checked out the content service keynote with Stephen Ludlow, SVP of product management; there’s a lot of overlap between their content, process, AI and appdev messages, so important to see how they approach it from all directions. His message is that content and process are tightly linked in terms of their business usage (even if on different systems), and business users should be able to see content in the context of business processes. They integrate with and complement a number of mainstream platforms, including Microsoft Office/Teams, SAP, Salesforce and SuccessFactors. They provide digital signature capabilities, allowing an external party to digitally sign a document that is stored in an OpenText content server.

An interesting industry event that was not discussed was the recent acquisition of Alfresco by Hyland. Alfresco bragged about the Documentum customers that they were moving onto Alfresco on AWS, and now OpenText may be trying to reclaim some of that market by offering support services for Alfresco customers and provide an OpenText-branded version of Alfresco Community Edition, unfortunately via a private fork. In the 2019 Forrester Wave for ECM, OpenText takes the lead spot, Microsoft and Hyland are some ways back but still in the leaders category, and Alfresco is right on the border between leaders and strong performers. Clearly, Hyland believes that acquiring Alfresco will allow it to push further up into OpenText’s territory, and OpenText is coming out swinging.

I’m finding it a bit difficult to navigate the agenda, since there’s no way to browse the entire agenda by time, but it seems to require that you know what product category that you’re interested in to see what’s coming up in a time-based format. That’s probably best for customers who only have one or two of their products and would just search in those areas, but for someone like me who is interested in a broader swath of topics, I’m sure that I’m missing some things.

That’s it for me for today, although I may try to tune in later for Poppy Crum‘s keynote. I’ll be back tomorrow for Muhi Majzoub’s innovation keynote and a few other sessions.

CamundaCon 2020.2 Day 1

I listened to Camunda CEO Jakob Freund‘s opening keynote from the virtual CamundaCon 2020.2 (the October edition), and he really hit it out of the park. I’ve known Jakob a long time and many of our ideas are aligned, and there was so much in particular in his keynote that resonated with me. He used the phrase “reinvent [your business] or die”, whereas I’ve been using “modernize or perish”, with a focus not just on legacy systems and infrastructure, but also legacy organizational culture. Not to hijack this post with a plug for another company, but I’m doing a keynote at the virtual Bizagi Catalyst next week on aligning intelligent automation with incentives and business outcomes, which looks at issues of legacy organizational culture as well as the technology around automation. Processes are, as he pointed out, the algorithms of an organization: they touch everything and are everywhere (even if you haven’t automated them), and a lot of digital-native companies are successful precisely because they have optimized those algorithms.

Jakob’s advice in achieving reinvention/modernization is to do a gradual transformation, not try to do a big bang approach that fails more often than it succeeds, and positions Camunda (of course) as the bridge between the worlds of legacy and new technology. In my years of technology consulting on BPM implementations, I also recommend using a gradual approach by building bridges between new and old technology, then swapping out the legacy bits as you develop or buy replacements. This is where, for example, you can use RPA to create stop-gap task automation with your existing legacy systems, then gradually replace the underlying legacy or at least create APIs to replace the RPA bots.

The second opening keynote was with Marco Einacker and Christoph Anzer of Deutsche Telekom, discussing how they are using process and task automation by combining Camunda for the process layer and RPA at the task layer. They started out with using RPA for automating tasks and processes, ending up with more than 3,000 bots and an estimated €93 million in savings. It was a very decentralized approach, with initially being created by business areas without IT involvement, but as they scaled up, they started to look for ways to centralize some of the ideas and technology. First was to identify the most important tasks to start with, namely those that were true pain points in the business (Einacker used the phrase ” look for the shittiest, most painful process and start there”) not just the easy copy-paste applications. They also looked at how other smart technologies, such as OCR and AI, could be integrated to create completely unattended bots that add significant value.

The decentralized approach resulted in seven different RPA platforms and too much process automation happening in the RPA layer, which increased the amount of technical debt, so they adapted their strategy to consolidate RPA platforms and separate the process layer from the bot layer. In short, they are now using Camunda for process orchestration, and the RPA bots have become tasks that are orchestrated by the process engine. Gradually, they are (or will be) replacing the RPA bots with APIs, which moves the integration from front-end to back-end, making it more robust with less maintenance.

I moved off to the business architecture track for a presentation by Srivatsan Vijayaraghavan of Intuit, where they are using Camunda for three different use cases: their own internal processes, some customer-facing processes for interacting with Intuit, and — most interesting to me — enabling their customers to create their own workflows across different applications. Their QuickBooks customers are primarily small and mid-sized business that don’t have the skills to set up their own BPM system (although arguably they could use one of the many low-code process automation platforms to do at least part of this), which opened the opportunity for Intuit to offer a workflow solution based on Camunda but customizable by the individual customer organizations. Invoice approvals was an obvious place to start, since Accounts Payable is a problem area in many companies, then they expanded to other approval types and integration with non-Intuit apps such as e-signature and CRM. Customers can even build their own workflows: a true workflow as a service model, with pre-built templates for common workflows, integration with all Intuit services, and a simplified workflow designer.

Intuit customers don’t interact directly with Camunda services; Camunda is a separately hosted and abstracted service, and they’ve used Kafka messages and external task patterns to create the cut-out layer. They’ve created a wrapper around the modeling tools, so that customers use a simplified workflow designer instead of the BPMN designer to configure the process templates. There is an issue with a proliferation of process definitions as each customer creates their own version of, for example, an invoice approval workflow — he mentioned 70,000 process definitions — and they will likely need to do some sort of automated cleanup as the platform matures. Really interesting use case, and one that could be used by large companies that want their internal customers to be able to create/customize their own workflows.

The next presentation was by Stephen Donovan of Fidelity Investments and James Watson of Doculabs. I worked with Fidelity in 2018-19 to help create the architecture for their digital automation platform (in my other life, I’m a technical architecture/strategy consultant); it appears that they’re not up and running with anything yet, but they have been engaging the business units on thinking about digital transformation and how the features of the new Camunda-based platform can be leveraged when the time comes to migrate applications from their legacy workflow platform. This doesn’t seem to have advanced much since they talked about it at the April CamundaCon, although Donovan had more detailed insights into how they are doing this.

At the April CamundaCon, I watched Patrick Millar’s presentation on using Camunda for blockchain ledger automation, or rather I watched part of it: his internet died partway through and I missed the part about how they are using Camunda, so I’m back to see it now. The RiskStream Collaborative is a not-for-profit consortium collaborating on the use of blockchain in the insurance industry; their parent organization, The Institutes, provides risk management and insurance education and is guided by senior executives from the property and casualty industry. To copy from my original post, RiskStream is creating a distributed network platform, called Canopy, that allows their insurance company members to share data privately and securely, and participate in shared business processes. Whenever you have multiple insurance companies in an insurance process, like a claim for a multi-vehicle accident, having shared business processes — such as first notice of loss and proof of insurance — between the multiple insurers means that claims can be settled quicker and at a much lower cost.

I do a lot of work with insurance companies, as well as with BPM vendors to help them understand insurance operations, and this really resonates: the FNOL (first notice of loss) process for multi-party claims continues to be a problem in almost every company, and using enterprise blockchain to facilitate interactions between the multiple insurers makes a lot of sense. Note that they are not creating or replacing claims systems in any way; rather, they are connecting the multiple insurance companies, who would then integrate Canopy to their internal claims systems such as Guidewire.

Camunda is used in the control framework layer of Canopy to manage the flows within the applications, such as the FNOL application. The control framework is just one slice of the platform: there’s the core distributed ledger layer below that, where the blockchain data is persisted, and an integration layer above it to integrate with insurers’ claims systems as well as the identity and authorization registry.

There was a Gartner keynote, which gave me an opportunity to tidy up the writing and images for the rest of this post, then I tuned back in for Niall Deehan’s session on Camunda Hackdays over on the community tech track, and some of the interesting creations that come out of the recent virtual version. This drives home the point that Camunda is, at its heart, open source software that relies on a community of developer both within and outside Camunda to extend and enhance the core product. The examples presented here were all done by Camunda employees, although many of them are not part of the development team, but come from areas such as customer-facing consulting. These were pretty quick demos so I won’t go into detail, but here are the projects on Github:

If you’re a Camunda customer (open source or commercial) and you like one of these ideas, head on over to the related github page and star it to show your interest.

There was a closing keynote by Capgemini; like the Gartner keynote, I felt that it wasn’t a great fit for the audience, but those are my only real criticisms of the conference so far.

Jakob Freund came back for a conversation with Mary Thengvall to recap the day. If you want to see the recorded videos of the live sessions, head over to the agenda page and click on Watch Now for any session.

There’s a lot of great stuff on the agenda for tomorrow, including CTO Daniel Meyer talking about their new RPA orchestration capabilities, and I’ll be back for that.

IBM acquires WDG Automation RPA

The announcement that IBM was acquiring WDG Automation for their RPA capabilities was weeks ago, but for some reason the analyst briefing was delayed, then delayed again. Today, however, we had a briefing with Mike Gilfix, VP Cloud Integration and Automation Software, Mike Lim, Acquisition Integration Executive, and Tom Ivory, VP IBM Automation Services, on the what, why and how of this. Interestingly, none of the pre-acquisition WDG executives/founders were included on the call.

IBM is positioning this as part of a “unified platform” for integration, but the reality is likely far from that: companies that grow product capabilities through acquisition, like IBM, usually end up with a mixed bag of lightly-integrated products that may not be better for a given use case than a best-of-breed approach from multiple vendors.

The briefing started with the now-familiar pandemic call to action: customer demand is volatile, industries are being disrupted, and remote employees are struggling to get work done. Their broad solution makes sense, it that is focused on digitizing and automating work, applying AI where possible, and augmenting the workforce with automation and bots. RPA for task automation was their missing piece: IBM already had BPM, AI and automated decisioning, but needed to address task automation. Now, they are offering their Cloud Pak for Automation, that includes all of these intelligent automation-related components.

Mike Lim walked through their reasons for selecting WDG — a relatively unknown Brazilian company — and it appears that the technology is a good fit for IBM because it’s cloud-native, offers multi-channel AI-powered chatbots integrated with RPA, and has a low-code bot builder with 650+ pre-built commands. There will obviously be some work to integrate this with some of the overlapping Watson capabilities, such as the Watson Assistant that offers AI-powered chatbots. WDG also has some good customer cases, with super-fast ROI. It offers unattended and attended bots, OCR (although it stops short of full-on document capture), and operational dashboards. The combination of AI and RPA has become increasingly important in the market, to the point where some vendors and analysts use “intelligent automation” to mean AI and RPA to the exclusion of other types of automation. I’m not arguing that it’s not important, but more that AI and other forms of intelligence need to be integrated across the automation suite, not just with RPA.

IBM is envisioning their new RPA having use cases both in business operations, as you usually see, and also with a strong focus on IT operations, such as semi-automated real-time event incident management. To get there, they have a roadmap to bring the RPA product into the IBM fold to offer IBM RPA as a service, integrate into the Cloud Pak, and roll it out via their GBS professional services arm. Tom Ivory from GBS gave us a view into their Services Essentials for Automation platform that includes a “hosted RPA” bucket: WDG will initially just be added to that block of available tools, although GBS will continue to offer competitive RPA products as part of the platform too.

It’s a bit unusual for IBM GBS and the software group to play together nicely: my history with IBM tends to show otherwise, and Mike Lim even commented on the (implied: unusual) cooperation and collaboration on this particular initiative.

There’s no doubt that RPA will play a strong role in the frantic reworking of business operations that’s going on now within many large organizations to respond to the pandemic crisis. Personally, I don’t think it’s a super long-term growth play: as more applications offer proper APIs and integration points, the need for RPA (which basically integrates with applications that don’t have integration points) will decrease. However, IBM needs to have it in their toolbox to show completeness, even if GBS ends up using their competitors’ RPA products in projects.