Camunda Cloud beta goes public

It’s definitely webinar season! I’ve seen a lot of webinar invitations pass by recently, and I’ll be speaking on a couple in the coming weeks. Today, I listened in on a webinar about the Camunda Cloud public beta, with Daniel Meyer (Camunda CTO) discussing their drivers for creating it, and Immanuel Monma providing a demo. I heard about the Camunda Cloud at CamundaCon last September, and it’s good to see that they’re launching it so soon.

Meyer spoke about using cloud-based process automation for modernizing legacy infrastructure, and the requirements that they had for re-inventing process automation for the cloud:

  • Externalize processes from business applications (this isn’t really new, since it’s been a driver for BPM systems all along).
  • Maximize developer productivity by allowing them to work within their programming language of choice.
  • Support hybrid orchestration with both cloud and on-premise applications, and across multiple public cloud platforms.
  • Native BPMN execution.
  • Cloud scalability and resilience.

This is where their Zeebe workflow engine comes in, which is at the core of Camunda Cloud. By supporting hybrid orchestration, Camunda Cloud allows for a gradual migration of legacy on-premise IT by first externalizing the processes, then migrating some of the legacy functionality to cloud-based microservices while still supporting direct contact with the legacy IT, then gradually (if possible) migrating all of the legacy functionality to the cloud. This gains the advantage of both a microservices architecture for modularity and scalability, and process orchestration to knit things together for loose coupling with end-to-end visibility.

The live demo showed the interaction between Zeebe and Operate, the two main execution components of Camunda Cloud, plus Cawemo for collaborative modeling of the processes (althought the process could have just been modeled in the Zeebe modeler). Monma walked us through how to create, deploy and execute a simple BPMN process in Camunda Cloud; watching the webinar replay would be a great place to start if you want to play around with the beta. Note that aside from creating the BPMN model in Cawemo, which may involve business people, this is a technical developer toolset for service orchestration and automated processes at this point. You can plug into their Zeebe Slack community or forum to interact with other developers who are trying things out.

Future Camunda Cloud components

Meyer returned with the product roadmap, then handled questions from attendees. Right now, Camunda Cloud is a free public beta although there are some limitations; they will be launching the GA version shortly (he said “hopefully within the next month”) that will allow better control over clusters plus have SLA-based technical support. They are also adding human workflow with a tasklist, providing both an API and a simple out of the box UI, which will also push the addition of the human task type in the Zeebe BPMN coverage. They will be adding analytics via a cloud version of Optimize. The Camunda components are running in their cloud, which is currently running in Google Cloud and an automated Kubernetes structure; in the future, they will expand this to run in multiple (geographic) regions to better support applications in different regions. They may consider running on different cloud platforms, although since this is hidden from the Camunda Cloud customers, it may not be necessary. A number of other good questions on hybrid orchestration, the use of RPA, and how the underlying event-streaming distributed architecture of Zeebe provides for vastly greater scalability than most BPM systems.

You’ll be able to see the webinar replay (typically without registration) on the webinar information page as soon as they publish it.

Focus on Banking Processes: Improve Revenue, Costs and Compliance – my upcoming webinar with @Signavio

I’ll be presenting on two webinars sponsored by Signavio in the upcoming weeks, starting with one on banking processes on February 25 at 1pm ET. In this first webinar, I’ll be taking a look not just at the operational improvements, but at the (executive) management-level concerns of improving revenue, controlling costs and maintaining compliance. From the webinar description:

Today’s retail banks face more challenges than ever before: in addition to competing with each other, they are competing with fintech startups that provide alternatives to traditional banking products and methods. The concerns in the executive suite continue to focus on revenue, costs and compliance, but those top-level goals are more than just numbers. Revenue is tied closely to customer satisfaction and wallet share, with today’s customers expecting personalized banking products and modern omnichannel experiences.

You can sign up for the webinar here. This will be a concise 35 minutes plus Q&A, and I’ll include some use case examples from client onboarding and KYC in retail banking.

ARIS Elements: the cloud “starter edition” for process design

I decided not to get up at 4am (Eastern time) earlier this week to watch the ARIS Elements launch webinar presented by ARIS senior product manager Tom Thaler, but Software AG has published it here — no registration required — and I had a quick view of it as well as checking out the ARIS Elements website, which is already live.

Creating a model in ARIS Elements, showing the seven supported model types

As seen in the webinar, model creation allows you to create seven different types of models: process landscape, BPMN process, event-driven process (EPC), organizational chart, system diagram, data model, and structuring model. It does not include DMN or CMMN models; DMN is in ARIS Advanced and Enterprise editions.

Thaler demonstrated creating a BPMN model, which is similar to many of the other cloud-based modelers, although it’s not clear the extent of their BPMN coverage (for some of the more esoteric event types, for example). What they do provide that is unique, however, is analysis-focused information for different steps such as RACI responsibility assignments that link directly to an organizational chart. BPMN models are checked for validity, even though these are probably not expected to be directly-executable models. Once a model is created, it can be previewed and then published (unless the database has been set for auto-publication). In addition to the visual model, the preview/published versions of BPMN models show a summary tabular view of the process steps, with roles, input, output and IT systems for each. The RACI chart is also generated from the values entered in the process model.

A process landscape/map model can be created to show groups of processes (in the demo, the top level groups were management, core and supporting processes); these can in turn be nested groups of processes for more complex areas such as HR.

A user can set specific models as favorites, which will then appear on their home page for easy access. There is a hierarchical view of the repository by model type.

There are fairly standard user management features to add new users and assign permissions, although this edition does not provide single sign-on.

There are a number of video tutorials available to show how to create different model types and manage settings, and a free trial if you want to get started quickly.

There were a number of good questions in the Q&A (starting at around 38 minutes into the webinar) that exposed some of the other features and limitations of ARIS Elements. Many of these were obviously from people who are currently ARIS users, and looking to see if Elements fits into their plans:

  • Commenting by viewers is not supported
  • BPMN models can be imported
  • There is only one database (multiple databases to separate business units is a feature in ARIS Advanced/Enterprise)
  • Upgrading to a more expensive version would allow all models that were already created to be migrated
  • There is no automation of model review cycles (or any other workflow control), such as having a model reviewed by one or more others before publication; this would have to be done manually
  • There is no document storage (supporting documents can be stored directly in ARIS Advanced/Enterprise)
  • There is no process comparison (available in higher level versions)
  • Migrating from an ARIS on-premise edition to Elements could result in data loss since not all of the model types and features are supported, and is not recommended
  • There are a small number of pre-defined reports available for immediate use, but no report customization

If you look at the pricing page which also shows a feature comparison chart, you’ll see that ARIS Elements is considered the low-end edition of their cloud process modeling product suite. It’s fairly limited (up to 20 users, one database, other limitations) and is priced at 100€ (about $US110) per designer user per month and 50€ per 10 viewer users; that seems somewhat high, but they offer a broader range of model types than competitive process modeling tools, and include a shared repository for collaborative designing and viewing.

ARIS Elements is being positioned in an interesting space: it’s more than just process modeling, but less than the more complete enterprise architecture modeling that you’ll find in ARIS Advanced/Enterprise and competitive EA modeling products. It’s being targeted at “beginners”, although arguably beginners would not be creating a lot of these model types (although might be viewing them). Possibly they’ve had feedback that the Advanced version is just a bit too complex for many situations, and they are attempting to hit the part of the market that doesn’t need full capabilities; or they are offering Elements as a starting point with the goal to migrate many of these customers onto the Advanced/Enterprise editions as soon as they run up against the limitations.

APQC webinar: 2020 process and performance management priorities, with @hlykehogland

I listened in on a webinar today with APQC‘s Holly Lyke-Ho-Gland looking at the results of their 2020 process and performance management priorities survey (conducted in late 2019). Some good insights here, looking at the top three challenges in business process management and continuous improvement. Process modeling and mining vendors will be happy to see that the highest priority challenge in BPM is defining and mapping end-to-end processes.

She covered a number of tips and solutions to address these challenges, from points on developing end-to-end processes, how to develop a culture of continuous improvement, and governance alignment. She included a lot of great case studies and examples across all of these areas, and what type of resources and expertise is required to achieve them.

After covering the business process management and continuous improvement side, she moved on to discuss the organizational performance management challenges and solutions. Performance management is more about analytics and metrics, and using those measures to support decision making; apparently this year’s challenges are the same as last year’s, meaning that organizations are still struggling with these issues.

Some interesting points here about change management plans and what needs to be done in order to be successful in performance management; check out the webinar replay for details.

The last part of the webinar was on their “special interest” section, which this year is process management. The first point was on the purpose of process teams and work, the most important of which is supporting strategic initiatives. This is definitely what I see in my own consulting practice, with process gaining a much higher profile as companies focus on digital transformation efforts: at their heart, many transformation efforts are process-centric. The APQC research also showed information on measuring process work, and she notes (as I often see) that the top measures are still focused on bottom-line savings rather than more strategic measures, meaning that process metrics are misaligned with strategic focus. She also covered the impact of technology on process work: not just process automation, but collaboration, data management and visualization, collaboration and cloud computing topped the technology list, since they are looking at the entire process management lifecycle. She made a direct call-out to process mining (although it wasn’t in the top five list) as a cross-over between data analysis and process modeling; I’m definitely in agreement with that as you can see from my post earlier this week.

She finished with a summary of the survey results, plus a peek at their research agenda for 2020 with lots of interesting and timely topics. I like that their research uses a lot of real-world case studies.

I couldn’t find a direct link to the webinar replay yet, but it will likely available on APQC’s On-Demand Webinars page soon; definitely worth checking out for Lyke-Ho-Gland’s insights and discussion. While you’re over there, check out their Process and Performance Management Conference, coming up in October. I spoke at their conference back in 2013, and really enjoyed the experience, good sessions and a smaller conference so great for networking.

Process is cool (again), and the coolest kid on the block is process mining

I first saw process mining software in 2008, when Fujitsu was showing off their process discovery software/services package, plus an interesting presentation by Anne Rozinat from that year’s academic BPM conference where she tied in concepts of process mining and simulation without really using the term process mining or discovery. Rozinat went on to form Fluxicon, which developed one of the earliest process mining products and really opened up the market, and she spent time with me providing my early process mining education. Fast forward 10+ years, and process mining is finally a hot topic: I’m seeing it from a few mining-only companies (Celonis), and as a part of a suite from process modeling companies (Signavio) or even a larger process automation suite (Software AG). Eindhoven University of Technology, arguably the birthplace of process mining, even offers a free process mining course which is quite comprehensive and covers usage as well as many of the underlying algorithms — I did the course and found it offered some great insights and a few challenges.

Today, Celonis hosted a webinar, featuring Rob Koplowitz of Forrester in conversation with Celonis’ CMO Anthony Deighton, on the role of process improvement in improving digital operations. Koplowitz started with some results from a Forrester survey showing that digital transformation is now the primary driver of process improvement initiatives, and the importance of process mining in that transformation. Process mining continues its traditional role in process discovery and conformance checking but also has a role in process enhancement and guidance. Lucky for those of us who focus on process, process is now cool (again).

Unlike just examining analytics for the part of a process that is automated in a BPMS, process mining allows for capturing information from any system and tracing the entire customer journey, across multiple systems and forms of interaction. Process discovery using a process mining tool (like Celonis) lets you take all of that data and create consolidated process models, highlighting the problem areas such as wait states and rework. It’s also a great way to find compliance problems, since you’re looking at how the processes actually work rather than how they were designed to work.

Koplowitz had some interesting insights and advice in his presentation, not the least of which was to engage business experts to drive change and automation, not just technologists, and use process analytics (including process mining) as a guide to where problems lie and what should/could be automated. He showed how process mining fits into the bigger scope of process improvement, contributing to the discovery and analysis stages that are a necessary precursor to reengineering and automation.

Good discussion on the webinar, and there will probably be a replay available if you head to the landing page.

Building Scalable Business Automation with Microservices – a paper I created for @Camunda

scalable-business-automation-with-microservicesLast year, I did a few presentations for Camunda: a keynote at their main conference in Berlin, a webinar together with CEO Jakob Freund, and a presentation at their Camunda Day in Toronto, all on the similar theme of building a scalable digital automation platform using microservices and a BPMS.

I wrapped up my ideas for those presentations into a paper, which you can download from the Camunda website. Enjoy!

Blast from the past: what I was writing about on Column 2 in 2010

After I posted earlier this week about top 10 blog posts for 2019, I decided to take a look back at my archives and see what was happening 10 years ago. We were just starting to crawl out of a recession, and interesting things were happening in the industry.

Acquisitions! IBM had announced the acquisition of Lombardi just before Christmas 2009, and closed the deal in January 2010. Also that month, Progress Software acquired Savvion. Later that year, I ranted briefly about how when vendors acquire multiple overlapping products, it’s not good for the customers.

Standards! BPMN 2.0 neared release, and started to gain traction with many vendors. There was a fiery online debate about the use of BPMN by business people later in the year.

Conferences! The academic BPM conference came to North America for the first time, landing at the Stevens Institute in New Jersey. I spoke at the Software 2010 conference in Oslo on BPM and Enterprise 2.0 (what we would now call social BPM), a topic I’d been covering since 2006 and was “discovered” by the large analyst firms around 2010. I went to a lot of vendor conferences that year, and blogged about them while there.

Cloud! Faced with a growing number of vendors offering cloud BPM products, I climbed up on my usual soapbox about how geography does matter when it comes to cloud, at least US versus non-US hosting locations. It took some vendors a long time (and a few EU regulations) to realize this.

Case management! This was definitely the year that case management started to hit the BPM vendors’ radar, with many of them adding or acquiring capabilities to handle less-structured processes. I did a webinar with Keith Swenson on the top of agile and social BPM: while Keith and I don’t always agree, we always have interesting conversations.

New products! The creators of jBPM moved over to Alfresco and started the Activiti project. The reverberations of this are still felt today, with both of those creators having moved on, and at least two notable forks of Activiti currently available.

Column 2 turned five that year, which means that this year will be fifteen years that I’ve been blogging.

The top Column 2 posts of 2019

A couple of weeks ago, I wrote about my post that has had the most visits over all time, a 2007 post on policies, procedures, processes and rules.

Here’s what was the most popular in 2020:

  1. That same 2007 post, Policies, procedures, processes and rules. Obviously, this theme strikes deep with a broad range of people, and a recent comment on that post was from someone who had used it as a source in developing definitions for the PMI’s Project Management Body of Knowledge (PMBOK).
  2. An even older post from 2005, Adaptive approaches. This is about application development and deployment methodologies, and would now be called “Agile”. I also talk about my usual method of “get something simpler into production sooner”, which would now be called “minimum viable product”. The only thing that has really changed here is terminology.
  3. The first product-related post, my 2012 post Introduction to AWD 10. I was at the user conference for DST Systems, now part of SS&C Technologies, and wrote about what I saw at a session where they presented the new product version.
  4. Another terminology post, this one from 2017: What’s in a name? BPM and DPA. This was prompted by Forrester’s move to relabel business process management (BPM) systems as Digital Process Automation (DPA), and the ongoing confusion in terminology. This problem continues today, with Gartner sticking to the term iBPMS (Intelligent BPM Suite) but shifting it to mean low-code application development platform.
  5. The first that was originally published in 2019, Snowed in at the OpenText Analyst Summit 2019. In the midst of a massive snowstorm (I arrived on one of the last flights before the airport shut down), I attended OpenText’s analyst meetup in Boston, and this post was on the main keynote featuring CEO Mark Barrenechea’s vision for their future product direction.
  6. From the 2019 bpmNEXT conference, bpmNEXT 2019 demos: microservices, robots and intentional processes with Bonitasoft, Signavio and Flowable. My conference live-blogging is usually popular with those who can’t make it to the conference themselves, but this post was likely read more than most because it covered the Flowable chatbot + CMMN demo that went on to win the “Best in Show” award.
  7. Another throwback to 2005, Shallow vs. Deep Knowledge. I was writing in response to a post from EDS that said that they believe that someone working on a business application based on vendor components really had to see the vendor’s source code to do this right. I disagreed.
  8. A post on service-oriented architecture standards from 2009, The Open Group’s Service Integration Maturity Model and SOA Governance Framework. This was the result of a briefing that I had with them in advance of the standards’ release; to be honest, I’ve never used these frameworks and have no idea how broadly they were adopted.
  9. A post from this year’s academic BPM conference in Vienna, Day 1 @BPMConf opening keynote: Kalle Lyytinen on the role of BPM in design and business model innovation. The keynote discussed the concept of digital intensity, namely, the degree to which digitalization is required to perform a task, and how technology is changing the way that we do things on a micro level.
  10. Another post from the academic BPM conference, Workshop at @BPMConf on BPM in the era of Digital Innovation and Transformation. This workshop day preceded the keynote mentioned above, and covered a number of talks on digital transformation. This is the only one of the top 10 posts for 2019 that covers a presentation that I made, since I was invited to give a short talk at the end of the workshop.

I blogged quite a bit less in 2019 (in fact, my blogging has been a bit slow the past couple of years) although I had a lot of activity around conferences and a few product briefings. I’ve been fairly active on Twitter, and I’m looking at ways to bring together some of the links that I post there onto the blog for more discussion.

Looking forward to 2020!

Policies, procedures, processes and rules (redux)

For some reason, the most popular post of all time on this blog is from 2007, on policies, procedures, process and rules: definitions, differences, and how processes and decisions intersect both in modeling/documentation and implementation. If I look at the past year, quarter, month or week, it’s still the most viewed post in those time periods, even though it’s over 12 years old. Interesting to see what is still relevant after all this time, since blog posts typically have a half-life of only a couple of days.

A few days ago, someone named Jason Gorman (who did not note their affiliation) added a lengthy comment to the post, describing how he is reviewing the 6th edition of the PMBOK Guide, and has documented their definitions of policy, process, procedure and rule based on various sources including my original blog post. Glad I could be of service!

Conway’s Law and end-to-end processes – my new post on the @Trisotech blog

I’ve written another post for the Trisotech blog on end-to-end processes, and how Conway’s Law can prevent good horizontal integration across your organization’s processes. Conway’s Law is an adage that states that organizations design systems that mirror their own communication structure: basically, if your interdepartmental communications is not that great, the underlying systems are unlikely to do a good job of it either.

I go into some detail of the problems that occur when an organization is functionality siloed with poor communication between areas, and how you can start to find your way out of it.

You can find all of the posts that I’m creating for them here.