Opening up open source development with Bonitasoft 2021.1

I had a briefing with Bonitasoft CEO Miguel Valdes earlier this week to hear more about their 2021.1 release, announced today. Note that they have changed their version numbers to align with the year and release number relative to that year, something I’ve seen starting to happen with a number of vendors.

The most important part of this release, in my opinion, is their shift in what’s open source versus commercial. Like most open source vendors, Bonitasoft is built on an open source core engine and has a number of other open source capabilities, but creates proprietary commercial components that they license to paying customers in a commercial version of the system. In many cases, the purely open source version is great for technical developers who are using their own development tools, e.g., for data modeling or UI development, but it’s a non-starter for business developers since you can’t build and deploy a complete application using just the open source platform. Bonitasoft is shaking up this concept by taking everything to do with development — page/form/layout UI design, data object models, application descriptors, process diagrams, organization models, system connectors, and Git integration — and adding it to the open source version. In other words, instead of having one version of their Studio development environment for open source and another for commercial, everything is unified, including everything in the open source version necessary to develop and deploy process automation applications. Having a unified development environment also makes it easy to move a Bonitasoft application from the open source to the commercial version (and the reverse) without project redevelopment, allowing customers to start with one version and shift to the other as project requirements change.

This is a big deal for semi-technical business developers, who can now use Bonita Studio to create complete applications on the same underlying platform as technical developers. Bonitasoft has removed anything that requires coding from Studio, recognizing that business developers don’t want to write code, and technical developers don’t use the visual development environment anyway. [As a Java developer at one of my enterprise clients once said when presented with a visual development environment, “yeah, it looks nice, but I’ll never use it”.] That doesn’t mean that these two types of developers don’t collaborate, however: technical developers are critical for the development of connectors, for example, which allow an application to connect to an external system. Once connectors are created to link to, for example, a legacy system, the business developers can consume those connectors within their applications. Bonitasoft provides connector archetypes that allow technical developers to create their own connectors, but what is missing is a way to facilitate collaboration between business and technical developers for specifying the functionality of a connector. For example, allowing the business developer to create a placeholder in an application/process and specify the desired behavior, which would then be passed on to the technical developer.

Miguel took me through a demo of Bonita Studio using only the open source version, showing their new starting point of MyProcessAutomation that includes sections for Organization, Business Data Model Applications, Diagrams, and Pages/Forms/Layouts. There are also separate sections for Custom Widgets and Environments: the latter is to be able to define separate environments for development, testing, production, etc., and was previously only in the commercial edition. It’s a very unified look and feel, and seems to integrate well between the components: for example, the expression editor allows direct access to business object variables that will be understandable to business developers, and they are looking at ways to simplify this further.

They are moving their UI away from Angular to Web Components, and are even using their own UI tools to create their new user and administrator portals. The previous Bonita Portal is now being replaced by two applications: one is a user portal that provides case and task list functionality, while the other is for administrators to monitor and repair any process instance problems. These two applications can be freely modified by customers to personalize and extend them within their own environment.

There are definitely things remaining in their commercial edition, with a focus on security (single sign-on), scalability (clustering, elasticity), monitoring, and automated continuous deployment. There are a few maintenance tools that are being moved from the open source to commercial version, and maintenance releases (bug fixes between releases) will be limited to commercial customers. They also have a new subscription model that helps with self-managed elastic deployments (e.g., Amazon Cloud); provides update and migration services for on-premise customers; and includes platform audits for best practices, performance tuning and code review. Along with this are some new packaged professional services offering: Fast Track for first implementation, on-premise to Bonita Cloud upgrade, and on-premise upgrades for customers not on the new subscription model.

The last thing that Miguel mentioned was an upcoming open source project that they are launching related to … wait, that might not have been for public consumption yet. Suffice to say that Bonitasoft is disrupting the open source process automation space to be more inclusive of non-technical developers, and we’ll be seeing more from them along these lines in the near future.

SAP acquiring Signavio into Business Process Intelligence unit

I first met Signavio CEO Gero Decker in 2008, when he was a researcher at Hasso Platner Institut and emailed me about promoting their BPMN poster — a push to have BPMN (then version 1.1) recognized as a standard for process modeling. I attended the academic BPM conference in Milan that year but Gero wasn’t able to attend, although his name was on a couple of that year’s modeling-related demo sessions and papers related to Oryx, an open source process modeling project. By the 2009 conference in Ulm we finally met face-to-face, where he told me about what he was working on, process modeling ideas that would eventually evolve into Signavio. By the 2010 BPM conference in Hoboken, he was showing me a Signavio demo, and we ended up running into each other at many other BPM events over the years, as well as having many online briefings as they released new products. The years of hard work that he and his team have put into Signavio have paid off this week with the announcement of Signavio’s impending acquisition by SAP (Signavio press release, SAP press release). There have been rumors floating around for a couple of days, and this morning I had the chance for a quick chat with Gero in advance of the official announcement.

The combination of business process intelligence from SAP and Signavio creates a leading end-to-end business process transformation suite to help our customers achieve the requirements needed to gain a competitive edge.

Luka Mucic, CFO of SAP

SAP is launching RISE with SAP today, with the Signavio acquisition a part of the announcement. RISE with SAP is billed as “business transformation as a service”, providing business process redesign (including Signavio), technical migration (which appears to be a push to get reluctant customers onto their current platform), and building an intelligent enterprise (which is mostly a cloud infrastructure message).

This is a full company acquisition, including all Signavio employees (numbering about 500). Gero and the only other co-founder still at Signavio, CTO Willi Tscheschner, will continue in their roles to drive forward the product vision and implementation, becoming part of SAP’s relatively new Business Process Intelligence unit, which is directly under the executive board. Since that unit previously contained about 100 people, the Signavio acquisition will swell those ranks considerably, and Gero will co-lead the unit with the existing GM, Rouven Morato. A long-time SAP employee, Morato can no doubt help navigate the sometimes murky organizational waters that might otherwise trip up a newcomer. Morato was also a significant force in SAP’s own internal transformation through analytics and process intelligence, moving them from the dinosaur of old to a (relatively) more nimble and responsive company, hence understands the importance of products like Signavio’s in transforming large organizations.

Existing Signavio customers probably won’t see much difference right now. Over time, capabilities from SAP will become integrated into the process intelligence suite, such as deeper integration to introspect and analyze SAP S/4 processes. Eventually product names and SKUs will change, but as long as Gero is involved, you can expect the same laser focus on linking customer experience and actions back to processes. The potential customer base for Signavio will broaden considerably, especially as they start to offer dashboards that collect information on processes that include, but are not limited to, the SAP suite. In the past, SAP has been very focused on providing “best practice” processes within their suite; however, if there’s anything that this past year of pandemic-driven disruption has taught us, those best practices aren’t always best for every organization, and processes always include things outside of SAP. Having a broader view of end-to-end processes will help organizations in their digital transformations.

Obviously, this is going to have an impact on SAP’s current partnership with Celonis, since the SAP Process Mining by Celonis would be directly in competition with Signavio’s Process Intelligence. Of course, Signavio also has a long history with SAP, but their partnership has not been as tightly branded as the Celonis arrangement. Until now. Celonis arguably has a stronger process mining product than Signavio, especially with their launch into task mining, and have a long history of working with SAP customers on their process improvement. There’s always room for partners that provide different functionality even if somewhat in competition with an internal functionality, but Celonis will need to build a strong case for why a SAP customer should pick them over the Signavio-based, SAP-branded process intelligence offering.

Keep in mind that SAP hasn’t had a great track record of process products that aren’t part of their core suite: remember SAP NetWeaver BPM? Yeah, I didn’t think so. However, Signavio’s products are focused on modeling and analyzing processes, not automating them, so they might have a better chance of being positioned as discovering improvements to processes that are automated in the core suite, as well as giving SAP more visibility into how their customers’ businesses run outside of the SAP suite. There’s definitely great potential here, but also the risk of just becoming buried within SAP — time will tell.

Disclosure: Signavio has been a client of mine within the last year for creating a series of webinars. I was not compensated in any way for writing this post (or anything else on this blog, for that matter), and it represents my own opinions.

Making experience matter by building the right incentives into processes

Last month at the Bizagi virtual conference, I gave a keynote on aligning intelligent process automation with employee incentives and business goals. I decided to expand on those themes a bit for my monthly post on the Trisotech blog. Rather than the usual sort of performance metrics, I suggest the following:

The key to designing metrics and incentives is to figure out the problems that the workers are there to solve, which are often tied in some way to customer satisfaction, then use that to derive performance metrics and employee incentives.

There are a lot of challenges with figuring out how to measure and reward experience and innovative thinking: if it’s done wrong, then companies end up measuring how long you spent with a particular app open on your screen, or how many times you clicked on your keyboard.

We’re going through a lot of process disruption right now, and smart companies are using this opportunity to retool the way that they do things. They also need to be thinking about how their employee incentives are lined up with that redesign, and whether business goals are being served appropriately.

You can check out the whole post over at Trisotech’s blog.

Disclaimer: Trisotech is my client.

(Post image by my talented friend Alison Garwood-Jones).

Happy birthday to BPMN 2.0

Bruce Silver points out that it’s been 10 years since the finalization of BPMN 2.0, the standard notation that we use for modeling (and sometimes executing) business processes. The standard wasn’t published until some time after that, and there have been revisions over the years, but BPMN 2.0 was the start of a wave of standardization in the BPMS market since it included the notation (a serialization file format) that made model interchange between products possible. There’s always been some amount of controversy swirling around BPMN: some consider it too difficult for non-technical people to understand, some consider it too restrictive for technical implementations, and more. I believe that a subset of BPMN is a good tool for process modeling by business people, and that it’s a powerful process implementation tool for developers, but I agree that it’s not the only tool you should have in your toolbox.

Bruce’s post takes us back to some basic definitions of a process, and why BPMN is a better fit for modeling processes. He also covers some of the things that are not great about the standard:

The ability to understand the process behavior in detail simply by inspecting the diagram, unfortunately, was not top of mind in the BPMN 2.0 task force in 2010. They were solely focused on making the diagrams executable on an automation engine. But to most BPMN users today, precise description based on the diagram alone is much more important.

To help with the adoption of the standard, Bruce developed conventions for the use of the standard, publishing his BPMN Method & Style books and training. Some modeling vendors have even incorporated this into their product, so that you can validate your models against his conventions.

Regardless of whether you love or hate BPMN, it’s impossible to deny the impact that it has had on the BPMS market.

On the folly of becoming the “product expert”

This post by Charity Majors of Honeycomb popped up in my feed today, and really resonated relative to our somewhat in-bred world of process automation. She is talking about the need to move between software development teams in order to keep building skills, even if it means that you move from a “comfortable” position as the project expert to a newbie role:

There is a world of distance between being expert in this system and being an actual expert in your chosen craft. The second is seniority; the first is merely .. familiarity

I see this a lot with people becoming technical experts at a particular vendor product, when it’s really a matter of familiarity with the product rather than a superior skill at application development or even process automation technology. Being dedicated to a single product means that you think about solving problems in the context of that product, not about how process automation problems in general could be solved with a wider variety of technology. Dedication to a single product may make you a better technician but does not make you a senior engineer/architect.

Majors uses a great analogy of escalators: becoming an expert on one project (or product) is like riding one long escalator. When you get to the top, you either plateau out, or move laterally and start another escalator ride from its bottom up to the next level. Considering this with vendor products in our area, this would be like building expertise in IBM BPM for a couple of years, then moving to building Bizagi expertise for a couple of years, then moving to Camunda for a couple of years. At the end of this, you would have an incredibly broad knowledge of how to solve process automation projects on a variety of different platforms, which makes you much more capable of making the type of decisions at the senior architecture and design level.

This broader knowledge base also reduces risk: if one vendor product falls out of favor in the market, you can shift to others that are already in your portfolio. More importantly, because you already understand how a number of different products work, it’s easier to take on a completely new product. Even if that means starting at the bottom of another escalator.

The State of Process Automation, from Camunda

Camunda has just published a 20-page report on the state of process automation, which is pretty balanced (i.e., not particularly biased to their products). They get right to the point up front:

Process automation has emerged as a linchpin
for digital transformation, powering innovation across a
company. Process automation is equally sought after to
improve an organization’s top line as well as its bottom line
– helping to improve customer service, lower costs and drive
business growth.

I’m definitely on board with this statement. Companies that are most likely to emerge successfully from the current disruption are taking a hard look at their business processes, and considering how to include more intelligent automation.

The report is based on the results of a survey that they commissioned, which included 400 IT decision makers in the US and Europe. Almost all of those interviewed (97%) agreed that process automation is vital to digital transformation, and I was encouraged that half of of the current initiatives are focused on growth rather than just efficiency or firefighting. As I’ve been saying for a while, efficiency and productivity are table stakes: you have to consider those, but you’re not going to get the biggest benefit until you start looking at what intelligent automation can do for top-line growth and customer satisfaction.

The survey included a few questions on the impact of the pandemic, with 80% of respondents saying that they are doing more automation because of remote work and (I assume) fewer workers in some cases. This is not unexpected, with 68% reporting that key business processes had breakdowns due to remote work, and most companies are working harder on automation initiatives in order to survive the current disruption.

Definitely worth a read.

OpenText Enterprise World 2020, Day 1

The last time that I was on a plane was mid-February, when I attended the OpenText analyst summit in Boston. For people even paying attention to the virus that was sweeping through China and spreading to other Asian countries, it seemed like a faraway problem that wasn’t going to impact us. How wrong we were. Eight months later, many businesses have completely changed their products, their markets and their workforce, much of this with the aid of technology that automates processes and supply chains, and enables remote work.

By early April, OpenText had already moved their European regional conference online, and this week, I’m attending the virtual version of their annual OpenText World conference, in a completely different world than in February. Similar to many other vendors that I cover (and have attended virtual conferences for in the past several months), OpenText’s broad portfolio of enterprise automation products has the opportunity to make gains during this time. The conference opened with a keynote from CEO Mark Barrenechea, “Time to Rethink Business”, highlighting that we are undergoing a fundamental technological (and societal) disruption, and small adjustments to how businesses work aren’t going to cut it. Instead of the overused term “new normal”, Barrenechea spoke about “new equilibrium”: how our business models and work methods are achieving a stable state that is fundamentally different than what it was prior to 2020. I’ve presented about a lot of these same issues, but I really like his equilibrium analogy with the idea that the landscape has changed, and our ball has rolled downhill to a new location.

He announced OpenText Cloud Edition (CE) 20.4, which includes five domain-oriented cloud platforms focused on content, business network, experience, security and development. All of these are based on the same basic platform and architecture, allowing them to updated on a quarterly basis.

  • The Content Cloud provides the single source of truth across the organization (via information federation), enables collaboration, automates processes and provides information governance and security.
  • The Business Network Cloud deals directly with the management and automation of supply chains, which has increased in importance exponentially in these past several months of supply chain disruption. OpenText has used this time to expand the platform in terms of partners, API integrations and other capabilities. Although this is not my usual area of interest, it’s impossible to ignore the role of platforms such as the Business Network Cloud in making end-to-end processes more agile and resilient.
  • The Experience Cloud is their customer communications platform, including omnichannel customer engagement tools and AI-driven insights.
  • The Security and Protection Cloud provides a collection of security-related capabilities, from backup to endpoint protection to digital forensics. This is another product class that has become incredibly important with so many organizations shifting to work from home, since protecting information and transactions is critical regardless of where the worker happens to be working.
  • The Developer Cloud is a new bundling/labelling of their software development (including low-code) tools and APIs, with 32 services across eight groupings including capture, storage, analysis, automation, search, integration, communicate and security. The OpenText products that I’ve covered in the past mostly live here: process automation, low-code application development, and case management.

Barrenechea finished with their Voyager program, which appears to be an enthusiastic rebranding of their training programs.

Next up was a prerecorded AppWorks strategy and roadmap with Nic Carter and Nick King from OpenText product management. It was fortunate that this was prerecorded (as much as I feel it decreases the energy of the presentation and doesn’t allow for live Q&A) since the keynote ran overtime, and the AppWorks session could be started when I was ready. Which begs the question why it was “scheduled” to start at a specific time. I do like the fact that OpenText puts the presentation slides in the broadcast platform with the session, so if I miss something it’s easy to skip back a slide or two on my local copy.

Process Suite (based on the Cordys-heritage product) was rolled into the AppWorks branding starting in 2018, and the platform and UI consolidated with the low-code environment between then and now. The sweet spot for their low-code process-centric applications is around case management, such as service requests, although the process engine is capable of supporting a wide range of application styles and developer skill levels.

They walked through a number of developer and end-user feature enhancements in the 20.4 version, then covered new automation features. This includes enhanced content and Brava viewer integration, but more significantly, their RPA service. They’re not creating/acquiring their own RPA tool, or just focusing on one tool, but have created a service that enables connectors to any RPA product. Their first connector is for UiPath and they have more on the roadmap — very similar rollout to what we saw at CamundaCon and Bizagi Catalyst a few weeks ago. By release 21.2 (mid-2021), they will have an open source RPA connector so that anyone can build a connector to their RPA of choice if it’s not provided directly by OpenText.

There are some AppWorks demos and discussion later, but they’re in the “Demos On Demand” category so I’m not sure if they’re live or “live”.

I checked out the content service keynote with Stephen Ludlow, SVP of product management; there’s a lot of overlap between their content, process, AI and appdev messages, so important to see how they approach it from all directions. His message is that content and process are tightly linked in terms of their business usage (even if on different systems), and business users should be able to see content in the context of business processes. They integrate with and complement a number of mainstream platforms, including Microsoft Office/Teams, SAP, Salesforce and SuccessFactors. They provide digital signature capabilities, allowing an external party to digitally sign a document that is stored in an OpenText content server.

An interesting industry event that was not discussed was the recent acquisition of Alfresco by Hyland. Alfresco bragged about the Documentum customers that they were moving onto Alfresco on AWS, and now OpenText may be trying to reclaim some of that market by offering support services for Alfresco customers and provide an OpenText-branded version of Alfresco Community Edition, unfortunately via a private fork. In the 2019 Forrester Wave for ECM, OpenText takes the lead spot, Microsoft and Hyland are some ways back but still in the leaders category, and Alfresco is right on the border between leaders and strong performers. Clearly, Hyland believes that acquiring Alfresco will allow it to push further up into OpenText’s territory, and OpenText is coming out swinging.

I’m finding it a bit difficult to navigate the agenda, since there’s no way to browse the entire agenda by time, but it seems to require that you know what product category that you’re interested in to see what’s coming up in a time-based format. That’s probably best for customers who only have one or two of their products and would just search in those areas, but for someone like me who is interested in a broader swath of topics, I’m sure that I’m missing some things.

That’s it for me for today, although I may try to tune in later for Poppy Crum‘s keynote. I’ll be back tomorrow for Muhi Majzoub’s innovation keynote and a few other sessions.

My writing on the Trisotech blog: better analysis and design of processes

I’ve been writing some guest posts over on the Trisotech blog, but haven’t mentioned them here for a while. Here’s a recap of what I’ve posted over there the past few months:

In May, I wrote about designing loosely-coupled processes to reduce fragility. I had previously written about Conway’s Law and the problems of functional silos within an organization, but then the pandemic disruption hit and I wanted to highlight how we can avoid the sorts of cascading supply chain process failures that we saw early on. A big part of this is not having tightly-coupled end-to-end processes, but separating out different parts of the process so that they can be changed and scaled independently of each other, but still form part of an overall process.

In July, I helped to organize the DecisionCAMP conference, and wrote about the BPMN-CMMN-DMN “triple crown”: not just the mechanics of how the three standards work together, but why you would choose one over the other in a specific design situation. There are some particular challenges with the skill sets of business analysts who are expected to model organizations using these standards, since they will end up using more of the one that they’re most familiar with regardless of its suitability to the task at hand, as well as challenges for the understandability of multi-model representations that require a business operations reader of the models to be able to see how this BPMN diagram, that CMMN model and this other DMN definition all fit together.

In August, I focused on better process analysis using analytical techniques, namely process mining, and gave a quick intro to process mining for those who haven’t seen it in action. For several months now, we haven’t been able to do a lot of business “as is” analysis through job shadowing and interviews, and I put forward the idea that this is the time for business analysts to start learning about process mining as another tool in their kit of analysis techniques.

In early September, I wrote about another problem that can arise due to the current trend towards distributed (work from home) processes: business email compromise fraud, and how to foil it with better processes. I don’t usually write about cybersecurity topics, but I have my own in-home specialist, and this topic overlapped nicely with my process focus and the need for different types of compliance checks to be built in.

Then, at the end of September, I finished up the latest run of posts with one about the process mining research that I had seen at the (virtual) academic BPM 2020 conference: mining processes out of unstructured emails, and queue mining to see the impact of queue congestion on processes.

Recently, I gave a keynote on aligning intelligent automation with incentives and business outcomes at the Bizagi Catalyst virtual conference, and I’ve been putting together some more detailed thoughts on that topic for this months’ post. Stay tuned.

Disclosure: Trisotech is my customer, and I am compensated for writing posts for publication on their site. However, they have no editorial control or input into the topics that I wrote about, and no input into what I write here on my own blog.

Closing comments from Bizagi Catalyst 2020

By the time we got to day 3 of the virtual Bizagi Catalyst 2020, Bizagi had given up on their event streaming platform and just published all of the pre-recorded presentations for on-demand viewing. We were supposed to do a live wrap-up at the end of the day with Rob Koplowitz of Forrester Research, Bizagi CEO Gustavo Gómez and myself, moderated by Bizagi’s Senior Director of Product Marketing Rachel Brennan, so we went ahead and recorded that yesterday. It’s now up on the on-demand page, check it out:

This was my first time speaking at — or attending! — Bizagi Catalyst, and I’m looking forward to more of them in the future. Hopefully somewhere more exciting than my own home office.

Bizagi Catalyst 2020 Day 2 keynotes and hackathon

With a quick nod to Ada Lovelace Day (which was yesterday) and women in technology, Bizagi’s Catalyst virtual conference kicked off today with a presentation by Rachel Brennan, Senior Director of Product Marketing, and Marlando Rhule, Professional Services Director, on some of the new industry accelerators available from Bizagi. The first of these was onboarding and KYC (know your client) process, including verification and risk assessment of both business and individual clients. Secondly was a permit lifecycle management process, specifically for building (and related) permits for municipal and state governments; it orchestrates communications between multiple applications for zoning and inspections, gathers information and approvals, generates letters and permits, and drives the overall process.

Coming soon, they will be releasing the APQC Process Classification Framework for Bizagi Modeler: the APQC frameworks are a good source of pre-built processes for specific industries as well as cross-industry frameworks.

Rachel also announce the Bizagi hackathon, which runs from October 19 to November 14. From the hackathon website:

It can be a new and innovative Widget to improve Bizagi´s forms, a new connector that extends Bizagi´s capabilities to connect with external systems or an experience-centric process using Bizagi Sites and Stakeholder concepts.

As with yesterday, the platform was pretty unstable, but eventually the second session stated with Luigi Mule of Blue Prism and their customer Royston Clark from Old Mutual, a financial services group in African and Asian markets. As I mentioned yesterday (and last week at another virtual conference) BPM vendors and RPA vendors are finally learning how to cooperate rather than position themselves as competitors: BPM orchestrates processes, and invokes RPA bots to perform tasks as the steps in the process. Eventually, many of the bots will be replaced with proper APIs, but in the meantime, bots provide value through integrating with legacy systems that don’t have exposed APIs.

At Old Mutual, they have 170 bots, 70% of which are integrated in a Bizagi process. Since they started with Blue Prism, they have automated the equivalent of eight million minutes of worker time: effectively, they have given that amount of time back to the business for more value-added activities. The combination of Bizagi and Blue Prism has given them a huge increase in agility, able to change and automate processes in a very short time frame.

Next up was supposed to be my keynote on aligning intelligent automation with incentives and business outcomes, but the broadcast platform failed quite spectacularly and Bizagi had to cancel the remainder of the day, which included the planned live Q&A after my presentation (I’d like to imagine that I’m so popular that I broke the internet). That also means that we missed the Q&A, but feel free to ask questions in the comments here, or on Twitter. You can see my slides below, and the keynote is recorded and available for replay.

I’ve been writing and presenting about aligning incentives with business processes for a long time, since I recognized that more collaborative and ad hoc processes needed to have vastly different metrics than our old-school productivity widget-counting. This was a good opportunity to revisit and update some of those ideas through the pandemic lens, since worker metrics and incentives have shifted quite a bit with work from home scenarios.

Assuming they have the event platform back online tomorrow, I’ll be back for a few of the sessions, and to catch up on some of today’s sessions that I missed.