13 years of reporting on @BPTrends BPM annual reports: from vendor reviews to state of the BPM market

The very first post that I wrote on this blog was in March 2005, and it covered BPTrends’ 2005 BPM Suites report. I think that this was the first year that they published a BPM annual report, although I can’t find even the link to this report in their web archive and I don’t have it in my own archive. However, at the time, I listed the 13 vendors that were included (about half of which still exist in some form) and noted that it was a “pay for play” report that required that the vendors pay $5,000 to participate. I don’t think that BPTrends does vendor reviews any more – the last that I have a record of was in 2007 – but they have an out-of-date page of vendor info and links that is provided for free.

By 2006, BPTrends was conducting surveys of practitioners, consultants and others involved in BPM, and had published their first State of Business Process Management report based on survey results, with others following every two years (2008, 2010, 2012, 2014, 2016). They’ve now published the 2018 State of Business Process Management report, sponsored by Red Hat and Signavio, who get top billing on the front page of the report but presumably no editorial control or special treatment in the report since it’s not a review of products but rather a state of the industry/market report based on the results of surveys. Since BPTrends has been creating and analyzing BPM surveys since 2005, they have a good view of how the market is evolving.

My guest post on the @Alfresco blog: BPM Cloud Architectures and Microservices

The second of the short articles that I wrote for Alfresco has been published on their blog, on BPM cloud architectures and microservices. I walk through the basics of cloud architectures (private, public and hybrid), containerization and microservices, and why this is relevant for BPM implementations. As I point out:

Not all BPM solutions are built for cloud-native architectures: a monolithic BPMS stuffed into a Docker container will not be able to leverage the advantages of modern cloud infrastructures.

Check out the full article on the Alfresco site.

Summarizing OPEXWeek 2018

I only had 1.5 days at OPEX Week 2018 in Orlando this week, and spent part of my time giving a presentation as well as sitting on a panel, so didn’t attend many sessions. However, I struck up a conversation with Eric Thompson at the reception last night without realizing that he was one of the original co-founders of Lombardi Software — now a part of IBM, with the Lombardi BPMS forming a good part of the core of IBM BPM — and had such an interesting talk that I sat in on the presentation that he did today with Doug Drolett about continuous improvement at Shell. Both Thompson and Drolett have senior CI roles at Shell.

Shell has been working on process improvement for more than 10 years, with business-centric process improvements during 2005-2009, moving to more end-to-end global process improvement during 2010-2013, and now focused on continuous improvement to the way that everyone works. Although driven from the top, with the CEO fully engaged, the idea is that it’s an ongoing cultural shift at every level. As they moved to this mindset, it became less about programmatic improvement (rolling out new systems to improve the business processes) and more about how that embedded culture impacts operational excellence. This results in everyone being focused on delivering value to the customer — however the customer is defined — through a perpetual cycle of plan-do-improve.

They talked about improving the order-to-cash process in their commercial business, and how they improved that process on a global scale including standardization. They use customer journey mapping and “thinking like the customer” extensively to determine how and why to deliver value in those processes, which has an interesting tie-in with the session that I gave yesterday on how customer journey mapping and process improvement fit together. They also use value stream representations of customer-facing processes, and owners for those processes. Their front-line staff include Lean practitioners, with a smaller number of CI coaches to overlay on ongoing initiatives and projects. Since they’re a global operation, they use technology to enable collaboration so that a single CI initiative can involve participants from several countries.

As you might expect from a process-centric conference, OPEX Week is exceptionally well-run, and attracts a lot of attendees because of the quality of the content. The conference originated several years ago with a focus on the Lean Six Sigma community, and many of the attendees and speakers (such as those from Shell) have roles in their company such as continuous improvement, change management and business transformation. Although technology is definitely a component in most of the projects that people talk about here, that’s not the main thing; that’s what makes this different from the typical attendee and speaker at more technology-focused conferences. There’s a smallish display area for vendor booths and a relatively low-key vendor sponsor element that is integrated into the breakout tracks. They also have the talented visual faciliator Kimberly Dornisch capturing the themes in sessions while they’re going on. Here she is with the one that she did for the low code panel that I was on:

I also gave a presentation yesterday on customer journey mapping, and you can see my slides here:

Transforming Insurance with Cloud BPM: my guest post on the @Alfresco blog

I recently wrote three short articles for Alfresco, which they are publishing on their blog. The first one is about insurance and cloud BPM, looking at how new business models are enabled and customer-facing processes improved using a containerized cloud architecture and microservices. From the intro:

In this blog post, I plan to explore the role BPMS plays in integrating packaged software, custom-built systems, and external services into a seamless process that includes both internal and external participants. What if you need to include customers in your process without having to resort to email or manual reconciliation with an otherwise automated process? What if you need employees and partners to participate in processes regardless of their location, and from any device? What if some of the functions that you want to use, such as machine learning for auto-adjudication, industry comparative analytics on claims, or integration with partner portals, are available primarily in the public cloud?

Head over there to read more about my 4-step plan for insurance technology modernization, although the same can be applied in many other types of organizations. They also have a webinar coming up next week on legacy ECM modernization at Liberty Mutual; with some luck, Liberty Mutual will read my article and think about how cloud BPM can help modernize their processes too.

The other two posts that I wrote for them – one that dives more into BPM cloud architectures and microservices, and one that examines use cases for content in process applications – will be published over the next couple of months. Obviously, Alfresco paid me to write the content that is published on their site, although it’s educational and thought leadership in nature, not about their products.

On the Alfresco topic, I’ll likely be at Alfresco Day in New York on March 28, since they’re holding an analyst briefing there the day before.

A variety of opinions on what’s ahead for BPM in 2018

I was asked to contribute to 2018 prediction posts on a couple of different sites, along with various other industry pundits. Here’s a summary.

BPM.com: Predictions

BPMcom_Logo_Tagline3BPM.com published The Year Ahead for BPM – 2018 Predictions from Top Influencers, introduced by BPM.com’s Nathaniel Palmer and featuring mostly people who work for vendors but mostly whose opinions I respect. Many of the vendors’ predictions align with their product direction, either through good planning or happy coincidence. Winking smile

A few ideas that stood out:

Blockchain may be almost ready for its close-up. Miguel Valdés Faura (Bonitasoft) and Setrag Khoshafian (Pega) both mentioned the potential for integrating blockchain with processes using DPA (digital process automation) platforms. I’ve been watching this space for a couple of years, waiting for the connections to be made between BPM and blockchain, and in addition to these mentions in the predictions article, Bernd Ruecker (Camunda) published a post yesterday with a practical use case and MWD Advisors published a report on IBM Blockchain Platform that mentions its integration with IBM BPM.

Automated decisioning, whether DMN-based or AI/ML, is going to improve process automation significantly but there’s still a lot of trepidation. Denis Gagné (Trisotech) said that decision auditability – a legal requirement in some countries – could favor DMN-based decision services over AI/ML, while Roger King (TIBCO) sees AI-based automation as the key to having RPA replace workers. Keith Swenson (Fujitsu) predicts that deep learning will be both the most important and most disappointing innovation in 2018, while Peter Fingar is bullish on intelligent (AI) agents integrated with BPM. James Taylor (Decision Management Solutions) seemed a bit disheartened that DM is being trivialized as a “feature” of BPM rather than an independent stateless service where it can have the greatest impact.

Microservices architectures are replacing monolithic BPM systems. Brian Reale (ProcessMaker) predicts that microservices will disrupt the BPM market this year, and Roger King gave a nod to dynamically-orchestrated process fragments although didn’t explicitly mention microservices. I’m seeing microservices approaches from a few of the BPM vendors, and I agree that this has a lot of potential to shift away from the monolithic (and proprietary) platforms; watch for an article that I wrote for Alfresco on BPM and microservices to be published shortly on their blog.

Low code is allowing business users (analysts, really) to participate in DevOps directly. Malcolm Ross (Appian) sees low code as a catalyst for developer diversity and the blurring of lines between business and IT. Phil Simpson (Red Hat) states that low code and citizen developers are the only way to meet the need for constantly-changing applications. My concern is that, much like how business analysts were going to develop their own BPM applications when model-driven development came around several years ago, this isn’t actually going to happen in such an optimistic fashion.

Customer journey matters. Gero Decker(Signavio) is seeing top-level value chains being replaced by customer journey maps. To me, customer journey mapping feels like a bit of old wine in new bottles, 10 years after outside-in process modeling and other customer-centric views, but whatever it takes to get some traction around modeling process to include the customer and optimize from their point of view. I’m speaking on this topic at next week’s OPEX Week conference.

BPM is no longer BPM. The term is being replaced by digital process automation, digital transformation and a number of others as the platforms expand beyond just process modeling and execution. Neil Ward-Dutton (MWD Advisors) envisions different types of tools vying for the place that BPM platforms occupy now within organizations, from RPA to model-driven application development tools. As I wrote in my section the article, “BPM is dead…long live BPM!”

Lots of great insights in there, check out the entire article on BPM.com.

BPMtips.com: Skills

iNIo-1aL_400x400Zbigniew Misiak on BPM tips takes a slightly different predictions approach, asking what BPM-related skills and techniques will be most in demand in 2018, where to learn those skills, and what’s no longer relevant in BPM Skills in 2018 – Hot or Not.

Unlike the BPM.com list, which was dominated by vendors, this one has mostly opinions from consultants with some practitioners thrown in. There’s a lot of generic “people need to keep up on the latest technology trends” (duh), but some specific advice stood out:

Process/business architecture to connect processes to value. This ties in with the customer journey mapping trends that we saw in the BPM.com article; here, Roger Burlton (Process Renewal/BPTrends) stresses the importance of including the customer and other external stakeholders in the processes and value definition, and Sandeep Johal (PPB Advisory) reminds us that the focus of process management is (or should be) on improving customer experience via a variety of technologies. Ian Gotts (Q9 Elements) identifies business analysis and critical questioning as key skills, linking to broader business requirements such as GDPR, and Jim Sinur (Aragon) lists journey mapping.

BPMN, CMMN and DMN for standardized modeling. Alan Fish (FICO) sees formal modeling of processes and decisions as important, and Juergen Pitschke (Process Renewal) believes both BPMN and DMN are important, but some say that this is no longer required in low code process application development tools. BJ Biernatowski (Nordstrom) wonders if BPMN has a future, Roger Burlton thinks that process analysts only need to know how to use the core elements, and Sandeep Johal advocates getting rid of manual current-state modeling as automated process discovery, analysis and improvement takes over.

RPA bot training. Abhijit Kakhandiki (Automation Anywhere) suggests this somewhat depressing skill – train the robot to do your job! – but I agree that learning this would help a lot of people create helper functions for their tasks or even completely automate some tasks. Much in the same way that we used to create Excel macros… On a similar note, I recommended that people gathering requirements become proficient with the low code BPM platforms to at least create prototypes, if not the full applications, and Phil Simpson (Red Hat) recommends that less-technical BPM practitioners start to gain an understanding of things that are likely to significantly impact how applications are designed and deployed, such as RPA and microservices.

Interpersonal and soft skills in BPM and change management, in addition to technical skills. Adrian Reed (Blackmetric) listed influencing, stakeholder engagement and conflict resolution as important to making sure that the technology part of the projects fit into the business and people. I discussed some similar skills: the ability to translate need into action, and the need for developers to learn more about what the business does.

Again, lots more to read here, check out the original on BPMtips.com.

ITESOFT | W4 Secure Capture and Process Automation digital business platform

It’s been three years since I looked at ITESOFT | W4’s BPMN+ product, which was prior to W4’s acquisition by ITESOFT. At that time, I had just seen W4 for the first time at bpmNEXT 2014, and had this to say about it:

For the last demo of this session, Jean-Loup Comeliau of W4 on their BPMN+ product, which provides model-driven development using BPMN 2, UML 2, CMIS and other standards to generate web-based process applications without generating code: the engine interprets and executes the models directly. The BPMN modeling is pretty standard compared to other process modeling tools, but they also allow UML modeling of the data objects within the process model; I see this in more complete stack tools such as TIBCO’s, but this is less common from the smaller BPM vendors. Resources can be assigned to user tasks using various rules, and user interface forms are generated based on the activities and data models, and can be modified if required. The entire application is deployed as a web application. The data-centricity is key, since if the models change, the interface and application will automatically update to match. There is definitely a strong message here on the role of standards, and how we need more than just BPMN if we’re going to have fully model-driven application development.

A couple of weeks ago, I spoke with Laurent Hénault and François Bonnet (the latter whom I met when he demoed at bpmNEXT in 2015 and 2016) about what’s happened in their product since then. From their company beginnings over 30 years ago in document capture and workflow, they have expanded their platform capabilities and relabelled it as digital process automation since it goes beyond BPM technology, a trend I’m seeing with many other BPM vendors. It’s not clear how many of their 650+ customers are using many of the capabilities of the new platform versus just their traditional imaging and workflow functions, but they seem to be expanding on the original capabilities rather than replacing them, which will make transitioning customers easier.

31 ITESOFT W4 platform as part of enterprise architectureThe new platform, Secure Capture and Process Automation (SCPA), provides capabilities for capture, business automation (process, content and decisions), analytics and collaborative modeling, and adds some nice extras in the area of document recognition, fraud detection and computer-aided process design. Using the three technology pillars of omni-channel capture, process automation, and document fraud detection, they offer several solutions including eContract for paperless customer purchase contracts, including automatic fraud detection on documents uploaded by the customer; and the cloud-based Streamline for Invoices for automated invoice processing.

Their eContract solution provides online forms with e-signature, document capture, creation of an eIDAS-compliant contract and other services required to complete a complex purchase contract bundled into a single digital case. The example shown was an online used car purchase with the car loan offered as part of the contract process: by bundling all components of the contract and the loan into a single online transaction, they were able to double the purchase close rate. Their document fraud detection comes into play here, using graphometric handwriting analysis and content verification to detect if a document uploaded by a potential customer has been falsified or modified. Many different types of documents can be analyzed for potential fraud based on content: government ID, tax forms, pay slips, bank information, and public utility invoices may contain information in multiple formats (e.g., plain text plus encoded barcode); other documents such as medical records often contain publicly-available information such as the practitioner’s registration ID. They have a paper available for more information on combatting incoming document fraud.

07 ITESOFT W4 Verifiable documentsTheir invoice processing solution also relies heavily on understanding certain types of documents: 650,000 different supplier invoice types are recognized, and they maintain a shared supplier database in their cloud capture environment to allow these formats to be added and modified for use by all of their invoice processing customers. There’s also a learning environment to capture new invoice types as they occur. Keep in mind that the heavy lifting in invoice processing is all around interpreting the vendor invoice: once you have that sorted out, the rest of the process of interacting with the A/P system is straightforward, and the payment of most invoices that relate to a purchase order can be fully automated. Streamline for Invoices won the Accounts Payable/Invoicing product of the year at the 2017 Document Manager Awards.

After a discussion of their solutions and some case studies, we dug into a more technical demo. A few highlights:

  • 09 ITESOFT W4 Web Modeler - concurrent updates of modelThe Web Modeler provides a fully BPMN-compliant collaborative process modeling environment, with synchronous model changes and (persistent) discussion thread between users. This is a standalone business analyst tool, and the model must be exported as a BPMN file for import to the engine for execution, so there’s no round-tripping. A cool feature is the ability to scroll back through the history of changes to the model by dragging a timeline slider: each changed snapshot is shown with the specific author.
  • Once the business analyst’s process model has been imported into the BPMN+ Composer tool, the full application can be designed: data model, full process model, low code forms-based user experience, and custom code (if required). This allows a more complex BPMN model to be integrated into a low code application – something that isn’t allowed by many of the low code platforms that provide only simple linear flows – as well as developer code for “beyond the norm” integration such as external portals.
  • Supervisor dashboards provide human task monitoring, including task assignment rules and skills matrix that can be changed in real time, and performance statistics.

The applications developed with their tools generally fall into the case management category, although they are document/data based rather than CMMN. Like many BPM vendors, they are finding that there is not the same level of customer demand for CMMN as there was for BPMN, and data-driven case management paradigms are often more understandable to business people.

They’ve OEM’d some of the components (the capture OCR, which is from ABBYY, and the web modeler from another French company) but put them together into a seamless offering. The platform is built on a standard Java stack; some of the components can be scaled independently and containerized (using Microsoft Azure), allowing customers to choose which data should exist on which private and public cloud infrastructure.

ITESOFT | W4 SCPA 2017-12 briefing

28 ITESOFT W4 timeline demoThey also showed some of the features that they demoed at the 2017 bpmNEXT (which I unfortunately missed): process guidance and correction that goes beyond just BPMN validation to attempt to add data elements, missing tasks, missing pathways and more; a GANTT-type timeline model of a process (which I’ve seen in BPLogix for years, but is sadly absent in many products) to show expected completion times and bottlenecks, and the same visualization directly in a live instance that auto-updates as tasks are completed within the instance. I’m not sure if these features are fully available in the commercial product, but they show some interesting views on providing automated assistance to process modeling.


What’s in a name? BPM and DPA

The term “business process management” (BPM) has always been a bit problematic because it means two things: the operations management practice of discovering, modeling and improving business processes, which may have no technology involved whatsoever; and the suite of technologies associated with automating processes. I’ve often heard – and sometimes participated in – arguments on the distinction between BPM-the-discipline and BPM-the-technology. Many people use “BPMS” (BPM system or suite) to define the technology while reserving “BPM” for the discipline, but that’s not sufficiently universal to avoid confusion.

Gartner iBPMS in 2011To compound the confusion, the components of a BPMS have grown from completely process-focused modeling and execution to more complete application development suites that may include decision management, analytics, content management and much more. Gartner relabelled this market “iBPMS” starting around 2011 when they realized that BPM suites were doing much more than just BPM:

The intelligent business process management suite (iBPMS) market is the natural evolution of the earlier BPMS market, adding more capabilities for greater intelligence within business processes. Capabilities such as validation (process simulation, including “what if”) and verification (logical compliance), optimization, and the ability to gain insight into process performance have been included in many BPMS offerings for several years. iBPMSs have added enhanced support for human collaboration such as integration with social media, mobile-enabled process tasks, streaming analytics and real-time decision management.

The term iBPMS makes it sound like what we were doing before wasn’t intelligent, which clearly is not the case, but it also made it obvious that we needed a different name to describe these technologies that we’re using to automate our business functions.

Since then, we’ve moved through a number of different names and acronyms in an attempt to describe these systems: for the more case-oriented (with little or no predefined processes), we have “case management” (confused with the non-technical term used in social sciences and healthcare) which is sometimes abbreviated as CM (confused with the abbreviation for content management, which is also abbreviated as ECM but has now be rebranded as content services) plus the variations of advanced or adaptive case management (ACM), and dynamic case management (DCM). Although there are differences between case management and BPM, there are also a lot of similarities and the distinction in products is sometimes a bit fuzzy. However, using the term “process” causes a certain amount of angst amongst the case managementerati.

This year, Forrester started using the term “digital process automation” (DPA), which is pretty much what Gartner is calling iBPMS. Forrester’s use of DPA seems to have been slightly preceded by the term “digital business automation”. Although “digital” and “automation” are a bit redundant in this context – we’re not going to do analog mechanical automation of most businesses – I think that the use of “business” rather than “process” is a much better fit. However, due to Forrester’s recent DPA wave report, vendors are leaping onto the DPA bandwagon, so we might be stuck there for a while.

From their report in February 2017, “Traditional BPM Gives Way To Digital Process Automation”, Forester describes why this shift is necessary without actually describing the differences between [i]BPM[S] and DPA; instead, this seems to be coming about because organizations took what should have been model-driven development (aka low-code) BPMS and used it in waterfall development environments, thereby turning what should have been agile into legacy. In other words, they seem to be hoping that changing the name of the class of tools will change how organizations use the tools. Call me a cynic, but I’m not completely hopeful about that.

I’m not arguing that the current low code, process/case-centric platforms that combine a full suite of business automation tools aren’t a step forward from yesterday’s BPM platforms in terms of enabling automation as a part of digital transformation. But what is going to change within customer organizations to prevent them from undermining the inherent rapid application development capabilities by enforcing antiquated software development lifecycle methods?

Bonus reading: check back on my review of a Gartner presentation from 2006 on the future of BPM, which looked forward as far as 2017! They were correct that the primary value of BPM moved from productivity to visibility to innovation, and I correctly predicted that their predictions would happen much faster than they expected.

Tune in for the 2017 WfMC Global Awards for Excellence in BPM and Workflow

I had the privilege this year of judging some of the entries for WfMC’s Global Awards for Excellence in BPM and Workflow, and next Tuesday the 12 winners will be announced in a webinar. Tune in to hear the results from Nathaniel Palmer and Keith Swenson, as well as a presentation on industry trends from Connie Moore of Digital Clarity Group.

Presenting at OPEXWeek in January: customer journey mapping and lowcode

I’ll be on stage for a couple of speaking slots at the OPEX Week Business Transformation Summit 2018 in Orlando the week of January 22nd:

  • Tuesday afternoon, I’ll lead a breakout session in the customer-centric transformation track on increasing customer satisfaction through customer journey mapping and process improvement.
  • Wednesday morning, I’ll be on a panel in the RPA track on how low-code platforms are transforming BPM.

I was last at OPEX Week in 2012, when it was still called PEX Week (for Process Excellence Network) – I was on a BPM blogger panel that time around – and it will be interesting to see how it’s changed since then. Looks like a lot more automation technology in the current version, with the expectation that digital transformation isn’t going to come about just by modeling your business.

If you’re going to be there, look me up at one of my sessions or around the conference on Tuesday and Wednesday.

Release webinar: @CamundaBPM 7.8

I listened in on the Camunda 7.8 release webinar this morning – they issue product releases every six months like clockwork – to hear about the new features and upgrades from CEO Jakob Freund and VP of engineering Daniel Meyer.

Camunda BPM stack, community versus enterpriseThey’re obviously getting a broader audience for these release webinars than just their current customers and open source community members, and started with a bit about the company, the product stack and their clients. We heard about a recent case study presented at their first San Francisco community day: 24 Hour Fitness is using Camunda process and decision management for high volume real-time orchestration of their core business processes. With over 190 processes in production, executing 20 million BPMN and 18 million DMN instances per day, this is clearly an enterprise-strength application; they are using the Camunda Enterprise Edition rather than the Community Edition for the additional features and SLA-based support, but the underlying engine and much of the tooling is identical between the products.

The key new features and updates are as follows:

  • Camunda BPM batch mode database operationsWorkflow engine performance improvements. A new batch mode allows 3-4 times more process instances to be executed per minute on several of the supported databases. This is based on grouping database transactions for the same database table (including both operational and audit tables), then doing a single round-trip call between the Camunda server and the database server to execute the batch of inserts, updates and deletes.
  • Cockpit batch operations. It’s now possible to do bulk operations for suspending/activating and modifying running processes instances, and restarting completed process instances. Process instances can be selected by process definition name and by more complex search and filtering operations such as instance variable values, then a batch command issued to suspend, restart, modify or delete instances. A new feature also allows all instances that are at a specific task to be dragged to a new task directly in the process model, whereas this was only possible with single instances before; this can be used either to move the instances to a new task to correct for an error condition or changed process flow, or to restart instances that are sitting at the final end node.
  • More Cockpit features. In addition to the batch operations, Cockpit also now has faster BPMN model rendering (from 8 seconds down to 2 seconds), ability to delete process definitions, and a number of other administrative functions.
  • Spring Boot Starter. Originally created as a community extension in 2015 (with significant contributions from community members Jan Galinski and Oliver Steinhauer), Camunda adopted this project into the main code base to create an officially-supported version of the Camunda Spring Boot Starter, documented here.

The first two updates are focused squarely on improving performance and administration for high volume operations, likely driven by clients such as 24 Hour Fitness, that will serve Camunda well as they push into more core enterprise business processes. The Spring Boot integration positions them well for deploying BPM services in a microservice architecture.

Camunda BPM 7.8

Good summary of the new features in 7.8, and a great Spring Boot coding demo by Meyer, in spite of his grumbling about having to do it on Windows for the webinar. Smile

The webinar will be available for replay soon; check their website for availability. You can also see their release blog post that links to the release notes and describes many of the things that I saw today in the webinar.

Disclaimer: Camunda has been, but is not currently, a client. They did not provide any incentive to attend and write about this webinar, and these are my own opinions. That’s always the case for what I write here, but it’s good to make it explicit every once in a while.