WfMC Business Transformation Awards 2019, and farewell to WfMC

I listened in today on the annual awards webinar for the Workflow Management Coalition’s Business Transformation Awards. Nathaniel Palmer and Keith Swenson hosted the webinar, with assistance from Layna Fischer, and they announced that the WfMC is being disbanded: the original goals of the organization around process standards development have been achieved, and the standards are now being successfully managed by other standards bodies such as OMG.

I was a judge on some of the case management case studies for the awards, and it’s always interesting to read about how BPM, case management and related technologies are used in different scenarios. The winners in these final awards for excellence in business awards were presented and discussed:

  • Banco Galicia, nominated by IBM India
  • Banmedica Chile, nominated by Pectra Technology
  • Becton Dickinson, nominated by Newgen Software
  • BeeHIVE, nominated by IBM Singapore (I’m very curious as whether this is related to their Beehive enterprise collaboration tool that I first saw in 2008)
  • City of Fort Worth, nominated by BP Logix (also recognized last year)
  • Immunization Information Systems Support Branch of the Centers for Disease Control (CDC), self-nominated
  • EsPozo Alimentacion, nominated by AuraPortal
  • ERSP, City of Buenos Aires, nominated by Pectra Technology
  • EVRAZ, nominated by BPM’online
  • Maury, Donnely and Parr, nominated by ProcessMaker
  • NEM Solutions, nominated by AuraPortal
  • Quote-to-Cash Operations, nominated by IBM Philippines
  • Remaza Group, nominated by Vianuvem
  • Sicoob Credicitrus, nominated by Lecom Tecnologia
  • Signature Care Management, self-nominated

They had a few other awards in addition to the case studies, focused on people involved in business transformation:

  • Michael Pang of Protiviti Greater China, awarded for BT CEO – Technology User
  • Jude Chagas Pereira of IYCON and Wizly, awarded for BT CEO – Technology Provider
  • Layna Fischer of Future Strategies and WfMC, awarded the Manheim Award for Significant Contributions in the Field of Workflow/BPM (yay Layna!)
  • Shaun Campbell of City of Fort Worth, awarded for Outstanding BT Team Leader
  • Me (!), awarded for Outstanding BT Consultant

Congrats to all the winners, and a heartfelt thanks to Nathaniel, Keith and Layna for their amazing contributions to WfMC over the years.

The slides and recording of the awards webinar will be available at the Business Transformation Awards website, and watch for the new version of the Intelligent Automation book as well as previous books on this topic at Future Strategies’ BPM Books site.

Prepping for #BPM2019 and the BPMinDIT workshop

It’s less than two weeks until the academic/research-oriented International Conference on BPM in Vienna, and as of two days ago, they’ve closed the registration with 459 participants from 48 countries. It will still be possible to get a ticket on site, but you’ll miss out on the social events.

The conference organizers graciously provided me with a conference pass (I’m covering my own travel expenses), and invited me to give a talk at the workshop on BPM in the era of Digital Innovation and Transformation (BPMinDIT). I’ll be talking about how BPM systems are being used as the keystone for digial automation platforms, covering both technical architecture and how this contributes to business agility. My aim is to provide an industry experience perspective to complement the research papers in the workshop, and hopefully generate some interest and ideas, all in about 25 minutes!

There are a ton of interesting things to see at the conference: a doctoral consortium on Sunday, workshops on Monday, tutorials and sessions Tuesday through Thursday, then a meeting on teaching fundamentals of BPM on Friday. I’ll be there Monday through Thursday, look me up if you’re there, or watch for my blog posts about what I’m finding interesting.

Cake

My first Sacher Torte in Vienna, 2007. Yes, if you buy a whole one it comes in a fancy box.

Lately, I’ve been thinking about cake. Not (just) because I’m headed to Vienna, home of the incomparable Sacher Torte, nor because I’ll be celebrating my birthday while attending the BPM2019 academic research conference while there. No, I’ve been thinking about technical architectural layer cake models.

In 2014, an impossibly long time ago in computer-years, I wrote a paper about what one of the analyst firms was then calling Smart Process Applications (SPA). The idea is that a vendor would provide a SPA platform, then the vendor, customer or third parties would create applications using this platform — not necessarily using low-code tooling, but at least using an integrated set of tools layered on top of the customer’s infrastructure and core business systems. Instances of these applications — the actual SPAs — could then be deployed by semi-technical analysts who just needed to configure the SPA with the specifics of the business function. The paper that I wrote was sponsored by Kofax, but many other vendors provided (and still provide) similar functionality.

Layer cake diagram from my 2014 white paper on Smart Process Application platforms.

The SPA platforms included a number of integrated components to be used when creating applications: process management (BPM), content capture and management (ECM), event handling, decision management (DM), collaboration, analytics, and user experience.

The concept (or at least the name) of SPA platforms has now morphed into a “digital transformation”, “digital automation” or “digital business” platforms, but the premise is the same: you buy a monolithic platform from a vendor that sits on top of your core business systems, then you build applications on top of that to deploy to your business units. The tooling offered by the platform is now more likely to include a low-code development environment, which means that the applications built on the platform may not need a separate “configure and deploy” layer above them as in the SPA diagram here. Or this same model could be used, with non-low-code applications developed in the layer above the platform, then low-code configuration and deployment of those just as in the SPA model. Due to pressure suggestions from analysts, many BPMS platforms became these all-in-one platforms under the guise of iBPMS, but some ended up with a set of tools with uneven capabilities: great functionality for their core strengths (BPM, etc.) but weaker in functionality that they had to partner to include or hastily build in order to be included in the analyst ranking.

The monolithic vendor platform model is great for a lot of businesses that are not in the business of software development, but some very large organizations (or small software companies) want to create their own platform layer out of best-of-breed components. For example, they may want to pick BPM and DM from one vendor, ECM from multiple others, collaboration and user experience from still another, plus event handling and analytics using open source tools. In the SPA diagram above, that turns the dark blue platform layer into “Build” rather than “Buy”, although the impact is much the same for the developers who are building the applications on top of the platform. This is the core of what I’m going to be presenting at CamundaCon next month in Berlin, with some ideas on how the market divides between monolithic and best-of-breed platforms, and how to make a best-of-breed approach work (since that’s the focus of this particular audience).

And yes, there will be cake, or at least some updated technical architectural layer cake models.

Goals and metrics

I’ve been spending some time recently helping a few companies think about how their corporate goals are aligned with key performance indicators (KPIs) at all levels of their organization, like this:

image

Top-level goals, or what keeps the corporate executives awake at night, usually fall into the following categories:

  • Revenue growth
  • Competitive differentiation
  • Product agility
  • Customer retention

As we move down the hierarchy, different levels of business managers are also concerned with operating margin/profitability, service time, compliance, and operational scalability; you can see a pretty direct line between these KPIs and the top-level corporate goals. For example, improved profitability is likely going to improve (net) revenue, while better service time means happier customers. When we reach the level of front-line workers, their KPIs are usually based on individual performance and skills advancement.

The problem arises when those worker-level KPIs are not aligned with the corporate goals; I’ve written about this in several presentations and papers in the past, in particular about how we need to change worker metrics in more collaborative work environments so that they’re rewarded for more than just personal performance. In doing some research on this, I came across Goodhart’s Law (via the book The Tyranny of Metrics), which is basically about how people will game measurement systems to their own benefit, particularly when goals are complex and the metrics are crude. That’s so true. In other words, given the choice between maximizing a poorly-designed metric that will benefit them personally, or doing the right thing for the customer/company, people will almost always choose the former.

Examples:

  • An organization has a “same day” SLA for incoming customer inquiries, except if the inquiry needs to be reviewed by the legal or accounting departments. Business units are measured on how well they meet the SLA, so everyone forwards all of their unfinished work to legal or accounting at the end of the day in order to they meet their SLA, even if the inquiry does not require it. This decreases productivity and increases customer service time, but maximizes the departmental time-based SLA.
  • An HR department is measured by the number of candidates that are hired, but not on the quality of the candidates. I don’t need to explain how that goes wrong, but suffice it to say that it has a big impact on customer satisfaction as well as productivity.

Any metric that is based on individual (or departmental) performance but can’t be aligned up the hierarchy to a corporate goal is probably going to be detrimental to overall performance, or at least neutral. If you can’t show how a task is contributing to the good of the enterprise, then why are you doing it?

Spreadsheets and email

I had a laugh at the xkcd comic from a few days ago:

Spreadsheets

It made me think of my standard routine when I’m walking through a business operations area and want to pinpoint where the existing systems aren’t doing what the workers really need them to do: I look for the spreadsheets and email. These are the best indicator of shadow IT at work, where someone in the business area creates an application that is not sanctioned or supported by IT, usually because IT is too busy to "do it right". Instead of accessing data from a validated source, it’s being copied to a spreadsheet, where scripts are performing calculations using business logic that was probably valid at that point that it was written but hasn’t been updated since that person left the company. Multiple copies of the spreadsheet (or a link to an unprotected copy on a shared drive) are forwarded to people via email, but there’s no way to track who has it or what they’ve done with it. If the data in the source system changes, the spreadsheet and all of its copies stay the same unless manually updated.

Don’t get me wrong: I love spreadsheets. I once claimed that you could take away every other tool on my desktop and I could just reproduce it in Excel. Spreadsheets and email fill the gaps between brittle legacy systems, but they aren’t a great solution. That’s where low-code platforms fit really well: they let semi-technical business analysts (or semi-business technical analysts) create applications that can access realtime business data, assign and track tasks, and integrate other capabilities such as decision management and analytics.

I gave a keynote at bpmNEXT this year about creating your own digital automation platform using a BPMS and other technology components, which is what many large enterprises are doing. However, there are many other companies — and even departments within those large companies — for which a low-code platform fills an important gap. I’ll be doing a modified version of that presentation at this year’s CamundaCon in Berlin, and I’m putting together a bit of a chart on how to decide when to build your own platform and when to use a monolithic low-code platform for building business applications. Just don’t use spreadsheets and email.

September in Europe: @BPMConf in Vienna, @Camunda in Berlin, @DecisionCAMP in Bolzano

Many people vacation in Europe in September once the holiday-making families are back home. Personally, I like to cram in a few conferences between sightseeing.

Brandenburger Tor in Berlin

Primarily, my trip is to present a keynote at CamundaCon in Berlin on September 12-13. Last time that I attended, it was one day for Camunda open source users, followed by one day for commercial customers, the latter of which was mostly in German (Ich spreche nur Deutsch, wenn Google mir hilft). Since then, they’ve combined the programs into a two-day conference that includes keynotes and tracks that appeal across the board; lucky for me, it’s all in English. I’m speaking on the morning of the first day, but plan to stay for most of the conference to hear some updates from Camunda and their customers, and blog about the sessions. Also, I can’t miss the Thursday night BBQ!

Staatsoper in Vienna

Once I had agreed to be in Berlin, I realized that the international academic BPM conference is the previous week in Vienna. I attended my first one in Milan in 2008, then Ulm in 2009, Hoboken in 2010, Clermont-Ferrand in 2011 (where I had the honor of keynoting) and Tallinn in 2012, before I fell off the wagon and have missed every one since then. This year, however, I’ll be back to check out the latest BPM-related research, see workshop presentations, and attend presentations across a number of technical and management tracks.

Waltherplatz in Bolzano

Then I saw a tweet about DecisionCAMP being held in Bolzano the week after CamundaCon, and a few tweets later, I was signed up to attend. Although I’m not focused on decision management, it’s part of what I consult on and write about, and this is a great chance to hear about some of the new trends and best practices.

Look me up if you’re going to be at any of these three conferences, or want to meet up nearby.

Microservices meets case management: my post on the @Alfresco blog

Image lifted from my post on Alfresco’s blog, charmingly named “analyst_meme1”

I wrote a post on a microservices approach to intelligent case management applications for Alfresco, which they’ve published on their blog. It covers the convergence of three key factors: the business requirement to support case management paradigms for knowledge work; the operational drive to increase automation and ensure compliance; and the technology platform trend to adopt microservices.

It’s a pretty long read, I originally wrote it as a 3-4 page paper to cover the scope of the issues and cover case management examples in insurance claims, citizen services, and customer onboarding. My conclusion:

Moving from a monolithic application to microservices architecture makes good sense for many business systems today; for intelligent case management, where no one supplier can provide a good solution for all of the required capabilities, it’s essential.

Before you ask:

  • Yes, I was paid for it, which is why it’s there and not here.
  • No, it’s not about Alfresco products, it’s technology/business analysis.

Wrapping up OpenText Enterprise World 2019

It’s the last day of OpenText Enterprise World for this year. I started the day attending one of the developer labs, where I created a JavaScript app using OT2 services, then attended a couple of AppsWorks-related sessions: Duke Energy’s transition from MetaStorm to AppWorks, and using AppWorks for process/case and content integration in the public sector. I also got to meet the adorable Great Dane that was here as part of the Paws for a Break program: she’s a cross between a Harlequin and Merle in color, so they call her a Merlequin.

Mark Barrenechea was back to close the conference with a quick recap: 3,500 attendees, OpenText Cloud Edition, Google partnership, ethical supply chains, and the talk by Sir Tim Berners-Lee. Plus Berner-Lee’s quote of the real reason that the web was created: cat videos!

In addition to the announcements that we heard during the week, Barrenechea also told us about their new partnership with MasterCard to provide integrated payment services in B2B supply chains, and had two MasterCard Enterprise Partnership executives on stage to talk more about it.

The closing ceremonies finished off with another very special guest: singer, songwriter and activist Peter Gabriel. I was familiar with his music career — having had the pleasure to see him live in concert in the past — but didn’t realize the extent of his human rights activism. He talked about his start and career in music, and some of the ways that he’s woven human rights into his career, from writing the timeless anti-apartheid hit about Stephen Biko to starting the WOMAD festival. He’s been involved in the creation of an inter-species internet, and showed a video of a bonobo composing music with him.

Then his band joined him and he played a set! Amazing finish to the week.

OpenText Enterprise World 2019: AppWorks roadmap and technical deep dive

I had an afternoon with AppWorks at OpenText Enterprise World: a roadmap session followed by a technical deep dive. AppWorks is their low-code tool that includes process management, case management, and access to content and other information, supported across mobile and desktop and platforms. It contains a number of pre-packaged components, and a technical developer can create new components that can be accessed as services from the AppWorks environment. They’ve recently made it into the top-right corner of the Forrester Wave for [deep] digital process automation platforms, with their strength in case management and content integration listed as some of their strongest features, as well as Magellan’s AI and analytics, and the OpenText Cloud deployment platform.

The current release has focused on improving end-user flexibility and developer ease-of-use, but also on integration capabilities with the large portfolio of other OpenText tools and products. Some new developer features such as an expression editor and a mobile-first design paradigm, plus an upcoming framework for end-user UI customization in terms of themes and custom forms. Runtime performance has been improved by making applications into true single-page applications.

There are four applications built on the current on-premise AppWorks: Core for Legal, Core for Quality Management, Contract Center and People Center. These are all some combination of content (from the different content services platforms available) plus case or process management, customized for a vertical application. I didn’t hear a commitment to migrate these to the cloud, but there’s no reason that this won’t happen.

Some interesting future plans, such as how AppWorks will be used as a low-code development tool for OT2 applications. They have a containerized version of AppWorks available as a developer preview as a stepping stone to next year’s cloud edition. There was a mention of RPA although not a clear direction at present: they can integrate with third-party RPA tools now and may be mulling over whether to build/buy their own capability. There’s also the potential to build process intelligence/mining and reporting functionality based on their Magellan machine learning and analytics. There were a lot of questions from the audience, such as whether they will be supporting GitHub for source code control (probably but not yet scheduled) and better REST support.

Nick King, the director of product management for AppWorks, took us through a technical session that was primarily an extended live demonstration of creating a complex application in AppWorks. Although the initial part of creating the layout and forms is pretty accessible to non-technical people, the creation of BPMN diagrams, web service integration, and case lifecycle workflows are clearly much more technical; even the use of expressions in the forms definition is starting to get pretty technical. Also, based on the naming of components visible at various points, there is still a lot of the legacy Cordys infrastructure under the covers of AppWorks; I can’t believe it’s been 12 years since I first saw Cordys (and thought it was pretty cool).

There are a lot of nice things that just happen without configuration, much less coding, such as the linkages between components within a UI layout. Basically, if an application contains a number of different building blocks such as properties, forms and lifecycle workflows, those components are automatically wired together when assembled on a single page layout. Navigation breadcrumbs and action buttons are generated automatically, and changes in one component can cause updates to other components without a screen refresh.

OpenText, like every other low-code application development vendor, will likely continue to struggle with the issues of what a non-technical business analyst versus a technical developer does within a low-code environment. As a JAVA developer at one of my enterprise clients said recently upon seeing a low-code environment, “That’s nice…but we’ll never use it.” I hope that they’re wrong, but fear that they’re right. To address that, it is possible to use the AppWorks environment to write “pro-code” (technical lower-level code) to create services that could be added to a low-code application, or to create an app with a completely different look and feel than is possible using AppWorks low-code. If you were going to do a full-on BPMN process model, or make calls to Magellan for sentiment analysis, it would be more of a pro-code application.

OpenText Enterprise World 2019 day 2: technology keynote

We started day 2 of OpenText Enterprise World with a technology keynote by Muhi Majzoub, EVP of Engineering. He opened with a list of their major releases over the last year. He highlighted the upcoming shift to cloud-first containerized deployments of the next generation of their Release 16 that we heard about in Mark Barrenechea’s keynote yesterday, and described the new applications that they have created on the OT2 platform.

We heard about and saw a demo of their Core for Federated Compliance, which allows for federated records and retention management across CMS Core, Content Suite and Documentum repositories, with future potential to connect to other (including non-OpenText) repositories. I’m still pondering the question of when they might force customers to migrate off some of the older platforms, but in the meantime, the content compliance and disposition can be managed in a consolidated manner.

Next was a demo of Documentum D2 integrated with SAP — this already existed for their other content products but this was a direct request from customers — allowing content imported into D2 to support transactions such as purchase orders to be viewed from a Smart View by an SAP user as related documents. They have a strong partnership with SAP, providing enterprise-scale content management as a service on the SAP cloud, integrated with SAP S/4HANA and other applications. They are providing content management as OT2-based microservices, allowing content to be integrated anywhere in the SAP product stack.

AppWorks also made an appearance: this is OpenText’s low-code application development platform that also includes their process management capabilities. They have new interfaces for developers and users, including better mobile applications. No demo, however; given that I missed my pre-conference briefing, I’ll have to wait until later today for that.

Majzoub walked through the updates of many of the other products in their portfolio: EnCase, customer experience management, AI, analytics, eDocs, Business Network and more. They have such a vast portfolio that there are probably few analysts or customers here that are interested in all of them, but there are many customers that use multiple OpenText products in concert.

He finished up with more on OT2, positioning it as a platform and repository of services for building applications in any of their product areas. These services can be consumed by any application development environment, whether their AppWorks low-code platform or more technical development tools such as JAVA. An interesting point made in yesterday’s keynote challenges the idea of non-technical users as “citizen developers”: they see low-code as something that is used by [semi-]technical developers to build applications. The reality of low-code may finally be emerging.

They are featuring six new cloud-based applications built on OT2 that are available to customers now: Core for Capital Projects, Core for Supplier Exchange, Core Enhances Integration with CSP, Core Capture, Core for SAP SuccessFactors, and Core Experience Insights. We saw a demo that included the Capital Projects and Supplier Exchange applications, where information was shared and integrated between a project manager on a project and a supplier providing documentation on proposed components. The Capital Projects application includes analytics dashboards to track progress on deliverables and issues.

Good start to the day, although I’m looking forward to more of a technical drill-down on AppWorks and OT2.