I attended the tutorial on Exploring Explorative Business Process Management with Max Röglinger, Thomas Grisold, Steven Gross and Katharina Stelzl, representing Universität Liechtenstein, Universität Bayreuth and Wirtschafts Universität Wien. This was a good follow-on from Kalle Lyytinen’s keynote with the themes of emerging processes routines, and looked at the contrast between maximizing operational efficiency on pre-defined processes, and finding completely new ways of doing things: doing things right versus doing the right thing.
They frame these as exploitative BPM — the traditional approach of maximizing operational efficiency — and explorative BPM as a driver for innovation. Anecdotal evidence aside, there is research that shows that (pre-defined) BPM activities impede radical innovation because the lack of variance in processes means that we rarely have “unusual experiences” that might inspire radical new ways of doing things. In fact, most of the BPM research and literature is on incrementally improving business processes (usually via analytics), not process innovation or even process redesign.
The division between exploitative (improvement) and explorative (innovation) activities comes from organizational science, with “organizational ambidexterity” a measure of how well an organization can balance exploitation and exploration as they manage change. This can be seen to align with people and teams with delivery skills (analyzing, planning, implementation, self-discipline) versus those with discovery skills (associating, questioning, observing, networking, experimenting), and the need for both sides to work together.
In their research, they have defined three dimensions of process design — trigger, action and value proposition — with combinations of these for problem-driven versus opportunity-driven process improvement/innovation. Explorative BPM results in reengineered (not just incrementally improved) processes or completely new processes to offer the same, enhanced or new value propositions. In general, their definition of opportunities is tied to top-level business goals, while problems are essentially operational pain points. There was a discussion around the nature of problems and opportunities, and how the application of BPM (and BPM research) is expanding to more than just classical process management but also supporting business model innovation.
Having set the stage for what explorative BPM is and why it is important, we had a group exercise to explore explorative BPM, and generate ideas for how to go about process innovation. To finish the tutorial, they presented the results of their tutorial research paper with a potential method for explorative BPM: the triple diamond model.
Future research directions for explorative BPM may include ideas from other domains/disciplines; organizational capabilities required for explorative BPM; and the tools, methods and techniques required for its implementation.
With the workshops finished yesterday, we kicked off the main BPM 2019 program with a welcome from co-organizers Jan Mendling and Stefanie Rinderle-Ma, and greetings from the steering committee chair Mathias Weske. We heard briefly about next year’s conference in Sevilla, and 2021 in Rome — I’m already looking forward to both of those — then remarks from WU Rectorate (and Vice-Rector Digitalization) Tatjana Oppitz on the importance of BPM in transforming businesses. This is the largest year for this academic/research BPM conference, with more than 400 submissions and almost 500 attendees from 50 countries, an obvious indication of interest in this field. Also great to see so many BPM vendors participating as sponsors and in the industry track, since I’m an obvious proponent of two-way collaboration between academia and practitioners.
The main keynote speaker was Kalle Lyytinen of Case Western Reserve University, discussing digitalization and BPM. He showed some of the changes in business due to process improvement and design (including the externalization of processes to third parties), and the impacts of digitalization, that is, deeply embedding digital data and rules into organizational context. He went through some history of process management and BPM, with the goals focused on maximizing use of resources and optimization of processes. He also covered some of the behavioral and economic views of business routines/processes in terms of organizational responses to stimuli, and a great analogy (to paraphrase slightly) that pre-defined processes are like maps, while the performance of those processes forms the actual landscape. This results in two different types of approaches for organized activity: the computational metaphor of BPM, and the social/biological metaphor of constantly-evolving routines.
He defined digital intensity as the degree to which digitalization is required to perform a task, and considered how changes in digital intensity impact routines: in other words, how is technology changing the way we do things on a micro level? Lyytinen specializes in part on the process of designing systems (since my degree is in Systems Design Engineering, I find this fascinating), and showed some examples of chip design processes and how they changed based on the tools used.
He discussed a research study and paper that he and others had performed looking at the implementation of an SAP financial system in NASA. Their conclusion is that routines — that is, the things that people do within organizations to get their work done — adapted dynamically to adjust to the introduction of the IT-imposed processes. Digitalization initially increases variation in routines, but then the changes decrease over time, perhaps as people become accustomed to the new way of doing things and start using the digital tools. He sees automation and workflow efficiency as an element of a broader business model change, and transformation of routines as complementary to but not a substitute for business model change.
The design of business systems and models needs to consider both the processes provided by digitalization (BPM) and the interactions with those digital processes that are made up by the routines that people perform.
There was a question — or more of a comment — from Wil van der Aalst (the godfather of process mining) on whether Lyytinen’s view of BPM is based on the primarily pre-defined BPM of 20 years ago, and if process mining and more flexible process management tools are a closer match to the routines performed by people. In other words, we have analytical techniques that can then identify and codify processes that are closer to the routines. In my opinion, we don’t always have the ability to easily change our processes unless they are in a BPM or similar system; Lyytinen’s SAP at NASA case study, for example, was very unlikely to have very flexible processes. However, van der Aalst’s point about how we now have more flexible ways of digitally managing processes is definitely having an impact in encoding routines rather than forcing the change of routines to adapt to digital processes.
There was also a good discussion on digital intensity sparked by a question from Michael Rosemann, and how although we might not all become Amazon-like in the digital transformation of our businesses, there are definitely now activities in many businesses that just can’t be done by humans. This represents a much different level of digital intensity from many of our organizational digital process, which are just automated versions of human routines.
I’m starting off the BPM2019 academic/research conference in the workshop day attending the session on BPM in the era of digital innovation and transformation, organized by Oktay Turetken of Eindoven University of Technology. Great to walk into the room and be greeted by a long-time acquaintance, Wasana Bandara, who is here from the Queensland University of Australia to deliver several papers in other sessions later this week.
The opening keynote in the workshop was by Maximilian Röglinger of Universität Bayreuth, Fraunhofer FIT on BPM capabilities for the digital age (also covered in a BP Trends article by the authors). He and his research team have a focus on inter-disciplinary process-related topics with a management focus, closely collaborating with industry. Their main interest in recent research is on how BPM is changing (and needs to change) as organizations move into the digital age, and they performed a Delphi study on the question of what’s next for BPM in this new environment. As with all Delphi studies, this was done in rounds: first identifying the challenges and opportunities, and the second phase on identifying and ranking capabilities. This was based on the six core elements of BPM capabilities — strategic alignment, governance, methods, information technology, people and culture — identified in Michael Rosemann’s research from 2007.
Röglinger made some interesting points about specific capabilities that were considered important by industry versus research participants in the study, and presented an updated BPM capability framework that included merging the methods and information technology areas. Only three of the capabilities from the original framework are identical; the other 27 are either new (13) or enhanced (14). Many of the new capabilities are data-related: not new in research or practice, of course, but newly recognized as fundamental BPM capabilities rather than something that just happens to occur alongside it.
They have a number of results with both theoretical and managerial implications, a couple of which I found particularly interesting:
Industry and academia have different perceptions about the challenges and opportunities for BPM in the digital age. This seems obvious at first glance, but highlights the disconnect between research and practice. The first time I attended this conference, I recall writing about ideas that I saw in the research papers and wondering why there weren’t more BPM vendors and other practitioners at the conference to help drive this alignment (thankfully, now there is an industry track at the conference). Of course, there will always be academic research that has no (obvious) immediate industry use, and therefore little interest until it becomes closer to practical use, but we need to have better collaboration between industry and academia to inspire research and promote adoption of new ideas.
The scope of BPM needs to expand to other disciplines and technologies, and not be completely focused on process-centric technologies. There’s a blockchain forum here at BPM 2019, and in industry we’re seeing the inclusion of many other technologies in what have traditionally been BPM suites. I’m giving a short presentation later in the workshop on BPM as the keystone of digital/business automation platforms, in which I discuss some of the practical architectural considerations for these broader capabilities.
Following the keynote, we had a paper presentation from Ralf Laue of Westsächsische Hochschule Zwickau on The Power of the Ideal Final Result for Identifying Process Optimization Potential. He had some interesting comments on customer journey modeling and other techniques based on an existing process, in that it typically only results in incremental improvement, not innovative results (which is similar to my views on Lean Six Sigma for process improvement). Based on ideas from TRIZ methodology for pattern-based design, Laue stressed that it’s necessary to “not stop thinking too early” in the design process. Applied to customer journeys, this means including not only customer touch-points, but the “non-touch-points”, that is, other things that the customer does that are not (currently) involved in their interaction with an organization. Modeling the entire customer process, rather than just their touch-points with the organization, allows us to look at process optimization that is centered on the customer.
Next was a paper on an empirical analysis to understanding the need for new perspectives on BPM in the digital age, presented by Christian Janiesch from Technische Universität Dresden. They looked at a number of questions about BPM, such as whether BPM is still the driving force in digital transformation projects, and interviewed companies that had deployed BPM successfully — possibly a perspective that would tend to support the central positioning of BPM in digital transformation. They have found that the emergence of digital technology has led to organizations having less overall process control, although I would posit that the amount of process control is the same, just that some previously manual processes (uncontrolled but also unobserved) are now managed with digital tools that don’t use explicit process control. They found a need for a funded BPM CoE for coordinating process improvement activities, and for the inclusion of more than just the automation/management of core processes. Two interesting conclusions:
At present, BPM isn’t seen as the driving force for successful digital transformation initiatives. My presentation after this shows how a BPM engine is a keystone for a digital automation platform, but I agree that BPM is not necessarily front-of-mind for companies implementing these projects.
BPM needs to expand its focus on core processes to encompass more processes and a more universal involvement. This is being done in part by the low-code “citizen developer” products that we’re seeing from BPM vendors.
The last paper was presented by Anna-Maria Exler from Wirtschaftsuniversität Wien (where we are hosted this week) on The Use of Distance Metrics in Managing Business Process Transfer, that is, migrating a business process from one organization to another, as will happen when one company acquires another, or a centralized process is rolled out to branches of an organization. This is still fairly early research, and is looking at factors such as the acceptance of the processes by the target organization. They consider six factors and are mapping their interaction, such as more stakeholder integration having a positive impact on acceptance. They have also derived some proximity/distance factors that will impact the process transfer, such as geographic, cultural and technical distance, and visualization of dimensional proximity using a network diagram. Future research will include additional organizations to help derive measurement and weighting of the distance factors.
Trisotech recently announced that Bruce Silver – who writes and teaches the gold standard Method & Style books and courses on BPMN and DMN, and who has forgotten more about BPMN than most people ever learned – is joining Trisotech as a principal consultant. Congrats all around, although Bruce may regret this when he’s needed at Trisotech Montreal headquarters in January when it’s -30C.
Bruce even has his first post on the Trisotech blog, about practical DMN basics. Essential reading for getting started with DMN.
Disclosure: Trisotech is a consulting client of mine. I’m not being paid for writing this post, I just like these guys because they’re smart and do great work. You can read about my relationship with vendors here.
I listened in today on the annual awards webinar for the Workflow Management Coalition‘s Business Transformation Awards. Nathaniel Palmer and Keith Swenson hosted the webinar, with assistance from Layna Fischer, and they announced that the WfMC is being disbanded: the original goals of the organization around process standards development have been achieved, and the standards are now being successfully managed by other standards bodies such as OMG.
I was a judge on some of the case management case studies for the awards, and it’s always interesting to read about how BPM, case management and related technologies are used in different scenarios. The winners in these final awards for excellence in business awards were presented and discussed:
The conference organizers graciously provided me with a conference pass (I’m covering my own travel expenses), and invited me to give a talk at the workshop on BPM in the era of Digital Innovation and Transformation (BPMinDIT). I’ll be talking about how BPM systems are being used as the keystone for digial automation platforms, covering both technical architecture and how this contributes to business agility. My aim is to provide an industry experience perspective to complement the research papers in the workshop, and hopefully generate some interest and ideas, all in about 25 minutes!
There are a ton of interesting things to see at the conference: a doctoral consortium on Sunday, workshops on Monday, tutorials and sessions Tuesday through Thursday, then a meeting on teaching fundamentals of BPM on Friday. I’ll be there Monday through Thursday, look me up if you’re there, or watch for my blog posts about what I’m finding interesting.
Lately, I’ve been thinking about cake. Not (just) because I’m headed to Vienna, home of the incomparable Sacher Torte, nor because I’ll be celebrating my birthday while attending the BPM2019 academic research conference while there. No, I’ve been thinking about technical architectural layer cake models.
In 2014, an impossibly long time ago in computer-years, I wrote a paper about what one of the analyst firms was then calling Smart Process Applications (SPA). The idea is that a vendor would provide a SPA platform, then the vendor, customer or third parties would create applications using this platform — not necessarily using low-code tooling, but at least using an integrated set of tools layered on top of the customer’s infrastructure and core business systems. Instances of these applications — the actual SPAs — could then be deployed by semi-technical analysts who just needed to configure the SPA with the specifics of the business function. The paper that I wrote was sponsored by Kofax, but many other vendors provided (and still provide) similar functionality.
The SPA platforms included a number of integrated components to be used when creating applications: process management (BPM), content capture and management (ECM), event handling, decision management (DM), collaboration, analytics, and user experience.
The concept (or at least the name) of SPA platforms has now morphed into a “digital transformation”, “digital automation” or “digital business” platforms, but the premise is the same: you buy a monolithic platform from a vendor that sits on top of your core business systems, then you build applications on top of that to deploy to your business units. The tooling offered by the platform is now more likely to include a low-code development environment, which means that the applications built on the platform may not need a separate “configure and deploy” layer above them as in the SPA diagram here. Or this same model could be used, with non-low-code applications developed in the layer above the platform, then low-code configuration and deployment of those just as in the SPA model. Due to pressure suggestions from analysts, many BPMS platforms became these all-in-one platforms under the guise of iBPMS, but some ended up with a set of tools with uneven capabilities: great functionality for their core strengths (BPM, etc.) but weaker in functionality that they had to partner to include or hastily build in order to be included in the analyst ranking.
The monolithic vendor platform model is great for a lot of businesses that are not in the business of software development, but some very large organizations (or small software companies) want to create their own platform layer out of best-of-breed components. For example, they may want to pick BPM and DM from one vendor, ECM from multiple others, collaboration and user experience from still another, plus event handling and analytics using open source tools. In the SPA diagram above, that turns the dark blue platform layer into “Build” rather than “Buy”, although the impact is much the same for the developers who are building the applications on top of the platform. This is the core of what I’m going to be presenting at CamundaCon next month in Berlin, with some ideas on how the market divides between monolithic and best-of-breed platforms, and how to make a best-of-breed approach work (since that’s the focus of this particular audience).
And yes, there will be cake, or at least some updated technical architectural layer cake models.
I’ve been spending some time recently helping a few companies think about how their corporate goals are aligned with key performance indicators (KPIs) at all levels of their organization, like this:
Top-level goals, or what keeps the corporate executives awake at night, usually fall into the following categories:
As we move down the hierarchy, different levels of business managers are also concerned with operating margin/profitability, service time, compliance, and operational scalability; you can see a pretty direct line between these KPIs and the top-level corporate goals. For example, improved profitability is likely going to improve (net) revenue, while better service time means happier customers. When we reach the level of front-line workers, their KPIs are usually based on individual performance and skills advancement.
The problem arises when those worker-level KPIs are not aligned with the corporate goals; I’ve written about this in several presentations and papers in the past, in particular about how we need to change worker metrics in more collaborative work environments so that they’re rewarded for more than just personal performance. In doing some research on this, I came across Goodhart’s Law (via the book The Tyranny of Metrics), which is basically about how people will game measurement systems to their own benefit, particularly when goals are complex and the metrics are crude. That’s so true. In other words, given the choice between maximizing a poorly-designed metric that will benefit them personally, or doing the right thing for the customer/company, people will almost always choose the former.
An organization has a “same day” SLA for incoming customer inquiries, except if the inquiry needs to be reviewed by the legal or accounting departments. Business units are measured on how well they meet the SLA, so everyone forwards all of their unfinished work to legal or accounting at the end of the day in order to they meet their SLA, even if the inquiry does not require it. This decreases productivity and increases customer service time, but maximizes the departmental time-based SLA.
An HR department is measured by the number of candidates that are hired, but not on the quality of the candidates. I don’t need to explain how that goes wrong, but suffice it to say that it has a big impact on customer satisfaction as well as productivity.
Any metric that is based on individual (or departmental) performance but can’t be aligned up the hierarchy to a corporate goal is probably going to be detrimental to overall performance, or at least neutral. If you can’t show how a task is contributing to the good of the enterprise, then why are you doing it?
It made me think of my standard routine when I’m walking through a business operations area and want to pinpoint where the existing systems aren’t doing what the workers really need them to do: I look for the spreadsheets and email. These are the best indicator of shadow IT at work, where someone in the business area creates an application that is not sanctioned or supported by IT, usually because IT is too busy to "do it right". Instead of accessing data from a validated source, it’s being copied to a spreadsheet, where scripts are performing calculations using business logic that was probably valid at that point that it was written but hasn’t been updated since that person left the company. Multiple copies of the spreadsheet (or a link to an unprotected copy on a shared drive) are forwarded to people via email, but there’s no way to track who has it or what they’ve done with it. If the data in the source system changes, the spreadsheet and all of its copies stay the same unless manually updated.
Don’t get me wrong: I love spreadsheets. I once claimed that you could take away every other tool on my desktop and I could just reproduce it in Excel. Spreadsheets and email fill the gaps between brittle legacy systems, but they aren’t a great solution. That’s where low-code platforms fit really well: they let semi-technical business analysts (or semi-business technical analysts) create applications that can access realtime business data, assign and track tasks, and integrate other capabilities such as decision management and analytics.
I gave a keynote at bpmNEXT this year about creating your own digital automation platform using a BPMS and other technology components, which is what many large enterprises are doing. However, there are many other companies — and even departments within those large companies — for which a low-code platform fills an important gap. I’ll be doing a modified version of that presentation at this year’s CamundaCon in Berlin, and I’m putting together a bit of a chart on how to decide when to build your own platform and when to use a monolithic low-code platform for building business applications. Just don’t use spreadsheets and email.
Many people vacation in Europe in September once the holiday-making families are back home. Personally, I like to cram in a few conferences between sightseeing.
Primarily, my trip is to present a keynote at CamundaCon in Berlin on September 12-13. Last time that I attended, it was one day for Camunda open source users, followed by one day for commercial customers, the latter of which was mostly in German (Ich spreche nur Deutsch, wenn Google mir hilft). Since then, they’ve combined the programs into a two-day conference that includes keynotes and tracks that appeal across the board; lucky for me, it’s all in English. I’m speaking on the morning of the first day, but plan to stay for most of the conference to hear some updates from Camunda and their customers, and blog about the sessions. Also, I can’t miss the Thursday night BBQ!
Then I saw a tweet about DecisionCAMP being held in Bolzano the week after CamundaCon, and a few tweets later, I was signed up to attend. Although I’m not focused on decision management, it’s part of what I consult on and write about, and this is a great chance to hear about some of the new trends and best practices.
Look me up if you’re going to be at any of these three conferences, or want to meet up nearby.