I attended the tutorial on Exploring Explorative Business Process Management with Max Röglinger, Thomas Grisold, Steven Gross and Katharina Stelzl, representing Universität Liechtenstein, Universität Bayreuth and Wirtschafts Universität Wien. This was a good follow-on from Kalle Lyytinen’s keynote with the themes of emerging processes routines, and looked at the contrast between maximizing operational efficiency on pre-defined processes, and finding completely new ways of doing things: doing things right versus doing the right thing.
They frame these as exploitative BPM — the traditional approach of maximizing operational efficiency — and explorative BPM as a driver for innovation. Anecdotal evidence aside, there is research that shows that (pre-defined) BPM activities impede radical innovation because the lack of variance in processes means that we rarely have “unusual experiences” that might inspire radical new ways of doing things. In fact, most of the BPM research and literature is on incrementally improving business processes (usually via analytics), not process innovation or even process redesign.
The division between exploitative (improvement) and explorative (innovation) activities comes from organizational science, with “organizational ambidexterity” a measure of how well an organization can balance exploitation and exploration as they manage change. This can be seen to align with people and teams with delivery skills (analyzing, planning, implementation, self-discipline) versus those with discovery skills (associating, questioning, observing, networking, experimenting), and the need for both sides to work together.
In their research, they have defined three dimensions of process design — trigger, action and value proposition — with combinations of these for problem-driven versus opportunity-driven process improvement/innovation. Explorative BPM results in reengineered (not just incrementally improved) processes or completely new processes to offer the same, enhanced or new value propositions. In general, their definition of opportunities is tied to top-level business goals, while problems are essentially operational pain points. There was a discussion around the nature of problems and opportunities, and how the application of BPM (and BPM research) is expanding to more than just classical process management but also supporting business model innovation.
Having set the stage for what explorative BPM is and why it is important, we had a group exercise to explore explorative BPM, and generate ideas for how to go about process innovation. To finish the tutorial, they presented the results of their tutorial research paper with a potential method for explorative BPM: the triple diamond model.
Future research directions for explorative BPM may include ideas from other domains/disciplines; organizational capabilities required for explorative BPM; and the tools, methods and techniques required for its implementation.
With the workshops finished yesterday, we kicked off the main BPM 2019 program with a welcome from co-organizers Jan Mendling and Stefanie Rinderle-Ma, and greetings from the steering committee chair Mathias Weske. We heard briefly about next year’s conference in Sevilla, and 2021 in Rome — I’m already looking forward to both of those — then remarks from WU Rectorate (and Vice-Rector Digitalization) Tatjana Oppitz on the importance of BPM in transforming businesses. This is the largest year for this academic/research BPM conference, with more than 400 submissions and almost 500 attendees from 50 countries, an obvious indication of interest in this field. Also great to see so many BPM vendors participating as sponsors and in the industry track, since I’m an obvious proponent of two-way collaboration between academia and practitioners.
The main keynote speaker was Kalle Lyytinen of Case Western Reserve University, discussing digitalization and BPM. He showed some of the changes in business due to process improvement and design (including the externalization of processes to third parties), and the impacts of digitalization, that is, deeply embedding digital data and rules into organizational context. He went through some history of process management and BPM, with the goals focused on maximizing use of resources and optimization of processes. He also covered some of the behavioral and economic views of business routines/processes in terms of organizational responses to stimuli, and a great analogy (to paraphrase slightly) that pre-defined processes are like maps, while the performance of those processes forms the actual landscape. This results in two different types of approaches for organized activity: the computational metaphor of BPM, and the social/biological metaphor of constantly-evolving routines.
He defined digital intensity as the degree to which digitalization is required to perform a task, and considered how changes in digital intensity impact routines: in other words, how is technology changing the way we do things on a micro level? Lyytinen specializes in part on the process of designing systems (since my degree is in Systems Design Engineering, I find this fascinating), and showed some examples of chip design processes and how they changed based on the tools used.
He discussed a research study and paper that he and others had performed looking at the implementation of an SAP financial system in NASA. Their conclusion is that routines — that is, the things that people do within organizations to get their work done — adapted dynamically to adjust to the introduction of the IT-imposed processes. Digitalization initially increases variation in routines, but then the changes decrease over time, perhaps as people become accustomed to the new way of doing things and start using the digital tools. He sees automation and workflow efficiency as an element of a broader business model change, and transformation of routines as complementary to but not a substitute for business model change.
The design of business systems and models needs to consider both the processes provided by digitalization (BPM) and the interactions with those digital processes that are made up by the routines that people perform.
There was a question — or more of a comment — from Wil van der Aalst (the godfather of process mining) on whether Lyytinen’s view of BPM is based on the primarily pre-defined BPM of 20 years ago, and if process mining and more flexible process management tools are a closer match to the routines performed by people. In other words, we have analytical techniques that can then identify and codify processes that are closer to the routines. In my opinion, we don’t always have the ability to easily change our processes unless they are in a BPM or similar system; Lyytinen’s SAP at NASA case study, for example, was very unlikely to have very flexible processes. However, van der Aalst’s point about how we now have more flexible ways of digitally managing processes is definitely having an impact in encoding routines rather than forcing the change of routines to adapt to digital processes.
There was also a good discussion on digital intensity sparked by a question from Michael Rosemann, and how although we might not all become Amazon-like in the digital transformation of our businesses, there are definitely now activities in many businesses that just can’t be done by humans. This represents a much different level of digital intensity from many of our organizational digital process, which are just automated versions of human routines.
I’m starting off the BPM2019 academic/research conference in the workshop day attending the session on BPM in the era of digital innovation and transformation, organized by Oktay Turetken of Eindoven University of Technology. Great to walk into the room and be greeted by a long-time acquaintance, Wasana Bandara, who is here from the Queensland University of Australia to deliver several papers in other sessions later this week.
The opening keynote in the workshop was by Maximilian Röglinger of Universität Bayreuth, Fraunhofer FIT on BPM capabilities for the digital age (also covered in a BP Trends article by the authors). He and his research team have a focus on inter-disciplinary process-related topics with a management focus, closely collaborating with industry. Their main interest in recent research is on how BPM is changing (and needs to change) as organizations move into the digital age, and they performed a Delphi study on the question of what’s next for BPM in this new environment. As with all Delphi studies, this was done in rounds: first identifying the challenges and opportunities, and the second phase on identifying and ranking capabilities. This was based on the six core elements of BPM capabilities — strategic alignment, governance, methods, information technology, people and culture — identified in Michael Rosemann’s research from 2007.
Röglinger made some interesting points about specific capabilities that were considered important by industry versus research participants in the study, and presented an updated BPM capability framework that included merging the methods and information technology areas. Only three of the capabilities from the original framework are identical; the other 27 are either new (13) or enhanced (14). Many of the new capabilities are data-related: not new in research or practice, of course, but newly recognized as fundamental BPM capabilities rather than something that just happens to occur alongside it.
They have a number of results with both theoretical and managerial implications, a couple of which I found particularly interesting:
Industry and academia have different perceptions about the challenges and opportunities for BPM in the digital age. This seems obvious at first glance, but highlights the disconnect between research and practice. The first time I attended this conference, I recall writing about ideas that I saw in the research papers and wondering why there weren’t more BPM vendors and other practitioners at the conference to help drive this alignment (thankfully, now there is an industry track at the conference). Of course, there will always be academic research that has no (obvious) immediate industry use, and therefore little interest until it becomes closer to practical use, but we need to have better collaboration between industry and academia to inspire research and promote adoption of new ideas.
The scope of BPM needs to expand to other disciplines and technologies, and not be completely focused on process-centric technologies. There’s a blockchain forum here at BPM 2019, and in industry we’re seeing the inclusion of many other technologies in what have traditionally been BPM suites. I’m giving a short presentation later in the workshop on BPM as the keystone of digital/business automation platforms, in which I discuss some of the practical architectural considerations for these broader capabilities.
Following the keynote, we had a paper presentation from Ralf Laue of Westsächsische Hochschule Zwickau on The Power of the Ideal Final Result for Identifying Process Optimization Potential. He had some interesting comments on customer journey modeling and other techniques based on an existing process, in that it typically only results in incremental improvement, not innovative results (which is similar to my views on Lean Six Sigma for process improvement). Based on ideas from TRIZ methodology for pattern-based design, Laue stressed that it’s necessary to “not stop thinking too early” in the design process. Applied to customer journeys, this means including not only customer touch-points, but the “non-touch-points”, that is, other things that the customer does that are not (currently) involved in their interaction with an organization. Modeling the entire customer process, rather than just their touch-points with the organization, allows us to look at process optimization that is centered on the customer.
Next was a paper on an empirical analysis to understanding the need for new perspectives on BPM in the digital age, presented by Christian Janiesch from Technische Universität Dresden. They looked at a number of questions about BPM, such as whether BPM is still the driving force in digital transformation projects, and interviewed companies that had deployed BPM successfully — possibly a perspective that would tend to support the central positioning of BPM in digital transformation. They have found that the emergence of digital technology has led to organizations having less overall process control, although I would posit that the amount of process control is the same, just that some previously manual processes (uncontrolled but also unobserved) are now managed with digital tools that don’t use explicit process control. They found a need for a funded BPM CoE for coordinating process improvement activities, and for the inclusion of more than just the automation/management of core processes. Two interesting conclusions:
At present, BPM isn’t seen as the driving force for successful digital transformation initiatives. My presentation after this shows how a BPM engine is a keystone for a digital automation platform, but I agree that BPM is not necessarily front-of-mind for companies implementing these projects.
BPM needs to expand its focus on core processes to encompass more processes and a more universal involvement. This is being done in part by the low-code “citizen developer” products that we’re seeing from BPM vendors.
The last paper was presented by Anna-Maria Exler from Wirtschaftsuniversität Wien (where we are hosted this week) on The Use of Distance Metrics in Managing Business Process Transfer, that is, migrating a business process from one organization to another, as will happen when one company acquires another, or a centralized process is rolled out to branches of an organization. This is still fairly early research, and is looking at factors such as the acceptance of the processes by the target organization. They consider six factors and are mapping their interaction, such as more stakeholder integration having a positive impact on acceptance. They have also derived some proximity/distance factors that will impact the process transfer, such as geographic, cultural and technical distance, and visualization of dimensional proximity using a network diagram. Future research will include additional organizations to help derive measurement and weighting of the distance factors.
Trisotech recently announced that Bruce Silver – who writes and teaches the gold standard Method & Style books and courses on BPMN and DMN, and who has forgotten more about BPMN than most people ever learned – is joining Trisotech as a principal consultant. Congrats all around, although Bruce may regret this when he’s needed at Trisotech Montreal headquarters in January when it’s -30C.
Bruce even has his first post on the Trisotech blog, about practical DMN basics. Essential reading for getting started with DMN.
Disclosure: Trisotech is a consulting client of mine. I’m not being paid for writing this post, I just like these guys because they’re smart and do great work. You can read about my relationship with vendors here.
I listened in today on the annual awards webinar for the Workflow Management Coalition‘s Business Transformation Awards. Nathaniel Palmer and Keith Swenson hosted the webinar, with assistance from Layna Fischer, and they announced that the WfMC is being disbanded: the original goals of the organization around process standards development have been achieved, and the standards are now being successfully managed by other standards bodies such as OMG.
I was a judge on some of the case management case studies for the awards, and it’s always interesting to read about how BPM, case management and related technologies are used in different scenarios. The winners in these final awards for excellence in business awards were presented and discussed:
The conference organizers graciously provided me with a conference pass (I’m covering my own travel expenses), and invited me to give a talk at the workshop on BPM in the era of Digital Innovation and Transformation (BPMinDIT). I’ll be talking about how BPM systems are being used as the keystone for digial automation platforms, covering both technical architecture and how this contributes to business agility. My aim is to provide an industry experience perspective to complement the research papers in the workshop, and hopefully generate some interest and ideas, all in about 25 minutes!
There are a ton of interesting things to see at the conference: a doctoral consortium on Sunday, workshops on Monday, tutorials and sessions Tuesday through Thursday, then a meeting on teaching fundamentals of BPM on Friday. I’ll be there Monday through Thursday, look me up if you’re there, or watch for my blog posts about what I’m finding interesting.
Lately, I’ve been thinking about cake. Not (just) because I’m headed to Vienna, home of the incomparable Sacher Torte, nor because I’ll be celebrating my birthday while attending the BPM2019 academic research conference while there. No, I’ve been thinking about technical architectural layer cake models.
In 2014, an impossibly long time ago in computer-years, I wrote a paper about what one of the analyst firms was then calling Smart Process Applications (SPA). The idea is that a vendor would provide a SPA platform, then the vendor, customer or third parties would create applications using this platform — not necessarily using low-code tooling, but at least using an integrated set of tools layered on top of the customer’s infrastructure and core business systems. Instances of these applications — the actual SPAs — could then be deployed by semi-technical analysts who just needed to configure the SPA with the specifics of the business function. The paper that I wrote was sponsored by Kofax, but many other vendors provided (and still provide) similar functionality.
The SPA platforms included a number of integrated components to be used when creating applications: process management (BPM), content capture and management (ECM), event handling, decision management (DM), collaboration, analytics, and user experience.
The concept (or at least the name) of SPA platforms has now morphed into a “digital transformation”, “digital automation” or “digital business” platforms, but the premise is the same: you buy a monolithic platform from a vendor that sits on top of your core business systems, then you build applications on top of that to deploy to your business units. The tooling offered by the platform is now more likely to include a low-code development environment, which means that the applications built on the platform may not need a separate “configure and deploy” layer above them as in the SPA diagram here. Or this same model could be used, with non-low-code applications developed in the layer above the platform, then low-code configuration and deployment of those just as in the SPA model. Due to pressure suggestions from analysts, many BPMS platforms became these all-in-one platforms under the guise of iBPMS, but some ended up with a set of tools with uneven capabilities: great functionality for their core strengths (BPM, etc.) but weaker in functionality that they had to partner to include or hastily build in order to be included in the analyst ranking.
The monolithic vendor platform model is great for a lot of businesses that are not in the business of software development, but some very large organizations (or small software companies) want to create their own platform layer out of best-of-breed components. For example, they may want to pick BPM and DM from one vendor, ECM from multiple others, collaboration and user experience from still another, plus event handling and analytics using open source tools. In the SPA diagram above, that turns the dark blue platform layer into “Build” rather than “Buy”, although the impact is much the same for the developers who are building the applications on top of the platform. This is the core of what I’m going to be presenting at CamundaCon next month in Berlin, with some ideas on how the market divides between monolithic and best-of-breed platforms, and how to make a best-of-breed approach work (since that’s the focus of this particular audience).
And yes, there will be cake, or at least some updated technical architectural layer cake models.
Many people vacation in Europe in September once the holiday-making families are back home. Personally, I like to cram in a few conferences between sightseeing.
Primarily, my trip is to present a keynote at CamundaCon in Berlin on September 12-13. Last time that I attended, it was one day for Camunda open source users, followed by one day for commercial customers, the latter of which was mostly in German (Ich spreche nur Deutsch, wenn Google mir hilft). Since then, they’ve combined the programs into a two-day conference that includes keynotes and tracks that appeal across the board; lucky for me, it’s all in English. I’m speaking on the morning of the first day, but plan to stay for most of the conference to hear some updates from Camunda and their customers, and blog about the sessions. Also, I can’t miss the Thursday night BBQ!
Then I saw a tweet about DecisionCAMP being held in Bolzano the week after CamundaCon, and a few tweets later, I was signed up to attend. Although I’m not focused on decision management, it’s part of what I consult on and write about, and this is a great chance to hear about some of the new trends and best practices.
Look me up if you’re going to be at any of these three conferences, or want to meet up nearby.
We started day 2 of OpenText Enterprise World with a technology keynote by Muhi Majzoub, EVP of Engineering. He opened with a list of their major releases over the last year. He highlighted the upcoming shift to cloud-first containerized deployments of the next generation of their Release 16 that we heard about in Mark Barrenechea’s keynote yesterday, and described the new applications that they have created on the OT2 platform.
We heard about and saw a demo of their Core for Federated Compliance, which allows for federated records and retention management across CMS Core, Content Suite and Documentum repositories, with future potential to connect to other (including non-OpenText) repositories. I’m still pondering the question of when they might force customers to migrate off some of the older platforms, but in the meantime, the content compliance and disposition can be managed in a consolidated manner.
Next was a demo of Documentum D2 integrated with SAP — this already existed for their other content products but this was a direct request from customers — allowing content imported into D2 to support transactions such as purchase orders to be viewed from a Smart View by an SAP user as related documents. They have a strong partnership with SAP, providing enterprise-scale content management as a service on the SAP cloud, integrated with SAP S/4HANA and other applications. They are providing content management as OT2-based microservices, allowing content to be integrated anywhere in the SAP product stack.
AppWorks also made an appearance: this is OpenText’s low-code application development platform that also includes their process management capabilities. They have new interfaces for developers and users, including better mobile applications. No demo, however; given that I missed my pre-conference briefing, I’ll have to wait until later today for that.
Majzoub walked through the updates of many of the other products in their portfolio: EnCase, customer experience management, AI, analytics, eDocs, Business Network and more. They have such a vast portfolio that there are probably few analysts or customers here that are interested in all of them, but there are many customers that use multiple OpenText products in concert.
He finished up with more on OT2, positioning it as a platform and repository of services for building applications in any of their product areas. These services can be consumed by any application development environment, whether their AppWorks low-code platform or more technical development tools such as JAVA. An interesting point made in yesterday’s keynote challenges the idea of non-technical users as “citizen developers”: they see low-code as something that is used by [semi-]technical developers to build applications. The reality of low-code may finally be emerging.
They are featuring six new cloud-based applications built on OT2 that are available to customers now: Core for Capital Projects, Core for Supplier Exchange, Core Enhances Integration with CSP, Core Capture, Core for SAP SuccessFactors, and Core Experience Insights. We saw a demo that included the Capital Projects and Supplier Exchange applications, where information was shared and integrated between a project manager on a project and a supplier providing documentation on proposed components. The Capital Projects application includes analytics dashboards to track progress on deliverables and issues.
Good start to the day, although I’m looking forward to more of a technical drill-down on AppWorks and OT2.
OpenText is holding their global Enterprise World back in Toronto for the third year in a row (meaning that they’ll probably move on to another city for next year — please not Vegas) and I’m here for a couple of days for briefings with the product teams and to sit in on some of the sessions.
I attended a session earlier on connecting content and process that was mostly market research presented by analysts John Mancini and Connie Moore — some interesting points from both of them — before going to the opening keynote with CEO/CTO Mark Barrenechea and a few guests including Sir Tim Berners-Lee.
Barrenechea started with some information about where OpenText is at now, including their well-ranked positions in analyst rankings for content services platforms (Content Services), supply chain commerce networks (Business Network) and digital process automation (AppWorks). He believes that we’re “beyond digital”, with a focus on information rather than automation. He announced cloud-first versions of their products coming in April 2020, although some products will also be available on premise. Their OT2 Cloud Platform will be sold on a service model; I’m not sure if it’s a full microservice implementation, but it sounds like it’s at least moving in that direction. They’ve also announced a new partnership with Google, with Google Cloud being their preferred platform for customers and the integration of Google Services (such as machine learning) into OpenText EIM; this is on a similar scale to what we’ve seen between Alfresco and Amazon AWS.
The keynote finished with a talk by Sir Tim Berners-Lee, inventor of the World Wide Web, on how the web started, how it’s now used and abused, and what we all can do to make it better.