Industry forum presentations at @BPMConf – process improvement benefits realization, IoT process mining, quality management, and deep learning of shipping container movement

To finish up my time at the academic research BPM 2019 conference, I attended one of the industry forum sessions, which highlights initiatives that bring together academic research and practical applications in industry. These are shorter presentations than the research sessions, although still have formal published papers documenting their research and findings; check those proceedings for the full author list for each paper and the details of the research.

Process Improvement Benefits Realization: Insights from an Australian University. Wasana Bandara, QUT

The first presentation was about process improvement at the author’s university. They took on an enterprise business process improvement project in 2017, and have developed a Business Process Improvement Office (BPIO — aka centre of excellence). They wanted to be able to have measurable benefits, and created a benefits realization framework that ran in parallel with their process improvement lifecycle to have the idea and measurement of benefits built in from the beginning of any project.

Alignment of BR lifecycle with BPI lifecycle
(From the industry forum research paper)

They found that identifying and aligning the ideas of benefits realization early in a project created awareness and increased receptiveness to unexpected benefits. Good discussion following on the details of their framework and how it impacts the business areas as they move their manual processes to more automation and management.

Enabling Process Mining in Aircraft Manufacture: Extracting Event Logs and Discovering Processes from Complex Data. Maria Teresa Gómez López, Universidad de Sevilla

The second presentation was about using process mining to discover the processes used in aircraft manufacture. The data underlying the process mining is generated by IoT manufacturing devices, hence had much higher volumes than a usual business process mining application, requiring preprocessing to aggregate the raw log data into events suitable for process mining. They also wanted to be able to integrate knowledge from engineers involved in the manufacturing process to capture best practices and help extract the right data from the raw data logs.

Result of process discovery for three test cases
(From the industry forum research paper)

They had some issues with analyzing the log data, such as incorrect data (an aircraft was in two stations at the same time, or was moving backwards through the assembly process), incomplete or insufficient information. Future research and work on this will include potential integration with big data architectures to handle the volume of raw log data, and and finding new ways of analyzing the log data to have cleaner input to the process discovery algorithms.

The adoption of globally unified process standards via a multilingual management system The case of Marabu, a worldwide manufacturer of printing inks and creative colours of the highest quality. Klaus Cee, Marabu, and Andreas Schachermeier, ConSense

The next presentation was about how Marabu, a printing ink company, standardized and aligned their multinational subsidiaries’ business processes with the parent company. This was not a research project per se, although ConSense is a spinoff consulting company from a university project several years ago, but a process and knowledge management implementation project. They had some particular challenges with developing uniform multi-lingual processes that could have local variants, integrated with needs of quality, environmental and occupational safety management.

Data-driven Deep Learning for Proactive Terminal Process Management. Andreas Metzger, University of Duisburg-Essen

The final paper in this industry forum session was on the digitalization of the Duisberg intermodal container shipping port, a large inland port dealing with about 10,000 containers arriving and departing by rail and truck each month. Data streams from terminal equipment captured information about the movement of containers, cranes and trains; their goal was to predict based on current state whether a given train would be loaded and could depart on time, and proactively dispatch resources to speed up loading when necessary. This sounds like a straightforward problem, but the data collected can lead to erroneous results: waiting for more data to be collected can lead to more accurate predictions, but earlier intervention can resolve the problem faster.

They applied multiple parallel deep learning models (recurrent neural networks) to improve the predictions, dynamically trading off between accuracy and earliness of detection. They were able to increase the number of trains leaving on time by 4.7%, which is a great result when you consider the cost of a delayed train.

Terminal Productivity Cockpit excerpt
(From the industry forum research paper)

They used RNNs as their deep learning models because they can handle arbitrary length process instances without sequence or trace encoding, and can perform predictions at any checkpoint with a single model; there’s a long training time and it’s compute-intensive, but that pays off in this scenario. Lessons that they learned included the fact that the deep learning ensembles worked well out of the box, but also that the quality of the data used as input is a key concern for accuracy: if you’re going to spend time working on something, work on data cleansing before you work on the deep learning algorithms.

The Zaha Hadid-designed Library and Learning Center at UW Wien, our home for the main conference

The last segment following this is a closing panel, so this is the last of my coverage from BPM 2019. I haven’t attended this conference in many years, and I’m glad to be back. Looking forward to next year in Seville!

It’s been great to catch up with a lot of people who I haven’t seen since the last time that I attended, plus a few who I see more often. UW Wien has been a welcoming host as well as having a campus full of extraordinary modern architecture, with a well-organized conference and great evening social events.

Research track on management at @BPMConf – BPM skills, incentives and hackathons

We’re into the second day of the main BPM 2019 academic conference, and I attended the research session on management, made up of three papers, each presented by one of the authors with time for questions and discussion. Note that I’ve only listed the presenting author in each case, but you can find all of the contributors on the detailed program and read the full papers in the proceedings.

Regulatory Instability, BPM Technology, and BPM Skill Configuration. Michael zur Muehlen, Stevens Institute of Technology

Organizations have a choice on how they configure their BPM functionality and skills — this is something that I am asked about all the time in my consulting practice — and this research looked at BPM-related roles and responsibilities within three companies across different industries to see how regulatory environments can impact this. The work was based on LinkedIn data for Walmart (retail), Pfizer (pharmaceutical) and Goldman Sachs (financial), with analysis on how the regulatory environments impact the ostensive and performative aspects of processes: in other words, whether compliance issues may be due to an ostensive misfit (design of processes) or a performative misfit (exceptions and workarounds to designed processes). Some industries have a huge number of regulation/rule changes each year: more than 2,000 per year for both financial and pharma, while trade regulations (impacting retail) have few rule changes and therefore may have more stable processes. They also looked at job ads on Monster.com for a broader selection of pharma, retail and financial services companies to generate a skills taxonomy for framing the analysis.

Graph of BPM roles and skills at Walmart
(From research paper)

They did some interesting text analysis on the job ads and the resumes, then mapped the roles and skills for the three companies. They found that some organizations have a bilateral configuration, where manager and analyst roles are involved in both operations and change, and a trilateral configuration, where the change functions performed by managers and project managers but not analysts. The diagram above shows the graph for Walmart; check the proceedings for the other two companies’ graphs and the full details of the analysis. They’re expanding their research to include other organizations to see what other factors (besides regulatory environment) that may be impacting how roles and responsibilities are distributed, as well as studies of how organizations change over a 20-year time period.

Understanding the Alignment of Employee Appraisals and Rewards with Business Processes. Aygun Shafagatova, Ghent University

This research is looking at the crossover between HR research on employee rewards and BPM research on people/culture aspects and BP maturity. They analyzed case studies and performed interviews in a number of companies, and performed a multi-dimensional analysis considering processes, employee roles/levels, BP maturity, reward type, and performance appraisal. They identified patterns of alignment for processes, employee roles and BP maturity, then mapped this to the performance appraisal dimension and the type of rewards (financial or non-financial) that are used for process-related performance and competence.

They identified a number of critical success factors, including rewarding the right performance and behaviors, and recommendations on evaluation and reward techniques based on BPM maturity of an organization.

Recommendations for aligning appraisals and rewards to lower and higher levels of BPM maturity
(From the research paper)

I find this a very interesting topic, and have written papers and presentations on how employee incentives need to change in collaborative process environments. I also see a lot of issues related to this in my consulting practice, where employee appraisals and rewards are not aligned with newly-deployed business processes in the face of more automation and greater need for collaboration in process knowledge work.

What the Hack? – Towards a Taxonomy of Hackathons. Christoph Kollwitz, Chemnitz University of Technology

This research looked at the question of how hackathons can be used to drive/support innovation processes within organizations. The nature of hackathons combines different open innovation tools, including contests, short-period intensive workshops, communities and teamwork, and participation by both the supply and demand side of the solution space. They looked at hackathons from an organizational perspective, that is, how organizations design and execute hackathons, not from the perspective of the participants.

Based on literature reviews, they developed a taxonomy of hackathons based on strategic design decisions (SDD in the diagram) and operational design decisions (ODD). They want to conduct future research with actual case studies and interviews, and study the relationships between the individual dimensions.

Taxonomy of hackathons
(From research paper)

I recently attended a session at a vendor conference where we heard about an internal hackathon at financial services firm Capital One, focused on their use of the vendor’s BPM tool. The presenter didn’t (or couldn’t) share the details of the solutions created, but focused on the organizational aspects — very similar to the topic under study in Kollwitz’ research, and possibly a good target of a case study for their future research.

Research track on process improvement at @BPMConf

The first research session that I attended at the academic BPM 2019 conference included three papers, each presented by one of the authors with time for questions and discussion. Note that I’ve only listed the presenting author in each case, but you can find all of the contributors on the detailed program and read the full papers in the proceedings.

Trust-Aware Process Design. Michael Rosemann, QUT

The research context for trust concepts in process come from a variety of disciplines, with trust being defined as confidence in a relationship where some uncertainty exists. He presented a four-stage model for trust-aware process design:

  • Identify moments of trust, including the points where it materializes, the requirements/concerns, and the stakeholders
  • Reduce uncertainty, including operational, behavioral and perceived uncertainty
  • Reduce vulnerability, which is the cost to the process consumer in case the process does not perform as expected
  • Build confidence through additional information and sources such as trust in an expert

He discussed different types of trust that can be used to increase confidence at points in a process, and summarized with a meta model that shows the relationship between uncertainty, trustworthiness and confidence. He finished with some of their potential future research, such as a taxonomy for trust.

Meta model for trust-aware process design
(From the research paper)

Design Patterns for Business Process Individualization. Bastian Wurm, WU Wien

There is an increased demand for individualized products and services, but this creates complexity and the need for process variants within a company: a balance between individualization to maximize revenue and standardization to minimize cost. There are different stages of individualization, from mass production (one size fits all) to mass customization (e.g., configure to order) to mass personalization (one of a kind production).

They started with a standard business process meta model, and noted that a lot of process improvement was based on reducing variants, and therefore pushing towards the mass production end of the spectrum. They generated design patterns for business process individualization:

  • Sequence and activity individualization, where mass personalization has activities and processes unique to each customer.
  • Flow unit individualization, where the sequence (process) may be standardized, but what is delivered to the customer at a specific point is unique.
  • Resource individualization, where the process is standardized but the resource assigned to complete a task is the most suitable match to the customer’s needs.
  • Data individualization, where data requested from/presented to the customer and the decisions that may be based on it is unique to the customer and context.

Their future research will include collecting primary data to test their models and hypotheses, and they’re interested in experiences or ideas from others on how this will impact BPM.

Design patterns for business process individualization
(From the research paper)

Business Process Improvement Activities: Differences in Organizational Size, Culture and Resources. Iris Beerepoot, Utrecht University

This research was based on studying the daily work of healthcare professionals in five hospitals, including human activities and how they worked with (or around) the healthcare information systems. Any time that the user had to do a workaround on the system, this was considered as a deviation and examined to see if there was a poorly-designed process or system, if the individual was overstepping their authority, if there was an organizational context that required a deviation, or if there was a unique situation that required a one-time solution.

They considered the context — size, culture (flat or hierarchical organizational structure) and resources — within the five different hospitals; then they looked at snapshots of the workarounds as they occurred within specific organizations and how this impacted business process improvement activities. For example, the more hierarchical the organization, the greater importance to having management commitment and vision; in less hierarchical organizations it was more important to adhere to current culture. Larger and smaller organizations had demonstrably different ways of addressing process improvement.

Together with the studies and focus groups, they identified a fourth contextual factor (in addition to size, culture and resources): the maturity of an organization. I assume there is some degree of correlation between some of these factors, for example, a larger organization may be more likely to have a more hierarchical culture and greater resources.

Good audience discussion at the end about the nature of workarounds — intentional/unintentional versus positive/negative — and the potential inclusion of endogenous factors in addition to the exogenous ones used in this study. As with all of the research papers, there are plenty of future research directions possible.

Day 1 @BPMConf tutorial on explorative BPM

I attended the tutorial on Exploring Explorative Business Process Management with Max Röglinger, Thomas Grisold, Steven Gross and Katharina Stelzl, representing Universität Liechtenstein, Universität Bayreuth and Wirtschafts Universität Wien. This was a good follow-on from Kalle Lyytinen’s keynote with the themes of emerging processes routines, and looked at the contrast between maximizing operational efficiency on pre-defined processes, and finding completely new ways of doing things: doing things right versus doing the right thing.

They frame these as exploitative BPM — the traditional approach of maximizing operational efficiency — and explorative BPM as a driver for innovation. Anecdotal evidence aside, there is research that shows that (pre-defined) BPM activities impede radical innovation because the lack of variance in processes means that we rarely have “unusual experiences” that might inspire radical new ways of doing things. In fact, most of the BPM research and literature is on incrementally improving business processes (usually via analytics), not process innovation or even process redesign.

The division between exploitative (improvement) and explorative (innovation) activities comes from organizational science, with “organizational ambidexterity” a measure of how well an organization can balance exploitation and exploration as they manage change. This can be seen to align with people and teams with delivery skills (analyzing, planning, implementation, self-discipline) versus those with discovery skills (associating, questioning, observing, networking, experimenting), and the need for both sides to work together.

In their research, they have defined three dimensions of process design — trigger, action and value proposition — with combinations of these for problem-driven versus opportunity-driven process improvement/innovation. Explorative BPM results in reengineered (not just incrementally improved) processes or completely new processes to offer the same, enhanced or new value propositions. In general, their definition of opportunities is tied to top-level business goals, while problems are essentially operational pain points. There was a discussion around the nature of problems and opportunities, and how the application of BPM (and BPM research) is expanding to more than just classical process management but also supporting business model innovation.

Three dimensions of BPM
(From the tutorial research paper)

Having set the stage for what explorative BPM is and why it is important, we had a group exercise to explore explorative BPM, and generate ideas for how to go about process innovation. To finish the tutorial, they presented the results of their tutorial research paper with a potential method for explorative BPM: the triple diamond model.

Triple Diamond Model for explorative BPM
(From the tutorial research paper)

Future research directions for explorative BPM may include ideas from other domains/disciplines; organizational capabilities required for explorative BPM; and the tools, methods and techniques required for its implementation.

Day 1 @BPMConf opening keynote: Kalle Lyytinen on the role of BPM in design and business model innovation

With the workshops finished yesterday, we kicked off the main BPM 2019 program with a welcome from co-organizers Jan Mendling and Stefanie Rinderle-Ma, and greetings from the steering committee chair Mathias Weske. We heard briefly about next year’s conference in Sevilla, and 2021 in Rome — I’m already looking forward to both of those — then remarks from WU Rectorate (and Vice-Rector Digitalization) Tatjana Oppitz on the importance of BPM in transforming businesses. This is the largest year for this academic/research BPM conference, with more than 400 submissions and almost 500 attendees from 50 countries, an obvious indication of interest in this field. Also great to see so many BPM vendors participating as sponsors and in the industry track, since I’m an obvious proponent of two-way collaboration between academia and practitioners.

Kalle Lyytinen at BPM 2019 keynote

The main keynote speaker was Kalle Lyytinen of Case Western Reserve University, discussing digitalization and BPM. He showed some of the changes in business due to process improvement and design (including the externalization of processes to third parties), and the impacts of digitalization, that is, deeply embedding digital data and rules into organizational context. He went through some history of process management and BPM, with the goals focused on maximizing use of resources and optimization of processes. He also covered some of the behavioral and economic views of business routines/processes in terms of organizational responses to stimuli, and a great analogy (to paraphrase slightly) that pre-defined processes are like maps, while the performance of those processes forms the actual landscape. This results in two different types of approaches for organized activity: the computational metaphor of BPM, and the social/biological metaphor of constantly-evolving routines.

Lyytinen’s research conclusions regarding the impact of digital intensity

He defined digital intensity as the degree to which digitalization is required to perform a task, and considered how changes in digital intensity impact routines: in other words, how is technology changing the way we do things on a micro level? Lyytinen specializes in part on the process of designing systems (since my degree is in Systems Design Engineering, I find this fascinating), and showed some examples of chip design processes and how they changed based on the tools used.

He discussed a research study and paper that he and others had performed looking at the implementation of an SAP financial system in NASA. Their conclusion is that routines — that is, the things that people do within organizations to get their work done — adapted dynamically to adjust to the introduction of the IT-imposed processes. Digitalization initially increases variation in routines, but then the changes decrease over time, perhaps as people become accustomed to the new way of doing things and start using the digital tools. He sees automation and workflow efficiency as an element of a broader business model change, and transformation of routines as complementary to but not a substitute for business model change.

The design of business systems and models needs to consider both the processes provided by digitalization (BPM) and the interactions with those digital processes that are made up by the routines that people perform.

There was a question — or more of a comment — from Wil van der Aalst (the godfather of process mining) on whether Lyytinen’s view of BPM is based on the primarily pre-defined BPM of 20 years ago, and if process mining and more flexible process management tools are a closer match to the routines performed by people. In other words, we have analytical techniques that can then identify and codify processes that are closer to the routines. In my opinion, we don’t always have the ability to easily change our processes unless they are in a BPM or similar system; Lyytinen’s SAP at NASA case study, for example, was very unlikely to have very flexible processes. However, van der Aalst’s point about how we now have more flexible ways of digitally managing processes is definitely having an impact in encoding routines rather than forcing the change of routines to adapt to digital processes.

There was also a good discussion on digital intensity sparked by a question from Michael Rosemann, and how although we might not all become Amazon-like in the digital transformation of our businesses, there are definitely now activities in many businesses that just can’t be done by humans. This represents a much different level of digital intensity from many of our organizational digital process, which are just automated versions of human routines.

Workshop at @BPMConf on BPM in the era of Digital Innovation and Transformation

I’m starting off the BPM2019 academic/research conference in the workshop day attending the session on BPM in the era of digital innovation and transformation, organized by Oktay Turetken of Eindoven University of Technology. Great to walk into the room and be greeted by a long-time acquaintance, Wasana Bandara, who is here from the Queensland University of Australia to deliver several papers in other sessions later this week.

The opening keynote in the workshop was by Maximilian Röglinger of Universität Bayreuth, Fraunhofer FIT on BPM capabilities for the digital age (also covered in a BP Trends article by the authors). He and his research team have a focus on inter-disciplinary process-related topics with a management focus, closely collaborating with industry. Their main interest in recent research is on how BPM is changing (and needs to change) as organizations move into the digital age, and they performed a Delphi study on the question of what’s next for BPM in this new environment. As with all Delphi studies, this was done in rounds: first identifying the challenges and opportunities, and the second phase on identifying and ranking capabilities. This was based on the six core elements of BPM capabilities — strategic alignment, governance, methods, information technology, people and culture — identified in Michael Rosemann’s research from 2007.

Updated BPM capability framework based on Delphi study. See http://digital-bpm.com/bpm-capability-framework/

Röglinger made some interesting points about specific capabilities that were considered important by industry versus research participants in the study, and presented an updated BPM capability framework that included merging the methods and information technology areas. Only three of the capabilities from the original framework are identical; the other 27 are either new (13) or enhanced (14). Many of the new capabilities are data-related: not new in research or practice, of course, but newly recognized as fundamental BPM capabilities rather than something that just happens to occur alongside it.

They have a number of results with both theoretical and managerial implications, a couple of which I found particularly interesting:

  • Industry and academia have different perceptions about the challenges and opportunities for BPM in the digital age. This seems obvious at first glance, but highlights the disconnect between research and practice. The first time I attended this conference, I recall writing about ideas that I saw in the research papers and wondering why there weren’t more BPM vendors and other practitioners at the conference to help drive this alignment (thankfully, now there is an industry track at the conference). Of course, there will always be academic research that has no (obvious) immediate industry use, and therefore little interest until it becomes closer to practical use, but we need to have better collaboration between industry and academia to inspire research and promote adoption of new ideas.
  • The scope of BPM needs to expand to other disciplines and technologies, and not be completely focused on process-centric technologies. There’s a blockchain forum here at BPM 2019, and in industry we’re seeing the inclusion of many other technologies in what have traditionally been BPM suites. I’m giving a short presentation later in the workshop on BPM as the keystone of digital/business automation platforms, in which I discuss some of the practical architectural considerations for these broader capabilities.

Following the keynote, we had a paper presentation from Ralf Laue of Westsächsische Hochschule Zwickau on The Power of the Ideal Final Result for Identifying Process Optimization Potential. He had some interesting comments on customer journey modeling and other techniques based on an existing process, in that it typically only results in incremental improvement, not innovative results (which is similar to my views on Lean Six Sigma for process improvement). Based on ideas from TRIZ methodology for pattern-based design, Laue stressed that it’s necessary to “not stop thinking too early” in the design process. Applied to customer journeys, this means including not only customer touch-points, but the “non-touch-points”, that is, other things that the customer does that are not (currently) involved in their interaction with an organization. Modeling the entire customer process, rather than just their touch-points with the organization, allows us to look at process optimization that is centered on the customer.

Next was a paper on an empirical analysis to understanding the need for new perspectives on BPM in the digital age, presented by Christian Janiesch from Technische Universität Dresden. They looked at a number of questions about BPM, such as whether BPM is still the driving force in digital transformation projects, and interviewed companies that had deployed BPM successfully — possibly a perspective that would tend to support the central positioning of BPM in digital transformation. They have found that the emergence of digital technology has led to organizations having less overall process control, although I would posit that the amount of process control is the same, just that some previously manual processes (uncontrolled but also unobserved) are now managed with digital tools that don’t use explicit process control. They found a need for a funded BPM CoE for coordinating process improvement activities, and for the inclusion of more than just the automation/management of core processes. Two interesting conclusions:

  • At present, BPM isn’t seen as the driving force for successful digital transformation initiatives. My presentation after this shows how a BPM engine is a keystone for a digital automation platform, but I agree that BPM is not necessarily front-of-mind for companies implementing these projects.
  • BPM needs to expand its focus on core processes to encompass more processes and a more universal involvement. This is being done in part by the low-code “citizen developer” products that we’re seeing from BPM vendors.

The last paper was presented by Anna-Maria Exler from Wirtschaftsuniversität Wien (where we are hosted this week) on The Use of Distance Metrics in Managing Business Process Transfer, that is, migrating a business process from one organization to another, as will happen when one company acquires another, or a centralized process is rolled out to branches of an organization. This is still fairly early research, and is looking at factors such as the acceptance of the processes by the target organization. They consider six factors and are mapping their interaction, such as more stakeholder integration having a positive impact on acceptance. They have also derived some proximity/distance factors that will impact the process transfer, such as geographic, cultural and technical distance, and visualization of dimensional proximity using a network diagram. Future research will include additional organizations to help derive measurement and weighting of the distance factors.

The workshop finished with a short presentation by me on BPM systems as the keystone for digital automation platforms, which covers a small bit of the material that I’ll be presenting next week at CamundaCon, with some additional bits that I thought would be more interesting for this audience.

A match made in BPMN/DMN heaven: @bpmswatch joining @Trisotech

Trisotech recently announced that Bruce Silver – who writes and teaches the gold standard Method & Style books and courses on BPMN and DMN, and who has forgotten more about BPMN than most people ever learned – is joining Trisotech as a principal consultant. Congrats all around, although Bruce may regret this when he’s needed at Trisotech Montreal headquarters in January when it’s -30C. Winking smile

Bruce even has his first post on the Trisotech blog, about practical DMN basics. Essential reading for getting started with DMN.

Disclosure: Trisotech is a consulting client of mine. I’m not being paid for writing this post, I just like these guys because they’re smart and do great work. You can read about my relationship with vendors here.

WfMC Business Transformation Awards 2019, and farewell to WfMC

I listened in today on the annual awards webinar for the Workflow Management Coalition‘s Business Transformation Awards. Nathaniel Palmer and Keith Swenson hosted the webinar, with assistance from Layna Fischer, and they announced that the WfMC is being disbanded: the original goals of the organization around process standards development have been achieved, and the standards are now being successfully managed by other standards bodies such as OMG.

I was a judge on some of the case management case studies for the awards, and it’s always interesting to read about how BPM, case management and related technologies are used in different scenarios. The winners in these final awards for excellence in business awards were presented and discussed:

  • Banco Galicia, nominated by IBM India
  • Banmedica Chile, nominated by Pectra Technology
  • Becton Dickinson, nominated by Newgen Software
  • BeeHIVE, nominated by IBM Singapore (I’m very curious as whether this is related to their Beehive enterprise collaboration tool that I first saw in 2008)
  • City of Fort Worth, nominated by BP Logix (also recognized last year)
  • Immunization Information Systems Support Branch of the Centers for Disease Control (CDC), self-nominated
  • EsPozo Alimentacion, nominated by AuraPortal
  • ERSP, City of Buenos Aires, nominated by Pectra Technology
  • EVRAZ, nominated by BPM’online
  • Maury, Donnely and Parr, nominated by ProcessMaker
  • NEM Solutions, nominated by AuraPortal
  • Quote-to-Cash Operations, nominated by IBM Philippines
  • Remaza Group, nominated by Vianuvem
  • Sicoob Credicitrus, nominated by Lecom Tecnologia
  • Signature Care Management, self-nominated

They had a few other awards in addition to the case studies, focused on people involved in business transformation:

  • Michael Pang of Protiviti Greater China, awarded for BT CEO – Technology User
  • Jude Chagas Pereira of IYCON and Wizly, awarded for BT CEO – Technology Provider
  • Layna Fischer of Future Strategies and WfMC, awarded the Manheim Award for Significant Contributions in the Field of Workflow/BPM (yay Layna!)
  • Shaun Campbell of City of Fort Worth, awarded for Outstanding BT Team Leader
  • Me (!), awarded for Outstanding BT Consultant

Congrats to all the winners, and a heartfelt thanks to Nathaniel, Keith and Layna for their amazing contributions to WfMC over the years.

The slides and recording of the awards webinar will be available at the Business Transformation Awards website, and watch for the new version of the Intelligent Automation book as well as previous books on this topic at Future Strategies’ BPM Books site.

Prepping for #BPM2019 and the BPMinDIT workshop

It’s less than two weeks until the academic/research-oriented International Conference on BPM in Vienna, and as of two days ago, they’ve closed the registration with 459 participants from 48 countries. It will still be possible to get a ticket on site, but you’ll miss out on the social events.

The conference organizers graciously provided me with a conference pass (I’m covering my own travel expenses), and invited me to give a talk at the workshop on BPM in the era of Digital Innovation and Transformation (BPMinDIT). I’ll be talking about how BPM systems are being used as the keystone for digial automation platforms, covering both technical architecture and how this contributes to business agility. My aim is to provide an industry experience perspective to complement the research papers in the workshop, and hopefully generate some interest and ideas, all in about 25 minutes!

There are a ton of interesting things to see at the conference: a doctoral consortium on Sunday, workshops on Monday, tutorials and sessions Tuesday through Thursday, then a meeting on teaching fundamentals of BPM on Friday. I’ll be there Monday through Thursday, look me up if you’re there, or watch for my blog posts about what I’m finding interesting.

Cake

My first Sacher Torte in Vienna, 2007. Yes, if you buy a whole one it comes in a fancy box.

Lately, I’ve been thinking about cake. Not (just) because I’m headed to Vienna, home of the incomparable Sacher Torte, nor because I’ll be celebrating my birthday while attending the BPM2019 academic research conference while there. No, I’ve been thinking about technical architectural layer cake models.

In 2014, an impossibly long time ago in computer-years, I wrote a paper about what one of the analyst firms was then calling Smart Process Applications (SPA). The idea is that a vendor would provide a SPA platform, then the vendor, customer or third parties would create applications using this platform — not necessarily using low-code tooling, but at least using an integrated set of tools layered on top of the customer’s infrastructure and core business systems. Instances of these applications — the actual SPAs — could then be deployed by semi-technical analysts who just needed to configure the SPA with the specifics of the business function. The paper that I wrote was sponsored by Kofax, but many other vendors provided (and still provide) similar functionality.

Layer cake diagram from my 2014 white paper on Smart Process Application platforms.

The SPA platforms included a number of integrated components to be used when creating applications: process management (BPM), content capture and management (ECM), event handling, decision management (DM), collaboration, analytics, and user experience.

The concept (or at least the name) of SPA platforms has now morphed into a “digital transformation”, “digital automation” or “digital business” platforms, but the premise is the same: you buy a monolithic platform from a vendor that sits on top of your core business systems, then you build applications on top of that to deploy to your business units. The tooling offered by the platform is now more likely to include a low-code development environment, which means that the applications built on the platform may not need a separate “configure and deploy” layer above them as in the SPA diagram here. Or this same model could be used, with non-low-code applications developed in the layer above the platform, then low-code configuration and deployment of those just as in the SPA model. Due to pressure suggestions from analysts, many BPMS platforms became these all-in-one platforms under the guise of iBPMS, but some ended up with a set of tools with uneven capabilities: great functionality for their core strengths (BPM, etc.) but weaker in functionality that they had to partner to include or hastily build in order to be included in the analyst ranking.

The monolithic vendor platform model is great for a lot of businesses that are not in the business of software development, but some very large organizations (or small software companies) want to create their own platform layer out of best-of-breed components. For example, they may want to pick BPM and DM from one vendor, ECM from multiple others, collaboration and user experience from still another, plus event handling and analytics using open source tools. In the SPA diagram above, that turns the dark blue platform layer into “Build” rather than “Buy”, although the impact is much the same for the developers who are building the applications on top of the platform. This is the core of what I’m going to be presenting at CamundaCon next month in Berlin, with some ideas on how the market divides between monolithic and best-of-breed platforms, and how to make a best-of-breed approach work (since that’s the focus of this particular audience).

And yes, there will be cake, or at least some updated technical architectural layer cake models.