CamundaCon 2023 Day 2: Process and Decision Standards

Falko Menge and Marco Lopes from Camunda gave a presentation on the involvement of Camunda with the development of OMG’s core process and decision standards, BPMN and DMN. Camunda (and Falko in particular) has been involved in OMG standards for a long time, and embrace these two standards in their products. Sadly, at least to me, they gave up support for the case management standard, CMMN, due to a lackluster market adoption; other vendors such as Flowable support all three of the standards in their products and have viable use cases for CMMN.

Falko and Marco gave a shout out to universities and the BPM academic research conference that I attended recently as promoters of both the concepts of standards and future research into the standards. Camunda has not only participated in the standards efforts, but the co-founders wrote a book on Real-Life BPMN as they discovered the ways that it can best be used.

They gave a good history of the development of the BPMN standard and also of Camunda’s implementation of it, from the early days of the Eclipse-based BPMN modeler to the modern web-based modelers. Camunda became involved in the BPMN Model Interchange Working Group (MIWG) to be able to exchange models between different modeling platforms, because they recognized that a lot of organizations do much broader business modeling in tools aimed at business analysts, then want to transfer the models to a process execution platform like Camunda. Different vendors choose to participate in the BPMN MWIG tests, and the results are published so that the level of interoperability is understood.

DMN is another critical standard, allowing modelers to create standardized decision models and also supports the Friendly-Enough Expression Language (FEEL) for scripting within the models. The DMN Technolgy Compatibility Kit (TCK) is a set of decision models and expected results that provides test results similar to that of the BPMN MIWG tests: information about the vendors’ products test coverage are published so that their implementation of DMN can be assessed by potential customers.

Although standards are sometimes decried as being too difficult for business people to understand and use (they’re really not), they create an environment where common executable models of processes and decisions can be created and exchanged across many different vendor platforms. Although there are many other parts of a technology stack that can create vendor lock-in, process and decision models don’t need to be part of that. Also, someone working at a company that uses BPMN and DMN modeling tools can easily move to a different organization that uses different tools without having to relearn a proprietary modeling language. From a model richness standpoint, many vendors and researchers working together towards a common goal can create a much better and more extensive standard (as long as they’re not squabbling over details).

They went on to discuss some of the upcoming new standards: BPM+ Harmonization Model and Notation (BHMN), Shared Data Model and Notation (SDMN), and Knowledge Package Model and Notation (KPMN), all of which are in some way involved in integrating BPMN and DMN due to the high degree of overlap between these standards in many organizations. Since these standards aren’t close enough to release, they’re not planned for a near-future version of Camunda, but they’ll be added to the product management roadmap as the standards evolve and the customer requirements for the standards becomes clear.

BPM2023 Day 2: RPA Forum

In the last session of the day, I attended another part of the RPA Forum, with two presentations. 

The first presentation was “Is RPA Causing Process Knowledge Loss? Insights from RPA Experts” (Ishadi Mirispelakotuwa, Rehan Syed, Moe T. Wynn), presented by Moe Wynn. RPA has a lot of measurable benefits – efficiency, compliance, quality – but what about the “dark side” of RPA? Can it make organizations lose knowledge and control over their processes because people have been taken out of the loop? RPA is often quite brittle, and when (not if) it stops working, it’s possible that organizational amnesia has set in: no one remembers how the process works well enough to do it manually. The resulting process knowledge loss (PKL) can have a number of negative organizational impacts.

The study created a conceptual model for RPA-related PKL, and she walked us through the sets of human, organizational and process factors that may contribute. In layman’s terms, use it or lose it.

In my opinion, this is different from back-end or more technical automation (e.g., deploying a BPMS or creating APIs into enterprise system functionality) in that back-end automation is usually fully specified, rigorously coded and tested, and maintained as a part of the organization’s enterprise systems. Conversely, RPA is often created by the business areas directly and can be inherently brittle due to changes in the systems with which it interfaces. If an automated process goes down, there are likely service level agreements in place and IT steps in to get the system back online. If an RPA bot goes down, a person is expected to do the tasks manually that had been done by the bot, and there is less likely to be a robust SLA for getting the bot fixed and back online. Interesting discussion around this in the Q&A, although not part of the area of study for the paper as presented.

The second presentation was “A Business Model of Robotic Process Automation” (Helbig & Braun), presented by Eva Katarina Helbig of BurdaSolutions, an internal IT service provider for an international media group. Their work was based on a case study within their own organization, looking at establishing RPA as a driver of digitization and automation within a company based on an iterative, holistic view of business models with the Business Model Canvas as analysis tool.

They interviewed several people across the organization, mostly in operational areas, to develop a more structured model for how to approach, develop and deploy RPA projects, starting with the value proposition and expanding out to identify the customers, resources and key activities.

That’s it for day two of the main BPM2023 conference, and we’re off later to the Spoorwegmuseum for the conference dinner and a tour of the railway museum.

BPM2023 Day 1: RPA Forum

In the afternoon breakouts, I attended the RPA (robotic process automation) forum for three presentations.

The first presentation was “What Are You Gazing At? An Approach to Use Eye-tracking for Robotic Process Automation”, presented by Antonio Martínez-Rojas. RPA typically includes a training agent that captures what and where a human operator is typing based on UI logs, and uses that to create the script of actions that should be executed when that task is automated using the RPA “bot” without the person being involved – a type of process mining but based on UI event logs. In this presentation, we heard about using eye tracking — what the person is looking at and focusing on — during the training phase to understand where they are looking for information. This is especially interesting in less structured environments such as reading a letter or email, where the information may be buried in non-relevant text, and it’s difficult to filter out the relevant information. Unlike the UI event log methods, this can find what the user is focusing on while they are working, which may not be the same things in the screen that they are clicking on – an important distinction.

The second presentation was “Accelerating The Support of Conversational Interfaces For RPAs Through APIs”, presented by Yara Rizk. She presented the problem that many business people could be better supported through easier access to all types of APIs, including unattended RPA bots, and proposed a chatbot interface to APIs. This can be extracted by automatically interrogating the OpenAPI specifications, with some optional addition of phrases from people, to create natural language sentences: what is the intent of the action based on the API endpoint name and description plus sample sentences provided by the people. Then, the sentences are analyzed and filtered, and typically also with some involvement from human experts, and used to train the intent recognition models required to drive a chatbot interface.

The last presentation in this session was “Migrating from RPA to Backend Automation: An Exploratory Study”, presented by Andre Strothmann. He discussed how RPA robots need to be designed and prioritized so that they can be easily replaceable, with the goal to move to back-end automation as soon as it is available. I’ve written and presented many times about how RPA is a bridging technology, and most of it will go away in the 5-10 year horizon, so I’m pretty happy to see this presented in a more rigorous way than my usual hand-waving. He discussed the analysis of their interview data that resulted in some fundamental design requirements for RPA bots, design guidelines for the processes that orchestrate those bots, and migration considerations when moving from RPA bots to APIs. If you’re developing RPA bots now and understand that they are only a stopgap solution, you should be following this research.

More on Mastodon

I’m not really active much on Mastodon yet, and still use Twitter as a broadcast platform (rather than discussion, which is how it started), but I’ve moved over to mastodon.social – you can find me at mastodon.social/@skemsley

My plan is to start doing a bit of engagement to see how that works out; as always, my main content will be here on my blog, with pointers to new posts on social media.

Playing with AI filters on photos, too. 🙂

Will the elephant replace the bird?

tldr; I’m on Mastodon at fosstodon.org/@skemsley

I’ve been on Twitter since March 2007 and have amassed over 7,500 followers (probably half of them bots, but whatever). There’s a current push to move off Twitter and onto Mastodon, an open source microblogging social network, because of the declining standards of content and new ownership over at the bird site. Can we successfully make the shift from tweeting to tooting?

In the old days of Twitter, which don’t seem that long ago, I used to engage in a lot of conversations: my timeline was mostly tweets from my friends and business colleagues, and we would banter back and forth. These days, however, I use Twitter mostly as a broadcast platform, where I post links to new blog posts, videos and other publications. I respond if someone mentions me directly, but it’s no longer the place that I go to start a conversation. It’s just too noisy, full of promoted tweets, and retweets about topics that I don’t care about by people who I barely know. To be fair, some of that is my fault: I tended to follow most people who followed me and had some sort of similar interests, and it’s a lot of work to go back and pare down that list of 2,000 who I follow to a more reasonable number. When lists came out, I started putting people on lists rather than following them directly, but it was probably already too late. Same, by the way, for LinkedIn: I was indiscriminate about who I added to my network, and it’s just too noisy over there for a real conversation.

Enter the elephant. Mastodon is an open source, decentralized social platform that has functionality quite similiar to Twitter: posts are “toots” instead of tweets; you can like, share and reply to posts, and can see a running feed of posts. The big difference is that Mastodon isn’t one company, or one instance: anyone can create a Mastodon instance, either privately for use within a smaller group, or included in a group of federated servers that share posts and (to some extent) account information. When you sign up, you need to choose the server where you want your account, although you can follow accounts from other servers. If you want to change servers at some point in the future, you can; however, it doesn’t appear that you can move your posts to the new server (although you will move your following/follower lists), so there is less incentive to do this once you start posting a lot.

I looked at the available servers that follow the Mastodon Server Covenant and are part of the fediverse (the group of federated servers), and picked fosstodon.org, which is a technology-focused server that includes a lot of (but is not exclusive to) free and open source software. I’m not exclusive to open source, but I do cover a number of process automation vendors with open source offerings and this felt like a good fit. You can find and follow me there at fosstodon.org/@skemsley. Will I be better at curating my follows on this platform, which I so miserably failed at on Twitter and LinkedIn? I have way less FOMO these days, so maybe.

I’m already starting to have some conversations over there, but finding it difficult to find who from my current circle of friends and colleagues is on Mastodon, and on which server — searching by name really only gives you who is on your server unless someone else on your server mentions them. I also have a lot to learn about curating my feed, since the defaults are Home (my posts, re-toots of my posts, and my followers’ posts), Local (all posts from everyone on my server) and Federated (holy crap, everything in the fediverse). I’ve discovered an unoffial but quite good source of helpful into at fedi.tips and will be reviewing more of that.

On a side note, Twitter tends to be a good platform for contacting customer service for some organizations, so I’m not going to abandon it outright, and I’ll still use it for broadcasts. Just covering my bases.

Transforming Digital Transformation: A Panel on Women in Tech

I’ve been asked to participate in a panel in honor of International Women’s Day on March 15. Sponsored by Camunda and Infosys (and technically by me, since I’m giving my time and effort to this without compensation), this panel brings together actual technical women: nothing against women in tech marketing or other non-technical roles, but this is the first “women in tech” panel that I’ve been on where every single one of the non-sponsor participants has a degree in engineering or computer science, and has worked in a technical role at some point in her career.

I’m honored to be invited to join this group of trailblazers in the tech world to discuss the challenges and experiences for women in technology.

Regardless of your gender, the topics that we will discuss have an impact on you. Technology and automation are huge drivers of innovation, and companies are starving for good technical talent, regardless of gender. In fact, women in technology and leadership roles foster diversity, collaboration and innovation in ways that result in higher revenues for companies. Yet with an environment that seems a natural for encouraging more technical women, many companies still toss up barriers, from hiring biases to an unfriendly “bro” culture.

Register at the link above, and tune in on March 15 at 12:30pm (Eastern) for our live discussion with Q&A to follow.

The (old) new software industry

Facebook is a hot mess most of the time, but I usually enjoy the “memories” that remind me what I posted on this date in past years. A couple of days ago, on April 30, I was reminded that in 2007 I attended the New Software Industry conference at the Microsoft campus in Mountain View. These were the days when SaaS and other cloud platforms were emerging as a significant packaging concept, and companies were rethinking their delivery models as well as their split between product and services.

In reviewing those old posts, there were a lot of points that are still valid today, and topics ranged from development practices to software company organization to venture capital. The discussion about the spectrum of software development practices was especially on point: there are some things that lend themselves to a heavily-specified waterfall-like model (e.g., infrastructure development), while others that benefit from an agile approach (e.g., most applications). I also liked this bit that I picked up from one of the sessions about software industry qualifications:

In World of Warcraft you can tell if someone has a Master’s in Dragon Slaying, and how good they are at it, whereas the software industry in general, and the open source community in particular, has no equivalent (but should).

I finished my coverage by pointing out that this was a very Valley-centric view of the software industry, and that new software industry conferences in the future would need to much more exclusive of the global software industry.

I was already live-blogging at conferences by this point in time, and you can read all my posts for the conference here.

It’s My 16th Blogaversary

Sixteen years ago, with some trepidation, I hit the Publish button for the first time on Column 2, posting a review of the BPTrends 2005 BPM Suites Report. This was early in the social media days, and I wasn’t sure if anyone would be interested in anything that I had to write about. Since then, I’ve written 2,236 posts and 1,026,740 words. My readers from all over the world have contributed 2,095 comments. The readership stats are not completely accurate, since I’ve transferred platforms twice and they would have been reset at those points, although the last change was quite a number of years ago. Based on the current site stats, aside from the Home and About Me pages, the most popular post of all time is Policies, procedures, processes and rules from 2007. More readers are from the US than any other country, although India and Germany have respectable second and third place showings.

Social publishing platforms come and go, and I occasionally dabble in other places such as LinkedIn and Medium, but I believe that maintaining control over my content is important. I choose to make this open platform (self-hosted WordPress) my main platform, rather than a proprietary walled garden that may limit who sees what I write, or go out of business and take my content with them.

When I started writing this blog, I was doing similar technical strategy and architecture consulting work to what I do now. The exposure that I have from writing here has leveraged me into a different and complementary business, and I now spend half my time as an indepedent industry analyst. That started with me asking for free press passes to vendor and industry conferences, since I was writing about the conferences; eventually, the value of what I was writing was recognized, and vendors started to invite me to attend (covering my travel expenses) and include me in analyst sessions and product briefings. Now, they hire me to help with internal strategy, as well as to write and present on thought-leadership and educational topics regarding our industry.

Writing this blog has expanded my business social circle enormously, and I count as friends (or at least friendly business colleagues) many people who I have met because I hit that Publish button. Without a doubt, it has been transformational for me, both in business and in life.

Virtual conference best practices: 2020 in review

Wow, it’s been over two months since my last post. I took a long break over the end of the year since there wasn’t a lot going on that inspired me to write, and we were in conference hiatus. Now that (virtual) conferences are ramping up again for 2021, I wanted to share some of the best practices that I gathered from attending — and in one case, organizing — virtual conferences over 2020. Having sent this information by email to multiple people who were organizing their own conferences, I decided to just put it here where everyone could enjoy it. Obviously, these are all conferences about intelligent automation platforms, but the best practices are applicable to any technical conference, and likely to many non-technical conferences.

In summary, I saw three key things that make a virtual conference work well:

  1. Live presentations, not pre-recorded. This is essential for the amount of energy in the presentation, and makes the difference between a cohesive conference and a just a bunch of webinars. Screwups happen when you’re live, but they do at in-person conferences, too.
  2. Separate and persistent discussion platform, such as Slack (or Pega’s community in the case of their conference). Do NOT use the broadcast vendor’s chat/discussion platform, since a) it will disappear once your conference is over, and b) it probably sucks.
  3. Replays of the video posted as soon as possible, so that people who missed a live session can watch it and jump into the discussion later the same day while others are still talking about it. Extra points for also publishing the presentation slides at the same time.

A conference is not a one-way broadcast, it’s a big messy collaborative conversation

Let’s start with the list of the virtual conferences that I wrote about, with links to the posts:

What I saw by attending these helped me when I was asked to organize DecisionCAMP, which ran in late June: we did the sessions using Zoom with livestreaming to YouTube (participants could watch either way), used Slack as a discussion platform (which is still being used for ongoing discussions and to run monthly events), and YouTube for the on-demand videos. Fluxicon used a similar setup for their Process Mining Camp: Skype (I think) instead of Zoom to capture the speakers’ sessions with all participants watching through the YouTube livestream and discussions on Slack.

Some particular notes excerpted from my posts on the vendor conferences follow. If you want to see the full blog posts, use the tag links above or just search.

Camunda

  • “Every conference organizer has had to deal with either cancelling their event or moving it to some type of online version as most of us work from home during the COVID-19 pandemic. Some of these have been pretty lacklustre, using only pre-recorded sessions and no live chat/Q&A, but I had expectations for Camunda being able to do this in a more “live” manner that doesn’t completely replace an in-person event, but has a similar feel to it. They did not disappoint: although a few of the CamundaCon presentations were pre-recorded, most were done live, and speakers were available for live Q&A. They also hosted a Slack workspace for live chat, which is much better than the Q&A/chat features on the webinar broadcast platform: it’s fundamentally more feature-rich, and also allows the conversations to continue after a particular presentation completes.”
  • “As you probably gather from my posts today, I’m finding the CamundaCon online format to be very engaging. This is due to most of the presentations being performed live (not pre-recorded as is seen with most of the online conferences these days) and the use of Slack as a persistent chat platform, actively monitored by all Camunda participants from the CEO on down.”
  • “I mentioned on Twitter today that CamundaCon is now the gold standard for online conferences: all you other vendors who have conferences coming up, take note. I believe that the key contributors to this success are live (not pre-recorded) presentations, use of a discussion platform like Slack or Discord alongside the broadcast platform, full engagement of a large number of company participants in the discussion platform before/during/after presentations, and fast upload of the videos for on-demand watching. Keep in mind that a successful conference, whether in-person or online, allows people to have unscripted interactions: it’s not a one-way broadcast, it’s a big messy collaborative conversation.”
  • Note that things did go wrong occasionally — one presentation was cut off part way through when the presenter’s home internet died. However, the energy level of the presentations was really high, making me want to keep watching. Also hilarious when one speaker talked about improving their “shittiest process” which is probably only something that would come out spontaneously during a live presentation.

Alfresco

  • “Alfresco Modernize didn’t have much of a “live” feel to it: the sessions were all pre-recorded which, as I’ve mentioned in my coverage of other online conferences, just doesn’t have the same feel. Also, without a full attendee discussion capability, this was more like a broadcast of multiple webinars than an interactive event, with a short Q&A session at the end as the only point of interaction.”

Celonis

  • “A few notes on the virtual conference format. Last week’s CamundaCon Live had sessions broadcast directly from each speaker’s home plus a multi-channel Slack workspace for discussion: casual and engaging. Celonis has made it more like an in-person conference by live-broadcasting the “main stage” from a studio with multiple camera angles; this actually worked quite well, and the moderator was able to inject live audience questions. Some of the sessions appeared to be pre-recorded, and there’s definitely not the same level of audience engagement without a proper discussion channel like Slack — at an in-person event, we would have informal discussions in the hallways between sessions that just can’t happen in this environment. Unfortunately, the only live chat is via their own conference app, which is mobile-only and has a single chat channel, plus a separate Q&A channel (via in-app Slido) for speakers that is separated by session and is really more of a webinar-style Q&A than a discussion. I abandoned the mobile app early and took to Twitter. I think the Celosphere model is probably what we’re going to see from larger companies in their online conferences, where they want to (attempt to) tightly control the discussion and demonstrate the sort of high-end production quality that you’d have at a large in-person conference. However, I think there’s an opportunity to combine that level of production quality with an open discussion platform like Slack to really improve the audience experience.”
  • “Camunda and Celonis have both done a great job, but for very different reasons: Camunda had much better audience engagement and more of a “live” feel, while Celonis showed how to incorporate higher production quality and studio interviews to good effect.”
  • “Good work by Celonis on a marathon event: this ran for several hours per day over three days, although the individual presentations were pre-recorded then followed by live Q&A. Lots of logistics and good production quality, but it could have had better audience engagement through a more interactive platform such as Slack.”

IBM

  • “As I’ve mentioned over the past few weeks of virtual conferences, I don’t like pre-recorded sessions: they just don’t have the same feel as live presentations. To IBM’s credit, they used the fact that they were all pre-recorded to add captions in five or six different languages, making the sessions (which were all presented in English) more accessible to those who speak other languages or who have hearing impairments. The platform is pretty glitchy on mobile: I was trying to watch the video on my tablet while using my computer for blogging and looking up references, but there were a number of problems with changing streams that forced me to move back to desktop video for periods of time. The single-threaded chat stream was completely unusable, with 4,500 people simultaneously typing “Hi from Tulsa” or “you are amazing”.”
  • “IBM had to pivot to a virtual format relatively quickly since they already had a huge in-person conference scheduled for this time, but they could have done better both for content and format given the resources that they have available to pour into this event. Everyone is learning from this experience of being forced to move events online, and the smaller companies are (not surprisingly) much more agile in adapting to this new normal.”

Appian

  • “This was originally planned as an in-person conference, and Appian had to pivot on relatively short notice. They did a great job with the keynotes, including a few of the Appian speakers appearing (appropriately distanced) in their own auditorium. The breakout sessions didn’t really grab me: too many, all pre-recorded, and you’re basically an audience of one when you’re in any of them, with little or no interactivity. Better as a set of on-demand training/content videos rather than true breakout sessions, and I’m sure there’s a lot of good content here for Appian customers or prospects to dig deeper into product capabilities but these could be packaged as a permanent library of content rather than a “conference”. The key for virtual conferences seems to be keeping it a bit simpler, with more timely and live sessions from one or two tracks only.”

Signavio

  • “Signavio has a low-key format of live presentations that started at 11am Sydney time with a presentation by Property Exchange Australia: I tuned in from my timezone at 9pm last night, stayed for the Deloitte Australia presentation, then took a break until the last part of the Coca-Cola European Partners presentation that started at 8am my time. In the meantime, there were continuous presentations from APAC and Europe, with the speakers all presenting live in their own regular business hours.”
  • “The only thing missing is a proper discussion platform — I have mentioned this about several of the online conferences that I’ve attended, and liked what Camunda did with a Slack workspace that started before and continued after the conference — although you can ask questions via the GoToWebinar Question panel. To be fair, there is very little social media engagement (the Twitter hashtag for the conference is mostly me and Signavio people), so possibly the attendees wouldn’t get engaged in a full discussion platform either. Without audience engagement, a discussion platform can be a pretty lonely place. In summary, the GTW platform seems to behave well and is a streamlined experience if you don’t expect a lot of customer engagement, or you could use it with a separate discussion platform.”

Pega

  • “In general, I didn’t find the prerecorded sessions to be very compelling. Conference organizers may think that prerecording sessions reduces risk, but it also reduces spontaneity and energy from the presenters, which is a lot of what makes live presentations work so well. The live Q&A interspersed with the keynotes was okay, and the live demos in the middle breakout section as well as the live Tech Talk were really good. PegaWorld also benefited from Pega’s own online community, which provided a more comprehensive discussion platform than the broadcast platform chat or Q&A.”

Fluxicon

  • “The format is interesting, there is only one presentation each day, presented live using YouTube Live (no registration required), with some Q&A at the end. The next day starts with Process Mining Café, which is an extended Q&A with the previous day’s presenter based on the conversations in the related Slack workspace (which you do need to register to join), then a break before moving on to that day’s presentation. The presentations are available on YouTube almost as soon as they are finished.”
  • “The really great part was engaging in the Slack discussion while the keynote was going on. A few people were asking questions (including me), and Mieke Jans posted a link to a post that she wrote on a procedure for cleansing event logs for multi-case processes – not the same as what van der Aalst was talking about, but a related topic. Anne Rozinat posted a link to more reading on these types of many-to-many situations in the context of their process mining product from their “Process Mining in Practice” online book. Not surprisingly, there was almost no discussion on the Twitter hashtag, since the attendees had a proper discussion platform; contrast this with some of the other conferences where attendees had to resort to Twitter to have a conversation about the content. After the keynote, van der Aalst even joined in the discussion and answered a few questions, plus added the link for the IEEE task force on process mining that promotes research, development, education and understanding of process mining: definitely of interest if you want to get plugged into more of the research in the field. As a special treat, Ferry Timp created visual notes for each day and posted them to the related Slack channel.”

Bizagi

  • “The broadcast platform fell over completely…I’m not sure if Bizagi should be happy that they had so many attendees that they broke the platform, or furious with the platform vendor for offering something that they couldn’t deliver. The “all-singing, all-dancing” platforms look nice when you see the demo, but they may not be scalable enough.”

Final thoughts

Just to wrap things up, it’s fair to say that things aren’t going to go back to the way that they were any time soon. Part of this is due to organizations understanding that things can be done remotely just as effectively (or nearly so) as they can in person, if done right. Also, a lot of people are still reluctant to even think about travelling and spending days in poorly-ventilated rooms with a bunch of strangers from all over the world.

The vendors who ran really good virtual conferences 2020 are almost certain to continue to run at least some of their events virtually in the future, or find a way to have both in-person and remote attendees simultaneously. If you run a virtual conference that doesn’t get the attendee engagement that you expected, the problem may not be that “virtual conferences don’t work”: it could be that you just aren’t doing it right.

Pandemic-driven digital transformation in the legal world: this genie is not going back in the bottle

When I write a present about the type of digital transformation that the pandemic is forcing on firms in order to survive, I usually use examples from financial services and insurance, since that’s where I do most of my consulting. However, we see examples all around us as consumers, as every business of every size struggles to transform to an online model to be able to continue providing us with goods and services. And once both the consumers and the businesses see the benefits of doing some (not all) transactions online, there will be no going back to the old way of doing things.

I recently moved, and just completed the closing on the sale of my previous home. It’s been quite a while since I last did this, but it was always (and I believe still was until a few months ago) a very paper-driven, personal service type of transaction. This time was much easier, and almost all online; in fact, I’ve never even met anyone from my lawyer’s office face-to-face, I didn’t use a document courier, and I only saw my real estate agent in person once. All documents were digitally signed, and I had a video call with my lawyer me to walk through the documents and verify that it was me doing the signing. I downloaded the signed documents directly, although the law office would have been happy to charge me to print and mail a copy. To hand over the keys, my real estate agent just left their lockbox (which contained the keys for other agents to do showings) and gave the code to my lawyer to pass on to the other party once the deal was closed. Payments were all done as electronic transfers.

My lawyer’s firm is obviously still struggling with this paradigm, and provided the option to deliver paper documents, payments and keys by courier (in fact, I had to remind them to remove the courier fee from their standard invoice). In fact, they no longer offer in-person meetings: it has to be a video call. Yes, you can still sign physical documents and courier them back and forth, but that’s going to add a couple of days to the process and is more cumbersome than signing them digitally. Soon, I expect to see pricing from law firms that strongly encourages their clients to do everything digitally, since it costs them more to handle the paper documents and can create health risks for their employees.

Having gone through a real estate closing once from the comfort of my own home, I am left with one question: why would we ever go back to the old way of doing this? I understand that there are consumers who won’t or can’t adopt to new online methods of doing business with organizations, but those are becoming fewer every day. That’s not because the millennial demographic is taking over, but because people of all ages are learning that some of the online methods are better for them as well as the companies that they deal with.

Generalizing from my personal anecdote, this is happening in many businesses now: they are making the move to online business models in response to the pandemic, then finding that for many operations, this is a much better way of doing things. Along the way, they may also be automating some processes or eliminating manual tasks, like my lawyer’s office eliminating the document handling steps that used to be done. Not just more efficient for the company, but better for the clients.

As you adjust your business to compensate for the pandemic, design your customer-facing processes so that they make it easier (if possible) for your customer to do things online than the old way of doing things. That will almost always be more efficient for your business, and can greatly improve customer satisfaction. This does not mean that you don’t need people in your organization, or that your customers can’t talk to someone when required: automating processes and tasks means that you’re freeing up people to focus on resolving problems and improving customer communications, rather than performing routine tasks.

As one of my neighbourhood graffiti artists so eloquently put it, “6 feet apart but close 2 my ❤”.