We’re reaching the end of what would have been the usual spring season of tech conferences, although all of them moved online with varying degrees of success. After the first few that I attended, I promised a summary of the best and worst practices, and I still plan to do that, but Jacob Feldman convinced me to help him out with the logistics for the online version of DecisionCAMP, which was supposed to be in Oslo next week. I first attended DecisionCAMP last year in Bolzano since I was already in Berlin the week before for CamundaCan, and managed to spend a few days vacation in northern Italy as a bonus. This year, I won’t be blogging about it live, because I’ll be running the logistics and the on-screen monitoring. This is giving me a chance to test-drive some of my ideas about how to run a good online event without spending a fortune.
Note that the last day of the conference, Wednesday July 1, is Canada Day: a major national holiday sometimes referred to as “Canada’s birthday”, but I’ll be online moderating the conference because who’s really going anywhere these days. I do expect everyone on the Zoom meeting that day to be sporting a red maple leaf, or at least be wearing red and white, at risk of having their video disabled by the diabolical Canadian moderator.
All presentations will be live on Zoom, with simultaneously livestreaming on YouTube. If you are registered, you will receive the Zoom link; if you’re not registered or prefer to watch on YouTube, subscribe to the DecisionCAMP YouTube channel and watch it there.
Discussions and Q&A will be on the DecisionCAMP Slack workspace, with dedicated channels for discussions about each day’s presentations. We are encouraging presenters to engage with their audience there after their presentation to answer any open questions, and we already have some discussions going on. This type of persistent, multi-threaded platform is much better for emulating the types of hallway conversations and presenter Q&A that might occur at an in-person conference
For Zoom attendees, there will also be the option to use the “raise hand” feature and ask a question verbally during the presentation.
I’ve learned a lot about online conference tools in the past month or so, including:
Zoom features, settings and all variations on recording to have the best possible experience during and after each presentation. I will share all of those in my “best practices” post that I’ll create after DecisionCAMP is over, based on what I’ve seen from all the online conferences this spring.
Slack, which I have used before but I’ve never created/administered a workspace or added apps.
YouTube livestreaming, or rather, stream-through from Zoom. This is a very cool feature of how Zoom and YouTube work together, but you have to learn a few things, such as to manually end the stream over on YouTube once you’ve closed the Zoom meeting so that it doesn’t keep running with no data input for several hours. Oops.
I’m not being financially compensated for working on DecisionCAMP: I’ve been treating it as a learning experience.
Last week, I was busy preparing and presenting to webinars for twodifferent clients, so I ended up missing Software AG’s ARIS international user groups (IUG) conference and most of Fluxicon’s Process Mining Camp online conference, although I did catch a bit of the Lufthansa presentation. However, Process Mining Camp continues this week, giving me a chance to tune in for the remaining sessions. The format is interesting, there is only one presentation each day, presented live using YouTube Live (no registration required), with some Q&A at the end. The next day starts with Process Mining Café, which is an extended Q&A with the previous day’s presenter based on the conversations in the related Slack workspace (which you do need to register to join), then a break before moving on to that day’s presentation. The presentations are available on YouTube almost as soon as they are finished, but are being shared via Slack using unlisted links, so I’ll let Fluxicon make them public at their own pace (subscribe to their YouTube channel since they will likely end up there).
Anne Rozinat, co-founder of Fluxicon, was moderator for the event, and was able to bring life to the Q&A since she’s an expert in the subject matter and had questions of her own. Each day’s session runs a maximum of two hours starting at 10am Eastern, which makes it a reasonable time for all of Europe and North America (having lived in California, I know the west coasters are used to getting up for 7am events to sync with east coast times). Also, each presentation is a practitioner who uses process mining (specifically, Fluxicon’s Disco product) in real applications, meaning that they have stories to share about their data analysis, and what worked and didn’t work.
Monday started with Q&A with Zsolt Varga of the European Court of Auditors, who presented last Friday. It was a great discussion and made me want to go back and see Varga’s presentation: he had some interesting comments on how they track and resolve missing historical data, as well as one of the more interesting backgrounds. There was then a presentation by Hilda Klasky of the Oak Ridge National Laboratory on process mining for electronic health records with some cool data clustering and abstraction to extract case management state transition patterns from what seemed to be a massive spaghetti mess. Tuesday, Klasky returned for Q&A, then a presentation by Harm Hoebergen and Redmar Draaisma of Freo (an online loans subsidiary of Rabobank) on loan and credit processes across multiple channels. It was great to track Slack during a presentation and see the back-and-forth conversations as well as watch the questions accumulate for the presenter; after each presentation, it was common to see the presenter respond to questions and discussion points that weren’t covered in the live Q&A. For online conferences, this type of “chaotic engagement” (rather than tightly controlled broadcasts from the vendor, or non-functionality single-threaded chat streams) replaces the “hallway chats” and is essential for turning a non-engaging set of online presentations into a more immersive conference experience.
The conference closing keynote today was by Wil van der Aalst, who headed the process mining group at Eindhoven University of Technology where Fluxicon’s co-founders did their Ph.D. studies. He’s now at RWTH Aachen University, although remains affiliated with Eindhoven. I’ve had the pleasure of meeting van der Aalst several times at the academic/research BPM conferences (including last year in Vienna), and always enjoy hearing him present. He spoke about some of the latest research in object-centric process mining, which addresses the issue of handling events that refer to multiple business objects, such as multiple items in a single order that may be split into multiple deliveries. Traditionally in process mining, each event record from a history log that forms the process mining data has a single case ID, plus a timestamp and an activity name. But what happens if an event impacts multiple cases?
He started with an overview of process mining and many of the existing challenges, such as performance issues with conformance checking, and the fact that data collection/cleansing still takes 80% of the effort. However, process mining (and, I believe, task mining as a secondary method of data collection) can be using event logs where an event refers to multiple cases, requiring that the data be “flattened” to pick one of the cases as the identifier for the event record, then duplicate the record for each case referred to in the event. The problem arises because events can disappear when cases are merged again, which will cause problems in generating accurate process models. Consider your standard Amazon order, like the one that I’m waiting for right now. I placed a single order containing eight items a couple of days ago, which were supposed to be delivered in a single shipment tomorrow. However, the single order was split into three separate orders the day after I placed the order, then two of the orders are being sent in a single shipment that is arriving today, while the third order will be in its own shipment tomorrow. Think about the complexity of tracking by order, or item, or shipment: processes diverge and converge in these many-to-many relationships. Is this one process (my original order), or two (shipments), or three (final orders)?
The really great part was engaging in the Slack discussion while the keynote was going on. A few people were asking questions (including me), and Mieke Jans posted a link to a post that she wrote on a procedure for cleansing event logs for multi-case processes – not the same as what van der Aalst was talking about, but a related topic. Anne Rozinat posted a link to more reading on these types of many-to-many situations in the context of their process mining product from their “Process Mining in Practice” online book. Not surprisingly, there was almost no discussion on the Twitter hashtag, since the attendees had a proper discussion platform; contrast this with some of the other conferences where attendees had to resort to Twitter to have a conversation about the content. After the keynote, van der Aalst even joined in the discussion and answered a few questions, plus added the link for the IEEE task force on process mining that promotes research, development, education and understanding of process mining: definitely of interest if you want to get plugged into more of the research in the field. As a special treat, Ferry Timp created visual notes for each day and posted them to the related Slack channel – you can see the one from today at the left.
Great keynote and discussion afterwards, I recommend tracking Fluxicon’s blog and/or YouTube channel to watch it – and all of the other presentations – when published.
Did you ever make some last-minute changes before a presentation, only to look on in horror when a slide pops up that was not exactly what you were expecting? That was me today, on a webinar about process intelligence that I did with Signavio. In the webinar, I was taking a step back from process automation — my usual topic of discussion — to talk more about the analytical tools such as process mining. This morning, I decided that I wanted to add a brief backgrounder on process mining, and pulled in some slides that I had created for a presentation back in 2013 on (what were then) evolving technologies related to process management. I got a bit too fancy, and created a four-image build but accidentally didn’t have the animation set on what should have been the last image added to the build, so it obscured all the good stuff on the slide.
I thought it was a pretty interesting topic, and I rebuilt the slide and recorded it. Check it out (it’s only 3-1/2 minutes long):
How do you get a handle on your company’s disrupted processes? How do you get real-time visibility into your organization’s strengths and weaknesses? How do you confidently chart a path to the future? The key is process intelligence: seeing your processes clearly and understanding what is actually happening versus what’s supposed to happen.
For example, your order-to-cash process is showing increased sales but decreasing customer satisfaction. Why? What is the root cause? Or, you have an opportunity to offer a new product but aren’t sure if your manufacturing process can handle it. To make this decision, you need a clear line of sight into what your organization can do. These areas are where process intelligence shines.
This webinar will help you answer questions like these, showing you – with examples – how process intelligence can help you drive real business results.
Rather than my usual focus on process automation, I’m digging a bit more into the process analysis side, particularly around process mining. With the current situation with a largely distributed workforce for many businesses, processes have change and it’s not possible to do Gemba walks or job shadowing to collect information on what the adjusted processes look like. Process mining and task mining provide the capabilities to do that remotely and accurately, and identify any problems with conformance/compliance as well as discover root causes. You can sign up at the link above to attend or receive the on-demand replay after the event.
I also posted last week about the webinar that I’m presenting on Wednesday for ABBYY on digital intelligence in the insurance industry, which is a related but different spin on the same issue: how are processes changing now, and what methodologies and technologies are available to handle this disruption. In case it’s not obvious, I don’t work for either of these vendors (who have some overlap in products) but provide “thought leadership” presentations to help introduce and clarify concepts for audiences. Looking forward to seeing everyone on either or both of these webinars later this week.
PegaWorld, in shifting from an in-person to virtual event, dropped down to a short 2.5 hours. The keynotes and many of the breakouts appeared to be mostly pre-recorded, hosted live by CTO Don Schuerman who provided some welcome comic relief and moderated live Q&A with each of the speakers after their session.
The first session was a short keynote with CEO Alan Trefler. It’s been a while since I’ve had a briefing with Pega, and their message has shifted strongly to the combination of AI and case management as the core of their digital platform capabilities. Trefler also announced Pega Process Fabric that allows the integration of multiple systems not just from Pega, but other vendors.
Next up was SVP of Products Kerim Akgonul, discussing their low-code Pega Express approach and how it’s helping customers to stand up applications faster. We heard briefly from Anna Gleiss, Global IT Head of Master Data Management at Siemens, who talked about how they are leveraging Pega to ensure reusability and speed deployment across the 30 different applications that they’re running in the Pega Cloud. Akgonul continued with use cases for self-service — especially important with the explosion in customer service in some industries due to the pandemic — and some of their customers such as Aflac who are using Pega to further their self-service efforts.
There was a keynote by Rich Gilbert, Chief Digital and Information Officer at Aflac, on the reinvention that they have gone through. There’s a lot of disruption in the insurance industry now, and they’ve been addressing this by creating a service-based operating model to deliver digital services as a collaboration between business and IT. They’ve been using Pega to help them with their key business drivers of settling claims faster and providing excellent customer service with offerings such as “Claims Guest Checkout”, which lets someone initiate a claim through self-service without knowing their policy number or logging in, and a Claims Status Tracker available on their mobile app or website. They’ve created a new customer service experience using a combination of live chat and virtual assistants, the latter of which is resolving 86% of inquiries without moving to a live agent.
Akgonul also provided a bit more information on the Process Fabric, which acts as a universal task manager for individual workers, with a work management dashboard for managers. There was no live Q&A at this point, but it was delayed until a Tech Talk later in the agenda. In the interim was a one-hour block of breakouts that had one track of three live build sessions, plus a large number of short prerecorded sessions from Pega, partners and customers. I’m interested in more information on the Process Fabric, which I believe will be in the later Tech Talk, although I did grab some screenshots from Akgonul’s keynote:
The live build sessions seemed to be overloaded and there was a long delay getting into them, but once started, they were good-quality demos of building Pega applications. I came in part way through the first one on low-code using App Studio, and it was quite interactive, with a moderator dropping in occasionally with live questions, and eventually hurrying the presenter along to finish on time. I was only going to stay for a couple of minutes, but it was pretty engaging and I watched all of it. The next live demo was on data and integration, and built on the previous demo’s vehicle fleet manager use case to add data from a variety of back-end sources. The visuals were fun, too: the presenter’s demo was most of the screen, with a bubble at the bottom right containing a video of the speaker, then a bubble popping in at the bottom left with the moderator when he had a question or comment. Questions from the audience helped to drive the presentation, making it very interactive. The third live demo was on user experience, which had a few connectivity issues so I’m not sure we saw the entire demo as planned, but it showed the creation of the user interface for the vehicle manager app using the Cosmos system, moving a lot of logic out of the UI and into the case model.
The final session was the Tech Talk on product vision and roadmap with Kerim Akgonul, moderated by Stephanie Louis, Senior Director of Pega’s Community and Developer Programs. He discussed Process Fabric, Project Phoenix, Cosmos and other new product releases in addition to fielding questions from social media and Pega’s online community. This was very interactive and engaging, much more so than his earlier keynote which seemed a bit stiff and over-rehearsed. More of this format, please.
In general, I didn’t find the prerecorded sessions to be very compelling. Conference organizers may think that prerecording sessions reduces risk, but it also reduces spontaneity and energy from the presenters, which is a lot of what makes live presentations work so well. The live Q&A interspersed with the keynotes was okay, and the live demos in the middle breakout section as well as the live Tech Talk were really good. PegaWorld also benefited from Pega’s own online community, which provided a more comprehensive discussion platform than the broadcast platform chat or Q&A. If you missed today’s event, you should be able to find all of the content on demand on the PegaWorld site within the next day or two.
I have a long history working with insurance companies on their digitization and process automation initiatives, and there’s a lot of interesting things happening in insurance as a result of the pandemic and associated lockdown: more automation of underwriting and claims, increased use of digital documents instead of paper, and trying to discover the “new normal” in insurance processes as we move to a world that will remain, at least in part, with a distributed workforce for some time in the future. At the same time, there is an increase in some types of insurance business activity, and decreases in other areas, requiring reallocation of resources.
On June 17, I’ll be presenting a webinar for ABBYY on some of the ways that insurance companies can navigate this perfect storm of business and societal disruption using digital intelligence technologies including smarter content capture and process intelligence. Here’s what we plan to cover:
Helping you understand how to transform processes, instead of falling into the trap of just automating existing, often broken processes
Getting your organization one step further of your competition with the latest content intelligence capabilities that help transform your customer experience and operational effectiveness
Completely automating your handling of essential documents used in onboarding, policy underwriting, claims, adjudication, and compliance
Having direct overview of your processes as living in real time to discover where bottlenecks and repetitions occur, where content needs to be processed, and where automation can be most effective
I missed last year’s Signavio Live event, and it turns out that it gave them a year head start on the virtual conference format now being adopted by other tech vendors. Now that everyone has switched to online conferences, many have decided to go the splashy-but-prerecorded route, which includes a lot of flying graphics and catchy music but canned presentations that fall a bit flat. Signavio has a low-key format of live presentations that started at 11am Sydney time with a presentation by Property Exchange Australia: I tuned in from my timezone at 9pm last night, stayed for the Deloitte Australia presentation, then took a break until the last part of the Coca-Cola European Partners presentation that started at 8am my time. In the meantime, there were continuous presentations from APAC and Europe, with the speakers all presenting live in their own regular business hours.
Signavio started their product journey with a really good process modeler, and now have process mining and some degree of operational monitoring for a more complete process intelligence suite. In his keynote, CEO Gero Decker talked about how the current pandemic — even as many countries start to emerge from it — is highlighting the need for operational resilience: companies need to design for flexibility, not just efficiency. For example, many companies are reinventing customer touchpoints, such as curbside delivery for retail as an alternative to in-store shopping, or virtual walk-throughs for looking at real estate. Companies are also reinventing products and services, allowing businesses that rely on in-person interactions to take their services online; I’ve been seeing this shift with everything from yoga classes to art gallery lectures. Decker highlighted two key factors to focus on in order to emerge from the crisis stronger: operational excellence, and customer experience. One without the other does not provide the benefit, but they need to be combined into the concept of “Customer Excellence”. In the Q&A, he discussed how many companies started really stepping up their process intelligence efforts in order to deal with the COVID-19 crisis, then realized that they should be doing this in the normal course of business.
There was a session with Jan ten Sythoff, Senior TEI Consultant at Forrester, and Signavio’s Global SVP of Customer Service, Stefan Krumnow, on the total economic impact of the Signavio Suite (TEI is the Forrester take on ROI). Krumnow started with the different factors that might be part of what a customer organization might be getting out of Signavio — RPA at scale, operational excellence, risk and compliance, ERP transformation, and customer excellence — then ten Sythoff discussed the specific TEI report that Forrester created for Signavio in October 2019 with a few updates for the current situation. The key quantifiable benefits identified by Forrester were external resources cost avoidance, higher efficiency in implementing new processes, and cost avoidance of alternative tools; they also found non-quantifiable benefits such as a better culture of process improvement across organizations. For their aggregate case study created from all of their interviews, they calculated a payback of less than six months for implementing Signavio: this would depend, of course, on how closely a particular organization matched their fictitious use case, which was a 100,000-employee company.
There are a number of additional sessions running until 5pm Eastern North American time; I might peek back in for a few of those, and will write another post if there’s anything of particular interest. I expect that everything will be available on demand after the event if you want to check out any of the sessions.
On the conference format, there is a single track of live presentations, and a Signavio moderator on each one to introduce the speaker and help wrangle the Q&A. Each presentation is 40 minutes plus 10 minutes of Q&A, with a 10-minute break between each one. Great format, schedule-wise, and the live sessions make it very engaging. They’re using GoToWebinar, and I’m using it on my tablet where it works really well: I can control the screen split between speaker video and slides (assuming the speaker is sharing their video), it supports multiple simultaneous speakers, I can see at the top of the screen who is talking in case I join a presentation after the introduction, and the moderator can collect feedback via polls and surveys. Because it’s a single track, it’s a single GTW link, allowing attendees to drop in and out easily throughout the day. The only thing missing is a proper discussion platform — I have mentioned this about several of the online conferences that I’ve attended, and liked what Camunda did with a Slack workspace that started before and continued after the conference — although you can ask questions via the GTW Question panel. To be fair, there is very little social media engagement (the Twitter hashtag for the conference is mostly me and Signavio people), so possibly the attendees wouldn’t get engaged in a full discussion platform either. Without audience engagement, a discussion platform can be a pretty lonely place. In summary, the GTW platform seems to behave well and is a streamlined experience if you don’t expect a lot of customer engagement, or you could use it with a separate discussion platform.
The second day of the Appian World 2020 virtual conference started with CTO Michael Beckley, who immediately set me straight on something that I assumed yesterday: at least some of the keynotes were pre-recorded, not live. So their statement on their website, that keynotes are “live” from 10am-noon, and other references to “live” keynotes just means that they are being broadcast at that time, not being broadcast live. Since there’s no interaction with the audience during keynotes it’s difficult to tell, and the content of most keynotes has been well done in any case. This may have been a special case for Beckley, since he has health conditions that make him higher risk, although this was still recorded in the Appian auditorium where there would have been some number of support staff.
Beckley went into more detail on the COVID-19 apps that they have developed, with a highlight on their latest Workforce Safety and Readiness that helps to manage how workers return to a workplace. He walked through the employee view of the app, where they can record their own health check information, plus the HR manager view that allows them to set the questions, policies and information that will be seen by the employees. They’ve put this together pretty quickly using their own low-code platform, and are offering it at a reasonable price to their customers.
Next up was a customer presentation by Michael Woolley, Principal of IT Retail Systems at The Vanguard Group. They’re a huge wealth management firm spread over several countries, and they’re building Appian applications including ones that will be used by 6,000 employees. It appears that they are replacing their legacy workflow system of 20 years, which has hundreds of workflows. [I think the legacy system may be an IBM FileNet system, since I have a memory of doing some work for Vanguard over 20 years ago to develop requirements and technical design for just such a system – flashback!] They wanted to move to a modern low-code cloud platform, and although their standard workflow is pretty straightforward financial services transactional flows, they are incorporating business rules as well as BPM and case management, and RPA for interacting with legacy line of business systems. They are also planning to include AI/ML within the case management stages. He discussed their basic architecture as well as their development organization, and finished with some best practices for large projects such as this: it’s a multi-year program that covers many different workflows, so isn’t a greenfield application and has complex migration considerations.
Deputy CTO Malcolm Ross returned to follow on from his talk yesterday, when he talked about AI and RPA, to discuss how they’re improving low-code development. He showed some pretty cool AI-augmented development that they are releasing in June, which looks at the design of a process as you’re building it, and recommends the next node that you will want to add based on the current content and goals of the process. I’m definitely interested in seeing where they go with this. He had a number of detailed product updates, including cloud security, details on testing/deployment cycles for application packages, and administrative tools such as (system) Health Check. They continue to push new features into their SAIL user interface layer, making it easier for developers to create new experiences on any form factor — one of the strikes against most low-code platforms is that their UI development is not as flexible as customers require, and Appian is definitely raising the bar on what’s possible. He finished up with their multi-channel communication add-ons, which allow the use of tools such as Twilio directly within an Appian application.
The final presentation of the morning keynote was Kristie Grinnell, Global CIO and Chief Supply Chain Officer at General Dynamics Information Technology with a presentation on how they are using Appian to help manage their 30,000 employees spread over 400 customer locations. They are a government contractor, and have to manage all things around being an outsourced IT company, such as assigning people to customer projects, timesheet adjustments and invoicing, while maintaining compliance and auditability. She spoke about some of their specific Appian applications that they have developed, and the benefits: an employee pay adjustment request application (to adjust people’s pay for when they work more hours than they were paid for) reduced backlog from three weeks to three days, and reduced errors. They also developed an international travel approval app (likely not getting used much these days), since most of their employees have a high security clearance and specific risks need to be managed during travel, which reduced the approval time from days to hours. Most of their applications to date have been administrative, but they are keen to look at how applying AI/ML to their existing data can help them to make better decisions in the future.
CMO Kevin Spurway and Malcolm Ross closed the keynotes with announcements of their awards to partners, resellers, app market contributors, and hackathon winners. On an optimistic note, Spurway announced that next year’s Appian World will be in San Diego, April 11-14, 2021. Here’s hoping.
This is the end of my Appian World 2020 coverage — some good information in the keynotes. As noted yesterday, the breakout session format isn’t sufficiently compelling to have me spend time there, but if you’re an Appian customer, you’ll probably find things of interest.
Another week, another virtual event! Appian World is happening two days this week, and will be available on-demand after. This has a huge number of sessions on several parallel tracks, which are pre-recorded, with keynotes in advance (not clear if the keynotes are actually live, or pre-recorded). From their site:
Keynote sessions are live from 10:00 AM – 12:00 PM EDT on May 12th and 13th. All breakout sessions will become available on-demand at 12:00 PM EDT on their scheduled day, immediately following the live keynote. Speakers will be available from 12:00 PM – 3:00 PM EDT for live Q&A on their scheduled session day.
They’re using the INXPO platform, and apparently using every possible feature. Here’s a bit of navigation help:
There’s a Lobby tab with a video greeting from Appian CMO Kevin Spurway. It has links to the agenda, solutions showcase and lounge, which is a bit superfluous since those are all top-level tabs visible at all times.
The Agenda tab lists the sessions for today, including the keynote (for some reason it showed as Live from 8:30am although the keynotes didn’t start until 10am), then all of the breakout sessions for the day, which you can dip into in any order since they are all pre-recorded and are made available at the same time.
The Sessions tab is where you can drill down and watch any of the sessions when they are live, but you can also do this directly from the Agenda tab. Sessions has them organized into tracks, such as Full Stack Automation Theater and Low-Code Development Theater.
The Solutions Showcase tab is virtual exhibit hall, with booths for partners and a pavilion of Appian product booths. These can have a combination of pre-recorded video, documents to download, and links to chat with them. It’s a bit overwhelming, although I supposed people will go through some of the virtual booths after the sessions, since the sessions run only 10-3 each day. I suppose that many of these partners signed on for Appian World before it moved to a virtual event, so Appian needed to provide a way for them to show their offerings.
The Lounge tab is a single-threaded chat for all attendees. Not a great forum for discussion: as I’ve mentioned on all of the other virtual conference coverage in the past couple of weeks, a separate discussion platform like Slack that allows for multi-threaded discussions where audience members can both lead and participate in discussions with each other is much, much better for audience engagement.
The Games tab has results for some games that they’re running — this is common at conferences, such as how many people send out tweets with the conference hashtag, or getting your ID scanned by a certain number of booths, but not something that adds value for my conference experience.
The keynote speakers appeared on a stage in Appian’s own auditorium, empty (except supposedly for each other and production staff). CEO Matt Calkins was up first, and talked about how the world has changed in 2020, and how their low-code application development can help with the changes that are being forced on organizations by the pandemic. He talked about the applications that they have built in the past couple of months: a COVID-19 workforce tracking app, a loan coordination app that uses AI and RPA for automation, and a workforce safety & readiness app that manages how businesses reopen to their workforce coming back to work. They have made these free or low-cost for their customers for the near term.
His theme for the keynote is automation: using human and digital workers, including RPA and AI, to get the best results. He mentioned BPM as part of their toolbox, and focused on the idea that the goal is to automate, and the specific tool doesn’t matter. They bought an RPA company and have rebranded it as AppianRPA: it’s cloud-native and Java-based, which is different from many other RPA products, but is more appealing to the CIO-level buyer for organization-wide implementations. They are pushing an open agenda, where they can interact with other RPA products and cloud platforms: certainly as a BPM vendor, interaction with other automation tools is part of the fabric.
They have a few new things that I haven’t seen in briefings (to be fair, I think I’ve dropped off their analyst relations radar). Their “Automation Planner” can make recommendations for what type of automation to use for any particular task. Calkins also spoke about their intelligent document processing (IDP), which addresses what they believe is one of the biggest automation challenges that companies have today.
The Appian platform offers full-stack automation — workflow, case management, RPA, AI, business rules — with a “data anywhere” philosophy of integrating with systems to allow processing data in place, and their low-code development for which they have become known. If you’re a fan of the all-in-one proprietary platform, Appian is definitely one of main contenders. They have a number of vertical solutions now, and are starting to offer standardized all-inclusive subscription pricing for different sizes of installations that removes a lot of uncertainty about total cost of ownership. He also highlighted some of the vertical applications created by partners PWC, Accenture and KPMG.
I always like hearing Calkins talk (or chatting with him in person), because he’s smart and well-spoken, and ties together a lot of complex ideas well. He covered a lot of information about Appian products, direction, customers and partners in a 30-minute keynote, and it’s definitely worth watching the replay.
Next up was a “stories from the front line of COVID-19” panel featuring Darin Cline, EVP of Operations of Bank of the West (part of BNP Paribas), and Darren Blake, COO of Bexley Health Neighbourhood Care in the UK National Health Service, moderated by Appian co-founder Mark Wilson. This was done remotely rather than in their studio, with each of the three having an Appian World backdrop: a great branding idea that was similar to what Celonis did with their remote speakers at Celosphere, although each person’s backdrop also had their own company’s logo — nice touch.
Blake talked about how they saw the wave of COVID-19 coming based on data that they were seeing from around the world, and put plans in place to leverage their existing Appian-based staff tracker to implement emergency measures around staff management and redeployment. They support home-based services as well as their patients’ visits to medical facilities, and had to manage staff and patient visits for non-COVID ailments as well as COVID responses and even dedicated COVID testing clinics without risking cross-contamination. Cline talked about how they needed to change their operations to allow people to continue accessing banking services even with lockdowns that happened in their home state of California. He said this disruption has pushed them to become a much more agile organization, both in business and IT departments: this is one of those things that likely is never going back to how it was pre-COVID. He credited their use of Appian for low-code development as part of this, and said that they are now taking advantage of it as never before. Blake echoed that they also have become much more agile, creating and deploying new capabilities in their systems in a matter of a few days: the vision of all low-code, but rarely the reality.
Interesting to hear these customers stories, where they stepped up and started doing agile development in the low-code platform that they were already using, listening to the voice of the customer in cooperation with their business people, executives and implementation partners such as Appian. So many things that companies said were just not possible actually are: fast low-code implementation, work from home, and other changes that are here to stay. These are organizations that are going to hit the ground running as the pandemic recedes — as Blake points out, this is going to be with us for at least two years until a vaccine is created, and will have multiple waves — since they have experienced a digital revolution that has fundamentally changed how they work.
Great customer panel: often these are a bit dry and unfocused, but this one was fascinating since they’ve had a bit of time to track how the pandemic has impacted their business and how they’ve been able to respond to it. In both cases, this is the new normal: Cline explicitly said that they are never going back to having so many people in their offices again, since both their distributed workforce and their customers have embraced online interactions.
Next up was deputy CTO Malcolm Ross (who I fondly remember as providing my first Appian demo in 2006) with a product update. He showed a demo that included integration of RPA, AI, IDP, Salesforce and SAP within the low-code BPM framework that ties it all together. It’s been a while since I’ve had an Appian briefing, and some nice new functionality for how integrations are created and configured with a minimum of coding. They have built-in integrations (i.e., “no code”) to many different other systems. Their AI is powered by Google’s AI services, and includes all of the capabilities that you would find there, bundled into Appian’s environment. This “Appian AI” is at the core of their IDP offering, which does classification and data extraction on unstructured documents, to map into structured data: they have a packaged use case that they provide with their product that includes manual correction when AI classification/extraction doesn’t have a sufficient level of confidence. Because there’s AI/ML behind IDP, it will become smarter as human operators correct the errors.
He went through a demo of their RPA, including how the bots can interact with other Appian automation components such as IDP. There is, as expected, another orchestration (process) model within RPA that shows the screen/task flow: it would be good if they could look at converging this modeling format with their BPM modeling, even though it would be a simple subset. Regardless, a lot of interesting capabilities here for management of robotic resources. If you’re an existing Appian customer, you’re probably already looking at their RPA. Even if you’re already using another RPA product, Appian’s Robotic Workforce Manager allows you to manage Blue Prism, Automation Anywhere and UiPath bots as well as AppianRPA bots.
The last part of the morning keynotes was a panel featuring Austan Goolsbee, Former Chairman of President Obama’s White House Council of Economic Advisers, and Arthur Laffer, Economist and Member of President Reagan’s Economic Policy Advisory Board, moderated by Matt Calkins. This was billed as a “left versus right” economists’ discussion on how to reopen the (US) economy, and quickly lost my interest: it’s not that I’m not interested in the topic, but prefer to find a much wider set of opinions than these two Americans who turned it into a political debate, flinging around phrases such as “Libertarian ideal”. Not really a good fit as part of a keynote at a tech vendor conference. I think this really highlights some of the differences between in-person and virtual conferences: the virtual tech conferences should stick to their products and customers, and drop the “thought leaders” from unrelated areas. The celebrity speakers have a slight appeal to some attendees in person, but not so much in the virtual forum even if they are live conversations. IBM Think had a couple of high profile speakers that I skipped, since I can just go and watch their TED Talk or YouTube channel, and they didn’t really fit into the flow of the conference.
The remaining three hours of day 1 were (pre-recorded) breakout sessions available simultaneously on demand, with live Q&A with the speakers for the entire period. This allows them to have a large number of sessions — an overwhelming 30+ of them — but I expect that engagement for each specific session will be relatively low. It’s not clear if the Q&A with the speaker is private or if you would share the same Q&A with other people who happened to be looking at that session at the same time; even if they were, the session starts when you pop in, so everyone would be at a different point in the presentation and probably asking different questions. It looks like a similar lineup of breakout sessions will be available tomorrow for the afternoon portion, following another keynote.
I poked into a couple of the breakout sessions, but they’re just a video that starts playing from the beginning when you enter, no way to engage with other audience members, and no motivation to watch at a particular time. I sent a question for one speaker off into the void, but never saw a response. Some of them are really short (I saw one that was 8 minutes) and others are longer (Neil Ward-Dutton‘s session was 36 minutes) but there’s no way to know how long each one is without starting it. This is a good way to push out a lot of content simultaneously, but there’s extremely low audience engagement. I was also invited to a “Canada Drop-In Centre” via Google Meet; I’m not that interested in any sort of Canadian-specific experience, a broader based engagement (like Slack) would have been a better choice, possibly with separate channels for regions but also session discussions and Q&A. They also don’t seem to be providing slide decks for any of the presentations, which I like to have to remind me of what was said (or to skip back if I missed something).
This was originally planned as an in-person conference, and Appian had to pivot on relatively short notice. They did a great job with the keynotes, including a few of the Appian speakers appearing (appropriately distanced) in their own auditorium. The breakout sessions didn’t really grab me: too many, all pre-recorded, and you’re basically an audience of one when you’re in any of them, with little or no interactivity. Better as a set of on-demand training/content videos rather than true breakout sessions, and I’m sure there’s a lot of good content here for Appian customers or prospects to dig deeper into product capabilities but these could be packaged as a permanent library of content rather than a “conference”. The key for virtual conferences seems to be keeping it a bit simpler, with more timely and live sessions from one or two tracks only.
I’ll be back for tomorrow’s keynotes, and will have a look through the breakout sessions to see if there’s anything that I want to watch right now as opposed to just looking it up later.
This is now my third day attending IBM’s online Think 2020 conference: I attended the analyst preview on Monday, then the first day of the conference yesterday. We started the day with Mark Foster, SVP of IBM Services, giving a keynote on building resilient and smarter businesses. He pointed out that we are in uncertain times, and many companies are still figuring out whether to freeze new initiatives, or take advantage of this disruption to build smarter businesses that will be more competitive as we emerge from the pandemic. This message coming from a large software/services vendor is a bit self-serving, since they are probably seeing this quarter’s sales swirling down the drain, but I happen to agree with him: this is the time to be bold with digital transformation. He referred to what can be done with new technologies as “business process re-engineering on steroids”, and said that it’s more important than ever to build more intelligent processes to run our organizations. Resistance to change is at a particular low point, except (in my experience) at the executive level: workers and managers are embracing the new ways of doing things, from virtual experiences to bots, although they may be hampered somewhat by skittish executives that think that change at a time of disruption is too risky, while holding the purse strings of that change.
He had a discussion with Svein Tore Holsether, CEO of Yara, a chemical company with a primary business in nitrogen crop fertilizers. They also building informational platforms for sustainable farming, and providing apps such as a hyper-local farm weather app in India, since factors such as temperature and rainfall can vary greatly due to microclimates. The current pandemic means that they can no longer have their usual meetings with farmers — apparently a million visits per year — but they are moving to virtual meetings to ensure that the farmers still have what the need to maximize their crop yields.
Foster was then joined by Carol Chen, VP of Global Lubricants Marketing at Shell. She talked about the specific needs of the mining industry for one of their new initiatives, leveraging the ability to aggregate data from multiple sources — many of them IoT — to make better decisions, such as predictive maintenance on equipment fleets. This allows the decisions about a mining operation to be made from a digital twin in the home office, rather than just by on-site operators who may not have the broader context: this improves decision quality and local safety.
He then talked to Michael Lindsey, Chief Transformation and Strategy Officer at PepsiCo North America, with a focus on their Frito-Lay operations. This operation has a huge fleet, controlling the supply chain from the potato farms to the store. Competition has driven them to have a much broader range of products, in terms of content and flavors, to maintain their 90%+ penetration into the American household market. Previously, any change would have been driven from their head office, moving out to the fringes in a waterfall model. They now have several agile teams based on IBM’s Garage Methodology that are more distributed, taking input from field associates to know what it needed at each point in the supply chain, driving need from the store shelves back to the production chain. The pandemic crisis means that they have had to move their daily/weekly team meetings online, but that has actually made them more inclusive by not requiring everyone to be in the same place. They have also had to adjust the delivery end of their supply chains in order to keep up the need for their products: based on my Facebook feed, there are a lot of people out there eating snacks at home, fueling a Frito-Lay boom.
Rob Thomas, SVP of IBM Cloud & Data Platform, gave a keynote on how AI and automation is changing how companies work. Some of this was a repeat from what we saw in the analyst preview, plus some interviews with customers including Mirco Bharpalania, Head of Data & Analytics at Lufthansa, and Mallory Freeman, Director of Data Science and Machine Learning in the Advanced Analytics Group at UPS. In both cases, they are using the huge amount of data that they collect — about airplanes and packages, respectively — to provide better insights into their operations, and perform optimization to improve scheduling and logistics.
He was then joined by Melissa Molstad, Director of Common Platforms, Stata Strategy & Vendor Relations at PayPal. She spoke primarily about their AI-driven chatbots, with the virtual assistants handling 1.4M conversations per month. This relieves the load on customer service agents, especially for simple and common queries, which is especially helpful now that they have moved their customer service to distributed home-based work.
He discussed AIOps, which was already announced yesterday by Arvind Krishna; I posted a bit about that in yesterday’s post including some screenshots from a demo that we saw at the analyst preview on Monday. They inserted the video of Jessica Rockwood, VP of Development for Hybrid Multicloud Management, giving the same demo that we saw on Monday, worthwhile watching if you want to hear the entire narrative behind the screenshots.
Thomas’ last interview segment was with Aaron Levie, CEO of Box, and Stewart Butterfield, CEO of Slack, both ecosystem partners of IBM. Interesting that they chose to interview Slack rather than use it as an engagement channel for the conference attendees. ¯_(ツ)_/¯ They both had interesting things to add on how work is changing with the push to remote cloud-based work, and the pressures on their companies for helping a lot of customers to move to cloud-based collaboration all at once. There seems to be a general agreement (I also agree) that work is never going back to exactly how it was before, even when there is no longer a threat from the pandemic. We are learning new ways of working, and also learning that things that companies thought could not be done effectively — like work from home — actually work pretty well. Companies that embrace the changes and take advantage of the disruption can jump ahead on their digital transformation timeline by a couple of years. One of them quoted Roy Amara’s adage that “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”; as distributed work methods, automation and the supporting technology get a foothold now, they will have profound changes on how work will be done in the future. This is not going to be about which organizations have the most money to spend: it will hinge on the ability and will to embrace AI and automation to remake intelligent end-to-end processes. Software vendors will need to accept the fact that customers want to do best-of-breed assembly of services from different vendors, meaning that the vendors that integrate into a standard fabric are going to do much better in the long run.
I switched over to the industry/customer channel to hear a conversation between Donovan Roos, VP of Enterprise Automation at US Bank, and Monique Ouellette, VP of Global Digital Workplace Solutions at IBM. She invited us at the beginning to submit questions, so this may have been one of the few sessions that has not been prerecorded, although they never seemed to take any audience questions so I’m not sure. Certainly much lower audio and video quality than most of the prerecorded sessions. US Bank has implemented Watson AI-driven chatbots for internal and external service requests, and has greatly reduced wait times for requests where a chatbot can assist with self-service rather than waiting for a live agent. Roos mentioned that they really make use of the IBM-curated content that comes as part of the Watson platform, and many of the issues are resolved without even hitting internal knowledge bases. Like many other banks during the current crisis, they have had to scale up their ability to process small business loans; although he had some interesting things to mention about how they scaled up their customer service operations using AI chatbots, I would also be interested to hear how they have scaled up the back-end processes. He did mention that you need to clean up your business processes first before starting to apply AI, but no specifics.
I stayed on the industry channel for a presentation on AI in insurance by Sandeep Bajaj, CIO of Everest Re Group. I do quite a bit of work with insurance companies as a technical strategist/architect so have some good insights into how their business works, and Bajaj started with the very accurate statement that insurance is an information-driven industry, both in the sense of standard business information, but also IoT and telematics especially for vehicle and P&C coverage. This provides great opportunities for better insights and decisions based on AI that leverages that data. He believes that AI is no longer optional in insurance because of the many factors and data sources involved in decisions. He did discuss the necessity to review and improve your business processes to find opportunities for AI: it’s not a silver bullet, but needs to have relatively clean processes to start with — same message that we heard from US Bank in the previous presentation. Everest reviewed some of their underwriting processes and split the automation opportunities between robotic process automation and AI, although I would have thought that using them together, as well as other automation technologies, could provide a better solution. They used an incremental approach, which let them see results sooner and feed back initial results into ongoing development. One side benefit is that they now capture much more of the unstructured information from each submission, whereas previously they would only capture the information entered for those submissions that led to business; this allows them to target their marketing and pricing accordingly. They’re starting to use AI-driven processes for claims first notice of loss (FNOL is a classic claims problem) in addition to underwriting, and are seeing operational efficiency improvements as well as better accuracy and time to market. Looking ahead, he sees that AI is here to stay in their organization since it’s providing proven value. Really good case study; worth watching if you’re in the insurance business and want to see how AI can be applied effectively.
IBM had to pivot to a virtual format relatively quickly since they already had a huge in-person conference scheduled for this time, but they could have done better both for content and format given the resources that they have available to pour into this event. Everyone is learning from this experience of being forced to move events online, and the smaller companies are (not surprisingly) much more agile in adapting to this new normal. I’ll be at the virtual Appian World next week, then will write an initial post on virtual conference best — and worst — practices that I’ve seen over the five events that I’ve attended recently. In the weeks following that, I’ll be attending Signavio Live, PegaWorld iNspire and DecisionCAMP, so will have a chance to add on any new things that I see in those events.