How do you get a handle on your company’s disrupted processes? How do you get real-time visibility into your organization’s strengths and weaknesses? How do you confidently chart a path to the future? The key is process intelligence: seeing your processes clearly and understanding what is actually happening versus what’s supposed to happen.
For example, your order-to-cash process is showing increased sales but decreasing customer satisfaction. Why? What is the root cause? Or, you have an opportunity to offer a new product but aren’t sure if your manufacturing process can handle it. To make this decision, you need a clear line of sight into what your organization can do. These areas are where process intelligence shines.
This webinar will help you answer questions like these, showing you – with examples – how process intelligence can help you drive real business results.
Rather than my usual focus on process automation, I’m digging a bit more into the process analysis side, particularly around process mining. With the current situation with a largely distributed workforce for many businesses, processes have change and it’s not possible to do Gemba walks or job shadowing to collect information on what the adjusted processes look like. Process mining and task mining provide the capabilities to do that remotely and accurately, and identify any problems with conformance/compliance as well as discover root causes. You can sign up at the link above to attend or receive the on-demand replay after the event.
I also posted last week about the webinar that I’m presenting on Wednesday for ABBYY on digital intelligence in the insurance industry, which is a related but different spin on the same issue: how are processes changing now, and what methodologies and technologies are available to handle this disruption. In case it’s not obvious, I don’t work for either of these vendors (who have some overlap in products) but provide “thought leadership” presentations to help introduce and clarify concepts for audiences. Looking forward to seeing everyone on either or both of these webinars later this week.
PegaWorld, in shifting from an in-person to virtual event, dropped down to a short 2.5 hours. The keynotes and many of the breakouts appeared to be mostly pre-recorded, hosted live by CTO Don Schuerman who provided some welcome comic relief and moderated live Q&A with each of the speakers after their session.
The first session was a short keynote with CEO Alan Trefler. It’s been a while since I’ve had a briefing with Pega, and their message has shifted strongly to the combination of AI and case management as the core of their digital platform capabilities. Trefler also announced Pega Process Fabric that allows the integration of multiple systems not just from Pega, but other vendors.
Next up was SVP of Products Kerim Akgonul, discussing their low-code Pega Express approach and how it’s helping customers to stand up applications faster. We heard briefly from Anna Gleiss, Global IT Head of Master Data Management at Siemens, who talked about how they are leveraging Pega to ensure reusability and speed deployment across the 30 different applications that they’re running in the Pega Cloud. Akgonul continued with use cases for self-service — especially important with the explosion in customer service in some industries due to the pandemic — and some of their customers such as Aflac who are using Pega to further their self-service efforts.
There was a keynote by Rich Gilbert, Chief Digital and Information Officer at Aflac, on the reinvention that they have gone through. There’s a lot of disruption in the insurance industry now, and they’ve been addressing this by creating a service-based operating model to deliver digital services as a collaboration between business and IT. They’ve been using Pega to help them with their key business drivers of settling claims faster and providing excellent customer service with offerings such as “Claims Guest Checkout”, which lets someone initiate a claim through self-service without knowing their policy number or logging in, and a Claims Status Tracker available on their mobile app or website. They’ve created a new customer service experience using a combination of live chat and virtual assistants, the latter of which is resolving 86% of inquiries without moving to a live agent.
Akgonul also provided a bit more information on the Process Fabric, which acts as a universal task manager for individual workers, with a work management dashboard for managers. There was no live Q&A at this point, but it was delayed until a Tech Talk later in the agenda. In the interim was a one-hour block of breakouts that had one track of three live build sessions, plus a large number of short prerecorded sessions from Pega, partners and customers. I’m interested in more information on the Process Fabric, which I believe will be in the later Tech Talk, although I did grab some screenshots from Akgonul’s keynote:
The live build sessions seemed to be overloaded and there was a long delay getting into them, but once started, they were good-quality demos of building Pega applications. I came in part way through the first one on low-code using App Studio, and it was quite interactive, with a moderator dropping in occasionally with live questions, and eventually hurrying the presenter along to finish on time. I was only going to stay for a couple of minutes, but it was pretty engaging and I watched all of it. The next live demo was on data and integration, and built on the previous demo’s vehicle fleet manager use case to add data from a variety of back-end sources. The visuals were fun, too: the presenter’s demo was most of the screen, with a bubble at the bottom right containing a video of the speaker, then a bubble popping in at the bottom left with the moderator when he had a question or comment. Questions from the audience helped to drive the presentation, making it very interactive. The third live demo was on user experience, which had a few connectivity issues so I’m not sure we saw the entire demo as planned, but it showed the creation of the user interface for the vehicle manager app using the Cosmos system, moving a lot of logic out of the UI and into the case model.
The final session was the Tech Talk on product vision and roadmap with Kerim Akgonul, moderated by Stephanie Louis, Senior Director of Pega’s Community and Developer Programs. He discussed Process Fabric, Project Phoenix, Cosmos and other new product releases in addition to fielding questions from social media and Pega’s online community. This was very interactive and engaging, much more so than his earlier keynote which seemed a bit stiff and over-rehearsed. More of this format, please.
In general, I didn’t find the prerecorded sessions to be very compelling. Conference organizers may think that prerecording sessions reduces risk, but it also reduces spontaneity and energy from the presenters, which is a lot of what makes live presentations work so well. The live Q&A interspersed with the keynotes was okay, and the live demos in the middle breakout section as well as the live Tech Talk were really good. PegaWorld also benefited from Pega’s own online community, which provided a more comprehensive discussion platform than the broadcast platform chat or Q&A. If you missed today’s event, you should be able to find all of the content on demand on the PegaWorld site within the next day or two.
I have a long history working with insurance companies on their digitization and process automation initiatives, and there’s a lot of interesting things happening in insurance as a result of the pandemic and associated lockdown: more automation of underwriting and claims, increased use of digital documents instead of paper, and trying to discover the “new normal” in insurance processes as we move to a world that will remain, at least in part, with a distributed workforce for some time in the future. At the same time, there is an increase in some types of insurance business activity, and decreases in other areas, requiring reallocation of resources.
On June 17, I’ll be presenting a webinar for ABBYY on some of the ways that insurance companies can navigate this perfect storm of business and societal disruption using digital intelligence technologies including smarter content capture and process intelligence. Here’s what we plan to cover:
Helping you understand how to transform processes, instead of falling into the trap of just automating existing, often broken processes
Getting your organization one step further of your competition with the latest content intelligence capabilities that help transform your customer experience and operational effectiveness
Completely automating your handling of essential documents used in onboarding, policy underwriting, claims, adjudication, and compliance
Having direct overview of your processes as living in real time to discover where bottlenecks and repetitions occur, where content needs to be processed, and where automation can be most effective
I missed last year’s Signavio Live event, and it turns out that it gave them a year head start on the virtual conference format now being adopted by other tech vendors. Now that everyone has switched to online conferences, many have decided to go the splashy-but-prerecorded route, which includes a lot of flying graphics and catchy music but canned presentations that fall a bit flat. Signavio has a low-key format of live presentations that started at 11am Sydney time with a presentation by Property Exchange Australia: I tuned in from my timezone at 9pm last night, stayed for the Deloitte Australia presentation, then took a break until the last part of the Coca-Cola European Partners presentation that started at 8am my time. In the meantime, there were continuous presentations from APAC and Europe, with the speakers all presenting live in their own regular business hours.
Signavio started their product journey with a really good process modeler, and now have process mining and some degree of operational monitoring for a more complete process intelligence suite. In his keynote, CEO Gero Decker talked about how the current pandemic — even as many countries start to emerge from it — is highlighting the need for operational resilience: companies need to design for flexibility, not just efficiency. For example, many companies are reinventing customer touchpoints, such as curbside delivery for retail as an alternative to in-store shopping, or virtual walk-throughs for looking at real estate. Companies are also reinventing products and services, allowing businesses that rely on in-person interactions to take their services online; I’ve been seeing this shift with everything from yoga classes to art gallery lectures. Decker highlighted two key factors to focus on in order to emerge from the crisis stronger: operational excellence, and customer experience. One without the other does not provide the benefit, but they need to be combined into the concept of “Customer Excellence”. In the Q&A, he discussed how many companies started really stepping up their process intelligence efforts in order to deal with the COVID-19 crisis, then realized that they should be doing this in the normal course of business.
There was a session with Jan ten Sythoff, Senior TEI Consultant at Forrester, and Signavio’s Global SVP of Customer Service, Stefan Krumnow, on the total economic impact of the Signavio Suite (TEI is the Forrester take on ROI). Krumnow started with the different factors that might be part of what a customer organization might be getting out of Signavio — RPA at scale, operational excellence, risk and compliance, ERP transformation, and customer excellence — then ten Sythoff discussed the specific TEI report that Forrester created for Signavio in October 2019 with a few updates for the current situation. The key quantifiable benefits identified by Forrester were external resources cost avoidance, higher efficiency in implementing new processes, and cost avoidance of alternative tools; they also found non-quantifiable benefits such as a better culture of process improvement across organizations. For their aggregate case study created from all of their interviews, they calculated a payback of less than six months for implementing Signavio: this would depend, of course, on how closely a particular organization matched their fictitious use case, which was a 100,000-employee company.
There are a number of additional sessions running until 5pm Eastern North American time; I might peek back in for a few of those, and will write another post if there’s anything of particular interest. I expect that everything will be available on demand after the event if you want to check out any of the sessions.
On the conference format, there is a single track of live presentations, and a Signavio moderator on each one to introduce the speaker and help wrangle the Q&A. Each presentation is 40 minutes plus 10 minutes of Q&A, with a 10-minute break between each one. Great format, schedule-wise, and the live sessions make it very engaging. They’re using GoToWebinar, and I’m using it on my tablet where it works really well: I can control the screen split between speaker video and slides (assuming the speaker is sharing their video), it supports multiple simultaneous speakers, I can see at the top of the screen who is talking in case I join a presentation after the introduction, and the moderator can collect feedback via polls and surveys. Because it’s a single track, it’s a single GTW link, allowing attendees to drop in and out easily throughout the day. The only thing missing is a proper discussion platform — I have mentioned this about several of the online conferences that I’ve attended, and liked what Camunda did with a Slack workspace that started before and continued after the conference — although you can ask questions via the GTW Question panel. To be fair, there is very little social media engagement (the Twitter hashtag for the conference is mostly me and Signavio people), so possibly the attendees wouldn’t get engaged in a full discussion platform either. Without audience engagement, a discussion platform can be a pretty lonely place. In summary, the GTW platform seems to behave well and is a streamlined experience if you don’t expect a lot of customer engagement, or you could use it with a separate discussion platform.
Disclaimer: Signavio is my customer, and I’ve created several webinars for them over the past few months. We have another one coming up next month on Process Intelligence. However, they have not compensated me in any way for listening in on Signavio Live today, or writing about it here.
The second day of the Appian World 2020 virtual conference started with CTO Michael Beckley, who immediately set me straight on something that I assumed yesterday: at least some of the keynotes were pre-recorded, not live. So their statement on their website, that keynotes are “live” from 10am-noon, and other references to “live” keynotes just means that they are being broadcast at that time, not being broadcast live. Since there’s no interaction with the audience during keynotes it’s difficult to tell, and the content of most keynotes has been well done in any case. This may have been a special case for Beckley, since he has health conditions that make him higher risk, although this was still recorded in the Appian auditorium where there would have been some number of support staff.
Beckley went into more detail on the COVID-19 apps that they have developed, with a highlight on their latest Workforce Safety and Readiness that helps to manage how workers return to a workplace. He walked through the employee view of the app, where they can record their own health check information, plus the HR manager view that allows them to set the questions, policies and information that will be seen by the employees. They’ve put this together pretty quickly using their own low-code platform, and are offering it at a reasonable price to their customers.
Next up was a customer presentation by Michael Woolley, Principal of IT Retail Systems at The Vanguard Group. They’re a huge wealth management firm spread over several countries, and they’re building Appian applications including ones that will be used by 6,000 employees. It appears that they are replacing their legacy workflow system of 20 years, which has hundreds of workflows. [I think the legacy system may be an IBM FileNet system, since I have a memory of doing some work for Vanguard over 20 years ago to develop requirements and technical design for just such a system – flashback!] They wanted to move to a modern low-code cloud platform, and although their standard workflow is pretty straightforward financial services transactional flows, they are incorporating business rules as well as BPM and case management, and RPA for interacting with legacy line of business systems. They are also planning to include AI/ML within the case management stages. He discussed their basic architecture as well as their development organization, and finished with some best practices for large projects such as this: it’s a multi-year program that covers many different workflows, so isn’t a greenfield application and has complex migration considerations.
Deputy CTO Malcolm Ross returned to follow on from his talk yesterday, when he talked about AI and RPA, to discuss how they’re improving low-code development. He showed some pretty cool AI-augmented development that they are releasing in June, which looks at the design of a process as you’re building it, and recommends the next node that you will want to add based on the current content and goals of the process. I’m definitely interested in seeing where they go with this. He had a number of detailed product updates, including cloud security, details on testing/deployment cycles for application packages, and administrative tools such as (system) Health Check. They continue to push new features into their SAIL user interface layer, making it easier for developers to create new experiences on any form factor — one of the strikes against most low-code platforms is that their UI development is not as flexible as customers require, and Appian is definitely raising the bar on what’s possible. He finished up with their multi-channel communication add-ons, which allow the use of tools such as Twilio directly within an Appian application.
The final presentation of the morning keynote was Kristie Grinnell, Global CIO and Chief Supply Chain Officer at General Dynamics Information Technology with a presentation on how they are using Appian to help manage their 30,000 employees spread over 400 customer locations. They are a government contractor, and have to manage all things around being an outsourced IT company, such as assigning people to customer projects, timesheet adjustments and invoicing, while maintaining compliance and auditability. She spoke about some of their specific Appian applications that they have developed, and the benefits: an employee pay adjustment request application (to adjust people’s pay for when they work more hours than they were paid for) reduced backlog from three weeks to three days, and reduced errors. They also developed an international travel approval app (likely not getting used much these days), since most of their employees have a high security clearance and specific risks need to be managed during travel, which reduced the approval time from days to hours. Most of their applications to date have been administrative, but they are keen to look at how applying AI/ML to their existing data can help them to make better decisions in the future.
CMO Kevin Spurway and Malcolm Ross closed the keynotes with announcements of their awards to partners, resellers, app market contributors, and hackathon winners. On an optimistic note, Spurway announced that next year’s Appian World will be in San Diego, April 11-14, 2021. Here’s hoping.
This is the end of my Appian World 2020 coverage — some good information in the keynotes. As noted yesterday, the breakout session format isn’t sufficiently compelling to have me spend time there, but if you’re an Appian customer, you’ll probably find things of interest.
Another week, another virtual event! Appian World is happening two days this week, and will be available on-demand after. This has a huge number of sessions on several parallel tracks, which are pre-recorded, with keynotes in advance (not clear if the keynotes are actually live, or pre-recorded). From their site:
Keynote sessions are live from 10:00 AM – 12:00 PM EDT on May 12th and 13th. All breakout sessions will become available on-demand at 12:00 PM EDT on their scheduled day, immediately following the live keynote. Speakers will be available from 12:00 PM – 3:00 PM EDT for live Q&A on their scheduled session day.
They’re using the INXPO platform, and apparently using every possible feature. Here’s a bit of navigation help:
There’s a Lobby tab with a video greeting from Appian CMO Kevin Spurway. It has links to the agenda, solutions showcase and lounge, which is a bit superfluous since those are all top-level tabs visible at all times.
The Agenda tab lists the sessions for today, including the keynote (for some reason it showed as Live from 8:30am although the keynotes didn’t start until 10am), then all of the breakout sessions for the day, which you can dip into in any order since they are all pre-recorded and are made available at the same time.
The Sessions tab is where you can drill down and watch any of the sessions when they are live, but you can also do this directly from the Agenda tab. Sessions has them organized into tracks, such as Full Stack Automation Theater and Low-Code Development Theater.
The Solutions Showcase tab is virtual exhibit hall, with booths for partners and a pavilion of Appian product booths. These can have a combination of pre-recorded video, documents to download, and links to chat with them. It’s a bit overwhelming, although I supposed people will go through some of the virtual booths after the sessions, since the sessions run only 10-3 each day. I suppose that many of these partners signed on for Appian World before it moved to a virtual event, so Appian needed to provide a way for them to show their offerings.
The Lounge tab is a single-threaded chat for all attendees. Not a great forum for discussion: as I’ve mentioned on all of the other virtual conference coverage in the past couple of weeks, a separate discussion platform like Slack that allows for multi-threaded discussions where audience members can both lead and participate in discussions with each other is much, much better for audience engagement.
The Games tab has results for some games that they’re running — this is common at conferences, such as how many people send out tweets with the conference hashtag, or getting your ID scanned by a certain number of booths, but not something that adds value for my conference experience.
The keynote speakers appeared on a stage in Appian’s own auditorium, empty (except supposedly for each other and production staff). CEO Matt Calkins was up first, and talked about how the world has changed in 2020, and how their low-code application development can help with the changes that are being forced on organizations by the pandemic. He talked about the applications that they have built in the past couple of months: a COVID-19 workforce tracking app, a loan coordination app that uses AI and RPA for automation, and a workforce safety & readiness app that manages how businesses reopen to their workforce coming back to work. They have made these free or low-cost for their customers for the near term.
His theme for the keynote is automation: using human and digital workers, including RPA and AI, to get the best results. He mentioned BPM as part of their toolbox, and focused on the idea that the goal is to automate, and the specific tool doesn’t matter. They bought an RPA company and have rebranded it as AppianRPA: it’s cloud-native and Java-based, which is different from many other RPA products, but is more appealing to the CIO-level buyer for organization-wide implementations. They are pushing an open agenda, where they can interact with other RPA products and cloud platforms: certainly as a BPM vendor, interaction with other automation tools is part of the fabric.
They have a few new things that I haven’t seen in briefings (to be fair, I think I’ve dropped off their analyst relations radar). Their “Automation Planner” can make recommendations for what type of automation to use for any particular task. Calkins also spoke about their intelligent document processing (IDP), which addresses what they believe is one of the biggest automation challenges that companies have today.
The Appian platform offers full-stack automation — workflow, case management, RPA, AI, business rules — with a “data anywhere” philosophy of integrating with systems to allow processing data in place, and their low-code development for which they have become known. If you’re a fan of the all-in-one proprietary platform, Appian is definitely one of main contenders. They have a number of vertical solutions now, and are starting to offer standardized all-inclusive subscription pricing for different sizes of installations that removes a lot of uncertainty about total cost of ownership. He also highlighted some of the vertical applications created by partners PWC, Accenture and KPMG.
I always like hearing Calkins talk (or chatting with him in person), because he’s smart and well-spoken, and ties together a lot of complex ideas well. He covered a lot of information about Appian products, direction, customers and partners in a 30-minute keynote, and it’s definitely worth watching the replay.
Next up was a “stories from the front line of COVID-19” panel featuring Darin Cline, EVP of Operations of Bank of the West (part of BNP Paribas), and Darren Blake, COO of Bexley Health Neighbourhood Care in the UK National Health Service, moderated by Appian co-founder Mark Wilson. This was done remotely rather than in their studio, with each of the three having an Appian World backdrop: a great branding idea that was similar to what Celonis did with their remote speakers at Celosphere, although each person’s backdrop also had their own company’s logo — nice touch.
Blake talked about how they saw the wave of COVID-19 coming based on data that they were seeing from around the world, and put plans in place to leverage their existing Appian-based staff tracker to implement emergency measures around staff management and redeployment. They support home-based services as well as their patients’ visits to medical facilities, and had to manage staff and patient visits for non-COVID ailments as well as COVID responses and even dedicated COVID testing clinics without risking cross-contamination. Cline talked about how they needed to change their operations to allow people to continue accessing banking services even with lockdowns that happened in their home state of California. He said this disruption has pushed them to become a much more agile organization, both in business and IT departments: this is one of those things that likely is never going back to how it was pre-COVID. He credited their use of Appian for low-code development as part of this, and said that they are now taking advantage of it as never before. Blake echoed that they also have become much more agile, creating and deploying new capabilities in their systems in a matter of a few days: the vision of all low-code, but rarely the reality.
Interesting to hear these customers stories, where they stepped up and started doing agile development in the low-code platform that they were already using, listening to the voice of the customer in cooperation with their business people, executives and implementation partners such as Appian. So many things that companies said were just not possible actually are: fast low-code implementation, work from home, and other changes that are here to stay. These are organizations that are going to hit the ground running as the pandemic recedes — as Blake points out, this is going to be with us for at least two years until a vaccine is created, and will have multiple waves — since they have experienced a digital revolution that has fundamentally changed how they work.
Great customer panel: often these are a bit dry and unfocused, but this one was fascinating since they’ve had a bit of time to track how the pandemic has impacted their business and how they’ve been able to respond to it. In both cases, this is the new normal: Cline explicitly said that they are never going back to having so many people in their offices again, since both their distributed workforce and their customers have embraced online interactions.
Next up was deputy CTO Malcolm Ross (who I fondly remember as providing my first Appian demo in 2006) with a product update. He showed a demo that included integration of RPA, AI, IDP, Salesforce and SAP within the low-code BPM framework that ties it all together. It’s been a while since I’ve had an Appian briefing, and some nice new functionality for how integrations are created and configured with a minimum of coding. They have built-in integrations (i.e., “no code”) to many different other systems. Their AI is powered by Google’s AI services, and includes all of the capabilities that you would find there, bundled into Appian’s environment. This “Appian AI” is at the core of their IDP offering, which does classification and data extraction on unstructured documents, to map into structured data: they have a packaged use case that they provide with their product that includes manual correction when AI classification/extraction doesn’t have a sufficient level of confidence. Because there’s AI/ML behind IDP, it will become smarter as human operators correct the errors.
He went through a demo of their RPA, including how the bots can interact with other Appian automation components such as IDP. There is, as expected, another orchestration (process) model within RPA that shows the screen/task flow: it would be good if they could look at converging this modeling format with their BPM modeling, even though it would be a simple subset. Regardless, a lot of interesting capabilities here for management of robotic resources. If you’re an existing Appian customer, you’re probably already looking at their RPA. Even if you’re already using another RPA product, Appian’s Robotic Workforce Manager allows you to manage Blue Prism, Automation Anywhere and UiPath bots as well as AppianRPA bots.
The last part of the morning keynotes was a panel featuring Austan Goolsbee, Former Chairman of President Obama’s White House Council of Economic Advisers, and Arthur Laffer, Economist and Member of President Reagan’s Economic Policy Advisory Board, moderated by Matt Calkins. This was billed as a “left versus right” economists’ discussion on how to reopen the (US) economy, and quickly lost my interest: it’s not that I’m not interested in the topic, but prefer to find a much wider set of opinions than these two Americans who turned it into a political debate, flinging around phrases such as “Libertarian ideal”. Not really a good fit as part of a keynote at a tech vendor conference. I think this really highlights some of the differences between in-person and virtual conferences: the virtual tech conferences should stick to their products and customers, and drop the “thought leaders” from unrelated areas. The celebrity speakers have a slight appeal to some attendees in person, but not so much in the virtual forum even if they are live conversations. IBM Think had a couple of high profile speakers that I skipped, since I can just go and watch their TED Talk or YouTube channel, and they didn’t really fit into the flow of the conference.
The remaining three hours of day 1 were (pre-recorded) breakout sessions available simultaneously on demand, with live Q&A with the speakers for the entire period. This allows them to have a large number of sessions — an overwhelming 30+ of them — but I expect that engagement for each specific session will be relatively low. It’s not clear if the Q&A with the speaker is private or if you would share the same Q&A with other people who happened to be looking at that session at the same time; even if they were, the session starts when you pop in, so everyone would be at a different point in the presentation and probably asking different questions. It looks like a similar lineup of breakout sessions will be available tomorrow for the afternoon portion, following another keynote.
I poked into a couple of the breakout sessions, but they’re just a video that starts playing from the beginning when you enter, no way to engage with other audience members, and no motivation to watch at a particular time. I sent a question for one speaker off into the void, but never saw a response. Some of them are really short (I saw one that was 8 minutes) and others are longer (Neil Ward-Dutton‘s session was 36 minutes) but there’s no way to know how long each one is without starting it. This is a good way to push out a lot of content simultaneously, but there’s extremely low audience engagement. I was also invited to a “Canada Drop-In Centre” via Google Meet; I’m not that interested in any sort of Canadian-specific experience, a broader based engagement (like Slack) would have been a better choice, possibly with separate channels for regions but also session discussions and Q&A. They also don’t seem to be providing slide decks for any of the presentations, which I like to have to remind me of what was said (or to skip back if I missed something).
This was originally planned as an in-person conference, and Appian had to pivot on relatively short notice. They did a great job with the keynotes, including a few of the Appian speakers appearing (appropriately distanced) in their own auditorium. The breakout sessions didn’t really grab me: too many, all pre-recorded, and you’re basically an audience of one when you’re in any of them, with little or no interactivity. Better as a set of on-demand training/content videos rather than true breakout sessions, and I’m sure there’s a lot of good content here for Appian customers or prospects to dig deeper into product capabilities but these could be packaged as a permanent library of content rather than a “conference”. The key for virtual conferences seems to be keeping it a bit simpler, with more timely and live sessions from one or two tracks only.
I’ll be back for tomorrow’s keynotes, and will have a look through the breakout sessions to see if there’s anything that I want to watch right now as opposed to just looking it up later.
This is now my third day attending IBM’s online Think 2020 conference: I attended the analyst preview on Monday, then the first day of the conference yesterday. We started the day with Mark Foster, SVP of IBM Services, giving a keynote on building resilient and smarter businesses. He pointed out that we are in uncertain times, and many companies are still figuring out whether to freeze new initiatives, or take advantage of this disruption to build smarter businesses that will be more competitive as we emerge from the pandemic. This message coming from a large software/services vendor is a bit self-serving, since they are probably seeing this quarter’s sales swirling down the drain, but I happen to agree with him: this is the time to be bold with digital transformation. He referred to what can be done with new technologies as “business process re-engineering on steroids”, and said that it’s more important than ever to build more intelligent processes to run our organizations. Resistance to change is at a particular low point, except (in my experience) at the executive level: workers and managers are embracing the new ways of doing things, from virtual experiences to bots, although they may be hampered somewhat by skittish executives that think that change at a time of disruption is too risky, while holding the purse strings of that change.
He had a discussion with Svein Tore Holsether, CEO of Yara, a chemical company with a primary business in nitrogen crop fertilizers. They also building informational platforms for sustainable farming, and providing apps such as a hyper-local farm weather app in India, since factors such as temperature and rainfall can vary greatly due to microclimates. The current pandemic means that they can no longer have their usual meetings with farmers — apparently a million visits per year — but they are moving to virtual meetings to ensure that the farmers still have what the need to maximize their crop yields.
Foster was then joined by Carol Chen, VP of Global Lubricants Marketing at Shell. She talked about the specific needs of the mining industry for one of their new initiatives, leveraging the ability to aggregate data from multiple sources — many of them IoT — to make better decisions, such as predictive maintenance on equipment fleets. This allows the decisions about a mining operation to be made from a digital twin in the home office, rather than just by on-site operators who may not have the broader context: this improves decision quality and local safety.
He then talked to Michael Lindsey, Chief Transformation and Strategy Officer at PepsiCo North America, with a focus on their Frito-Lay operations. This operation has a huge fleet, controlling the supply chain from the potato farms to the store. Competition has driven them to have a much broader range of products, in terms of content and flavors, to maintain their 90%+ penetration into the American household market. Previously, any change would have been driven from their head office, moving out to the fringes in a waterfall model. They now have several agile teams based on IBM’s Garage Methodology that are more distributed, taking input from field associates to know what it needed at each point in the supply chain, driving need from the store shelves back to the production chain. The pandemic crisis means that they have had to move their daily/weekly team meetings online, but that has actually made them more inclusive by not requiring everyone to be in the same place. They have also had to adjust the delivery end of their supply chains in order to keep up the need for their products: based on my Facebook feed, there are a lot of people out there eating snacks at home, fueling a Frito-Lay boom.
Rob Thomas, SVP of IBM Cloud & Data Platform, gave a keynote on how AI and automation is changing how companies work. Some of this was a repeat from what we saw in the analyst preview, plus some interviews with customers including Mirco Bharpalania, Head of Data & Analytics at Lufthansa, and Mallory Freeman, Director of Data Science and Machine Learning in the Advanced Analytics Group at UPS. In both cases, they are using the huge amount of data that they collect — about airplanes and packages, respectively — to provide better insights into their operations, and perform optimization to improve scheduling and logistics.
He was then joined by Melissa Molstad, Director of Common Platforms, Stata Strategy & Vendor Relations at PayPal. She spoke primarily about their AI-driven chatbots, with the virtual assistants handling 1.4M conversations per month. This relieves the load on customer service agents, especially for simple and common queries, which is especially helpful now that they have moved their customer service to distributed home-based work.
He discussed AIOps, which was already announced yesterday by Arvind Krishna; I posted a bit about that in yesterday’s post including some screenshots from a demo that we saw at the analyst preview on Monday. They inserted the video of Jessica Rockwood, VP of Development for Hybrid Multicloud Management, giving the same demo that we saw on Monday, worthwhile watching if you want to hear the entire narrative behind the screenshots.
Thomas’ last interview segment was with Aaron Levie, CEO of Box, and Stewart Butterfield, CEO of Slack, both ecosystem partners of IBM. Interesting that they chose to interview Slack rather than use it as an engagement channel for the conference attendees. ¯_(ツ)_/¯ They both had interesting things to add on how work is changing with the push to remote cloud-based work, and the pressures on their companies for helping a lot of customers to move to cloud-based collaboration all at once. There seems to be a general agreement (I also agree) that work is never going back to exactly how it was before, even when there is no longer a threat from the pandemic. We are learning new ways of working, and also learning that things that companies thought could not be done effectively — like work from home — actually work pretty well. Companies that embrace the changes and take advantage of the disruption can jump ahead on their digital transformation timeline by a couple of years. One of them quoted Roy Amara’s adage that “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”; as distributed work methods, automation and the supporting technology get a foothold now, they will have profound changes on how work will be done in the future. This is not going to be about which organizations have the most money to spend: it will hinge on the ability and will to embrace AI and automation to remake intelligent end-to-end processes. Software vendors will need to accept the fact that customers want to do best-of-breed assembly of services from different vendors, meaning that the vendors that integrate into a standard fabric are going to do much better in the long run.
I switched over to the industry/customer channel to hear a conversation between Donovan Roos, VP of Enterprise Automation at US Bank, and Monique Ouellette, VP of Global Digital Workplace Solutions at IBM. She invited us at the beginning to submit questions, so this may have been one of the few sessions that has not been prerecorded, although they never seemed to take any audience questions so I’m not sure. Certainly much lower audio and video quality than most of the prerecorded sessions. US Bank has implemented Watson AI-driven chatbots for internal and external service requests, and has greatly reduced wait times for requests where a chatbot can assist with self-service rather than waiting for a live agent. Roos mentioned that they really make use of the IBM-curated content that comes as part of the Watson platform, and many of the issues are resolved without even hitting internal knowledge bases. Like many other banks during the current crisis, they have had to scale up their ability to process small business loans; although he had some interesting things to mention about how they scaled up their customer service operations using AI chatbots, I would also be interested to hear how they have scaled up the back-end processes. He did mention that you need to clean up your business processes first before starting to apply AI, but no specifics.
I stayed on the industry channel for a presentation on AI in insurance by Sandeep Bajaj, CIO of Everest Re Group. I do quite a bit of work with insurance companies as a technical strategist/architect so have some good insights into how their business works, and Bajaj started with the very accurate statement that insurance is an information-driven industry, both in the sense of standard business information, but also IoT and telematics especially for vehicle and P&C coverage. This provides great opportunities for better insights and decisions based on AI that leverages that data. He believes that AI is no longer optional in insurance because of the many factors and data sources involved in decisions. He did discuss the necessity to review and improve your business processes to find opportunities for AI: it’s not a silver bullet, but needs to have relatively clean processes to start with — same message that we heard from US Bank in the previous presentation. Everest reviewed some of their underwriting processes and split the automation opportunities between robotic process automation and AI, although I would have thought that using them together, as well as other automation technologies, could provide a better solution. They used an incremental approach, which let them see results sooner and feed back initial results into ongoing development. One side benefit is that they now capture much more of the unstructured information from each submission, whereas previously they would only capture the information entered for those submissions that led to business; this allows them to target their marketing and pricing accordingly. They’re starting to use AI-driven processes for claims first notice of loss (FNOL is a classic claims problem) in addition to underwriting, and are seeing operational efficiency improvements as well as better accuracy and time to market. Looking ahead, he sees that AI is here to stay in their organization since it’s providing proven value. Really good case study; worth watching if you’re in the insurance business and want to see how AI can be applied effectively.
IBM had to pivot to a virtual format relatively quickly since they already had a huge in-person conference scheduled for this time, but they could have done better both for content and format given the resources that they have available to pour into this event. Everyone is learning from this experience of being forced to move events online, and the smaller companies are (not surprisingly) much more agile in adapting to this new normal. I’ll be at the virtual Appian World next week, then will write an initial post on virtual conference best — and worst — practices that I’ve seen over the five events that I’ve attended recently. In the weeks following that, I’ll be attending Signavio Live, PegaWorld iNspire and DecisionCAMP, so will have a chance to add on any new things that I see in those events.
The first day of IBM’s online conference Think 2020 kicked off with a keynote by CEO Arvind Krishna on enterprise technology for digital transformation. He’s new to the position of CEO, but has decades of history at IBM, including heading IBM Research and, most recently, the Cloud and Cognitive Computing group. He sees hybrid cloud and AI as the key technologies for enterprises to move forward, and was joined by Rajeev Ronanki, Chief Digital Officer at Anthem, a US healthcare provider, discussing what they’re doing with AI to harness data and provide better insights. Anthem is using Red Hat OpenShift containerization that allows them to manage their AI “supply chain” effectively, working with technology partners to integrate capabilities.
Krishna announced AIOps, which infuses Watson AI into mission-critical IT operations, providing predictions, recommendations and automation to allow IT to get ahead of problems, and resolve them quickly. We had a quick demo of this yesterday during the analyst preview, and it looks pretty interesting: integrating trouble notifications into a Slack channel, then providing recommendations on actions based on previous similar incidents:
He finished up with an announcement about their new cloud satellite, and edge and telco solutions for cloud platforms. This enables development of future 5G/edge applications that will change how enterprises work internally and with their customers. As our last several weeks of work-from-home has taught us, better public cloud connectivity can make a huge difference in how well a company can continue to do business in times of disruption; in the future, we won’t require a disruption to push us to a distributed workforce.
There was a brief interview with Michelle Peluso, CMO, on how IBM has pivoted to focus on what their customers need: managing during the crisis, preparing for recovery, and enabling transformation along the way. Cloud and AI play a big part of this, with hybrid cloud providing supply chain resiliency, and AI to better adapt to changing circumstances and handle customer engagement. I completely agree with one of her key points: things are not just going back to normal after this crisis, but this is forcing a re-think of how we do business and how things work. Smart companies are accelerating their digital transformation right now, using this disruption as a trigger. I wrote a bit more about this on a guest post on the Trisotech blog recently, and included many of my comments in a webinar that I did for Signavio.
The next session was on scaling innovation at speed with hybrid cloud, featuring IBM President Jim Whitehurst, with a focus on how this can provide the level of agility and resiliency needed at any time, but especially now. Their OpenShift-based hybrid cloud platform will run across any of the major cloud providers, as well as on premise. He announced a technology preview of a cloud marketplace for Red Hat OpenShift-based applications, and had a discussion with Vishant Vora, CTO at Vodafone Idea, India’s largest telecom provider, on how they are building infrastructure for low-latency applications. The session finished up with Hillery Hunter, CTO of IBM Cloud, talking about their public cloud infrastructure: although their cloud platform will run on any vendor’s cloud infrastructure, they believe that their own cloud architecture has some advantages for mission-critical applications. She gave us a few more details about the IBM Cloud Satellite that Arvind Krishna had mentioned in his keynote: a distributed cloud that allows you to run workloads where it makes sense, with simplified and consolidated deployment and monitoring options. They have security and privacy controls built in for different industries, and currently have offerings such as a financial services-ready public cloud environment.
I tuned in briefly to an IDC analyst talking about the new CEO agenda, although targeted at IBM business partners; then a few minutes with the chat between IBM’s past CEO Ginny Rometty and will.i.am. I skipped Amal Clooney‘s talk — she’s brilliant, but there are hours of online video of other presentations that she has made that are likely very similar. If I had been in the audience at a live event, I wouldn’t have walked out of these, but they did not hold my interest enough to watch the virtual versions. Definitely virtual conferences need to be more engaging and offer more targeted content: I attend tech vendor conferences for information about their technology and how their customers are using it, not to hear philanthropic rap singers and international human rights lawyers.
The last session that I attended was on reducing operational cost and ensuring supply chain resiliency, introduced by Kareen Yusuf, General Manager of AI applications. He spoke about the importance of building intelligence into systems using AI, both for managing work in flight through end-to-end visibility, and providing insights on transactions and data. The remainder of the session was a panel hosted by Amber Armstrong, CMO of AI applications, featuring Jonathan Wright who heads up cognitive process re-engineering in supply chains for IBM Global Business Services, Jon Young of Telstra, and Joe Harvey of Southern Company. Telstra (a telecom company) and Southern Company (an energy company) have both seen supply chain disruptions due to the pandemic crisis, but have intelligent supply chain and asset management solutions in place that have allowed them to adapt quickly. IBM Maximo, a long-time asset management product, has been supercharged with IoT data and AI to help reduce downtime and increase asset utilization. This was an interesting panel, but really was just three five-minute interviews with no interaction between the panelists, and no audience questions. If you want to see an example of a much more engaging panel in a virtual conference, check out the one that I covered two weeks ago at CamundaCon Live.
The sessions ran from 11am-3pm in my time zone, with replays starting at 7pm (well, they’re all technically replays because everything was pre-recorded). That’s a much smaller number of sessions than I expected, with many IBM products not really covered, such as the automation products that I normally focus on. I even took a lengthy break in the middle when I didn’t see any sessions that interested me, so only watched about 90 minutes of content. Today was really all cloud and AI, interspersed with some IBM promotional videos, although a few of the sessions tomorrow look more promising.
As I’ve mentioned over the past few weeks of virtual conferences, I don’t like pre-recorded sessions: they just don’t have the same feel as live presentations. To IBM’s credit, they used the fact that they were all pre-recorded to add captions in five or six different languages, making the sessions (which were all presented in English) more accessible to those who speak other languages or who have hearing impairments. The platform is pretty glitchy on mobile: I was trying to watch the video on my tablet while using my computer for blogging and looking up references, but there were a number of problems with changing streams that forced me to move back to desktop video for periods of time. The single-threaded chat stream was completely unusable, with 4,500 people simultaneously typing “Hi from Tulsa” or “you are amazing” (directed to the speaker, presumably).
This will be the fourth in a series of webinars that I’m doing for Signavio, this time focused on the high-tech industry but with lessons that span other industries. From the description:
High-Tech businesses are renowned disruptors. But what happens when the disruptors become the disrupted? For example, let’s say a global pandemic surfaces and suddenly changes your market dynamics and your business model.
Can your business handle an instant slowdown or a hyper-growth spurt? What about your operating systems? Are they nimble enough for you to scale? Can you onboard new customers en masse or handle a high volume of service tickets overnight? What about your supply chain; how agile are your systems and supplier relationships?
The first two webinars were discussing banking in February and insurance in March, and the role that intelligent processes play in improving business, with a brief mention in the March webinar about addressing business disruption caused by the pandemic. By the time we hit the third webinar on financial services in April, we had pivoted to look at the necessity of process improvement technologies and methodologies in times of business disruption such as the current crisis. Unlike a lot of industries, many high-tech sectors have been booming during the pandemic: their problems are around being able to scale up operations to meet demand without sacrificing customer service. Although they share some of the same issues as I looked at in the earlier webinars, they have some unique issues where process intelligence and automation can help them.
Tune in on May 20th at 11am Eastern; if you can’t make it then, sign up anyway and you’ll get a link to the on-demand version.
I had an early look at IBM’s virtual Think conference by attending the analyst preview today, although I will need to embargo the announcements until they are officially released at the main event tomorrow. The day kicked off with a welcome from Harriet Fryman, VP of Analyst Relations, followed by a welcome from IBM President Jim Whitehurst before the first presentation from Mark Foster, SVP of Services, on building resilient and smarter businesses. Foster led with the need for innovative and intelligent workflow automation, and a view of end-to-end processes, and how work patterns are changing and will continuing to change as we emerge from the current pandemic crisis.
Whitehurst returned to discuss their offerings in hybrid cloud environments, including both the platforms and the applications that run on those platforms. There’s no doubt that every company right now is laser-focused on the need for cloud environments, with many workforces being distributed to a work-from-home model. IBM offers Cloud Paks, containerized software solutions to get organizations up and running quickly. Red Hat OpenShift is a big part of their strategy for cloud.
Hillery Hunter, CTO and VP of Cloud Infrastructure, followed on with more details on the IBM cloud. She doubled down on their commitment to open source, and to how they have hardened open source cloud tools for enterprise readiness. If enterprises want to be flexible, scalable and resilient, they need to move their core business operations to the public cloud, and IBM hopes to provide the platform for them to do that. This should not just be lift-and-shift from on-premise systems, but this is an opportunity to modernize systems and operations. The impacts of COVID-19 have shown the cracks in many companies’ mission-critical capabilities and infrastructure, and the smart ones will already be finding ways to push towards more modern cloud platforms to allow them to weather business disruptions and gain a competitive edge in the future.
Rob Thomas, SVP of IBM Cloud and Data Platform, gave a presentation on AI and automation, and how they are changing the way that organizations work. By infusing AI into workflows, companies can outperform their competitors by 165% in terms of revenue growth and productivity, plus improve their ability to innovate and manage change. For example, in a very short time, they’ve deployed Watson Assistant to field questions about COVID-19 using information published by the CDC and other sources. Watson Anywhere combines with their Cloud Pak for Data to allow Watson AI to be applied to any of your data sources. He finished with a reminder of the “AI Ladder” which is basically a roadmap for adding operationalized AI.
The final session was with Dario Gil, Director of IBM Research. IBM has been an incredible source of computing research over 75 years, and employs 3,000 researchers in 19 locations. Some of this research is around the systems for high-performance computing, including their support for the open source Linux community. Other research is around AI, having moved from narrow AI to broader multi-domain AI, with more general AI with improved learning and autonomy in the future. They are also investing in quantum computing research, and he discussed this convergence of bits, neurons and qubits for things such as AI-assisted programming and accelerated discovery.
This was all pre-recorded presentations, which is not as compelling as live video, and there was no true discussion platform or even live Q&A; these are the two common complaints that I am having with many of the virtual conferences. I’m expecting that the next two days of the main IBM Think event will be more of the same format. I’ll be tuning in for some of the sessions of the main event, starting with CEO Arvind Krishna tomorrow morning.