Insurance technology: is this very conservative industry finally ready for its close-up?

I’ve worked with insurance clients for a long time, first helping them with automation in their underwriting, policy administration and claims processes, and now helping them with digital transformation to create new business models and platforms. One thing that has always struck me is how behind the time most insurance companies are: usually old companies (by today’s standards), they trend far on the conservative end of the business and technology innovation scale. However, new entrants to the market have been stirring the pot for a couple of years – such as Lemonade for the urban consumer property insurance market – and it seems that everywhere I look, there’s something popping up about innovation in insurance.

Capgemini has a significant insurance practice, and writes an annual World Insurance Report that is about to be updated for 2018; a couple of their consultants write about different aspects of how insurance is changing and the technology enabling that change. They’ve just started a three-part series on the insurance customer of the future, which echoes some of the points that I made in my recent post on the Alfresco blog about transforming insurance with cloud BPM, and although they use the apocryphal “millennial” definition to describe who these customers are in their first post, they point out four main characteristics:

  • Smart shoppers
  • Lower loyalty
  • Self-centred
  • Caring consumers – which appears contrary to the previous point, but check out their post for a description

They have another post on how new InsurTech models can decrease risk for the insurer, which explains more about the social risk pool models that are used by companies like Lemonade, and how risk can be proactively mitigated through the use of connected devices.

We’re also seeing platform innovation for some insurers, such as Liberty Mutual moving their documents to Alfresco on AWS cloud. As I’ve experienced for many years, just getting insurance companies to move from paper to digital files can provide huge operational benefits, and moving those files to the cloud allows a global insurer to allow access wherever required. There are a lot of regulatory issues with data sovereignty, that is, where the content is actually stored and what laws/regulations apply to it because of that, but the vendors are starting to solve those problems with regional data centers and secure, encrypted transport. With digital content comes the issue of digital preservation, which John Mancini on the AIIM blog points out is a big issue for financial and insurance companies because of the typically long time span that they are dealing with customers: consider that a personal injury insurance claim can go on for years, requiring that all documents be retained for future review. After hearing about one former insurance customer of mine that had a flood in their basement storage, destroying years of customer files, I wished that they had decided to move a bit faster on my advice about digital documents.

Cutting edge technologies such as blockchain are also getting into the insurance mix: blockchain can be used to show proof of insurance, improve transparency and reduce risk of fraud, and speed up claims with smart contracts. I can also imagine that as cars get smarter and insurance companies can tie in directly to the on-board systems, there may be less opportunity for auto repair shop fraud, which reduces overall costs to the insurer and consumer.

If you work in insurance and know that you’re behind the curve, there are a lot of things that you can do to help bring yourself into at least the last century, if not this one:

  • Convert all of your files to digital format at the front end of the process, that is, when they arrive (or are created). This will allow you to automatically extract data from the files, which can then be used for classifying and routing content as it arrives. Files can now be shared by anyone who needs to see them, and there will be no piles of completed documents/files waiting to be scanned at the end of a process. This is a big cultural shift as your workers move from working on paper to working on the screen, but if you give them a couple of big screens and a properly-designed workspace, it can be just as productive as paper.
  • With all of your content arriving in digital form, or being converted to digital immediately on arrival, you can now automate your processes:
    • New policy application? Look up any previous information for this customer, create a new business case, and route to the appropriate underwriter if required. If this is a simple policy, such as consumer renter insurance, it can usually be automatically adjudicated and issued immediately.
    • Policy changes? Extract information from the policy administration system, classify the type of change, and either complete the change automatically or forward to a policy administration clerk.
    • A first notice of loss arriving for a claim? Use that to automatically extract information from your policy administration system, set up a claim in your claims system, and route the claim to the appropriate claims manager. Simple claims, such as auto windshield replacement, can be settled automatically and immediately.
    • Additional documents arriving for a claim? Automatically recognize the document type and claim number, and add to the claim case directly.
  • Find the best ways to integrate your digital content and processes with your legacy systems. This is a huge part of what I do with any insurance customer (really, with any customer at all), and it’s not trivial but can result in huge rewards. This will be some combination of exposing APIs, digging directly into operational databases, RPA to integrate “at the glass”, and other methods that are specific to your environment. In the end, you want to be sure that no one is re-entering data manually from one system to another, even by copy and paste.
  • Automate, automate, automate. In case I haven’t made that clear already. There should be no such thing as manual work assignment or routing, except in special cases. Data exchange with legacy systems should be automated. Decisions should be automated where possible, or at least used to make recommendations to workers. Incorporate artificial intelligence and machine learning to learn how your most skilled workers make decisions, align that with your policies and regulatory compliance, and use as input to automated decisions and recommendations. The workers will be left doing the work that actually requires a person to do it, not all of the low-level administrative work.
  • Use some type of low-code application development platform that allows semi-technical business analysts – there are a ton of these working in insurance business areas – to create their own situational apps.
  • Now that you have your operational processes sorted out, start looking for new ways to leverage your digital content and processes:
    • Interact with reinsurers and other business partners using digital content and processes, not paper files and faxes.
    • Provide customers with the option for completely paperless policy application, issuance and renewal. Although I’m far from being a millennial in age, the huge stack of paper sent by my previous home insurer on renewal was a key reason that I ran directly towards an online insurer that could do it all without paper.
    • Streamline claims processes, automating where possible. Many insurance companies don’t spend a lot of time fixing their claims processes, preferring to spend their time on attracting new customers; however, in this age of online consumer reviews, an inefficient claims process is going to hit hard. Automating claims also reduces operational costs: claims managers are highly skilled, and it can take 6-12 months to train a new one.
    • Automate and streamline your ancillary processes that support the main processes, such as recovery of assets, and negotiating contracts with preferred repair vendors.
    • Build in the process monitoring, and provide automated dashboards and reports to different levels of management. As well as giving management a real-time view of operations, this reduces the time of line supervisors spent manually compiling reports. It also, amazingly, will reduce the amount of time that individual workers spend tracking their own work: in many of the insurance companies that I visit, claims managers and other front-line workers keep a manual log of their work because they don’t trust the system to do it for them.
  • Tie your process performance back to business goals: loss ratio, customer satisfaction, regulatory SLAs (such as communicating with customer in a timely manner), net promoter score, fraud rate, closure rate. It’s too easy to get bogged down in making a particular activity or process more efficient when it shouldn’t even be done at all. Although you can use your existing procedures guides as a starting point for your new digital processes, you really need to link everything back to the high-level goals rather than just paving the cow paths.

This started out as a short post because I was seeing a flurry of insurance-related items in my news feed, and grew into a bit of a monster as I thought of my own experiences with insurance customers over the past couple of years. Nonetheless, likely some useful tidbits in here.

Integrating process and content: exploring the use cases

I recently wrote a series of short articles sponsored by Alfresco and published on their blog. Today, the third of the series was published, discussing some use cases for integrating content into your processes:

  • Document-driven processes
  • Case management
  • Document lifecycle processes
  • Support documentation for exceptions in data-driven processes
  • Classification and analysis processes for non-document content

Head over there to read all the details on each of these use cases. As I write at the end:

Over the years, I’ve learned two things about integrating process and content: first, almost every process application has some sort of content associated with it; and second, most process-centric developers underestimate the potential complexity of handling the content in the context of the process application.

While you’re over there, you can also check out the other two articles that I wrote: transforming insurance with cloud BPM, and BPM cloud architectures and microservices.

13 years of reporting on @BPTrends BPM annual reports: from vendor reviews to state of the BPM market

The very first post that I wrote on this blog was in March 2005, and it covered BPTrends’ 2005 BPM Suites report. I think that this was the first year that they published a BPM annual report, although I can’t find even the link to this report in their web archive and I don’t have it in my own archive. However, at the time, I listed the 13 vendors that were included (about half of which still exist in some form) and noted that it was a “pay for play” report that required that the vendors pay $5,000 to participate. I don’t think that BPTrends does vendor reviews any more – the last that I have a record of was in 2007 – but they have an out-of-date page of vendor info and links that is provided for free.

By 2006, BPTrends was conducting surveys of practitioners, consultants and others involved in BPM, and had published their first State of Business Process Management report based on survey results, with others following every two years (2008, 2010, 2012, 2014, 2016). They’ve now published the 2018 State of Business Process Management report, sponsored by Red Hat and Signavio, who get top billing on the front page of the report but presumably no editorial control or special treatment in the report since it’s not a review of products but rather a state of the industry/market report based on the results of surveys. Since BPTrends has been creating and analyzing BPM surveys since 2005, they have a good view of how the market is evolving.

Intelligent Capture für die digitale Transformation: my intelligent capture paper for @ABBYY_Software, now in German

A little over a year ago, I wrote a paper on intelligent capture for digital transformation, sponsored by ABBYY, and gave a keynote at their conference on the same topic. The original English version is on their site here, and if you read German (or want to pass it along to German-speaking colleagues), you can find the German version here. As usual, this paper is not about ABBYY’s products, but about how intelligent capture is the on-ramp for any type of automated processes and hence required for digital transformation. From the abstract:

Data capture from paper or electronic documents is an essential step for most business processes, and often is the initiator for customer-facing business processes. Capture has traditionally required human effort – data entry workers transcribing information from paper documents, or copying and pasting text from electronic documents – to expose information for downstream processing. These manual capture methods are inefficient and error-prone, but more importantly, they hinder customer engagement and self-service by placing an unnecessary barrier between customers and the processes that serve them.

Intelligent capture – including recognition, document classification, data extraction and text analytics – replaces manual capture with fully-automated conversion of documents to business-ready data. This streamlines the essential link between customers and your business, enhancing the customer journey and enabling digital transformation of customer-facing processes.

Or, in German:

Die Erfassung von Daten aus papierbasierten oder elektronischen Dokumenten steht als
zentraler Schritt am Anfang zahlreicher kundenorientierter Geschäftsprozesse. Dies ist üblicherweise
mit großem manuellen Aufwand verbunden – Mitarbeiter übertragen und kopieren
per Hand Daten und Texte, um sie so nachgelagerten Systemen und Prozessen zur Verfügung
zu stellen. Diese manuelle Vorgehensweise ist jedoch nicht nur ineffizient und fehleranfällig,
sie bremst auch den Kundendialog aus und verhindert Self-Service-Szenarien durch unnötige
Barrieren zwischen Kunden und Dienstleistern. Intelligent-Capture-Lösungen – mit Texterkennung,
Dokumentenklassifizierung, Datenextraktion und Textanalyse – ersetzen die manuelle
Datenerfassung. Dokumente werden vollautomatisch in geschäftlich nutzbare Daten umgewandelt.
So können Unternehmen die Beziehung zu ihren Kunden stärken, das Benutzererlebnis
steigern und die digitale Transformation kundenorientierter Prozesse vorantreiben.

Recently, I was interviewed by KVD, a major European professional association for customer service professionals. Although most of their publication is in German, the interview was in English, and you can find it on their site here.

My guest post on the @Alfresco blog: BPM Cloud Architectures and Microservices

The second of the short articles that I wrote for Alfresco has been published on their blog, on BPM cloud architectures and microservices. I walk through the basics of cloud architectures (private, public and hybrid), containerization and microservices, and why this is relevant for BPM implementations. As I point out:

Not all BPM solutions are built for cloud-native architectures: a monolithic BPMS stuffed into a Docker container will not be able to leverage the advantages of modern cloud infrastructures.

Check out the full article on the Alfresco site.

Cleaning up the deadwood…dead links, that is

I’ve been writing Column 2 for almost 13 years, and there’s quite a bit of crud that’s accumulated. I’ve also been seeing some performance problems that are completely out of line with the amount of traffic on the site, so doing some tuning as well.

Please be patient if you see any glitches on this site as well as my corporate website while I complete the following:

  • Moved to the more modern Twenty Sixteen WordPress theme, which is supposed to have better performance than the Twenty Thirteen theme that I was using. I’ve also replaced JPG graphics on the page design with much smaller GIF graphics.
  • Use Cloudflare as a Content Delivery Network (CDN) to cache all images from the site plus a lot of the content to help reduce load. This only includes images that are stored on my WordPress site, not those embedded from Flickr, but should help the load on the site as well as loading performance.
  • Added CAPTCHAs to certain countries and IP ranges that were pummeling the site for content scraping/indexing. If you’re in one of those, you’ll need to click a “I am not a robot” checkmark.
  • Enforced SSL (https). This was a bit of a process, since I had to track down all of the internal links and embedded objects that used http. If I link to your site and it’s http, that will still work but I really recommend that you update to SSL. I may just change http:// to // to provide a protocol-relative URL, which means that the site will map through to https if it exists, or http otherwise, which will be a bit more future-proof.
  • Added an EU Cookie Law banner, where you are notified that this site generates cookies, and you need to accept that to dismiss the banner. I don’t explicitly place cookies, but some of the WordPress services and embedded objects do. To my knowledge, there isn’t anything that’s particularly nefarious in there.
  • Remove the “links” posts: these were older posts generated from delicious and other link saving services. I haven’t been posting these since some time in 2010, when Twitter took over this type of sharing, and many of the links were dead.
  • Strip out the worst of the dead links. I’m using a broken link checker to find the most common of these (usually when a company changes its URL or ceases to exist) and will gradually get rid of them. This is a longer term project, I’ll keep combing through to find them in my spare time but will likely only fix up the past couple of years.
  • Replace the old Flickr Flash-based slideshow plugin with the newer embed code from Flickr. I tried using different plugins but they just don’t work as well; the only disadvantages of the direct Flickr embed is that the slideshow doesn’t auto-advance, you have to click on it to move forwards and backwards through the images, plus it has some wonky sizing sometimes when the images are of different sizes. I’m also gradually moving the screen snapshots over from my personal Flickr account to a dedicated Column 2 Flickr account, although the process of cleaning up the related links within posts is a bit of a pain.
  • Removed other old Flash embeds, such as the original method of embedding a Slideshare presentation.
  • SEO tuning through better use of post tags.

My goal is to create a faster, cleaner experience for readers with a minimum of clutter. If there are other tools that you’d like to see on the site, let me know: I’ve initially set it with search, top posts and categories for navigation.

Summarizing OPEXWeek 2018

I only had 1.5 days at OPEX Week 2018 in Orlando this week, and spent part of my time giving a presentation as well as sitting on a panel, so didn’t attend many sessions. However, I struck up a conversation with Eric Thompson at the reception last night without realizing that he was one of the original co-founders of Lombardi Software — now a part of IBM, with the Lombardi BPMS forming a good part of the core of IBM BPM — and had such an interesting talk that I sat in on the presentation that he did today with Doug Drolett about continuous improvement at Shell. Both Thompson and Drolett have senior CI roles at Shell.

Shell has been working on process improvement for more than 10 years, with business-centric process improvements during 2005-2009, moving to more end-to-end global process improvement during 2010-2013, and now focused on continuous improvement to the way that everyone works. Although driven from the top, with the CEO fully engaged, the idea is that it’s an ongoing cultural shift at every level. As they moved to this mindset, it became less about programmatic improvement (rolling out new systems to improve the business processes) and more about how that embedded culture impacts operational excellence. This results in everyone being focused on delivering value to the customer — however the customer is defined — through a perpetual cycle of plan-do-improve.

They talked about improving the order-to-cash process in their commercial business, and how they improved that process on a global scale including standardization. They use customer journey mapping and “thinking like the customer” extensively to determine how and why to deliver value in those processes, which has an interesting tie-in with the session that I gave yesterday on how customer journey mapping and process improvement fit together. They also use value stream representations of customer-facing processes, and owners for those processes. Their front-line staff include Lean practitioners, with a smaller number of CI coaches to overlay on ongoing initiatives and projects. Since they’re a global operation, they use technology to enable collaboration so that a single CI initiative can involve participants from several countries.

As you might expect from a process-centric conference, OPEX Week is exceptionally well-run, and attracts a lot of attendees because of the quality of the content. The conference originated several years ago with a focus on the Lean Six Sigma community, and many of the attendees and speakers (such as those from Shell) have roles in their company such as continuous improvement, change management and business transformation. Although technology is definitely a component in most of the projects that people talk about here, that’s not the main thing; that’s what makes this different from the typical attendee and speaker at more technology-focused conferences. There’s a smallish display area for vendor booths and a relatively low-key vendor sponsor element that is integrated into the breakout tracks. They also have the talented visual faciliator Kimberly Dornisch capturing the themes in sessions while they’re going on. Here she is with the one that she did for the low code panel that I was on:

I also gave a presentation yesterday on customer journey mapping, and you can see my slides here:

Transforming Insurance with Cloud BPM: my guest post on the @Alfresco blog

I recently wrote three short articles for Alfresco, which they are publishing on their blog. The first one is about insurance and cloud BPM, looking at how new business models are enabled and customer-facing processes improved using a containerized cloud architecture and microservices. From the intro:

In this blog post, I plan to explore the role BPMS plays in integrating packaged software, custom-built systems, and external services into a seamless process that includes both internal and external participants. What if you need to include customers in your process without having to resort to email or manual reconciliation with an otherwise automated process? What if you need employees and partners to participate in processes regardless of their location, and from any device? What if some of the functions that you want to use, such as machine learning for auto-adjudication, industry comparative analytics on claims, or integration with partner portals, are available primarily in the public cloud?

Head over there to read more about my 4-step plan for insurance technology modernization, although the same can be applied in many other types of organizations. They also have a webinar coming up next week on legacy ECM modernization at Liberty Mutual; with some luck, Liberty Mutual will read my article and think about how cloud BPM can help modernize their processes too.

The other two posts that I wrote for them – one that dives more into BPM cloud architectures and microservices, and one that examines use cases for content in process applications – will be published over the next couple of months. Obviously, Alfresco paid me to write the content that is published on their site, although it’s educational and thought leadership in nature, not about their products.

On the Alfresco topic, I’ll likely be at Alfresco Day in New York on March 28, since they’re holding an analyst briefing there the day before.

A variety of opinions on what’s ahead for BPM in 2018

I was asked to contribute to 2018 prediction posts on a couple of different sites, along with various other industry pundits. Here’s a summary.

BPM.com: Predictions

BPMcom_Logo_Tagline3BPM.com published The Year Ahead for BPM – 2018 Predictions from Top Influencers, introduced by BPM.com’s Nathaniel Palmer and featuring mostly people who work for vendors but mostly whose opinions I respect. Many of the vendors’ predictions align with their product direction, either through good planning or happy coincidence. Winking smile

A few ideas that stood out:

Blockchain may be almost ready for its close-up. Miguel Valdés Faura (Bonitasoft) and Setrag Khoshafian (Pega) both mentioned the potential for integrating blockchain with processes using DPA (digital process automation) platforms. I’ve been watching this space for a couple of years, waiting for the connections to be made between BPM and blockchain, and in addition to these mentions in the predictions article, Bernd Ruecker (Camunda) published a post yesterday with a practical use case and MWD Advisors published a report on IBM Blockchain Platform that mentions its integration with IBM BPM.

Automated decisioning, whether DMN-based or AI/ML, is going to improve process automation significantly but there’s still a lot of trepidation. Denis Gagné (Trisotech) said that decision auditability – a legal requirement in some countries – could favor DMN-based decision services over AI/ML, while Roger King (TIBCO) sees AI-based automation as the key to having RPA replace workers. Keith Swenson (Fujitsu) predicts that deep learning will be both the most important and most disappointing innovation in 2018, while Peter Fingar is bullish on intelligent (AI) agents integrated with BPM. James Taylor (Decision Management Solutions) seemed a bit disheartened that DM is being trivialized as a “feature” of BPM rather than an independent stateless service where it can have the greatest impact.

Microservices architectures are replacing monolithic BPM systems. Brian Reale (ProcessMaker) predicts that microservices will disrupt the BPM market this year, and Roger King gave a nod to dynamically-orchestrated process fragments although didn’t explicitly mention microservices. I’m seeing microservices approaches from a few of the BPM vendors, and I agree that this has a lot of potential to shift away from the monolithic (and proprietary) platforms; watch for an article that I wrote for Alfresco on BPM and microservices to be published shortly on their blog.

Low code is allowing business users (analysts, really) to participate in DevOps directly. Malcolm Ross (Appian) sees low code as a catalyst for developer diversity and the blurring of lines between business and IT. Phil Simpson (Red Hat) states that low code and citizen developers are the only way to meet the need for constantly-changing applications. My concern is that, much like how business analysts were going to develop their own BPM applications when model-driven development came around several years ago, this isn’t actually going to happen in such an optimistic fashion.

Customer journey matters. Gero Decker(Signavio) is seeing top-level value chains being replaced by customer journey maps. To me, customer journey mapping feels like a bit of old wine in new bottles, 10 years after outside-in process modeling and other customer-centric views, but whatever it takes to get some traction around modeling process to include the customer and optimize from their point of view. I’m speaking on this topic at next week’s OPEX Week conference.

BPM is no longer BPM. The term is being replaced by digital process automation, digital transformation and a number of others as the platforms expand beyond just process modeling and execution. Neil Ward-Dutton (MWD Advisors) envisions different types of tools vying for the place that BPM platforms occupy now within organizations, from RPA to model-driven application development tools. As I wrote in my section the article, “BPM is dead…long live BPM!”

Lots of great insights in there, check out the entire article on BPM.com.

BPMtips.com: Skills

iNIo-1aL_400x400Zbigniew Misiak on BPM tips takes a slightly different predictions approach, asking what BPM-related skills and techniques will be most in demand in 2018, where to learn those skills, and what’s no longer relevant in BPM Skills in 2018 – Hot or Not.

Unlike the BPM.com list, which was dominated by vendors, this one has mostly opinions from consultants with some practitioners thrown in. There’s a lot of generic “people need to keep up on the latest technology trends” (duh), but some specific advice stood out:

Process/business architecture to connect processes to value. This ties in with the customer journey mapping trends that we saw in the BPM.com article; here, Roger Burlton (Process Renewal/BPTrends) stresses the importance of including the customer and other external stakeholders in the processes and value definition, and Sandeep Johal (PPB Advisory) reminds us that the focus of process management is (or should be) on improving customer experience via a variety of technologies. Ian Gotts (Q9 Elements) identifies business analysis and critical questioning as key skills, linking to broader business requirements such as GDPR, and Jim Sinur (Aragon) lists journey mapping.

BPMN, CMMN and DMN for standardized modeling. Alan Fish (FICO) sees formal modeling of processes and decisions as important, and Juergen Pitschke (Process Renewal) believes both BPMN and DMN are important, but some say that this is no longer required in low code process application development tools. BJ Biernatowski (Nordstrom) wonders if BPMN has a future, Roger Burlton thinks that process analysts only need to know how to use the core elements, and Sandeep Johal advocates getting rid of manual current-state modeling as automated process discovery, analysis and improvement takes over.

RPA bot training. Abhijit Kakhandiki (Automation Anywhere) suggests this somewhat depressing skill – train the robot to do your job! – but I agree that learning this would help a lot of people create helper functions for their tasks or even completely automate some tasks. Much in the same way that we used to create Excel macros… On a similar note, I recommended that people gathering requirements become proficient with the low code BPM platforms to at least create prototypes, if not the full applications, and Phil Simpson (Red Hat) recommends that less-technical BPM practitioners start to gain an understanding of things that are likely to significantly impact how applications are designed and deployed, such as RPA and microservices.

Interpersonal and soft skills in BPM and change management, in addition to technical skills. Adrian Reed (Blackmetric) listed influencing, stakeholder engagement and conflict resolution as important to making sure that the technology part of the projects fit into the business and people. I discussed some similar skills: the ability to translate need into action, and the need for developers to learn more about what the business does.

Again, lots more to read here, check out the original on BPMtips.com.