bpmNEXT 2018 day 1 videos are up!

In an amazing feat of real-time editing, the videos from yesterday’s sessions were posted yesterday evening. I tweeted a link to the YouTube channel, but here’s the full list sent to us by Nathaniel Palmer. In his words, “It is a rougher product than we’ve had in the past, and we will continue to edit after the event.  But we’re excited to have these videos available immediately” – as are we all.

The Future of Process in Digital Business: Jim Sinur, Aragon Research
https://www.youtube.com/watch?v=iBJBbXeVYUA

A New Architecture for Automation: Neil Ward-Dutton, MWD Advisors
https://www.youtube.com/watch?v=-AeijpL4b98

Secure, Private, Decentralized Business Processes for Blockchains: Vanessa Bridge, ConsenSys
https://www.youtube.com/watch?v=oww8zMzxvZA

Turn IoT Technology into Operational Capability: Pieter van Schalkwyk, XMPro
https://www.youtube.com/watch?v=G7C01e8qyac

Business Milestones as Configuration: Joby O’Brien and Scott Menter, BPLogix
https://www.youtube.com/watch?v=D_heO33fyC0

Timing the Stock Market with DMN: Bruce Silver, methodandstyle.com
https://www.youtube.com/watch?v=vHCIC1HGbHQ

Smarter Contracts with DMN: Edson Tirelli, Red Hat
https://www.youtube.com/watch?v=tdpZgbQbF9Q

Decision as a Service (DaaS): The DMN Platform Revolution: Denis Gagné, Trisotech
https://www.youtube.com/watch?v=sYAIcBhVhIc

Designing the Data-Driven Company: Elmar Nathe, MID GmbH
https://www.youtube.com/watch?v=zb__xVsOEA0

Using Customer Journeys to Connect Theory with Reality: Till Reiter and Enrico Teterra, Signavio
https://www.youtube.com/watch?v=ov0SqJCMmoY

Discovering the Organizational DNA: Jude Chagas Pereira, IYCON; Frank Kowalkowski, KCI
https://www.youtube.com/watch?v=NsCDgKPsTCs

bpmNEXT 2018: Complex Modeling with MID GmbH, Signavio and IYCON

The final session of the first day of bpmNEXT 2018 was focused on advanced modeling techniques.

Designing the Data-Driven Company, MID GmbH

Elmar Nathe of MID GmbH presented on their enterprise decision maps, which provides an aggregated visualization of strategic, tactical and operational decisions with business events. They provide a variety of modeling tools, but see decisions as key to understanding how organizations are driven by data and events. Clearly a rich decision modeling environment, including support for PMML for including predictive models and other data scientist analysis tools, plus links to other model types such as ERDs that can show what data contributes to which decision model, and business process models. Much more of an enterprise architecture approach to model-driven design that can incorporate the work of data scientists.

Using Customer Journeys to Connect Theory with Reality, Signavio

Till Reiter and Enrico Teterra of Signavio started with a great example of an Ignite presentation, with few words, lots of graphics and a bit of humor, discussing their new notation for modeling an outside-in view of the customer journey rather than just having an undifferentiated “customer” swimlane in a BPMN diagram. The demo walked through their customer journey mapping tool, and how their collaboration hub overlays on that to allow information about each component of the journey map to be discussed amongst process modeling users. The journey map contains a lot of information about KPIs and other process metrics in a form most consumable by process owners and modelers, but also has a notebook/dashboard view for analysts to determine problems with the process and identify potential resolution actions. This includes a variety of analysis tools including process discovery, where process mining techniques are applied to determine which paths in the process model may be contributing to specific problems such as cycle time, then overlay this on the process model to assist with root cause analysis. Although their product does a good job of combing CJMs, process models and process analysis, this was more of a walkthrough of a set of pre-calculated dashboard screens rather than an actual demo — a far cry from the experimental features that Gero Decker showed off in their demo at the first bpmNEXT.

Discovering the Organizational DNA, IYCON and Knowledge Consultants

The final presentation of this section was with Jude Chagas Pereira of IYCON and Frank Kowalkowski of Knowledge Consultants presenting IYCON’s Afterspyre modeling tool for creating a catalog of complex business objects, their attributes and their linkages to create organizational DNA diagrams. Ranking these with machine learning algorithms for semantic and sentiment analysis allows identification of process improvement opportunities. They have a number of standard business analysis techniques built in, and robust analytics focused on problem solving. The demo walked through their catalog, drilling down into the “Strategy DNA” section and into “Technology Solutions” subsection to show an enumeration of the platforms currently in place together with attributes such as technology risk and obsolescence, which can be used to rank technology upgrade plans. Relationships between business objects can be auto-detected based on existing data. Levels including Objectives, Key Processes, Technology Solutions, Database Technology and Datacenter and their interrelationships are mapped into a DNA diagram and an alluvial diagram, starting at any point in the catalog and drilling down a specific number of levels as selected by the modeling analyst. These diagrams can then be refined further based on factors such as scaling the individual markers based on actual performance. They showed sentiment analysis for a hotel rank on a review site, which included extracting specific phrases that related to certain sentiments. They also demonstrated a two-model comparison, which compared the models for two different companies to determine the overlap and unique processes; a good indicator for a merger/acquisition (or even divestiture) level of difficulty. They finished up with affinity modeling, such as the type used by Amazon when they tell you what books that other people bought who also bought the book that you’re looking at: easy to do in a matrix form with a small data set, but computationally intensive once you get into non-trivial amounts of data. Affinity modeling is most commonly used in marketing to analyze buying habits and offering people something that they are likely to buy, even if that’s what they didn’t plan to buy at first — this sort of “would you like fries with that” technique can increase purchase value by 30-40%. Related to that is correlation modeling, which can be used as a first step for determining causation. Impressive semantic data-driven analytics tool for modeling a lot of different organizational characteristics.

That’s it for day one; if everyone else is as overloaded with information as I am, we’re all ready for tonight’s wine tasting! Check the Twitter stream for opinions and photos from other attendees.

bpmNEXT 2018: Here’s to the oddballs, with ConsenSys, XMPro and BPLogix

And we’re off with the demo sessions!

Secure, Private Decentralized Business Processes for Blockchains, ConsenSys

Vanessa Bridge of ConsenSys spoke about using BPMN diagrams to create smart contracts and other blockchain applications, while also including privacy, security and other necessary elements: essentially, using BPM to enable Ethereum-based smart contracts (rather than using blockchain as a ledger for BPM transactions and other BPM-blockchain scenarios that I’ve seen in the past). She demonstrated using Camunda BPM for a token sale application, and for a boardroom voting application. For each of the applications, she used BPMN to model the process, particularly the use of BPMN timers to track and control the smart contract process — something that’s not native to blockchain itself. Encryption and other steps were called as services from the BPMN diagram, and the results of each contract were stored in the blockchain. Good use of BPM and blockchain together in a less-expected manner.

Turn IoT Technology into Operational Capability, XMPro

Pieter van Schalkwyk of XMPro looked at the challenges of operationalizing IoT, with a virtual flood of data from sensors and machines that needs to be integrated into standard business workflows. This involves turning big data into smart data via stream processing before passing it on to the business processes in order to achieve business outcomes. XMPro provides smart listeners and agents that connect the data to the business processes, forming the glue between realtime data and resultant actions. His demo showed data being collected from a fan on a cooling tower, bringing in data the sensor logs and comparing it to manufacturer’s information and historical information in order to predict if the fan is likely to fail, create a maintenance work order and even optimize maintenance schedules. They can integrate with a large library of action agents, including their own BPM platform or other communication and collaboration platforms such as Slack. They provide a lot of control over their listener agents, which can be used for any type of big data, not just industrial device data, and integrate complex and accurate prediction models regarding likelihood and remaining useful life predictions. He showed their BPM platform that would be used downstream from the analytical processing, where the internet of things can interact with the internet of people to make additional decisions required in the context of additional information such as 3D drawings. Great example of how to filter through hundreds of millions data points in streaming mode to find the few combinations that require action to be taken. He threw out a comment at the end that this could be used for non-industrial applications, possibly for GDPR data, which definitely made me think about content analytics on content as it’s captured in order to pick out which of the events might trigger a downstream process, such as a regulatory process.

Business Milestones as Configuration, BPLogix

Scott Menter and Joby O’Brien of BPLogix finished up this section on new BPM ideas with their approach to goal orientation in BPM, which is milestone-based and requires understanding the current state of a case before deciding how to move forward. Their Process Director BPM is not BPMN-based, but rather an event-based platform where events are used to determine milestones and drive the process forward: much more of a case management view, usually visualized as a project management-style GANTT chart rather thana flow model. They demonstrated the concept of app events, where changes in state of any of a number of things — form fields, activities, document attachments, etc. — can record a journal entry that uses business semantics and process instance data. This allows events from different parts of the platform to be brought together in a single case journal that shows the significant activity within the case, but also to be triggers for other events such as determining case completion. The journal can be configured to show only certain types of events for specific users — for example, if they’re only interested in events related to outgoing correspondence — and also becomes a case collaboration discussion. Users can select events within the journal and add their own notes, such as taking responsibility for a meeting request. They also showed how machine learning and rules can be used for dynamically changing data; although shown as interactions between fields on forms, this can also be used to generate new app events. Good question from the audience on how to get customers to think about their work in terms of events rather than procedures; case management proponents will say that business people inherently think about events/state changes rather than process. Interesting representation of creating a selective journal based on business semantics rather than just logging everything and expecting users to wade through it for the interesting bits.

We’re off to lunch. I’m a bit out of practice at live-blogging, but hope that I captured some of the flavor of what’s going on here. Back with more this afternoon!

bpmNEXT 2018 kicks off: keynotes with @JimSinur and @NeilWD

It’s the first day of bpmNEXT, the conference for BPM visionaries and free thinkers to get together, share ideas, show their cool new stuff, meet new friends and get reacquainted with old ones. This is an opportunity for technologists (primarily senior technical people from BPM vendors) to give demos in a very structured format, but it’s not really a place for customers: more like the BPM Think Tanks of old. Organized by Bruce Silver and Nathaniel Palmer, themselves both long-time contributors to the industry, with content provided by a lot of people who are loud and proud about their technology.

That very structured format, in case you haven’t read about or attended bpmNEXT before, is a strictly limited Ignite-style presentation followed by a live demo. This limits the amount of time that presenters can spend showing slides and forces them to get to the good stuff.

You can see the the agenda here, and we started out the first day with a few keytnoes from industry thought leaders before getting to the demo presentations. I’ll cover those in this post, then do individual posts for each section of demos (usually three in a section). These will be rough notes since there’s a lot of information that goes by quickly; you’ll be able to see video of all of the sessions, most likely on the bpmNEXT YouTube channel (where you can also see previous years’ sessions).

The Future of Process in Digital Business, Jim Sinur

Jim Sinur, a long-time Gartner analyst who is now with Aragon Research, spoke about trends in digital businesses. Most of this was a plug for Aragon and their research reports that seemed focused on customer organizations, which doesn’t seem like a good fit with this audience where most of the people in the room are well-versed in these technologies and how to apply them in real life.

I’d really like to see more conversational sessions rather than presentations for the keynotes, or at least content that is more directly relevant to the audience.

A New Architecture for Automation, Neil Ward-Dutton

Neil Ward-Dutton, who heads up MWD Advisors, presented a distillation of the conversations that they’re having with customer organizations, starting with the difficult choices that they have to make in terms of which technologies to choose: for example, when RPA vendors tell them that they don’t need BPM any more. he went through some insights into the technologies that are impacting CIOs’ strategic decisions — no surprises there — then presented a schematic model for how work happens in organizations as a basis for understanding how different technologies impact different parts of their work. The framework categorizes work into exploratory, transactional and programmatic, and he walked through what each of those types defines up front, and how the technologies are used within that. Good view of how to help organizations think about their work and how to develop automation strategies that address different work styles and applications.

Although a lot of his presentation was aimed at a general audience that could include customers, Neil finished up with a bit on next moves for vendors and technologists as the technology market changes: there are a lot of mergers and acquisitions going on, and older technologies are being replaced with newer ones in specific instances. He had some recommendations about rearchitecting products and adding value, shifting from one-size-fits-all products to collections of independent runtime services in order to support cloud architectures (especially elastic computing requirements) and provide more flexibility in product offerings.

Insurance technology: is this very conservative industry finally ready for its close-up?

I’ve worked with insurance clients for a long time, first helping them with automation in their underwriting, policy administration and claims processes, and now helping them with digital transformation to create new business models and platforms. One thing that has always struck me is how behind the time most insurance companies are: usually old companies (by today’s standards), they trend far on the conservative end of the business and technology innovation scale. However, new entrants to the market have been stirring the pot for a couple of years – such as Lemonade for the urban consumer property insurance market – and it seems that everywhere I look, there’s something popping up about innovation in insurance.

Capgemini has a significant insurance practice, and writes an annual World Insurance Report that is about to be updated for 2018; a couple of their consultants write about different aspects of how insurance is changing and the technology enabling that change. They’ve just started a three-part series on the insurance customer of the future, which echoes some of the points that I made in my recent post on the Alfresco blog about transforming insurance with cloud BPM, and although they use the apocryphal “millennial” definition to describe who these customers are in their first post, they point out four main characteristics:

  • Smart shoppers
  • Lower loyalty
  • Self-centred
  • Caring consumers – which appears contrary to the previous point, but check out their post for a description

They have another post on how new InsurTech models can decrease risk for the insurer, which explains more about the social risk pool models that are used by companies like Lemonade, and how risk can be proactively mitigated through the use of connected devices.

We’re also seeing platform innovation for some insurers, such as Liberty Mutual moving their documents to Alfresco on AWS cloud. As I’ve experienced for many years, just getting insurance companies to move from paper to digital files can provide huge operational benefits, and moving those files to the cloud allows a global insurer to allow access wherever required. There are a lot of regulatory issues with data sovereignty, that is, where the content is actually stored and what laws/regulations apply to it because of that, but the vendors are starting to solve those problems with regional data centers and secure, encrypted transport. With digital content comes the issue of digital preservation, which John Mancini on the AIIM blog points out is a big issue for financial and insurance companies because of the typically long time span that they are dealing with customers: consider that a personal injury insurance claim can go on for years, requiring that all documents be retained for future review. After hearing about one former insurance customer of mine that had a flood in their basement storage, destroying years of customer files, I wished that they had decided to move a bit faster on my advice about digital documents.

Cutting edge technologies such as blockchain are also getting into the insurance mix: blockchain can be used to show proof of insurance, improve transparency and reduce risk of fraud, and speed up claims with smart contracts. I can also imagine that as cars get smarter and insurance companies can tie in directly to the on-board systems, there may be less opportunity for auto repair shop fraud, which reduces overall costs to the insurer and consumer.

If you work in insurance and know that you’re behind the curve, there are a lot of things that you can do to help bring yourself into at least the last century, if not this one:

  • Convert all of your files to digital format at the front end of the process, that is, when they arrive (or are created). This will allow you to automatically extract data from the files, which can then be used for classifying and routing content as it arrives. Files can now be shared by anyone who needs to see them, and there will be no piles of completed documents/files waiting to be scanned at the end of a process. This is a big cultural shift as your workers move from working on paper to working on the screen, but if you give them a couple of big screens and a properly-designed workspace, it can be just as productive as paper.
  • With all of your content arriving in digital form, or being converted to digital immediately on arrival, you can now automate your processes:
    • New policy application? Look up any previous information for this customer, create a new business case, and route to the appropriate underwriter if required. If this is a simple policy, such as consumer renter insurance, it can usually be automatically adjudicated and issued immediately.
    • Policy changes? Extract information from the policy administration system, classify the type of change, and either complete the change automatically or forward to a policy administration clerk.
    • A first notice of loss arriving for a claim? Use that to automatically extract information from your policy administration system, set up a claim in your claims system, and route the claim to the appropriate claims manager. Simple claims, such as auto windshield replacement, can be settled automatically and immediately.
    • Additional documents arriving for a claim? Automatically recognize the document type and claim number, and add to the claim case directly.
  • Find the best ways to integrate your digital content and processes with your legacy systems. This is a huge part of what I do with any insurance customer (really, with any customer at all), and it’s not trivial but can result in huge rewards. This will be some combination of exposing APIs, digging directly into operational databases, RPA to integrate “at the glass”, and other methods that are specific to your environment. In the end, you want to be sure that no one is re-entering data manually from one system to another, even by copy and paste.
  • Automate, automate, automate. In case I haven’t made that clear already. There should be no such thing as manual work assignment or routing, except in special cases. Data exchange with legacy systems should be automated. Decisions should be automated where possible, or at least used to make recommendations to workers. Incorporate artificial intelligence and machine learning to learn how your most skilled workers make decisions, align that with your policies and regulatory compliance, and use as input to automated decisions and recommendations. The workers will be left doing the work that actually requires a person to do it, not all of the low-level administrative work.
  • Use some type of low-code application development platform that allows semi-technical business analysts – there are a ton of these working in insurance business areas – to create their own situational apps.
  • Now that you have your operational processes sorted out, start looking for new ways to leverage your digital content and processes:
    • Interact with reinsurers and other business partners using digital content and processes, not paper files and faxes.
    • Provide customers with the option for completely paperless policy application, issuance and renewal. Although I’m far from being a millennial in age, the huge stack of paper sent by my previous home insurer on renewal was a key reason that I ran directly towards an online insurer that could do it all without paper.
    • Streamline claims processes, automating where possible. Many insurance companies don’t spend a lot of time fixing their claims processes, preferring to spend their time on attracting new customers; however, in this age of online consumer reviews, an inefficient claims process is going to hit hard. Automating claims also reduces operational costs: claims managers are highly skilled, and it can take 6-12 months to train a new one.
    • Automate and streamline your ancillary processes that support the main processes, such as recovery of assets, and negotiating contracts with preferred repair vendors.
    • Build in the process monitoring, and provide automated dashboards and reports to different levels of management. As well as giving management a real-time view of operations, this reduces the time of line supervisors spent manually compiling reports. It also, amazingly, will reduce the amount of time that individual workers spend tracking their own work: in many of the insurance companies that I visit, claims managers and other front-line workers keep a manual log of their work because they don’t trust the system to do it for them.
  • Tie your process performance back to business goals: loss ratio, customer satisfaction, regulatory SLAs (such as communicating with customer in a timely manner), net promoter score, fraud rate, closure rate. It’s too easy to get bogged down in making a particular activity or process more efficient when it shouldn’t even be done at all. Although you can use your existing procedures guides as a starting point for your new digital processes, you really need to link everything back to the high-level goals rather than just paving the cow paths.

This started out as a short post because I was seeing a flurry of insurance-related items in my news feed, and grew into a bit of a monster as I thought of my own experiences with insurance customers over the past couple of years. Nonetheless, likely some useful tidbits in here.

Integrating process and content: exploring the use cases

I recently wrote a series of short articles sponsored by Alfresco and published on their blog. Today, the third of the series was published, discussing some use cases for integrating content into your processes:

  • Document-driven processes
  • Case management
  • Document lifecycle processes
  • Support documentation for exceptions in data-driven processes
  • Classification and analysis processes for non-document content

Head over there to read all the details on each of these use cases. As I write at the end:

Over the years, I’ve learned two things about integrating process and content: first, almost every process application has some sort of content associated with it; and second, most process-centric developers underestimate the potential complexity of handling the content in the context of the process application.

While you’re over there, you can also check out the other two articles that I wrote: transforming insurance with cloud BPM, and BPM cloud architectures and microservices.

13 years of reporting on @BPTrends BPM annual reports: from vendor reviews to state of the BPM market

The very first post that I wrote on this blog was in March 2005, and it covered BPTrends’ 2005 BPM Suites report. I think that this was the first year that they published a BPM annual report, although I can’t find even the link to this report in their web archive and I don’t have it in my own archive. However, at the time, I listed the 13 vendors that were included (about half of which still exist in some form) and noted that it was a “pay for play” report that required that the vendors pay $5,000 to participate. I don’t think that BPTrends does vendor reviews any more – the last that I have a record of was in 2007 – but they have an out-of-date page of vendor info and links that is provided for free.

By 2006, BPTrends was conducting surveys of practitioners, consultants and others involved in BPM, and had published their first State of Business Process Management report based on survey results, with others following every two years (2008, 2010, 2012, 2014, 2016). They’ve now published the 2018 State of Business Process Management report, sponsored by Red Hat and Signavio, who get top billing on the front page of the report but presumably no editorial control or special treatment in the report since it’s not a review of products but rather a state of the industry/market report based on the results of surveys. Since BPTrends has been creating and analyzing BPM surveys since 2005, they have a good view of how the market is evolving.

My guest post on the @Alfresco blog: BPM Cloud Architectures and Microservices

The second of the short articles that I wrote for Alfresco has been published on their blog, on BPM cloud architectures and microservices. I walk through the basics of cloud architectures (private, public and hybrid), containerization and microservices, and why this is relevant for BPM implementations. As I point out:

Not all BPM solutions are built for cloud-native architectures: a monolithic BPMS stuffed into a Docker container will not be able to leverage the advantages of modern cloud infrastructures.

Check out the full article on the Alfresco site.

Summarizing OPEXWeek 2018

I only had 1.5 days at OPEX Week 2018 in Orlando this week, and spent part of my time giving a presentation as well as sitting on a panel, so didn’t attend many sessions. However, I struck up a conversation with Eric Thompson at the reception last night without realizing that he was one of the original co-founders of Lombardi Software — now a part of IBM, with the Lombardi BPMS forming a good part of the core of IBM BPM — and had such an interesting talk that I sat in on the presentation that he did today with Doug Drolett about continuous improvement at Shell. Both Thompson and Drolett have senior CI roles at Shell.

Shell has been working on process improvement for more than 10 years, with business-centric process improvements during 2005-2009, moving to more end-to-end global process improvement during 2010-2013, and now focused on continuous improvement to the way that everyone works. Although driven from the top, with the CEO fully engaged, the idea is that it’s an ongoing cultural shift at every level. As they moved to this mindset, it became less about programmatic improvement (rolling out new systems to improve the business processes) and more about how that embedded culture impacts operational excellence. This results in everyone being focused on delivering value to the customer — however the customer is defined — through a perpetual cycle of plan-do-improve.

They talked about improving the order-to-cash process in their commercial business, and how they improved that process on a global scale including standardization. They use customer journey mapping and “thinking like the customer” extensively to determine how and why to deliver value in those processes, which has an interesting tie-in with the session that I gave yesterday on how customer journey mapping and process improvement fit together. They also use value stream representations of customer-facing processes, and owners for those processes. Their front-line staff include Lean practitioners, with a smaller number of CI coaches to overlay on ongoing initiatives and projects. Since they’re a global operation, they use technology to enable collaboration so that a single CI initiative can involve participants from several countries.

As you might expect from a process-centric conference, OPEX Week is exceptionally well-run, and attracts a lot of attendees because of the quality of the content. The conference originated several years ago with a focus on the Lean Six Sigma community, and many of the attendees and speakers (such as those from Shell) have roles in their company such as continuous improvement, change management and business transformation. Although technology is definitely a component in most of the projects that people talk about here, that’s not the main thing; that’s what makes this different from the typical attendee and speaker at more technology-focused conferences. There’s a smallish display area for vendor booths and a relatively low-key vendor sponsor element that is integrated into the breakout tracks. They also have the talented visual faciliator Kimberly Dornisch capturing the themes in sessions while they’re going on. Here she is with the one that she did for the low code panel that I was on:

I also gave a presentation yesterday on customer journey mapping, and you can see my slides here:

Transforming Insurance with Cloud BPM: my guest post on the @Alfresco blog

I recently wrote three short articles for Alfresco, which they are publishing on their blog. The first one is about insurance and cloud BPM, looking at how new business models are enabled and customer-facing processes improved using a containerized cloud architecture and microservices. From the intro:

In this blog post, I plan to explore the role BPMS plays in integrating packaged software, custom-built systems, and external services into a seamless process that includes both internal and external participants. What if you need to include customers in your process without having to resort to email or manual reconciliation with an otherwise automated process? What if you need employees and partners to participate in processes regardless of their location, and from any device? What if some of the functions that you want to use, such as machine learning for auto-adjudication, industry comparative analytics on claims, or integration with partner portals, are available primarily in the public cloud?

Head over there to read more about my 4-step plan for insurance technology modernization, although the same can be applied in many other types of organizations. They also have a webinar coming up next week on legacy ECM modernization at Liberty Mutual; with some luck, Liberty Mutual will read my article and think about how cloud BPM can help modernize their processes too.

The other two posts that I wrote for them – one that dives more into BPM cloud architectures and microservices, and one that examines use cases for content in process applications – will be published over the next couple of months. Obviously, Alfresco paid me to write the content that is published on their site, although it’s educational and thought leadership in nature, not about their products.

On the Alfresco topic, I’ll likely be at Alfresco Day in New York on March 28, since they’re holding an analyst briefing there the day before.