Prepping for OPEXWeek presentation on customer journey mapping – share your ideas!

I’m headed off to OPEX Week in Orlando later this month, where I’ll give a presentation on customer journey mapping and how it results in process improvement as well as customer satisfaction/value. Although customer journey mapping is commonly used to talk about user experience/navigation on customer-facing websites, I want to look at the bigger picture of what we used to call “outside-in processes”, where internal processes are turned on their head to show the process from the customer’s point of view. Once you start thinking about what the customer is trying to accomplish, it can completely change how you perform and set priorities on the internal work, as well as changing the user experience presented to the customer.

I’m preparing a few slides to guide the presentation, and if you have any good stories to share, feel free to let me know by commenting on this post or tweeting to me.

I’m also sitting on a panel the following day on low code and BPM, which I’ve recently written a paper on (sponsored by TIBCO).

Presenting at OPEXWeek in January: customer journey mapping and lowcode

I’ll be on stage for a couple of speaking slots at the OPEX Week Business Transformation Summit 2018 in Orlando the week of January 22nd:

  • Tuesday afternoon, I’ll lead a breakout session in the customer-centric transformation track on increasing customer satisfaction through customer journey mapping and process improvement.
  • Wednesday morning, I’ll be on a panel in the RPA track on how low-code platforms are transforming BPM.

I was last at OPEX Week in 2012, when it was still called PEX Week (for Process Excellence Network) – I was on a BPM blogger panel that time around – and it will be interesting to see how it’s changed since then. Looks like a lot more automation technology in the current version, with the expectation that digital transformation isn’t going to come about just by modeling your business.

If you’re going to be there, look me up at one of my sessions or around the conference on Tuesday and Wednesday.

TIBCO Nimbus for regulatory compliance at Bank of Montreal

It’s the first afternoon of breakout sessions at TIBCO NOW 2016, and Alex Kurm from Bank of Montreal is presenting how the bank has used Nimbus for process documentation, to serve the goals of regulatory compliance and process transformation. They are one of the largest Nimbus users, and Kurm leads a team of process experts deploying Nimbus across the enterprise as part of their in-house process excellence strategy.

He provided a good overview of regulatory and compliance requirements: to quote his slide, you need to have “evidence of robust, documented standard processes to ensure compliance to risk and regulatory requirements” as a minimum. Overlaid on that, there’s an evolving set of consumer demands, moving from traditional in-person, telephone and ATM banking to web and mobile platforms. As a Canadian resident, I can attest that our banks haven’t been as responsive as desired to customer needs in the past; their focus is on operational risk and security.

wp-1463521439971.jpgBMO’s process centre of excellence maintains a knowledge hub of process best practices (including how to use Nimbus in their environment), leads and supports process-related projects, and heads up governance of all process efforts. They have about 16 people in the CoE, then process specialists out in business areas; they even have internalized the Nimbus training. Although there are a variety of tools being used for process models in the bank, they selected Nimbus because of its business-understandable notation, the ability to put all process content in one place, the built-in governance and control over the content (key for auditors to be able to review), and the direct link between process architecture and process maps.

They started on Nimbus 3 years ago with about 20 process authors working on a couple of opportunistic projects; this quickly ramped up to 300 authors by the next year, and they now have more than 500 authors (including business analysts and project managers as well as process specialists), although there are only about 160 active any given month since this work is often project-based. There are 1800 end users looking at Nimbus maps each month, with the largest number in capital markets, although the highest number of distinct initiatives is in the highly regulated area of capital markets. They organize their 20,000 Nimbus maps by core business capability, such as onboarding, then drill down into the business area; they’re looking at ways of improving that to allow for finding content by any search path. They’re also adding Spotfire to be able to interrogate the content to find non-compliant and high-risk maps for review by the CoE.

Their key use cases are:

  • Process documentation for use as a high-level procedural guide
  • A guide for compliance auditors to verify that specific checks and balances are being done
  • Requirements gathering prior to automation (they are also an ActiveMatrix BPM customer), and as ongoing documentation of the automated process

Nimbus is now a core part of their process transformation and risk mitigation strategies; interestingly, the only resistance came from other “process gurus” in the bank who had their own favorite modeling tools.

Good case study of the benefit of process documentation – even in the absence of process automation — in highly-regulated industries.

bpmNEXT 2016 demo session: Signavio and Princeton Blue

Second demo round, and the last for this first day of bpmNEXT 2016.

Process Intelligence – Sven Wagner-Boysen, Signavio

Signavio allows creating a BPMN model with definitions of KPIs for the process such as backlog size and end-to-end cycle time. The demo today was their process intelligence application, which allows a process model to be uploaded as well as an activity log of historical process instance data from an operational system — either a BPMS or some other system such as an ERP or CRM system — in CSV format. Since the process model is already known (in theory), this doesn’t do process mining to derive the model, but rather aggregates the instance data and creates a dashboard that shows the problem areas relative to the KPIs defined in the process model. Drilling down into a particular problem area shows some aggregate statistics as well as the individual instance data. Hovering over an instance shows the trace overlaid on the defined process model, that is, what path that that instance took as it executed. There’s an interesting feature to show instances that deviate from the process model, typically by skipping or repeating steps where there is no explicit path in the process model to allow that. This is similar in nature to what SAP demonstrated in the previous session, although it is using imported process log data rather than a direct connection to the history data. Given that Signavio can model DMN integrated with BPMN, future versions of this could include intelligence around decisions as well as processes; this is a first version with some limitations.

Leveraging Cognitive Computing and Decision Management to Deliver Actionable Customer Insight – Pramod Sachdeva, Princeton Blue

Sentiment analysis of unstructured social media data, creating a dashboard of escalations and activities integrated with internal customer data. Uses Watson for much of the analysis, IBM ODM to apply rules for escalation, and future enhancements may add IBM BPM to automatically spawn action/escalation processes. Includes a history of sentiment for the individual, tied to service requests that responded to social media activity. There are other social listening and sentiment analysis tools that have been around for a while, but they mostly just drive dashboards and visualizations; the goal here is to apply decisions about escalations, and trigger automated actions based on the results. Interesting work, but this was not a demo up to the standards of bpmNEXT: it was only static screenshots and some additional PowerPoint slides after the Ignite portion, effectively just an extended presentation.

bpmNEXT 2016 demo session: 8020 and SAP

My panel done — which probably set some sort of record for containing exactly 50% of the entire female attendees at the conference — we’re on to the bpmNEXT demo session: each is 5 minutes of Ignite-style presentation, 20 minutes of demo, and 5 minutes for Q&A. For the demos, I’ll just try capture some of the high points of each, and I highly recommend that you check out the video of the presentations when they are published after the conference.

Process Design & Automation for a New Economy – Ian Ramsay, 8020 BPM

A simplified, list-based process designer that defines a list of real-world business entities (e.g., application), a list of states unique to each entity (e.g., approved), lists of individuals and groups, lists of stages and tasks associated with each stage. Each new process has a list of start events that happen when a process is instantiated, one or more tasks in the middle, then a list of end events that define when the process is done. Dragging from the lists of entities, states, groups, individuals, stages and tasks onto the process model creates the underlying flow and events, building a more comprehensive process model behind the scenes. This allows a business specialist to create a process model without understanding process modeling or even simple flowcharting, just by identifying the relationships between the different states of business entity, the stages of a business process, and the people involved. Removing an entity from a process modifies the model to remove that entity while keeping the model syntactically correct. Interesting alternative to BPMN-style process modeling, from someone who helped create the BPMN standard, where the process model is a byproduct of entity-state modeling.

Process Intelligence for the Digital Age: Combining Intelligent Insights with Process Mining – Tarun Kamal Khiani and Joachim Meyer, SAP, and Bastian Nominacher, Celonis

Combining SAP’s Operational Process Intelligence analytics and dashboard (which was shown in last year’s bpmNEXT as well as some other briefings that I’ve documented) with Celonis’ process mining. Drilling down on a trouble item from the OPInt dashboard, such as late completion of a specific process type, to determine the root cause of the problem; this includes actionable insights, that is, being able to trigger an operational activity to fix the problem. That allows a case-by-case problem resolution, but adding in the Celonis HANA-based process mining capability allows past process instance data to be mined and analyzed. Adjusting the view on the mined data allows outliers and exceptions to be identified, transforming the straight-through process model to a full model of the instance data. For root cause analysis, this involved filtering down to only processes that took longer than a specific number of days to complete, then manually identifying the portions of the model where the lag times or certain activities may be causing the overly-long cycle time. Similar to other process mining tools, but nicely integrated with SAP S4 processes via the in-memory HANA data mart: no export or preprocessing of the process instance history log, since the process mining is applied directly to the realtime data. This has the potential to be taken further by looking at doing realtime recommendations based on the process mining data and some predictive modeling, although that’s just my opinion.

Good start to the demos with some new ideas on modeling and realtime process mining.

Positioning Business Modeling panel at bpmNEXT

We had a panel of Clay Richardson of Forrester, Kramer Reeves of Sapiens and Denis Gagne of Trisotech, moderated by Bruce Silver, discussing the current state of business modeling in the face of digital transformation, where we need to consider modeling processes, cases, content, decisions, data and events in an integrated fashion rather than as separate activities. The emergence of the CMMN and DMN standards, joining BPMN, is driving the emergence of modeling platforms that not only include all three of these, but provide seamless integration between them in the modeling environment: a decision task in a BPMN or CMMN model links directly to the DMN model that represents that decision; a predefined process snippet in a CMMN model links directly to the BPMN model, and an ad hoc task in a BPMN model links directly to the CMMN model. The resulting models may be translated to (or even created in) a low-code executable environment, or may be purely for the purposes of understanding and optimizing the business.

Some of the points covered on the panel:

  • The people creating these models are often in a business architecture role if they are being created top down, although bottom-up modeling is often done by business analysts embedded within business areas. There is a large increase in interest in modeling within architecture groups.
  • One of the challenges is how to justify the time required to create these models. A potential positioning is that business models are essential to capturing knowledge and understanding the business even if they are not directly executable, and as organizations’ use of modeling matures and gains visibility with executives, it will be easier to justify without having to show an immediate tangible ROI. Executable models are easier to justify since they are an integrated part of an application development lifecycle.
  • Models may be non-executable because they model across multiple implementation systems, or are used to model activities in systems that do not have modeling capabilities, such as many ERP, CRM and other core operational systems, or are at higher levels of abstraction. These models have strategic value in understanding complexity and interrelationships.
  • Models may be initiated using a model derived from process/data mining to reduce the time required to get started.
  • Modeling vendors aren’t competing against each other, they’re competing against old methods of text-based business requirements.
  • Many models are persistent, not created just for a specific point in time and discarded after use.

A panel including two vendors and an analyst made for some lively conversation, and not a small amount of finger-pointing. 🙂

HP Consulting’s Standards-Driven Requirements Method at BPMCM15

Tim Price from HP’s enterprise transformation consulting group presented in the last slot of day 2 of the BPM and case management summit (and what will be my last session, since I’m not staying for the workshops tomorrow) with a discussion on how to improve requirements management by applying standards. There are a lot of potential problems with requirements: inconsistency, not meeting the actual needs, not designed for change, and especially the short-term thinking of treating requirements as project rather than architecture assets. Price is pretty up-front about how you can’t take a “garden variety” business analyst and have them create BPMN diagrams without training, and that 50% of business analysts are unable to create lasting and valuable requirements.

Although I haven’t done any quantitative studies on this, I would tend to agree that the term “business analyst” covers a wide variety of skill levels, and you can’t just assume that anyone with that title can create reusable requirements models and assets. This becomes especially important when you move past written requirements — that need the written language skills that many BAs do have — to event-driven BPMN and other models; the main issue is that these models are actually code, albeit visual code, that may be beyond the technical analysis capabilities of most BAs.

Getting back to Price’s presentation, he established traceability as key to requirements: between BPMN or UML process models and UML use cases, for example; or upwards from processes to capabilities. Data needs to be modeled at the same time as processes, and processes should be modeled as soon as the the high level use case is defined. You can’t always created a one-to-one relationship between different types of elements: an atomic BPMN activity may translate to a use case (system or human), or to more than one use cases, or to only a portion of a use case; lanes and pools may translate to use case actors, but not necessarily; events may represent states and implied state transitions, although also not necessarily. Use prose for descriptions, but not for control flow: that’s what you use process models for, with the prose just explaining the process model. Develop the use case and process models first, then write text to explain whatever is not obvious in the diagrams.

He walked through a case study of a government welfare and benefits organization that went through multiple failed implementations, which were traced back to poor requirements: structural problems, consistency issues, and designs embedded in the specification. Price and his team spent 12 months getting their analysts back on track by establishing standards for creating requirements — with a few of the analysts not making the transition — that led to CMMI recognition of their new techniques. Another case study applied BPMN process models and UML use cases for a code modernization process: basically, their SDLC was the process being improved. A third case study used BPMN to document as-is and to-be processes, then use case models with complete traceability from the to-be processes to the use cases, with UML domain class models being developed in parallel.

The lessons learned from HP’s experiences:

    • Apply existing standards consistently, including BPMN, CMMN, DMN, UML

    • Use graph-structured languages for structure and logic, and prose for description

    • Use repository-based modeling tools to allow for reusability and collaboration

    • Be concise, be precise, be consistent

    • Create requirements models that are architecture assets, not just short-term project assets

    Some good lessons for requirements analysis; although this was developed for complex more waterfall-y SDLCs, some of these can definitely be adapted for more agile implementations.

    The Digital Enterprise Graph with @denisgagne at BPMCM15

    Yesterday, Denis Gagné demonstrated the modeling tools in the Trisotech Digital Enterprise Suite, and today he showed us the Digital Enterprise Graph, the semantic layer that underlies the modeling tools and allows for analysis of relationships between them. There are many stakeholders involved in defining and implementing a digital enterprise, including enterprise architects, business architects and process analysts; each of these roles has a different view on transformation of the enterprise and different goals for their work. He sees a need for a more emergent enterprise architecture rather than a structured top-down architecture effort: certainly, architects need to create the basic structure, but rather than trying to build out every artifact that might exist in the architecture before making use of it, a more pragmatic approach is for a “just-in-time” architecture that is a bit more self-organizing.

    A graph, in general, is a powerful but simple contstruct: it consists only of nodes and links, but can provide meaningful connections of loosely-coupled business entities that can be easily modified. Think about a social graph, such as Facebook’s social graph: it’s just people and their connections, but it’s a rich context for analyzing the relationships between nodes (people) in the graph depending on the nature of the links (friends, likes, etc.) between them. Trisotech’s Digital Enterprise Graph links the who, what, when, where, why and how of an organization by mapping every model that is added to the Graph onto those types of nodes and links, whether the model originates with one of their own modelers (BPMN, CMMN, DMN) or an external EA modeling tool (Casewise, SAP PowerDesigner, Sparx). This provides an intelligent fabric for automated reasoning about the current relationships between parts of the organization, but also allows estimation of the impact of changes in one area on other parts of the organization. Their Insight Analyzer tool allows you to introspect the graph, providing views such as interconnectivity between nodes as part of impact analysis, or tracing responsibility for a capability up through the organizational structure. The analysis isn’t automated, but provides visualization tools for analysts and planners, based on a single integrated scheme that allows for straightforward queries.

    He gave us a demo of the Graph in action, starting with a BPMN model that uses the Sparx EA accelerator for SOA architecture artifacts, and tracing through that loose coupling to the architectural components in the EA framework, with similar linkages for roles from a Casewise business architecture framework and definitions of contracts from the Financial Business Industry Ontology (FIBO). The idea is that the Graph provides an underlying mesh of semantic linkages from elements in a model to other frameworks, ontologies and models while still retaining business understandability at the model level. In the Insight Analyzer, we saw how to explore linkages between different types of elements, such as RACI-type relationships between roles and activities, as well as a more detailed introspection that allows drilling down on any node to see what other nodes and models that it is linked to, and traversing those links.

    Interesting ideas about how to bring together all of the architecture, process, case and decision models and frameworks into a single graph for analysis of your digital enterprise.

    Fannie Mae Case Study on Effective Process Modeling at BPMCM15

    Amit Mayabhate from Fannie Mae (a US government-sponsored mortgage lender that buys mortgages from the banks and packages them for sale as securities) gave a session at the BPM and Case Management Summit on outcome-based process modeling for delivering business value. He had a few glitches getting started — apparently Fannie Mae doesn’t allow employees to download a presentation to their laptop, so he had to struggle through getting connected to the conference wifi and then the Fannie Mae VPN to open a PDF of his presentation — but did tell the best joke of the day when he was restarting his computer in front of us and said “now you know my password…it’s 8 dots in a row”.

    Back on track, he discussed their business architecture efforts and how process modeling fits into it. Specifically, he talked about their multifamily housing division, which had its own outdated and inflexible technology platform that they wanted to change out for a simpler infrastructure that would give them better access to information for better decision-making. To get there, they decided to start with the best possible outcome in mind, but first had to have the organization understand not only that they had problems, but some quantification of how big those problems were in order to set those future goals. They identified several key metrics where they could compare today’s measurements with their desired future goals, such as operational efficiency (manual versus automated) and severability. To map from the current to future state, they needed a transformation roadmap and a framework for achieving the steps on the roadmap; this included mapping their journey to greater levels of process maturity, and creating a business capability model that included 17 capabilities, 65 functions, 262 sub-functions, and around 300 process flows.

    Their business architecture transformation framework started with the business model (how do we make money), the operating model (how do we behave to make money) and the business capability model (what abilities are needed) using the Business Model Canvas framework. They used other architecture analysis tools, such as analyzing their operating model by plotting business process standardization against business process integration both for their current state and desired future state, to help them develop the strategy for moving between them. They used Mega’s business strategy module for most of the architecture analysis, which helps them identify business processes that are ripe for automation, then move to a BPMS for process modeling and automation. In that way, they can do just the process modeling that provides them with some architectural change that they know will provide value, rather than attempting to boil the ocean by modeling all processes in the organization.

    Going Beyond Process Modeling, Part 1

    I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

    My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

    The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

    We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

    If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.