Wearable Workflow by @wareFLO at BPMCM15

Charles Webster gave a breakout session on wearable workflow, looking at some practical examples of combining wearables — smart glasses, watches and even socks — with enterprise processes, allowing people wearing these devices to have device events integrated directly into their work without having to break to consult a computer (or at least a device that self-identifies as a computer). Webster is a doctor, and has a lot of great case studies in healthcare, such as detecting when a healthcare worker hasn’t washed their hands before approaching a patient by instrumenting the soap dispenser and the worker. Interestingly, the technology for the hand hygiene project came from smart dog collars, and we’re now seeing devices such as Intel’s Curie that are making this much more accessible by combining sensors and connectivity as we commercialize the internet of things (IoT).

He was an early adopter of Google Glass, and talked to us about the experience of having a wearable integrated into his lifestyle, such as for voice-controlled email and photography, plus some of the ideas for Google Glass that he has for healthcare workflows where electronic health records (EHR) and other device information can be integrated with work patterns. Google Glass, however, was not a commercial success since it is too bulky and geeky-looking, as well as requiring frequent recharging if you’re using it a lot. It requires more miniaturization to be considered as a possibility for most people, but that’s a matter of time, and probably a short amount of time, especially if they’re integrated directly into eyeglass frames that likely have a lot of unused volume that could be filled with electronic components.

Webster talked about a university curriculum for healthcare technology and IoT that he designed, which would include the following courses:

  • Wearable human factors and workflow ergonomics
  • Data and process mining wearable data, since wearables generate so much more interesting data that needs to be analyzed and correlated
  • Designing and prototyping wearable products

IMG_20150623_104530He is working on a prototype for a 3D-printed, Arduino-based wearable interactive robot, MrRIMP, intended to be used by pediatric healthcare professionals to amuse and distract their young patients during medical examinations and procedures. He showed us a video of a demo of he and MrRIMP interacting, and the different versions that he’s created. Great ideas about IoT, wearables and healthcare.

Day 2 Keynote at BPMCM15

Second day at the BPM and Case Management summit in DC, and our morning keynote started with Jim Sinur — former Gartner BPM analyst — discussing opportunities in BPM and case management. He pointed out the proven benefits of process and case management, in terms of improving revenue, costs, time to market, innovation and visibility, while paving a path to digital transformation. However, these tried-and-true ROI measures aren’t just enough these days: we also need to consider customer loyalty, IoT, disruptive companies and business models, and in general, maintaining competitive differentiation in whatever way necessary to thrive in the emerging marketplace. In order to accommodate this, as well as attract good workers, it’s necessary to break the specialist mindset and allow people to become knowledge workers. I gave a workshop last week at the IRM BPM conference on the future of work, and I agree that this is a key part of it: more of the routine work is being automated, leaving the knowledge work for the people in the process; this requires a work environment that allows people to do the right thing at the right time to achieve a goal, not just work at a pre-defined task in a pre-defined way. Sinur cited a number of examples of processes that are leveraging emerging technologies, including knowledge workers’ workbenches that incorporate smart automated agents and predictive analytics; and IoT applications in healthcare and farming. The idea is to create goal-driven and proactive “smarming” processes that figure out on their own how to accomplish a goal through both human and automated intelligence, then assemble the resources to do it. Instead of pre-defining processes, you provide goals, constraints, analytics and contexts; the agents — including people, services, bots and sensors — create each process instance on the fly to best meet the situation. Although his case studies included a number of other technologies, he finished with a comment on how BPM and case management can be used to coordinate and orchestrate these processes as we move to a new world of digital transformation of the customer experience.

Next up was Tom Debevoise, now with Signavio to help promote their recently-released DMN modeler; we had a sneak peek of the DMN modeler at bpmNEXT. He talked about three levels of decisions — strategic (e.g., should we change our business model), tactical (e.g. which customers to target) and operational (e.g., which discount to apply to this transaction) — and how these tend to be embedded within process models and business application logic, rather than externalized into decision models where they can be explicitly managed. Most organizations manage their decisions very poorly, both human and automated, resulting in inconsistent or just plain wrong decisions being made. In other words, our business decisions are at the same point now as business processes were a decade or more ago, before BPM systems became widespread, and the path to improving this is to consider decision management as a discipline as well as the systems to model and automate decisions. We now have a decision modeling standard, DMN 1.0; this is expected to drive the adoption of decision modeling in organizations in the same way that BPMN did for process modeling. He proposed a decision management lifecycle similar to a BPM lifecycle, starting with decision discovery that allows modeling using the DMN-standard elements of a decision, input data, knowledge sources, information requirements, authority requirements and knowledge requirements. He wrapped up with the linkage between process and decision models, particularly using the Signavio BPMN and DMN modelers: how decisions that are defined external to a process can be used to assign process activity participants, decide on next steps, select the process pathway, define data access control, or detect and respond to events. We saw yesterday how Trisotech’s tools combine BPMN, CMMN and DMN, and today how Signavio combines BPMN and DMN; as more process modeling vendors expand to include decision modeling, we are going to see more implementations of these modeling standards integrated.

The last speaker in the keynote was Lloyd Dugan, on how business architecture and BPM work together, in response to a paper that he wrote last year with Neal McWhorter. Although dense (I recommend checking out the paper at the link), his presentation discussed some of the issues with reconciling business architecture and BPM, such as reconciling value stream, balanced scorecard and other BA models with activities within a process model. He reviewed a number of definitions and model types, cutting a wide swath through pretty much everything even remotely related to process and architecture, and highlighting some of the failures of mapping enterprise architecture frameworks to BPMN. He finished with a spectrum from business model perspectives (what the business is doing) to the operational model perspective (how the business is doing it), and how the business architecture versus BPM viewpoints differ, but can still both use BPMN as a modeling language. Pretty sure of two things from this: 1) I missed a lot of the detail 2) Dugan has never heard that you’re supposed to have less than 500 words on each PowerPoint slide.

BPMN, CMMN and DMN with @denisgagne at BPMCM15

Last session of day 1 of the BPM and Case Management Summit 2015 in DC, and Denis Gagne of Trisotech is up to talk about the three big standards: the Business Process Model and Notation (BPMN), the Case Management Model & Notation, and the Decision Model & Notation. BPMN has been around for a few years and is well-established — pretty much every business process modeling and automation vendor uses BPMN in some form in their process modelers, and it is OMG’s most-adopted standard — but CMMN and DMN are much newer and less widespread in the market. There are a few vendors offering CMMN modelers and even fewer offering DMN. There are two major benefits to standards such as BPMN, CMMN and DMN, in addition to the obvious benefit of providing an unambiguous format for modeling processes, management and decisions: they can be used to create models that can be interchanged between different vendors’ products; and they provide a common and readily-transferable “language” that is learned by analysts. This interchangeability, both of models and skills, means that organizations don’t need to be quite so worried about which modeling tool that they use, or the people that they hire to use it. Denis was at the Model Interchange Working Group (MIWG) OMG meeting in Berlin last week, where they showed all types of interchange for BPMN; with luck, we’ll be seeing the same activities for the other standards as they become widely adopted.

There are some grey areas about when to use BPMN versus CMMN, since both are (sort of) process-based. However, the main focus in BPMN is on activities within processes, whereas CMMN focuses on events that impact cases. He showed a chart comparing different facets of the three standards:

BPMN CMMN DMN
Processes Cases Decisions
Activities Events Rules
Transitional Contextual Applied
Data Information Knowledge
Procedural Declarative Functional
Token Event Condition Action (ECA) First Order Logic (FOL)

The interesting part (at least to me) comes when we look at the bridges between these standards: in BPMN, there is a business rule task that can call a decision in DMN; in CMMN, there is a process task that can call a process defined in BPMN. Trisotech’s version of all of these modelers (not yet in the standards, but count on Denis to get them in there) also provides for a case task type in BPMN that can call a CMMN case, and a decision task in CMMN that can call a DMN decision. There are some patterns to watch for when modeling that might indicate that you should be using another model type:

  • In BPMN, if you have a lot of gateways expressing business logic, then consider moving the gateway logic to DMN
  • In BPMN, if you have a lot of events especially boundary events, then consider encapsulating that portion into a CMMN case
  • In BPMN, if you have a lot of ad hoc subprocesses, then consider using CMMN to allow for greater specification of the ad hoc activities
  • In CMMN, if you have a lot of task interdependencies, consider using BPMN to replace the temporal dependencies with flow diagrams

The recognition and refactoring of these patterns is pretty critical for using the right model type, and are likely a place where a more trained technical analytical eye might be able to suggest improvements to models created by a less-technical analyst who isn’t familiar with all of the model types or how to think about this sort of decomposition and linking.

He demonstrated integration between the three model types using the Trisotech BPMN, CMMN and DMN modelers, where a decision task in the BPMN modeler can link directly to a decision within a model in the DMN modeler, and a case task in BPMN can link directly to a case model in the CMMN modeler. Nice integration, although it remains to be seen what analyst skill level is required to be able to model across all three types, or how to coordinate different analysts who might be modeling in only one of the three model types each, where the different models are loosely coupled with different authors.

Disclosure: I’m doing some internal work with Trisotech, which means that I have quite a bit of knowledge about their products, although I have not been compensated in any way for writing about them here on my blog.

Fannie Mae Case Study on Effective Process Modeling at BPMCM15

Amit Mayabhate from Fannie Mae (a US government-sponsored mortgage lender that buys mortgages from the banks and packages them for sale as securities) gave a session at the BPM and Case Management Summit on outcome-based process modeling for delivering business value. He had a few glitches getting started — apparently Fannie Mae doesn’t allow employees to download a presentation to their laptop, so he had to struggle through getting connected to the conference wifi and then the Fannie Mae VPN to open a PDF of his presentation — but did tell the best joke of the day when he was restarting his computer in front of us and said “now you know my password…it’s 8 dots in a row”.

Back on track, he discussed their business architecture efforts and how process modeling fits into it. Specifically, he talked about their multifamily housing division, which had its own outdated and inflexible technology platform that they wanted to change out for a simpler infrastructure that would give them better access to information for better decision-making. To get there, they decided to start with the best possible outcome in mind, but first had to have the organization understand not only that they had problems, but some quantification of how big those problems were in order to set those future goals. They identified several key metrics where they could compare today’s measurements with their desired future goals, such as operational efficiency (manual versus automated) and severability. To map from the current to future state, they needed a transformation roadmap and a framework for achieving the steps on the roadmap; this included mapping their journey to greater levels of process maturity, and creating a business capability model that included 17 capabilities, 65 functions, 262 sub-functions, and around 300 process flows.

Their business architecture transformation framework started with the business model (how do we make money), the operating model (how do we behave to make money) and the business capability model (what abilities are needed) using the Business Model Canvas framework. They used other architecture analysis tools, such as analyzing their operating model by plotting business process standardization against business process integration both for their current state and desired future state, to help them develop the strategy for moving between them. They used Mega’s business strategy module for most of the architecture analysis, which helps them identify business processes that are ripe for automation, then move to a BPMS for process modeling and automation. In that way, they can do just the process modeling that provides them with some architectural change that they know will provide value, rather than attempting to boil the ocean by modeling all processes in the organization.

PCM Requirements Linking Capability Taxonomy and Process Hierarchy at BPMCM15

I’m in Washington DC for a couple of days at the BPM and Case Management Summit; I missed this last year because I was at the IRM BPM conference in London, and in fact I was home from IRM less than 36 hours this weekend before I got back on a plane to head down to DC this morning.

I’m in a breakout session to hear John Matthias from the Court Consulting Services of the National Center for State Courts, who focuses on developing requirements for court case management systems. As might be expected, the usual method for courts to acquire their case management systems is to just pick the commercial off-the-shelf (COTS) software from the leading packaged solution vendor, then customize it to suit. Except in general, the leading vendor’s software doesn’t meet the current needs of courts’ case workers and court clerks, and Matthias is trying to rework the best practices to create definitive links and traceability between requirements, processes and the business capabilities taxonomy.

As he noted, justice case management is a prime example of production case management (PCM), wherein there are well-worn paths and complicated scenarios; multiple agents are involved (court and clerks, prosecution, public defender) and the specific order of activities is not always pre-defined, so the key role of the PCM system is to track and respond to state changes in the system of record. There are, however, some points at which there are very specific rules of procedure and deadlines, including actions that need to be taken in the event of missed deadlines. The problem comes with the inflexibility of the existing COTS justice case management software available in the market: regardless of how much study and customization is done at the time of original installation (or, perhaps, in spite of it), the needs change over time and there is no way for the courts to make adjustments to how the customized COTS package behaves.

To address the issue of requirements, Matthias has developed a taxonomy of business capabilities: a tree structure that breaks each business capability down to increasing specialized capabilities that can be mapped to the capability’s constituent requirements. He’s also looked at a process hierarchy, where process stages break down to process groups, and then to elementary processes. This process hierarchy is necessary for organization of the processes, particularly when it comes to reusability across various case types. Process groups show hand-offs between workers on a case, while the elementary processes are the low-level workflows that may be able to be fully automated, or at least represent atomic tasks performed by workers. The elementary processes are definitely designed to be reusable, so that a process such as “Issue Warrant” can be related to a variety of business capabilities. Managing the relationships between requirements gets complex fast, and they’re looking at requirements management software that allows them to establish relationships between business capabilities, business rules, processes, system requirements and more, then understand traceability when there is a change to one component.

Unlike systems with completely pre-defined processes, the requirements for PCM systems need to have the right degree of granularity (not too much to overconstrain the workers, and not too little to provide insufficient guidance), have performance measurement built in, and link to systems of record to provide state awareness and enable process automation. The goal is to achieve some amount of process discipline and standardization while will allowing variations in how the case managers operate: provide guidance, but allow for flexible selection of actions. Besides that ability to provide guidance without overconstraining, developing requirements for a PCM isn’t that much different from other enterprise systems: consider the future state, build to change, and understand the limits of the system’s configurability. I would also argue that requirements for any user-facing systems shouldn’t be done using a waterfall methodology where complete detailed requirements are necessary before implementation, but rather a more Agile approach where we collect the high level requirements, then enough detailed requirements to get you to your first implementation in an iterative development cycle. At which time all of the requirements will change anyway.

Top 10 Trends of Digital Enterprise with @setrag at PegaWorld 2015

I finished my visit to PegaWorld 2015 in the breakout session by Setrag Khoshafian, Pega’s chief BPM evangelist, on the top 10 trends for the adaptive digital enterprise:

  1. Context matters. Analyze and understand the information about your customers and your interactions with them.
  2. Connecting customers to operations. Think of your customers’ actions online as part of your business process, and implement your processes accordingly.
  3. The rise of things. Specifically, the process of things: devices becomes actors in sense-and-respond processes, and channels for customer engagement, across the seven levels of the IoT World Forum reference model.
  4. Design time blurs with runtime. Model-driven development, and moving from explicit requirements collection to having the business make changes on the fly.
  5. The digital executive. Making it so simple that even a C-level executive can understand it. Or better yet, finding executives that actually have a digital clue.
  6. Two-speed IT. Setrag talked about mapping business KPIs and value streams directly onto case phases; being able to do this using Agile model-driven development while still doing more traditional maintenance on underlying legacy systems allows for fast-tracking strategic projects.
  7. The database is dead, long live the database. Moving from transactional SQL-based RDBMS to the new crop of NoSQL databases optimized for big data, analytics and more dynamic applications.
  8. Smart enterprises avoid the app-pocolypse. Allow for native apps as well as responsive web for the most flexibility.
  9. The hybrid cloud. Well, that goes back to what we can really call “cloud”.
  10. The rise of design. Users expect beautiful design; if you don’t live up to those expectations, you degrade the customer experience and therefore your relationship with them.

He was joined by Adam Field from Pega’s technology innovation area to see some examples of what Pega is doing in mobile user experience and device management.

Fast-paced and fun view of what’s driving the digital enterprise, and a bit of how Pega is helping its customers meet those needs.

That’s it for PegaWorld 2015: it was a quick but information-filled trip. I’ll be at the IRM BPM conference in London next week, then the BPM and Case Management Summit in DC the week following, watch for my coverage from those events.

The Personology of @RBSGroup at PegaWorld 2015

IMG_7261Andrew McMullan, director of analytics and decisioning (aka “personologist”) at Royal Bank of Scotland, gave a presentation on how they are building a central (Pega-based) decisioning capability to improve customer engagement and change their culture along the way. He started with a personal anecdote about how RBS did the right thing for a family member and gained a customer for life – a theme echoed from this morning’s keynote that also included RBS.  He showed a short video of their current vision, which stated goals of making RBS easier to do business with, and to work for, in addition to being more efficient. In that order, in case you other banks are following along.

RBS is now government owned, having been bailed out during the financial crisis; I’m not sure how much this has allowed them to focus on customer engagement rather than short-term profits, but they do seem to be talking the right talk.

RBS uses Pega’s Chordiant – primarily the decision management components, if I am reading it correctly – although are implementing Pega 7 for an August 2015 rollout to bring in more robust Next Best Action capabilities; they also use SAS Visual Analytics for reporting. This highlights the huge role of decisioning as well as process in customer engagement, especially when you’re applying analytics to a broad variety of customer information in order to determine how to interact with the customer (online or IRL) at any particular moment. RBS is proactive about having their customers do things that will save them money, such as renewing a mortgage at a lower rate, or choosing a package of banking services that doesn’t overlap with other services that they are paying for elsewhere. Contrary to what nay-sayers within RBS said about lost revenue, this tends to make customers more loyal and ultimately do more business with them.

There was a good question from the audience about how much of this was changes to organizational culture, and how much was the data science: McMullan said that it’s really critical to win the hearts and minds of the employees, although obviously you need to have at least the beginnings of the analytics and recommendations to get that started. Also, they use Net Promoter Score as their main internal metric, which tends to reward relationship-building over short-term profits; having the right incentives for employees goes a long ways towards helping them to do the right thing.

TD Bank at PegaWorld 2015

I attended a breakout presented by TD Bank (there was also a TCS presenter, since they’ve done the implementation) on their workflow system for customer maintenance requests – it’s a bit of a signal about the customer, and possibly the SI, that this is called “workflow” – and how they have implemented this using Pega. They started with PRPC 6.3 and there was no indication that they’ve upgraded to Pega 7, which would give them a whole raft of new functionality.

Customer maintenance requests include any non-financial transaction that a customer may request, such as an address change, which may be fulfilled either manually, semi-automatically, or automated based on Pega business rules. They’re measuring the ROI primarily in terms of improving efficiency (increased throughput, reduced processing time, reduced paper) and improving quality and regulatory compliance (reconciliation of work received and processed, data capture validation, identification of trends, better reporting to compliance). He did mention the improved customer experience, although mostly in terms of the call center/branch staff rather than the actual customer, but turned that back to branch efficiency and productivity. There was a mention that this would result in lower wait times for customers while they were in the branch making the request, but this is so far out of touch with the realities of customer experience these days, as evidenced by the keynote that we saw this morning with AIG and RBS. This was (I think) a technical presenter from TCS going through this part, but depressing in the lack of awareness of how far they are from understanding the customer journey. This is one of the dangers in treating internal stakeholders as the customer rather than having an awareness of the actual customer and their requirements: the internal operations customer is mostly motivated by improving efficiency and compliance, not making sure that their real customer isn’t walking out the door and goes to a bank that pays attention to their needs. We can’t throw away the concepts of efficiency and compliance, but I find in dealing with my banks (yes, more than one, because none of them give me everything that I need) that there are still too many processes that require my presence in a branch, a physical signed document or a call to a call center, when they have already authenticated me in so many ways online already.

They talked about their development process and some of the best practices and lessons learned: allowing time for visual screen mockups during inception in order to reduce rework later (they seriously didn’t know that?), participation from other groups such as application integration (?!), and including a Pega deployment architect to make sure that things get into production the right way. TD Bank has been using Pega for about eight years, and they seem to be rooted in older versions and older development methodologies. Definitely in need of some digital transformation.

I didn’t attend this session with the goal of poking fun at TD or TCS, but this is really an example of old-school (probably waterfall) development methods that is not going to give them big wins in the long run. It’s clear that there is very deep integration with their other systems, and a lot of use of the Pega CPM framework and rules, but also that there has been a lot of custom work here: PRPC used as an application development tool. This is pretty typical of what I have seen with Pega customers in the past, although their recent shift to providing applications rather than frameworks is an obvious strategy to move to less-customized solutions that can be deployed faster. For the customers still plugging away on 6.x, that might be more of a dream than reality.

PegaWorld 2015 Day 2 Customer Keynotes: Big Data and Analytics at AIG and RBS

After the futurist view of Brian Solis, we had a bit more down-to-earth views from two Pega customers, starting with Bob Noddin from AIG Japan on how to turn information that they have about customers into an opportunity to do something expected and good. Insurance companies have the potential to help their customers to reduce risk, and therefore insurance claims: they have a lot of information about general trends in risk reduction (e.g., tell an older customer that if they have a dog and walk it regularly, they will stay healthier and live longer) as well as customer-specific actions (e.g., suggest a different route for someone to drive to work in order to reduce likelihood of accident, based on where they live and work, and the accident rates for the roads in between). This is not a zero-sum game: fewer claims is good for both AIG and the customers. Noddin was obviously paying close attention to Solis, since he wove elements of that into his presentation in how they are engaging customers in the way that the customer chooses, and have reworked their customer experience – and their employee and agent experience – with  that in mind.

Between the two customers, we heard from Rob Walker, VP of Decision Management and Analytics at Pega, about the always-on customer brain and strategies for engaging with them:

  • Know your customer: collect and analyze their data, then put it in the context of their entire customer journey
  • Reach your customer: break down the silos between different channels, and also between inbound and outbound communications, to form a single coherent conversation
  • Delight your customer: target their needs and wants based on what you know about them, using the channels through which you know that they can be reached.

He discussed how to use Pega solutions to achieve this through data, analytics and decisioning; obviously, the principles are universal.

Chrome Legacy Window 2015-06-09 103539 AM.bmpThe second customer on stage was Christian Nelissen from Royal Bank of Scotland, who I also saw yesterday (but didn’t blog about) on the big data panel. RBS has a good culture of knowing their customer from their roots as a smaller, more localized bank: instead of the branch manager knowing every customer personally, however, they now rely on data about customers to create 1:1 personalize experiences based on predictive and adaptive analytics in the ever-changing context of the customer. He talked about the three pillars of their approach:

  • It’s about the conversation. If you focus on doing the right thing for the customer, not always explicit selling to them, you build the relationship for the long term.
  • One customer, one bank. A customer may have products in different bank divisions, such as retail banking, credit cards and small business banking, and you need to be cognizant of their complete relationship with the bank and avoid internal turf wars.
  • You can do a lot with a little. Data collection and analytics technologies have become increasingly cheaper, allowing you to start small and learn a lot before expanding your customer analytics program.

Alan Trefler closed out the keynote before sending us off to the rest of the day of breakout sessions. Next years, PegaWorld is in Las Vegas; not my favorite place, but I’ll be back for the quality of the presentations and interactions here.

These two keynotes this morning have been great to listen to, and also closely aligned with the future of work workshop that I’m doing at IRM BPM in London next week, as well as the session on changing incentives for knowledge workers. Always good when the planets align.

PegaWORLD 2015 Keynote with @BrianSolis: Innovate or Die!

Brian Solis from Altimeter  Group was the starting keynote, talking about disruptive technology and how businesses can undergo digital transformation. One of the issues with companies and change is that executives don’t live the way the rest of us do, and have to think of the shareholders first, but may not have sufficient insight into how changing customer attitudes and the supporting technology will impact their profitability, or even their ability to survive. “A Kodak moment” is now about how you go bankrupt when you ignore disruptive technology: not something that you want to capture for posterity.

Digital Darwinism

Customer experience can just happen by accident, or it can be something that we design in order to achieve a “higher purpose” of being customer centric. That doesn’t mean that we have complete control over that customer experience any more, since our brands are made up of what we put out there, and what other people say about us. Customer experience is not about what we say, but about what we do, since that’s what will be examined under the social media microscope. Altimeter’s research shows that almost all companies undergoing their digital transformation specifically because of customer experience, but that few of them really understand what the problem is. 67% of buyers’ customer journey is now done online, consulting 11 different sources for information even if they purchase IRL, and your online customer experience is the difference between surviving or not. Part of this is omni-channel presence, since almost none of those pre-buying search journeys happen on a single device. You can’t force customers to do business your way: you have to do it their way. And in order to do it their way, you have to understand what that is (that sounds kind of obvious, but may companies don’t get that). You have to think through the eyes of your customers: as Solis said, “Think like a customer. Act like a startup.”

Innovate or Die

Solis’ message, in short: if you don’t disrupt yourself, someone else will do it for you. Innovate or die.