BPMN, CMMN and DMN with @denisgagne at BPMCM15

Last session of day 1 of the BPM and Case Management Summit 2015 in DC, and Denis Gagne of Trisotech is up to talk about the three big standards: the Business Process Model and Notation (BPMN), the Case Management Model & Notation, and the Decision Model & Notation. BPMN has been around for a few years and is well-established — pretty much every business process modeling and automation vendor uses BPMN in some form in their process modelers, and it is OMG’s most-adopted standard — but CMMN and DMN are much newer and less widespread in the market. There are a few vendors offering CMMN modelers and even fewer offering DMN. There are two major benefits to standards such as BPMN, CMMN and DMN, in addition to the obvious benefit of providing an unambiguous format for modeling processes, management and decisions: they can be used to create models that can be interchanged between different vendors’ products; and they provide a common and readily-transferable “language” that is learned by analysts. This interchangeability, both of models and skills, means that organizations don’t need to be quite so worried about which modeling tool that they use, or the people that they hire to use it. Denis was at the Model Interchange Working Group (MIWG) OMG meeting in Berlin last week, where they showed all types of interchange for BPMN; with luck, we’ll be seeing the same activities for the other standards as they become widely adopted.

There are some grey areas about when to use BPMN versus CMMN, since both are (sort of) process-based. However, the main focus in BPMN is on activities within processes, whereas CMMN focuses on events that impact cases. He showed a chart comparing different facets of the three standards:

BPMN CMMN DMN
Processes Cases Decisions
Activities Events Rules
Transitional Contextual Applied
Data Information Knowledge
Procedural Declarative Functional
Token Event Condition Action (ECA) First Order Logic (FOL)

The interesting part (at least to me) comes when we look at the bridges between these standards: in BPMN, there is a business rule task that can call a decision in DMN; in CMMN, there is a process task that can call a process defined in BPMN. Trisotech’s version of all of these modelers (not yet in the standards, but count on Denis to get them in there) also provides for a case task type in BPMN that can call a CMMN case, and a decision task in CMMN that can call a DMN decision. There are some patterns to watch for when modeling that might indicate that you should be using another model type:

  • In BPMN, if you have a lot of gateways expressing business logic, then consider moving the gateway logic to DMN
  • In BPMN, if you have a lot of events especially boundary events, then consider encapsulating that portion into a CMMN case
  • In BPMN, if you have a lot of ad hoc subprocesses, then consider using CMMN to allow for greater specification of the ad hoc activities
  • In CMMN, if you have a lot of task interdependencies, consider using BPMN to replace the temporal dependencies with flow diagrams

The recognition and refactoring of these patterns is pretty critical for using the right model type, and are likely a place where a more trained technical analytical eye might be able to suggest improvements to models created by a less-technical analyst who isn’t familiar with all of the model types or how to think about this sort of decomposition and linking.

He demonstrated integration between the three model types using the Trisotech BPMN, CMMN and DMN modelers, where a decision task in the BPMN modeler can link directly to a decision within a model in the DMN modeler, and a case task in BPMN can link directly to a case model in the CMMN modeler. Nice integration, although it remains to be seen what analyst skill level is required to be able to model across all three types, or how to coordinate different analysts who might be modeling in only one of the three model types each, where the different models are loosely coupled with different authors.

Disclosure: I’m doing some internal work with Trisotech, which means that I have quite a bit of knowledge about their products, although I have not been compensated in any way for writing about them here on my blog.

PCM Requirements Linking Capability Taxonomy and Process Hierarchy at BPMCM15

I’m in Washington DC for a couple of days at the BPM and Case Management Summit; I missed this last year because I was at the IRM BPM conference in London, and in fact I was home from IRM less than 36 hours this weekend before I got back on a plane to head down to DC this morning.

I’m in a breakout session to hear John Matthias from the Court Consulting Services of the National Center for State Courts, who focuses on developing requirements for court case management systems. As might be expected, the usual method for courts to acquire their case management systems is to just pick the commercial off-the-shelf (COTS) software from the leading packaged solution vendor, then customize it to suit. Except in general, the leading vendor’s software doesn’t meet the current needs of courts’ case workers and court clerks, and Matthias is trying to rework the best practices to create definitive links and traceability between requirements, processes and the business capabilities taxonomy.

As he noted, justice case management is a prime example of production case management (PCM), wherein there are well-worn paths and complicated scenarios; multiple agents are involved (court and clerks, prosecution, public defender) and the specific order of activities is not always pre-defined, so the key role of the PCM system is to track and respond to state changes in the system of record. There are, however, some points at which there are very specific rules of procedure and deadlines, including actions that need to be taken in the event of missed deadlines. The problem comes with the inflexibility of the existing COTS justice case management software available in the market: regardless of how much study and customization is done at the time of original installation (or, perhaps, in spite of it), the needs change over time and there is no way for the courts to make adjustments to how the customized COTS package behaves.

To address the issue of requirements, Matthias has developed a taxonomy of business capabilities: a tree structure that breaks each business capability down to increasing specialized capabilities that can be mapped to the capability’s constituent requirements. He’s also looked at a process hierarchy, where process stages break down to process groups, and then to elementary processes. This process hierarchy is necessary for organization of the processes, particularly when it comes to reusability across various case types. Process groups show hand-offs between workers on a case, while the elementary processes are the low-level workflows that may be able to be fully automated, or at least represent atomic tasks performed by workers. The elementary processes are definitely designed to be reusable, so that a process such as “Issue Warrant” can be related to a variety of business capabilities. Managing the relationships between requirements gets complex fast, and they’re looking at requirements management software that allows them to establish relationships between business capabilities, business rules, processes, system requirements and more, then understand traceability when there is a change to one component.

Unlike systems with completely pre-defined processes, the requirements for PCM systems need to have the right degree of granularity (not too much to overconstrain the workers, and not too little to provide insufficient guidance), have performance measurement built in, and link to systems of record to provide state awareness and enable process automation. The goal is to achieve some amount of process discipline and standardization while will allowing variations in how the case managers operate: provide guidance, but allow for flexible selection of actions. Besides that ability to provide guidance without overconstraining, developing requirements for a PCM isn’t that much different from other enterprise systems: consider the future state, build to change, and understand the limits of the system’s configurability. I would also argue that requirements for any user-facing systems shouldn’t be done using a waterfall methodology where complete detailed requirements are necessary before implementation, but rather a more Agile approach where we collect the high level requirements, then enough detailed requirements to get you to your first implementation in an iterative development cycle. At which time all of the requirements will change anyway.

Pega 7 Express at PegaWORLD 2015

img-pega-7-express-ui-snippetAdam Kenney and Dennis Grady of Pega gave us the first look at Pega 7 Express: a new tool for building apps on top of the Pega infrastructure to allow Pega to push into the low-code end of the BPM/ACM market. In part, this is likely driven by the somewhat high degree of technical skill that has traditionally been required to create applications using Pega, but also by the fact that customer experience is becoming a key differentiator, creating the need to create good customer-facing applications faster. Customer experience, of course, is much more than just the type of apps that you’re going to create using Pega 7 Express: it’s new devices and methods of interaction, but all of these are setting the bar high and changing customer expectations for how they should be able to deal with vendors of goods and services. Pega 7 Express is part of the Pega 7 platform, using the same underlying infrastructure: it’s (just) a simpler authoring experience that requires little or no advance training.

We saw an introductory video, then a live demo. It includes graphical data modeling, form building and case configuration, all with multi-device support.

IMG_7234Call it end-user computer (EUC), citizen computing or low-code model-driven development, Express is addressing the problem area of applications that were traditionally built using email, spreadsheets and local desktop databases (I’m looking at you, Excel and Access). I’m not going to enumerate the problems with building apps like these; suffice it to say that Express allows you to leverage your existing Pega infrastructure while allowing non-Java developers to build applications. They even include some badges for gamifying achievements – when you build your first app, or personalize your dashboard. Just-in-time learning is integrated so that you can see an instructional video or read the help as you need it, plus in-context guidance while you’re working.

IMG_7236In the demo, we created a new case-based app by specifying the following:

  • Application name, description and logo
  • Case type
  • Major phases (a straight-through process view)
  • Steps in each phase
  • Form design for specific steps – the data model is created behind the scenes from the form fields
  • Routing/assignment to reviewers and approvers
  • Milestones and deadlines
  • Device support

In part, this looks a lot like their Directly Capturing Objectives tools, but with more tools to create an actual executable app rather than just as input to the more technical Designer Studio development environment. We also saw customizing the dashboard, which was a pretty standard portal configuration.

IMG_7237As with any good Pega demo, however, Kenney went off-screen to “create a little data table” while Grady showed us the graphical form/case builder; they are definitely the masters of “pay no attention to the man behind the curtain” during demos, where one person does the user-friendly stuff on-screen, while a couple of others do a bit of heavy lifting in the background. Lucky for us (unlikely for Kenney), he couldn’t connect to the wifi so we did get to see the data table definition, which was straightforward.

IMG_7239This does look like a pretty usable low-code application development environment. Like any other low-code model driven development, however, it’s not really for complete non-techies: you need to understand data types, how to design a form, the concept of linking case types and separately-defined data types, and how to decompose a case into phases and steps. It wasn’t clear from the brief demo how this would interact with any sort of expected case automation or other parts of the underlying Pega infrastructure: predictions, automated steps/service calls, more complex process flow or temporal dependencies, access control, etc. It’s also unclear any sort of migration path from Express to the full Designer Studio, so that this could be used as an operational prototyping tool for more complex development. They did respond to a question about reporting; there is some out of the box, and they will be adding more as well as adding ad hoc

Pega 7 Express was announced today, with the cloud version available starting today, with a 30-day free trial followed by subscription pricing; when Pega 7.19 rolls out to on-premise installations, it will also offer Express. They’re not really pushing it yet, but will start to roll out the marketing around it in Q3.

PegaWORLD 2015 Keynote: CRM Evolved and Pega 7 Express

Orlando in June? Check. Overloaded wifi? Check. Loud live band at 8am? Check. I must be at PegaWORLD 2015!

Alan Trefler kicked off the first day (after the band) by looking at the new world of customer engagement, and how both organizations and the supporting technology need to change to support this. He took a direct hit at the silos of old-school companies such as traditional financial services (“What *is* a middle office, anyway?”, a question that I’ve often asked), and how many applications and platforms fail to move them beyond that model: conforming (to how an application works out of the box) versus strategic (mix your own DNA into the software). Like many other vendors in this space who are repositioning as process-centric application development platforms, the term BPM (business process management) didn’t come up; Pega is repositioning as “CRM Evolved”. To be fair, Pega has always had a strong CRM (customer relationship management) bias, but it looks like they’re rebranding the entire business of their customers as CRM, from sales and onboarding through support and back into operations. This includes anticipating and operationalizing customer actions, so that you can respond to a potential problem before it ever occurs, and moving from conforming to strategic software in order to allow you to evolve quickly to meet those needs. He warned against implementing the Frankenstack, pieced together from “dead software products”, and decried the term BPM in favor of case management as how customer engagement and operations need to work, although arguably there is a lot of what we think of a traditional BPM implemented as part of Pega’s customers’ solutions.

We’re definitely seeing the BPM market (broadly defined to include dynamic and ad hoc process management including case management) bifurcating into the application development platforms such as Pega, and the more out-of-the-box, low-code process platforms. BPM is really much beyond just process management, of course: many of these platforms include mobile, social, IoT, analytics, big data and all of the other popular features that are being built into almost all enterprise applications. Trefler talked about Pega 7 Express – I’ll be going to a session on that after the keynote – which is a simpler user experience for application development. Having seen their more complex user experience in a few client projects, this is definitely needed to cut through the complexity in order to address the end-user computing/citizen computing needs. In other words, although they are primarily in the heavy-duty application development space, they also realize that they can’t ignore the “low end” of the market if they want to achieve greater awareness and penetration in their customer environments beyond the IT development group.

Trefler also talked about Pega’s vertical industry applications, and we heard from Dr. Mark Boxer from Cigna Healthcare. He discussed how they use Pega’s Smart Claims App, although we mostly saw a lot of futuristic videos of what healthcare could be like, including big data and gamification. Plus Apollo 13. It’s not clear how much of this that Cigna has implemented (presumably they are not working on the moon shot) although I know that some US healthcare companies are reducing premiums for customers who use wearables to monitor their health since it allows for early problem detection.

Don Schuerman, Pega’s CTO and VP of Product Marketing, took the stage to talk about their technology, with a big focus on strategic applications rather than the platform itself – Trefler did make a comment earlier about how their marketing used to be really bad, and I think that someone told them that applications show better than platforms – plus their cloud infrastructure. He was joined by Jim Smith, CIO of the State of Maine, who was not afraid to talk about BPM: he sees BPM plus agile plus legacy system modernization as the cornerstones of their enterprise strategy, underpinned by a cloud platform for speed and security. He showed some pictures of their filing cabinets, pending files in paper folders and other paper-based inefficiencies; it’s interesting to see that there is still so much of their digital transformation – and that of many other organizations that I work with – that is relying on getting paper into digital form, either natively (i.e., online forms replacing paper ones) or through image and data capture.

Brian Matsubara, head of Global Technology Alliances at Amazon, talked briefly about their Amazon Web Services offerings, and their partnership with Pega to create the Pega Cloud on which Pega 7 Express and other products are domiciled. I don’t need to be sold on cloud in general or AWS in particular since I trust critical business data to AWS, but there are still a lot of skittish organizations who think that their own data centers are better, faster, cheaper and more secure than AWS. (Hint: they’re not.) I just finished up the materials for a workshop that I’m giving in London next week on the Future of Work, and I agree with what Matsubara said about (public) cloud: it’s not just cheaper infrastructure, it provides ways of doing business that just weren’t possible before, especially consumer mobile and external collaboration applications. Schuerman stressed at the end that they need to help their customers make cloud strategic:

The keynote finished with Kerim Akgonul, SVP of Products, who discussed changing customer attitudes: customers now expect more, and will quickly make their displeasure public when the experience is less than awesome. He talked about their suite of applications – Marketing, Sales Automation, Customer Service, and Operations – and how decision-based Next Best Action predictions and recommendations are an underlying feature that drives all of them. The Pega Marketing application brings tools to help improve customer engagement, including next best action and 1:1 targeted marketing. Their Sales Automation application offers guided selling through the end-to-end sales process. Their Customer Service application uses case management paradigms and next best actions for guided customer conversations, while interacting with social media and other channels. Akgonul is always willing to participate in the on-stage highjinks: last year, it was a wild motorcycle ride, and this year it’s a wellness app on an iWatch and iPhone that tied in with a customer service agent’s screen, with some assistance from his colleagues David Wells and Don Schuerman. Fun, and drove home the point about how these technologies can be used to improve customer engagement: mobile, omni-channel, next best action, gamification and more. He wrapped up with a more serious, if somewhat breathless, look at some of the newer features, including offline mobile apps that can synchronize data later, pattern detection in real-time streaming data such as dropped calls, dashboard personalization, and the new Pega 7 Express lightweight application builder.

bpmNEXT 2015 Day 3 Demos: Camunda, Fujitsu and Best In Show

Last demo block of the conference, and we’re focused on case management and unstructured processes.

Camunda, CMMN and BPMN Combined

Jakob Freund presented on OMG’s (relatively) new standard for case management modeling, CMMN, and how they combine it with BPMN to create processes that have a combination of pre-defined flows and case structures. They use the Trisotech CMMN modeler embedded in their environment, running both the CMMN and BPMN on the same engine; they are looking at adding DMN for decision modeling as well. He demonstrated an insurance application example there BPMN is used to model the overall process, with the underwriting subprocess actually being a CMMN model within a BPMN model. The user task list can show a consolidated view of both BPMN tasks and CMMN tasks, or a dedicated UI can be used for a case since it can also show enabled activities that are not yet instantiated (hence would not appear in a task list) as available user actions. BPMN processes can also be triggered from the CMMN model, providing pre-defined process fragments that can be triggered by the case worker to perform standard operations. He also showed their developer workbench, including a full-featured debugger that includes stepwise execution and the ability to execute code at any step. Since their paradigm is to provide process management services to a developer writing in Java, their tooling is more technical than what is found in a no-code or low-code environment. Also, a BPMN font.

Fujitsu: Using Agents to Coordinate Patient Care across Independent Specialists

Keith Swenson finished the demos presenting healthcare research from the University of Granada, which helps to create patient treatment plans based on rules and iterative goal-seeking rather than pre-defined processes. This allows for different medical specialists to have their own sets of rules and best practices for dealing with their own specialization; automated agents can combine and negotiate the rules from multiple specialists to create a consolidated treatment plan for patients with multiple conditions, allowing each of the participants to monitor progress. He demonstrated a prototype/sample application that allows each specialist to set out a schedule of actions that make up a treatment plan; the multiple treatments plans are conciliated against each other — basically, modifying a plan by adding steps from another plan — and presented back to the referring physician, who can then select one of the plan processes for execution. He used the IActive Knowledge Studio to show how the plans and rules are designed, and discussed how the processes for the interacting agents would be emergent as they communicate and negotiate.

That’s it for bpmNEXT for me. Great conference, as always. As a matter of disclosure, I was not charged the conference fee to attend, although I paid my own travel and living expenses. A number of the vendors that I have written about here over the past three days are my clients or have been so in the past, but that did not allow them to escape the snarky Twitter comments.

Update: waiting to take off at Santa Barbara airport, and I see from the Twitter stream that SAP won the Best In Show award for their Internet of Everything demo – congratulations! Top five presentations: W4, Camunda, Trisotech, Bonitasoft and BP-3. Kudos all around. 

bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you will have to check the Twitter hashtag to see who won — or maybe I’ll do a wrapup post from the road.

IBM: BPM, say Hello to Watson. A New Era of Cognitive Work – Here Today

First up was Chris Vavra discussing how Watson and cognitive computing and natural language analysis capabilities can be used in the context of BPM, acting as an expert advisor to knowledge workers to enhance, scale and accelerate their work with its (or as Chris said, “his”) reasoning capabilities. There are a number of Watson services offered on their Bluemix cloud development platform; he demonstrated an example of an HR hiring process where the HR person uses Watson to analyze a candidate’s personality traits as part of the evaluation process. This is based on a written personal statement provided by the candidate; Watson analyzes that text (or could link through to a personal website or blog) to provide a personality analysis. From the Bluemix developer dashboard, you can create applications that include any of the services, including Watson Personality Insights that provides ranking on several factors in the five basic personality traits of Openness, Conscientiousness, Extraversion, Agreeableness and Emotional Range, with a graphical representation to highlight values and needs that may be of concern in the hiring process. It’s unlikely that a hiring manager would use solely this information to make a decision, but it’s interesting for exploring a candidate’s personality characteristics as part of the process. There are a number of other Watson-based services available on Bluemix to bind into BPM (and other) applications; in the IBM cloud BPM designer, this just appears as a service connector that can be configured with the Watson authentication information, and invoked at a services step in a process flow. Lots of other potential applications for bringing this level of expert recommendations into processes, such as healthcare condition diagnoses or drug interactions.

Safira: Managing Unstructured Processes with AdHoc BPM Framework

Filipe Pinho Pereira addressed the issue of the long tail of organizations’ processes, where only the high-volume, high-value structured processes are being implemented as full BPM projects by IT, and the long tail of less critical and ad hoc processes that end up being handled manually. Using IBM BPM, he demonstrated their Ad-Hoc BPM Framework add-on that allows a business user to create a new ad-hoc process based on a predefined request-intervention process pattern, which has only an initial data capture/launch step, then a single “do it” human step with a loop that keeps returning to the same step until explicitly completed. The example was an expense report process, where a blank expense spreadsheet was attached, a form created to capture basic data, and SLAs specified. Routing is created by specifying the primary recipient, and notifications that will be issued on start, end and SLA violations. Users can then create an instance of that process (that is, submit their own expense report), which is then routed to the primary recipient; the only routing options at that point are Postpone, Forward and Complete, since it’s in the main human task loop part of the process pattern. This distills ad-hoc processes to their simplest form, where the current recipient of the main task decides on who the next recipient is or whether to complete the task; this is functionally equivalent to an email-based process, but with proper process monitoring and SLA analytics. By looking at the analytics for the process, we saw the number of interventions (the number of times that the human step loop was executed for an instance), and the full history log could be exported to perform mining to detect patterns for process improvement. Good example of very simple user-created ad hoc processes based on an industrial-strength infrastructure; you’re not going to buy IBM BPM just to run this, but if you’re already using IBM BPM for your high-volume processes, this add-on allows you to leverage the infrastructure for the long tail of your processes.

Cryo: Tools for Flexibility in Collaborative Processes

Rafael Fazzi Bortolini and Leonardo Luzzatto presented on processes that lie somewhere in the middle of the structured-unstructured spectrum, and how to provide flexibility and dynamic aspects within structured constraints through decision support, flexible operations, ad-hoc task execution and live changes to processes. Demonstrating with their Orquestra BPMS, they showed a standard process task interface with the addition of localized analytics based on the history of that task in order to help the user decide on their actions at that point. Flexible routing options allow the user to return the process to an earlier step, or forward the current task to a colleague for consultation before returning it to the original user at the same step; this does not change the underlying process model, but may move the instance between activities in a non-standard fashion or reassign it to users who were not included in the original process definition. They also have an ad-hoc process pattern, but unlike Safira, they are using actual ad-hoc activities in BPMN, that is, tasks that are not connected by flow lines. Users are presented with the available ad hoc tasks in the process model, allowing them to “jump” between the activities in any order. They also demonstrated live changes to production processes; the examples were adding a field to a form and changing the name of a task in the process, both of which are presumably loaded at runtime rather than embedded within the instantiated process to allow these types of changes.

Kofax Claims Agility SPA

Continuing with breakout sessions at Kofax Transform is a presentation on the Claims Agility smart process application that Kofax is creating for US healthcare claims processing, based on the KTA platform. They have built in a lot of the forms and rules for compliance with US healthcare regulations; I suspect that this means that the SPA would not be of much value for non-US organizations.

Claims Agility is still in development, but we were able to get an early look at a demo. The capture workflow is pretty simple: scan, classify and extract data fields from a form, then pass it on to a claims worker for their activities, presenting both the scanned document and the data fields. This is a pretty standard scanned document workflow application, but has the advantage of having a lot of knowledge of the US healthcare forms, data validation, rules and processes built in so that little setup and system training would be required for the standard forms and workflows. Incomplete or incorrect forms can be held, allowing the validated forms in the same batch to be completed. The final step in the predefined workflow performs the EDI transactions.

They will do updates for some components of the system, such as the CMS codes that drive the validation; they haven’t finalized the hot update capabilities, and it’s not clear that they will be able to do much more than update code tables.

We looked at the customizability of the processes and rules: customers can modify the standard processes using the graphical process designer, including building their own processes. Since the out of the box process is really simple — four steps — there’s no real issue of upgradability of the process at this point, but it’s likely that any processes provided should be considered templates rather than productized frameworks. Configuration for data extraction and validation is provided as part of the core package, but again, the customer can override the defaults. I was going to ask the question about the separation of base product from customizations with respect to product upgrades but a customer in the audience beat me to it; there are separate areas for custom versus core code, as well as versioning, so it appears that they have thought through some of this but it will be interesting to see how this plays out after the product is being used at customer sites through a couple of upgrade cycles.

There will be an initial release in June or July this year, and Kofax is looking for early adopters now; full release will be near the year end. Claims Agility is the fifth SPA that Kofax is offering on the KTA platform, and they’re learning more about how to do these right with each implementation, plus how to integrate the new technologies such as e-signature.

TotalAgility Product Update At KofaxTransform

In a breakout session at Kofax Transform, Dermot McCauley gave us an update on the TotalAgility product vision and strategy. He described five vital communities impacted by their product innovation: information all-stars who ensure that the right information is seen by the right people at the right time, performance improvers focused on operational excellence, customer obsessives who focus on customer satisfaction, visionary leaders who challenge the status quo, and change agents using technology thought-leadership to drive business value. I think that this a great way to think about product vision, and Dermot stated that he spends his time thinking about how to serve these five communities and help them to achieve their goals.

TotalAgility Product VisionTotalAgility is positioned to be the link between systems of engagement and systems of record, making that first mile of customer engagement faster, simpler, more efficient, and customer-friendly. It includes four key components: multichannel capture and output, adaptive process management, embedded actionable analytics, and collaboration. Note that some of this represents product vision rather than released product, but this gives you an idea of where they are and what they’re planning.

Multichannel capture and output includes scanning in all forms, plus capture from electronic formats including documents, forms and even social media, with a goal to be able to ingest information in any type and any format. On the processing and output side, their recent acquisitions fill in the gaps with e-signature and signature verfication, and outbound correspondence management.

TotalAgility Product Components and SPAsAdaptive process management includes pre-defined routine workflows and ad hoc collaboration, plus goal-based and analytics-driven adaptive processes. These can be automated intelligent processes, or richer context used when presenting tasks to a knowledge worker.

Embedded actionable analytics are focused on the process at hand, driving next-best-action decisions or recommendations, and detecting and predicting patterns within processes.

Collaboration includes identifying suitable and available collaborators, and supporting unanticipated participants.

AP AgilityThe goal is to provide a platform for building smart process applications (SPAs), both for Kofax with their Mortgage Agility and other SPAs, and for partners to create their own vertical solutions. McCauley walked through how Kofax AP Agility uses the TotalAgility platform for AP processing with ERP integration, procurement, invoice capture and actionable analytics; then Mortgage Agility that brings in newer capabilities of the platform such as e-signature and customer correspondence management with a focus on customer engagement as well as internal efficiencies.

TotalAgility Deployment OptionsHe walked through deployment options of on-premise (including multi-tenancy on-premise for a BPO or shared service center) and Microsoft Azure public cloud (multi-tenant or own instance), and touched on the integration into and usage of Kapow and e-signatures in the TotalAgility platform. They’re also working on bringing more of the analytics into TotalAgility to allow for predictions, pattern detection, recommendations and other analytics-based processing.

TotalAgility Innovation ThemesGoing forward, they have four main innovation themes:

  • Platform optimization for better performance
  • Portfolio product integrations for a harmonized design time and runtime
  • Pervasive mobility
  • Context-aware analytics

KofaxAgility Mobile Extraction InnovationHe showed some specific examples that could be developed in the future as part of the core platform, including real-time information extraction during document capture on a mobile device, and process improvement analytics for lightweight process mining; the audience favorite (from a show of hands) was the real-time extraction during mobile capture.

Kicking off KofaxTransform 2015: Day 1 Keynotes

I’m in Vegas for a couple of days for the Kofax Transform conference. Kofax has built their business beyond their original scanning and capture capabilities (although many customers still use them for just that): they have made a play in the past couple of years in the smart process application space to extend that “first mile” capability further into the customer journey, and recently acquired Aia, a customer communications management software company, to help round out their ability to support the entire cycle.

2015-03-09 08.20.47The morning keynote on the first day kicked off with introductions from Howard Dratler, EVP of Field Operations, then on to the CEO, Reynolds Bish, for a company update and vision. He’s been with Kofax since 2007, and led them through getting rid of their hardware business and building a more modern software organization including a number of acquisitions, and a public listing on NASDAQ. A key part of this was repositioning in the smart process application (SPA) area with their TotalAgility platform and the vertical solutions that they and their partners build on that. I suspect that there are still a lot of partners (and customers) with solutions built on the older technology; as other software vendors have found, it’s often difficult to get people to refactor onto a new platform when the old one is currently meeting their needs. Their core capture license revenue is still almost 60% of their revenue, but the new and acquired products have increased from zero in 2011 to over 40% They’ve increased their R&D spending significantly, as well as spending on acquisitions, and sales and marketing activities; given their strong cash position, they’ve been able to fund all of these expenditures using cash generated from operations. Overall, a good financial position and good repositioning from their traditional capture business to a broader part of their customers’ business processes, particularly in the systems of engagement.

Bish talked about the “first mile challenge” of getting from the systems of engagement to the systems of record, and how SPAs created with their TotalAgility platform can fill that gap, providing a range of capabilities including mobility, capture, transformation, collaboration, e-signature (via their recent acquisition of Softpro SignDoc) integration and analytics. Although they present TotalAgility primarily as a platform for partners and customers to build SPAs, they also offer vertical applications where they have experience: Kofax Mortgage Agility and Claims Agility. He went into more detail on how the Softpro SignDoc and Aia customer communications acquisitions provide new capabilities for handling signed digital communications with customers, allowing legally-binding transactions to be completed within applications built on TotalAgility: what he called digital transaction management. He also talked about their new mobile capture platform, available as an off-the-shelf capability as well as an SDK for embedding withing other mobile apps.

They’re really pushing to shift people onto the new TotalAgility platform, offering some good deals for customers to switch their KC/KTM licenses for TotalAgility licenses at a reasonable price point, but the effort required to refactor existing applications will still likely impede these migration efforts. There is no doubt that TotalAgility is a superior platform that offers new functionality, better flexibility, and faster development and deployment: the issue is that many customers find their current KC/KTM functionality to be adequate, and moving onto the new platform is a sufficiently large development project that they may consider looking at competitive solutions. I see this same challenge with other vendors, and it’s going to be a lot slower and less successful than Kofax likely imagines.

He summed up by stating that their short-term growth strategy is incremental rather than transformative. There’s also some sessions later today for the analysts to meet with Bish, so I’ll likely be able to expand on this further in later posts.

2015-03-09 08.58.41Next up was Guy Kawasaki, currently at Canva but well-known for his past gigs as an Apple evangelist and his other high-profile engagement gigs. His presentation was on customer enchantment: like engagement, but better. He laid out 10 rules for enchanting customers:

  1. Achieve likability, by being genuine, accepting and positive
  2. Achieve trustworthiness, by trusting others
  3. Perfect your product, because it is easier to engage people with good stuff than with crap
  4. Launch your product, by telling a story containing salient points, and planting the seeds of that story with many social channels
  5. Overcome resistance, using social proof or data to change people’s minds, and understanding who the true influencers are
  6. Endure, by enabling dedicated supporters and building an ecosystem
  7. Present, and become a great presenter by customizing the introduction to the audience, selling the dream of how your product changes people’s lives, and using the 10 slides/20 minutes/30 point font rule
  8. Use technology, removing the speed bumps to engagement and providing value in the form of information, insights and assistance
  9. Enchant up (that is, enchant the boss), by dropping everything else, prototyping fast and delivering bad news early
  10. Enchant down (that is, the people who work for you), by providing mastery and autonomy towards a higher purpose, and being willing to do the dirty jobs

He wrapped up with the three pillars of enchantment: be likable, trustworthy and competent. Great words to live by.

Kawasaki’s a very engaging and funny speaker, with some funny cracks at Apple and Microsoft alike, and good examples from a number of industries. Great talk.

2015-03-09 09.48.25

There’s an analysts session for the rest of the morning, looking forward to hearing more about Kofax’s plans for the future.

Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so not a lot of writing. No keynotes today, on this last day of TIBCO NOW 2014, but some BPM breakouts on the calendar — stay tuned.

I started the day with Jeremy Smith and Nam Ton That presenting on case management. They discussed customer journeys, and how their Fast Data platform allows you to detect and respond to that journey: this often includes semi-structured, dynamic processes that need to change based on external events and the process to date. It’s more than just process, of course; there needs to be context, actionable analytics, internal and external collaboration, and recommended actions, all working adaptively towards the customer-centric goal.

TIBCO addresses case management with additions to AMX BPM, not with a separate product; I believe that this is the best way to go for a lot of case management use cases that might need to combine more traditional structured processes with adaptive cases. The new capabilities added to support case management are:

  • Case data, providing context for performing actions. The case data model is created independently of a process model; the modeling uses UML to create relational-style ERDs, but also scripting and other functions beyond simple data modeling. This appears to be where the power — and the complexity — of the case management capabilities lie.
  • Case folders, integrating a variety of document sources, including from multiple ECM systems using CMIS, to act as the repository for case-related artifacts.
  • Case state and actions, allowing a user (or agent) to view and set the state of a case — e.g., received, in process, closed — and take any one of a number of actions allowed for the case when it is that state. This is modeled graphical with a state/action model, which also can apply user/role permissions, in a very similar fashion to their existing page flows capability. Actions can include social interactions, such as requesting information from an expert, accessing a Nimbus-based operations manual related to the current action, applying/viewing analytics to provide context for the action at that state, or providing recommendations such as next best action. Rules can be integrated through pre-conditions that prevent, require or invoke actions.
  • Ad hoc tasks, allowing the case user to instantiate a user task or subprocess; it appears they are doing this by pre-defining these in the process model (as ad hoc, or disconnected, tasks) so although they can be invoked on an ad hoc basis, they can’t be created from scratch by the user during execution. Given that multiple process models can be invoked from a case, there is still a lot of flexibility here.
  • Case UI, providing some out of the box user interfaces, but also providing a framework for building custom UIs or embedding these capabilities within another UI or portal.

Related cases can be linked via an association field created in the case data model; since this is, at heart, an integration application development environment, you can do pretty much anything although it looks like some of it might result in a fairly complex and technical case data model.

They didn’t do an actual demo during the presentation, I’ll drop by the showcase later and take a peek at it later today.