Going Beyond Process Modeling, Part 1

I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.

Effektif BPM Goes Open Source

On a call with Tom Baeyens last week, he told me about their decision to turn the engine and APIs of Effektif BPM into an open source project: not a huge surprise since he was a driver behind two major open source BPM projects prior to starting Effektif, but an interesting turn of events. When Tom launched Effektif two years ago, it was a bit of a departure from his previous open source BPM projects: subscription-based pricing, cloud platform, business-friendly tooling for creating executable task lists and workflows with little IT involvement, and an integrated development environment rather than an embeddable engine. In the past, his work has been focused on building clean and fast BPM engines, but building the Effektif user-facing tooling taught them a lot about how to make a better engine (a bit to his surprise, I think).

The newly-launched open source project includes the fully-functional BPM engine with Java and REST APIs; the REST APIs are a bit minimal at this point, but more will come from Effektif or from community contributions. It also includes a developer cloud account for creating and exporting workflows to an on-premise engine (although it sounds like you can create them in any standard BPMN editor), or process instances can be run in the cloud engine for a subscription fee (after a 30-day free trial). They will also offer developer support for a fee. Effektif will continue to offer the existing suite of cloud tools for building and running workflows at subscription pricing, allowing them to address both the simple, out-of-the-box development environment and the developer-friendly embeddable engine – the best of both worlds, although it’s unclear how easy it will be for both types of of “developers” to share projects.

You can read more about the technical details on Tom’s blog or check out the wiki on the open source project.

This definitely puts Effektif back in direct competition with the other open source BPM projects that he has been involved with in the past – jBPM and Activiti (and, due to it forking from Activiti, Camunda) – since they all use a similar commercial open source business model, although Tom considers the newer Effektif engine as having a more up-to-date architecture as well as simpler end-user tooling. How well Effektif can compete against these companies offering commercial open source BPM will depend on the ability to build the community as well as continue to offer easy and compelling citizen developer tools.

KofaxTransform 2015 In Pictures

As I prepared to depart Las Vegas, I flicked through some of my photos from the past couple of days and decided to share. First, the great work of the ImageThink team of graphic recorders:







There were more of these that I didn’t capture; great idea and nice execution. 

We had a fun evening event on Monday at Tao nightclub at the Venetian, with an impressive turnout considering that it wasn’t in the same hotel:



I also captured some Vegas day and night shots from my hotel room at the Aria:





Lastly, our Kofax-branded tiramisu dessert from the awards dinner last night:



A good mix of work and play!

Analytics For Kofax TotalAgility With @Altosoft

Last session here at Kofax Transform, and as much I’d like to be sitting around the pool, I also like to squeeze every bit out of these events, and support the speakers who get this most unenviable timeslot. I’ve been in a couple of the analytics sessions over the past two days, which are based on the Kofax Altasoft Insight product. Married with TotalAgility for process analytics, they offer a simple version with some pre-defined dashboards, a more complete version but tied only to the KTA databases, and the full version that has the full Insight functionality with any data sources including KTA. The focus seems to be only on document capture workflow analytics, with many of the default reports on things like productivity, extraction rates and field accuracy in the scan and extraction modules; although these are definitely important, and likely of primary importance to Kofax’s current customer base of capture clients, the use cases for their demos need to push further into the post-capture business processes if they expect to be taken seriously as a BPM vendor. I know that KTA is a “first mile” solution and the capture processes are essential, but there should be more to apply analytics to across the customer journey managed within a SPA.

The visualization and dynamic filtering is pretty nice, as you would expect in the Altosoft environment, allowing you to drill into specific processes and tasks to find problem areas in process quality and operator performance. Traditional capture customers in the audience are going to like this, since it provides a lot of information on those front-end processes that can become an expensive bottleneck to downstream processing. 

We had another look at the process intelligence that I saw in an earlier session, monitoring event logs from capture workflows plus downstream processing in KTA or another system such as a third-party BPM or ERP system. Although that’s all good stuff, it does highlight that the Kofax end-to-end solution is made up of a number of systems strung together, rather than an integrated platform with shared infrastructure. It’s also completely document-centric since it uses document ID as the instance ID: again, well-suited for their current capture customers, but not necessarily the mind-set required to approach a more general BPM/case management market that is more data-centric than document-centric.

This wraps up Kofax Transform 2015. There is a customer awards dinner tonight that I plan to attend, then head home tomorrow. Thanks to the entire Kofax team, especially the amazing analyst relations crew, for inviting me here and making sure my time was well-spent. As a matter of disclosure, Kofax paid my travel expenses to be here, but did not otherwise compensate me for my time or for anything that I wrote here on my blog. Kofax has been a customer of mine in the past for presentations at Transform as well as webinars and white papers.

My next event is bpmNEXT in Santa Barbara at the end of the month — if you’re interested in the next generation of BPM or just want to hang with a bunch of BPM geeks in a relatively non-partisan environment, I highly recommend that you check it out.

Smarter Processes With Kapow Integration

I’m in a Kofax Transform breakout session on Kapow Integration together with KTA; I missed documenting the first part of the session when my Bluetooth keyboard stopped talking to my Android tablet, until I figured out how to pair it with my iPhone (which is not supposed to be possible), so I’m blogging on that. I feel like Macgyver.

Kapow provides a method to create “robots” for a sophisticated sort of automated control and screen scraping of web pages, so that you can create robots to interact with a web page for the purpose of integrating it with other applications (such as those built on Kofax TotalAgility) instead of a user having to interact with the page directly. In the demonstration that we saw, a robot was created to enter data to generate pay stubs on a site, then scroll between the full set of stubs created to take a screen snapshot or PDF of each. This allows any web application to use the robots to harvest information from a web site without user interaction, for example, to go to a series of bank web sites and enter the provided credentials to gather bank statements as input to a mortgage process. The use case shown had a web application that was presented to the customer, gathered their credentials for a number of banking sites, then went to each of those behind the scenes to grab the bank statements using the robot’s knowledge of how to navigate to each of those sites. Although the web sites being remotely controlled are hidden from the user, the robot can show a clip of the underlying site to, for example, display an error message such as incorrect credentials.

The design is all pretty much drag-and-drop, meaning that a semi-technical data or business analyst could work through the creation of a robot: they just need to know how to navigate through the web site to be controlled, and be able to understand how to handle all of the possible error cases. There are more technical implementations for complex scenarios that would require developer skills, but a lot can be done without that.

In my past life as a systems integrator, we did a lot of screen scraping, mostly of green-screen systems that could not be easily integrated with; funny that we have exactly the same problem even though we have leapfrogged a few generations of technologies from terminal emulators to browsers. Plus ça change.

Process Intelligence at KofaxTransform

It’s after lunch on the second (last) day of Kofax Transform, and the bar for keeping my attention in a session has gone up somewhat. To that end, I’m in a session with Scott Opitz and Rich Rabin from the Kofax Altosoft division, but not sure it’s going to meet that bar since Opitz started out by stating that what the TotalAgility (KTA) sessions call process is a much more complex than what they call process, and I’m a bit more on KTA’s side of this definition.

Altosoft process intelligence is really about the simple milestone-based monitoring processes of operational intelligence, with the processes being executed on multiple systems, more like SAP’s SAP Operational Process Intelligence based on HANA or IBM Business Monitor; you rarely have all of your process milestones in a single system, and even if you do, that system may not have adequate operational intelligence capabilities. Instead, operational intelligence systems pick up the breadcrumbs left by the processes — such as events, database records or log files — and provide an analytics layer, usually after importing that data into a dedicated analytics datamart.

There are really two main things to measure with process intelligence: performance and quality/compliance. To get there, however, you need to know what the process is supposed to look like in order to measure patterns of behavior. Altosoft’s process intelligence does what they call “swimlane analysis” — looking at which tasks are done in which order, a form of process mining discovery algorithm since there is no a priori process model — to identify operational patterns and derive a process model from runtime data, showing the most common/expected paths as well as the outliers. Not just process mining as an analysis tool, it then shows the live process monitoring data points against those models, and provides some good interactive filtering capabilities, allowing you to find missing steps that may indicate that the task wasn’t performed or (more likely for steps with manual logging) that the task was not documented.

Since the Insight platform is a complete BI environment, this information can also be combined with more traditional BI analytics and dashboards, providing real-time alerts as well as historical analysis. They also have ways to use a predefined process model and measure against that; this then becomes more of a conformance analysis to see how closely the actual runtime data matches the a prioiri model.

Kofax Claims Agility SPA

Continuing with breakout sessions at Kofax Transform is a presentation on the Claims Agility smart process application that Kofax is creating for US healthcare claims processing, based on the KTA platform. They have built in a lot of the forms and rules for compliance with US healthcare regulations; I suspect that this means that the SPA would not be of much value for non-US organizations.

Claims Agility is still in development, but we were able to get an early look at a demo. The capture workflow is pretty simple: scan, classify and extract data fields from a form, then pass it on to a claims worker for their activities, presenting both the scanned document and the data fields. This is a pretty standard scanned document workflow application, but has the advantage of having a lot of knowledge of the US healthcare forms, data validation, rules and processes built in so that little setup and system training would be required for the standard forms and workflows. Incomplete or incorrect forms can be held, allowing the validated forms in the same batch to be completed. The final step in the predefined workflow performs the EDI transactions.

They will do updates for some components of the system, such as the CMS codes that drive the validation; they haven’t finalized the hot update capabilities, and it’s not clear that they will be able to do much more than update code tables.

We looked at the customizability of the processes and rules: customers can modify the standard processes using the graphical process designer, including building their own processes. Since the out of the box process is really simple — four steps — there’s no real issue of upgradability of the process at this point, but it’s likely that any processes provided should be considered templates rather than productized frameworks. Configuration for data extraction and validation is provided as part of the core package, but again, the customer can override the defaults. I was going to ask the question about the separation of base product from customizations with respect to product upgrades but a customer in the audience beat me to it; there are separate areas for custom versus core code, as well as versioning, so it appears that they have thought through some of this but it will be interesting to see how this plays out after the product is being used at customer sites through a couple of upgrade cycles.

There will be an initial release in June or July this year, and Kofax is looking for early adopters now; full release will be near the year end. Claims Agility is the fifth SPA that Kofax is offering on the KTA platform, and they’re learning more about how to do these right with each implementation, plus how to integrate the new technologies such as e-signature.

TotalAgility Product Update At KofaxTransform

In a breakout session at Kofax Transform, Dermot McCauley gave us an update on the TotalAgility product vision and strategy. He described five vital communities impacted by their product innovation: information all-stars who ensure that the right information is seen by the right people at the right time, performance improvers focused on operational excellence, customer obsessives who focus on customer satisfaction, visionary leaders who challenge the status quo, and change agents using technology thought-leadership to drive business value. I think that this a great way to think about product vision, and Dermot stated that he spends his time thinking about how to serve these five communities and help them to achieve their goals.

TotalAgility Product VisionTotalAgility is positioned to be the link between systems of engagement and systems of record, making that first mile of customer engagement faster, simpler, more efficient, and customer-friendly. It includes four key components: multichannel capture and output, adaptive process management, embedded actionable analytics, and collaboration. Note that some of this represents product vision rather than released product, but this gives you an idea of where they are and what they’re planning.

Multichannel capture and output includes scanning in all forms, plus capture from electronic formats including documents, forms and even social media, with a goal to be able to ingest information in any type and any format. On the processing and output side, their recent acquisitions fill in the gaps with e-signature and signature verfication, and outbound correspondence management.

TotalAgility Product Components and SPAsAdaptive process management includes pre-defined routine workflows and ad hoc collaboration, plus goal-based and analytics-driven adaptive processes. These can be automated intelligent processes, or richer context used when presenting tasks to a knowledge worker.

Embedded actionable analytics are focused on the process at hand, driving next-best-action decisions or recommendations, and detecting and predicting patterns within processes.

Collaboration includes identifying suitable and available collaborators, and supporting unanticipated participants.

AP AgilityThe goal is to provide a platform for building smart process applications (SPAs), both for Kofax with their Mortgage Agility and other SPAs, and for partners to create their own vertical solutions. McCauley walked through how Kofax AP Agility uses the TotalAgility platform for AP processing with ERP integration, procurement, invoice capture and actionable analytics; then Mortgage Agility that brings in newer capabilities of the platform such as e-signature and customer correspondence management with a focus on customer engagement as well as internal efficiencies.

TotalAgility Deployment OptionsHe walked through deployment options of on-premise (including multi-tenancy on-premise for a BPO or shared service center) and Microsoft Azure public cloud (multi-tenant or own instance), and touched on the integration into and usage of Kapow and e-signatures in the TotalAgility platform. They’re also working on bringing more of the analytics into TotalAgility to allow for predictions, pattern detection, recommendations and other analytics-based processing.

TotalAgility Innovation ThemesGoing forward, they have four main innovation themes:

  • Platform optimization for better performance
  • Portfolio product integrations for a harmonized design time and runtime
  • Pervasive mobility
  • Context-aware analytics

KofaxAgility Mobile Extraction InnovationHe showed some specific examples that could be developed in the future as part of the core platform, including real-time information extraction during document capture on a mobile device, and process improvement analytics for lightweight process mining; the audience favorite (from a show of hands) was the real-time extraction during mobile capture.

KofaxTransform 2015: Day 2 Customer Keynotes

I had a chance to hear Tom Knapp from Waterstone Mortgage speak yesterday at the analyst briefing here at Kofax Transform, and we have him to kick off this morning’s keynote. They started their journey with Kofax a year ago, and participated in the Kofax sales kickoff in January of this year to show their use of Kofax Mortgage Agility. It will be going beta this month in a couple of their branches, and rolling out across all branches later this year. They are looking to improve the loan process pipeline, which goes from loan application to initial processing to underwriting to approval to closing; they started their improvement process by examining what data and documentation is required at what step in the process, and the challenges in collecting those documents, such as replacing illegible documents and clarifying the applicant’s cash flow. Given that time to correct documentation errors can be from 2 days to 4 weeks, this can cause problems because there is typically a fixed closing date that needs to be met for funding. He expects the market to shift so that millennials will dominate household formation, and therefore the mortgage market, and that smartphones are a key way for those potential customers to perform their financial transactions. There are also some changing mortgage regulations in the US that deal with costs and fee disclosure, and getting better information earlier in the process helps Waterstone to make those disclosures as required.

They expect four major classes of benefits from implementing Mortgage Agility:

  • Market: improved borrower experience, and a consistent borrower portal for life of loan
  • Technology: leveraging mobile capture, e-disclosure and e-signatures
  • Loan process flow: lifting data from documents to populate application, eliminating paper where possible, use intelligence and automation in the process to assist underwriter and other knowledge workers
  • Compliance: support new regulatory and compliance requirements

Waterstone has not implemented Mortgage Agility yet, so we will have to wait until next year to hear about their success.

Next up was Tim Dewey from Safe-Guard Products, talking about their journey in transforming claims processing for today’s connected customer. They sell automotive-related insurance, including such products as wear-and-tear during a lease, and wheel insurance. (As a non-car-owner, I didn’t even realize that these things existed.) They wanted to improve the claims process by reducing service “friction”, particularly in handoffs, touchpoints, and requesting information from the customer; and in improving the customer experience. To address the service friction, they created a new adjudication process using Kofax TotalAgility, with new job roles to match the claims process and reduce handoffs and touches, plus automation of the content management. For customer engagement, they created a self-service capability for all stakeholders, allowing both document uploads and status checks, plus automated milestone notifications to the customer; as an aside, this also improves back-office efficiency by reducing status calls and paper handling.

They measured performance before and after, so can tell how well this is working: significant improvements in claim touches and adjudication cycle time, and reduced customer calls. They are working to improve this further through more back-office automation and additional transparency and self-service. A great success story.

We finished the keynote with a panel moderated by Anthony Macciola, Kofax CTO, including Knapp and Dewey, plus Craig LeClair from Forrester and Dave Caldeira, Kofax Product Marketing. Some discussions on disruption in business — again, supposedly “millennial”-related — and the value proposition of improving the first mile of business that Kofax is addressing. Macciola gave a shorter version of the product capability briefing that we heard in the analyst session yesterday, which you can read about at that post, then had Knapp and Dewey share their experiences in digital transaction management.

I’ll be sticking around for the rest of the day and will be blogging from some of the breakout sessions; looks like there’s a great lineup.

As an aside, the “us versus them” discussions about the millennials is getting a bit tired. We really need to stop this characterization, because everything that Knapp, Dewey and Macciola said about millennials is also true for me and many other Boomer/Gen Xers that I know — smartphone use (I am live-blogging from my phone and tablet right now), self-service (faster and more accurate if I do it myself), preferred modes of customer service (online and asynchronous so that I can get service even if I’m travelling and in other time zones), social media participation (did I mention that I’m blogging from my phone?) and more — and I’m 25+ years too old to hit the millennial demographic. I’ve written about this before, but can we stop having customers and vendors stand up at conferences and talk in a slightly bemused and condescending tone about how their adult kids use their smartphones? It’s really a split between those who embrace new technology and those who are dragging their feet (even senior executives in technology companies, who should be setting an example rather than resisting new technology and the ways of business that it enables), and although there is some degree of age correlation, it’s not so simple as just birth year. </rant>

Kofax Altosoft For Operational Intelligence

Wayne Chambliss and Rich Rabin of Kofax Altosoft gave a presentation at Kofax Transform, most of which was a demo, on becoming an operational intelligence guru. This is my first real look at the Altosoft analytics product, acquired by Kofax about two years ago, since that’s not my main focus unless it’s particularly tied to process in some way.

Rabin used their graphical design tool to define the location of the metrics datamart, the data source (a variety of databases, or a file drop location), then define metrics by mapping the data fields, and applying aggregations and formatting. Although there is inherent complexity in understanding the use of the underlying data, the tool seems to make it pretty easy and fast to define the data and metrics, then load the data into the datamart and calculating the initial metrics. Once that was done,  he created a graphical dashboard related to the defined metrics, and could preview and run it directly. No SQL, no coding. If you want to get more complex, there’s a full expression editor, but everything is still done graphically within the same tool. You can directly examine the underlying generated SQL if you really want to.

It’s also possible to create a record to use as a data source, which is a similar abstraction concept to a database view, but with the addition functionality of heterogeneous data sources, derived fields, mappings and even field renaming. This allows someone to create metrics and dashboards based on records, without having to understand the underlying data sources.

Lots of other functionality here, including setting user authentications and access rights, scheduled loading of data sources into the metrics mart, dynamic filtering on dashboards and pivoting charts, much of which is available directly to the end users on the dashboard.

He wrapped up with a very brief process intelligence demo, where it’s possible to specify metrics directly based on a Kofax document capture process.