bpmNEXT 2015 Day 2 Demos: Trisotech, Comindware, Bonitasoft

The first group of demos on bpmNEXT day 2 had a focus on the links between architecture and process: from architectural modeling, to executable architecture, to loosely-coupled development architecture for process applications.

Trisotech: Digital Enterprise Graph Semantic Layer for Business/IT divide

Denis Gagné kicked off talking about Trisotech’s Digital Enterprise Graph, which is a semantic layer for transforming and combining information and models, allowing information to be shared and enriched for use by both business and IT stakeholders. The issue with current standards is that they only allow for structured exchange of information between different parts of the business, but a graph structure allows for information in widely varying formats to be distilled down to the who, what, when, where and why of the organization, allowing new relationships and interactions to be discovered and explored. Trisotech’s current modeling tools — Discovery Accelerator, BPMN Modeler and CMMN Modeler — can all contribute models to the Digital Enterprise Graph, but it can also accept models from a variety of other enterprise architecture and modeling tools. This brings together business architecture, enterprise architecture and case/process modeling outputs into a consolidated semantic graph, allowing each group to use their own models and terminology. Denis gave a demo of the Discovery Accelerator for capturing/discovering business information, where a text description can be highlighted with the actors, activities and artifacts to iteratively build a conceptual model; a balanced scorecard, W5 or SIPOC board can be used as a starting template; or an accelerator to reference models from Casewise, APQC and others to provide a framework and ontology to begin discovery and modeling. RACI charts can be created from the actors, activities and goals. The resulting information can be exported into BPMN, CMMN, UML, XPDL or GO-BPMN for more detailed modeling in another tool. If an EA reference framework (such as Casewise or Sparx) was used in the Discovery Accelerator, semantic links are maintained from activities to the original framework, even if the activities have been renamed and reorganized. He finished up with a demo of their new Insight Analyzer tool, which is used to explore information in the Digital Enterprise Graph; a node in the graph can be selected to see its origin as well as interrelationships with other nodes that may have come from different modeling tools. New relationships can be inferred from the graph as more information is added, without having to make explicit links, for example, identifying risk points based on their level of interconnectivity with other activities.

[Update to change Trisotech “BPM Graph” to “Digital Enterprise Graph” to match Denis’ presentation materials and current product naming.]

Comindware: Between Architecture and Execution: Tale of 3 Gaps

Anatoly Belaychuk and Konstantin Bredyuk discussed gaps between architecture and execution in terms of process models — the process model round-trip problem; between process, project and case models;and between process-based versus object-based work. They see architecture and architectural maturity as important in an organization’s ability to model and execute processes. In their demo, they showed a different representation of processes by modeling capabilities, resources and inputs/outputs; this is not an execution sequence to replace BPMN, but rather an architectural view of how organizational capabilities link together, more like a value chain diagram with major milestones identified. Drilling down into a capability, we may see a submodel using the same model syntax, or it may link to a BPMN process. This is like a slice through enterprise architecture, with a variety of process-related model types linked into a business architecture capability model, but also creates executable processes and cases, not just models. This “executable architecture” can be used by both architects and process modelers; it also includes data modeling to define record objects and attributes, and a forms modeler to provide a complete application development environment. This provides a link between architects — who are unlikely to learn or even care about BPMN — and executable process models, although there is not a direct link to existing enterprise architecture products or models in order to maintain any sort of semantic links such as we saw in the Trisotech demo earlier.

Bonitasoft: Building Sustainable Process-Based Apps

Miguel Valdés Faura finished this block of demos discussing process-based applications: how it’s still hard to create engaging user interfaces and easily-updated applications in spite of the low-code/no-code promises. He demoed some of their capabilities still in their labs, allowing for more agile applications by separating data, business logic and user interfaces. He started with a procurement application: BPMN process models for the business logic, data object models, and user interfaces defined separately, interacting via JSON contracts and REST APIs. The contract between an activity in the process model and the user interface is defined as inputs and constraints; as long as the contract does not change, the UI can be changed with no impact on the process model. Mobile interfaces can be built independently of desktop interfaces, using the same contracts to interface with the business logic, and REST APIs for access to the data objects. Their page builder provides environments for different form factors, providing standard UI widgets plus allowing for custom widgets; either the page can be deployed directly in their environment, or the page definition can be exported for further hand-coding outside their environment. Page fragments can be created for reusability cross pages. Custom pages built outside their environment, such as with AngularJS, can be imported by an administrator into the runtime environment and immediately deployed. Although a full process application can be built purely in their environment, by loosely coupling the logic, data and UI, they are able to make changes to any of those layers including adding custom components and UIs without impacting the others, as long as they respect the existing contract and APIs. Good example of why we use multi-tier architectures rather than tightly-coupled layers for greater flexibility and agility.

bpmNEXT 2015 Day 1 Demos: SAP, W4 and Whitestein

The demo program kicked off in the afternoon, with time for three of them sandwiched between two afternoon keynotes. Demos are strictly limited to 30 minutes, with a 5-minute, 20-slide, auto-advancing Ignite-style presentation (which I am credited with suggesting after some of last year’s slideware dragged on), followed by a 15-minute demo and 10 minutes for Q&A and changeover to the next speaker.

SAP: BPM and the Internet of Everything

Harsh Jegadeesan and Benjamin Notheis were in the unenviable first position, given the new presentation format; they gave an introduction to the internet of everything, referring to things, people, places and content. Events are at the core of many BPM systems that sense and respond to events; patterns of events are detected, and managed with rules and workflow. They introduced Smart Process Services on HANA Cloud Platform, including an app marketplace, and looked at a case study of pipeline incident management, where equipment sensor events will trigger maintenance processes: a machine-to-process scenario. The demo showed a dashboard for pipeline management, with a geographic view of a pipeline overlaid with pump locations and details, and highlighting abnormal readings and predicted failures. This is combined with cost data, including the cost of various risk scenarios such as a pipeline break or pump failure. The operator can drill down into abnormal readings for a pump, see predicted failure and maintenance records, then trigger an equipment repair or replacement. The incident case can be tracked, and tasks assigned and escalated. Aggregates for incident cases shows the number of critical cases or those approaching deadlines, and can be used to cluster the incidents to detect contributing factors. Nice demo; an expansion of the operational intelligence dashboards that I’ve seen from SAP previously, with good integration of predictions. Definitely a two-person demo with the inclusion of a tablet, a laptop and a wearable device. They finished with a developer view of the process-related services available on the HANA cloud portal plus the standard Eclipse environment for assembling services using BPMN. This does not have their BPM engine (the former Netweaver engine) behind it: the workflow microservices compile to Javascript and run in an in-memory cloud workflow engine. However, they see that some of the concepts from the more agile development that they are doing on the cloud platform could make their way back to the enterprise BPM product.

W4: Events, IOT, and Intelligent Business Operations

Continuing on the IoT theme, Francois Bonnet talked about making business operations more intelligent by binding physical device events together with people and business events in a BPMS. His example was for fall management — usually for the elderly — where a device event triggers a business process in a call center; the device events can be integrated into BPMN models using standard event constructs. He demonstrated with a sensor made from a Raspberry Pi tied to positional sensors that detect orientation; by tipping over the sensor, a process instance was created that triggered a call to the subscriber, using GPS data to indicate the location on a map. If the call operator indicated that the subscriber did not answer, they would be prompted to call a neighbour, and then emergency services. KPIs such as falls within a specified period are tracked, and a history of the events for the subscriber’s device. The sensor being out of range or having no movement over a period of time can also trigger a new task instance, while reorienting the sensor to the upright orientation within a few seconds after a fall was detected can cancel the process. Looking at the BPMN for managing events from the sensor, they are using the event objects in standard BPMN to their fullest extent, including both in-line and boundary events, with the device events translating to BPMN signal events. Great example of responsive event handling using BPMN.

Whitestein: Demonstrating Measurable Intelligence in an Enterprise Process Platform

The last demo of the day was Dan Neason of Whitestein also was in the theme of events, but more focused on intelligent agents and measurable intelligence in processes. Their LSPS solution models and executes goal-driven processes, where the system uses previous events to evolve its methods for reaching the goals, predicting outcomes, and recommending alternatives. The scenario used was a mortgage application campaign, where information about applicants is gathered and the success of the campaign determined by the number of completed mortgages; potential fraud cases are detected and recommended actions presented to a user to handle the case. Feedback from the user, in the form of accepting or rejecting recommendations, is used to tune the predictions. In addition to showing standard dashboards of events that have occurred, it can also give a dashboard view of predictions such as how many mortgage applications are expected to fail, including those that may be able to be resolved favorably through some recommended actions. The system is self-learning based on statistical models and domain knowledge, so can detect predefined patterns or completely emergent patterns; it can be applied to provide predictive analytics and goal-seeking behavior across multiple systems, including other BPMS.

Wrapping up this set of demos on intelligent, event-driven processes, we had a keynote from Jim Sinur (formerly of Gartner, now an independent consultant) on goal-directed processes. He covered concepts of hybrid processes, made up of multiple heterogeneous systems and processes that may exhibit both orchestration and collaboration to solve business problems.

Great first set of demos, definitely setting the bar high for tomorrow’s full day of 11 demos, and a good first day. We’re all off to the roof deck for a reception, wine tasting and dinner, so that’s it for blogging for today.

Canary roof deck

By the way, I realize that we completely forgot to create bpmNEXT bingo cards, although it did take until after 4pm for “ontology” to come up.

bpmNEXT 2015 Day 1: More Business of BPM

Talking with people at the first break of the first day, I feel so lucky to be part of a community with so many people who are friends, and with whom you can have both enlightening and amusing conversations.

Building a BPM Ecosystem

Continuing on the Business of BPM program, we had a panel with Miguel Valdés Faura of Bonitasoft, Scott Francis of BP-3 Global and Denis Gagne of Trisotech on the BPM ecosystem. Although billed as a panel, each participant had a 10-minute presentation slot before joint Q&A.

BPM ecosystem panelNot surprisingly, Miguel sees open source as an important part of the BPM ecosystem because it creates more of a meritocracy in the development of BPM capabilities, allowing many more people to participate actively in BPMS development and be recognized for their contributions. Being part of an open source community doesn’t necessarily mean that you’re writing core code: there are many people who contribute through developing extensions and add-ons, providing requirements, testing code, writing documentation and training materiels for developers and users, and creating vertical solutions based on the open source offering. They may do this as volunteer contributors, or create businesses around the added-value components that they offer.

Scott talked about BP-3’s journey as former Lombardi employees who became Lombardi (then IBM BPM) partners, and now build add-on products for IBM BPM including user dashboards and code quality checkers. He talked about the things that they have done to build a successful business as a partner and ISV for a large vendor, including being consistent, adding value, building their own customer base rather than subcontracting to the vendor’s professional services arm, and marketing what they do. Having run a boutique BPM implementation services firm in the past, I agree that companies like BP-3 are an essential part of the BPM community, providing an alternative to the vendor’s PS that can often provide higher-quality services at a lower cost.

Denis, with his background in standards as well as building the Business Process Incubator resource community, has worked for years at explicitly building the BPM ecosystem. He has a “rising tide lifts all boats” philosophy of providing resources that allow potential customers to educate themselves and exchange information, which broadens the reach of the industry and helps to lift it out of the BPM 101 discussion stage. He also talked about the problem of BPM standards being divergent, that is, vendors take an agreed-upon standard such as BPMN, then create their own proprietary extensions that detract from the standard, and therefore the community in general. Vendors that do this rather than participating in the standards development effort are not good community members; in my opinion, they are working from a fear-based philosophy of market scarcity rather than Denis’ more generous view that there will be a lot more of the BPM market to go around if we all help to educate and commoditize.

There was a wide-ranging discussion following their mini-presentations, although I only captured a couple of points:

  • Ensuring that the BPM ecosystem that we’re talking about covers process improvement, enterprise/business architecture and related topics, not just BPM software.
  • Why the push towards (mobile) apps isn’t more oriented to/supported by BPM technologies; as well as the problem of mobile app developers who don’t think at all about the back-end process of the transactions that they initiate, low-code BPM solutions might be hindering this since it removes the focus from developers. Mobile development fiefdoms have formed in many organizations, and these barriers need to be removed to integrate mobile apps and process.

Schrodinger’s BPM

We finished off the Business of BPM half-day program with Neil Ward-Dutton of MWD Advisors, talking about whether we are at the end of BPM or the end for transformation, and where we go next. The term “BPM” is starting to disappear from communications and the market for platforms is growing slowly, with maintenance revenue dominating license revenue, but there are still plenty of inquiries about how to get started with BPM, including from non-traditional (read: not financial services) sectors. He sees this as an indication that we’re in the middle of mainstream adoption of BPM, with the conversation shifting from pure technology to domain-specific expertise, success stories, stakeholder education and how to develop cost-effective skills. A key challenge is that a BPMS isn’t like most other enterprise technologies, because it includes aspects of many different technologies and methodologies, and can be positioned as the “one suite to rule them all” application development platform as well as an enabler for significant organizational change. Since mainstream adoption means approaching the more conservative half of the market, this is a scary proposition.

He presented two organizations that both embarked on BPM projects: a retail group that successfully implemented a cloud-hosted case management system to specifically improve the delivery of in-home customer services; and a banking group that failed to implement an expensive IT-led technology transformation project that built their COE before implementing anything, and not focusing on a specific business problem to solve. For organizations used to solving problems like the bank, enterprise-wide BPM looks like it’s too big and too disruptive; for more nimble organizations like the retailers, it’s a tool that can be used to solve a business problem while moving to low-code platforms, Agile development methodologies, cloud and mobile.

The lines are blurring between different product classes: BPMS, BPA, low-code, operational intelligence, task management, project management, enterprise social collaboration, and cloud orchestration. Customers are picking products from different categories to solve the same problems, and products are spanning multiple categories. It’s not so easy any more to put boundaries around what any particular product can do. The digital business era is also creating new threats and opportunities: new customer expectations, and new ways to gather information from devices, for example. This requires two capabilities working in concert: instrumentation of products, services and processes; and agility of services, processes and business models. This is a fundamentally different view of transformation, with continuous change and improvement based on instrumentation of a quickly-implemented solution rather than pre-planned to-be/as-is multi-year transformation projects.

His summary: enterprise-wide BPM initiatives are just not happening in the way that transformation efforts happened 10 years ago, but organizations are actively transforming business processes using more agile iterative techniques, particularly in the area of work coordination. Keep an eye on the non-traditional vendors and starting with simpler solutions, while linking to broader digital strategies.

Neil Ward-Dutton and Schrodinger's cat

bpmNEXT 2015 Day 1: The Business of BPM

I can’t believe it’s already the third year of bpmNEXT, my favorite BPM conference, organized by Nathaniel Palmer and Bruce Silver. It’s a place to meet up with other BPM industry experts and hear about some of the new things that are coming up in the industry: a meeting of peers, including CEOs and CTOs from smaller BPM companies, BPM architects and product management experts from larger vendors, industry analysts and more. The goal is a non-partisan friendly meeting of the minds rather than a competitive arena, and it’s great to see a lot of familiar faces here, plus some new faces of people who I only know online or through phone calls.

Hanging with Denis and Jakob

We’re at the lovely Canary Hotel in Santa Barbara, and will have the chance for a wine tasting with some of the local wineries tonight: Slone Vineyards, Happy Canyon, Grassini, Au Bon Climat, and Margerum. But first, we have some work to do.

This year, we started with an optional half day program on the business of BPM, including keynotes and a panel, before kicking off the usual DEMO-style presentations. Because of the large volume of great content, I’ll just publish summaries at the break points; all of the presentations will be available online after the conference (as they were in 2014 and 2013) if you want to learn more.

BPM 2020: Outlook for the Next Five Years

Bruce Silver opening remarksBruce Silver kicked off the conference and summarized the themes and presenters here at bpmNEXT:

  • Breaking old barriers: between BPM and (business and enterprise) architecture, which will be covered in presentations by Comindware and Trisotech; between process modeling and decision modeling, with Sapiens and Signavio presentations; and between BPM and case management, with Camunda, Safira, Cryo, Kofax and IBM presentations.
  • Expanding BPM horizons: the internet of things, with presentations from SAP and W4; cognitive computing and expert systems, with BP3, Fujitsu, IBM and Living Systems; and resourcing optimization with process mining, from Process Analytica.
  • Reaffirming core values: business empowerment, covered by Omny.link and Oracle; and embracing continual change, with Bonitasoft.

Hearing Bruce talk about the future to BPM in the context of the presentations to be given here over the next couple of days makes you realize just how much thought goes into the bpmNEXT program, and selecting presenters that provide maximum value. If this fascinates you, you should consider being here next year, as an attendee or a presenter.

Nathaniel Palmer then gave us his view of what BPM will look like in five years: data-driven, goal-oriented, adaptive and with intelligent automation, so that processes understand, evolve and self-optimize to meet the work context and requirements. He sees the key challenges as the integration of rules, relationships and robots into processes and operations, including breaking down the artificial barrier that exists between the modeling and automation of rules and process. Today’s consumers — and business people — are expecting to interact with services through their mobile devices, and are starting to include the quality of mobile services as a primary decision criteria. Although we are primarily doing that via our phones and tablets now, there are also devices such as Amazon Echo that are there to lower the threshold to interaction (and therefore to purchasing) by being a dedicated, voice-controlled gateway to Amazon; Jibo, a home-automation “robot” that aims to become a personal assistant for your home, interfacing with rather than automating tasks; and wearables that can notify and accept instructions.

bpmNEXT attendeesToday, most BPM is deployed as a three-tier, MVC-type architecture that presents tasks via a worklist/inbox metaphor; Nathanial thinks that we need to re-envision this as a four-tier architecture: a client tier native to each platform, a delivery tier that optimizes delivery for the platform, an aggregation tier that integrates services and data, and a services tier that provides the services (which is, arguably, the same as the bottom two tiers of a standard three-tier architecture). Tasks are machine-discoverable for automated integration and actions, and designed by context rather than procedure. Key enablers for this in include standards such as BPAF, and techniques for automated analysis including process mining.

Reinventing BPM for the Age of the Customer

Clay Richardson of Forrester — marking what I think is the first participation by a large analyst firm at bpmNEXT — presented some of Forrester’s research on how organizations are retooling for improving customer. Although still critical for automation and information management, BPM has evolved to support customer engagement, especially via mobile applications and innovation. 42% of their customers surveyed consider it either critical or high priority to reengineer business processes for mobile, meaning that this is no longer about just putting a mobile interface on an existing product, but reworking these processes to leverages things such as events generated by sensors and devices, providing a much richer informational context for processes. Digital transformation provides new opportunities for using BPM to drive rapid customer-centric innovation: digitizing the customer lifecycle and end-to-end experiences as well as quickly integrating services behind the scenes. Many companies now are using customer journey maps to connect the dots between process changes and customer experience, using design thinking paradigms.

We saw Forrester’s BPM TechRadar — similar to Gartner’s Hype Cycle — showing the key technologies related to BPM, and where they are on their maturity curves: BPM suites, business rules, process modeling and document capture are all at or past their peak, whereas predictive analytics, social collaboration, low-code platforms and dynamic case management are still climbing. They see BPM platforms as moving towards more customer-centricity, being used to create customer-facing applications in addition to automated integration and internal human-centric workflow. There’s also an interesting focus on the low-code application development platform market, as some BPM vendors reposition their products as process-centric app dev — including both traditional technical developers and less technical citizen developers — rather than BPMS.

We’re off on a break now, but will be back to finish the Business of BPM program with a panel and a keynote before we start on the demo program this afternoon.

Going Beyond Process Modeling, Part 1

I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.

Effektif BPM Goes Open Source

On a call with Tom Baeyens last week, he told me about their decision to turn the engine and APIs of Effektif BPM into an open source project: not a huge surprise since he was a driver behind two major open source BPM projects prior to starting Effektif, but an interesting turn of events. When Tom launched Effektif two years ago, it was a bit of a departure from his previous open source BPM projects: subscription-based pricing, cloud platform, business-friendly tooling for creating executable task lists and workflows with little IT involvement, and an integrated development environment rather than an embeddable engine. In the past, his work has been focused on building clean and fast BPM engines, but building the Effektif user-facing tooling taught them a lot about how to make a better engine (a bit to his surprise, I think).

The newly-launched open source project includes the fully-functional BPM engine with Java and REST APIs; the REST APIs are a bit minimal at this point, but more will come from Effektif or from community contributions. It also includes a developer cloud account for creating and exporting workflows to an on-premise engine (although it sounds like you can create them in any standard BPMN editor), or process instances can be run in the cloud engine for a subscription fee (after a 30-day free trial). They will also offer developer support for a fee. Effektif will continue to offer the existing suite of cloud tools for building and running workflows at subscription pricing, allowing them to address both the simple, out-of-the-box development environment and the developer-friendly embeddable engine – the best of both worlds, although it’s unclear how easy it will be for both types of of “developers” to share projects.

You can read more about the technical details on Tom’s blog or check out the wiki on the open source project.

This definitely puts Effektif back in direct competition with the other open source BPM projects that he has been involved with in the past – jBPM and Activiti (and, due to it forking from Activiti, Camunda) – since they all use a similar commercial open source business model, although Tom considers the newer Effektif engine as having a more up-to-date architecture as well as simpler end-user tooling. How well Effektif can compete against these companies offering commercial open source BPM will depend on the ability to build the community as well as continue to offer easy and compelling citizen developer tools.

KofaxTransform 2015 In Pictures

As I prepared to depart Las Vegas, I flicked through some of my photos from the past couple of days and decided to share. First, the great work of the ImageThink team of graphic recorders:







There were more of these that I didn’t capture; great idea and nice execution. 

We had a fun evening event on Monday at Tao nightclub at the Venetian, with an impressive turnout considering that it wasn’t in the same hotel:



I also captured some Vegas day and night shots from my hotel room at the Aria:





Lastly, our Kofax-branded tiramisu dessert from the awards dinner last night:



A good mix of work and play!

Analytics For Kofax TotalAgility With @Altosoft

Last session here at Kofax Transform, and as much I’d like to be sitting around the pool, I also like to squeeze every bit out of these events, and support the speakers who get this most unenviable timeslot. I’ve been in a couple of the analytics sessions over the past two days, which are based on the Kofax Altasoft Insight product. Married with TotalAgility for process analytics, they offer a simple version with some pre-defined dashboards, a more complete version but tied only to the KTA databases, and the full version that has the full Insight functionality with any data sources including KTA. The focus seems to be only on document capture workflow analytics, with many of the default reports on things like productivity, extraction rates and field accuracy in the scan and extraction modules; although these are definitely important, and likely of primary importance to Kofax’s current customer base of capture clients, the use cases for their demos need to push further into the post-capture business processes if they expect to be taken seriously as a BPM vendor. I know that KTA is a “first mile” solution and the capture processes are essential, but there should be more to apply analytics to across the customer journey managed within a SPA.

The visualization and dynamic filtering is pretty nice, as you would expect in the Altosoft environment, allowing you to drill into specific processes and tasks to find problem areas in process quality and operator performance. Traditional capture customers in the audience are going to like this, since it provides a lot of information on those front-end processes that can become an expensive bottleneck to downstream processing. 

We had another look at the process intelligence that I saw in an earlier session, monitoring event logs from capture workflows plus downstream processing in KTA or another system such as a third-party BPM or ERP system. Although that’s all good stuff, it does highlight that the Kofax end-to-end solution is made up of a number of systems strung together, rather than an integrated platform with shared infrastructure. It’s also completely document-centric since it uses document ID as the instance ID: again, well-suited for their current capture customers, but not necessarily the mind-set required to approach a more general BPM/case management market that is more data-centric than document-centric.

This wraps up Kofax Transform 2015. There is a customer awards dinner tonight that I plan to attend, then head home tomorrow. Thanks to the entire Kofax team, especially the amazing analyst relations crew, for inviting me here and making sure my time was well-spent. As a matter of disclosure, Kofax paid my travel expenses to be here, but did not otherwise compensate me for my time or for anything that I wrote here on my blog. Kofax has been a customer of mine in the past for presentations at Transform as well as webinars and white papers.

My next event is bpmNEXT in Santa Barbara at the end of the month — if you’re interested in the next generation of BPM or just want to hang with a bunch of BPM geeks in a relatively non-partisan environment, I highly recommend that you check it out.

Smarter Processes With Kapow Integration

I’m in a Kofax Transform breakout session on Kapow Integration together with KTA; I missed documenting the first part of the session when my Bluetooth keyboard stopped talking to my Android tablet, until I figured out how to pair it with my iPhone (which is not supposed to be possible), so I’m blogging on that. I feel like Macgyver.

Kapow provides a method to create “robots” for a sophisticated sort of automated control and screen scraping of web pages, so that you can create robots to interact with a web page for the purpose of integrating it with other applications (such as those built on Kofax TotalAgility) instead of a user having to interact with the page directly. In the demonstration that we saw, a robot was created to enter data to generate pay stubs on a site, then scroll between the full set of stubs created to take a screen snapshot or PDF of each. This allows any web application to use the robots to harvest information from a web site without user interaction, for example, to go to a series of bank web sites and enter the provided credentials to gather bank statements as input to a mortgage process. The use case shown had a web application that was presented to the customer, gathered their credentials for a number of banking sites, then went to each of those behind the scenes to grab the bank statements using the robot’s knowledge of how to navigate to each of those sites. Although the web sites being remotely controlled are hidden from the user, the robot can show a clip of the underlying site to, for example, display an error message such as incorrect credentials.

The design is all pretty much drag-and-drop, meaning that a semi-technical data or business analyst could work through the creation of a robot: they just need to know how to navigate through the web site to be controlled, and be able to understand how to handle all of the possible error cases. There are more technical implementations for complex scenarios that would require developer skills, but a lot can be done without that.

In my past life as a systems integrator, we did a lot of screen scraping, mostly of green-screen systems that could not be easily integrated with; funny that we have exactly the same problem even though we have leapfrogged a few generations of technologies from terminal emulators to browsers. Plus ça change.

Process Intelligence at KofaxTransform

It’s after lunch on the second (last) day of Kofax Transform, and the bar for keeping my attention in a session has gone up somewhat. To that end, I’m in a session with Scott Opitz and Rich Rabin from the Kofax Altosoft division, but not sure it’s going to meet that bar since Opitz started out by stating that what the TotalAgility (KTA) sessions call process is a much more complex than what they call process, and I’m a bit more on KTA’s side of this definition.

Altosoft process intelligence is really about the simple milestone-based monitoring processes of operational intelligence, with the processes being executed on multiple systems, more like SAP’s SAP Operational Process Intelligence based on HANA or IBM Business Monitor; you rarely have all of your process milestones in a single system, and even if you do, that system may not have adequate operational intelligence capabilities. Instead, operational intelligence systems pick up the breadcrumbs left by the processes — such as events, database records or log files — and provide an analytics layer, usually after importing that data into a dedicated analytics datamart.

There are really two main things to measure with process intelligence: performance and quality/compliance. To get there, however, you need to know what the process is supposed to look like in order to measure patterns of behavior. Altosoft’s process intelligence does what they call “swimlane analysis” — looking at which tasks are done in which order, a form of process mining discovery algorithm since there is no a priori process model — to identify operational patterns and derive a process model from runtime data, showing the most common/expected paths as well as the outliers. Not just process mining as an analysis tool, it then shows the live process monitoring data points against those models, and provides some good interactive filtering capabilities, allowing you to find missing steps that may indicate that the task wasn’t performed or (more likely for steps with manual logging) that the task was not documented.

Since the Insight platform is a complete BI environment, this information can also be combined with more traditional BI analytics and dashboards, providing real-time alerts as well as historical analysis. They also have ways to use a predefined process model and measure against that; this then becomes more of a conformance analysis to see how closely the actual runtime data matches the a prioiri model.