bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you will have to check the Twitter hashtag to see who won — or maybe I’ll do a wrapup post from the road.

IBM: BPM, say Hello to Watson. A New Era of Cognitive Work – Here Today

First up was Chris Vavra discussing how Watson and cognitive computing and natural language analysis capabilities can be used in the context of BPM, acting as an expert advisor to knowledge workers to enhance, scale and accelerate their work with its (or as Chris said, “his”) reasoning capabilities. There are a number of Watson services offered on their Bluemix cloud development platform; he demonstrated an example of an HR hiring process where the HR person uses Watson to analyze a candidate’s personality traits as part of the evaluation process. This is based on a written personal statement provided by the candidate; Watson analyzes that text (or could link through to a personal website or blog) to provide a personality analysis. From the Bluemix developer dashboard, you can create applications that include any of the services, including Watson Personality Insights that provides ranking on several factors in the five basic personality traits of Openness, Conscientiousness, Extraversion, Agreeableness and Emotional Range, with a graphical representation to highlight values and needs that may be of concern in the hiring process. It’s unlikely that a hiring manager would use solely this information to make a decision, but it’s interesting for exploring a candidate’s personality characteristics as part of the process. There are a number of other Watson-based services available on Bluemix to bind into BPM (and other) applications; in the IBM cloud BPM designer, this just appears as a service connector that can be configured with the Watson authentication information, and invoked at a services step in a process flow. Lots of other potential applications for bringing this level of expert recommendations into processes, such as healthcare condition diagnoses or drug interactions.

Safira: Managing Unstructured Processes with AdHoc BPM Framework

Filipe Pinho Pereira addressed the issue of the long tail of organizations’ processes, where only the high-volume, high-value structured processes are being implemented as full BPM projects by IT, and the long tail of less critical and ad hoc processes that end up being handled manually. Using IBM BPM, he demonstrated their Ad-Hoc BPM Framework add-on that allows a business user to create a new ad-hoc process based on a predefined request-intervention process pattern, which has only an initial data capture/launch step, then a single “do it” human step with a loop that keeps returning to the same step until explicitly completed. The example was an expense report process, where a blank expense spreadsheet was attached, a form created to capture basic data, and SLAs specified. Routing is created by specifying the primary recipient, and notifications that will be issued on start, end and SLA violations. Users can then create an instance of that process (that is, submit their own expense report), which is then routed to the primary recipient; the only routing options at that point are Postpone, Forward and Complete, since it’s in the main human task loop part of the process pattern. This distills ad-hoc processes to their simplest form, where the current recipient of the main task decides on who the next recipient is or whether to complete the task; this is functionally equivalent to an email-based process, but with proper process monitoring and SLA analytics. By looking at the analytics for the process, we saw the number of interventions (the number of times that the human step loop was executed for an instance), and the full history log could be exported to perform mining to detect patterns for process improvement. Good example of very simple user-created ad hoc processes based on an industrial-strength infrastructure; you’re not going to buy IBM BPM just to run this, but if you’re already using IBM BPM for your high-volume processes, this add-on allows you to leverage the infrastructure for the long tail of your processes.

Cryo: Tools for Flexibility in Collaborative Processes

Rafael Fazzi Bortolini and Leonardo Luzzatto presented on processes that lie somewhere in the middle of the structured-unstructured spectrum, and how to provide flexibility and dynamic aspects within structured constraints through decision support, flexible operations, ad-hoc task execution and live changes to processes. Demonstrating with their Orquestra BPMS, they showed a standard process task interface with the addition of localized analytics based on the history of that task in order to help the user decide on their actions at that point. Flexible routing options allow the user to return the process to an earlier step, or forward the current task to a colleague for consultation before returning it to the original user at the same step; this does not change the underlying process model, but may move the instance between activities in a non-standard fashion or reassign it to users who were not included in the original process definition. They also have an ad-hoc process pattern, but unlike Safira, they are using actual ad-hoc activities in BPMN, that is, tasks that are not connected by flow lines. Users are presented with the available ad hoc tasks in the process model, allowing them to “jump” between the activities in any order. They also demonstrated live changes to production processes; the examples were adding a field to a form and changing the name of a task in the process, both of which are presumably loaded at runtime rather than embedded within the instantiated process to allow these types of changes.

bpmNEXT 2015 Day 2 Demos: Omny.link, BP-3, Oracle

We’re finishing up this full day of demos with a mixed bag of BPM application development topics, from integration and customization that aims to have no code, to embracing and measuring code complexity, to cloud BPM services.

Omny.link: Towards Zero Coding

Tim Stephenson discussed how extremely low-code solutions could be used to automate marketing processes, in place of using more costly marketing automation solutions. Their Omny.link solution integrates workflow and decisioning with WordPress using JavaScript libraries, with detailed tracking and attribution, by providing forms, tasks, decision tables, business processes and customer management. He demonstrated an actual client solution, with custom forms created in WordPress, then referenced in a WordPress page (or post) that is used as the launch page for an email campaign. Customer information can be captured directly in their solution, or interfaced to another CRM such as Sugar or Salesforce. Marketers interact with a custom dashboard that allows them to define tasks, workflows, decisions and customer information that drive the campaigns; Tim sees the decision tables as a key interface for marketers to create the decision points in a campaign based on business terms, using a format that is similar to an Excel spreadsheet that they might now be using to track campaign rules.

BP-3: Sleep at Night Again: Automated Code Analysis

Scott Francis and Ivan Kornienko presented their new code analysis tool, Neches, that applies a set of rules based on best practices and anti-patterns based on their years of development experience to identify code and configuration issues in IBM BPM implementations that could adversely impact performance and maintainability. They propose that proper code reviews — including Neches reviews — at the end of each iteration of development can find design flaws as well as implementation flaws. Neches is a SaaS cloud tool that analyzes uploads of snapshots exported from the IBM BPM design environment; it scores each application based on complexity, which is compared to the aggregate of other applications analyzed, and can visualize the complexity score over time compared to found, resolved and fixed issues. The findings are organized by category, and you can drill into the categories to see the specific rules that have been triggered, such as UI page complexity or JavaScript block length, which can indicate potential problems with the code. The specific rules are categorized by severity, so that the most critical violations can be addressed immediately, while less critical ones are considered for future refactoring. Specific unused services, such as test harnesses, can be excluded from the complexity score calculation. Interesting tool for training new IBM BPM developers as well as review code quality and maintainability of existing projects, leveraging the experience of BP-3 and Lombardi/IBM developers as well as general best coding practices.

Oracle: Rapid Process Excellence with BPM in the Public Cloud

Linus Chow presented Oracle’s public cloud BPM service for developing both processes and rules, deployable in a web workspace or via mobile apps. He demonstrated an approval workflow, showing the portal interface, a monitoring view overlaid on the process model, and a mobile view that can include offline mode. The process designer is fully web-based, including forms and rules design; there are also web-based administration and deployment capabilities. This is Oracle’s first cloud BPM release and looks pretty full-featured in terms of human workflow; it’s a lightweight, public cloud refactoring of their existing Oracle BPM on-premise solution, but doesn’t include the business architecture or SOA functionality at this time.

Great day of demos, and lots of amazing conversations at the breaks. We’re all off to enjoy a free night in Santa Barbara before returning for a final morning of five more demos tomorrow.

bpmNEXT 2015 Day 2 Demos: Kofax, IBM, Process Analytica

Our first afternoon demo session included two mobile presentations and one on analytics, hitting a couple of the hot buttons of today’s BPM.

Kofax: Integrating Mobile Capture and Mobile Signature for Better Multichannel Customer Engagement Processes

John Reynolds highlighted the difficulty in automating processes that involve customers if you can’t link the real world — in the form of paper documents and signatures — with your digital processes. Kofax started in document scanning, and they’ve expanded their repertoire to include all manner of capture that can make processes more automated and faster to complete. Smartphones become intelligent scanners and signature capture devices, reducing latency in capture information from customers. John demonstrated the Kofax Mobile Capture app, both natively and embedded within a custom application, using physical documents and his iPhone; it captures images of a financial statement, a utility bill and a driver’s license, then pre-processes them on the device to remove irregularities that might impact automated character recognition and threshold them to binary images to reduce the data transmission size. These can then be directly injected into a customer onboarding process, with both the scanned image and the extracted data included, for automated or manual validation of the documents to continue the process. He showed the back-end tool used to train the recognition engine by manually identifying the data fields on sample images, which can accept a variety of formats for the same type of document, e.g., driver’s licenses from different states. This is done by a business person who understands the documents, not developers. Similarly, you can also use their Kapow Design Studio to train their system on how to extract information from a website (John was having the demo from hell, and his Kapow license had expired) by marking the information on the screen and walking through the required steps to extract the required data fields. They take on a small part of the process automation, mostly around the capture of information for front-end processes such as customer onboarding, but are seeing many implementations moving toward an “app” model of several smaller applications and processes being used for an end-to-end process, rather than a single monolithic process application.

IBM: Mobile Case Management and Capture in Insurance

Mike Marin and Jonathan Lee continued on the mobile theme, stressing that mobile is no longer an option for customer-facing and remote worker functionality. They demonstrated IBM Case Manager for an insurance example, showing how mobile functionality could be used to enhance the claims process by mobile capture, content management and case handling. Unlike the Kofax scenario where the customer uses the mobile app, this is a mobile app for a knowledge worker, the claims adjuster, who may need a richer informational context and more functionality such as document type classification than a customer would use. They captured the (printed and filled) claims form and a photo of the vehicle involved in the claim using a smartphone, then the more complete case view on a tablet that showed more case data and related tasks. The supervisor view shows related cases plus a case visualizer that shows a timeline view of the case. They finished with a look at the new IBM mobile UI design concepts, which presented a more modern mobile interface style including a high-level card view and a smoother transition between information and functions.

Process Analytica: Process Discovery and Analytics in Healthcare Systems

Robert Shapiro shifted the topic to process mining/discovery and analytics, specifically in healthcare applications. He started with a view of process mining, simulation and other analytical techniques, and how to integrate with different types of healthcare systems via their history logs. Looking at their existing processes based on the history data, missed KPIs and root causes can be identified, and potential solutions derived and compared in a systematic and analytic manner. Using their Optima process analytics workbench, he demonstrated importing and analyzing an event log to create a BPMN model based on the history of events: this is a complete model that includes interrupting and non-interrupting boundary events, and split and merge gateways based on the patterns of events, with probabilistic weights and/or decision logic calculated for the splitting gateways. Keeping in mind that the log events come from systems that have no explicit process model, the automatic derivation of the boundary events and gateways and their characteristics provides a significant step in process improvement efforts, and can be further analyzed using their simulation capabilities. Most of the advanced analysis and model derivation (e.g., for gateway and boundary conditions) is dependent on capturing data value changes in the event logs, not just activity transitions; this is an important distinction since many event logs don’t capture that information.

bpmNEXT 2015 Day 2 Demos: Sapiens Decision, Signavio

We finished the morning demo sessions with two on the theme of decision modeling and management.

Sapiens: How to Manage Business Logic

Michael Grohs highlighted the OMG release of the Decision Model and Notation (DMN) standard, and how the decision model is really a business logic model. However, business rule management systems are typically technical solutions, and don’t do much for business users and analysts trying to model their decision logic and rules based on their policies and procedures. Decision-aware processes extract declarative knowledge from process models, greatly simplifying the process models and moving the declarative information to a model format more suitable to business logic, such as a decision table. BPMS and DMS are complementary, and can be combined to create a complete model of the business process. He provided a demo of their decision modeling and repository tooling, which starts with the definition of a community space that shares a glossary, attributes and models, and has governance workflows for decision model approval and deployment. The glossary allows for the definition of fact types, including multiple synonyms to allow different stakeholders to use their own terminology. The decision models are made up of rule families that capture the business logic, with a visual syntax that indicates the rules and conditions that make up a particular decision. This can be expanded into a full decision table style that shows the if-then-else logic using the business terms. Different instances of decisions and rules sets can be created — in his demo example, the insurance policy renewals base logic versus that for a hurricane-prone state such as Florida — and visually compared in the graphical or tabular view, with changes highlighted or listed in detail in a report. Rule sets can be validated to highlight conflicts and missing information, then exported in a variety of formats for importing into a DMS for execution.

Signavio: Business Decision Management

Gero Decker talked about their collaborative process design and SAP upgrade tools as an introduction, but mainly addressed decision modeling and how they are embracing the DMN standard: modeling decisions, inputs and knowledge sources, then linking that to a decision activity in a BPMN model. DMN provides a graphical model form, and also allows for decision tables for detailed steps. Like Sapiens, Signavio does only decision modeling, not execution, and exports in standard formats for importing to a DMN such as Drools for execution. They are releasing the Signavio Decision Manager in a few weeks, and he gave us a preview demo of modeling and testing rules integrated with their process modeling environment. Similar to the modeling that we saw from Comindware earlier this morning, Signavio can be used to model higher-level enterprise architecture constructs such as value chains plus full BPMN models for specific capabilities within those models; he used a BPMN model as a jumping-off point for demonstrating decision modeling by creating a business rule task. From that point, you can specify a decision table directly in situ, or choose to create a DMN model at that point, which launches the DMN modeler with the top-level question/answer in the DMN model linked to the business rule activity from the BPMN model. The DMN model can be built out graphically, data objects defined and rules added with decision tables, and sub-decisions added as required. The DMN modeler can make use of the existing glossary in the Signavio environment for data objects and attributes. The decision tables can be validated to detect conflicts, and can export test cases in a spreadsheet format to drive manual or automated testing. They are also doing some work on detecting complex decision logic within BPMN models, with the goal to refactor the models to externalize the decision logic into DMN models where it makes the BPMN model unnecessarily complex.

bpmNEXT 2015 Day 2 Demos: Trisotech, Comindware, Bonitasoft

The first group of demos on bpmNEXT day 2 had a focus on the links between architecture and process: from architectural modeling, to executable architecture, to loosely-coupled development architecture for process applications.

Trisotech: Digital Enterprise Graph Semantic Layer for Business/IT divide

Denis Gagné kicked off talking about Trisotech’s Digital Enterprise Graph, which is a semantic layer for transforming and combining information and models, allowing information to be shared and enriched for use by both business and IT stakeholders. The issue with current standards is that they only allow for structured exchange of information between different parts of the business, but a graph structure allows for information in widely varying formats to be distilled down to the who, what, when, where and why of the organization, allowing new relationships and interactions to be discovered and explored. Trisotech’s current modeling tools — Discovery Accelerator, BPMN Modeler and CMMN Modeler — can all contribute models to the Digital Enterprise Graph, but it can also accept models from a variety of other enterprise architecture and modeling tools. This brings together business architecture, enterprise architecture and case/process modeling outputs into a consolidated semantic graph, allowing each group to use their own models and terminology. Denis gave a demo of the Discovery Accelerator for capturing/discovering business information, where a text description can be highlighted with the actors, activities and artifacts to iteratively build a conceptual model; a balanced scorecard, W5 or SIPOC board can be used as a starting template; or an accelerator to reference models from Casewise, APQC and others to provide a framework and ontology to begin discovery and modeling. RACI charts can be created from the actors, activities and goals. The resulting information can be exported into BPMN, CMMN, UML, XPDL or GO-BPMN for more detailed modeling in another tool. If an EA reference framework (such as Casewise or Sparx) was used in the Discovery Accelerator, semantic links are maintained from activities to the original framework, even if the activities have been renamed and reorganized. He finished up with a demo of their new Insight Analyzer tool, which is used to explore information in the Digital Enterprise Graph; a node in the graph can be selected to see its origin as well as interrelationships with other nodes that may have come from different modeling tools. New relationships can be inferred from the graph as more information is added, without having to make explicit links, for example, identifying risk points based on their level of interconnectivity with other activities.

[Update to change Trisotech “BPM Graph” to “Digital Enterprise Graph” to match Denis’ presentation materials and current product naming.]

Comindware: Between Architecture and Execution: Tale of 3 Gaps

Anatoly Belaychuk and Konstantin Bredyuk discussed gaps between architecture and execution in terms of process models — the process model round-trip problem; between process, project and case models;and between process-based versus object-based work. They see architecture and architectural maturity as important in an organization’s ability to model and execute processes. In their demo, they showed a different representation of processes by modeling capabilities, resources and inputs/outputs; this is not an execution sequence to replace BPMN, but rather an architectural view of how organizational capabilities link together, more like a value chain diagram with major milestones identified. Drilling down into a capability, we may see a submodel using the same model syntax, or it may link to a BPMN process. This is like a slice through enterprise architecture, with a variety of process-related model types linked into a business architecture capability model, but also creates executable processes and cases, not just models. This “executable architecture” can be used by both architects and process modelers; it also includes data modeling to define record objects and attributes, and a forms modeler to provide a complete application development environment. This provides a link between architects — who are unlikely to learn or even care about BPMN — and executable process models, although there is not a direct link to existing enterprise architecture products or models in order to maintain any sort of semantic links such as we saw in the Trisotech demo earlier.

Bonitasoft: Building Sustainable Process-Based Apps

Miguel Valdés Faura finished this block of demos discussing process-based applications: how it’s still hard to create engaging user interfaces and easily-updated applications in spite of the low-code/no-code promises. He demoed some of their capabilities still in their labs, allowing for more agile applications by separating data, business logic and user interfaces. He started with a procurement application: BPMN process models for the business logic, data object models, and user interfaces defined separately, interacting via JSON contracts and REST APIs. The contract between an activity in the process model and the user interface is defined as inputs and constraints; as long as the contract does not change, the UI can be changed with no impact on the process model. Mobile interfaces can be built independently of desktop interfaces, using the same contracts to interface with the business logic, and REST APIs for access to the data objects. Their page builder provides environments for different form factors, providing standard UI widgets plus allowing for custom widgets; either the page can be deployed directly in their environment, or the page definition can be exported for further hand-coding outside their environment. Page fragments can be created for reusability cross pages. Custom pages built outside their environment, such as with AngularJS, can be imported by an administrator into the runtime environment and immediately deployed. Although a full process application can be built purely in their environment, by loosely coupling the logic, data and UI, they are able to make changes to any of those layers including adding custom components and UIs without impacting the others, as long as they respect the existing contract and APIs. Good example of why we use multi-tier architectures rather than tightly-coupled layers for greater flexibility and agility.

bpmNEXT 2015 Day 1 Demos: SAP, W4 and Whitestein

The demo program kicked off in the afternoon, with time for three of them sandwiched between two afternoon keynotes. Demos are strictly limited to 30 minutes, with a 5-minute, 20-slide, auto-advancing Ignite-style presentation (which I am credited with suggesting after some of last year’s slideware dragged on), followed by a 15-minute demo and 10 minutes for Q&A and changeover to the next speaker.

SAP: BPM and the Internet of Everything

Harsh Jegadeesan and Benjamin Notheis were in the unenviable first position, given the new presentation format; they gave an introduction to the internet of everything, referring to things, people, places and content. Events are at the core of many BPM systems that sense and respond to events; patterns of events are detected, and managed with rules and workflow. They introduced Smart Process Services on HANA Cloud Platform, including an app marketplace, and looked at a case study of pipeline incident management, where equipment sensor events will trigger maintenance processes: a machine-to-process scenario. The demo showed a dashboard for pipeline management, with a geographic view of a pipeline overlaid with pump locations and details, and highlighting abnormal readings and predicted failures. This is combined with cost data, including the cost of various risk scenarios such as a pipeline break or pump failure. The operator can drill down into abnormal readings for a pump, see predicted failure and maintenance records, then trigger an equipment repair or replacement. The incident case can be tracked, and tasks assigned and escalated. Aggregates for incident cases shows the number of critical cases or those approaching deadlines, and can be used to cluster the incidents to detect contributing factors. Nice demo; an expansion of the operational intelligence dashboards that I’ve seen from SAP previously, with good integration of predictions. Definitely a two-person demo with the inclusion of a tablet, a laptop and a wearable device. They finished with a developer view of the process-related services available on the HANA cloud portal plus the standard Eclipse environment for assembling services using BPMN. This does not have their BPM engine (the former Netweaver engine) behind it: the workflow microservices compile to Javascript and run in an in-memory cloud workflow engine. However, they see that some of the concepts from the more agile development that they are doing on the cloud platform could make their way back to the enterprise BPM product.

W4: Events, IOT, and Intelligent Business Operations

Continuing on the IoT theme, Francois Bonnet talked about making business operations more intelligent by binding physical device events together with people and business events in a BPMS. His example was for fall management — usually for the elderly — where a device event triggers a business process in a call center; the device events can be integrated into BPMN models using standard event constructs. He demonstrated with a sensor made from a Raspberry Pi tied to positional sensors that detect orientation; by tipping over the sensor, a process instance was created that triggered a call to the subscriber, using GPS data to indicate the location on a map. If the call operator indicated that the subscriber did not answer, they would be prompted to call a neighbour, and then emergency services. KPIs such as falls within a specified period are tracked, and a history of the events for the subscriber’s device. The sensor being out of range or having no movement over a period of time can also trigger a new task instance, while reorienting the sensor to the upright orientation within a few seconds after a fall was detected can cancel the process. Looking at the BPMN for managing events from the sensor, they are using the event objects in standard BPMN to their fullest extent, including both in-line and boundary events, with the device events translating to BPMN signal events. Great example of responsive event handling using BPMN.

Whitestein: Demonstrating Measurable Intelligence in an Enterprise Process Platform

The last demo of the day was Dan Neason of Whitestein also was in the theme of events, but more focused on intelligent agents and measurable intelligence in processes. Their LSPS solution models and executes goal-driven processes, where the system uses previous events to evolve its methods for reaching the goals, predicting outcomes, and recommending alternatives. The scenario used was a mortgage application campaign, where information about applicants is gathered and the success of the campaign determined by the number of completed mortgages; potential fraud cases are detected and recommended actions presented to a user to handle the case. Feedback from the user, in the form of accepting or rejecting recommendations, is used to tune the predictions. In addition to showing standard dashboards of events that have occurred, it can also give a dashboard view of predictions such as how many mortgage applications are expected to fail, including those that may be able to be resolved favorably through some recommended actions. The system is self-learning based on statistical models and domain knowledge, so can detect predefined patterns or completely emergent patterns; it can be applied to provide predictive analytics and goal-seeking behavior across multiple systems, including other BPMS.

Wrapping up this set of demos on intelligent, event-driven processes, we had a keynote from Jim Sinur (formerly of Gartner, now an independent consultant) on goal-directed processes. He covered concepts of hybrid processes, made up of multiple heterogeneous systems and processes that may exhibit both orchestration and collaboration to solve business problems.

Great first set of demos, definitely setting the bar high for tomorrow’s full day of 11 demos, and a good first day. We’re all off to the roof deck for a reception, wine tasting and dinner, so that’s it for blogging for today.

Canary roof deck

By the way, I realize that we completely forgot to create bpmNEXT bingo cards, although it did take until after 4pm for “ontology” to come up.

bpmNEXT 2015 Day 1: More Business of BPM

Talking with people at the first break of the first day, I feel so lucky to be part of a community with so many people who are friends, and with whom you can have both enlightening and amusing conversations.

Building a BPM Ecosystem

Continuing on the Business of BPM program, we had a panel with Miguel Valdés Faura of Bonitasoft, Scott Francis of BP-3 Global and Denis Gagne of Trisotech on the BPM ecosystem. Although billed as a panel, each participant had a 10-minute presentation slot before joint Q&A.

BPM ecosystem panelNot surprisingly, Miguel sees open source as an important part of the BPM ecosystem because it creates more of a meritocracy in the development of BPM capabilities, allowing many more people to participate actively in BPMS development and be recognized for their contributions. Being part of an open source community doesn’t necessarily mean that you’re writing core code: there are many people who contribute through developing extensions and add-ons, providing requirements, testing code, writing documentation and training materiels for developers and users, and creating vertical solutions based on the open source offering. They may do this as volunteer contributors, or create businesses around the added-value components that they offer.

Scott talked about BP-3’s journey as former Lombardi employees who became Lombardi (then IBM BPM) partners, and now build add-on products for IBM BPM including user dashboards and code quality checkers. He talked about the things that they have done to build a successful business as a partner and ISV for a large vendor, including being consistent, adding value, building their own customer base rather than subcontracting to the vendor’s professional services arm, and marketing what they do. Having run a boutique BPM implementation services firm in the past, I agree that companies like BP-3 are an essential part of the BPM community, providing an alternative to the vendor’s PS that can often provide higher-quality services at a lower cost.

Denis, with his background in standards as well as building the Business Process Incubator resource community, has worked for years at explicitly building the BPM ecosystem. He has a “rising tide lifts all boats” philosophy of providing resources that allow potential customers to educate themselves and exchange information, which broadens the reach of the industry and helps to lift it out of the BPM 101 discussion stage. He also talked about the problem of BPM standards being divergent, that is, vendors take an agreed-upon standard such as BPMN, then create their own proprietary extensions that detract from the standard, and therefore the community in general. Vendors that do this rather than participating in the standards development effort are not good community members; in my opinion, they are working from a fear-based philosophy of market scarcity rather than Denis’ more generous view that there will be a lot more of the BPM market to go around if we all help to educate and commoditize.

There was a wide-ranging discussion following their mini-presentations, although I only captured a couple of points:

  • Ensuring that the BPM ecosystem that we’re talking about covers process improvement, enterprise/business architecture and related topics, not just BPM software.
  • Why the push towards (mobile) apps isn’t more oriented to/supported by BPM technologies; as well as the problem of mobile app developers who don’t think at all about the back-end process of the transactions that they initiate, low-code BPM solutions might be hindering this since it removes the focus from developers. Mobile development fiefdoms have formed in many organizations, and these barriers need to be removed to integrate mobile apps and process.

Schrodinger’s BPM

We finished off the Business of BPM half-day program with Neil Ward-Dutton of MWD Advisors, talking about whether we are at the end of BPM or the end for transformation, and where we go next. The term “BPM” is starting to disappear from communications and the market for platforms is growing slowly, with maintenance revenue dominating license revenue, but there are still plenty of inquiries about how to get started with BPM, including from non-traditional (read: not financial services) sectors. He sees this as an indication that we’re in the middle of mainstream adoption of BPM, with the conversation shifting from pure technology to domain-specific expertise, success stories, stakeholder education and how to develop cost-effective skills. A key challenge is that a BPMS isn’t like most other enterprise technologies, because it includes aspects of many different technologies and methodologies, and can be positioned as the “one suite to rule them all” application development platform as well as an enabler for significant organizational change. Since mainstream adoption means approaching the more conservative half of the market, this is a scary proposition.

He presented two organizations that both embarked on BPM projects: a retail group that successfully implemented a cloud-hosted case management system to specifically improve the delivery of in-home customer services; and a banking group that failed to implement an expensive IT-led technology transformation project that built their COE before implementing anything, and not focusing on a specific business problem to solve. For organizations used to solving problems like the bank, enterprise-wide BPM looks like it’s too big and too disruptive; for more nimble organizations like the retailers, it’s a tool that can be used to solve a business problem while moving to low-code platforms, Agile development methodologies, cloud and mobile.

The lines are blurring between different product classes: BPMS, BPA, low-code, operational intelligence, task management, project management, enterprise social collaboration, and cloud orchestration. Customers are picking products from different categories to solve the same problems, and products are spanning multiple categories. It’s not so easy any more to put boundaries around what any particular product can do. The digital business era is also creating new threats and opportunities: new customer expectations, and new ways to gather information from devices, for example. This requires two capabilities working in concert: instrumentation of products, services and processes; and agility of services, processes and business models. This is a fundamentally different view of transformation, with continuous change and improvement based on instrumentation of a quickly-implemented solution rather than pre-planned to-be/as-is multi-year transformation projects.

His summary: enterprise-wide BPM initiatives are just not happening in the way that transformation efforts happened 10 years ago, but organizations are actively transforming business processes using more agile iterative techniques, particularly in the area of work coordination. Keep an eye on the non-traditional vendors and starting with simpler solutions, while linking to broader digital strategies.

Neil Ward-Dutton and Schrodinger's cat

bpmNEXT 2015 Day 1: The Business of BPM

I can’t believe it’s already the third year of bpmNEXT, my favorite BPM conference, organized by Nathaniel Palmer and Bruce Silver. It’s a place to meet up with other BPM industry experts and hear about some of the new things that are coming up in the industry: a meeting of peers, including CEOs and CTOs from smaller BPM companies, BPM architects and product management experts from larger vendors, industry analysts and more. The goal is a non-partisan friendly meeting of the minds rather than a competitive arena, and it’s great to see a lot of familiar faces here, plus some new faces of people who I only know online or through phone calls.

Hanging with Denis and Jakob

We’re at the lovely Canary Hotel in Santa Barbara, and will have the chance for a wine tasting with some of the local wineries tonight: Slone Vineyards, Happy Canyon, Grassini, Au Bon Climat, and Margerum. But first, we have some work to do.

This year, we started with an optional half day program on the business of BPM, including keynotes and a panel, before kicking off the usual DEMO-style presentations. Because of the large volume of great content, I’ll just publish summaries at the break points; all of the presentations will be available online after the conference (as they were in 2014 and 2013) if you want to learn more.

BPM 2020: Outlook for the Next Five Years

Bruce Silver opening remarksBruce Silver kicked off the conference and summarized the themes and presenters here at bpmNEXT:

  • Breaking old barriers: between BPM and (business and enterprise) architecture, which will be covered in presentations by Comindware and Trisotech; between process modeling and decision modeling, with Sapiens and Signavio presentations; and between BPM and case management, with Camunda, Safira, Cryo, Kofax and IBM presentations.
  • Expanding BPM horizons: the internet of things, with presentations from SAP and W4; cognitive computing and expert systems, with BP3, Fujitsu, IBM and Living Systems; and resourcing optimization with process mining, from Process Analytica.
  • Reaffirming core values: business empowerment, covered by Omny.link and Oracle; and embracing continual change, with Bonitasoft.

Hearing Bruce talk about the future to BPM in the context of the presentations to be given here over the next couple of days makes you realize just how much thought goes into the bpmNEXT program, and selecting presenters that provide maximum value. If this fascinates you, you should consider being here next year, as an attendee or a presenter.

Nathaniel Palmer then gave us his view of what BPM will look like in five years: data-driven, goal-oriented, adaptive and with intelligent automation, so that processes understand, evolve and self-optimize to meet the work context and requirements. He sees the key challenges as the integration of rules, relationships and robots into processes and operations, including breaking down the artificial barrier that exists between the modeling and automation of rules and process. Today’s consumers — and business people — are expecting to interact with services through their mobile devices, and are starting to include the quality of mobile services as a primary decision criteria. Although we are primarily doing that via our phones and tablets now, there are also devices such as Amazon Echo that are there to lower the threshold to interaction (and therefore to purchasing) by being a dedicated, voice-controlled gateway to Amazon; Jibo, a home-automation “robot” that aims to become a personal assistant for your home, interfacing with rather than automating tasks; and wearables that can notify and accept instructions.

bpmNEXT attendeesToday, most BPM is deployed as a three-tier, MVC-type architecture that presents tasks via a worklist/inbox metaphor; Nathanial thinks that we need to re-envision this as a four-tier architecture: a client tier native to each platform, a delivery tier that optimizes delivery for the platform, an aggregation tier that integrates services and data, and a services tier that provides the services (which is, arguably, the same as the bottom two tiers of a standard three-tier architecture). Tasks are machine-discoverable for automated integration and actions, and designed by context rather than procedure. Key enablers for this in include standards such as BPAF, and techniques for automated analysis including process mining.

Reinventing BPM for the Age of the Customer

Clay Richardson of Forrester — marking what I think is the first participation by a large analyst firm at bpmNEXT — presented some of Forrester’s research on how organizations are retooling for improving customer. Although still critical for automation and information management, BPM has evolved to support customer engagement, especially via mobile applications and innovation. 42% of their customers surveyed consider it either critical or high priority to reengineer business processes for mobile, meaning that this is no longer about just putting a mobile interface on an existing product, but reworking these processes to leverages things such as events generated by sensors and devices, providing a much richer informational context for processes. Digital transformation provides new opportunities for using BPM to drive rapid customer-centric innovation: digitizing the customer lifecycle and end-to-end experiences as well as quickly integrating services behind the scenes. Many companies now are using customer journey maps to connect the dots between process changes and customer experience, using design thinking paradigms.

We saw Forrester’s BPM TechRadar — similar to Gartner’s Hype Cycle — showing the key technologies related to BPM, and where they are on their maturity curves: BPM suites, business rules, process modeling and document capture are all at or past their peak, whereas predictive analytics, social collaboration, low-code platforms and dynamic case management are still climbing. They see BPM platforms as moving towards more customer-centricity, being used to create customer-facing applications in addition to automated integration and internal human-centric workflow. There’s also an interesting focus on the low-code application development platform market, as some BPM vendors reposition their products as process-centric app dev — including both traditional technical developers and less technical citizen developers — rather than BPMS.

We’re off on a break now, but will be back to finish the Business of BPM program with a panel and a keynote before we start on the demo program this afternoon.

Going Beyond Process Modeling, Part 1

I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.

Effektif BPM Goes Open Source

On a call with Tom Baeyens last week, he told me about their decision to turn the engine and APIs of Effektif BPM into an open source project: not a huge surprise since he was a driver behind two major open source BPM projects prior to starting Effektif, but an interesting turn of events. When Tom launched Effektif two years ago, it was a bit of a departure from his previous open source BPM projects: subscription-based pricing, cloud platform, business-friendly tooling for creating executable task lists and workflows with little IT involvement, and an integrated development environment rather than an embeddable engine. In the past, his work has been focused on building clean and fast BPM engines, but building the Effektif user-facing tooling taught them a lot about how to make a better engine (a bit to his surprise, I think).

The newly-launched open source project includes the fully-functional BPM engine with Java and REST APIs; the REST APIs are a bit minimal at this point, but more will come from Effektif or from community contributions. It also includes a developer cloud account for creating and exporting workflows to an on-premise engine (although it sounds like you can create them in any standard BPMN editor), or process instances can be run in the cloud engine for a subscription fee (after a 30-day free trial). They will also offer developer support for a fee. Effektif will continue to offer the existing suite of cloud tools for building and running workflows at subscription pricing, allowing them to address both the simple, out-of-the-box development environment and the developer-friendly embeddable engine – the best of both worlds, although it’s unclear how easy it will be for both types of of “developers” to share projects.

You can read more about the technical details on Tom’s blog or check out the wiki on the open source project.

This definitely puts Effektif back in direct competition with the other open source BPM projects that he has been involved with in the past – jBPM and Activiti (and, due to it forking from Activiti, Camunda) – since they all use a similar commercial open source business model, although Tom considers the newer Effektif engine as having a more up-to-date architecture as well as simpler end-user tooling. How well Effektif can compete against these companies offering commercial open source BPM will depend on the ability to build the community as well as continue to offer easy and compelling citizen developer tools.