bpmNEXT 2015 Day 1 Demos: SAP, W4 and Whitestein

The demo program kicked off in the afternoon, with time for three of them sandwiched between two afternoon keynotes. Demos are strictly limited to 30 minutes, with a 5-minute, 20-slide, auto-advancing Ignite-style presentation (which I am credited with suggesting after some of last year’s slideware dragged on), followed by a 15-minute demo and 10 minutes for Q&A and changeover to the next speaker.

SAP: BPM and the Internet of Everything

Harsh Jegadeesan and Benjamin Notheis were in the unenviable first position, given the new presentation format; they gave an introduction to the internet of everything, referring to things, people, places and content. Events are at the core of many BPM systems that sense and respond to events; patterns of events are detected, and managed with rules and workflow. They introduced Smart Process Services on HANA Cloud Platform, including an app marketplace, and looked at a case study of pipeline incident management, where equipment sensor events will trigger maintenance processes: a machine-to-process scenario. The demo showed a dashboard for pipeline management, with a geographic view of a pipeline overlaid with pump locations and details, and highlighting abnormal readings and predicted failures. This is combined with cost data, including the cost of various risk scenarios such as a pipeline break or pump failure. The operator can drill down into abnormal readings for a pump, see predicted failure and maintenance records, then trigger an equipment repair or replacement. The incident case can be tracked, and tasks assigned and escalated. Aggregates for incident cases shows the number of critical cases or those approaching deadlines, and can be used to cluster the incidents to detect contributing factors. Nice demo; an expansion of the operational intelligence dashboards that I’ve seen from SAP previously, with good integration of predictions. Definitely a two-person demo with the inclusion of a tablet, a laptop and a wearable device. They finished with a developer view of the process-related services available on the HANA cloud portal plus the standard Eclipse environment for assembling services using BPMN. This does not have their BPM engine (the former Netweaver engine) behind it: the workflow microservices compile to Javascript and run in an in-memory cloud workflow engine. However, they see that some of the concepts from the more agile development that they are doing on the cloud platform could make their way back to the enterprise BPM product.

W4: Events, IOT, and Intelligent Business Operations

Continuing on the IoT theme, Francois Bonnet talked about making business operations more intelligent by binding physical device events together with people and business events in a BPMS. His example was for fall management — usually for the elderly — where a device event triggers a business process in a call center; the device events can be integrated into BPMN models using standard event constructs. He demonstrated with a sensor made from a Raspberry Pi tied to positional sensors that detect orientation; by tipping over the sensor, a process instance was created that triggered a call to the subscriber, using GPS data to indicate the location on a map. If the call operator indicated that the subscriber did not answer, they would be prompted to call a neighbour, and then emergency services. KPIs such as falls within a specified period are tracked, and a history of the events for the subscriber’s device. The sensor being out of range or having no movement over a period of time can also trigger a new task instance, while reorienting the sensor to the upright orientation within a few seconds after a fall was detected can cancel the process. Looking at the BPMN for managing events from the sensor, they are using the event objects in standard BPMN to their fullest extent, including both in-line and boundary events, with the device events translating to BPMN signal events. Great example of responsive event handling using BPMN.

Whitestein: Demonstrating Measurable Intelligence in an Enterprise Process Platform

The last demo of the day was Dan Neason of Whitestein also was in the theme of events, but more focused on intelligent agents and measurable intelligence in processes. Their LSPS solution models and executes goal-driven processes, where the system uses previous events to evolve its methods for reaching the goals, predicting outcomes, and recommending alternatives. The scenario used was a mortgage application campaign, where information about applicants is gathered and the success of the campaign determined by the number of completed mortgages; potential fraud cases are detected and recommended actions presented to a user to handle the case. Feedback from the user, in the form of accepting or rejecting recommendations, is used to tune the predictions. In addition to showing standard dashboards of events that have occurred, it can also give a dashboard view of predictions such as how many mortgage applications are expected to fail, including those that may be able to be resolved favorably through some recommended actions. The system is self-learning based on statistical models and domain knowledge, so can detect predefined patterns or completely emergent patterns; it can be applied to provide predictive analytics and goal-seeking behavior across multiple systems, including other BPMS.

Wrapping up this set of demos on intelligent, event-driven processes, we had a keynote from Jim Sinur (formerly of Gartner, now an independent consultant) on goal-directed processes. He covered concepts of hybrid processes, made up of multiple heterogeneous systems and processes that may exhibit both orchestration and collaboration to solve business problems.

Great first set of demos, definitely setting the bar high for tomorrow’s full day of 11 demos, and a good first day. We’re all off to the roof deck for a reception, wine tasting and dinner, so that’s it for blogging for today.

Canary roof deck

By the way, I realize that we completely forgot to create bpmNEXT bingo cards, although it did take until after 4pm for “ontology” to come up.

bpmNEXT 2015 Day 1: More Business of BPM

Talking with people at the first break of the first day, I feel so lucky to be part of a community with so many people who are friends, and with whom you can have both enlightening and amusing conversations.

Building a BPM Ecosystem

Continuing on the Business of BPM program, we had a panel with Miguel Valdés Faura of Bonitasoft, Scott Francis of BP-3 Global and Denis Gagne of Trisotech on the BPM ecosystem. Although billed as a panel, each participant had a 10-minute presentation slot before joint Q&A.

BPM ecosystem panelNot surprisingly, Miguel sees open source as an important part of the BPM ecosystem because it creates more of a meritocracy in the development of BPM capabilities, allowing many more people to participate actively in BPMS development and be recognized for their contributions. Being part of an open source community doesn’t necessarily mean that you’re writing core code: there are many people who contribute through developing extensions and add-ons, providing requirements, testing code, writing documentation and training materiels for developers and users, and creating vertical solutions based on the open source offering. They may do this as volunteer contributors, or create businesses around the added-value components that they offer.

Scott talked about BP-3’s journey as former Lombardi employees who became Lombardi (then IBM BPM) partners, and now build add-on products for IBM BPM including user dashboards and code quality checkers. He talked about the things that they have done to build a successful business as a partner and ISV for a large vendor, including being consistent, adding value, building their own customer base rather than subcontracting to the vendor’s professional services arm, and marketing what they do. Having run a boutique BPM implementation services firm in the past, I agree that companies like BP-3 are an essential part of the BPM community, providing an alternative to the vendor’s PS that can often provide higher-quality services at a lower cost.

Denis, with his background in standards as well as building the Business Process Incubator resource community, has worked for years at explicitly building the BPM ecosystem. He has a “rising tide lifts all boats” philosophy of providing resources that allow potential customers to educate themselves and exchange information, which broadens the reach of the industry and helps to lift it out of the BPM 101 discussion stage. He also talked about the problem of BPM standards being divergent, that is, vendors take an agreed-upon standard such as BPMN, then create their own proprietary extensions that detract from the standard, and therefore the community in general. Vendors that do this rather than participating in the standards development effort are not good community members; in my opinion, they are working from a fear-based philosophy of market scarcity rather than Denis’ more generous view that there will be a lot more of the BPM market to go around if we all help to educate and commoditize.

There was a wide-ranging discussion following their mini-presentations, although I only captured a couple of points:

  • Ensuring that the BPM ecosystem that we’re talking about covers process improvement, enterprise/business architecture and related topics, not just BPM software.
  • Why the push towards (mobile) apps isn’t more oriented to/supported by BPM technologies; as well as the problem of mobile app developers who don’t think at all about the back-end process of the transactions that they initiate, low-code BPM solutions might be hindering this since it removes the focus from developers. Mobile development fiefdoms have formed in many organizations, and these barriers need to be removed to integrate mobile apps and process.

Schrodinger’s BPM

We finished off the Business of BPM half-day program with Neil Ward-Dutton of MWD Advisors, talking about whether we are at the end of BPM or the end for transformation, and where we go next. The term “BPM” is starting to disappear from communications and the market for platforms is growing slowly, with maintenance revenue dominating license revenue, but there are still plenty of inquiries about how to get started with BPM, including from non-traditional (read: not financial services) sectors. He sees this as an indication that we’re in the middle of mainstream adoption of BPM, with the conversation shifting from pure technology to domain-specific expertise, success stories, stakeholder education and how to develop cost-effective skills. A key challenge is that a BPMS isn’t like most other enterprise technologies, because it includes aspects of many different technologies and methodologies, and can be positioned as the “one suite to rule them all” application development platform as well as an enabler for significant organizational change. Since mainstream adoption means approaching the more conservative half of the market, this is a scary proposition.

He presented two organizations that both embarked on BPM projects: a retail group that successfully implemented a cloud-hosted case management system to specifically improve the delivery of in-home customer services; and a banking group that failed to implement an expensive IT-led technology transformation project that built their COE before implementing anything, and not focusing on a specific business problem to solve. For organizations used to solving problems like the bank, enterprise-wide BPM looks like it’s too big and too disruptive; for more nimble organizations like the retailers, it’s a tool that can be used to solve a business problem while moving to low-code platforms, Agile development methodologies, cloud and mobile.

The lines are blurring between different product classes: BPMS, BPA, low-code, operational intelligence, task management, project management, enterprise social collaboration, and cloud orchestration. Customers are picking products from different categories to solve the same problems, and products are spanning multiple categories. It’s not so easy any more to put boundaries around what any particular product can do. The digital business era is also creating new threats and opportunities: new customer expectations, and new ways to gather information from devices, for example. This requires two capabilities working in concert: instrumentation of products, services and processes; and agility of services, processes and business models. This is a fundamentally different view of transformation, with continuous change and improvement based on instrumentation of a quickly-implemented solution rather than pre-planned to-be/as-is multi-year transformation projects.

His summary: enterprise-wide BPM initiatives are just not happening in the way that transformation efforts happened 10 years ago, but organizations are actively transforming business processes using more agile iterative techniques, particularly in the area of work coordination. Keep an eye on the non-traditional vendors and starting with simpler solutions, while linking to broader digital strategies.

Neil Ward-Dutton and Schrodinger's cat

bpmNEXT 2015 Day 1: The Business of BPM

I can’t believe it’s already the third year of bpmNEXT, my favorite BPM conference, organized by Nathaniel Palmer and Bruce Silver. It’s a place to meet up with other BPM industry experts and hear about some of the new things that are coming up in the industry: a meeting of peers, including CEOs and CTOs from smaller BPM companies, BPM architects and product management experts from larger vendors, industry analysts and more. The goal is a non-partisan friendly meeting of the minds rather than a competitive arena, and it’s great to see a lot of familiar faces here, plus some new faces of people who I only know online or through phone calls.

Hanging with Denis and Jakob

We’re at the lovely Canary Hotel in Santa Barbara, and will have the chance for a wine tasting with some of the local wineries tonight: Slone Vineyards, Happy Canyon, Grassini, Au Bon Climat, and Margerum. But first, we have some work to do.

This year, we started with an optional half day program on the business of BPM, including keynotes and a panel, before kicking off the usual DEMO-style presentations. Because of the large volume of great content, I’ll just publish summaries at the break points; all of the presentations will be available online after the conference (as they were in 2014 and 2013) if you want to learn more.

BPM 2020: Outlook for the Next Five Years

Bruce Silver opening remarksBruce Silver kicked off the conference and summarized the themes and presenters here at bpmNEXT:

  • Breaking old barriers: between BPM and (business and enterprise) architecture, which will be covered in presentations by Comindware and Trisotech; between process modeling and decision modeling, with Sapiens and Signavio presentations; and between BPM and case management, with Camunda, Safira, Cryo, Kofax and IBM presentations.
  • Expanding BPM horizons: the internet of things, with presentations from SAP and W4; cognitive computing and expert systems, with BP3, Fujitsu, IBM and Living Systems; and resourcing optimization with process mining, from Process Analytica.
  • Reaffirming core values: business empowerment, covered by Omny.link and Oracle; and embracing continual change, with Bonitasoft.

Hearing Bruce talk about the future to BPM in the context of the presentations to be given here over the next couple of days makes you realize just how much thought goes into the bpmNEXT program, and selecting presenters that provide maximum value. If this fascinates you, you should consider being here next year, as an attendee or a presenter.

Nathaniel Palmer then gave us his view of what BPM will look like in five years: data-driven, goal-oriented, adaptive and with intelligent automation, so that processes understand, evolve and self-optimize to meet the work context and requirements. He sees the key challenges as the integration of rules, relationships and robots into processes and operations, including breaking down the artificial barrier that exists between the modeling and automation of rules and process. Today’s consumers — and business people — are expecting to interact with services through their mobile devices, and are starting to include the quality of mobile services as a primary decision criteria. Although we are primarily doing that via our phones and tablets now, there are also devices such as Amazon Echo that are there to lower the threshold to interaction (and therefore to purchasing) by being a dedicated, voice-controlled gateway to Amazon; Jibo, a home-automation “robot” that aims to become a personal assistant for your home, interfacing with rather than automating tasks; and wearables that can notify and accept instructions.

bpmNEXT attendeesToday, most BPM is deployed as a three-tier, MVC-type architecture that presents tasks via a worklist/inbox metaphor; Nathanial thinks that we need to re-envision this as a four-tier architecture: a client tier native to each platform, a delivery tier that optimizes delivery for the platform, an aggregation tier that integrates services and data, and a services tier that provides the services (which is, arguably, the same as the bottom two tiers of a standard three-tier architecture). Tasks are machine-discoverable for automated integration and actions, and designed by context rather than procedure. Key enablers for this in include standards such as BPAF, and techniques for automated analysis including process mining.

Reinventing BPM for the Age of the Customer

Clay Richardson of Forrester — marking what I think is the first participation by a large analyst firm at bpmNEXT — presented some of Forrester’s research on how organizations are retooling for improving customer. Although still critical for automation and information management, BPM has evolved to support customer engagement, especially via mobile applications and innovation. 42% of their customers surveyed consider it either critical or high priority to reengineer business processes for mobile, meaning that this is no longer about just putting a mobile interface on an existing product, but reworking these processes to leverages things such as events generated by sensors and devices, providing a much richer informational context for processes. Digital transformation provides new opportunities for using BPM to drive rapid customer-centric innovation: digitizing the customer lifecycle and end-to-end experiences as well as quickly integrating services behind the scenes. Many companies now are using customer journey maps to connect the dots between process changes and customer experience, using design thinking paradigms.

We saw Forrester’s BPM TechRadar — similar to Gartner’s Hype Cycle — showing the key technologies related to BPM, and where they are on their maturity curves: BPM suites, business rules, process modeling and document capture are all at or past their peak, whereas predictive analytics, social collaboration, low-code platforms and dynamic case management are still climbing. They see BPM platforms as moving towards more customer-centricity, being used to create customer-facing applications in addition to automated integration and internal human-centric workflow. There’s also an interesting focus on the low-code application development platform market, as some BPM vendors reposition their products as process-centric app dev — including both traditional technical developers and less technical citizen developers — rather than BPMS.

We’re off on a break now, but will be back to finish the Business of BPM program with a panel and a keynote before we start on the demo program this afternoon.

bpmNEXT 2014 Wrapup And Best In Show

I couldn’t force myself to write about the last two sessions of bpmNEXT: the first was a completely incomprehensible (to me) demo, and the second spent half of the time on slides and half on a demo that didn’t inspire me enough to actually put my hands on the keyboard. Maybe it’s just conference fatigue after two full days of this.

However, we did get a link to the Google Hangout recording of the BPMN model interchange demo from yesterday (be sure to set it to HD or you’ll miss a lot of the screen detail).

We had a final wrapup address from Bruce Silver, and he announced our vote for the best in show: Stefan Andreasen of Kapow – congrats!

I’m headed home soon to finish my month of travel; I’ll be Toronto-based until the end of April when IBM Impact rolls around.

bpmNEXT 2014 Thursday Session 2: Decisions And Flexibility

In the second half of the morning, we started with James Taylor of Decision Management Solutions showing how to use decision modeling for simpler, smarter, more agile processes. He showed what a process model looks like in the absence of externalized decisions and rules: it’s a mess of gateways and branches that basically creates a decision tree in BPMN. A cleaner solution is to externalize the decisions so that they are called as a business rules activity from the process model, but the usual challenge is that the decision logic is opaque from the viewpoint of the process modeler. James demonstrated how the DecisionsFirst modeler can be used to model decisions using the Decision Model and Notation standard, then link a read-only view of that to a process model (which he created in Signavio) so that the process modeler can see the logic behind the decision as if it were a callable subprocess. He stepped through the notation within a decision called from a loan origination process, then took us into the full DecisionsFirst modeler to add another decision to the diagram. The interesting thing about decision modeling, which is exploited in the tool, is that it is based on firmer notions of reusability of data sources, decisions and other objects than we see in process models: although reusability can definitely exist in process models, the modeling tools often don’t support it well. DecisionsFirst isn’t a rules/decision engine itself: it’s a modeling environment where decisions are assembled from the rules and decisions in other environments, including external engines, spreadsheet-based decision tables, or knowledge sources describing the decision. It also allows linking to the processes from which it is invoked, objectives and organizational context; since this is a collaborative authoring environment, it can also include comments from other designers.

François Chevresson-Aubain and Aurélien Pupier of Bonitasoft were up next to show how to build flexibility into deployed processes through a few simple but powerful features. First, adding collaboration tasks at runtime, so that a user in a pre-defined step who needs to include other users at that point can do so even if collaboration wasn’t built in at that point. Second, process model parameters can be changed (by an administrator) at runtime, which will impact all running processes based on that model: the situation demonstrated was to change an external service connector when the service call failed, then replay the tasks that failed on that service call. Both of these features are intended to address dynamic environments where the situation at runtime may be different from that at design time, and how to adjust both manual and automated tasks to accommodate those differences.

We finished the morning with Robert Shapiro of Process Analytica on improving resource utilization and productivity using his Optima workbench. Optima is a tool for a serious analyst – likely with some amount of statistical or data science background – to import a process model and runtime data, set optimization parameters (e.g., reduce resource idleness without unduly impacting cycle time), simulate the process, analyze the results, and determine how to best allocate resources in order to optimize relative to the parameters. Although a complex environment, it provides a lot of visualization of the analytics and optimization; Robert actually encourages “eyeballing” the charts and playing around with parameters to fine-tune the process, although he has a great deal more experience at that than the average user. There are a number of analytical tools that can be applied to the data, such as critical path modeling, and financial parameters to optimize revenues and costs. It can also do quite a bit of process mining based on event log inputs in XES format, including deriving a BPMN process model and data correlation based on the event logs; this type of detailed offline analysis could be applied with the data captured and visualized through an intelligent business operations dashboard for advanced process optimization.

We have one more short session after lunch, then best in show voting before bpmNEXT wraps up for another year.

bpmNEXT 2014 Thursday Session 1: Intelligence And A Bit More BPMN

Harsh Jegadeesan of SAP set the dress code bar high by kicking off the Thursday demos in a suit jacket, although I did see Thomas Volmering and Patrick Schmidt straightening his collar before the start. He also set a high bar for the day’s demo by showing how to illuminate business operations with intelligent process intelligence. He discussed a scenario of a logistics hub (such as Amazon’s), and the specific challenges of the hub operations manager who has to deal with inbound and outbound flights, and sorting all of the shipments between them throughout the day. Better visibility into the operations across multiple systems allows problems to be detected and resolved while they are still developing by reallocating the workforce. Harsh showed a HANA-based hub operations dashboard, where the milestones for shipments demark the phases of the value chain: from arrival to ground handling to warehouse to outbound buffer to loading and takeoff. Real-time information is pulled from each of the systems involved, and KPIs show; drill downs can show the lower level aggregate or even individual instance data to determine what is causing missed KPIs – in the demo, shipments from certain other hubs are not being unloaded quickly enough. But more than just a dashboard, this allows the hub operations manager to add a task directly in the context of the problem and assign it (via an @mention) to someone else, for example, to direct more trucks to unload the shipments. The dashboard can also make recommendations, such as changing the flights for specific shipments to improve the overall flow and KPIs. He showed a flight map view of all inbound and outbound flights, where the hub operations manager can click on a specific flight and see the related data. He showed the design environment for creating the intelligent business operations process by assembling SAP and non-SAP systems using BPMN, mapping events from those systems onto the value chain phases (using BPAF where available), thereby providing visibility into those systems from the dashboard; this builds a semantic data mart inside HANA for the different scenarios to support the dashboard but also for more in-depth analytics and optimization. They’ve also created a specification for Process Façade, an interface for unifying process platforms by integrating using BPMN, BPAF and other standards, plus their own process-based systems; at some point, expect this to open up for broader vendor use. Some nice case studies from process visualization in large-scale enterprises.

Dominic Greenwood of Whitestein on intelligent process execution, starting by defining an intelligent process: it has experiences (acquired data), knowledge (actionable information, or analytical interpretation of acquired data), goals (adoptable intentions, or operationally-relevant behavioral directives), plans (ways to achieve goals through reusable action sequences, such as BPMN processes) and actions (result of executing plans). He sees intelligent process execution as an imperative because of the complexity of real-world processes; processes need to dynamically adapt, and process governance needs to actively apply constraints in this shifting environment. An intelligent process controller, or reflective agent, passes through a continuous cycle of observe, comprehend, deliberate, decide, act and learn; it can also collaborate with other intelligent process controllers. He discussed a case study in transportation logistics – a massively complex version of the travelling salesman problem – where a network of multi-modal vehicles has to be optimized for delivery of goods that are moved through multiple legs to reach their destinations. This involves knowledge of the goods and their specific requirements, vehicle sensors of various types, fleet management, hub/port systems, traffic and weather, and personnel assignments. DHL in Europe is using this to manage 60,000 orders per day, allocated between 17,500 vehicles that are constantly in motion, managed by 300 dispatchers across 24 countries with every order changing at least once while en route. The intelligent process controllers are automating many of the dispatching decisions, providing a 25-30% operational efficiency boost and a 12% reduction in transportation costs. A too-short demo that just walked through their process model to show how some of these things are assigned, but an interesting look into intelligent processes, and a nice tie-in to Harsh’s demonstration immediately preceding.

Next up was Jakob Freund of camunda on BPMN everywhere; camunda provides an open-source BPM framework intended to be used by Java developers to incorporate process automation into their applications, but he’s here today to talk about bpmn.io: an open-source toolkit in Javascript that provides a framework for developers and a BPMN web modeler, all published on GitHub. The first iteration is kicking off next week, and the web modeler will be available later this year. Unlike yesterday’s demonstrators who firmly expressed the value of no-code BPM implementations, Jakob jumped straight into code to show how to use the Javascript classes to render BPMN XML as a graphical diagram and add annotations around the display of elements. He showed how these concepts are being used in their cockpit process monitoring product; it could also be used to demonstrate or teach BPMN, making use of functions such as process animation. He demonstrated uploading a BPMN diagram (as XML) to their camunda community site; the site uses the Javascript libraries to render the diagram, and allows selecting specific elements in the diagram and adding comments, which are then seen via a numeric indicator (indicating the number of comments) attached to the elements with comments. He demonstrated some of the starting functionality of the web modeler, but there’s a lot of work to do there still; once it’s released, any developer can download the code and embed that web modeler into their own applications.

We finished the first morning session with Keith Swenson of Fujitsu on letting go of control: we’re back on the topic of agents, which Keith initially defined as autonomous, goal-directed software that does something for you, before pointing out that that describes a lot of software today. He expanded that definition to mean something more…human-like. A personal assistant that can coordinate your communications with those of other domains. These type of agents do a lot of communication amongst themselves in a rules-based dynamic fashion, simplifying and reducing the communication that the people need to do in order to achieve their goals. The key to determining what the personal assistants should be doing is to observe emergent behavior through analytics. Keith demonstrated a healthcare scenario using Cognoscenti, an open-source adaptive case management project; a patient and several different clinicians could set goals, be assigned tasks, review documents and other activities centered around the patient’s care. It also allows the definition of personal assistants to do specific rules-based actions, such as cloning cases and synchronizing documents between federated environments (since both local and cloud environments may be used by different participants in the same case), accepting tasks, and more; copying between environments is essential so that each participant can have their information within their own domain of control, but with the ability to synchronize content and tasks. The personal assistants are pretty simple at this point, but the concept is that they are helping to coordinate communications, and the communications and documents are all distributed via encrypted channels so safer than email. A lot of similarities with Dominic’s intelligent process controllers, but on a more human scale. As many thousand of these personal assistant interactions occur, patterns will begin to emerge of the process flows between the people involved, which can then be used to build more intelligence into the agents and the flows.

bpmNEXT 2014 Wednesday Afternoon 2: Unstructured Processes

We’re in the Wednesday home stretch; this session didn’t have a specific theme but it seemed to mostly deal with unstructured processes and event-driven systems.

The session started with John Reynolds and Amy Dickson of IBM on blending structured flow and event condition action patterns within process models. John showed how they are modeling ad hoc activities using BPMN (rather than CMMN): basically, disconnected activities can have precondition events and expressions specified as to when and how they are triggered, be identified as optional or mandatory, and their behavior. It’s not completely standard BPMN, but uses a relatively small number of extensions to indicate how the activity is started and whether it is optional or required. The user sees activities with different visual indicators to show which are required or optional, and if an activity is still waiting for a precondition. This exposes the natural capabilities of the execution engine as an event handling engine; BPMN just provides a model for what happens next after an action occurs, as well as handling the flow model portions of the process. They’re looking at adding milestones and other constructs; this is an early pre-release version and I expect that we’ll see some of these ideas rolling into their products over the months to come. An interesting way to combine process flows and ad hoc activities in the same (pre-defined) process while hiding some of the complexity of events from the users; also interesting in that this indicates some of IBM’s direction for handling ad hoc cases in BPM.

Ashok Anand and R.V.S. Mani of Inswit presented their beta appiyo “business response platform”, which is an application development platform for small, simple BPM apps that can interconnect with social media such as Facebook, but an overly-short demo followed an overly-long presentation so difficult to grasp much of the capability.

We finished the day with Jason Bloomberg of EnterpriseWeb discussing agent-oriented architecture for cross-process governance: a “style of EA that drives business agility by leveraging policy-based, data-driven intelligent agents”. They call their intelligent agent SmartAlex; it’s like Siri for the enterprise, dynamically connecting people and content at the right time in a goal-driven manner rather than with pre-defined processes. Every step is just an event that calls SmartAlex; SmartAlex interprets models, evaluates and applies policies and rules, then delivers results or makes recommendations using a custom interface and payload depending on the context. Agents can not only coordinate local processes, but also track what’s happening in all related processes across an enterprise to provide overall governance and support integrated functions. EnterpriseWeb isn’t a BPM tool; it’s a tool for building tools, including workflows. Bill Malyk joined remotely to do the demo based on resolving a declarative conflict of interest; he showed creating an application related to cases in the system, and stating that potential conflict of interest cases are those that have relationships between people involved in the case. This immediately identified existing cases where there is a potential conflict of interest, and allowed navigation through the graph that links the cases and the criteria. He then demonstrated creating a process related to the application, which can then run flow-oriented processes based on potential conflicts of interest found using the declarative logic specified earlier. Some powerful capabilities for declarative, agent-based applications that take advantage of a variety of data sources and fact models, with greater flexibility and ease of use than complex event processing platforms.

My brain is full, so it must be time for dinner and another evening of drinks and conversation; I’ll be back tomorrow with another full morning and half afternoon of sessions.

bpmNEXT 2014 Wednesday Afternoon 1: Mo’ Models

Denis Gagne of Trisotech was back after lunch at bpmNEXT demonstrating socializing process change with their BPMN web modeler. He showed their process animation feature, which allows you to follow the flow through a process and see what happens at each step, and view rich media that has been attached at any given step to explain that step. He showed a process for an Amazon order, where each step had a slideshow or video attached to show the actual work that was being performed at that step; the tool supports YouTube, Slideshare, Dropbox and a few others natively, plus any URL as an attachment to any element in the process. The animated process can be referenced by a URL, allowing it to be easily distributed and socialized. This provides a way for people to learn more about the process, and can be used as an employee training tool or a customer experience enhancement. Even without the rich media enhancements, the process animation can be used to debug processes and find BPMN logical errors (e.g., deadlocks, orphan branches) by allowing the designer to walk through the process and see how the tokens are processed through the model – most modeling tools only check that the BPMN is syntactically correct, not for more complex logical errors that can result in unexpected and unwanted scenarios. Note that this is different from process simulation (which they also offer), which is typically used to estimate performance based on aggregate instances.

Bruce Silver took a break from moderating to do a demo together with Stephan Fischli and Antonio Palumbo of itp commerce on wizard-based generation of “good BPMN” that they’ve done through their BPMessentials collaboration for BPMN training and certification. Bruce’s book BPMN Method and Style as well as his courses attempt to teach good BPMN, where the process logic is evident from the printed diagram in spite of things that can tend to confuse a reader, such as hierarchical modeling forms. He uses a top-down methodology where you identify the start and end states of a process instance, then decompose the process into 6-10 steps where each is an activity aligned with the process instance (i.e., no multi-instance activities), and enumerate the possible end states of each activity if there is more than one so that end states within subprocesses can be matched to gateways that immediately follow the subprocesses. This all takes a bit of a developer’s mindset that’s typically not seen in business analysts who might be creating BPMN models, meaning that we can still end up with spaghetti process models even in BPMN. Bruce walked through an order-to-cash scenario, then Stephan and Antonio took over to demonstrate how their tool creates a BPMN model based on a wizard that walks through the steps of the BPMN method and style: first the process start and (one or more) end states; then a list of the major steps, where each is named, the end states enumerated and (optionally) the performer identified; then the activity-end state pairs are listed so that the user can specify the target (following step), which effectively creates the process flow diagram; then, each activity can be expanded as a subprocess by listing the child activities and the end states; finally, the message flows and lanes are specified by stating which activities have incoming and outgoing message flows. The wizard then creates the BPMN process model in the itp commerce Visio tool where all of the style rules are enforced. Without doubt, this creates better BPMN, although specifying a branching process model via a list of activities and end states might not be much more obvious than creating the process model directly. I know that the itp commerce and some other BPMN modeling tools can also run a check on a BPMN model to check for violations of the style rules; I assume that detecting and fixing the rule violations from a model is just another way of achieving the same goal.

Last up before the afternoon break was Gero Decker of Signavio to demonstrate combining process modeling and enterprise architecture. Signavio’s only product is their process modeler – used to model, collaborate, publish and implement models – which means that they typically deal with process designers and process centers of excellence. However, they are finding that they are now running into EA modelers as they start to move into process architecture and governance, and application owners for application lifecycle management. EA modelers have to deal with the issues of whether to use a unified tool with a single object repository for all modeling and unified model governance, or multiple best of breed tools where metamodels can be synchronized and may be slaved between tools. Signavio is pushing the second alternative, where their tool integrated with or overlays other tools such as SAP Solution Manager and leanIX. Signavio has added ArchiMate open standard enterprise architecture model types to their tool for EA modeling, creating linkages and tracing from ArchiMate objects to BPMN models. Gero demonstrated what the ArchiMate models look like in Signavio, then how processes in leanIX can be directly linked to Signavio process models as well as having applications from the EA portfolio available as performers to specify in a Signavio process model. Creating of process models in Signavio that use applications from the portfolio then show up (via automated synchronization) in leanIX as references to that application. He also showed an integration with Effektif for approving changes to the process model in Signavio, comparing the before and after versions of the flow, since there is a pluggable connector to Signavio from Effektif processes. Connections to other tools could be built using the Signavio REST API. Nice integration between process models and application portfolio models in separate tools, as well as the model approval workflow.

bpmNEXT 2014: BPMN MIWG Demo

The BPMN Model Interchange Working Group is all about (as you might guess from the name) interchanging BPMN models between different vendors’ products: something that OMG promised with the BPMN standard, but which never actually worked out of the box due to contradictions in the standard and misinterpretations by some vendors. To finish off Wednesday morning at bpmNEXT, we have a live demo involving 12 different tools with participants in different locations, with Denis Gagne of Trisotech (who chairs the working group) and Falko Menge of camunda (who heads up the test automation subgroup) on the stage, a few others here on the sidelines, some at the OMG meeting in Reston, and some in their offices in France and Poland.

To start, different lanes of the process were designed by four different people on IBM Blueworks Live, Activiti, camunda and W4; each then exported their process models and saved to Dropbox. Denis switched back and forth between the different screens (they were all on a Google Hangout) to show us what was happening as the proceeded, and we could see the notifications from Dropbox as the different files were added. In the second stage, Bonitasoft was used to augment the Blueworks Live model, itp-commerce edited the Activiti model, and Signavio edited the camunda model. In the third stage, ADONIS was used to merge together the lanes created in several of the models (I lost track of which ones) into a single process model, and Yaoqiang used to merge the Signavio and camunda models. Then, the Trisiotech Visio BPMN modeler was used to assemble the ADONIS and Yaoqiang models into the final model with multiple pools. At the end, the final model was imported into a number of different tools: the Trisotech web modeler, the Oracle BPM modeler, the bpmn.io environment from camunda, and directly into to the W4 execution engine (without passing through a modeling environment). Wow.

The files exchanged were BPMN XML files, and the only limitations of which tool to use when was that some only support a single pool so had to be used at the earlier stages where each tool was only modeling a single lane or pool. This is how BPMN was supposed to work, but the MIWG has found some number of inconsistencies with the standard and also some issues with the vendors’ tools that had to be corrected.

They have developed a number of test cases that cover the descriptive and analytic classes within BPMN, and automated tools to test the outcome of different vendors’ modelers. Over 20 BPMN modelers have been tested for import, export and roundtrip capabilities; if you’re a BPMS vendor supporting BPMN 2.0 (or claiming to), you should be on this list because there are a lot of us who just aren’t going to write our own XSLT to translate your models into something that can be read by another tool. If you’re a process designer using a BPMS, shame your vendor into participating because it creates a more flexible and inclusive environment for your design and modeling efforts.

This is hugely valuable work that they’re doing in the working group; note that you don’t have to be an OMG member to get involved, and the BPMN MIWG would love to have others join in to help make this work even better.

We’re off for lunch and a break now, then back for six more sessions this afternoon. Did I mention how awesome bpmNEXT is?

bpmNEXT 2014 Wednesday Morning: Cloud, Synthetic APIs and Models

I’m not going to say anything about last night, but it’s a bit of a subdued crowed here this morning at bpmNEXT. Smile

We started the day with Tom Baeyens of Effektif talking about cloud workflow simplified. I reviewed Effektif in January at the time of launch and liked the simple and accessible capabilities that it offers; Tom’s premise is that BPM is just as useful as email, and it needs to be just as simple to use as email so that we are not reliant on a handful of power users inside an organization to make them work. To do this, we need to strip out features rather than add features, and reduce the barriers to trying it out by offering it in the cloud. Inspired by Trello (simple task management) and IFTTT (simple cloud integration, which basically boils down every process to a trigger and an action), Effektif brings personal DIY workflow to the enterprise that also provides a bridge to enterprise process management through layers of functionality. Individual users can get started building their own simple workflows to automate their day-to-day tasks, then more technical resources can add functionality to turn these into fully-integrated business processes. Tom gave a demo of Effektif, starting with creating a checklist of items to be completed, with the ability to add comments, include participants and add attachments to the case. There have been a few changes since my review: you can use Markdown to format comments (I think that understanding of Markdown is very uncommon in business and may not be well-adopted as, for example, a TinyMCE formatted text field); cases can now to started by a form as well as manually or via email; and Google Drive support is emerging to support usage patterns such as storing an email attachment when the email is used to instantiate the case. He also talked about some roadmap items, such as migrating case instances from one version of a process definition to another.

Next up was Stefan Andreasen of Kapow (now part of Kofax) on automation of manual processes with synthetic APIs – I’m happy for the opportunity to see this because I missed seeing anything about Kapow during my too-short trip to the Kofax Transform conference a couple of weeks ago. He walked through a scenario of a Ferrari sales dealership who looks up SEC filings to see who sold their stock options lately (hence has some ready cash to spend on a Ferrari), and narrow that down with Bloomberg data on age, salary and region to find some pre-qualified sales leads, then load them into Salesforce. Manually, this would be an overwhelming task, but Kapow can create synthetic APIs on top of each of these sites/apps to allow for data extraction and manipulation, then run those on a pre-set schedule. He started with a “Kapplet” (applications for business analysts) that extracts the SEC filing data, allows easy manual filtering by criteria such as filing amount and age, then select records for committal to Salesforce. The idea is that there are data sources out there that people don’t think of as data sources, and many web applications that don’t easily integrated with each other, so people end up manually copying and pasting (or re-keying) information from one screen to another; Kapow provides the modern-day equivalent to screen-scraping that taps into the presentation logic and data (not the physical layout or style, hence less likely to break when the website changes) of any web app to add an API using a graphical visual flow/rules editor. Building by example, elements on a web page are visually tagged as being list items (requiring a loop), data elements to extract, and much more. It can automate a number of other things as well: Stefan showed how a local directory of cryptically-named files can be renamed to the actual titles based on table of contents HTML document; this is very common for conference proceedings, and I have hundreds of file sets like this that I would love to rename. The synthetic APIs are exposed as REST services, and can be bundled into Kapplets so that the functionality is exposed through an application that is useable by non-technical users. Just as Tom Baeyens talked about lowering the barriers for BPM inside enterprises in the previous demo, Kapow is lowering the bar for application integration to service the unmet needs.

It would be great if Tom and Stefan put their heads together and lunch and whipped up an Effektif-Kapow demo, it seems like a natural fit.

Next was Scott Menter of BP Logix on a successor to flowcharts, namely their Process Director GANTT chart-style process interface – he said that he felt like he was talking about German Shepherds to a conference of cat-lovers – as a different way to represent processes that is less complex to build and modify than a flow diagram, and also provides better information on the temporal aspects and dependencies such as when a process will complete and the impacts of delays. Rather than a “successor” model such as a flow chart, that models what comes after what, a GANTT chart is a “predecessor” model, that models the preconditions for each task. A subtle but important difference when the temporal dependencies are critical. Although you could map between the two model types on some level, BP Logix has a completely different model designer and execution engine, optimized for a process timeline. One cool thing about it is that it incorporates past experience: the time required to do a task in the past is overlaid on the process timeline, and predictions made for how well this process is doing based on current instance performance and past performance, including tasks that are far in the future. In other words, predictive analytics are baked right into each process since it is a temporal model, not an add-on such as you would have in a process based on a flow model.

For the last demo of this session, Jean-Loup Comeliau of W4 on their BPMN+ product, which provides model-driven development using BPMN 2, UML 2, CMIS and other standards to generate web-based process applications without generating code: the engine interprets and executes the models directly. The BPMN modeling is pretty standard compared to other process modeling tools, but they also allow UML modeling of the data objects within the process model; I see this in more complete stack tools such as TIBCO’s, but this is less common from the smaller BPM vendors. Resources can be assigned to user tasks using various rules, and user interface forms are generated based on the activities and data models, and can be modified if required. The entire application is deployed as a web application. The data-centricity is key, since if the models change, the interface and application will automatically update to match. There is definitely a strong message here on the role of standards, and how we need more than just BPMN if we’re going to have fully model-driven application development.

We’re taking a break, and will be back for the Model Interchange Working Group demonstration with participants from around the world.