SapphireNow 2015 Day 2 Keynote with Bernd Leukert

SAP HANA functionalityThe second day of SAP’s SAPPHIRENOW conference started with Bernd Leukert discussing some customers’ employees worry of being disintermediated by the digital enterprise, but how the digital economy can be used to accentuate the promise of your original business to make your customers happier without spending the same amount of time (and hopefully, money) on enterprise applications. It’s not just about changing technologies but about changing business models and leveraging business networks to address the changing world of business. All true, but I still see a lot of resistance to the digital enterprise in large organizations, with both mid-level management and front-line workers feeling threatened by new technologies and business models until they can see how it can be of benefit to them.

S/4HANAAlthough Leukert is on the stage, the real star of the show is S/4HANA: the new generation of their Business Suite ERP solutions based natively on the in-memory HANA data and transaction engine for faster processing, a simplified data model for easier analytics and faster reconciliation, and a new user interface with their Fiori user experience platform. With the real-time analytical capabilities of HANA, including non-SAP as well as S/4HANA data from finances and logistics, they are moving from being just a system of record to a full decision support system. We saw a demo of a manufacturing scenario, where we walked through a large order process where we saw a combination of financial and logistics data presented in real time for making recommendations on how to deal with a shortage in fulfilling an order. Potential solutions — in this case, moving stock allocated from one customer to another higher priority customer — are presented with a predicted financial score, allowing the user to select one of the options. Nice demo of analytics and financial predictions directly integrated with order processing.

Order processing dashboard Order processing recommendations Order process simulation results

The new offering is modular, with additional plug-ins for their other products such as Concur and SuccessFactors to enhance the suite capabilities. It runs in the cloud and on-premise. Lots of reasons to transition, but having this type of new functionality requires significant work to adopt the new programming model: both on SAP’s side in building the new platform, and also on the customers’ side for refactoring their applications to take advantage of the new features. Likely this will take several months, if not years, for widespread adoption by customers that have highly customized solutions (isn’t that all of them?), in spite of the obvious advantages. As we have seen with other vendors who completely re-architect their product, new customers are generally very happy with starting on the new platform, but existing customers can take years even when there is certified migration path. However, since they launched in February, 400 customers have committed to S4/HANA, and they are now supporting all 25 industries that they serve.

As we saw last year, SAP is pushing to have existing customers first migrate to HANA as the underlying database in their existing systems (typically displacing Oracle), which is a non-trivial but straightforward operation that is likely to improve performance; then, reconsider whether the customizations that they have in their current system are handled out of the box with S/4HANA or can be easily re-implemented based on the simpler data model and more functional capabilities. Sounds good, and I imagine that they will get a reasonable share of their existing customers to make the first step and migrate to HANA, but the second step starts to look more like a new implementation than a simple migration that will scare off a lot of customers. Leukert invited a representative from their customer Asian Paints to the stage to talk about their migration: they have moved to HANA and the simplified finance core functionality, and are still working on implementing the simplified logistics and other modules with a vision to soon be completely on S/4HANA. A good success story, but indicative of the length of time and amount of work required to migrate. For them, definitely worth the trip since they have been able to re-imagine their business model to reach new markets through a better understanding of their customers and their own business data.

He moved on to talk about the HANA Cloud Platform (HCP), a general-purpose application development platform that can be used to build applications unrelated to SAP applications, or to build extensions to SAP functionality. He mentioned an E&Y application built on HCP for fraud detection that is directly integrated with core SAP solutions, which is just one of 1,000 or more third-party applications available on the HCP marketplace. HCP provides structured and unstructured data models, geospatial, predictive, Fiori UX platform as a service, mobile support, analytics portfolio, and integration layers that provide direct connection to your business both on the device side through IoT events and into the operational business systems. With the big IoT push that we saw in the panel yesterday, Siemens has selected HCP as their cloud platform for IoT: the Siemens Cloud for Industry. Peter Weckesser of Siemens joined Leukert on stage to talk more about this newly-launched platform, and how it can be added to their customer installations as a monitoring (not control) layer: remote devices, such as sensors on manufacturing equipment, push their event streams to the Siemens cloud (based on HCP) in public, hybrid or on-premise configurations; analytics can then be applied for predictive maintenance scheduling as well as aggregate operational optimization.

Energy grid geospatial analyticsWe saw a demo based on the CenterPoint IoT example at the panel yesterday, showing monitoring and maintenance of energy distribution networks: tracking the health of transformers, grid storage and other devices and identifying equipment failures, sometimes before they even happen. CenterPoint already has 100,000 sensors out in the field, and since this is integrated with S/4HANA, this is not just monitoring: an operator can trigger a work order directly from the predictive equipment maintenance analytics dashboard.

Energy grid analytics Energy grid analytics drill-down

Leukert touched on to the HANA roadmap, with the addition of Hadoop and SPARK Cluster Manager to handle infinite volumes of data, then welcomed Walmart CIO Karenann Terrell to discuss what it is like to handle a really large HANA implementation. Walmart serves 250 million customers per week through 11,000 locations with 2.2 million employees, meaning that they generate a lot of data just in their daily operations: they generate literally trillions of financial transactions. Because technology is so core to managing this well, she pointed out that Walmart is creating a technology company in the middle of the world’s largest retail company, which allows them to stay focused on the customer experience while reducing costs. Their supply chain is extensive, since they are directly plugged into many of their suppliers, and innovating along that supply chain has driven them to partner with SAP more closely than most other customers. HANA allows them to have 5,000 people hitting on data stores of a half-billion records simultaneously with sub-second response time to provide a real-time view of their supply chain, making them a true data-driven retailer and shooting them to the top of yesterday’s HANA Innovation Awards. She finished by saying that seeing S/4HANA implemented at Walmart in her lifetime is on her bucket list, which got a good laugh from the audience but highlighted the fact that this is not a trivial transition for most companies.

Leukert finished with an invitation — or maybe it was a challenge — to use S/4HANA and HCP to reinvent your business: “clean your basement” to remove unnecessary customization in your current SAP solutions or convert it to HCP or S/4HANA extension platform; change your business model to become more data-driven; and leverage business networks to expand the edges of your value chain. Thrive, don’t just survive.

Employee disaster scenarioSteve Singh, CEO of Concur (acquired by SAP last December) then took over to look at reinventing the employee travel experience, from booking through trip logistics to expense reporting. For companies with large number of traveling employees, managing travel can be a serious headache both from a logistics and financial standpoint. Concur does this by creating a business network (or a network or networks) that directly integrates with suppliers — such as airlines and car rental companies — for booking and direct invoice capture, plus easy functions for inputting travel expenses that are not captured directly from the supplier. I heard comments yesterday that SAP already has travel and expense management, and although the functionality of Concur for that functionality is likely a bit better, the networks that they bring are the real prize here. The networks, for example, allow for managing the extraction of an employee who finds themself in a disaster or other dangerous travel scenario, and becomes part of a broader human resources risk management strategy.

At the press Q&A later, Leukert fielded questions about how they have simplified the complete core of their ERP solution in terms of data model and functionality but still have work to do for some industry modules: although all 25 industries are supported as of now in the on-premise version, they need to do a bit of tinkering under the hood and do additional migration for the cloud version. They’re also still working on the cloud version of everything, and are recommending the HCM and CRM standalone products if the older Business Suite versions don’t meet requirements. In other words, it’s not done yet, although core portions are fully functional. Singh talked about the value of business networks such as Ariba in changing business models, and sees that products such as Concur using HCP and the SAP business networks will help drive broader adoption.

There was a question on the ROI for migration to S/4HANA: it’s supposed to run 1,800 times faster than previous versions, but customers may not be seeing much (if any) savings, opening things up to competitive displacement. I heard this same sentiment from some customers last night at the HANA Innovation Awards reception; since there is little or no cost reduction in terms of license and deployment costs, they need to make the case based on what additional capabilities that HANA enables, such as real-time analytics and predictions, that allow companies to run their businesses differently, and a longer-term reduction in IT complexity and maintenance costs. Since a lot of more traditional companies don’t yet see the need to change their business models, this can be a hard sell, but eventually most companies will need to come around to the need for real-time insights and actions.

IoT Solutions Panel at SapphireNow 2015

Steve Lucas, president of platform solutions at SAP, led a panel on the internet of things at SAPPHIRENOW 2015. He kicked off with some of their new IoT announcements: SAP HANA Cloud Platform (HCP) for IoT with free access to SAP SQL Anywhere embeddable database for edge intelligence; a partner ecosystem that includes Siemens and Intel; and customer success stories from Tennant and Tangoe. Their somewhat complex marketecture diagram shows a fairly comprehensive IoT portfolio that includes connecting to people and things at the edges of your value chain, and integrating the events that they generate to optimize your core corporate planning and reporting, providing real-time insights and automated decisioning. The cloud platform is key to enabling this, since it provides the fabric that weaves all of the data, actions, rules and decisions into a single connected enterprise.

SAP IoT marketecture

He was joined on stage by Austin Swope, who demonstrated remote equipment monitoring using a tiny but operational truck on the stage, complete with onboard sensors that pushed events and data to the cloud for remote monitoring and problem detection. We saw some of the real-time analytics (when the wifi cooperated) on-screen while the truck ran around the stage, and some of the other types of dashboards and analytics that would be used for broader equipment management programs. Since the equipment is now fully instrumented, analytics can be used to visualize and optimize operations: reducing costs, improving maintenance cycles, and increasing equipment load factors through a better understanding of what each piece of equipment is doing at any given time.

Next, Lucas was joined by Gary Hayes, CIO of CenterPoint Energy; Paul Wellman, CIO of Tennant; and Peter Weckesser, CEO Customer Service, Digital Factory at Siemens. Hayes talked about how CenterPoint is using smart meters, grid storage, digital distribution networks and other IoT-enabled technologies to drastically reduce costs and improve service, while maintaining safety and security standards. They’re starting to use predictive analytics on HANA to model and predict underground cable failures, and several other innovations in intelligent energy management. Wellman discussed how Tennant, which has fleets of large-scale cleaning machines such as you would see in conference centers and airports, has added telemetry to provide machine monitoring and predictive maintenance, and expose this information to customers so that they can understand and reduce costs themselves through fleet management and usage. Last up, Weckesser talked about how Siemens devices (of which there are millions out there in a variety of industrial applications) generate events that can be analyzed to optimize industrial plants and machines as well as energy and resources As an SAP partner, Siemens is offering an open cloud platform for industry customers based on HANA; customers can easily connect their existing Siemens devices to the Siemens Cloud for Industry apps via public cloud, private cloud or on-premise infrastructure. This allows them to do analysis for predictive maintenance on individual machines, as well as aggregate fleet operations optimization, through apps provided by Siemens, SAP, SAP partners or the customers themselves.

I was disappointed not to see the SAP Operational Process Intelligence offering involved in this discussion: it seems a natural fit since it can be used to monitor events and control processes from a variety of underlying systems and sources, including event data in HANA. However, good to see that SAP is providing some real-world examples of how they are supporting their customers’ and partners’ IoT efforts through the HANA Cloud Platform.

Consolidated Inbox in SAP Fiori at SapphireNow 2015

I had a chance to talk with Benny Notheis at lunchtime today about the SAP Operational Intelligence product directions, and followed on to his session on a consolidated inbox that uses SAP’s Fiori user experience platform to provide access to SAP’s Business Suite workflow, BPM and Operational Process Intelligence work items, as well as work items from non-SAP workflow systems. SAP has offered a few different consolidated inboxes over the years — some prettier than others — but they all serve the same purpose: to make things easier for users by providing a single point of contact for all work items, and easier for IT by reducing maintenance and support. In the case of the Fiori My Inbox, it also provides a responsive interface across mobile and desktop devices. Just as the underlying database and transaction platform for SAP is converging on HANA, all user experience for applications and analytics is moving to Fiori. Fiori (and therefore the consolidated My Inbox) is not yet available on the cloud platform, but that’s in the works.

As a consolidated work list manager, My Inbox provides multiple device support including mobile, managing work items from multiple systems in a single list and fully integrated into the Fiori launchpad. It has some nice features such as mass approvals, full-text searching, sorting and filtering, and sharing tasks via email and SAP JAM; work items can have attachments, comments and custom attributes that are exposed in the work list UI or by launching the UI specific to the work item.

We saw a demo of My Inbox, with  a user-configurable view that allows workers to create filtered lists within their inbox for specific task types or source systems in order to organize their work in the way that they want to view it. Work items can be viewed and managed in the work list view within Fiori, or the work item launched for full interaction using its native UI. Tasks can be forwarded to other users or suspended, as well as task type-specific actions such as approve and reject. Attachments can be added and viewed directly from the work list view, as well as direct links into other systems. The history for a work item is maintained directly in My Inbox for viewing by the user, although the underlying workflow systems are likely also maintaining their own separate history logs; this provides a more collaborative history by allowing users to add comments that become part of the My Inbox history. Emailing a task to a user sends a direct link to the task but does not interrogate or allocate access rights; I assume that this could mean that a task could  sent to someone who does not have rights to open or edit the tasks, and the original sender would not be informed. Within any list view, a multi-select function can be used to select multiple items for approval; these all have to be approval-type items rather than notifications, so this might be most useful in a list view that is filtered for a single task type. There is no view of tasks that a user delegated or completed — a sort of Sent Items box — so a user can’t monitor the progress of something that they forward to someone else. Substitutions for out-of-office times are set in My Inbox, meaning that the user does not need to visit each of the underlying systems of record to set up substitution rules; these rules can be applied based on task groups, which are established by how task profiles are set up during the initial technical configuration.

A good demonstration of the new generation of SAP user experience, and how Fiori can be used in a production transaction-oriented environment. There obviously needs to be a fair amount of cooperation between the Fiori-based My Inbox and the systems of record that contribute work items: My Inbox needs to be able to interrogate quite a bit of data from each work item, send actions, and manage user substitution rules via a common task consumption model that interacts with gateways to each type of underlying system. There is likely still quite a bit of work to do in those integration points to make this a fully-functional universal inbox, especially for systems of record that are more reluctant to yield their secrets to other systems; SAP has published specifications for building task gateways that could then be plugged into this model, which would expose work items from any system in My Inbox via a compatible gateway.

image

(Image from SDN link above)

The next good trick will be to have a consolidated history log, combining the logs from My Inbox with those in the systems of record to build a more complete history of a work item for reporting and decisioning.

SapphireNow 2015 Day 1 Keynote with Bill McDermott

Happy Cinco de Mayo! I’m back in Orlando for the giant SAP SAPPHIRE NOW and ASUG conference to catch up with the product people and hear about what organizations are doing with SAP solutions. If you’re not here, you can catch the keynotes and some of the other sessions online either in real time or on demand. The wifi is swamped as usual, my phone kicked from LTE down to 3G and on down to Edge before declaring No Service during the keynote, and since I’m blogging from my tablet/keyboard configuration, I didn’t have connectivity at the keynote (hardwired connections are provided for media/analysts, but my tablet doesn’t have a suitable port) so this will be posted sometime after the keynote and the press conference that follows.

We kicked off the 2015 conference with CEO Bill McDermott asking what the past can teach us about the present. Also, a cat anecdote from his days as a door-to-door Xerox salesman, highlighting the need for empathy and understanding in business, in addition to innovation in products and services. From their Run Simple message last year, SAP is moving on to Making Digital Simple, since all organizations have a lot of dark data that could be exploited to make them data-driven and seamless across the entire value chain: doing very sophisticated things while making them look easy. There is a sameness about vendors’ messaging these day around the digital enterprise — data, events, analytics, internet of things, mobile, etc. — but SAP has a lot of the pieces to bridge the data divide, considering that their ERP systems are at the core of so many enterprises and that they have a lot of the other pieces including in-memory computing, analytics, BPM, B2B networks, HR systems and more. Earlier this year, SAP announced S/4HANA: the next generation of their core ERP suite running on HANA in-memory database and integrating with their Fiori user experience layer, providing a more modular architecture that runs faster, costs less to run and looks better. It’s a platform for innovation because of the functionality and platform support, and it’s also a platform for generating and exposing so much of that data that you need to make your organization data-driven. The HANA cloud platform also provides infrastructure for customer engagement, while allowing organizations to run their SAP solutions in on-premise, hybrid and cloud configurations.

SAP continues to move forward with HR solutions, and recently acquired Concur — the company that owns TripIt (an app that I LOVE) as well as a number of other travel planning and expense reporting tools — to better integrate travel-related information into HR management. Like many other large vendors, SAP is constantly acquiring other companies; as always, the key is how well that they can integrate this into their other products and services, rather than simply adding “An SAP Company” to the banner. Done well, this provides more seamless operations for employees, and also provides an important source of data for analyzing and improving operations.

A few good customer endorsements, but pretty light on content, and some of the new messaging (“Can a business have a soul?”) seemed a bit glib. The Stanley Cup may a short and somewhat superfluous appearance, complete with white-gloved handler. Also, there was a Twitter pool running for how many times the word “simple” was used in the keynote, another indication that the messaging might need a bit of fine-tuning.

There was a press conference afterwards, where McDermott was joined by Jonathan Becher and Steve Lucas to talk about some other initiatives (including a great SAP Store demo by Becher) and answer questions from press and analysts both here in Orlando and in Germany. There was a question about supporting Android and other third-party development; Lucas noted that HANA Cloud Platform is available now for free to developers as a full-stack platform for building applications, and that there are already hundreds of apps built on HCP that do not necessarily have anything to do with SAP ERP solutions. Building on HCP provides access to other information sources such as IoT data: Siemens, for example, is using HCP for their IoT event data. There’s an obvious push by SAP to their cloud platform, but even more so to HANA, either cloud or on-premise: HANA enables real-time transactions and reconciliations, something rarely available in ERP systems, while allowing for far superior analytics and data integration without complex customization and add-ons. Parts of the partner channel are likely a bit worried about this since they exploit SAP’s past platform weaknesses by providing add-on products, customization and services that may no longer be necessary. In fact, an SAP partner that relies on the complexity of SAP solutions by providing maintenance services just released a survey claiming to show a lack of customer interest in S/4HANA; although this resulted in a flurry of sensational headlines today, if you look at the numbers that show some adoption and quite a bit of non-committed interest — not bad for three months after release — it starts to look more like an act of desperation. It will be more interesting to ask this questions a few quarters from now. HANA may also be seen as a threat to SAP’s customers’ middle management, who will be increasingly disintermediated as more information is gathered, analyzed and used to automatically generate decisions and recommendations, replacing manually-collated reports that form the information fiefdoms within many organizations.

Becher and Lucas offered welcome substance as a follow-on to McDermott’s keynote; I expect that we’ll see much more of the product direction details in tomorrow’s keynote with Bernd Leukert.

London Calling To The Faraway Towns…For EACBPM

I missed the IRM Business Process Management Europe conference in London last June, but will be there this year from June 15-18 with a workshop, plus a breakout session and a panel session. It’s collocated with the Enterprise Architecture Europe conference, and you can attend sessions from either conference if you attend.

There are five conference tracks and 40 case studies over three days of the conference, plus a day of pre-conference workshops. Here’s what I’m presenting:

  • On the morning of June 15, I’ll present a half-day workshop/tutorial on The Future of Work, looking at how work is changing in the face of changing technology and culture, and how to adapt your organization for this brave new world.
  • On the morning of June 17, I’ll give a breakout session that excerpts some of the material from the workshop on Changing Incentives for Knowledge Workers.
  • Also on the morning of June 17, I’ll be on a panel of “BPM Gurus” with Roger Burlton, Ron Ross and Howard Smith, moderated by Chris Potts, discussing ten years of BPM.

IRM runs a good conference with a lot of great content, hope to see you there. If you plan to attend, I have a 10% discount code that I can provide to colleagues, send me a note or add a comment here and I’ll send it to you.

bpmNEXT 2015 Day 3 Demos: Camunda, Fujitsu and Best In Show

Last demo block of the conference, and we’re focused on case management and unstructured processes.

Camunda, CMMN and BPMN Combined

Jakob Freund presented on OMG’s (relatively) new standard for case management modeling, CMMN, and how they combine it with BPMN to create processes that have a combination of pre-defined flows and case structures. They use the Trisotech CMMN modeler embedded in their environment, running both the CMMN and BPMN on the same engine; they are looking at adding DMN for decision modeling as well. He demonstrated an insurance application example there BPMN is used to model the overall process, with the underwriting subprocess actually being a CMMN model within a BPMN model. The user task list can show a consolidated view of both BPMN tasks and CMMN tasks, or a dedicated UI can be used for a case since it can also show enabled activities that are not yet instantiated (hence would not appear in a task list) as available user actions. BPMN processes can also be triggered from the CMMN model, providing pre-defined process fragments that can be triggered by the case worker to perform standard operations. He also showed their developer workbench, including a full-featured debugger that includes stepwise execution and the ability to execute code at any step. Since their paradigm is to provide process management services to a developer writing in Java, their tooling is more technical than what is found in a no-code or low-code environment. Also, a BPMN font.

Fujitsu: Using Agents to Coordinate Patient Care across Independent Specialists

Keith Swenson finished the demos presenting healthcare research from the University of Granada, which helps to create patient treatment plans based on rules and iterative goal-seeking rather than pre-defined processes. This allows for different medical specialists to have their own sets of rules and best practices for dealing with their own specialization; automated agents can combine and negotiate the rules from multiple specialists to create a consolidated treatment plan for patients with multiple conditions, allowing each of the participants to monitor progress. He demonstrated a prototype/sample application that allows each specialist to set out a schedule of actions that make up a treatment plan; the multiple treatments plans are conciliated against each other — basically, modifying a plan by adding steps from another plan — and presented back to the referring physician, who can then select one of the plan processes for execution. He used the IActive Knowledge Studio to show how the plans and rules are designed, and discussed how the processes for the interacting agents would be emergent as they communicate and negotiate.

That’s it for bpmNEXT for me. Great conference, as always. As a matter of disclosure, I was not charged the conference fee to attend, although I paid my own travel and living expenses. A number of the vendors that I have written about here over the past three days are my clients or have been so in the past, but that did not allow them to escape the snarky Twitter comments.

Update: waiting to take off at Santa Barbara airport, and I see from the Twitter stream that SAP won the Best In Show award for their Internet of Everything demo – congratulations! Top five presentations: W4, Camunda, Trisotech, Bonitasoft and BP-3. Kudos all around. 

bpmNEXT 2015 Day 3 Demos: IBM (again), Safira, Cryo

It’s the last (half) day of bpmNEXT 2015, and we have five presentations this morning followed by the Best in Show award. Unfortunately, I have to leave at lunchtime to catch a flight, so you will have to check the Twitter hashtag to see who won — or maybe I’ll do a wrapup post from the road.

IBM: BPM, say Hello to Watson. A New Era of Cognitive Work – Here Today

First up was Chris Vavra discussing how Watson and cognitive computing and natural language analysis capabilities can be used in the context of BPM, acting as an expert advisor to knowledge workers to enhance, scale and accelerate their work with its (or as Chris said, “his”) reasoning capabilities. There are a number of Watson services offered on their Bluemix cloud development platform; he demonstrated an example of an HR hiring process where the HR person uses Watson to analyze a candidate’s personality traits as part of the evaluation process. This is based on a written personal statement provided by the candidate; Watson analyzes that text (or could link through to a personal website or blog) to provide a personality analysis. From the Bluemix developer dashboard, you can create applications that include any of the services, including Watson Personality Insights that provides ranking on several factors in the five basic personality traits of Openness, Conscientiousness, Extraversion, Agreeableness and Emotional Range, with a graphical representation to highlight values and needs that may be of concern in the hiring process. It’s unlikely that a hiring manager would use solely this information to make a decision, but it’s interesting for exploring a candidate’s personality characteristics as part of the process. There are a number of other Watson-based services available on Bluemix to bind into BPM (and other) applications; in the IBM cloud BPM designer, this just appears as a service connector that can be configured with the Watson authentication information, and invoked at a services step in a process flow. Lots of other potential applications for bringing this level of expert recommendations into processes, such as healthcare condition diagnoses or drug interactions.

Safira: Managing Unstructured Processes with AdHoc BPM Framework

Filipe Pinho Pereira addressed the issue of the long tail of organizations’ processes, where only the high-volume, high-value structured processes are being implemented as full BPM projects by IT, and the long tail of less critical and ad hoc processes that end up being handled manually. Using IBM BPM, he demonstrated their Ad-Hoc BPM Framework add-on that allows a business user to create a new ad-hoc process based on a predefined request-intervention process pattern, which has only an initial data capture/launch step, then a single “do it” human step with a loop that keeps returning to the same step until explicitly completed. The example was an expense report process, where a blank expense spreadsheet was attached, a form created to capture basic data, and SLAs specified. Routing is created by specifying the primary recipient, and notifications that will be issued on start, end and SLA violations. Users can then create an instance of that process (that is, submit their own expense report), which is then routed to the primary recipient; the only routing options at that point are Postpone, Forward and Complete, since it’s in the main human task loop part of the process pattern. This distills ad-hoc processes to their simplest form, where the current recipient of the main task decides on who the next recipient is or whether to complete the task; this is functionally equivalent to an email-based process, but with proper process monitoring and SLA analytics. By looking at the analytics for the process, we saw the number of interventions (the number of times that the human step loop was executed for an instance), and the full history log could be exported to perform mining to detect patterns for process improvement. Good example of very simple user-created ad hoc processes based on an industrial-strength infrastructure; you’re not going to buy IBM BPM just to run this, but if you’re already using IBM BPM for your high-volume processes, this add-on allows you to leverage the infrastructure for the long tail of your processes.

Cryo: Tools for Flexibility in Collaborative Processes

Rafael Fazzi Bortolini and Leonardo Luzzatto presented on processes that lie somewhere in the middle of the structured-unstructured spectrum, and how to provide flexibility and dynamic aspects within structured constraints through decision support, flexible operations, ad-hoc task execution and live changes to processes. Demonstrating with their Orquestra BPMS, they showed a standard process task interface with the addition of localized analytics based on the history of that task in order to help the user decide on their actions at that point. Flexible routing options allow the user to return the process to an earlier step, or forward the current task to a colleague for consultation before returning it to the original user at the same step; this does not change the underlying process model, but may move the instance between activities in a non-standard fashion or reassign it to users who were not included in the original process definition. They also have an ad-hoc process pattern, but unlike Safira, they are using actual ad-hoc activities in BPMN, that is, tasks that are not connected by flow lines. Users are presented with the available ad hoc tasks in the process model, allowing them to “jump” between the activities in any order. They also demonstrated live changes to production processes; the examples were adding a field to a form and changing the name of a task in the process, both of which are presumably loaded at runtime rather than embedded within the instantiated process to allow these types of changes.

bpmNEXT 2015 Day 2 Demos: Omny.link, BP-3, Oracle

We’re finishing up this full day of demos with a mixed bag of BPM application development topics, from integration and customization that aims to have no code, to embracing and measuring code complexity, to cloud BPM services.

Omny.link: Towards Zero Coding

Tim Stephenson discussed how extremely low-code solutions could be used to automate marketing processes, in place of using more costly marketing automation solutions. Their Omny.link solution integrates workflow and decisioning with WordPress using JavaScript libraries, with detailed tracking and attribution, by providing forms, tasks, decision tables, business processes and customer management. He demonstrated an actual client solution, with custom forms created in WordPress, then referenced in a WordPress page (or post) that is used as the launch page for an email campaign. Customer information can be captured directly in their solution, or interfaced to another CRM such as Sugar or Salesforce. Marketers interact with a custom dashboard that allows them to define tasks, workflows, decisions and customer information that drive the campaigns; Tim sees the decision tables as a key interface for marketers to create the decision points in a campaign based on business terms, using a format that is similar to an Excel spreadsheet that they might now be using to track campaign rules.

BP-3: Sleep at Night Again: Automated Code Analysis

Scott Francis and Ivan Kornienko presented their new code analysis tool, Neches, that applies a set of rules based on best practices and anti-patterns based on their years of development experience to identify code and configuration issues in IBM BPM implementations that could adversely impact performance and maintainability. They propose that proper code reviews — including Neches reviews — at the end of each iteration of development can find design flaws as well as implementation flaws. Neches is a SaaS cloud tool that analyzes uploads of snapshots exported from the IBM BPM design environment; it scores each application based on complexity, which is compared to the aggregate of other applications analyzed, and can visualize the complexity score over time compared to found, resolved and fixed issues. The findings are organized by category, and you can drill into the categories to see the specific rules that have been triggered, such as UI page complexity or JavaScript block length, which can indicate potential problems with the code. The specific rules are categorized by severity, so that the most critical violations can be addressed immediately, while less critical ones are considered for future refactoring. Specific unused services, such as test harnesses, can be excluded from the complexity score calculation. Interesting tool for training new IBM BPM developers as well as review code quality and maintainability of existing projects, leveraging the experience of BP-3 and Lombardi/IBM developers as well as general best coding practices.

Oracle: Rapid Process Excellence with BPM in the Public Cloud

Linus Chow presented Oracle’s public cloud BPM service for developing both processes and rules, deployable in a web workspace or via mobile apps. He demonstrated an approval workflow, showing the portal interface, a monitoring view overlaid on the process model, and a mobile view that can include offline mode. The process designer is fully web-based, including forms and rules design; there are also web-based administration and deployment capabilities. This is Oracle’s first cloud BPM release and looks pretty full-featured in terms of human workflow; it’s a lightweight, public cloud refactoring of their existing Oracle BPM on-premise solution, but doesn’t include the business architecture or SOA functionality at this time.

Great day of demos, and lots of amazing conversations at the breaks. We’re all off to enjoy a free night in Santa Barbara before returning for a final morning of five more demos tomorrow.

bpmNEXT 2015 Day 2 Demos: Kofax, IBM, Process Analytica

Our first afternoon demo session included two mobile presentations and one on analytics, hitting a couple of the hot buttons of today’s BPM.

Kofax: Integrating Mobile Capture and Mobile Signature for Better Multichannel Customer Engagement Processes

John Reynolds highlighted the difficulty in automating processes that involve customers if you can’t link the real world — in the form of paper documents and signatures — with your digital processes. Kofax started in document scanning, and they’ve expanded their repertoire to include all manner of capture that can make processes more automated and faster to complete. Smartphones become intelligent scanners and signature capture devices, reducing latency in capture information from customers. John demonstrated the Kofax Mobile Capture app, both natively and embedded within a custom application, using physical documents and his iPhone; it captures images of a financial statement, a utility bill and a driver’s license, then pre-processes them on the device to remove irregularities that might impact automated character recognition and threshold them to binary images to reduce the data transmission size. These can then be directly injected into a customer onboarding process, with both the scanned image and the extracted data included, for automated or manual validation of the documents to continue the process. He showed the back-end tool used to train the recognition engine by manually identifying the data fields on sample images, which can accept a variety of formats for the same type of document, e.g., driver’s licenses from different states. This is done by a business person who understands the documents, not developers. Similarly, you can also use their Kapow Design Studio to train their system on how to extract information from a website (John was having the demo from hell, and his Kapow license had expired) by marking the information on the screen and walking through the required steps to extract the required data fields. They take on a small part of the process automation, mostly around the capture of information for front-end processes such as customer onboarding, but are seeing many implementations moving toward an “app” model of several smaller applications and processes being used for an end-to-end process, rather than a single monolithic process application.

IBM: Mobile Case Management and Capture in Insurance

Mike Marin and Jonathan Lee continued on the mobile theme, stressing that mobile is no longer an option for customer-facing and remote worker functionality. They demonstrated IBM Case Manager for an insurance example, showing how mobile functionality could be used to enhance the claims process by mobile capture, content management and case handling. Unlike the Kofax scenario where the customer uses the mobile app, this is a mobile app for a knowledge worker, the claims adjuster, who may need a richer informational context and more functionality such as document type classification than a customer would use. They captured the (printed and filled) claims form and a photo of the vehicle involved in the claim using a smartphone, then the more complete case view on a tablet that showed more case data and related tasks. The supervisor view shows related cases plus a case visualizer that shows a timeline view of the case. They finished with a look at the new IBM mobile UI design concepts, which presented a more modern mobile interface style including a high-level card view and a smoother transition between information and functions.

Process Analytica: Process Discovery and Analytics in Healthcare Systems

Robert Shapiro shifted the topic to process mining/discovery and analytics, specifically in healthcare applications. He started with a view of process mining, simulation and other analytical techniques, and how to integrate with different types of healthcare systems via their history logs. Looking at their existing processes based on the history data, missed KPIs and root causes can be identified, and potential solutions derived and compared in a systematic and analytic manner. Using their Optima process analytics workbench, he demonstrated importing and analyzing an event log to create a BPMN model based on the history of events: this is a complete model that includes interrupting and non-interrupting boundary events, and split and merge gateways based on the patterns of events, with probabilistic weights and/or decision logic calculated for the splitting gateways. Keeping in mind that the log events come from systems that have no explicit process model, the automatic derivation of the boundary events and gateways and their characteristics provides a significant step in process improvement efforts, and can be further analyzed using their simulation capabilities. Most of the advanced analysis and model derivation (e.g., for gateway and boundary conditions) is dependent on capturing data value changes in the event logs, not just activity transitions; this is an important distinction since many event logs don’t capture that information.

bpmNEXT 2015 Day 2 Demos: Sapiens Decision, Signavio

We finished the morning demo sessions with two on the theme of decision modeling and management.

Sapiens: How to Manage Business Logic

Michael Grohs highlighted the OMG release of the Decision Model and Notation (DMN) standard, and how the decision model is really a business logic model. However, business rule management systems are typically technical solutions, and don’t do much for business users and analysts trying to model their decision logic and rules based on their policies and procedures. Decision-aware processes extract declarative knowledge from process models, greatly simplifying the process models and moving the declarative information to a model format more suitable to business logic, such as a decision table. BPMS and DMS are complementary, and can be combined to create a complete model of the business process. He provided a demo of their decision modeling and repository tooling, which starts with the definition of a community space that shares a glossary, attributes and models, and has governance workflows for decision model approval and deployment. The glossary allows for the definition of fact types, including multiple synonyms to allow different stakeholders to use their own terminology. The decision models are made up of rule families that capture the business logic, with a visual syntax that indicates the rules and conditions that make up a particular decision. This can be expanded into a full decision table style that shows the if-then-else logic using the business terms. Different instances of decisions and rules sets can be created — in his demo example, the insurance policy renewals base logic versus that for a hurricane-prone state such as Florida — and visually compared in the graphical or tabular view, with changes highlighted or listed in detail in a report. Rule sets can be validated to highlight conflicts and missing information, then exported in a variety of formats for importing into a DMS for execution.

Signavio: Business Decision Management

Gero Decker talked about their collaborative process design and SAP upgrade tools as an introduction, but mainly addressed decision modeling and how they are embracing the DMN standard: modeling decisions, inputs and knowledge sources, then linking that to a decision activity in a BPMN model. DMN provides a graphical model form, and also allows for decision tables for detailed steps. Like Sapiens, Signavio does only decision modeling, not execution, and exports in standard formats for importing to a DMN such as Drools for execution. They are releasing the Signavio Decision Manager in a few weeks, and he gave us a preview demo of modeling and testing rules integrated with their process modeling environment. Similar to the modeling that we saw from Comindware earlier this morning, Signavio can be used to model higher-level enterprise architecture constructs such as value chains plus full BPMN models for specific capabilities within those models; he used a BPMN model as a jumping-off point for demonstrating decision modeling by creating a business rule task. From that point, you can specify a decision table directly in situ, or choose to create a DMN model at that point, which launches the DMN modeler with the top-level question/answer in the DMN model linked to the business rule activity from the BPMN model. The DMN model can be built out graphically, data objects defined and rules added with decision tables, and sub-decisions added as required. The DMN modeler can make use of the existing glossary in the Signavio environment for data objects and attributes. The decision tables can be validated to detect conflicts, and can export test cases in a spreadsheet format to drive manual or automated testing. They are also doing some work on detecting complex decision logic within BPMN models, with the goal to refactor the models to externalize the decision logic into DMN models where it makes the BPMN model unnecessarily complex.