Upcoming webinar on digital transformation in financial services featuring @BPMdotcom and @ABBYY_USA – and my white paper

Something strange about receiving an email about an upcoming webinar, featuring two people who I know well…

 …then scrolling down to see that ABBYY is featuring the paper that I wrote for them as follow-on bonus material!

Nathaniel Palmer and Carl Hillier are both intelligent speakers with long histories in the industry, tune in to hear them talk about the role that content capture and content analytics play in digital transformation.

bpmNEXT 2018: All about bots with Cognitive Technology, PMG.net, Flowable

We’re into the afternoon of day 2 of bpmNEXT 2018, with another demo section.

RPA Enablement: Focus on Long-Term Value and Continuous Process Improvement, Cognitive Technology

Massimiliano Delsante of Cognitive Technology presented their myInvenio product for analyzing processes to determine where gaps exist and create models for closing those gaps through RPA task automation. The demo started with loading historical process data for process mining, which created a process model from the data together with activity resources, counts and other metrics; then comparing the model for conformance with a reference model to determine the frequency and performance of conformant and non-conformant cases. The process discovery model can be transformed to a BPMN model, and simulated performance. With a baseline data set of all manual activities, the system identified the cost of each activity, helping to identify which activities would result in the greatest savings if automated, and fed the data for actual resources used into the simulation scenario; adjusting the resources required by specifying the number of RPA robots that could be deployed at specific tasks allows for a what-if simulation for the process performance with an RPA implementation. An analytics dashboard provides visualization of the original process discovery and the simulated changes, with performance trends over time. Predictive analytics can be applied to running processes to, for example, predict which cases will not meet their deadlines, and some root cause analysis for the problems. Doing this analysis requires that you have information about the cost of the RPA robots as well as being able to identify which tasks could be automated with RPA. Good integration of process discovery, simulation, analysis and ongoing monitoring.

Integration is Still Cool, and Core in your BPM Strategy, PMG.net

Ben Alexander from PMG.net focused on integration within BPM as a key element for driving innovation by increasing the speed of application development: integrating services for RPA, ML, AI, IoT, blockchain, chatbots and whatever other hot new technologies can be brought together in a low-code environment such as PMG. His demo showed a vendor onboarding application, adding a function/subprocess for assessing probability of vendor approval using machine learning by calling AzureML, user task assignment using Slack integration or SMS/phone support through a Twilio connector, and RPA bot invocation using a generic REST API. Nice demo of how to put all of these third-party services together using a BPM platform as the main application development and orchestration engine.

Making Process Personal, Flowable

Paul Holmes-Higgin and Micha Keiner from Flowable presented on their Engage product for customer engagement via chat, using chatbots to augment rather than replace human chat, and modeling the chatbot behavior using standard modeling tools. In particular, they have found that a conversation can be modeled as a case with dynamic injection of processes, with the ability to bring intelligence into conversations, and the added benefit of the chat being completely audited. The demo was around the use case of a high-wealth banking client talking to their relationship manager using chat, with simultaneous views of both the client and relationship manager UI in the Flowable Engage chat interface. The client mentioned that she moved to a new home, and the RM initiated the change address process by starting a new case right in the chat by invoking a context-sensitive digital assistant. This provided advice to the RM about address change regulatory rules, and provided a form in situ to collect the address data. The case is now progressed through a combination of chat message to collaborate between human players, forms filled directly in the chat window, and confirmation by the client via chat by presenting them with information to be updated. Potential issues, such as compliance regulations due to a country move, are raised to the RM, and related processes execute behind the scenes that include a compliance officer via a more standard task inbox interface. Once the compliance process completes, the RM is informed via the chat interface. Behind the scenes, there’s a standard address change BPMN diagram, where the chat interface is integrated through service activities. They also showed replacing the human compliance decision with a decision table that was created (and manually edited if necessary) based on a decision tree generated by machine learning on 200,000 historical address change cases; rerunning the scenario skipped the compliance officer step and approved the change instantaneously. Other chat automated tasks that the RM can invoke include setting reminders, retrieving customer information and more using natural language processing, as well as other types of more structured cases and processes. Great demo, and an excellent look at the future of chat interfaces in process and case management.

bpmNEXT 2018: Here’s to the oddballs, with ConsenSys, XMPro and BPLogix

And we’re off with the demo sessions!

Secure, Private Decentralized Business Processes for Blockchains, ConsenSys

Vanessa Bridge of ConsenSys spoke about using BPMN diagrams to create smart contracts and other blockchain applications, while also including privacy, security and other necessary elements: essentially, using BPM to enable Ethereum-based smart contracts (rather than using blockchain as a ledger for BPM transactions and other BPM-blockchain scenarios that I’ve seen in the past). She demonstrated using Camunda BPM for a token sale application, and for a boardroom voting application. For each of the applications, she used BPMN to model the process, particularly the use of BPMN timers to track and control the smart contract process — something that’s not native to blockchain itself. Encryption and other steps were called as services from the BPMN diagram, and the results of each contract were stored in the blockchain. Good use of BPM and blockchain together in a less-expected manner.

Turn IoT Technology into Operational Capability, XMPro

Pieter van Schalkwyk of XMPro looked at the challenges of operationalizing IoT, with a virtual flood of data from sensors and machines that needs to be integrated into standard business workflows. This involves turning big data into smart data via stream processing before passing it on to the business processes in order to achieve business outcomes. XMPro provides smart listeners and agents that connect the data to the business processes, forming the glue between realtime data and resultant actions. His demo showed data being collected from a fan on a cooling tower, bringing in data the sensor logs and comparing it to manufacturer’s information and historical information in order to predict if the fan is likely to fail, create a maintenance work order and even optimize maintenance schedules. They can integrate with a large library of action agents, including their own BPM platform or other communication and collaboration platforms such as Slack. They provide a lot of control over their listener agents, which can be used for any type of big data, not just industrial device data, and integrate complex and accurate prediction models regarding likelihood and remaining useful life predictions. He showed their BPM platform that would be used downstream from the analytical processing, where the internet of things can interact with the internet of people to make additional decisions required in the context of additional information such as 3D drawings. Great example of how to filter through hundreds of millions data points in streaming mode to find the few combinations that require action to be taken. He threw out a comment at the end that this could be used for non-industrial applications, possibly for GDPR data, which definitely made me think about content analytics on content as it’s captured in order to pick out which of the events might trigger a downstream process, such as a regulatory process.

Business Milestones as Configuration, BPLogix

Scott Menter and Joby O’Brien of BPLogix finished up this section on new BPM ideas with their approach to goal orientation in BPM, which is milestone-based and requires understanding the current state of a case before deciding how to move forward. Their Process Director BPM is not BPMN-based, but rather an event-based platform where events are used to determine milestones and drive the process forward: much more of a case management view, usually visualized as a project management-style GANTT chart rather thana flow model. They demonstrated the concept of app events, where changes in state of any of a number of things — form fields, activities, document attachments, etc. — can record a journal entry that uses business semantics and process instance data. This allows events from different parts of the platform to be brought together in a single case journal that shows the significant activity within the case, but also to be triggers for other events such as determining case completion. The journal can be configured to show only certain types of events for specific users — for example, if they’re only interested in events related to outgoing correspondence — and also becomes a case collaboration discussion. Users can select events within the journal and add their own notes, such as taking responsibility for a meeting request. They also showed how machine learning and rules can be used for dynamically changing data; although shown as interactions between fields on forms, this can also be used to generate new app events. Good question from the audience on how to get customers to think about their work in terms of events rather than procedures; case management proponents will say that business people inherently think about events/state changes rather than process. Interesting representation of creating a selective journal based on business semantics rather than just logging everything and expecting users to wade through it for the interesting bits.

We’re off to lunch. I’m a bit out of practice at live-blogging, but hope that I captured some of the flavor of what’s going on here. Back with more this afternoon!

The collision of capture, content and analytics

Martyn Christian of UNDRSTND Group, who I worked with back in FileNet in 2000-1, gave a keynote at ABBYY Technology Summit 2017 on the evolution and ultimate collision of capture, content and analytics. He started by highlighting some key acquisitions in the industry, including the entry of private capital, as well as a move to artificial intelligence in the capture space, as harbingers of the changes in the capture market. Since Gartner declared enterprise content management dead — long live content services platforms! — and introduced new players in the magic quadrant alongside the traditional ECM players, while shifting IBM from the leaders quadrant back to the challengers quadrant.

Intelligent capture is gaining visibility and importance, particularly as a driver for digital transformation. Interestingly, capture was traditionally about converting analog (paper) to digital (data); now, however, many forms of information are natively digital, and capture is not only about performing OCR on scanned paper documents but about extracting and analyzing actionable data from both analog and digital content. High-volume in-house production scanning operations are being augmented — or replaced — with customers doing their own capture, such as we now see with depositing a check using a mobile banking application. Information about customer actions and sentiment is being automatically gleaned from their social media actions. Advanced machine learning is being used to classify content, reducing the need for manual intervention further downstream, and enabling straight-through processing or the use of autonomous agents.

As a marketing guy, he had a lot of advice on how this can be positioned and sold into customers; UNDRSTND apparently ran a workshop yesterday for some of the channel partner companies on bringing this message to their customers who are seeking to move beyond simple capture solutions to digital transformation.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

Analytics customer keynote at TIBCONOW 2016

Michael O’Connell hosted the last general session for TIBCO NOW 2016, focusing on analytics customer stories with the help of five customers: State Street, Shell, Vestas, Monsanto and Western Digital. I’m not going to try to attribute specific comments to the customer representatives, just capture a few thoughts as they go by.

wp-1463615608893.jpg

  • Spotfire is allowing self-service analytics to be pushed down to the business users
  • Typically, the analysis going on in a number of different solutions — from Excel to BI tools — are able to be consolidated onto a single analytics platform
  • Analytics is allowing the business to discover the true nature of their business, especially with outliers
  • Real-time analytics on physical processes (e.g., supply chain) generates significant benefits
  • Providing visual analytics to business changes the way that they use data and collaborate across the organization
  • The enterprise-class back-end and the good visualizations in Spotfire are helping it to win over both IT and business areas
  • Data and events are being generated faster and in greater volumes from more devices, making desktop analytics solutions impractical
  • Business users who are not data specialists can understand — and leverage — fairly complex analytical models when it concerns their own data
  • Analytics about manufacturing quality can be used to identify potential problems before they occur

We finished up with a brief presentation from Fred Ehlers, VP of IT at Norfolk Southern, about their use of TIBCO products to help manage their extensive railway operations. He talked about optimizing their intermodal terminals, where goods shipped in containers are moved between trains, trucks and ships; asset utilization, to ensure that empty cars are distributed to the right place at the right time for expected demand; and their customer service portal that shows an integrated view of a shipment lifecycle to give customers a more accurate, real-time view. As an old company, they have a lot of legacy systems, and used TIBCO to integrate them, centralizing operational events, data and business rules. For them, events can come from their physical assets (locomotives and railway sensors), legacy reporting systems, partner networks for assets not under their ownership, and external information including weather. On this, they build asset state models, and create applications that automatically correlate information and optimize operations. They now have one source of data and rules, and a reusable set of data and services to make application development faster. Their next steps are predictive maintenance, gathering information from locomotives, signal systms, switches and trackside defect detector to identify problems prior to an equipment failure; and real-time visual analytics with alerts on potential problem areas. They also want to inmprove operational forecasting to support better allocation of resources, allowing them to divert traffic and take other measures to avoid service disruptions. Great case study that incorporates the two conference themes of interconnecting everything and augmenting intelligence.

We’re at the end of day 2, and the end of my blogging at TIBCO NOW; there are breakouts sessions tomorrow but I’ll be on my way home. Some great new stuff in BPM and analytics, although far too many sessions going on at once to capture more than a fraction of what I wanted to see.

Intelligent Business Operations at TIBCONOW 2016

wp-1463598777880.jpgNicolas Marzin of TIBCO gave a breakout session on making business operations intelligent, starting with the drivers of efficiency, agility, quality and transparency. There are a number of challenges to achieving this in terms of work management: workers may have too many queues to monitor and not know which is most important, or people may be having work assigned to them that they are either over- or under-qualified to complete. This can result in missed SLAs and unhappy customers, lower efficiency, and lack of agility since business priorities aren’t enforced.

Looking at a day in the life of an operational business user, they need to know their own and their team’s performance goals, and what work that they should be completing that day in order to achieve those goals. Managers are concerned about their team as a whole, including whether they are meeting goals and SLAs, whether they have sufficient resources, and how to prioritize work. Managers need tools for real-time metrics, workforce administration, workload balancing, and changing priorities on the fly. ActiveMatrix BPM provides the ability to model your workforce in terms of roles, groups, privileges, relationships and capabilities; rules are applied to create a distribution strategy that determines what work is assigned to what resource at any point in a business process. Typically, work is assigned to a subset of the workforce whose skills match the requirement, since allocating work to an individual creates an operational risk if that person is absent or overloaded with work. AMX BPM includes process patterns for resource management: separation of duties, retain familiar, chaining and piling.

wp-1463598797560.jpgAnalytics comes into play in the management dashboard, where Spotfire is used to monitor operational performance and trigger actions directly from the dashboard. Typical visualization include work backlog and SLAs, resources pool workload and capacity, process and case performance, and business data in context. Marzin showed examples of dashboards for real-time tracking of work backlog and staffing, plus as-is forecasting that identifies bottlenecks. The charts show the factors that are most important for a manager to make resource allocation decisions, understand staffing needs based on combinations of skills, and reprioritize specific work types, which can then be pushed back to AMX BPM.

wp-1463598810544.jpgThis is fairly traditional BPM and case management, with rule-based workforce management, but that’s a huge part of where AMX BPM is being used in practice. However, their workforce management is fairly advanced compared to many competitive solutions, and using Spotfire for operational analytics raises the bar in active manager dashboards while allowing for what-if prediction and simulation on the fly. This ties in to the “closing the loop” theme of the day, where manager dashboard actions feed directly back to adjust the workforce management rules. This level of integrated visual analytics for AMX BPM is long overdue, but it looks like they’ve turned the previous demo-ware into something much more robust and generally applicable.

As an aside, I’ve done some presentations recently about the need to align incentives with corporate goals; although individual performance statistics are important, it’s key to ensure that they match up with overall goals, and include measurements of collaboration and teamwork too. Metrics for collaboration are just starting to emerge, and are not included in most BPM or other work management platforms.

Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.

The Enterprise Digital Genome with Quantiply at BPMCM15

“An operating system for a self-aware quantifiable predictive enterprise” definitely gets the prize for the most intriguing presentation subtitle, for an afternoon session that I went to with Surendra Reddy and David Chaney from Quantiply (a stealth startup that has just publicly launched), and their customer, a discount brokerage service whose name I have been requested to remove from this post.

Said customer has some significant event data challenges, with a million customers and 100,000 customer interactions per day across a variety of channels, and five billion log messages generated every day across all of their product systems and platforms. Having this data exist in silos with no good aggregation tools means fragmented and poor customer support, and also significant challenges in system and internal support.

To address these types of heterogenous data analysis problems, Quantiply has a two-layer tool: Edge Cloud for the actual data analysis, which can then be exposed to different roles based on access control (business users, operational users, data scientists, etc.); and Pulse for connecting to various data sources including data warehouses, transactional databases, BPM systems and more. It appears that they’re using some sort of dimensional fact models, which is fairly standard data warehouse analytical tools, but their Pulse connectors is allowing them to pour in data on a near-real-time basis, then make the connections between capabilities and services to be able to do fast problem resolution on their critical trading platforms. Because of the nature of the graph connectivity that they’re deriving from the data sources, they’re able to not only resolve the problem by drilling down, but also determine what customers were impacted by the problem in order to follow up. In response to a question, the customer said that they had used Splunk and other log analytics tools, but that this was “not Splunk”, in terms of both the real-time nature, and the front-end user experience, plus deeper analytical capabilities such as long-term interaction trending. In some cases, the Quantiply representation is sufficient analysis; in other cases, it’s a starting point for a data scientist to dig in and figure out some of the more complex correlations in the data.

There was a lot of detail in the presentation about the capabilities of the platform and what the customer is doing with it, and the benefits that they’re seeing; there’s not a lot of information on the Quantiply website since they’re just publicly launching.

Update: The original version of this post included the name of the customer and their representative. Since this was a presentation at a public conference with no NDA or confidentiality agreements in place, not even a verbal request at any time during the session, I live-blogged as usual. A day later, the vendor, under pressure from the customer’s PR group, admitted that they did not have clearance to have this customer speak publicly, which is a pretty rookie mistake on their part, although it lines up with my general opinion on their social media skills. As a favor to the conference organizers, who put a lot of effort into making a great experience for all of us, I’ve decided to remove the customer’s name from this post. I’m sure that those of you who really want to know it won’t have any trouble finding it, because of this thing called “the internet”.

The Personology of @RBSGroup at PegaWorld 2015

IMG_7261Andrew McMullan, director of analytics and decisioning (aka “personologist”) at Royal Bank of Scotland, gave a presentation on how they are building a central (Pega-based) decisioning capability to improve customer engagement and change their culture along the way. He started with a personal anecdote about how RBS did the right thing for a family member and gained a customer for life – a theme echoed from this morning’s keynote that also included RBS.  He showed a short video of their current vision, which stated goals of making RBS easier to do business with, and to work for, in addition to being more efficient. In that order, in case you other banks are following along.

RBS is now government owned, having been bailed out during the financial crisis; I’m not sure how much this has allowed them to focus on customer engagement rather than short-term profits, but they do seem to be talking the right talk.

RBS uses Pega’s Chordiant – primarily the decision management components, if I am reading it correctly – although are implementing Pega 7 for an August 2015 rollout to bring in more robust Next Best Action capabilities; they also use SAS Visual Analytics for reporting. This highlights the huge role of decisioning as well as process in customer engagement, especially when you’re applying analytics to a broad variety of customer information in order to determine how to interact with the customer (online or IRL) at any particular moment. RBS is proactive about having their customers do things that will save them money, such as renewing a mortgage at a lower rate, or choosing a package of banking services that doesn’t overlap with other services that they are paying for elsewhere. Contrary to what nay-sayers within RBS said about lost revenue, this tends to make customers more loyal and ultimately do more business with them.

There was a good question from the audience about how much of this was changes to organizational culture, and how much was the data science: McMullan said that it’s really critical to win the hearts and minds of the employees, although obviously you need to have at least the beginnings of the analytics and recommendations to get that started. Also, they use Net Promoter Score as their main internal metric, which tends to reward relationship-building over short-term profits; having the right incentives for employees goes a long ways towards helping them to do the right thing.