Show me the money: Financials, sales and support at @OpenText Analyst Summit 2019

We started the second day of the OpenText Analyst Summit 2019 with their CFO, Madhu Ranganathan, talking about their growth via acquisitions and organic growth. She claimed that their history of acquisitions shows that M&A does work — a point with which some industry specialists may not agree, given the still overlapping collection of products in their portfolio — but there’s no doubt that they’re growing well based on their six-year financials, across a broad range of industries and geographies. She sees this as a position for continuing to scale to $1B in operating cash flow by June 2021, an ambitious but achievable target, on their existing 25-year run.

Ted Harrison, EVP of Worldwide Sales, was up next with an update on their customer base: 85 of the 100 largest companies in the world, 17 of the top 20 financial services companies, 20 of the top 20 life sciences companies, etc. He walked through the composition of the 1,600 sales professionals in their teams, from the account executives and sales reps to the solution consultants and other support roles. They also have an extensive partner channel bringing domain expertise and customer relationships. He highlighted a few customers in some of the key product areas — GM for digital identity management, Nestle for supply chain management, Malaysia Airports for AI and analytics,and British American Tobacco for SuccessFactors-OT2 integration — with a focus on customers that are using OpenText in ways that span their business operations in a significant way.

James McGourlay, EVP of Customer Operations, covered how their global technical support and professional services organization has aligned with the customer journey from deployment to adoption to expansion of their OpenText products. With 1,400 professional services people, they have 3,000 engagements going on at any given time across 30 countries. As with most large vendors’ PS groups, they have a toolbox of solution accelerators, best practices, and expert resources to help with initial implementation and ongoing operations. This is also where they partner with systems integrators such as CGI, Accenture and Deloitte, and platform partners like Microsoft and Oracle. He addressed the work of their 1,500 technical support professionals across four major centers of excellence for round-the-clock support, co-located with engineering teams to provide a more direct link to technical solutions. They have a strong focus on customer satisfaction in PS and technical support because they realize that happy customers tend to buy more stuff; this is particularly important when you have a lot of different products to sell to those customers to expand your footprint within their organizations.

Good to hear more about the corporate and operations side than I normally cover, but looking forward to this afternoon’s deeper dives into product technology.

Product Innovation session at @OpenText Analyst Summit 2019

Muhi Majzoub, EVP of Engineering, continued the first day of the analyst summit with a deeper look at their technology progress in the past year as well as future direction. I only cover a fraction of OpenText products; even in the ECM and BPM space, they have a long history of acquisitions and it’s hard to keep on top of all of them.

Their Content Services provides information integration into a variety of key business applications, including Salesforce and SAP; this allows users to work in those applications and see relevant content in that context without having to worry where or how it’s stored and secured. Majzoub covered a number of the new features of their content platforms (alas, there are still at least two content platforms, and let’s not even talk about process platforms) as well as user experience, digital asset management, AI-powered content analytics and eDiscovery. He talked about their solutions for LegalTech and digital forensics (not areas that I follow closely), then moved on to the much broader areas of AI, machine learning and analytics as they apply to capture, content and process, as well as their business network transactions.

He talked about AppWorks, which is their low-code development environment but also includes their BPM platform capabilities since they have a focus on process- and content-centric applications such as case management. They have a big push on vertical application development, both in terms of enabling it for their customers and also for building their own vertical offerings. Interestingly, they are also allowing for citizen development of micro-apps in their Core cloud content management platform that includes document workflows.

The product session was followed by a showcase and demos hosted by Stephen Ludlow, VP of Product Marketing. He emphasized that they are a platform company, but since line-of-business buyers want to buy solutions rather than platforms, they need to be able to demonstrate applications that bring together many of their capabilities. We had five quick demos:

  • AI-augmented capture using Captive capture and Magellan AI/analytics: creating an insurance claim first notice of loss from an unstructured email, while gathering aggregate analytics for fraud detection and identifying vehicle accident hotspots.
  • Unsupervised machine learning for eDiscovery to identify concepts in large sets of documents in legal investigations, then using supervised learning/classification to further refine search results and prioritize review of specific documents.
  • Integrated dashboard and analytics for supply chain visibility and management, including integrating, harmonizing and cleansing data and transactions from multiple internal and external sources, and drilling down into details of failed transactions.
  • HR application integrating SAP SuccessFactors with content management to store and access documents that make up an employee HR file, including identifying missing documents and generating customized documents.
  • Dashboard for logging and handling non-conformance and corrective/preventative actions for Life Sciences manufacturing, including quality metrics and root cause analysis, and linking to reference documentation.

Good set of business use cases to finish off our first (half) day of the analyst summit.

Upcoming webinar on digital transformation in financial services featuring @BPMdotcom and @ABBYY_USA – and my white paper

Something strange about receiving an email about an upcoming webinar, featuring two people who I know well…

 …then scrolling down to see that ABBYY is featuring the paper that I wrote for them as follow-on bonus material!

Nathaniel Palmer and Carl Hillier are both intelligent speakers with long histories in the industry, tune in to hear them talk about the role that content capture and content analytics play in digital transformation.

bpmNEXT 2018: All about bots with Cognitive Technology, PMG.net, Flowable

We’re into the afternoon of day 2 of bpmNEXT 2018, with another demo section.

RPA Enablement: Focus on Long-Term Value and Continuous Process Improvement, Cognitive Technology

Massimiliano Delsante of Cognitive Technology presented their myInvenio product for analyzing processes to determine where gaps exist and create models for closing those gaps through RPA task automation. The demo started with loading historical process data for process mining, which created a process model from the data together with activity resources, counts and other metrics; then comparing the model for conformance with a reference model to determine the frequency and performance of conformant and non-conformant cases. The process discovery model can be transformed to a BPMN model, and simulated performance. With a baseline data set of all manual activities, the system identified the cost of each activity, helping to identify which activities would result in the greatest savings if automated, and fed the data for actual resources used into the simulation scenario; adjusting the resources required by specifying the number of RPA robots that could be deployed at specific tasks allows for a what-if simulation for the process performance with an RPA implementation. An analytics dashboard provides visualization of the original process discovery and the simulated changes, with performance trends over time. Predictive analytics can be applied to running processes to, for example, predict which cases will not meet their deadlines, and some root cause analysis for the problems. Doing this analysis requires that you have information about the cost of the RPA robots as well as being able to identify which tasks could be automated with RPA. Good integration of process discovery, simulation, analysis and ongoing monitoring.

Integration is Still Cool, and Core in your BPM Strategy, PMG.net

Ben Alexander from PMG.net focused on integration within BPM as a key element for driving innovation by increasing the speed of application development: integrating services for RPA, ML, AI, IoT, blockchain, chatbots and whatever other hot new technologies can be brought together in a low-code environment such as PMG. His demo showed a vendor onboarding application, adding a function/subprocess for assessing probability of vendor approval using machine learning by calling AzureML, user task assignment using Slack integration or SMS/phone support through a Twilio connector, and RPA bot invocation using a generic REST API. Nice demo of how to put all of these third-party services together using a BPM platform as the main application development and orchestration engine.

Making Process Personal, Flowable

Paul Holmes-Higgin and Micha Keiner from Flowable presented on their Engage product for customer engagement via chat, using chatbots to augment rather than replace human chat, and modeling the chatbot behavior using standard modeling tools. In particular, they have found that a conversation can be modeled as a case with dynamic injection of processes, with the ability to bring intelligence into conversations, and the added benefit of the chat being completely audited. The demo was around the use case of a high-wealth banking client talking to their relationship manager using chat, with simultaneous views of both the client and relationship manager UI in the Flowable Engage chat interface. The client mentioned that she moved to a new home, and the RM initiated the change address process by starting a new case right in the chat by invoking a context-sensitive digital assistant. This provided advice to the RM about address change regulatory rules, and provided a form in situ to collect the address data. The case is now progressed through a combination of chat message to collaborate between human players, forms filled directly in the chat window, and confirmation by the client via chat by presenting them with information to be updated. Potential issues, such as compliance regulations due to a country move, are raised to the RM, and related processes execute behind the scenes that include a compliance officer via a more standard task inbox interface. Once the compliance process completes, the RM is informed via the chat interface. Behind the scenes, there’s a standard address change BPMN diagram, where the chat interface is integrated through service activities. They also showed replacing the human compliance decision with a decision table that was created (and manually edited if necessary) based on a decision tree generated by machine learning on 200,000 historical address change cases; rerunning the scenario skipped the compliance officer step and approved the change instantaneously. Other chat automated tasks that the RM can invoke include setting reminders, retrieving customer information and more using natural language processing, as well as other types of more structured cases and processes. Great demo, and an excellent look at the future of chat interfaces in process and case management.

bpmNEXT 2018: Here’s to the oddballs, with ConsenSys, XMPro and BPLogix

And we’re off with the demo sessions!

Secure, Private Decentralized Business Processes for Blockchains, ConsenSys

Vanessa Bridge of ConsenSys spoke about using BPMN diagrams to create smart contracts and other blockchain applications, while also including privacy, security and other necessary elements: essentially, using BPM to enable Ethereum-based smart contracts (rather than using blockchain as a ledger for BPM transactions and other BPM-blockchain scenarios that I’ve seen in the past). She demonstrated using Camunda BPM for a token sale application, and for a boardroom voting application. For each of the applications, she used BPMN to model the process, particularly the use of BPMN timers to track and control the smart contract process — something that’s not native to blockchain itself. Encryption and other steps were called as services from the BPMN diagram, and the results of each contract were stored in the blockchain. Good use of BPM and blockchain together in a less-expected manner.

Turn IoT Technology into Operational Capability, XMPro

Pieter van Schalkwyk of XMPro looked at the challenges of operationalizing IoT, with a virtual flood of data from sensors and machines that needs to be integrated into standard business workflows. This involves turning big data into smart data via stream processing before passing it on to the business processes in order to achieve business outcomes. XMPro provides smart listeners and agents that connect the data to the business processes, forming the glue between realtime data and resultant actions. His demo showed data being collected from a fan on a cooling tower, bringing in data the sensor logs and comparing it to manufacturer’s information and historical information in order to predict if the fan is likely to fail, create a maintenance work order and even optimize maintenance schedules. They can integrate with a large library of action agents, including their own BPM platform or other communication and collaboration platforms such as Slack. They provide a lot of control over their listener agents, which can be used for any type of big data, not just industrial device data, and integrate complex and accurate prediction models regarding likelihood and remaining useful life predictions. He showed their BPM platform that would be used downstream from the analytical processing, where the internet of things can interact with the internet of people to make additional decisions required in the context of additional information such as 3D drawings. Great example of how to filter through hundreds of millions data points in streaming mode to find the few combinations that require action to be taken. He threw out a comment at the end that this could be used for non-industrial applications, possibly for GDPR data, which definitely made me think about content analytics on content as it’s captured in order to pick out which of the events might trigger a downstream process, such as a regulatory process.

Business Milestones as Configuration, BPLogix

Scott Menter and Joby O’Brien of BPLogix finished up this section on new BPM ideas with their approach to goal orientation in BPM, which is milestone-based and requires understanding the current state of a case before deciding how to move forward. Their Process Director BPM is not BPMN-based, but rather an event-based platform where events are used to determine milestones and drive the process forward: much more of a case management view, usually visualized as a project management-style GANTT chart rather thana flow model. They demonstrated the concept of app events, where changes in state of any of a number of things — form fields, activities, document attachments, etc. — can record a journal entry that uses business semantics and process instance data. This allows events from different parts of the platform to be brought together in a single case journal that shows the significant activity within the case, but also to be triggers for other events such as determining case completion. The journal can be configured to show only certain types of events for specific users — for example, if they’re only interested in events related to outgoing correspondence — and also becomes a case collaboration discussion. Users can select events within the journal and add their own notes, such as taking responsibility for a meeting request. They also showed how machine learning and rules can be used for dynamically changing data; although shown as interactions between fields on forms, this can also be used to generate new app events. Good question from the audience on how to get customers to think about their work in terms of events rather than procedures; case management proponents will say that business people inherently think about events/state changes rather than process. Interesting representation of creating a selective journal based on business semantics rather than just logging everything and expecting users to wade through it for the interesting bits.

We’re off to lunch. I’m a bit out of practice at live-blogging, but hope that I captured some of the flavor of what’s going on here. Back with more this afternoon!

The collision of capture, content and analytics

Martyn Christian of UNDRSTND Group, who I worked with back in FileNet in 2000-1, gave a keynote at ABBYY Technology Summit 2017 on the evolution and ultimate collision of capture, content and analytics. He started by highlighting some key acquisitions in the industry, including the entry of private capital, as well as a move to artificial intelligence in the capture space, as harbingers of the changes in the capture market. Since Gartner declared enterprise content management dead — long live content services platforms! — and introduced new players in the magic quadrant alongside the traditional ECM players, while shifting IBM from the leaders quadrant back to the challengers quadrant.

Intelligent capture is gaining visibility and importance, particularly as a driver for digital transformation. Interestingly, capture was traditionally about converting analog (paper) to digital (data); now, however, many forms of information are natively digital, and capture is not only about performing OCR on scanned paper documents but about extracting and analyzing actionable data from both analog and digital content. High-volume in-house production scanning operations are being augmented — or replaced — with customers doing their own capture, such as we now see with depositing a check using a mobile banking application. Information about customer actions and sentiment is being automatically gleaned from their social media actions. Advanced machine learning is being used to classify content, reducing the need for manual intervention further downstream, and enabling straight-through processing or the use of autonomous agents.

As a marketing guy, he had a lot of advice on how this can be positioned and sold into customers; UNDRSTND apparently ran a workshop yesterday for some of the channel partner companies on bringing this message to their customers who are seeking to move beyond simple capture solutions to digital transformation.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

Analytics customer keynote at TIBCONOW 2016

Michael O’Connell hosted the last general session for TIBCO NOW 2016, focusing on analytics customer stories with the help of five customers: State Street, Shell, Vestas, Monsanto and Western Digital. I’m not going to try to attribute specific comments to the customer representatives, just capture a few thoughts as they go by.

wp-1463615608893.jpg

  • Spotfire is allowing self-service analytics to be pushed down to the business users
  • Typically, the analysis going on in a number of different solutions — from Excel to BI tools — are able to be consolidated onto a single analytics platform
  • Analytics is allowing the business to discover the true nature of their business, especially with outliers
  • Real-time analytics on physical processes (e.g., supply chain) generates significant benefits
  • Providing visual analytics to business changes the way that they use data and collaborate across the organization
  • The enterprise-class back-end and the good visualizations in Spotfire are helping it to win over both IT and business areas
  • Data and events are being generated faster and in greater volumes from more devices, making desktop analytics solutions impractical
  • Business users who are not data specialists can understand — and leverage — fairly complex analytical models when it concerns their own data
  • Analytics about manufacturing quality can be used to identify potential problems before they occur

We finished up with a brief presentation from Fred Ehlers, VP of IT at Norfolk Southern, about their use of TIBCO products to help manage their extensive railway operations. He talked about optimizing their intermodal terminals, where goods shipped in containers are moved between trains, trucks and ships; asset utilization, to ensure that empty cars are distributed to the right place at the right time for expected demand; and their customer service portal that shows an integrated view of a shipment lifecycle to give customers a more accurate, real-time view. As an old company, they have a lot of legacy systems, and used TIBCO to integrate them, centralizing operational events, data and business rules. For them, events can come from their physical assets (locomotives and railway sensors), legacy reporting systems, partner networks for assets not under their ownership, and external information including weather. On this, they build asset state models, and create applications that automatically correlate information and optimize operations. They now have one source of data and rules, and a reusable set of data and services to make application development faster. Their next steps are predictive maintenance, gathering information from locomotives, signal systms, switches and trackside defect detector to identify problems prior to an equipment failure; and real-time visual analytics with alerts on potential problem areas. They also want to inmprove operational forecasting to support better allocation of resources, allowing them to divert traffic and take other measures to avoid service disruptions. Great case study that incorporates the two conference themes of interconnecting everything and augmenting intelligence.

We’re at the end of day 2, and the end of my blogging at TIBCO NOW; there are breakouts sessions tomorrow but I’ll be on my way home. Some great new stuff in BPM and analytics, although far too many sessions going on at once to capture more than a fraction of what I wanted to see.

Intelligent Business Operations at TIBCONOW 2016

wp-1463598777880.jpgNicolas Marzin of TIBCO gave a breakout session on making business operations intelligent, starting with the drivers of efficiency, agility, quality and transparency. There are a number of challenges to achieving this in terms of work management: workers may have too many queues to monitor and not know which is most important, or people may be having work assigned to them that they are either over- or under-qualified to complete. This can result in missed SLAs and unhappy customers, lower efficiency, and lack of agility since business priorities aren’t enforced.

Looking at a day in the life of an operational business user, they need to know their own and their team’s performance goals, and what work that they should be completing that day in order to achieve those goals. Managers are concerned about their team as a whole, including whether they are meeting goals and SLAs, whether they have sufficient resources, and how to prioritize work. Managers need tools for real-time metrics, workforce administration, workload balancing, and changing priorities on the fly. ActiveMatrix BPM provides the ability to model your workforce in terms of roles, groups, privileges, relationships and capabilities; rules are applied to create a distribution strategy that determines what work is assigned to what resource at any point in a business process. Typically, work is assigned to a subset of the workforce whose skills match the requirement, since allocating work to an individual creates an operational risk if that person is absent or overloaded with work. AMX BPM includes process patterns for resource management: separation of duties, retain familiar, chaining and piling.

wp-1463598797560.jpgAnalytics comes into play in the management dashboard, where Spotfire is used to monitor operational performance and trigger actions directly from the dashboard. Typical visualization include work backlog and SLAs, resources pool workload and capacity, process and case performance, and business data in context. Marzin showed examples of dashboards for real-time tracking of work backlog and staffing, plus as-is forecasting that identifies bottlenecks. The charts show the factors that are most important for a manager to make resource allocation decisions, understand staffing needs based on combinations of skills, and reprioritize specific work types, which can then be pushed back to AMX BPM.

wp-1463598810544.jpgThis is fairly traditional BPM and case management, with rule-based workforce management, but that’s a huge part of where AMX BPM is being used in practice. However, their workforce management is fairly advanced compared to many competitive solutions, and using Spotfire for operational analytics raises the bar in active manager dashboards while allowing for what-if prediction and simulation on the fly. This ties in to the “closing the loop” theme of the day, where manager dashboard actions feed directly back to adjust the workforce management rules. This level of integrated visual analytics for AMX BPM is long overdue, but it looks like they’ve turned the previous demo-ware into something much more robust and generally applicable.

As an aside, I’ve done some presentations recently about the need to align incentives with corporate goals; although individual performance statistics are important, it’s key to ensure that they match up with overall goals, and include measurements of collaboration and teamwork too. Metrics for collaboration are just starting to emerge, and are not included in most BPM or other work management platforms.

Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.