bpmNEXT 2019 demos: microservices, robots and intentional processes with @Bonitasoft @Signavio and @Flowable

BPM, Serverless and Microservices: Innovative Scaling on the Cloud with Philippe Laumay and Thomas Bouffard of Bonitasoft

Turns out that my microservices talk this morning was a good lead-in to a few different presentations: Bonitasoft has moved to a serverless microservices architecture, and the pros and cons of this approach. Their key reason was scalability, especially where platform load is unpredictable. The demo showed an example of starting a new case (process instance) in a monolithic model under no load conditions, then the same with a simulated load, where the user response in the new case was significantly degraded. They then demoed the same scenario but scaling the BPM engine by deploying it multiple times in componentized “pods” in Kubernetes, where Kubernetes can automatically scale up further as load increases. This time, the user experience on the loaded system was considerably faster. This isn’t a pure microservices approach in that they are scaling a common BPM engine (hence a shared database even if there are multiple process servers), not embedding the engine within the microservices, but it does allow for easy scaling of the shared server platform. This requires cluster management for communicating between the pods and keeping state in sync. The final step of the demo was to externalize the execution completely to AWS Lambda by creating a BPM Lambda function for a serverless execution.

Performance Management for Robots, with Mark McGregor and Alessandro Manzi of Signavio

Just like human performers, robots in an RPA scenario need to have their performance monitored and managed: they need the right skills and training, and if they aren’t performing as expected, they should be replaced. Signavio does this by using their Process Intelligence (process mining) to discover potential bottleneck tasks to apply RPA and create a baseline for the pre-RPA processes. By identifying tasks that could be automated using robots, Alessandro demonstrated how they could simulate scenarios with and without robots that include cost and time. All of the simulation results can be exported as an Excel sheet for further visualization and analysis, although their dashboard tools provide a good view of the results. Once robots have been deployed, they can use process mining again to compare against the earlier analysis results as well as seeing the performance trends. In the demo, we saw that the robots at different tasks (potentially from different vendors) could have different performance results, with some requiring either replacement, upgrading or removal. He finished with a demo of their “Lights-On” view that combines process modeling and mining, where traffic lights linked to the mining performance analysis are displayed in place in the model in order to make changes more easily.

The Case of the Intentional Process, with Paul Holmes-Higgin and Micha Kiener of Flowable

The last demo of the day was Flowable showing how they combined trigger, sentry, declarative and stage concepts from CMMN with microprocesses (process fragments) to contain chatbot processes. Essentially, they’re using a CMMN case folder and stages as intelligent containers for small chatbot processes; this allows, for example, separation and coordination of multiple chatbot roles when dealing with a multi-product client such as a banking client that does both business banking and personal investments with the bank. The chat needs to switch context in order to provide the required separation of information between business and personal accounts. “Intents” as identified by the chatbot AI are handled as inbound signals to the CMMN stages, firing off the associated process fragment for the correct chatbot role. The process fragment can then drive the chatbot to walk the client through a process for the requested service, such as KYC and signing a waiver for onboarding with a new investment category, in a context-sensitive manner that is aware of the customer scenario and what has happened already. The chatbot processes can even hand the chat over to a human financial advisor or other customer support person, who would see the chat history and be able to continue the conversation in a manner that is seamless to the client. The digital assistant is still there for the advisor, and can detect their intentions and privately offer to kick off processes for them, such as preparing a proposal for the client, or prevent messages that may violate privacy or regulatory compliance. The advisor’s task list contains tasks that may be the result of conversations such as this, but will also include internally created and assigned tasks. The advisor can also provide a QR code to the client via chat that will link to a WhatsApp (or other messaging platform) version of the conversation: less capable than the full Flowable chat interface since it’s limited to text, but preferred by some clients. If the client changes context, in this case switching from private banking questions to a business banking request, the chatbot an switch seamlessly to responding to that request, although the advisor’s view would show separate private and business banking cases for regulatory reasons. Watch the video when it comes out for a great discussion at the end on using CMMN stages in combination with BPMN for reacting to events and context switching. It appears that chatbots have officially moved from “toy” to “useful”, and CMMN just got real.

bpmNEXT 2019 demos: citizen development, process analysis and AI-driven automation with @kissflow Wizly and @IBM

Is the Citizen Developer Story a Fairytale? by Neil Miller of Kissflow

Given that Kissflow provides a low-code BPM platform, Neil’s answer is that citizen developers are not, in fact, unicorns: given the right tools, non-developers can build their own applications. Their platform allows a citizen developer to create a process-based application by defining a form, then a related process using a flowchart notation. Forms can link to internally-defined (or imported) data sources, and process steps can include links to webhooks to access external services. Simple but reasonably powerful capabilities, easy enough for non-technical analysts and business users to create and deploy single-form applications for their own use and to share with others. He also showed us the new version that is being released next month with a number of new features and tools, including more powerful integration capabilities that are still well within the reach of citizen developers. The new version also includes completely new functionality for unstructured collaborative scenarios, which can include conversation streams and tasks, plus Kanban boards for managing projects and tasks. There’s still a lot missing for this to handle any type of core processes (e.g., versioning, testing) but good for administrative, situational and collaboration processes.

Insightful Process Analysis, by Jude Chagas-Pereira of Wizly, Frank Kowalkowski of Knowledge Consultants, Inc., and Gil Laware of Information by Design

Wizly provides a suite of analysis tools including process analytics, using process mining and other techniques in demo focused on improving an airline’s call center performance. Jude showed how they can compare process history data against a pre-defined model for conformance checking, and a broad range of analysis techniques to discover correlations between activities and customer satisfaction. They can also generate a “DNA analysis” and other data visualizations, then filter and re-slice the data to hone in on the problems. The main tabular interface is similar to Excel-type filtering and pivot charts, so understandable to most business data analysts, with visualizations and extra analytical tools to drive out root causes. This set of process analytics is just part of their suite: they can apply the same tools to other areas such as master data management. We had a previous look at this last year under the name Aftespyre: Frank pointed out that he and Gil develop the intellectual property of the analytical models, while Jude’s company does the tool implementation.

Improving the Execution of Work with an AI Driven Automation Platform, by Kramer Reeves, Michael Lim and Jeff Goodhue of IBM

Jeff took us through a demo of their Business Automation Workflow Case Builder, which is a citizen developer tool for creating case and content-centric applications that can include processes, decisions and services created by technical developers layered on a simpler milestone-based flow. Checklists are built in as a task management and assignment, allowing a business user to create an ad hoc checklist and assign tasks to other users at any point in the case. We also saw the process task interface with an attended RPA bot invoked by the user as a helper to open the task, extract related data from a legacy interface, then update and dispatch the task . Alongside the process task interface, he showed us using a conversational interface to their Watson AI to ask what type of accounts that the client has, and what documents that they have for the client. We also saw the integration of AI into a dashboard to make decision recommendations based on historical data. He finished with their new Business Automation Studio low-code design environment, where we saw how the citizen developer can add integrations that were created by technical developers, and create new pages in a page flow application. It’s taken a long time for IBM to bring together their entire automation platform based on a number of past acquisitions, but now they appear to have a fairly seamless integration between case/content and process (BPM) applications, with low code and Watson sprinkled in for good measure. They’re also trying to move away from their monolithic pricing models to a microservices pricing model, even though their platforms are pretty monolithic in structure: Mike made the point that customers only pay for what they use.

That’s it for day 1 of bpmNEXT 2019; tomorrow morning I’ll be giving a keynote before we start back into demo rounds.

Machine learning and process mining at bpmNEXT 2019 with BP Logix, Minit and Cognitive Technology

Note that Kris Verlaenen, jBPM project lead at Red Hat, is also blogging from here, check out his coverage for a different view.

Democratizing Machine Learning with BPM, by Scott Menter and Joby O’Brien of BP Logix

We’re now into the full demo sessions at bpmNEXT, and Scott and Joby are up to talk about they’re making machine learning more accessible to non-data scientists and integrating it into their BPM tool, Process Director. They do this by creating a learner object that pulls in data from an external source, then configure the system to select the predicted data field, the algorithm to use and the input data feature to use for prediction. Their example is whether an employee is at risk for leaving the company (possibly a gentle dig at a bigger company making the same sort of predictions), so select one or more input values from the employee data set such as amount of travel and income. They have some nice visualization tools to use while building the learner object, selecting a couple of input features to see which may be the most interesting in the prediction, then can create the learner object so that it can update forms as data is entered, such as during a performance review. This now allows the output from a fairly sophisticated ML object that is analyzing past data to be used just like any other rule or data source in their BPMS. In general, their tools can be used by someone who knows about data scientists to create learner objects for other people to consume in their processes, but can also be used for those without a lot of data science knowledge to create simple but powerful machine learning predictions on their own.

Leveraging Process Mining to Enable Human and Robot Collaboration, by Michal Rosik of Minit

Michal started with the analysis of an invoice approval process as seen through their process mining tool, but the point of his demo was to perform data mining on UI session recording data, that is, the data collected when a recorder is monitoring a person’s activities to figure out exactly the steps they are taking to perform a task. Unlike a strict RPA training/scripting session, this can use data from users just doing their day-to-day work, filter out the activities that aren’t related to the task, and create a definition of the best RPA path. Or, it can use data from the process when RPA is performing the tasks to see where there are potential problems within the bot’s actions or if the existence of the bot is causing bottlenecks to be shifted to other parts of the process. It can use process variant analysis to look at the differences between the process pre- and post-bot implementation. He also showed their Minit dashboard, being released now, which combines process mining and business intelligence to see a much more predictive environment for business managers.

Process Mining and DTO — How to Derive Business Rules and ROI from the Data, with Massimiliano Delsante and Luca Fontanili of Cognitive Technology

DTO – the digital twin of an organization – is the focus of Massimiliano and Luca’s presentation, and how to get from process mining to DTO for analyzing and governing processes in their myInvenio tool. From their process mining model, they can show a number of visualizations: non-conformant cases within the process, manual steps (not yet automated, showing potential for improvement), steps that are in violation of their SLA, and a dashboard combining activity cost and other performance data with the process mining model. They demonstrated how a reference model would be created using BPMN and DMN to allow conformance checking and simulation, or derive the BPMN model – including branching rules – directly from the discovered process model. They’re using machine learning to discover the correlation from which the branching conditions are determined, but the business user/analyst can override the discovered branching rules to define more precise decision rules and decision tables. This “decision mining” is a unique capability in the process mining world (for now). The analyst can also add manual steps to the discovered process model in BPMN mode, which will update the related analytics and visualizations. Their simulation allows each step to not just be simulated as it is currently, but by specifying potential robot replacements of some of the human operators at an activity, comparing the different scenarios.

As a comment on the latter two process mining sessions, I’m really happy to see process mining moving from a purely post-execution analytical tool to an interactive process health check and prediction tool. I’ve done some presentations in the past in which I suggested that process mining would be a great tool for forward-looking simulations and what-if scenarios, and there’s so much more than can be done in this area.

2019 @Alfresco Analyst Day: partner strategy

Darren Yetzer, Alfresco’s VP Channel, took us through their partner strategy, and hosted conversations with Bibhakar Pandey of Cognizant and Matt Yanchyshyn of AWS.

The Alfresco partner ecosystem has three divisions:

  • Global system integrators
  • Local/regional system integrators
  • Technology and ISVs

In the breakout discussions that we had earlier in the afternoon, there were various discussions on the strategy focus on vertical use cases, and how partners will need to fill some big part of that need since Alfresco isn’t in the vertical application business. Obviously, Alfresco needs to have more than just a logo salad of partners: they need partners that can work with customer to develop vertical solutions, make sales and extend functionality.

Pandey sees Alfresco’s platform as a way for them quickly create domain-specific content-centric solutions for their clients that include integration with other platforms and systems. They don’t want to have to build that base level of capabilities before starting with their solution development, so see a platform like Alfresco being table stakes for working effectively as an SI. Cognizant works with a broad scope of customer organizations, but Pandey highlighted insurance as one vertical that is ripe for the content capabilities offered via Alfresco’s platform. He focused on the cloud-native capabilities of Alfresco as essential, as well as the microservices architecture that allows specific functions to be deployed and scaled independently.

The Amazon AWS partnership is much different and even more significant from a platform functionality point of view: we saw this beginning at last year’s Alfresco Day with the significant announcements about native AWS containerization, and continuing now with the integration of their AI/ML services. Yanchyshyn discussed how Amazon is now developing domain-specific intelligence services, such as Comprehend Medical, and how organizations will start to take advantage of this higher-level starting point for mapping their business use cases onto Amazon’s AI/ML services. He sees Alfresco as being an early adopter of these services, and uses them as an example of what can be done to integrate that into larger platforms.

2019 @Alfresco Day: Go To Market Strategy

Jennifer Smith, Alfresco’s CMO, gave us an expansion of the GTM strategy that Bernadette Nixon spoke about earlier today.

Their platform is based on a single cloud-native platform combining content, process and governance services, on which they identify three pillars of their horizontal platform approach:

  • Modernization and migration, providing tools for migrating to Alfresco quickly and with minimal risk
  • Coexistence and integration, allowing for easy integration with third-party services and legacy systems
  • Cloud-native and AWS-first, with deep integration and support for AWS cloud platform, storage and AI/ML services

Their vertical use case approach is based on a typical land-and-expand strategy: they take an existing implementation with a customer and find other use cases within that organization to leverage the platform benefits, then work with a large enterprise or partner to develop managed vertical solutions.

We saw a demo of a citizen services scenario: to paraphrase, a government agency has old, siloed systems and bad processes, but citizens want to interact with that agency in the same way that they interact with other services such as their bank. In a modernized passport application example, the process would include document upload directly by the citizen, intelligent classification and extraction from the documents, fraud detection by integration with other data sources, natural language translation to communicate with foreign agencies, and tasks for manual review. Although the process and content bits are handled natively by Alfresco, much of the intelligence is based on Amazon services such as Comprehend and Textract — Alfresco’s partnership with Amazon and AWS-native platform make this a natural fit.

We’re off to some breakouts now then partner strategy this afternoon, so it might be quiet here until tomorrow.

2019 @Alfresco Analyst Day: update and strategy with @bvnixon

Bernadette Nixon, who assumed the role of CEO after Alfresco’s acquisition last year, opened the analyst day with the company strategy. They seem to be taking a shot at several of their competitors by pushing the idea that they’re one platform, built from the ground up as a single integrated platform rather than being a “Frankenplatform” pieced together from acquisitions. Arguably, Activiti grew up inside Alfresco as quite a separate project from the content side and I’m not sure it’s really as integrated as the other bits, but Alfresco sometimes forgets that content isn’t everything.

Nixon walked through what’s happened in the past year, starting with some of their customer success stories — wins against mainstream competitors, fast implementations and happy customers — and how they’ve added 126 new customer logos in the past year while maintaining a high customer renewal rate. They’ve maintained a good growth rate, and moved to profitability in order to invest back into the company for customer success, developing their teams, brand refresh, engineering and more. They’ve added many of the big SIs as new partners and are obviously working with the partner channel for success, since they’ve doubled their partner win rate. They’ve added five new products, including their Application Development Framework which is the core for some of the other products as well as the cornerstone of partner and customer success for fast implementation.

They commissioned a study that showed that most organizations want to be deployed in the cloud, have better control over their processes, and be able to create applications faster (wait…they paid for that advice?); more interestingly, they found that 35% of enterprises want to switch out their BPM and ECM platforms in the next few years, providing a huge opportunity for Alfresco and other disruptive vendors.

Alfresco is addressing the basic strategy of a horizontal platform approach versus a use case vertical approach: are they a platform vendor or an application vendor? Their product strategy is betting on their Alfresco Digital Business Platform targeted at the technical buyer, but also developing a go-to-market approach that highlights use cases primarily in government and insurance for the business/operational buyer. They don’t have off-the-shelf apps — that’s for their partners or their customers to develop — but will continue to present use cases that resonate with their target market of financial services, insurance, government and manufacturing.

A good start to the day — I’ll be here all day at the analyst conference, then staying on tomorrow for the user conference.

Show me the money: Financials, sales and support at @OpenText Analyst Summit 2019

We started the second day of the OpenText Analyst Summit 2019 with their CFO, Madhu Ranganathan, talking about their growth via acquisitions and organic growth. She claimed that their history of acquisitions shows that M&A does work — a point with which some industry specialists may not agree, given the still overlapping collection of products in their portfolio — but there’s no doubt that they’re growing well based on their six-year financials, across a broad range of industries and geographies. She sees this as a position for continuing to scale to $1B in operating cash flow by June 2021, an ambitious but achievable target, on their existing 25-year run.

Ted Harrison, EVP of Worldwide Sales, was up next with an update on their customer base: 85 of the 100 largest companies in the world, 17 of the top 20 financial services companies, 20 of the top 20 life sciences companies, etc. He walked through the composition of the 1,600 sales professionals in their teams, from the account executives and sales reps to the solution consultants and other support roles. They also have an extensive partner channel bringing domain expertise and customer relationships. He highlighted a few customers in some of the key product areas — GM for digital identity management, Nestle for supply chain management, Malaysia Airports for AI and analytics,and British American Tobacco for SuccessFactors-OT2 integration — with a focus on customers that are using OpenText in ways that span their business operations in a significant way.

James McGourlay, EVP of Customer Operations, covered how their global technical support and professional services organization has aligned with the customer journey from deployment to adoption to expansion of their OpenText products. With 1,400 professional services people, they have 3,000 engagements going on at any given time across 30 countries. As with most large vendors’ PS groups, they have a toolbox of solution accelerators, best practices, and expert resources to help with initial implementation and ongoing operations. This is also where they partner with systems integrators such as CGI, Accenture and Deloitte, and platform partners like Microsoft and Oracle. He addressed the work of their 1,500 technical support professionals across four major centers of excellence for round-the-clock support, co-located with engineering teams to provide a more direct link to technical solutions. They have a strong focus on customer satisfaction in PS and technical support because they realize that happy customers tend to buy more stuff; this is particularly important when you have a lot of different products to sell to those customers to expand your footprint within their organizations.

Good to hear more about the corporate and operations side than I normally cover, but looking forward to this afternoon’s deeper dives into product technology.

Product Innovation session at @OpenText Analyst Summit 2019

Muhi Majzoub, EVP of Engineering, continued the first day of the analyst summit with a deeper look at their technology progress in the past year as well as future direction. I only cover a fraction of OpenText products; even in the ECM and BPM space, they have a long history of acquisitions and it’s hard to keep on top of all of them.

Their Content Services provides information integration into a variety of key business applications, including Salesforce and SAP; this allows users to work in those applications and see relevant content in that context without having to worry where or how it’s stored and secured. Majzoub covered a number of the new features of their content platforms (alas, there are still at least two content platforms, and let’s not even talk about process platforms) as well as user experience, digital asset management, AI-powered content analytics and eDiscovery. He talked about their solutions for LegalTech and digital forensics (not areas that I follow closely), then moved on to the much broader areas of AI, machine learning and analytics as they apply to capture, content and process, as well as their business network transactions.

He talked about AppWorks, which is their low-code development environment but also includes their BPM platform capabilities since they have a focus on process- and content-centric applications such as case management. They have a big push on vertical application development, both in terms of enabling it for their customers and also for building their own vertical offerings. Interestingly, they are also allowing for citizen development of micro-apps in their Core cloud content management platform that includes document workflows.

The product session was followed by a showcase and demos hosted by Stephen Ludlow, VP of Product Marketing. He emphasized that they are a platform company, but since line-of-business buyers want to buy solutions rather than platforms, they need to be able to demonstrate applications that bring together many of their capabilities. We had five quick demos:

  • AI-augmented capture using Captive capture and Magellan AI/analytics: creating an insurance claim first notice of loss from an unstructured email, while gathering aggregate analytics for fraud detection and identifying vehicle accident hotspots.
  • Unsupervised machine learning for eDiscovery to identify concepts in large sets of documents in legal investigations, then using supervised learning/classification to further refine search results and prioritize review of specific documents.
  • Integrated dashboard and analytics for supply chain visibility and management, including integrating, harmonizing and cleansing data and transactions from multiple internal and external sources, and drilling down into details of failed transactions.
  • HR application integrating SAP SuccessFactors with content management to store and access documents that make up an employee HR file, including identifying missing documents and generating customized documents.
  • Dashboard for logging and handling non-conformance and corrective/preventative actions for Life Sciences manufacturing, including quality metrics and root cause analysis, and linking to reference documentation.

Good set of business use cases to finish off our first (half) day of the analyst summit.

Snowed in at the @OpenText Analyst Summit 2019

Mark Barrenechea, OpenText’s CEO and CTO, kicked off the analyst summit with his re:imagine keynote here in Boston amidst a snowy winter storm that ensures a captive audience. He gave some of the current OpenText stats –100M end users over 120,000 customers, 2.8B in revenue last year — before expanding into a review of how the market has shifted over the past 10 years, fueled by changes in technology and infrastructure. What’s happened on the way to digital and AI is what he calls the zero theorem: zero trust (guard against security and privacy breaches), zero IT (bring your own device, work in the cloud), zero people (automate everything possible) and zero down time (everything always available).

Their theme for this year is to help their customers re:imagine work, re:imagine their workforce, and re:imagine automation and AI. This starts with OpenText’s intelligent information core (automation, AI, APIs and data management), then expands with both their EIM platforms and EIM applications. OpenText has a pretty varied product portfolio (to say the least) and is bringing many of these components together into a more cohesive integrated vision in both the content services and the business network spaces. More importantly, they are converging their many, many engines so that in the future, customers won’t have to decide between which ECM or BPM engine, for example.

They are providing a layer of RESTful services on top of their intelligent information core services (ECM, BPM, Capture, Business Network, Analytics/AI, IoT), then allow that to be consumed either by standard development tools in a technical IDE, or using the AppWorks low-code environment. The Cloud OT2 architecture provides about 40 services for consumption in these development environments or by OpenText’s own vertical applications such as People Center.

Barrenechea finished up with a review of how OpenText is using OpenText to transform their own business, using AI for looking at some of their financial and people management data to help guide them towards improvements. They’ll be investing $2B in R&D over the next five years to help them become even bigger in the $100B EIM market, both through the platform and more increasingly through vertical applications.

We’ll be digging into more of the details later today and tomorrow as the summit continues, so stay tuned.

Next up was Ted Harrison, EVP of Worldwide Sales, interviewing one of their customers: Gopal Padinjaruveetil, VP and Chief Information Security Officer at The Auto Club Group. AAA needs no introduction as a roadside assistance organization, but they also have insurance, banking, travel, car care and advocacy business areas, with coordinated member access to services across multiple channels. It’s this concept of the connected member that has driven their focus on digital identity for both people and devices, and how AI can help them to reduce risk and improve security by detecting abnormal patterns.

TechnicityTO 2018: Cool tech projects

The afternoon session at Technicity started with a few fast presentations on cool projects going on in the city. Too quick to grab details from the talks, but here’s who we heard from:

  • Dr. Eileen de Villa, medical officer of health at Toronto Public Health, and Lawrence ETA, deputy CIO at the city of Toronto, on using AI to drive public health outcomes.
  • Angela Chung, project director at Toronto Employment and Social Services, Children’s Services, Shelter Support and Housing, on client-centric support through service platform integration.
  • Matthew Tenney, data science and visualization team supervisor, on IoT from streetcars to urban forestry for applications such as environmental data sensing.
  • Arash Farajian, policy planning consultant, on Toronto Water’s use of GIS, smart sensors, drones (aerial and submersible) and augmented reality.

The rest of the afternoon was the 10th annual Toronto’s Got IT Awards of Excellence, but unfortunately I had to duck out for other meetings, so that’s it for my Technicity 2018 coverage.