bpmNEXT 2019 demos: citizen development, process analysis and AI-driven automation with @kissflow Wizly and @IBM

Is the Citizen Developer Story a Fairytale? by Neil Miller of Kissflow

Given that Kissflow provides a low-code BPM platform, Neil’s answer is that citizen developers are not, in fact, unicorns: given the right tools, non-developers can build their own applications. Their platform allows a citizen developer to create a process-based application by defining a form, then a related process using a flowchart notation. Forms can link to internally-defined (or imported) data sources, and process steps can include links to webhooks to access external services. Simple but reasonably powerful capabilities, easy enough for non-technical analysts and business users to create and deploy single-form applications for their own use and to share with others. He also showed us the new version that is being released next month with a number of new features and tools, including more powerful integration capabilities that are still well within the reach of citizen developers. The new version also includes completely new functionality for unstructured collaborative scenarios, which can include conversation streams and tasks, plus Kanban boards for managing projects and tasks. There’s still a lot missing for this to handle any type of core processes (e.g., versioning, testing) but good for administrative, situational and collaboration processes.

Insightful Process Analysis, by Jude Chagas-Pereira of Wizly, Frank Kowalkowski of Knowledge Consultants, Inc., and Gil Laware of Information by Design

Wizly provides a suite of analysis tools including process analytics, using process mining and other techniques in demo focused on improving an airline’s call center performance. Jude showed how they can compare process history data against a pre-defined model for conformance checking, and a broad range of analysis techniques to discover correlations between activities and customer satisfaction. They can also generate a “DNA analysis” and other data visualizations, then filter and re-slice the data to hone in on the problems. The main tabular interface is similar to Excel-type filtering and pivot charts, so understandable to most business data analysts, with visualizations and extra analytical tools to drive out root causes. This set of process analytics is just part of their suite: they can apply the same tools to other areas such as master data management. We had a previous look at this last year under the name Aftespyre: Frank pointed out that he and Gil develop the intellectual property of the analytical models, while Jude’s company does the tool implementation.

Improving the Execution of Work with an AI Driven Automation Platform, by Kramer Reeves, Michael Lim and Jeff Goodhue of IBM

Jeff took us through a demo of their Business Automation Workflow Case Builder, which is a citizen developer tool for creating case and content-centric applications that can include processes, decisions and services created by technical developers layered on a simpler milestone-based flow. Checklists are built in as a task management and assignment, allowing a business user to create an ad hoc checklist and assign tasks to other users at any point in the case. We also saw the process task interface with an attended RPA bot invoked by the user as a helper to open the task, extract related data from a legacy interface, then update and dispatch the task . Alongside the process task interface, he showed us using a conversational interface to their Watson AI to ask what type of accounts that the client has, and what documents that they have for the client. We also saw the integration of AI into a dashboard to make decision recommendations based on historical data. He finished with their new Business Automation Studio low-code design environment, where we saw how the citizen developer can add integrations that were created by technical developers, and create new pages in a page flow application. It’s taken a long time for IBM to bring together their entire automation platform based on a number of past acquisitions, but now they appear to have a fairly seamless integration between case/content and process (BPM) applications, with low code and Watson sprinkled in for good measure. They’re also trying to move away from their monolithic pricing models to a microservices pricing model, even though their platforms are pretty monolithic in structure: Mike made the point that customers only pay for what they use.

That’s it for day 1 of bpmNEXT 2019; tomorrow morning I’ll be giving a keynote before we start back into demo rounds.

Machine learning and process mining at bpmNEXT 2019 with BP Logix, Minit and Cognitive Technology

Note that Kris Verlaenen, jBPM project lead at Red Hat, is also blogging from here, check out his coverage for a different view.

Democratizing Machine Learning with BPM, by Scott Menter and Joby O’Brien of BP Logix

We’re now into the full demo sessions at bpmNEXT, and Scott and Joby are up to talk about they’re making machine learning more accessible to non-data scientists and integrating it into their BPM tool, Process Director. They do this by creating a learner object that pulls in data from an external source, then configure the system to select the predicted data field, the algorithm to use and the input data feature to use for prediction. Their example is whether an employee is at risk for leaving the company (possibly a gentle dig at a bigger company making the same sort of predictions), so select one or more input values from the employee data set such as amount of travel and income. They have some nice visualization tools to use while building the learner object, selecting a couple of input features to see which may be the most interesting in the prediction, then can create the learner object so that it can update forms as data is entered, such as during a performance review. This now allows the output from a fairly sophisticated ML object that is analyzing past data to be used just like any other rule or data source in their BPMS. In general, their tools can be used by someone who knows about data scientists to create learner objects for other people to consume in their processes, but can also be used for those without a lot of data science knowledge to create simple but powerful machine learning predictions on their own.

Leveraging Process Mining to Enable Human and Robot Collaboration, by Michal Rosik of Minit

Michal started with the analysis of an invoice approval process as seen through their process mining tool, but the point of his demo was to perform data mining on UI session recording data, that is, the data collected when a recorder is monitoring a person’s activities to figure out exactly the steps they are taking to perform a task. Unlike a strict RPA training/scripting session, this can use data from users just doing their day-to-day work, filter out the activities that aren’t related to the task, and create a definition of the best RPA path. Or, it can use data from the process when RPA is performing the tasks to see where there are potential problems within the bot’s actions or if the existence of the bot is causing bottlenecks to be shifted to other parts of the process. It can use process variant analysis to look at the differences between the process pre- and post-bot implementation. He also showed their Minit dashboard, being released now, which combines process mining and business intelligence to see a much more predictive environment for business managers.

Process Mining and DTO — How to Derive Business Rules and ROI from the Data, with Massimiliano Delsante and Luca Fontanili of Cognitive Technology

DTO – the digital twin of an organization – is the focus of Massimiliano and Luca’s presentation, and how to get from process mining to DTO for analyzing and governing processes in their myInvenio tool. From their process mining model, they can show a number of visualizations: non-conformant cases within the process, manual steps (not yet automated, showing potential for improvement), steps that are in violation of their SLA, and a dashboard combining activity cost and other performance data with the process mining model. They demonstrated how a reference model would be created using BPMN and DMN to allow conformance checking and simulation, or derive the BPMN model – including branching rules – directly from the discovered process model. They’re using machine learning to discover the correlation from which the branching conditions are determined, but the business user/analyst can override the discovered branching rules to define more precise decision rules and decision tables. This “decision mining” is a unique capability in the process mining world (for now). The analyst can also add manual steps to the discovered process model in BPMN mode, which will update the related analytics and visualizations. Their simulation allows each step to not just be simulated as it is currently, but by specifying potential robot replacements of some of the human operators at an activity, comparing the different scenarios.

As a comment on the latter two process mining sessions, I’m really happy to see process mining moving from a purely post-execution analytical tool to an interactive process health check and prediction tool. I’ve done some presentations in the past in which I suggested that process mining would be a great tool for forward-looking simulations and what-if scenarios, and there’s so much more than can be done in this area.

bpmNEXT 2019 demos: Appian

Usually I blog about the demos in groups, but Malcolm Ross of Appian was the lone demo between the panel and lunch so he gets his own post. Smile

As a reminder, demos are a five-minute Ignite-style presentation (20 slides with an auto-advance every 15 seconds) followed by a live demo and Q&A. Malcolm had a lot to say, however, so had five minutes of slide followed by another four minutes of talk in front of a looping video before he started the actual demo.

Malcolm’s demo is on realigning BPM in the age of intelligent automation, in the context of different automation technologies (RPA, AI, BPM, integration) that are being sold as separate solutions into organizations. Not surprisingly, he positions BPM as the core technology and integration platform, but they also OEM Blue Prism’s RPA into their product suite and can integrate with many other web services to take part in the automation. He demonstrated an invoice processing application where he uploaded an invoice PDF where the data was captured using an RPA bot where BPM was used for exception handling when the bot couldn’t complete its task as well as overall monitoring of processes including the bot tasks. He walked through some of their design-time experience that is focused on integration, showing how connections to services from Blue Prism, Automation Anywhere, AWS machine learning, Google NLP and others can be used to create integration points that can then be called from their BPM processes. Good use case of using BPM and RPA together – they are much more complementary than competitive – by allowing RPA tasks to be orchestrated and monitored as part of a larger BPM process. He also had a great analogy when asked about deciding when to use RPA versus BPM: RPA is like a pain reliever that provides temporary relief, while BPM (and SOA) is like an antibiotic that cures the underlying problem.

Cloud architecture panel at bpmNEXT 2019

As a twist on the usual bpmNEXT format, we heard from a panel of the demo participants: Michael Lim of IBM, Philippe Laumay of Bonitasoft and Phil Simpson of Red Hat. A few notes from the panel – no attribution of any specific comments but you can likely make some guesses – of what vendors are facing with cloud architectures.

  • Platform architecture needs to have cloud-level scalability through containerization
  • Cloud is pushing vendors from a monolithic BPMS platform to a microservice architecture for elasticity
  • A “boil the ocean” digital business monolithic platform doesn’t make sense, but better to provide easily-consumable services on a pay-per-use basis
  • Services are assembled into solutions but may be guided by a platform strategy to know what will work well together
  • A single-vendor platform requires pricing for only the components used
  • Monolithic platforms provide a common data model used by a single vendor’s tools for better application of machine learning and AI to the data
  • Low-code application development, solution accelerators or partner-created vertical solutions are required to sell the cloud platform
  • Cloud microservice architecture enables collaboration between vendors and customers in more of an open source model
  • Picking the right best-of-breed service for your use case can be a competitive differentiator
  • Systems integrators are going to shift to more of a consulting role to focus on best practices (including which service to pick for which application) rather than building solutions
  • Vendors can help to build relationships between partners with complementary skills to build solutions together
  • Cloud doesn’t necessarily mean that you don’t know where the data is (e.g., hybrid cloud), just that it is managed in a consistent fashion and transparent to the users
  • Capture (from physical documents/objects) is one area where physical location is particularly relevant since the physical documents will be stored somewhere for a period of time
  • BPMN isn’t necessarily used for end-to-end modeling of executable processes, since that implies orchestration at that level; at the high level, it is more commonly used to model milestones and business behaviors

In summary: cloud and microservices are good, but the single-vendor platform versus best-of-breed services is still up for debate.

Now, on to the demos.

bpmNEXT 2019 keynote: @JimSinur on technology combinations that digitally deliver

Our second keynote on the first day of bpmNEXT 2019 is with long-time presenter Jim Sinur, looking at technology combinations that digitally deliver. Unlike his usual focus on future directions, he’s driving down into what technologies work for companies that are undergoing digital transformation. This is a great lead-in to what I’ll be talking about tomorrow morning, and I fully expect to be fine-tuning my presentation before then to incorporate ideas from Jim’s presentation as well as Nathaniel Palmer’s presentation that preceded it.

IMG_3358Digital business platforms – something bigger than a BPMS – provide the real pathway to digital transformation, combining a variety of technologies. The traditional BPMS products are strong in work/process management, but they also need proactive intelligence, integration, automation, IoT enablement and business functionality. He looks at technical streams and their benefits, ranging from computational technologies to consumer delivery channels. He had a draft version of a matrix that he’s working on that shows attributes for these different technologies, from skill level required to get started with the technology to the likelihood of the vendors in this category partnering with other category vendors successfully, IMG_3360leading to a list of top productive pairs and triplets that we’re seeing in the market today: BPM and AI, for example, for processes with smart resources and actions; or architecture, low code and RPA for incremental transformation of legacy.

He finished up with how we will be leveraging the trends for marketplace collaboration between vendor products, and encouraging the vendors in the room (mostly everybody) to collaborate along the lines of his top pairs and triplets. In my opinion, this won’t necessarily being the vendors deciding to partner to offer joint solutions, but larger enterprises deciding to roll their own platforms using a combination of best-of-breed technologies that they select themselves: the vendors will need to make sure that their products can be sliced, diced and re-integrated in the way that the customers want.

Slide decks and videos of all presentations will be online within a day or two; I’ll come back and update all of the posts with links then.

Kicking off bpmNEXT 2019 with @NathanielPalmer

Except for a hiatus in 2017, I’ve been at every bpmNEXT since its inception in 2013, created and hosted by Bruce Silver and Nathaniel Palmer as a showcase for new ideas in BPM and related technologies. This is not a conference for (potential) customers, but a place for vendors, researchers and analysts to come together to exchange ideas about what’s happening in the marketplace and the technology labs. Most of the agenda is made up of 30-minute demo sessions with a few panels and keynotes sprinkled in.

Nathaniel Palmer started our first day with a look forward at the next five years of BPM by considering the five-year span from 2015 to 2020 and how his predictions are playing out from his first predictions keynote. In 2015, he talked about intelligent automation; today, we’re seeing robots and rules-based automation as an integral part of how business is done. This is pretty crucial, because the average number of systems required to present a complete view of a customer is 13.2 (!), 8 of which are external, with 80% of firms stating that they use more than 10 systems to get that a 360 degree view. He talks about the need for an intelligent automation platform that includes robotic automation, AI and machine learning, decision management, and process management, communicating with events and data via an event gateway/bus. He believes that the role of a BPMS is also to provide the framework for development and to build the user interface – an idea that I’ll be debating somewhat in my keynote tomorrow – but sees always-on, context-driven devices such as smart speakers as the future of how we interact with systems rather than traditional computers and smartphones. That means that conversational interaction will take over from worklist metaphors for common processes for consumers and employees; my interpretation of this is that the task-focused activities are those that will be automated, leaving the more fluid activities for people to deal with.

A consideration of this changing nature of automation is how to model this. Our traditional workflows have a pre-defined path, whereas intelligent automation (with more of a case management/ad hoc paradigm) has more adaptable processes driven by rules and business context. It’s more like using Waze for dynamically-adjusted driving directions rather than a pre-conceived idea of what route to follow. The danger with this – in my experience with Waze and adaptable business processes – is that you could end up on a route that is not generally followed, messes up the people who have to get involved along the route, and definitely isn’t repeatable or scalable: better for that specific instance and its participants, but possibly detrimental to others. The potential gain is, of course, that the process as a whole is more resilient because it responds to events by determining an action that will reach the goal, and you may just find a new and better way of doing something. Respond to events, definitely, but at some point take a step back and consider the impact of the new pathways that you’re carving out.

IMG_3352He spoke about problems with AI/ML and training data biases – robots are only as smart as your training data – and highlighted that BPM platforms are a great source of training data via process mining.and analysis.

Insightful as always, and it will be interesting to see these themes play out in the demos over the next three days.

2019 @Alfresco Day: RBC Capital Markets

Yesterday at the analyst day, Alfresco CEO Bernadette Nixon had a fireside chat with Jim Williams of RBC about their Alfresco journey, and today at the user conference, Williams gave us more of the details of what they’re doing. They had an aging platform (built on Pega) that wasn’t able to support their derivatives business operations adequately, having been designed for a single purpose without the ability to easily change, resulting in many manual processes.

They wanted to have a single BPM and ECM platform that would span all of their business areas for handling regulatory documentation, and they started in 2015 with their equities operations: not because it was easy, low-hanging fruit, but because it was complex and essential to get it right. They now have 14 applications built on the same framework, and 3,500+ users. Williams said that they specifically liked Alfresco because it doesn’t try to be everything but integrates with other products and services to do functions such as reporting or OCR; this is particularly interesting in the face of other vendor platforms that want to be everything to everyone, and don’t do some of the functions very well.

By 2016, they had rolled out applications in tax operations, which was essential to the changing IRS rules that required foreign banks like RBC to withhold tax on US investments unless clients could prove that they met non-resident requirements. This had to integrate with many of their other operational processes that followed. They also implemented content and process applications for HR due to some of their complex job role management in the UK, reducing dependency on spreadsheets and email for what are essentially core processes.

Like all of the very conservative Canadian financial institutions, their Alfresco implementation is all on premise rather than cloud, although they have cloud ambitions. It’s also important to note that although RBC is Canada’s largest bank, Capital Markets is a relatively small part of it; it will be interesting to see if Williams can carry the Alfresco message to other parts of the organization.

2019 @Alfresco Analyst Day: vision for the future with @JohnNewton

We wrapped up the 2019 Analyst Day with founder John Newton talking about Alfresco’s vision for the future.

Most digital transformation efforts today are focused on external experiences, that is, how a company interacts with its customers. However, there’s more to it than that: the external experience has to interact with employee experiences and operational systems; this linkage is what Newton calls digital operations. Looking at the ubiquitous onboarding use case, digital transformation is not just about the nice app that the customer sees to upload their documents: it’s also about the straight-through processing that manages what happens after the customer does that upload, or request a service. He points out that it’s all about the process, and that content follows the process. This, obviously, is music to my ears.

Customers need to think about their digital business platform, which is not the same as any vendor’s digital business platform: it’s more than that, and it may be made up of more than one vendor’s platform. It needs to handle the digital outside (customer-facing) as well as the digital inside (employee-facing), and the end-to-end processes and content repositories that link them. There are a number of disruptive technologies that are driving digital operations — cloud, microservices, edge computing, blockchain — and there will always be a new one to add to this list.

That took us to their strategic themes:

  1. Process-first digital operations, including process, content, search, governance and insight capabilities
  2. Global-scale, multi-cloud digital operations, which removes the enterprise infrastructure concerns such as scalability and global replication
  3. Artificial intelligence powering digital operations, with the modern range of AI services now widely available from the internet giants being applied to content and process
  4. Empowering business users with targeted solutions, and improved user experience
  5. Empowering builders to accelerate solutions, with development and deployment tools
  6. Differentiate open source and enterprise (note that this is the first mention of open source all day), with add-on capabilities to the open source core services and engines

Always an insightful speaker, and I’m particularly interested in how the layers above the API “surface” (such as the Alfresco Digital Framework and Digital Workspace built on the ADF) are adopted in practice versus direct API usage.

That’s it for the analyst day; I’ll be back tomorrow for the regular user conference.

2019 @Alfresco Analyst Day: partner strategy

Darren Yetzer, Alfresco’s VP Channel, took us through their partner strategy, and hosted conversations with Bibhakar Pandey of Cognizant and Matt Yanchyshyn of AWS.

The Alfresco partner ecosystem has three divisions:

  • Global system integrators
  • Local/regional system integrators
  • Technology and ISVs

In the breakout discussions that we had earlier in the afternoon, there were various discussions on the strategy focus on vertical use cases, and how partners will need to fill some big part of that need since Alfresco isn’t in the vertical application business. Obviously, Alfresco needs to have more than just a logo salad of partners: they need partners that can work with customer to develop vertical solutions, make sales and extend functionality.

Pandey sees Alfresco’s platform as a way for them quickly create domain-specific content-centric solutions for their clients that include integration with other platforms and systems. They don’t want to have to build that base level of capabilities before starting with their solution development, so see a platform like Alfresco being table stakes for working effectively as an SI. Cognizant works with a broad scope of customer organizations, but Pandey highlighted insurance as one vertical that is ripe for the content capabilities offered via Alfresco’s platform. He focused on the cloud-native capabilities of Alfresco as essential, as well as the microservices architecture that allows specific functions to be deployed and scaled independently.

The Amazon AWS partnership is much different and even more significant from a platform functionality point of view: we saw this beginning at last year’s Alfresco Day with the significant announcements about native AWS containerization, and continuing now with the integration of their AI/ML services. Yanchyshyn discussed how Amazon is now developing domain-specific intelligence services, such as Comprehend Medical, and how organizations will start to take advantage of this higher-level starting point for mapping their business use cases onto Amazon’s AI/ML services. He sees Alfresco as being an early adopter of these services, and uses them as an example of what can be done to integrate that into larger platforms.

2019 @Alfresco Day: Go To Market Strategy

Jennifer Smith, Alfresco’s CMO, gave us an expansion of the GTM strategy that Bernadette Nixon spoke about earlier today.

Their platform is based on a single cloud-native platform combining content, process and governance services, on which they identify three pillars of their horizontal platform approach:

  • Modernization and migration, providing tools for migrating to Alfresco quickly and with minimal risk
  • Coexistence and integration, allowing for easy integration with third-party services and legacy systems
  • Cloud-native and AWS-first, with deep integration and support for AWS cloud platform, storage and AI/ML services

Their vertical use case approach is based on a typical land-and-expand strategy: they take an existing implementation with a customer and find other use cases within that organization to leverage the platform benefits, then work with a large enterprise or partner to develop managed vertical solutions.

We saw a demo of a citizen services scenario: to paraphrase, a government agency has old, siloed systems and bad processes, but citizens want to interact with that agency in the same way that they interact with other services such as their bank. In a modernized passport application example, the process would include document upload directly by the citizen, intelligent classification and extraction from the documents, fraud detection by integration with other data sources, natural language translation to communicate with foreign agencies, and tasks for manual review. Although the process and content bits are handled natively by Alfresco, much of the intelligence is based on Amazon services such as Comprehend and Textract — Alfresco’s partnership with Amazon and AWS-native platform make this a natural fit.

We’re off to some breakouts now then partner strategy this afternoon, so it might be quiet here until tomorrow.