OpenText Analyst Summit 2020 day 2: Digital Accelerants

Although technically a product breakout, the session on OpenText’s Digital Accelerants product collection was presented to the entire audience as our last full-audience session before the afternoon breakouts. This was split into three sections: cloud, AI and analytics, and process automation.

Jon Schupp, VP of Cloud GTM, spoke about how information is transforming the world: not just cloud, but a number of other technologies, a changing workforce, growing customer expectations and privacy concerns. Cloud, however, is the destination for innovation. Moving to cloud allows enterprise customers to take advantage of the latest product features, guaranteed availability, global reach and scalability while reducing their operational IT footprint. OpenText provides a number of different deployment platforms: “off-cloud” (aka on-premise), public cloud, private cloud, managed services, and SaaS.

Dave Moyers and Paul O’Hagan were up next to talk about AI and analytics, and how they are addressing data variety, ease of use, embedding AI/ML in processes, and deploying anywhere that it’s required. Their AI and analytics capabilities are provided by the Magellan product, and have been integrated with other OpenText products as well as built into vertical solutions. Magellan has been integrated into the ECM products with AI-augmented capture and the AI-powered “magic folder” auto-categorization and filing; into the Business Network products with asset performance optimization and predictive maintenance; into AppWords by instantiating processes based on insights; and several other integrations. They also have some new standard features for identifying PII (personal identifiable information), which is crucial for compliance and privacy. In addition to the analysis capabilities, there is a wide range of dashboard and visualization options, and full-fledged ETL for connecting to enterprise and third-party data sources and organize data flows. We also saw some examples yesterday of using Magellan for e-discovery and sentiment analysis. Interestingly, this is one of the product portfolios where they’ve taken advantage of integrating with open source tools to extend the core products.

Saving the best for last (okay, maybe that’s just my bias), Lori McKellar and Nick King presented on business process automation. This is not just about back-office automation, but includes customer-facing processing, IoT and other more complex intelligent processes. AppWorks, which includes the process automation capabilities, is an application development environment for use by semi-technical citizen developers (low-code) as well as professional developers (pro-code). We saw the all-new developer experience last year, and now they’ve had a chance to integrate the actual customer usage to fine-tune both the developer and end-user AppWorks experience. One significant change is that as their customers start to build larger apps, they now allow more granular access to the entities under development to allow multiple developers to be working on the same application simultaneously without collisions. They’ve added some new UI capabilities, such as a card view option and an optimized tablet view. Integration with Documentum has been improved for easier document check-in and repository access. Privacy features, including dynamic instance-level permissions and document redaction, are now available in AppWorks. In the upcoming 20.2 version, they’ll be adding an RPA connector framework, then expanding the RPA integrations in 20.3.

The session finished with a Q&A with all of the participants, including discussions on RPA connectors, operationalization of machine learning, hybrid cloud models, the role of unstructured content in AI training, and natural language processing.

This afternoon, I’ll be attending the breakout session on content services, so stay tuned for those final notes from the OpenText Analyst Summit 2020.

OpenText Analyst Summit 2020 day 1: technology strategy

After the break, OpenText EVP and Chief Product Officer Muhi Majzoub took the stage at the analyst summit to talk about innovation within their products, a strategic projects update and a bit of a roadmap. They have innovation that comes from customer requirements as well as their own drivers, but they also have a lot to do in order to integrate new acquisitions.

He stressed that Documentum and Content Suite are both being maintained, with innovation (e.g., UI, Core Share integration) being applied to both product lines; although this is probably a great relief for customers of either product, I can’t believe that this will go on forever. This is the real challenge for OpenText going forward: how to consolidate some of their overlapping/competing acquisitions without alienating customers, especially in the content space where information is persistent for a long time. Branding everything as “Cloud Edition” doesn’t fix the problem, it just obscures it.

Majzoub spoke about their four strategic projects:

  • Cloud Edition (CE) is their cloud-native platform for running all manner of solutions and applications, which runs on a variety of cloud vendor platforms (OpenText, Google, AWS, Azure, Anthos) and includes containerized deployment models.
  • OT2 is their cloud-native application development platforms, including 231 of their own services and pre-built SaaS applications. This includes their Core services and applications, such as content and workflow services and many others.
  • Business Network, which includes a range of B2B services from fax to EDI, plus a huge directory of global trading partners that can be linked via OpenText’s platform.
  • Carbonite integration, which brings some new capabilities in cybersecurity, as well as SMB/consumer content management.

He finished with five new innovations to watch for from OpenText, including new features in Documentum, IoT connected supply chain, Exstream, Magellan, and Encase.

Craig Stilwell, formerly Chief Revenue Officer at Carbonite and now OpenText’s EVP and GM of the SMB and Consumer division, was on next to highlight some (more) of what Carbonite is bringing to OpenText. This acquisition is obviously energizing a lot of people, since we’ve heard about it in every presentation so far today. Carbonite, through their original product and their acquisition of Webroot last year, covers two of the main concerns of many SMBs: backup/disaster recovery, and endpoint protection. Unlike large organizations that own their own data centers, SMBs (and consumers) were much earlier adopters of cloud storage and computation, and therefore some of the early victims of downtime due to disaster or cyber attack.

We finished the day with SVP of product management Stephen Ludlow and demos by his four VPs of product management, each of which is responsible for a different product category. With their broad range of products, they obviously didn’t cover everything, but each showed an interesting capability with a large potential business impact:

  • Marc Diefenbruch demonstrated the intelligent folder in the content suite, which is AI-powered content classification and filing.
  • Dawn Andre demonstrated the identification and connection with potential trading partners based on multiple qualification criteria in the Trading Grid global partner directory.
  • Guy Hellier demonstrated personalized omnichannel communications with Extream for improving customer satisfaction, using customer data sources and Magellan speech analytics tied together with Core Experience Insights.
  • Michael Cybala demonstrated secure cloud sharing, collaboration and signing of documents using Core Share and Core Signature working with Content Server.

That’s it for our first (half) day at the 2020 OpenText analyst summit. We’ll be back tomorrow for another full day.

2020 State of Business Process Management report from @BPTrends

The very first post that I wrote here, back in 2005, was on the BPTrends 2005 BPM Suites Report, which has evolved into their State of Business Process Management report (free, but registration required). Back in 2005, I noted that the vendors included in the report were “pay to play”, whereas by now the report is mostly BPM background/thought leadership information plus the detailed results of surveys with BPM practitioners. Although survey respondents are asked about the tools that they use, and capabilities that they require for future work, the specific vendors are not discussed in any detail. For this report, Paul Harmon of BPTrends worked together with Jorge Garcia of Technology Evaluation Centers (TEC): it appears that Harmon focused on the first section, How Organizations Understand Business Processes, while Garcia covered the second section on Business Process Software Tools. The report was sponsored by Creatio (formerly bpm’online), Signavio and Trisotech, so thanks to them for helping to make this report free to everyone (as an aside, Signavio and Trisotech are both customers of mine, but that’s not why I’m writing this post).

This quote from the executive summary really highlights why processes are such an important part of understanding how businesses are going digital:

BPTrends started tracking the process market in 2005 when BPMS tools first appeared on the scene. In the years since, enthusiasm has driven a wide variety of process initiatives. Underneath it all, however, was the interest in Internet-based tools that could model, track and control major business processes. The tools have gone through a variety of changes and are, today, powerful, widespread and widely used. The initial enthusiasm for new process work has declined a bit and current interest is perhaps better characterized by the term digital transformation, but the underlying impulse – to improve how businesses perform their work – remains.

There’s some good analysis in the report, including trends that they’ve noted from their surveys since 2005. Interestingly, the percentage of companies that see BPM efforts as a major strategic commitment by executive management has actually decreased, and is now at 23%; the biggest current characterization (34%) is companies that are working on a limited number of mid or low-level projects. That’s right, 1/3 of the respondents to the BPTrends survey – who are presumably engaged to some degree in BPM efforts – are using BPM methods/technologies only on lower-level, non-strategic projects. They spin the results in a slightly different way, pointing out that if you combine the “major strategic commitment” and “significant commitment”, it forms 50% of the respondents. Their discussion in this section explored the idea that there was a big uptake in process interest in 2007-9, but interest hasn’t really grown since then: in many organizations, a senior manager brings in process management methodologies or tools as a “pet project”, then after they move on, no one takes up the reins to continue the process improvement efforts.

Another interesting result is the major business drivers for business process change (each organization could choose multiple): reducing costs/improving productivity has steadily risen to the current high of 69%, although product innovation and customer satisfaction are also a top choice for more than 1/3 of organizations. This really highlights what I see in practice: productivity/cost are table stakes in any process improvement, and although might not be the front-of-mind reason for many executives, are expected outcomes.

The second section on BPM tools has quite a bit of information on how the respondents’ companies are using process modeling and analysis tools, but much less on process automation.

I’ve included a couple of their results here as a teaser, but I highly recommend that you head over to BPTrends and grab a (free) copy of the full report.

Focus on Insurance Processes: Product Innovation While Managing Risk and Costs – my upcoming webinar with @Signavio

I know, I just announced a banking webinar with Signavio on February 25; this is another one with an insurance focus on March 10 at 1pm ET. From the webinar description:

With customer churn rates approaching 25% in some insurance sectors, insurers are attempting to increase customer retention by offering innovative products that better address today’s market. The ability to create and support innovative products has become a top-level goal for many insurance company executives, and requires agile and automated end-to-end processes for a personalized customer journey.

Similar to the banking webinar, the focus is on more management-level concerns, and I’ll look at some use cases around insurance product innovation and claims.

Head on over to the landing page to sign up for the webinar. If you’re unable to attend, you’ll receive a link to the replay.

My post on the @Trisotech blog: Designing Processes for Agility

In my continuing series over on the Trisotech blog, I’ve just published a post on issues to consider when designing processes for agility, especially the tradeoffs between what goes in a process model versus a decision model. I’ve spent quite a bit of time thinking about this in the past when working with enterprise customers on their processes, and a conversation after a conference presentation last year really brought some of the ideas into focus. From that post:

Assuming that you’re using model-driven process and decision management systems (BPMN and DMN, respectively) for design and implementation, you might assume that it doesn’t really matter how you design your applications, since either could be quickly changed to accommodate changing business needs. It’s true that model-driven process and decision management systems give you agility in that they can be changed relatively quickly with little or no coding, then tested and redeployed in a matter of hours or days. But your design choices can impact understandability of the models as well as agility of the resulting application, and it’s important to have both.

Head on over to their blog to read the entire post.

If you have comments or questions about the post, feel free to engage in the comments on this post, since Trisotech doesn’t allow commenting on their blog.

Camunda Cloud beta goes public

It’s definitely webinar season! I’ve seen a lot of webinar invitations pass by recently, and I’ll be speaking on a couple in the coming weeks. Today, I listened in on a webinar about the Camunda Cloud public beta, with Daniel Meyer (Camunda CTO) discussing their drivers for creating it, and Immanuel Monma providing a demo. I heard about the Camunda Cloud at CamundaCon last September, and it’s good to see that they’re launching it so soon.

Meyer spoke about using cloud-based process automation for modernizing legacy infrastructure, and the requirements that they had for re-inventing process automation for the cloud:

  • Externalize processes from business applications (this isn’t really new, since it’s been a driver for BPM systems all along).
  • Maximize developer productivity by allowing them to work within their programming language of choice.
  • Support hybrid orchestration with both cloud and on-premise applications, and across multiple public cloud platforms.
  • Native BPMN execution.
  • Cloud scalability and resilience.

This is where their Zeebe workflow engine comes in, which is at the core of Camunda Cloud. By supporting hybrid orchestration, Camunda Cloud allows for a gradual migration of legacy on-premise IT by first externalizing the processes, then migrating some of the legacy functionality to cloud-based microservices while still supporting direct contact with the legacy IT, then gradually (if possible) migrating all of the legacy functionality to the cloud. This gains the advantage of both a microservices architecture for modularity and scalability, and process orchestration to knit things together for loose coupling with end-to-end visibility.

The live demo showed the interaction between Zeebe and Operate, the two main execution components of Camunda Cloud, plus Cawemo for collaborative modeling of the processes (althought the process could have just been modeled in the Zeebe modeler). Monma walked us through how to create, deploy and execute a simple BPMN process in Camunda Cloud; watching the webinar replay would be a great place to start if you want to play around with the beta. Note that aside from creating the BPMN model in Cawemo, which may involve business people, this is a technical developer toolset for service orchestration and automated processes at this point. You can plug into their Zeebe Slack community or forum to interact with other developers who are trying things out.

Future Camunda Cloud components

Meyer returned with the product roadmap, then handled questions from attendees. Right now, Camunda Cloud is a free public beta although there are some limitations; they will be launching the GA version shortly (he said “hopefully within the next month”) that will allow better control over clusters plus have SLA-based technical support. They are also adding human workflow with a tasklist, providing both an API and a simple out of the box UI, which will also push the addition of the human task type in the Zeebe BPMN coverage. They will be adding analytics via a cloud version of Optimize. The Camunda components are running in their cloud, which is currently running in Google Cloud and an automated Kubernetes structure; in the future, they will expand this to run in multiple (geographic) regions to better support applications in different regions. They may consider running on different cloud platforms, although since this is hidden from the Camunda Cloud customers, it may not be necessary. A number of other good questions on hybrid orchestration, the use of RPA, and how the underlying event-streaming distributed architecture of Zeebe provides for vastly greater scalability than most BPM systems.

You’ll be able to see the webinar replay (typically without registration) on the webinar information page as soon as they publish it.

Focus on Banking Processes: Improve Revenue, Costs and Compliance – my upcoming webinar with @Signavio

I’ll be presenting on two webinars sponsored by Signavio in the upcoming weeks, starting with one on banking processes on February 25 at 1pm ET. In this first webinar, I’ll be taking a look not just at the operational improvements, but at the (executive) management-level concerns of improving revenue, controlling costs and maintaining compliance. From the webinar description:

Today’s retail banks face more challenges than ever before: in addition to competing with each other, they are competing with fintech startups that provide alternatives to traditional banking products and methods. The concerns in the executive suite continue to focus on revenue, costs and compliance, but those top-level goals are more than just numbers. Revenue is tied closely to customer satisfaction and wallet share, with today’s customers expecting personalized banking products and modern omnichannel experiences.

You can sign up for the webinar here. This will be a concise 35 minutes plus Q&A, and I’ll include some use case examples from client onboarding and KYC in retail banking.

ARIS Elements: the cloud “starter edition” for process design

I decided not to get up at 4am (Eastern time) earlier this week to watch the ARIS Elements launch webinar presented by ARIS senior product manager Tom Thaler, but Software AG has published it here — no registration required — and I had a quick view of it as well as checking out the ARIS Elements website, which is already live.

Creating a model in ARIS Elements, showing the seven supported model types

As seen in the webinar, model creation allows you to create seven different types of models: process landscape, BPMN process, event-driven process (EPC), organizational chart, system diagram, data model, and structuring model. It does not include DMN or CMMN models; DMN is in ARIS Advanced and Enterprise editions.

Thaler demonstrated creating a BPMN model, which is similar to many of the other cloud-based modelers, although it’s not clear the extent of their BPMN coverage (for some of the more esoteric event types, for example). What they do provide that is unique, however, is analysis-focused information for different steps such as RACI responsibility assignments that link directly to an organizational chart. BPMN models are checked for validity, even though these are probably not expected to be directly-executable models. Once a model is created, it can be previewed and then published (unless the database has been set for auto-publication). In addition to the visual model, the preview/published versions of BPMN models show a summary tabular view of the process steps, with roles, input, output and IT systems for each. The RACI chart is also generated from the values entered in the process model.

A process landscape/map model can be created to show groups of processes (in the demo, the top level groups were management, core and supporting processes); these can in turn be nested groups of processes for more complex areas such as HR.

A user can set specific models as favorites, which will then appear on their home page for easy access. There is a hierarchical view of the repository by model type.

There are fairly standard user management features to add new users and assign permissions, although this edition does not provide single sign-on.

There are a number of video tutorials available to show how to create different model types and manage settings, and a free trial if you want to get started quickly.

There were a number of good questions in the Q&A (starting at around 38 minutes into the webinar) that exposed some of the other features and limitations of ARIS Elements. Many of these were obviously from people who are currently ARIS users, and looking to see if Elements fits into their plans:

  • Commenting by viewers is not supported
  • BPMN models can be imported
  • There is only one database (multiple databases to separate business units is a feature in ARIS Advanced/Enterprise)
  • Upgrading to a more expensive version would allow all models that were already created to be migrated
  • There is no automation of model review cycles (or any other workflow control), such as having a model reviewed by one or more others before publication; this would have to be done manually
  • There is no document storage (supporting documents can be stored directly in ARIS Advanced/Enterprise)
  • There is no process comparison (available in higher level versions)
  • Migrating from an ARIS on-premise edition to Elements could result in data loss since not all of the model types and features are supported, and is not recommended
  • There are a small number of pre-defined reports available for immediate use, but no report customization

If you look at the pricing page which also shows a feature comparison chart, you’ll see that ARIS Elements is considered the low-end edition of their cloud process modeling product suite. It’s fairly limited (up to 20 users, one database, other limitations) and is priced at 100€ (about $US110) per designer user per month and 50€ per 10 viewer users; that seems somewhat high, but they offer a broader range of model types than competitive process modeling tools, and include a shared repository for collaborative designing and viewing.

ARIS Elements is being positioned in an interesting space: it’s more than just process modeling, but less than the more complete enterprise architecture modeling that you’ll find in ARIS Advanced/Enterprise and competitive EA modeling products. It’s being targeted at “beginners”, although arguably beginners would not be creating a lot of these model types (although might be viewing them). Possibly they’ve had feedback that the Advanced version is just a bit too complex for many situations, and they are attempting to hit the part of the market that doesn’t need full capabilities; or they are offering Elements as a starting point with the goal to migrate many of these customers onto the Advanced/Enterprise editions as soon as they run up against the limitations.

APQC webinar: 2020 process and performance management priorities, with @hlykehogland

I listened in on a webinar today with APQC‘s Holly Lyke-Ho-Gland looking at the results of their 2020 process and performance management priorities survey (conducted in late 2019). Some good insights here, looking at the top three challenges in business process management and continuous improvement. Process modeling and mining vendors will be happy to see that the highest priority challenge in BPM is defining and mapping end-to-end processes.

She covered a number of tips and solutions to address these challenges, from points on developing end-to-end processes, how to develop a culture of continuous improvement, and governance alignment. She included a lot of great case studies and examples across all of these areas, and what type of resources and expertise is required to achieve them.

After covering the business process management and continuous improvement side, she moved on to discuss the organizational performance management challenges and solutions. Performance management is more about analytics and metrics, and using those measures to support decision making; apparently this year’s challenges are the same as last year’s, meaning that organizations are still struggling with these issues.

Some interesting points here about change management plans and what needs to be done in order to be successful in performance management; check out the webinar replay for details.

The last part of the webinar was on their “special interest” section, which this year is process management. The first point was on the purpose of process teams and work, the most important of which is supporting strategic initiatives. This is definitely what I see in my own consulting practice, with process gaining a much higher profile as companies focus on digital transformation efforts: at their heart, many transformation efforts are process-centric. The APQC research also showed information on measuring process work, and she notes (as I often see) that the top measures are still focused on bottom-line savings rather than more strategic measures, meaning that process metrics are misaligned with strategic focus. She also covered the impact of technology on process work: not just process automation, but collaboration, data management and visualization, collaboration and cloud computing topped the technology list, since they are looking at the entire process management lifecycle. She made a direct call-out to process mining (although it wasn’t in the top five list) as a cross-over between data analysis and process modeling; I’m definitely in agreement with that as you can see from my post earlier this week.

She finished with a summary of the survey results, plus a peek at their research agenda for 2020 with lots of interesting and timely topics. I like that their research uses a lot of real-world case studies.

I couldn’t find a direct link to the webinar replay yet, but it will likely available on APQC’s On-Demand Webinars page soon; definitely worth checking out for Lyke-Ho-Gland’s insights and discussion. While you’re over there, check out their Process and Performance Management Conference, coming up in October. I spoke at their conference back in 2013, and really enjoyed the experience, good sessions and a smaller conference so great for networking.

Process is cool (again), and the coolest kid on the block is process mining

I first saw process mining software in 2008, when Fujitsu was showing off their process discovery software/services package, plus an interesting presentation by Anne Rozinat from that year’s academic BPM conference where she tied in concepts of process mining and simulation without really using the term process mining or discovery. Rozinat went on to form Fluxicon, which developed one of the earliest process mining products and really opened up the market, and she spent time with me providing my early process mining education. Fast forward 10+ years, and process mining is finally a hot topic: I’m seeing it from a few mining-only companies (Celonis), and as a part of a suite from process modeling companies (Signavio) or even a larger process automation suite (Software AG). Eindhoven University of Technology, arguably the birthplace of process mining, even offers a free process mining course which is quite comprehensive and covers usage as well as many of the underlying algorithms — I did the course and found it offered some great insights and a few challenges.

Today, Celonis hosted a webinar, featuring Rob Koplowitz of Forrester in conversation with Celonis’ CMO Anthony Deighton, on the role of process improvement in improving digital operations. Koplowitz started with some results from a Forrester survey showing that digital transformation is now the primary driver of process improvement initiatives, and the importance of process mining in that transformation. Process mining continues its traditional role in process discovery and conformance checking but also has a role in process enhancement and guidance. Lucky for those of us who focus on process, process is now cool (again).

Unlike just examining analytics for the part of a process that is automated in a BPMS, process mining allows for capturing information from any system and tracing the entire customer journey, across multiple systems and forms of interaction. Process discovery using a process mining tool (like Celonis) lets you take all of that data and create consolidated process models, highlighting the problem areas such as wait states and rework. It’s also a great way to find compliance problems, since you’re looking at how the processes actually work rather than how they were designed to work.

Koplowitz had some interesting insights and advice in his presentation, not the least of which was to engage business experts to drive change and automation, not just technologists, and use process analytics (including process mining) as a guide to where problems lie and what should/could be automated. He showed how process mining fits into the bigger scope of process improvement, contributing to the discovery and analysis stages that are a necessary precursor to reengineering and automation.

Good discussion on the webinar, and there will probably be a replay available if you head to the landing page.