OpenText Analyst Summit 2020 day 2: Digital Accelerants

Although technically a product breakout, the session on OpenText’s Digital Accelerants product collection was presented to the entire audience as our last full-audience session before the afternoon breakouts. This was split into three sections: cloud, AI and analytics, and process automation.

Jon Schupp, VP of Cloud GTM, spoke about how information is transforming the world: not just cloud, but a number of other technologies, a changing workforce, growing customer expectations and privacy concerns. Cloud, however, is the destination for innovation. Moving to cloud allows enterprise customers to take advantage of the latest product features, guaranteed availability, global reach and scalability while reducing their operational IT footprint. OpenText provides a number of different deployment platforms: “off-cloud” (aka on-premise), public cloud, private cloud, managed services, and SaaS.

Dave Moyers and Paul O’Hagan were up next to talk about AI and analytics, and how they are addressing data variety, ease of use, embedding AI/ML in processes, and deploying anywhere that it’s required. Their AI and analytics capabilities are provided by the Magellan product, and have been integrated with other OpenText products as well as built into vertical solutions. Magellan has been integrated into the ECM products with AI-augmented capture and the AI-powered “magic folder” auto-categorization and filing; into the Business Network products with asset performance optimization and predictive maintenance; into AppWords by instantiating processes based on insights; and several other integrations. They also have some new standard features for identifying PII (personal identifiable information), which is crucial for compliance and privacy. In addition to the analysis capabilities, there is a wide range of dashboard and visualization options, and full-fledged ETL for connecting to enterprise and third-party data sources and organize data flows. We also saw some examples yesterday of using Magellan for e-discovery and sentiment analysis. Interestingly, this is one of the product portfolios where they’ve taken advantage of integrating with open source tools to extend the core products.

Saving the best for last (okay, maybe that’s just my bias), Lori McKellar and Nick King presented on business process automation. This is not just about back-office automation, but includes customer-facing processing, IoT and other more complex intelligent processes. AppWorks, which includes the process automation capabilities, is an application development environment for use by semi-technical citizen developers (low-code) as well as professional developers (pro-code). We saw the all-new developer experience last year, and now they’ve had a chance to integrate the actual customer usage to fine-tune both the developer and end-user AppWorks experience. One significant change is that as their customers start to build larger apps, they now allow more granular access to the entities under development to allow multiple developers to be working on the same application simultaneously without collisions. They’ve added some new UI capabilities, such as a card view option and an optimized tablet view. Integration with Documentum has been improved for easier document check-in and repository access. Privacy features, including dynamic instance-level permissions and document redaction, are now available in AppWorks. In the upcoming 20.2 version, they’ll be adding an RPA connector framework, then expanding the RPA integrations in 20.3.

The session finished with a Q&A with all of the participants, including discussions on RPA connectors, operationalization of machine learning, hybrid cloud models, the role of unstructured content in AI training, and natural language processing.

This afternoon, I’ll be attending the breakout session on content services, so stay tuned for those final notes from the OpenText Analyst Summit 2020.

OpenText Analyst Summit 2020 day 2: sales, finance and operations

We started the second day of the OpenText analyst summit with EVP of sales Ted Harrison outlining their sales value proposition, both through their direct sales force and their partner channel. Customers tend to start with one OpenText product, but often expand to additional product lines to create more of a strategic partnership. OpenText is a prolific user of their own technology, providing a good template for some of their large customers in how their products can be used throughout an organization. With the growth in their cloud platform, they expect cloud to be their largest business in FY21. Harrison finished his presentation with a couple of customer case studies: Pacific Life doing a huge migration to OpenText Cloud, and JPMorgan Chase using AI for automated redaction, and Google using TeamSite for their partner portal.

James McGourlay, EVP of customer operations, covered their support, professional services and customer experience teams. They’ve done more than 40,000 engagements, which has created a depth of knowledge in successful deployment of their products. To fuel the move to the OpenText Cloud, professional services is helping customers with “Cloudification” strategy, migration, integration and adoption. McGourlay spoke about their commitment to data sovereignty, especially for European customers that have strict location regulations for certain data types. They perform customer satisfaction surveys for both professional services and technical support, with the goal to constantly improve their approval rating — currently at 96.4% for their technical support, for example, which he considers “not good enough”.

The last presentation in this session was CFO Madhu Ranganathan with a business and financial update. You can get more of the investor and financial details on their website (or read something written by one of the many blue-suited financial analysts in the audience), and she provided a summary of that publicly-available information: profitable and cash flow-positive, 25+ years of solid performance, and a proven M&A track record which is their dominant growth strategy. They have specific criteria for acquisitions: market leadership, value for OpenText’s customers, mission-critical capabilities, financially compelling, larger customer base, and longer operating history. Ranganathan showed a timeline of successful acquisitions; interestingly, none of the three BPM/workflow buys (Global360 and Metastorm in 2011, Cordys in 2013) were mentioned. It’s probably fair to say that workflow is not a primary product category for OpenText; it’s really just functionality within their AppWorks application development platform, most often used content-centric applications such as document lifecycle and case management. In summary, OpenText is solid financially, and has cash in the bank to leverage more acquisitions as part of their growth strategy.

OpenText Analyst Summit 2020 day 1: technology strategy

After the break, OpenText EVP and Chief Product Officer Muhi Majzoub took the stage at the analyst summit to talk about innovation within their products, a strategic projects update and a bit of a roadmap. They have innovation that comes from customer requirements as well as their own drivers, but they also have a lot to do in order to integrate new acquisitions.

He stressed that Documentum and Content Suite are both being maintained, with innovation (e.g., UI, Core Share integration) being applied to both product lines; although this is probably a great relief for customers of either product, I can’t believe that this will go on forever. This is the real challenge for OpenText going forward: how to consolidate some of their overlapping/competing acquisitions without alienating customers, especially in the content space where information is persistent for a long time. Branding everything as “Cloud Edition” doesn’t fix the problem, it just obscures it.

Majzoub spoke about their four strategic projects:

  • Cloud Edition (CE) is their cloud-native platform for running all manner of solutions and applications, which runs on a variety of cloud vendor platforms (OpenText, Google, AWS, Azure, Anthos) and includes containerized deployment models.
  • OT2 is their cloud-native application development platforms, including 231 of their own services and pre-built SaaS applications. This includes their Core services and applications, such as content and workflow services and many others.
  • Business Network, which includes a range of B2B services from fax to EDI, plus a huge directory of global trading partners that can be linked via OpenText’s platform.
  • Carbonite integration, which brings some new capabilities in cybersecurity, as well as SMB/consumer content management.

He finished with five new innovations to watch for from OpenText, including new features in Documentum, IoT connected supply chain, Exstream, Magellan, and Encase.

Craig Stilwell, formerly Chief Revenue Officer at Carbonite and now OpenText’s EVP and GM of the SMB and Consumer division, was on next to highlight some (more) of what Carbonite is bringing to OpenText. This acquisition is obviously energizing a lot of people, since we’ve heard about it in every presentation so far today. Carbonite, through their original product and their acquisition of Webroot last year, covers two of the main concerns of many SMBs: backup/disaster recovery, and endpoint protection. Unlike large organizations that own their own data centers, SMBs (and consumers) were much earlier adopters of cloud storage and computation, and therefore some of the early victims of downtime due to disaster or cyber attack.

We finished the day with SVP of product management Stephen Ludlow and demos by his four VPs of product management, each of which is responsible for a different product category. With their broad range of products, they obviously didn’t cover everything, but each showed an interesting capability with a large potential business impact:

  • Marc Diefenbruch demonstrated the intelligent folder in the content suite, which is AI-powered content classification and filing.
  • Dawn Andre demonstrated the identification and connection with potential trading partners based on multiple qualification criteria in the Trading Grid global partner directory.
  • Guy Hellier demonstrated personalized omnichannel communications with Extream for improving customer satisfaction, using customer data sources and Magellan speech analytics tied together with Core Experience Insights.
  • Michael Cybala demonstrated secure cloud sharing, collaboration and signing of documents using Core Share and Core Signature working with Content Server.

That’s it for our first (half) day at the 2020 OpenText analyst summit. We’ll be back tomorrow for another full day.

OpenText Analyst Summit 2020 day 1: corporate strategy and customer spotlight

I’m in Boston for the next two days for OpenText’s annual analyst summit; Patty Nagle, CMO, kicked things off in the first session, then we had a keynote from CEO/CTO Mark Barrenechea. They’re coming up on 30 years in existence, which is definitely a milestone for any technology company, and they’ve grown to 15,000 employees in over 30 countries, in part through their pattern of growth through acquisition. They sell through a large direct salesforce, as well as through their 27,000 partners and directly from their website.

The latest acquisition is Carbonite, which seems a pretty good fit with their cloud/edge content strategy, and Barrenchea discussed where Carbonite fits into their strategy some detail: decentralized computing, small/medium business and consumer audience, and cyber-resilience. OpenText has promoted the term enterprise information management (EIM) in the past, and now are dropping the “E” to be just information management as they enter the smaller end of the market.

They are following the lead of smaller (arguably more nimble) vendors with a move to quarterly product releases for their core content management, and their product versioning will reflect that with a YY.Q version number (e.g., 20.2). Their release 16 will become Cloud Edition 20.2 with the April release, with OT2 and Business Network following the same version numbering. The push to the cloud continues, and if you go to their website now, you’ll see a link to their cloud logins. I’m not sure that having quite so many different logins is a good thing, but I get that there are different audiences for this.

He also covered their business network and cyber resilience offerings, which are a bit peripheral to my interests; then on to their digital accelerants, which is a mixed bag of capabilities including low-code development, AI, IoT, process automation and analytics. They showed a demo of Magellan analytics visualizing World Health Organization data on COVID-19 — a timely example — showing the trends of the disease spread in human healthcare terms, but also the impact on business and markets.

Their key corporate priorities include maintaining market leadership in information management, with expansion to all size of customers; continued move to the cloud; and becoming more of an applications company. I’ve seen a few horizontal technology vendors fail spectacularly on building applications, so it will be interesting to see what they can accomplish there.

We heard briefly about BrightCloud Threat Intelligence, part of the Carbonite acquisition, and saw a demonstration of the Webroot BrightCloud Threat Investigator. Webroot was only acquired by Carbonite less than a year ago, and the branding didn’t even have time to change to Carbonite before becoming part of OpenText. OpenText plans to integrate this into their other offerings to provide better security for content and access to third-party sites and services.

Barrenechea ended with a call to arms to address climate change, ethical supply chains, overuse of plastics and other issues threatening society at large. Not what you usually hear from a technology CEO, but they are pushing a brand of “technology for the good”.

Ted Harrison, EVP of sales, finished the session by hosting a customer panel featuring Peter Chen of Stericycle, Shyam Pitchaimuthu of Chevron, and Gurreet Sidhu of BMO Financial Group. Stericycle and Chevron are both OpenText content management customers, with broad usage across their organizations and deep integration into other systems and processes. BMO is using the OpenText Trading Grid for B2B payment solutions, and appreciate the elastic scalability of the platform as business sectors expand and contract. Stericycle and Chevron both moved to cloud content management as part of their cloud-first strategy, with Chevron doing a conversion from on-premise Documentum to Azure. BMO went with OpenText’s managed services to allow them greater customization and security without running the core infrastructure themselves. Good discussion of how they’re using OpenText products, and the transition to their current state.

2020 State of Business Process Management report from @BPTrends

The very first post that I wrote here, back in 2005, was on the BPTrends 2005 BPM Suites Report, which has evolved into their State of Business Process Management report (free, but registration required). Back in 2005, I noted that the vendors included in the report were “pay to play”, whereas by now the report is mostly BPM background/thought leadership information plus the detailed results of surveys with BPM practitioners. Although survey respondents are asked about the tools that they use, and capabilities that they require for future work, the specific vendors are not discussed in any detail. For this report, Paul Harmon of BPTrends worked together with Jorge Garcia of Technology Evaluation Centers (TEC): it appears that Harmon focused on the first section, How Organizations Understand Business Processes, while Garcia covered the second section on Business Process Software Tools. The report was sponsored by Creatio (formerly bpm’online), Signavio and Trisotech, so thanks to them for helping to make this report free to everyone (as an aside, Signavio and Trisotech are both customers of mine, but that’s not why I’m writing this post).

This quote from the executive summary really highlights why processes are such an important part of understanding how businesses are going digital:

BPTrends started tracking the process market in 2005 when BPMS tools first appeared on the scene. In the years since, enthusiasm has driven a wide variety of process initiatives. Underneath it all, however, was the interest in Internet-based tools that could model, track and control major business processes. The tools have gone through a variety of changes and are, today, powerful, widespread and widely used. The initial enthusiasm for new process work has declined a bit and current interest is perhaps better characterized by the term digital transformation, but the underlying impulse – to improve how businesses perform their work – remains.

There’s some good analysis in the report, including trends that they’ve noted from their surveys since 2005. Interestingly, the percentage of companies that see BPM efforts as a major strategic commitment by executive management has actually decreased, and is now at 23%; the biggest current characterization (34%) is companies that are working on a limited number of mid or low-level projects. That’s right, 1/3 of the respondents to the BPTrends survey – who are presumably engaged to some degree in BPM efforts – are using BPM methods/technologies only on lower-level, non-strategic projects. They spin the results in a slightly different way, pointing out that if you combine the “major strategic commitment” and “significant commitment”, it forms 50% of the respondents. Their discussion in this section explored the idea that there was a big uptake in process interest in 2007-9, but interest hasn’t really grown since then: in many organizations, a senior manager brings in process management methodologies or tools as a “pet project”, then after they move on, no one takes up the reins to continue the process improvement efforts.

Another interesting result is the major business drivers for business process change (each organization could choose multiple): reducing costs/improving productivity has steadily risen to the current high of 69%, although product innovation and customer satisfaction are also a top choice for more than 1/3 of organizations. This really highlights what I see in practice: productivity/cost are table stakes in any process improvement, and although might not be the front-of-mind reason for many executives, are expected outcomes.

The second section on BPM tools has quite a bit of information on how the respondents’ companies are using process modeling and analysis tools, but much less on process automation.

I’ve included a couple of their results here as a teaser, but I highly recommend that you head over to BPTrends and grab a (free) copy of the full report.

Focus on Insurance Processes: Product Innovation While Managing Risk and Costs – my upcoming webinar with @Signavio

I know, I just announced a banking webinar with Signavio on February 25; this is another one with an insurance focus on March 10 at 1pm ET. From the webinar description:

With customer churn rates approaching 25% in some insurance sectors, insurers are attempting to increase customer retention by offering innovative products that better address today’s market. The ability to create and support innovative products has become a top-level goal for many insurance company executives, and requires agile and automated end-to-end processes for a personalized customer journey.

Similar to the banking webinar, the focus is on more management-level concerns, and I’ll look at some use cases around insurance product innovation and claims.

Head on over to the landing page to sign up for the webinar. If you’re unable to attend, you’ll receive a link to the replay.

My post on the @Trisotech blog: Designing Processes for Agility

In my continuing series over on the Trisotech blog, I’ve just published a post on issues to consider when designing processes for agility, especially the tradeoffs between what goes in a process model versus a decision model. I’ve spent quite a bit of time thinking about this in the past when working with enterprise customers on their processes, and a conversation after a conference presentation last year really brought some of the ideas into focus. From that post:

Assuming that you’re using model-driven process and decision management systems (BPMN and DMN, respectively) for design and implementation, you might assume that it doesn’t really matter how you design your applications, since either could be quickly changed to accommodate changing business needs. It’s true that model-driven process and decision management systems give you agility in that they can be changed relatively quickly with little or no coding, then tested and redeployed in a matter of hours or days. But your design choices can impact understandability of the models as well as agility of the resulting application, and it’s important to have both.

Head on over to their blog to read the entire post.

If you have comments or questions about the post, feel free to engage in the comments on this post, since Trisotech doesn’t allow commenting on their blog.

Camunda Cloud beta goes public

It’s definitely webinar season! I’ve seen a lot of webinar invitations pass by recently, and I’ll be speaking on a couple in the coming weeks. Today, I listened in on a webinar about the Camunda Cloud public beta, with Daniel Meyer (Camunda CTO) discussing their drivers for creating it, and Immanuel Monma providing a demo. I heard about the Camunda Cloud at CamundaCon last September, and it’s good to see that they’re launching it so soon.

Meyer spoke about using cloud-based process automation for modernizing legacy infrastructure, and the requirements that they had for re-inventing process automation for the cloud:

  • Externalize processes from business applications (this isn’t really new, since it’s been a driver for BPM systems all along).
  • Maximize developer productivity by allowing them to work within their programming language of choice.
  • Support hybrid orchestration with both cloud and on-premise applications, and across multiple public cloud platforms.
  • Native BPMN execution.
  • Cloud scalability and resilience.

This is where their Zeebe workflow engine comes in, which is at the core of Camunda Cloud. By supporting hybrid orchestration, Camunda Cloud allows for a gradual migration of legacy on-premise IT by first externalizing the processes, then migrating some of the legacy functionality to cloud-based microservices while still supporting direct contact with the legacy IT, then gradually (if possible) migrating all of the legacy functionality to the cloud. This gains the advantage of both a microservices architecture for modularity and scalability, and process orchestration to knit things together for loose coupling with end-to-end visibility.

The live demo showed the interaction between Zeebe and Operate, the two main execution components of Camunda Cloud, plus Cawemo for collaborative modeling of the processes (althought the process could have just been modeled in the Zeebe modeler). Monma walked us through how to create, deploy and execute a simple BPMN process in Camunda Cloud; watching the webinar replay would be a great place to start if you want to play around with the beta. Note that aside from creating the BPMN model in Cawemo, which may involve business people, this is a technical developer toolset for service orchestration and automated processes at this point. You can plug into their Zeebe Slack community or forum to interact with other developers who are trying things out.

Future Camunda Cloud components

Meyer returned with the product roadmap, then handled questions from attendees. Right now, Camunda Cloud is a free public beta although there are some limitations; they will be launching the GA version shortly (he said “hopefully within the next month”) that will allow better control over clusters plus have SLA-based technical support. They are also adding human workflow with a tasklist, providing both an API and a simple out of the box UI, which will also push the addition of the human task type in the Zeebe BPMN coverage. They will be adding analytics via a cloud version of Optimize. The Camunda components are running in their cloud, which is currently running in Google Cloud and an automated Kubernetes structure; in the future, they will expand this to run in multiple (geographic) regions to better support applications in different regions. They may consider running on different cloud platforms, although since this is hidden from the Camunda Cloud customers, it may not be necessary. A number of other good questions on hybrid orchestration, the use of RPA, and how the underlying event-streaming distributed architecture of Zeebe provides for vastly greater scalability than most BPM systems.

You’ll be able to see the webinar replay (typically without registration) on the webinar information page as soon as they publish it.

Focus on Banking Processes: Improve Revenue, Costs and Compliance – my upcoming webinar with @Signavio

I’ll be presenting on two webinars sponsored by Signavio in the upcoming weeks, starting with one on banking processes on February 25 at 1pm ET. In this first webinar, I’ll be taking a look not just at the operational improvements, but at the (executive) management-level concerns of improving revenue, controlling costs and maintaining compliance. From the webinar description:

Today’s retail banks face more challenges than ever before: in addition to competing with each other, they are competing with fintech startups that provide alternatives to traditional banking products and methods. The concerns in the executive suite continue to focus on revenue, costs and compliance, but those top-level goals are more than just numbers. Revenue is tied closely to customer satisfaction and wallet share, with today’s customers expecting personalized banking products and modern omnichannel experiences.

You can sign up for the webinar here. This will be a concise 35 minutes plus Q&A, and I’ll include some use case examples from client onboarding and KYC in retail banking.

ARIS Elements: the cloud “starter edition” for process design

I decided not to get up at 4am (Eastern time) earlier this week to watch the ARIS Elements launch webinar presented by ARIS senior product manager Tom Thaler, but Software AG has published it here — no registration required — and I had a quick view of it as well as checking out the ARIS Elements website, which is already live.

Creating a model in ARIS Elements, showing the seven supported model types

As seen in the webinar, model creation allows you to create seven different types of models: process landscape, BPMN process, event-driven process (EPC), organizational chart, system diagram, data model, and structuring model. It does not include DMN or CMMN models; DMN is in ARIS Advanced and Enterprise editions.

Thaler demonstrated creating a BPMN model, which is similar to many of the other cloud-based modelers, although it’s not clear the extent of their BPMN coverage (for some of the more esoteric event types, for example). What they do provide that is unique, however, is analysis-focused information for different steps such as RACI responsibility assignments that link directly to an organizational chart. BPMN models are checked for validity, even though these are probably not expected to be directly-executable models. Once a model is created, it can be previewed and then published (unless the database has been set for auto-publication). In addition to the visual model, the preview/published versions of BPMN models show a summary tabular view of the process steps, with roles, input, output and IT systems for each. The RACI chart is also generated from the values entered in the process model.

A process landscape/map model can be created to show groups of processes (in the demo, the top level groups were management, core and supporting processes); these can in turn be nested groups of processes for more complex areas such as HR.

A user can set specific models as favorites, which will then appear on their home page for easy access. There is a hierarchical view of the repository by model type.

There are fairly standard user management features to add new users and assign permissions, although this edition does not provide single sign-on.

There are a number of video tutorials available to show how to create different model types and manage settings, and a free trial if you want to get started quickly.

There were a number of good questions in the Q&A (starting at around 38 minutes into the webinar) that exposed some of the other features and limitations of ARIS Elements. Many of these were obviously from people who are currently ARIS users, and looking to see if Elements fits into their plans:

  • Commenting by viewers is not supported
  • BPMN models can be imported
  • There is only one database (multiple databases to separate business units is a feature in ARIS Advanced/Enterprise)
  • Upgrading to a more expensive version would allow all models that were already created to be migrated
  • There is no automation of model review cycles (or any other workflow control), such as having a model reviewed by one or more others before publication; this would have to be done manually
  • There is no document storage (supporting documents can be stored directly in ARIS Advanced/Enterprise)
  • There is no process comparison (available in higher level versions)
  • Migrating from an ARIS on-premise edition to Elements could result in data loss since not all of the model types and features are supported, and is not recommended
  • There are a small number of pre-defined reports available for immediate use, but no report customization

If you look at the pricing page which also shows a feature comparison chart, you’ll see that ARIS Elements is considered the low-end edition of their cloud process modeling product suite. It’s fairly limited (up to 20 users, one database, other limitations) and is priced at 100€ (about $US110) per designer user per month and 50€ per 10 viewer users; that seems somewhat high, but they offer a broader range of model types than competitive process modeling tools, and include a shared repository for collaborative designing and viewing.

ARIS Elements is being positioned in an interesting space: it’s more than just process modeling, but less than the more complete enterprise architecture modeling that you’ll find in ARIS Advanced/Enterprise and competitive EA modeling products. It’s being targeted at “beginners”, although arguably beginners would not be creating a lot of these model types (although might be viewing them). Possibly they’ve had feedback that the Advanced version is just a bit too complex for many situations, and they are attempting to hit the part of the market that doesn’t need full capabilities; or they are offering Elements as a starting point with the goal to migrate many of these customers onto the Advanced/Enterprise editions as soon as they run up against the limitations.