bpmNEXT 2019 keynote: @JimSinur on technology combinations that digitally deliver

Our second keynote on the first day of bpmNEXT 2019 is with long-time presenter Jim Sinur, looking at technology combinations that digitally deliver. Unlike his usual focus on future directions, he’s driving down into what technologies work for companies that are undergoing digital transformation. This is a great lead-in to what I’ll be talking about tomorrow morning, and I fully expect to be fine-tuning my presentation before then to incorporate ideas from Jim’s presentation as well as Nathaniel Palmer’s presentation that preceded it.

IMG_3358Digital business platforms – something bigger than a BPMS – provide the real pathway to digital transformation, combining a variety of technologies. The traditional BPMS products are strong in work/process management, but they also need proactive intelligence, integration, automation, IoT enablement and business functionality. He looks at technical streams and their benefits, ranging from computational technologies to consumer delivery channels. He had a draft version of a matrix that he’s working on that shows attributes for these different technologies, from skill level required to get started with the technology to the likelihood of the vendors in this category partnering with other category vendors successfully, IMG_3360leading to a list of top productive pairs and triplets that we’re seeing in the market today: BPM and AI, for example, for processes with smart resources and actions; or architecture, low code and RPA for incremental transformation of legacy.

He finished up with how we will be leveraging the trends for marketplace collaboration between vendor products, and encouraging the vendors in the room (mostly everybody) to collaborate along the lines of his top pairs and triplets. In my opinion, this won’t necessarily being the vendors deciding to partner to offer joint solutions, but larger enterprises deciding to roll their own platforms using a combination of best-of-breed technologies that they select themselves: the vendors will need to make sure that their products can be sliced, diced and re-integrated in the way that the customers want.

Slide decks and videos of all presentations will be online within a day or two; I’ll come back and update all of the posts with links then.

Kicking off bpmNEXT 2019 with @NathanielPalmer

Except for a hiatus in 2017, I’ve been at every bpmNEXT since its inception in 2013, created and hosted by Bruce Silver and Nathaniel Palmer as a showcase for new ideas in BPM and related technologies. This is not a conference for (potential) customers, but a place for vendors, researchers and analysts to come together to exchange ideas about what’s happening in the marketplace and the technology labs. Most of the agenda is made up of 30-minute demo sessions with a few panels and keynotes sprinkled in.

Nathaniel Palmer started our first day with a look forward at the next five years of BPM by considering the five-year span from 2015 to 2020 and how his predictions are playing out from his first predictions keynote. In 2015, he talked about intelligent automation; today, we’re seeing robots and rules-based automation as an integral part of how business is done. This is pretty crucial, because the average number of systems required to present a complete view of a customer is 13.2 (!), 8 of which are external, with 80% of firms stating that they use more than 10 systems to get that a 360 degree view. He talks about the need for an intelligent automation platform that includes robotic automation, AI and machine learning, decision management, and process management, communicating with events and data via an event gateway/bus. He believes that the role of a BPMS is also to provide the framework for development and to build the user interface – an idea that I’ll be debating somewhat in my keynote tomorrow – but sees always-on, context-driven devices such as smart speakers as the future of how we interact with systems rather than traditional computers and smartphones. That means that conversational interaction will take over from worklist metaphors for common processes for consumers and employees; my interpretation of this is that the task-focused activities are those that will be automated, leaving the more fluid activities for people to deal with.

A consideration of this changing nature of automation is how to model this. Our traditional workflows have a pre-defined path, whereas intelligent automation (with more of a case management/ad hoc paradigm) has more adaptable processes driven by rules and business context. It’s more like using Waze for dynamically-adjusted driving directions rather than a pre-conceived idea of what route to follow. The danger with this – in my experience with Waze and adaptable business processes – is that you could end up on a route that is not generally followed, messes up the people who have to get involved along the route, and definitely isn’t repeatable or scalable: better for that specific instance and its participants, but possibly detrimental to others. The potential gain is, of course, that the process as a whole is more resilient because it responds to events by determining an action that will reach the goal, and you may just find a new and better way of doing something. Respond to events, definitely, but at some point take a step back and consider the impact of the new pathways that you’re carving out.

IMG_3352He spoke about problems with AI/ML and training data biases – robots are only as smart as your training data – and highlighted that BPM platforms are a great source of training data via process mining.and analysis.

Insightful as always, and it will be interesting to see these themes play out in the demos over the next three days.

2019 @Alfresco Day: RBC Capital Markets

Yesterday at the analyst day, Alfresco CEO Bernadette Nixon had a fireside chat with Jim Williams of RBC about their Alfresco journey, and today at the user conference, Williams gave us more of the details of what they’re doing. They had an aging platform (built on Pega) that wasn’t able to support their derivatives business operations adequately, having been designed for a single purpose without the ability to easily change, resulting in many manual processes.

They wanted to have a single BPM and ECM platform that would span all of their business areas for handling regulatory documentation, and they started in 2015 with their equities operations: not because it was easy, low-hanging fruit, but because it was complex and essential to get it right. They now have 14 applications built on the same framework, and 3,500+ users. Williams said that they specifically liked Alfresco because it doesn’t try to be everything but integrates with other products and services to do functions such as reporting or OCR; this is particularly interesting in the face of other vendor platforms that want to be everything to everyone, and don’t do some of the functions very well.

By 2016, they had rolled out applications in tax operations, which was essential to the changing IRS rules that required foreign banks like RBC to withhold tax on US investments unless clients could prove that they met non-resident requirements. This had to integrate with many of their other operational processes that followed. They also implemented content and process applications for HR due to some of their complex job role management in the UK, reducing dependency on spreadsheets and email for what are essentially core processes.

Like all of the very conservative Canadian financial institutions, their Alfresco implementation is all on premise rather than cloud, although they have cloud ambitions. It’s also important to note that although RBC is Canada’s largest bank, Capital Markets is a relatively small part of it; it will be interesting to see if Williams can carry the Alfresco message to other parts of the organization.

2019 @Alfresco Day: Go To Market Strategy

Jennifer Smith, Alfresco’s CMO, gave us an expansion of the GTM strategy that Bernadette Nixon spoke about earlier today.

Their platform is based on a single cloud-native platform combining content, process and governance services, on which they identify three pillars of their horizontal platform approach:

  • Modernization and migration, providing tools for migrating to Alfresco quickly and with minimal risk
  • Coexistence and integration, allowing for easy integration with third-party services and legacy systems
  • Cloud-native and AWS-first, with deep integration and support for AWS cloud platform, storage and AI/ML services

Their vertical use case approach is based on a typical land-and-expand strategy: they take an existing implementation with a customer and find other use cases within that organization to leverage the platform benefits, then work with a large enterprise or partner to develop managed vertical solutions.

We saw a demo of a citizen services scenario: to paraphrase, a government agency has old, siloed systems and bad processes, but citizens want to interact with that agency in the same way that they interact with other services such as their bank. In a modernized passport application example, the process would include document upload directly by the citizen, intelligent classification and extraction from the documents, fraud detection by integration with other data sources, natural language translation to communicate with foreign agencies, and tasks for manual review. Although the process and content bits are handled natively by Alfresco, much of the intelligence is based on Amazon services such as Comprehend and Textract — Alfresco’s partnership with Amazon and AWS-native platform make this a natural fit.

We’re off to some breakouts now then partner strategy this afternoon, so it might be quiet here until tomorrow.

2019 @Alfresco Analyst Day: use case with RBC Capital Markets

Jim Williams, Head of Operations and Shared Services Technology for RBC Capital Markets had an on-stage fireside chat with Bernadette Nixon about what they’ve been doing with Alfresco over the past five years.

The focus of their implementation is back office operations, including trade confirmations, settlement and other transactions, especially with all of the regulatory implications and changes. They started looking at this in 2015 for a specific use case (equity trade confirmations) when they had no cohesive platform and many manual processes, and now have several different applications on Alfresco technology. Their transactions tend to be more complex, not just simple financial transactions, so have specific concerns with integrating multiple sources of information, and multiple business rules regarding regulations and compliance. They were an early customer for the Application Development Framework (ADF), and it has allowed them to build apps more quickly due to shared components such as single signon. They’re now replacing some of their 10-year-old legacy processes that were initially on Pega, providing more agility in the deployed processes.

He shared some great feedback from the actual users of the applications on their experience and the benefits that they’re seeing, which included the usual operational hot buttons of simplification, cost reduction, productivity increase, reduced risk and scalability, plus innovation and transformation. He joked that they’ve reduced their organizational dependency on Excel, but that’s a very real measure: when I work with enterprise customers on improving processes, I always look for the “spreadsheet and email” processes that we need to replace.

They explored RPA technology but came to the inevitable conclusion that it was just a stopgap: it can make a bad process work a bit faster or better, but it doesn’t fundamentally make it a good process. This was an interesting comment following on a side conversation that I had with Nixon at the break about how Lean Six Sigma initiatives — still all the rage in many financial organizations — are more about incremental improvement than transformation.

Happy to see a process-centric use case taking top billing here: I may need to reassess my earlier statement that Alfresco sometimes forgets about process. 🙂

2019 @Alfresco Analyst Day: update and strategy with @bvnixon

Bernadette Nixon, who assumed the role of CEO after Alfresco’s acquisition last year, opened the analyst day with the company strategy. They seem to be taking a shot at several of their competitors by pushing the idea that they’re one platform, built from the ground up as a single integrated platform rather than being a “Frankenplatform” pieced together from acquisitions. Arguably, Activiti grew up inside Alfresco as quite a separate project from the content side and I’m not sure it’s really as integrated as the other bits, but Alfresco sometimes forgets that content isn’t everything.

Nixon walked through what’s happened in the past year, starting with some of their customer success stories — wins against mainstream competitors, fast implementations and happy customers — and how they’ve added 126 new customer logos in the past year while maintaining a high customer renewal rate. They’ve maintained a good growth rate, and moved to profitability in order to invest back into the company for customer success, developing their teams, brand refresh, engineering and more. They’ve added many of the big SIs as new partners and are obviously working with the partner channel for success, since they’ve doubled their partner win rate. They’ve added five new products, including their Application Development Framework which is the core for some of the other products as well as the cornerstone of partner and customer success for fast implementation.

They commissioned a study that showed that most organizations want to be deployed in the cloud, have better control over their processes, and be able to create applications faster (wait…they paid for that advice?); more interestingly, they found that 35% of enterprises want to switch out their BPM and ECM platforms in the next few years, providing a huge opportunity for Alfresco and other disruptive vendors.

Alfresco is addressing the basic strategy of a horizontal platform approach versus a use case vertical approach: are they a platform vendor or an application vendor? Their product strategy is betting on their Alfresco Digital Business Platform targeted at the technical buyer, but also developing a go-to-market approach that highlights use cases primarily in government and insurance for the business/operational buyer. They don’t have off-the-shelf apps — that’s for their partners or their customers to develop — but will continue to present use cases that resonate with their target market of financial services, insurance, government and manufacturing.

A good start to the day — I’ll be here all day at the analyst conference, then staying on tomorrow for the user conference.

Shifting (back) from buy to build for digital automation

One of the advantages of being in the software industry for a long time is that I can watch trends come and go. Some of them many times. Take the buy versus build argument: is it better for an organization to build a system (for its own use) using best-of-breed components, or buy a best-in-class monolithic system from a single vendor? As with all things software, the answer is “it depends”: it depends on how well the company’s needs are accommodated by a single-vendor solution, and how much the company’s needs are expected to change on an ongoing basis.

Almost every end-customer organization that I talk to now, either in my consulting practice or through industry contacts, is deploying an in-house digital automation platform that allows them to quickly assemble capabilities into new business applications. Since business applications tend to be process- and case-centric, organizations have often ended up with a BPMS (or what Gartner might call an iBPMS) as a single-vendor solution for the core of their digital automation platform, although ERP and CRM platforms such as SAP and Salesforce are also making a play in this space.

BPMS — once (more or less) single-purpose systems for modeling and managing processes — have, Borg-like, assimilated so many other technologies and capabilities that they have become the monolith. If you sign up for their process management capabilities, you may also get decision management, analytics, event handling, user experience, social media and many other capabilities in the same box. This is what allows BPMS vendors to market their products as complete digital automation platforms, requiring only a bit of wiring to connect up with line-of-business systems and data.

If there’s one constant in how organizations work, it’s that they will outgrow their systems as their business environment (constantly) changes. And that’s exactly the problem with any monolithic system: there will certain capabilities that no longer meet your changing needs, or a disruptive new vendor or product that could replace specific capabilities with something transformative. Without the ability to decouple the components of the monolith, you may be stuck using the unwanted capabilities; at the very least, you’ll still be paying maintenance and upgrades for the entire suite rather than just the parts that you’re using.

The result of all this is that I’m seeing organizations starting to build their digital automation platforms with much more granular components, and BPMS vendors offering their products in a granularity to match that. It’s this pattern that I’ll be talking about in my bpmNEXT keynote in Santa Barbara on April 16, “Best of Breed: Rolling Your Own Digital Automation Platform using BPMS and Microservices”. Hope to see you there.

Webinar: Unlocking Back Office Value by Automating Processes

I’ve been quiet here for a while – the result of having too much real work, I suppose Winking smile – but wanted to highlight a webinar that I’ll be doing on December 13th with TrackVia and one of their customers, First Guaranty Mortgage Corporation, on automating back office processes:

With between 300 to 800 back-office processes to monitor and manage, it’s no wonder financial services leaders look to automate error-prone manual processes. Yet, IT resources are scarce and reserved for only the most strategic projects. Join Sandy Kemsley, industry analyst, Pete Khanna, CEO of TrackVia, and Sarah Batangan, COO of First Guaranty Mortgage Corporation, for an interactive discussion about how financial services are digitizing the back-office to unlock great economic value — with little to no IT resources.

During this webinar, you’ll learn about:

  • Identifying business-critical processes that need to be faster
  • Key requirements for automating back office processes
  • Role of low-code workflow solutions in automating processes
  • Results achieved by automating back office processes

I had a great discussion with Pete Khanna, CEO of TrackVia, while sitting on a panel with him back in January at OPEX Week, and we’ve been planning to do this webinar ever since then. The idea is that this is more of a conversational format: I’ll do a bit of context-setting up front, then it will become more of a free-flowing discussion between Sarah Batangan (COO of First Guaranty), Pete and myself based around the topics shown above.

You can register for the webinar here.

AI and BPM: my article for @Bonitasoft on making processes more intelligent

Part of my work as an industry analyst is to write papers and articles (and present webinars), sponsored by vendors, on topics that will be of interest to their clients as well as a broader audience. I typically don’t talk about the sponsor’s products or give them any sort of promotion; it’s intended to be educational thought leadership that will help their clients and prospects to understand the complex technology environment that we work in.

I’ve recently written an article on AI and BPM for Bonitasoft that started from a discussion we had after I contributed articles on adding intelligent technologies to process management to a couple of books, as well as writing here on my blog and giving a few presentations on the topic. From the intro of the article:

In 2016, I was asked to contribute to the Workflow Management Coalition’s book “Best Practices for Knowledge Workers.” My section, “Beyond Checklists”, called for more intelligent adaptive case management to drive innovation while maintaining operational efficiency. By the next year, they published “Intelligent Adaptability,” and I contributed a section called “Machine Intelligence and Automation in ACM [Adaptive Case Management] and BPM” that carried forward these ideas further. Another year on, it’s time to take a look at how the crossover between BPM and artificial intelligence (AI) — indeed, between BPM and a wide range of intelligent technologies — is progressing.

I go on to cover the specific technologies involved and what types of business innovation that we can expect from more intelligent processes. You can read the entire article on Bonita’s website, on their LinkedIn feed and their Medium channel. If you prefer to read it in French, it’s also on the Decideo.fr industry news site, and apparently there’s a Spanish version in the works too.

Integrating process and content for digital transformation: my upcoming webinar

As much as I love chatting with the newer crop of entrepreneurs about their products and ideas, sometimes it’s nice to have a conversation with someone who remembers when OS/2 was the cheapest way to buy 3.5” high density disks. You know who’s been hip-deep in the technology of content and process as long as I have? John Newton, founder and CTO of Alfresco, that’s who. John started Documentum back in 1990, around the time that I was selling off my imaging/workflow product startup and starting my services company, and while he’s stayed on the product side and I’ve stayed on the services/industry analyst side (except for a brief period as FileNet’s BPM evangelist), we’re both focused on how this technology helps companies in their digital transformation journey.

John and I will get together on a webinar about integrating process and content on July 24, sponsored by Alfresco, which will combine structured content with a free-ranging conversation. We’re planning to talk about use cases for applications that integrate process and content, some best practices for designing these applications, and overall architectural considerations for process/content applications including cloud and microservices. Add a comment here or on Twitter if there’s something in particular that you’d like us to discuss, and we’ll see if we can work it in.

I wrote a blog post for Alfresco a couple of months ago on use cases for content in process applications, stressing the importance of integrating process and content rather than leaving them as siloed applications; in general, this is what I’ve seen over the years in my practice as a systems architect and consultant helping organizations to get their content digitized and their processes automated. If you have digital content that’s locked up without any way to take actions on it, or automated processes that still require manual lookups of related content, then you should be thinking about how to integrate process and content. Tune in to our webinar for pointers from a couple of industry gray-hairs.