2019 @Alfresco Analyst Day: use case with RBC Capital Markets

Jim Williams, Head of Operations and Shared Services Technology for RBC Capital Markets had an on-stage fireside chat with Bernadette Nixon about what they’ve been doing with Alfresco over the past five years.

The focus of their implementation is back office operations, including trade confirmations, settlement and other transactions, especially with all of the regulatory implications and changes. They started looking at this in 2015 for a specific use case (equity trade confirmations) when they had no cohesive platform and many manual processes, and now have several different applications on Alfresco technology. Their transactions tend to be more complex, not just simple financial transactions, so have specific concerns with integrating multiple sources of information, and multiple business rules regarding regulations and compliance. They were an early customer for the Application Development Framework (ADF), and it has allowed them to build apps more quickly due to shared components such as single signon. They’re now replacing some of their 10-year-old legacy processes that were initially on Pega, providing more agility in the deployed processes.

He shared some great feedback from the actual users of the applications on their experience and the benefits that they’re seeing, which included the usual operational hot buttons of simplification, cost reduction, productivity increase, reduced risk and scalability, plus innovation and transformation. He joked that they’ve reduced their organizational dependency on Excel, but that’s a very real measure: when I work with enterprise customers on improving processes, I always look for the “spreadsheet and email” processes that we need to replace.

They explored RPA technology but came to the inevitable conclusion that it was just a stopgap: it can make a bad process work a bit faster or better, but it doesn’t fundamentally make it a good process. This was an interesting comment following on a side conversation that I had with Nixon at the break about how Lean Six Sigma initiatives — still all the rage in many financial organizations — are more about incremental improvement than transformation.

Happy to see a process-centric use case taking top billing here: I may need to reassess my earlier statement that Alfresco sometimes forgets about process. 🙂

2019 @Alfresco Analyst Day: update and strategy with @bvnixon

Bernadette Nixon, who assumed the role of CEO after Alfresco’s acquisition last year, opened the analyst day with the company strategy. They seem to be taking a shot at several of their competitors by pushing the idea that they’re one platform, built from the ground up as a single integrated platform rather than being a “Frankenplatform” pieced together from acquisitions. Arguably, Activiti grew up inside Alfresco as quite a separate project from the content side and I’m not sure it’s really as integrated as the other bits, but Alfresco sometimes forgets that content isn’t everything.

Nixon walked through what’s happened in the past year, starting with some of their customer success stories — wins against mainstream competitors, fast implementations and happy customers — and how they’ve added 126 new customer logos in the past year while maintaining a high customer renewal rate. They’ve maintained a good growth rate, and moved to profitability in order to invest back into the company for customer success, developing their teams, brand refresh, engineering and more. They’ve added many of the big SIs as new partners and are obviously working with the partner channel for success, since they’ve doubled their partner win rate. They’ve added five new products, including their Application Development Framework which is the core for some of the other products as well as the cornerstone of partner and customer success for fast implementation.

They commissioned a study that showed that most organizations want to be deployed in the cloud, have better control over their processes, and be able to create applications faster (wait…they paid for that advice?); more interestingly, they found that 35% of enterprises want to switch out their BPM and ECM platforms in the next few years, providing a huge opportunity for Alfresco and other disruptive vendors.

Alfresco is addressing the basic strategy of a horizontal platform approach versus a use case vertical approach: are they a platform vendor or an application vendor? Their product strategy is betting on their Alfresco Digital Business Platform targeted at the technical buyer, but also developing a go-to-market approach that highlights use cases primarily in government and insurance for the business/operational buyer. They don’t have off-the-shelf apps — that’s for their partners or their customers to develop — but will continue to present use cases that resonate with their target market of financial services, insurance, government and manufacturing.

A good start to the day — I’ll be here all day at the analyst conference, then staying on tomorrow for the user conference.

Shifting (back) from buy to build for digital automation

One of the advantages of being in the software industry for a long time is that I can watch trends come and go. Some of them many times. Take the buy versus build argument: is it better for an organization to build a system (for its own use) using best-of-breed components, or buy a best-in-class monolithic system from a single vendor? As with all things software, the answer is “it depends”: it depends on how well the company’s needs are accommodated by a single-vendor solution, and how much the company’s needs are expected to change on an ongoing basis.

Almost every end-customer organization that I talk to now, either in my consulting practice or through industry contacts, is deploying an in-house digital automation platform that allows them to quickly assemble capabilities into new business applications. Since business applications tend to be process- and case-centric, organizations have often ended up with a BPMS (or what Gartner might call an iBPMS) as a single-vendor solution for the core of their digital automation platform, although ERP and CRM platforms such as SAP and Salesforce are also making a play in this space.

BPMS — once (more or less) single-purpose systems for modeling and managing processes — have, Borg-like, assimilated so many other technologies and capabilities that they have become the monolith. If you sign up for their process management capabilities, you may also get decision management, analytics, event handling, user experience, social media and many other capabilities in the same box. This is what allows BPMS vendors to market their products as complete digital automation platforms, requiring only a bit of wiring to connect up with line-of-business systems and data.

If there’s one constant in how organizations work, it’s that they will outgrow their systems as their business environment (constantly) changes. And that’s exactly the problem with any monolithic system: there will certain capabilities that no longer meet your changing needs, or a disruptive new vendor or product that could replace specific capabilities with something transformative. Without the ability to decouple the components of the monolith, you may be stuck using the unwanted capabilities; at the very least, you’ll still be paying maintenance and upgrades for the entire suite rather than just the parts that you’re using.

The result of all this is that I’m seeing organizations starting to build their digital automation platforms with much more granular components, and BPMS vendors offering their products in a granularity to match that. It’s this pattern that I’ll be talking about in my bpmNEXT keynote in Santa Barbara on April 16, “Best of Breed: Rolling Your Own Digital Automation Platform using BPMS and Microservices”. Hope to see you there.

Show me the money: Financials, sales and support at @OpenText Analyst Summit 2019

We started the second day of the OpenText Analyst Summit 2019 with their CFO, Madhu Ranganathan, talking about their growth via acquisitions and organic growth. She claimed that their history of acquisitions shows that M&A does work — a point with which some industry specialists may not agree, given the still overlapping collection of products in their portfolio — but there’s no doubt that they’re growing well based on their six-year financials, across a broad range of industries and geographies. She sees this as a position for continuing to scale to $1B in operating cash flow by June 2021, an ambitious but achievable target, on their existing 25-year run.

Ted Harrison, EVP of Worldwide Sales, was up next with an update on their customer base: 85 of the 100 largest companies in the world, 17 of the top 20 financial services companies, 20 of the top 20 life sciences companies, etc. He walked through the composition of the 1,600 sales professionals in their teams, from the account executives and sales reps to the solution consultants and other support roles. They also have an extensive partner channel bringing domain expertise and customer relationships. He highlighted a few customers in some of the key product areas — GM for digital identity management, Nestle for supply chain management, Malaysia Airports for AI and analytics,and British American Tobacco for SuccessFactors-OT2 integration — with a focus on customers that are using OpenText in ways that span their business operations in a significant way.

James McGourlay, EVP of Customer Operations, covered how their global technical support and professional services organization has aligned with the customer journey from deployment to adoption to expansion of their OpenText products. With 1,400 professional services people, they have 3,000 engagements going on at any given time across 30 countries. As with most large vendors’ PS groups, they have a toolbox of solution accelerators, best practices, and expert resources to help with initial implementation and ongoing operations. This is also where they partner with systems integrators such as CGI, Accenture and Deloitte, and platform partners like Microsoft and Oracle. He addressed the work of their 1,500 technical support professionals across four major centers of excellence for round-the-clock support, co-located with engineering teams to provide a more direct link to technical solutions. They have a strong focus on customer satisfaction in PS and technical support because they realize that happy customers tend to buy more stuff; this is particularly important when you have a lot of different products to sell to those customers to expand your footprint within their organizations.

Good to hear more about the corporate and operations side than I normally cover, but looking forward to this afternoon’s deeper dives into product technology.

Product Innovation session at @OpenText Analyst Summit 2019

Muhi Majzoub, EVP of Engineering, continued the first day of the analyst summit with a deeper look at their technology progress in the past year as well as future direction. I only cover a fraction of OpenText products; even in the ECM and BPM space, they have a long history of acquisitions and it’s hard to keep on top of all of them.

Their Content Services provides information integration into a variety of key business applications, including Salesforce and SAP; this allows users to work in those applications and see relevant content in that context without having to worry where or how it’s stored and secured. Majzoub covered a number of the new features of their content platforms (alas, there are still at least two content platforms, and let’s not even talk about process platforms) as well as user experience, digital asset management, AI-powered content analytics and eDiscovery. He talked about their solutions for LegalTech and digital forensics (not areas that I follow closely), then moved on to the much broader areas of AI, machine learning and analytics as they apply to capture, content and process, as well as their business network transactions.

He talked about AppWorks, which is their low-code development environment but also includes their BPM platform capabilities since they have a focus on process- and content-centric applications such as case management. They have a big push on vertical application development, both in terms of enabling it for their customers and also for building their own vertical offerings. Interestingly, they are also allowing for citizen development of micro-apps in their Core cloud content management platform that includes document workflows.

The product session was followed by a showcase and demos hosted by Stephen Ludlow, VP of Product Marketing. He emphasized that they are a platform company, but since line-of-business buyers want to buy solutions rather than platforms, they need to be able to demonstrate applications that bring together many of their capabilities. We had five quick demos:

  • AI-augmented capture using Captive capture and Magellan AI/analytics: creating an insurance claim first notice of loss from an unstructured email, while gathering aggregate analytics for fraud detection and identifying vehicle accident hotspots.
  • Unsupervised machine learning for eDiscovery to identify concepts in large sets of documents in legal investigations, then using supervised learning/classification to further refine search results and prioritize review of specific documents.
  • Integrated dashboard and analytics for supply chain visibility and management, including integrating, harmonizing and cleansing data and transactions from multiple internal and external sources, and drilling down into details of failed transactions.
  • HR application integrating SAP SuccessFactors with content management to store and access documents that make up an employee HR file, including identifying missing documents and generating customized documents.
  • Dashboard for logging and handling non-conformance and corrective/preventative actions for Life Sciences manufacturing, including quality metrics and root cause analysis, and linking to reference documentation.

Good set of business use cases to finish off our first (half) day of the analyst summit.

Snowed in at the @OpenText Analyst Summit 2019

Mark Barrenechea, OpenText’s CEO and CTO, kicked off the analyst summit with his re:imagine keynote here in Boston amidst a snowy winter storm that ensures a captive audience. He gave some of the current OpenText stats –100M end users over 120,000 customers, 2.8B in revenue last year — before expanding into a review of how the market has shifted over the past 10 years, fueled by changes in technology and infrastructure. What’s happened on the way to digital and AI is what he calls the zero theorem: zero trust (guard against security and privacy breaches), zero IT (bring your own device, work in the cloud), zero people (automate everything possible) and zero down time (everything always available).

Their theme for this year is to help their customers re:imagine work, re:imagine their workforce, and re:imagine automation and AI. This starts with OpenText’s intelligent information core (automation, AI, APIs and data management), then expands with both their EIM platforms and EIM applications. OpenText has a pretty varied product portfolio (to say the least) and is bringing many of these components together into a more cohesive integrated vision in both the content services and the business network spaces. More importantly, they are converging their many, many engines so that in the future, customers won’t have to decide between which ECM or BPM engine, for example.

They are providing a layer of RESTful services on top of their intelligent information core services (ECM, BPM, Capture, Business Network, Analytics/AI, IoT), then allow that to be consumed either by standard development tools in a technical IDE, or using the AppWorks low-code environment. The Cloud OT2 architecture provides about 40 services for consumption in these development environments or by OpenText’s own vertical applications such as People Center.

Barrenechea finished up with a review of how OpenText is using OpenText to transform their own business, using AI for looking at some of their financial and people management data to help guide them towards improvements. They’ll be investing $2B in R&D over the next five years to help them become even bigger in the $100B EIM market, both through the platform and more increasingly through vertical applications.

We’ll be digging into more of the details later today and tomorrow as the summit continues, so stay tuned.

Next up was Ted Harrison, EVP of Worldwide Sales, interviewing one of their customers: Gopal Padinjaruveetil, VP and Chief Information Security Officer at The Auto Club Group. AAA needs no introduction as a roadside assistance organization, but they also have insurance, banking, travel, car care and advocacy business areas, with coordinated member access to services across multiple channels. It’s this concept of the connected member that has driven their focus on digital identity for both people and devices, and how AI can help them to reduce risk and improve security by detecting abnormal patterns.

TechnicityTO 2018: Cool tech projects

The afternoon session at Technicity started with a few fast presentations on cool projects going on in the city. Too quick to grab details from the talks, but here’s who we heard from:

  • Dr. Eileen de Villa, medical officer of health at Toronto Public Health, and Lawrence ETA, deputy CIO at the city of Toronto, on using AI to drive public health outcomes.
  • Angela Chung, project director at Toronto Employment and Social Services, Children’s Services, Shelter Support and Housing, on client-centric support through service platform integration.
  • Matthew Tenney, data science and visualization team supervisor, on IoT from streetcars to urban forestry for applications such as environmental data sensing.
  • Arash Farajian, policy planning consultant, on Toronto Water’s use of GIS, smart sensors, drones (aerial and submersible) and augmented reality.

The rest of the afternoon was the 10th annual Toronto’s Got IT Awards of Excellence, but unfortunately I had to duck out for other meetings, so that’s it for my Technicity 2018 coverage.

TechnicityTO 2018: CIO @RobMeikle keynote

Rob Meikle, CIO at the city of Toronto, gave a fast-paced and inspiring keynote to close out the morning at Technicity. I can’t do justice to his talk here (hopefully there will be a video, because he’s a great speaker), but a few points did resonate with me.

  • There’s a correlation between digital access and socioeconomic level, and we need to use technology to drive digital inclusion.
  • Interactions between government and constituents needs to be more digital and more responsive.
  • The most inclusive cities are the most successful.
  • Focus on meaningful and measurable outcomes to make the city prosperous.
  • IT organization is being reworked to support a digital city model.
  • Policies need to be transformed faster to keep up with data usages: innovation is in policies, not just technology.
  • Increasing digital literacy is a mandate for the city in order to benefit residents.
  • The city creates a lot of opportunities, but also needs to focus on outcomes to benefit all residents — such as the one in four children in the city who live in poverty.

Good focus on how public sector technology should focus on social good as well as making government more efficient.

If I see a link to the video published, I’ll come back and update this post.

Update: here’s the video!

TechnicityTO 2018: Administrative Penalty System case study

We had a quick review of the City’s Administrative Penalty System (APS), which lets you pay or dispute your parking ticket online, with a panel made up of Lenny Di Marco, senior systems integrator; Kelli Chapman, director of prosecution services; and Susan Garossino, director of court services.

Technologically, this was a challenge to integrate old COBOL systems and newer systems across both city and provincial agencies, but there was also a cultural change to do some level of dispute resolution online rather than in the courts. Paying online isn’t new (I seem to remember paying a ticket online years ago when I still had a car), but the process of requesting a review and appealing a review result now happens in a matter of weeks rather than years. In addition to the obvious benefit of a timely outcome – which is better for citizens to get things sorted out, for the city in terms of resolving tickets faster, and for police officers who don’t have to attend court if the issue is resolved online — this also frees up court time for more serious charges. It’s still possible to do this in person, but a lot of people don’t have the time to get to a city office during business hours, or don’t want to go through the face-to-face process.

This is not just a matter of keeping up with regular day-to-day parking violations, but managing peaks that occur when the city has ticketing blitzes (usually caused when an elected official wants to make a statement about being tough on parking offenders).

The whole project took 12-14 months from inception to rollout, and is based on integrating and extending their COBOL back end and other existing systems, rather than purchasing new technology or bringing in outside help. Definitely some technology challenges, but also assessing the needs of the stakeholders from the city, the province and the police so that they can do their job including the new online review and adjudication roles.

Cool stuff, even if you don’t like paying parking tickets. Sounds like they’re already working on another integration project for next year related to Vision Zero, although we didn’t get the details.

TechnicityTO 2018: Innovative Toronto

The second session at today’s Technicity conference highlighted some of the technology innovation going on at the city, with a panel featuring Grant Coffey, director of strategy and program management at the City of Toronto; Tina Scott, Blockchain proof of concept lead for the city; and Gabe Sawhney, executive director of Code for Canada and a representative for Civic Hall Toronto. Jim Love, CIO of IT World Canada, moderated.

There are a number of different technology innovations underway at the city: some of them are public services, such as public WiFi and the offerings of Code for Canada and Civic Hall, while others are about how the city does business internally and with its commercial partners, such as blockchain in procurement processes.

Civic Hall has some interesting programs for connecting city government with other organizations for the purpose of building solutions together — I’ve been aware of and involved in things like this over several years, and they can yield great results in conjunction with the open data initiative at the city. Toronto also has a Civic Innovation Office as an in-house accelerator to help come up with innovative solutions to tough problems. These private and public programs aren’t in competition: they both foster innovation, and support different constituents in different ways.

Blockchain is starting to gain a foothold in the city through some training and an internal hackathon earlier this year to develop proofs of concept; this provided exposure to both business and technology areas about the potential for blockchain applications. Now, they are trading ideas with some of the other levels of government, such at provincial ministries, about using blockchain, and developing use cases for initial applications. They’re still just coming out of the experimental stage, and are looking at uses such as cross-jurisdictional/cross-organizational information sharing as near-term targets.

It’s not all positive, of course: challenges exist in evolving the city employee culture to take advantage of innovation and do things differently (which is pretty much the same as in private industry), as well as changing policies and governance best practices to be ready for innovation rather than playing catch-up. Sharing success stories is one of the best ways to help promote those changes.