AI and the Content Creator

There’s a lot of buzz about how we’re all going to be replaced in our jobs by artifical intelligence. As a long-time pracitioner in business process automation, it seemed like this might just be another step in the trend of automating work to make it better, faster and cheaper, as we’ve been doing for centuries. From Jacquard looms to Ford’s assembly lines to automated business workflows, it’s a bit more of the same applied to different fields, although increasing more sophisticated. Most people who are in some sort of creative role — writing, graphics, innovation — assume that they’re immune from this type of automation. They may be wrong.

I was listening to a podcast by Tim Harford, who creates the excellent Cautionary Tales series, and in this episode he was talking with Jacob Goldstein, who has recently written a book called Money: The True Story of a Made Up Thing. Jacob described uploading a chapter of his book to Google’s Notebook LM and asking it to do an audio summary; Notebook LM generated a two-person converational podcast with entirely AI actors that summarized and discussed his book chapter. Whoa.

I decided to give it a try, and uploaded this paper that I wrote back in 2012 on BPM and application development. It generated this podcast-like audio summary that is, quite frankly, pretty cool.

CamundaCon 2023 Day 1: Behavior-Driven Development at Zurich Insurance

Zurich has been working on their end-to-end processes for a long time, and found that the lead time for changes became a problem as they attempted to move to continuous improvement through continuously changing processes. Laurenz Harnischmacher and Stefan Post of Zurich presented their approach to improving this. They measured Change Lead Time, which is the time that the developer starts working on a change until it is released to production, and found that it was generally about 10 weeks. This was a problem not just for agile processes in general, but they also had a mismatch between what the developers were creating versus what the users actually needed, which meant that there were potentially multiple of those 10-week cycles to solve a single problem. In other words, like many other organizations, there was not an appropriate and complete method of communicating needs from business to the developer.

They adopted Behavior Driven Development, which provides a methodology for the business to define the desired system behavior using structured test scenarios, aided by the use of Cucumber. These scenarios are defined with given…when…then…and language to specify the context, constraints and expected behavior. This takes a bit longer up front, since the business unit has to write complete test cases, then the developers/testers create the automated testing scenarios and feedback to the business if they find inconsistencies or omissions, then development occurs, then the automated testing can be applied to the finished code. This is much more likely to result in code that works right for the business the first time rather than requiring multiple cycles of the full development cycle.

Although they’ve been able to have a huge improvement in lead time, I’m left with the thought that somewhere along the Agile journey, we lost the ability to properly create business requirements for software. Agile in the absence of written requirements works okay if your development cycle is super-short, but 10 weeks is not really that Agile. I’m definitely not advocating a return to waterfall development, where requirements are completely documented and signed off before any development begins, but we need to bake in some of the successful bits of waterfall with a more (lower-case A) agile iterative approach. This is independent of Camunda or any other platform, and more a comment on how application development needs to work in order to avoid excessive lengthy rework cycles. Also, the concepts of BDD borrow a lot from earlier test-driven development methods; in software development, like much else in life, everything old is new again.

Interesting to note from the comments at the end that the automated test cases are only for automated/service steps, and that the business people do not create process models directly. 樂

Back to Berlin! It’s time for CamundaCon 2022

I realize that I’m completely remiss for not posting about last week’s DecisionCAMP, but in my defense, I was co-hosting it and acting as “master of ceremonies”, so was a bit busy. This was the third year for a virtual DecisionCAMP, with a plan to be back in person next year, in Oslo. And speaking of in-person conferences, I’m in Berlin! Yay! I dipped my toe back into travel three weeks ago by speaking at Hyland’s CommunityLive conference in Nashville, and this week I’m on a panel at Camunda’s annual conference. I’ve been in Berlin for this conference several times in the past, from the days when they held the Community Day event in their office by just pushing back all the desks. Great to be back and hear about some of their successes since that time.

Day 1 started with an opening keynote by Jakob Freund, Camunda’s CEO. This is a live/online hybrid conference, and considering that Camunda did one of the first successful online conferences back in 2020 by keeping it live and real, this is shaping up to be a forerunner in the hybrid format, too. A lot of companies need to learn to do this, since many people aren’t getting back on a plane any time soon to go to a conference when it can be done online just as well.

Anyway, back to the keynote. Camunda just published the Process Orchestration Handbook, which is a marketing piece that distills some of the current messaging around process automation, and highlights some of the themes from Jakob’s keynote. He points out the problems with a lot of current process automation: there’s no end-to-end automation, no visibility into the overall process, and little flexibility to rework those processes as business conditions change. As a result, a lot of process automation breaks since it falls over whenever there’s a problem outside the scope of the automation of a specific set of tasks.

Jakob focused on a couple of things that make process orchestration powerful as a part of business automation: endpoint diversity (being able to connect a lot of different types of tasks into an integrated process) and process complexity (being able to include dynamic parallel execution, event-driven message correlation, and time-based escalation). These sound pretty straightforward, and for those of us who have been in process automation for a long time these are accepted characteristics of BPMN-based processes, but these are not the norm in a lot of process orchestration.

He also walked through the complexities that arise due to long-running processes, that is, anything that involves manual steps with knowledge workers: not the same as straight-through API-only process orchestration that doesn’t have to worry about state persistence. There are a few good customer stories here this week that bring all of these things together, and I plan to be attending some of those sessions.

He presented a view of the process automation market: BPMS, low-code platforms, process mining, iPaaS/integration, RPA, microservices orchestration, and process orchestration. Camunda doesn’t position itself in BPMS any more – mostly since the big analysts have abandoned this term – but in the process orchestration space. Looking at the intersection between the themes of endpoint diversity and process complexity that he talked about earlier, you can see where different tools end up. He even gives a nod to Gartner’s hyperautomation term and how process orchestration fits into the tech stack.

He finished up with a bit of Camunda’s product vision. They released V8 this year with the Zeebe engine, but much more than that is under development. More low-code especially around modeling and front-end human task technology, to enable less technical developers. Decision automation tied into process orchestration. And stronger coverage of AI, process mining and other parts of the hyperautomation tech stack through partnerships as well as their own product development.

Definitely some shift in the messaging going on here, and in some of Camunda’s direction. A big chunk of their development is going into supporting low-code and other “non-developer” personas, which they resisted for many years. They have a crossover point for pro-code developers to create connectors that are then used by low-code developers to assemble into process orchestrations – a collaboration that has been recognized by larger vendors for some time. Sensible plans and lots of cool new technology.

The rest of the day is pretty packed, and I’m having trouble deciding which sessions to attend since there are several concurrent that all sound interesting. Since most of them are also virtual for remote attendees, I assume the recordings will be available later and I can catch up on what I missed. It’s not too late to sign up to attend the rest of today and tomorrow virtually, or to see the recorded sessions after the fact.

Videos from the past

Now that I’m starting to produce more video — see my short videos on the Trisotech blog and the citizen developer series on Bizagi — I’ve been combing through my portfolio of previous interviews and presentations, and it’s been a real blast from the past. These stretch back to my days at FileNet (2000-2001, or what I refer to as “the longest 16 months of my life”) where I did a lot of public conference presentations and internal educational courses on the emerging field of BPM, but most of the pre-YouTube content has been lost to time.

I’ve created a playlist of all of the ones that I can find on my YouTube channel and I’ll add new content of my own on the main video page. Click Subscribe over there to be notified of new videos when I publish them.

Here’s the earliest video that I can find of an interview, talking about TIBCO’s first release of ActiveMatrix BPMat the 2010 TIBCO conference. This was recorded and published by (now retired) Den Howlett. I discussed the trend of BPM suites moving to an all-in-one application development environment, a trend that swept through most of the mainstream vendors over the ensuing years and is still popular with many of them.

By the way, the review that I wrote of AMX BPM a few months later, after a few more in-depth briefings, is still one of the most-read posts on this blog.

Camunda Platform 7.15: now low-code (-ish)

I had a quick briefing with Daniel Meyer, CTO of Camunda, about today’s release. With this new version 7.15, they are rebranding from Camunda BPM to Camunda Platform (although most customers just refer to the product as “Camunda” since they really bundle everything in one package). This follows the lead of other vendors who have distanced themselves from the BPM (business process management) moniker, in part because what the platforms do is more than just process management, and in part because BPM is starting to be considered an outdated term. We’ve seen the analysts struggle with naming the space, or even defining it in the same way, with terms like “digital process automation”, “hyperautomation” and “digitalization” being bandied about.

An interesting pivot for Camunda in this release is their new support for low-code developers — which they distinguish as having a more technical background than citizen developers — after years of primarily serving the needs of professional technical (“pro-code”) developers. The environment for pro-code developers won’t change, but now it will be possible for more collaboration between low-code and pro-code developers within the platform with a number of new features:

  • Create a catalog of reusable workers (integrations) and RPA bots that can be integrated into process models using templates. This allows pro-code developers to create the reusable components, while low-code developers consume those components by adding them to process models for execution. RPA integration is driving some amount of this need for collaboration, since low-code developers are usually the ones on the front-end of RPA initiatives in terms of determining and training bot functionality, but previously may have had more difficult integrating those into process orchestrations. Camunda is extending their RPA Bridge to add Automation Anywhere integration to their existing UIPath integration, which gives them coverage of a significant portion of the RPA market. I covered a bit of their RPA Bridge architecture and their overall view on RPA in one of my posts from their October 2020 CamundaCon. I expect that we will soon see Blue Prism integration to round out the main commercial RPA products, and possibly an open source alternative to appeal to their community customers.
  • DMN support, including DRD and decision tables, in their Cawemo collaborative modeler. This is a good way to get the citizen developers and business analysts involved in modeling decisions as well as processes.
  • A form builder. Now, I’m pretty sure I’ve heard Jakob Freund claim that they would never do this, but there it is: a graphical form designer for creating a rudimentary UI without writing code. This is just a preliminary release, only supporting text input fields, so isn’t going to win any UI design awards. However, it’s available in the open source and commercial versions as well as accessible as a library in bpmn.io, and will allow a low-code developer to do end-to-end development: create process and decision models, and create reusable “starter” UIs for attaching to start events and user activities. When this form builder gets a bit more robust in the next version, it may be a decent operational prototyping tool, and possibly even make it into production for some simple situations.

They’ve also added some nice enhancements to Optimize, their monitoring and analytics tool, and have bundled it into the core commercial product. Optimize was first released mid-2017 and is now used by about half of their customers. Basically, it pumps the operational data exhaust out of the BPM engine database and into an elastic search environment; with the advent of Optimize 3.0 last year, they could also collect tracking events from other (non-Camunda) systems into the same environment, allowing end-to-end processes to be tracked across multiple systems. The new version of Optimize, now part of Camunda Platform 7.15, adds some new visualizations and filtering for problem identification and tracking.

Overall, there’s some important things in this release, although it might appear to be just a collection of capabilities that many of the all-in-one low-code platforms have had all along. It’s not really in Camunda’s DNA to become a proprietary all-in-one application development platform like Appian or IBM BPM, or even make low-code a primary target, since they have a robust customer base of technical developers. However, these new capabilities create an important bridge between low-code developers who have a better understanding of the business needs, and pro-code developers with the technical chops to create robust systems. It also provides a base for Camunda customers who want to build their own low-code environment for internal application development: a reasonably common scenario in large companies that just can’t fit their development needs into a proprietary application development platform.

Opening up open source development with Bonitasoft 2021.1

I had a briefing with Bonitasoft CEO Miguel Valdes earlier this week to hear more about their 2021.1 release, announced today. Note that they have changed their version numbers to align with the year and release number relative to that year, something I’ve seen starting to happen with a number of vendors.

The most important part of this release, in my opinion, is their shift in what’s open source versus commercial. Like most open source vendors, Bonitasoft is built on an open source core engine and has a number of other open source capabilities, but creates proprietary commercial components that they license to paying customers in a commercial version of the system. In many cases, the purely open source version is great for technical developers who are using their own development tools, e.g., for data modeling or UI development, but it’s a non-starter for business developers since you can’t build and deploy a complete application using just the open source platform. Bonitasoft is shaking up this concept by taking everything to do with development — page/form/layout UI design, data object models, application descriptors, process diagrams, organization models, system connectors, and Git integration — and adding it to the open source version. In other words, instead of having one version of their Studio development environment for open source and another for commercial, everything is unified, including everything in the open source version necessary to develop and deploy process automation applications. Having a unified development environment also makes it easy to move a Bonitasoft application from the open source to the commercial version (and the reverse) without project redevelopment, allowing customers to start with one version and shift to the other as project requirements change.

This is a big deal for semi-technical business developers, who can now use Bonita Studio to create complete applications on the same underlying platform as technical developers. Bonitasoft has removed anything that requires coding from Studio, recognizing that business developers don’t want to write code, and technical developers don’t use the visual development environment anyway. [As a Java developer at one of my enterprise clients once said when presented with a visual development environment, “yeah, it looks nice, but I’ll never use it”.] That doesn’t mean that these two types of developers don’t collaborate, however: technical developers are critical for the development of connectors, for example, which allow an application to connect to an external system. Once connectors are created to link to, for example, a legacy system, the business developers can consume those connectors within their applications. Bonitasoft provides connector archetypes that allow technical developers to create their own connectors, but what is missing is a way to facilitate collaboration between business and technical developers for specifying the functionality of a connector. For example, allowing the business developer to create a placeholder in an application/process and specify the desired behavior, which would then be passed on to the technical developer.

Miguel took me through a demo of Bonita Studio using only the open source version, showing their new starting point of MyProcessAutomation that includes sections for Organization, Business Data Model Applications, Diagrams, and Pages/Forms/Layouts. There are also separate sections for Custom Widgets and Environments: the latter is to be able to define separate environments for development, testing, production, etc., and was previously only in the commercial edition. It’s a very unified look and feel, and seems to integrate well between the components: for example, the expression editor allows direct access to business object variables that will be understandable to business developers, and they are looking at ways to simplify this further.

They are moving their UI away from Angular to Web Components, and are even using their own UI tools to create their new user and administrator portals. The previous Bonita Portal is now being replaced by two applications: one is a user portal that provides case and task list functionality, while the other is for administrators to monitor and repair any process instance problems. These two applications can be freely modified by customers to personalize and extend them within their own environment.

There are definitely things remaining in their commercial edition, with a focus on security (single sign-on), scalability (clustering, elasticity), monitoring, and automated continuous deployment. There are a few maintenance tools that are being moved from the open source to commercial version, and maintenance releases (bug fixes between releases) will be limited to commercial customers. They also have a new subscription model that helps with self-managed elastic deployments (e.g., Amazon Cloud); provides update and migration services for on-premise customers; and includes platform audits for best practices, performance tuning and code review. Along with this are some new packaged professional services offering: Fast Track for first implementation, on-premise to Bonita Cloud upgrade, and on-premise upgrades for customers not on the new subscription model.

The last thing that Miguel mentioned was an upcoming open source project that they are launching related to … wait, that might not have been for public consumption yet. Suffice to say that Bonitasoft is disrupting the open source process automation space to be more inclusive of non-technical developers, and we’ll be seeing more from them along these lines in the near future.

On the folly of becoming the “product expert”

This post by Charity Majors of Honeycomb popped up in my feed today, and really resonated relative to our somewhat in-bred world of process automation. She is talking about the need to move between software development teams in order to keep building skills, even if it means that you move from a “comfortable” position as the project expert to a newbie role:

There is a world of distance between being expert in this system and being an actual expert in your chosen craft. The second is seniority; the first is merely .. familiarity

I see this a lot with people becoming technical experts at a particular vendor product, when it’s really a matter of familiarity with the product rather than a superior skill at application development or even process automation technology. Being dedicated to a single product means that you think about solving problems in the context of that product, not about how process automation problems in general could be solved with a wider variety of technology. Dedication to a single product may make you a better technician but does not make you a senior engineer/architect.

Majors uses a great analogy of escalators: becoming an expert on one project (or product) is like riding one long escalator. When you get to the top, you either plateau out, or move laterally and start another escalator ride from its bottom up to the next level. Considering this with vendor products in our area, this would be like building expertise in IBM BPM for a couple of years, then moving to building Bizagi expertise for a couple of years, then moving to Camunda for a couple of years. At the end of this, you would have an incredibly broad knowledge of how to solve process automation projects on a variety of different platforms, which makes you much more capable of making the type of decisions at the senior architecture and design level.

This broader knowledge base also reduces risk: if one vendor product falls out of favor in the market, you can shift to others that are already in your portfolio. More importantly, because you already understand how a number of different products work, it’s easier to take on a completely new product. Even if that means starting at the bottom of another escalator.

CamundaCon 2020.2 Day 1

I listened to Camunda CEO Jakob Freund‘s opening keynote from the virtual CamundaCon 2020.2 (the October edition), and he really hit it out of the park. I’ve known Jakob a long time and many of our ideas are aligned, and there was so much in particular in his keynote that resonated with me. He used the phrase “reinvent [your business] or die”, whereas I’ve been using “modernize or perish”, with a focus not just on legacy systems and infrastructure, but also legacy organizational culture. Not to hijack this post with a plug for another company, but I’m doing a keynote at the virtual Bizagi Catalyst next week on aligning intelligent automation with incentives and business outcomes, which looks at issues of legacy organizational culture as well as the technology around automation. Processes are, as he pointed out, the algorithms of an organization: they touch everything and are everywhere (even if you haven’t automated them), and a lot of digital-native companies are successful precisely because they have optimized those algorithms.

Jakob’s advice in achieving reinvention/modernization is to do a gradual transformation, not try to do a big bang approach that fails more often than it succeeds, and positions Camunda (of course) as the bridge between the worlds of legacy and new technology. In my years of technology consulting on BPM implementations, I also recommend using a gradual approach by building bridges between new and old technology, then swapping out the legacy bits as you develop or buy replacements. This is where, for example, you can use RPA to create stop-gap task automation with your existing legacy systems, then gradually replace the underlying legacy or at least create APIs to replace the RPA bots.

The second opening keynote was with Marco Einacker and Christoph Anzer of Deutsche Telekom, discussing how they are using process and task automation by combining Camunda for the process layer and RPA at the task layer. They started out with using RPA for automating tasks and processes, ending up with more than 3,000 bots and an estimated €93 million in savings. It was a very decentralized approach, with initially being created by business areas without IT involvement, but as they scaled up, they started to look for ways to centralize some of the ideas and technology. First was to identify the most important tasks to start with, namely those that were true pain points in the business (Einacker used the phrase ” look for the shittiest, most painful process and start there”) not just the easy copy-paste applications. They also looked at how other smart technologies, such as OCR and AI, could be integrated to create completely unattended bots that add significant value.

The decentralized approach resulted in seven different RPA platforms and too much process automation happening in the RPA layer, which increased the amount of technical debt, so they adapted their strategy to consolidate RPA platforms and separate the process layer from the bot layer. In short, they are now using Camunda for process orchestration, and the RPA bots have become tasks that are orchestrated by the process engine. Gradually, they are (or will be) replacing the RPA bots with APIs, which moves the integration from front-end to back-end, making it more robust with less maintenance.

I moved off to the business architecture track for a presentation by Srivatsan Vijayaraghavan of Intuit, where they are using Camunda for three different use cases: their own internal processes, some customer-facing processes for interacting with Intuit, and — most interesting to me — enabling their customers to create their own workflows across different applications. Their QuickBooks customers are primarily small and mid-sized business that don’t have the skills to set up their own BPM system (although arguably they could use one of the many low-code process automation platforms to do at least part of this), which opened the opportunity for Intuit to offer a workflow solution based on Camunda but customizable by the individual customer organizations. Invoice approvals was an obvious place to start, since Accounts Payable is a problem area in many companies, then they expanded to other approval types and integration with non-Intuit apps such as e-signature and CRM. Customers can even build their own workflows: a true workflow as a service model, with pre-built templates for common workflows, integration with all Intuit services, and a simplified workflow designer.

Intuit customers don’t interact directly with Camunda services; Camunda is a separately hosted and abstracted service, and they’ve used Kafka messages and external task patterns to create the cut-out layer. They’ve created a wrapper around the modeling tools, so that customers use a simplified workflow designer instead of the BPMN designer to configure the process templates. There is an issue with a proliferation of process definitions as each customer creates their own version of, for example, an invoice approval workflow — he mentioned 70,000 process definitions — and they will likely need to do some sort of automated cleanup as the platform matures. Really interesting use case, and one that could be used by large companies that want their internal customers to be able to create/customize their own workflows.

The next presentation was by Stephen Donovan of Fidelity Investments and James Watson of Doculabs. I worked with Fidelity in 2018-19 to help create the architecture for their digital automation platform (in my other life, I’m a technical architecture/strategy consultant); it appears that they’re not up and running with anything yet, but they have been engaging the business units on thinking about digital transformation and how the features of the new Camunda-based platform can be leveraged when the time comes to migrate applications from their legacy workflow platform. This doesn’t seem to have advanced much since they talked about it at the April CamundaCon, although Donovan had more detailed insights into how they are doing this.

At the April CamundaCon, I watched Patrick Millar’s presentation on using Camunda for blockchain ledger automation, or rather I watched part of it: his internet died partway through and I missed the part about how they are using Camunda, so I’m back to see it now. The RiskStream Collaborative is a not-for-profit consortium collaborating on the use of blockchain in the insurance industry; their parent organization, The Institutes, provides risk management and insurance education and is guided by senior executives from the property and casualty industry. To copy from my original post, RiskStream is creating a distributed network platform, called Canopy, that allows their insurance company members to share data privately and securely, and participate in shared business processes. Whenever you have multiple insurance companies in an insurance process, like a claim for a multi-vehicle accident, having shared business processes — such as first notice of loss and proof of insurance — between the multiple insurers means that claims can be settled quicker and at a much lower cost.

I do a lot of work with insurance companies, as well as with BPM vendors to help them understand insurance operations, and this really resonates: the FNOL (first notice of loss) process for multi-party claims continues to be a problem in almost every company, and using enterprise blockchain to facilitate interactions between the multiple insurers makes a lot of sense. Note that they are not creating or replacing claims systems in any way; rather, they are connecting the multiple insurance companies, who would then integrate Canopy to their internal claims systems such as Guidewire.

Camunda is used in the control framework layer of Canopy to manage the flows within the applications, such as the FNOL application. The control framework is just one slice of the platform: there’s the core distributed ledger layer below that, where the blockchain data is persisted, and an integration layer above it to integrate with insurers’ claims systems as well as the identity and authorization registry.

There was a Gartner keynote, which gave me an opportunity to tidy up the writing and images for the rest of this post, then I tuned back in for Niall Deehan’s session on Camunda Hackdays over on the community tech track, and some of the interesting creations that come out of the recent virtual version. This drives home the point that Camunda is, at its heart, open source software that relies on a community of developer both within and outside Camunda to extend and enhance the core product. The examples presented here were all done by Camunda employees, although many of them are not part of the development team, but come from areas such as customer-facing consulting. These were pretty quick demos so I won’t go into detail, but here are the projects on Github:

If you’re a Camunda customer (open source or commercial) and you like one of these ideas, head on over to the related github page and star it to show your interest.

There was a closing keynote by Capgemini; like the Gartner keynote, I felt that it wasn’t a great fit for the audience, but those are my only real criticisms of the conference so far.

Jakob Freund came back for a conversation with Mary Thengvall to recap the day. If you want to see the recorded videos of the live sessions, head over to the agenda page and click on Watch Now for any session.

There’s a lot of great stuff on the agenda for tomorrow, including CTO Daniel Meyer talking about their new RPA orchestration capabilities, and I’ll be back for that.

#PegaWorld iNspire 2020

PegaWorld, in shifting from an in-person to virtual event, dropped down to a short 2.5 hours. The keynotes and many of the breakouts appeared to be mostly pre-recorded, hosted live by CTO Don Schuerman who provided some welcome comic relief and moderated live Q&A with each of the speakers after their session.

The first session was a short keynote with CEO Alan Trefler. It’s been a while since I’ve had a briefing with Pega, and their message has shifted strongly to the combination of AI and case management as the core of their digital platform capabilities. Trefler also announced Pega Process Fabric that allows the integration of multiple systems not just from Pega, but other vendors.

Next up was SVP of Products Kerim Akgonul, discussing their low-code Pega Express approach and how it’s helping customers to stand up applications faster. We heard briefly from Anna Gleiss, Global IT Head of Master Data Management at Siemens, who talked about how they are leveraging Pega to ensure reusability and speed deployment across the 30 different applications that they’re running in the Pega Cloud. Akgonul continued with use cases for self-service — especially important with the explosion in customer service in some industries due to the pandemic — and some of their customers such as Aflac who are using Pega to further their self-service efforts.

There was a keynote by Rich Gilbert, Chief Digital and Information Officer at Aflac, on the reinvention that they have gone through. There’s a lot of disruption in the insurance industry now, and they’ve been addressing this by creating a service-based operating model to deliver digital services as a collaboration between business and IT. They’ve been using Pega to help them with their key business drivers of settling claims faster and providing excellent customer service with offerings such as “Claims Guest Checkout”, which lets someone initiate a claim through self-service without knowing their policy number or logging in, and a Claims Status Tracker available on their mobile app or website. They’ve created a new customer service experience using a combination of live chat and virtual assistants, the latter of which is resolving 86% of inquiries without moving to a live agent.

Akgonul also provided a bit more information on the Process Fabric, which acts as a universal task manager for individual workers, with a work management dashboard for managers. There was no live Q&A at this point, but it was delayed until a Tech Talk later in the agenda. In the interim was a one-hour block of breakouts that had one track of three live build sessions, plus a large number of short prerecorded sessions from Pega, partners and customers. I’m interested in more information on the Process Fabric, which I believe will be in the later Tech Talk, although I did grab some screenshots from Akgonul’s keynote:

The live build sessions seemed to be overloaded and there was a long delay getting into them, but once started, they were good-quality demos of building Pega applications. I came in part way through the first one on low-code using App Studio, and it was quite interactive, with a moderator dropping in occasionally with live questions, and eventually hurrying the presenter along to finish on time. I was only going to stay for a couple of minutes, but it was pretty engaging and I watched all of it. The next live demo was on data and integration, and built on the previous demo’s vehicle fleet manager use case to add data from a variety of back-end sources. The visuals were fun, too: the presenter’s demo was most of the screen, with a bubble at the bottom right containing a video of the speaker, then a bubble popping in at the bottom left with the moderator when he had a question or comment. Questions from the audience helped to drive the presentation, making it very interactive. The third live demo was on user experience, which had a few connectivity issues so I’m not sure we saw the entire demo as planned, but it showed the creation of the user interface for the vehicle manager app using the Cosmos system, moving a lot of logic out of the UI and into the case model.

The final session was the Tech Talk on product vision and roadmap with Kerim Akgonul, moderated by Stephanie Louis, Senior Director of Pega’s Community and Developer Programs. He discussed Process Fabric, Project Phoenix, Cosmos and other new product releases in addition to fielding questions from social media and Pega’s online community. This was very interactive and engaging, much more so than his earlier keynote which seemed a bit stiff and over-rehearsed. More of this format, please.

In general, I didn’t find the prerecorded sessions to be very compelling. Conference organizers may think that prerecording sessions reduces risk, but it also reduces spontaneity and energy from the presenters, which is a lot of what makes live presentations work so well. The live Q&A interspersed with the keynotes was okay, and the live demos in the middle breakout section as well as the live Tech Talk were really good. PegaWorld also benefited from Pega’s own online community, which provided a more comprehensive discussion platform than the broadcast platform chat or Q&A. If you missed today’s event, you should be able to find all of the content on demand on the PegaWorld site within the next day or two.

Building Scalable Business Automation with Microservices – a paper I created for @Camunda

scalable-business-automation-with-microservicesLast year, I did a few presentations for Camunda: a keynote at their main conference in Berlin, a webinar together with CEO Jakob Freund, and a presentation at their Camunda Day in Toronto, all on the similar theme of building a scalable digital automation platform using microservices and a BPMS.

I wrapped up my ideas for those presentations into a paper, which you can download from the Camunda website. Enjoy!