Category Archives: BPA

business process analysis

HP Consulting’s Standards-Driven Requirements Method at BPMCM15

Tim Price from HP’s enterprise transformation consulting group presented in the last slot of day 2 of the BPM and case management summit (and what will be my last session, since I’m not staying for the workshops tomorrow) with a discussion on how to improve requirements management by applying standards. There are a lot of potential problems with requirements: inconsistency, not meeting the actual needs, not designed for change, and especially the short-term thinking of treating requirements as project rather than architecture assets. Price is pretty up-front about how you can’t take a “garden variety” business analyst and have them create BPMN diagrams without training, and that 50% of business analysts are unable to create lasting and valuable requirements.

Although I haven’t done any quantitative studies on this, I would tend to agree that the term “business analyst” covers a wide variety of skill levels, and you can’t just assume that anyone with that title can create reusable requirements models and assets. This becomes especially important when you move past written requirements — that need the written language skills that many BAs do have — to event-driven BPMN and other models; the main issue is that these models are actually code, albeit visual code, that may be beyond the technical analysis capabilities of most BAs.

Getting back to Price’s presentation, he established traceability as key to requirements: between BPMN or UML process models and UML use cases, for example; or upwards from processes to capabilities. Data needs to be modeled at the same time as processes, and processes should be modeled as soon as the the high level use case is defined. You can’t always created a one-to-one relationship between different types of elements: an atomic BPMN activity may translate to a use case (system or human), or to more than one use cases, or to only a portion of a use case; lanes and pools may translate to use case actors, but not necessarily; events may represent states and implied state transitions, although also not necessarily. Use prose for descriptions, but not for control flow: that’s what you use process models for, with the prose just explaining the process model. Develop the use case and process models first, then write text to explain whatever is not obvious in the diagrams.

He walked through a case study of a government welfare and benefits organization that went through multiple failed implementations, which were traced back to poor requirements: structural problems, consistency issues, and designs embedded in the specification. Price and his team spent 12 months getting their analysts back on track by establishing standards for creating requirements — with a few of the analysts not making the transition — that led to CMMI recognition of their new techniques. Another case study applied BPMN process models and UML use cases for a code modernization process: basically, their SDLC was the process being improved. A third case study used BPMN to document as-is and to-be processes, then use case models with complete traceability from the to-be processes to the use cases, with UML domain class models being developed in parallel.

The lessons learned from HP’s experiences:

    • Apply existing standards consistently, including BPMN, CMMN, DMN, UML

    • Use graph-structured languages for structure and logic, and prose for description

    • Use repository-based modeling tools to allow for reusability and collaboration

    • Be concise, be precise, be consistent

    • Create requirements models that are architecture assets, not just short-term project assets

    Some good lessons for requirements analysis; although this was developed for complex more waterfall-y SDLCs, some of these can definitely be adapted for more agile implementations.

    The Digital Enterprise Graph with @denisgagne at BPMCM15

    Yesterday, Denis Gagné demonstrated the modeling tools in the Trisotech Digital Enterprise Suite, and today he showed us the Digital Enterprise Graph, the semantic layer that underlies the modeling tools and allows for analysis of relationships between them. There are many stakeholders involved in defining and implementing a digital enterprise, including enterprise architects, business architects and process analysts; each of these roles has a different view on transformation of the enterprise and different goals for their work. He sees a need for a more emergent enterprise architecture rather than a structured top-down architecture effort: certainly, architects need to create the basic structure, but rather than trying to build out every artifact that might exist in the architecture before making use of it, a more pragmatic approach is for a “just-in-time” architecture that is a bit more self-organizing.

    A graph, in general, is a powerful but simple contstruct: it consists only of nodes and links, but can provide meaningful connections of loosely-coupled business entities that can be easily modified. Think about a social graph, such as Facebook’s social graph: it’s just people and their connections, but it’s a rich context for analyzing the relationships between nodes (people) in the graph depending on the nature of the links (friends, likes, etc.) between them. Trisotech’s Digital Enterprise Graph links the who, what, when, where, why and how of an organization by mapping every model that is added to the Graph onto those types of nodes and links, whether the model originates with one of their own modelers (BPMN, CMMN, DMN) or an external EA modeling tool (Casewise, SAP PowerDesigner, Sparx). This provides an intelligent fabric for automated reasoning about the current relationships between parts of the organization, but also allows estimation of the impact of changes in one area on other parts of the organization. Their Insight Analyzer tool allows you to introspect the graph, providing views such as interconnectivity between nodes as part of impact analysis, or tracing responsibility for a capability up through the organizational structure. The analysis isn’t automated, but provides visualization tools for analysts and planners, based on a single integrated scheme that allows for straightforward queries.

    He gave us a demo of the Graph in action, starting with a BPMN model that uses the Sparx EA accelerator for SOA architecture artifacts, and tracing through that loose coupling to the architectural components in the EA framework, with similar linkages for roles from a Casewise business architecture framework and definitions of contracts from the Financial Business Industry Ontology (FIBO). The idea is that the Graph provides an underlying mesh of semantic linkages from elements in a model to other frameworks, ontologies and models while still retaining business understandability at the model level. In the Insight Analyzer, we saw how to explore linkages between different types of elements, such as RACI-type relationships between roles and activities, as well as a more detailed introspection that allows drilling down on any node to see what other nodes and models that it is linked to, and traversing those links.

    Interesting ideas about how to bring together all of the architecture, process, case and decision models and frameworks into a single graph for analysis of your digital enterprise.

    Fannie Mae Case Study on Effective Process Modeling at BPMCM15

    Amit Mayabhate from Fannie Mae (a US government-sponsored mortgage lender that buys mortgages from the banks and packages them for sale as securities) gave a session at the BPM and Case Management Summit on outcome-based process modeling for delivering business value. He had a few glitches getting started — apparently Fannie Mae doesn’t allow employees to download a presentation to their laptop, so he had to struggle through getting connected to the conference wifi and then the Fannie Mae VPN to open a PDF of his presentation — but did tell the best joke of the day when he was restarting his computer in front of us and said “now you know my password…it’s 8 dots in a row”.

    Back on track, he discussed their business architecture efforts and how process modeling fits into it. Specifically, he talked about their multifamily housing division, which had its own outdated and inflexible technology platform that they wanted to change out for a simpler infrastructure that would give them better access to information for better decision-making. To get there, they decided to start with the best possible outcome in mind, but first had to have the organization understand not only that they had problems, but some quantification of how big those problems were in order to set those future goals. They identified several key metrics where they could compare today’s measurements with their desired future goals, such as operational efficiency (manual versus automated) and severability. To map from the current to future state, they needed a transformation roadmap and a framework for achieving the steps on the roadmap; this included mapping their journey to greater levels of process maturity, and creating a business capability model that included 17 capabilities, 65 functions, 262 sub-functions, and around 300 process flows.

    Their business architecture transformation framework started with the business model (how do we make money), the operating model (how do we behave to make money) and the business capability model (what abilities are needed) using the Business Model Canvas framework. They used other architecture analysis tools, such as analyzing their operating model by plotting business process standardization against business process integration both for their current state and desired future state, to help them develop the strategy for moving between them. They used Mega’s business strategy module for most of the architecture analysis, which helps them identify business processes that are ripe for automation, then move to a BPMS for process modeling and automation. In that way, they can do just the process modeling that provides them with some architectural change that they know will provide value, rather than attempting to boil the ocean by modeling all processes in the organization.

    Going Beyond Process Modeling, Part 1

    I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

    My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

    The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

    We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

    If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.

    Software AG Analyst Day: The Enterprise Gets Digital

    After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely:

    Grand Canyon

    Back at work, I spent a day last week in Boston for the first-ever North American Software AG analyst event, attended by a collection of industry and financial analysts. It was a long-ish half day followed by lunch and opportunities for one-on-one meetings with executives: worth the short trip, especially considering that I managed to fly in and out between the snow storms that have been plaguing Boston this year. I didn’t live-blog this since there was a lot of material spread over the day, so had a chance to see some of the other analysts’ coverage published after the event, such as this summary from Peter Krensky of Aberdeen Group.

    The focus of the event was squarely on the digital enterprise, a trend that I’m seeing at many other vendors but not so many customers yet. Software AG’s CEO, Karl-Heinz Streibich kicked off the day talking about how everywhere you turn, you hear about the digital enterprise: not just using digital technology, but having enough real-time data and devices integrated into our work and lives that they can be said to be truly digital. Streibich feels that companies with a basis in integration middleware – like Software AG with webMethods and other products – are in a good position to enable digital enterprises by integrating data, devices and systems of all types.

    Although Software AG is not a household consumer name, its software is in 70% of the Fortune 1000, with a community of over 2M developers; it’s fair to say that you will likely interact with a company that uses Software AG products at least once per day: banks, airports and airlines, manufacturing, telecommunications, energy and more. Their revenues are split fairly evenly between Europe and the Americas, with a small amount in Asia Pacific. License revenues are 32% of the total, with maintenance and consulting splitting the remainder; this relatively low proportion of license revenue is an indicator of a mature software company, and not unexpected from a company more than 40 years old. I found a different representation of their revenues more interesting: they had 66% of their business in the “digital business” segment in 2014, expected to climb to 75% this year, which includes their portfolio minus the legacy ADABAS/NATURAL mainframe development tools. Impressive, considering that it was about a 50:50 split in 2010. 2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103114 PM.bmpPart of this increase is likely due to their several acquisitions over that period, but also because they are repositioning their portfolio as the Digital Business Platform, a necessary shift towards the systems of engagement where more of the customer spend is happening. Based on the marketecture diagram, this platform forms a cut-out layer between back office core operational systems and front office customer engagement systems. Middleware, by any other name; but according to Streibich, more business logic is moving to the middleware layer, although this is what middleware vendors have been telling us for decades.

    There’s definitely a lot of capable products in the portfolio that form this “development platform for digital business” – webMethods (integration and BPM), ARIS (BPA), Terracotta (in memory big data), Longjump (application PaaS), Metaquark (mobility), Alfabet, Apama, JackBe and more – but the key will be to see how well they can make them all work together to be a true platform rather than just a collection of Software AG-branded tools.

    We had an in-depth presentation on their Digital Business Platform from Wolfram Jost, Software AG’s CTO; you can read the long version on their site, so I’ll just hit the high points. He started with some industry quotes, such as “every company will become a software company”, and one analyst firm’s laughable brainstorm for 2014, “Big Change”, but moved on to define digital business as having the following characteristics:

    • Blurring the digital and physical world
    • More influence of customers (on business direction as well as external perceptions)
    • Combining people, business and physical things
    • Agility, speed, scale, responsiveness
    • “Supermaneuverable” business processes
    • Disrupting existing business models

    The problem with this shift in business models is that conventional business applications don’t support the way that the new breed of business applications are designed, developed, used and operated. Current applications and development techniques are still valuable, but are being pushed behind the scenes as core operational systems and packaged applications.

    Software AG’s Digital Business Platform, then, is based on the premise that few packaged applications are useful in the face of business transformation and the required agility. We need tools to create adaptive applications – built to change, not to last – especially in front office customer engagement applications, replacing or augmenting packaged CRM and other applications. This is not fundamentally different from the message about any agile/adaptive/mashup/model-driven application development environment over the past few years, including BPMS; it’s interesting to see how a large vendor such as Software AG positions their entire portfolio around that message. In fact, one of their slides refers to the adaptive application platform as iBPMS, since the definition of iBPMS has expanded to include everything related to model-driven application development.

    2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103731 PM.bmpThe core capabilities of their platform include intelligent business operations (webMethods Operational Intelligence, Apama Streaming Analytics); agile processes (webMethods BPM and AgileApps); integration (webMethods Integration and API Management); in-memory data fabric (Terracotta); and business and IT transformation (ARIS BPA and GRC, Alfabet IT Portfolio Management and EA Management). In a detailed slide overlaying their products, they also added a transaction processing capability to allow the inclusion of ADABAS-NATURAL, as well as the cloud offerings that they’ve released over the past year.

    Jost dug further in to definitions of business application layers and architectural requirements. They provide the structure and linkages for event routing and event persistence frameworks, using relatively loose event-based coupling between their own products to allow them to be deployed selectively, but also (I imagine) to reduce the amount of refactoring of the products that would be required for tighter coupling. Their cloud IoT offering plays an interesting role by ingesting events from smart devices – developed via co-innovation with device companies such as Bosch and Siemens – for integration with on-premise business applications.

    We then heard two shorter presentations, each followed by a panel. First was Eric Duffaut, the Chief Customer Officer, presenting their go-to-market strategy then moderating a panel with two partners, Audi Lucas of Wipro and Chris Brinton of Mosaic Data Science. Their GTM plan was fairly standard for a large enterprise software vendor, although they are improving effectiveness by having a single marketing team across all products as well as improving the sales productivity processes. Their partners are critical for scalability in this plan, and provide the necessary industry experience and solutions; both of the partner panelists talked about co-innovation with Software AG, rather than just providing resources trained on the products.

    The second presentation and panel was led by John Bates, CMO and head of industry solutions; he was joined by a customer panel including Bryan Zigler of Boeing, Mark DuBrock of Standard&Poor, and Greg James of Outerwall. Bates discussed the role of industry solutions and solution accelerators, built by Software AG and/or partners, that provide a pre-built, customizable and adaptive application for fast deployment. They’re not using the Smart Process Application terminology that other vendors adopted from the Forrester trend from a couple of years ago, but it’s a very similar concept, and Bates announced the solution marketplace that they are launching to allow these to be easily discovered and purchased by customers.

    My issue with solution accelerators and industry solutions in general is that many of these solutions are tied to a specific version of the underlying technology, and are templates rather than frameworks in that you change the solution itself during implementation: upgrades to platform may not be easily performed, and upgrades to the actual solution likely requires re-customizing for each deployed instance. I didn’t get a chance to ask Bates how SAG helps partners and customers to create and deploy more upgradable solutions, e.g., recommended technology guardrails; this is a sticky problem that every technology vendor needs to deal with.

    AVPageView 07032015 111148 PM.bmpBates also discussed the patterns of digital disruption that can be seen in the marketplace, and how these are manifesting in three specific areas that they can help to address with their Digital Business Platform:

    • Connected customers, providing opportunities for location-based marketing and offers, automated concierge service, customer location tracking, demographic marketing
    • Internet of Things/Machine-to-Machine (IoT/M2M), with real-time monitoring and diagnostics, and predictive maintenance
    • Proactive risk and compliance, including proactive financial trade surveillance for unusual/rogue behavior

    After a wrapup by Streibich, we received copies of his latest book, The Digital Enterprise, plus Thingalytics by Bates; ironically, these were paper rather than digital copies. Winking smile

    Disclosure: Software AG paid my airfare and hotel to attend this event, plus gave me a nice lunch and two books, but did not otherwise compensate me for my time nor for anything that I have written here.

    This week, I’m in Las Vegas for Kofax Transform, although just as an attendee this year rather than a speaker; expect to see a few notes from here over the two days of the conference.

    Webinar On Collaborative Business Process Analysis In The Cloud

    I’m giving a webinar on Wednesday, June 18 (11am Eastern) on social cloud-based BPA, sponsored by Software AG – you can register here to watch it live. I’ve written a white paper going into this theme in more detail, which will be available from Software AG after the webinar. They will also be presenting a bit on the webinar about their Process Live cloud-based BPA service, which is their full-featured ARIS process analysis toolset running in the cloud, with some additional collaboration features.

    bpmNEXT 2014 Thursday Session 2: Decisions And Flexibility

    In the second half of the morning, we started with James Taylor of Decision Management Solutions showing how to use decision modeling for simpler, smarter, more agile processes. He showed what a process model looks like in the absence of externalized decisions and rules: it’s a mess of gateways and branches that basically creates a decision tree in BPMN. A cleaner solution is to externalize the decisions so that they are called as a business rules activity from the process model, but the usual challenge is that the decision logic is opaque from the viewpoint of the process modeler. James demonstrated how the DecisionsFirst modeler can be used to model decisions using the Decision Model and Notation standard, then link a read-only view of that to a process model (which he created in Signavio) so that the process modeler can see the logic behind the decision as if it were a callable subprocess. He stepped through the notation within a decision called from a loan origination process, then took us into the full DecisionsFirst modeler to add another decision to the diagram. The interesting thing about decision modeling, which is exploited in the tool, is that it is based on firmer notions of reusability of data sources, decisions and other objects than we see in process models: although reusability can definitely exist in process models, the modeling tools often don’t support it well. DecisionsFirst isn’t a rules/decision engine itself: it’s a modeling environment where decisions are assembled from the rules and decisions in other environments, including external engines, spreadsheet-based decision tables, or knowledge sources describing the decision. It also allows linking to the processes from which it is invoked, objectives and organizational context; since this is a collaborative authoring environment, it can also include comments from other designers.

    François Chevresson-Aubain and Aurélien Pupier of Bonitasoft were up next to show how to build flexibility into deployed processes through a few simple but powerful features. First, adding collaboration tasks at runtime, so that a user in a pre-defined step who needs to include other users at that point can do so even if collaboration wasn’t built in at that point. Second, process model parameters can be changed (by an administrator) at runtime, which will impact all running processes based on that model: the situation demonstrated was to change an external service connector when the service call failed, then replay the tasks that failed on that service call. Both of these features are intended to address dynamic environments where the situation at runtime may be different from that at design time, and how to adjust both manual and automated tasks to accommodate those differences.

    We finished the morning with Robert Shapiro of Process Analytica on improving resource utilization and productivity using his Optima workbench. Optima is a tool for a serious analyst – likely with some amount of statistical or data science background – to import a process model and runtime data, set optimization parameters (e.g., reduce resource idleness without unduly impacting cycle time), simulate the process, analyze the results, and determine how to best allocate resources in order to optimize relative to the parameters. Although a complex environment, it provides a lot of visualization of the analytics and optimization; Robert actually encourages “eyeballing” the charts and playing around with parameters to fine-tune the process, although he has a great deal more experience at that than the average user. There are a number of analytical tools that can be applied to the data, such as critical path modeling, and financial parameters to optimize revenues and costs. It can also do quite a bit of process mining based on event log inputs in XES format, including deriving a BPMN process model and data correlation based on the event logs; this type of detailed offline analysis could be applied with the data captured and visualized through an intelligent business operations dashboard for advanced process optimization.

    We have one more short session after lunch, then best in show voting before bpmNEXT wraps up for another year.

    Webinar On Business-IT Alignment In Process Applications

    This afternoon, I’m giving a webinar (hosted by Software AG) on business-IT alignment when developing process-centric applications: you can sign up for it or see the replay here.

    Some interesting stuff on model-driven development and also why we usually need to use separate modeling tools when we’re building applications for complex core processes.

    We’re also developing a white paper on this topic, to be released in the next few weeks; I’ll post a link to that when it’s out.

    High-Value Solution Consulting At Amdocs With An ARIS-Based Solution Book

    Down to the last two breakout sessions at Innovation World, and we heard from Ophir Edrey of Amdocs, a company providing software for business support, with a focus on the communications, media and entertainment industries. They wanted to be able to leverage their own experience across multiple geographies, leading their customers towards a best practice-based implementation. To do this, they created a solution book that brings together best practices, methodologies, business processes and other information within an enterprise architecture to allow Amdoc consultants to work together with customers to collaborate on how that architecture needs to be modified to fit the customer’s specific needs.

    The advantage of this is the Amdocs doesn’t just offer a software solution, but an entire advisory service around the best practices related to the solution. The solution book is created in ARIS, including the process models, solution design, solution traceability, customer collaboration (which they are migrating to ARIS Connect, not Process Live), and review and approval management.

    He showed us a demo of the Amdocs Solution Book, specifically the business process framework. It contains four levels of decomposition, starting with a value chain of the entire operator landscape mapped onto the full set of process model families. Drilling through into a specific set of processes for, in this example, a mobile customer upgrading a handset, he showed the KPIs and the capabilities provided by their solution for that particular process; this starts the proof of Amdocs value to the customer as more than just a software provider. Drilling further into the specific process model, the Amdocs consultant can gather feedback from the customer on how this might need to be modified for their specific needs, and comments added directly on the models for others to see and comment.

    They have had some pushback from customers on this – some people really just want a paper document – but generally have had very enthusiastic feedback and a strong demand to use the tool for projects. The result is faster, better, value-added implementations of their software solutions, giving them a competitive edge. Certainly an interesting model for the services arm of any complex enterprise software provider.

    Still More Conference Within A Conference: ARIS World

    The irrepressible Joerg Klueckmann, Director of Product Marketing for ARIS, hosted the ARIS World session, third in the sub-conferences that I’ve attended here at Innovation World.

    Georg Simon, SVP of Product Marketing, discussed some of the drivers for ARIS 9: involving occasional users in processes through social collaboration, shortening the learning curve with a redesigned UI, modernizing the look and feel of the UI with new colors and shapes, lowering the TCO with centralized user and license management, and speeding content retrieval with visual and ad hoc search capabilities. There are new role-specific UI perspectives, allowing users to decide what capabilities that they want to see on their interface (based on what they have been allocated by an administrator). There’s a new flexible metamodel, allowing you to create new object types beyond what is provided in the standard metamodel.

    He also briefly mentioned Process Live, which moves this newly re-architected ARIS into the public cloud, and went live yesterday, and discussed their plans to release a mobile ARIS framework, allowing some functionality to be exposed on mobile devices: consuming, collaborating and designing on tablets, and approvals on smartphones as well.

    Their recent acquisition, Alfabet, is being integrated with ARIS so that its repository of IT systems can be synchronized with the ARIS process repository for a more complete enterprise architecture view. This allows for handoffs in the UI between activities in an ARIS process model and systems in an Alfabet object model, with direct navigation between them.

    Software AG Process LiveKlueckmann gave us a demo of Process Live and how it provides social process improvement in the cloud. This is hardly a market leader – cloud-based process discovery/modeling collaboration started with Lombardi Blueprint (now IBM’s Blueworks Live) around 2007 – but it is definitely significant that a leading BPA player like ARIS is moving into the cloud. They’re also offering a reasonable price point: about $140/month for designers, and less than $6/month for viewers, which you can buy directly on their site with a credit card – and there’s a one-month free trial available. Contrast this with Blueworks Live, where an editor is $50/month, a contributor (who can comment) is $10/month, and a viewer is $2/month (but has to be purchased in multiples of 1,000): considerably more expensive for the designer, but likely much more functionality since it brings much of the ARIS functionality to the cloud.

    Software AG Process LiveProcess Live offers three templates for create new project databases, ranging from a simple one with four model types, to the full-on professional one with 74 model types. Process Live doesn’t provide the full functionality of ARIS 9: it lacks direct support from Software AG, instead relying on community support; it is missing a number of advanced modeling and analysis features; and can’t be customized since it’s multi-tenanted cloud. You can check out some of their video tutorials for more information on how it works. Data is stored on the Amazon public cloud, which might offer challenges for those who don’t want to include the NSA as a collaborator.

    Software AG Process LiveWe heard from Fabian Erbach of Litmus Group, a large consulting organization using Process Live with their customers. For them, the cloud aspect is key since it reduces the setup time by eliminating installation and providing pre-defined templates for initiating projects; furthermore, the social aspects promote engagement with business users, especially occasional ones. Since it’s accessible on mobile (although not officially supported), it is becoming ubiquitous rather than just a tool for BPM specialists. The price point and self-provisioning makes it attractive for companies to try it out without having to go through a software purchasing cycle.

    ARIS World ended with a panel of three ARIS customers plus audience participation, mostly discussing future features that customers would like to have in ARIS as well as Process Live. This took on the feel of a user group meeting, which offered a great forum for feedback from actual users, although I missed a lot of the nuances since I’m not a regular ARIS user. Key topics included the redesigned ARIS 9 UI, and the distinction between ARIS and Process Live capabilities.