Category Archives: BPA

business process analysis

TIBCO Nimbus for regulatory compliance at Bank of Montreal

It’s the first afternoon of breakout sessions at TIBCO NOW 2016, and Alex Kurm from Bank of Montreal is presenting how the bank has used Nimbus for process documentation, to serve the goals of regulatory compliance and process transformation. They are one of the largest Nimbus users, and Kurm leads a team of process experts deploying Nimbus across the enterprise as part of their in-house process excellence strategy.

He provided a good overview of regulatory and compliance requirements: to quote his slide, you need to have “evidence of robust, documented standard processes to ensure compliance to risk and regulatory requirements” as a minimum. Overlaid on that, there’s an evolving set of consumer demands, moving from traditional in-person, telephone and ATM banking to web and mobile platforms. As a Canadian resident, I can attest that our banks haven’t been as responsive as desired to customer needs in the past; their focus is on operational risk and security.

wp-1463521439971.jpgBMO’s process centre of excellence maintains a knowledge hub of process best practices (including how to use Nimbus in their environment), leads and supports process-related projects, and heads up governance of all process efforts. They have about 16 people in the CoE, then process specialists out in business areas; they even have internalized the Nimbus training. Although there are a variety of tools being used for process models in the bank, they selected Nimbus because of its business-understandable notation, the ability to put all process content in one place, the built-in governance and control over the content (key for auditors to be able to review), and the direct link between process architecture and process maps.

They started on Nimbus 3 years ago with about 20 process authors working on a couple of opportunistic projects; this quickly ramped up to 300 authors by the next year, and they now have more than 500 authors (including business analysts and project managers as well as process specialists), although there are only about 160 active any given month since this work is often project-based. There are 1800 end users looking at Nimbus maps each month, with the largest number in capital markets, although the highest number of distinct initiatives is in the highly regulated area of capital markets. They organize their 20,000 Nimbus maps by core business capability, such as onboarding, then drill down into the business area; they’re looking at ways of improving that to allow for finding content by any search path. They’re also adding Spotfire to be able to interrogate the content to find non-compliant and high-risk maps for review by the CoE.

Their key use cases are:

  • Process documentation for use as a high-level procedural guide
  • A guide for compliance auditors to verify that specific checks and balances are being done
  • Requirements gathering prior to automation (they are also an ActiveMatrix BPM customer), and as ongoing documentation of the automated process

Nimbus is now a core part of their process transformation and risk mitigation strategies; interestingly, the only resistance came from other “process gurus” in the bank who had their own favorite modeling tools.

Good case study of the benefit of process documentation – even in the absence of process automation — in highly-regulated industries.

bpmNEXT 2016 demo session: Signavio and Princeton Blue

Second demo round, and the last for this first day of bpmNEXT 2016.

Process Intelligence – Sven Wagner-Boysen, Signavio

Signavio allows creating a BPMN model with definitions of KPIs for the process such as backlog size and end-to-end cycle time. The demo today was their process intelligence application, which allows a process model to be uploaded as well as an activity log of historical process instance data from an operational system — either a BPMS or some other system such as an ERP or CRM system — in CSV format. Since the process model is already known (in theory), this doesn’t do process mining to derive the model, but rather aggregates the instance data and creates a dashboard that shows the problem areas relative to the KPIs defined in the process model. Drilling down into a particular problem area shows some aggregate statistics as well as the individual instance data. Hovering over an instance shows the trace overlaid on the defined process model, that is, what path that that instance took as it executed. There’s an interesting feature to show instances that deviate from the process model, typically by skipping or repeating steps where there is no explicit path in the process model to allow that. This is similar in nature to what SAP demonstrated in the previous session, although it is using imported process log data rather than a direct connection to the history data. Given that Signavio can model DMN integrated with BPMN, future versions of this could include intelligence around decisions as well as processes; this is a first version with some limitations.

Leveraging Cognitive Computing and Decision Management to Deliver Actionable Customer Insight – Pramod Sachdeva, Princeton Blue

Sentiment analysis of unstructured social media data, creating a dashboard of escalations and activities integrated with internal customer data. Uses Watson for much of the analysis, IBM ODM to apply rules for escalation, and future enhancements may add IBM BPM to automatically spawn action/escalation processes. Includes a history of sentiment for the individual, tied to service requests that responded to social media activity. There are other social listening and sentiment analysis tools that have been around for a while, but they mostly just drive dashboards and visualizations; the goal here is to apply decisions about escalations, and trigger automated actions based on the results. Interesting work, but this was not a demo up to the standards of bpmNEXT: it was only static screenshots and some additional PowerPoint slides after the Ignite portion, effectively just an extended presentation.

bpmNEXT 2016 demo session: 8020 and SAP

My panel done — which probably set some sort of record for containing exactly 50% of the entire female attendees at the conference — we’re on to the bpmNEXT demo session: each is 5 minutes of Ignite-style presentation, 20 minutes of demo, and 5 minutes for Q&A. For the demos, I’ll just try capture some of the high points of each, and I highly recommend that you check out the video of the presentations when they are published after the conference.

Process Design & Automation for a New Economy – Ian Ramsay, 8020 BPM

A simplified, list-based process designer that defines a list of real-world business entities (e.g., application), a list of states unique to each entity (e.g., approved), lists of individuals and groups, lists of stages and tasks associated with each stage. Each new process has a list of start events that happen when a process is instantiated, one or more tasks in the middle, then a list of end events that define when the process is done. Dragging from the lists of entities, states, groups, individuals, stages and tasks onto the process model creates the underlying flow and events, building a more comprehensive process model behind the scenes. This allows a business specialist to create a process model without understanding process modeling or even simple flowcharting, just by identifying the relationships between the different states of business entity, the stages of a business process, and the people involved. Removing an entity from a process modifies the model to remove that entity while keeping the model syntactically correct. Interesting alternative to BPMN-style process modeling, from someone who helped create the BPMN standard, where the process model is a byproduct of entity-state modeling.

Process Intelligence for the Digital Age: Combining Intelligent Insights with Process Mining – Tarun Kamal Khiani and Joachim Meyer, SAP, and Bastian Nominacher, Celonis

Combining SAP’s Operational Process Intelligence analytics and dashboard (which was shown in last year’s bpmNEXT as well as some other briefings that I’ve documented) with Celonis’ process mining. Drilling down on a trouble item from the OPInt dashboard, such as late completion of a specific process type, to determine the root cause of the problem; this includes actionable insights, that is, being able to trigger an operational activity to fix the problem. That allows a case-by-case problem resolution, but adding in the Celonis HANA-based process mining capability allows past process instance data to be mined and analyzed. Adjusting the view on the mined data allows outliers and exceptions to be identified, transforming the straight-through process model to a full model of the instance data. For root cause analysis, this involved filtering down to only processes that took longer than a specific number of days to complete, then manually identifying the portions of the model where the lag times or certain activities may be causing the overly-long cycle time. Similar to other process mining tools, but nicely integrated with SAP S4 processes via the in-memory HANA data mart: no export or preprocessing of the process instance history log, since the process mining is applied directly to the realtime data. This has the potential to be taken further by looking at doing realtime recommendations based on the process mining data and some predictive modeling, although that’s just my opinion.

Good start to the demos with some new ideas on modeling and realtime process mining.

Positioning Business Modeling panel at bpmNEXT

We had a panel of Clay Richardson of Forrester, Kramer Reeves of Sapiens and Denis Gagne of Trisotech, moderated by Bruce Silver, discussing the current state of business modeling in the face of digital transformation, where we need to consider modeling processes, cases, content, decisions, data and events in an integrated fashion rather than as separate activities. The emergence of the CMMN and DMN standards, joining BPMN, is driving the emergence of modeling platforms that not only include all three of these, but provide seamless integration between them in the modeling environment: a decision task in a BPMN or CMMN model links directly to the DMN model that represents that decision; a predefined process snippet in a CMMN model links directly to the BPMN model, and an ad hoc task in a BPMN model links directly to the CMMN model. The resulting models may be translated to (or even created in) a low-code executable environment, or may be purely for the purposes of understanding and optimizing the business.

Some of the points covered on the panel:

  • The people creating these models are often in a business architecture role if they are being created top down, although bottom-up modeling is often done by business analysts embedded within business areas. There is a large increase in interest in modeling within architecture groups.
  • One of the challenges is how to justify the time required to create these models. A potential positioning is that business models are essential to capturing knowledge and understanding the business even if they are not directly executable, and as organizations’ use of modeling matures and gains visibility with executives, it will be easier to justify without having to show an immediate tangible ROI. Executable models are easier to justify since they are an integrated part of an application development lifecycle.
  • Models may be non-executable because they model across multiple implementation systems, or are used to model activities in systems that do not have modeling capabilities, such as many ERP, CRM and other core operational systems, or are at higher levels of abstraction. These models have strategic value in understanding complexity and interrelationships.
  • Models may be initiated using a model derived from process/data mining to reduce the time required to get started.
  • Modeling vendors aren’t competing against each other, they’re competing against old methods of text-based business requirements.
  • Many models are persistent, not created just for a specific point in time and discarded after use.

A panel including two vendors and an analyst made for some lively conversation, and not a small amount of finger-pointing. 🙂

HP Consulting’s Standards-Driven Requirements Method at BPMCM15

Tim Price from HP’s enterprise transformation consulting group presented in the last slot of day 2 of the BPM and case management summit (and what will be my last session, since I’m not staying for the workshops tomorrow) with a discussion on how to improve requirements management by applying standards. There are a lot of potential problems with requirements: inconsistency, not meeting the actual needs, not designed for change, and especially the short-term thinking of treating requirements as project rather than architecture assets. Price is pretty up-front about how you can’t take a “garden variety” business analyst and have them create BPMN diagrams without training, and that 50% of business analysts are unable to create lasting and valuable requirements.

Although I haven’t done any quantitative studies on this, I would tend to agree that the term “business analyst” covers a wide variety of skill levels, and you can’t just assume that anyone with that title can create reusable requirements models and assets. This becomes especially important when you move past written requirements — that need the written language skills that many BAs do have — to event-driven BPMN and other models; the main issue is that these models are actually code, albeit visual code, that may be beyond the technical analysis capabilities of most BAs.

Getting back to Price’s presentation, he established traceability as key to requirements: between BPMN or UML process models and UML use cases, for example; or upwards from processes to capabilities. Data needs to be modeled at the same time as processes, and processes should be modeled as soon as the the high level use case is defined. You can’t always created a one-to-one relationship between different types of elements: an atomic BPMN activity may translate to a use case (system or human), or to more than one use cases, or to only a portion of a use case; lanes and pools may translate to use case actors, but not necessarily; events may represent states and implied state transitions, although also not necessarily. Use prose for descriptions, but not for control flow: that’s what you use process models for, with the prose just explaining the process model. Develop the use case and process models first, then write text to explain whatever is not obvious in the diagrams.

He walked through a case study of a government welfare and benefits organization that went through multiple failed implementations, which were traced back to poor requirements: structural problems, consistency issues, and designs embedded in the specification. Price and his team spent 12 months getting their analysts back on track by establishing standards for creating requirements — with a few of the analysts not making the transition — that led to CMMI recognition of their new techniques. Another case study applied BPMN process models and UML use cases for a code modernization process: basically, their SDLC was the process being improved. A third case study used BPMN to document as-is and to-be processes, then use case models with complete traceability from the to-be processes to the use cases, with UML domain class models being developed in parallel.

The lessons learned from HP’s experiences:

    • Apply existing standards consistently, including BPMN, CMMN, DMN, UML

    • Use graph-structured languages for structure and logic, and prose for description

    • Use repository-based modeling tools to allow for reusability and collaboration

    • Be concise, be precise, be consistent

    • Create requirements models that are architecture assets, not just short-term project assets

    Some good lessons for requirements analysis; although this was developed for complex more waterfall-y SDLCs, some of these can definitely be adapted for more agile implementations.

    The Digital Enterprise Graph with @denisgagne at BPMCM15

    Yesterday, Denis GagnĂ© demonstrated the modeling tools in the Trisotech Digital Enterprise Suite, and today he showed us the Digital Enterprise Graph, the semantic layer that underlies the modeling tools and allows for analysis of relationships between them. There are many stakeholders involved in defining and implementing a digital enterprise, including enterprise architects, business architects and process analysts; each of these roles has a different view on transformation of the enterprise and different goals for their work. He sees a need for a more emergent enterprise architecture rather than a structured top-down architecture effort: certainly, architects need to create the basic structure, but rather than trying to build out every artifact that might exist in the architecture before making use of it, a more pragmatic approach is for a “just-in-time” architecture that is a bit more self-organizing.

    A graph, in general, is a powerful but simple contstruct: it consists only of nodes and links, but can provide meaningful connections of loosely-coupled business entities that can be easily modified. Think about a social graph, such as Facebook’s social graph: it’s just people and their connections, but it’s a rich context for analyzing the relationships between nodes (people) in the graph depending on the nature of the links (friends, likes, etc.) between them. Trisotech’s Digital Enterprise Graph links the who, what, when, where, why and how of an organization by mapping every model that is added to the Graph onto those types of nodes and links, whether the model originates with one of their own modelers (BPMN, CMMN, DMN) or an external EA modeling tool (Casewise, SAP PowerDesigner, Sparx). This provides an intelligent fabric for automated reasoning about the current relationships between parts of the organization, but also allows estimation of the impact of changes in one area on other parts of the organization. Their Insight Analyzer tool allows you to introspect the graph, providing views such as interconnectivity between nodes as part of impact analysis, or tracing responsibility for a capability up through the organizational structure. The analysis isn’t automated, but provides visualization tools for analysts and planners, based on a single integrated scheme that allows for straightforward queries.

    He gave us a demo of the Graph in action, starting with a BPMN model that uses the Sparx EA accelerator for SOA architecture artifacts, and tracing through that loose coupling to the architectural components in the EA framework, with similar linkages for roles from a Casewise business architecture framework and definitions of contracts from the Financial Business Industry Ontology (FIBO). The idea is that the Graph provides an underlying mesh of semantic linkages from elements in a model to other frameworks, ontologies and models while still retaining business understandability at the model level. In the Insight Analyzer, we saw how to explore linkages between different types of elements, such as RACI-type relationships between roles and activities, as well as a more detailed introspection that allows drilling down on any node to see what other nodes and models that it is linked to, and traversing those links.

    Interesting ideas about how to bring together all of the architecture, process, case and decision models and frameworks into a single graph for analysis of your digital enterprise.

    Fannie Mae Case Study on Effective Process Modeling at BPMCM15

    Amit Mayabhate from Fannie Mae (a US government-sponsored mortgage lender that buys mortgages from the banks and packages them for sale as securities) gave a session at the BPM and Case Management Summit on outcome-based process modeling for delivering business value. He had a few glitches getting started — apparently Fannie Mae doesn’t allow employees to download a presentation to their laptop, so he had to struggle through getting connected to the conference wifi and then the Fannie Mae VPN to open a PDF of his presentation — but did tell the best joke of the day when he was restarting his computer in front of us and said “now you know my password…it’s 8 dots in a row”.

    Back on track, he discussed their business architecture efforts and how process modeling fits into it. Specifically, he talked about their multifamily housing division, which had its own outdated and inflexible technology platform that they wanted to change out for a simpler infrastructure that would give them better access to information for better decision-making. To get there, they decided to start with the best possible outcome in mind, but first had to have the organization understand not only that they had problems, but some quantification of how big those problems were in order to set those future goals. They identified several key metrics where they could compare today’s measurements with their desired future goals, such as operational efficiency (manual versus automated) and severability. To map from the current to future state, they needed a transformation roadmap and a framework for achieving the steps on the roadmap; this included mapping their journey to greater levels of process maturity, and creating a business capability model that included 17 capabilities, 65 functions, 262 sub-functions, and around 300 process flows.

    Their business architecture transformation framework started with the business model (how do we make money), the operating model (how do we behave to make money) and the business capability model (what abilities are needed) using the Business Model Canvas framework. They used other architecture analysis tools, such as analyzing their operating model by plotting business process standardization against business process integration both for their current state and desired future state, to help them develop the strategy for moving between them. They used Mega’s business strategy module for most of the architecture analysis, which helps them identify business processes that are ripe for automation, then move to a BPMS for process modeling and automation. In that way, they can do just the process modeling that provides them with some architectural change that they know will provide value, rather than attempting to boil the ocean by modeling all processes in the organization.

    Going Beyond Process Modeling, Part 1

    I recently wrote two white papers for Bizagi on going beyond process modeling to process execution: Bizagi is known for their free downloadable process modeler, but also have a full-featured BPMS for process execution.

    My papers are not at all specific to Bizagi products; the first one, which you can find here (registration required) outlines the business benefits of automating and managing processes, and presents some use cases. In my experience, almost every organization models their processes in some way, but most never move beyond process analysis to process management. This paper will provide some information that can help build a business case to do just that.

    The second paper will be released in a few weeks, covering a more technical view of exactly how you go about starting on process automation projects, and moving from an initial project to a broader program or center of excellence.

    We’re also scheduling a webinar to expand on the concepts in the paper, I’ll post the date when that’s available.

    If you want to learn more about how Bizagi stacks up in the BPMS marketplace, check out the report on Bizagi from the Fraunhofer Institute for Experimental Software Engineering. available in both English and German. Spoiler alert: relative to the participating vendors, Bizagi scored above average in six of the nine categories, with the remaining around average. This is a more rigorous academic view than you might find in a typical analyst report on a vendor, including test scenarios and scripts for workshops where they created and ran sample process applications. Fraunhofer sells a book with the complete market analysis of all vendors studied, although I could only find a German edition on their site.

    Software AG Analyst Day: The Enterprise Gets Digital

    After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely:

    Grand Canyon

    Back at work, I spent a day last week in Boston for the first-ever North American Software AG analyst event, attended by a collection of industry and financial analysts. It was a long-ish half day followed by lunch and opportunities for one-on-one meetings with executives: worth the short trip, especially considering that I managed to fly in and out between the snow storms that have been plaguing Boston this year. I didn’t live-blog this since there was a lot of material spread over the day, so had a chance to see some of the other analysts’ coverage published after the event, such as this summary from Peter Krensky of Aberdeen Group.

    The focus of the event was squarely on the digital enterprise, a trend that I’m seeing at many other vendors but not so many customers yet. Software AG’s CEO, Karl-Heinz Streibich kicked off the day talking about how everywhere you turn, you hear about the digital enterprise: not just using digital technology, but having enough real-time data and devices integrated into our work and lives that they can be said to be truly digital. Streibich feels that companies with a basis in integration middleware – like Software AG with webMethods and other products – are in a good position to enable digital enterprises by integrating data, devices and systems of all types.

    Although Software AG is not a household consumer name, its software is in 70% of the Fortune 1000, with a community of over 2M developers; it’s fair to say that you will likely interact with a company that uses Software AG products at least once per day: banks, airports and airlines, manufacturing, telecommunications, energy and more. Their revenues are split fairly evenly between Europe and the Americas, with a small amount in Asia Pacific. License revenues are 32% of the total, with maintenance and consulting splitting the remainder; this relatively low proportion of license revenue is an indicator of a mature software company, and not unexpected from a company more than 40 years old. I found a different representation of their revenues more interesting: they had 66% of their business in the “digital business” segment in 2014, expected to climb to 75% this year, which includes their portfolio minus the legacy ADABAS/NATURAL mainframe development tools. Impressive, considering that it was about a 50:50 split in 2010. 2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103114 PM.bmpPart of this increase is likely due to their several acquisitions over that period, but also because they are repositioning their portfolio as the Digital Business Platform, a necessary shift towards the systems of engagement where more of the customer spend is happening. Based on the marketecture diagram, this platform forms a cut-out layer between back office core operational systems and front office customer engagement systems. Middleware, by any other name; but according to Streibich, more business logic is moving to the middleware layer, although this is what middleware vendors have been telling us for decades.

    There’s definitely a lot of capable products in the portfolio that form this “development platform for digital business” – webMethods (integration and BPM), ARIS (BPA), Terracotta (in memory big data), Longjump (application PaaS), Metaquark (mobility), Alfabet, Apama, JackBe and more – but the key will be to see how well they can make them all work together to be a true platform rather than just a collection of Software AG-branded tools.

    We had an in-depth presentation on their Digital Business Platform from Wolfram Jost, Software AG’s CTO; you can read the long version on their site, so I’ll just hit the high points. He started with some industry quotes, such as “every company will become a software company”, and one analyst firm’s laughable brainstorm for 2014, “Big Change”, but moved on to define digital business as having the following characteristics:

    • Blurring the digital and physical world
    • More influence of customers (on business direction as well as external perceptions)
    • Combining people, business and physical things
    • Agility, speed, scale, responsiveness
    • “Supermaneuverable” business processes
    • Disrupting existing business models

    The problem with this shift in business models is that conventional business applications don’t support the way that the new breed of business applications are designed, developed, used and operated. Current applications and development techniques are still valuable, but are being pushed behind the scenes as core operational systems and packaged applications.

    Software AG’s Digital Business Platform, then, is based on the premise that few packaged applications are useful in the face of business transformation and the required agility. We need tools to create adaptive applications – built to change, not to last – especially in front office customer engagement applications, replacing or augmenting packaged CRM and other applications. This is not fundamentally different from the message about any agile/adaptive/mashup/model-driven application development environment over the past few years, including BPMS; it’s interesting to see how a large vendor such as Software AG positions their entire portfolio around that message. In fact, one of their slides refers to the adaptive application platform as iBPMS, since the definition of iBPMS has expanded to include everything related to model-driven application development.

    2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103731 PM.bmpThe core capabilities of their platform include intelligent business operations (webMethods Operational Intelligence, Apama Streaming Analytics); agile processes (webMethods BPM and AgileApps); integration (webMethods Integration and API Management); in-memory data fabric (Terracotta); and business and IT transformation (ARIS BPA and GRC, Alfabet IT Portfolio Management and EA Management). In a detailed slide overlaying their products, they also added a transaction processing capability to allow the inclusion of ADABAS-NATURAL, as well as the cloud offerings that they’ve released over the past year.

    Jost dug further in to definitions of business application layers and architectural requirements. They provide the structure and linkages for event routing and event persistence frameworks, using relatively loose event-based coupling between their own products to allow them to be deployed selectively, but also (I imagine) to reduce the amount of refactoring of the products that would be required for tighter coupling. Their cloud IoT offering plays an interesting role by ingesting events from smart devices – developed via co-innovation with device companies such as Bosch and Siemens – for integration with on-premise business applications.

    We then heard two shorter presentations, each followed by a panel. First was Eric Duffaut, the Chief Customer Officer, presenting their go-to-market strategy then moderating a panel with two partners, Audi Lucas of Wipro and Chris Brinton of Mosaic Data Science. Their GTM plan was fairly standard for a large enterprise software vendor, although they are improving effectiveness by having a single marketing team across all products as well as improving the sales productivity processes. Their partners are critical for scalability in this plan, and provide the necessary industry experience and solutions; both of the partner panelists talked about co-innovation with Software AG, rather than just providing resources trained on the products.

    The second presentation and panel was led by John Bates, CMO and head of industry solutions; he was joined by a customer panel including Bryan Zigler of Boeing, Mark DuBrock of Standard&Poor, and Greg James of Outerwall. Bates discussed the role of industry solutions and solution accelerators, built by Software AG and/or partners, that provide a pre-built, customizable and adaptive application for fast deployment. They’re not using the Smart Process Application terminology that other vendors adopted from the Forrester trend from a couple of years ago, but it’s a very similar concept, and Bates announced the solution marketplace that they are launching to allow these to be easily discovered and purchased by customers.

    My issue with solution accelerators and industry solutions in general is that many of these solutions are tied to a specific version of the underlying technology, and are templates rather than frameworks in that you change the solution itself during implementation: upgrades to platform may not be easily performed, and upgrades to the actual solution likely requires re-customizing for each deployed instance. I didn’t get a chance to ask Bates how SAG helps partners and customers to create and deploy more upgradable solutions, e.g., recommended technology guardrails; this is a sticky problem that every technology vendor needs to deal with.

    AVPageView 07032015 111148 PM.bmpBates also discussed the patterns of digital disruption that can be seen in the marketplace, and how these are manifesting in three specific areas that they can help to address with their Digital Business Platform:

    • Connected customers, providing opportunities for location-based marketing and offers, automated concierge service, customer location tracking, demographic marketing
    • Internet of Things/Machine-to-Machine (IoT/M2M), with real-time monitoring and diagnostics, and predictive maintenance
    • Proactive risk and compliance, including proactive financial trade surveillance for unusual/rogue behavior

    After a wrapup by Streibich, we received copies of his latest book, The Digital Enterprise, plus Thingalytics by Bates; ironically, these were paper rather than digital copies. Winking smile

    Disclosure: Software AG paid my airfare and hotel to attend this event, plus gave me a nice lunch and two books, but did not otherwise compensate me for my time nor for anything that I have written here.

    This week, I’m in Las Vegas for Kofax Transform, although just as an attendee this year rather than a speaker; expect to see a few notes from here over the two days of the conference.

    Webinar On Collaborative Business Process Analysis In The Cloud

    I’m giving a webinar on Wednesday, June 18 (11am Eastern) on social cloud-based BPA, sponsored by Software AG – you can register here to watch it live. I’ve written a white paper going into this theme in more detail, which will be available from Software AG after the webinar. They will also be presenting a bit on the webinar about their Process Live cloud-based BPA service, which is their full-featured ARIS process analysis toolset running in the cloud, with some additional collaboration features.