bpmNEXT 2018: Complex Modeling with MID GmbH, Signavio and IYCON

The final session of the first day of bpmNEXT 2018 was focused on advanced modeling techniques.

Designing the Data-Driven Company, MID GmbH

Elmar Nathe of MID GmbH presented on their enterprise decision maps, which provides an aggregated visualization of strategic, tactical and operational decisions with business events. They provide a variety of modeling tools, but see decisions as key to understanding how organizations are driven by data and events. Clearly a rich decision modeling environment, including support for PMML for including predictive models and other data scientist analysis tools, plus links to other model types such as ERDs that can show what data contributes to which decision model, and business process models. Much more of an enterprise architecture approach to model-driven design that can incorporate the work of data scientists.

Using Customer Journeys to Connect Theory with Reality, Signavio

Till Reiter and Enrico Teterra of Signavio started with a great example of an Ignite presentation, with few words, lots of graphics and a bit of humor, discussing their new notation for modeling an outside-in view of the customer journey rather than just having an undifferentiated “customer” swimlane in a BPMN diagram. The demo walked through their customer journey mapping tool, and how their collaboration hub overlays on that to allow information about each component of the journey map to be discussed amongst process modeling users. The journey map contains a lot of information about KPIs and other process metrics in a form most consumable by process owners and modelers, but also has a notebook/dashboard view for analysts to determine problems with the process and identify potential resolution actions. This includes a variety of analysis tools including process discovery, where process mining techniques are applied to determine which paths in the process model may be contributing to specific problems such as cycle time, then overlay this on the process model to assist with root cause analysis. Although their product does a good job of combing CJMs, process models and process analysis, this was more of a walkthrough of a set of pre-calculated dashboard screens rather than an actual demo — a far cry from the experimental features that Gero Decker showed off in their demo at the first bpmNEXT.

Discovering the Organizational DNA, IYCON and Knowledge Consultants

The final presentation of this section was with Jude Chagas Pereira of IYCON and Frank Kowalkowski of Knowledge Consultants presenting IYCON’s Afterspyre modeling tool for creating a catalog of complex business objects, their attributes and their linkages to create organizational DNA diagrams. Ranking these with machine learning algorithms for semantic and sentiment analysis allows identification of process improvement opportunities. They have a number of standard business analysis techniques built in, and robust analytics focused on problem solving. The demo walked through their catalog, drilling down into the “Strategy DNA” section and into “Technology Solutions” subsection to show an enumeration of the platforms currently in place together with attributes such as technology risk and obsolescence, which can be used to rank technology upgrade plans. Relationships between business objects can be auto-detected based on existing data. Levels including Objectives, Key Processes, Technology Solutions, Database Technology and Datacenter and their interrelationships are mapped into a DNA diagram and an alluvial diagram, starting at any point in the catalog and drilling down a specific number of levels as selected by the modeling analyst. These diagrams can then be refined further based on factors such as scaling the individual markers based on actual performance. They showed sentiment analysis for a hotel rank on a review site, which included extracting specific phrases that related to certain sentiments. They also demonstrated a two-model comparison, which compared the models for two different companies to determine the overlap and unique processes; a good indicator for a merger/acquisition (or even divestiture) level of difficulty. They finished up with affinity modeling, such as the type used by Amazon when they tell you what books that other people bought who also bought the book that you’re looking at: easy to do in a matrix form with a small data set, but computationally intensive once you get into non-trivial amounts of data. Affinity modeling is most commonly used in marketing to analyze buying habits and offering people something that they are likely to buy, even if that’s what they didn’t plan to buy at first — this sort of “would you like fries with that” technique can increase purchase value by 30-40%. Related to that is correlation modeling, which can be used as a first step for determining causation. Impressive semantic data-driven analytics tool for modeling a lot of different organizational characteristics.

That’s it for day one; if everyone else is as overloaded with information as I am, we’re all ready for tonight’s wine tasting! Check the Twitter stream for opinions and photos from other attendees.

Strategy to execution – and back: it’s all about alignment

I recently wrote a paper sponsored by Software AG called Strategy To Execution – And Back, which you can find here (registration required). From the introduction:

When planning for business success, corporate management sets business strategy and specifies goals in terms of critical success factors and key performance indicators (KPIs). Although senior management is not concerned with the technical details of how business operations are implemented, they must have confidence that the operations are aligned with the strategy, and be able to monitor performance relative to the goals in real time.

In order to achieve operational alignment, there must be a clear path that maps strategy to execution: a direct link from the strategic goals in the high-level business model, through IT development and management practices, to the systems, activities and roles that make the business work. However, that’s only half the story: there must also be a path back from execution to strategy, allowing operational performance to be measured against the objectives in order to guide future strategy. Without both directions of traceability, there’s a disconnect between strategy and operations that can allow a business to drift off course without any indication until it’s far too late.

I cover how you need to have links from your corporate strategy through various levels of architecture to implementation, then be able to capture the operational metrics from running processes and roll those up relative to the corporate goals. If you don’t do that, then your operations could just be merrily going along their own path rather than working towards corporate objectives.

The Digital Enterprise Graph with @denisgagne at BPMCM15

Yesterday, Denis Gagné demonstrated the modeling tools in the Trisotech Digital Enterprise Suite, and today he showed us the Digital Enterprise Graph, the semantic layer that underlies the modeling tools and allows for analysis of relationships between them. There are many stakeholders involved in defining and implementing a digital enterprise, including enterprise architects, business architects and process analysts; each of these roles has a different view on transformation of the enterprise and different goals for their work. He sees a need for a more emergent enterprise architecture rather than a structured top-down architecture effort: certainly, architects need to create the basic structure, but rather than trying to build out every artifact that might exist in the architecture before making use of it, a more pragmatic approach is for a “just-in-time” architecture that is a bit more self-organizing.

A graph, in general, is a powerful but simple contstruct: it consists only of nodes and links, but can provide meaningful connections of loosely-coupled business entities that can be easily modified. Think about a social graph, such as Facebook’s social graph: it’s just people and their connections, but it’s a rich context for analyzing the relationships between nodes (people) in the graph depending on the nature of the links (friends, likes, etc.) between them. Trisotech’s Digital Enterprise Graph links the who, what, when, where, why and how of an organization by mapping every model that is added to the Graph onto those types of nodes and links, whether the model originates with one of their own modelers (BPMN, CMMN, DMN) or an external EA modeling tool (Casewise, SAP PowerDesigner, Sparx). This provides an intelligent fabric for automated reasoning about the current relationships between parts of the organization, but also allows estimation of the impact of changes in one area on other parts of the organization. Their Insight Analyzer tool allows you to introspect the graph, providing views such as interconnectivity between nodes as part of impact analysis, or tracing responsibility for a capability up through the organizational structure. The analysis isn’t automated, but provides visualization tools for analysts and planners, based on a single integrated scheme that allows for straightforward queries.

He gave us a demo of the Graph in action, starting with a BPMN model that uses the Sparx EA accelerator for SOA architecture artifacts, and tracing through that loose coupling to the architectural components in the EA framework, with similar linkages for roles from a Casewise business architecture framework and definitions of contracts from the Financial Business Industry Ontology (FIBO). The idea is that the Graph provides an underlying mesh of semantic linkages from elements in a model to other frameworks, ontologies and models while still retaining business understandability at the model level. In the Insight Analyzer, we saw how to explore linkages between different types of elements, such as RACI-type relationships between roles and activities, as well as a more detailed introspection that allows drilling down on any node to see what other nodes and models that it is linked to, and traversing those links.

Interesting ideas about how to bring together all of the architecture, process, case and decision models and frameworks into a single graph for analysis of your digital enterprise.

London Calling To The Faraway Towns…For EACBPM

I missed the IRM Business Process Management Europe conference in London last June, but will be there this year from June 15-18 with a workshop, plus a breakout session and a panel session. It’s collocated with the Enterprise Architecture Europe conference, and you can attend sessions from either conference if you attend.

There are five conference tracks and 40 case studies over three days of the conference, plus a day of pre-conference workshops. Here’s what I’m presenting:

  • On the morning of June 15, I’ll present a half-day workshop/tutorial on The Future of Work, looking at how work is changing in the face of changing technology and culture, and how to adapt your organization for this brave new world.
  • On the morning of June 17, I’ll give a breakout session that excerpts some of the material from the workshop on Changing Incentives for Knowledge Workers.
  • Also on the morning of June 17, I’ll be on a panel of “BPM Gurus” with Roger Burlton, Ron Ross and Howard Smith, moderated by Chris Potts, discussing ten years of BPM.

IRM runs a good conference with a lot of great content, hope to see you there. If you plan to attend, I have a 10% discount code that I can provide to colleagues, send me a note or add a comment here and I’ll send it to you.

bpmNEXT 2015 Day 2 Demos: Trisotech, Comindware, Bonitasoft

The first group of demos on bpmNEXT day 2 had a focus on the links between architecture and process: from architectural modeling, to executable architecture, to loosely-coupled development architecture for process applications.

Trisotech: Digital Enterprise Graph Semantic Layer for Business/IT divide

Denis Gagné kicked off talking about Trisotech’s Digital Enterprise Graph, which is a semantic layer for transforming and combining information and models, allowing information to be shared and enriched for use by both business and IT stakeholders. The issue with current standards is that they only allow for structured exchange of information between different parts of the business, but a graph structure allows for information in widely varying formats to be distilled down to the who, what, when, where and why of the organization, allowing new relationships and interactions to be discovered and explored. Trisotech’s current modeling tools — Discovery Accelerator, BPMN Modeler and CMMN Modeler — can all contribute models to the Digital Enterprise Graph, but it can also accept models from a variety of other enterprise architecture and modeling tools. This brings together business architecture, enterprise architecture and case/process modeling outputs into a consolidated semantic graph, allowing each group to use their own models and terminology. Denis gave a demo of the Discovery Accelerator for capturing/discovering business information, where a text description can be highlighted with the actors, activities and artifacts to iteratively build a conceptual model; a balanced scorecard, W5 or SIPOC board can be used as a starting template; or an accelerator to reference models from Casewise, APQC and others to provide a framework and ontology to begin discovery and modeling. RACI charts can be created from the actors, activities and goals. The resulting information can be exported into BPMN, CMMN, UML, XPDL or GO-BPMN for more detailed modeling in another tool. If an EA reference framework (such as Casewise or Sparx) was used in the Discovery Accelerator, semantic links are maintained from activities to the original framework, even if the activities have been renamed and reorganized. He finished up with a demo of their new Insight Analyzer tool, which is used to explore information in the Digital Enterprise Graph; a node in the graph can be selected to see its origin as well as interrelationships with other nodes that may have come from different modeling tools. New relationships can be inferred from the graph as more information is added, without having to make explicit links, for example, identifying risk points based on their level of interconnectivity with other activities.

[Update to change Trisotech “BPM Graph” to “Digital Enterprise Graph” to match Denis’ presentation materials and current product naming.]

Comindware: Between Architecture and Execution: Tale of 3 Gaps

Anatoly Belaychuk and Konstantin Bredyuk discussed gaps between architecture and execution in terms of process models — the process model round-trip problem; between process, project and case models;and between process-based versus object-based work. They see architecture and architectural maturity as important in an organization’s ability to model and execute processes. In their demo, they showed a different representation of processes by modeling capabilities, resources and inputs/outputs; this is not an execution sequence to replace BPMN, but rather an architectural view of how organizational capabilities link together, more like a value chain diagram with major milestones identified. Drilling down into a capability, we may see a submodel using the same model syntax, or it may link to a BPMN process. This is like a slice through enterprise architecture, with a variety of process-related model types linked into a business architecture capability model, but also creates executable processes and cases, not just models. This “executable architecture” can be used by both architects and process modelers; it also includes data modeling to define record objects and attributes, and a forms modeler to provide a complete application development environment. This provides a link between architects — who are unlikely to learn or even care about BPMN — and executable process models, although there is not a direct link to existing enterprise architecture products or models in order to maintain any sort of semantic links such as we saw in the Trisotech demo earlier.

Bonitasoft: Building Sustainable Process-Based Apps

Miguel Valdés Faura finished this block of demos discussing process-based applications: how it’s still hard to create engaging user interfaces and easily-updated applications in spite of the low-code/no-code promises. He demoed some of their capabilities still in their labs, allowing for more agile applications by separating data, business logic and user interfaces. He started with a procurement application: BPMN process models for the business logic, data object models, and user interfaces defined separately, interacting via JSON contracts and REST APIs. The contract between an activity in the process model and the user interface is defined as inputs and constraints; as long as the contract does not change, the UI can be changed with no impact on the process model. Mobile interfaces can be built independently of desktop interfaces, using the same contracts to interface with the business logic, and REST APIs for access to the data objects. Their page builder provides environments for different form factors, providing standard UI widgets plus allowing for custom widgets; either the page can be deployed directly in their environment, or the page definition can be exported for further hand-coding outside their environment. Page fragments can be created for reusability cross pages. Custom pages built outside their environment, such as with AngularJS, can be imported by an administrator into the runtime environment and immediately deployed. Although a full process application can be built purely in their environment, by loosely coupling the logic, data and UI, they are able to make changes to any of those layers including adding custom components and UIs without impacting the others, as long as they respect the existing contract and APIs. Good example of why we use multi-tier architectures rather than tightly-coupled layers for greater flexibility and agility.

Australia Post at camunda Community Day

I am giving the keynote at camunda’s BPMcon conference tomorrow, and since I arrived in Berlin a couple of days early, camunda invited me to attend their community day today, which is the open source community meeting. Nice to see such a great turnout — 70 or 80 people? — and I definitely wasn’t the one to travel the furthest to get here, since today’s opening presentation is from Rob Parker of Australia Post. Australia has a lot of the same issues as Canada when it comes to nation-wide services such as post, since we both have a lot of geography and not a lot of population: this means that a lot of services have to be delivered at a fiscal loss to sparsely-populated areas, with specific rules about what percentage of the population has to be within a certain distance of a postal outlet.

Post offices in particular are hard-hit by digital disruption; Australia Post has seen their letter delivery service decline by 1 billion articles (and the related revenue), even though the number of addresses to cover has increased. However, they have seen their parcel delivery business increase, even though this is a competitive business with courier companies. They’re also offering a number of other products, such as electronic bill payment, digital mail delivery and even passport interviews, which has driven them to create a more integrated multi-channel/multi-product architecture to be able to quickly bring new products to market. They’re using camunda BPM for their order management processses, both for customer orders and service fulfillment orders. Customer order processes support the various customer channels, then drive out one or more service order processes to fulfill a customer order.

They decided to use BPM in order to externalize processes from applications, making for more agile development and better reusability. They picked camunda because they wanted “just enough technology”: that is, they wanted to add process management to their existing Java application development environment, not rewrite all of their apps in a proprietary, monolithic BPMS app dev environment. camunda BPM is used to implement the multiple service order processes that might be kicked off by any given customer order, with their overall architecture handling the communication between the two order management layers: the customer order layer as a consumer for the service order layer producer.

Parker went into a lot of detail of how they have implemented this architecture, putting their BPM usage into the context of their overall technical architecture, and walked through the general process model for their service order that instantiates a dispatcher process for each customer order, which in turn instantiates a subprocess for each line item in the order. They really want to implement all of this in camunda, but are still using TIBCO for the dispatching process while they work out some of the sticky bits such as how subprocess cancelations are propagated to the parent process. They are also having some challenges with handling process versions, considering that they run 7×24: they need a mapping table that takes these temporal anomalies into consideration, so that the process version in use may be tied to the order date for longer-running order processes. They also created a business dashboard by modifying Cockpit, the camunda IT operations dashboard, to remove all of the “dangerous” operations while exposing the work in progress, and adding some additional functions such as searching by a business key.

Parker ended up with their outcomes, some expected, some less so: basically, BPMN 2.0 is really working for them both for business-IT collaboration and model-driven development; this level of business-IT alignment means that error handling can be shared, with business handling business errors, and IT handling IT errors. They found that developers became productive very quickly since they were just adding some tools to their existing familiar Java application development environment, although some had to be gently reminded to use the BPM capabilities instead of writing code.

It was great to see the reactions and interactions of the camunda team during the presentation: Australia Post is a “do-it-themselves” open source user of camunda, and as Parker discussed some of the shortcomings, they were obviously taking some notes for future work. The presentation finished with him being presented as an award for the non-camunda person who contributed most to the community forum discussions, suggesting that you get out of open source what you put into it.

bpmNEXT 2014 Wednesday Afternoon 1: Mo’ Models

Denis Gagne of Trisotech was back after lunch at bpmNEXT demonstrating socializing process change with their BPMN web modeler. He showed their process animation feature, which allows you to follow the flow through a process and see what happens at each step, and view rich media that has been attached at any given step to explain that step. He showed a process for an Amazon order, where each step had a slideshow or video attached to show the actual work that was being performed at that step; the tool supports YouTube, Slideshare, Dropbox and a few others natively, plus any URL as an attachment to any element in the process. The animated process can be referenced by a URL, allowing it to be easily distributed and socialized. This provides a way for people to learn more about the process, and can be used as an employee training tool or a customer experience enhancement. Even without the rich media enhancements, the process animation can be used to debug processes and find BPMN logical errors (e.g., deadlocks, orphan branches) by allowing the designer to walk through the process and see how the tokens are processed through the model – most modeling tools only check that the BPMN is syntactically correct, not for more complex logical errors that can result in unexpected and unwanted scenarios. Note that this is different from process simulation (which they also offer), which is typically used to estimate performance based on aggregate instances.

Bruce Silver took a break from moderating to do a demo together with Stephan Fischli and Antonio Palumbo of itp commerce on wizard-based generation of “good BPMN” that they’ve done through their BPMessentials collaboration for BPMN training and certification. Bruce’s book BPMN Method and Style as well as his courses attempt to teach good BPMN, where the process logic is evident from the printed diagram in spite of things that can tend to confuse a reader, such as hierarchical modeling forms. He uses a top-down methodology where you identify the start and end states of a process instance, then decompose the process into 6-10 steps where each is an activity aligned with the process instance (i.e., no multi-instance activities), and enumerate the possible end states of each activity if there is more than one so that end states within subprocesses can be matched to gateways that immediately follow the subprocesses. This all takes a bit of a developer’s mindset that’s typically not seen in business analysts who might be creating BPMN models, meaning that we can still end up with spaghetti process models even in BPMN. Bruce walked through an order-to-cash scenario, then Stephan and Antonio took over to demonstrate how their tool creates a BPMN model based on a wizard that walks through the steps of the BPMN method and style: first the process start and (one or more) end states; then a list of the major steps, where each is named, the end states enumerated and (optionally) the performer identified; then the activity-end state pairs are listed so that the user can specify the target (following step), which effectively creates the process flow diagram; then, each activity can be expanded as a subprocess by listing the child activities and the end states; finally, the message flows and lanes are specified by stating which activities have incoming and outgoing message flows. The wizard then creates the BPMN process model in the itp commerce Visio tool where all of the style rules are enforced. Without doubt, this creates better BPMN, although specifying a branching process model via a list of activities and end states might not be much more obvious than creating the process model directly. I know that the itp commerce and some other BPMN modeling tools can also run a check on a BPMN model to check for violations of the style rules; I assume that detecting and fixing the rule violations from a model is just another way of achieving the same goal.

Last up before the afternoon break was Gero Decker of Signavio to demonstrate combining process modeling and enterprise architecture. Signavio’s only product is their process modeler – used to model, collaborate, publish and implement models – which means that they typically deal with process designers and process centers of excellence. However, they are finding that they are now running into EA modelers as they start to move into process architecture and governance, and application owners for application lifecycle management. EA modelers have to deal with the issues of whether to use a unified tool with a single object repository for all modeling and unified model governance, or multiple best of breed tools where metamodels can be synchronized and may be slaved between tools. Signavio is pushing the second alternative, where their tool integrated with or overlays other tools such as SAP Solution Manager and leanIX. Signavio has added ArchiMate open standard enterprise architecture model types to their tool for EA modeling, creating linkages and tracing from ArchiMate objects to BPMN models. Gero demonstrated what the ArchiMate models look like in Signavio, then how processes in leanIX can be directly linked to Signavio process models as well as having applications from the EA portfolio available as performers to specify in a Signavio process model. Creating of process models in Signavio that use applications from the portfolio then show up (via automated synchronization) in leanIX as references to that application. He also showed an integration with Effektif for approving changes to the process model in Signavio, comparing the before and after versions of the flow, since there is a pluggable connector to Signavio from Effektif processes. Connections to other tools could be built using the Signavio REST API. Nice integration between process models and application portfolio models in separate tools, as well as the model approval workflow.

High-Value Solution Consulting At Amdocs With An ARIS-Based Solution Book

Down to the last two breakout sessions at Innovation World, and we heard from Ophir Edrey of Amdocs, a company providing software for business support, with a focus on the communications, media and entertainment industries. They wanted to be able to leverage their own experience across multiple geographies, leading their customers towards a best practice-based implementation. To do this, they created a solution book that brings together best practices, methodologies, business processes and other information within an enterprise architecture to allow Amdoc consultants to work together with customers to collaborate on how that architecture needs to be modified to fit the customer’s specific needs.

The advantage of this is the Amdocs doesn’t just offer a software solution, but an entire advisory service around the best practices related to the solution. The solution book is created in ARIS, including the process models, solution design, solution traceability, customer collaboration (which they are migrating to ARIS Connect, not Process Live), and review and approval management.

He showed us a demo of the Amdocs Solution Book, specifically the business process framework. It contains four levels of decomposition, starting with a value chain of the entire operator landscape mapped onto the full set of process model families. Drilling through into a specific set of processes for, in this example, a mobile customer upgrading a handset, he showed the KPIs and the capabilities provided by their solution for that particular process; this starts the proof of Amdocs value to the customer as more than just a software provider. Drilling further into the specific process model, the Amdocs consultant can gather feedback from the customer on how this might need to be modified for their specific needs, and comments added directly on the models for others to see and comment.

They have had some pushback from customers on this – some people really just want a paper document – but generally have had very enthusiastic feedback and a strong demand to use the tool for projects. The result is faster, better, value-added implementations of their software solutions, giving them a competitive edge. Certainly an interesting model for the services arm of any complex enterprise software provider.

The Digital Agility Layer: Time To Get Intentionally Digital

Wolfram Jost, CTO of Software AG, started us off on the first full day of Innovation World with a keynote on innovations for the digital enterprise. As I mentioned yesterday, the use of the term “digital enterprise” (and even more, “digitization”) is a bit strange, since pretty much everything is digital these days, it’s just not necessarily the right type of digital. We still need to think about integration between systems to make automation seamless, but more importantly, we need to think about interaction patterns that put control in the hands of customers, and mobile and social platforms that make the digital forms ubiquitous. So maybe the right phrase is that we have to start being intentionally digital enterprises, rather than let it happen accidentally.

Software AG suiteI definitely agree with Jost’s key point: it’s all about the process. We need end-to-end processes at the business/customer layer, but have to interact with a plethora of silos down below, both on premise and in the cloud, some of which are decades old. Software AG, naturally, provides tools to help that happen: in-memory data management, integration/SOA, BPM, EA and intelligent business operations (IBO, including event processing and analytics). Software AG acquisitionsThis is made up of a number of acquisitions – Apama, alfabet, LongJump, Nirvana, JackBe – plus the pre-existing portfolio including ARIS and webMethods. Now, we’re seeing some of that on their Software AG Live PaaS vision for a unified cloud offering: Process Live for modeling and process publishing; Portfolio Live for IT portfolio management; AgileApps Live for application development and case management; and Integration Live for cloud-to-cloud and cloud-to-on premise integration. Integration Live is coming next year, but the rest of the platform is available as of today.

Software AG cloud offeringWe had a demo of Process Live, which provides cloud-based BPMN process modeling including collaboration; and Portfolios Live to see the systems with which the modeled processes may interact, including a wide variety of portfolio management functions such as assessing the usage and future development potential of any given system or application. We also saw an AgileApps Live application, including an analytics dashboard plus forms data entry and task/case management; interestingly, this is still sporting a longjump.com URL. I last reviewed LongJump in 2007 in conjunction with the Enterprise 2.0 conference, and obviously there have been some advances since then: it’s still an application development tool for web-based apps, but includes a lot of ad hoc task/case management functionality that allows the knowledge worker to create their own multi-step tasks (subprocesses, in effect) as well as perform other case-type functionality such as gathering artifacts and completing tasks related to a case resolution/completion.

Software AG Integration Live deployment stylesAlthough Integration Live isn’t there yet, we did hear about the different deployment styles that will be supported: development and/or operations can be in the cloud; there can be an on premise ESB or direct connections to systems.

Software AG event-driven architectureJost drilled down into several of the specific products, starting out with the overarching premise that Software AG is moving from a more traditional multi-tier architecture into an event-driven architecture (EDA), where everything is based around the event bus. Product highlights included:

  • ARIS positioning and use cases from process modeling to governance, and the radical UI redesign in ARIS 9 that matches the Process Live UI
  • Mobile and social BPM UI
  • Elastic ESB using virtual private cloud as well as public and private cloud
  • API management, representing an extension to the Centrasite concepts
  • Intelligent business operations architecture including in-memory analytics and event processing
  • Terracotta strategy for in-memory data management
  • Integration of Apama, big memory (Terracotta) and messaging for big data/event correlation

Software AG mobile BPM 1 Software AG mobile BPM 2 Software AG mobile BPM 3

I’m sure that we’ll see a lot more about these over the next two days so I’m not trying to cover everything here.

We had a brief demo from John Bates on audience sentiment analysis for price level setting using Apama, then wrapped up with a presentation from Edy Liongosari, Managing Director at Accenture on how to bring some of this into practice. One thing that Liongosari said really resonated: next year, none of us are going to be talking about cloud, because it will be so ubiquitous. Same is true, I believe, of the terms social and mobile. Not to mention digital.

Business Architecture Bridging Strategic Vision And Operational Excellence

I made it to the Gartner BPM Summit 2013 in Washington DC today just in time for the 11am session that Betsy Burton gave on bridging the gap between strategic vision and operational excellence with business architecture (BA). I like her view on this: strategic vision really isn’t much good unless you have a plan (or at least a direction) for how you’re going to do it. She points out that most organizations don’t execute on their vision — only about 10% if you believe the studies by Hammer and others — and you’re not going to get there unless along with vision, you also define implications, constraints, risks and interdependencies. Business strategy, which is a big part of business architecture, requires a diagnosis, guiding policy, coherent actions and target outcomes. I also like her distinction between “deliberate strategy” (that which is foreseen and planned) and “emergent strategy” (that which happens in response to actual conditions, a.k.a., “how we get stuff done”), although I’m not sure that I’d consider the emergent part to be strategy, strictly speaking.

She showed a good example of a business capability model that had been developed for a financial services firm, where capabilities are “things the business does”, not processes or departments. Overlaid with that was color coding showing the level of investment to each capability, and bolding to show the capabilities with strategic importance, plus physical grouping of capabilities related to a specific business goal. This gives a view, on one chart, of how the business vision is aligned with capabilities and spending. For example, in the group “Self and Service Products” were six capabilities. Once of those was “Onboard Customers”, which was bolded to indicate that it’s of strategic importance, but is white to indicate that it is getting only a minimal amount of investment. Then, overlaid on that, she showed how processes intersect with capabilities by adding numbered bubbles to indicate which process impacts each capability. Keep in mind that a process can span multiple capabilities, and a capability may require multiple processes. So that Onboard Customers capability intersects with A1, an account management process, as do 10 other capabilities. Next, she overlaid information sources and consumers and their linkages, that is, which capabilities create or consume information from other capabilities. As you add in the application portfolio, the inconsistencies in the architecture start to emerge, and low-risk, non-strategic capabilities are exposed as targets for cloud or outsourcing.

Gartner provides a classification for applications (their Pace Layering): they’re there for innovation, differentiation. or for record (commodity). Extending this to the capability map allows the processes and capabilities to also be categorized this way. To quote her presentation notes, “processes associated with innovative business capabilities will be more likely to change, will be more complex and potentially high value.” This identifies processes that really drive the business growth and goal achievement. Making the link between capabilities, processes and applications, the impact on people and processes of changing capabilities and swapping out applications becomes obvious.

Since this is a BPM conference, she has to make links to what this means for BP professionals, and ended up with some specific recommendations for BP directors, starting with “work with your EA team to understand the role of business architecture”, and understanding the link between BA and BPM. I’m impressed with the level of integration that she’s made between BPM and BA, and provided some good ideas on how to connect these up as part of the business strategy.