Australia Post at camunda Community Day

I am giving the keynote at camunda’s BPMcon conference tomorrow, and since I arrived in Berlin a couple of days early, camunda invited me to attend their community day today, which is the open source community meeting. Nice to see such a great turnout — 70 or 80 people? — and I definitely wasn’t the one to travel the furthest to get here, since today’s opening presentation is from Rob Parker of Australia Post. Australia has a lot of the same issues as Canada when it comes to nation-wide services such as post, since we both have a lot of geography and not a lot of population: this means that a lot of services have to be delivered at a fiscal loss to sparsely-populated areas, with specific rules about what percentage of the population has to be within a certain distance of a postal outlet.

Post offices in particular are hard-hit by digital disruption; Australia Post has seen their letter delivery service decline by 1 billion articles (and the related revenue), even though the number of addresses to cover has increased. However, they have seen their parcel delivery business increase, even though this is a competitive business with courier companies. They’re also offering a number of other products, such as electronic bill payment, digital mail delivery and even passport interviews, which has driven them to create a more integrated multi-channel/multi-product architecture to be able to quickly bring new products to market. They’re using camunda BPM for their order management processses, both for customer orders and service fulfillment orders. Customer order processes support the various customer channels, then drive out one or more service order processes to fulfill a customer order.

They decided to use BPM in order to externalize processes from applications, making for more agile development and better reusability. They picked camunda because they wanted “just enough technology”: that is, they wanted to add process management to their existing Java application development environment, not rewrite all of their apps in a proprietary, monolithic BPMS app dev environment. camunda BPM is used to implement the multiple service order processes that might be kicked off by any given customer order, with their overall architecture handling the communication between the two order management layers: the customer order layer as a consumer for the service order layer producer.

Parker went into a lot of detail of how they have implemented this architecture, putting their BPM usage into the context of their overall technical architecture, and walked through the general process model for their service order that instantiates a dispatcher process for each customer order, which in turn instantiates a subprocess for each line item in the order. They really want to implement all of this in camunda, but are still using TIBCO for the dispatching process while they work out some of the sticky bits such as how subprocess cancelations are propagated to the parent process. They are also having some challenges with handling process versions, considering that they run 7×24: they need a mapping table that takes these temporal anomalies into consideration, so that the process version in use may be tied to the order date for longer-running order processes. They also created a business dashboard by modifying Cockpit, the camunda IT operations dashboard, to remove all of the “dangerous” operations while exposing the work in progress, and adding some additional functions such as searching by a business key.

Parker ended up with their outcomes, some expected, some less so: basically, BPMN 2.0 is really working for them both for business-IT collaboration and model-driven development; this level of business-IT alignment means that error handling can be shared, with business handling business errors, and IT handling IT errors. They found that developers became productive very quickly since they were just adding some tools to their existing familiar Java application development environment, although some had to be gently reminded to use the BPM capabilities instead of writing code.

It was great to see the reactions and interactions of the camunda team during the presentation: Australia Post is a “do-it-themselves” open source user of camunda, and as Parker discussed some of the shortcomings, they were obviously taking some notes for future work. The presentation finished with him being presented as an award for the non-camunda person who contributed most to the community forum discussions, suggesting that you get out of open source what you put into it.

bpmNEXT 2014 Wednesday Afternoon 1: Mo’ Models

Denis Gagne of Trisotech was back after lunch at bpmNEXT demonstrating socializing process change with their BPMN web modeler. He showed their process animation feature, which allows you to follow the flow through a process and see what happens at each step, and view rich media that has been attached at any given step to explain that step. He showed a process for an Amazon order, where each step had a slideshow or video attached to show the actual work that was being performed at that step; the tool supports YouTube, Slideshare, Dropbox and a few others natively, plus any URL as an attachment to any element in the process. The animated process can be referenced by a URL, allowing it to be easily distributed and socialized. This provides a way for people to learn more about the process, and can be used as an employee training tool or a customer experience enhancement. Even without the rich media enhancements, the process animation can be used to debug processes and find BPMN logical errors (e.g., deadlocks, orphan branches) by allowing the designer to walk through the process and see how the tokens are processed through the model – most modeling tools only check that the BPMN is syntactically correct, not for more complex logical errors that can result in unexpected and unwanted scenarios. Note that this is different from process simulation (which they also offer), which is typically used to estimate performance based on aggregate instances.

Bruce Silver took a break from moderating to do a demo together with Stephan Fischli and Antonio Palumbo of itp commerce on wizard-based generation of “good BPMN” that they’ve done through their BPMessentials collaboration for BPMN training and certification. Bruce’s book BPMN Method and Style as well as his courses attempt to teach good BPMN, where the process logic is evident from the printed diagram in spite of things that can tend to confuse a reader, such as hierarchical modeling forms. He uses a top-down methodology where you identify the start and end states of a process instance, then decompose the process into 6-10 steps where each is an activity aligned with the process instance (i.e., no multi-instance activities), and enumerate the possible end states of each activity if there is more than one so that end states within subprocesses can be matched to gateways that immediately follow the subprocesses. This all takes a bit of a developer’s mindset that’s typically not seen in business analysts who might be creating BPMN models, meaning that we can still end up with spaghetti process models even in BPMN. Bruce walked through an order-to-cash scenario, then Stephan and Antonio took over to demonstrate how their tool creates a BPMN model based on a wizard that walks through the steps of the BPMN method and style: first the process start and (one or more) end states; then a list of the major steps, where each is named, the end states enumerated and (optionally) the performer identified; then the activity-end state pairs are listed so that the user can specify the target (following step), which effectively creates the process flow diagram; then, each activity can be expanded as a subprocess by listing the child activities and the end states; finally, the message flows and lanes are specified by stating which activities have incoming and outgoing message flows. The wizard then creates the BPMN process model in the itp commerce Visio tool where all of the style rules are enforced. Without doubt, this creates better BPMN, although specifying a branching process model via a list of activities and end states might not be much more obvious than creating the process model directly. I know that the itp commerce and some other BPMN modeling tools can also run a check on a BPMN model to check for violations of the style rules; I assume that detecting and fixing the rule violations from a model is just another way of achieving the same goal.

Last up before the afternoon break was Gero Decker of Signavio to demonstrate combining process modeling and enterprise architecture. Signavio’s only product is their process modeler – used to model, collaborate, publish and implement models – which means that they typically deal with process designers and process centers of excellence. However, they are finding that they are now running into EA modelers as they start to move into process architecture and governance, and application owners for application lifecycle management. EA modelers have to deal with the issues of whether to use a unified tool with a single object repository for all modeling and unified model governance, or multiple best of breed tools where metamodels can be synchronized and may be slaved between tools. Signavio is pushing the second alternative, where their tool integrated with or overlays other tools such as SAP Solution Manager and leanIX. Signavio has added ArchiMate open standard enterprise architecture model types to their tool for EA modeling, creating linkages and tracing from ArchiMate objects to BPMN models. Gero demonstrated what the ArchiMate models look like in Signavio, then how processes in leanIX can be directly linked to Signavio process models as well as having applications from the EA portfolio available as performers to specify in a Signavio process model. Creating of process models in Signavio that use applications from the portfolio then show up (via automated synchronization) in leanIX as references to that application. He also showed an integration with Effektif for approving changes to the process model in Signavio, comparing the before and after versions of the flow, since there is a pluggable connector to Signavio from Effektif processes. Connections to other tools could be built using the Signavio REST API. Nice integration between process models and application portfolio models in separate tools, as well as the model approval workflow.

High-Value Solution Consulting At Amdocs With An ARIS-Based Solution Book

Down to the last two breakout sessions at Innovation World, and we heard from Ophir Edrey of Amdocs, a company providing software for business support, with a focus on the communications, media and entertainment industries. They wanted to be able to leverage their own experience across multiple geographies, leading their customers towards a best practice-based implementation. To do this, they created a solution book that brings together best practices, methodologies, business processes and other information within an enterprise architecture to allow Amdoc consultants to work together with customers to collaborate on how that architecture needs to be modified to fit the customer’s specific needs.

The advantage of this is the Amdocs doesn’t just offer a software solution, but an entire advisory service around the best practices related to the solution. The solution book is created in ARIS, including the process models, solution design, solution traceability, customer collaboration (which they are migrating to ARIS Connect, not Process Live), and review and approval management.

He showed us a demo of the Amdocs Solution Book, specifically the business process framework. It contains four levels of decomposition, starting with a value chain of the entire operator landscape mapped onto the full set of process model families. Drilling through into a specific set of processes for, in this example, a mobile customer upgrading a handset, he showed the KPIs and the capabilities provided by their solution for that particular process; this starts the proof of Amdocs value to the customer as more than just a software provider. Drilling further into the specific process model, the Amdocs consultant can gather feedback from the customer on how this might need to be modified for their specific needs, and comments added directly on the models for others to see and comment.

They have had some pushback from customers on this – some people really just want a paper document – but generally have had very enthusiastic feedback and a strong demand to use the tool for projects. The result is faster, better, value-added implementations of their software solutions, giving them a competitive edge. Certainly an interesting model for the services arm of any complex enterprise software provider.

The Digital Agility Layer: Time To Get Intentionally Digital

Wolfram Jost, CTO of Software AG, started us off on the first full day of Innovation World with a keynote on innovations for the digital enterprise. As I mentioned yesterday, the use of the term “digital enterprise” (and even more, “digitization”) is a bit strange, since pretty much everything is digital these days, it’s just not necessarily the right type of digital. We still need to think about integration between systems to make automation seamless, but more importantly, we need to think about interaction patterns that put control in the hands of customers, and mobile and social platforms that make the digital forms ubiquitous. So maybe the right phrase is that we have to start being intentionally digital enterprises, rather than let it happen accidentally.

Software AG suiteI definitely agree with Jost’s key point: it’s all about the process. We need end-to-end processes at the business/customer layer, but have to interact with a plethora of silos down below, both on premise and in the cloud, some of which are decades old. Software AG, naturally, provides tools to help that happen: in-memory data management, integration/SOA, BPM, EA and intelligent business operations (IBO, including event processing and analytics). Software AG acquisitionsThis is made up of a number of acquisitions – Apama, alfabet, LongJump, Nirvana, JackBe – plus the pre-existing portfolio including ARIS and webMethods. Now, we’re seeing some of that on their Software AG Live PaaS vision for a unified cloud offering: Process Live for modeling and process publishing; Portfolio Live for IT portfolio management; AgileApps Live for application development and case management; and Integration Live for cloud-to-cloud and cloud-to-on premise integration. Integration Live is coming next year, but the rest of the platform is available as of today.

Software AG cloud offeringWe had a demo of Process Live, which provides cloud-based BPMN process modeling including collaboration; and Portfolios Live to see the systems with which the modeled processes may interact, including a wide variety of portfolio management functions such as assessing the usage and future development potential of any given system or application. We also saw an AgileApps Live application, including an analytics dashboard plus forms data entry and task/case management; interestingly, this is still sporting a URL. I last reviewed LongJump in 2007 in conjunction with the Enterprise 2.0 conference, and obviously there have been some advances since then: it’s still an application development tool for web-based apps, but includes a lot of ad hoc task/case management functionality that allows the knowledge worker to create their own multi-step tasks (subprocesses, in effect) as well as perform other case-type functionality such as gathering artifacts and completing tasks related to a case resolution/completion.

Software AG Integration Live deployment stylesAlthough Integration Live isn’t there yet, we did hear about the different deployment styles that will be supported: development and/or operations can be in the cloud; there can be an on premise ESB or direct connections to systems.

Software AG event-driven architectureJost drilled down into several of the specific products, starting out with the overarching premise that Software AG is moving from a more traditional multi-tier architecture into an event-driven architecture (EDA), where everything is based around the event bus. Product highlights included:

  • ARIS positioning and use cases from process modeling to governance, and the radical UI redesign in ARIS 9 that matches the Process Live UI
  • Mobile and social BPM UI
  • Elastic ESB using virtual private cloud as well as public and private cloud
  • API management, representing an extension to the Centrasite concepts
  • Intelligent business operations architecture including in-memory analytics and event processing
  • Terracotta strategy for in-memory data management
  • Integration of Apama, big memory (Terracotta) and messaging for big data/event correlation

Software AG mobile BPM 1 Software AG mobile BPM 2 Software AG mobile BPM 3

I’m sure that we’ll see a lot more about these over the next two days so I’m not trying to cover everything here.

We had a brief demo from John Bates on audience sentiment analysis for price level setting using Apama, then wrapped up with a presentation from Edy Liongosari, Managing Director at Accenture on how to bring some of this into practice. One thing that Liongosari said really resonated: next year, none of us are going to be talking about cloud, because it will be so ubiquitous. Same is true, I believe, of the terms social and mobile. Not to mention digital.

Business Architecture Bridging Strategic Vision And Operational Excellence

I made it to the Gartner BPM Summit 2013 in Washington DC today just in time for the 11am session that Betsy Burton gave on bridging the gap between strategic vision and operational excellence with business architecture (BA). I like her view on this: strategic vision really isn’t much good unless you have a plan (or at least a direction) for how you’re going to do it. She points out that most organizations don’t execute on their vision — only about 10% if you believe the studies by Hammer and others — and you’re not going to get there unless along with vision, you also define implications, constraints, risks and interdependencies. Business strategy, which is a big part of business architecture, requires a diagnosis, guiding policy, coherent actions and target outcomes. I also like her distinction between “deliberate strategy” (that which is foreseen and planned) and “emergent strategy” (that which happens in response to actual conditions, a.k.a., “how we get stuff done”), although I’m not sure that I’d consider the emergent part to be strategy, strictly speaking.

She showed a good example of a business capability model that had been developed for a financial services firm, where capabilities are “things the business does”, not processes or departments. Overlaid with that was color coding showing the level of investment to each capability, and bolding to show the capabilities with strategic importance, plus physical grouping of capabilities related to a specific business goal. This gives a view, on one chart, of how the business vision is aligned with capabilities and spending. For example, in the group “Self and Service Products” were six capabilities. Once of those was “Onboard Customers”, which was bolded to indicate that it’s of strategic importance, but is white to indicate that it is getting only a minimal amount of investment. Then, overlaid on that, she showed how processes intersect with capabilities by adding numbered bubbles to indicate which process impacts each capability. Keep in mind that a process can span multiple capabilities, and a capability may require multiple processes. So that Onboard Customers capability intersects with A1, an account management process, as do 10 other capabilities. Next, she overlaid information sources and consumers and their linkages, that is, which capabilities create or consume information from other capabilities. As you add in the application portfolio, the inconsistencies in the architecture start to emerge, and low-risk, non-strategic capabilities are exposed as targets for cloud or outsourcing.

Gartner provides a classification for applications (their Pace Layering): they’re there for innovation, differentiation. or for record (commodity). Extending this to the capability map allows the processes and capabilities to also be categorized this way. To quote her presentation notes, “processes associated with innovative business capabilities will be more likely to change, will be more complex and potentially high value.” This identifies processes that really drive the business growth and goal achievement. Making the link between capabilities, processes and applications, the impact on people and processes of changing capabilities and swapping out applications becomes obvious.

Since this is a BPM conference, she has to make links to what this means for BP professionals, and ended up with some specific recommendations for BP directors, starting with “work with your EA team to understand the role of business architecture”, and understanding the link between BA and BPM. I’m impressed with the level of integration that she’s made between BPM and BA, and provided some good ideas on how to connect these up as part of the business strategy.

Aligning BPM and EA Tutorial at BBCCon11

I reworked my presentation on BPM in an enterprise architecture context (a.k.a., “why this blog is called ‘Column 2’”) that I originally did at the IRM BPM conference in London in June, and presented it at the Building Business Capability conference in Fort Lauderdale last week. I removed much of the detailed information on BPMN, refined some of the slides, and added in some material from Michael zur Muehlen’s paper on primitives in BPM and EA. Some nice improvements, I thought, and it came in right on time at 3 hours without having to skip over some material as I did in London.

Here are some of the invaluable references that I used in creating this presentation:

That should give you plenty of follow-on reading if you find my slides to be too sparse on their own.

NSERC BI Network at CASCON2011 (Part 1)

I only have one day to attend CASCON this year due to a busy schedule this week, so I am up in Markham (near the IBM Toronto software lab) to attend the NSERC Business Intelligence Network workshop this morning. CASCON is the conference run by IBM’s Centers for Advanced Studies throughout the world, including the Toronto lab (where CAS originated), as a place for IBM researchers, university researchers and industry to come together to discuss many different areas of technology. Sometimes, this includes BPM-related research, but this year the schedule is a bit light on that; however, the BI workshop promises to provide some good insights into the state of analytics research.

Eric Yu from University of Toronto started the workshop, discussing how BI can enable organizations to become more adaptive. Interestingly, after all the talk about enterprise architecture and business architecture at last week’s Building Business Capability conference, that is the focus of Yu’s presentation, namely, that BI can help enterprises to better adapt and align business architecture and IT architecture. He presented a concept for an adaptive enterprise architecture that is owned by business people, not IT, and geared at achieving measurable business success. He discussed modeling variability at different architectural layers, and the traceability between them, and how making BI an integral part of an organization – not just the IT infrastructure – can support EA adaptability. He finished by talking about maturity models, and how a closed loop deployment of BI technologies can help meet adaptive enterprise requirements. Core to this is the explicit representation of change processes and their relationship to operational processes, as well as linking strategic drivers to specific goals and metrics.

Frank Tompa from University of Waterloo followed with a discussion of mapping policies (from a business model, typically represented as high-level business rules) to constraints (in a data model) so that these can be enforced within applications. My mind immediately went to why you would be mapping these to a database model rather than a rules management system; his view seems to be that a DBMS is what monitors at a transactional level and ensures compliance with the business model (rules). His question: “how do make the task of database programming easier?” My question: “why aren’t you doing this with a BRMS instead of a DBMS?” Accepting his premise that this should be done by a database programmer, the approach is to start with object definitions, where an object is a row (tuple) defined by a view over a fixed database schema, and represents all of the data required for policy making. Secondly, consider the states that an object can assume by considering that an object x is in state S if its attributes satisfy S(x). An object can be in multiple states at once; the states seem to be more like functions than states, but whatever. Thirdly, the business model has to be converted to an enforcement model through a sort of process model that also includes database states; really more of a state diagram that maps business “states” to database states, with constraints on states and state transitions denoted explicitly. I can see some value in the state transition constraint models in terms of representing some forms of business rules and their temporal relationships, but his representation of a business process as a constraint diagram is not something that a business analyst is ever going to read, much less create. However, the role of the business person seems to be restricted to “policy designer” listing “states of interest”, and the goal of this research is to “form a bridge between the policy manager and the database”. Their future work includes extracting workflows from database transaction logs, which is, of course, something that is well underway in the BPM data mining community. I asked (explicitly to the presenter, not just snarkily here in my blog post) about the role of rules engines: he said that one of the problems was in vocabulary definition, which is often not done in organizations at the policy and rules level; by the time things get to the database, the vocabulary is sufficiently constrained that you can ensure that you’re getting what you need. He did say that if things could be defined in a rules engine using a standardized vocabulary, then some of the rules/constraints could be applied before things reached the database; there does seem to be room for both methods as long as the business rules vocabulary (which does exist) is not well-entrenched.

Jennifer Horkoff from University of Toronto was up next discussing strategic models for BI. Her research is about moving BI from a technology practice to a decision-making process that starts with strategic concerns, generates BI queries, interprets the results relative to the business goals and decide on necessary actions. She started with the OMG Business Motivation Model (BMM) for building governance models, and extended that to a Business Intelligence Model (BIM), or business schema. The key primitives include goals, situations (can model SWOT), indicators (quantitative measures), influences (relationships) and more. This model can be used at the high-level strategic level, or at a more tactical level that links more directly to activities. There is also the idea of a strategy, which is a collection of processes and quality constraints that fulfill a root-level goal. Reasoning that can be done with BIMs, such as whether a specific strategy can fulfill a specific goal, and influence diagrams with probabilities on each link used to help determine decisions. They are using BIM concepts to model a case study with Rouge Valley Health System to improve patient flow and reduce wait times; results from this will be seen in future research.

Each of these presentations could have filled a much bigger time slot, and I could only capture a flavor of their discussions. If you’re interested in more detail, you can contact the authors directly (links to each above) to get the underlying research papers; I’ve always found researchers to be thrilled that anyone outside the academic community is interested in what they’re doing, and are happy to share.

We’re just at the md-morning break, but this is getting long so I’ll post this and continue in a second post. Lots of interesting content, I’m looking forward to the second half.

Process and Information Architectures

Last day of the Building Business Capability conference, and I attended Louise Harris’ session on process and information architectures as the missing link to improving enterprise performance. She was on the panel on business versus IT architecture that I moderated yesterday, and had a lot of great insight into business architecture and enterprise architecture.

Today’s session highlighted how business processes and information are tightly interconnected – business processes create and maintain information, and information informs and guides business processes – but that different types of processes use information differently. This is a good distinction: looking at what she called “transactional” (structured)  versus “creative” (case management) versus “social” (ad hoc), where transactional processes required exact data, but the creative and social processes may require interpretation of a variety of information sources that may not be known at design time. She showed the Burlton Hexagon to illustrate how information is not just input to be processed into output, but also used to guide processes, inform desisions and measure process results.

This led to Harris’ definition of a business process architecture as “defining the business processes delivering results to stakeholders and supported by the organization/enterprise showing how they are related to each other and to the strategic goals of the organization/enterprise”. (whew) This includes four levels of process models:

  • Business capability models, also called business service models or end-to-end business process models, which is the top level of the work hierarchy that defined what business processes are, but not how they are performed. Louise referenced this to a classic EA standpoint as being row 1 of Zachman (in column 2).
  • Business process models, which provide deeper decomposition of the end-to-end models that tie them to the KPIs/goals. This has the effect of building process governance into the architecture directly.
  • Business process flow models, showing the flow of business processes at the level of logistical flow, such as value chains or asset lifecycles, depending on the type of process.
  • Business process scope models (IGOEs, that is, Inputs, Guides, Outputs, Enablers), identifying the resources involved in the process, including information, people and systems.

She moved on to discuss information architecture, and its value in defining information assets as well as content and usage standards. This includes three models:

  • Information concept model with the top level of the information related to the business, often organized into domains such as finance or HR. For example, in the information domain of finance, we might have information subject areas (concepts) of Invoicing, capital assets, budget, etc.
  • Information relationship model defines the relationships between the concepts identified in the information concept model, which can span different subject areas. This can look like an ERD, but the objects being connected are higher-level business objects rather than database objects: this makes it fairly tightly tied to the processes that those business objects undergo.
  • Information governance model, which defines that has to be done to maintain information integrity: governance structure, roles responsible, and policy and business standards.

Next was bringing together the process and information architectures, which is where IGOE (business process scope models) come into play, since they align information subject areas with top level business processes or business capabilities, allowing identification of gaps between process and information. This creates a framework for ensuring alignment at the design and operational levels, but does not map information subject areas to business functions since that is too dependent on the organizational structure.

Harris presented these models as being the business architecture, corresponding to rows 1 and 2 of Zachman (for example), which can then be used to provide context for the remainder of the enterprise architecture and into design. For example, once these models are established, the detailed process design can be integrated with logical data models.

She finished up by looking at how process and information architectures need to be developed in lock step, since business process ensures information quality, while information ensures process effectiveness.

Accepting The Business Architecture Challenge with @logicalleap

Forrester analyst Jeff Scott presented at Building Business Capability on what business architecture is and what business architects do. According to their current research, interest in business architecture is very high – more than half of organizations consider it “very important”, and all organizations survey showed some interest – and more than half also have an active business architecture initiative. This hasn’t changed all that much since their last survey on this in 2008, although the numbers have crept up slightly. Surprisingly, less than half of the business architecture activities are on an enterprise-wide level, although if you combine that with those that have business architecture spanning multiple lines of business, it hits about 85%. When you look at where these organizations plan to take their business architecture programs, over 80% are planning for them to be enterprise-wide but that hasn’t changed in 3 years, meaning that although the intention is there, that may not actually be happening with any speed.

He moved on to a definition of business architecture, and how it has changed in the past 15 years. In the past, it used to be more like business analysis and requirements, but now it’s considered an initiative (either by business, EA or IT) to improve business performance and business/IT alignment. The problem is, in my opinion, that the term “business/IT alignment” has become a bit meaningless in the past few years as every vendor uses it in their marketing literature. Process models are considered a part of business architecture by a large majority of organizations with a business architecture initiative, as are business capability models and business strategy, application portfolio assessments, organizational models and even IT strategy.

Business architecture has become the hot new professional area to get into, whether you’re a business analyst or an enterprise architecture, which means that it’s necessary to have a better common understanding of what business architecture actually is and who the business architects are. I’m moderating a panel on this topic with three business/IT/enterprise architects today at 4:30pm, and plan to explore this further with them. Scott showed their research on what people did before they became (or labeled themselves as) business architects: most held a business analyst role, although many also were enterprise architects, application architects and other positions. Less than half of the business architects are doing it full time, so may still be fulfilling some of those other roles in addition. Many of them are resident in the EA group, and more than half of organizations consider EA to be responsible for the outcomes of business architecture.

It’s really a complex set of factors in figuring out what business architects do: some of them are working on enterprise-wide business transformation, while others are looking at efficiency within a business unit or project. The background of the business architect – that is, what they did before they became a business architect – can hugely impact (obviously) the scope and focus of their work as a business architect. In fact, is business architecture a function performed by many players, or is it a distinct role? Who is really involved in business architecture besides those who hold the title, and where do they fit in the organization? As Scott pointed out, these are unanswered questions that will plague business architecture for a few years still to come.

He presented several shifts to make in thinking:

  • Give up your old paradigms (think different; act different to get different results)
  • Start with “why” before you settle on the how and what
  • “Should” shouldn’t matter when mapping from “what is” to “what can be”
  • Exploration, not standardization, since enterprise architecture is still undergoing innovation on its way to maturity
  • Business architecture, not technology architecture, is what provides insight, risk management and leadership (rather than engineering, knowledge and management)
  • Stress on “business” in business architecture, not “architecture”, which may not fit into the EA frameworks that are more focused on knowledge
  • Focus on opportunity rather than efficiency, which is aligned with the shift in focus for BPM benefits that I’ve been seeing in the past few years
  • Complex problems need different solutions, including looking at the problems in context rather than just functional decomposition.
  • Solve the hard “soft” problems of building business architect skills and credibility, leveraging local successes to gain support and sponsorship, and overcome resistance to change
  • Think like the business before applying architectural thinking to the business problems and processes

He finished up with encouragement to become more business savvy: not just the details of business, but innovation and strategy. This can be done via some good reading resources, finding a business mentor and building relationships, while keeping in mind that business architecture should be an approach to clarify and illuminate the organization’s business model.

He wrote a blog post on some of the challenges facing business architects back in July, definitely worth a read as well.

Strategic Synergies Between BPM, EA and SOA

I just had to attend Claus Jensen’s presentation on actionable architecture with synergies between BPM, EA and SOA since I read two of his white papers in preparing the workshop that I delivered here on Wednesday on BPM in an EA context. I also found out that he’s co-authored a new red book on EA and BPM.

Lots of great ideas here – I recommend that you read at least the first of the two white papers that I link to above, which is the short intro – about how planning (architecture) and solution delivery (BPM) are fundamentally different, and you can’t necessarily transform statements and goals from architecture into functions in BPM, but there is information that is passed in both directions between the two different lifecycles.

He went through descriptions of scenarios for aligning and interconnecting EA and BPM, also covered in the white papers, which are quite “build a (IBM-based) solution”-focused, but still some good nuggets of information.