Building a Business Architecture Capability and Practice Within Shell

For the first breakout of the day, I attended Dan Jeavon’s session on Shell’s business architecture practice. For such a massive company – 93,000 employees in 90 countries – this was a big undertaking, and they’ve been at this for five years.

He defines business architecture as the business strategy, governance, organization and key business process information, as well as the interaction between these concepts, which is taken directly from the TOGAF 9 definition. Basically, this involves design, must be implemented and not just conceptual, and requires flexibility based on business agility requirements. They started on their business architecture journey because of factors that affect many other companies: globalization, competition, regulatory requirements, realization of current inefficiencies, and emergence of a single governance board for the multi-national company.

Their early efforts were centered on a huge new ERP system, especially with the problems due to local variations from the global standard process models. “Process” (and ERP) became naughty words to many people, with connotations of bloated, not-quite-successful projects. Following on from some of the success points, their central business architecture initiative actually started with process modeling/design: standard processes across the different business areas with global best practices. This was used to create and roll out a standard set of financial processes, with a small core team doing the process redesign, and coordinating with IT to create a common metamodel and architectural standards. As they found out, many other parts of the company had similar process issues – HR, IT and others – so they branched out to start building a business architecture for other areas as well.

They had a number of challenges in creating a process design center of excellence:

  • Degree of experience with the tool and the methodology; initial projects weren’t sufficiently structured, reducing benefits.
  • Perceived value to the business, especially near-term versus long-term ROI.
  • Impact of new projects, and ensuring that they follow the methodology.
  • Governance and high-level sponsorship.

They also found a number of key steps to implementing their CoE and process architecture:

  • Sponsorship
  • Standard methodology, embedded within standard project delivery framework
  • Communication of success stories

Then, they migrated their process architecture initiative to a full business architecture by looking at the relationships to other elements of business architecture; this led to them do business architecture (mostly) as part of process design initiatives. Recent data quality/management initiatives have also brought a renewed focus on architecture, and Jeavons feels that although the past five years have been about process, the next several years will be more about data.

He showed a simplified version of their standard metamodel, including aspects of process hierarchy models, process flow models, strategy models and organization models. He also showed a high-level view of their enterprise process model in a value stream format, with core processes surrounded by governing and supporting processes. From there, he showed how they link the enterprise process model to their enterprise data catalogue, which links to the “city plan” of their IT architecture and portfolio investment cycle; this allows for traceability as well as transparency. They’ve also been linking processes to strategy – this is one of the key points of synergy between EA and BPM – so that business goals can be driven down into process performance measures.

The EA and process design CoE have been combined (interesting idea) into a single EA CoE, including process architects and business architects, among other architect positions; I’m not sure that you could include an entire BPM CoE within an EA CoE due to BPM’s operational implementation focus, but there are certainly a lot of overlapping activities and functions, and should have overlapping roles and resources.

He shared lots of great lessons learned, as well as some frank assessment of the problems that they ran into. I particularly found it interesting how they morphed a process design effort into an entire business architecture, based on their experience that the business really is driven by its processes.

Designing a Breakout Business Strategy

The keynote this morning was A Strategic Toolkit for Designing and Delivering A Breakout Strategy by Professor Thomas Lawton of EMYLON Business School. This was about business strategy, starting with a view of how different companies responded to the recent/ongoing recession: panic, protect, cloak or conquer, where the first three are reactive but with different results (negative, neutral, positive) and the last of which is proactive. He had examples of each; for example, how Sony used a “cloak” response to take business cutback measures that would have been difficult during good times, improving the business overall. He challenged the audience to consider which of the four responses that our organizations have adopted, and some strategies for dealing with the current economic conditions. Although it’s not easy to think about success when you’re fighting for survival, you need to be proactively preparing for the inevitable upturn so as to be able to seize the market when it starts improving. I definitely started thinking about BPM at this point; organizations that implement BPM during a down market in order to control costs often find themselves well-positioned to improve their market share during the upswing because they are more efficient and more agile to respond to customer needs.

He introduced a few different tools that form a strategy system:

  • Identify your route to breakout and market success. He showed a quadrant comparing breakout styles, “taking by storm” and “laggard to leader” (often an ailing company that is turned around), against emergent and established markets; all of these indicate significant opportunities for growth. Again, he had great examples for each of these, and discussed issues of adapting these strategies to different corporate cultures and geographic/regulatory environments. He presented a second quadrant for those organizations who are staying out in front of their market, with the breakout styles “expanding horizons” and “shifting shape”, also against emergent and established markets. For each of the squares in each of these quadrants, he has an evocative moniker, such as “boundary breakers” or “conquistadors”, to describe the companies that fit that growth strategy profile.
  • Identify your corporate vision, providing a sense of purpose, and considering the viewpoints of all stakeholders. The vision wheel is his technique for finding the corporate vision by breaking down the organization, culture, markets and relationships into their constituent parts, considering both current and future state, ending up with four worksheets across which you will see some common threads to guide the future strategy. Vision can be a bit of a fuzzy concept, but is a guiding star that is critical for direction setting and strategic coordination.
  • Align your value proposition with the needs of your customers. Aspire to create a “magnet company”, one that excites markets, attracts and retains customers, repels new entrants, and renders competitors unable to respond. This doesn’t mean you have to be the best in all aspects of what you do, but you have to be top in the features of what your customers care about, from the general areas of price, features, quality, support, availability and reputation.
  • Assemble an IT-enabled business model that is both efficient and effective; think about your business model as a vehicle for delivering your value proposition, and focus on alignment between those two. He discussed the six pillars of a business model: cost, innovation, reliability, relationships, channels and brand (which are just the other side of the six features discussed in the value proposition); some of these will emerge as your core competencies and become the source of competitive advantage.
  • Every business is both a techno and socio system: you need to consider both hard and soft aspects. He pointed out that it’s necessary to embed IT in strategy implementation, since almost all businesses these days are highly dependent on technology; technology can be used to realize an energized and productive socio-system (e.g., inspiring trust and loyalty) as well as an efficient and productive techno-system.

The breakout strategy system that he lays out has strategic leadership at the center, with products and programs, vision, value proposition, and business model surrounding it.

He finished up with the interaction between business and IT strategy:

  • Breakout strategies are enabled by IT
  • IT contributes to improve financial performance
  • IT supports strategy implementation

Unfortunately, only 19% of companies involve IT in the early strategy phase of growth initiatives; in other words, executives are not really considering how IT can help them with strategy. The impact of IT on business strategies, corporate structure and culture should be better understood. In particular, EA should be involved in strategy at this level, and BPM can be an important enabler of breakout strategies if that is understood early enough in the strategy development cycle.

Really great presentation, and I’ll definitely be tracking down some of his books for more reading on the topic.

By the way, some great tweets are starting to flow at the conference; you can find them at the hashtags #IRMBPM and #IRMEAC.

IRM BPM and EA Conferences Kickoff

Sally Bean and Roger Burlton opened the dual IRM’s colocated BPM and EA conferences in London this morning with a tag-team presentation on the synergies between EA and BPM – fitting nicely with 3-hour workshop that I gave yesterday on BPM in an EA context.

EA provides a framework to structure for transiting from strategy to implementation. BPM – from architecture through implementation – is a process-centric slice that intersects EA at points, but also includes process-specific operational activities. They present EA and BPM as collaborative, synergistic disciplines:

  • Common, explicit view of business drivers and business strategy
  • Shared understanding of business design
  • Disciplined approach to change prioritization and road maps
  • Coherent view of the enterprise through shared models
  • Monitoring fit between current performance and business environment

They briefly introduced John Zachman to the stage, but wouldn’t actually let him speak more than a minute, because we’d never get to the keynote Winking smile. I had the pleasure of having a conversation with John yesterday evening while having a drink with Roger and a few others (which was a bit weird because I had just been talking about his framework in my workshop, and this blog is named after the process column therein); during that time, I helped him get his iPhone onto the hotel wifi, which probably says something about the differences between EA and BPM…

Learning to Love BPMN 2.0

The last presentation of the IRM BPM London conference before the final panel, and Chris Bradley and Tim Franklin of IPL are presenting on BPMN 2.0. Bradley started with a brief history of BPMN from its 1.0 release in 2004 by BPMI to the present day 2.0 release, now under OMG. It was interesting to see their list of what BPMN does not do: state transitions, functional decomposition, organizational hierarchies and data modelling, which explains why some BPMS products are starting to build those functions into their integrated development environment to be defined along with the process models. [Note that although I normally use US spelling due to the geographic location of most of my audience, I’m using “modelling” here after Bradley point out that the US spelling, “modeling” should rhyme with “yodeling” 🙂 ]

Franklin took over to get into the details of the notation, particularly the differences between the 1.x and 2.0 versions and the new elements and diagram types in 2.0. I’m not going to review all of that; there’s a lot of existing material both on this blog and in other locations, including a webinar that Robert Shapiro gave earlier this year on BPMN 2.0.

Bradley took the stage again to discuss all the other things that have to happen after you get started on BPMN 2.0, particularly modelling data and aligning that with the process models, whether that’s being done in an integrated tool or two different modelling tools. I agree with him that it’s critical for process, data and organizational modelling efforts to be linked, although I think that’s more likely to have happen via MDM rather than by having a single integrated modelling tool.

His summary said it all: BPMN is simple (if you can read a flowchart, you can understand BPMN); BPMN is robust (can be used for both high-level design/architecture and detailed process model execution/implementation); and most importantly, BPMN and process models are only part of the big picture, and need to be linked to other information assets such as data and organizational models.

You may not have come out of this session actually loving BPMN 2.0, but at least you’ll respect it in the morning.

BPM for Small and Medium Businesses

Tom Bellinson of IT Methods looked at some of the reasons why BPM in small business (under 500 people) is different from that in larger businesses, based on some of the fundamental differences in how small and large businesses work, and therefore how they deal with process improvement. There are some advantages to looking at process improvement in large companies – more human and financial resources, and a longer time frame – but small businesses have the advantage that usually their processes can be understood and tackled end to end rather than piecemeal.

Because of less available resources, a big bang approach sometimes doesn’t work for small companies: the big bang is good for just getting it does all at once, but causes a great deal more stress and requires more concentrated effort. Bellinson proposes using a big bang approach for the analysis and planning phases, then implementing using a more incremental approach. This can still cause significant stress, particularly during the analysis phase when you may be challenging the company owner’s personality directly as it manifests in company culture and processes. Although process analysis always challenges management in some way in any sized company, when the owner personally created specific processes, and those processes are now being called out as broken, that can be uncomfortable all around.

To do a big bang approach to process mapping in a small business, it needs to be made a #1 priority in the company so that it doesn’t get pushed aside when the inevitable emergencies occur; you’ll also need to hire an external consultant to guide this process and gather the information, since the odds of those skills being inside your company and readily available is near zero. This is really a 2-4 week effort, not the months that it might take in a larger company, so although it will be a bit stressful and disruptive during that time, you need to bite the bullet and just get it done. The analysis itself isn’t unique to small businesses – map as-is processes, find and eliminate the non-value added activities, determine ROI – but sometimes the roles are a bit different, with the process consultant actually doing the process improvement exercise and presenting it to the company, rather than internal participants being involved in the reengineering efforts.

I’ve been approached by a few smaller businesses lately who were interested in BPM, and I think that the tools are finally at a price point that SMBs can consider implementing BPM inside their organizations. I agree with Bellinson that many of the techniques are just different for smaller businesses; having started and run two small businesses up to 40-50 people in size, I can certainly understand how the owner’s personality can have a big influence on the corporate culture and therefore the way that business processes improvement has to happen. However, there are still a lot of standard BPM principles and methodologies that can be applied, just on a smaller scale.

SaaS BPM at Surrenda-link

Bruce Spicer of Keystar Consultancy presented on a project that he did with Surrenda-link Investment Management to implement Appian cloud-based BPM for the process around procuring US life settlement assets (individual life insurance policies) to become part of their investment funds. They were specifically looking at a software as a service offering for this, in order to reduce cost and risk (considering the small size of their IT group), since SaaS allows them to scale up and down seamlessly without increasing costs significantly. They’ve built their own portal/user interface, using Appian Anywhere as the underlying process and analytics engine; it surprises me a bit that they’re not using more out of the box UI.

They were overtime and over budget, mostly because they (admittedly) screwed up the process mapping due to immature processes, inexperience with process analysis, and inexperience with gathering requirements versus just documenting the as-is state. Even worse, someone senior signed off on these incorrect process models, which were then used for initial development in the proof of concept before corrections were made. They made some methodology corrections after that, improving their process analysis by looking at broad processes before doing a detailed view of a functional silo, and moving to Agile development methodologies. Even with the mistakes that were made, they’re in production and on track to achieve their three-year ROI.

This should be a compelling case study, but maybe because it was just after lunch, or maybe because his presentation averaged 120+ words per slide, I had a hard time getting into this.

Resolving Case Management Challenges with Dynamic BPM

Dermot McCauley of Singularity discussed case management and its need for dynamism. He’s one of the co-authors of Mastering the Unpredictable: How Adaptive Case Management Will Revolutionize The Way That Knowledge Workers Get Things Done, and started with a definition of case management:

Case management is the management of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and resources to achieve an objective or goal. The path of execution cannot be predefined. Human judgment is required in determining how to proceed, and the state of a case can be affected by external events.

As he pointed out, cases are inherently unpredictable, emerging and changing over time, and must allow case workers to chart their own course through the process of managing the case, deciding on the right tasks to do and the right information to include at the right time. He discussed 14 key characteristics of case management, including “goal driven”, “information complexity” and “fluid participants and roles”, and how a case management technology platform must include aspects of BPMS, ECM and collaboration technologies in order to effectively support the knowledge workers. He also discussed the criticality of the history of a case, even more so than with structured processes, since cases are typically long-running and might include several workers added in partway through the case timeline. Case workers need a flexible work environment, since that’s the nature of their work, which means that they need to be able to configure their own desktop environment via a mashup-like functionality in order to organize their work effectively.

He also covered off a bit of their own product; interesting to see that there is a process map underlying a case, with a “happy path” showing what the case should be doing, but providing the user at any point with the ability to skip forward or back in the process map, initiate other (pre-defined) tasks, reassign the task to another user, and change case characteristics such as priority and expected completion time. This is not purely unstructured process, where there is no predefined model, but dynamic BPM where the model is predefined but can be readily changed while in flight. They have implemented a solution with the UK Insolvency Service, dealing with individual bankruptcy; this was targeted at a new low-cost program that the Insolvency Service was putting in place to handle the large number of low-asset individual insolvency cases in the face of the recent economic crisis. They used an agile approach, moving the case files from paper to electronic and providing a more flexible and efficient case management process that was live within 12 months of the original government legislation that enacted the program.

Bridging Process Modeling and IT Solutions Design at adidas

Eduardo Gonzalez of the adidas Group talked about how they are implementing BPM within their organization, particularly the transition from business process models to designing a solution, which ties in nicely with the roundtable that I moderated yesterday. The key issue is that process models are created for the purpose of modeling the existing and future business processes, but the linkage between that and requirements documents – and therefore on to solution design – is tenuous at best. One problem is with traceability: there is no way to connect the process models to the thick stack of text-based requirements documents, and from the requirements documents to the solution modules; this means that when something changes in a process model, it’s difficult to propagate that through to the requirements and solution design. Also, the requirements leave a bit too much to the developers imaginations, so often the solution doesn’t really meet the requirements.

The question becomes how to insert the business process models into the software development lifecycle. Different levels of the process model are required, from high-level process flows to executable workflows; they wanted to tie this in to their V-cycle model of solution design and development, which appears to be a modified waterfall model with integrated testing. Increasingly granular process models are built as the solution design moves from requirements and architecture to design and implementation; the smaller and more granular process building blocks, translated into solution building blocks, are then reassembled into a complete solution that includes a BPMS, a rules engine, a portal, and several underlying databases and other operational systems that are being orchestrated by the BPMS.

Gonzalez has based some of their object-driven project decomposition methods on Martyn Ould’s Business Process Management: A Rigorous Approach , although he found some shortcomings to that approach and modified it to suit adidas’ needs. Their approach uses business and solution objects in an enterprise architecture sort of approach (not surprising when he mentioned at the end of the presentation that he is an enterprise architect), moving from purely conceptual object models to logical object models to physical object models. Once the solution objects have been identified, they model the object states through its lifecycle, and object handling cases (analogous to use cases) that describe how the system handles an object through its full lifecycle, including both system and human interaction. He made the point that you have to have the linkage to master data; this is becoming recognized as a critical part of process applications now, and some BPMS vendors are starting to consider MDM connectivity.

The end solution includes a portal, BPMS, BRMS, ESB, MDM, BI and back-end systems – a fairly typical implementation – and although the cycle for moving from process model to solution design isn’t automated, at least they have a methodology that they use to ensure that all the components are covered and in synchronization. Specific models at particular points in their cycle include models from multiple domains, including process and data. They did a proof of concept with this methodology last year, and are currently running a live project using it, further refining the techniques.

Their cycle currently includes the model and execute phases of a standard BPM implementation cycle; next, they want to take on the monitor and optimize phases, and add modeling techniques to derive KPIs from functional and non-functional requirements. They also plan to look at more complex object state modeling techniques, as well as how adaptive case management fits into some of their existing concepts.

I posed a question at the end of my roundtable yesterday: if a tool existed that allowed for the definition of the process model, user interface, business rules and data model, then generated an executable system from that, would there still be a need for written requirements? Once we got past the disbelief that such tools exist (BPMS vendors – you have a job to do here), the main issue identified was one of granularity: some participants in the process modeling and requirements definition cycle just don’t need to see the level of detail that will be present in these models at an executable level. Obviously, there are still many challenges in moving seamlessly from conceptual process models to an executable process application; although some current BPMS provide a partial solution for relatively simple processes, this typically breaks down as processes (and related integrations) become more complex.

Conversation with Keith Harrison-Broninski

You may have noticed that I haven’t been blogging for the first two days of the IRM BPM conference here in London: that’s because I gave a 1/2-day seminar on the BPM technology landscape on Monday, then presented a session on collaboration and BPM yesterday morning, then moderated a roundtable on transforming process models to IT requirements yesterday afternoon. Last night, a small group of us had dinner at the lovely Institute of Directors club, where we had a fascinating conversation about all things related to BPM – off the record, of course. 🙂

This morning, we started the day with Roger Burlton, the conference organizer, interviewing Keith Harrison-Broninski about the future of work. Keith, who I first met at the BPMG conference here in London four years ago, created the theory of Human Interaction Management (HIM), with the idea that you start with the complex human relationships – strategy, goals and deliverables – and work your way out to the transactional stuff. In other words, get a handle on the collaborative human-to-human processes first, with no technology involved, then use the successes in that sort of process improvement to gain support for the greater funding and time commitments required for implementing a BPMS. When Roger said that HIM sounds a lot like project management, Keith replied that project management is a use case of HIM.

Keith comes across as a bit of an old-school technophobe: he pooh-poohs blogging, tweeting and all other social media, and (based on his involvement in my roundtable yesterday afternoon) considers BPMS implementations to take much too long and cost too much although he appears to have little practical experience with any modern-day model-driven BPMS. Ignoring that, he does have some interesting ideas that get back to the definition of BPM that we all give lip service to, but often ignore: the management practice of improving processes, separate from the technology. This is about knowledge work, however, not routine work, where people are given goals and deliverables and work out how to achieve those based on their own knowledge. He refers to these as information-based processes, and everything that could be represented by a process model as task-based processes, where these mundane task-based processes are merely programs (in the software sense) to be implemented with much time and effort by the lowly engineers and developers. The answer to all this, of course, is his software, HumanEdj, and the workshops and services that he provides to help you implement it.

An interesting discussion, showing some of the huge gaps that exist in BPM today, especially between how we deal with knowledge work versus routine work.