BPM: The New Language Of IT To Business Technology

Alex Peters and Derek Miers presented in the business process track with a session on BPM as the new language of IT to business technology. Forrester has been pushing the phrase “business technology” instead of “information technology” for the past year or so, and it was funny this morning to hear John Rymer say that he didn’t like the term at first, but he’s really bought into it now, since it really describes the role of IT in supporting the business, rather than off in their own world.

Peters discussed three recent technologies that have become game changers: social computing to expand the customer interaction channels, dynamic business applications for cross-functional processes, and the cloud as a delivery platform. There are also new key actors in business process transformation initiatives, characterized as VP process improvement (“change agent”), business architect (“guru”), process architect (“prodigy”), business analyst (“wannabe”), and manager of IT business systems (“operator”). Business analyst = “wannabe? That’s gotta hurt, although it was Forrester that claimed that more than half of all business analysts couldn’t cut it as a process analyst.

In moving to this new world order, where technology is focused on business, it’s necessary to evaluate the maturity of the organization’s business process management, and start the journey by eliminating waste in processes. Suddenly, this is sounding a lot like Lean. He showed some examples of companies at various stages of the journey: an organization with immature processes, where IT uses a plan-build-run structure; an aspiring organization starting to move from reactive to proactive, with more of a demand-supply structure and the beginnings of centers of excellence; and an organization with mature process management, leveraging cross-business process centers of excellence and shared services.

Miers took over to explain how the language of BPM can be used along this journey to process maturity and a business technology focus. He’s not talking about a graphical notation for process models like BPMN; he’s talking about the natural language words that we use to describe processes and process improvements, and how we need to decide what they mean. In other words, what do you mean by process? Task? Process model? Object? Capability? And does everyone else in your organization use the same words to describe the same concepts? If not, you’re going to have some alignment problems, since language is key to building a common understanding between different roles.

He stepped through each of the five actors, the challenges that they encounter in doing their business transformation work, and the language that they need to use to describe their world. Although the call to action at the end was to do your process maturity assessment and portfolio analysis, there was little of that in the rest of the presentation.

A bit of a meta topic, and a bit unfocused due in part to logistical problems at the beginning of the session, but some interesting nuggets of information.

Fidelity Investments’ Evolution To Product-Focused Software Delivery

Darrell Fernandes, SVP of advisory solutions technology at Fidelity Investments, finished up the morning at Forrester’s BP&AD Forum with a discussion of their IT transformation: how they changed their software delivery process to become more like a software product company. They created “fences” around their projects in terms of centers of excellence and project management offices, with the idea that this would drive excellence on their projects; what they found is that the communication overhead started to bog them down, and that the silos of technology expertise became obsolete as technologies became more integrated. This is a really interesting counterpoint to Medco’s experience, where they leveraged the centers of excellence to create a more agile enterprise.

For Fidelity, the answer was to structure their software delivery to look more like that of a software product company, rather than focusing specifically on projects. They looked at and introduced best practices not just from other organizations like themselves, but also from software companies such as Microsoft. Taking a broader product portfolio view, they were able to look for synergies across projects and products, as well as take a longer-term, more disciplined view of the product portfolio development. A product vision maps to the product roadmap, then to the release plans, then ties into the project high-level plans. They’ve created an IT product maturity model, moving through initiation, emerging, defined, managed and optimizing; Fernandes admitted that they don’t have any in the optimizing category, but told about how they’ve moved up the maturity scale significantly in the past few years. They also started as an IT-led initiative before coming around to a business focus, and he recommends involving the business from the start, since their biggest challenges came when they started the business engagement so far along in their process.

They’ve had some cultural shifts in moving to the concept of IT products, rather than IT providing services via projects to the business, and disengaged the project/product cycle from annual IT budgets. Also, they drove the view of business capabilities that span multiple IT products, rather than a siloed view of applications that tended to happen with a project and application-oriented view. Next up for them is to align the process owners and product owners; he didn’t have any answers yet about how to do that, since they’re just starting on the initiative. They’re a long way from being done, but are starting to shift from the mode of IT process transformation to that of it just being business as usual.

Interesting view of how to shift the paradigm for software development and delivery within large organizations.

Texas Education Agency’s Process Transformation Journey

After a somewhat lengthy introduction by Marie Wieck from IBM’s middleware group, Rick Goldgar, CTO of the Texas Education Agency, talked about their process transformation. This was mostly about good software development practices – componentize, use a shared bus, agile methods, providing tools that empower the users to create their own solutions – but also about focusing on business process rather than UI when first prototyping. They start with a business process model of all business activities, then an implementation model to show what will be automated, then an operational model that translates directly to BPEL for execution. This idea of different perspectives on the process model is key to success at process modeling, but I hope that they’re using tools that allow for a shared model or some sort of automated translation, not having to recreate the process model three times.

In addition to their core WebSphere process modeling, they had the happy accident of using products that were eventually bought by IBM, such as ILOG and Cognos, so IBM is actually doing some of their integration for them as the product portfolio matures. Goldgar pointed out that it’s critical to choose technologies that integrate well; a timely comment after hearing a presenter from Scotiabank at a seminar earlier this week say that “most vendors integrate really well with themselves”. 🙂

In response to an audience question about the speed of system changes, he responded that many of the changes now are not limited by the technology – they can enact a change a business rule or report format in minutes – but by the users of the technology, who may need to be trained on the changes, or consider the full business impact of the changes relative to governing regulations. That’s the way it should be: the speed of technology change shouldn’t get in the way of the speed of business change.

Medco’s Agile Enterprise

Kenny Klepper, president and COO of Medco, gave the second keynote at the Forrester BP&AD Forum today on their business transformation. I saw him speak at PegaWorld earlier this year (and Pega even published the video), so this was a good update on what they’re doing – check out those reference for background on Medco and more information.

They’ve created an enabling architecture – frameworks, service bus, data fabric, data management and data warehouse – that enables their agile enterprise, and he believes that this level of supporting technology just couldn’t have been created a few years ago. We’ve reached a tipping point, where the technology has empowered the business for self-service, leading back to the themes from the opening keynote earlier: they’ve moved hundreds of people out of their IT groups and into centers of excellence, turning them into mentors and innovators rather than just back-room techies. What’s key is that they didn’t create some new group for the fancy new technology, but changed the roles of their existing people to allow them to take on the challenge. This resulted in business process centers of excellence, business innovation and agility centers, core IT centers of excellence, and operations centers, all working in concert. They don’t see this as a technology play, however, it’s financial: he sees these four centers as key to their return on invested capital, and an earnings-generating activity.

They use embedded “imagineers” with no technology constraints and rapid prototyping tools to rethink their business processes, not just apply some incremental process improvement techniques. This links up the market view of the business innovation and agility centers with the internal view of their operations centers, then pushes the innovations back through the business process and core IT centers of excellence. The result: they’re seeing business changes in days, not weeks or months.

This new agile enterprise structure has changed how they deploy and manage capital: they allocate capital both for growth and productivity, rewarding agile methods in order to incent movement away from legacy projects and into the new infrastructure. Interesting idea: choke off the funds going to the old legacy development, and people will start to focus on moving off the platform.

This has obviously been a huge success for them, both financially and in the enthusiasm of the people in their organization: the video clips that he showed were mostly of the business people who are impacted positively by this, one of whom claimed that you would have to “pry [the system] out of my cold dead hands”. We should all have such passionate stakeholders.

Forrester BP&AD Forum Keynote: The Empowered Future

I’m in DC at the Forrest Business Process and Application Delivery Forum – always a good conference in my experience – and Connie Moore opened the event with the morning keynote on business transformation and IT transformation. She showed some really great imagery about agility: a video clip of running water to represent where we should be, moving easily within a fluid environment, then a still shot of boot-covered feet mired in concrete. Also a good quote from someone at Linklaters:

Business transformation is not a series of discrete process improvement efforts.

That’s a great point, since we sometimes get too focused on a specific process improvement project and lose sight of the bigger picture of improving our entire organization.

Up next were John Rymer and Mike Gualtieri to talk about succeeding – and leading – in the empowered future. Empowerment is a big theme here, which I’m sure isn’t exactly a coincidence, given the recent release of Empowerment by Forrester’s Josh Bernoff and Ted Schadler. They talked about the rise of social media in empowerment, such as how Heather Armstrong kicked Whirlpool’s butt over a new broken washing machine via her hugely popular blog (although they neglected to mention why she ended up with 1.5 million Twitter followers, which is a great story on its own), and about finding the empowered people within your company and your customer base. They point out that empowered people accelerate everything; by creating crises (I admit to doing that sometimes myself 🙂 ) and by publicly promoting those who respond appropriately. We need to have empowered (or rather, empowering) technology and empowered employees in order to properly engage with empowered customers; otherwise, we risk missing out on the conversation altogether and allowing an empowered competitor to take over.

For many organizations, the old non-agile ways haven’t been working all that well. Business is going around IT to get things done, and innovation is at a standstill. They have four recommendations for achieving a newer, more responsive organization:

  1. Design for faster change. This allows you to change at the pace that the business requires it, which virtually assures business-IT alignment. The keys here are flexible platforms and tools that enable continuous transformation, and allow business professionals to share the responsibility of delivery. Create ever-evolving programs that deliver streams of value.
  2. Get passionate about people experience. Experiences need to be useful, usable and desirable, allowing people to accomplish their goals, easily perform tasks, and enjoy their tasks. That’s right, enjoyment of the experience actually makes a difference, both for your customers and your employees.
  3. Deliver smart solutions. This is about creating solutions that have a lot of flexibility built in to allow the business people to configure and extend them, through goal-driven processes rather than strictly structured processes. Events and analytics have a big part of this, by delivering key information at the right time to process participants, using suggestions for guided experiences as well as awareness of the process context. The result: huge productivity gains, both for IT (who do less development) and business (who can do more without having to wait for IT to change the applications).
  4. Make proposals to the business. Innovation comes from a combination of business and technology knowledge, and IT needs to learn the business in a very deep way in order to be able to recommend new technologes that will really make a difference. I can personally attest to this: my work with clients, which is a lot about helping implement BPM technology, relies on me having a deep understanding of what the business does; otherwise, I can’t visualize a solution that will have a significant impact on their business. That means that by the end of a project, I can do the job of half the people in the business area: knowledge that I’m unlikely to use everyday, but invaluable in helping them to innovate their business. To generalize, the right combination of analytic skills, technology know-how and business knowledge allows IT professionals to propose breakthrough innovations that the business just won’t come up with on their own because they didn’t even know that they were possible.

They were directly addressing the IT professionals in the crowd; given that this is also a conference on business process, I’m not sure that’s all who’s here, but great suggestions nonetheless.

They finished with some thoughts on changing language from the old school IT speak as part of creating the new empowered ways:

  • “User” becomes “Person” to stop some of the alienation between business and IT
  • “Project” becomes “Program”, which requires a change in focus as well as language
  • “Application” becomes “Business capability”, since the iPhone has ruined the word “app” for us 😉
  • “IT (Information Technology)” becomes “BT (Business Technology)”, since it’s really about the business, not just the information underlying the business
  • Industrial metaphors becomes creative metaphors, since we’re not just cogs in the wheels of business – a message on the Twitter stream suggested that we do away with “Lean” while we’re at it

This was a call to arms for IT to do things better, and lead us to the empowered future.

Learning to Love BPMN 2.0

The last presentation of the IRM BPM London conference before the final panel, and Chris Bradley and Tim Franklin of IPL are presenting on BPMN 2.0. Bradley started with a brief history of BPMN from its 1.0 release in 2004 by BPMI to the present day 2.0 release, now under OMG. It was interesting to see their list of what BPMN does not do: state transitions, functional decomposition, organizational hierarchies and data modelling, which explains why some BPMS products are starting to build those functions into their integrated development environment to be defined along with the process models. [Note that although I normally use US spelling due to the geographic location of most of my audience, I’m using “modelling” here after Bradley point out that the US spelling, “modeling” should rhyme with “yodeling” 🙂 ]

Franklin took over to get into the details of the notation, particularly the differences between the 1.x and 2.0 versions and the new elements and diagram types in 2.0. I’m not going to review all of that; there’s a lot of existing material both on this blog and in other locations, including a webinar that Robert Shapiro gave earlier this year on BPMN 2.0.

Bradley took the stage again to discuss all the other things that have to happen after you get started on BPMN 2.0, particularly modelling data and aligning that with the process models, whether that’s being done in an integrated tool or two different modelling tools. I agree with him that it’s critical for process, data and organizational modelling efforts to be linked, although I think that’s more likely to have happen via MDM rather than by having a single integrated modelling tool.

His summary said it all: BPMN is simple (if you can read a flowchart, you can understand BPMN); BPMN is robust (can be used for both high-level design/architecture and detailed process model execution/implementation); and most importantly, BPMN and process models are only part of the big picture, and need to be linked to other information assets such as data and organizational models.

You may not have come out of this session actually loving BPMN 2.0, but at least you’ll respect it in the morning.

BPM for Small and Medium Businesses

Tom Bellinson of IT Methods looked at some of the reasons why BPM in small business (under 500 people) is different from that in larger businesses, based on some of the fundamental differences in how small and large businesses work, and therefore how they deal with process improvement. There are some advantages to looking at process improvement in large companies – more human and financial resources, and a longer time frame – but small businesses have the advantage that usually their processes can be understood and tackled end to end rather than piecemeal.

Because of less available resources, a big bang approach sometimes doesn’t work for small companies: the big bang is good for just getting it does all at once, but causes a great deal more stress and requires more concentrated effort. Bellinson proposes using a big bang approach for the analysis and planning phases, then implementing using a more incremental approach. This can still cause significant stress, particularly during the analysis phase when you may be challenging the company owner’s personality directly as it manifests in company culture and processes. Although process analysis always challenges management in some way in any sized company, when the owner personally created specific processes, and those processes are now being called out as broken, that can be uncomfortable all around.

To do a big bang approach to process mapping in a small business, it needs to be made a #1 priority in the company so that it doesn’t get pushed aside when the inevitable emergencies occur; you’ll also need to hire an external consultant to guide this process and gather the information, since the odds of those skills being inside your company and readily available is near zero. This is really a 2-4 week effort, not the months that it might take in a larger company, so although it will be a bit stressful and disruptive during that time, you need to bite the bullet and just get it done. The analysis itself isn’t unique to small businesses – map as-is processes, find and eliminate the non-value added activities, determine ROI – but sometimes the roles are a bit different, with the process consultant actually doing the process improvement exercise and presenting it to the company, rather than internal participants being involved in the reengineering efforts.

I’ve been approached by a few smaller businesses lately who were interested in BPM, and I think that the tools are finally at a price point that SMBs can consider implementing BPM inside their organizations. I agree with Bellinson that many of the techniques are just different for smaller businesses; having started and run two small businesses up to 40-50 people in size, I can certainly understand how the owner’s personality can have a big influence on the corporate culture and therefore the way that business processes improvement has to happen. However, there are still a lot of standard BPM principles and methodologies that can be applied, just on a smaller scale.

SaaS BPM at Surrenda-link

Bruce Spicer of Keystar Consultancy presented on a project that he did with Surrenda-link Investment Management to implement Appian cloud-based BPM for the process around procuring US life settlement assets (individual life insurance policies) to become part of their investment funds. They were specifically looking at a software as a service offering for this, in order to reduce cost and risk (considering the small size of their IT group), since SaaS allows them to scale up and down seamlessly without increasing costs significantly. They’ve built their own portal/user interface, using Appian Anywhere as the underlying process and analytics engine; it surprises me a bit that they’re not using more out of the box UI.

They were overtime and over budget, mostly because they (admittedly) screwed up the process mapping due to immature processes, inexperience with process analysis, and inexperience with gathering requirements versus just documenting the as-is state. Even worse, someone senior signed off on these incorrect process models, which were then used for initial development in the proof of concept before corrections were made. They made some methodology corrections after that, improving their process analysis by looking at broad processes before doing a detailed view of a functional silo, and moving to Agile development methodologies. Even with the mistakes that were made, they’re in production and on track to achieve their three-year ROI.

This should be a compelling case study, but maybe because it was just after lunch, or maybe because his presentation averaged 120+ words per slide, I had a hard time getting into this.

Resolving Case Management Challenges with Dynamic BPM

Dermot McCauley of Singularity discussed case management and its need for dynamism. He’s one of the co-authors of Mastering the Unpredictable: How Adaptive Case Management Will Revolutionize The Way That Knowledge Workers Get Things Done, and started with a definition of case management:

Case management is the management of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and resources to achieve an objective or goal. The path of execution cannot be predefined. Human judgment is required in determining how to proceed, and the state of a case can be affected by external events.

As he pointed out, cases are inherently unpredictable, emerging and changing over time, and must allow case workers to chart their own course through the process of managing the case, deciding on the right tasks to do and the right information to include at the right time. He discussed 14 key characteristics of case management, including “goal driven”, “information complexity” and “fluid participants and roles”, and how a case management technology platform must include aspects of BPMS, ECM and collaboration technologies in order to effectively support the knowledge workers. He also discussed the criticality of the history of a case, even more so than with structured processes, since cases are typically long-running and might include several workers added in partway through the case timeline. Case workers need a flexible work environment, since that’s the nature of their work, which means that they need to be able to configure their own desktop environment via a mashup-like functionality in order to organize their work effectively.

He also covered off a bit of their own product; interesting to see that there is a process map underlying a case, with a “happy path” showing what the case should be doing, but providing the user at any point with the ability to skip forward or back in the process map, initiate other (pre-defined) tasks, reassign the task to another user, and change case characteristics such as priority and expected completion time. This is not purely unstructured process, where there is no predefined model, but dynamic BPM where the model is predefined but can be readily changed while in flight. They have implemented a solution with the UK Insolvency Service, dealing with individual bankruptcy; this was targeted at a new low-cost program that the Insolvency Service was putting in place to handle the large number of low-asset individual insolvency cases in the face of the recent economic crisis. They used an agile approach, moving the case files from paper to electronic and providing a more flexible and efficient case management process that was live within 12 months of the original government legislation that enacted the program.

Bridging Process Modeling and IT Solutions Design at adidas

Eduardo Gonzalez of the adidas Group talked about how they are implementing BPM within their organization, particularly the transition from business process models to designing a solution, which ties in nicely with the roundtable that I moderated yesterday. The key issue is that process models are created for the purpose of modeling the existing and future business processes, but the linkage between that and requirements documents – and therefore on to solution design – is tenuous at best. One problem is with traceability: there is no way to connect the process models to the thick stack of text-based requirements documents, and from the requirements documents to the solution modules; this means that when something changes in a process model, it’s difficult to propagate that through to the requirements and solution design. Also, the requirements leave a bit too much to the developers imaginations, so often the solution doesn’t really meet the requirements.

The question becomes how to insert the business process models into the software development lifecycle. Different levels of the process model are required, from high-level process flows to executable workflows; they wanted to tie this in to their V-cycle model of solution design and development, which appears to be a modified waterfall model with integrated testing. Increasingly granular process models are built as the solution design moves from requirements and architecture to design and implementation; the smaller and more granular process building blocks, translated into solution building blocks, are then reassembled into a complete solution that includes a BPMS, a rules engine, a portal, and several underlying databases and other operational systems that are being orchestrated by the BPMS.

Gonzalez has based some of their object-driven project decomposition methods on Martyn Ould’s Business Process Management: A Rigorous Approach , although he found some shortcomings to that approach and modified it to suit adidas’ needs. Their approach uses business and solution objects in an enterprise architecture sort of approach (not surprising when he mentioned at the end of the presentation that he is an enterprise architect), moving from purely conceptual object models to logical object models to physical object models. Once the solution objects have been identified, they model the object states through its lifecycle, and object handling cases (analogous to use cases) that describe how the system handles an object through its full lifecycle, including both system and human interaction. He made the point that you have to have the linkage to master data; this is becoming recognized as a critical part of process applications now, and some BPMS vendors are starting to consider MDM connectivity.

The end solution includes a portal, BPMS, BRMS, ESB, MDM, BI and back-end systems – a fairly typical implementation – and although the cycle for moving from process model to solution design isn’t automated, at least they have a methodology that they use to ensure that all the components are covered and in synchronization. Specific models at particular points in their cycle include models from multiple domains, including process and data. They did a proof of concept with this methodology last year, and are currently running a live project using it, further refining the techniques.

Their cycle currently includes the model and execute phases of a standard BPM implementation cycle; next, they want to take on the monitor and optimize phases, and add modeling techniques to derive KPIs from functional and non-functional requirements. They also plan to look at more complex object state modeling techniques, as well as how adaptive case management fits into some of their existing concepts.

I posed a question at the end of my roundtable yesterday: if a tool existed that allowed for the definition of the process model, user interface, business rules and data model, then generated an executable system from that, would there still be a need for written requirements? Once we got past the disbelief that such tools exist (BPMS vendors – you have a job to do here), the main issue identified was one of granularity: some participants in the process modeling and requirements definition cycle just don’t need to see the level of detail that will be present in these models at an executable level. Obviously, there are still many challenges in moving seamlessly from conceptual process models to an executable process application; although some current BPMS provide a partial solution for relatively simple processes, this typically breaks down as processes (and related integrations) become more complex.