Applying Lean Six Sigma Methodology to Transactional Processes

Next up was a panel discussion with David Haigh of Johnson & Johnson, Sabrina Lemos of United Airlines, and Gary Kucera of Kaplan Higher Education, moderated by Charles Spina of e-Zsigma.

United Airlines has a unique project going on in one of their freight-related operations: they decided to outsource the operation in order to be able to completely remake the process and have it meet specific KPIs, but also decided to allow the existing people to bid on their own jobs. This would have the effect of shifting them out of their current ways of doing things and proposing the best possible way to do it, since they will be in a competitive bidding situation with outsiders. Lemos also spoke about the importance of getting to the real data. She did an exercise of tracing a particularly biweekly report –which took several hours to compile – up to the VP and what he reports on, then tracked what he actually reports on back down to the reports and metrics that are being gathered at the lower levels. Not surprisingly, she found that there was zero alignment: nothing in the biweekly reports were used by the VP in his report, or anywhere else in the chain of command. She spoke about using gauge R&R, walk the process, and value stream mapping techniques to analyze processes, and the necessity of coming to agreement on the meaning of things such as process start points.

Haigh spoke about accounts payable processes at J&J Canada, and how an in-depth review of those processes was triggered by someone actually forgetting to pay the electricity bill, and showing up at the office one day to find a notice that the power would be cut if the bill weren’t paid immediately: not that they didn’t have the money to pay the bill, just that the process to do so wasn’t working. Accounts payable is often one of those processes in companies that is ignored when looking at major process improvement because it’s not revenue generating, but it’s important to recognize that enormous cost savings can be found through taking advantage of early payment discount levels, and avoiding any late penalties or service disruptions. They have found that doing some amount of the work onsite where the business processes are being done is helpful, since the process participants can see what’s involved in their process overall. They use the same techniques as discussed by Lemos, plus Kaizen Blitz and some activity-based costing.

Kucera spoke about aligning the corporate and executive goals with efforts at all levels, and how Jack Welch suggested making your bonus be some percentage of your process improvement savings in order to incent people to align their behavior and metrics with the ultimate goals. He spoke about some of the modeling and display tools that they use, such as fishbone and Pareto diagrams, and how doing these early and engaging with the business management can greatly speed the process improvement efforts. In many cases, since they’re dealing with simple transactional processes, they can use fairly simple analysis tools, but have some of the more sophisticated tools and techniques available as required.

They all had examples of process improvement efforts that have had a direct customer impact. Lemos had a great example of processing freight insurance claims, where they had a metric of processing five claims per day, resulting in the claims people cherry-picking claims in order to meet their quota; enforcing first-in, first-out claims processing resulted in an immediate and dramatic improvement in customer satisfaction. Listening to her stories of their paper-based inefficiencies, where emails are printed, signed and passed around, reminds me so much of the processes in some of my financial services and insurance customers.

In all cases – and I think that this is a key criticism of Lean and Six Sigma – they’re looking for incremental process improvements, not completely disruptive reengineering that would discover new ways to do business. However, in many of today’s standard transactional processes, incremental improvement is the only alternative.

Lean Six Sigma & Process Improvement: David Brown of Motorola

I missed the first morning of the IQPC Lean Six Sigma & Process Improvement conference in Toronto today, but with my usual impeccable timing, showed up just in time for lunch (where we had to explain the rules of curling to the American attendees). The first session this afternoon is with David Brown, a black belt at Motorola, where the term “Six Sigma” was first coined and is still used to make their processes more effective, efficient, productive, and transparent.

There has been a transformation for them in how they analyze their processes: ranging from just looking at transactions to high-level intelligence including complex simulations and forecasting. Since they run SAP for their ERP, they have a number of SAP business intelligence (Xcelsius and Business Objects) products, although their most complex analysis is done with Oracle Crystal Ball.

Brown’s presentation was short – less than 10 minutes – and the rest of the session was an interactive one-on-one interview with questions from Charles Spina of e-Zsigma, the conference chair. The Q&A explored much more about how Motorola uses business analytics tools, and opened it up to the (small) audience for their experience with analytics. Not surprisingly, there has been quite a bit of success through the introduction of analytics to process improvement teams: sometimes it’s the black belts themselves, sometimes it’s a separate analytics group that works closely to develop the reports, analysis, and more complex intelligence based on the large volumes of data collected as part of any process improvement project.

Reporting tools can be as simple as Excel – for simple needs – through more complex solutions that include ETL from multiple data sources and regularly scheduled reports, such as Crystal Reports and Xcelsius. Legacy systems can make that a bit of a challenge; often these end up as extracts to Excel or Access, which are then remixed with other sources. Extracts such as this can be really problematic, as I’ve seen first-hand with many of my customers, since there’s no way to keep the data completely in sync with the underlying systems, and typically any one legacy system doesn’t have all the relevant data, so there can be a real problem in matching up related data from multiple systems. Brown underlined that the key issue is to get all of your data into a central data warehouse in order to determine if your data is complete and clean, and to facilitate reporting and analytics. This is especially important for process engineers when trying to do time studies over long periods of time: if you don’t have some consistent representation of the processes over the time period in question, then your analysis will suffer.

Motorola is using their data analytics to improve operational processes, such as order shipping, but also what-if scenarios to inform salespeople on the impact of discount levels to the bottom line. In many cases, this is an issue of data integration: Sabrina Lemos from United Airlines (who will be on the panel following) shared what they were able to recover in late container fees just by integrating their container tracking system with a database (Access, alas) that generates their invoices. Interestingly, I wouldn’t have thought of this as a process improvement initiative – although it is – but rather just as an artifact of doing some clever system integration.

They also discussed the challenges with presenting the results of analytics to the less numerically inclined, which often entails rolling data up to some simpler charts that can be drilled into as required, or just presented in a PowerPoint or PDF file. The real ROI may come from more interactive tools, however, such as dashboards that show operational alerts, or real-time what-if analysis to support human and automated decisions. Since Lean and Six Sigma tools are inherently analytical, this isn’t a new problem for the people in this audience; this is a matter of building relationships early with the non-analytical business managers, getting some early successes in projects to encourage adoption, and using different presentation and learning styles to present the information.

Because of the nature of this audience, the analytics that they’re discussing are typically for human consumption; in the BPM world, this is more and more moving to using the analytics to generate events that feed back into processes, or to inform automated decisioning. Either way, it’s all about improving the business processes.

Cloud-Based BPM Vendors: Geography Matters

I’ve spoken with a lot of cloud-based BPM vendors over the past few years, and I inevitably ask where their services are hosted. Since almost all of these are American companies, or are primarily targeting the American market, the answer is, almost inevitably, in the United States. I continue to point out that that’s a problem for many non-American companies: my Canadian customers are mostly financial services and insurance, and not one of them would consider hosting any of their data – even non-executing process models – outside Canada. Yes, I’ve asked them. Similarly, many EU companies require that their data be hosted in the EU. The problem is not, as many believe, safe harbor regulations that attempt to bring US data privacy in line with the stricter laws of other countries; it’s the Patriot Act, which allows U.S. intelligence and law enforcement authorities to view personal data held by U.S. organizations without a court order, and without informing people or organizations that their data has been shared. This is in violation of Canadian privacy standards, as well as those of many other countries.

Where to host servers for Canadian clients

Yesterday, I had the chance to speak with someone at Human Resources and Skills Development Canada (our federal department dealing with labour and employment, which is pretty big due to the social benefits such as unemployment insurance and government pensions that we enjoy). They’re doing process modeling on a large scale across their department, and looking at how they can collaborate with other departments. Currently, they collaborate on process models using desktop sharing software for real-time collaboration between a modeler and a mentor who is helping them on a process, plus an internal repository and web publishing of the process models for viewing. I asked if they would consider using something like Lombardi Blueprint or one of the other online process modeling environments that are emerging, and he said, unequivocally, “only if it’s hosted in Canada.” I’m not sure if that’s an explicit Canadian government policy, but that’s their practice.

So to all the vendors who think that geography doesn’t matter for hosted solutions, a news flash: geography does matter if you plan to sell to non-American organizations, whether private sector or public sector.

</soapbox>

BPM and Business Analysis Conferences, London: Call For Speakers

IRM is running both a BPM and business analysis conference in London on September 27-29, and the calls for speakers are open until March 8th.

The BPM conference is looking for presentations on:

  • Building BPM capabilities
  • Using BPM to change how businesses are managed
  • BPM governance and the centre of expertise
  • BPM success stories
  • Process modelling and improvement techniques and best practices
  • Business process design innovations
  • Process-centric approaches to business rules and business analysis
  • BPM implementation
  • BPM human change
  • BPM and emerging trends

The Business Analysis conference, in its second year and organized in conjunction with IIBA, had three proposed speaker tracks:

  • Techniques for business analysis
  • Shaping the future of business analysis
  • Business agility and business analysis

You can submit a proposal for a presentation at BPM Europe 2010 here, or at Business Analysis London 2010 here.

IBM BlueWorks Online BPM Community

I had a briefing a couple of weeks ago on IBM BlueWorks by Angel Diaz and Janine Sneed from the BlueWorks team. BlueWorks is IBM’s cloud-based BPM environment, providing the following capabilities:

  • Browser-based modeling, including strategy maps, capability maps, process maps and BPMN processes.
  • Pre-built content to supplement or replace a BPM center of excellence (CoE), including the ability to submit your own content.
  • Online community for collaboration and exchange of ideas.

BlueWorks content viewBlueWorks was launched last July, and has several thousand people signed up, although I didn’t get a good feel for the level of activity. It’s based on Lotus Business Space, with the modeling editor and repository from the WebSphere BPM suite, which allows IBM to offer both a hosted and on-premise version.

They’ve kicked off the content part of BlueWorks by seeding it with a lot of content from internal and external contributors, including information provided by their professional services arm. The results is a large repository of articles, sample strategy maps, business measures such as KPIs, forums and blogs with more information that you could hope to scavenge through. It’s all categorized and tagged in multiple ways, however, making it easy to filter the library to just what you’re looking for, whether by topic, industry, or type of content. They also include industry content packs, which are bundles of industry-specific strategy maps and other content.

BlueWorks process modelThe process designer is Flash-based, and it only took me about 5 minutes to crash it; luckily, it saved as I worked, so I didn’t lose any work. Some of the operations are not very intuitive (I had to go to the help file to figure out how to add a new activity), but once I learned a few of the basics, it’s pretty efficient to use, and I could use the keyboard for entering my activity list, which I like. The process is shown in both a text outline view and a process outline view, very similar to other process discovery/outlining tools such as Lombardi Blueprint (which should make the integration of Blueprint into this environment straightforward from a user interface standpoint, if not a technical one). Once complete, I could export to a PowerPoint presentation (which includes slides for the process model and the details that I entered), a Business Document Archive (a binary format that I’m not familiar with) to my local file system or the asset repository, or to a WebSphere Business Modeler XML format.

BlueWorks BPMN modelThis is where I found things a bit strange: I couldn’t export or otherwise convert the process model that I had created to use in the BPMN modeler, which is a separate tool. Maybe this is something that the Blueprint folks can teach them about. I found the BPMN modeler a bit clunky: resizing and placement of elements was awkward, although it allowed me to validate my model as valid BPMN. There definitely needs to be a way to move between these two process model types, to eliminate redrawing and also to allow a process analyst to quickly flip between the different perspectives. From the BPMN model, I could save to the shared repository, or export to BPMN 2.0 XML, WebSphere Business Modeler XML, or a Process Diagram Archive XML format.

I didn’t spend a lot of time on the strategy or capability maps; a strategy map is a mind-map type of model that allows you to model business SWOT factors as well as business goals, whereas a capability map shows the business capabilities and can link them to process models. The strategy, capability and process maps all have a similar user experience, and are all shown as siblings within folders in the BlueWorks space under the Design tab; BPMN models, on the other hand, are shown in a separate tab and have a completely different UI. The BPMN model seems like a bit of an add-on: obviously, there’s a need for BPMN modeling in an online BPM community, but they haven’t quite got it integrated yet. The three Design map types are really intended for business users, and allow functions such as pasting an indented bulleted list from a PowerPoint presentation into a strategy map to create an initial map. Links and attachments (including documents and folders) can be added to any node in any of the three Design diagram types. All four model types have versioning, and models of all types are visible in my dashboard view.

BlueWorks share model dialogAside from the functionality of the modelers, there’s the ability to collaborate on models: each person has their own private space in BlueWorks, or they can share their models with their team members. The upcoming version 7 of BlueWorks will allow more fine-grained privacy controls to allow sharing only with specific groups.

The content and community parts of BlueWorks form the basis of a CoE: smaller companies could use this as their only CoE, whereas larger ones might want to use content from BlueWorks with their own internal content. Content submitted to the content section is not only visible to anyone on BlueWorks, but also is explicitly licensed to IBM for redistribution, so this isn’t a place for your private intellectual property, but a good place to share ideas with people from other companies. IBM partner companies are starting to use it for sales material and starter content.

The hosted version of BlueWorks is free, and you don’t even need to be an IBM customer, but if you want to take this capability inside your own firewall, IBM would be happy to sell you WebSphere Business Compass (formerly WebSphere Publisher). Also based on Business Space, Diaz described it as an in-house version of BlueWorks, but it has many more tools such as forms designers, organization charts and other process modeling tools. You don’t need to use WebSphere Business Compass – it’s possible to go directly from BlueWorks to an executable system using the WebSphere and BPMN export formats – but for some companies, BlueWorks will act as the “gateway drug” to get them hooked on the bigger and better functionality of Business Compass.

I was briefed on Software AG’s online community, ARISalign, earlier this week and will post my thoughts on that soon; in both cases, these competing online communities lack some key functionality, but need to get their platforms out there for people to start using and feeding back on what’s needed. The best online community will result not from who has the most advanced starting point, but from who can be most responsive to their community’s needs.

You can sign up for your own BlueWorks account for free, and there’s a webinar tomorrow at 1pm ET on getting started with BlueWorks that will be recorded and available for replay later.

Henk de Man of Cordys at Software 2010

Only one other presentation at the Software 2010 conference in Oslo today was in English, which likely would have attracted me anyway, but I especially wanted to see Henk de Man of Cordys speak about adaptive BPM and case management in the cloud, which provides a nice bookend to my talk at the start of the day.

I couldn’t believe that it’s been three years since I last looked at Cordys, and I was looking for a bit of an update. Cordys Process Factory (CPF) is now tightly integrated with Google Apps, and they have some examples of customers using Google Apps, CPF and on-premise applications with data and transaction exchange between the cloud-based and on-premise software in a “hybrid cloud” configuration.

His focus today, however, was on case management: a higher-level coordination of activities that can’t be shown in a single structured process, with many bits of content and process works towards a common goal, such as is defined by OMG. This is emerging as a type of process modeling, separate but adjacent to the type of structured process modeling that we see in BPMN. In case management, there is a case file that contains all the relevant content, but multiple ways to achieve the ultimate goal, which might be dependent on the contents of the case file, current conditions, and the decisions of the individual participants working on the case. Forrester just released a research note on dynamic case management, and some of the older document management and workflow solutions are being repositioned into this “new” area, but the successful players will be those that can bring quality analytics, collaboration and modern user experience to bear: areas where Cordys is making inroads.

This is a bit of old wine in new bottles, but new technologies are definitely breathing life back into case management; the challenge will be to differentiate true case management processes from potentially structured complex processes that someone is just too lazy to model. Expect to see much more of this in 2010.

BPM and Enterprise 2.0 at Software 2010 in Oslo

I’m at the Software 2010 conference held by the Norwegian Computing Society in Oslo this week, and gave the opening keynote on one of the tracks this morning: how Business Process Management is being impacted by social software and social networking:

I gave a similar talk last November at the Business Rules Forum, but I find this topic to be endlessly changing and endlessly fascinating. I’ve written two related papers on it recently, too: one for the Springer BPM Handbook, and one for the Cutter IT Journal (specifically on runtime collaboration in BPM).

I won’t be attending most of the other sessions because they’re in Norwegian, but may pop out this afternoon and visit the Edvard Munch works at the National Gallery. I spent a few days in London earlier this week, visiting the Victoria & Albert, Tate Modern and British Museums, so that would round out my week nicely.

Lean Six Sigma and Process Improvement conference, Toronto

In a nice break from the past two years as a road warrior, I’ve only been on one trip since November. Even better, some conferences are coming to Toronto so that I don’t even need to travel (although not sure that February up here is a big draw if you don’t already live here).

This month, IQPC is hosting a Lean Six Sigma and process improvement conference on February 22-24 at the Westin Harbour Castle, with a focus on achieving a sustainable and transparent Lean Six Sigma and process improvement culture:

    • Increase Organizational Synergies by Applying LSS and Process Re-engineering to Consolidation and Organizational Restructuring
    • Maximize Benefits and Savings of Process Improvement Projects by Identifying and Implementing Low Cost Solutions
    • Bring the Quality of Your Products to a New Level of Efficiency by Applying Innovative Methodologies, such as Triz, to Your Transactional Processes and Engage Your Customers in Transactional Projects
    • Maximize the Efficiency of Internal and External Benchmarking by Expanding the Use of Dashboards

My readers can get 20% off the “All Access” price by using the code LSSCCol2 when you register here.

Disclosure: IQPC is providing me with a free pass to the show.

Another Call for Papers: Americas Conference on Information Systems

Although it’s very well-hidden on the information site, the 16th Americas Conference on Information Systems, to be held in Lima in August, will have a mini-track on BPM (it’s within the Systems Analysis and Design track):

This mini-track seeks contributions that discuss the management of business processes as well as technologies for process automation. We encourage submissions from both a
managerial as well as a technical perspective.

Suggested topics include, but are not limited to, the following:

-Business process automation and workflow management systems
-Business process and rule modeling, languages and design patterns
-Strategies for business process design and innovation
-Service-oriented architectures for BPM
-Resource management and capacity planning in BPM
-Information security and assurance in BPM
-Business process monitoring and controlling
-Process mining and its applications
-Business process governance, risk and compliance management
-Management of adaptive and flexible processes
-Management of ad-hoc and collaboration processes
-Management of knowledge-intensive processes
-Formal evaluation of BPM methods and technologies
-BPM adoption and critical success factors
-BPM maturity
-Standardization of BPM, web services and workflow technology
-Industry case studies on BPM technology or BPM applications

March 1st is the submission deadline for papers.

BPM 2010 Call for Papers: Research, Education and Industry

 

I’ve previously extolled the benefits of attending the annual international research conference on BPM, and for those of you in North America who just weren’t ready to shell out for a trip to Europe, you’re in luck: it’s coming to Stevens Institute in New Jersey in September. Although this has always been an academic research conference, rife with papers full of statistical analysis, this year the organizers are creating an industry track for practitioners to discuss the adoption and use of BPM:

The industry track will provide practitioners with the opportunity to present insight gained through BPM projects. We are particularly interested in case studies from the perspective of user organizations. While contributions from consultants and vendors are appreciated, pure product demonstrations, method tutorials, or vendor showcases will not be accepted in the industry track. All contributions to the industry track have to describe experiences with BPM methods and/or technologies from the viewpoint of the adopting organization.

This is not the usual conference PowerPoint deck: you have to actually write a paper. If you want to present in the industry track, you must submit an abstract by February 15th.

If you’re submitting a paper for the regular research tracks, the paper (not just an abstract) is due by March 14th. You can also submit a paper in the new education track, specifically about education and training methods for the BPM professional, also due by March 14th.

Even if you’re not giving a paper, I highly recommend that BPM vendors send along someone from their design/engineering team. This conference shows BPM research that (in some cases) indicates where product functionality could go in the future; best to get in there and see it first hand.