Applying Lean Six Sigma Methodology to Transactional Processes

Next up was a panel discussion with David Haigh of Johnson & Johnson, Sabrina Lemos of United Airlines, and Gary Kucera of Kaplan Higher Education, moderated by Charles Spina of e-Zsigma.

United Airlines has a unique project going on in one of their freight-related operations: they decided to outsource the operation in order to be able to completely remake the process and have it meet specific KPIs, but also decided to allow the existing people to bid on their own jobs. This would have the effect of shifting them out of their current ways of doing things and proposing the best possible way to do it, since they will be in a competitive bidding situation with outsiders. Lemos also spoke about the importance of getting to the real data. She did an exercise of tracing a particularly biweekly report –which took several hours to compile – up to the VP and what he reports on, then tracked what he actually reports on back down to the reports and metrics that are being gathered at the lower levels. Not surprisingly, she found that there was zero alignment: nothing in the biweekly reports were used by the VP in his report, or anywhere else in the chain of command. She spoke about using gauge R&R, walk the process, and value stream mapping techniques to analyze processes, and the necessity of coming to agreement on the meaning of things such as process start points.

Haigh spoke about accounts payable processes at J&J Canada, and how an in-depth review of those processes was triggered by someone actually forgetting to pay the electricity bill, and showing up at the office one day to find a notice that the power would be cut if the bill weren’t paid immediately: not that they didn’t have the money to pay the bill, just that the process to do so wasn’t working. Accounts payable is often one of those processes in companies that is ignored when looking at major process improvement because it’s not revenue generating, but it’s important to recognize that enormous cost savings can be found through taking advantage of early payment discount levels, and avoiding any late penalties or service disruptions. They have found that doing some amount of the work onsite where the business processes are being done is helpful, since the process participants can see what’s involved in their process overall. They use the same techniques as discussed by Lemos, plus Kaizen Blitz and some activity-based costing.

Kucera spoke about aligning the corporate and executive goals with efforts at all levels, and how Jack Welch suggested making your bonus be some percentage of your process improvement savings in order to incent people to align their behavior and metrics with the ultimate goals. He spoke about some of the modeling and display tools that they use, such as fishbone and Pareto diagrams, and how doing these early and engaging with the business management can greatly speed the process improvement efforts. In many cases, since they’re dealing with simple transactional processes, they can use fairly simple analysis tools, but have some of the more sophisticated tools and techniques available as required.

They all had examples of process improvement efforts that have had a direct customer impact. Lemos had a great example of processing freight insurance claims, where they had a metric of processing five claims per day, resulting in the claims people cherry-picking claims in order to meet their quota; enforcing first-in, first-out claims processing resulted in an immediate and dramatic improvement in customer satisfaction. Listening to her stories of their paper-based inefficiencies, where emails are printed, signed and passed around, reminds me so much of the processes in some of my financial services and insurance customers.

In all cases – and I think that this is a key criticism of Lean and Six Sigma – they’re looking for incremental process improvements, not completely disruptive reengineering that would discover new ways to do business. However, in many of today’s standard transactional processes, incremental improvement is the only alternative.

Lean Six Sigma & Process Improvement: David Brown of Motorola

I missed the first morning of the IQPC Lean Six Sigma & Process Improvement conference in Toronto today, but with my usual impeccable timing, showed up just in time for lunch (where we had to explain the rules of curling to the American attendees). The first session this afternoon is with David Brown, a black belt at Motorola, where the term “Six Sigma” was first coined and is still used to make their processes more effective, efficient, productive, and transparent.

There has been a transformation for them in how they analyze their processes: ranging from just looking at transactions to high-level intelligence including complex simulations and forecasting. Since they run SAP for their ERP, they have a number of SAP business intelligence (Xcelsius and Business Objects) products, although their most complex analysis is done with Oracle Crystal Ball.

Brown’s presentation was short – less than 10 minutes – and the rest of the session was an interactive one-on-one interview with questions from Charles Spina of e-Zsigma, the conference chair. The Q&A explored much more about how Motorola uses business analytics tools, and opened it up to the (small) audience for their experience with analytics. Not surprisingly, there has been quite a bit of success through the introduction of analytics to process improvement teams: sometimes it’s the black belts themselves, sometimes it’s a separate analytics group that works closely to develop the reports, analysis, and more complex intelligence based on the large volumes of data collected as part of any process improvement project.

Reporting tools can be as simple as Excel – for simple needs – through more complex solutions that include ETL from multiple data sources and regularly scheduled reports, such as Crystal Reports and Xcelsius. Legacy systems can make that a bit of a challenge; often these end up as extracts to Excel or Access, which are then remixed with other sources. Extracts such as this can be really problematic, as I’ve seen first-hand with many of my customers, since there’s no way to keep the data completely in sync with the underlying systems, and typically any one legacy system doesn’t have all the relevant data, so there can be a real problem in matching up related data from multiple systems. Brown underlined that the key issue is to get all of your data into a central data warehouse in order to determine if your data is complete and clean, and to facilitate reporting and analytics. This is especially important for process engineers when trying to do time studies over long periods of time: if you don’t have some consistent representation of the processes over the time period in question, then your analysis will suffer.

Motorola is using their data analytics to improve operational processes, such as order shipping, but also what-if scenarios to inform salespeople on the impact of discount levels to the bottom line. In many cases, this is an issue of data integration: Sabrina Lemos from United Airlines (who will be on the panel following) shared what they were able to recover in late container fees just by integrating their container tracking system with a database (Access, alas) that generates their invoices. Interestingly, I wouldn’t have thought of this as a process improvement initiative – although it is – but rather just as an artifact of doing some clever system integration.

They also discussed the challenges with presenting the results of analytics to the less numerically inclined, which often entails rolling data up to some simpler charts that can be drilled into as required, or just presented in a PowerPoint or PDF file. The real ROI may come from more interactive tools, however, such as dashboards that show operational alerts, or real-time what-if analysis to support human and automated decisions. Since Lean and Six Sigma tools are inherently analytical, this isn’t a new problem for the people in this audience; this is a matter of building relationships early with the non-analytical business managers, getting some early successes in projects to encourage adoption, and using different presentation and learning styles to present the information.

Because of the nature of this audience, the analytics that they’re discussing are typically for human consumption; in the BPM world, this is more and more moving to using the analytics to generate events that feed back into processes, or to inform automated decisioning. Either way, it’s all about improving the business processes.

Lean Six Sigma and Process Improvement conference, Toronto

In a nice break from the past two years as a road warrior, I’ve only been on one trip since November. Even better, some conferences are coming to Toronto so that I don’t even need to travel (although not sure that February up here is a big draw if you don’t already live here).

This month, IQPC is hosting a Lean Six Sigma and process improvement conference on February 22-24 at the Westin Harbour Castle, with a focus on achieving a sustainable and transparent Lean Six Sigma and process improvement culture:

    • Increase Organizational Synergies by Applying LSS and Process Re-engineering to Consolidation and Organizational Restructuring
    • Maximize Benefits and Savings of Process Improvement Projects by Identifying and Implementing Low Cost Solutions
    • Bring the Quality of Your Products to a New Level of Efficiency by Applying Innovative Methodologies, such as Triz, to Your Transactional Processes and Engage Your Customers in Transactional Projects
    • Maximize the Efficiency of Internal and External Benchmarking by Expanding the Use of Dashboards

My readers can get 20% off the “All Access” price by using the code LSSCCol2 when you register here.

Disclosure: IQPC is providing me with a free pass to the show.

BPMG… I mean, BPEE Conference in London

I received an email this morning about the upcoming Business Process Excellence Exchange (BPEE) conference organized by IQPC in London this October — obviously replacing the BPMG conference that they used to organize. The only problem is the speaker list, which has both Steve Towers and Terry Schurter listed as being with BPMG.

You’d think that the conference organizers would have figured out a bit of what’s happening (or not happening) with BPMG these days, since they had to actually change the name of the conference from last year…

IQPC BPM Summit: David Haigh

Last speaker of the day — and of the conference — was David Haigh, Global Director of Continuous Improvement at W.E.T. Automotive Systems, discussing Lean Product Development. It’s actually refreshing to be at a BPM conference where I’m the only person that I heard (since I missed Jodi Starkman-Mendelsohn’s talk this morning) that talked about the technology.

They previously tried out a lot of different quality programs, including ISO 9000, Six Sigma, Lean, BPR and other techniques, but these were always initiated by the subsidiaries and didn’t really catch on, so in 2006 they started on a global program that included the shop floor, logistics and product creation. Whereas they had always focused on the production/fulfillment value stream previously, they expanded the scope to include the entire order-to-cash cycle, particularly to include the design portion of the cycle that has the smallest cost element but the largest cost influence.

I loved his analogy for hand-offs in the business process: it’s like the telephone game that we played as kids, whispering a message from one person to the next to see how message changes by the time it reaches the end; any hand-off results in a reduction in information clarity, as well as being a big time-waster.

Since he’s in an engineering manufacturing environment, there’s some interesting ideas that at first seem unique, but have value in many other areas: set-based design, for example, where you spend the engineers’ time researching and pushing boundaries on the technology that underlies customer solutions, rather than spending the time building one-off customer solutions. The equivalent in the BPM world would likely be having them focus on building out the service layer, not assembling the services using a BPMS. He also spoke about Toyota’s practice of streaming engineers up to higher levels of engineering rather than “promoting” them to sales or management — I always tried to do that when I ran a company, since there’s always some people who just want to stay technical, and don’t want their career to suffer for it.

They’ve built a “workflow” and project planning tool in Excel that has some interesting concepts: no dependencies between tasks, just points of integration, and the team sets the deadlines (can you say “collaboration”?). This helped them by providing tools for visualizing waste in the process, and driving to reduce the waste, which is the main focus of Lean.

This has been an interesting conference, although the attendance is quite a bit less than I had expected, but that makes for a much better environment for asking questions and networking. And speaking of networking, I think that I just have time to run home before the Girl Geek Dinner tonight…

IQPC BPM Summit: Kirk Gould

Kirk Gould, a performance consultant with Pinnacle West Capital, talked about business processes and metrics. I like his definition of a metric: “A tool created to tie the performance of the organization to the business objectives”, and he had lots of great advice about how to — and how not to — develop metrics that work for your company.

I came right off of my presentation before this one, so I’m a bit too juiced up to focus as well on his presentation as it deserves. However, his slides are great and I’ll be reviewing them later. He also has a good handout that takes us through the 10 steps of metric development:

  1. Plan
  2. Perform
  3. Capture
  4. Analyze
  5. Display
  6. Level
  7. Automate
  8. Adjust
  9. Manage
  10. Achieve

He has a great deal more detail for each of these steps, both on the handout and in his presentation. He discussed critical success factors and performance indicators, and how they fit into a metrics framework, but the best parts were when he described the ways in which you can screw up your metrics programs: there were a lot of sheepish chuckles and head-shaking around the room, so I know that many of these hit home.

He went through the stages of metrics maturity, which I’ll definitely have to check out later since he flew through the too-dense slides pretty quickly. He quotes the oft-used (and very true) line that “what gets measured, gets managed”, a concept that is at the heart of metrics.

IQPC BPM Summit: Manish Mehta

Manish Mehta, a project manager at the government of the Region of Peel (which covers a huge chunk of the bedroom communities north and west of Toronto, about 1.2 million people in urban and rural areas), gave a presentation on implementing process management at Peel.

For them, process management was part of their quality management, which already included ISO and some other quality programs. They wanted to strengthen corporate thinking, reduce their silo departmental focus, increase alignment and connection, and measure employee and client satisfaction.

The steps in their process management project were as follows:

  • Phase 1: develop standard terms, definitions and symbols
  • Phase 2: provide process management training, and develop a process management framework
  • Phase 3: develop their service improvement initiative (SII) to apply process management

As with the previous OPG talk, this was not about a BPM implementation, but about putting standard, optimized processes into place in an organization in order to not only improve service delivery, but measure it as well. Specifically, their three goals were to implement process management across the organization, to implement a consistent approach to client satisfaction measurement and management, and to develop and monitor a corporate client satisfaction rating.

They first applied this to their TransHelp program, which provides transportation for those who are unable to use public transportation, then to their waste management program; in both cases, they found that doing client satisfaction surveys identified factors that clients found important that had not been considered by the inside workers: definitely a good reason to get out there and talk to people rather than sitting in the ivory tower and deciding the best process to implement. They are seeing some measurable improvements: with TransHelp, their no-show/cancel rate has dropped in half, and with waste management, their number of complaints has dropped. For process improvement work that they did with children’s services within their financial assistance area, the time to complete an application and assess eligibility dropped dramatically with the new process. Once they’d done these three pilots, they found that other areas started to come to them to ask for help in process improvement: you really need to show some successes in order to get started in a diverse organization such as a regional government. Measuring client satisfaction for regional governments is still in an early stage, and Mehta said that they were working with other government organizations to develop methods for doing this. He also showed a great public sector reference model that linked resources, processes, services and programs and how they interact with providers and clients.

They used outside facilitation for process redesign, but mostly created their own methodology and guidelines and do have some capacity for the process redesign internally. They’ve developed a fairly structured project management approach in terms of defining scope and schedules. They have discovered, not surprisingly, is that pulling subject matter experts away from their regular jobs for several days in order to do the process redesign is much more effective than trying to have people add this on to their real jobs, in spite of the grumbling that will inevitably occur when you try to get a group of people to a multi-day offsite meeting.

Something I really like about what they’ve done is to split their SII approach into three stages, depending on the state and complexity of the original process: repair (quick fixes to smaller broken processes through understanding customer needs), improvement (considering root cause analysis and piloting a solution), and design/redesign (similar to improvement but with best practices and benchmarks). They see both improvement and redesign projects as requiring a trained facilitator.

Mehta showed some private sector studies that showed that employee satisfaction leads to client satisfaction, which in turn provides even greater employee satisfaction, and client satisfaction also leads to greater profitability. Applied to the public sector, you can replace the profitability part of the equation with trust and confidence: greater customer satisfaction results in more trust in the government body and their ability to execute works on behalf of the citizens.

Off to lunch, then I’m up at the front of the room after that.

IQPC BPM Summit: Ron McGillis

I missed Day 1 of the IQPC BPM summit due to a road trip to Detroit, but I’m here this morning to hear the speakers, and this afternoon to give my own presentation. It’s a small group, probably less than 40 people, but I’ve already been hearing great things from the attendees about how much that they’ve learned from the two days so far.

I missed Jodi Starkman-Mendelsohn of West Park Assessment Centre speak first thing, but I’ve heard her speak twice this year already so just had a quick update from her at a break on what they’ve done since I last saw her.

Ron McGillis of Ontario Power Generation talked about their contractor management program, where they manage all contracts for everything that they do with power generation: everything for both construction and non-construction servicing of the 3 nuclear power generation plants, 4 fossil fuel plants, 64 hydroelectric and 3 wind power facilities in the province. This came out of their safety compliance programs, since McGillis stated that this was their greatest concern with their contractors (the WSIB people in the audience applauded at that point), and that it needed to be the same level of safety for contractors as for OPG employees.

In 2002, they did a safety audit that showed some serious problems — “systemic, cultural-based problems with the existing contractor safety management process” — and recommended some standardized processes around the safety standards for contractors. This resulted in OPG’s current policy statement that their contractors and subcontractors must maintain a level of safely equivalent to that of OPG employees while at OPG workplaces, and set out requirements that the contractors are accountable for the health and safety of their own employees while at OPG, including ensuring that they don’t harm OPG employees or the public.

They’ve come up with a contract management process that’s documented and freely available on their website. They have consistent pre-qualification processes (which would pre-qualify a contractor for working with any division of OPG), processes for awarding contracts, standard performance reporting, and training for anyone involved in the contract management process. Using the ISO standards as a guideline, they recreated a Plan – Do – Check – Act program for contract management, and defined roles and responsibilities for each contract: owner, administrator, monitor and buyer.

Every contract stipulates these roles explicitly, and also safety clauses in terms of reporting, inspections, procedures and rules. This trickles down to any subcontractors, too.

Their contract process has five steps: planning, procurement, post award (which ensures that all parties are ready to go to work on the execution), execution, and close-out.

The contract management manual is only 13 pages long, and is at a “contract management for dummies” level, with the following content:

  • The steps in each stage
  • Who is accountable for each step
  • Forms (mandatory)
  • Worksheets (mandatory)
  • Job aids (good practices)
  • Check lists
  • Notes and references

Training was a key part of their success, including contract administration and monitoring courses at various levels of detail, ranging from 4 hours to 4 days.

From an automation standpoint, this isn’t a BPM system implementation: this is BPM in the sense of “management discipline” as defined by Gartner, where there’s a structured business process that is providing a huge benefit to the organization, but none of this process is automated. They have database applications that provide some analysis — for example, a contractor database allows for input of various scoring factors and provides a pre-qualification rating — but most of this is about getting people to follow the correct business processes. Their contract management process is so successful that it’s been adopted by some large companies and other power generation companies.

Their lessons learned for any business process change:

  • Let the process “soak”, giving it some time for people to get used to it before making changes (since people will always chafe against a new process when they’re first getting used to it).
  • Listen to and engage the stakeholders.
  • Benchmark against other similar companies, and don’t reinvent the wheel (including using ideas from other successful organizations).
  • Ensure senior management is fully committed, or you will fail.
  • Ensure that you’re adequately resources for all stages of the project, including post-implementation.

At the end of the day, they’ve cleaned up all the problems identified by the 2002 audit, and has provided a consistent pre-qualification process for contractors that benefits the entire organization.

McGillis travels extensively both to make sure that the program is being implemented consistently within OPG, and as an evangelist with external companies and by speaking at conferences.

Could parts of this process be automated to some benefit? Possibly, although they’ve likely gained so much of their ROI already in terms of cleaning up the process and capturing the relevant data in their database application. Process automation might provide them with some additional visibility into the processes, although likely not much more efficiency.