Applying Lean Six Sigma Methodology to Transactional Processes

Next up was a panel discussion with David Haigh of Johnson & Johnson, Sabrina Lemos of United Airlines, and Gary Kucera of Kaplan Higher Education, moderated by Charles Spina of e-Zsigma.

United Airlines has a unique project going on in one of their freight-related operations: they decided to outsource the operation in order to be able to completely remake the process and have it meet specific KPIs, but also decided to allow the existing people to bid on their own jobs. This would have the effect of shifting them out of their current ways of doing things and proposing the best possible way to do it, since they will be in a competitive bidding situation with outsiders. Lemos also spoke about the importance of getting to the real data. She did an exercise of tracing a particularly biweekly report –which took several hours to compile – up to the VP and what he reports on, then tracked what he actually reports on back down to the reports and metrics that are being gathered at the lower levels. Not surprisingly, she found that there was zero alignment: nothing in the biweekly reports were used by the VP in his report, or anywhere else in the chain of command. She spoke about using gauge R&R, walk the process, and value stream mapping techniques to analyze processes, and the necessity of coming to agreement on the meaning of things such as process start points.

Haigh spoke about accounts payable processes at J&J Canada, and how an in-depth review of those processes was triggered by someone actually forgetting to pay the electricity bill, and showing up at the office one day to find a notice that the power would be cut if the bill weren’t paid immediately: not that they didn’t have the money to pay the bill, just that the process to do so wasn’t working. Accounts payable is often one of those processes in companies that is ignored when looking at major process improvement because it’s not revenue generating, but it’s important to recognize that enormous cost savings can be found through taking advantage of early payment discount levels, and avoiding any late penalties or service disruptions. They have found that doing some amount of the work onsite where the business processes are being done is helpful, since the process participants can see what’s involved in their process overall. They use the same techniques as discussed by Lemos, plus Kaizen Blitz and some activity-based costing.

Kucera spoke about aligning the corporate and executive goals with efforts at all levels, and how Jack Welch suggested making your bonus be some percentage of your process improvement savings in order to incent people to align their behavior and metrics with the ultimate goals. He spoke about some of the modeling and display tools that they use, such as fishbone and Pareto diagrams, and how doing these early and engaging with the business management can greatly speed the process improvement efforts. In many cases, since they’re dealing with simple transactional processes, they can use fairly simple analysis tools, but have some of the more sophisticated tools and techniques available as required.

They all had examples of process improvement efforts that have had a direct customer impact. Lemos had a great example of processing freight insurance claims, where they had a metric of processing five claims per day, resulting in the claims people cherry-picking claims in order to meet their quota; enforcing first-in, first-out claims processing resulted in an immediate and dramatic improvement in customer satisfaction. Listening to her stories of their paper-based inefficiencies, where emails are printed, signed and passed around, reminds me so much of the processes in some of my financial services and insurance customers.

In all cases – and I think that this is a key criticism of Lean and Six Sigma – they’re looking for incremental process improvements, not completely disruptive reengineering that would discover new ways to do business. However, in many of today’s standard transactional processes, incremental improvement is the only alternative.

Lean Six Sigma & Process Improvement: David Brown of Motorola

I missed the first morning of the IQPC Lean Six Sigma & Process Improvement conference in Toronto today, but with my usual impeccable timing, showed up just in time for lunch (where we had to explain the rules of curling to the American attendees). The first session this afternoon is with David Brown, a black belt at Motorola, where the term “Six Sigma” was first coined and is still used to make their processes more effective, efficient, productive, and transparent.

There has been a transformation for them in how they analyze their processes: ranging from just looking at transactions to high-level intelligence including complex simulations and forecasting. Since they run SAP for their ERP, they have a number of SAP business intelligence (Xcelsius and Business Objects) products, although their most complex analysis is done with Oracle Crystal Ball.

Brown’s presentation was short – less than 10 minutes – and the rest of the session was an interactive one-on-one interview with questions from Charles Spina of e-Zsigma, the conference chair. The Q&A explored much more about how Motorola uses business analytics tools, and opened it up to the (small) audience for their experience with analytics. Not surprisingly, there has been quite a bit of success through the introduction of analytics to process improvement teams: sometimes it’s the black belts themselves, sometimes it’s a separate analytics group that works closely to develop the reports, analysis, and more complex intelligence based on the large volumes of data collected as part of any process improvement project.

Reporting tools can be as simple as Excel – for simple needs – through more complex solutions that include ETL from multiple data sources and regularly scheduled reports, such as Crystal Reports and Xcelsius. Legacy systems can make that a bit of a challenge; often these end up as extracts to Excel or Access, which are then remixed with other sources. Extracts such as this can be really problematic, as I’ve seen first-hand with many of my customers, since there’s no way to keep the data completely in sync with the underlying systems, and typically any one legacy system doesn’t have all the relevant data, so there can be a real problem in matching up related data from multiple systems. Brown underlined that the key issue is to get all of your data into a central data warehouse in order to determine if your data is complete and clean, and to facilitate reporting and analytics. This is especially important for process engineers when trying to do time studies over long periods of time: if you don’t have some consistent representation of the processes over the time period in question, then your analysis will suffer.

Motorola is using their data analytics to improve operational processes, such as order shipping, but also what-if scenarios to inform salespeople on the impact of discount levels to the bottom line. In many cases, this is an issue of data integration: Sabrina Lemos from United Airlines (who will be on the panel following) shared what they were able to recover in late container fees just by integrating their container tracking system with a database (Access, alas) that generates their invoices. Interestingly, I wouldn’t have thought of this as a process improvement initiative – although it is – but rather just as an artifact of doing some clever system integration.

They also discussed the challenges with presenting the results of analytics to the less numerically inclined, which often entails rolling data up to some simpler charts that can be drilled into as required, or just presented in a PowerPoint or PDF file. The real ROI may come from more interactive tools, however, such as dashboards that show operational alerts, or real-time what-if analysis to support human and automated decisions. Since Lean and Six Sigma tools are inherently analytical, this isn’t a new problem for the people in this audience; this is a matter of building relationships early with the non-analytical business managers, getting some early successes in projects to encourage adoption, and using different presentation and learning styles to present the information.

Because of the nature of this audience, the analytics that they’re discussing are typically for human consumption; in the BPM world, this is more and more moving to using the analytics to generate events that feed back into processes, or to inform automated decisioning. Either way, it’s all about improving the business processes.