The power of Lean IT #BTF09

John Swainson, CEO of CA, gave a presentation on how Lean can help companies built long-term competitive advantage during tough economic times in industries as diverse as manufacturing, healthcare, retail and IT, and how Lean IT – or what he referred to as the industrialization of IT – can deliver greater value at lower cost. As he pointed out, it’s about time that we applied some discipline to IT, and we can learn from how Lean helped other types of organizations to create just-in-time IT that deploys the right solutions at the right time.

Lean IT is about a “sense and respond” philosophy that has IT paying attention to what’s happening in the business and manage variable volumes, prioritize and create new business services, and ensure ongoing quality levels.

CA commissioned a study on waste, and found that 96% of IT executives agree that there is significant waste in their organization, primarily due to inefficient processes, duplication of effort, redundant applications, and underutilized assets; Swainson sees that the keys to resolving these issues is to analyze, automate, integrate and optimize (respectively).

He was then joined on stage by John Parkinson, CTO of TransUnion. As a credit rating/tracking service that tracks individual consumer credit ratings around the world, IT is absolutely central and critical to their operations, not just a support function. As the recession approached in 2007, however, they had to consider how to grow new sources of revenue and increase operating margins in order to decrease their dependency on the failing consumer credit market. Lean was part of their strategy, helping them to pinpoint the wasted effort and other waste, and allowing them to optimize their operations. With their corporate culture, they needed this to have this be more of a grassroots initiative that made sense to people: adopting Lean since it helped them to do their job better, not because of a corporate mandate. There is, however, monitoring and measurement in place, and performance and compensation are tied to improvements: the ultimate incentive. Their idea is to instill these Lean ideas into their culture, so that these good habits learned in tough times would serve them well when times improve.

Parkinson pointed out specifically that Lean sets you up for taking advantage of cloud computing, and Swainson took over to talk about the opportunities and challenge of working in the cloud. It’s pretty hard to embrace Lean without at least taking a look at using the cloud, where you can provision the right resources at the right time, rather than having a lot of excess capacity sitting around in your data center. Consider non-production environments such as test labs, for example: being able to create test environments as required – either through internal virtualization (which I don’t really consider to be cloud) or in an environment such as Amazon EC2 – rather than having them pre-installed and ready long before required, and sized for peak rather than actual load. Considering that test environments can be 2/3 or more of the server footprint, this is huge.

Mike Gilpin joined them for a discussion, which briefly continued on the topic of using virtualized or cloud test environments, but also covered the issues of how well contract IT employees can adapt to Lean cultures (if they don’t, then find someone else), using other techniques such as Six Sigma together with Lean (they’re all tools focused on process optimization, pick and choose what works for you), and the security challenges of using cloud infrastructure.

George Colony on the CEO’s brain #BTF09

We had a brief address from George Colony, CEO of Forrester, on changing from IT to BT, with one key message: if your CEO doesn’t understand what you’re talking about, then you’re probably not using “BT speak”.

The CEO is focused on two things: higher profits, and revenue growth, and you need to translate your projects and technology strategy into those terms, or risk being marginalized within the organization.

A brief 10-minute address, but a good message.

Lean and the CIO #BTF09

Tom Hughes, currently with CSC but formerly the CIO of the US Social Security Administration, spoke to us about Lean and the CIO. The imperative here is driven by surveys that show that (to paraphrase) business thinks that IT is important, but that they’re doing a crappy job. He believes that CIOs need to break out of the technology pack and focus on business outcomes (e.g., market share) rather than outputs (e.g., number of workstations): exactly the same message as Connie Moore gave us in the opening keynote. CIOs needs to be valid members of the executive team, reporting to the board rather than the COO, HR, general counsel or any of a number of other non-effective reporting structures.

He believes that the CIO of the future must:

  • Be a strategic thinker, not an IT techie
  • Be at the table of chief executives
  • Partner in agency or business transformation
  • Have broad experience

The CIOs focus needs to be on four things: strategy, budget, architecture and security. Delivery and maintenance, on the other hand, are operational issues, and should be handled below the CIO level, even directly in the business units by promoting cross-functional ownership. The CIO needs to be forward-thinking and set strategy for new technologies such as cloud computing and unified communications, but doesn’t need to be responsible for delivering all of it: for things that the business can handle on their own, such as business process analysis, let the business take the lead, even if it means acquiring and deploying some form of technology on their own.

He concluded with the statements that the CIO needs to work with the CEO and develop a collaborative operational model, be at the table with other senior executives, and get other executives to take accountability for how technology impacts their business area. The CIO needs to be seen by the CEO as a partner in business transformation, not the guy fixing his Blackberry.

Questions from the audience included how to transition the current technology-focused IT teams to have more of a business focus: Hughes’s response is that some of them will never change, and won’t make the cut; others can benefit by being seconded to the business for a while.

On a side note, I like the format of the keynotes: Mike Gilpin pops up on stage at the end of each one, he and the speaker move to a couple of comfy chairs at center stage, and he asks some questions to continue the conversation. Questions from the audience are collected on cards and vetted by Forrester analysts, who then distill them into a few key questions to ask.

There’s still a bit of confusion over the Twitter hashtag: the website says #BTF09, then Gilpin announced in the opening address that it is #FBTF09, but then @forrester DM’ed me that it is actually #BTF09 and that Gilpin will correct this, although that hasn’t happened yet.

Why Lean is the new business technology imperative #BTF09

I’ve moved from the Gartner BPM summit in Orlando to Forrester’s Business Technology Forum in Chicago, where the focus is on Lean as the new business imperative: how to use Lean concepts and methods to address the overly complex things in our business environment.

Mike Gilpin opened the conference with a short address on how our businesses and systems got to be so bloated that lean has become such an imperative, then Connie Moore took over for the keynote. From the keynote’s description on the event agenda site:

Lean is not a new business concept — but it is enduring. By embracing Lean years ago, Toyota reached No. 1, while rivals GM and Chrysler collapsed into wards of the state. In its broadest sense, Lean seeks to better satisfy customer needs, improve process and information flows, support continuous improvement, and reduce waste. Today’s recession is a clarion call for businesses and government to reexamine and reapply Lean thinking across people, processes, and technology. When maintenance eats 80% to 90% of IT budgets, it’s beyond time to examine Lean approaches — like process frameworks, cloud computing, SaaS, Agile methodologies, open source, or other fresh ideas. And when the sheer complexity of technology overwhelms information workers, it’s time to simplify and understand what workers really need to get their jobs done. And by focusing on Lean now, your organization will be positioned to power out of the recession and move quickly into the next new era of IT: business technology — where business is technology and technology is business.

She started with discussions about how Lean started in manufacturing, and you can see the obvious parallels in information technology. In Lean manufacturing, the focus is on eliminating waste, and everyone owns quality and problems are fixed at the source. Lean software isn’t a completely new idea either, but Forrester is pushing that further to change “information technology” to “business technology”.

Lean is not just operational, however, it’s also strategic, with a focus on understanding value. However, it’s usually easier to get started on it at the operational level, where it’s focused on eliminating waste through improving quality, eliminating non-productive time, and other factors. Lean can be counterintuitive, especially if you’ve been indoctrinated with an assembly line mentality: it can be much more efficient, for example, for individuals or small teams to complete an entire complex task from start to finish, rather than have each person or team perform only a single step in that task.

Moving on to the concepts of Lean software, she started with the results of a recent Forrester survey that showed that 92% think that enterprise software has an excessive cost of ownership (although personally, I’m not sure why they bothered to take a survey on something so incredibly obvious 🙂 ), and discussed some of the alternatives: SaaS such as Google Apps, open source or free software and other lighter weight tools that can be deployed at much less cost, both in licensing costs and internal resource usage. Like Goldilocks, we need to all start buying what’s just right: not too much or too little, in spite of all those licenses that the vendor wants to unload at a discount before quarter-end.

Looking at the third part of their trifecta, there’s a need to change IT to BT (business technology). That’s mostly about governance – who has responsibility for the technology that is deployed – and turning technology back into a tool that services the business rather than some separate entity off doing technology for its own sake. What this looks like in practice is that the CIO is most likely now focused on business process improvement, with success being measured in business terms (like customer retention) rather than IT terms (like completing that ERP upgrade on time, not that that ever happens). Stop leading with technology solutions, and focus on value, flexibility and eliminating waste. You can’t do this just by having a mandate for business-IT alignment: you need to actually fuse business and IT, and radically change behaviors and reporting structures. We’re stuck in a lot of old models, both in terms of business processes and organizational models, and these are unsustainable practices in the new world order.

There were some good questions from the audience on how this works in practice: whether IT can be Lean even if this isn’t practiced elsewhere in the organization (yes, but with less of an effect), what this means for IT staff (they need to become much more business focused, or even move to business areas), and how to apply Lean in a highly regulated environment (don’t consider required compliance as waste, but consider how to have less assembly-line business processes and look for waste within automated parts of processes).

Getting Business Process Value From Social Networks #GartnerBPM

For the last session of the day, I attended Carol Rozwell’s presentation on social network analysis and the impact of understanding network processes. I’ll be doing a presentation at Business Rules Forum next month on social networking and BPM, so this is especially interesting even though I’ll be covering a lot of other information besides social graphs.

She started with the (by now, I hope obvious) statement that what you don’t know about your social network can, in fact, hurt you: there are a lot of stories around about how companies have and have not made good use of their social network, and the consequences of those activities.

She posited that while business process analysis tells us about the sequence of steps, what can be eliminated and where automation can help, social network analysis tells us about the intricacies of working relationships, the complexity and variability of roles, the critical people and untapped resources, and operational effectiveness. Many of us are working very differently than we were several years ago, but this isn’t just about “digital natives” entering the workforce, it’s about the changing work environment and resources available to all of us. We’re all more connected (although many Blackberry slaves don’t necessarily see this as an advantage), more visual in terms of graphical representations and multimedia, more interactively involved in content creation, and we do more multitasking in an increasingly dynamic environment. The line between work and personal life blurs, and although some people decry this, I like it: I can go to many places in the world, meet up with someone who I met through business, and enjoy some leisure time together. I have business contacts on Facebook in additional to personal friends, and I know that many business contacts read my personal blog (especially the recent foodie posts) as well as my business blog. I don’t really have a lot to hide, so don’t have problem with that level of transparency; I’m also not afraid to turn off my phone and stop checking my email if I want to get away from it all.

Your employees are already using social media, whether you allow it within your firewall or not, so you might as well suck it up and educate them on what they can and can’t say about your company on Twitter. If you’re on the employee side, then you need to embrace the fact that you’re connected, and stop publishing those embarrassing photos of yourself on Facebook even if you’re not directly connected to your boss.

She showed a chart of social networks, with the horizontal axis ranging from emergent to engineered, and the vertical axis from interest-driven to purpose-driven. I think that she’s missing a few things here: for example, open source communities are emergent and purpose-driven, that is, at the top left of the graph, although all of her examples range roughly along the diagonal from bottom left to top right.

There are a lot of reasons for analyzing social networks, such as predicting trends and identifying new potential sources of resources, and a few different techniques for doing this:

  • Organizational network analysis (ONA), which examines the connections amongst people in groups
  • Value network analysis (VNA), which examines the relationships used to create economic value
  • Influence analysis, a type of cluster analysis that pinpoints people, associations and trends

Rozwell showed an interesting example of a company’s organizational chart, then the same players represented in an ONA. Although it’s not clear exactly what the social network is based on – presumably some sort of interpersonal interaction – it highlights issues within the company in that some people have no direct relation to their direct reports, and one person who was low in the organizational chart was a key linkage between different departments and people.

She showed an example of VNA, where the linkages between a retailer, distributor, manufacturer and contract manufacturer where shown: orders, movements of goods, and payments. This allows the exchanges of value, whether tangible or intangible, to be highlighted and analyzed.

Her influence analysis example discussed the people who monitor social media – either within a company or their PR agency – to analyze the contributors, determine which are relevant and credible, and use that to drive engagement with the social media contributors. I get a few emails per day from people who start with “I read your blog and think that you should talk to my customer about their new BPM widget”, so I know that there are a lot of these around.

There are some basic features that you look for when doing network analysis: central connectors (those people in the middle of a cluster), peripheral players (connected to only one or two others), and brokers (people who form the connection between two clusters).

There are some pretty significant differences between ONA, VNA and business process analysis, although there are some clear linkages: VNA could have a direct impact on understanding the business process flows, while ONA could help to inform the roles and responsibilities. She discussed a case study of a company that did a business process analysis and an ONA, and used the ONA on the redesigned process in order to redesign roles to reduce variability, identify roles most impacted by automation, and expose critical vendor relationships.

Determining how to measure a social network can be a challenge: one telecom company used records of voice calls, SMS and other person-to-person communications in order to develop marketing campaigns and pricing strategies. That sounds like a complete invasion of privacy to me, but we’ve come to expect that from our telecom providers.

The example of using social networks to find potential resources is something that a lot of large professional services firms are testing out: she showed an example that looked vaguely familiar where employees indicated their expertise and interests, and other employees could look for others with specific sets of skills. I know that IBM does some of this with their internal Beehive system, and I saw a presentation on this at the last Enterprise 2.0 conference.

There are also a lot of examples of how companies use social networks to engage their customers, and a “community manager” position has been created at many organizations to help manage those relationships. There are a lot of ways to do this poorly – such as blasting advertising to your community – but plenty of ways to make it work for you. Once things get rolling in such a public social network, the same sort of social network analysis techniques can be applied in order to find the key people in your social network, even if they don’t work for you, and even if they primarily take an observer role.

Tons of interesting stuff here, and I have a lot of ideas of how this impacts BPM – but you’ll have to come to Business Rules Forum to hear about that.

Fujitsu process discovery case study #GartnerBPM

I first saw Fujitsu’s process discovery offering last year, and it looked pretty useful at the time, but it didn’t have much of a track record yet. Today’s session brought forward Greg Mueller of Electro Scientific Industries (ESI), a manufacturer of photonic and laser systems for microengineering applications, to talk about their successes with it.

Basically, the Automated Process Discovery (APD) uses log files and similar artifacts from any variety of systems in order to derive a process model, analyzing frequencies of process variations, and slicing and dicing the data based on any of the contributing parameters. I’ve written a lot about why you would want to do process discovery, including some of the new research that I saw at BPM 2009 in Germany last month.

ESI wanted to reduce inventory and improve manufacturing cycle time, and needed to understand their opportunity-to-order process better in order to do that. They used APD to determine the actual process flows based on about 15 months of data from SAP and other systems, then validated those flows with the team who worked with those flows. They wanted to look at variations based on business unit and other factors to figure out what was causing some of their cycle time and inventory problems.

They assumed a relatively simple four-step process of opportunity-quote-order-shipment, possibly with 3-4 additional steps to allow revisions at each of these steps; what they actually found when they looked at about 11,500 process instances is that they had over 1,300 unique process flows. Yikes. Some of this was cycling through steps such as order change: you would expect an order to be changed, but not 120 times as they found in some of their instances. There were also loopbacks from order to quote, each of these representing wasted employee time and increased cycle time. They found that one task took an average of 58 days to complete, with a standard deviation of 68 days – again, a sign of a process out of control. They realize that they’re never going to get it down to 25 unique process flows, but they are aiming for something far lower than 1,300.

They did a lot of data slicing and analysis: by product, by region, by sales manager and many other factors. APD allows for that sort of analysis pretty easily (from what I saw last year), much like any sort of dimensional modeling that you would do in a data warehouse.

They observed that less than 20% of their opportunities followed the happy path, and the rest were taking too long, duplicating efforts, having too many rework loopbacks, and sometimes not even shipping after a great deal of up-front work.

In their process improvement phase, they established 22 projects including a number of improvement features such as automating processes to reduce repeated steps, improving entry flow to reduce time intervals, require the entry of initial data early in the process in order to reduce loopbacks and rework. Since their business runs on SAP, a lot of this was implemented there (which begs the question of who did such a crappy SAP implementation for them in the first place such that they had problems like this – seriously, insufficient required data entry at the start of an process?), and they’re able to keep extracting and analyzing the logs from there in order to see what level of improvement that they are experiencing.

After a much too short presentation by ESI, Ivar Alexander from Fujitsu gave us a demo of APD with ESI’s basic process; I’ve seen a demo before, but it’s still fascinating so see how the system correlates data and extracts the process flows, then performs detailed dimensional analysis on the data. All of this is done without having to do a lot of interviews of knowledge workers, so is non-invasive both from a people and system standpoint.

It’s important to recognize that since APD is using the system logs to generate the process flows, only process steps that have some sort of system touch-point will be recorded: purely manual process steps will not. Ultimately, although they can make big improvements to their SAP-based processes based on the analysis through APD, they will probably need to combine this with some manual analysis of off-system process steps in order to fully optimize their operations.

Dynamic BPM versus agility #GartnerBPM

Jim Sinur led a session this morning on dynamic BPM and how to deal with the demands for change. He started with the statement that dynamic BPM is more than just another type of BPM technology, it’s a requirement for a transformational advantage, and took a look at how BPM will become more dynamic in the future.

Change is driven by unexpected exceptions in processes, and patterns of these unexpected events can indicate trends in your business environment that the processes need to accommodate. Typical change cycles in IT, however, tend to be slow and steady, which doesn’t at all match either the business dynamics or the external forces that shape them. Being able to handle these spiky demands drives the requirement for more dynamism in how processes and rules are managed, and drives the requirement for the business to be able to manage these directly rather than having to engage IT for all changes.

Gartner’s definition of dynamic BPM is the ability to support process change by any role, at any time, with very low latency. Change agents include everyone from customers and business people through business and process analysts, and on to architects and developers; if the people at the business end of this spectrum aren’t allowed to make process changes, then they’ll just work around it and invent their own processes using their own tools. This isn’t just about each individual’s personal preferences for how they work, however: if knowledge workers can make changes to their processes, they will tend to make them more efficient and effective, which has enterprise benefits.

A significant part of this is the inclusion of explicit rules within processes, so that scenario-driven rule sets can detect and respond to conditions, even without the process participants having to make those changes themselves: the basis of what James Taylor was saying in his presentation this morning. What used to be monolithic lumps of code can be split into several parts, each of which has the potential to be agile: user interface is managed by portals and the web; decision points are handled by rules engines; paths of execution are managed by BPMS; and data definitions are handled in databases or XML data representations. All of those parts used to be under the control of the developers, but turning it inside out and using more agile technologies allows people to customize their UI, change their rules on a daily basis, modify their processes, and define their own data structures. Dynamic BPM isn’t just about changing process models, it spans requirements, recompilation, data binding, loading and versioning.

There was quite a bit about services composition environments and CEP that I felt didn’t really belong in a presentation on dynamic BPM: yes, you need to have services and CEP in order to build agile processes in the first place, but it seems like filler.

One brief slide on “Web 2.0”, really just a quick laundry list of enterprise social software aspects that could impact BPM, including collaborative process design and execution, but no meat. Sinur merely read the list and pointed out that there are vendors at the showcase showing some of these capabilities. That was a bit of a disappointment, considering that the term “dynamic BPM” is being used by many (including Forrester and several vendors) to describe collaborative processes that are created or modified at runtime by the user.

He finished up with some sensible advice about separating rules and other application components from the processes in order to push towards more agile processes, although not different from the message that we’ve been hearing for quite a while now.

This wasn’t a new presentation: it was mostly recycled material that I had seen in previous Gartner presentations (either at conferences or on webinars) about agile BPM using rules, services and complex event processing. There’s been some new verbiage put around it and a few new slides, but only the briefest nod to the type of user-created ad hoc collaborative processes that represent the most dynamic form of BPM.

The five dysfunctions of a team #GartnerBPM

Jeff Gibson of the Table Group gave the morning keynote based on some of the concepts in his colleague’s book, The Five Dysfunctions of a Team: A Leadership Fable.

He started with the idea that there are two requirements for a company’s success: it has to be smart (strategy, marketing, finance, technology) and it has to be healthy (minimal politics, minimal confusion, high morale, high productivity, low turnover). Although a lot of management courses are focused on the smart side, the healthy side is a multiplier of the smart side, boosting the success far beyond what you can do by being smart alone.

He then moved on to the five dysfunctions of a team:

  1. Absence of trust, specifically personal trust and exposing vulnerability to other team members. The role of the leader is to go first in order to show that it’s okay to make mistakes.
  2. Fear of conflict, which can lead to misunderstandings because people don’t speak their mind. The role of the leader is to search out conflict amongst team members, draw out the issues and wrestle with them.
  3. Lack of commitment, particularly to tough decisions. The role of the leader is to force clarity and closure on those decisions to ensure that everyone is committed to upholding them.
  4. Avoidance of accountability. The role of the leader is to confront difficult issues, such as problematic team behaviors.
  5. Inattention to results. The role of the leader is to focus on collective outcomes, not allowing a “superstar” on the team to make themselves look good to the detriment of the team result.

Usually I find these external keynotes that are unrelated to the conference subject to be so-so, but I really enjoyed this one, and could have used this advice when I was heading up a 40-person company. I’ll be checking out the book.

Advanced decisioning #GartnerBPM

I managed to get out of bed and down to the conference in time for James Taylor’s 7am presentation on advanced decisioning. If you’ve been reading here for a while, you know that I’m a big proponent of using decisioning in the context of processes, and James sums up the reasons why: it makes your processes simpler, smarter and more agile.

Simpler: If you build all of your rules and decisioning logic within your processes – essentially turning your process map into a decision tree – then your processes will very quickly become completely unreadable. Separating decisions from the process map, allowing them to become the driver for the process or available at specific points within the process, makes the process itself simpler

More agile: If you don’t put your decisioning in your processes, then you may have written it in code, either in legacy systems or in new code that you create just to support these decisions. In other words, you tried to write your own decisioning system in some format, but probably created something that’s much harder to change than if you’re using a rules management system to build your decisions. Furthermore, decisions typically change more frequently than processes; consider a process like insurance underwriting, where the basic flow rarely changes, but the rules that are applied and the decisions made at each step may change frequently due to company policy or regulatory changes. Using decision management not only allows for easier modification of the rules and decisions, it also allows these to be changed without changing the processes. This is key, since many BPMS don’t easily allow for processes that are already in progress to be easily changed: that nice graphical process modeler that they show you will make changes to the process model for process instances created after that point, but don’t impact in-flight instances. If a decision management system is called at specific points in a process, it will use the correct version of the rules and decisions at that point in time, not the point at which the process was instantiated.

Smarter: This is where analytics comes into play, with knowledge about processes fed into the decisioning in order to make better decisions in an automated fashion. Having more information about your processes increases the likelihood that you can implement straight-through processes with no human intervention. This is not just about automating decisions based on some initial data: it’s using the analytics that you continue to gather about the processes to feed into those decisions in order to constantly improve them. In other words, apply analytics to make decisions smarter and make more automated decisions.

To wrap up James’ five core principles of decisioning:

  • Identify, separate and manage decisions
  • Use business rules to define decisions
  • Analytics to make decisions smarter
  • No answer is static
  • Decision-making is a process

He then walked through the steps to apply advanced decisioning, starting with identifying and automating the current manual decisions in the process, then applying analytics to constantly optimize those decisions.

He closed with an action plan for moving to decisioning:

  • Identify your decisions
  • Adopt decisioning technology
  • Think about decisions and processes, and how those can be managed as separate entities.

Good presentation as always – well worth getting up early.

Using BPM to survive, thrive and capitalize #GartnerBPM

Last session of the day, a panel with Jim Sinur, Elise Olding and Michele Cantara on using BPM to survive, thrive and capitalize in a turbulent economy. I realize that this session has the same title as a webinar that Cantara and Janelle Hill did a while back, and there’s a lot of repeat material from that so I won’t bother to recapture it here. There’s a link to the webinar replay in that post, and I recommend checking it out if you weren’t here in Orlando today.

Off to the vendor showcase; that’s it for day 1 of the Gartner BPM summit.