CMU Masters in Software Management

Often, when I receive a request for a meeting on something that’s far outside of my usual BPM/Enterprise 2.0 interests, I’ll turn it down. However, when the meeting is with various deans and professors at Carnegie Mellon University West about their new Masters in Software Management program (press release here), I’m happy to make an exception. I graduated as an engineer over 20 years ago, and programs like this just weren’t available then; I was curious to see how engineering education has advanced. I had a call with Dr. Jim Morris, dean of the CMU west coast campus, Dr. Martin Griss, associate dean for education and director of the software engineering program, and Tony Wasserman, executive director of the Center for Open Source Investigation. Of course, they’re all professors at CMU as well, at the relatively new campus in Silicon Valley.

The Masters in Software Management is like a software engineering equivalent to an executive MBA: it’s intended for people who are already experienced practitioners but want to improve their management skills in a big way, and do so part-time while they continue to work so that they can start to see immediate application and benefit. It grew out of the high level of interest in the management courses offered as part of the Masters in Software Engineering program that’s been running since 2002, as well as interest from employers in the marketplace for the skills that they plan to teach. The Masters in Software Management is less technical than the Masters in Software Engineering, but offers some amazing courses that I think should work their way into any senior software engineering or computer science curriculum: open source, enterprise architecture, managing distributed teams, outsourcing, and many others. Since these are presented in a current business context, using long-running teams and simulating a small company experience, the goal is to produce the next generation of software leaders.

The program doesn’t kick off until later this year, so they don’t know the demographics of the student population yet, but are expecting that most will have a technical computer science/software engineering background, and that there will be a mix of those from small companies who want to improve their skills and build the next Google, and some from large companies who are either closet entrepreneurs or are serious about software management within their organization. About 1/3 of the Masters in Software Engineering program attendees are women, and they expect the percentage to be higher in the Software Management program. As in the Software Engineering program (where about 30% of the students are offsite), they’ll allow remote students, although they need to be onsite for the 4-day kickoff and a few more times during the 2-year program.

BPM in Action panel

If you enjoyed the free-for-all discussion at the end of the webinar that I moderated with Colin Teubner and Jim Rudden a few weeks back, you’re going to love the panel that I’m hosting next week on BPM and Enterprise 2.0 as part of the BPM in Action series. It’s not sponsored by a vendor, so I was able to pick whoever I wanted on the panel, and there will be no vendor product pitches or slides — just an interactive discussion between the four of us. Get over there and sign up.

Here’s the description that I wrote for it:

As Web 2.0 technologies and concepts advance in the consumer world, they are also impacting enterprise software as users change their expectations of how software should behave. This “Enterprise 2.0” movement is impacting BPM software in a variety of ways ranging from platforms to user functionality to integration. This panel will explore the following:

  • How Web 2.0 is becoming Enterprise 2.0
  • BPM platform changes: the impact of browser-based tools and software as a service
  • New tools and techniques for improving user participation in process design and execution
  • New ways of “mashing up” BPM data with internal and external data

I picked three people to join me whose opinions I value and who I think will be interesting in this format: Phil Gilbert of Lombardi, Phil Larson of Appian, and Ismael Ghalimi of Intalio. They’re all very opinionated and all have a stake in the Web 2.0/Enterprise 2.0 space: Lombardi’s very cool Web 2.0 Blueprint release for widespread collaboration, Appian’s kick-butt browser-based process designer for serious Enterprise 2.0 work in a browser, and Ismael’s involvement in radical office 2.0 and BPM 2.0 ideas.

We only have 45 minutes so I’m going to have to keep a tight rein on the conversation to cover off our proposed subject areas, which could be difficult because I inadvertently invited two Phils to the panel (“Phil, pipe down! No, not you Phil, the other Phil!”). No “call me Ishmael” jokes, I’ve heard them already.

By the way, congratulations to Ismael and his wife on their new arrival. No wonder he wasn’t at the Gartner conference.

BPM splog

If you surf around looking for BPM blogs, you may have noticed something strange: my blog posts from here on Column 2 reproduced in their entirety and without permission on the blog of Mark Bean, the VP of Sales for an ECM/BPM-related vendor, Altien. I’m not linking to them or to the fake blog itself, called “Office 2.0 and ECM News”, since I am definitely not encouraging traffic.

This is a clear violation of my intellectual property and copyright, and I’m amazed that anyone who works in this industry would propagate such an openly fraudulent and illegal activity. Maybe that tells you something about how Altien does business in general.

I noticed this a few weeks back, but I only noticed that he started stealing the full posts (as opposed to significant chunks of them) with my Gartner coverage this week. I sent Bean a request this morning to stop stealing my blog posts, and he replied “Sure thing”, like I had asked him for the weather — no apology, no admission that he might have violated blogging etiquette, much less copyright law. I’ve asked him to remove all of my full posts from his site, although obviously there’s no law against him linking to any of my posts and publishing a short excerpt under fair use rules.

Imitation may be the sincerest form of flattery, but in this case, it’s also theft.

Update: According to Altien’s CEO, who left a comment on this post, Mark Bean is no longer in their employ. In my communications with Altien, it was clear that Bean’s activities do not reflect their general business practices.

Gartner Day 3: Jim Sinur scenario-based rules panel

Jim Sinur hosted a case study panel on scenario-based rules with two presenters: David Luce at UTi (a logistics outsourcing firm) and Husayn Alvarez-Gomariz at Micron (a semiconductor manufacturer).

Luce started out talking about UTi, and how as a logistics provider, they are actually a business process outsourcer. They pride themselves on customer intimacy, but that drives up their operational costs since there are so many manual, special-case processes. They were looking for ways to maintain the same level of customer intimacy while automating processes and rules wherever possible in order to increase efficiency and drive down costs, and what they devised was a rules-driven architecture where they use business rules as a policy validation tool. They’ve externalized rules from legacy code into a business rules management system, which provides them with the level of agility that they need to provide customized service to their customers while still automating their processes.

Alvarez-Gomariz discussed scenario analysis, and how to use scenarios to provide the agility to respond to changing market events. His talk was both detailed and abstract, not a good combination for holding my attention, although he had some good points about the intersection between BPM, BI and planning.

Like yesterday’s panel session, this was really more like two separate 30-minute presentations, with no interaction between the panelists. This format should definitely be changed to something more interactive, or be labelled as consecutive short presentations rather than a panel.

Although it’s only lunchtime, this was my last session of the day and of the conference: I’m on a flight back to Toronto in a couple of hours. I didn’t blog about the fun at the vendor hospitality suites, but suffice to say that it included Michael Beckley in a very tropical hat (he also hade a “Made in Mexico” sticker on his forehead at one point, but I couldn’t verify that statement with his parents), Scott the hotel bartender talking about SOA and Six Sigma, and a vendor ending up in my room for the night.

I hope that you enjoyed my coverage of the conference; I’ve had a lot of great feedback from people here, and I’ll soon catch up with the comments that you’ve added to my posts in the last couple of days.

Gartner Day 3: Fair Isaac customer session

For the second half of this morning’s vendor sessions, I sat in on Fair Isaac’s customer presentation, Michele Sprayregen Edelman of Discover Financial Services on Managing Business Rules and Analytics as an Enterprise Asset. As the largest proprietary credit card network in the US with 50 million cardholders and 4 million merchant and cash access locations, they need to have a good handle not just on what their customers are doing, but on how current market trends will change what their customers want to do in the future.

To them, this means using an advanced decision management environment: start with criteria- and rule-based decisions, then automate processes with business rule management, then increase decision precision with predictive analytics, and finally optimize strategies with predictive analytics. They’re only a few steps of the way along this route, but are starting to automate decisions in a more sophisticated manner for things such as individual purchase approval/denial, in order to increase revenue and reduce losses.

They wanted a modelling environment that could be done by analysts without requiring IT support, as well as methods for integrating with the transactional systems for automating decisions. They use other decisioning tools besides Fair Isaac’s, including SAS, and combine the decisions from all of the systems in order to make the ultimate decisions. When you look at what they’ve done, even in the simplified diagrams that Edelman showed us, it’s hugely complex but provides them with a huge competitive advantage: they’re using automated decisioning in a number of different areas across their organization, including portfolio scoring, dispute processing, customer contact strategy and many others.

She presented some final recommendations, the primary one being the importance of the data infrastructure that’s going to drive the decisioning.

Gartner Day 3: Microsoft session

I wanted to stop in on the Microsoft session, People-Ready Processes, in part because I’m a bit confused about what Microsoft is doing in this area, and in part because of the Business Process Alliance announcement from Monday. Microsoft sees themselves as a force for commoditizing (and in the subtext, dumbing down) technology so that it is accessible to a much wider audience, and this presentation was Burley Kawasaki’s take on how they’re doing that for BPM. He describes people-ready processes as a fusion of document-centric processes and system-centric processes, and I really with that he (and many other people in the industry) would stop equating human-centric with document-centric. Although human-facing BPM grew out of the workflow that started in document imaging systems, that was a long time ago, and there are many instances of human-facing BPM that don’t include documents — depending, of course, on how you define a document.

My previous view of Microsoft BizTalk was as a B2B message broker or an internal ESB. My view of SharePoint was as a collaboration and document management platform. I wanted to see how Microsoft was bringing together the technologies and concepts from both of these to create a seamless BPM solution.

Kawasaki showed a spectrum of BPM application types, from collaborative to transactional processes. Individual ad hoc processes (e.g., individual task lists), human semi-structured (e.g., vacation approval), system highly structured (e.g., expense reporting) and fixed process (e.g., supply chain). He then overlaid a split between a collaboration server and a process server, with some overlap in the middle of spectrum, and labelled these as SharePoint and BizTalk. My heart sank.

Okay, you can have a SharePoint collaboration or document kick off a BizTalk process, but that’s not the same as having a single end-to-end BPM solution. In the future, the Windows Workflow Foundation will be used as the underlying process infrastructure for both SharePoint and BizTalk, which might help to integrate them more closely.

He finished up with a light-speed overview of the Microsoft process platform roadmap, which includes Windows Workflow Foundation, the .Net framework, Office (including SharePoint) and BizTalk. He also made a big push for the benefits of a platform and partner ecosystem rather than a single vendor “close and proprietary” BPM stack. Not sure that I’m convinced.

Gartner Day 3: Yvonne Genovese keynote

We started the last day at the Gartner summit with a keynote by Yvonne Genovese, Business Applications Through 2010: Major Changes Will Affect Your Process Environment. Early in her presentation, she made an important statement: “the technology keeps breaking our processes”. Her focus is on business applications, not specifically BPM, but she’s looking at trends of what’s happening with enterprise applications like ERP and CRM systems. Her point is that these business applications have, in the past, forced businesses to use rigid business processes implemented within those systems.

However, the current trend is towards unbundling some of this functionality, exposing it through services, then consuming those services using a BPMS. This allows you to not only call specific functionality from your business applications at any point in a process that you now control, you can actually replace or augment the functionality of those applications by calling other services. This also provides an opportunity to more easily integrate between business applications if you have multiple ones in your environment. Although the business application vendors have been pushing suites for some time now, that packaging model will be less compelling to their customers as organizations start to slice and dice the atomic functionality of the business applications and compose their own processes using BPM rather than use the suite in its monolithic form.

Business applications aren’t going away: there’s still a huge amount of good functionality available in them, and as long as that commoditized functionality can be consumed as services, you’re not going to be writing a replacement yourself. What I think will happen, however, is that the amount of the functionality used from any given business application platform will begin to erode as other internal or external services replace some of that functionality. This frees organizations from the vendor lock-in that they’re subjected to now, and adds a new possibility for creating business applications: instead of just “buy” or “build”, you can now also “compose”. And if the megavendors in this field are going to stay competitive, they need to embrace and encourage an ecosystem that allows smaller vendors to provide services that can easily be integrated with their larger platform. This isn’t going to be the old model of the vendor controlling the ecosystem by anointing their favourite technology partners, however: the customer organizations are going to build their own ecosystem from their preferred vendors in a truly best-of-breed fashion.

At the end of the day, BPM is an essential part of all this, since it will be used as a composition framework for combining functionality from business applications, along with internal and external services, into the processes that the business really needs.

Gartner Day 2: Jim Sinur (again)

I finished up the day by attending Jim Sinur’s session on continuous optimization. And thanks to Gartner, we have a brand new acronym: BOSS, for business optimization support systems.

He has an interesting take on optimization that I agree with: it’s an antidote to entropy. Laws of entropy say that systems tend to become more chaotic over time, and you have to have something in place that will actively head off that slide into chaos. Continuous improvement is not, however, a replacement for disruptive or radical change within an organization: former provides some refinements along the way to a goal, while the latter causes changes in direction to a new goal.

He defined continuous optimization as “keeping a process efficient, effective and relevant under all possible and changing conditions,” and laid out a list of benefits of continuous process optimization, not the least of which is creating a balance amongst competing goals: sacrificing a local optimization in favour of an overall optimization.

There was some amount of repeated material from Bill Gassman’s BI/BAM presentation earlier today, but Sinur went into a number of other areas, such as understanding both drivers for process optimization and inhibitors for the adoption of optimization. It’s completely necessary to link processes to desired outcomes so that the goals of optimization are well understood, and also have to anticipate the shift to indeterminate/ad hoc/collaborative processes that don’t have pre-determined process maps, but are usually triggered by events and are goal-driven.

He looked at how to discover the opportunities for optimization, and selecting the proper optimization capability from a set of optimization tools and techniques. He also made some good points about matching your optimization method and your risk profile, which I’ve heard in earlier presentations this week: if you’re very risk-averse, for example, you’re unlikely to have self-optimizing systems that change their own behaviour based on patterns of events in the system.

This is a growth area, and one that can be providing some competitive advantage: only the leader organizations are using this technology now, and it has the potential to make a huge impact on a company’s agility.

Gartner Day 2: BEA sessions

I really wanted to attend Daryl Plummer’s analyst/user roundtable on BPM and Web 2.0, but they don’t let press into those sessions, so I ducked in to hear Jesper Joergenson of BEA talk about Best Practices in Business Transformation. Jesper, I know that you’re reading this — no offence intended on being my second choice 🙂  I stayed through both half-hour sessions this time, seeing Jesper talk first, then BEA’s customer, Christophe Marcel of Integro Insurance Brokers with Building the Business Case for BPM.

Joergenson started with a cooking theme for this “BPM secret sauce” talk: start with sharp knives, make big meals of small dishes, measure to taste and adjust as required, have a recipe, and follow the recipe. In BPM, this translates to start with common tools, build a platform out of small projects, use simulation and measurement, have established best practices, and follow those best practices. Cute theme, and some nice cooking utensil graphics, although I have to admit that I rarely follow a recipe in the kitchen, even if I bother to have one.

He talked about the importance of modelling tools for business users, with a shared process model for the IT side for implementation to avoid the inevitably incomplete round-tripping that happens when you model in one tool and implement in another. He also discussed how to identify suitable first targets for BPM implementation — low complexity, high impact, and low maturity level — while planning for scale in both the tool selection and the methodology, since one successful project will breed demand. He briefly discussed process simulation and measurement/monitoring, and the importance of a process centre of excellence.

After a brief break, Christophe Marcel talked about their experiences with BPM. Their focus was on integration, tying together a number of existing systems with a minimum amount of new development. They made use of both human-facing tasks and web services calls to update data in the underlying enterprise systems, and built their own web-based user interface. In addition to the enterprise data systems, they integrated Microsoft Sharepoint as their document management system.

One of the major challenges, which I’ve seen many times before whenever you integrate BPM with enterprise systems, is the issue of data synchronization. When data is replicated into the BPMS for display or control purposes, any changes to the data either in the BPMS or the underlying enterprise system need to be considered for replication to the other system. Similarly, if an entire insurance program is sold, all tasks the BPMS may need to be updated or deleted to reflect that change.

Marcel had some best practices to share: do a proof of concept; hire an experienced consultant; keep in mind that data synchronization is probably a lot more complex than you think it is; use your best business analysts on the workflow rather than the UI; and users want all of their tasks in a single system, whether that’s the BPMS or their email.