The New Software Industry: David Messerschmitt

David Messerschmitt, a prof at UC Berkeley and the Helsinki University of Technology, finished the formal presentations for the day with a talk on how inter-firm cooperation can be improved in the software industry. This is an interesting wrap-up, since we’ve been hearing about technology, applications and business opportunities all day, and this takes a look at how all these new software industry companies can cooperate to the benefit of all parties.

He started out by proposing a mission statement: focus the software industry’s attention and resources on providing greater value to the user and consumer. This has two aspects: do less harm, and do more direct provision of value to the customer rather than the computational equivalent of administrivia.

In general the software industry has a fairly low customer satisfaction rate of around 75%, whereas specific software sectors such as internet travel and brokerage rank significantly higher. In general, services provided by people have a lower satisfaction rate (likely due to the variability of service levels), and the satisfaction rates are decreasing each year. Complaints are focussed on gratuitous change (change due to platform changes rather than anything that enhances user value) and security, and to some extent on having to change business processes to match an application’s process rather than having the system adapt to their business process. Certainly, there are lessons here for BPM implementations.

Messerschmitt raised the issue of declining enrolment of women in computer science, which he thinks is in part due to the perception that computer science is more about heads-down programming rather than about dealing with users’ requirements. He sees this as a bit of a canary in a coal mine, indicating some sort of upcoming problem for the computing industry in general if it is driving away those who want to deal with the user-facing side of software development. Related to that, he recommends the book Democratizing Innovation by Eric von Hippel, for its study of how customers are providing innovation that feeds back into product design and development, not just in software but in many areas of products.

He ended up by discussing various ways to improve inter-firm cooperation, such as the Global Environment for Networking Innovations (GENI) initiative, ways to accomplish seamless operation of enterprise systems, and referring to a paper that he recently wrote and will be published in July’s IEEE Proceedings, Rethinking Components: From Hardware and Software to Systems. He then listed elements of collective action that can be pursued by industry players, academia and professional organizations to help achieve this end:

  • Systematically look at knowledge gaps and ensure that the research is addressing those gaps
  • Create/educate the human resources that are needed by the industry
  • Understand and encourage complementarities, like broadband and certain types of software
  • Structures and processes: capture end-user innovations for incorporation into a product, and achieve a more orderly evolution of technology with the goal of leaving behind many fewer legacies in the future

He’s definitely of the “a rising tide lifts all boats” mindset.

The New Software Industry: Investment Opportunities Panel

Jason Maynard of Credit Suisse moderated a panel on investment opportunities in the new software industry, which included Bill Burnham of Inductive Capital, Scott Russell (who was with two different venture capital firms but doesn’t appear to be with one at this time, although his title is listed as “venture capitalist”), and Ann Winblad of Hummer Winblad Venture Partners.

This was more of an open Q&A between the moderator and the panel with no presentation by each of them, so again, difficult to blog about since the conversation wandered around and there were no visual aids.

Winblad made a comment early on about how content management and predictive analytics are all part of the collaboration infrastructure; I think that her point is that there’s growth potential in both of those areas as Web 2.0 and Enterprise 2.0 applications mature.

There was a lengthy discussion about open source, how it generates revenue and whether it’s worth investing in; Burnham and Russell are against investing in open source, although Winblad is quite bullish on it but believes that you can’t just lump all open source opportunities together. Like any other market sector, there’s going to be winners and losers here. They all seem to agree, however, that many startups are benefiting from open source components even though they are not offering an open source solution themselves, and that there are great advantages to be had by bootstrapping startup development using open source. So although they might not invest in open source, they’d certainly invest in a startup that used open source to accelerate their development process and reduce development costs.

Russell feels that there are a number of great opportunities in companies where the value of the company is based on content or knowledge rather than the value of their software.

SaaS startups create a whole new wrinkle in venture: the working capital management is much trickier due to the delay in revenue recognition since payments tend to trickle in rather than be paid up front, even though the SaaS company needs to invest in infrastructure. Of course, I’m seeing some SaaS companies that are using hosted infrastructure rather than buying their own; Winblad discussed these sort of rented environments, and other ways to reduce startup costs such as using virtualization to create different testing environments. There are still a lot of the same old problems however, such as sales models. She advises keeping low to the ground, getting something out to a customer in less than a year, getting a partner to help bring the product to market in less than two years. As she put it, frugality counts; the days of spending megabucks on unnecessary expenses went away in 2000 when the first bubble burst, and VCs are understandably nervous about investing in startups that exhibit that same sort of profligate spending.

Maynard challenged them each to name one public company to invest in for the next five years, and why:

  • Russell: China and other emerging markets require banking and other financial data, which companies like Reuters and Bloomberg (more favoured) will be able to serve. He later made comments about how there are plenty of opportunities in niche markets for companies that own and provide data/information rather than software.
  • Burnham: mapping/GPS software like Tele Atlas, that have both valuable data and good software. He would not invest in the existing middleware market, and specifically suggested shorting TIBCO and BEA (unless they are bought by HP) — the two companies whose user conferences that I’m attending this week and next.
  • Winblad: although she focusses on private rather than public investments, she makes Amazon is a good bet since they are expanding their range of services to serve bigger markets, and have a huge amount of data about their customers that allows them to . She thinks that Bezos has a good vision of where to take the company. She recommends shorting companies like CA, because they’re in the old data, infrastructure and services business.

Audience questions following that discussion focussed a lot on asking the VCs opinions on various public companies, such as Yahoo. Burnham feels that Yahoo is now in the entertainment industry, not the software industry, so is not a real competitor to Google. He feels that Google versus Microsoft is the most interesting battle to come. Russell thinks that Yahoo is a keeper, nonetheless.

Questions about investments in mobile produced a pretty fuzzy answer: at some point, someone will get the interface right, and it will be a huge success; it’s very hard for startups to get involved since it involves them doing long negotiations with the big providers.

Burnham had some interesting comments about investing in the consumer versus the business space, and how the metrics are completely different because marketing, distribution and other factors differ so much. Winblad added that it’s very difficult to build a consumer destination site now, like MySpace or YouTube. Not only are they getting into a crowded market, but many of the startups in this area have no idea how to answer basic questions about the details of an advertising revenue model, for example.

Burnham had a great comment about what type of Web 2.0 companies not to invest in: triple-A’s, that is, AdSense, AJAX and arrogance.

Winblad feels that there’s still a lot of the virtualization story to unfold, since it is seriously changing the value chain in data centres. Although VMware has become the big success story in this market, there are a number of other niches that have plenty of room for new players. She also thinks that companies providing specialized analytics — her example was basically about improving financial services sales by analyzing what worked in the past — can provide a great deal of revenue enhancement for their customers. As a final point on that theme, Maynard suggested checking out Swivel, which provides some cool data mashups.

The New Software Industry: Bob Glushko and Shelley Evenson

Bob Glushko, a prof at UC Berkeley, and Shelley Evenson, a prof at CMU, discussed different views on bridging the front stage and back stage in service system design. As a side note, I have to say that it’s fun to be back (temporarily) in an academic environment: many of these presentations are much more like grad school lectures than standard conference presentations. And like university lectures, they cover way too much material in a very short time by speaking at light speed and flipping slides so fast that there’s no time to even read what’s on the slide, much less absorb or document it. If I had a nickel for every time that a presenter today said “I don’t have time to go into this but it’s an important concept” while flipping past an interesting-looking slide, I could probably buy myself the drink that I need to calm myself after the information overload. 🙂

Glushko posits that greater predictability produces a better experience, even if the average level of service is lower, using the example of a self-service hotel check-in versus the variability of dealing with a reception clerk. Although he doesn’t mention it, this is exactly the point of Six Sigma: reducing variability, not necessarily improving service quality.

He goes on to discuss the front stage of services, which is the interaction of the customer or other services with the services, and the back stage, which is the execution of the underlying services themselves. I love his examples: he uses an analogy of a restaurant, with the front stage being the dining room, and the back stage being the kitchen. Front stage designers focus on usability and other user interface factors, whereas the back stage designers focus on efficiency, standardization, data models and the like. This tends to create a tension between the two design perspectives, and begs the question if these are intrinsic or avoidable.

From a design standpoint, he feels that it’s essential to create information flow and process models that span both the back and front stages. The focus of back stage design is to design modular and configurable services that enable flexibility and customization in the front stage, and to determine which back stage services you will perform and which you will outsource/reuse from other service providers. Front stage design, on the other hand, is focussed on designing the level of service intensity (the intensity of information exchange between the customer and the service, whether the service is human or automated), and to implement model-based user interfaces and use these models to generate/configure/specify the APIs of user interfaces for the services. By exposing back stage information in front stage design, more back stage information can improve the immediate experience for a specific customer, and can improve subsequent experiences. Data mining and business intelligence can also improve service for future customers.

Evenson, who specializes in interaction design, has a very different perspective than Glushko, who focusses on the back stage design, but rather than being opposing views, they’re just different perspectives on the same issues of designing service systems.

She started out with a hilarious re-rendering of Glushko’s restaurant example by making the point that she applied colour to make the division of the co-production between front and back stage more visible.

Her slides really went by so fast that I was only able to capture a few snippets: sensors will improve the degree of interaction and usefulness of web-based services; technology influences our sense of self; services are activities or events that form a product through interaction with a customer; services are performances: choreographed interactions manufactured at the point of delivery; services are the visible front end of a process that co-produces value. A service system is a framework that connects service touchpoints so that they can sense, respond and reinforce one another. The system must be dynamic enough to be able to efficiently reflect the expectations people bring to the experience at any given moment. Service systems enable people to have experiences and achieve goals.

She discussed the difficulties of designing a service system, such as the difficulty of prototyping and the difficulty of representing the experience, and pointed out that it requires combining aspects of business, technology and experience. She feels that it’s helpful to create an integrated service design language: systems of elements with meanings (that designers use to communicate and users “read”) plus sets of organizing principles.

The New Software Industry: Martin Griss and Adam Blum

Martin Griss of CMU West and Adam Blum of Mobio Networks had a fairly interactive discussion about integrating traditional software engineering practices into modern service oriented development.

Griss is a big proponent of agile development, and believes that the traditional software development process is too ponderous; Blum admits to benefits from smaller teams and lightweight process for faster delivery, but he believes that some of the artifacts of traditional development methods provide value to the process.

Griss’ problems with traditional development are:

  • Too many large documents
  • It’s too hard to keep the documents in synch with each other and the development
  • People spend too much time in document reviews
  • Use cases are too complex
  • Can’t react well to changes in requirements
  • Schedule and features become omnipotent, rather than actual user requirements

In response, Blum had his list of problems with agile development:

  • Some things really do need upfront analysis/architecture to create requirements and specification, particularly the lower layers in the stack
  • Team management needs to be more complex on larger projects
  • Many agile artifacts are simply “old wine in new bottles”, and it’ simply a matter of determining the right level of detail
  • If you have a team that’s currently delivering well, the introduction of agile processes can disrupt the team and impact productivity — if it’s not broke, don’t fix it
  • Some of the time-boxing of agile development (e.g., SCRUM monthly sprints, daily 10-minute meetings) creates artificial schedule constraints
  • Agile development theory is mostly pseudo-science without many facts to back it up
  • Modern tools can make older artifacts lighter-weight and more usable

Writing requirements and specifications is something that I’ve spent probably 1000’s of hours doing over the years, and many of my customers still require this methodology, so I’m sympathetic to Blum’s viewpoint: sometimes it’s not appropriate or not possible to go agile.

An interesting point emerged from the back-and-forth discussion: it may not be possible to build the development platforms and frameworks themselves (such as what Mobio builds) in an agile fashion, but the applications built on those high-level platforms lend themselves well to agile development. Features to be added to the platform are effectively prototyped in an agile way in applications built on the platform, then are handed off to the more traditional, structured development cycle of the platform itself.

Griss, who was partially looking to just stir up discussion earlier, pointed out that it’s necessary to take the best parts of both ends of the software development methodology spectrum. At the end, it appears that they agree that there are methodologies and artifacts that are important, it’s just a matter of the degree of ceremony to use on any given part of the software development process.

The New Software Industry: Open Source panel

First up after lunch is a panel on the role of open source in service management, moderated by Martin Griss of CMU West, and including Kim Polese of SpikeSource, and Jim Berbsleb and Tony Wasserman of CMU West.

Polese is included in the panel because her company is focussed on creating new business models for packaging and supporting open source software, whereas the other two are profs involved in open source research and projects.

The focus of the session is on how open source is increasingly being used to quickly and inexpensively create applications, both by established companies and startups: think of the number of web-based applications based on Apache and MySQL, for example. In many of these cases, a dilemma is created by the lack of traditional support models for open source components — that’s certainly an issue with the acceptance of open source for internal use within many organizations — so new models are emerging for development, distribution and support of open source.

Open source is helping to facilitate unbundling and modularization of software components: it’s very common to see open source components from multiple projects integrated with both commercial software components and custom components to create a complete application.

A question from the audience asked if there is a sense of misguided optimism about the usefulness open source; Polese pointed out in response that open source projects that aren’t useful end up dying on the vine, so there’s some amount of self-selection that tends to promote successful open source components and suppress those that are less successful through market acceptance.

As I mentioned during the Brainstorm BPM conference a few weeks back, it’s very difficult to blog about a panel — much less structure than a regular presentation, so the post tends to be even more disjointed than usual. With luck, you’ll still get some of the flavour of the panel.

The New Software Industry: Timothy Chou

The morning finished with Timothy Chou, author of The End of Software and the former president of Oracle’s online services group, discussing the radical changes in the software industry due to software-as-a-service. Anyone who entitles his talk “To Infinity and Beyond” and has a picture of Buzz Lightyear on the title slide is okay with me. 🙂

He looks at the economics of why the transformation is occurring, and encourages becoming a student of the economics in order to understand the shift. Considering a sort of Moore’s law for software, traditional software (e.g., SAP) costs around $100/user/month to licence, install and support in various configurations; SaaS (e.g., Salesforce.com) costs around $10/user/month; and internet applications (e.g., Google) are more like $1/user/month.

He makes the point that the SaaS revolution is already occurring by listing nine SaaS companies that have gone public (including Webex and Salesforce.com); these nine went from just over $200M in revenues in 2002 to $1.4B in 2006.

Chou gives us three lessons for the future:

  • Specialization matters. Think Google, which was originally an insanely simple interface for a single task: searching. Or eBay, which just does auctioning. This isn’t just a product functionality or distribution issue, however; the software development process has fundamentally changed. It’s now easier to become a software developer because of the tools, and this drives the development of niche applications. In a world where Citibank has more developers than Oracle, we’re not just buying software from the “professionals” any more; we’re creating it ourselves or buying it from much smaller players.
  • Games matter. Chou uses World of Warcraft as a collaboration example, and it’s a great one. People from all over the world, with different languages and ethnicity, come together for a common goal, then disperse when that goal is achieved. WoW makes specialized skills and skill levels transparent, so that you immediately know if another player’s skills are complementary to your own, and how good he is at that skill. In general, you can’t do that now in business collaboration environments, but it would be great if you could. Also of interest is the world of currency within these games, and how that currency is valued in the real world.
  • Service matters. The service economy is not just about human labour; service is information. Consider the information that Amazon has about books, from finding them to other user reviews to recommendations. The information is there, but some of it is hard to find/analyze. The “surface web” of approximately 100TB is what you could find on Google, but there’s a much deeper web of more than a million TB, mostly inside corporate firewalls. How much better service could we have if we had access to more of that information in the deep web?

The New Software Industry: John Zysman

John Zysman, a professor of Political Science at UC Berkeley, immediately followed Maglio with a related discussion on Services Transformation. The expectation was that Maglio and Zysman have diametrically opposed views and that their joint question period will degrade into fisticuffs — or at least a lively debate — but it turns out that they’re pretty closely aligned on many issues.

A generation ago, services (within a software product company) were seen as a sink hole of productivity, but are now considered to be sources of productivity. It’s not that the service sector has grown or has changed from agriculture to IT, it’s that the sector has been reorganized in significant ways. In order to navigate this, we need to understand three things: strategy and organization; tools; and rules and roles (social-political dynamics).

An example of this sort of transformation is what Zysman referred to as the “American Comeback”, driven by the new consumer electronics, with a shift from electro-mechanical to digital (think Walkman to iPod) as well as modularization and commoditization within the supply chain. He listed stages of service transformations, although I can’t do justice to an explanation of these:

  • Outsourcing
  • Changes in consumption patterns
  • Outsourcing household work
  • The algorithmic transformation: from revolution to delusion

Most of this transformation is based on a change in how services are performed and the application of technology to allow services to be performed in different ways and locations. I heard an interesting example of this last night while having dinner with some of the TIBCO people who I’ll be seeing at TUCON later this week: two of them were from the U.K., one of those two now living in the U.S., and we had a discussion about healthcare in the U.K., U.S. and Canada. One of them made the point that in the U.K., patients sit in the waiting room until the doctor comes out and calls them in, where as in both Canada and the U.S., multiple patients are taken simultaneously to separate examination rooms and prepped by medical assistants, then the doctor just goes from one room to another to do a more specialized part of the work. What’s really interesting is that U.K. and Canada both have socialized medicine, which would tend to favour the less efficient but total service U.K. model, except Canada has a shortage of doctors so has moved to the more efficient U.S. service model.

A couple of random ideas from his talk that I want to capture here for later thought:

  • Should we conceive a services stack?
  • Automating the codifiable parts of a process is the first step in the transformation.
  • By commoditizing a service, you may be “moving the whiteboards of innovation”, i.e., disabling the ability to have innovation in a service.

In discussing rules and roles, Zysman talked about how services are embedded social processes, and how we need to change the way that processes work. How did we end up talking about business process reengineering? I thought that I was taking a break from process today, but as it turns out, there is no escape.

The New Software Industry: Paul Maglio

Paul Maglio, a senior manager of service systems research at IBM’s Almaden Research Center, spoke to us on the science of service systems, looking at the services sector of the economy, including everything from high-end professional services to McJobs in the hospitality industry. The focus of much of his research is on high-value services that simply can’t be automated.

Harkening back to Cusumano’s talk, he showed where services generates 53% of IBM’s gross revenue, but only 35% of their pretax net income; because of that, they’re focussing on service innovation in order to be able to squeeze a bigger margin out of that services portion.

He showed a model of services as a system of relationships between a service provider, a service client and a service target (the reality to be transformed or operated on by the service provider for the sake of the service client). Service systems depend on value co-creation between the provider and the client. if the client wins to the detriment of the provider, it’s a loss leader; in the reverse situation, it’s coercion. If they both win, it’s co-creation.

Although there’s no equivalent to Moore’s Law for services, telling us where the efficiencies will be created in the future, there are some known factors that can be applied to make services more effective, both related to people (location, education) and technology.

In mapping profits against revenues, the steepest curve (biggest return) is information, then technology, then SaaS, then labour. However, most services are a combination of all of these things, so it’s considerably more complex to model.

The New Software Industry: Michael Cusumano

Michael Cusumano is with the MIT Sloan School of Management, and has written several books on the changing software industry; he spoke today about the changing business of software.

In general, there is a decline of new enterprise software product revenues, and growth in services and maintenance sales. There are a number of new business models, including SaaS and ad-supported software.

Software companies tend to move from being product companies to services or hybrid product/services companies (maintenance revenue is usually included in services). However, there’s a different evolution curve that shows where companies focus on product innovation, then on process innovation (e.g., making the product more efficiently), then on services innovation.

The number of publicly-owned software companies peaked in 1997 at around 400 companies. IT services firms peaked in 1999 at around 500 companies. Web companies, which can be launched with significantly less capital (due to distribution mechanisms and development tools/methodologies), had a peak in 1999 before dropping in the crash, but are now climbing to an even higher peak.

Cusumano showed a graph of three business model dimensions: revenue model, delivery model, and customers, with traditional software product vendors at the origin of the graph, and various other models scattered throughout the cube. He also asked the question, is the rise in services and new business models temporary or permanent? The “temporary” argument says that we’re in a transition phase between platform and business model innovations; the “permanent” argument (with which I agree) says that software is now commoditized and prices will fall to close to zero as we embrace SaaS and ad-supported models.

Being an MIT geek, Cusumano had slide after slide of data analysis about his research on software product companies. For example, average product company revenue crossed over in 2002 so that services revenue was larger than product revenue; also, firms at 24+ years of age have more services than product revenue. The age phenomena contributes to the date-based phenomena, since many of the large enterprise product vendors are reaching this level of age maturity now. There’s an interesting cycle where services are very attractive for revenue generation, but then reach a point (in terms of % of revenue) where they are performed relatively inefficiently and, due to lower profit margins, are not as profitable as product; eventually, as companies become better at providing services (e.g., reusability), it swings to a more positive contributor to profitability. Market cap follows a similar pattern, although the centre (when services are undesirable) portion of the graph is broader.

Similar things are happening with hardware companies: more than 50% of IBM’s revenue, for example, is from services.

He had some interesting comments on the way that software product companies should incorporate services into their business model: it should be planned and exploited as opposed to just happening by accident, as it does with many product companies.

He ended up with some key questions:

  • How to manage the mix of products, services and maintenance efforts and revenue within a product company.
  • How to “servitize” products, to make them less generic and more customizable.
  • How to productize services; a great point that he made here is that it’s best served by creating two professional services organizations with different mandates.

The New Software Industry: Ray Lane

I’m at the Microsoft campus in Mountain View attending the New Software Industry conference, put on by Carnegie Mellon West and the Haas School of Business. I interviewed a few of the people from CMU West a few months ago about the new Masters of Software Management program, and ended up being invited to attend here today. Since I’m down here for TUCON this week, it was just a matter of coming in a day early and fighting the traffic down from the city this morning (although I left San Francisco at 7:30 this morning, I still arrived late, around 9:15).

Unfortunately, I missed the brief opening address which, according to the program, featured Jim Morris, Dean of CMU West, and Tom Campbell, Dean of Haas, so my day started with Ray Lane of Kleiner Perkins (formerly of Oracle) talking about the personal enterprise, or what I would call Enterprise 2.0.

Lane started with a discussion about how the software industry is changing, including factors such as packaging (including SaaS) and vertical focus. I found it interesting, if not exactly surprising, that he has a very American-centric view of the industry, so that he’s really talking about the software industry in the U.S., not the global industry; he spoke about India and China gaining market share in software as some sort of external force as opposed to part of the industry.

He had some interesting points: a call to action, which including leveraging community power via mashups and other collaborative methods; and a look at how platforms are moving from monoliths to clouds (i.e., services exist in cloud and are called as required). He covered some basic about Web 2.0 and web-driven capabilities. Since I’ve been so immersed in this for such a long time, there wasn’t much new here for me, although he had some interesting examples, particularly about collaboration and user-driven content.

He talked about the “personal enterprise”, where consumer web applications inspire new enterprise applications, or what many of us have been talking about as Enterprise 2.0. He makes a great point that somehow, being at home allows us to just try something new online, whereas the act of going into the office makes us want to spend a year evaluating rather than just trying something, and how we need to change that notion.

He gave seven laws for Enterprise 2.0 applications:

  • serves individual needs
  • viral/organic adoption
  • contextual personalize information
  • no data entry or training required
  • delivers instantaneous value
  • utilizes community, social relationships
  • minimum IT footprint

I’d love to expand further on each of these, but I’m trying to get this conference blogging back to something like real-time, so that will have to wait for another post.

He finished up with some examples of personal enterprise applications, with some discussion about what each of them contributed to advancing software industry business models:

  • Services: Webex, Skype, RIM, Google
  • Applications: Salesforce.com, NetSuite, RightNow
  • Collaboration: SuiteTwo, Visible Path

Access to the Microsoft guest wifi is tightly guarded and requires an hour or so turnaround to get login credentials, so this first post is coming out late and the other will trickle along throughout the day. All of the posts for this conference are available here.