Appian Analyst Update

Matt Calkins and Samir Gulati from Appian were on a short analyst call today to give us a summary of 2008 and a preview of 2009. They had some big changes this year: expanding their marketing efforts, launching their SaaS offering with customers like Starbucks and Manulife, and expanding geographically into Europe and Asia. Much of this is fuelled by the $10M in VC funding that they took on in 2008, the first external funding in their 10-year history; based on the timing of the funding, I’m guessing that they got a much better valuation than if it had happened a few months later.

Their sales numbers are counter-cyclical, with their Q4 in 2008 being their biggest closing quarter ever. Although they built their business on their US federal government business, they’ve broadened out to a number of commercial clients in financial services, manufacturing and other verticals. They’ve also seen some milestones with systems already in place, such as a total of 1B logins to the system that they have at the US Army. I think that they’re just getting starting with BPM there, so this is likely mostly on their portal platform; still, that’s a lot of logins.

Appian’s big push in 2008 was their SaaS platform, Appian Anywhere, which is forming an estimated 30% of their new business. Currently, it’s still only available to selected large customers in a dedicated and fault-tolerant hosting environment: in other words, not a multi-tenanted SaaS solution that you can just sign up for online at any time, but more like just having your BPM servers sitting in someone else’s location. They’ll be releasing a lower-end offering hosted on Amazon EC2 in early February, with 30-day free trials for small businesses, where each customer is hosted on their own instance. This is the same sort of configuration approach adopted by Intalio, as discussed in the comments on a post that I wrote for the BPM Think Tank; there are many who would say that this is not multi-tenancy, it’s virtualization, and it doesn’t provide the level of scalability (both up and down) that’s needed for true SaaS. The subscription cost for Appian Anywhere on EC2 will be $35/user/month.

Regardless of the platform – on-premise Appian Enterprise, the high-end hosted Appian Anywhere, or the EC2-hosted Appian Anywhere – it’s the same code base, so there shouldn’t be a problem moving from one to another as the need arises. This also means that they’re not trying to split their engineering team in three directions to serve three markets: it’s all the same code.

At the same time as the EC2 launch, Appian will be launching an application framework to allow for faster development and deployment of vertical applications, and an application marketplace to provide applications developed by Appian or partners on a subscription basis. Some initial applications will be free, with others coming in at around $10/user/month on top of the base subscription price.

Appian’s focus is on making BPM frictionless: allowing it to be purchased and deployed within an organization without all the usual hoopla that it takes for on-premise systems. I think that there could be some challenges ahead, however, with the lack of multi-tenancy causing additional administrative overhead and setting limits on how big (or small) you can get with your Appian Anywhere system and still have it be cost-effective all around.

Gartner webinar: First 100 days as BP director

In keeping with other recently-installed change agents, Elise Olding of Gartner delivered a webinar today on your first 100 days as a business process director. As she points out, you have 100 days to make some key first impressions and get things rolling, and although you may not necessarily deliver very much in that time, it sets the tone for the ongoing BPM efforts.

She breaks this down into what you should be doing and delivering in each of the first three months:

  • The first month is about planning and getting a number of activities kicked off. If you’re new to the business area (often, the BP director is coming in from another part of the organization or from outside), then learn about the organization and the business. Start an assessment of how BPM will impact the business, interview key executives, and make sure that you understand the key drivers for BPM to ensure that the project actually has a long-term vision and goals. By the end of that first month, you should have delivered a high-level plan, figured out who’s going to be on the team and how it will be staffed (internal, external consultants, new hires), and create a “what is BPM” presentation to use for eduction within the organzation.
  • The second month is about getting the strategy in place. The team should be mostly in place, with roles and responsibilities defined, and you should have ties established with complementary groups such as enterprise architecture and strategic planning. Some amount of documentation needs to be created by this point, including the BPM charter, methodology for BPM projects and the BPM governance structure (including a competency center) that dovetails with other governance within your organization. At this point, you should also have a first draft of your BPM strategic plan and a communication plan.
  • The third month is about starting to deliver results. With the internal team fully in place and some new hires likely still ongoing, you’ll need to determine training needs both for the team and to roll out on a larger scale. The actual process improvement work should be started, looking at the details of processes in the business areas and considering the application of BPM practices (we’re not talking technology implementations here) to start understanding and improving processes, and try to complete two “quick win” projects where you’re showing value in the organization. The business process competency center should be kicked off and the charter drafted, and governance bodies such as steering committees in place, and you should finish your final strategic plan.

In some organizations, this will seem a wildly optimistic schedule for all of these activities, and Olding admitted that she has seen many cases of this stretching to around 18 months. I’m sure that hiring Gartner to help you out will speed things along, however. 🙂

She ended up with some recommendations that are pretty good advice on any type of project: understand the organization and have a plan that is flexible enough to accommodate theirpecific needs; communicate, particularly showing BPM in the context of business imperatives; and advocates within the business to help with the adoption process. Gartner has published quite a bit of research on getting started with your BPM initiatives, including governance and competency centers, but she recommends actions such as getting a collaboration site (e.g., SharePoint, or a hosted solution such as Google Sites if you have external participants) set up early to gather ideas and information about BPM.

Elise went into quite a bit of detail on each of these; definitely worth checking out the replay of the webinar in full (the registration was here, so the replay will likely show up there somewhere). Also, they have two BPM conferences coming up: February 23-25 in London, and March 23-25 in San Diego, and there’s a discount code given at the end of the webinar for $300 off the San Diego conference.

West Bend Insurance does BPM

I attended a webinar today, sponsored by Lombardi, featuring Stacie Kenney, a senior business process analyst at West Bend Mutual Insurance, discussing how they used BPM to allow them to tap into new insurance markets. West Bend has been around since 1894 and have a strong customer base in P&C insurance in the Midwest, but you can imagine the legacy processes and systems that build up over 115 years of operation.

They’ve seen significant growth in the past five years, and wanted to get a bigger piece of the small commercial policies market. However, they couldn’t do small commercial policies cost-effectively with their old business processes because the application process is time-consuming for the agents, and the commissions are small relative to the amount of time spent on the application. The underwriters spent a lot of time re-entering data on a variety of systems, including their mainframe policy administration system, a standalone and inflexible workflow system, and Word and Excel forms. They looked at BPM to provide a more agile solution that could more easily adapt to change through rule and process changes, make the referral process during process fast and easy, and provide visibility into operations. She didn’t give a lot of detail on what they actually did, although it was focused on the quoting and underwriting processes, with a focus on reducing the quote-to-issue time from days or weeks down to just minutes or hours.

They use both Blueprint (for process discovery and modeling) and Teamworks (for full process design and execution), and Kenney talked about what they liked about both products. She likes Teamworks because it allowed her, as a non-technical business analyst, to design the actual screens that they would be using, not just sketch a mock-up that would have to be coded by developers. She likes Blueprint for the ability to keep all process documentation in one place, including using it for what-if scenarios by modeling multiple versions of the same process to allow people to see them. Iterative process development was key for them, with playbacks every 4-6 weeks to ensure that the business was fully engaged, and that there was the opportunity to include their feedback all through the development cycle. They did less formal playbacks weekly, and targeted 3-4 month delivery cycles with at least 3 playbacks during that time. Quite an impressive move to an agile-like development cycle, from an organization that had a fairly traditional development methodology prior to that.

They used an architect and a couple of developers from Lombardi’s professional services to get them started and mentor their team; she noted that while anyone could use Blueprint, you do need some developers on the Teamworks side. One of the biggest challenges that they had was getting their heads wrapped around BPM: not just the tools and technology, but BPM as a new way of doing business. She believes (and I agree) that process analysis should be a core competency of any trained business analyst, but there’s some transition to move away from an application development mindset to more of a process focus in order to become a true business process analyst in the context of BPM. BPM shouldn’t be part of an application development project, especially one that has more of a waterfall methodology, since it will tend to lose momentum and you’ll tend to lose the agility benefits that BPM brings.

The BPM project for Small Commercial business was just the start for West Bend, and having it as a showcase project means that other areas are coming to them to request BPM in their areas. IT is also using BPM internally for their “Road to Excellence” program, which is focused on consolidating the functional silos of resources and tools within IT. They are using Blueprint as a collaborative tool to model their IT processes, redesigning 14 processes in 4 weeks; implementation is underway, and they expect to implement their IT processes in BPM by March.

Much of what they experienced isn’t unique to Lombardi, although Blueprint provides some extra benefits over many other BPM vendors through a more collaborative modeling environment and a process documentation repository. However, the BPM philosophy and agile methods that they used can be used with pretty much any BPM product: that’s more an issue of corporate culture than the specific product, as long as it provide model-driven process development.

The original registration page for the webinar is here, and they’ll have a replay available soon.

BPM and Workflow Handbook 2009

Another paper that you can write, even if you’re not up for heading to Ulm this year: the Workflow Management Coalition’s BPM and Workflow Handbook for 2009. The spotlight this year is on BPM in government, and the deadline for abstracts is December 31st which doesn’t leave a lot of time. Completed papers are due February 22nd, and the book is out in June.

Contextware for process documentation

I’ve had a look at Contextware previously, but yesterday had a chance to talk in depth with David Austin, the president and COO, and see a demo of their current product. Contextware can be described as a a process documentation tool, although it’s also being used as a process discovery tool, but it’s more than just a static document of your operational procedures: for each step in a process, it displays a narrative that might include context or instructions, plus links to various resources including content, applications, and people. You capture processes in Contextware for the purpose of communication and training, not automation; it doesn’t automate processes in any way, but might be combined with a BPMS in order to provide instructions for complex human steps in the process.

There a couple of key use cases for this sort of process documentation, whether you’re doing it for completely manual processes or for the human steps in a BPMS:

  1. You need to capture information from those aging boomer knowledge workers (who will be retiring as soon as the stock market comes back up), since many of the manual processes exist only in their heads.
  2. You want to standardize processes across the organization, and need to provide operational procedures documentation for those processes.

Contextware - Author Steps and NarrativeThere is no automated capture of information: you have to lay the process out step by step, but it’s done in a simple hierarchical list format where you create a list of the main steps, then can add sub-steps to any step as an indented sub-list, and so on. Then, for each step/sub-step, you add the narrative and link up the assets from the list of available resources. Assets linked to a step are inherited by its sub-steps, and can be kept, discarded or a subtype created during the authoring process.

Assets can be predefined — these are in a common repository that can be reused by any process — or defined and saved to the repository on the fly. Content such as documents are typically not stored in Contextware: the asset is actually metadata pointing to external content via a URL or URI, which allows the author to set a meaningful name for the asset that is shared wherever that asset is used within any process. Although it’s common for organizations to have samples and procedural documentation online somewhere on their intranet, this removes the process of hunting for the file or page since it’s linked directly to the step in the process where it’s required.

The resources are a sort of extended IDEF model with six dimensions, and the author can suppress the display of any of them:

  • Inputs (often suppressed), which may link to content or to a system depending on the context
  • Guidelines: additional procedural documentation
  • Content: typically samples
  • People, which allows for a mailto: link to directly create an email
  • Tools, providing links to invoke systems and executables, or describe offline resources or tools
  • Outputs (often suppressed)

There’s no auto-login or single sign-on for any of the systems that are linked from resources (e.g., content management, email, line of business applications): these are just links to launch the appropriate application or content, and the user is responsible for doing the login themselves on the invoked application if they’re not already logged in.

I’m not sure that I completely understand the role of inputs and outputs in the resource list; it might be stretching the IDEF metaphor a bit too much (and for little purpose these days, when many people don’t know what IDEF is).

An author can clone an existing process to start a new design and there’s some lightweight versioning and roll-back capabilities for managing the processes. It’s also possible to call one process (or subprocess) from another by listing it as an asset in the resources.

Contextware - Step Narrative and ResourcesThe end users select their group/department from a list, then their process, then select the step that they’re working on (or expand the step hierarchy and click on a sub-step) to see the narrative and the list of resources available. Clicking on a resource for that step in the right-hand panel will invoke the URL or URI that’s specified for the resource: it could be a link to a web page or document, an executable desktop application, or a mailto: link for a person. The text in the center panel changes depending on which step or which resource that they’ve selected. Users can also add comments to a step via a context-sensitive “bulletin board” at each step, providing a bit of a wiki-like experience to capture the users’ feedback directly in the context of the process, although they can’t change the main narrative.

Users are not constrained to follow the steps in order; they can see the entire hierarchy and select any step that they need, since this is process documentation and doesn’t actually drive their work. In fact, this could be used as a contextual help system without regard to a process, organizing the help topics in the hierarchy on the left and attaching narrative and assets to each topic and subtopic.

Both authors and end users can search on processes and metadata to locate a specific process or resource.

An audit (management) function shows who has accessed which processes, which allows a manager to tell if someone has accessed a new process after it has been rolled out. Since this can be considered training material, the manager is basically checking to see if everyone did their homework and understands the new process.

Both the authoring and end-user environment are completely web-based with a rich AJAX interface, so no desktop installation. We didn’t talk about what’s lurking on the server side of this, but their website lists the operating systems, database, application servers and web servers supported. Their site also talks about delivering content to mobile devices, but we didn’t discuss that.

There’s an obvious play in the training space and for documentation of manual processes, but Contextware would also like to see how they can put this together with some of the BPMS products in order to provide additional documentation for human-facing steps in cases where it’s difficult to build that into the BPMS user interface itself. I also believe that they need to focus on importing from some of the BPA tools, such as ARIS and Mega: although the process model is only a starting point for Contextware, it would be helpful to have that starting point in the case of large complex processes with a lot of manual steps. Of course, this may start to conflict with what some of the BPA vendors are trying to do in terms of process documentation, but I think that most of them are really focused on showing the business processes rather than providing complete operational procedures. I also see potential for integration with a process discovery tool such as Process Master, although there might be too much overlap in functionality.

There needs to be much better management of screen real estate as well: this product smacks of developers with huge dual monitor setups who don’t realize that the average underwriter in an insurance company works on a single 800×600 15″ screen. I suggested that they be able to minimize to some sort of floating widget (which could be difficult considering that they’re web-based, but hey, that’s what Adobe AIR and Google Gadgets are for, right?) that the user could float their mouse over to pop up the narrative and resources, and click to the next step. Otherwise, you’d be trying to deploy this on a screen where the user has to constantly flip back and forth between Contextware as their procedural guide and their actual applications. Printing would ensue.

They also need to do a bit more with versioning of processes, where a process could be modified and tested by a specific group of people, then promoted to the production version. In the current system, you’d clone the process (and therefore have to use ad hoc naming conventions to indicate that this was a new version of the older process), make the changes, and release selectively by security groups in order to promote through test and production. Contextware - DollyOnce in production, the users of the old process would have to be notified to start using the new process documentation.

One final criticism about the cutesy interface icons: using a graphic of a sheep (think Dolly) to clone a process just doesn’t cut it for me.

Survey on business process modeling

Three universities with BPM programs — Humboldt University, Eindhoven University of Technology and the Queensland University of Technology — are running a survey on how business process models can be improved in terms of understandability. You can take the survey here, although it’s specifically for those who model using event-driven process chains (EPC). As a participant, you’ll have access to the results of the survey, plus the chance to win a recently-released book, Metrics for Process Models.

Lombardi Analyst Call

Lombardi had a call today for analyst, with Rod Favaron covering business, the market and customers, and Phil Gilbert on the product and technology. Lombardi continues to grow — 60% in license revenue and 40% overall — although their services business isn’t growing as fast as license sales since they are bringing on more partners to provide services rather than doing it all themselves, especially in geographies that they can’t cover well. They’ve increased their headcount by 25% and increased productivity (which allows them to grow revenues faster than headcount), and are in a profitable state for 2008. They believe that BPM will be counter-cyclical to the current economic crisis, and have the potential to grow in more difficult financial times due to a closer focus on ROI: a position that all the BPM vendors are taking (especially with their investors), although I think that a lot of the increased BPM activity will be new projects with already-purchased software rather than a lot of new license purchases. Although it might not drive a lot of short-term license revenue, this will be good for the BPM vendors in the long run due to greater proliferation of projects within customers.

From a product standpoint, they have four active engineering teams:

  • TeamWorks 6, where there’s still some innovation going on around active management of in-flight process instances, to allow business owners to take more granular control at the instance level. There will be another release of TeamWorks 6 before mid-year 2009, which is good news for all those existing 6 users who aren’t ready to make a major platform shift yet.
  • Office add-ins, TeamWorks for Office and TeamWorks for SharePoint, where some upgrades are happening for non-English-speaking users.
  • TeamWorks 7, which will be released in beta next month. This version has been in the works for a couple of years, and Phil thinks that it’s “the biggest leap in BPMS since BPMS’ began” due to the governance and BPM program capabilities that are built in. They’ve rolled in a lot of repository management and code sharing capabilities.
  • Blueprint, where they’re pushing out releases in an Agile development environment every 5-6 weeks. Because of this, the rate of innovation is high, and the product capabilities are growing quickly. Next release is targeted for the third week of December, and next spring they’ll be announcing some capabilities and positioning of Blueprint as a central location where people in a process-centric enterprise go to discuss process by making it relevant to everyone’s job, not just that of process analysts.

From a services standpoint, their own professional services staff is increasing, and they’ve moved from having 5-7 partner staff delivering billable services around Lombardi solutions for every one Lombardi billable professional services staff, to having about 15 partner people to one Lombardi professional services person. They expect this ratio to grow further, and are increasing their efforts in training and certification to support this partner growth.

The first questions from the listeners were around the impact of the economic situation on Lombardi’s business and the BPM market in general, then there was an interesting discussion on the uptake of Blueprint: it’s mostly directly with people in operational areas, not IT, as people see this as a way to get started with collaborating without a lot of up-front capital investment. The interest that they’re getting from the federal government will lead them to offer Blueprint in a “more secure” environment for customers who don’t want their processes in the public cloud — this is good news for non-US customers as well, since there are many European and Canadian organizations who would not consider putting their processes on US-based servers due to the privacy regulations.

Good call, it would be great if more vendors did this on a quarterly or semi-annual basis.

SAP Tools for Process Definition, Modeling and Management

I spent the morning presenting an introduction to BPM in a jumpstart session at the SAP BPM 2008 event put on by SAP Insider and was going to spend the afternoon by the pool, but was tempted by Ann Rosenberg’s invitation to her afternoon session, A Complete Guide to SAP Tools for Business Process Definition, Modeling and Management. Ann is in the Business Transformation Consulting group at SAP, and was joined in the session by Marilyn Pratt (SAP BPX Community Evangelist), Greg Prickril (SAP NetWeaver BPM product management) and Charles Möller (Center for Industrial Production at Aalborg University).

Several people in the audience — including Ann and Marilyn — were in my session this morning, so had some context for this; Ann did a quick overview of BPM to start, and it was a good complement to my session since she covered many of the topics that I didn’t have time to address, such as the link between BPM and quality management programs like Six Sigma, and business process maturity models. One interesting quote from Ann: “The way we will run SAP projects going forward will be different from how we did it in the past”, due to the process orchestration capabilities that are now available.

She positions IDS Scheer’s ARIS as the place where you will do your business process modeling, which includes both manual and automated activities (by “manual”, I believe that she means those that are not touched in any way by the BPMS); automated activities make up typically less than 20% of all activities. Of those automated activities, you’ll then use NetWeaver BPM to model and execute less than 20% of those activities — the ones that are a competitive differentiator — whereas the remaining 80+% are standard activities/processes within SAP’s standard business suite.

My thoughts on this:

  • I don’t agree that only 20% of what most organizations model are candidates for any sort of automation if you include the manual tasks executed within a BPMS, but I haven’t done any definitive survey on this; the percentage would depend how much process modeling that your organization is doing as a standalone initiative, but I would expect a much higher percentage if your organization has some sort of BPMS initiative.
  • The 80% or more of the automated activities that are targeted for SAP’s business suite rather than BPM are those that are intended to be more “cost-effective”, which implies that it’s much more expensive to develop and execute business processes in NetWeaver BPM than in the core business systems. I don’t know enough about SAP to make that sort of cost comparison, but given the time and effort that I’ve heard is required to deploy and maintain an SAP business suite system, I find it hard to believe that a more agile BPM system is more expensive if you are going to do a comparison of a realistic (read: not static) process. I imagine that for truly standard processes — those where you could use SAP business suite out of the box — that would be true, but it’s not my impression that that happens a lot.

She had some good comments on business process maturity and how it relates to SAP: the core business products cover off the first three or four levels to get your processes standardized, then BPM kicks in when you move into the upper levels of continuous improvement. I think that’s a good context for SAP customers moving into BPM; if they’re using SAP’s business suite properly, then they already have some degree of business process maturity, but have no hope of achieving that continuous improvement nirvana without something more agile, like BPM.

Charles Möller was up next with an academic review of the management discipline of BPM that links to the book “Business Process Management – The SAP Roadmap” that he recently co-authored with Rosenberg and two others; this covered some of the history of quality management methodologies and their connection to business process, the current analyst views, some ongoing research, and more on process maturity models. He included some research on architectural maturity models, which are related to process maturity, particularly around how IT budgets decrease with architectural maturity up to the point of a centralized optimized core set of services, but increases when you reach a maturity level of business modularity since individual business units can’t have flexible business processes without increasing IT costs. Möller’s premise from his book is that this is just not going to fly, and that we have to have new paradigms for business process maturity: a new sort of IT value change that moves beyond business process management to business process innovation; where innovation and change is the standard rather than a specific set of processes or services as a standard. He sees enterprise architecture as the enabler in moving from process management to process innovation.

Ann Rosenberg was back up to talk about BPM governance, particularly in SAP’s structured approach to moving from a functional organization to a process organization. She talked about how SAP applied this approach to their own organization, and their experiences with it. She also had an interesting point about how there are no longer IT projects: every project is a process improvement project, otherwise you shouldn’t be doing it. It’s critical to build a process-centric IT department, not the old-style functional IT where each person is a specialist in a particular system or function. IT needs to recognize that they are an enabler for business change, not a driver of change, and hand the control back to the business. I resisted the urge to stand up and cheer.

Greg Prickril gave us a view of NetWeaver BPM, starting with some of their basic philosophy — their main target is existing SAP customers who want to add the orchestration capabilities of BPM to extend their current business processes in the SAP business suite. In the context of BPM, the SAP business suite can be exposed as just another set of services to be invoked from BPM (which, of course, any other BPMS vendor who works with SAP customers knows already). I’ve had some extensive briefings on NetWeaver BPM from some of the other product management team members, and I’ll be publishing some of my observations on it this week in the context of this conference.

He pointed out that although they intended to address the needs of many personas across business and IT, their first version will be optimized for the process architect: an IT role that designs processes. In other words, they don’t yet have their business analyst perspective ready in the modeling environment. He showed us a demo of the Eclipse-based process modeling environment, and a look at the end-user experience in the context of the NetWeaver universal task list. My assessment of this first version of the product, which is in beta now and will be released in Q109, is that it has some nice integration capabilities (although no asynchronous web services calls), but that the human-centric capabilities are barely adequate, and they don’t meet the minimum requirements to be considered a BPMS in the eyes of some of the analysts. However, this is version 1.0, and you don’t expect them to land in the top right of anyone’s quadrant the first time out; from what I’ve seen, they have a good roadmap to getting to the functionality that will make them competitive with other BPMS vendors when it comes to SAP customers. Will they ever be competitive with non-SAP customers? Probably not, but then, that’s not their target market.

It’s interesting to see a BPMS demo to a group of mostly technical people who have no idea what a BPMS looks like: usually, I’m seeming demos like this at other BPMS vendors’ conferences where they’re showing the next version of their product, but everyone is familiar with the current version and basic BPM concepts. Things that those of us familiar with BPMS don’t even think about any more — like the concept of process instance parameters — have to be explained, which is a good reminder to be aware of the context and the audience background when discussing BPM.

Ann Rosenberg came back up to cover some of the BPM training curriculum, and handed it over to Marilyn Pratt to discuss the SDN BPX community. I’m a big fan of Marilyn’s: she’s one of the most active and enthusiastic community managers that I’ve met, and manages to ensure that SAP’s corporate party line doesn’t overshadow the independent discussions and interactions on the BPX site.

The afternoon jumpstart session ended with a panel that included the four speakers plus me, which gave the audience a chance to ask questions on everything from specific SAP product questions to more philosophical questions on the differences between BPR and what we’re doing now with product improvement.

BPM events for 2009

I know that it’s only mid-November, but the conference season is pretty much over for 2008. I’m trying to get conference dates for 2009 on the shared BPM Events calendar (click through to Google Calendar to add it to your calendars), so if you have a date, please add it if you’re already an author, or send the details to me and I’ll add it. If you’ll be a regular contributor of events to the calendar, then I’ll add you as an editor on the calendar.