Zero History

Really enjoying reading William Gibson’s latest, “Zero History”, which includes some lovely gems of writing such as:

He looked down at the screen, the glowing map. Saw it as a window into the city’s underlying fabric, as though he held something from which a rectangular chip of London’s surface had been pried, revealing a substrate of bright code. But really, wasn’t the opposite true, the city the code that underlay the map?

IOD Keynote: Computational Mathematics and Freakonomics

I attended the keynote this morning, on the theme of looking forward: first we heard from Mike Rhodin, an exec in the IBM Software group, then Brenda Dietrich, a mathematician (and VP – finally, a female IBM exec on stage) from the analytics group in IBM Research. IBM Research has nine labs around the world, including a new one just launched in Brazil, and a number of collaborative research facilities, or “collaboratories”, where they work with universities, government agencies and private industries on research that can be leveraged into the market more quickly. I’ve met a few of the BPM researchers from the Zurich lab at the annual academic BPM conference, but the range of the research across the IBM labs is pretty vast: from nanotechnology, to the cloud, to all of the event generation that leads to the “smarter planet” that IBM has been promoting. She’s here from the analytics group because analytics is at the top of this pyramid of research areas, especially in the context of the smarter planet: all of our devices are generating a flood of events and data, and some pretty smart analytics have to be in place to be able to make sense of all this.

The future of analytics is moving from today’s static model of collect-analyze-present results, to more predictive analytics that can create models of the future based on what’s happened in the past, and use that flood of data (such as Twitter) as input to these analytical models.

I have a lot of respect for IBM for trying out their own ideas on systems on themselves as one big guinea pig, and this analytics research is no exception. They’re using data from all sorts of internal systems, from manufacturing plants to software development processes to human resources, to feed into this research, and benefit from the results. When this starts to hit the outside market, it has impacts on a much wider variety of industries, such as telco and oil field development. Not surprisingly, this ties in with master data management, since you need to deal with common data models if you’re going to perform complex analytics and queries across all of this data, and their research on using the data stream to actually generate the queries is pretty cool.

She showed a short video ciip on Watson, an AI “question answering system” that they’ve built, and showed it playing Jeopardy, interpreting the natural language questions – including colloquialisms – and responding to them quickly, beating out some top human Jeopardy players. She closed with a great quote that is inspirational in so many ways, especially to girls in mathematics: “It’s a great time to be a computational mathematician”.

The high-profile speakers of the keynote were up next: Steven Levitt and Stephen Dubner, authors of Freakonomics and Superfreakonomics, with some interesting anecdotes about how they started working together (Levitt’s the genius economist, and Dubner’s the writer who collaborated with him on the books). They talked about turning data into ideas, tying in with the analytics theme; they had lots of interesting and humorous stories on an economic theme, such as teaching monkeys about money as a token to be exchanged for goods and (ahem) services, and what that teaches us about risk and loss aversion in people.

I have a noon flight home to Toronto, so this ends my time at IOD 2010. This is my first IOD: I used to attend FileNet’s UserNet conference before the acquisition, but have never been to IOD or Impact until this year. With over 10,000 people registered, this is a massive conference that covers a pretty wide range of information management technologies, including the FileNet ECM, BPM and now Case Manager software that is my main focus here. I’ve had a look at the new IBM Case Manager, as you’ve read in my posts from yesterday, and give it a bit of a mixed review, although it’s still not even released. I’m hoping for an in-depth demo sometime in the coming weeks, and will be watching to see how IBM launches itself into the case management space.

Customizing the IBM Case Manager UI

Dave Perman and Lauren Mayes had the unenviable position of presenting at the end of the day, and at the same time as the expo reception was starting (a.k.a. “open bar”), but I wanted to round out my view of the new Case Manager product by looking at how the user interfaces are built. This is all about the Mashup Center and the Case Manager widgets; I’ve played around with the ECM widgets in the past, which provide an easy way to build a composite application that includes FileNet ECM capabilities.

Perman walked through the Case Manager Builder briefly to show how everything hangs together – or at least, the parts that are integrated into the Builder environment, which are the content and process parts, but not rules or analytics – then described the mashup environment. The composite application development (mashup) environment is pretty standard functionality in BPM and ACM these days, but Case Manager comes with a pre-configured set of pages that make it easy to build case application UIs. A business analyst can easily customize the standard Case Manager pages, selecting which widgets are included and their placement on the page, including external (non-Case Manager) widgets.

The designer can also override the standard case view pages either for all users or for specific roles; this requires creating the page in the mashup environment and registering it for use in Case Manager, then using the Case Manager Builder to assign that page to the specific actions associated with a case. In other words, the UI design is not integrated into the Case Builder environment, although the end result is linked within that environment.

Mayes then went through the process of building and integrating 3rd party widgets; there’s a lot of material on the IBM website now on how to build widgets, and this was just a high-level view of that process and the architecture of integrating between the Mashup Center and the ACM widgets, themes and ECM services on the application server. This uses lightweight REST services that return JSON, hence easier to deal with in the browser, including CMIS REST services for content access, PE REST services for process access, and some custom case-specific REST services. Since there are widgets for Sametime presence and chat functionality, they link through to a Sametime proxy server on the application server. For you FileNet developer geeks, know that you also have to have an instance of Workplace XT running on the application server as well. I’m not going to repeat all the gory details, but basically once you have your custom widget built, you can deploy it so that it appears on the Mashup Center palette, and can be used like any other pre-existing widget. There’s also a command widget that retrieves all the case information so that it’s not loaded multiple times by all of the other widgets; it’s also a controller for moving between list and detail pages.

This is a bit more information that I was counting on absorbing this late in the day, and I ducked out early when the IBM partner started presented about what they’ve done with custom widgets.

That’s it for today; tomorrow will be a short day since I fly home mid-day, but I’ll likely be at one or two sessions in the morning.

IBM’s New Case Manager Product Overview

The day before the official announcement of IBM’s Case Manager product, Jake Levirne, Senior Product Manager, walked us through the capabilities. He started by defining case management, and discussing how it is about providing context to enable better outcomes rather than prescribing the exact method for achieving that outcome. For those of you who have been following ACM for a while, this wasn’t anything new, although I’m imagining that it is for some of the audience here at IOD.

Case Manager is an extension of the core (FileNet) ECM product through the integration of functionality from several other software products across multiple IBM software groups, specifically analytics, rules and collaboration. There is a new design tool targeted at business analysts, and a user interface environment that is the next generation of the old ECM widgets. There’s a new case object model in the repository, allowing the case construct to exist purely in the content repository, and be managed using the full range of content management capabilities including records management. Case tasks can be triggered by a number of different event types: user actions, new content, or updates to the case metadata. By having tasks as objects within the case, each task can then correspond to a structured subprocess in FileNet BPM, or just be part of a checklist of actions to be completed by the case worker (further discussion left it unclear whether even the simple checklist tasks were implemented as a single-step BPM workflow). A task can also call a WebSphere Process Server task; in fact, from what I recall of how the Content Manager objects work, you can call pretty much anything if you want to write a Java wrapper around it, or possibly this is done by triggering a BPM process that in turn calls a web service. The case context – a collection of all related metadata, tasks, content, comments, participants and other information associated with the case – is available to any case worker, giving them a complete view of the history and the current state of the case. Some collaboration features are built in to the runtime, including presence and synchronous chat, as well as simple asynchronous commenting; these collaborations are captured as part of the case context.

As you would expect, cases are dynamic and allow case workers to add new tasks for the case at any time. Business rules, although they may not even be visible to the end user, can be defined during design time in order to set properties and trigger events in the case. Rules can be changed at runtime, although we didn’t see an example of how that would be done or why it might be necessary.

There are two perspectives in the Case Manager Builder design environment: a simplified view for the business analysts to define the high level view of the case, and a more detailed view for the technologists to build in more complex integrations and complex decision logic. This environment allows for either start-from-scratch or template-based case solution definitions, and is targeted at the business analyst with a wizard-based interface. Creating a case solution includes defining the following from the business analyst’s view:

  • case properties (metadata)
  • roles that will work on this case, which will be bound to users at runtime
  • case types that can exist within the same case solution
  • document types that can be included in the case or may even trigger the case
  • case data and search views
  • which case views that each role will see
  • default folders to be included in the case
  • tasks that can be added to this case, each of which is a process (even if only a one-step process), and any triggering events for the tasks
  • the process behind each of the tasks, which is a simple step editor directly in Case Builder; a system lane in this editor can represent the calling of a web service or a WPS process

All of these can be defined on an ad hoc basis, or stubbed out initially using a wizard interface that walks the business analyst through and prompts for which of these things needs to be included in the case solution. Comments can be added on the objects during design time, such as tasks, allowing for collaboration between designers.

As was made clear in an audience question, the design that a business analyst is doing will actually create object classes in both Content Manager and BPM; this is not a requirements definition that then needs to be coded by a developer. From that standpoint, you’ll need to be sure that you don’t let them do this in your production environment since you may want to have someone ensure that the object definitions aren’t going to cause performance problems (that seemed screamingly obvious to me, but maybe wasn’t to the person asking the question).

From what Levirne said, it sounds as if the simple step editor view of the task process can then be opened in the BPM Process Designer by someone more technical to add other information, implying that every task does have a BPM process behind it. It’s not clear if this is an import/export to Process Designer, or just two perspectives on the same model, or if a task always generates a BPM process or if it can exist without one, e.g., as a simple checklist item. There were a lot of questions during the session and he didn’t have time to take them all, but I’m hoping for a more in-depth demo/briefing in the weeks to come.

Case analytics, including both dashboards (Cognos BAM) and reports (Excel and Cognos BI reports) based on case metadata, and more complex analytics based on the actual content (Content Analytics), are provided to allow you to review operational performance and determine root causes of inefficiencies. From a licensing standpoint, you would need a Cognos BI license to use that for reporting, and a limited-license Content Analytics version is included out of the box that can only be used for analyzing cases, not all your content. He didn’t cover much about the analytics in this session, it was primarily focused on the design time and runtime of the case management itself.

The end-user experience for Case Manager is in the IBM Mashup Center, a mashup/widget environment that allows the inclusion of both IBM’s widgets and any other that support the iWidget standard and expose their properties via REST APIs. IBM has had the FileNet ECM widgets available for a while to provide some standard ECM and BPM capabilities; the new version provides much more functionality to include more of the case context including metadata and tasks. A standard case widget provides access to the summary, documents, activities and history views of the case, and can link to a case data widget, a document viewer widget for any given document related to the case, and e-forms for creating more complex user interfaces for presenting and entering data as part of the case.

Someone I know who has worked with FileNet for years commented that Case Manager looks a lot like the integrated demos that they’ve been building for a couple of years now; although there’s some new functionality here and the whole thing is presented as a neat package, it’s likely that you could have done most of this on your own already if you were proficient with FileNet ECM and some of the other products involved.

We also heard from Brian Benoit of Pyramid Solutions, a long-time FileNet partner who has been an early adopter of Case Manager and responsible for building some of the early templates that will be available when the product is released. He demonstrated a financial account management template, including account opening, account maintenance, financial transaction requests and correspondence handling. In spite of IBM’s claim that there is no migration path from Business Process Framework (BPF), there is a very BPF-like nature to this application; clearly, the case management experience that they gained from BPF usage has shaped the creation of Case Manager, or possibly Pyramid was so familiar with BPF that they built something similar to what they knew already. Benoit said that the same functionality could be built out of the box with Case Manager, but that what they have provided is an accelerator for this sort of application.

Levirne assured me that everything in his presentation could be published immediately, although I’ve had analyst briefings on Case Manager that are under embargo until the official announcement tomorrow so I’ll give any of the missing details then.

IBM IOD Opening Session: ACM and Analytics

I’m at IBM’s Information On Demand (IOD) conference this week, attending the opening session. There are 10,000 attendees here (including, I assume, IBM employees) for a conference that covers information management of all sorts: databases, analytics and content management. As at other large vendor conferences, they feel obligated to assault our senses in the morning with loud performance art: today, it’s Japanese drummers (quite talented, and thankfully short). From a logistics standpoint, the wifi fell to its knees before the opening session even started (what, like you weren’t expecting this many people??); IBM could learn a few lessons about supporting social media attendees from SAP, which provided a social media section with tables, power and wired internet to ensure that our messages got out in a timely fashion.

Getting back to the session, it was hosted by Mark Jeffries, who provides some interesting and amusing commentary between sessions, told us the results of the daily poll, and moderated some of the Q&A sessions; I’ve seen him at other conferences and he does a great job. First up from IBM is Robert LeBlanc (I would Google his title, but did I mention that there’s no wifi in here as I type?), talking about how the volume of information is exploding, and yet people are starved for the right information at the right time: most business people say that it’s easier to get information on the internet than out of their own internal systems. Traditional information management – database and ECM – is becoming tightly tied with analytics, since you need analytics to make decisions based on all that information, and gain insights that help to optimize business.

They ran some customer testimonial videos, and the term “advanced case management” came up early and often: I sense that this is going to be a theme for this conference, along with the theme of being analytics-driven to anticipate and shape business outcomes.

LeBlanc was then joined on stage by two customers: Mike Dreyer of Visa and Steve Pratt of CenterPoint Energy. In both cases, these organizations are leveraging information in order to do business better, for example, Visa used analytics to determine that “swipe-and-go” for low-value neighborhood transactions such as Starbucks were so low risk that they didn’t need immediate verification, speeding each transaction and therefore getting your morning latte to you faster. CenterPoint, an energy distributor, uses advanced metering and analytics not only for end-customer metering, but to monitor the health of the delivery systems so as to avoid downtimes and optimize delivery costs. They provided insights into how to plan and implement an information management strategy, from collecting the right data to analyzing and acting on that information.

We then heard from Arvind Krishna, IBM’s GM of Information Management, discussing the cycle of information management and predictive analytics, including using analytics and event processing to optimize real-time decisions and improve enterprise visibility. He was then joined on a panel by Rob Ashe, Fred Balboni and Craig Hayman, moderated by Mark Jeffries; this started to become more of the same message about the importance of information management and analytics. I think that they put the bloggers in the VIP section right in front of the stage so that we don’t bail out when it starts to get repetitive. I’m looking forward to attending some of the more in-depth sessions to hear about the new product releases and what customers are doing with them.

Since the FileNet products are showcased at IOD, this is giving me a chance to catch up with a few of my ex-FileNet friends from when I worked there in 2000-1: last night’s reception was like old home week with lots of familiar faces, and I’m looking forward to meeting up with more of them over the next three days. Looking at the all-male group of IBM executives speaking at the keynote, however, reminded me why I’m not there any more.

Disclosure: In addition to providing me with a free pass to the conference, IBM paid my travel expenses to be here this week. I flew Air Canada coach and am staying at the somewhat tired Luxor, so that’s really not a big perq.

The Rules For Process

I’ve been pretty busy here at the Building Business Capability conference the past two days with little time for blogging, and with two presentations to do today, I don’t have much time, but wanted to attend Roger Burlton’s “The Rules For Process” keynote, which he refers to as his business process manifesto. After some terms and meta-rules (e.g., short, jargon-free and methodology-neutral), he got into his eight main principles:

  1. A business process is a unique capability of an organization.
  2. A business process exists within a clearly defined business context.
  3. The name of a business process must be consistently structured, unambiguous and commonly used.
  4. A model of a business process holds knowledge about a business process.
  5. A model of a business process associates a business process with other capabilities of the organization.
  6. A business process is guided by the business’ strategy and its policies.
  7. The performance of a business process is measured and assessed in business terms.
  8. A business process is an essential asset of the organization.

He spent quite a bit of time delving into each of these principles in detail, such as describing a business process as an action, not a policy, business rule or technology application.

I’m not sure if Roger is considering publishing a paper on this; definitely lots of good information about what business processes are and are not, which could help many people with their business process capture efforts. There’s apparently a discussion about this on the BPTrends LinkedIn group where you can find out more and join in the conversation, although I haven’t found it yet.

Building Business Capability Conference: Rules and Process and Analysis, Oh My!

After a welcome by Gladys Lam, the conference kicked off with a keynote by Ron Ross, Kathleen Barret and Roger Burlton, chairs of the three parts of the conference: Business Rules Forum, Business Analysis Forum and Business Process Forum. This is the first time that these three conferences have come together, although last year BPF emerged as more than just a special interest track at BRF, and it makes a lot of sense to see them together when you consider the title of the conference: Business Business Capability.

The keynote was done as a bit of a panel, with each of the three providing a view on the challenges facing organizations today, the capabilities required to tackle these challenges, and how this conference can help you to take these on. Some themes:

  • Lack of agility is a major challenge facing organizations today. To become more agile, design for change, including techniques like externalizing rules from processes and applications for late binding.
  • Consider effectiveness before efficiency, i.e., make sure that you’re doing the right thing before seeking to optimize it. In the vein of “doing the right thing”, we need to change corporate culture to focus on customer service.
  • Structured business vocabularies are important for effectiveness, including things like rules vocabularies and BPMN. Roger pointed out that we need to keep things simple within the usage of these vocabularies, and jokingly challenged us to create a valid process model containing all 100+ BPMN elements.
  • The business analyst’s role will transform in the next five years as process, rules and decision tools converge and business architecture gains more significant. BAs need to step up to the challenge of using these tools and related methodologies, not just write requirements, and need to be able to assess return on investment of previous business decisions to assist with future directions.
  • There is no conflict between the rules and process domains, they’re complementary. I often joke that business process people want to embed all rules into their process maps and just turn them into big decision trees, whereas business rules people want the business process to have a single step that calls a rules system, but the truth is somewhere in between. I’ve written and presented a lot about how rules and process should work together, and know that using them together can significantly increase business process agility.
  • It’s not about aligning business and IT, it’s about aligning business strategy with IT capability. Don’t focus on delivering IT systems, focus on delivering business solutions.

Julian Sammy of IIBA tweeted that he was recording the keynote and will put some of it up on YouTube and the IIBA site, so watch for that if you want to see the highlights on video. You can also follow the conference Twitter stream at #bbc2010.

Thriving In A Process-Driven World

Clay Richardson and Dave West (definitely the two snappiest dressers at Forrester) opened the second day of the Forrester Business Process and Application Delivery Forum with a keynote on thriving in a process-driven world by shifting both your business and IT culture. These shifts are hard work and fraught with risk, but necessary in order to achieve transformation. It’s critical to lead change, not just manage it, by creating change agents inside your organization.

They discussed some tools for doing this: identifying value streams that can point everyone in the same direction; using process as a common language for transformation, although not necessarily a common process representation; extending agile thinking to the entire enterprise; and lean governance that starts at the top but pushes down responsibility to empower teams to make decisions.

To achieve agility, it’s necessary to align business and IT into integrated process teams and adopt agile processes for those integrated teams, as well as selecting tools and architectures that support change.

Good governance is less about telling people what to do (and what not to do), and more about educating people on why they need to do certain things and empowering them to make the right choices. Many successful organizations adopt not just centers of excellence, but build communities of practice around those CoEs.

Since Richardson focuses on business process and West on agile software development, this was an interesting hybrid of ideas that spanned both business and IT.

Dynamic Case Management In the Public Sector

There was nothing that could have entice me to attend the Lean Six Sigma presentation this late in the afternoon, so instead I opted for the public sector track (which is not really my area of interest) for a discussion on dynamic case management (which is my area of interest) by Craig LeClair.

Government workers have low levels of empowerment and resourcefulness, for both cultural reasons and lack of technology to support such activities. So why is dynamic case management important in the government? He lists several reasons, including the increased need to manage the costs and risks of servicing higher numbers of citizen requests, the less structured nature of government jobs, demographic trends that will see many experience government workers retiring soon, and new regulations that impact their work.

Forrester defines dynamic case management as “a semistructured but also collaborative, dynamic, human and information-intensive process that is driven by outside events and requires incremental and progressive responses from the business domain handling the case”. A case folder at the center of the case is surrounded by people, content, collaboration, reporting, events, policies, process and everything else that might impact that case or be consumed during the working of the case. It’s a combination of BPM, ECM and analytics, plus new collaborative user interface paradigms. Forrester is just wrapping up a wave report on dynamic case management (or adaptive case management, as it is also known), and we’re seeing a bit of the research that’s going into it.

Le Clair discussed the three case management categories – service requests, incident management and investigative – and showed several government process examples that fit into each type as well as some case studies. He moved on to more generic Forrester BPM advice that I heard earlier in sessions today, such as leveraging centers of excellence, but included some specific to case management such as using it as a Lean approach for automating processes.

39 minutes into a 45-minute presentation, he turned it over to his co-presenter, Christine Johnson from Iron Data, although he assured her that she still had 15-20 minutes. 🙂 She walked through a case lifecycle for a government agencies dealing with unemployment and disability claims through retraining and other activities: this includes processes for referral/intake, needs assessment and service plan, appeals, and so on. Some portions, such as intake, had more of a structured workflow, whereas others were less structured cases where the case worker determined the order and inclusion of activities. There were some shockingly detailed diagrams, not even considering the time of day, but she redeemed herself with a good list of best practices for case management implementations (including, ironically, “clutter-free screens”), covering technology, design and process improvement practices.

Interestingly, Johnson’s key case study on a federal agency handling disability claims started as an electronic case file project – how many of those have we all seen? – and grew into a case management project as the client figured out that it was possible to actually do stuff with that case file in addition to just pushing documents into it. The results: paperless cases, and reduced case processing times and backlogs.

Fidelity Investments’ Evolution To Product-Focused Software Delivery

Darrell Fernandes, SVP of advisory solutions technology at Fidelity Investments, finished up the morning at Forrester’s BP&AD Forum with a discussion of their IT transformation: how they changed their software delivery process to become more like a software product company. They created “fences” around their projects in terms of centers of excellence and project management offices, with the idea that this would drive excellence on their projects; what they found is that the communication overhead started to bog them down, and that the silos of technology expertise became obsolete as technologies became more integrated. This is a really interesting counterpoint to Medco’s experience, where they leveraged the centers of excellence to create a more agile enterprise.

For Fidelity, the answer was to structure their software delivery to look more like that of a software product company, rather than focusing specifically on projects. They looked at and introduced best practices not just from other organizations like themselves, but also from software companies such as Microsoft. Taking a broader product portfolio view, they were able to look for synergies across projects and products, as well as take a longer-term, more disciplined view of the product portfolio development. A product vision maps to the product roadmap, then to the release plans, then ties into the project high-level plans. They’ve created an IT product maturity model, moving through initiation, emerging, defined, managed and optimizing; Fernandes admitted that they don’t have any in the optimizing category, but told about how they’ve moved up the maturity scale significantly in the past few years. They also started as an IT-led initiative before coming around to a business focus, and he recommends involving the business from the start, since their biggest challenges came when they started the business engagement so far along in their process.

They’ve had some cultural shifts in moving to the concept of IT products, rather than IT providing services via projects to the business, and disengaged the project/product cycle from annual IT budgets. Also, they drove the view of business capabilities that span multiple IT products, rather than a siloed view of applications that tended to happen with a project and application-oriented view. Next up for them is to align the process owners and product owners; he didn’t have any answers yet about how to do that, since they’re just starting on the initiative. They’re a long way from being done, but are starting to shift from the mode of IT process transformation to that of it just being business as usual.

Interesting view of how to shift the paradigm for software development and delivery within large organizations.