TUCON: Process Plans using iProcess Conductor

The last session of the day — and likely the last one of the conference for me, since I think that the engineering roundtables tomorrow morning are targeted at customers — was Enrique Goizueta of TIBCO discussing a "Lego approach" to creating business processes: dynamic BPM using the iProcess Conductor. Bruce Silver raved about the Conductor after seeing it at the solutions showcase on Tuesday, and it seems to have been a well-kept secret from those of us who watch the BPM space.

Goizueta started by discussing complex processes such as the cross-selling bundling processes seen in telecommunications and financial services, or claims management that may include both property damage and bodily injury exposures. In many cases, there are too many alternatives to realistically model all process possibilities explicitly, or the process is dynamic and specific instances may change during execution. The key is to identify reusable parts of the process and publish them as discrete processes in a process library, then mix them together at runtime as required for the specific situation. Each of these is a fully-functional, self-contained process, but the Conductor packages up a set of these at runtime and manages them as a "plan", presenting this package as a Gantt chart similar to a project plan. As with tasks in a project plan, you can set dependencies within a plan in Conductor, e.g., not starting one process until another is completed, or starting one process two weeks after another process starts. The iProcess engine still executes the processes, but Conductor is a layer above that to allow you to manage and monitor all the processes together in order to manage dependencies and identify the critical path across the set of processes.

TIBCO iProcess Conductor

This is very cool just as it is, but the Conductor also allows you to change a plan while it’s executing, adding and canceling processes on the fly.

He gave us a demo of Conductor for auto insurance claims management, where both vehicle damage and personal injury claims have been made, and these must be completed before the liability claim can be started processing.

For processes that always run together as single instances, such as a loss adjustment report followed by a vehicle repair claim, I’m not sure why you would represent these as separate processes that are put in the plan as end-to-end rather than subprocesses called by a single process, but there are other parts of this where the benefit of using Conductor is more clear, such as the ability to dynamically add a second liability claim a week into the process.

As Bruce pointed out, this is really case management, but it’s pretty nice case management. SLAs and critical paths can now be managed across the entire plan as well as for each individual process within it, and there’s lots of examples of complex processes that could benefit from this type of dynamic BPM.

Tonight we’re all off to the Exploratorium, where TIBCO is hosting a private party for us to check out the fun and interactive science exhibits. I’m flying back to Toronto tomorrow, which might give me a few hours on the flight to finish up some other blog posts that I’ve been working on, and watch for my coverage of SAPPHIRE next week from Orlando.

TUCON: BPM Health Insurance Case Study

Both Patrick Stamm (CTO) and Kevin Maloney (CIO) of Golden Rule Insurance were on hand to discuss their experiences in building a BPM infrastructure. They started out looking at BPM because of the multiple redundant systems and applications that they have, which is endemic in insurance: multiple ratings engines, multiple policy systems and multiple claims systems due to acquisitions and leapfrogging technologies. They needed to be more responsive and agile to changing business requirements, and increase end-to-end process visibility and management.

As they started looking at enterprise-wide BPM, they had a number of objectives:

  • Improving scalability
  • Improving cycle time and quality of process
  • Facilitating self-service on the web
  • Harvest rules from custom legacy systems
  • Reduce reliance on paper

This presentation focused on their new business process, from application submission through underwriting to issuance of the policy. Not surprisingly, adding BPM to underwriting was one of their significant challenges here; underwriting is often perceived as being as much of an art as a science, and I’ve seen a lot of resistance to introducing BPM into underwriting in many organizations that I’ve worked with.

They wanted to be strategic about how they implemented BPM, and established governance for the entire BPM program early on in the process. This allowed them take a big-picture approach, and led them to change how they do development by incorporating offshore development for the first time. The architecture of the TIBCO toolset allows them to get a lot of reusability across the different business silos (which still stay separate above the common platform), and the scalability helped them with both business continuity and business growth.

They have a 5-layer logical architecture:

  • UI layer, including General Interface, VB and other UI platforms
  • Services layer, strangely shown above the BPM layer, although it is called directly from the UI layer in some cases as well as from the BPM layer
  • BPM layer, which seems to actually show their queues rather than their business processes, which makes me wonder what the processes actually look like beyond a simple one-step queue
  • EAI layer, including all the adapters
  • Data access layer

Some of the highlights of their New Business process in BPM:

  • Mainframe integration to eliminate redundant data entry, triggering multiple mainframe transactions from a single BPM interface
  • Integration of business rules to eliminate error for incorrect riders, saving the underwriters’ time in researching which riders are applicable in which state
  • Integration with third parties, such as MIB (Medical Information Bureau) to automatically retrieve data from these sources rather than having users look it up manually on those parties’ web pages

The results that they’ve seen in less than a year since they’ve deployed:

  • New business volume is up over 50% with essentially the same number of staff
  • Applications processed per FTE is up over 30%
  • Cycle time is significantly reduced, as much as 30% in some cases
  • Better quality and consistency, with several error types eliminated
  • Improved visibility into business processes through better and more timely metrics and reporting

Their lessons learned:

  • Implementation partner selection is key: they’ve been happy with TIBCO as a product partner, but they had a bit of a rocky time with their first TIBCO integration partner and started over four months later. They still did the implementation in 11 months total, so really seven months from the point of restart.
  • You need to develop internal expertise in the tool and technology.
  • The first project should not be mission critical, and there must be a contingency plan. Funny, they didn’t consider New Business to be mission critical, but in reality, reverting to paper is an easy fallback in that case.
  • Don’t underestimate the impact that BPM will have on operational management and work culture.

This sounds like a fairly standard insurance implementation (I’ve done a few of these), but I like how they’re moving into the use of rules, and see the introduction of rules as having a significant impact on their process efficiency and cycle time.

TUCON: BPM with Spotfire Analytics

Lars Bauerle and Brendan Gibson of TIBCO showed us how Spotfire analytics are being integrated with data from iProcess to identify process improvement. I hadn’t seen Spotfire in any detail before the demo that I saw on Tuesday, and it’s a very impressive visualization and analysis tool; today, they showed iProcess process runtime data copied and pasted from Excel into Spotfire, but it’s not clear that they’ve done a real integration between the iProcess process statistics and Spotfire. Regardless, once you get the data in there, it’s very easy to do aggregations on the fly then drill into the results, comparisons of portions of the data set, and filtering by any attributes. You can also define KPIs and create dashboard-style interfaces. Authoring and heavy-duty analysis are done using an installed desktop application with (I believe) a local in-memory engine, but light-weight analysis can be done using a zero-install web client and all analysis done on the server.

In addition to local data, it’s possible to link directly from enterprise databases into the Spotfire client, which effectively gives the Spotfire user the ability to do queries to bring data into the in-memory engine for visualization and analysis — in other words, there doesn’t appear to be any technical barriers to establishing a link to the statistics in an iProcess engine. They showed a model of data flowing from the iProcess server to a data mart, which would then be connected to Spotfire; realistically, you’re not going to let your analytics hit your production process engine directly, so this makes sense, although there can be latency issues with this model. It’s not clear if they provide any templates for doing this and for some standard process analytics.

They did a demo of some preconfigured analytics pages with process data, such as cases in progress and missed SLAs, showing what this could look like for a business manager or knowledge worker. Gibson did refer to "when you refresh the data from the database" which indicates that this is not real-time data, although it could be reasonably low latency depending on the link between iProcess and the data mart, and client refresh frequency.

Then, the demo gods reared their heads and Spotfire froze, and hosed IE with it. Obviously, someone forgot to do the animal sacrifice this morning…

They went to questions while rebooting, and we found out that it’s not possible to stream data in real-time to Spotfire (as I suspected from the earlier comments); it needs to load data from a data source into its own in-memory engine on a periodic basis. In other words, you’re not going to use this as a real-time monitoring dashboard, but as an advanced visualization and analytics tool.

Since this uses an in-memory engine for analytics, there are limitations based on the physical memory of the machine doing the processing, but Spotfire does some smart things in terms of caching to disk, and swapping levels of aggregation in and out as required. However, at some point you’re going to have to consider loading a subset of your process history data via a database view.

There was a question about data security, for example, if a person should only be able to drill down on their own region’s data; this is done in Spotfire by setting permissions on the queries underlying the analysis, including row-level security.

iProcess Analytics is being positioned as being for preconfigured reporting on your process data, whereas Spotfire is positioned for ad hoc analysis and integration with other data sets.

Spotfire could add huge value to iProcess data, but it appears that they don’t quite have the whole story put together yet; I’m looking forward to seeing how this progresses, some real world case studies when customers start to use it, and the reality of what you need to do to preprocess the ocean of process data before loading it into Spotfire for analysis.

TUCON: Keynote Day 2

Tom Laffey was back hosting the keynote, dressed in a cycling shirt from Team TIBCO, one of the best US women’s pro cycling teams. He was joined briefly by a member of the team who also happens to hold a Ph.D. in biology; like any geeky engineer, Laffey giggled nervously in the presence of an attractive, brainy woman in form-fitting cycling gear, although I suspect that some of the nervousness was due to the pair of cycling shorts that she was handing him to try on. 🙂

Having covered the product announcements yesterday, this morning’s keynote moved to a customer focus, starting with Simon Post, CTO of Carphone Warehouse discussing how they improved the processes within their IT department. He made an excellent point: there is no "ERP for IT", that is, packaged software for running an IT business; this requires large IT groups roll their own process improvement efforts instead. They have the capability to do it, but that’s not the point: the IT departments are there to provide services to the business, not to spend time building systems for themselves unless no packaged software exists or they need custom capability for a competitive advantage. Carphone Warehouse uses TIBCO products extensively for their IT processes and systems: iProcess and BusienssEvents for the process layer, BusinessWorks for system orchestration, and EMS for messaging. They haven’t stopped at IT processes, however; they’re building their service-oriented architecture and rolling out services across the enterprise, facilitating reuse and reducing costs as they set up new locations in several countries.

I ducked out after that to review notes for my presentation, coming up at 11:30, since I want to take the time to see the Spotfire+BPM session that’s on just before mine.

TUCON: Centralized BPM Platform at HBOS

The last session of the day was a bit of a tough choice: I was thinking about heading over to see the session on in-process analytics through the integration of Spotfire and BusinessEvents, but decided in favor of hearing Richard Frost of HBOS (a UK-based financial services organization) discuss their centralized BPM platform and center of excellence strategy. Since they were created from a merger of Halifax and Bank of Scotland, and are made up of a number of brands, there’s quite a bit of vertical IT within the individual organizations. They’ve been moving some of this into shared services (what they call Group IT), including a business process layer based on TIBCO’s iProcess.

They had some significant drivers for BPM, allowing for growth while containing costs, and codifying processes and knowledge to reduce the impact of employee turnover. They had a variety of process types to manage as well, from straight-through processing with integration to their existing systems to high-touch human-centric and collaborative processes, so needed a product that could handle both well. They deployed BPM in a number of stages:

  • Digitizing, with human workflow and case management based on scanned documents
  • Automation
  • Optimization, through automation and separation of process logic from operational systems

As they roll this out, the benefits from automation have been most apparent and used in future business cases, and implementation costs are expected to reduce through reusability.

Instead of each division deploying their own BPM, they are moving to a centralized platform for a number of reasons:

  • Shared processes, such as complaints handling
  • Shared platform for cost savings
  • Shared resources
  • Best practices and governance
  • Architecture simplification

On this common software and hardware platform, each division has their own unique services, processes, rules and parameters; they’re now building a common services layer that will be reusable across divisions as well as consolidating onto the same physical hardware and software platform. They’ve had to determine ownership of each layer — which is owned by the divisions, shared services application development, and shared services technology — as well as governance of these layers by a business-led user group, an IT-led process certification board and a joint business-IT change approvals board.

They see the business opportunity for BPM is to remove the IT problems from what the business has to consider by providing a common platform, allowing them to focus on business and process improvement. Frost showed a chart that mapped process types (simple, regular, complex) against solutions (manual work distribution and handling, imaging and workflow with minimal integration, BPM with application integration) in order to identify the key processes to consider for BPM: although the conventional wisdom is to go for the simple processes that can be fully automated with BPM and application integration, he also feels that there’s huge benefits in looking at the complex processes that require a lot of human knowledge work. They also use this as a guideline for both simplifying processes and pushing for a greater degree of automation.

In an example of one of their insurance arrears processes, they’ve removed 60% of the human effort by automating most of the steps involved, while improving both service times and consistency.

His recommendations:

  • Understand your organizational model, recognizing where you are in your process efforts and aligning your BPM and SOA strategies
  • Don’t obsess on software selection, or the divisions will just do their own thing instead of waiting for the common platform
  • It will be hard work and will take a significant piece of time — HBOS has spent two years from when they did their first TIBCO pilot to where they are today with a shared platform
  • Reviewing and optimizing processes is crucial so that you’re automating the right processes
  • Needs a combined effort of a business push and an IT pull

An interesting message here is that although we all want 3-month delivery cycles for BPM projects, creating a shared BPM platform across multiple divisions takes a lot longer. A roadmap that allows divisional installations of the enterprise-standard platform in the interim, to be converged on the shared platform at a later date, is essential to allow progress on BPM applications within divisions while the shared platform is being developed.

TUCON: BPM Product Update

Roger King and Justin Brunt of TIBCO gave us an update of what’s happened lately with their BPM product, and what’s coming up.

In the past year, Business Studio has added a lot of new features:

  • Support for BPMN 1.0 and XPDL 2.0
  • In-line service binding and mapping, through direct connections to Business Works, web services, Java, databases, email and scripts
  • Direct deployment to the iProcess engine
  • Management of models using any source control system that supports Eclipse, or using their packaged Subversion option
  • Visual Forms Editor for creating forms directly in Business Studio using platform-independent models at design time and platform-specific models for run time: General Interface now, and other platforms to follow. Forms can be created from a step definition with a default layout based on the exposed parameters, then the forms editor can be used to add other UI widgets.
  • In-line subprocesses and a number of other modeling niceties.

The iProcess Workspace (end-user browser client) has been simplified and updated using an Outlook visual paradigm, based on General Interface. This is supported on IE 6 and 7 (no mention of Firefox). It’s also possible to use GI Builder to create your own BPM client, since the components are provided for easy inclusion, allowing iProcess functionality to be embedded into web pages or as portlets, with no knowledge of the iProcess APIs.

The iProcess Suite has a number of other improvements, including generic data plugins and direct deployment from Business Studio, plus support for 64-bit Windows and SUSE Linux. There’s also been repackaging and installation improvements. As we heard this morning, there’s also event-driven real-time worklist management, where a user can be alerted when something in a queue changes rather than having to poll it manually. There’s also updated LDAP authentication.

iProcess also has a new version of its web services plugin providing improved inbound and outbound web services security (at the transport layer with SSL and digital signatures and at the message layer through signatures, encryption and tokens), plus enhanced authentication.

The big thing in my mind is that Business Studio 3.0 now contains all key iProcess Modeler features so that it’s no longer necessary to use iProcess Modeler as an intermediate step in moving processes from Business Studio to the iProcess execution engine: Business Studio is the new BPM IDE. At TUCON last year, I said that this definitely needed to happen, and I’m very happy to see that it has since it represents a significant advance into full model-driven development for TIBCO’s BPM.

Their vision for BPM going forward is that the complexity of process models can be pushed down into the infrastructure, and free the business process modeling/design tools from the technical details that have made process modeling into a technical rather than business role over the past years. This will allow business people to do what the BPM vendors have always told us that they could do: design executable process models without having to be a technical expert. King feels that the key to this is service and data virtualization, since data is BPM’s "dirty secret": synchronization of data within a business process with other systems is one of the key drivers for having a technical person do the process models instead of a business person. Virtualizing the location, ownership, form and transport of the data means that you don’t need to worry about a business analyst doing something inappropriate with data in the course of process modeling.

The idea is that BPM suites will become model-driven composite application development and deployment platforms (wait! isn’t that what they’re supposed to be already?), with more latitude for business sandboxes and mashups for prototyping and building situational applications.

They’re working on breaking off the front end of the process engine to allow the creation of a single enterprise "work cloud" that can be used for any source of information or work coming at someone: sort of like event processing, but at a higher semantic level.

In addition to all the event-driven goodies, they’re also focused on covering the entire domain of process patterns (as in the full academic set of process patterns), so that any process could be modeled and executed using TIBCO’s BPM. We’ll also see some enhanced resource and organizational modeling, plus scheduling, capability requirements, SLAs and more models corresponding to real-world work.

TUCON: Design for People, Build for Constant Change

Connie Moore kicked off the Process Improvement track with the Forrester message "design for people, build for change" and dynamic business applications to a packed room. Check out my coverage of her keynote from the Forrester technology leadership conference last year for some background to this theme.

She discussed how methods of working are changing to put the worker at the center, with access to their information, processes, functions and other components as required: the modern information worker decides what he needs to complete any given task. In order to accommodate this, workers need dynamic applications that provide a highly-contextual dashboard/portal interface that might include client information, a calendar of events related to that client’s data, what-if tools for financial analysis, tools such as online enrolment for selling additional products to the client, and other information that’s related to what’s happening right now, not static information.

She sees BPM as going mainstream, and dragged out the hockey-stick growth predictions that all the big analysts love; I’m still seeing a lot of niche and departmental applications of BPM and think that these growth projections may only be met if the analysts continue to change the boundaries of what is considered to be BPM.

She covered several of the reasons for deploying BPM, and walked through some best practices for getting started:

  • Start with a major process that is causing pain: there will be less resistance to change, and easier support and funding. Typically, these are customer-facing, high-volume processes with lots of steps and handoffs. I’m also a big fan of this approach, since no one ever justified enterprise-wide deployment of BPM by doing a proof of concept with managing expense reports.
  • Look for quick hits, using an incremental approach and targeting 3-month release phases. I’m also completely behind this idea, and always recommend getting something simpler into production sooner, then adding on functionality and fine-tuning processes incrementally. I’ve found that BPM implementations lend themselves particularly well to Agile methodologies.
  • Design for real-world processes by doing effective process discovery: avoid interviewing the managers and reading the out-of-date procedures documentation in favor of talking with the people who really know how the process currently works and where the pain points are that need fixing. You don’t want to get too granular here, but use some process modeling tools to sketch things out and identify subprocesses and services. I’m going to expanding on this topic tomorrow in my breakout session, Using BPM to Prioritize Service Creation.
  • Link BPM and SOA. 71% of large companies surveyed by Forrester said that SOA was very important to their BPM efforts: the availability of services is what makes it possible to create and modify processes quickly and easily.
  • Keep the financials in mind. Link projects to the line of business rather than infrastructure, and don’t burden the first project with the infrastructure cost. Measure the results and ROI to use for future project justifications. For ROI calculations, she listed conservative estimates of saving 30-50% of clerical workers’ time, and 20-35% for knowledge workers, with transaction-focused processes seeing even greater benefit.
  • Develop a competency center from the start, including a cross-functional and collocated team of developers and business analysts, strong involvement from the vendor, and judicious use of systems integrations for specific targeted parts of the project. Forrester has seen a strong correlation between the existence of a competency center and measurable benefits in BPM projects.

She recently interviewed a financial services client of TIBCO’s, and they shared a few of their lessons learned:

  • Reengineer the process first, then pick the tool
  • Set the tools aside and focus on the process
  • Be prepared for staffing challenges
  • A competency center is critical

This was really a whirlwind tour of Forrester’s view of BPM, much too much information for a 50-minute presentation but lots of good stuff in here.

TUCON: Product Announcements, including a Messaging Appliance

I decided to break this out into a separate post although it’s all the same keynote, since this is getting a bit long and this post has all the product goodies in it, including TIBCO’s first-ever hardware release in the form of a messaging appliance.

Matt Quinn, TIBCO’s SVP of engineering and technology strategies, discussed product directions and their focus areas for 2008/2009:

  • Event-driven computing everywhere
  • invest in neutrality and enterprise breadth
  • Continue to simplify and streamline user experience

A bit of this was covered in the GTM strategy yesterday, but he provided much more information on specific products:

  • TIBCO ONE, covering all user interfaces from design time to run time, is now under the auspices of a single user experience team. They’ll be adopting Microsoft’s Silverlight as an alternative to their own General Interface, and some functionality will soon be available on Silverlight as well as GI.
  • He announced Spotfire version 2.1 for creating interactive business mashups, and discussed the Spotfire Operations Analytics that Ahlberg showed us yesterday.
  • BusinessEvents version 3 is in the final testing stages, providing a fully distributed and fully clustered rules, CEP and streaming engine, apparently a first in the market.
  • Over the next year, a new product called ActiveSpaces will be introduced for distributing data and state management cache, and is design to handle massive volumes in real-time.
  • iProcess version 11 is now in final testing, with real-time worklist management based on events, and some improvements to installation, LDAP support.
  • Business Studio version 3 is also being released, now fully based on Eclipse as part of the TIBCO ONE’s initiatives, and now (wait for it) with full functionality of the older process modeler, including Eclipse-based forms design.
  • ActiveMatrix version 2 and BusinessWorks 5.6 have been shipping since December, with improved capabilities such as multiple projects per BW engine, and AMX support for BW. In the future, the BW user interface will become part of TIBCO ONE, new testing, development and performance tuning tools will be added, and new deployment options such as clustering will be supported.

There were other announcements about enterprise messaging, managed file transfer, service performance management and CIM; it all went by pretty fast but there will be more information over the next two days.

The big finale was the announcement of a messaging appliance, which they showed on stage to spontaneous applause: a dedicated piece of hardware with Rendezvous embedded within it for incredible performance characteristics, allowing multiple services to be replaced by this single appliance. The appliance doesn’t replace existing Rendezvous installations, but is intended to work with them. They’ll be shipping the first units in September; press release here.

TUCON: Keynote

Before I start, I have to make a comment about the analyst dinner last night. I usually have a hard-and-fast rule about not blogging anything that happens when I have a drink in my hand, but I want to shout out to Heidi Bartlett for organizing the analyst summit yesterday and arranging for an amazing dinner last night. Not only were there excellent food and wines, but I sat with Christopher Ahlberg (Spotfire) and Bruce Silver so had excellent conversation as well about analytics and BPM.

The keynote session was hosted by Tom Laffey, EVP of products and technology — an engineer in the sea of sales and marketing people that’s typical for these events. After a brief intro, we heard from Vivek Ranadivé reprising his message from yesterday of Enterprise 1.0 being the mainframe era (1960-1980), Enterprise 2.0 being the database era (1980-2000) and Enterprise 3.0 being the event-driven era (2000-2020). Someone really needs to get the message to him, or whoever in marketing writes his stuff, that Enterprise 2.0 has a specific meaning in the current vernacular, and this isn’t it.

He does have a good message, which is that we’ve moved from having some of the information available in some places some of the time, to having all information available in real-time, on demand wherever we want it. In reality, we’re not there yet — my bank, one of the largest in Canada, still can’t post my banking transactions to the web in less than 24 hours — but infrastructure like TIBCO is going to help to make it happen by providing the mechanisms to tie systems together and perform complex event processing. He seeing the transition from the last 10 years to the next 10 years as being from static to dynamic, from database to SOA, from ERP to BPM, and from BI to predictive business. A modern-day event-driven architecture has an event-driven service bus as the backplane, kicking up events to the "event cloud" where they can be consumed and combined by analytics for visualization and analysis, rules to determine what to do with specific combinations of events, and BPM to take action on those decisions.

Ranadivé was followed by Kris Gopalakrishnan, CEO of Infosys (a major TIBCO integration partner), who talked about the changing markets, economies and demographics, and how enterprises need to change in order to respond to and anticipate the new requirements. A rapid consumerization of enterprise IT is happening, with greater demand for richer digital experiences both by internal enterprise workers and external customers. Process cut across systems and organizational boundaries, and need to be managed explicitly. IT systems need to be exposed as services that are aligned to business operating model in order to allow IT to respond quickly to business needs. Analytics need to be provided to more people within an enterprise to aid decision making, and there needs to be a convergence of BPM and BI to drive business optimization. He sees that the fundamental problem of information silos still exists: a point of view that I agree with, since I see it in client organizations all the time.

We then heard from Bob Beauchamp, CEO of BMC Software, to hear the customer viewpoint on IT process management using TIBCO’s products. The theme of the conference is "building bridges", with lots of pictures of the Golden Gate bridge and other famous bridges as slide backdrops, and analogies about building bridges between systems, and he used the Golden Gate bridge in another analogy about software: the bridge cost $24 million to build, and $54 million per year to maintain. This analogy is especially true of custom integration software, where in many cases you either effectively rewrite it constantly to keep up with other changes in your environment, or allow it to fall into disrepair.

In particular, however, he’s talking about how IT processes are the last to benefit from new technologies, since they’re too focused on providing these to (or testing them on 🙂 ) other departments within an enterprise. BMC is using TIBCO ActiveMatrix and some of their own technology to bring functions together in order to enable more efficient IT processes, including service support such as asset configuration, service automation such as auditing and compliance, and service assurance such as predictive analytics and scheduling. He sees this as transformational in how IT is managed, and believes that it will have a huge impact in the years to come.

Next up was Anthony Abbattista, VP of technology solutions at Allstate insurance. They’re a huge company — 70,000 employees and 17 million customers — and always felt that they were unique enough that they had to build their own systems for everything, a mindset that they’re actively working to change now. With a 7×24 operation that allows customers direct access to their back office systems, they had some unique challenges in replacing their custom legacy systems and point-to-point integration with standardized reusable components that gives them greater agility. They’ve completely rearchitected for data hubs, service bus and a range of new technologies, and taking advantage of standards to help them move into a new generation of systems and business processes.

TIBCO Analyst Summit: Partner and Channel Strategy

Dean Hidalgo, Director of Industry and Partner Marketing, discussed the partner network and how it ties into their overall strategy. As we heard in the sales strategy sessions, partnering is extremely important in certain regions and will be increasingly so as TIBCO pushes into new geographies that they can’t cover with their own people directly. The usual big system integration companies are here: HP, EDS, Infosys, Wipro, Accenture, Deloitte and CGEY to name a few.

He also discussed their vertical marketing activities, providing tools that allow the sales teams to present vertical value propositions (VVP) — matching the functionality of TIBCO’s products to the vertical business requirements — without having vertical products. Salesware, not software. TIBCO doesn’t back down from the idea that they sell infrastructure, not vertical packaged applications, although these VVPs include "frameworks" (really unsupported templates) of pre-configured rules, policies, KPIs, dashboards and processes that they throw in for a customer to use as a starting point. These VVPs are based on successful real-world implementations, like their predictive customer interaction management that is based on what was actually implemented at one of their large retail banking customers, advanced order fulfillment for telco, dynamic claims management for insurance, predictive STP for securities, point-of-sale monitoring for retail, supply chain optimization for manufacturing and retail, and disruption management for airlines and logistics.

There was a lot of discussion in the room about the value of the VVPs: some analysts felt that this didn’t go far enough, and that TIBCO needs to put out some vertical applications in order to compete, but several of us (including me) feel that this type of vertical marketing is extremely valuable by allowing an infrastructure company to sell to business people without moving out of their sweet spot. If a customer doesn’t look closely at this, however, they might think that these are supported products rather than unsupported templates. This is likely to be exacerbated by their marketing videos that refer to (for example) "TIBCO’s airline disruption management system" — if seen in isolation, or presented by a salesperson who didn’t make that distinction clear, it would be pretty easy to make that mistake.

Abhishek, AVP and head of BPM-EAI practice at Infosys, briefed us on Infosys, their primary verticals and their horizontal technology specializations. They have a significant TIBCO practice, which has allowed them to build reusable frameworks and tools that accelerate their TIBCO-based projects with clients. I’m all for reusability, but I’ve seen some pretty disastrous frameworks built on other products by other systems integrators, and I’m wary about the maintainability and weight of any third-party framework: if you’re looking at something like this, be sure to check out issues like whether it’s productized or considered custom code, the process for upgrading the underlying platform, e.g., TIBCO, and the ability to use the underlying platform features directly for design and administration. In addition to frameworks and systems integration, Infosys is also an engineering partner of TIBCO, developing and supporting various application and technology adapters.

We were supposed to finish at 4:30, but the only thing that ended at 4:30 sharp was our internet connectivity. To be fair, TIBCO provided hard-wired connectivity and power to each table in the analyst briefing room throughout the day, and they did get the internet access turned back on about 10 minutes later so that I could publish this last post before heading to the solutions showcase. The stories that I heard about the hotel’s extortionate cost for wifi doesn’t bode well for intraday posting the rest of the week, in spite of the BPM product marketing manager’s promise to have someone follow me around with a wireless router. 🙂