IBMConnect (Lotusphere) 2013 Highlights: Product Updates, Smarter Workforce and Smarter Commerce

A couple of weeks ago, IBM had two analyst calls about the announcements this week at IBM Connect 2013; since I’m not at the conference, I wrote most of this at that time but only published today due to embargo restrictions. It’s the 20th anniversary of Lotusphere, although the conference is no longer branded as Lotusphere since the “smarter workforce” and “smarter commerce” streams are beyond just products with a Lotus heritage or brand.

The first briefing featured Jeff Schick, who heads up social software at IBM. He discussed new software and cloud services to put social business capabilities in the hands of C-level executives in HR and marketing, covering the dual goals of managing corporate intranets and talent, and managing external marketing campaigns. The catchphrases are “Activate the Workforce” and “Delight Customers”, enabled by IBM social business solutions for Smarter Workforce and Smarter Commerce, built on the social integration capabilities of IBM WebSphere Portal.

Specific product releases coming up in the next several weeks:

  • IBM Connections v4.5, with FileNet ECM now available as a native service: documents and their processes (processes within FileNet, I assume, not within IBM BPM) can be integrated into a Connections community, exposing FileNet functionality such as metadata and foldering through Connections, and providing fully integrated social capabilities such as tagging, commenting and liking, making content a first-class social citizen. This is hot. It will not include records management or Case Manager: it appears that these functions would be available on the FileNet side, but not exposed (at this time) through Connections. Quickr customers are being offered a migration path to IBM Connections Content Manager, which is a bundled FileNet repository that can be upgraded to the full ECM suite if you wanted to use it outside the Connections context. Connections can also integrate with SharePoint and Outlook, so is an option even if you’re a Microsoft customer in those areas.
  • IBM Notes 9 Social Edition, competing against Outlook 2013 with social-enabled email, activity Streams and other social capabilities.
  • IBM Docs for web-based collaboration, now available on-premise as well as in the cloud. This competes against Office 365 and Google Docs, but offers better collaboration than O365 (which requires passing control of a document between collaborators) and better rendering/conversion of Office documents than GDocs. IBM Docs is integrated with Connections for social features and sharing, in the same sort of way as Content Manager.
  • IBM Sametime replaces their existing meeting service in the cloud, including iOS and Android support. It uses the Polycom framework for video and audio support.
  • Deployment of all of this can be public cloud, private cloud, on-premise (not really sure of the distinction there) or a hybrid of these. Their SmartCloud for Social Business provides for the cloud deployment and adds wiki, blogs and other social authoring functionality. SmartCloud has Safe Harbor certification, making it a bit mire immune to government snooping, and can be private-labeled, with two telelcom companies already using this to provide these capabilities to their customers.

Everything is focused on mobile: mobile meetings, chat, Connections including Content Manager access, Docs and more.

Jonathan Ferrar, who heads up strategy for the Smarter Workforce business area, gave us an update on what they’re providing to support attracting, empowering and motivating employees. They have just completed their acquisition of Kenexa, and offer a portfolio of HR and workforce management products that includes behavioral sciences plus the entire platform for social business that Schick talked about, including analytics, collaboration and content management.

There are three main functional areas related to workforce management: attract (including recruitment, hiring, onboarding), empower (including learning and intranet content such as benefits and procedures), and motivate (including surveys, assessments and talent management). An integrated employee and HR portal uses existing IBM portal technology to expose Kennexa functionality and social features. There are also workforce analytics to monitor, provide insight and predict based on demographic, qualitative and social data, using both Cognos for dashboards and SPSS for analysis. There’s also some features related to outsourcing but not a lot of details; I was left with the impression that this was a strong capability of Kennexa prior to the acquisition.

I don’t know a lot about HR systems, although I’m seeing a huge potential to integrate this with operational systems such as BPM to drive analytics from the operational systems to the HR systems (e.g., employee performance measures), and even some from HR to the operational systems (e.g., learning management to push training to people at the point in their work when they need the training).

In the second briefing, we heard from Larry Bowden, VP of Web Experience software at IBM, covering the website building and user experience sides of Smarter Commerce and Social Business. He started out the the same “smarter workforce/exceptional customer experience” catchphrases as we heard on the earlier call, then went on to highlight some of their customers recognized for exceptional web experience awards in 2012. Web experience includes the smarter workforce (employee engagement, workplace social portal) and smarter commerce (web presence and brand marketing, buy, sell, market) areas, but also can include direct business uses (e.g., online banking, claims), engaging a broad variety of constituents (e.g., e-government), and customer self-service. The core of the IBM customer experience suite, however, is on the buy, market, sell  and service capabilities under their Smarter Commerce umbrella. They are working at putting the web marketing/commerce capabilities directly into the hands of business users (although if this is anything similar to how most vendors put BPMS capabilities directly into the hands of business users, I wouldn’t be too worried if I were a web developer), including both web content management/analytics and campaign management.

The Smarter Workforce and Smarter Commerce solutions are built on the IBM Social Business Platform, as we heard from Schick earlier, which includes WebSphere Portal, Web Content Manager, Connections, Notes & Domino Social Edition, Sametime, Social Analytics Suite, ECM, Web Experience Factory and Forms. That’s nine products just in the platform, then the Customer Experience Suite and Employee Experience Suite solutions built on top of that. Whew. There are other products that come in at the higher level, such as Worklight for mobile enablement.

There’s been a refresh on all of their web experience capabilities, resulting in a new IBM Web Experience “Next”, providing for faster content creation, social content rendering and multi-channel publishing. This is not so much a product as the list of everything across their product base that is being updated, and a more consistent user interface.

There’s a new digital asset management system for rich media management (part of a WCM Rich Media Edition?), although that’s currently in tech preview rather than released.

They’ve also done some PureSystems updates that make it faster to deploy and optimized complex configurations of the multiple IBM products required to support these capabilities – arguably, they should have spent some time on refactoring and reducing the number of products, rather than working out how to make bigger and better hardware to support these patterns.

As always after an IBM briefing, I’m left with a sense of almost overwhelming complexity in the number — and possible combinations — of products that make up these integrated solutions. Powerful: yes. But expect some rough edges in the integration.

IBM On Business Analytics In Banking With @LaurenceTrigwel

I was on an IBM analyst call this morning on analytics in banking, featuring Laurence Trigwell, who is responsible for IBM’s business analytics strategy for the Financial Services industry.

In response to a number of challenges that banks (and other financial institutions) are facing, he identified three key initiatives for these companies:

  • Creating a customer-focused enterprise
  • Optimize risk and compliance
  • Increase flexibility and streamline operations

Obviously, analytics are pretty important to all of this, and IBM continues to focus on building their analytics portfolio with acquisitions such as Algorithmics and capabilities such as Watson. As with all of their portfolios, the IBM analytics portfolio contains a number of different products, making it a bit difficult for customers (and probably IBM sales reps) to wade through the potential solutions; however, for the purposes of this presentation, Trigwell has organized them around the three key initiatives listed above.

  • For creating a customer-focused enterprise, you need performance management, business intelligence and predictive analytics in order to gain insights from customer data to “provide the most profitable offer to the right customer at the right time”. Although this sort of “next best action” view of analytics for customer focus is part of the picture, I think that it misses the point completely: the stated goal (in quotes above) is really focused on the enterprise, not the customer, since it has enterprise profitability at the core. Not that profitability isn’t important, but if you’re going to be truly customer-focused, you need to have some goals stated as actual direct customer benefits. It might be the same thing, since providing an offer that is best suited to a customer’s needs means that they are more likely to accept it, which in turn results in great profitability in a probabilistic sense, but using language that is actually customer-focused when you’re talking about becoming a customer-focused enterprise is important to shift corporate culture.
  • For optimizing risk and compliance, you will need risk management, governance, performance management, compliance and business intelligence capabilities. I don’t disagree with this, and analytics (as well as other things, such as BPM) can definitely turn risk management and compliance from just overhead to a competitive advantage, if you do it well.
  • For increasing flexibility and streamlining operations, you will need performance management, predictive analytics and business intelligence capabilities for identifying process bottlenecks and operational inefficiencies. This is really about displaying and reporting on what’s happening, not fixing the problems, so not a sense-and-respond view of analytics or how it fits into process management in general.

In each of these three scenarios, the capabilities that he is suggesting that you need are actually IBM products: from three to five products for each scenario if you want to go all out. They have bundled some of these into industry solutions, but it’s not clear what the level of integration between the products is to create a seamless solution, or if they’re only integrated on the PO. Clearly, IBM needs to do some rationalization of their analytics portfolio to reduce the number of products (as they need to do in some of their other portfolios), although their usual strategy is to allow acquired companies to run pretty much untouched for quite a long time before starting to merge products. That strategy is good for the sales teams and the executives of the acquired company, but not necessary good for customers who have to deal with an increasingly large and bewildering array of products that overlap in functionality. The case studies that he discussed typically used only one product, either Algorithmics or Cognos, although one used both Cognos and SPSS, so I’m not sure that there’s much of an appetite for multiple analytics products applied to the same initiative.

In all cases, he talks about analytics as identifying/reporting on issues/situations, but not much on how analytics need to fit together with other systems to make it all happen. He touches on a bit of this with some of his case studies, but it would be great to see the analytics for banking shown in the context of other IBM products that can really make the three initiatives real, such as CRM, BPM and ECM.

Since banks and other financial institutions are my main customer base for consulting, and they’re all IBM customers, it will be interesting to see how they roll out some of the newer IBM products and solutions, especially in combination, in the years to come.

TIBCO Corporate and Technology Analyst Briefing at TUCON2012

Murray Rode, COO of TIBCO, started the analyst briefings with an overview of technology trends (as we heard this morning, mobile, cloud, social, events) and business trends (loyalty and cross-selling, cost reduction and efficiency gains, risk management and compliance, metrics and analytics) to create the four themes that they’re discussing at this conference: digital customer experience, big data, social collaboration, and consumerization of IT. TIBCO provides a platform of integrated products and functionality in five main areas:

  • Automation, including messaging, SOA, BPM, MDM, and other middleware
  • Event processing, including events/CEP, rules, in-memory data grid and log management
  • Analytics, including visual analysis, data discovery, and statistics
  • Cloud, including private/hybrid model, cloud platform apps, and deployment options
  • Social, including enterprise social media, and collaboration

A bit disappointing to see BPM relegated to being just a piece of the automation middleware, but important to remember that TIBCO is an integration technology company at heart, and that’s ultimately what BPM is to them.

Taking a look at their corporate performance, they have almost $1B in revenue for FY2011, showing growth of 44% over the past two years, with 4,000 customers and 3,500 employees. They continue to invest 14% of revenue into R&D with a 20% increase in headcount, and significant increases in investment in sales and marketing, which is pushing this growth. Their top verticals are financial services and telecom, and while they still do 50% of their business in the Americas, EMEA is at 40%, and APJ making up the other 10% and showing the largest growth. They have a broad core sales force, but have dedicated sales forces for a few specialized products, including Spotfire, tibbr and Nimbus, as well as for vertical industries.

They continue to extend their technology platform through acquisitions and organic growth across all five areas of the platform functionality. They see the automation components as being “large and stable”, meaning we can’t expect to see a lot of new investment here, while the other four areas are all “increasing”. Not too surprising considering that AMX BPM was a fairly recent and major overhaul of their BPM platform and (hopefully) won’t need major rework for a while, and the other areas all include components that would integrate as part of a BPM deployment.

Matt Quinn then reviewed the technology strategy: extending the number of components in the platform as well as deepening the functionality. We heard about some of this earlier, such as the new messaging appliances and Spotfire 5 release, some recent releases of existing platforms such as ActiveSpaces, ActiveMatrix and Business Events, plus some cloud, mobile and social enhancements that will be announced tomorrow so I can’t tell you about them yet.

We also heard a bit more on the rules modeling that I saw before the sessions this morning: it’s their new BPMN modeling for rules. This uses BPMN 1.2 notation to chain together decision tables and other rule components into decision services, which can then be called directly as tasks within a BPMN process model, or exposed as web services (SOAP only for now, but since ActiveMatrix is now supporting REST/JSON, I’m hopeful for this). Sounds a bit weird, but it actually makes sense when you think about how rules are formed into composite decision services.

There was a lot more information about a lot more products, and then my head exploded.

Like others in the audience, I started getting product fatigue, and just picking out details of products that are relevant to me. This really drove home that the TIBCO product portfolio is big and complex, and this might benefit from having a few separate analyst sessions with some sort of product grouping, although there is so much overlap and integration in product areas that I’m not sure how they would sensibly split it up. Even for my area of coverage, there was just too much information to capture, much less absorb.

We finished up with a panel of the top-level TIBCO execs, the first question of which was about how the sales force can even start to comprehend the entire breadth of the product portfolio in order to be successful selling it. This isn’t a problem unique to TIBCO: any broad-based platform vendor such as IBM and Oracle have the same issue. TIBCO’s answer: specialized sales force overlays for specific products and industry verticals, and selling solutions rather than individual products. Both of those work to a certain extent, but often solutions end up being no more than glorified templates developed as sales tools rather than actual solutions, and can lead to more rather than less legacy code.

Because of the broad portfolio, there’s also confusion in the customer base, many of whom see one TIBCO product and have no idea of everything else that TIBCO does. Since TIBCO is not quite the household name like IBM or Oracle, companies don’t necessarily know that TIBCO has other things to offer. One of my banking clients, on hearing that I am at the TIBCO conference this week, emailed “Heard of them as a player in the Cloud Computing space.  What’s different or unique about them vs others?” Yes, they play in the cloud. But that’s hardly what you would expect a bank (that uses very little cloud infrastructure, and likely does have some TIBCO products installed somewhere) to think of first when you mention TIBCO.

TIBCO TUCON2012 Day 1 Keynotes, Part 2: Big Honking Data

Back from the mid-morning break, CMO Raj Verma shifted gears from customer experience management to look at one of the other factors introduced in the first part of the session: big data.

Matt Quinn was back to talk about big data: in some ways, this isn’t new, since there has been a lot of data within enterprises for many years. What’s changed is that we now have the tools to deal with it, both in place and in motion, to find the patterns hiding within it through cleansing and transformation. He makes a sports analogy, saying that a game is not just about the final score, but about all of the events that happen to make up the entire game; similarly, it is not sufficient any more to just measure outcomes in business transactions, you have to monitor patterns in the event streams and combine that with historical data to make the best possible decisions about what is happening right now. He referred to this combination of event processing and analytics as closing the loop between data in motion and data at rest. TIBCO provides a number of products that combine to handle big data: not just CEP, but ActiveSpaces (the in-memory data grid) to enable realtime processing, Spotfire for visual analytics and integration with Hadoop.

We saw a demo of LogLogic, recently acquired by TIBCO, which provides analytics and event detection on server logs. This might sound like a bit of a boring topic, but I’m totally on with this: too many companies just turn off logging on their servers because it generates too many events that they just can’t do anything with, and it impacts performance since logging is done on the operational server. LogLogic’s appliance can collect enormous amounts of log data, detect unusual events based on various rules, and integrate with Spotfire for visualization of potential security threats.

Mark Lorion, CMO for TIBCO Spotfire, came up to announce Spotfire 5, with a complete overhaul to the analytics engine, and including the industry’s first enterprise runtime for the R statistical language, providing 10 times the performance of the open source R project for predictive analytics. Self-service predictive analytics, ftw. They are also going beyond in-memory, integrating with Teradata, Oracle and Microsoft SQL Server for in-database analysis. With Teradata horsepower behind it – today’s announcement of Spotfire being optimized for in-database computation on Teradata – you can now do near-realtime exploration and visualization of some shocking amounts of data. Brad Hopper gave us a great Spotfire demo, not something that most TUCON attendees are used to seeing on the main stage.

Rob Friel, CEO of PerkinElmer, took the stage to talk about how they are using big data and analytics in their scientific innovations in life sciences: screening patient data, environmental samples, human genomes, and drug trials to detect patterns that can improve quality of life in some way. They screened 31 million babies born last year (one in four around the globe) through the standard heel-prick blood test, and detected 18,000 with otherwise undiagnosed disorders that could be cured or treated. Their instrumentation is key in acquiring all the data, but once it’s there, tools such as Spotfire empower their scientists to discover and act on what they find in the data. Just as MGM Grand is delivering unique experiences to each customer, PerkinElmer is trying to enable personalized health monitoring and care for each patient.

To wrap up the big data section, Denny Page, TIBCO’s VP of Engineering, came on stage with his new hardware babies: a FTL Message switch and an EMS appliance, both to be available by the end of November 2012.

For the final part of the day 1 keynotes, we heard from an innovators’ panel of Scott McNealy (founder of Sun Microsystems, now chairman of Wayin), Tom Siebel (founder of Siebel Systems, now at C3 Energy where they are using TIBCO for energy usage analytics), Vivek Ranadivé, and KR Sridhar (CEO of Bloom Energy), chaired by David Kirkpatrick. Interesting and wide-ranging discussion about big data, analytics, sentiment analysis, enterprise social media, making data actionable, the internet of things and how a low barrier to platform exit drives innovation. The panel thinks that the best things in tech are yet to come, and I’m in agreement, although those who are paranoid about the impact of big data on their privacy should be very, very afraid.

I’ll be blogging from the analyst event for the rest of the day: we have corporate and technology briefings from the TIBCO execs plus some 1:1 sessions. No pool time for me today!

TIBCO TUCON2012 Day 1 Keynotes, Part 1

The keynotes started with TIBCO’s CEO, Vivek Ranadivé, talking about the forces driving change: a massive explosion of data (big data), the emergence of mobility, the emergence of platforms, the rise of Asia (he referenced the Gangnam Style video, although did not actually do the dance), and how math is trumping science (e.g., the detection and exploitation of patterns). The ability to harness these forces and produce extreme value is a competitive differentiator, and is working for companies like Apple and Amazon.

Raj Verma, TIBCO’s CMO, was up next, continuing the message of how fast things are changing: more iPhones were sold over the past few days than babies were born worldwide, and Amazon added more computing capacity last night than they had in total in 2001. He (re)introduced their concept of the two-second advantage – the right information a little bit before an event is worth infinitely more than any amount of information after the event – enabled by an event-enabled enterprise (or E3, supported by, of course, TIBCO infrastructure). Regardless of whether or not you use TIBCO products, this is a key point: if you’re going to exploit the massive amounts of data being generated today in order to produce extreme value, you’re going to need to be an event-enabled enterprise, responding to events rather than just measuring outcomes after the fact.

He discussed the intersection of four forces: cloud, big data, social collaboration and mobility. This is not a unique message – every vendor, analyst and consultant are talking about this – but he dug into some of these in detail: mobile, for example, is no longer discretionary, even (or maybe especially) in countries where food and resources are scarce. The four of these together all overlap in the consumerization of IT, and are reshaping enterprise IT. A key corporate change driven by these is customer experience management: becoming the brand that customers think of first when the product class is mentioned, and turning customers into fans. Digital marketing, properly done, turns your business into a social network, and turns customer management into fan management.

Matt Quinn, CTO, continued the idea of turning customers into fans, and solidifying customer loyalty. To do this, he introduced TIBCO’s “billion dollar backend” with its platform components of automation, event processing, analytics, cloud and social, and hosted a series of speakers on the subject of customer experience management.

We then heard from a customer, Chris Nordling, EVP of Operations and CIO of MGM Resorts and CityCenter, who use TIBCO for their MLife customer experience management/loyalty program. Their vision is to track everything about you from your gambling wins/losses to your preferences in restaurants and entertainment, and use that to build personalized experiences on the fly. By capturing the flow of big data and responding to events in realtime, the technology provides their marketing team with the ability to provide a zero-friction offer to each customer individually before they even know that they want something: offering reduced entertainment tickets just as you’re finishing a big losing streak at the blackjack tables, for example. It’s a bit creepy, but at the same time, has the potential to provide a better customer experience. Just a bit of insight into what they’re spending that outrageous $25/day resort fee on.

Quinn came back to have a discussion with one of their “loyalty scientists” (really??) about Loyalty Lab, TIBCO’s platform/service for loyalty management, which is all about analyzing events and data in realtime, and providing “audience of one” service and offerings. Traditional loyalty programs were transaction-based, but today’s loyalty programs are much more about providing a more holistic view of the customer. This can include not just events that happen in a company’s own systems, but include external social media information, such as the customer’s tweets. I know all about that.

Another customer, Rick Welts of the Golden State Warriors (who, ironically, play at the Oracle stadium) talked about not just customer loyalty management, but the Moneyball-style analytics that they apply to players on a very granular scale: each play of each game is captured and analyzed to maximize performance. They’re also using their mobile app for a variety of customer service initiatives, from on-premise seat upgrades to ordering food directly from your seat in the stadium.

Mid-morning break, and I’ll continue afterwards.

As an aside, I’m not usually wide awake enough to get much out of the breakfast-in-the-showcase walkabout, but this morning prior to the opening sessions, I did have a chance to see the new TIBCO decision services integrated into BPM, also available as standalone services. Looked cool, more on that later.

AWD Monitoring Technical Deep-Dive

Great keynote at AWD Advance this morning by Captain Michael Abrashoff, author of It’s Our Ship, a book on leadership; I confess to tearing up a bit when he described how he supported and encouraged the young people who worked for him, and hope that I did a fraction as well when I ran a company.

Back to business, however, I’m in the technical session on AWD monitoring and business intelligence, following on from Kim Smyly’s introduction to the new monitoring yesterday, where Dirk Luttrell and Bob Kuesterteffan are giving us a peek under the covers for their new monitoring offering. They are implementing dimensional data modeling in their new offering – which, as I pointed out yesterday, is based on Oracle’s BI – in order to provide better business-based metrics and analytics. We got a brief tutorial on dimensional data models (star schemas in relational databases, or cubes in multidimensional databases), making me wish I was paying a bit more attention when my other half was talking about how he was implementing one of these in his data warehouse. In short, relational data models are organized around transactions, whereas dimensional data models are organized around business entities and information. Business entities are represented in fact tables, and dimensions are key to selecting, sorting, filtering and summarizing the data contained in fact tables.

The core AWD data is based on relational models, since it is a transactional system, but both the process and line-of-business data in AWD can be published to the dimensional (star) model for easier reporting and monitoring. If you’ve ever written a report or dashboard based directly on the process transactional data in a BPMS (which I have), you know that it’s not pretty: BI tools are optimized for dimensional data models, not relational transaction models. In the past AWD has allowed for reporting directly against relational models, but it was (is) not very flexible and could be prone to performance and scalability problems, requiring either extremely complex (and compute-intensive) queries, or denormalization and data duplication. Furthermore, it requires that report writers know and understand the underlying relational data model since they’re writing directly against that physical schema, which further locks in the core AWD product to that schema rather than being able to mask it behind a logical data schema.

In the new dimensional data model, they represent business entities directly: work items, queues, users and various other attributes of work including time dimensions. They also include a single line-of-business data dimension for all LOB fields (this seems like they are relational-izing their dimensional model, but I can understand the administrative and design complexity if they didn’t do it this way), so that fields such as account number can be used to cross-reference to other systems or for filtering, searching and sorting within the BI context.

They are creating the following fact tables:

  • Assigned work fact, with dimensions regarding when and to whom a work item was assigned and unassigned, and the current state regarding assignment and work selection. This is used, for example, to report on assigned work by user.
  • Completed work fact, which tracks work steps as they are completed, including duration, user experience level and other information about how the work was completed. This is used for reporting on work that was completed.
  • Locked work fact, tracking items when they are locked by users: who, when and how. As with assigned work fact, this is used for reporting on work locked by a particular user.
  • Login status fact, tracking when users log in and out, and whether they are currently logged in.
  • Queue fact, tracking work as it moves from queue to queue, and the status that each work item is in.
  • Suspended work fact, including when items are suspended and unsuspended, and who did it.
  • Work fact, which including historical information on work but includes a “current” flag to filter for just work that is in flight.

[This is probably way more detail about their dimensional data than you’re interested in, but I blog because I have no memory, and this is my only record of what I see here. That’s right, it’s really all about me.]

Given that the same underlying relational model will still be there in AWD, customers can continue to use the existing AWD BI (which would hit against those tables), but I’m guessing that a lot are going to want to move over in order to take advantage of the ease of use, performance and scalability of the new BI environment. They’re also planning on some future features such as scheduled report delivery; I’m not sure which of the new and upcoming features are based purely on the underlying Oracle technology, and how much that they’re building themselves, but if they’re smart, they’ll leverage as much of the Oracle BI package as possible. They also need to figure out how to integrate/publish to enterprise data warehouses, and work up full replacement functionality for the current BI product so that it can be retired.

Monitoring Dashboards And Reports In AWD

Kim Smyly presented on some of the new monitoring and analytics capabilities in AWD. They now have interactive dashboards, charts and reports that link directly to the underlying transactions, and can include line of business data in the reports. Writing reports in the custom AWD BI engine has been replaced with an OEM version of the Oracle Business Intelligence Enterprise Edition, allowing for a more flexible representation and visualization of the information, with actionable links to the processes. With interactive filtering capabilities, this also provides a search interface, such as searching for all active transactions for a specific account number.

This is pretty standard BI in terms of report and dashboard definition: quite a bit of flexibility for visualization and computation in a drag-and-drop interface, no more difficult to use than Excel tables and charts. It includes pivot tables (which you may know from Excel), which are great interactive analytical tools. I’m not sure what the legacy AWD BI looks like, but if it’s like that in most older systems (usually some ancient version of Crystal Reports), this is bound to be a huge improvement.

Line of business data can be included directly as fields in the dashboards and reports; I’m not sure of the underlying data architecture, but it appears that LOB data dimensions are defined in AWD and somehow replicated from other systems (or they are a view on those other systems); because they’re in the AWD data schema, they’re exposed for monitoring and reporting.

A number of questions from the audience on this; DST is porting over from the old to a new schema, and although they will support both for the foreseeable future, I expect that this will eventually force a migration from the old report mechanisms. It seems like the first implementation of this is not as powerful as the old custom BI (although probably significantly easier to use), so they will need to bring the functionality up to match before they can expect a mass migration.

Process Intelligence White Paper

And here’s the white paper from this afternoon’s webinar (registration required) on Enabling Process Intelligence Through Process Mining & Analytics. Sponsored by Fujitsu, written by me, reviewed with many helpful comments by Keith Swenson and the rest of the Fujitsu team. When the on-demand webinar is available, I’ll post a link.

Enabling Process Intelligence Through Process Mining & Analytics

A bit short notice, but I’m doing a webinar this afternoon at 2pm (Eastern) on process intelligence through process mining and analytics, along with Keith Swenson of Fujitsu, and you can sign up here. Fujitsu is sponsoring the webinar, as well as the white paper that I have written to go into the topic in a bit more detail; if you sign up for the webinar, I think that they will send you a copy of the white paper.

Fujitsu has recently repackaged their Automated Process Discovery (process mining, which I have reviewed here previously) together with their analytics to create a sort of “intelligent process” suite: you use the mining part to find out what your processes are across multiple systems, then add the necessary instrumentation to those processes in order to feed into a consolidated analytics visualization. Whereas most process analytics are based just on the processes automated or orchestrated by a BPMS, Fujitsu is trying to expand that level of visibility into systems that aren’t connected to the BPMS. With my background in pattern recognition, I have a bit of interest in process mining, and have written about their process discovery tool previously as well as working my way through Wil van der Aalst’s recent book, Process Mining: Discovery, Conformance and Enhancement of Business Processes.

Hope that you can join us for the webinar today.

TIBCO Spotfire 4.0

I had a briefing with TIBCO on their Spotfire 4.0 release, announced today and due to be released by the end of November. Spotfire is the analytics platform that TIBCO acquired a few years back, and provides end-user tools for dimensional analysis of data. This includes both visualization and mashups with other data sources, such as ERP systems.

In 4.0, they have focused on two main areas:

  • Analytic dashboards for monitoring and interactive drilldowns. This seems more like the traditional BI dashboards market, whereas Spotfire is known for its multidimensional visualizations, but I expect that business customers find that just a bit too unstructured at times.
  • Social collaboration around data analysis, both in terms of finding contributors and publishing results, by allowing Spotfire analysis to be embedded in SharePoint or shared with tibbr, and by including tibbr context in an analysis.

I did get a brief demo, starting with the dashboards. This starts out like a pretty standard dashboard, but does show some nice tools for the user to change the views, apparently including custom controls that can be created without development. The dynamic visualization is good, as you would expect if you have ever seen Spotfire in full flight: highlighting parts of one visualization object (graph or table) causes the corresponding bits in the other visualizations on the screen to be highlighted, for example.

Spotfire 4.0 - tibbr in sidebar of dashboard

There’s also some built-in collaboration: a chart on the Spotfire dashboard can be shared on tibbr, which has a static snapshot of the chart shared in a discussion thread but links back directly to the live visualization, [Insert obligatory iPad demo here] whereas sharing in SharePoint embeds the live visualization rather than a static shot. Embedding a tibbr discussion as context within an analysis is really less of an integration than just a side-by-side complementary viewing: you can have a tibbr discussion thread viewed on the same page as part of the analysis, although the tibbr thread is not itself being analyzed.

I found that the integration/collaboration was a bit lightweight, some of it no more than screen sharing (like the difference between a portal and a composite application). However, the push into the realm of more traditional dashboards will allow Spotfire to take on the more traditional BI vendors, particularly for data related to other TIBCO products, such as ActiveMatrix BPM.

[Update: All screenshots from briefing; for some reason, Flickr doesn’t want to show them as an embedded slideshow]