Business Objects Summit: Sanjay Poonen

Moving from business intelligence to the applications that rely on business intelligence, we heard from Sanjay Poonen, SVP and GM of Performance Optimization Applications. He started off with the cycle that we saw earlier in the day, with insight being linked to execution and process optimization, and focused on the governance, risk and compliance aspects of this cycle.

EPM 7.0He sees business intelligence as a central contributor to EPM, GRC and ERP, and with SAP having a leadership position in all of these, they can provide a total application suite for the CFO. Their product portfolio includes both SAP and Business Objects offerings:

  • SAP Strategy Management
  • Business Objects Financial Consolidation
  • SAP Business Planning and Consolidation
  • SAP GRC Process Controls
  • Business Objects Profitability and Cost Management
  • SAP Spend Analytics

These tools are becoming more collaborative, since almost all situations involving EPM and GRC will require a degree of collaboration between different people within an enterprise, and pull in information from both internal and external sources to create a more complete view of an business situation.

Business Objects Summit: Marge Breya on product portfolio

Marge Breya, EVP and BPM of the Business Intelligence Platform group, gave us a whirlwind briefing on the product portfolio, which is made up of Business Objects assets, some SAP assets, and new products being built jointly:

  • Governance, Risk, Compliance (GRC), where they hold the overall #1 position worldwide with a 21% market share
  • Enterprise Performance Management (EPM), made up of financial performance management (#2 with a 20% share) and operational performance management (#1 with a 23% share), the latter of which includes both standalone and embedded components
  • Information Discovery and Delivery, made up of query, reporting and analysis (#1 with a 24% share) and advanced analytics (just introduced)
  • Enterprise Information Management (#4)

Not surprisingly, they are pursuing a complete integrated stack with other SAP products, but they also integrate with products from Oracle, IBM, Microsoft and independent application and database vendors.

She then introduced John Mayer, director of consulting and testing services at Apotex Group (a pharmaceutical firm), to discuss their use of the products: they’re a big SAP user, and are also using Business Objects in several areas. They’ve been giving the end users tools that they can use to have a view into the corporate data for ad hoc queries, and having seen the value of that, it’s spreading across the organization and helping to drive their data warehouse initiative. IT keeps overall design control over the universes and databases — you don’t really want users doing this since they may not understand the implications of, say, searching on a multi-million record table using an unindexed field — but the users create their own queries and reports.

Breya continued with the message of making this easier for the not-so-technically-minded to create their own queries, reports and dashboards. Their intelligence platform puts a semantic layer over the messy technical stuff (metadata management, master data management, etc.), and creates a common services infrastructure for finding and using those information components as services from the analysis and reporting layer.

They have a large suite of information consumption tools in that query, analysis and reporting layer:

  • Crystal Reports (production reports with drillable visualization)
  • BI Widgets
  • BI Mobile
  • Polestar
  • Web Intelligence (ad hoc reporting and analysis)
  • Text Analytics
  • Voyager (OLAP advanced analysis)
  • Predictive Workbench (advanced statistical analysis)

Today, they’re announcing a new product in that suite: Xcelsius Present, a data visualization tool. From today’s press release:

Xcelsius PresentXcelsius Present is a data-visualization tool that transforms ordinary, static Microsoft Office Excel spreadsheets into captivating visuals and allows business users to share them via Microsoft PowerPoint or Adobe™ PDF files. Through interactive data visualizations and a simple, point-and-click interface, Xcelsius Present enables business users to create professional-looking visuals in just minutes, resulting in engaging experiences for presenters and audiences alike. Using interactive graphics – including dials, charts and gauges that clearly convey business cases and demonstrate “what if” scenarios – business professionals can involve, inform and persuade their audiences in meaningful ways with stunning visualizations.

This is a sub-$200 product, aimed at a broad range of business users who want to add some nice visualizations to their spreadsheet data, but there’s probably also a consumer market for this as well; in fact, one of their online demos is a college cost calculator.

They’re also announcing Crystal Reports Basic for SAP Business One, allowing for easily customizable drillable reporting on SAP Business One data that can be shared with others.

She quickly coverred their data services portfolio, which provides all the usual data management functions but also data federation and management of unstructured data such as RSS feeds.

They provide on-premise solutions, but also have more than 125,000 subscribers for their SaaS BI offerings.

Business Objects Summit: Partner Panel

Narina Sippy, SVP and GM of GRC at Business Objects, hosted a panel of three major partners: Lee Dittmar of Deloitte, Glenn Gutwillig of Accenture, and Dan Miller of IBM GS. Inevitably, this started with the “mine’s bigger than yours” comparisons, because apparently when it comes to Business Objects professional services practices, size does matter.

The most interesting part of the discussion was in response to an audience question about whether it’s possible to reach the nirvana of enterprise-wide information access and sharing, or if we’re stuck with unintegrated silos of information within enterprises. The panel felt that the leaders in moving to enterprise-wide integrated information management will gain such a competitive advantage — compliance, internal collaboration and other benefits — that it will force the rest of the market along quickly behind it.

Business Objects Summit: Day 1 Keynote

I’m here in rainy Boston at the Business Objects Influencer Summit, which was kicked off with Jonathan Becher, SVP of Marketing for Business Objects. It’s a very process-oriented message (which explains why I’m here): using business intelligence to drive process efficiency, improve insight to close the gap between strategy and execution, and add flexibility to create new business processes that align operations to strategy.

He was joined by Doug Merritt, EVP and GM of Business User Global Sales (moving from a product role), who continued with the message of how total insight allows organizations to optimize business performance. He discussed a number of customer case studies, focusing on how their easy-to-use end-user tools are being used to solve real business problems.

He also showed the strong tie-in between business intelligence to core SAP systems: insight, strategy and decisions feeding into monitoring, process refinement, process execution and events.

It’s only been just over six months since Business Objects’ acquisition by SAP, a period when most acquired companies take a bit of a dip in sales, but they’ve managed to keep their numbers on an upward growth path.

Becher then introduced Dr. Robert Kaplan from Harvard Business School and Palladium Group, inventor of such business strategy and measurement concepts as balanced scorecard and activity-based costing. We’ve also been given a copy of his book, The Execution Premium — Linking Strategy to Operations for Competitive Advantage, which I look forward to reading. He walked us through the main concept in the book: closed-look cycle that links strategy and operations:

  1. Develop the strategy
  2. Translate the strategy
  3. Align the organization
  4. Plan operations
  5. Execution
  6. Monitor and learn
  7. Test and adapt

In the middle of this cycle are the strategic plan (e.g., balanced scorecard) and the operating plan (e.g., forecasts, budgets, dashboards), with links to the several steps in the cycle that either create the artifacts of the plans or are informed by those artifacts, as well as interacting with each other.

Sep 2 and 3 represent the creation of the balanced scorecard, and translating that into operational improvement programs (step 4) is a new focus in Kaplan’s book. And here we are again, talking about process — since that’s what step 5 is all about — and how balanced scorecard helps to determine which processes have the most impact on a business’ performance, and are therefore the ones that should be the focus of process improvement efforts.

Becher took us back around the cycle, showing how Business Objects is applied at each of those steps (except execution), which provided an interesting perspective on the different roles of Business Objects within cycle that we in the BPM world know as design-execute-monitor-optimize.

TUCON: BPM with Spotfire Analytics

Lars Bauerle and Brendan Gibson of TIBCO showed us how Spotfire analytics are being integrated with data from iProcess to identify process improvement. I hadn’t seen Spotfire in any detail before the demo that I saw on Tuesday, and it’s a very impressive visualization and analysis tool; today, they showed iProcess process runtime data copied and pasted from Excel into Spotfire, but it’s not clear that they’ve done a real integration between the iProcess process statistics and Spotfire. Regardless, once you get the data in there, it’s very easy to do aggregations on the fly then drill into the results, comparisons of portions of the data set, and filtering by any attributes. You can also define KPIs and create dashboard-style interfaces. Authoring and heavy-duty analysis are done using an installed desktop application with (I believe) a local in-memory engine, but light-weight analysis can be done using a zero-install web client and all analysis done on the server.

In addition to local data, it’s possible to link directly from enterprise databases into the Spotfire client, which effectively gives the Spotfire user the ability to do queries to bring data into the in-memory engine for visualization and analysis — in other words, there doesn’t appear to be any technical barriers to establishing a link to the statistics in an iProcess engine. They showed a model of data flowing from the iProcess server to a data mart, which would then be connected to Spotfire; realistically, you’re not going to let your analytics hit your production process engine directly, so this makes sense, although there can be latency issues with this model. It’s not clear if they provide any templates for doing this and for some standard process analytics.

They did a demo of some preconfigured analytics pages with process data, such as cases in progress and missed SLAs, showing what this could look like for a business manager or knowledge worker. Gibson did refer to "when you refresh the data from the database" which indicates that this is not real-time data, although it could be reasonably low latency depending on the link between iProcess and the data mart, and client refresh frequency.

Then, the demo gods reared their heads and Spotfire froze, and hosed IE with it. Obviously, someone forgot to do the animal sacrifice this morning…

They went to questions while rebooting, and we found out that it’s not possible to stream data in real-time to Spotfire (as I suspected from the earlier comments); it needs to load data from a data source into its own in-memory engine on a periodic basis. In other words, you’re not going to use this as a real-time monitoring dashboard, but as an advanced visualization and analytics tool.

Since this uses an in-memory engine for analytics, there are limitations based on the physical memory of the machine doing the processing, but Spotfire does some smart things in terms of caching to disk, and swapping levels of aggregation in and out as required. However, at some point you’re going to have to consider loading a subset of your process history data via a database view.

There was a question about data security, for example, if a person should only be able to drill down on their own region’s data; this is done in Spotfire by setting permissions on the queries underlying the analysis, including row-level security.

iProcess Analytics is being positioned as being for preconfigured reporting on your process data, whereas Spotfire is positioned for ad hoc analysis and integration with other data sets.

Spotfire could add huge value to iProcess data, but it appears that they don’t quite have the whole story put together yet; I’m looking forward to seeing how this progresses, some real world case studies when customers start to use it, and the reality of what you need to do to preprocess the ocean of process data before loading it into Spotfire for analysis.

TIBCO Analyst Summit: Spotfire

Christopher Ahlberg, founder of Spotfire and now president of TIBCO’s Spotfire division, discussed Spotfire’s capabilities and what’s been done with integrating Spotfire into other TIBCO products.

Timely insights — the right information at the right time — is a competitive differentiator for most businesses, and classic business intelligence just doesn’t cut it in many cases. Consumer applications like Google Finance are raising the bar for dynamic visualization techniques, although most of them are fairly inflexible when it comes to viewing or comparing specific data in which you’re interested. In other words, we want the data selection and aggregation capabilities of our enterprise systems, and the visualization capabilities of consumer web applications. Ahlberg sees a number of disruptive BI technologies transforming the platform — in-memory processes, interactive visualization, participatory architecture, mashups — and starting to be able to link to the event-driven world of classic TIBCO.

He did a demo of copying and pasting the contents of a spreadsheet directly into Spotfire, which automatically used the column headers as metadata and created a scatterplot. He filtered and colored the chart dynamically, set thresholds and played around with the data to show what could be extracted from it, then showed a pre-built dashboard of charts that still allowed quite a bit of interactivity in terms of filtering and other view parameters. He also showed a mashup between Spotfire and Microsoft Virtual Earth that allowed a subset of the data to be selected in Spotfire, causing a shortest route between the geographic location corresponding to the data points to be plotted on Virtual Earth.

This puts a much more configurable face on standard analytics, not just in display parameters but also in area like selecting the dimensions to be compared on the fly rather than having them pre-defined in OLAP cubes. Since TIBCO is focused on real-time event processing, the logical step is to see how those events can be visualized in Spotfire: instead of just raising an alert to someone, give them a view of the analytical context behind the alert that makes it easier to close the loop on problem resolution. They’ve packaged this as Spotfire Operations Analytics, which fits most closely into a LEAN Six Sigma manufacturing environment.

There’s a session on Thursday about BPM with analytics which I’ll likely attend to see what they’re doing in that area.

ProcessWorld 2008: Maureen Fleming, IDC

Maureen Fleming of IDC spoke in the Process Intelligence and Performance Management track on process measurement, and how it’s used to support decisions about a process as well as having an application context. She defines strategic measurement as guiding decisions about where to focus across processes, providing information on where to improve a process, and supporting fact-based dispute arbitration.

She showed a chart of timeliness of measurement versus complexity:

  • Simple and timely: measure and spot-check performance within a process
  • Simple and time critical: need for continuous measurement and problem identification within homogeneous processes
  • Complex and timely: regular reporting to check performance across heterogeneous process islands
  • Complex and time-critical: need for continuous measurement and problem identification across heterogeneous process islands

Leading enterprises are moving towards more complex measurement. I’m not sure I agree with her definition of “timely”, which seems to be used to mean “historical” in this context.

She breaks down measurement tools by the intention of the measurement system: what happened (process intelligence and reporting)/what will happen(analytics, complex event processing)/what is happening (BAM)/why it is happening (root cause analysis))/how we should respond (intelligent process automation).

She went through IDC’s categorization of BPMS — decision-centric automation (human-centric), sensing automation (integration-centric and complex event processing), and transaction-centric automation (integration-centric) — and discussed the problem of each BPMS vendors’ individual BAM creating islands of process measurement. Process metrics from all process automation systems need to feed into a consolidated process measurement infrastructure: likely an enterprise process warehouse with analytics/BAM tied to that more comprehensive view, such as ARIS PPM.

She discussed KPIs and how the goals for those KPIs need to consider both business objectives and past performance: you can’t understand performance variations that might occur in the present without looking at when and why they occurred in the past.

Although her presentation mostly focussed on process measurement, the Q&A was much more about sense and respond: how to have specific measurements/events trigger something back in the process side in order to respond to an event.

Webinar: The New Paradigm for Business Intelligence – Collaborative, User Centric, Process Embedded

I’m watching this webinar featuring Don Tapscott of New Paradigm and Katrina Coyle of Molson Canada, sponsored by SAP.

Tapscott spoke first, and started with a reworked version of the same presentation that I saw last June at the Enterprise 2.0 conference, covering the four basic drivers for change: web 2.0, the net generation, the social revolution, and the economic revolution. He went on, however, to talk about the changing face of business intelligence: moving from cost-cutting to a focus on growth and creating relationships with customers and partners. There’s a number of factors at play:

  • Simplifying BI from a tool for tech-savvy power users to a visual, interactive tool for business decision makers
  • Making it easier to filter out the relevant data for making decisions rather that being confronted with a sea of data (a foundation of automated decisioning and complex event processing)
  • Providing interactive and iterative tools rather than creating standard reports through batch processes
  • Integrating with business processes for automated decisioning rather than just one-way periodic reporting

He sees more of this in the future: simpler interfaces to allow more people to participate in BI, new visualization techniques, better integration with other technologies, and support for harnessing the collective intelligence of participants.

I love that Tapscott’s using the term “BI 2.0”, which I first used in early 2006 to refer to the entire field of analytics in the face of a new batch of terms that seemed determined to relegate BI to refer only to periodic, one-way reporting.

We were then treated to a 24-slide presentation by Lothar Schubert, Director of Solution Marketing for SAP NetWeaver. Although he provided coverage of the landscape and history of BI, this could have been a bit shorter since we were left with only about 10 minutes for the customer case study.

Next up was Katrina Coyle, BI Team Manager for Molson, discussing their complex business environment — partnerships, acquisitions, multiple geographic locations with different go-to-market strategies, changes to consumer preferences — and how a single version of the truth through BI is absolutely necessary in order for them to continue to build their brand successfully.

Molson has been pushing innovation in their products and through social networking, but also through information using BI. This greatly improves information quality and timeliness throughout their supply chain, which in turn changes their physical loading and shipping practices. Problems in the supply chain are identified as they occur, and less time is spent managing the information and reporting.

You can see a replay of the webinar at the first link above.

Agent Logic’s RulePoint and RTAM

This post has been a long time coming: I missed talking to Agent Logic at the Gartner BPM event in Orlando in September since I didn’t stick around for the CEP part of the week, they persisted and we had both an intro phone call and a longer demo session in the weeks following. Then I had a crazy period of travel, came home to a backlog of client work and a major laptop upgrade, and seemed to lose my blogging mojo for a month.

If you’re not yet familiar with the relatively new field of CEP (complex event processing), there are many references online, including a recent ebizQ white paper based on their event processing survey which determined that a majority of the survey respondents believe that event-driven architecture comprises all three of the following:

  • Real-time event notification – A business event occurs and those individuals or systems who are interested in that event are notified, and potentially act on the event.
  • Event stream processing – Many instances of an event occur, such as a stock trade, and a process filters the event stream and notifies individuals or systems only about the occurrences of interest, such as a stock price reaching a certain level.
  • Complex event processing – Different types of events, from unrelated transactions, correlated together to identify opportunities, trends, anomalies or threats.

And although the survey shows that the CEP market is dominated by IBM, BEA and TIBCO, there are a number of other significant smaller players, including Agent Logic.

In my discussions with Agent Logic, I had the chance to speak with Mike Appelbaum (CEO), Chris Bradley (EVP of Marketing) and Chris Carlson (Director of Product Management). My initial interest was to gain a better understanding of how BPM and CEP come together as well as how their product worked; I was more than a bit amused when they referred to BPM as an “event generator”. I was someone mollified when they also pointed out that business rules engines are event generators: both types of systems (and many others) generate thousands of events to their history logs as they operate, most of which are of no importance whatsoever; CEP helps to find the few unique combinations of events from multiple data feeds that are actually meaningful to the business, such as detecting credit card fraud based on geographic data, spending patterns, and historical account information.

Agent Logic - RulePoint - Home

Agent Logic has been around since 1999, and employs about 50 people. Although they initially targeted defence and intelligence industries, they’re now working with financial services and manufacturing as well. Their focus is on providing an end-user-driven CEP tool for non-technical users to write rules, rather than developers — something that distinguishes them from the big three players in the market. After taking a look at the product, I think that they got their definition of “non-technical user” from the same place as the BPM vendors: the prime target audience for their product would be a technically-minded business analyst. This definitely pushes down the control and enforcement of policies and procedures closer to the business user.

They also seem to be more focused on allowing people to respond to events in real-time (rather than, for example, spawning automated processes to react to events, although the product is certainly capable of that). As with other CEP tools, they allow multiple data feeds to be combined and analyzed, and rules set for alerts and actions to fire based on specific business events corresponding to combinations of events in the data feeds.

Agent Logic has two separate user environments (both browser-based): RulePoint, where the rules are built that will trigger alerts, and RTAM, where the alerts are monitored.

Agent Logic - RulePoint - Rule builderRulePoint is structured to allow more technical users work together with less technical users. Not only can users share rules, but a more technical user can create “topics”, which are aggregated, filtered data sources, then expose these to the less technical to be used as input for their rules. Rules can be further combined to create higher-level rules.

RulePoint has three modes for creating rules: templates, wizards and advanced. In all cases, you’re applying conditions to a data source (topic) and creating a response, but they vary widely in terms of ease of use and flexibility.

  • Templates can be used by non-technical users, who can only set parameter values for controlling filtering and responses, and save their newly-created rule for immediate use.
  • The wizard creation tool allows for much more complex conditions and responses to be created. As I mentioned previously, this is not really end-user friendly — more like business analyst friendly — but not bad.
  • The advanced creation mode allows you to write DRQL (detect and response query language) directly, for example, ‘when 1 “Stock Quote” s with s.symbol = “MSFT” and s.price > 90 then “Instant Message” with to=”[email protected]”,body=’MSFT is at ${s.price}”‘. Not for everyone, but the interesting thing is that by using template variables within the DRQL statements, you can converted rules created in advanced mode into templates for use by non-technical users: another example of how different levels of users can work together.

Agent Logic - RulePoint - WatchlistsWatchlists are lists that can be used as parameter sets, such as a list of approved airlines for rules related to travel expenses, which then become drop-down selection lists when used in templates. Watchlists can be dynamically updated by rules, such as adding a company to a list of high-risk companies if a SWIFT message is received that references both that company and a high-risk country.

Agent Logic - RulePoint - ServicesRulePoint includes a large number of predefined services that can be used as data sources or responders, including SQL, web services and RSS feeds. You can also create your own services. By providing access to web services both as a data source and as a method of responding to an alert, this allows Agent Logic to do things like kick off a new fraud review process in a BPMS when a set of events occur across a range of systems that indicate a potential for fraud.

Lastly, in terms of rule creation, there are both standard and custom responses that can be attached to a rule, ranging from sending an alert to a specific user in RTAM to sending an email message to writing a database record.

Although most of the power of Agent Logic shows up in RulePoint, we spent a bit of time looking at RTAM, the browser-based real-time alert manager. Some Agent Logic customers don’t use RTAM at all, or only for high-priority alerts, preferring to use RulePoint to send responses to other systems. However, compared to a typical BAM environment, RTAM provides pretty rich functionality: it can link to underlying data sources, for example, by linking to an external web site with criminal record data on receiving an alert that a job candidate has a record, and allows for mashups with external services such as Google maps.

Agent Logic - RTAM - AlertsIt’s also more of an alert management system rather than just monitoring: you can filter alerts by the various rules that trigger them, and perform other actions such as acknowledging the alert or forwarding it to another user.

Admittedly, I haven’t seen a lot of other CEP products to this depth to provide any fair comparison, but there were a couple of things that I really liked about Agent Logic. First of all, RulePoint provides a high degree of functionality with three different levels of interfaces for three different skill levels, allowing more technical users to create aggregated, easier-to-use data sources and services for less technical users to include in their rules. Rule creation ranges from dead simple (but inflexible) with templates to roll-your-own in advanced mode.

Secondly, the separation of RulePoint and RTAM allows the use of any BI/BAM tool instead of RTAM, or just feeding the alerts out as RSS feeds or to a portal such as Google Gadgets or Pageflakes. I saw a case study of how Bank of America is using RSS for company-wide alerts at the Enterprise 2.0 conference earlier this year, and see a natural fit between CEP and this sort of RSS usage.

Update: Agent Logic contacted me and requested that I remove a few of the screenshots that they don’t want published. Given that I always ask vendors during a demo if there is anything that I can’t blog about, I’m not sure how that misunderstanding occurred, but I’ve complied with their request.