TIBCONOW ActiveMatrix BPM Roadmap

On Monday, we heard an update on the current state of AMX BPM from Roger King; today, he gave us more on the new release and future plans in his “BPM for Tomorrow” breakout session. He started out introducing ActiveMatrix BPM 3.1, including the following key themes:

  • Case management
  • Data
  • Usability and productivity

As we saw in the previous breakout, the addition of ad hoc activities to process models enables case management capabilities. Ad hoc (disconnected) activities are fully supported in BPMN; TIBCO provides tooling to add preconditions and the choice of manual/automatic invocation: that allow an activity to be started manually or to start itself once the preconditions are met. If there are no preconditions, the activity will start (or be available to start) as soon as the process is instantiated. Manually-startable activities are surfaced for the user in the case UI, in the task list and in the process list. Case states and actions are defined in the case model, specifying the states, actions, and which actions are valid for each state. Support for CMIS has been extended to allow the addition of content (in an external ECM system) to a case object via a case folder paradigm; this includes some new document operations such as linking/unlinking to a case object.

Data and self-serving reporting is now enabled with the inclusion of the full capabilities of Jaspersoft — acquired by TIBCO in April 2014 — in AMX BPM (limited in use to BPM) and a number of out of the box reports and dashboards. This works with case data as well as process data. The messaging and capabilities of Spotfire for BPM analytics has been a bit lacking in the past, and obviously Jaspersoft is being positioned as the “right” way to do BPM analytics (which is probably not happy news for the customers that sweated through the BPM-Spotfire implementations).

On the usability side, they have improved some BPM developer tools such as developer server configuration, and added “live development” capability for iterative development of UI forms without needing to rebuild and redeploy: just edit, save and test directly.

He then talked about their future product direction, which is predicated on their role in managing the “crown jewel” core business processes, necessitating a lot of non-functional capability such as high availability and scalability. As for market trends, they are seeing the cloud being used to drive innovation through experimentation because of the low cost of failure, and the rise of disposable enterprise apps. As enterprise processes become more and more digital, organizations are starting to respond with more automated business processes as well as case management for more dynamic processes. Not surprisingly, they are seeing BPMS with HTML5 as an enterprise rapid application development platform: I have been seeing a merging of the high end of the BPMS market with the application development platform market for some time.

Every organization has a lot of non-differentiating applications with standardized experiences, such as those that support procurement and HR; TIBCO’s target is the differentiating apps within an enterprise, which may not be the systems of record but likely are the systems of engagement. The key to this is enterprise case management and process-centric apps, which include data, process, organizational and social aspects, but also UI composition capabilities, since out-of-the-box UI is rarely differentiating. They are moving toward having some large part of their development environment on the web rather than Eclipse, which will roll out around the time that Microsoft finally forces companies onto Internet Explorer 11 where HTML5 is properly supported. Through this, they will support more of the composable situational apps that can be built, rolled out, used and discarded in less time that it used to take you to write the requirements for an app.

Declarative (data and rules-driven) versus imperative (predefined flow) process models are on their roadmap, and they will start to roll out declarative models in the context of case management: not to the exclusion of imperative models, but to augment them where they provide a better fit. Tied into this, at least in my mind, they are providing stronger support for rules integrated into BPM development.

He restated the official TIBCO party line that BPMN is not for business users, but that they need something more like Nimbus UPN instead; however, those are currently offered by two separate and non-integrated products that can’t exchange models, making Nimbus less useful for process discovery that will lead to automation. In the future, they will address this with enterprise BPM in the cloud, providing a “Nimbus-style” experience for business users and business-IT collaboration to start, then more analyst-style BPMN modeling, design and implementation. Not clear how they are going to reconcile UPM and BPMN, however.

King then announced TIBCO Cloud BPM — not yet available, but soon — which will be a BPM service powered by AMX BPM. They deprecated their Silver Fabric BPM support, which allowed you to run AMX BPM in the Amazon cloud; it wasn’t a particularly flexible or supportable cloud BPM offering, and a true SaaS offering will be a good addition when it comes along.

Case Management at TIBCONOW 2014

Yesterday, I attended the analyst sessions (which were mostly Q&A with Matt Quinn on the topics that he covered in the keynote), then was on the “Clash of the BPM Titans” panel, so not a lot of writing. No keynotes today, on this last day of TIBCO NOW 2014, but some BPM breakouts on the calendar — stay tuned.

I started the day with Jeremy Smith and Nam Ton That presenting on case management. They discussed customer journeys, and how their Fast Data platform allows you to detect and respond to that journey: this often includes semi-structured, dynamic processes that need to change based on external events and the process to date. It’s more than just process, of course; there needs to be context, actionable analytics, internal and external collaboration, and recommended actions, all working adaptively towards the customer-centric goal.

TIBCO addresses case management with additions to AMX BPM, not with a separate product; I believe that this is the best way to go for a lot of case management use cases that might need to combine more traditional structured processes with adaptive cases. The new capabilities added to support case management are:

  • Case data, providing context for performing actions. The case data model is created independently of a process model; the modeling uses UML to create relational-style ERDs, but also scripting and other functions beyond simple data modeling. This appears to be where the power — and the complexity — of the case management capabilities lie.
  • Case folders, integrating a variety of document sources, including from multiple ECM systems using CMIS, to act as the repository for case-related artifacts.
  • Case state and actions, allowing a user (or agent) to view and set the state of a case — e.g., received, in process, closed — and take any one of a number of actions allowed for the case when it is that state. This is modeled graphical with a state/action model, which also can apply user/role permissions, in a very similar fashion to their existing page flows capability. Actions can include social interactions, such as requesting information from an expert, accessing a Nimbus-based operations manual related to the current action, applying/viewing analytics to provide context for the action at that state, or providing recommendations such as next best action. Rules can be integrated through pre-conditions that prevent, require or invoke actions.
  • Ad hoc tasks, allowing the case user to instantiate a user task or subprocess; it appears they are doing this by pre-defining these in the process model (as ad hoc, or disconnected, tasks) so although they can be invoked on an ad hoc basis, they can’t be created from scratch by the user during execution. Given that multiple process models can be invoked from a case, there is still a lot of flexibility here.
  • Case UI, providing some out of the box user interfaces, but also providing a framework for building custom UIs or embedding these capabilities within another UI or portal.

Related cases can be linked via an association field created in the case data model; since this is, at heart, an integration application development environment, you can do pretty much anything although it looks like some of it might result in a fairly complex and technical case data model.

They didn’t do an actual demo during the presentation, I’ll drop by the showcase later and take a peek at it later today.

TIBCONOW 2014 Day 2 Keynote: Product Direction

Yesterday’s keynote was less about TIBCO products and customers, and more about discussions with industry thought leaders about disruptive innovation. This morning’s keynote continued that theme with a pre-recorded interview with Vivek Ranadive and Microsoft CEO Satya Nadella talking about cloud, mobile, big data and the transformational effects on individual and business productivity. Nadella took this as an opportunity to plug Microsoft products such as Office 365, Cortana and Azure; eventually he moved on to talk about the role of leadership in providing a meaningful environment for people to work and thrive. Through the use of Microsoft products, of course.

Thankfully, we then moved on to actual TIBCO products.

We had a live demo of TIBCO Engage, their real-time customer engagement marketing product, showing how a store can recognize a customer and create a context-sensitive offer that can be immediately consumed via their mobile app. From the marketer’s side, they can define and monitor engagement flows — almost like mini-campaigns, such as social sharing in exchange for points, or enrolling in their VIP program — that are defined by their target, trigger and response. The target audience can be filtered by past interests or demographics; triggers can be a combination of geolocation (via their app), social media interactions, shopping cart contents and time of day; and responses may be an award such as loyalty points or a discount coupon, a message or both, with a follow link customized to the customer. A date range can then be set for each engagement flow, and set to be live/scheduled to start, or in a draft or review mode. Analytics are gathered as the flows execute, and the effectiveness can be measured in real time.

Matt Quinn, TIBCO’s CTO, spoke about the challenges of fast data: volume, speed and complexity. We saw the three blocks of the TIBCO Fast Data platform — analytics, event processing, and integration — in a bit more detail, with him describing how these three layers work together. Their strategy for the past 12 months, and going forward, has three prongs: evolution of the Fast Data platform; improved ease of use; and delivery of the Fast Data platform including cloud and mobile support. The Fast Data platform appears to be a rebranding of their large portfolio of products as if it were a single integrated product; that’s a bit of marketing-speak, although they do appear to be doing a better job of providing integrations and use cases of how the different products within the platform can be combined.

image

In the first part of the strategy, evolution of the platform (that is, product enhancements and new releases), they continue to make improvements to their messaging infrastructure. Fast, secure message transactions are where they started, and they continue to do this really well, in software and on their FTL appliances. Their ActiveSpaces in-memory data grid has improved monitoring and management, as well as multi-site replication, and is now more easily consumed via Node.js and other lighter-weight development protocols. BusinessWorks 6, their integration IDE, now provides more integrated development tooling with greatly improved user interfaces to more easily create and deploy integration applications. They’ve provided plug-ins for SaaS integrations such as Salesforce, and made it easier to create your own plug-ins for integration sources that they don’t yet support directly. On the event processing side, they’ve brought together some related products to more easily combine stream processing, rules and live data marts for real-time aggregation and visualization. And to serve the internet of things (IoT), they are providing connectivity to devices and sensors.

image

User experience is a big challenge with any enterprise software company, especially one that grows through acquisition: in general, user interfaces end up as a hodge-podge of inconsistent interfaces. TIBCO is certainly making some headway at refactoring these into a more consistent and easier to use suite of interfaces. They’ve improved the tooling in the BusinessWorks IDE, but also in the administration and management of integrations during development, deployment and runtime. They’ve provided a graphical UI designer for master data management (MDM). Presented as part of the ease of use initiative, he discussed the case management functions added to AMX BPM, including manual and automatic ad hoc tasks, case folder and documents with CMIS/ECMS access, and support for elastic organization structures (branch model). BPM reporting has also been improved through the integration of Jaspersoft (acquired by TIBCO earlier this year) with out of the box and customizable reports, and Jaspersoft also has been enhanced to more easily embed analytics in any application. They still need to do some work on interoperability between Jaspersoft and Spotfire: having two analytics platforms is not good for the customers who can’t figure out when to use which, and how to move between them.

The third prong of the strategy, delivery of the platform, is being addressed by offering on-premise, cloud, Silver Fabric platform-as-a-service, TIBCO Cloud Bus for hybrid cloud/on premise configurations, consumable apps and more; it’s not clear that you can get everything on every delivery platform, and I suspect that customers will have challenges here as TIBCO continues to build out their capabilities. In the near future, they will launch Simplr for non-technical integration (similar to IFTTT), and Expresso for consuming APIs. They are also releasing TIBCO Clarity for cleansing cloud data, providing cleaner input for these situational consumable apps. For TIBCO Engage, which we saw demonstrated earlier, they will be adding next best engagement optimization and support for third-party mobile wallets, which should improve the hit rate on their customer engagement flows.

He discussed some of the trends that they are seeing impacting business, and which they have on the drawing board for TIBCO products: socialization and gamification of everything; cloud requirements becoming hybrid to combine public cloud, private cloud and on premise; the rise of micro-services from a wide variety of sources that can be combined into apps; and HTML5/web-based developer tooling rather than the heavier Eclipse environments. They are working on Project Athena, a triplestore database that includes context to allow for faster decisioning; this will start to show up in some of the future product development.

Good review of the last year of product development and what to expect in the next year.

The keynote finished with Raj Verma, EVP of sales, presenting “trailblazer” awards to their customers that are using TIBCO technologies as part of their transformative innovation: Softrek for their ClearView CRM that embeds Jaspersoft; General Mills for their internal use of Spotfire for product and brand management; jetBlue for their use of TIBCO integration and eventing for operations and customer-facing services; and Three (UK telecom) for their use of TIBCO integration and eventing for customer engagement.

Thankfully shorter than yesterday’s 3-hour marathon keynote, and lots of good product updates.

Spotfire Content Analytics At TIBCONOW

(This session was from late yesterday afternoon, but I didn’t remember to post until this morning. Oops.)

Update: the speakers were Thomas Blomberg from TIBCO and Rik Tamm-Daniels from Attivio. Thanks, guys!

I went to the last breakout on Monday to look at the new Spotfire Content Analytics, which combines Spotfire in-memory analytics and visualization with Attivio content analysis and extraction. This is something that the ECM vendors (e.g., IBM FileNet) have been offering for a while, and I was interested to see the Spotfire take on it.

Basically, content analytics is about analyzing documents, emails, blogs, press releases, website content and other human-created textual data (also known as unstructured content) in order to find insights; these days, a primary use case is to determine sentiment in social media and other public data, in order for a company to get ahead of any potential PR disasters.

Spotfire Content Analytics — or rather, the Attivio engine that powers the extraction — uses four techniques to find relative information in unstructured content:

  • Text extraction, including metadata
  • Key phrase analysis, using linguistics to find “interesting” phrases
  • Entity extraction, identifying people, companies, places, products, etc.
  • Sentiment analysis, to determine degree of negative/positive sentiment and confidence in that score

Once the piece of content has been analyzed to extract this relevant information, more traditional analytics can be applied to detect patterns, tie these back to revenue, and allow for handling of potential high-value or high-risk situations.

Spotfire Content Analytics (via their ) uses machine learning that allows you to train the system using sample data, since the information that is considered relevant is highly dependent on the specific content type (e.g., a tweet versus a product review). They provide rich text analytics, seamless visualization via Spotfire, agility through combining sources and transformations, and support for diverse content sources. They showed a demo based on a news feed by country from the CIA factbook site (I think), analyzing and showing aggregate sentiment about countries: as you can imagine, countries experiencing war and plague right now aren’t viewed very positively. Visualization using Spotfire allows for some nice geographic map-based searching, as well as text searching. The product will be available later this month (November 2014).

Great visualizations, as you would expect from Spotfire; it will be interesting to see how this measures up to IBM’s and other content analytics offerings once it’s released.

BPM For Today At TIBCONOW

Roger King, who heads up TIBCO’s BPM product strategy, gave us an update on ActiveMatrix BPM, and some of the iProcess to AMX BPM tooling (there is a separate session on this tomorrow that I may attend, so possibly more on that then). It’s been four years since they launched AMX BPM; that forms the model-driven implementation side of their BPM offering, augmented by Nimbus for business stakeholders for procedure documentation and business-IT collaboration. AMX BPM provides a number of process patterns (e.g., maker-checker) built in, intelligent work and resource management, actionable analytic insights and more. This is built on an enterprise-strength platform — as you would expect from TIBCO — to support 24×7 real-time operations.

In May of this year, they released AMX BPM 3.0 with a number of new features:

  • Support all styles of processes in a single solution: human workflow, case management, rules-based processes, automation, etc.
  • To support case management, they enable global data to allow the creation of a case data model in a central repository separate from processes, allowing cases to exist independent of processes, although they can be acted upon by processes. Work items representing actions on cases can retrieve and update case data on demand, since it references the case data rather than having it copied to local instance data.
  • In work management enhancements, support for elastic organizations (branches, such as you see in retail banking). This allows defining a model for a branch — you could have different models for different sizes of branches, for example — then link to those from branch nodes in the static organization model. Work can then be managed relative to the features of those underlying models, e.g., “send to manager”.
  • Also in work management, they have added dynamic performers to allow for distribution based on business data in a running instance rather than pre-determined role assignments. This is supported by dynamic RQL (resource query language), a query language specifically for manipulating resource assignments.
  • Some new LDAP functions.

There will be another session on Wednesday that covers the new features that are new since May, including a lot about case management; I’ll report more from that.

He also gave us some of the details of the iProcess to AMX BPM “conversion” tools, which migrate the process models (although not the applications that use those models): I assume that the conversion rate of their iProcess customers to AMX BPM has been lower than they expected, and they are hoping that this will move things along.

We then heard a Nimbus update from Dan Egan, which will release version 9.5 this month: this is positioned as a “how to” guide for the enterprise, showing process models in a more consumable format than a full technical BPMN model. They have added collaboration capabilities so that users can review and provided feedback on the business processes, and the ability to model multiple process variants as multiple drill-downs from a single object. The idea is that you use Nimbus both as a place to document manual procedures that people need to perform, and as a process discovery tool for eventual automation, although the former is what Nimbus was originally designed for and seems to still be the main use case. They’ve spiffed up the UI, and will soon be offering their authoring, admin and governance functions on the web, allowing them to offer a fully web-based solution.

Nimbus uses their universal process notation (UPN) rather than BPMN for process models; King stated in response to a question about Nimbus supporting BPMN by stating that they do not believe that BPMN is a user-consumable format. They don’t have have tooling — or at least haven’t talked about it — to convert UPN to BPMN; they’re going to need to have that if they want to position UPN as being for business-led process discovery as well as procedural documentation.

If you want to see the replay of this morning’s keynote, or watch tomorrow’s keynotes live or on demand, you can see them here.

BPM COE at TIBCONOW 2014

Raisa Mahomed of TIBCO presented a breakout session on best practices for building a BPM center of excellence. She started with a description of different types of COEs based on Forrester’s divisions (I’m too lazy to hack the HTML to add a table in WordPress for Android, so imagine a 2×2 quadrant with one axis being centralized versus decentralized, the other tactical, i.e., focused on cost and efficiency, versus strategic, i.e., focused on revenue and growth):

  • Center of Expertise (decentralized, strategic) – empowers business stakeholders with expert assistance, provides best practice, governance, technology that is configurable and consumable by business
  • Center of Excellence (centralized, strategic) – governs all processes in organization, enforces strict guidelines and process methodology governance, owns the BPMS, engagement models foster trust and collaboration including internal evangelists
  • Community of Practice (decentralized, tactical) – small teams, departmental priorities and scope, basic workflow capabilities, little or no governance
  • Process Factory (centralized, tactical) – optimized for process automation projects, processes as application development, frameworks

Center of Expertise and Process Factory work well together and are often seen in combination.

image

Best practices (these went by pretty quickly with a lot of detail on the slides, so I’ve just tried to capture some of the high points):

  • Find executive sponsorship for the COE: they must be influential across the organization, and be in the right place for the COE within your organization (e.g., COO, CIO, separate architecture group)
  • Create a governance framework – style will be based on the type(s) of COEs in use
  • Establish a methodology, which may have to accommodate different levels of BPM maturity within organization; be sure to address reusability and common components
  • Start with a core process, but relatively low complexity – this is exactly what I recommend, and I’m always frustrated by the “experts” that recommend starting with a non-core process even if the core processes are the target for implementation.
  • Encourage innovation and introduce disruptive technology.
  • Collaboration is key, via co-location and online collaboration spaces.
  • Don’t skip the metrics: remember that measuring project success is essential for future funding, as well as day-to-day operations and feeding the continuous improvement cycle.
  • Don’t let the program go stale, or become an ivory tower; rotate SMEs from the COE back into the business.
  • There’s not a single BPM skillset: you need a variety of skills spread across multiple people and roles.
  • Make a business case to provide justification for BPM projects.
  • Empower and educate through training and change management.
  • Avoid the “build it and they will come” mentality: just because you create some cool technology, that doesn’t mean that business people will stop doing the things that they’re doing to take it up.
  • Institute formal reviews of process models and solutions.

Nothing revolutionary here, but a good introduction and review of the best practices.

TIBCONOW 2014 Opening Keynote: @Gladwell and More. Much More.

San Francisco! Finally, a large vendor figured out that they really can do a 2,500-person conference here rather than Las Vegas, it just means that attendees are spread out in a number of local hotels rather than in one monster location. Feels like home.

It seems impossible that I haven’t blogged about TIBCO in so long: I know that I was at last year’s conference but was a speaker (as I am this year) so may have been distracted by that. Also, they somehow missed giving me a briefing about the upcoming ActiveMatrix BPM release, which was supposed to be relatively minor but ended up  bit bigger — I’ll be at the breakout session on that later today.

We started the first day with a marathon keynote, with TIBCO CEO Vivek Ranadive welcoming San Francisco’s mayor, Ed Lee, for a brief address about how technology is fueling San Francisco’s growth and employment, as well as helping the city government to run more effectively. The city actually have a chief data officer responsible for their open data intiatives.

Ranadive addressed the private equity buy-out of TIBCO head-on: 15 years ago, they took the company public, and by the end of this year, they will be a private company again. I think that this is a good thing, since it removes them from the pressures of quarterly public filings, which artificially impacts product announcements and sales. It allows them to make any necessary organization restructuring or divestiture without being punished on the stock market. Also, way better than being absorbed by one of the bigger tech companies, where the product lines would have be to realigned with incumbent technologies. He talked about key changes in the past years: the explosion of data; the rise of mobility; the emergence of social platforms; Asian economies; and how math is trumping science by making the “how” more important than the “why”. Wicked problems, but some wicked solutions, too. He claims that every industry will have an “Uberization”: controversies aside, companies such as Uber and AirBnB are letting service businesses flourish on a small scale using technology and social networks.

We then heard from Malcolm Gladwell — he featured Ranadive in one of his books — on technology-driven transformation, and the kinds of attitudes that make this possible. He told the story of Malcolm McLean, who created the first feasible intermodal containerized shipping in the 1950s because of his frustration with how long it took to unload his truck fleet at seaports, and how that innovation transformed the physical goods economy. In order to do this, McLean had to overcome the popular opinion that containerized shipping would fail (based on earlier failed attempts by others): as Gladwell put it, he had the three necessary characteristics of successful entrepreneurs: he was open/imaginative with creative ideas; he was conscientious and had the discipline to bring ideas to fruition including a transformation of the supply chain and sales model; and he was “disagreeable”, that is, had the resolve to pursue an idea in the face of his peers’ disapproval and ridicule. Every transformative innovation must be driven by someone with these three traits, who has the imagination to reframe the incumbent business to address unmet needs, and kill the sacred cows. Great talk.

Ranadive then invited Marc Andreessen on stage for a conversation (Andreessen thanked him for letting him “follow Malcolm freaking Gladwell on the stage”) about innovation, which Andreessen says is currently driven by mobile devices: businesses now must assume that every customer is connected 24×7 with a mobile device. This provides incredible opportunities — allowing customers to order products/services on the go — but also threats for businesses behind the curve, who will see customers comparing them to their competitors in real-time before making a purchasing decision. They discussed the future of work; Andreessen sees this as leveraging small teams, but that things need to change to make that successful, including incentives (a particular interest of mine, since I’ve been looking at incentives for collaboration amongst knowledge workers). Diversity is becoming a competitive advantage since it draws talent from a larger pool. He talked about the success rates of typical venture-funded companies, such as those that they fund: of 4,000 companies, about 15 will make it to being big companies, that is, with a revenue of $100M or more that would position them to go public; most of their profits as a VC come from those 15 companies. They fund good ideas that look like terrible ideas, because if everyone thought that these were great ideas, the big companies would already be doing them; the trick is filtering out all of ideas that look terrible because they actually are. More important is the team: a bad team can ruin a good idea, but a great team with a bad idea can find their way to a good idea.

Next up was TIBCO’s CTO Matt Quinn talking with Box CEO Aaron Levie: Box has been innovating in the enterprise by taking the consumer cloud storage that we were accustomed to, and bringing it into the enterprise space. This not only enables internal innovation because of the drastically lower cost and simpler user experience than enterprise content solutions such as SharePoint, but also has the ability to transform the interface between businesses and their customers. Removing storage constraints is critical to supporting that explosion of data that Ranadive talked about earlier, enabling the internet of everything.

We saw a pre-recorded interview that Ranadive did with PepsiCo CEO Indra Nooyi: she discussed the requirement to perform while transforming, and the increase in transparency (and loss of privacy) as companies seek to engage with customers. She characterized a leader’s role as that of not just envisioning the future, but making that vision visible and attainable.

Mitch Barns, CEO of Nielsen (the company that measures and analyzes what people watch on TV), talked about how their business of measurement has changed as people went from watching broadcast TV at times determined by the broadcasters, to time-shifting with DVRs and consuming TV content on mobile devices on demand. They have had to shift their methods and business to accommodate this change in viewing models, and deal with a flood of data about how, when and where that consumption is occurring.

I have to confess, by this point, 2.5 hours into the keynote without a break, my attention span was not what it could have been. Or maybe these later speakers just didn’t inspire me as much as Gladwell and Andreessen.

Martin Taylor from Vista Equity Partners, the soon-to-be owners of TIBCO, spoke next about what they do and their vision for TIBCO. Taylor was at Microsoft for 14 years before joining Vista, and helps to support their focus on applying their best practices and operating platform to technology companies that they acquire. Since their start in 2000, they have spent over $14B on 140 transactions in enterprise software. He showed some of their companies; since most of these are vertical industry solutions, TIBCO is the only name on that slide that I recognized. They attempt to foster collaboration between their portfolio companies: not just sharing best practices, but doing business together where possible; I assume that this could be very good for TIBCO as a horizontal platform provider that could leveraged by their sibling companies. The technology best practices that they apply to their companies include improved product management roadmaps that address the needs of their customers, and improved R&D practices to speed product release cycles and improve quality. They’re still working through the paperwork and regulatory issues, but are starting to work with the TIBCO internal teams to ensure a smooth transition. It doesn’t sound as if there will be any big technology leadership changes, but a continued drive into new technologies including cloud, IoT, big data and more.

Murray Rode, TIBCO’s COO, finished up the keynote talking about their Fast Data positioning: organizations are collecting a massive volume of data, but that data has a definite shelf life and degrades in value over time. In order to take advantage of short-lived opportunities where timing is everything, you have to be able to analyze and take actions on that data quickly. As he put it, big data lets you understand what’s already happened, but fast data lets you influence what’s about to happen. To do this, you need to combine analytics to define situations of interest and decisions; event processing to understand and act on real-time information; and integration (including BPM) to unify your transactional and big data sources. Rode outlined the four themes of their positioning: expanded reach, ease of consumption, compelling user journey, and faster time to value; I expect that we will see more along these themes throughout the conference.

All in all, a great keynote, even though it stretched to an ass-numbing three hours.

Disclosure: TIBCO is paying my expenses to be at TIBCO NOW and a speaking fee for me to be on a panel tomorrow. What I write here is my own opinion, and I am not compensated in any way for blogging.

SAP’s Bigger Picture: The AppDev Play

Although I attended some sessions related to BPM and operational process intelligence, last week’s trip to SAP TechEd && d-code 2014 gave me a bit more breathing room to look at the bigger picture — and aspirations — of SAP and their business technology offerings.

I started coming to SAPPHIRE and TechEd when SAP released a BPM product, which means that my area of interest was a tiny part of their primary focus on ERP, financials and related software solutions; most of the attendees (including the analysts and bloggers) at that time were more concerned with licensing models for their Business Suite software than new technology platforms. Fast forwarding, SAP is retooling their core software applications using HANA as an in-memory platform (cloud or on-premise) and SAP UI5/Fiori for user experience, but there’s something much bigger than that afoot: SAP is making a significant development platform play using those same technologies that are working so well for their own application refactoring. In other words, you can consider SAP’s software applications groups to be software developers who use SAP platforms and tools, but those tools are also available to external developers who are building applications completely unrelated to SAP applications.

They have some strong components: in-memory database, analytics, cloud, UI frameworks; they are also starting to push down more functionality into HANA such as some rudimentary rules and process functionality that can be leveraged by a development team that doesn’t want to add a full-fledged BRM or BPM system.

This is definitely a shift for SAP over the past few years, and one that likely most of their customers are unaware; the question becomes whether their application development tools are sufficiently compelling for independent software development shops to take a look.

Disclaimer: SAP paid my travel expenses to be at TechEd last week. I was not compensated for my time in any way, including writing, and the opinions here are my own.

What’s New With SAP Operational Process Intelligence

Just finishing up some notes from my trip to SAP TechEd && d-code last week with the latest on their Operational Process Intelligence product, which can pull events and data from multiple systems – including SAP’s ERP and other core enterprise systems as well as SAP BPM – and provides real-time analytics via their HANA in-memory database. I attended a session on this, then had an individual briefing later to round things out.

Big processes are becoming a thing, and if you have big processes (that is, processes that span multiple systems, and consume/emit big data and high volume from a variety of sources), you need to have operational intelligence integrated into those processes. SAP is addressing this with their SAP Operational Process Intelligence, or what they see as a GPS for your business: a holistic view of where you are relative to your goals, the obstacles in your path, and the best way to reach your goals. It’s not just about what has happened already (traditional business intelligence), but what is happening right now (real-time analytics), what is going to happen (predictive analytics) and the ability to adjust the business process to accommodate the changing environment (sense and respond). Furthermore, it includes data and events from multiple systems, hence needs to provide scope beyond any one system’s analytics; narrow scope has been a major shortcoming of BPMS-based analytics in the past.

In a breakout session, Thomas Volmering and Harsh Jegadeesan gave an update and demo on the latest in their OPInt product. There are some new visualization features since I last saw it, plus the ability to do more with guided tasks including kicking off other processes, and trigger alerts based on KPIs. Their demo is based on a real logistics hub operation, which combines a wide variety of people, processes and systems, with the added complexity of physical goods movement.

Although rules have always been a part of their product suite, BRM is being highlighted as a more active participant in detecting conditions, then making predictions and recommendations, leveraging the ability to run rules directly in HANA: putting real-time guardrails around a business process or scenario. They also use rules to instantiate processes in BPM, such as for exception handling. This closer integration of rules is new since I last saw OPInt back at SAPPHIRE, and clearly elevates this from an analytics application to an operational intelligence platform that can sense and respond to events. Since SAP BPM has been able to use HANA as a database platform for at least a year, I assume that we will eventually see some BPM functionality (besides simple queuing) pushed down into HANA, as they have done with BRM, allowing for more predictive behavior and analytics-dependent functions such as work management to be built into BPM processes. As it is, hosting BPM on HANA allows the real-time data to be integrated directly into any other analytics, including OPInt.

OPInt provides ad hoc task management using a modern collaborative UI to define actions, tasks and participants; this is providing the primary “case management” capability now, although it’s really a somewhat simpler collaborative task management. With HANA behind the scenes, however, there is the opportunity for SAP to take this further down the road towards full case management, although the separation of this from their BPM platform may not prove to be a good thing for all of the hybrid structured/unstructured processes out there.

The creation of the underlying models looks similar to what I’ve been seeing from them for a while: the business scenario is defined as a graphical flow model (or imported from a process in Business Suite), organized into phases and milestones that will frame the visualization, and connected to the data sources; but now the rules can be identified directly on the process elements. The dashboard is automatically created, although it can be customized. In a new view (currently still in the lab), you will also be able to see the underlying process model with overlaid analytics, e.g., cost data; this seems like a perfect opportunity for a process mining/discovery visualization, although that’s more of a tool for an analyst than whoever might be monitoring a process in real-time.

SAP TechEd Keynote with @_bgoerke

I spent yesterday getting to Las Vegas for SAP TechEd && d-code and missed last night’s keynote with Steve Lucas, but up this morning to watch Björn Goerke — head of SAP Product & Innovation Technology — give the morning keynote on putting new technology into action. With the increasing rate of digital disruption, it’s imperative to embrace new ways of doing business, or risk becoming obsolete; this requires taking advantage of big data and real-time analytics as well as modern platforms. SAP’s current catch phrase is “Run Simple”, based in part on the idea of “one truth”, that is, one place for all your data so that you have a real-time view of your business rather than relying on separate sources for operations and analytics. You can’t run — and respond — at the speed that business requires if your analytics are based on yesterday’s transactions.

SAP HANA — their in-memory data store — allows for real-time analytics directly on operational transaction data, events, IoT machine data, social media data and more, all in a single data store. With the release of SAP HANA SPS09, they are adding support for dynamic tiering, streaming, enterprise information management, graphing, Hadoop user-defined functions, and multi-tenancy; these improve the management capabilities as well as the functionality. SAP deploys all of their business software solutions on HANA (although some more traditional databases are still supported in some products) with the goal to providing the basis for the “one truth” within business data.

Goerke was joined on stage by a representative from Alliander, an energy distribution company based in the Netherlands, and he demonstrated a HANA-based analytical dashboard based on geographic data that reduces the time required for geospatial queries — such as filtering by pipelines that are within a certain distance from buildings — from hours using more traditional database technology, to seconds with HANA. Geospatial data is one of the areas where in-memory data and analytics can really make a difference in terms of performance; I did a lot of my early-career software development on geospatial data, and there are some tough problems here that are not easily addressed by more traditional tools.

Another part of the simplicity message is “one experience” via the SAPUI5-based Fiori, providing for a more unified experienced between desktop and mobile, including management and distribution of mobile apps. They’ve added offline capabilities for their mobile apps – a capability widely ignored or dismissed as “unimportant” by developers who live and work only in areas blanketed in 4G and WiFi coverage, but critical in many real-world applications. Goerke demonstrated using some of the application development services — with some “help” from Ian Kimbell — to define an API, use it to create a mobile app, deploy it to a company app store, then install and run it: not something that most executives do live on stage at a keynote.

SAP now has a number of partnerships with hardware and infrastructure vendors to optimize their gear for SAP and especially for HANA: last week we saw an announcement about SAP running on the IBM cloud, and today we heard about how sgi is taking their well-known computational hardware capabilities and applying them to running transactional platforms such as SAP. SAP has also partnered with small software development shops to deliver the innovations in HANA-based applications needed to drive this forward. Applications developed on HANA can run on premise or in SAP’s managed cloud (and now IBM’s managed cloud), where they manage HANA and the SAP applications including Business Suite and Business Warehouse. Through a number of strategic acquisitions, SAP has much more than just your ERP and financials, however: they offer solutions for HR management, procurement, e-commerce, customer engagement and more. They also offer a rich set of development tools and application services for software development unrelated to SAP applications, allowing for applications built and deployed on HANA with modern mobile user interfaces and collaboration. In keeping with Goerke’s Star Trek theme in the keynote, very Borg-like. :-)

Lots more here than I could possibly capture; you can watch the keynotes and other presentations online at SAP TechEd online.