Sal Vella on Technologies for a Smarter Planet at CASCON2011

I attended the keynote at IBM’s CASCON conference in Toronto today, where Judy Huber, who directs the IBM Canada software lab, kicked off the session by reminding us that IBM software development has been happening in Canada since 1967 and continues to grow, and of the importance of collaboration between the research and industry communities. She introduced Joanna Ng, who is the head of research at the lab, to congratulate the winners of the most influential paper from CASCON 2001 (that date is not a typo, it’s a 10-year thing): Svetlana Kiritchenko and Stan Matwin for “Classification with Co-Training” (on email classification).

The main speaker was Sal Vella, VP of architecture and technology within the IBM software group, talking about technologies to build solutions for a smarter planet. Fresh from the IOD conference two weeks ago, I was all primed for this; there was a great booth at IOD that highlighted “smarter technology” with some interesting case studies. IBM’s smarter planet initiative is about technologies that allow us to do things that we were never able to do before, much of which is based on the immeasurable volume of data constantly produced by people, devices and systems. Consider electricity meters, like the one that you have in your home: it used to be that these were read once per month (if you were lucky) by a human, and the results entered into a billing system. Now, smart meters are read every 15 minutes to allow for time of use billing that rewards people for shifting their electricity usage away from peak periods. Analytics are being used in ways that they were never used before, and he discussed the popular Moneyball case of building a sports team based on player statistics. He also spoke about an even better use of analytics to create “solutions for a partying planet”: a drinks supplier predicting sports games outcomes to ensure that the pubs frequented by the fans of the teams most like to win had enough alcohol on hand to cover the ensuing parties. Now that’s technology used for the greater good. Winking smile

There are a lot of examples of big data and analytics that were previously unmanageable that are now becoming reasonable targets, most of which could be considered event-based: device instrumentation, weather data, social media, credit card transactions, crime statistics, traffic data and more. There are also some interesting problems in determining identity and relationships: figuring out who people really are even when they use different versions of their name, and who they are connected to in a variety of different ways that might indicate potential for fraud or other misrepresentation. Scary and big-brotherish to some, but undeniably providing organizations (including governments) with deeper insights into their customers and constituents. If those who complain about governments using this sort of technology “against” them would learn how to use it themselves, the tables might be turned as we gain insight into how well government is providing services to us.

We heard briefly from Charles Gauthier, acting director at the institute for information technology at National Research Council (NRC) Canada. NRC helped to create the CASCON conference 21 years ago, and continue to sponsor it; they support research in a number of areas that overlap with CAS and the other researchers and exhibitors presenting here.

The program chairs, Marin Litoiu of York University and Eleni Stroulia of University of Alberta presented awards for the two outstanding papers from the 22 papers at the conference:

  • “Enhancing Applications Robustness in Cloud Data Centres” by Madalin Mihailescu, Andres Rodriguez and Cristiana Amza of University of Toronto, and Dmitrijs Palcikovs, Gabriel Iszlai, Andrew Trossman and Joanna Ng of IBM Canada
  • “Parallel Data Cubes on Multi-Core Processors with Multiple Disks” for best student paper, by Hamidreza Zaboli and Frank Dehne of Carlton University

We finished with a presentation by Stan Matwin of University of Ottawa, co-author of the most influential paper presentation on email classification from the CASCON of 10 years past (his co-author is due to give birth on Wednesday, so decided not to attend). It was an interesting look at how the issue of email classification has continued to grow in the past 10 years; systems have become smarter since then, and we have automated spam filtering as well as systems for suggesting actions to take (or even taking actions without human input) on a specific message. The email classification that they discussed in their paper was based on classification systems where multiple training sets were used in concert to provide an overall classification for email messages. For example, two messages might both use the word “meeting” and a specific time in the subject line, but one might include a conference room reference in the body while the other references the local pub. Now, I often have business meetings in the pub, but I understand that many people do not, so I can see the value of such a co-training method. In 2001, they came to the conclusion that co-training can be useful, but is quite sensitive to its parameters and the learning algorithms used. Email classification has progressed since then: Bayesian (and other) classifiers have improved drastically, data representation is richer (through the use of meta formats and domain-specific enrichment) to allow for easier classification. social network and other information can be correlated, and there are specific tailored solutions for some email classification applications such as legal discovery. Interesting to see this sort of perspective on a landmark paper in the field of email classification.

I’m not sticking around for any of the paper presentations, since the ones later today are a bit out of my area of interest, and I’m booked the rest of the week on other work. However, I have the proceedings so will have a chance to look over the papers.

What Analysts Need to Understand About Business Events

Paul Vincent, CTO of Business Rules and CEP at TIBCO (and possibly the only person at Building Business Capability sporting a bow tie), presented a less technical view of events that you would normally see in one of his presentation, intended to have the business analysts here at Building Business Capability understand what events are, how they impact business processes, and how to model them. He started with a basic definition of events – an observation, a change in state, or a message – and why we should care about them. I cover events in the context of processes in many of the presentations that I give (including the BPM in EA tutorial that I did here on Monday), and his message is the same: life is event-driven, and our business processes need to learn to deal with that fact. Events are one of the fundamentals of business and business systems, but many systems do not handle external events well. Furthermore, many process analysts don’t understand events or how to model them, and can end up creating massive spaghetti process models to try and capture the result of events since they don’t understand how to model events explicitly.

He went through several different model types that allow for events to be captured and modeled explicitly, and compared the pros and cons of each: state models, event process chain models, resources events agents (REA) models, and BPMN models. The BPMN model is the only one that really models events in the context of business processes, and relates events as drivers of process tasks, but is really only appropriate for fairly structured processes. It does, however, allow for modeling 63 different types of events, meaning that there’s probably nothing that can happen that can’t be modeled by a BPMN event. The heavy use of events in BPMN models can make sense for heavily automated processes, and can make the process models much more succinct. Once the event notation is understood, it’s fairly easy to trace through them, but events are the one thing in BPMN that probably won’t be immediately obvious to the novice process analyst.

In many cases, individual events are not the interesting part, but rather a correlation between many events; for example, fraud events may be detected only have many small related transactions have occurred. This is the heart of complex event processing (CEP), which can be applied to a wide variety of business situations that rely on large volumes of events, and distinguishes between simple process patterns and business rules that can be applied to individual transactions.

Looking at events from an analyst’s view, it’s necessary to identify actors and roles, just as in most use cases, then identify what they do and (more importantly) when they do it in order to drive out the events, their sources and destinations. Events can be classified as positive (e.g., something that you are expecting to happen actually happened), negative (e.g., something that you are expecting to happen didn’t happen within a specific time interval) or sets (e.g., the percentage of a particular type of event is exceeding an SLA). In many cases, the more complex events that we start to see in sets are the ones that you’re really interested in from a business standpoint: fraud, missed SLAs, gradual equipment failure, or customer churn.

He presented the EPTS event reference architecture for complex events, then discussed how the different components are developed during analysis:

  • Event production and consumption, namely, where events come from and where they go
  • Event preparation, or what selection operations need to be performed to extract the events, such as monitoring, identification and filtering
  • Event analysis, or the computations that need to be performed on the individual events
  • Complex event detection, that is, the event correlations and patterns that need to performed in order to determine if the complex event of interest has occurred
  • Event reaction, or what event actions need to be performed in reaction to the detected complex event; this can overlap to some degree with predictive analytics in order to predict and learn the appropriate reactions

He discussed event dependencies models, which show event orderings, and relate events together as meaningful facts that can then be used in rules. Although not a common practice, this model type does show relationships between events as well as linking to business rules.

He finished with some customer case studies that include CEP and event decision-making: FedEx achieving zero latency in determining where a package is right now; and Allstate using CEP to adjust their rules on a daily basis, resulting in a 15% increase in closing rates.

A final thought that he left us with: we want agile processes and agile decisions; process changes and rule changes are just events. Analyzing business events is good, but exploiting business events is even better.

Accepting The Business Architecture Challenge with @logicalleap

Forrester analyst Jeff Scott presented at Building Business Capability on what business architecture is and what business architects do. According to their current research, interest in business architecture is very high – more than half of organizations consider it “very important”, and all organizations survey showed some interest – and more than half also have an active business architecture initiative. This hasn’t changed all that much since their last survey on this in 2008, although the numbers have crept up slightly. Surprisingly, less than half of the business architecture activities are on an enterprise-wide level, although if you combine that with those that have business architecture spanning multiple lines of business, it hits about 85%. When you look at where these organizations plan to take their business architecture programs, over 80% are planning for them to be enterprise-wide but that hasn’t changed in 3 years, meaning that although the intention is there, that may not actually be happening with any speed.

He moved on to a definition of business architecture, and how it has changed in the past 15 years. In the past, it used to be more like business analysis and requirements, but now it’s considered an initiative (either by business, EA or IT) to improve business performance and business/IT alignment. The problem is, in my opinion, that the term “business/IT alignment” has become a bit meaningless in the past few years as every vendor uses it in their marketing literature. Process models are considered a part of business architecture by a large majority of organizations with a business architecture initiative, as are business capability models and business strategy, application portfolio assessments, organizational models and even IT strategy.

Business architecture has become the hot new professional area to get into, whether you’re a business analyst or an enterprise architecture, which means that it’s necessary to have a better common understanding of what business architecture actually is and who the business architects are. I’m moderating a panel on this topic with three business/IT/enterprise architects today at 4:30pm, and plan to explore this further with them. Scott showed their research on what people did before they became (or labeled themselves as) business architects: most held a business analyst role, although many also were enterprise architects, application architects and other positions. Less than half of the business architects are doing it full time, so may still be fulfilling some of those other roles in addition. Many of them are resident in the EA group, and more than half of organizations consider EA to be responsible for the outcomes of business architecture.

It’s really a complex set of factors in figuring out what business architects do: some of them are working on enterprise-wide business transformation, while others are looking at efficiency within a business unit or project. The background of the business architect – that is, what they did before they became a business architect – can hugely impact (obviously) the scope and focus of their work as a business architect. In fact, is business architecture a function performed by many players, or is it a distinct role? Who is really involved in business architecture besides those who hold the title, and where do they fit in the organization? As Scott pointed out, these are unanswered questions that will plague business architecture for a few years still to come.

He presented several shifts to make in thinking:

  • Give up your old paradigms (think different; act different to get different results)
  • Start with “why” before you settle on the how and what
  • “Should” shouldn’t matter when mapping from “what is” to “what can be”
  • Exploration, not standardization, since enterprise architecture is still undergoing innovation on its way to maturity
  • Business architecture, not technology architecture, is what provides insight, risk management and leadership (rather than engineering, knowledge and management)
  • Stress on “business” in business architecture, not “architecture”, which may not fit into the EA frameworks that are more focused on knowledge
  • Focus on opportunity rather than efficiency, which is aligned with the shift in focus for BPM benefits that I’ve been seeing in the past few years
  • Complex problems need different solutions, including looking at the problems in context rather than just functional decomposition.
  • Solve the hard “soft” problems of building business architect skills and credibility, leveraging local successes to gain support and sponsorship, and overcome resistance to change
  • Think like the business before applying architectural thinking to the business problems and processes

He finished up with encouragement to become more business savvy: not just the details of business, but innovation and strategy. This can be done via some good reading resources, finding a business mentor and building relationships, while keeping in mind that business architecture should be an approach to clarify and illuminate the organization’s business model.

He wrote a blog post on some of the challenges facing business architects back in July, definitely worth a read as well.

Building Business Capability Keynote with @Ronald_G_Ross, @KathleenBarret and @RogerBurlton

After a short (and entertaining) introduction by Gladys Lam, we heard the opening keynote with conference chairs Ron Ross, Kathleen Barret and Roger Burlton. These three come from the three primary areas covered by this conference – business rules, business analysis and business process – and we heard about what attendees can expect to learn about and take away from the conference:

  • The challenge of business agility, which can be improved through the use of explicit and external business rules, instead of hard-coding rules into applications and documents. Making rules explicit also allows the knowledge within those rules to be more explicitly viewed and managed.
  • The need to think differently and use new solutions to solve today’s problems, and development of a new vocabulary to describe these problems and solutions.
  • You need to rewire the house while the lights are on, that is, you can’t stop your business while you take the time to improve it, but need to ensure that current operations are maintained in the interim.
  • Business rules need to be managed in a business sense, including traceability, in order to become a key business capability. They also need to be defined declaratively, independent from the business processes in which they might be involved.
  • Process and rules are the two key tools that should be in every business analyst’s toolkit: it’s not enough just to analyze the business, but you must be looking at how the identification and management of process and rules can improve the business.

The key message from all three of the chairs is that the cross-pollination between process, rules, analysis and architecture is essential in order to identify, manage and take advantage of the capabilities of your business. There is a lot of synergy between all of these areas, so don’t just stick with your area of expertise, but check out sessions in other tracks as well. We were encouraged to step up to a more business-oriented view of solving business problems, rather than just thinking about software and systems.

I’m adding the sessions that I attend to the Lanyrd site that I created for the conference, and linking my blog posts, presentations, etc. in the “coverage” area for each session. If you’re attending or presenting at a session, add it on Lanyrd so that others can socialize around it.

I’m moderating two panels during the remainder of the conference: today at 4:30pm is a BPM vendor panel on challenges in BPM adoption, then tomorrow at 4:30pm is a panel on business architecture versus IT architecture.

Tracking Your Conference Social Buzz

I’m pretty active on social media: primarily, I blog and tweet, but I also participate in Foursquare, Facebook and, recently, the social conference site Lanyrd. When I was preparing for this week’s Building Business Capabilities conference in Fort Lauderdale, I added the sessions that I’ll be giving and a few others to the Lanyrd site that I created for the conference, and encouraged others to do the same. Just to explain, this isn’t the official BBC site, but a shadow crowd-sourced site that allows people to socialize their participation in the conference: think of it as a wiki for the conference, including some structured data that makes it more than just plain text. Logging in via your Twitter account, you can create a session (or a whole conference), add speakers to it, indicate that you’re attending or speaking at the conference, and add links to any coverage (blog posts, slides, video, etc.).

For someone tracking the conference remotely, or attending but unable to attend all of the sessions, this is a great way to find information about the sessions that is just too fast-moving to expect the conference organizers to add to the official site. If you’re at BBC, or tracking it from your desk at home, I recommend that you check out the Lanyrd site for BBC, add any sessions that you’re attending or presenting that are missing (I only added a dozen or so, so feel free to go wild there), and link in any coverage of the conference or sessions that you read about on blogs or other sites.

I’ve been using Lanyrd for about a year, sometimes just to add conferences that I know are happening, but also to add myself to ones created by others, as a speaker, participant or just a tracker. There’s also a Lanyrd iPhone app that downloads all of this to your phone. Although BBC has a mobile site, it’s slow to load and doesn’t have a lot of the social features that you’ll find in the Lanyrd app, or the ability to save details offline.

I also had an interesting social interaction about my hotel room here at the Westin Diplomat, where the conference is being held. I checked in just after noon yesterday and arrived at the room to find it was nestled right beside a very noisy mechanical room, and looked out directly at several large air conditioning units about 10 feet away on the roof of the adjacent structure. It sounded like I was in the engine room of a ship. Unable to raise the front desk by phone, I went back down, and spent 20 minutes waiting for service. Fuming slightly, I tweeted, and ended up in a conversation with the Starwood hotels Twitter presence, StarwoodBuzz, which responded almost immediately to my mention of a Westin property. The second room was beside the elevator shaft so still a bit noisy, but tolerable; however, when I returned from dinner around 10pm, the carpet was flooded from a leaking windowpane due to the torrential rain that we had all evening, and another room change was required.

The hotel responded appropriately, for the most part (the service for the first room change could have been a bit better, and I expected a really quiet room after complaining about noise in the first room), but the real surprise was the near-immediate feedback and constant care provided by the nameless person/people at StarwoodBuzz, which you can see in the Bettween widget below:

[ Update: Unfortunately, Bettween went offline, and I didn’t capture a screen shot of the conversation. 🙁 I went back and faked it by favoriting all of the tweets in the conversation, then taking a screen snap.]

This is an excellent example of how some companies monitor the social conversation about their brands, and respond in a timely and helpful manner. Kudos to Starwood for putting this service in place. This is also a good example of why you should tweet using your real name (assuming that you’re not in a situation where that would be harmful to your person): StarwoodBuzz was able to notify the hotel management of my predicament. It’s possibly that by showing that I’m a real person, rather than a whiner complaining about their hotel while hiding behind a pseudonym, they were able to better address the problem.

The really funny thing is that everyone who I’ve run into at the conference so far said that they saw my original tweet, and wanted to know what happened with my room. Now they can watch it live on Twitter.

Ramping up for BBC2011

I’m getting ready to head for Fort Lauderdale for my last conference (and, I hope, flight) of the year: Building Business Capability. This conference grew out of the Business Rules Forum when it added tracks for business process, business analysis and business architecture, so technically it includes Business Rules Forum, Business Analysis Forum, Business Architecture Summit and Business Process Forum, but there’s so much overlap in interest that it’s fair to say that few people stick just to one track at this event.

I have a couple of spots in the conference this week, starting on Monday morning when I am giving a tutorial on aligning BPM and enterprise architecture, similar to that which I gave at the IRM BPM conference in London in June. It’s October 31st so Halloween costumes are optional, but I will give a prize for the best one worn by a tutorial attendee.

On Tuesday afternoon, I’m moderating a BPM vendor panel focused on BPM adoption issues from the vendors’ point of view. I’m a bit late with my plug for this since there was some confusion about who was actually picking the panelists (as I found out a few days ago, it was me), but I’ve assembled a stellar lineup:

  • Jesse Shiah, Founder and CEO at AgilePoint. I first met Jesse back at the BPM Think Tank in 2007, when his company was still called Ascentn; since then, they’ve changed it to something that we can all pronounce while they work at turning Microsoft’s Visio and Visual Studio into real BPM tools.
  • Mihnea Galeteanu, Chief Storyteller for BlueworksLiveat IBM. Besides having the coolest job title, Mihnea and I both live in Toronto, so have the advantage of being able to really put “social” into BPM by meeting for coffee to discuss how IBM is making BPM social with BlueworksLive. Yes, I make him pay for the coffee.
  • Jeremy Westerman, Senior Product Marketing Manager for BPM at TIBCO. Part of TIBCO’s “British invasion”, Jeremy and I have a long history of me asking him about what’s coming up in their product releases (such as “how’s that AMX/BPM to tibbrintegration coming along?”), and him trying to say things that won’t get him in trouble with TIBCO’s legal department. Obviously, he’s a big fan of my “everything is off the record after the bar opens” rule.
  • Thomas Olbrich, Cofounder and Managing Director at taraneon. Unlike the other vendors on the panel which provide implementation tools, taraneon provides a process test facility for process quality, meaning that they have the best process horror stories of all. Thomas is the only one of the panelists who I haven’t met face-to-face before now, although I feel like I know him because of our lengthy Twitter exchanges, only some of which are about shoes.

There’s also a rumor that I’m moderating a panel on Wednesday afternoon on business architecture versus technology architecture, although I have yet to hear any details about it.

If you’re interested in trying out a social conference site, you can find BBC on Lanyrd, where you can indicate that you’re attending, speaking at, or just tracking the conference, as well as adding any sessions that you’re interested in. Note that this is an independent crowdsourced social conference site, not an official site of the BBC conference. You can also follow the conference on Twitter at #bbccon11.

DemoCamp Toronto 30 Demos

On October 12th, I attended the 30th edition of Toronto DemoCamp, and saw four demos from local startups.

Upverter is an online electronic design tools, using HTML5, Javascript and Google libraries to provide a drawing canvas for electrical engineers. With about 40,000 lines of code, it provides pretty complex functionality, and they are hoping to displace $100K enterprise tools. They are seeing some enterprise adoption, but are pushing in the university and college space to provide free tools for EE students doing circuit design, who presumably will then take that knowledge into their future places of employment. They have realtime design collaboration designed into it, which will be released in the next few weeks, and already allow for some collaboration and reuse of common components. They also integrate with manufacturers and distributors, providing both components catalogues as input to the design, and “print to order/make” on the completion of the design.

Vidyard is a video player for corporate websites, intended to avoid the problems of native YouTube embedding, including that of corporate networks that block YouTube content. They provide customization of the video player, SEO and analytics, including analytics from the cross-posted video on YouTube. For me, the most interesting part was that they built this in 16 weeks, and fully embraced the idea that if you’re a startup, you can do it faster.

Blu Trumpet is an advertising platform based on application discovery, providing an SDK for an app explorer to be embedded in a publisher’s app to display a list of “related” or partner apps, and redirect to the App Store.

Maide Control was the most exciting demo for me that evening, mostly because it turned my preconceived notion of how a gadget is supposed to be used on its head: they allow you to use your iPad as an input controller for 3D navigation, rather than for consumption of information. In other words, you don’t see the model on your iPad, you see it on the native application on your computer, while your iPad is the touch-based input device that does gesture recognition and translates it to the application.

That’s not to say that you’ll give up using your iPad for consumption, but that you’ll extend your use of it by providing a completely new mode of functionality during an activity (navigating a 3D space such as a building model) when your iPad is probably currently languishing in a drawer. They gave a demo of using an iPad to navigate a 3D city model on SketchUp, taking full advantage of multi-touch capabilities to zoom and reorient the model. When I saw this, I immediately thought of Ross Brown and his 3D process models (BPMVE); even for 2D models, the idea of a handheld touchpad for navigating a model when displaying during a group presentation is definitely compelling. Add the ability for multiple iPads to interface simultaneously, and you have a recipe for in-person group model collaboration that could be awesome.

They also showed the ability to use the iPad and a mouse simultaneously for controlling the view and drawing simultaneously; for impatient, ambidextrous people like me, that’s a dream come true. They have to build interfaces to each specific application, such as what they have already done with SketchUp, but I can imagine a huge market for this with Autodesk’s products, and a somewhat smaller market for 2D Visio model manipulation.

Disappointingly, Kobo didn’t show in spite of being on the schedule; it was probably just a week too early to give us a sneak peek at their new gadget.

IBM IOD Keynote Day 3: New Possibilities (When They’re Not Blacked Out)

So there I was, in my hotel room – where the wifi actually works – watching the IOD keynote online, when the video went offline during the Michael Lewis/Billy Beane talk.

I understand (now) that there are copyright issues around broadcasting Michael Lewis and Billy Beane talking about how analytics are used in baseball, but it would have been great to know that it advance: I may have headed on the long walk down to the crowded, noisy, wifi-challenged events center to watch it in person. Instead, I’m hanging out, hoping for a speedy return of the video feed, and really not knowing if it’s coming back at all. Kind of like a scheduled system outage that your sys admin forgot to tell everyone about.

I’m headed for the airport shortly, so this was my last (and somewhat unsatisfying) session from IOD 2011. Regardless, there is definitely good content at IOD, a great conference for customers, partners and industry watchers alike. I also had the chance to meet up with many of my old FileNet colleagues (where I worked in 2000-2001 as the eProcess evangelist, in what I usually refer to as the longest 16 months of my life), some of whom are still at IBM following the 2006 acquisition, and some of whom are now at IBM business partners.

My major disappointment, this morning’s keynote blackout aside, was the cancellation of the 1:1 interviews with ECM executives that were scheduled. I think that being here under the blogger program (which designates me as “press”) rather than the analyst program (which is how I attend the IBM Impact conference, and most other vendor conferences) somehow has me seen as being less influential, although obviously my output and take-aways for my clients are identical either way.

IBM Case Manager Product Update

The nice thing about IBM Case Manager (shortened to ICM in some of their material, and ACM in others) being so new is that you can show up late to the technical product briefing and not miss anything, since the product managers spend the first 10 15 minutes re-explaining what case management and ICM are to the crowd of legacy FileNet customers. (Yes, it’s been a long day.)

This session with Dave Yockelson and Jake Lavirne discussed some of the customers that they have gained since last year’s initial product release, including banking, insurance, government and energy industry examples. They listed the integrated/bundled products that make up ICM (CM, BPM, ILOG, etc.) plus those things created specifically for ICM (case object model, task object model, case analytics) and the ease with which it is used as a framework for solution construction.

The upcoming release, v5.1, will be available within the next month or so, and includes a number of new features based on feedback from the early customers:

  • Enhanced case design, including improved data integration, enhanced widget customization, solution templates, and separate solution project areas. Specifically, the data integration framework allows data from a third-party system of record to be used directly in the ICM UI or as case metadata.
  • Direct IBM CM8 integration, with the CM8 documents staying in CM8 without requiring repository federation. This means that CM8 content can initiate cases and launch tasks, as well as being used natively in tasks, completely transparently to the case worker.
  • Improved case worker user experience, including integration of IBM Forms (in addition to the existing support for FileNet eForms) in the ICM UI for adding cases, adding tasks, or viewing task details. This provides a relatively easy way to replace the standard UI with a richer forms-based interface for the case worker. There will also be a simplified UI layout, resizing and custom theming, and the ability to email and share direct links to a case. A case can also be split to multiple cases.
  • Improved support for IBM BPM, including tighter design-time integration, universal inbox, and support for Business Space.

The session wrapped up with a review of some of the vertical applications built on ICM by partners or GBS. There are a number of IBM partners working on ICM applications; I’m sure that a lot of partners weren’t thrilled to find out that IBM had essentially made much of their custom work obsolete, but this does provide an opportunity for partners to build vertical solutions much more quickly based on the ICM framework.

IBM IOD Day 2 Opening Keynote: Transformation in the Era of Big Data and Analytics

Today’s morning keynote kicked off with Steve Mills talking about big data – “as if data weren’t big before”, he joked – and highlighted that the real challenge is not necessarily the volume of data, but what we need to do in order to make use of that data. A huge application for this is customer service and sentiment analysis: figuring out what your customers are saying to you (and about you), and using that to figure out how to deliver better service. Another significant application area is that of the smarter planet: sensing and responding to events triggered by instrumentation and physical devices. He discussed a number of customer examples, pointing out that no two situations are the same and that a variety of technologies are required, but there are reusable patterns across industries.

Doug Hunt was up next to talk about content analytics – another type of big data – and the impact on transforming business processes. He introduced Randy Sumrall, CIO of Education Service Center Region 10 (State of Texas), to talk about the impact of technology on education and the “no child left behind” policy. New technology can be overwhelming for teachers, who are often required to select what technologies are to be used without sufficient information or skills to do so; there needs to be better ways to empower the educator directly rather than just having information available at the administrative level. For example, they’ve developed an “early dropout warning” tool to be used by teachers, analyzing a variety of factors in order to alert the teachers about students who are at risk of dropping out of school. The idea is to create tools for completely customized learning for each student, covering assessment, design and delivery; this is more classical BI than big data. Some interesting solutions, but as some people pointed out on the Twitter stream, there’s a whole political and cultural element to education as well. Just as some doctors will resist diagnostic assistance from analytics, so too will some teachers resist student assessments based on analytics rather than their own judgment.

Next was Frank Kern to talk about organizations’ urgency to transform their businesses, for competitive differentiation but also for basic survival in today’s fast-moving, social, data-driven world. According to a recent MIT Sloan study, 60% of organizations are differentiating based on analytics, and outperform their competitors by 220%. It’s all about speed, risk and customers; much of the success is based on making decisions and taking actions in an automated fashion, based on the right analysis of the right data.

Some of IBM’s future of big data analytics is Watson, and Manoj Saxena presented on how Watson is being applied to healthcare – being demonstrated at IOD – as well as future applications in financial services and other industries. In healthcare, consider that medical information is doubling every five years, and about 20% of diagnoses in the US have some sort of preventable error. Using Watson as a diagnostic tool puts all healthcare information into the mix, not just what your doctor has learned (and remembers). Watson understands human speech, including puns, metaphors and other colloquial speech; it generates hypotheses based on the information that it absorbs; then it understands and learns from how the system is used. A medical diagnosis, then, can include information about symptoms and diseases, patient healthcare and treatment history, family healthcare history, and even patient lifestyle and travel choices to detect those nasty tropical bugs that your North American doctor is unlikely to know about. Watson’s not going to replace your doctor, but provide decision support during diagnosis and treatment.

Dr. Carolyn McGregor of UOIT was interviewed about big data for capturing health informatics, particularly the flood of information generated by the instrumentation hooked up to premature babies in NICU: some medical devices generating several thousand readings per second. Most of these devices may have a couple of days of memory to store the measurements; after that, the data is lost if not captured into some external system. Being able to analyze patterns over several days’ data can detect problems as they are forming, allowing for early preventative measures to be taken: saving lives and reducing costs by reducing the time that the baby spends in NICU. A pilot is being done at Toronto’s world-class Hospital for Sick Children, providing analysis of 90 million data points each day. This isn’t just for premature babies, but is easily applicable to any ICU instrumentation where the patients require careful monitoring for changing conditions. This can even be extended to any sort of medical monitoring, such as home monitoring of blood glucose levels. Once this level of monitoring is commonplace, the potential for detecting early warning signals for a wide variety of conditions becomes available.

Interesting themes for day 2 of IOD. However, as much as they are pushing that this is about big data and analytics, it’s also about the decision management and process management required to take action based on that analysis.