Reading And Writing Resolutions For 2014

The past year was a pretty busy one for me, with quite a bit of enterprise client work that I couldn’t discuss here, so my blogging was mostly limited to conferences that I attended and webinars/white papers that I created. Okay for information dissemination, but not so much for starting conversations, which is why I started blogging 9 years, 2,400 posts and 850k words ago. I’m also way behind on my reading of other blogs, so much so that the older unread ones are starting to drop out of my newsreader.

Catching up on the reading will likely involve committing a drastic act in my newsreader (clearing all unread – yikes!), trimming down the blogs that I follow, and making time regularly to browse for interesting developments in blogs and Twitter.

Getting back to some more interesting writing will follow from the reading: reading other people’s interesting ideas always helps me to generate some of my own, then it’s just a matter of putting hands to keyboard on a regular basis, and letting the ideas out into the wild.

Here’s to 2014!

Technicity2013 Cybersecurity Panel: How Prepared Is Business?

Our afternoon panel was moderated by Pete Deacon of Blackiron Data (another conference sponsor), and featured panelists from private industry: Kevvie Fowler, forensic advisory services at KPMG; Daniel Tobok, digital forensics at TELUS; Jeff Curtis, chief privacy officer at Sunnybrook Hospital; and Greg Thompson, enterprise security services at Scotiabank.

Security breaches happen. And as Deacon reminded us, over 60% of those take months (or years) to detect, and are usually detected by someone outside the organization. What are the real cybersecurity risks, what are companies’ perceptions of the risk, and what are the challenges that we face? Fowler believes that since security is often a low-level IT issue, the security message isn’t making its way up the ladder to the C-suite unless a high-profile breach occurs that requires some sort of executive damage control. Curtis agreed, adding that hospitals are used to dealing with clinical risks right up through the executive levels but that IT security risks are a new topic for their executive risk management participants. Both noted that it’s important to have the right people to carry that message: it has to be technically correct, but integrated with the business context and goals. Thompson added that the message doesn’t need to be dumbed down for the C-suite: their board is very used to assessing complex financial risk, and is capable of assessing other types of complex risk, although may need to become versed in some of the cybersecurity language and technology.

The next topic was BYOD (bring your own device), and Thompson pushed the conversation beyond this to BYON(etwork), where people bring their own network, even if just through a smartphone hotspot. Companies are losing control of where people do their work, both devices and network, and solutions should be designed to assume that all endpoints and networks are potentially hostile. Business and productivity have to be balanced with risk in these cases: people will do what they need to do in order to get their job done, and if you think that you’ve avoided security breaches by locking down someone’s access on their corporate device, you can be sure that they’re finding a way around that, possibly on their own device. Curtis agreed, and pointed out that they have a lot of students and interns who come in and out of the hospital environment with their own devices: the key is to enable workers to get their work done and protect the data, not to hamstring their work environment, so they have a device registration policy for BYOD that is working well. Tobok works with a lot of law firms, and notes a recent trend of new lawyers using technology capabilities (including openness to BYOD) as a competitive criterion when selecting a firm to work for.

Moving on to security analytics, Fowler said that there are few organizations actually getting value from predictive security analytics, versus more straightforward data mining: it’s important to query the vendors providing predictive analytics on the models that they’re actually using and the success rates. Thompson agreed that predictive analytics is a bit of black magic right now, but sees a lot of value in historical data analysis as a guide to improving the security environment. In my opinion, in the next two years, predictive analytical models are going to start to become mainstream and useful, moving out of a more purely research phase; we’re seeing this in predictive process analytics as well, which I still talk about in the context of “emerging technologies”. This is all tied up with reporting and compliance, of course: business intelligence and analytics have played, and will continue to play, a key role in detecting breaches and auditing cybersecurity. Both Curtis and Thompson spoke about the regulatory pressures in their respective industries and the growth of analytics and other GRC-related tools; healthcare is obviously a highly-regulated industry, and Scotiabank does business in 55 countries and has to deal with the regulations in all of them. Auditors and regulatory bodies are also having to step up their knowledge about cybersecurity.

There was a question from the audience on investigations of security breaches in cloud environments: Tobok is involved in cybersecurity forensic investigations including cloud, and discussed the changes that have happened in the industry in the four years that he’s been involved in cloud security forensics in order to provide better traceability and auditing. Fowler added that forensic science is adapting for these type of investigations, and half of the work is just figuring out what systems that the data has been resident on since the typical cloud contract only allows a client to access their data, not the actual servers on which is resides. These can include a number of other factors, such as hackers that use compromised credit cards to lease space in a data centre in order to hack into another organization’s data in that same centre; obviously, these complexities don’t exist in breaches to a company’s own data centre.

There was a final panel with five of the vendors who are sponsoring the conference, but my brain was pretty full of security information by then (and I thought that this might be a bit more about their products than I care about) so I decided to duck out before the end.

Another great Technicity conference, and I look forward to next year.

Technicity2013 Cybersecurity Keynote: Microsoft’s Angela McKay

This morning at Technicity 2013, we mostly heard from academics and public sector; this afternoon, it’s almost all private sector presentations and panels, starting with a keynote from Angela McKay, director of cybersecurity and strategy at Microsoft, on managing cyber risks through different approaches to addressing uncertainty. Risk, and therefore answering the question “am I secure enough?”, are quite individual choices: different people and different companies (and cultures) have different risk thresholds, and therefore may have different cybersecurity strategies.

By 2020, we will have 4B internet users, 50B connected devices, and data volumes 50x those of 2010. As users evolved, so have cyber threats: from early web defacement hacks, to worms, to the present day botnets and targeted attacks. There is a spectrum of cybersecurity threats: crime, disruptions (e.g., DDoS attacks), espionage, conflict, war; there is a lot of technological development going on around these, but there are also cultural and policy issues, namely the expectations of consumers, companies and governments. McKay discussed the EU network and information security directive and the US executive order and presidential policy directive on cybersecurity, and the levels of new regulation that are coming.

Reducing the impact of cyber threats involves risk management, information exchange, and effective partnership (both public-private and between private organizations). You can’t do risk management without information, and this means that cybersecurity is a CIO-level issue, not just some technical plumbing. Information sharing, however, can’t be indiscriminate; it has to be focused on specific outcomes. [As an aside, I’m not sure that I agree with this in some situations: open data initiatives work because the “owners” of the data can’t conceive of what anyone would do with their data, yet emergent uses happen with interesting results.] Private-public partnerships bring together the policies and goals related to public safety of the public sector, and the technical know-how of the private sector.

She spoke about the shared responsibilities for managing cyber risks: awareness and education, partnering effectively, driving and incentivizing cyber security, adopting best practices, building advancing capabilities, and developing a secure workforce. Furthermore, academia has to step up and start teaching security concepts and remedies at the college and university level, since most developers don’t have much of an idea about cyber risks unless they specialized in security post-graduation.

Microsoft is the premier sponsor of Technicity 2013, although to be fair, McKay’s talk covered very little about their products and services except for some generic discussion about automated cyberdefense at a machine level. Her slides used that ubiquitous font that we see on the Microsoft Windows 8 website, however, so probably some subliminal messaging going on. 🙂

Technicity2013 Cybersecurity Panel: Is Canada Ready?

Andy Papadopulous of Navantis moderated a panel on the Canadian context of cybersecurity, with panelists Rob Meikle, CIO of City of Toronto; Ritesh Kotak, Operation Reboot (cybercrime initiative) at Toronto Police Service; Wesley Wark, professor at University of Ottawa’s graduate school of public and international affairs, and a specialist in national security policy; and Stephen McCammon, legal counsel at the Ontario Information and Privacy Commissioner.

They each spoke about their specific take on privacy and security in Canada:

Meikle: The interconnection and importance of data and technology, and how these are no longer just on computers inside our offices any more: in addition to cloud computing, we consume information on mobile devices, but also collect and process information from remote devices such as transit vehicles. He addressed the Toronto open data initiative, and how it is critical to look at data from a public citizen perspective rather than an organizational perspective: similar views would not go amiss in private sector organizations and their data.

Kotak: How TPS is having to redefine crime in the era of cybercrime, and how the police force is having to adapt in order to track down online crimes in the same way that they do with “real world” crimes in order to protect public safety. His experience in researching how police services are addressing cybercrime is that many of them equated it only with child exploitation (driven, likely, by the federal government tendency to do the same in order to justify their over-reaching anti-privacy legislation that we heard about from Michael Geist earlier), but there are obviously many other forms of cybercrime, from financial to hacking pacemakers. They identified a number of areas that they needed to address with respect to cybercrime: overt communication (e.g., social media), investigations, covert operations, and policies and procedures.

Wark: Cyberaggression and its impact on us, with five possible outlets: cyberwar, cyberterrorism, cyber covert operations, cyberespionage and cybercrime. He feels that the first two do not actually exist, that covert operations is an emerging area, while espionage and crime are well-established cyber activities. He maintains that the government’s focus on terrorism in general is a bit ridiculous, considering the lack of any evidence that this is occurring or even imminent (a recent US study showed that Americans are more likely to be killed by their own furniture than by terrorism); and that the government has a difficult time establishing their role and responsibilities in cybersecurity beyond throwing out some simplistic barriers around classified government data. We need to do more with private-public partnerships and education — starting with some simple sharing of best practices — in order to appropriately address all forms of cyberaggression. We need to decide what we really mean by privacy, then define the legal framework for protecting that.

McCammon: How to achieve the balance between privacy and openness. Usability is critical: it’s not just enough to have good authentication, encryption and other services to protect people’s privacy; those tools need to be easy enough for everyone to use (or completely and transparently embedded in other platforms), although Wark challenged that that was unlikely to happen. More information is being gathered, and will continue to be gathered, and analytics allow that to be integrated in new ways; there is no putting the toothpaste back in that particular tube, so we need to learn to deal with it in ways that protect us without requiring us to pull the plug and move to the woods. Trust is essential for privacy (although I would add that enforcement of that trust is pretty critical, too).

Good discussion.

Technicity2013 – Focus On Cybersecurity Michael Geist Keynote @mgeist

I can’t believe that it’s been a year since the last Technicity conference: a free conference hosted by IT World Canada, and sponsored this year by McAfee and Microsoft. Last year, the focus was on crowdfunding including some lessons from crowdfunding in the UK and a panel on legalizing equity crowdfunding; this year, it’s about cybersecurity. There’s a strong presence from the city of Toronto here, including an opening address from Councillor Gary Crawford, and the participation of the city’s CIO Rob Meikle on a panel; plus provincial government participation with Blair Poetschke, director of the international trade branch for the Ontario Ministry of Economic Development, and Stephen McCammon, legal counsel at the Office of the Ontario Information and Privacy Commissioner.

Ontario is a hotbed for technology development in Canada, with a large software development community in and around Toronto. Toronto has also been a relatively early provider of open government data and publish a catalogue of online data, which in turn fosters innovation. The G8 countries have now signed on to a full open data initiative, and this is a good thing: we, as taxpayers, pay to have this information collected, and as long as it doesn’t violate anyone’s privacy, it should be freely available to us. Although this conference isn’t about open data, an environment of freely-available government data is a good place to start talking about security and privacy.

It wouldn’t be a Canadian event about cybersecurity without a keynote by Michael Geist, and he delivered on the topic of “The Internet: Friend or Foe?” (a question that many of us ask daily). Although he started with the answer “friend”, he also immediately addressed the privacy and security concerns that arise from the recent news that the NSA has hacked pretty much everyone on the planet, and the ramifications of Edward Snowden’s revelations: it’s not just metadata (as if that weren’t bad enough), and there are a lot of governments and companies complicit in this, including ours. You can read more about this from a Canadian security perspective on Geist’s excellent blog; as a law professor and the Canada Research Chair on internet and e-commerce law, he has a pretty good perspective on this. Geist and others think that what has come out from Snowden’s information is just the tip of the iceberg, and that we have many more horror stories to come.

A big challenge in this environment is with cloud computing, specifically any cloud storage that is resident in the US or owned by a US company: many companies are now calling for local (and locally-owned, therefore out of the grasp of the US Patriot Act) storage from their cloud providers. It’s a small consolation that I’ve been asking about locally-hosted — or at least, non-US hosted — BPM cloud providers for a number of years now; finally, the general business public has woken up to the potential surveillance dangers.

Encryption is becoming a much more visible issue, whereas previously it was a purely technical concern: cloud providers (Google, Microsoft and Twitter, to name three) are ramping up encryption of their traffic in what is rapidly becoming a technology arms race against our own governments. Similarly, businesses and individuals are demanding greater transparency from cloud providers with respect to the disclosures that they are making to government intelligence agencies. Many international bodies are calling for control of internet domains and standards to be wrested away from US-based organizations, since these have been shown to include a variety of government intelligence and corporate sock puppets.

In Canada, our conservative government is busy sucking up to the US government, so we have seen a number of privacy-busting attempts at an online surveillance bill by positioning “lawful access” (i.e., the government can access all of your information without explicit permission) as “protecting our children” by tossing in a bit about cyberbullying. Geist discussed some of the dangers of this bill (Bill C-13, just introduced last week) in a post yesterday, specifically that companies have immunity against prosecution for violating our privacy and information security if they hand that information over to the government under the definitions of this bill. 

He finished up with a look at Canada’s anti-spam law that is coming into effect shortly; this includes making communication from businesses opt-in rather than opt-out, and also requiring consent before installing computer programs in the course of a commercial activity.

It was great to see Geist in person, he’s a great speaker, full of passion and knowledge about his subject. As always, he inspires me to help make Canada a better place for our online activities.

Q&A With Vishal Sikka @vsikka

Summary of this morning’s keynote (replay available online within 24 hours):

  • Have seen “HANA effect” over past 2.5 years, and see HANA is being not just a commercial success for SAP but a change in the landscape for enterprise customers. A key technology to help people do more.
  • Partnerships with SAS, Amazon, Intel, Cisco, Cloud Foundry.
  • Enterprise cloud and cloud applications.
  • SuccessFactors Learning products.
  • 1000 startup companies developing products on HANA.
  • Team of 3 teenagers using HANA Cloud and Lego to build shelf-stacking robots.

Vishal Sikki keynote

Q&A with audience (in person and online):

  • SAP has always had an excellent technology platform for building applications, used to build their core enterprise applications. HANA is the latest incarnation of that platform, and one that they are now choosing to monetize directly as an application development platform rather than only through the applications. HANA Enterprise Cloud and HANA Cloud Platform are enterprise-strength managed cloud versions of HANA, and HANA One uses AWS for a lower entry point; they’re the same platform as on-premise HANA for cloud or hybrid delivery models. I had a briefing yesterday with Steve Lucas and others from the Platform Solution Group, which covers all of the software tools that can be used to build applications, but not the applications themselves: mobile, analytics, database and technology (middleware), big data, and partners and customers. PSG now generates about half of SAP revenue through a specialist sales force that augments the standard sales force; although obviously selling platforms is more of an IT sell, they are pushing to talk more about the business benefits and verticals that can be built on the platform. In some cases, HANA is being used purely as an application development platform, with little or no data storage.
  • Clarification on HANA Cloud: HANA Enterprise Cloud is the cloud deployment of their business applications, whereas HANA Cloud Platform is the cloud version of HANA for developing applications.
  • SAP is all about innovation and looking forward, not just consolidating their acquisitions.
  • Examples of how SAP is helping their partners to move into their newer innovation solutions: Accenture has a large SuccessFactors practice, for example. I think that the many midrange SIs who have SAP ERP customization as their bread and butter may find it a bit more of a challenge.
  • Mobile has become a de facto part of their work, hence has a lower profile in the keynotes: it is just assumed to be there. I, for one, welcome this: mobile is a platform that needs to be supported, but let’s just get to the point where we don’t need to talk about it any more. Fiori provides mobile and desktop support for the new UI paradigms.

As with the keynote, too much information to capture live. This session was recorded, and will be available online.

SAP TechEd Day 1 Keynote With @vsikka

Vishal Sikka – head technology geek at SAP – started us off at TechEd with a keynote on the theme of how great technology always serves to augment and amplify us. He discussed examples such as the printing press, Nordic skis and the Rosetta Stone, and ends up with HANA (of course) and how a massively parallel, in-memory columnar database with built-in application services provides a platform for empowering people. All of SAP’s business applications – ERP, CRM, procurement, HR and others – are available on or moving to HANA, stripping out the complexity of the underlying databases and infrastructure without changing the business system functionality. The “HANA effect” also allows for new applications to be built on the platform with much less infrastructure work through the use of the application services built into HANA.

He also discussed their Fiori user interface paradigm and platform which can be used to create better UX on top of the existing ERP, CRM, procurement, HR and other business applications that have formed the core of their business. Sikka drew the architecture as he went along, which was a bit of fun:

SAP architecture with HANA and Fiori

He was joined live from Germany by Franz Faerber, who heads up HANA development, who discussed some of the advances in HANA and what is coming next month in version SP7, then Sam Yen joined on stage to demonstrate the HANA developer experience, the Operational Intelligence dashboard that was shown at SAPPHIRE earlier this year as in use at DHL for tracking KPIs in real time, and the HANA Cloud platform developer tools for SuccessFactors. We heard about SAS running on HANA for serious data scientists, HANA on AWS, HANA and Hadoop, and much more.

There’s a lot of information pushing out in the keynote: even if you’re not here, you can watch the keynotes live (and probably watch it recorded after that fact), and there will be some new information coming out at TechEd in Bangalore in six weeks. The Twitter stream is going by too fast to read, with lots of good insights in there, too.

Bernd Leukert came to the stage to highlight how SAP is running their own systems on HANA, and to talk more about building applications, focusing on Fiori for mobile and desktop user interfaces: not just a beautification of the existing screens, but new UX paradigms. Some of the examples that we saw are very tile-based (think Windows 8), but also things like fact sheets for business objects within SAP enterprise systems. He summed up by stating that HANA is for all types of businesses due to a range of platform offerings; my comments on Hasso Plattner’s keynote from SAPPHIRE earlier this year called it the new mainframe (in a good way). We also heard from Dmitri Krakovsky from the SuccessFactors team, and from Nayaki Nayyar about iFlows for connecting cloud solutions.

TechEd is inherently less sales and more education than their SAPPHIRE conference, but there’s a strong sense of selling the concepts of the new technologies to their existing customer and partner base here. At the heart of it, HANA (including HANA cloud) and Fiori are major technology platform refreshes, and the big question is how difficult – and expensive – it will be for an existing SAP customer to migrate to the new platforms. Many SAP implementations, especially the core business suite ERP, are highly customized; this is not a simple matter of upgrading a product and retraining users on new features: it’s a serious refactoring effort. However, it’s more than just a platform upgrade: having vastly faster business systems can radically change how businesses work, since “reporting” is replaced by near-realtime analytics that provide transparency and responsiveness; it also simplifies life for IT due to footprint reduction, new development paradigms and cloud support.

We finished up 30 minutes late and with my brain exploding from all the information. It will definitely take the next two days to absorb all of this and drill down into my points of interest.

Disclosure: SAP is a customer, and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

The Rise Of The Machines: @BillRuh_GE On The Industrial Internet

Last day of Software AG’s Innovation World, and the morning keynote is Bill Ruh, VP of GE’s Global Software and Analytics Center, on how GE is becoming a digital business. He points out that part of that is what you do internally, but part is also your products: GE is transforming both their products and their operations on their transformation path. For example, their previous aircraft jet engines provided only aggregates measurements about takeoff, cruise and landing; now they have the potential to collect 1TB of measurement data per day from a two-engine aircraft. That’s really big data. Unfortunately, most data is dark: only 0.5% of the world’s data is being analyzed. We don’t need to analyze and act upon all of it, but there’s a lot of missed potential here.

His second point was about the “industrial internet”, where 50 billion machines are interconnected. We saw a revolution in entertainment, social marketing, communications, IT architecture and retail when a billion people were connected, but the much larger number of interconnected machines has the potential to virtualize operational technology, and to enable predictive analytics, automated and self-healing machines, mobilized monitoring and maintenance, and even increased employee productivity. Industrial businesses are starting to change how they get things done, in the same way as retail and other consumer businesses have been transformed over the past decade.

This flood of data is pushing big changes to IT architecture: industrial software now needs real-time predictive analytics, big data, mobile, cloud, end-to-end security, distributed computation, and a consistent and meaningful experience. Analytics is key to all of this, and he pointed out that data scientists are becoming the hardest position to fill in many companies. Behavioral changes around using the analytics is also important: if the analytics are being used to advise, rather than control, then the people being advised have to accept that advice.

Bill Ruh (GE) presentation at Innovation World - architecture for digital industry

The digital enterprise needs to focus on their customers’ outcomes – in their engine case, reducing fuel consumption and downtime, while improving efficiency of the operations around that machine – because at this scale, a tiny percentage improvement can have a huge impact: a 1% savings for GE translates to huge numbers in different industries, from $27B saved by increasing rail freight utilization to $63B saved by improving process efficiency in predictive maintenance in healthcare.

He had some great examples (speaking as a member of a two-engineer household, you can be sure that many of these will be talked about at my dinner table in the future), such as how wind turbines are not just generating data for remote monitoring, but are self-optimizing as well as actually talking to each other in order to optimize within and between wind farms. Smart machines and big data are disrupting manufacturing and related industries, and require a change in mindset from analog to digital thinking. If you think that it can’t happen because we’re talking about physical things, you’re wrong: think of how Amazon changed how physical books are sold. As Ruh pointed out, software coupled with new processing architectures are the enablers for digital industry.

Bill Ruh (GE) presentation at Innovation World - smart wind turbines

It’s early days for digital industry, and there needs to be a focus on changing processes to take advantage of the big data and connectivity of machines. His advice is to get started and try things out, or you’ll be left far behind leaders like GE.

Conference Within A Conference, Part Two: Big Fast Data World

John Bates, who I know from his days at Progress Software, actually holds the title SVP of Big Fast Data at Software AG. Love it. He led off the Big Fast Data World sub-conference at Innovation World to talk about real-time decisioning based on events, whether that is financial data such as trades, or device events from an oil rig. This isn’t just simple “if this event occurs, then trigger this action” sort of decisions, but real-time adaptive intelligence that might include social media, internal business systems, market information and more. It’s where events, data, analytics and process all come together.

The goal is to use all of the data and events possible to appear to be reading your customer’s mind and offering them the most likely thing that they want right then (without being too creepy about it), using historical patterns, current context and location information. For example, a customer is in the process of buying something, and their credit card company or retail partner uses that opportunity to upsell them on a related product or a payment plan, directly to their mobile phone and before they have finished making their purchase. Or, a customer is entering a mall, they are subscribed to a sports information service, there are available tables at a sports bar in the mall, so they are pushed a coupon to have lunch at the sports bar right then. Even recommendation engines, such as we see every time that we visit Amazon or Netflix, are examples of this. Completely context sensitive, and completely personalized.

On the flip side, companies have to use continuous monitoring of social media channels for proactive customer care: real-time event and data analysis for responding to unhappy customers before situations blow up on them. People like Dave Carroll and Heather Armstrong (and sometimes even me, on a much smaller scale) can strike fear in the hearts of customer service organizations who are unable to respond appropriately and quickly, but can cause big wins for these companies when they do the right things to fix things in an expedient manner for their customers.

What do you need to do to make this happen? Not much, just low-latency universal messaging, in-memory unstructured data, real-time predictive analytics, intelligent actions via real-time integration to operational systems, and real-time visual analytics. If you’re a Software AG customer, they’re bringing together Terracotta, Apama and JackBe into a unified platform for this sort of adaptive intelligence, producing intelligent actions from big data, in real time, to/from anywhere.

Software AG Big Fast Data

We then got a bit of a lesson on big data from Nathaniel Rowe, a research analyst at Aberdeen Group: how big is big, what’s the nature of that data, and some of the problems with it. The upshot: the fact that there’s a lot of data is important, but it’s the unstructured nature of it that presents many of the difficult analytical problems. It’s about volume, but also variety and velocity: the data could be coming from anywhere, and you don’t have control over a lot of it such as the social media data or that from business partners. You have to have a clear picture of what you want out of big data, such as better customer insights or operational visibility; Rowe had a number of use cases from e-commerce to healthcare to counterterrorism. The ability to effectively use unstructured data is key: those companies that are best in class are doing this better than average, and it translates directly to measures such as sales, customer satisfaction and net promoter score. He finished up with some of the tools required – automatic data capture, data compression, and data cleansing – and how those translate directly to employees’ ability to find data, particularly from multiple sources at once. Real-time analytics and in-memory analytics are the two high-speed technologies that result in the largest measurable benefits when working with big data, making the difference between seconds (or even sub-second) to see a result or take an action, versus minutes or hours. He ended up with the correlation between investing in big data and various customer experience measures (15-18% increases) as well as revenue measures (12-17% increases). Great presentation, although I’m pretty sure that I missed 75% of it since he is a serious speed-talker and zipped through slides at the speed of light.

And we’re done for the day: back tomorrow for another full day of Innovation World. I’m off to the drinks reception then a customer party event; as always, everything is off the record as soon as the bar opens. Smile

Yammer For Enterprise Social At The Canadian Cancer Society With @lvanderlip

My blogging has been pretty sparse lately, in part because I’ve been busier than usual for the summer and in part because I have an intimidating backlog of product reviews to get through. Tonight, however, I’m at a meetup of Knowledge Workers Toronto to hear Lisa Vanderlip of the Canadian Cancer Society (a charitable fund-raising organization) talk about how they are using Yammer in their workplace. Interestingly, her first words were that she is joining Microsoft in September to work with Yammer from the inside, although tonight she was here to talk about its use at the Cancer Society, and enterprise social in general.

I’ve been starting some independent research lately looking at worker incentives for enterprise social, so I was interested to hear about how they encouraged adoption within their organization. As a 75-year-old organization, a lot of their internal communication was unidirectional, and different offices across the country had their own local intranets using SharePoint servers and the like. This led to a lot of confusion about where to find information or locate internal skills and resources, and a lot of inefficiencies in getting work done. They started implementing Yammer as a social tool in January 2012 with a full production rollout in February 2013, and have seen significant improvements in their internal communications since then. Selection of Yammer was based on recommendations from other organizations, but also because of their strong Microsoft usage internally, especially SharePoint, which integrates well with Yammer.

Their driver for internal enterprise social (as opposed to customer/outward-facing social) was to achieve business goals through engaging staff, sharing goals, increasing productivity, improving collaboration, creating a positive culture and recognizing performance. They wanted to consolidate their intranet into a single cohesive resource available across the organization, and see Yammer as providing a combination of LinkedIn, Facebook and Twitter for internal users (authenticated with Active Directory).

A key part of the roll-out was getting buy-in from management (to get the budget and approvals) as well as from the teams who would be using it. They’ve gone through three (!) CEOs during the course of the project, which added some challenges due to the shifting political landscape and the views of the executive team towards enterprise social at any given time. They worked at buy-in by having an internal advocate team (Yambassadors) that communicated and educated about enterprise social and Yammer, so that by the time they rolled out, everyone knew what Yammer was and what the organization would be doing with it: namely, solving business problems, not sharing what people had for lunch.

They established multiple levels of goals: at the national office level, there were the wider-ranging goals of engaging staff in the organizational mission and vision, increasing communication and collaboration, and increasing efficiency; at the departmental/regional level, they had a template for establishing project-specific goals. For training, there were some basics about social media and Yammer, but also some examples of Yammer success stories and guidance on adding social aspects to current processes and methods.

I asked about incentives that help to motivate users to use enterprise social, and although they’re just starting to look at some of those issues (and are going through some HR restructuring), one key part is in non-financial recognition as an incentive: using Yammer for giving someone a “thumbs up” for a job well done, or recognizing someone as an expert in a particular area. Indirectly, of course, this can translate to financial incentives since peer recognition will (or at least, it should) feed into performance reviews, and is a good indicator for employee satisfaction and therefore reduced turnover. Since they rolled out in production in February of this year, they’ve had over 100 “thumbs up” given on their national network, and have seen 80% of their staff engaged (that is, took specific deliberate actions) on the system; all departments have been using Yammer to achieve their goals.

They are measuring staff engagement and effectiveness of Yammer, allowing each department and team to set metrics to determine if they are achieving their goals. They are actively trying to reduce (internal) email and replacing it with Yammer and other more appropriate communication channels: this has improved efficiencies in several of their team that collaborate on content creation, as would be expected. In the next fiscal year, as they move forward with their approved projects on Yammer, they will be implementing guidelines for limiting staff emails, which will also drive adoption. I think (and please feel free to chime in if you know more about this) that Yammer has some gaps in terms of records management from a regulatory/compliance standpoint, but there are no real technical barriers why enterprise social content can’t be managed in the same way that we managed email, documents and other content required for specific industry governance. In fact, without this level of governance, enterprise social systems will falter as they attempt to push into line-of-business applications.

The challenge for those of us in the BPM world is that enterprise social is something that’s (currently) done in the context of a BPMS, where the organizational goals, user motivators and methods of engagement can be quite different. However, some good lessons here on rolling out social capabilities within an organization, regardless of the platform.

Great presentation and discussion, especially hearing the views and questions of those who work in enterprise knowledge management but appear to have little exposure to social media, both consumer and enterprise: these are probably representative of the views of many people within organizations who are struggling with a justification for enterprise social. The presentation slides will be added to the Meetup group; you probably need to be a group member to see them.