Social Media Meets Social Collaboration. Or Not.

My fellow Enterprise Irregular Susan Scrupski posted last month on the split between enterprise initiatives in social media (external-facing marketing) and social collaboration (mostly internal work production and knowledge sharing) – apparently the number of organizations actually integrating these efforts is near-zero. I don’t find this particular surprising, since the people involved and the purposes of the initiatives are quite different, but it doesn’t bode well for efforts to directly connect internal business processes to customers via social media. I started to incorporate themes of linking external social presence into core business processes (recorded screencast here) a couple of years ago in my presentations and writing, based on my own experiences as well as those of my clients. However, when I talk about that Zipcar/Twitter example today, I still get a lot of “wow” reactions in the audience: for most organizations, the idea that social media can be directly integrated as a near-real-time customer interaction channel seems like science fiction. And even for those that do see social media as a customer engagement channel, it often has serious limitations: as soon as you actually need to do a “transaction”, the social media team has to hand off to an operations team, usually requiring that the customer restart their interaction over again through a different channel.

Many organizations are still struggling with the idea of internal social collaboration. Although the software functionality for the social enterprise is robust, and has become integrated with line-of-business functionality such as in BPM and ERP systems, I’m still working with many traditional industries, where managers still want to know exactly how how long people spend on break, and certainly don’t trust them enough to enable on-demand collaboration features in their systems. Although, of course, the workers do collaborate: they just do it outside the systems, creating hidden business processes that provide the collaborative and dynamic aspects using (primarily) email.

This is more than just an outside-in realignment, although that’s a necessary starting point: there’s a combination of technology and corporate culture that needs to allow for the direct connection of external social media and internal social collaboration.

Reading And Writing Resolutions For 2014

The past year was a pretty busy one for me, with quite a bit of enterprise client work that I couldn’t discuss here, so my blogging was mostly limited to conferences that I attended and webinars/white papers that I created. Okay for information dissemination, but not so much for starting conversations, which is why I started blogging 9 years, 2,400 posts and 850k words ago. I’m also way behind on my reading of other blogs, so much so that the older unread ones are starting to drop out of my newsreader.

Catching up on the reading will likely involve committing a drastic act in my newsreader (clearing all unread – yikes!), trimming down the blogs that I follow, and making time regularly to browse for interesting developments in blogs and Twitter.

Getting back to some more interesting writing will follow from the reading: reading other people’s interesting ideas always helps me to generate some of my own, then it’s just a matter of putting hands to keyboard on a regular basis, and letting the ideas out into the wild.

Here’s to 2014!

camunda BPM 7.0 Refactors Engine And Process Administration

On August 31, camunda released camunda BPM platform 7.0 (community open source and enterprise editions), the first major release of the software since it was forked from the Activiti project in March, although there were nine community releases between the fork at the release of 7.0. I had the chance for a couple of briefings over a period following that with Daniel Meyer, the project lead, and promised that I’d actually get this post written in time for Christmas. 🙂

The 7.0 release contains a significant amount of new code, but their focus remains the same: a developer-friendly BPM platform rather than a tool positioned for use by end users or non-technical analysts. As I discussed in a recent webinar, BPMS have become model-driven application development environments, hence having a BPMS positioned explicitly for developers meets a large market segment, especially for complex core processes.

The basic tasklist and other modeler and runtime features are mostly unchanged in this version, but there are big changes to the engine and to Cockpit, the technical process monitoring/administration module. Here’s what’s new:

Cockpit:

  • Inspect/repair process instances, including retrying failed service calls.
  • Create instance variables at runtime, and update variable values.
  • Reassign human activities.
  • Send a link directly to a specific instance or view.
  • Create a business key corresponding to a line-of-business system variable, providing a fast and easy to search on LOB data.
  • Extensible via third-party plug-ins. The aim with Cockpit is to solve 80% of use cases, then allow plug-ins from consulting partners and customers to handle the remainder; they provide full instructions on how to develop a Cockpit plug-in.
  • Add tabs to the detailed views of process instance, e.g., a link to LOB or other external data.

Engine:

  • A new authorization framework (also manifesting in admin capabilities for users/groups/authorizations): this is a preview feature in 7.0, supporting only application, group, group membership authorization. In the future, this will be expanded to include process definition and instance authorization. Users can be maintained in an internal camunda database or using a direct link to LDAP.
  • A complete rewrite of the history/audit log, splitting the history and runtime databases, which is potentially a huge performance booster. Updates to the history are triggered from events on running instances, whereas previously, writing history records required querying and updating existing records for that instance. The history log can be redirected to a database that is shared by multiple process engines; since some amount of the Cockpit monitoring is done on the history database, this makes it easier to consolidate monitoring of multiple process engines if the history logs are redirected to the same database. The logs can also be written directly to an external database based on the new history event stream API. Writes to the history log are asynchronous, which also improves performance. At the time of release, they were seeing preliminary benchmarks of 10-20% performance improvement in the process engine, and a significant reduction in the runtime database index size.
  • There is some increase in the coverage of the BPMN 2.0 standard; their reference document shows  supported elements in orange with a link through on each element to the description and usage, including code snippets where appropriate. Data objects/stores are still not supported, nor are about half of the event types, but their track record is similar to most vendors in this regard.

Version 7.0 is all open source, but a consolidation release (7.1) is already in alpha and will contain some proprietary administration features in Cockpit not available in the open source version: bulk edit/restart of instances, complex search/filter across instances from different process definitions, and a process-level authorizations UI (although the authorization structure will be built into the common engine). camunda is pretty open about their development, as you might expect from an open source company; you can even read some quite technical discussions about design decisions such as a new Activity Instance Execution Model have been made in the process engine in order to improve performance.

In September, camunda released a cloud application for collaborating on process models, camunda share. This is not a full collaborative authoring environment, but a place to upload, view and discuss process models. The camunda team created it during their “ShipIt-Day”, where they are tasked with creating something awesome within 24 hours. There’s no real security, your uploaded model generates a unique URL that you can send to others, but it provides the option to anonymize the process model by removing labels if your process contains proprietary information. A cool little side project that could let you avoid sending around PDFs of your process models for review.

camunda’s business model is in providing and supporting the enterprise edition of the software, which includes some proprietary functions in Cockpit but is otherwise identical to the community open source edition, plus in consulting and training services to help you get started with camunda BPM. They provide a great deal of the effort behind the community edition, while encouraging and supporting platform extensions such as fluent testing, PHP developer support and enterprise integration via Apache Camel.

camunda BPM 7.0

Webinar On Business-IT Alignment In Process Applications

This afternoon, I’m giving a webinar (hosted by Software AG) on business-IT alignment when developing process-centric applications: you can sign up for it or see the replay here.

Some interesting stuff on model-driven development and also why we usually need to use separate modeling tools when we’re building applications for complex core processes.

We’re also developing a white paper on this topic, to be released in the next few weeks; I’ll post a link to that when it’s out.

Technicity2013 Cybersecurity Panel: How Prepared Is Business?

Our afternoon panel was moderated by Pete Deacon of Blackiron Data (another conference sponsor), and featured panelists from private industry: Kevvie Fowler, forensic advisory services at KPMG; Daniel Tobok, digital forensics at TELUS; Jeff Curtis, chief privacy officer at Sunnybrook Hospital; and Greg Thompson, enterprise security services at Scotiabank.

Security breaches happen. And as Deacon reminded us, over 60% of those take months (or years) to detect, and are usually detected by someone outside the organization. What are the real cybersecurity risks, what are companies’ perceptions of the risk, and what are the challenges that we face? Fowler believes that since security is often a low-level IT issue, the security message isn’t making its way up the ladder to the C-suite unless a high-profile breach occurs that requires some sort of executive damage control. Curtis agreed, adding that hospitals are used to dealing with clinical risks right up through the executive levels but that IT security risks are a new topic for their executive risk management participants. Both noted that it’s important to have the right people to carry that message: it has to be technically correct, but integrated with the business context and goals. Thompson added that the message doesn’t need to be dumbed down for the C-suite: their board is very used to assessing complex financial risk, and is capable of assessing other types of complex risk, although may need to become versed in some of the cybersecurity language and technology.

The next topic was BYOD (bring your own device), and Thompson pushed the conversation beyond this to BYON(etwork), where people bring their own network, even if just through a smartphone hotspot. Companies are losing control of where people do their work, both devices and network, and solutions should be designed to assume that all endpoints and networks are potentially hostile. Business and productivity have to be balanced with risk in these cases: people will do what they need to do in order to get their job done, and if you think that you’ve avoided security breaches by locking down someone’s access on their corporate device, you can be sure that they’re finding a way around that, possibly on their own device. Curtis agreed, and pointed out that they have a lot of students and interns who come in and out of the hospital environment with their own devices: the key is to enable workers to get their work done and protect the data, not to hamstring their work environment, so they have a device registration policy for BYOD that is working well. Tobok works with a lot of law firms, and notes a recent trend of new lawyers using technology capabilities (including openness to BYOD) as a competitive criterion when selecting a firm to work for.

Moving on to security analytics, Fowler said that there are few organizations actually getting value from predictive security analytics, versus more straightforward data mining: it’s important to query the vendors providing predictive analytics on the models that they’re actually using and the success rates. Thompson agreed that predictive analytics is a bit of black magic right now, but sees a lot of value in historical data analysis as a guide to improving the security environment. In my opinion, in the next two years, predictive analytical models are going to start to become mainstream and useful, moving out of a more purely research phase; we’re seeing this in predictive process analytics as well, which I still talk about in the context of “emerging technologies”. This is all tied up with reporting and compliance, of course: business intelligence and analytics have played, and will continue to play, a key role in detecting breaches and auditing cybersecurity. Both Curtis and Thompson spoke about the regulatory pressures in their respective industries and the growth of analytics and other GRC-related tools; healthcare is obviously a highly-regulated industry, and Scotiabank does business in 55 countries and has to deal with the regulations in all of them. Auditors and regulatory bodies are also having to step up their knowledge about cybersecurity.

There was a question from the audience on investigations of security breaches in cloud environments: Tobok is involved in cybersecurity forensic investigations including cloud, and discussed the changes that have happened in the industry in the four years that he’s been involved in cloud security forensics in order to provide better traceability and auditing. Fowler added that forensic science is adapting for these type of investigations, and half of the work is just figuring out what systems that the data has been resident on since the typical cloud contract only allows a client to access their data, not the actual servers on which is resides. These can include a number of other factors, such as hackers that use compromised credit cards to lease space in a data centre in order to hack into another organization’s data in that same centre; obviously, these complexities don’t exist in breaches to a company’s own data centre.

There was a final panel with five of the vendors who are sponsoring the conference, but my brain was pretty full of security information by then (and I thought that this might be a bit more about their products than I care about) so I decided to duck out before the end.

Another great Technicity conference, and I look forward to next year.

Technicity2013 Cybersecurity Keynote: Microsoft’s Angela McKay

This morning at Technicity 2013, we mostly heard from academics and public sector; this afternoon, it’s almost all private sector presentations and panels, starting with a keynote from Angela McKay, director of cybersecurity and strategy at Microsoft, on managing cyber risks through different approaches to addressing uncertainty. Risk, and therefore answering the question “am I secure enough?”, are quite individual choices: different people and different companies (and cultures) have different risk thresholds, and therefore may have different cybersecurity strategies.

By 2020, we will have 4B internet users, 50B connected devices, and data volumes 50x those of 2010. As users evolved, so have cyber threats: from early web defacement hacks, to worms, to the present day botnets and targeted attacks. There is a spectrum of cybersecurity threats: crime, disruptions (e.g., DDoS attacks), espionage, conflict, war; there is a lot of technological development going on around these, but there are also cultural and policy issues, namely the expectations of consumers, companies and governments. McKay discussed the EU network and information security directive and the US executive order and presidential policy directive on cybersecurity, and the levels of new regulation that are coming.

Reducing the impact of cyber threats involves risk management, information exchange, and effective partnership (both public-private and between private organizations). You can’t do risk management without information, and this means that cybersecurity is a CIO-level issue, not just some technical plumbing. Information sharing, however, can’t be indiscriminate; it has to be focused on specific outcomes. [As an aside, I’m not sure that I agree with this in some situations: open data initiatives work because the “owners” of the data can’t conceive of what anyone would do with their data, yet emergent uses happen with interesting results.] Private-public partnerships bring together the policies and goals related to public safety of the public sector, and the technical know-how of the private sector.

She spoke about the shared responsibilities for managing cyber risks: awareness and education, partnering effectively, driving and incentivizing cyber security, adopting best practices, building advancing capabilities, and developing a secure workforce. Furthermore, academia has to step up and start teaching security concepts and remedies at the college and university level, since most developers don’t have much of an idea about cyber risks unless they specialized in security post-graduation.

Microsoft is the premier sponsor of Technicity 2013, although to be fair, McKay’s talk covered very little about their products and services except for some generic discussion about automated cyberdefense at a machine level. Her slides used that ubiquitous font that we see on the Microsoft Windows 8 website, however, so probably some subliminal messaging going on. 🙂

Technicity2013 Cybersecurity Panel: Is Canada Ready?

Andy Papadopulous of Navantis moderated a panel on the Canadian context of cybersecurity, with panelists Rob Meikle, CIO of City of Toronto; Ritesh Kotak, Operation Reboot (cybercrime initiative) at Toronto Police Service; Wesley Wark, professor at University of Ottawa’s graduate school of public and international affairs, and a specialist in national security policy; and Stephen McCammon, legal counsel at the Ontario Information and Privacy Commissioner.

They each spoke about their specific take on privacy and security in Canada:

Meikle: The interconnection and importance of data and technology, and how these are no longer just on computers inside our offices any more: in addition to cloud computing, we consume information on mobile devices, but also collect and process information from remote devices such as transit vehicles. He addressed the Toronto open data initiative, and how it is critical to look at data from a public citizen perspective rather than an organizational perspective: similar views would not go amiss in private sector organizations and their data.

Kotak: How TPS is having to redefine crime in the era of cybercrime, and how the police force is having to adapt in order to track down online crimes in the same way that they do with “real world” crimes in order to protect public safety. His experience in researching how police services are addressing cybercrime is that many of them equated it only with child exploitation (driven, likely, by the federal government tendency to do the same in order to justify their over-reaching anti-privacy legislation that we heard about from Michael Geist earlier), but there are obviously many other forms of cybercrime, from financial to hacking pacemakers. They identified a number of areas that they needed to address with respect to cybercrime: overt communication (e.g., social media), investigations, covert operations, and policies and procedures.

Wark: Cyberaggression and its impact on us, with five possible outlets: cyberwar, cyberterrorism, cyber covert operations, cyberespionage and cybercrime. He feels that the first two do not actually exist, that covert operations is an emerging area, while espionage and crime are well-established cyber activities. He maintains that the government’s focus on terrorism in general is a bit ridiculous, considering the lack of any evidence that this is occurring or even imminent (a recent US study showed that Americans are more likely to be killed by their own furniture than by terrorism); and that the government has a difficult time establishing their role and responsibilities in cybersecurity beyond throwing out some simplistic barriers around classified government data. We need to do more with private-public partnerships and education — starting with some simple sharing of best practices — in order to appropriately address all forms of cyberaggression. We need to decide what we really mean by privacy, then define the legal framework for protecting that.

McCammon: How to achieve the balance between privacy and openness. Usability is critical: it’s not just enough to have good authentication, encryption and other services to protect people’s privacy; those tools need to be easy enough for everyone to use (or completely and transparently embedded in other platforms), although Wark challenged that that was unlikely to happen. More information is being gathered, and will continue to be gathered, and analytics allow that to be integrated in new ways; there is no putting the toothpaste back in that particular tube, so we need to learn to deal with it in ways that protect us without requiring us to pull the plug and move to the woods. Trust is essential for privacy (although I would add that enforcement of that trust is pretty critical, too).

Good discussion.

Technicity2013 – Focus On Cybersecurity Michael Geist Keynote @mgeist

I can’t believe that it’s been a year since the last Technicity conference: a free conference hosted by IT World Canada, and sponsored this year by McAfee and Microsoft. Last year, the focus was on crowdfunding including some lessons from crowdfunding in the UK and a panel on legalizing equity crowdfunding; this year, it’s about cybersecurity. There’s a strong presence from the city of Toronto here, including an opening address from Councillor Gary Crawford, and the participation of the city’s CIO Rob Meikle on a panel; plus provincial government participation with Blair Poetschke, director of the international trade branch for the Ontario Ministry of Economic Development, and Stephen McCammon, legal counsel at the Office of the Ontario Information and Privacy Commissioner.

Ontario is a hotbed for technology development in Canada, with a large software development community in and around Toronto. Toronto has also been a relatively early provider of open government data and publish a catalogue of online data, which in turn fosters innovation. The G8 countries have now signed on to a full open data initiative, and this is a good thing: we, as taxpayers, pay to have this information collected, and as long as it doesn’t violate anyone’s privacy, it should be freely available to us. Although this conference isn’t about open data, an environment of freely-available government data is a good place to start talking about security and privacy.

It wouldn’t be a Canadian event about cybersecurity without a keynote by Michael Geist, and he delivered on the topic of “The Internet: Friend or Foe?” (a question that many of us ask daily). Although he started with the answer “friend”, he also immediately addressed the privacy and security concerns that arise from the recent news that the NSA has hacked pretty much everyone on the planet, and the ramifications of Edward Snowden’s revelations: it’s not just metadata (as if that weren’t bad enough), and there are a lot of governments and companies complicit in this, including ours. You can read more about this from a Canadian security perspective on Geist’s excellent blog; as a law professor and the Canada Research Chair on internet and e-commerce law, he has a pretty good perspective on this. Geist and others think that what has come out from Snowden’s information is just the tip of the iceberg, and that we have many more horror stories to come.

A big challenge in this environment is with cloud computing, specifically any cloud storage that is resident in the US or owned by a US company: many companies are now calling for local (and locally-owned, therefore out of the grasp of the US Patriot Act) storage from their cloud providers. It’s a small consolation that I’ve been asking about locally-hosted — or at least, non-US hosted — BPM cloud providers for a number of years now; finally, the general business public has woken up to the potential surveillance dangers.

Encryption is becoming a much more visible issue, whereas previously it was a purely technical concern: cloud providers (Google, Microsoft and Twitter, to name three) are ramping up encryption of their traffic in what is rapidly becoming a technology arms race against our own governments. Similarly, businesses and individuals are demanding greater transparency from cloud providers with respect to the disclosures that they are making to government intelligence agencies. Many international bodies are calling for control of internet domains and standards to be wrested away from US-based organizations, since these have been shown to include a variety of government intelligence and corporate sock puppets.

In Canada, our conservative government is busy sucking up to the US government, so we have seen a number of privacy-busting attempts at an online surveillance bill by positioning “lawful access” (i.e., the government can access all of your information without explicit permission) as “protecting our children” by tossing in a bit about cyberbullying. Geist discussed some of the dangers of this bill (Bill C-13, just introduced last week) in a post yesterday, specifically that companies have immunity against prosecution for violating our privacy and information security if they hand that information over to the government under the definitions of this bill. 

He finished up with a look at Canada’s anti-spam law that is coming into effect shortly; this includes making communication from businesses opt-in rather than opt-out, and also requiring consent before installing computer programs in the course of a commercial activity.

It was great to see Geist in person, he’s a great speaker, full of passion and knowledge about his subject. As always, he inspires me to help make Canada a better place for our online activities.

Breakfast Seminar On Intelligent Business Processes (Toronto) – December 3

I recently wrote a white paper and gave a webinar on intelligent business processes, sponsored by Software AG (although not about their products), and I’m now giving a breakfast seminar for them on the same topic in Toronto on December 3rd. If you’re Toronto-based, or are going to be there that day, you can see more information on the free seminar here and sign up for it here.

From Brazil To Vegas: BPM Global Trends and Building Business Capability

My frenzy of seven conferences in six weeks (eight, if you count the two different cities in Brazil as different conferences) is drawing to a close, but the past two weeks have been a bit brutal. Last week, I was in São Paulo and Brasília for the BPM Global Trends seminar series, where I presented in both cities along with Jan vom Brocke from University of Liechtenstein. It was arranged by ELO Group with the strong support of ABPMP Brazil, and was most interesting because I was presenting from a Portuguese version of my slides (with an English version visible to me) while United Nations-style simultaneous translators worked their magic from a booth at the back of the room.

I did a longer presentation in São Paulo earlier in the week with a workshop in the afternoon, then split it into two presentations with some added material for the public sector seminar in Brasília:

Many thanks to my hosts, and to those voices in my head: Leonardo and Daniel, the wonderful translators who brought my material alive for the Portuguese audience, and translated the questions and comments into English for me.

Unfortunately, I didn’t get to see a lot of Brazil except for hotels, airports and conference rooms, although I did get a short tour (thanks, Jones!) of the weird and wonderful modernist architecture of Brasília on the day that I flew out.

I arrived home in Toronto on Sunday morning, then 24 hours later was on a flight to Las Vegas for the Building Business Capability conference – my third trip to Vegas in a month. I presented a half-day seminar yesterday on emerging BPM technology, an ever-changing topic that continues to fascinate me:

I finished up today with a breakout session on the interplay of rules, process and content in case management, which is the combination of a number of different themes that I’ve been playing with over the past few years, but the first time for the presentation in this form:

I’m off to the evening reception to meet up with my peeps here, then tomorrow I get to take it easy and listen to someone else present for a change. Or maybe sit by the pool and let my brain relax for a day before I fly home to get back to my regular client work, and start to work through that backlog of product briefings that I have piled up in my drafts folder.

That’s the last of my conference travel for the year, but not the last of my conferences: I’ll be attending at least one day of CASCON next week for a workshop on Real Time Patient Flow Management using Business Process Management, Location Tags, and Complex Events Processing and to hear some of the research papers, then the Technicity Focus on Cyber Security event on November 26th. I’m also speaking at a Toronto breakfast seminar on intelligent business processes on December 3rd for Software AG.

Whew!