The Evolution of Privacy Regulations – an @AIIM1Canadian seminar

SeshatAIIM Toronto runs some great morning seminars every month or so, and today the guest is Else Khoury of Seshat Information Consulting to talk about privacy regulations. In the face of recent privacy gaffes from the Facebook fiasco (the breach that wasn’t a breach) to Alphabet Labs not thinking about where public data that they collect in Toronto will be stored (hello, data sovereignty), and with the upcoming GDPR regulations, privacy is hot right now. Khoury, who brightened our day by telling us that her company is named after the Egyptian goddess of recordkeeping, covered both Canadian and EU privacy frameworks.

In Canada, we’ve had the Privacy Act since 1983, which governs federal government offices and how they handle data about employees and citizens, including Freedom of Information. PIPEDA (Personal Information Protection and Electronic Documents Act) came in 2000, setting rules for how private organizations handle personal information. As technology evolved, the Digital Privacy Act of 2015 made major amendments to PIPEDA regarding mandatory breach reporting and recordkeeping. Khoury briefly covered FIPPA (Freedom of Information and Protection of Privacy Act) and MFIPPA, which apply the same sort of regulations as the Privacy Act but for provincial and municipal governments. PHIPA (Personal Health Information Protection Act) protects our health-related information across all types of health care providers, and was updated quite recently to state that “use” includes viewing information after a few cases of nosy health care workers who looked up records on people who they shouldn’t have. There is (or soon will be) mandatory reporting of PHIPA breaches in most provinces, including reporting to the regulatory colleges for different types of health care workers. There is also a privacy framework for electronic health records (EHR) under new revisions to PHIPA.

There are analogous privacy regulations in many other countries; for example, the US HIPAA serves the same purpose as our PHIPA, while GDPR is a broader regulation that will cover data across all organizations rather than our division by private and public sector.

There was a good discussion on security versus privacy: security is often focused on keeping external parties out, whereas privacy has to do with how people handle data inside an organization, although these are often intertwined issues. Of course, it’s possible to have a privacy breach (e.g., inappropriate internal access) without a security breach and vice versa. Khoury pointed out that a lot of privacy regulations have to do with processes; in my experience, compliance regulations in general are very process-driven, and the best way to both avoid privacy breaches as well as prove that you have safeguards in place is to implement and audit processes around how data is handled.

She moved on to GDPR, which comes into effect in the EU in May of this year; GDPR covers all personal data of EU residents, since often the combination of data from multiple sources can be used to identify individuals even when a specific identifier (such as name) is not present. As with the 10 privacy principles in Canadian privacy regulations, GDPR has a set of key principles, and uses the concept of Privacy by Design that was co-developed by Ontario’s privacy commissioner. GDPR has specific rules around data retention, specifically not keeping data longer than is required, then securely destroying it. This led to a really interesting discussion of how companies that provide recommendations handle retention of historical data about your interactions with them, such as Netflix or Amazon: will we need to explicitly give them permission to keep information about our past purchases/consumption in order for them to give us better recommendations? GDPR will forever shift data permissions from opt-out to opt-in for Europeans, although that has been creeping up on us for a while.

One of the most talked-about GDPR principles is the right to be forgotten — Google has already received millions of take-down requests under that part of the regulation — although it doesn’t apply to most health care data since that is required to provide proper medical care to an individual. They also have breach reporting regulations similar to Canada’s PIPEDA requirements, and pretty significant penalties if a breach occurs or an organizations can be proven to be non-compliant.

She finished up with a discussion of how privacy regulation changes are likely to impact organizations, and how to operationalize privacy regulations, which depends on the type of data you handled (PI versus PHI), how you interact with it (processing versus controlling), and if you have a privacy management program in place. You’ll need to assess your holdings — what data you collect, how it’s used, who has access, how long it is retained, how it to secured and destroyed — and develop a privacy management team that includes involvement of senior management and every department, not just a data privacy officer (DPO). You’ll need to develop a privacy management program that includes a breach response process, ensure that everyone is trained in privacy management, then audit and adapt it over time. If you’re subject to the GDPR, then you’ll also need processes for expunging data from your systems due to “right to be forgotten” requests in a timely fashion.

You’ll also need to develop a framework for data protection impact assessments (DPIA, aka privacy impact assessments or PIA) which is a proactive risk assessment for new programs or systems that use personal data: interestingly, the first part of this is often mapping the information flow processes that cover collection, storage and access. Performing DPIA/PIA is part of what Khoury’s company does for organizations, and she had a good checklist of the steps involved, as well as pointing out that they should be a regular part of your privacy management program, not something that’s just done at the end as an audit step.

As always, great content at the AIIM Toronto morning seminars, and I look forward to the next one.

Technicity2013 Cybersecurity Keynote: Microsoft’s Angela McKay

This morning at Technicity 2013, we mostly heard from academics and public sector; this afternoon, it’s almost all private sector presentations and panels, starting with a keynote from Angela McKay, director of cybersecurity and strategy at Microsoft, on managing cyber risks through different approaches to addressing uncertainty. Risk, and therefore answering the question “am I secure enough?”, are quite individual choices: different people and different companies (and cultures) have different risk thresholds, and therefore may have different cybersecurity strategies.

By 2020, we will have 4B internet users, 50B connected devices, and data volumes 50x those of 2010. As users evolved, so have cyber threats: from early web defacement hacks, to worms, to the present day botnets and targeted attacks. There is a spectrum of cybersecurity threats: crime, disruptions (e.g., DDoS attacks), espionage, conflict, war; there is a lot of technological development going on around these, but there are also cultural and policy issues, namely the expectations of consumers, companies and governments. McKay discussed the EU network and information security directive and the US executive order and presidential policy directive on cybersecurity, and the levels of new regulation that are coming.

Reducing the impact of cyber threats involves risk management, information exchange, and effective partnership (both public-private and between private organizations). You can’t do risk management without information, and this means that cybersecurity is a CIO-level issue, not just some technical plumbing. Information sharing, however, can’t be indiscriminate; it has to be focused on specific outcomes. [As an aside, I’m not sure that I agree with this in some situations: open data initiatives work because the “owners” of the data can’t conceive of what anyone would do with their data, yet emergent uses happen with interesting results.] Private-public partnerships bring together the policies and goals related to public safety of the public sector, and the technical know-how of the private sector.

She spoke about the shared responsibilities for managing cyber risks: awareness and education, partnering effectively, driving and incentivizing cyber security, adopting best practices, building advancing capabilities, and developing a secure workforce. Furthermore, academia has to step up and start teaching security concepts and remedies at the college and university level, since most developers don’t have much of an idea about cyber risks unless they specialized in security post-graduation.

Microsoft is the premier sponsor of Technicity 2013, although to be fair, McKay’s talk covered very little about their products and services except for some generic discussion about automated cyberdefense at a machine level. Her slides used that ubiquitous font that we see on the Microsoft Windows 8 website, however, so probably some subliminal messaging going on. 🙂

Technicity2013 Cybersecurity Panel: Is Canada Ready?

Andy Papadopulous of Navantis moderated a panel on the Canadian context of cybersecurity, with panelists Rob Meikle, CIO of City of Toronto; Ritesh Kotak, Operation Reboot (cybercrime initiative) at Toronto Police Service; Wesley Wark, professor at University of Ottawa’s graduate school of public and international affairs, and a specialist in national security policy; and Stephen McCammon, legal counsel at the Ontario Information and Privacy Commissioner.

They each spoke about their specific take on privacy and security in Canada:

Meikle: The interconnection and importance of data and technology, and how these are no longer just on computers inside our offices any more: in addition to cloud computing, we consume information on mobile devices, but also collect and process information from remote devices such as transit vehicles. He addressed the Toronto open data initiative, and how it is critical to look at data from a public citizen perspective rather than an organizational perspective: similar views would not go amiss in private sector organizations and their data.

Kotak: How TPS is having to redefine crime in the era of cybercrime, and how the police force is having to adapt in order to track down online crimes in the same way that they do with “real world” crimes in order to protect public safety. His experience in researching how police services are addressing cybercrime is that many of them equated it only with child exploitation (driven, likely, by the federal government tendency to do the same in order to justify their over-reaching anti-privacy legislation that we heard about from Michael Geist earlier), but there are obviously many other forms of cybercrime, from financial to hacking pacemakers. They identified a number of areas that they needed to address with respect to cybercrime: overt communication (e.g., social media), investigations, covert operations, and policies and procedures.

Wark: Cyberaggression and its impact on us, with five possible outlets: cyberwar, cyberterrorism, cyber covert operations, cyberespionage and cybercrime. He feels that the first two do not actually exist, that covert operations is an emerging area, while espionage and crime are well-established cyber activities. He maintains that the government’s focus on terrorism in general is a bit ridiculous, considering the lack of any evidence that this is occurring or even imminent (a recent US study showed that Americans are more likely to be killed by their own furniture than by terrorism); and that the government has a difficult time establishing their role and responsibilities in cybersecurity beyond throwing out some simplistic barriers around classified government data. We need to do more with private-public partnerships and education — starting with some simple sharing of best practices — in order to appropriately address all forms of cyberaggression. We need to decide what we really mean by privacy, then define the legal framework for protecting that.

McCammon: How to achieve the balance between privacy and openness. Usability is critical: it’s not just enough to have good authentication, encryption and other services to protect people’s privacy; those tools need to be easy enough for everyone to use (or completely and transparently embedded in other platforms), although Wark challenged that that was unlikely to happen. More information is being gathered, and will continue to be gathered, and analytics allow that to be integrated in new ways; there is no putting the toothpaste back in that particular tube, so we need to learn to deal with it in ways that protect us without requiring us to pull the plug and move to the woods. Trust is essential for privacy (although I would add that enforcement of that trust is pretty critical, too).

Good discussion.