Transforming Digital Transformation: A Panel on Women in Tech

I’ve been asked to participate in a panel in honor of International Women’s Day on March 15. Sponsored by Camunda and Infosys (and technically by me, since I’m giving my time and effort to this without compensation), this panel brings together actual technical women: nothing against women in tech marketing or other non-technical roles, but this is the first “women in tech” panel that I’ve been on where every single one of the non-sponsor participants has a degree in engineering or computer science, and has worked in a technical role at some point in her career.

I’m honored to be invited to join this group of trailblazers in the tech world to discuss the challenges and experiences for women in technology.

Regardless of your gender, the topics that we will discuss have an impact on you. Technology and automation are huge drivers of innovation, and companies are starving for good technical talent, regardless of gender. In fact, women in technology and leadership roles foster diversity, collaboration and innovation in ways that result in higher revenues for companies. Yet with an environment that seems a natural for encouraging more technical women, many companies still toss up barriers, from hiring biases to an unfriendly “bro” culture.

Register at the link above, and tune in on March 15 at 12:30pm (Eastern) for our live discussion with Q&A to follow.

The (old) new software industry

Facebook is a hot mess most of the time, but I usually enjoy the “memories” that remind me what I posted on this date in past years. A couple of days ago, on April 30, I was reminded that in 2007 I attended the New Software Industry conference at the Microsoft campus in Mountain View. These were the days when SaaS and other cloud platforms were emerging as a significant packaging concept, and companies were rethinking their delivery models as well as their split between product and services.

In reviewing those old posts, there were a lot of points that are still valid today, and topics ranged from development practices to software company organization to venture capital. The discussion about the spectrum of software development practices was especially on point: there are some things that lend themselves to a heavily-specified waterfall-like model (e.g., infrastructure development), while others that benefit from an agile approach (e.g., most applications). I also liked this bit that I picked up from one of the sessions about software industry qualifications:

In World of Warcraft you can tell if someone has a Master’s in Dragon Slaying, and how good they are at it, whereas the software industry in general, and the open source community in particular, has no equivalent (but should).

I finished my coverage by pointing out that this was a very Valley-centric view of the software industry, and that new software industry conferences in the future would need to much more exclusive of the global software industry.

I was already live-blogging at conferences by this point in time, and you can read all my posts for the conference here.

On the folly of becoming the “product expert”

This post by Charity Majors of Honeycomb popped up in my feed today, and really resonated relative to our somewhat in-bred world of process automation. She is talking about the need to move between software development teams in order to keep building skills, even if it means that you move from a “comfortable” position as the project expert to a newbie role:

There is a world of distance between being expert in this system and being an actual expert in your chosen craft. The second is seniority; the first is merely .. familiarity

I see this a lot with people becoming technical experts at a particular vendor product, when it’s really a matter of familiarity with the product rather than a superior skill at application development or even process automation technology. Being dedicated to a single product means that you think about solving problems in the context of that product, not about how process automation problems in general could be solved with a wider variety of technology. Dedication to a single product may make you a better technician but does not make you a senior engineer/architect.

Majors uses a great analogy of escalators: becoming an expert on one project (or product) is like riding one long escalator. When you get to the top, you either plateau out, or move laterally and start another escalator ride from its bottom up to the next level. Considering this with vendor products in our area, this would be like building expertise in IBM BPM for a couple of years, then moving to building Bizagi expertise for a couple of years, then moving to Camunda for a couple of years. At the end of this, you would have an incredibly broad knowledge of how to solve process automation projects on a variety of different platforms, which makes you much more capable of making the type of decisions at the senior architecture and design level.

This broader knowledge base also reduces risk: if one vendor product falls out of favor in the market, you can shift to others that are already in your portfolio. More importantly, because you already understand how a number of different products work, it’s easier to take on a completely new product. Even if that means starting at the bottom of another escalator.

Disruption in 2020: now down on the farm

I’ve been writing and presenting a lot over the past several months about the disruption that the pandemic has brought to many aspects of business, and how successful businesses are harnessing technology to respond to that disruption. In short, the ones that use the technology to become more flexible are much more likely to be coming out of this as a success.

I usually work with financial services clients on their technology strategy and execution, but this story caught my eye on how farmers are embracing Zoom calls and much more to make their operations work better. To quote the article, “the pandemic has sped up the adoption of technology in the agricultural industry as farmers spend more time with digital tools and programs and less time having face-to-face meetings”, which is exactly what’s happening in many other industries. If you thought small family farms were low-tech, think again: the farmer interviewed here uses his iPhone to monitor conditions in his fields, market his products, and track weather predictions from wherever he is. And starting this year, due to social distancing protocols, he orders his seed and supplies online, and uses Zoom to talk to experts about problems that arise during the growing season.

He thinks it’s working out well, which probably means that he’ll continue to work this way in the future. This is a theme that I’m hearing in many other types of businesses: once they’ve had to use technology and reorganize their business to accommodate the current disruption, they’re probably not going back to the old way of doing things.

There is definitely a big lesson here for businesses of any size: failure to innovate is going to cause failure, period.

Hype Cycle: better as a rear-view mirror than a look ahead

Interesting analysis and visualization of 25 years of Gartner hype cycles by Mark Mine, Director of the Technology Innovation Group at the Walt Disney Studios:

As Cory Doctorow pointed out in his post (where I first saw this), “His key insight is that while the Gartner Hype Cycle isn’t much of a predictive tool, it’s a fantastic historical record: looking back on it sheds a lot of insight on how we felt about technology and what those feelings meant for each technology’s future.”

Keep this in mind when you’re looking at the next hype cycle: although Gartner may be intending to predict (and even drive) the future of technology, they’re not all that accurate. However, the history of the data is a fascinating look into technological culture.

TechnicityTO 2018: Taming Transportation Troubles with Technology

Every year, IT World Canada organizes the Technicity conference in Toronto, providing a technology showcase for the city and an opportunity to hear about some of the things that are happening both in the city government and organizations that operate here. Fawn Annan, president of ITWC, opened the conference and introduced the city manager, Chris Murray for a backgrounder on the city as an economic engine, and how technology enables that.

The sessions started with a panel on transportation technology, moderated by Jaime Leverton, GM of Cogeco Peer 1 and featuring three people from the City of Toronto: Barb Gray, General Manager of Transportation Services; Ryan Landon, Autonomous Vehicle Lead; and Jesse Coleman, Transportation Big Data Team Leader. Erik Mok, Chief Enterprise Architect for the Toronto Transit Commission, is also supposed to be on the panel but not arrived yet: hopefully not delayed on the TTC. 🙂

They spoke about the need for data collection in order to determine how to improve transportation in the city, whether related to personal vehicles, public transit, cycling or walking. In the past, this used to require manual data collection on the street; these days, the proliferation of traffic cameras, embedded sensors and smartphones means that a lot of data is being collected about how people are moving around the streets. This creates a need for understanding how to work with the resulting big data, and huge opportunities for gaining better insights into making the streets more efficient and safer for everyone. Since the city is a big proponent of open data, this means that the data that the city collects is available (in an anonymized format) to anyone who wants to analyze it. The city is trying to do some of this analysis themselves (without the benefit of a data scientist job classification at the city), but the open data initiative means that a lot of commercial organizations — from big companies to startups — are incorporating this into apps and services. For the King Street Pilot, a year-old project that restricts the travel of private cars on our busiest streetcar route in order to prioritize public transit, the city deployed new types of sensors to measure the impact: Bluetooth sensors that track devices, traffic cameras with embedded AI, and more. This allows for unbiased measurement of the actual impact of the pilot (and other initiatives) that can be communicated to constituents.

There are privacy safeguards in place for ensuring that Bluetooth devices that are tracked can’t be traced to an individual on an ongoing basis, but video is a larger issue: in general, intelligence related to the transportation issues is extracted from the video, then the video is discarded. They mentioned the need for privacy by design, that is, building in privacy considerations from the start of any data collection project, not trying to add it on later.

They also discussed some of the smart sensors and signals being used for controlling traffic signals, where the length of the waiting queue of vehicles can influence when the traffic signals change. This isn’t just related to vehicles, however: there’s an impact on pedestrians that use the same intersections, and on public health in terms of people with mobility challenges.

Cities like Seattle, San Francisco and New York, that started with transportation data collection much earlier than Toronto, are doing some innovative things but the panel feels that we’re catching up: there’s an autonomous shuttle project in the works now to fill some of the gaps in our transit system, for example. There’s also some work being done with drones to monitor traffic congestion around special events (presumably both vehicle and pedestrian) in order to understand dispersal patterns.

Interesting audience questions on data storage (Amazon AWS) and standardization of data formats, especially related to IoT.

As a Toronto resident who uses public transit, walks a lot and sometimes even drives, some great information on how big data is feeding into improving mobility for everyone.

Citrix Productivity Panel – the future of work

I had a random request from Citrix to come out to a panel event that they were holding in downtown Toronto — not sure what media lists I’m on, but fun to check out to events I wouldn’t normally attend.

The premise is that it was a panel of one millennial, one gen X and one boomer discussing how their attitudes towards work and technology:

  • Anna Fitzpatrick, Senior Millennial Correspondent
  • Dr. Mary Donohue, Founder & CEO Donohue Learning Technologies
  • Charles Harnick, Mediator & Arbitrator

This was moderated by Michael Murphy, VP & Country Manager, Citrix Canada.

Donohue is a social scientist, and she started with the definitions of these age divisions (I fall on the cusp of boomer and gen X, although I act like a millennial).

The focus is really on how technology enables different business models (Citrix is, after all, a technology company), and we heard from the panelists about how Twitter, mobile phones and other technologies have levelled the playing field for small businesses regardless of age, gender, race, location and other factors: it’s now possible for your services to compete with bigger companies.

Harnick (a former Ontario Attorney General and politician) commented on how it’s necessary to invest in the education of the workforce: training people to be able to do the work required, mentoring, and taking advantage of the different skills that a non-traditional degree can bring to a traditional workplace.

Donohue talked about the necessity for developing soft skills, since “everything else can be replaced by AI”, which might be a somewhat naive view of AI capabilities. Harnick agreed that although STEM education is critical, there has to be more than just the technological skills for success — entry-level jobs are disappearing in the face of automation in many different industries.

Fitzpatrick talked about the gig lifestyle for work, and admitted that although it’s more easily enabled by technology, a lot of millennials don’t really want to be working several precarious gigs, but would prefer to have full-time regular employment.

Donohue had some really interesting comments on how you have to understand someone’s generation and their abilities with different communication media in order to really understand how to communicate with them and what they are getting out of things that you communicate with them. I don’t agree with her characterization purely along generational lines, since in my experience, older generations who are disrupted from their status quo will adapt quickly to newer communication modes.

There was a discussion on internet security and privacy, and different views that different generations have about the issues and practices. Again, I don’t think that security practices are necessarily generational, but much more about an individual’s experience; privacy is a different story, with a bigger intentional digital footprint being left (in general) by younger generations, although the unintentional footprint of everyone who uses social media is fast catching up. An audience member commented on how security and privacy education needs to start in high school or earlier; I suspect that while the younger generation needs some lessons on privacy, the older needs to hone their security skills.

This was pretty lightweight on both the social science and technology, although Donohue had some points worth considering and I’ll be checking out her TEDx talk.

Technicity2013 Cybersecurity Panel: How Prepared Is Business?

Our afternoon panel was moderated by Pete Deacon of Blackiron Data (another conference sponsor), and featured panelists from private industry: Kevvie Fowler, forensic advisory services at KPMG; Daniel Tobok, digital forensics at TELUS; Jeff Curtis, chief privacy officer at Sunnybrook Hospital; and Greg Thompson, enterprise security services at Scotiabank.

Security breaches happen. And as Deacon reminded us, over 60% of those take months (or years) to detect, and are usually detected by someone outside the organization. What are the real cybersecurity risks, what are companies’ perceptions of the risk, and what are the challenges that we face? Fowler believes that since security is often a low-level IT issue, the security message isn’t making its way up the ladder to the C-suite unless a high-profile breach occurs that requires some sort of executive damage control. Curtis agreed, adding that hospitals are used to dealing with clinical risks right up through the executive levels but that IT security risks are a new topic for their executive risk management participants. Both noted that it’s important to have the right people to carry that message: it has to be technically correct, but integrated with the business context and goals. Thompson added that the message doesn’t need to be dumbed down for the C-suite: their board is very used to assessing complex financial risk, and is capable of assessing other types of complex risk, although may need to become versed in some of the cybersecurity language and technology.

The next topic was BYOD (bring your own device), and Thompson pushed the conversation beyond this to BYON(etwork), where people bring their own network, even if just through a smartphone hotspot. Companies are losing control of where people do their work, both devices and network, and solutions should be designed to assume that all endpoints and networks are potentially hostile. Business and productivity have to be balanced with risk in these cases: people will do what they need to do in order to get their job done, and if you think that you’ve avoided security breaches by locking down someone’s access on their corporate device, you can be sure that they’re finding a way around that, possibly on their own device. Curtis agreed, and pointed out that they have a lot of students and interns who come in and out of the hospital environment with their own devices: the key is to enable workers to get their work done and protect the data, not to hamstring their work environment, so they have a device registration policy for BYOD that is working well. Tobok works with a lot of law firms, and notes a recent trend of new lawyers using technology capabilities (including openness to BYOD) as a competitive criterion when selecting a firm to work for.

Moving on to security analytics, Fowler said that there are few organizations actually getting value from predictive security analytics, versus more straightforward data mining: it’s important to query the vendors providing predictive analytics on the models that they’re actually using and the success rates. Thompson agreed that predictive analytics is a bit of black magic right now, but sees a lot of value in historical data analysis as a guide to improving the security environment. In my opinion, in the next two years, predictive analytical models are going to start to become mainstream and useful, moving out of a more purely research phase; we’re seeing this in predictive process analytics as well, which I still talk about in the context of “emerging technologies”. This is all tied up with reporting and compliance, of course: business intelligence and analytics have played, and will continue to play, a key role in detecting breaches and auditing cybersecurity. Both Curtis and Thompson spoke about the regulatory pressures in their respective industries and the growth of analytics and other GRC-related tools; healthcare is obviously a highly-regulated industry, and Scotiabank does business in 55 countries and has to deal with the regulations in all of them. Auditors and regulatory bodies are also having to step up their knowledge about cybersecurity.

There was a question from the audience on investigations of security breaches in cloud environments: Tobok is involved in cybersecurity forensic investigations including cloud, and discussed the changes that have happened in the industry in the four years that he’s been involved in cloud security forensics in order to provide better traceability and auditing. Fowler added that forensic science is adapting for these type of investigations, and half of the work is just figuring out what systems that the data has been resident on since the typical cloud contract only allows a client to access their data, not the actual servers on which is resides. These can include a number of other factors, such as hackers that use compromised credit cards to lease space in a data centre in order to hack into another organization’s data in that same centre; obviously, these complexities don’t exist in breaches to a company’s own data centre.

There was a final panel with five of the vendors who are sponsoring the conference, but my brain was pretty full of security information by then (and I thought that this might be a bit more about their products than I care about) so I decided to duck out before the end.

Another great Technicity conference, and I look forward to next year.

Technicity2013 Cybersecurity Keynote: Microsoft’s Angela McKay

This morning at Technicity 2013, we mostly heard from academics and public sector; this afternoon, it’s almost all private sector presentations and panels, starting with a keynote from Angela McKay, director of cybersecurity and strategy at Microsoft, on managing cyber risks through different approaches to addressing uncertainty. Risk, and therefore answering the question “am I secure enough?”, are quite individual choices: different people and different companies (and cultures) have different risk thresholds, and therefore may have different cybersecurity strategies.

By 2020, we will have 4B internet users, 50B connected devices, and data volumes 50x those of 2010. As users evolved, so have cyber threats: from early web defacement hacks, to worms, to the present day botnets and targeted attacks. There is a spectrum of cybersecurity threats: crime, disruptions (e.g., DDoS attacks), espionage, conflict, war; there is a lot of technological development going on around these, but there are also cultural and policy issues, namely the expectations of consumers, companies and governments. McKay discussed the EU network and information security directive and the US executive order and presidential policy directive on cybersecurity, and the levels of new regulation that are coming.

Reducing the impact of cyber threats involves risk management, information exchange, and effective partnership (both public-private and between private organizations). You can’t do risk management without information, and this means that cybersecurity is a CIO-level issue, not just some technical plumbing. Information sharing, however, can’t be indiscriminate; it has to be focused on specific outcomes. [As an aside, I’m not sure that I agree with this in some situations: open data initiatives work because the “owners” of the data can’t conceive of what anyone would do with their data, yet emergent uses happen with interesting results.] Private-public partnerships bring together the policies and goals related to public safety of the public sector, and the technical know-how of the private sector.

She spoke about the shared responsibilities for managing cyber risks: awareness and education, partnering effectively, driving and incentivizing cyber security, adopting best practices, building advancing capabilities, and developing a secure workforce. Furthermore, academia has to step up and start teaching security concepts and remedies at the college and university level, since most developers don’t have much of an idea about cyber risks unless they specialized in security post-graduation.

Microsoft is the premier sponsor of Technicity 2013, although to be fair, McKay’s talk covered very little about their products and services except for some generic discussion about automated cyberdefense at a machine level. Her slides used that ubiquitous font that we see on the Microsoft Windows 8 website, however, so probably some subliminal messaging going on. 🙂

Technicity2013 Cybersecurity Panel: Is Canada Ready?

Andy Papadopulous of Navantis moderated a panel on the Canadian context of cybersecurity, with panelists Rob Meikle, CIO of City of Toronto; Ritesh Kotak, Operation Reboot (cybercrime initiative) at Toronto Police Service; Wesley Wark, professor at University of Ottawa’s graduate school of public and international affairs, and a specialist in national security policy; and Stephen McCammon, legal counsel at the Ontario Information and Privacy Commissioner.

They each spoke about their specific take on privacy and security in Canada:

Meikle: The interconnection and importance of data and technology, and how these are no longer just on computers inside our offices any more: in addition to cloud computing, we consume information on mobile devices, but also collect and process information from remote devices such as transit vehicles. He addressed the Toronto open data initiative, and how it is critical to look at data from a public citizen perspective rather than an organizational perspective: similar views would not go amiss in private sector organizations and their data.

Kotak: How TPS is having to redefine crime in the era of cybercrime, and how the police force is having to adapt in order to track down online crimes in the same way that they do with “real world” crimes in order to protect public safety. His experience in researching how police services are addressing cybercrime is that many of them equated it only with child exploitation (driven, likely, by the federal government tendency to do the same in order to justify their over-reaching anti-privacy legislation that we heard about from Michael Geist earlier), but there are obviously many other forms of cybercrime, from financial to hacking pacemakers. They identified a number of areas that they needed to address with respect to cybercrime: overt communication (e.g., social media), investigations, covert operations, and policies and procedures.

Wark: Cyberaggression and its impact on us, with five possible outlets: cyberwar, cyberterrorism, cyber covert operations, cyberespionage and cybercrime. He feels that the first two do not actually exist, that covert operations is an emerging area, while espionage and crime are well-established cyber activities. He maintains that the government’s focus on terrorism in general is a bit ridiculous, considering the lack of any evidence that this is occurring or even imminent (a recent US study showed that Americans are more likely to be killed by their own furniture than by terrorism); and that the government has a difficult time establishing their role and responsibilities in cybersecurity beyond throwing out some simplistic barriers around classified government data. We need to do more with private-public partnerships and education — starting with some simple sharing of best practices — in order to appropriately address all forms of cyberaggression. We need to decide what we really mean by privacy, then define the legal framework for protecting that.

McCammon: How to achieve the balance between privacy and openness. Usability is critical: it’s not just enough to have good authentication, encryption and other services to protect people’s privacy; those tools need to be easy enough for everyone to use (or completely and transparently embedded in other platforms), although Wark challenged that that was unlikely to happen. More information is being gathered, and will continue to be gathered, and analytics allow that to be integrated in new ways; there is no putting the toothpaste back in that particular tube, so we need to learn to deal with it in ways that protect us without requiring us to pull the plug and move to the woods. Trust is essential for privacy (although I would add that enforcement of that trust is pretty critical, too).

Good discussion.