In the run up to any holiday I’m used to not feeling quite the same excitement as many other employees.
When other folk logged off and went home, the IT department and those who worked alongside them, were often obscenely busy. My first year in IT coincided with the millennium. I was very new, so I missed most of the Y2K prep, but it did pique the imagination.
So here I am, 20 years later, having clawed back some of that holiday cheer. Mainly because I’ve pivoted towards all things data protection. But someone decided to get Brexit done, then Max Schrems landed a knockout in round II. Not to mention squillions of other data protection and privacy regulations brewing.
Most recently I’ve been talking to very small firms about some of those things and their intersection with security. Tech startups who mainly have business propositions because of the commoditisation of cloud computing in hands of a few giant US players. That and the massive promise of ability to play with data at scale. Things we’re kind of being told are off limits to those of us dealing with EU residents. That’s not strictly true, but unless someone blinks on the geopolitical surveillance stage, it is going to be ‘interesting’ to work around. Encryption doesn’t solve everything, mainly due to the fact you have to decrypt stuff to use it, but it does help a fair bit. What helps more is the fact that these small firms can also be agile. They can immediately lay their hands-on technical detail and rapidly get things done. Having that opportunity to weave in key security and data protection principles, but also meaningful pauses, has been priceless. Mainly because I’m lucky to work with people who keep respect at the heart of plans.
Most of it is about having a place to park things that do not pass the sniff test. Embedding criteria so the ones that really matter, the ones that can bed in insoluble issues, get escalated to people who can change minds. What kind of sniff test? This kind:
GDPR sniff test
Privacy notice vs. Data handling reality
- If it was your data, would the privacy notice feel easy to read, and transparent about the planned use?
- If it was your data, would planned use feel fair and proportionate to achieve stated aims?
Data security risks vs. Control reality
- If it was your data, would the security controls in place feel appropriate, and adequate to mitigate unacceptable risk to you and your contacts?
Justifications vs. Reality of potential impact
- If it was your data, would the stated organisational interests feel like they outweigh all the things that could go personally and collectively wrong?
Some things go back into the mix to be smoothed out as development progresses, some are fine to fix through beta release support calls, others are categorically not. Most risk mitigation can’t be backfilled, but too much is handed off to a client to accept on behalf of the rest of us. Too often lasting until a head of steam builds in the shape of poor publicity. Something fended off by injecting an ever-widening gap between potentially impacted end users and scarce remaining bodies allowed to pick up the phone.
I’m lucky to work with people who see the difference. They build for a carefully considered run environment and educate people about exceptions, because they know interactions of people with different agendas may well change how their products and services are used. They know that people with myriad needs and fundamentally different relationships with the world, will be impacted by what they create in very different ways.
It is a fabulous antidote for some of the shenanigans being seen elsewhere. Attempts to de-humanise us so we better fit a policy mandate, or a definition of undesirable. Worse still, a suggestion of different worth, a suggestion that one life is inherently less valuable than another.
In the same way we saw younger and more diverse portions of the population ring in political change, younger and more diverse firms are rejecting rabid data monetisation without proper transparency and consideration of more than just immediate consequences. That is why, in a job that is too often a perpetual repetition of “it depends” I am absolutely certain it is the right thing for me to do.
If I could translate all that into one holiday wish, it would be for folk to steal my sniff test. And one day soon, when I’ve finished waxing lyrical about holiday cheer, I’ll be happy to put some meat on those bones.
In the meantime, here’s to 2021, fresh perspectives, fresh starts, and good health for everyone.
About Sarah Clarke
Sarah is a security and data protection governance specialist on a tireless mission to cut through the murk of new buzzwords to make all of our lives a bit easier. After a business degree at Edinburgh University, she toyed with customer relations, then stumbled into IT. In the intervening 20 years, including nearly a decade working in the financial services sector
she gained some really valuable insight into cybersecurity and data protection challenges.
Translating lessons learned into better ways to tackle governance and risk in depth and at both start-up and corporate scale. She is also an award-winning blog writer, frequent contributor to trade publications, speaker, and a guest lecturer for the Manchester University IT Governance masters course.
Her overarching aim is to make it simpler to do things in a secure and privacy-respecting way. Born from the conviction that no-one should be accountable for something they can’t influence, or don’t understand.
When not doing, speaking, or writing about GRC, she fights for better treatment for pancreatic cancer, mainly through Pancreatic Cancer UK. It is a deadly and ignored disease and deserves all the publicity and political attention we can muster.