Making data protection a reality rather than just a tick-box exercise when looking after vulnerable people
Charities occupy a unique and often complex space when it comes to regulatory compliance, especially when working with vulnerable individuals. Their role is not only compassionate and service-driven but also requires careful, ethical handling of personal data.
Whether supporting someone in crisis, processing donations or managing staff and volunteers, charities often hold sensitive information tied to real lives and real vulnerabilities. Data protection here isn’t just about regulation; it’s about trust. Trust that the information shared will be treated with care and discretion.
Yet, that trust has been worn down by high-profile data breaches and the perception that some organisations view data protection as an afterthought. In a charity setting, this kind of oversight can cause real harm to people already facing serious challenges. The duty of care is greater and so must be the care we apply to data.
Handling personal data with care
UK and European GDPR legislation classifies some information as special category data, including details about ethnic origin, religion, health and biometric data. These aren’t always obvious. A note like “low blood sugar due to fasting” dated during Ramadan might suggest religious observance, potentially identifying someone as Muslim. A combination of such details can form a bigger picture, often unintentionally.
This data requires stronger protection for good reason. History has shown us what happens when information about ethnicity, beliefs or medical conditions falls into the wrong hands. Sadly, discrimination and persecution based on such traits continue around the world today.
GDPR special category data, then, isn’t just a legal term. It signals a responsibility to protect something deeply personal because mishandling it can have devastating consequences.
Adopting a culture of caution
IT managers are increasingly promoting a “zero trust” mindset inside organisations: question that email, double-check that link, lock down access to Zoom calls. This is good practice. But if we encourage staff not to automatically trust, can we blame service users for doing the same?
Why should someone assume a charity has strong security simply because it says so? For vulnerable individuals handing over sensitive information, that question, “can I trust you with this?” Is not just fair, it’s fundamental.
Legal scholar Ryan Calo’s work on privacy and vulnerability explores how privacy often diminishes as vulnerability increases. Imagine someone collapsing at work, paramedics are called, and records are shared quickly under legal grounds like vital or legitimate interest. In moments like that, privacy can seem to vanish. It may be necessary, but it also highlights how privacy and vulnerability interact.
Charities, more than most, must understand this. Their service users may already have fragile relationships with public services. Data protection principles - lawfulness, fairness, transparency and minimisation - are not just legal niceties. They are tools for building safe, respectful relationships.
Storing sensitive data
A person experiencing homelessness might be asked intimate questions about sexuality, substance use or mental health. We gather this data to support them but what happens next?
Often, that information is kept indefinitely or accessed more widely than needed. Notes like “visit in pairs” may reflect safety concerns but could also stigmatise. While the intention might be protection or service planning, we need to ask: are we holding on to this data for the right reasons?
Data protection principles go back decades, set out in the 1971 Younger Report, and they still resonate today. One of the most overlooked is storage limitations. GDPR doesn’t just require retention schedules; it asks us to keep data that may render an individual to be identifiable only as long as needed. Perhaps we should call it “identification limitation”. Storage limitation is not about data retention, it's about identification limitation.
So why not anonymise or pseudonymise data at the point of collection? Often, the answer is cost. But many organisations fund senior salaries while claiming they can’t afford stronger data security. The sixth GDPR principle allows a balance between risk and cost but that balance must reflect the real stakes, especially for vulnerable people.
We may not need to anonymise everything immediately, but we should strongly consider it when dealing with our most sensitive data. If we value the trust people place in us, our data practices must show it.
Rethinking “data sharing”
“Data sharing” is a misleading phrase. It suggests mutual exchange, like sharing sweets or stories. But in practice, it’s often one-way disclosure.
When staff are asked in audits whether data is shared outside the organisation, most say no, because they picture sharing as collaborative. But once this is reframed as disclosure, the answers change. Who else can access your data? Who deletes it at the end of its lifecycle? These are essential questions.
Bedrock of good practice
This brings us right back to the first and second data protection principles: lawfulness, fairness, transparency and purpose limitation. If we’re disclosing data, we need to be clear about why, with whom, and on what legal basis. These aren’t abstract requirements; they go to the heart of ethical and safe data handling, especially when dealing with sensitive or vulnerable individuals.
The principle of minimisation also deserves serious attention. It’s there in nearly every credible data protection framework. For instance, it appears as the third principle in Jamaica’s Data Protection Act of 2020. And yet, oddly enough, the statutory subject access request (SAR) form includes a pair of tick boxes asking for the requestor’s sex. Why? What purpose does it serve? How does it align with the principle of collecting only what is strictly necessary?
This isn’t a dig at the Jamaican Parliament or the Office of the Information Commissioner. It’s simply a reflection of how difficult it can be to implement data protection in practice. Culture, habits and long held assumptions often stand in the way of meaningful change.
We keep collecting data because we always have and because we’re unsure what will happen if we stop. But we need to be braver, and we need to design our data collection around the idea of limiting identification wherever possible.
It’s worth emphasising that minimisation doesn’t stop at data collection. It should shape every interaction we have with personal data. That means reducing what we collect, of course, but also limiting access, reducing identifiability, and thinking critically about who needs to see what, and when.
Ransomeware attack
A few years ago, a charity suffered a ransomware attack. More precisely, its IT services supplier was hit. The fallout knocked the charity’s systems offline for weeks, some even for months. The charity only survived because it had its own small scale backup arrangements. Still, it was a close call and a powerful reminder of how vulnerable we are when we rely on third-party processors without proper checks in place.
Despite the increasing regulatory pressure on processors, many data controllers continue to operate under a misplaced sense of security. They assume that once data is with a supplier, it’s safe. But where’s the “zero trust” mindset?
On a separate project, I visited two data centres. One in Manchester was secure and well managed, processing travel data. The other, in London, handled vast amounts of sports fan data, millions of records each night, drawn from ticketing, websites, and social media to build detailed customer profiles.
I insisted on visiting it in person. What I found was shocking. The site was poorly secured and completely unfit for purpose. Yet big-name sports clubs were using the service simply because others were.
The lesson? Never assume. Do proper due diligence on your suppliers. You might be sitting on a ticking time bomb.
Real impact of a breach
Credit to the ICO for its 2024 “Ripple Effect” campaign, if you haven’t seen it, Google it. The grainy black and white images and stark messages were hard-hitting and, frankly, spot on for charities handling sensitive data. One quote stuck with me: “The data breach was a nightmare. We had to work round the clock for two weeks.”
But that’s not the real story. The real nightmare was for those whose personal data was exposed. Some had to change their names, relocate and move their children to new schools.
Think of the situation at personal genomics company 23andMe, or the Police Service of Northern Ireland (PSNI) incident where the personal information of its entire workforce was exposed, leaving many fearing for their safety. A single error - one mistyped entry or unauthorised disclosure - can wreak havoc on someone’s life. This is why what we do with data matters so much.
Beyond tick-box training
All of this brings us to training and awareness. Ask yourself: what are you doing tomorrow to change behaviour in your charity? Is your annual data protection training just a tick-box exercise? Is it even any good? Is it tailored to your charity, or just some off the shelf e-learning module that people click through half-asleep?
Anyone with basic digital skills can pull together a bespoke, narrated presentation. Anyone can design a simple poster reminding staff how to handle data securely. So why don’t we? Why do we default to generic training that fails to reflect the real challenges and risks we face and the expectations and rules for information handling in our specific organisations?
Data protection isn’t common sense. It’s a discipline. And if we want people in charities to do it well, we need to give them the tools, the context and the support to do so. Let’s stop building a “defensible position” just to protect ourselves in the event of a problem. Let’s build a posture that actually protects the people whose data we hold.