Privacy In Focus | August

August 28, 2025

by Olena Nechyporuk

We bring you a round up of articles and updates in the data sphere

Friday, 28th of August 2025

📢 A Shift in UK GDPR Compensation?

Generally, data breaches that caused minimal harm under the GDPR (such as sending a misdirected email) rarely qualified for compensation. However, this may change as a result of the recent Court of Appeal decision in Farley v Equiniti, which confirms that individuals can claim non-material damages as a result of a data breach.

Takeaways from the case:

GDPR breach alone doesn’t entitle someone to compensation; but there is no threshold of “seriousness” for bringing a claim.

Anyone can claim damages as a result of a data breach.

Claimants must show a causal link between the breach and the harm suffered. However, the fear of misuse of personal data arising from a data breach can itself qualify as sufficient harm.

This is a clear signal for organisations to be mindful about the possible claims made by victims of data breaches. The result of this case points to of a greater willingness of courts to engage with claims on the basis for psychological harm.

Read more

---

Breaking: Polish DPA Fines Bank for Obtaining Unlawful IDs

The Polish Data Protection Authority (UODO) has fined ING Bank Śląski approximately 4.1 million EUR.  

Between April 2019 and September 2020, the bank copied IDs of both clients and potential clients unlawfully. This was done even in situations unrelated to AML (Anti Money Laundering) requirements, for example, when clients wanted to submit a complaint. There were up to 4.7 million customers in 2020, and such large-scale processing demands a high degree of diligence and risk-assessment.

What went wrong?

- There was no proper risk assessment in place before launching this practice

- ID copies were collected without a valid legal basis under the GDPR

Key reminder:

Organizations handling personal data on such scale must implement a risk-based approach and regularly review the legal grounds for processing. Otherwise it poses a serious privacy threat for many people.

Read more

---

Dutch Court on GDPR's Right to Erasure

The Dutch Council of State delivered a significant ruling impacting privacy and healthcare law within the healthcare sector. The case involved GGNet, a mental health institution, which deleted a patient's medical file upon request but retained a note documenting the deletion, which contained some personal and health data.

The patient complained to the Dutch DPA seeking enforcement under GDPR's 'Right to Erasure', but the authorities refused to uphold this on the basis of an applicable exception under Article 9(2)(h) of the GDPR - the note was deemed necessary for ensuring and demonstrating accountability. The Council of State upheld this decision when the patient appealed.

This ruling clarifies an important point - the right to erasure cannot override healthcare providers' accountability obligations, provided retention practices fulfill the data minimisation principle and are legally justified.

Read more

---

ICO Releases Children's Data Lives Report

The ICO has recently published a report of the data lives that children lead online in the UK. This comes after the release of The Data (Use and Access) Act, (DUAA).

The report explores how children feel and interact within digital spaces.

A few key takeaways:

- Bypassing age verification has been easy for kids to do so far

- Recommended algorithm feeds shape the views online and offline

- Kids' awareness of data agreements is limited: cookies and privacy are rarely well-understood

Read more

---

UK's Online Safety Act: Censorship vs Safety

A BBC study reveals the dangerous tension between child protection and democratic freedoms. As a result of The Data (Use and Access) Act, (DUAA), platforms have been censoring issues such as parliamentary debates on grooming gangs, war coverage from Gaza and Ukraine and classical art imagery to avoid fines for noncompliance.

The BBC found that up to 59% of users browse without age verification, meaning that a vast majority of adults are effectively experiencing a child-filtered internet. Oxford's Prof Wachter warns this these measures are not about protecting children but "suppressing facts of public interest."

According to the BBC, the safety efforts to protect children online need better implementation systems from tech companies.

Read more

---

Italy Launches Meta Competition Investigation

Italy's competition authority launched an investigation into Meta for integrating their Meta AI feature into WhatsApp without user consent, thus abusing their competitor position. This AI feature, added to WhatsApp in March 2025, forces users to accept AI services they didn't request, potentially harming competitors and violating EU competition rules.

There was no explicit opt-in consent, and not users were not given the choice to accept the AI integration.

If the court rules in favour of the Italian competition authority, Meta may face fines up to 10% of their global revenue for breaching competition laws.

Read more

---

ChatGPT Exposed Thousands of Private Conversations

OpenAI has quietly removed ChatGPT's 'share' feature that allowed 4,500+ private conversations to became publicly searchable on Google. Users wanting to share their conversations with friends or family were not aware that these conversations became searchable on Google as soon as they were shared via a private link.

Users unknowingly exposed sensitive discussions about mental health, relationships, and personal matters, which reveal a high level of risk for personal data. Even "deleted" ChatGPT conversations are permanently stored due to ongoing litigation, so users should be careful with entering any personal information into AI systems.

Take-away: never share sensitive information with AI chatbots and review privacy policies carefully before engaging in a service.

Read more