Privacy In Focus | September

September 26, 2025

by Olena Nechyporuk

We bring you a round up of articles and updates in the data sphere

Friday, 26th of September 2025

EU Court Challenges Public Naming of Athletes

The EU Advocate General issued an opinion that Austria's practice of publishing athletes' names online for anti-doping violations is contrary to the EU GDPR. The information included their names, sports, suspension details, and exact violations committed, and was published on anti-doping websites.

Austrian law mandates systematic publication of athletes' names, violations, and sanctions on public websites; however, the Advocate General's analysis revealed fundamental points which clashed with the GDPR:

• There was a lack of any individual necessity assessment performed

• The data minimisation principle was not adhered to

• Special category data (health/criminal offence data) was not treated properly

The Advocate General suggested that an alternative pseudonymised approach would have achieved legitimate aims while respecting data subject rights.

This case reinforces that proportionality is one of the key aspects controllers should keep in mind when dealing with personal data.

Read more

---

£550,000 Fine for Unlawful Marketing

Two energy firms in the UK were hit with £550,000 in fines for unlawful robocalls targeting vulnerable people. The companies used avatar software with fake personas ("Jo, Helen, Ian") while actual call agents operated from abroad, making millions of automated calls without consent.

How can you spot a robocall?

• Slight response delays after your answer

• Generic, scripted answers from the agents

• Identical voices across "different" agents or calls

Under the GDPR, automated marketing calls require explicit, informed consent, otherwise, a marketing call is illegal.

How can you protect yourself from spam calls?

• Register with the Telephone Preference Service (TPS)

• Report violations to the ICO

• Always verify caller identity

Read more

---

Children's Safety Questioned by the FTC

The FTC is investigating seven major tech companies (Alphabet, OpenAI, Character.ai, Snap, XAI, Meta, and subsidiary Instagram) over how their AI chatbots' interact with children.

This investigation in prompted by tragic lawsuits involving a teen suicide linked to prolonged chatbot conversations, as well as Meta permitting AI companions to have "romantic or sensual" conversations with minors. This raises serious concerns about AI's psychological impact on young minds.

Key concerns:

• How much sensitive personal information is collected from vulnerable users?

• Profiling of minors' psychological states

• Weak parental consent and age-verification mechanisms

• Monetization of children's private conversations and emotions

Strong privacy safeguards are a must when children are involved, and tech giants have a responsibility in making sure their products do not pose big risks for their users.

Read more

---

The DSA and the GDPR: How do They Work Together?

The European Data Protection Board (EDPB) has created guidelines on how the DSA and the GDPR intersect.

Here are some of the key takeaways:

• Companies must have proper legal justification if they want to use personal data in fighting illegal content.

• If the mechanisms automatically detecting illegal content are fully automated, strict automated decision-making rules should apply.

• People reporting illegal content should not be required to submit their personal data unless identification is absolutely necessary.

• Online platforms cannot show ads based on profiling using special category data, even if they have legal permission to process that data.

• Platforms must explain in real-time why specific ads are shown to specific users.

• Platforms cannot show personalized ads to minors.

Read the full report below.

Read more

---

EU Court Upholds Data Transfers to US

The recently released judgement of the EU General court was the third attempt at questioning the EU-US data sharing agreement in court. Previously, activist Max Schrems killed two frameworks: Safe Harbor (2015) and Privacy Shield (2020), arguing that US surveillance was too invasive.

This year, French citizen Philippe Latombe claimed that America's new Data Protection Review Court wasn't truly independent from government control, with judges put in place who could be influenced by intelligence agencies and the Attorney General.

The EU General Court disagreed, finding that sufficient safeguards exist to continue the EU-US data sharing agreement, and that there is no high risk of the judges being compromised.

This judgement will ensure that data will keep flowing between EU and US companies while maintaining adequate privacy protections. This is welcome news for businesses, although some surveillance concerns might still persist.

Read more

---

Hot Off the Press: Meta and TikTok Successfully Claimed EU Commission Calculated Their fees Wrong

The General Court of the European Union revealed today that Meta and TikTok were right in appealing the EU Commission's DSA fees.

Background: the DSA allows the EU Commission to collect fees for supervising 'very large platforms' from the platforms themselves. This fee is calculated on the basis of the number of users a platform has in the EU.

In November 2023, Meta and TikTok brought an action before the General Court of the European Union, claiming that the fees the EU commission presented them with were wrong.

The General Court has today annulled the implementing decisions, maintaining that the EU Commission has to recalculate the fees correctly and change the methodology for calculating fees in the future.

Verdict: Meta and TikTok will still have to pay the DSA fees, but the amount will be recalculated by the EU Commission.

Read more

---

The Danger of Sharing "Pseudonymised Data"

When a Spanish bank (Banco Popular Español) failed in 2017, Deloitte was hired to assess whether former shareholders and bank creditors should get compensation. As part of this assessment, the comments and feedback from affected people were shared with Deloitte after being pseudonymised beforehand (the names of the commenters were taken out).

However, the people were not informed that their comments would be shared with Deloitte, and complained to the European Data Protection Supervisor (EDPS) and eventually the case was brought before the CJEU.

The verdict of the CJEU is as following:

• Personal opinions are always personal data: opinions are inherently linked to the person who expressed them, and removing the name might make the author of a comment still identifiable.

• The original data collector failed in their duties: the people had to be informed that their data was being shared when they it was first collected, regardless of whether it would still be identifiable after pseudonymisation.

Takeaway: pseudonymisation can be used as additional protection, but it is not a replacement for proper notices about third-party data sharing.

Read more