Loading...
Loading...
A pattern of weak privacy protections is emerging across EU tech: the new EU age-verification app was shown to store PINs and biometric images insecurely, with bypassable checks and writable configs that could enable large-scale data exposure and GDPR breaches. At the same time, institutions are pushing staff off consumer platforms like WhatsApp to regain data control, favoring self-hosted or open protocols to meet compliance and archival needs. Meanwhile, ineffective “reject cookies” interfaces leave tracking intact, undermining consent regimes. Together these stories highlight gaps between policy goals and implementation, stressing the need for stronger technical standards, audits, and enforcement.
Weak or inconsistent privacy controls in EU tech undermine GDPR goals and expose organizations to legal and reputational risk. Tech professionals must reconcile policy requirements with secure design, procurement, and operational controls to prevent data leakage and ensure compliance.
Dossier last updated: 2026-05-13 13:37:17
Dutch suicide prevention hotline Stichting 113 shared sensitive website visitor metadata — including location, device, referrer, and screen recordings — with third parties such as Google (and, with consent, Microsoft), according to research by ethical hacker Mick Beer of Hackedemia.nl reported by BNR. The foundation has temporarily disabled all measurement and analytics tools after being confronted and is investigating possible GDPR breaches, since contact with a suicide prevention service is treated as medical personal data requiring heightened protection. 113 says it did not share substantive chat or conversation content, calls the leaked items technical metadata, and acknowledges trust and privacy concerns while assessing next steps.
Security consultant Paul Moore published a two-minute exploit walkthrough showing critical flaws in the EU Age Verification app: PIN storage and protection are broken, rate-limiting and biometric checks can be bypassed via editable config files, and identity images (NFC DG2 and selfie PNGs) are written to disk unencrypted or not deleted. Moore says the PIN is stored in shared_prefs without proper cryptographic binding to the credential vault, letting an attacker reset or replace the PIN and present old credentials. Unprotected persistent biometric images and config booleans increase the risk of large-scale privacy breaches and potential GDPR violations. The issues implicate privacy, app design, and deployment of a high-profile EU digital ID tool.
European institutions are ordering civil servants to stop using WhatsApp for official communication amid data protection and control concerns. Politico reports EU officials are moving away from third-party messaging platforms that are outside government oversight; Dutch ministers explicitly called for alternatives under their control. The shift aims to reduce data exposure to private companies and foreign jurisdictions, improve compliance with privacy laws like GDPR, and ensure archival and security requirements are met. The move could push adoption of self-hosted or open protocols (Matrix, XMPP, ActivityPub) and enterprise messaging solutions, forcing procurement and integration work across member states. It matters for vendors, interoperability efforts, and digital governance policy across the EU.
Researchers and privacy advocates warn that many websites’ “reject cookies” buttons are ineffective: clicking them often does not stop tracking because sites continue to set or allow third-party trackers, use fingerprinting, or bury consent via opaque interfaces. Regulators and browser makers have pushed transparency and stricter consent rules, but inconsistent implementation and complexity in consent management platforms mean users may believe they opted out when tracking persists. The issue matters because it undermines user privacy, regulatory enforcement (e.g., GDPR), and trust in web consent mechanisms, prompting calls for stronger standards, better browser-level protections, and audits of consent tools.