Loading...
Loading...
Apple will require UK iPhone users to verify they are 18+ to access certain services or features after a forthcoming software update, using either a credit card (not debit) on file or a scanned driving licence/national ID. Ofcom hailed the move as a milestone for child safety and part of efforts to keep young people from harmful content, while noting app stores weren’t covered by the Online Safety Act. Apple did not specify which services or actions will be restricted without age confirmation. S
Apple has started enforcing device-level age verification in the UK via a new update that requires users to confirm they are 18+—using a credit card or ID scan—to access unrestricted features. Unverified or underage devices will have Web Content Filter and Communication Safety enabled, limiting apps/sites and scanning messages, shared photos, AirDrop, and FaceTime for nudity. Apple tied the move to likely compliance with the UK Online Safety Act’s restrictions on minors accessing online pornography; the UK doesn’t mandate OS-level checks but is concerned about VPN and loophole circumvention. Industry players including Meta and Pornhub operator Aylo have pushed for phone-level verification as a simpler, broader solution, raising questions about whether similar measures will reach the US.
India, the world’s fourth-most-populous country, has restricted access to major social media platforms for children by banning most services for users under 18 unless parental consent and age verification are implemented. The move targets platforms including TikTok, Instagram, and others widely used by minors, and is driven by concerns over mental health, privacy, and harmful content. Regulators expect platforms to adopt stricter age checks, parental controls, and content moderation or face blocks. This affects global tech companies that must adapt compliance, verification technology, and product features for a massive market. The policy could shift platform design, identity-verification methods, and data practices, with implications for child safety, regulation, and business models.
Indonesia is set to become the first Asian country to ban social media accounts for users under 16, requiring platform verification to enforce the rule. The government’s new regulation aims to protect minors from online harms, misinformation, and privacy risks; it will affect global platforms operating in Indonesia that must implement age checks and content controls. Tech companies face implementation challenges—balancing verification methods, privacy, and potential loss of young users—while civil society warns about surveillance and exclusion risks for those without IDs. The policy matters because it could set a regional precedent, reshape platform compliance costs, and influence debates on age verification, data protection, and online safety frameworks.
Apple will require UK iPhone users to verify they are 18+ to access certain services or features after a forthcoming software update, using either a credit card (not debit) on file or a scanned driving licence/national ID. Ofcom hailed the move as a milestone for child safety and part of efforts to keep young people from harmful content, while noting app stores weren’t covered by the Online Safety Act. Apple did not specify which services or actions will be restricted without age confirmation. Some users raised privacy and choice concerns about uploading sensitive ID or payment details to prove age. Ofcom will evaluate app store age assurance effectiveness next year.