News

iOS 26 Beta Brings FaceTime Nudity Detection That Pauses Calls Automatically

Apple’s iOS 26 Beta Adds FaceTime Nudity Filter That Auto-Pauses Video Calls on Detecting Explicit Content

Written By : Somatirtha

Apple’s next iOS 26 update, unveiled at WWDC 2025, introduces several high-profile features. The update brings a new Liquid Glass UI, Wallet, CarPlay, and Messages updates. The most recent privacy feature buried within the developer beta is now commanding serious attention: FaceTime nudity detection.

What Didn’t Apple Announce At WWDC?

This unannounced feature automatically pauses both video and audio on a FaceTime call if it identifies nudity on the screen. A warning message is then displayed, warning users:

“Audio and video are paused because you may be showing something sensitive. If you feel uncomfortable, you should end the call.”

Users are presented with the option to resume or hang up the call. Although Apple didn’t speak of this during its WWDC keynote, the feature has been unearthed by developers digging through the iOS 26 beta and published by 9to5Mac.

How Does This Expand Apple’s Communication Safety Tools?

Apple confirmed in a blog post that it is rolling out its Communication Safety features. This was initially designed to shield children from FaceTime. The tools already scan and blur explicit material in Messages and Shared Albums, and now cover real-time video communication.

“Communication Safety extends to intervene when nudity is observed in FaceTime video calls, and blur out nudity in Shared Albums in Photos,” Apple said.

Although the feature was intended for use with child accounts, a few developers have discovered it working on adult profiles too. This has sparked debate over whether its presence is a mistake during testing, or if Apple is gearing up to implement the tool more widely on all user accounts.

How Is User Privacy Being Protected?

The FaceTime feature is located within the settings tag ‘Sensitive Content Warning.’ Based on Apple’s documentation, the detection is entirely done on-device via machine learning. No content is ever sent up to Apple servers, and the company never gets an indicator of what was detected.

“Scan nude images and set up adult content before they hit your screen, and get advice on how to make a safe decision. Apple isn’t given access to the images or videos.”

This strategy is consistent with Apple’s long history of focusing on user privacy while adding more forward-thinking safety measures.

Also Read: iPhone 17 Pro To Have These New Features

Will This Feature be Included in Final iOS 26 Release?

It is unclear whether or not this nudity detection feature will become part of the ultimate public release of iOS 26. Apple tends to experiment with features such as this in developer builds before making a decision on whether they’re ready for full deployment, or whether or not they should be made optional or limited to younger users.

The public iOS 26 beta drops in July, before a full release later this year, probably with the iPhone 17 series. Improved live safety features are finally making their way onto more tech platforms. Apple’s FaceTime nudity screening can hint at the future of how devices process sensitive material and how much power users will have over it.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

As Solana and TRON Slow Down, Web3 ai at $0.000443 Gains Quiet Attention as July Begins

Crypto Billionaire Who Caught Ethereum's All-Time Low at $0.42 Identifies the Best Token to Buy Under That Price in 2025

Best Cryptos of 2025: Unstaked, LINK, HYPE, & SUI; Here’s Why Millions of Traders Are Jumping Into These Projects!

This 20,000% Potential Crypto Will Challenge Solana During the Next Market Pump, Warns SOL Investors

XRP Surges 4%, AVAX Shows Mixed Signals, While Web3 ai Eyes 1,747% ROI: Could This Be the Best Long Term Crypto?