Apple’s iOS 26 update brought new features like the new Liquid Glass design and major improvements to Messages, Wallet, and CarPlay, but a new Communication Safety tool, originally designed to protect children have started a heated discussion as it is reportedly affecting all FaceTime users, even adults, and many are not happy about it.
The feature works by detecting nudity during FaceTime video calls. If the system detects anything it considers sensitive, it automatically freezes the video and audio. A warning then appears on the user’s screen saying, “Audio and video are paused because you may be showing something sensitive. If you feel uncomfortable, you should end the call.”
Users are given two options: they can either resume the call or end it. While Apple designed this as a family safety feature for minors, reports from beta testers suggest it may be active for everyone. This has caused concern among users who feel Apple is stepping into their private space.
The feature was first spotted by iDeviceHelp, who shared screenshots showing the FaceTime warning message. It quickly went viral as people debated whether Apple’s intentions were protective or invasive.
Apple has said that the nudity detection is powered by on-device machine learning. This means all analysis happens locally on the user’s iPhone or iPad. The company insists no data leaves the device and no one at Apple has access to your photos, videos, or encrypted FaceTime calls.
“Because the photos and videos are analyzed on your device, Apple doesn’t receive any indication that nudity was detected, nor does it gain access to the content,” the company explained on its support page.
Even so, many users remain skeptical. While on-device processing protects user privacy in theory, some argue that monitoring private calls at all even by a machine feels intrusive.
Some beta testers report they were able to toggle the feature off in settings, but others claim it remains active even when disabled. It’s possible that the system is turned on by default for accounts registered to minors and accidentally affecting others.
Apple unveiled iOS 26 in June with promises of new family tools to keep children safe online. Alongside the FaceTime filter, there are also updates that blur explicit images in Shared Albums. But so far, Apple has not clarified whether the nudity detection in FaceTime is meant to apply to adult users or if this is an unintended glitch in the beta.
This isn’t the first time Apple has faced criticism over safety tools. In 2021, the company proposed scanning iCloud photos for child sexual abuse material. That plan was paused after public outcry over privacy concerns. Now, some fear this new FaceTime feature could start a similar controversy.
The iOS 26 public beta is expected to launch in mid-July, with the final release planned for September alongside the iPhone 17. Until then, users will be watching closely to see if Apple makes adjustments to the controversial FaceTime feature.
