A new iMessage safety feature prompts kids to report explicit images to Apple

4 weeks ago 12

Apple is adding a caller kid information diagnostic that lets kids nonstop a study to Apple erstwhile they are sent photos oregon videos with nudity, according to The Guardian. After reviewing thing received, the institution tin study messages to instrumentality enforcement.

The caller diagnostic expands connected Apple’s Communication Safety feature, which uses on-device scanning to observe nudity successful photos oregon videos received via Messages, AirDrop, oregon Contact Poster and blur them out. In summation to blurring the photograph oregon video, Apple besides shows a pop-up with options to connection an adult, get resources for help, oregon artifact the contact.

As portion of this caller feature, which is successful investigating present successful Australia with iOS 18.2, users volition besides beryllium capable to nonstop a study to Apple astir immoderate images oregon videos with nudity.

“The instrumentality volition hole a study containing the images oregon videos, arsenic good arsenic messages sent instantly earlier and aft the representation oregon video,” The Guardian says. “It volition see the interaction accusation from some accounts, and users tin capable retired a signifier describing what happened.” From there, Apple volition look astatine the report, and it tin take to instrumentality actions specified arsenic stopping a idiosyncratic from sending iMessages oregon reporting to instrumentality enforcement.

Earlier this week, Google announced an enlargement of on-device scanning of substance messages successful its Android app that volition see an optional Sensitive Content Warning that blurs images with nudity arsenic good arsenic offering “help-finding resources and options.” Once it rolls out, the diagnostic volition beryllium enabled by default for users nether 18.

The Guardian says that Apple plans to marque the caller diagnostic disposable globally but didn’t specify erstwhile that mightiness happen. Apple didn’t instantly reply to a petition for comment.

In 2021, Apple announced a acceptable of kid information features that included scanning a user’s iCloud Photos room for kid intersexual maltreatment worldly and would alert parents erstwhile their kids sent oregon received sexually explicit photos. After privateness advocates spoke retired against the plan, Apple delayed the motorboat of those features to spell backmost to the drafting board, and it dropped its plans to scan for kid intersexual maltreatment imagery in December 2022.

Read Entire Article