SAN FRANCISCO: Apple has recently announced that iPhones and iPads will shortly start identifying images carrying child sexual abuse and reporting through iCloud.
According to the tech giant based in the Silicon Valley, A new software change to Apple’s operating systems will regulate pictures, enabling Apple to report cases to the National Center for Missing and Exploited Children,
The company states: “We want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material (CSAM).”
Apple explains that this innovative tool will provide the phones’ operating systems to link abusive pictures on a user’s phone into a database of known CSAM photographs presented by child safety organizations. Afterward, the system will flag the images simultaneously when uploaded to iCloud.
The company adds that this new feature is a component of a range of tools overseeing Apple mobile devices.
Apple’s iPhone messaging application will also apply machine learning to identify and inform children and their parents if they are given sexually explicit photos.
Even Siri, the personal assistant will acquire the ability to “meddle” when users attempt to explore topics linked to child sex abuse.