Jump to content

Archived

This thread has been closed to further replies because it was not updated for 12 months. If you wish to have this thread reinstated, please contact an administrator.

Snow

Apple will start scanning all your photos, texts and searches with iOS 15

Recommended Posts

b9df60b0-f621-11eb-af5f-70693fc0d782

Quote

Apple announced last week that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It’s come under a great deal of scrutiny and generated some outrage, so here’s what you need to know about the new technology before it rolls out later this year.

Apple is introducing a Messages feature that’s meant to protect children from inappropriate images. If parents opt in, devices with users under 18 will scan incoming and outgoing pictures with an image classifier trained on pornography, looking for “sexually explicit” content. (Apple says it’s not technically limited to nudity but that a nudity filter is a fair description.) If the classifier detects this content, it obscures the picture in question and asks the user whether they really want to view or send it.

The new anti-CSAM features will be rolled out in three areas: Messages, iCloud Photos, and Siri and Search. Here’s how each of them will be implemented, according to Apple.

  • Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
  • iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
  • Siri and Search: Siri and Search will provide additional resources to help children and parents stay safe online and get help with unsafe situations.

The system would “proactively alert a team of human reviewers if it believes illegal imagery is detected” and human reviewers would alert law enforcement if the images were verified. The neuralMatch system, which was trained using a database from the National Center for Missing and Exploited Children, will be limited to iPhones in the United States to start, the report says.

The update — coming to accounts set up as families in iCloud on iOS 15, iPadOS 15, and macOS Monterey — also includes an additional option. If a user taps through that warning and they’re under 13, Messages will be able to notify a parent that they’ve done it. Children will see a caption warning that their parents will receive the notification, and the parents won’t see the actual message. The system doesn’t report anything to Apple moderators or other parties.

Critics like Harvard Cyberlaw Clinic instructor Kendra Albert have raised concerns about the notifications — saying they could end up outing queer or transgender kids, for instance, by encouraging their parents to snoop on them.

x

Link to post
Share on other sites

This is an invasion of privacy. And for Apple to claim to be champions of protecting people’s privacy, only for them to turn around do this is quite hypocritical of them. Everyone should care about this whether you have something to hide or not.

At what point will they stop if we let them get away with this? At what point will this become a malicious tool for governments to use to spy on their citizens?

Link to post
Share on other sites
  • Replies 7
  • Created
  • Last Reply

  • Browsing now   0 members

    No registered users viewing this page.

×