Windows 10 News and info | Forum
January 27, 2020, Loading... *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: This is a clean Ad-free Forum and protected by StopForumSpam, Project Honeypot, Botscout and AbuseIPDB | This forum does not use audio ads, popups, or other annoyances. New member registration currently disabled.
 
  Website   Home   Windows 8 Website GDPR Help Login Register  
By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy.
Pages: [1]
  Print  
Share this topic on Del.icio.usShare this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on MySpaceShare this topic on RedditShare this topic on StumbleUponShare this topic on TechnoratiShare this topic on TwitterShare this topic on YahooShare this topic on Google buzz
Author Topic: Apple scans photos to check for child abuse  (Read 29 times)
javajolt
Administrator
Hero Member
*****
Offline Offline

Gender: Male
United States United States

Posts: 30948


I Do Windows


WWW Email
« on: January 10, 2020, 01:11:41 PM »
ReplyReply

Apple scans photos to check for child sexual abuse images, an executive has said, as tech companies come under pressure to do more to tackle the crime.

Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.

Apple has often clashed with security forces and authorities, refusing to break into criminals’ phones and applying encryption to its messaging app in the name of protecting its users’ privacy.

Speaking at the Consumer Electronics Show in Las Vegas, Ms. Horvath said removing encryption was “not the way we’re solving these issues” but added: “We have started, we are, utilizing some technologies to help screen for child sexual abuse material.”

An Apple spokesman pointed to a disclaimer on the company’s website, saying: “Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space.

“As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.

“Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

The company did not elaborate on how it checks for child abuse images, but many tech companies use a filtering system called PhotoDNA, in which images are checked against a database of previously identified images using a technology known as “hashing”. The technology is also used by Facebook, Twitter, and Google.

Apple made a change to its privacy policy last year that said it may scan images for child abuse material.

Ms. Horvath defended Apple’s decision to encrypt iPhones in a way that makes it difficult for security services to unlock them after the FBI raised the prospect of another clash with the company by asking it to unlock an iPhone allegedly owned by a dead gunman who killed three people at a naval base in Florida last month.

“End to end encryption is critically important to the services we come to rely on…. health data, payment data. Phones are relatively small they get lost and stolen. We need to make sure that if you misplace that device you’re not [exposing that data],” she said.

Quote
Update January 9, 2020

This story originally said Apple screens photos when they are uploaded to iCloud, Apple's cloud storage service. Ms. Horvath and Apple's disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.


source
Logged


Pages: [1]
  Print  
 
Jump to:  

Powered by SMF 1.1.21 | SMF © 2017, Simple Machines

Google visited last this page January 12, 2020, 06:36:38 AM