Full width home advertisement

Welcome Home


Post Page Advertisement [Top]

Apple scrapped plans for new child-safety tools in the face of public outcry over concerns about privacy




After announcing the launch of a new tool to combat child exploitation earlier this month, Apple made headlines — which is not in a positive manner. Apple is taking a long, deep breath before moving forward with its plans, after critics immediately expressed concerns about the feature's potential privacy implications.


Following an announcement, the company stated that it would suspend testing of the tool in order to gather additional feedback and make improvements.


The plan revolves around a new system that, once it is implemented, will scan iOS devices and iCloud photos for evidence of child exploitation. It includes a new opt-in feature that will alert minors and their parents when sexually explicit image attachments are received or sent via iMessage, and it will blur the images if they are inappropriate.


Outrageous Reaction


In a time when technology companies are placing a greater emphasis on child protection, Apple's announcement that it would begin testing the tool was met with outrage on social media and critical headlines, as well as requests for more information from concerned citizens.


The company Apple (AAPL) announced on Friday that it would halt the implementation of these features as a result of this decision.


Last month, the company announced plans for features aimed at protecting children from predators who recruit and exploit them through communication tools, as well at limiting the spread of Child Sexual Abuse Material. Following feedback from customers, advocacy groups, researchers, and other interested parties, we've decided to take additional time in the coming months to collect feedback and make improvements before releasing these critical child safety features."


Consumers' privacy would be protected, Apple stressed last month during a series of press conferences aimed at explaining the planned tool. This is because the tool would convert photos taken on iPhones and iPads into unreadable hashes or complex numbers, which would then be stored on user devices. Those hashes would then be compared to those in a database maintained by the National Center for Missing and Exploited Children after the images were uploaded to Apple's iCloud storage service (NCMEC). The National Center for Microeconomic Development (NCMEC) was one of the organizations involved in the project.


NCMEC would only be notified after a certain number of hashes matched the NCMEC's photos, allowing it to decrypt the data, disable the user's account, and notify Apple, which would then alert law enforcement to the presence of potentially abusive images.


The intent of the plan was applauded by a number of child safety and security experts, who stressed the ethical responsibilities and obligations that businesses have in relation to the products and services that they develop. They did, however, state that the efforts raised the possibility of privacy violations.


Finally


In the wake of the revelation that Apple is 'searching' for child sexual abuse materials (CSAM) on end-user phones, images of Big Brother and the film 1984,' immediately come to mind." CNN Business reported last month that Ryan O'Leary, research manager of privacy and legal technology at market research firm IDC, stated his thoughts on the subject. "This is a highly nuanced issue that, on the surface, may appear to be quite frightening or intrusive," says the researcher.


Opponents of the plan applauded Apple's decision to put the test on hold until further notice.


The tool, according to Fight for Freedom, a digital rights organization, is a threat to "privacy, security, democracy, and freedom," and the organization has called on Apple to permanently discontinue it.


Fight for Freedom Director Evan Greer said in a statement that Apple's proposal to conduct on-device scanning of photos and messages is "one of the most dangerous proposals ever made by a technology company." 'Technologically speaking, this is the equivalent of installing malware on the computers and mobile devices of millions of people — malware that can be easily abused to cause enormous harm.'

No comments:

Post a Comment

Bottom Ad [Post Page]