Privacy Plight: Apple’s Proposed Changes & Consumer Pushback

Privacy Plight: Apple’s Proposed Changes & Consumer Pushback

Apple logo over people carrying screens

Photo by Jimmy Jin (Unsplash)

Natalie BravoNatalie Bravo is an IPilogue Writer and a 2L JD Candidate at Osgoode Hall Law School.

 

In August, Apple made headlines by introducing new privacy features in their upcoming software updates. These new features are purported to expand protections for children through modified communication tools, on-device algorithm learning within Messages, cryptography, and Siri, and Search interventions. Although protecting children as a vulnerable group should be of utmost importance to all, many security experts find some of these proposed changes troubling as they may undermine the company’s longstanding reputation in privacy preservation and enable future security backdoors.

Over the years, Apple has cultivated a strong reputation as a protector of consumer privacy. One of their core values and popular marketing points is that “privacy is a fundamental human right.” After all, their security and privacy designs are so powerful that Apple allegedly can’t access encrypted user data—even if a government asks for it. In 2015, Apple CEO Tim Cook stated that while issues such as national security are important, Apple would not implement any technology which malicious actors could misuse as a backdoor to encrypted user data. Now, in 2021, Apple’s ironclad encrypted system has one exception.

As one of the changes, Apple intends to introduce photo-scanning technology for all users to identify any Child Sexual Abuse Material (CSAM). This well-intentioned technology is already widely used online to identify known explicit materials, including terrorist propaganda and other violent content. Some consumers worry that all their private images will be scanned in search of illegal content; however, Apple is not proposing that. The technology scans for the “hash” of a file and matches it to a known hash. If a certain threshold of known CSAM is found, barring false positives, then law enforcement is contacted. Strangely enough, Apple has noted that users can opt to disable photo uploads to iCloud, expressing that CSAM is only identified within their servers, and not on users’ devices. Some experts interpret this as Apple favouring brand protection over child safety.

Some security experts expressed strong concerns over modified communication tools for children. Apple alleges that device software will detect any explicit content (not hashes) within a minor’s Messages conversations—a feature that can be turned on or off by a guardian. This will alert a parent if their minor has received any image that is flagged as explicit. This seems appropriate to allow some supervision to protect vulnerable children from online predators; however, the algorithms currently used to detect explicit images are notoriously prone to auditing mistakes. It is widely known that benevolent, non-sexual content, particularly LGBTQ+ content, is consistently flagged as sexually explicit by machine learning algorithms. To add to this, child advocates worry about the possibility of minors in abusive households being monitored by such a faulty and dangerous algorithm.

Though disciplinary technology is not a new concept, these changes will suddenly affect billions of consumers. It’s been reported that when a child, like any other user, experiences negative behaviour online, they seek to flag or report it themselves. However, there is currently no way to report messages within Apple’s Messages application. Critics have characterized Apple’s surveillance treatment of children as “victims” of technology, rather than users. After causing a tremendous stir in both the privacy and child advocacy communities, Apple later specified that Messages scanning would only apply to those under 13, not teenagers, and have attempted to offer limited clarity on the new technologies.

Despite the changes, Apple appears to remain committed to privacy as a fundamental human right. Children need to be protected and prioritized in terms of technology experience, but their privacy matters too. It will be interesting to see the roll-out of Apple’s polarizing changes, particularly how they will affect Apple’s reputation and ecosystem security and if Apple will introduce any more changes moving forward as it responds to community concerns.