A teenager uses a cell phone to stay connected.

Watchdog newsletter

Published — September 24, 2021

Proposed iPhone protections could put LGBTQ youth at risk

(iStock)

Digital privacy groups worry that the protections could create a backdoor to widespread surveillance and engender abuse.

Introduction

Virtual communities have long provided a space for LGBTQ youth to explore their identities, allowing queer children to safely come out of the closet without fear of abuse from unsupportive parents.

But as technology companies ratchet up surveillance in the name of content moderation, the digital privacy of LGBTQ youth and other vulnerable people may be at risk.

Apple’s new child protection features announced last month would use machine learning algorithms to flag “sexually explicit” photos sent or received in the Messages app by minor users enrolled in a Family Plan. To prevent the spread of child sexual abuse images, the families of children under 13 years can choose to be notified and receive a copy of the flagged content. For children ages 13 to 17 years, the child is warned before opening flagged content and no one else is notified. Another proposed update to Apple devices would allow the company to detect known sexually explicit images of children uploaded to iCloud Photos, then report the content to the National Center for Missing and Exploited Children.

Anti-human trafficking organization Thorn, which uses an automated tool to identify missing children in sex ads, commended Apple’s commitment to limiting the spread of child sexual abuse material. Tech companies must create platforms that prioritize the issue, “for every child victim and every survivor whose most traumatic moments have been disseminated across the internet,”  Thorn CEO Julie Cordua wrote in an Aug. 5 blog post

But the updates could jeopardize encryption and messaging security, said Emma Llansó Director of the Free Expression Project at the Center for Democracy & Technology. “It’s not about Apple potentially violating laws,” Llansó said, “but about them [Apple] making their users more vulnerable in ways that governments may then use as justification for passing laws that prohibit other companies from having strong encryption and truly private messaging systems.”  

Digital civil liberties advocates and concerned Apple users have organized protests, sent letters, and signed petitions in response to the proposed updates to iPhones, iPads, Apple Watches, and macOS Monterey, which they say violate users’ privacy and put LGBTQ youth at risk. The features could reveal a queer child’s gender identity or sexual orientation to an abusive family member without their consent, or incorrectly flag content that’s not sexually explicit, said Evan Greer, director of the advocacy nonprofit Fight for the Future. “In the end, it’s really important to recognize that young people have a right to communicate securely,” she said.

Fight for the Future, along with the digital privacy nonprofit the Electronic Frontier Foundation and other civil liberties advocates organized protests at Apple stores around the nation to shed light on this issue on Sep. 13.

Advocates say the unprecedented proposal could spark an industry trend that could open a backdoor to widespread surveillance. “No amount of privacy software, changing your settings or taking precautions is going to protect you if the device itself is being weaponized to monitor your communications and activities,” Greer said.  

Moreover, machine learning systems have bias baked into them and sexual content moderation is often inaccurate, said EFF’s Director for International Freedom of Expression, Jillian York. A 2019 University of Colorado Boulder study showed that IBM, Amazon, Microsoft and Clarifai’s facial analysis systems always misgendered nonbinary people, and transgender people were misidentified more often than non-trans people. When Tumblr banned adult content in 2018, its automated system classified troll socks and pillows as sexually explicit, the Electronic Frontier Foundation reported.

These systems’ inability to detect nuance could impact marginalized communities the most, York said.

Algorithms used to automate photo classification may read certain bodies as more sexually explicit than others. In 2019, Instagram and Facebook rejected advertisements featuring fully clothed transgender and non-binary people because they didn’t “allow ads that promote escort services.”

One example York worried could be flagged: A child who is exploring their gender identity for the first time may bind their chest to make it flatter and send the photo to their friend. It has the potential to expose someone’s sexual or gender identity before they’re ready.

Such revelations could put children in danger of being abused or kicked out of their house by unaccepting parents. LGBTQ youth are 120% more likely to experience homelessness than cisgender or heterosexual children, according to a study by Chapin Hall at the University of Chicago. 

“Oftentimes these companies are very shortsighted in who they consult.” York said. 

As a result of the pushback, Apple postponed the release of the new protections. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple posted Sept. 3 on its website.

While created with good intentions, Llansó said that Apple’s proposed updates are the latest efforts to halt sex trafficking that have the potential to do more harm than good. The passage of anti-sex trafficking bill FOSTA-SESTA by Congress in 2018, which was intended to curb sex trafficking by making online platforms liable, led to a widespread crackdown of sexual content online and made sex workers more vulnerable to violence.

A new report from the Center for Democracy & Technology that looked at content moderation technology in end-to-end encryption services found analysis of metadata — data about data — as well as tools that allows users to report disinformation, harassment, spam, and child sexual abuse materials, to be the most effective in maintaining user security and privacy. Ideally, Llansó said, users would have the freedom to access different types of services through apps, without it being embedded into the core messaging system. 

In the eyes of advocates, encrypted communication empowers the most vulnerable users.

“For LGBTQ communities, encryption and digital security can be a matter of life and death,” Greer said. “If Apple cares about protecting our community, they should be expanding and strengthening the encryption and security of their devices rather than undercutting it.”

Melissa Hellmann is a senior reporter at the Center for Public Integrity. She can be reached at mhellmann@publicintegrity.org. Follow her on Twitter at @m_Hellmann.

Help support this work

Public Integrity doesn’t have paywalls and doesn’t accept advertising so that our investigative reporting can have the widest possible impact on addressing inequality in the U.S. Our work is possible thanks to support from people like you. Donate now.

Read more in Inside Public Integrity

Share this article

Join the conversation

Show Comments

hi