Apple won the battle for the most heartwarming holiday ad last Christmas season. Advertising the new hearing aid feature in its Airpods Pro 2, the company left melted hearts and watery eyes the country over, with a story about a hearing-impaired father (left) being able to be present more fully in his daughter’s life.
It’s a profoundly pro-life and pro-family message – and as vice president of the American Family Association, these are two things I can get behind. And just as importantly, it’s a key sign that Apple is willing to resist the activist urge to opine on non-core business issues and focus on the company’s primary business responsibility: making the world a better place via its products and the human flourishing that such products create.
It’s because of this, and because of my organization's position as an Apple shareholder, that I don’t enjoy writing a piece with this title. Apple is more than willing to express support for pro-family values – but that doesn’t mean the company’s record is clean when it comes to defending those values via its corporate policies. No shareholder wants to go on the Internet and see their company facing a billion-dollar class-action lawsuit over its alleged failure to protect victims of child pornography. And that’s exactly what happened with Apple.
Survivors of child sex abuse allege that the company’s 2021 decision to ditch software designed to curb the spread of child sex abuse material (CSAM) resulted in videos of their abuse being shared via Apple platforms for more than a decade after this unconscionable abuse occurred. As an Apple shareholder, I’m familiar with many of the concerns raised by the deployment of such technology – concerns of jeopardized privacy and resultant questions of who truly owns the iPhone in their pocket. And yet, as a father, I also understand the pain and suffering of the victims involved in these horrific cases of abuse.
The facts do not lie: Apple’s decision to ditch CSAM-scanning material is not only a growing area of reputational risk for the company, but an area of moral concern for shareholders and customers alike. I say growing because concerns about Apple’s ability to combat online abuse predate this class-action lawsuit.
American Family Association was engaging with Apple on curbing online child abuse before we ever learned about this lawsuit – but its existence only furthers our concerns.
Just like no shareholder wants to see their company listed as allegedly having glossed over the abuse of innocent children, no shareholder wants to see an article from the Wall Street Journal describing how their company’s lobbyists fought against portions of a child safety bill – and yet that happened.
No shareholder wants to see their company listed on the National Center on Sexual Exploitation’s (NCOSE) “Dirty Dozen” list for two straight years, and called a “mainstream contributor to sexual exploitation” on account of its wavering on policies that protect children from sexually exploitative content on the Internet – and yet that happened too.
This new wave of questions and bad publicity surrounding the way Apple handles child abuse online doesn’t mean that the company must throw out all its commitments to online privacy. As shareholders, we believe that one of the most innovative tech companies in the world doesn’t have to make an "either-or" choice between protecting children and maintaining user privacy. But this bad press for the company is a moment for greater transparency and answers, which is why we’ve placed a proposal on the ballot for Apple’s annual meeting, held on February 25th, urging the company to issue a report on the costs and benefits of the company’s decisions regarding its use of CSAM-identifying software.
Apple may have a rational reason for scrapping planned software aimed at protecting children from sex abuse online (the kind of software that might have prevented the victimization described in this recent lawsuit the company faces). But if it does, shareholders deserve to know what that reason could be.
Why does the company still fail to block sexually explicit content from being viewed/sent by users under the age of twelve?
Why does Apple not default to censoring explicit content for teenage users on its messaging services?
This isn’t activism – it’s asking basic questions about common sense protections for children.
In Apple’s Human Rights Policy, the company asserts that it “believe[s] in the power of technology to empower and connect people around the world – and that business can and should be a force for good.” It’s time for the company to live that out. We know Apple is capable of doing better – and of creating a world where Steve Jobs’ belief in “think[ing] different” means being the industry gold standard when it comes to protecting the most innocent among us.
Editor's Note: The American Family Association is the parent organization of the American Family News Network, which operates AFN.net.
Notice: This column is printed with permission. Opinion pieces published by AFN.net are the sole responsibility of the article's author(s), or of the person(s) or organization(s) quoted therein, and do not necessarily represent those of the staff or management of, or advertisers who support the American Family News Network, AFN.net, our parent organization or its other affiliates.