Meta (FACEBOOK, INC.) | Report on risks of deepfakes in online child exploitation at Meta (FACEBOOK, INC.)

Status
AGM passed
AGM date
Previous AGM date
Proposal number
10
Resolution details
Company ticker
RRRR
Lead filer
Resolution ask
Report on or disclose
ESG theme
  • Social
ESG sub-theme
  • Digital rights
Type of vote
Shareholder proposal
Filer type
Shareholder
Company sector
Technology
Company HQ country
United States
Resolved clause
Shareholders request that Meta prepare a transparency report on the company’s use of deepfake identifying software to combat the risks of online child exploitation. This report shall be made publicly available to the company’s shareholders on the company’s website, be prepared at a reasonable cost, and omit proprietary information, litigation strategy and legal compliance information.
Supporting statement
Meta is one of the largest social media companies in the world, with more than 3 billion users active across its various platforms. As such, Meta has both an interest in, and responsibility to, address the growing risks created by its ongoing innovation in artificial intelligence. While artificial intelligence has many benefits and advantages, it also presents unique challenges that the company ought to address to avoid complicity in online victimization of children.
Earlier this year, Meta made the decision to add “Made with AI” labels to political content that involved use of AI. This decision is laudable. Yet, if Meta is willing to take tangible steps to curb broad social harms in the political arena, it should also be willing to take similar steps to curb the even more widespread social harm of online child abuse — harm that Meta’s platforms, unfortunately, have become a breeding ground for. As per a report1 released in June 2024 from Thorn and the NCMEC, Instagram and Facebook were the top two sites named in threats of sextortion, with Instagram being the top site where sextortion materials were actually distributed. With the meteoric rise in use of deepfake materials comes an increasing likelihood that the child exploitation risk that follows will see a similar spike.
These are serious risks — and Meta has, in other areas, taken similar risks of child exploitation seriously. As per NCOSE2, Meta has made recent updates to its policies that further privatize child accounts and limit their exposure to explicit content. It’s time to extend that same scrutiny, if not more, to the potential for child abuse that deepfake media creates.
The new age of artificial intelligence carries with it new opportunities to deter predation, limit the spread of child sex abuse material (CSAM), and protect the most innocent among us. Failure to do so, as evidenced by massive recent class-action lawsuits3 against tech companies like Apple, carries not only moral risks but tremendous legal and reputational price tags. Responding to that new age means taking on the challenge of combating child sex abuse with the kind of serious strategies and innovative thinking that Meta is known for worldwide — and helping to create a world4 where “enabling the best of what people can do together” involves protecting innocents from the worst that the online world has to offer.

DISCLAIMER: By including a shareholder resolution or management proposal in this database, neither the PRI nor the sponsor of the resolution or proposal is seeking authority to act as proxy for any shareholder; shareholders should vote their proxies in accordance with their own policies and requirements.

Any voting recommendations set forth in the descriptions of the resolutions and management proposals included in this database are made by the sponsors of those resolutions and proposals, and do not represent the views of the PRI.

Information on the shareholder resolutions, management proposals and votes in this database have been obtained from sources that are believed to be reliable, but the PRI does not represent that it is accurate, complete, or up-to-date, including information relating to resolutions and management proposals, other signatories’ vote pre-declarations (including voting rationales), or the current status of a resolution or proposal. You should consult companies’ proxy statements for complete information on all matters to be voted on at a meeting.