Skip to main content

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Investors will vote on child safety resolution at Meta’s Annual General Meeting

Tomorrow, Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company’s Instagram Teens feature “fails spectacularly on some key dimensions”, including promoting sexual, racist, drug and alcohol-related content. The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner.

“Two weeks ago, I stood outside of Meta’s office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta’s platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative. “Meta’s most recent ‘solution’ is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging.”

“Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health,” said Michael Passoff, CEO of Proxy Impact. “And now, a major child safety concern is Meta’s doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta’s continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives.”

The resolution asks the Meta Board of Directors to publish “a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.” Additional information for shareholders was filed with the SEC.

Meta has been under pressure for years linked to online child safety risks, including:

Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media.

Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit www.proxyimpact.com.

Heat Initiative works to hold the world’s most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children’s safety is at the forefront of any existing and future technological developments.

Contacts

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms Of Service.