Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company’s Instagram Teens feature “fails spectacularly on some key dimensions“, including promoting sexual, racist, drug and alcohol-related content. The resolution – filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe – will be presented by child safety advocate Sarah Gardner, CEO of the Heat Initiative.
“Two weeks ago, I stood outside of Meta’s office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta’s platforms and demanded stronger protections for kids,” said Sarah Gardner, CEO of the Heat Initiative, “Meta’s most recent ‘solution’ is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging.”
“Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health,” said Michael Passoff, CEO of Proxy Impact, “And now, a major child safety concern is Meta’s doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta’s continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives.”
The resolution asks the Meta Board of Directors to publish “a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.” Additional information for shareholders has been filed with the SEC.
Meta has been under pressure for years linked to online child safety risks, including:
- 41 States and the District of Columbia Attorney’s General filing lawsuits alleging that Meta Platforms has intentionally built programs with addictive features that harm young users.
- 1 out of 8 eight kids under 16 reported experiencing unwanted sexual advances on Instagram in the last 7 days according to Meta’s internal research.
- A leading psychologist resigned from her position on Meta’s SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain.
- As many as 100,000 children were sexually harassed daily on Meta platforms in 2021. Meta took no action until they were called for Senate testimony 3 years later.
- Internal research leaked by Meta whistleblower Frances Haugen showed that the company is aware of many harms including Instagram’s toxic risks to teenage girls mental health including thoughts of suicide and eating disorders.
Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media.

I am one of the editors here at www.systemtek.co.uk I am a UK based technology professional, with an interest in computer security and telecoms.