Meta (FACEBOOK, INC.) | Child Safety Online at Meta (FACEBOOK, INC.)

Status
Filed
AGM date
Previous AGM date
Resolution details
Company ticker
FB
Lead filer
Resolution ask
Adopt or amend a policy
ESG theme
  • Social
ESG sub-theme
  • Decent work
Type of vote
Shareholder proposal
Filer type
Shareholder
Company sector
Technology
Company HQ country
United States
Resolved clause
Resolved: Shareholders request that, within one year, the Board of Directors adopts targets and publishes annually a report (prepared at reasonable expense, excluding proprietary information) that includes quantitative metrics appropriate to assessing whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platform
Supporting statement
The internet was not developed with children in mind. Social media impacts children’s brains differently than adult brains.1 It also poses physical and psychological risks that many children and teens are unprepared for, including sextortion and grooming, hate group recruitment, human trafficking, cyberbullying and harassment, exposure to sexual or violent content, invasion of privacy, self-harm content, and financial scams, among others.
Meta is the world’s largest social media company with billons of children and teen users. Meta’s platforms, including Facebook, Instagram, Messenger and WhatsApp, have been linked to numerous child safety impacts including:
Mental Health:
Meta’s own research shows Instagram’s negative impacts on teens’ self-image, increased rates of depression and anxiety, and a link to increased suicidal thoughts.2 Forty-two states have sued Meta claiming that Facebook and Instagram algorithms are intentionally addictive and harm kids’ mental health.3
Sexual Exploitation:
In 2022, nearly 32 million cases of online child sexual abuse material were reported; over 27 million of those (85 percent) stemmed from Meta platforms.4 Meta has started encrypting Facebook Messenger despite urgent warnings from law enforcement and child protection organizations that encryption will hide millions of reports, cloak the actions of child predators, and make children more vulnerable.5 A Wall Street Journal investigation describes how Instagram’s algorithms “connect and promote” a vast pedophile network by guiding pedophiles to sellers of child sexual abuse materials.6
Cyberbullying:
Time Magazine reported that “By one estimate, nearly 80% of teens are on Instagram and more than half of those users have been bullied on the platform.”7 A United Kingdom study ranked Instagram first in youth cyberbullying, with 42 percent reporting bullying, followed by Facebook (39 percent), and WhatsApp (17 percent).8
Data Privacy:
In 2022, Meta was fined over $400 million for failing to safeguard children's information on Instagram.9\
Legislation:
The new European Union’s Digital Services Act will make identifying, reporting and removing child sexual abuse material mandatory.10 The United Kingdom’s Online Safety bill aims to keep internet users, particularly children, safe from fraudulent and harmful content. The United States’ proposed Kids Online Safety Act enjoys public and bipartisan Congressional support and requires companies to prevent or mitigate child risks including suicide, eating disorders and substance abuse.” 11 12
Meta is facing significant regulatory, reputational, and legal risks due to these unabated issues.
Meta’s website lists some steps taken to improve child safety, but it has no publicly available, company-wide child safety or harm reduction performance targets for investors and stakeholders to judge the effectiveness of Meta’s announced tools, policies and actions.

1 https://www.apa.org/news/apa/2022/social-media-children-teens
2 https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739
3 https://www.usatoday.com/story/opinion/2023/11/03/meta-lawsuit-attorney-generals-pursue-social-media-accountability/71410913007/
4 https://www.missingkids.org/content/dam/missingkids/pdfs/2022-reports-by-esp.pdf
5 https://www.nytimes.com/2023/12/06/technology/meta-messenger-encryption.html
6 https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189
7 https://time.com/5619999/instagram-mosseri-bullying-artificial-intelligence
8 https://techjury.net/blog/cyberbullying-statistics
9 https://www.cnet.com/news/privacy/meta-fined-400m-for-failing-to-protect-childrens-privacy-on-instagram
10 https://www.nytimes.com/2022/04/28/opinion/social-media-facebook-transparency.html?smid=em-share
11 https://abcnews.go.com/Politics/protecting-kids-online-bipartisan-cause-senators/story?id=9719575212 https://www.nbcnews.com/tech/social-media/kosa-kids-online-safety-act-speech-censor-rcna12824

How other organisations have declared their voting intentions

Organisation name Declared voting intentions Rationale
Comgest For https://www.comgest.com/-/media/comgest/esg-library/esg-en/2024-proxy-voting-pre-declaration.pdf

DISCLAIMER: By including a shareholder resolution or management proposal in this database, neither the PRI nor the sponsor of the resolution or proposal is seeking authority to act as proxy for any shareholder; shareholders should vote their proxies in accordance with their own policies and requirements.

Any voting recommendations set forth in the descriptions of the resolutions and management proposals included in this database are made by the sponsors of those resolutions and proposals, and do not represent the views of the PRI.

Information on the shareholder resolutions, management proposals and votes in this database have been obtained from sources that are believed to be reliable, but the PRI does not represent that it is accurate, complete, or up-to-date, including information relating to resolutions and management proposals, other signatories’ vote pre-declarations (including voting rationales), or the current status of a resolution or proposal. You should consult companies’ proxy statements for complete information on all matters to be voted on at a meeting.