Close Menu
Entertainment Industry Reporter
    Facebook X (Twitter) Instagram
    Entertainment Industry Reporter
    • Home
    • Film
    • Television
    • Box Office
    • Reality TV
    • Music
    • Horror
    • Politics
    • Books
    • Technology
    • Popular Music Videos
    • Cover Story
    • Contact
      • About
      • Amazon Disclaimer
      • DMCA / Copyright Disclaimer
      • Privacy Policy
      • Terms and Conditions
    Entertainment Industry Reporter
    You are at:Home»Technology»Unredacted Meta documents reveal ‘historical reluctance’ to protect children
    Technology

    Unredacted Meta documents reveal ‘historical reluctance’ to protect children

    By AdminJanuary 18, 2024
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Unredacted Meta documents reveal ‘historical reluctance’ to protect children


    Internal Meta documents about child safety have been unsealed as part of a lawsuit filed by the New Mexico Department of Justice against both Meta and its CEO, Mark Zuckerberg. The documents reveal that Meta not only intentionally marketed its messaging platforms to children, but also knew about the massive volume of inappropriate and sexually explicit content being shared between adults and minors. 

    The documents, unsealed on Wednesday as part of an amended complaint, highlight multiple instances of Meta employees internally raising concerns over the exploitation of children and teenagers on the company’s private messaging platforms. Meta recognized the risks that Messenger and Instagram DMs posed to underaged users, but failed to prioritize implementing safeguards or outright blocked child safety features because they weren’t profitable. 

    In a statement to TechCrunch, New Mexico Attorney General Raúl Torrez said that Meta and Zuckerberg enabled child predators to sexually exploit children. He recently raised concerns over Meta enabling end-to-end encryption protection for Messenger, which began rolling out last month. In a separate filing, Torrez pointed out that Meta failed to address child exploitation on its platform, and that encryption without proper safeguards would further endanger minors. 

    “For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and child exploitation,” Torrez continued. “Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children’s safety. While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive.” 

    Originally filed in December, the lawsuit alleges that Meta platforms like Instagram and Facebook have become “a marketplace for predators in search of children upon whom to prey,” and that Meta failed to remove many instances of child sexual abuse material (CSAM) after they were reported on Instagram and Facebook. Upon creating decoy accounts purporting to be 14-year-olds or younger, the New Mexico DOJ said Meta’s algorithms turned up CSAM, as well as accounts facilitating the buying and selling of CSAM. According to a press release about the lawsuit, “certain child exploitative content is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans.”

    The unsealed documents show that Meta intentionally tried to recruit children and teenagers to Messenger, limiting safety features in the process. A 2016 presentation, for example, raised concerns over the company’s waning popularity among teenagers, who were spending more time on Snapchat and YouTube than on Facebook, and outlined a plan to “win over” new teenage users. An internal email from 2017 notes that a Facebook executive opposed scanning Messenger for “harmful content,” because it would be a “competitive disadvantage vs other apps who might offer more privacy.” 

    The fact that Meta knew that its services were so popular with children makes its failure to protect young users against sexual exploitation “all the more egregious,” the documents state. A 2020 presentation notes that the company’s “End Game” was to “become the primary kid messaging app in the U.S. by 2022.” It also noted Messenger’s popularity among 6 to 10-year-olds. 

    Meta’s acknowledgement of the child safety issues on its platform is particularly damning. An internal presentation from 2021, for example, estimated that 100,000 children per day were sexually harassed on Meta’s messaging platforms, and received sexually explicit content like photos of adult genitalia. In 2020, Meta employees fretted over the platform’s potential removal from the App Store after an Apple executive complained that their 12-year-old was solicited on Instagram. 

    “This is the kind of thing that pisses Apple off,” an internal document stated. Employees also questioned whether Meta had a timeline for stopping “adults from messaging minors on IG Direct.” 

    Another internal document from 2020 revealed that the safeguards implemented on Facebook, such as preventing “unconnected” adults from messaging minors, did not exist on Instagram. Implementing the same safeguards on Instagram was “not prioritized.” A Meta employee criticized the lack of safeguards in the comments of the document, writing that allowing adult relatives to reach out to children on Instagram Direct was a “big growth bet.” The employee also noted that grooming occurred twice as much on Instagram as it did on Facebook. 

    Meta addressed grooming in another presentation on child safety in March 2021, which stated that its “measurement, detection and safeguards” were “more mature” on Facebook and Messenger than on Instagram. The presentation noted that Meta was “underinvested in minor sexualization on IG,” particularly in sexual comments left on minor creators’ posts, and described the problem as a “terrible experience for creators and bystanders.” 

    Meta has long faced scrutiny for its failures to adequately moderate CSAM. Large U.S.-based social media platforms are legally required to report instances of CSAM to the National Center for Missing & Exploited Children (NCMEC)’s CyberTipline. According to NCMEC’s most recently published data from 2022, Facebook submitted about 21 million reports of CSAM, making up about 66% of all reports sent to the CyberTipline that year. When including reports from Instagram (5 million) and WhatsApp (1 million), Meta platforms are responsible for about 85% of all reports made to NCMEC. 

    This disproportionate figure could be explained by Meta’s overwhelmingly large user base, constituting over 3 billion daily active users, but in response to much research, international leaders have argued that Meta isn’t doing enough to mitigate these millions of reports. In June, Meta told the Wall Street Journal that it had taken down 27 networks of pedophiles in the last two years, yet researchers were still able to uncover numerous interconnected accounts that buy, sell and distribute CSAM. In the five months after the Journal’s report, it found that Meta’s recommendation algorithms continued to serve CSAM; though Meta removed certain hashtags, other pedophilic hashtags popped up in their place.

    Meanwhile, Meta is facing another lawsuit from 42 U.S. state attorneys general over the platforms’ impact on children’s mental health. 

    “We see that Meta knows that its social media platforms are used by millions of kids under 13, and they unlawfully collect their personal info,” California Attorney General Rob Bonta told TechCrunch in November. “It shows that common practice where Meta says one thing in its public-facing comments to Congress and other regulators, while internally it says something else.”



    Original Source Link

    Share. Facebook Twitter LinkedIn Email Telegram WhatsApp

    Related Posts

    Who’s to Blame When AI Agents Screw Up?

    Signal will block Microsoft Recall from snooping on your texts

    Best Microsoft Surface Laptop (2025): Which Model to Buy or Avoid

    Fortnite is finally back in the US App Store

    Withings BPM Vision Review: At-Home Blood Pressure Monitoring

    Spotify iOS users can now buy audiobooks directly from the app

    Popular Posts

    Indie film box office: ‘Stress Positions’, ‘Sasquatch Sunset’

    THE DIVMULTI RAY DILEMMA | Kirkus Reviews

    Shop the best early deals for October Prime Day 2023

    Tom Wilkinson, Prolific Oscar-Nominated Actor, Dies at 75

    Paul Auster dies at 77

    ‘Wheel Of Fortune’ Vanna White Shares Rare Holiday Photo

    WHO OWNS THIS SENTENCE? | Kirkus Reviews

    Categories
    • Books (1,390)
    • Box Office (818)
    • Cover Story (14)
    • Events (6)
    • Featured (24)
    • Film (1,410)
    • Horror (1,398)
    • Lifestyle (3)
    • Music (1,454)
    • Politics (531)
    • Popular Music Videos (831)
    • Reality TV (853)
    • Technology (1,404)
    • Television (1,154)
    • Uncategorized (1)
    Archives
    Useful Links
    • About
    • Contact
    • Privacy Policy
    • DMCA / Copyright Disclaimer
    • Amazon Disclaimer
    • Terms and Conditions
    Categories
    • Books (1,390)
    • Box Office (818)
    • Cover Story (14)
    • Events (6)
    • Featured (24)
    • Film (1,410)
    • Horror (1,398)
    • Lifestyle (3)
    • Music (1,454)
    • Politics (531)
    • Popular Music Videos (831)
    • Reality TV (853)
    • Technology (1,404)
    • Television (1,154)
    • Uncategorized (1)
    Popular Posts

    JIN’s ‘Echo’ Voted Favorite New Music This Week

    The Buzziest Books of April | 2025

    Trolls Band Together – NSYNC fandom rise up!

    Ed Sheeran, Bruno Mars, Dua Lipa, Adele, Maroon 5, Rihanna, The Weeknd – Pop music 2024

    © 2025 Entertainment Industry Reporter. All rights reserved. All articles, images, product names, logos, and brands are property of their respective owners. All company, product and service names used in this website are for identification purposes only. Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Terms & Conditions and Privacy Policy.

    Type above and press Enter to search. Press Esc to cancel.

    We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
    Cookie SettingsAccept All
    Manage consent

    Privacy Overview

    This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
    Necessary
    Always Enabled
    Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
    CookieDurationDescription
    cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
    cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
    cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
    cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
    cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
    viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
    Functional
    Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
    Performance
    Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
    Analytics
    Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
    Advertisement
    Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
    Others
    Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
    SAVE & ACCEPT