Meta staff found Instagram subscription tool facilitated child exploitation, yet company continued implementation

Tech

TBS Report
24 February, 2024, 08:45 pm
Last modified: 24 February, 2024, 08:51 pm
Employees proposed measures to mitigate risks of misuse in subscription accounts involving minors. But Meta pursued a different Strategy, citing robust monitoring.

Meta Platforms' safety staff raised concerns last year about the misuse of new paid subscription tools on Facebook and Instagram. 

They discovered that hundreds of "parent-managed minor accounts" were using the subscription feature to sell exclusive content, often featuring young girls in bikinis and leotards, to a mostly male audience that expressed overt sexual interest in the children, reports The Wall Street Journal. 

Despite lacking nudity or illegal content, some parents were aware that the content could be used for sexual gratification, engaging in sexual banter or allowing their daughters to interact with subscribers' sexual messages.

Meta had launched the payments feature without basic child-safety protections, allowing adults to run or co-manage accounts in a child's name. These accounts drew attention from users with pedophilic interests, leading Meta to confirm that its recommendation systems were promoting underage modeling accounts to users suspected of inappropriate behavior online. 

While not all parent-run accounts intentionally appealed to pedophilic users, Meta's algorithms were found to promote child-modeling subscriptions to likely pedophiles.

To address these issues, Meta could have banned subscriptions to accounts featuring child models, as other platforms do, or required accounts selling subscriptions to child-focused content to register for monitoring. 

Instead, Meta chose to build an automated system to prevent suspected pedophiles from subscribing to parent-run accounts, though this system was not always effective and could be bypassed by creating new accounts.

During the development of this automated system, Meta expanded the subscriptions program and the tipping feature, leading to instances of misuse with the "gifts" tool as well, as reported by The Wall Street Journal.

Meta stated that its creator monetization programs are closely monitored and defended its decision to expand subscriptions before implementing planned safety features. 

The company emphasized that it does not collect commissions or fees on payments to subscription accounts, removing any financial incentive to encourage user subscriptions. However, Meta does collect a commission on gifts to creators.

Spokesman Andy Stone said, "We launched creator monetization tools with a robust set of safety measures and multiple checks on both creators and their content," referring to the company's efforts to limit likely pedophiles from subscribing to children as part of their ongoing safety work.

Despite these assurances, a review by The Wall Street Journal of popular parent-run modeling accounts on Instagram and Facebook revealed significant enforcement failures. Some previously banned parent-run accounts for child exploitation had returned, gained official Meta verification, and amassed hundreds of thousands of followers. Others, banned on Instagram for exploitative behavior, continued selling child-modeling content via Facebook.

In one alarming discovery, parent-run accounts were found cross-promoting pinup-style photos of children to a 200,000-follower Facebook page dedicated to adult-sex content creators and pregnancy fetishization. Meta took down these accounts and acknowledged enforcement errors after being alerted by The Wall Street Journal. However, the company often failed to remove "backup" Instagram and Facebook profiles promoted in bios, allowing the banned accounts to continue promoting or selling material until The Wall Street Journal inquired about them.

Some of the parent-run child accounts on Instagram reviewed by The Wall Street Journal have garnered significant attention on non-Meta platforms. Screenshots from these platforms showed men routinely reposting Instagram images of the girls and discussing the willingness of specific parents to sell more risqué content privately. In some instances, users shared tips on how to track down where specific girls live, with one user describing a sex fantasy involving a 14-year-old.

Meta has faced challenges in detecting and preventing child-safety hazards on its platforms. Following The Wall Street Journal's report last year revealing the company's algorithms connecting and promoting a network of accounts openly devoted to underage-sex content, Meta established a task force in June to address child sexualization on its platforms. However, the effectiveness of this work has been limited, with tests showing platforms continuing to recommend search terms such as "child lingerie" and large groups discussing trading links of child-sex material.

These failures have led federal legislators and state attorneys general to take notice.

A recent investigation by The Wall Street Journal found that many accounts subscribing to child-modeling content on Meta's platforms had explicitly sexual usernames. These accounts, with publicly viewable profiles, appeared to mostly follow sexual cartoons and other adult content along with children. Shockingly, one subscriber even posted a video on his account claiming to show himself ejaculating.

The Wall Street Journal also discovered unsavory activity in Meta's gifts program, available to accounts with at least a thousand followers. Some popular accounts sought donations via the program for curating suggestive videos of young girls stretching or dancing, sourced from other sources.

These videos exposed the children to a mass audience that flooded the posts with suggestive emojis and sexual comments. Furthermore, the "send gift" button was featured on posts promoting links to what was indicated as child sexual-abuse videos.

In addition to these troubling parent-run minor accounts, The Wall Street Journal found numerous examples of others offering subscriptions and gifts for content that violated Meta's rules. For example, one account titled "sex porn video xvideos xnxx xxx girl sexy" purported to livestream adult videos. Another example was a 500,000-user Spanish-language group called "School Girls" offering subscription content.

Meta removed both examples after being flagged by The Wall Street Journal and stated that it had recently started screening accounts that sell subscriptions for indications of suspicious activity involving children using a tool called Macsa, short for "Malicious Child Safety Actor." The company also mentioned plans to screen buyers of such content for pedophilic behavior, although this work is not yet completed.

Other violative accounts allowed to seek payments posted exploitative images of pedestrians being crushed by vehicles and a sex scene implying rape, with some videos garnering millions of views.

Meta's Andy Stone stated that the presence of the "send gift" button did not necessarily mean that the company had actually paid money to the accounts soliciting tips. He explained that videos that successfully elicited cash donations were subject to an additional layer of review.

To encourage content production, Meta and other social-media companies have urged creators to solicit funding from their followers, either via cash "tips" and "gifts" or recurring subscription payments, with the platforms taking a cut of those payments. Meta has announced that it would not take a cut on subscriptions through at least the end of 2024 but does collect commissions on gift payments to creators.

These creator payments represent a significant shift from the core advertising business of major social-media networks, putting them in competition with more specialized platforms like Patreon and OnlyFans, which have stricter policies regarding subscriptions and tipping for content involving children. 

Patreon, for example, requires minors to have an adult guardian's permission to open an account and bans underage modeling accounts. OnlyFans, which caters to both adult-content creators and those focused on nonsexual material, completely bans minors from its platform for safety reasons, using both artificial intelligence and manual review to remove even wholesome images featuring children.

TikTok stated that it bans the sale of underage modeling content on both TikTok marketplace and via its creator-monetization services. A spokesperson for YouTube noted that while it does not inherently ban subscriptions to child-modeling content, it does not allow communication between accounts and their followers, which Meta promotes as a core feature of its subscription product.

Comments

While most comments will be posted if they are on-topic and not abusive, moderation decisions are subjective. Published comments are readers’ own views and The Business Standard does not endorse any of the readers’ comments.