Parents are enraged after discovering that Meta, the parent company of Instagram, employed BTS-themed photographs of schoolgirls. They are aghast that the company used these images to advertise its new social media platform Threads to a 37-year-old male. The incident highlights troubling issues of privacy and consent, especially when it comes to using an identifiable image of non-consenting minors without direct notification of a parent.
The original promotional post had a big “Get Threads” button. It featured a portrait of a 15-year-old schoolgirl, taken from her Instagram account. This picture used to specifically target adult males in a direct advertisement. In particular, it targeted men 44 and older. All parents should be understandably outraged by the fact that Meta would use their children’s images for a profit without first requesting consent. They call the company’s moves totally egregious and perverse.
According to reports, Meta obtained the likeness from public posts uploaded by parents. They argue that it was actually compliant with their policies since the photos were publicly shared. Yet the company’s heavy-handed tactics have raised the ire of families whose kids say their safety and privacy were sacrificed on the altar of company profits.
One mother, whose daughter’s photo was used without consent, expressed her anger:
“For me it was a picture of my daughter going to school. I had no idea Instagram had picked it up and are using it as a promotion. It’s absolutely disgusting. She is a minor.”
Another parent voiced similar sentiments, emphasizing her disapproval of how the company leveraged her daughter’s image:
“Not for any money in the world would I let them use a girl dressed in a school uniform to get people on to [its platform].”
Thirty-seven-year-old London resident Abadesi Osunsade started to see advertising posts crowding his Instagram feed. He said he was hit by an avalanche of targeted ads featuring photos of girls in school uniforms, usually with their names included. He described feeling uncomfortable with Meta’s tactics.
“Over several days I was repeatedly served Meta adverts for Threads that exclusively featured parents’ images of their daughters in school uniform, some revealing their names. As a father, I find it deeply inappropriate for Meta to repurpose these posts in targeted promotion to adults.”
The case shines a harsh light on the way that social media platforms manage user-generated content, particularly when that content is harmful to minors. Critics argue that Meta’s reliance on public posts to generate content indicates a troubling disregard for parental consent and children’s online safety.
Beeban Kidron, a prominent advocate for children’s rights online, condemned Meta’s actions:
“Offering up school-age girls as bait to advertise a commercial service is a new low even for Meta.”
Meta’s scraping of images from private accounts is at issue. The platform automatically cross-posts content from Instagram to Threads, regardless of account settings, allowing potentially sensitive information to reach unintended audiences. This has further fueled demands to provide greater protections on how platforms use children’s faces.
“At every opportunity Meta privileges profit over safety, and company growth over children’s right to privacy. It is the only reason that they could think it appropriate to send pictures of schoolgirls to a 37-year-old man – as bait – Meta is a wilfully careless company.”
One father of a 13-year-old described his feelings upon discovering that his daughter’s image had been used in what he perceived as an exploitative manner:
Parents are farm out Meta’s practices, sparking a landmark class–action lawsuit. More deeply, they worry about the outsized effects this will have on online safety and privacy for all children. The incident has reignited debates about how social media companies should handle user-generated content and protect the rights of minors.
“The images shared do not violate our policies and are back-to-school photos posted publicly by parents. We have systems in place to help make sure we don’t recommend Threads shared by teens, or that go against our recommendation guidelines, and users can control whether Meta suggests their public posts on Instagram.”
Kidron emphasized this need, stating:
“When I found out an image of her has been exploited in what felt like a sexualised way by a massive company like that to market their product it left me feeling quite disgusted.”
Parents are left questioning not just Meta’s practices but also the broader implications for online safety and privacy for children. The incident has reignited debates about how social media companies should handle user-generated content and protect the rights of minors.
As outrage continues to build across various platforms, many are advocating for stricter regulations regarding the use of children’s images in advertising. Kidron emphasized this need, stating:
“Companies cannot offer sexualised images of children as bait to unknown men.”