Buyers and sellers of underage-sex content have developed a thriving network thanks to the discovery algorithms in Instagram, the photo-sharing platform owned by the Silicon Valley giant Meta, the Wall Street Journal reported on Wednesday.
"Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests," the Journal reported, based on investigations it conducted jointly with Stanford University and the University of Massachusetts Amherst.
Sexualized accounts on the platform are "brazen" about their interests, but don't post illegal materials openly, choosing to offer "menus" of content instead, according to the researchers. They also use certain emojis as code, in which a map stands for "minor-attracted person" – a euphemism for pedophile – and cheese pizza is shorthand for "child pornography," said Brian Levine, director of the UMass Rescue Lab at Amherst. Many users describe themselves as "lovers of the little things in life," he said.