Pump.fun has been ground zero for the meme coin market in 2024, with the platform being used to create over 2.6 million Solana tokens to date—including some truly odd stuff paired with headline-grabbing stunts. But a new video feature launched Wednesday was almost immediately exploited in a detestable manner.

The platform's developers confirmed to Decrypt on Thursday that the a user tapped the new feature to upload child sexual abuse material to the site, which Pump.fun said was quickly removed.

On Wednesday, the token launchpad enabled users to create meme coins with tokenized videos, adding fresh functionality as users could only previously include images and GIFs. This resulted in a slew of edgy, quirky, and otherwise amusing projects being launched. But at least one user took the opportunity to spread horrific child abuse material.

AD

After a user reported the existence of a video on the platform that apparently depicted sexual abuse and torture to a minor, Pump.fun confirmed to Decrypt that such content had been uploaded by a token creator and that it was removed "shortly" thereafter. Furthermore, they emphasized that such content is not welcome on the platform.

“We have built systems and have a team that handles the moderation of coins and messages in the coins' comment sections, and have been able to remove, ban, and report many cases of similar conduct in the past, all without affecting the end user,” pseudonymous Pump.fun co-founder Alon told Decrypt. “Hence why this is probably the first time you're hearing about a case like this.”

Pump.fun isn’t alone in this fight against the spread of child sexual abuse materials, which is reportedly at an all-time high.

Last year, Twitter suspended 12.4 million accounts for violating its child sexual exploitation policies. A report by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) found that Twitter only accounted for 3% of child sexual abuse material reported to the police, with Snapchat accounting for 44% of all flagged content.

AD

“The sharing of child sexual abuse material online is at record levels and has a devastating impact on young victims, putting them at risk of further abuse and trauma each time an image is shared,” a spokesperson for children’s charity NSPCC told Decrypt.

While online platforms are not liable for the content posted by its users, under the Communications Decency Act (CDA), the platform can be found liable if it does not promptly remove such content once notified.

“This is one of the biggest challenges and legal issues in today’s digital culture,” Andrew Rossow, a digital media attorney and the CEO of AR Media Consulting, told Decrypt.

Rossow recommends that platforms like Pump.fun be proactive in content moderation, provide accessible reporting mechanisms, and clear removal policies.

Pump.fun has a token report feature, as well as a support Telegram chat to report such instances. The Twitter user who originally reported the child abuse material later confirmed that the platform dealt with the scenario professionally.

“Any forms of illegal content, especially those affecting children, have never been and will never be welcome on Pump.fun,” Alon told Decrypt. “And we will continue to take steps to make it more difficult for malicious actors to spread such content.”

As for the individuals that created the token(s), if identified, they could face serious criminal and civil charges, said Rossow.

Edited by Andrew Hayward

AD

Stay on top of crypto news, get daily updates in your inbox.