The messaging platform Discord recklessly exposes children to graphic violent content, sexual abuse and exploitation, New Jersey’s attorney general said in a lawsuit filed on April 17. New Jersey is the first state in the country to file suit against Discord, whose 200 million users can post in chat rooms and exchange direct messages with one another. Founded in 2015 as a chat tool for gamers, it has exploded in popularity in recent years among children, a trend that accelerated at the height of the pandemic.
The app’s popularity and lax safety controls have made its users easy targets for predators, prosecutors said in the suit, which was filed in Superior Court in Essex County. “We’ve seen child exploitation, child sexual abuse material, grooming, kidnapping – all kinds of awful things that happened on this platform, and the company has just simply not done enough,” the attorney general, Matt Platkin, said in an interview. “They put their profits ahead of the safety of our kids.
” Discord’s users must be 13 or older, according to the platform’s policies. But the suit says that because Discord accounts are so easy to create, and because users can use pseudonyms, younger children can evade the age restrictions with little difficulty and adults can readily pose as children. The complaint cites several criminal cases against adults in New Jersey who were accused of using the app to engage in explicit communication with children, solicit and send nude pictures and take part in sexual acts on video chat.
Jillian Susi, a spokesperson for Discord, disputed the lawsuit’s claims in a statement. “Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer,” Susi said. “Given our engagement with the attorney general’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today.
” The company emphasised that it had worked to improve safety for younger users. In 2023, it introduced a series of features for parents and teenagers, including safety alerts and content filters on direct messages. The company has developed technology to detect child sexual abuse material and says it promptly takes action against those who violate its policies, including by banning accounts and providing information to the National Center for Missing & Exploited Children.
New Jersey is not the only state where people have been accused of using Discord to target children. On Sunday, a California man was arrested and charged with kidnapping and engaging in unlawful sexual conduct with a minor after a 10-year-old girl was reported missing and the police found that the two had been communicating on Discord and Roblox, a gaming site popular with children. In February, Discord and Roblox were named in a lawsuit filed in California on behalf of a 13-year-old New Jersey boy who, according to the suit, was sexually exploited by an adult stranger on the apps.
In one especially gruesome case, a 47-year-old man in Michigan used Discord to contact children and advertise “livestreams of children engaging in self-mutilation” and sexually explicit activity, prosecutors said. He was sentenced to 30 years in prison. A 2023 investigation by NBC News found 35 cases over a six-year period of grooming, sexual assault or kidnapping that involved communication on Discord.
The day after the report was published, Discord’s CEO, Jason Citron, said that he and the company took “this stuff very seriously”. He added, “As a parent, it’s horrifying.” During a Senate hearing last January, lawmakers grilled Citron and the executives of other social media companies, including Meta, TikTok and X, about what they were doing to protect children from harmful content on their sites.
Lawmakers told the executives that they had “blood on their hands” and had created “a crisis in America”. At the hearing, Citron said Discord was working with a tech company founded by actor Ashton Kutcher to detect predatory conversations. Discord was aware that its young users were vulnerable, the New Jersey suit argues.
But it marketed its platform to parents as safe anyway, highlighting a feature that it said would automatically identify and delete direct messages that contained explicit images or videos. From 2017 to 2023, the app’s default setting applied the feature only to messages between users who were not friends, prosecutors said. Haley Hinkle, policy counsel at Fairplay, a nonprofit children’s advocacy group, said that apps’ default settings mattered because few users actually went to the trouble of changing them, and that it could be difficult for parents to navigate the settings of all the different platforms their children are using.
Hinkle said the suit signaled that states were willing to take on platforms such as Discord. “This is one step and one attempt at accountability, and we’re really trying to get stronger rules of the road so that we don’t have to depend on individual actions like this,” she said. – ©2025 The New York Times Company This article originally appeared in The New York Times .
.
Technology
Discord app exposes children to abuse and graphic content, lawsuit says

The messaging platform misled parents about its safety settings and turned a blind eye to explicit content, New Jersey prosecutors said. Read full story