Two chat and gaming companies are being sued for allegedly facilitating child sexual exploitation and abuse.
The suit claims Roblox and Discord allowed online predators to find victims.
"Both Roblox and Discord, we allege, were negligent in the services they provided to our client, a 13-year-old boy," said Anna Marie Murphy, an attorney with Cotchett, Pitre and McCarthy.
Watch NBC 4 free wherever you are

The firm Murphy is a part of, along with firm Anapol Weiss, claims Roblox enables predators to hunt, find and exploit children online.
"The predator used a function on Roblox called a whisper function that allowed him, as a complete stranger, to be in the game with our client and send a direct message," Murphy said. "That was the first contact, and there was a request for a naked picture."
Get Tri-state area news delivered to your inbox with NBC New York's News Headlines newsletter.

Lawyers also said their client's family had to move because of the harassment.
Security experts said sites like Roblox have become a hotspot for predators.
U.S. & World
"They can create any sort of persona that they wish, and kids are trusting," said Cindi Carter, chief security officer at Check Point Software.
Roblox said it cannot comment on pending litigation but that it "takes the safety of its community very seriously. We are constantly innovating and launching new safety features, including more than 40 safety features and policies in 2024."
Advocates said it's not just the company's responsibility to keep kids safe online; it's up to everyone.
"Parents and guardians do have to educate themselves on online safety," Carter said. "We can't expect that the gaming platforms are going to do all that for us."