Florida's historic sting rescues dozens of kids and arrests alleged predators in nation's 'largest' child rescue sweep



On Monday, the U.S. Marshals Service Middle District of Florida stated that its two-week initiative, Operation Dragon Eye, had three key objectives: saving missing children, providing them with services, and deterring bad actors.

'Many of these kids have painful, disastrous situations, but at least today we've rescued them, and we now can work towards recovery.'

The USMS announced that along with 20 federal, state, and local government agencies, the Tampa Bay area mission recovered 60 "critically missing" children, or "those at risk of crimes of violence or those with other elevated risk factors such as substance abuse, sexual exploitation, crime exposure, or domestic violence."

RELATED: 'Nowhere to go, nowhere to hide': Florida will have 'Alligator Alcatraz' for illegal aliens up and running in days

  Photo by ANDRI TAMBUNAN/AFP via Getty Images

The operation also resulted in the arrest of eight individuals who are facing charges including human trafficking, child endangerment, narcotics possession, and custodial interference. Their bonds ranged from no bond to $250 million.

During a Monday press conference, Attorney General James Uthmeier noted that the initiative was the "largest child rescue operation not just in Florida's history, but in the United States' history."

He explained that some of the children recovered were the victims of trafficking.

"Many of these kids have been through painful, disastrous situations, but at least today we've rescued them, and we now can work towards recovery," Uthmeier said.

RELATED: Florida sheriff makes clear to radicals that riots won't go their way: 'We will kill you'

  Photo by Michael M. Santiago/Getty Images

The minors, ranging from 9 to 17 years old, were provided with medical and psychological care, nourishment, and appropriate placement.

U.S. Marshal William Berger stated, "I have to curtail my enthusiasm because of the sensitivity of the victims involved in this operation, but the successful recovery of 60 missing children, complemented with the arrest of eight individuals, including child predators, signifies the most successful missing child recovery effort in the history of the United States Marshals Service; or to my knowledge, any other similar operation held in the United States."

Callahan Walsh, the executive director of the National Center for Missing & Exploited Children, said that the operation's success was "a testament to what's possible when agencies unite with a shared mission to protect children."

"We're proud to have supported the U.S. Marshals Service and our partners in Florida to recover these missing children and provide critical support to those who need it most. NCMEC is honored to stand alongside these teams and will continue working tirelessly to help make sure that every child has a safe childhood," Walsh added.

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Your kids' iPhones may be the most dangerous things they own



What’s an acceptable level of online child sexual abuse, blackmail, and sextortion? How many teen suicides must happen before someone acts? Most parents would say the answer is obvious: zero.

Apple doesn’t seem to agree. Despite serving as the constant digital companion for millions of American kids, the company has done nothing to rein in the iMessage app — a tool that now functions as an unregulated playground for child predators. Apple has shrugged off the problem while iMessage becomes the wild west of child exploitation: unchecked, unreported, and deadly.

It’s long past time for Apple to confront the truth: Its inaction empowers predators. And that makes the company complicit and accountable.

You wouldn’t leave a toddler alone by the pool. You wouldn’t hand your 9-year-old the keys to a pickup. And when you drive that truck, you don’t let your kid ride on the hood. But every day, parents hand their children a device that could be just as dangerous: the iPhone.

That device follows them everywhere — to school, to bed, into the darkest corners of the internet. The threat doesn’t just come from YouTube or TikTok. It’s baked into iMessage itself — the default communication tool on every iPhone, the one parents use to text their kids.

Unlike social media platforms or games, iMessage gives parents almost no tools to limit its use or increase safety. No meaningful restrictions. No guardrails. No accountability.

Criminals understand this — and they take full advantage. They generate fake nude images of boys and send them via iMessage. Then, they threaten to release the images to the victims’ classmates and followers unless they pay up. It’s extortion. It’s emotional torture. And it often ends in tragedy.

This isn’t rare. It’s growing. Online child-sexual abuse and interaction are spreading fast — and Apple refuses to act.

The statistics are outrageous:

Why do predators prefer iMessage over apps like WhatsApp or Snapchat? According to law enforcement and online safety experts, iMessage offers “an appealing venue” for grooming — a place where predators can build trust with your child. They identify victims on public platforms, then move the conversation to iMessage, where no safety guardrails exist.

RELATED: Is your child being exposed to pedophiles in the metaverse?

  ljubaphoto via iStock/Getty Images

And children trust it. That familiar blue bubble? Apple teaches them it means the message came from a “trusted source.” Not just another text — another iPhone.

Apple claims to offer a “communication safety” feature that blurs nude images sent to kids through iMessage. But here’s the catch: The alert lets the child view the image anyway. That’s not a safety feature. That’s a fig leaf.

Apple knows exactly what iMessage enables — a criminal playground for sextortion, child sexual abuse, and worse. But Apple doesn’t act. Why? Because it doesn’t have to. The company sees no urgent economic risk. Today, 88% of American teens own iPhones. This fall, 25% are expected to upgrade to iPhone 17 — up from 22% last year.

The numbers tell the rest of the story.

In 2024, the National Center for Missing and Exploited Children identified more than 20 million cases of suspected online child sexual exploitation — much of it sextortion. Instagram reported 3.3 million. WhatsApp logged more than 1.8 million. Snapchat topped 1.1 million.

Apple reported 250.

No level of child sexual exploitation is acceptable. Not one instance. Content providers and app developers across the industry have taken steps to protect children. Apple, by contrast, has shrugged. Its silence is willful. Its inaction is a choice.

It’s long past time for Apple to confront the truth: Its inaction empowers predators. And that makes the company complicit and accountable — economically, legally, and morally.

Section 230 Needs To Be Fixed So Internet Companies Can’t Feature Child Pornography

In the leadup to the Communication Decency Act of 1996, America was concerned about the Internet exposing children to pornography. The July 1995 cover of Time magazine, titled “Cyberporn,” depicted a child staring at a computer with this caption: “Exclusive: A new study shows how pervasive and wild it really is. Can we protect our […]

Over 20 million online child sexual abuse material incidents reported on Facebook, by far the most of all platforms



A new report from the National Center for Missing and Exploited Children said that a vast majority of reported online child sexual abuse material was from Facebook.

The annual report by the NCMEC for 2020 claimed the organization's CyberTipline received more than 21.7 million reports of online child exploitation, 21.4 million of these reports were from electronic service providers. There were 20,307,216 reported incidents related to child pornography or trafficking on Facebook, including Instagram and Whatsapp, which the social media behemoth owns.

For comparison, Google reported 546,704 incidents of CSAM, Snapchat found 144,095, Microsoft had 96,776, Twitter cited 65,062, TikTok had 22,692, and Reddit reported 2,233 instances of apparent child sexual abuse material.

MindGeek, the Canada-based parent company of several adult content websites, reported far fewer incidents. MindGeek, which owns Pornhub, YouPorn, RedTube, and Brazzers, reported 13,229 instances of child sexual abuse material last year.

The Internet Watch Foundation, which helps "victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse," claimed it found 118 incidents of videos containing child sexual abuse or rape on Pornhub between 2017 and 2019.

In December, Pornhub faced scrutiny after the New York Times published multiple allegations of sexual exploitation on the adult content website.

The streaming behemoth, which netted 3.5 billion visits per month in 2019, introduced new guidelines in December to protect against underage porn being uploaded on the site.

"Going forward, we will only allow properly identified users to upload content," Pornhub said in a statement. "We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations."

Pornhub also noted that it voluntarily registered as an electronic service provider for the National Center for Missing and Exploited Children's data collection.

Regarding Facebook's overwhelming majority of the alleged CSAM incidents, the National Center for Missing and Exploited Children stated:

Higher numbers of reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP's efforts are to identify and remove abusive content. NCMEC applauds ESPs that make identifying and reporting this content a priority and encourages all companies to increase their reporting to NCMEC. These reports are critical to helping remove children from harmful situations and to stopping further victimization.

As of April 2020, Facebook was the most popular social media platform with nearly 2.5 billion active users.

The NCMEC said reports to the CyberTipline increased by 28% from 2019.

"The 21.7 million reports of child sexual exploitation made to the CyberTipline in 2020 included 65.4 million images, videos and other files," the NCMEC said. "These materials contained suspected child sexual abuse material (CSAM) and other incident related content."

Reports to the CyberTipline by the public more than doubled in 2020.

The numbers from NCMEC are reported instances and are not confirmed cases of abuse.

The NCMEC's CyberTipline is a "centralized reporting system for the online exploitation of children," where the public and ESP's "can make reports of suspected online enticement of children for sexual acts, extra-familial child sexual molestation, child pornography, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet."

Ahead of the NCMEC's report, Facebook announced on Tuesday that it was introducing new measures to prevent "people from sharing content that victimizes children," as well as new improvements to detection and reporting inappropriate content.

"To understand how and why people share child exploitative content on Facebook and Instagram, we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020," Facebook said in a statement.

"We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period," the social media network stated. "While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many."

Facebook reported 150 accounts to the NCMEC for "uploading child exploitative content in July and August of 2020 and January 2021," and found that over 75% of these users "did not exhibit malicious intent." "Instead, they appeared to share for other reasons, such as outrage or in poor humor."

Facebook will now have a "pop-up" that appears whenever users searches for terms associated with child exploitation. There will also be a "safety alert that informs people who have shared viral, meme child exploitative content about the harm it can cause and warns that it is against our policies and there are legal consequences for sharing this material."

Facebook said accounts that share and promote CSAM would be removed.

"Using our apps to harm children is abhorrent and unacceptable," Facebook's news release read. "Our industry-leading efforts to combat child exploitation focus on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to keep children safe."