Georgia couple arrested for allegedly using adopted children to make child porn



A Georgia couple have been arrested for using their two adopted children to make child porn, according to police.

Around 4 p.m. on July 27, detectives with the Criminal Investigation Division of the Walton County Sheriff's Office raided a home in Loganville – a town about 35 miles northeast of Atlanta. The officers had received tips that a man at the residence was downloading child sexual abuse material (CSAM) and obtained a search warrant.

"During the subsequent interview with the suspect, it was discovered that he was, in fact, receiving/collecting CSAM and that a secondary suspect lived in Walton County," the Walton County Sheriff's Office said in a statement. "Detectives were told that this secondary suspect was producing homemade child sexual abuse material with at least one child who lived in the home with the perpetrator."

At approximately 11:30 p.m. the next day, police officers executed a search warrant at a home in the nearby town of Oxford.

During the search of the home, police officers discovered that the two adult males who lived at the residence were "engaging in sexually abusive acts and video documenting this abuse," according to the Walton County Sheriff's Office. The two men allegedly used their two adopted children to make child porn.

William Dale Zulock, 32, and Zachary Jacoby Zulock, 35, were arrested. The couple were booked at the Walton County Jail and charged with aggravated child molestation, sexual exploitation of children, and enticing a child for indecent purposes, according to WSB-TV. Zulock was reportedly also charged with child molestation.

The Walton County Sheriff's Office were able to identify the young victims with assistance from the Walton County school district.

The Georgia Bureau of Investigations provided operational support and analytics for this case, officials said.

Authorities said the victims are "now safe" after the Walton County Division of Family & Children Services (DFCS) took the young boys into care.

Deputies say their investigation is ongoing.

Commentary: Alleged child sex abuser operated for four years on Twitter. He had 290,00 followers.



It’s not rare for arrests to be made of people distributing child sexual abuse material on Twitter. Evil exists in the world, but it’s great when it’s stopped and not allowed to continue.

A female former child care worker was arrested for trying to sell child sexual abuse material on Twitter. Savannah Noel Hawthorne, 22, was arrested for distribution of child sexual abuse material and possession of child sexual abuse material, according to the Stillwater Police Department in Oklahoma. “On Friday June 3, 2022 SPD detectives, along side with Enid and OSBI, served a warrant in Enid in the Hawthorn residence where Hawthorn was arrested,” said Lt. T.J. Low.

Unfortunately, every single piece of child sexual abuse material imagery includes a minor being sexually abused. It is something stolen from each victim that they can never get back. To make matters worse, it’s been documented and often shared. In some cases, for the child being sexually abused in the imagery, time is of the essence. It can be a matter of life or death to locate them and get them out of an abusive situation as quickly as possible. In 2021, Twitter made 86,666 reports of child sexual abuse material to the National Center for Missing and Exploited Children.

This next case stood out to me right away, because it’s particularly egregious on multiple levels. On June 9, 2022, Nitikorn Manop, 29, was arrested in the Cha-am district, Phetchaburi, of Thailand. The suspect was charged with human trafficking, child exploitation, and production and distribution of child pornography, Pol. Maj. Gen. Wiwat Khamchamnarn, head of the Anti-Trafficking in Persons Division, said on June 12. What stood out about this story is that this predator operated on Twitter for four years and had 290,000 Twitter followers. He allegedly used Twitter in some cases to lure minors. He would sexually abuse minors, film them without their consent, and often post the abuse on the Twitter main feed. He also reportedly sold links and access to a subscription model for others to have access to view this abuse. Why did it take so long to catch this predator? How much advertising revenue was created for Twitter by this profile, specifically using child sex abuse as the content?

In four years of operating openly on Twitter, there were no reports on this specific account made by concerned Twitter users or survivors? If there were reports made on this account, were they ignored or, worse, reviewed and left up, as we’ve seen in the cases of John Doe #1 and Joe Doe #2 Twitter case?

In a situation where there are no reports, why didn’t Twitter’s proprietary tools catch this predator? The technology that Twitter uses can most certainly scan for keywords, account behaviors, and suspicious activity. We’ve seen very real examples of how advanced this technology is during COVID-19, the Hunter Biden story, and anything to do with election fraud, so why would Twitter fall short in this case?

Those issues would have been classified as “harmful” by Twitter, but isn’t child sexual abuse imagery harmful? It’s harmful to the victims and harmful to people who inadvertently stumble on this content. It’s not as if they didn’t have a heads-up that this type of behavior was a problem. In May 2020, a similar case occurred with Twitter. “The man, Thai media reports as ‘Kittiphong’, uploaded and sold the child pornography on Twitter and the messaging app Line. He had a membership fee of 350 baht, or around $11 USD. Authorities gained information through the online reporting system CyberTip, the Chiang Rai Times reports. Police from immigration, anti-trafficking division and the Thailand Internet Crimes Against Children Task Force raided Kittiphong’s room in Bangkok and found 8 cell phones with photos of children. The background of the photos matched the man’s room. He was arrested and charged with child pornography for financial exploitation. He is also facing charges of sexually abusing children 13 years old and younger.”

I suggest that Twitter stop trying desperately to control the political narrative around the issues of the moment and innovate around these issues that matter. Invest in this issue of detecting, removing, and reporting child sexual abuse material, because it has real-world consequences when platforms fall short in response time. I spoke about these issues and possible solutions in a recent article for TheBlaze.

The children abused often have no voice, so we must speak for them until they have the opportunity to speak for themselves. Speaking about these issues applies pressure on tech companies like Twitter to make changes. Hopefully we can start to prevent these crimes before they happen, and if they do happen, tech companies must act in a timely fashion.

Twitter moves to dismiss child porn lawsuit citing Section 230 immunity



Twitter has filed a motion to dismiss a lawsuit from a minor who claims that the social media platform refused to remove child porn that featured him and another 13-year-old, citing its immunity under Section 230 of the Communications Decency Act.

"Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform's alleged 'knowledge' of offensive content if it chose to try to screen out that material but was unable to root out all of it," Twitter's legal team states in a motion to dismiss filed Wednesday with the U.S. District Court for the Northern District of California. "Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act ('CDA § 230'), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content."

"Given that Twitter's alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone," Twitter argues.

The 17-year-old plaintiff, a victim of sex traffickers identified as "John Doe" because he is a minor, alleges in a lawsuit filed in January that Twitter refused to take down pornographic images and videos of him and another teen that surfaced in 2019 because Twitter "didn't find a violation of our policies."

The plaintiff's mother, Jane Doe, allegedly had to contact a U.S. Department of Homeland Security agent who reached out to Twitter and convinced the company to take down the images and video.

The lawsuit accuses Twitter of benefitting from child sex trafficking, failing to report known child sex abuse material, knowingly distributing child pornography, intentionally distributing non-consensually shared pornography, and possessing child pornography, among other complaints.

In the motion to dismiss, Twitter said the plaintiff "appears to have suffered appallingly" at the hands of his sex traffickers, but denies culpability for the images and videos shared on its platform:

This case ultimately does not seek to hold those Perpetrators accountable for the suffering they inflicted on Plaintiff. Rather, this case seeks to hold Twitter liable because a compilation of that explicit video content (the "Videos") was — years later — posted by others on Twitter's platform and although Twitter did remove the content, it allegedly did not act quickly enough. Twitter recognizes that, regrettably, Plaintiff is not alone in suffering this kind of exploitation by such perpetrators on the Internet. For this reason, Twitter is deeply committed to combating child sexual exploitation ("CSE") content on its platform. And while Twitter strives to prevent the proliferation of CSE, it is not infallible.

But, mistakes or delays do not make Twitter a knowing participant in a sex trafficking venture as Plaintiff here has alleged. Plaintiff does not (and cannot) allege, as he must, that Twitter ever had any actual connection to these Perpetrators or took any part in their crimes. Thus, even accepting all of Plaintiff's allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators' despicable acts.

Over 20 million online child sexual abuse material incidents reported on Facebook, by far the most of all platforms



A new report from the National Center for Missing and Exploited Children said that a vast majority of reported online child sexual abuse material was from Facebook.

The annual report by the NCMEC for 2020 claimed the organization's CyberTipline received more than 21.7 million reports of online child exploitation, 21.4 million of these reports were from electronic service providers. There were 20,307,216 reported incidents related to child pornography or trafficking on Facebook, including Instagram and Whatsapp, which the social media behemoth owns.

For comparison, Google reported 546,704 incidents of CSAM, Snapchat found 144,095, Microsoft had 96,776, Twitter cited 65,062, TikTok had 22,692, and Reddit reported 2,233 instances of apparent child sexual abuse material.

MindGeek, the Canada-based parent company of several adult content websites, reported far fewer incidents. MindGeek, which owns Pornhub, YouPorn, RedTube, and Brazzers, reported 13,229 instances of child sexual abuse material last year.

The Internet Watch Foundation, which helps "victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse," claimed it found 118 incidents of videos containing child sexual abuse or rape on Pornhub between 2017 and 2019.

In December, Pornhub faced scrutiny after the New York Times published multiple allegations of sexual exploitation on the adult content website.

The streaming behemoth, which netted 3.5 billion visits per month in 2019, introduced new guidelines in December to protect against underage porn being uploaded on the site.

"Going forward, we will only allow properly identified users to upload content," Pornhub said in a statement. "We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations."

Pornhub also noted that it voluntarily registered as an electronic service provider for the National Center for Missing and Exploited Children's data collection.

Regarding Facebook's overwhelming majority of the alleged CSAM incidents, the National Center for Missing and Exploited Children stated:

Higher numbers of reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP's efforts are to identify and remove abusive content. NCMEC applauds ESPs that make identifying and reporting this content a priority and encourages all companies to increase their reporting to NCMEC. These reports are critical to helping remove children from harmful situations and to stopping further victimization.

As of April 2020, Facebook was the most popular social media platform with nearly 2.5 billion active users.

The NCMEC said reports to the CyberTipline increased by 28% from 2019.

"The 21.7 million reports of child sexual exploitation made to the CyberTipline in 2020 included 65.4 million images, videos and other files," the NCMEC said. "These materials contained suspected child sexual abuse material (CSAM) and other incident related content."

Reports to the CyberTipline by the public more than doubled in 2020.

The numbers from NCMEC are reported instances and are not confirmed cases of abuse.

The NCMEC's CyberTipline is a "centralized reporting system for the online exploitation of children," where the public and ESP's "can make reports of suspected online enticement of children for sexual acts, extra-familial child sexual molestation, child pornography, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet."

Ahead of the NCMEC's report, Facebook announced on Tuesday that it was introducing new measures to prevent "people from sharing content that victimizes children," as well as new improvements to detection and reporting inappropriate content.

"To understand how and why people share child exploitative content on Facebook and Instagram, we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020," Facebook said in a statement.

"We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period," the social media network stated. "While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many."

Facebook reported 150 accounts to the NCMEC for "uploading child exploitative content in July and August of 2020 and January 2021," and found that over 75% of these users "did not exhibit malicious intent." "Instead, they appeared to share for other reasons, such as outrage or in poor humor."

Facebook will now have a "pop-up" that appears whenever users searches for terms associated with child exploitation. There will also be a "safety alert that informs people who have shared viral, meme child exploitative content about the harm it can cause and warns that it is against our policies and there are legal consequences for sharing this material."

Facebook said accounts that share and promote CSAM would be removed.

"Using our apps to harm children is abhorrent and unacceptable," Facebook's news release read. "Our industry-leading efforts to combat child exploitation focus on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to keep children safe."

Texas man agrees to 10-year prison sentence over thousands of child porn images found on his cellphone



A man in Tyler, Texas, agreed in a plea deal on Friday to serve 10 years in prison after he was caught with thousands of images of child pornography on his cellphone.

According to the Tyler Morning Telegraph, police said they received a tip from the National Center for Missing and Exploited Children by Google about someone being in possession of images containing sexual abuse of children.

Police were able to locate James Edward Clement, 38, at his girlfriend's house through investigation of his online Internet Protocol address. Police said that they found more than 3,500 images of child pornography on his phone.

According to a probable cause affidavit, Clement became emotional when he told police that he had been abused himself as a child. He told police that he would never hurt a child, but that his abuse of drugs also contributed to his seeking out of child abuse images.

Clement also told police no one knew about the images on his phone until they had contacted him about it. He said he would take responsibility for his actions.

He had been in county jail since July 9 when he was first arrested, and will receive 198 days of credit against his 10-year prison sentence.

Detectives with the Major Crimes Unit and Homeland Security Special Agents assisted the Tyler Police Department with serving the warrant against Clement.

Here's a local news report about the guilty plea:

Tyler man sentenced 10 years for having 3,500 photos of child pornographywww.youtube.com

Twitter sued by minor after platform refused to remove child porn because it 'didn't find a violation,' lawsuit claims



Twitter is facing a lawsuit from a minor who claims that the social media platform refused to remove child porn that featured him and another 13-year-old. The social media platform reviewed the graphic content, and "didn't find a violation of our policies," according to the lawsuit.

The lawsuit, which was filed on Wednesday in U.S. District Court for the Northern District of California, states, "This lawsuit seeks to shine a light on how Twitter has enabled and profited from CSAM (child sexual abuse material) on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity."

The minor, identified only as "John Doe," alleges that he started communicating on Snapchat with someone who was posing as a 16-year-old female classmate when he was 13.

The minor, who is now 17-years-old, allegedly exchanged nude photos with the Snapchat user, who the lawsuit deemed to be a sex trafficker. The trafficker then allegedly blackmailed the boy, and threatened to expose the explicit photos with his "parents, coach, pastor" and others unless he provided them with more sexual photos and videos.

The minor complied and sent videos of himself performing sex acts, according to the lawsuit. The blackmailers demanded that Doe make child porn with another minor, which he reportedly did.

Doe eventually blocked the child porn traffickers, but the sex videos of him surfaced on Twitter in 2019. "Twitter was alerted by a concerned citizen" that a user account was disseminating CSAM, the suit claims.

The minor became aware of the distribution of his videos online in mid-January after classmates saw the explicit video, The New York Post reported.

"Due to the circulation of these videos, he faced teasing, harassment, vicious bullying, and became suicidal," the lawsuit reads. "John Doe spoke to his parents about what was happening and sought their help."

"His mother, Jane Doe, took immediate action to have the CSAM removed," the suit says. "She contacted school officials, local law enforcement, and reached out directly to Twitter. John Doe attempted to contact the Twitter users who had posted the CSAM video depicting him, informed them that the video was of him, that he was a minor in the video, and asked them to remove the posts. One of these Twitter users removed the post, however, the other ignored John Doe's request and kept the video live."

Twitter replied to John Doe with an email on Jan. 28, 2020, that reportedly said, "Thanks for reaching out. We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."

According to the lawsuit, Doe responded to Twitter, "What do you mean you don't see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down."

Doe alleges that he included the case number from a local law enforcement agency, but Twitter didn't remove the content.

The video was viewed more than 167,000 times and had 2,223 retweets, the lawsuit claims.

Two days later, Jane Doe contacted a U.S. Department of Homeland Security agent, who was able to convince the big tech company to delete the child porn, the lawsuit alleges.

"Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children ('NCMEC')," the lawsuit states. "This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children."

The lawsuit accuses Twitter of benefiting from a sex trafficking venture, failing to report known child sexual abuse material, knowingly distributing child pornography, intentionally distributing non-consensually shared pornography, possessing child pornography, among other complaints.

"Twitter's conduct was malicious, oppressive, or in reckless disregard of John Doe's rights and he is entitled to injunctive relief, compensatory and punitive damages, and the costs of maintaining this action," the lawsuit reads.

Twitter declined to comment when contacted by The New York Post.