Georgia couple arrested for allegedly using adopted children to make child porn



A Georgia couple have been arrested for using their two adopted children to make child porn, according to police.

Around 4 p.m. on July 27, detectives with the Criminal Investigation Division of the Walton County Sheriff's Office raided a home in Loganville – a town about 35 miles northeast of Atlanta. The officers had received tips that a man at the residence was downloading child sexual abuse material (CSAM) and obtained a search warrant.

"During the subsequent interview with the suspect, it was discovered that he was, in fact, receiving/collecting CSAM and that a secondary suspect lived in Walton County," the Walton County Sheriff's Office said in a statement. "Detectives were told that this secondary suspect was producing homemade child sexual abuse material with at least one child who lived in the home with the perpetrator."

At approximately 11:30 p.m. the next day, police officers executed a search warrant at a home in the nearby town of Oxford.

During the search of the home, police officers discovered that the two adult males who lived at the residence were "engaging in sexually abusive acts and video documenting this abuse," according to the Walton County Sheriff's Office. The two men allegedly used their two adopted children to make child porn.

William Dale Zulock, 32, and Zachary Jacoby Zulock, 35, were arrested. The couple were booked at the Walton County Jail and charged with aggravated child molestation, sexual exploitation of children, and enticing a child for indecent purposes, according to WSB-TV. Zulock was reportedly also charged with child molestation.

The Walton County Sheriff's Office were able to identify the young victims with assistance from the Walton County school district.

The Georgia Bureau of Investigations provided operational support and analytics for this case, officials said.

Authorities said the victims are "now safe" after the Walton County Division of Family & Children Services (DFCS) took the young boys into care.

Deputies say their investigation is ongoing.

Twitter sued by minor after platform refused to remove child porn because it 'didn't find a violation,' lawsuit claims



Twitter is facing a lawsuit from a minor who claims that the social media platform refused to remove child porn that featured him and another 13-year-old. The social media platform reviewed the graphic content, and "didn't find a violation of our policies," according to the lawsuit.

The lawsuit, which was filed on Wednesday in U.S. District Court for the Northern District of California, states, "This lawsuit seeks to shine a light on how Twitter has enabled and profited from CSAM (child sexual abuse material) on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity."

The minor, identified only as "John Doe," alleges that he started communicating on Snapchat with someone who was posing as a 16-year-old female classmate when he was 13.

The minor, who is now 17-years-old, allegedly exchanged nude photos with the Snapchat user, who the lawsuit deemed to be a sex trafficker. The trafficker then allegedly blackmailed the boy, and threatened to expose the explicit photos with his "parents, coach, pastor" and others unless he provided them with more sexual photos and videos.

The minor complied and sent videos of himself performing sex acts, according to the lawsuit. The blackmailers demanded that Doe make child porn with another minor, which he reportedly did.

Doe eventually blocked the child porn traffickers, but the sex videos of him surfaced on Twitter in 2019. "Twitter was alerted by a concerned citizen" that a user account was disseminating CSAM, the suit claims.

The minor became aware of the distribution of his videos online in mid-January after classmates saw the explicit video, The New York Post reported.

"Due to the circulation of these videos, he faced teasing, harassment, vicious bullying, and became suicidal," the lawsuit reads. "John Doe spoke to his parents about what was happening and sought their help."

"His mother, Jane Doe, took immediate action to have the CSAM removed," the suit says. "She contacted school officials, local law enforcement, and reached out directly to Twitter. John Doe attempted to contact the Twitter users who had posted the CSAM video depicting him, informed them that the video was of him, that he was a minor in the video, and asked them to remove the posts. One of these Twitter users removed the post, however, the other ignored John Doe's request and kept the video live."

Twitter replied to John Doe with an email on Jan. 28, 2020, that reportedly said, "Thanks for reaching out. We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."

According to the lawsuit, Doe responded to Twitter, "What do you mean you don't see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down."

Doe alleges that he included the case number from a local law enforcement agency, but Twitter didn't remove the content.

The video was viewed more than 167,000 times and had 2,223 retweets, the lawsuit claims.

Two days later, Jane Doe contacted a U.S. Department of Homeland Security agent, who was able to convince the big tech company to delete the child porn, the lawsuit alleges.

"Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children ('NCMEC')," the lawsuit states. "This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children."

The lawsuit accuses Twitter of benefiting from a sex trafficking venture, failing to report known child sexual abuse material, knowingly distributing child pornography, intentionally distributing non-consensually shared pornography, possessing child pornography, among other complaints.

"Twitter's conduct was malicious, oppressive, or in reckless disregard of John Doe's rights and he is entitled to injunctive relief, compensatory and punitive damages, and the costs of maintaining this action," the lawsuit reads.

Twitter declined to comment when contacted by The New York Post.