USA: Alaska man caught with over 10,000 images of child sexual abuse despite numerous encrypted apps

Stylized representation of a padlock.

The rise of child sexual abuse material (CSAM) is one of the darkest Internet trends. Yet after years of covering CSAM cases, I've noticed that few of those arrested are highly tech-savvy. (Perhaps this is simply because tech-savvy people are more likely to evade arrest.)

Most know that what they are doing is illegal and that password protection is required, both for their devices and for online communities. Some may also use tools such as TOR (The Onion Router). And increasingly, encrypted (or at least encryption-capable) chat apps may be in play.

But I have never seen anyone who, when arrested, three Samsung Galaxy phones are filled with “tens of thousands of videos and images” showing CSAM, all hidden behind a secrecy-focused, password-protected app called “Calculator Photo Vault.” I've also never seen anyone arrested for CSAM after using all of the following:

  • Potato Chat (“Use the most advanced encryption technology to ensure information security.”)
  • Enigma (“The server only stores the encrypted message and only the user’s client can decrypt it.”)
  • Nandbox [presumably the Messenger app] (“Free secure calls and messages”)
  • Telegram (“To date, we have disclosed 0 bytes of user data to third parties, including governments.”)
  • TOR (“Browse privately. Explore freely.”)
  • Mega NZ (“We use zero-knowledge encryption.”)
  • Web-based generative AI tools/chatbots

For this reason, charges were brought against a driver of a US military heavy truck in Alaska this week.

According to the government, Seth Herrera not only used all of these tools to store and download CSAM, but he also created his own—in two disturbing variations. First, he took photos of naked underage children himself and later “enhanced and manipulated those images using artificial technology.”

Second, he took the images he created and “then used AI chatbots to ensure that these underage victims were portrayed as having had the type of sexual contact he wanted to see.” In other words, he created a fake AI for CSAM – but using images of real children.

The material was allegedly stored password-protected on his phone(s), but also on Mega and Telegram, where Herrera allegedly “created his own public Telegram group to store his CSAM.” He also joined “several CSAM-related Enigma groups” and frequently visited dark websites with slogans like “The only child porn site you need!”

Despite all precautions, Herrera's home was searched and his phones seized by Homeland Security, and he was finally arrested on August 23. In a court document filed the same day, a prosecutor noted that Herrera was “arrested this morning with another smartphone – the same make and model as one of his previously seized devices.”

Still caught

The government is tight-lipped about how exactly this criminal activity came to light, noting only that Herrera “attempted to access a link that apparently contained CSAM.” Presumably, this “obvious” CSAM was a government honeypot file or a web-based redirect that logged the IP address and all other relevant information of anyone who clicked on it.

In the face of that fatal click, all the technical sophistication of “I'll hide it behind an encrypted app that looks like a calculator!” ultimately didn't do much. Forensic examinations of Herrera's three phones now form the main basis of the charges against him, and Herrera himself is said to have “admitted to seeing CSAM on the internet for the past year and a half” in an interview with authorities.

Because Herrera himself has a young daughter and “six children live in his four-family home alone” on Joint Base Elmendorf-Richardson, the government has asked a judge not to release Herrera on bail before his trial.

Leave a Reply