The 404 Media Podcast

404 Media

  • 44 minutes 50 seconds
    Privacy Under Pressure (With Harlo Holmes)

    In this week’s interview, Sam is joined by Harlo Holmes. Harlo is the Chief Security Programs Officer at Freedom of the Press Foundation. She’s a media scholar, software programmer, and activist.

    Harlo and Sam discuss the important work she does every day, and why it’s only becoming more crucial. They also get into how to fight back against privacy nihilism, digital security practices everyone can be implementing regardless of their threat model, and the recent arrests and raids of journalists in the U.S.

    Listen to the weekly podcast on Apple Podcasts, Spotify, or YouTube. Become a paid subscriber for access to this episode's bonus content and to power our journalism. If you’re a paid subscriber, check your inbox for an email from our podcast host Transistor for a link to the subscribers-only version! You can also add that subscribers feed to your podcast app of choice and never miss an episode that way. The email should also contain the subscribers-only unlisted YouTube link for the extended video version too. It will also be in the show notes in your podcast player.


    Learn more about your ad choices. Visit megaphone.fm/adchoices

    23 February 2026, 11:00 am
  • 1 hour 10 seconds
    Inside an AI-Powered School

    This week we start with Emanuel’s wild story about Alpha School, a very hyped AI-powered school. Emanuel got leaked documents and spoke to former employees. After the break, Sam tells us what happens when someone decides to make an AI nudify OnlyFans with your likeness. In the subscribers-only section, Joseph tells us about the agencies buying GeoSpy, an AI that can geolocate photos in seconds.


    2:49 - Understood: Deepfake Porn Empire

    5:47 - 'Students Are Being Treated Like Guinea Pigs:' Inside an AI-Powered Private School

    40:01 - 'The Most Dejected I’ve Ever Felt:' Harassers Made Nude AI Images of Her, Then Started an OnlyFans


    YouTube version: https://youtu.be/fy-38hIhykQ


    Learn more about your ad choices. Visit megaphone.fm/adchoices

    18 February 2026, 11:00 am
  • 49 minutes 41 seconds
    What It’s Like to Be a Data Labeler Training AI

    I recently traveled to Kenya for a journalism and AI conference. While I was there, I really wanted to meet with Michael Geoffrey Asia, the secretary general of the Data Labelers Association. Data Labeling is a huge job in Kenya. Data labelers are the people who train AI, and who also work on ensuring the outputs are accurate. In some cases, data labelers are themselves pretending to be AI, in order to train AI. Often, data labelers don’t know exactly what they’re working on, because the work usually goes through a platform, a subcontractor, or a combination of both. So basically they can be presented with a backend where they’re asked to perform tasks or answer questions; in some cases their answers may be presented in real time as AI. Data labeling is notoriously brutal and underpaid work. Workers sometimes earn as little as a few dollars a day, work under algorithmic management, and, because they’re sometimes trying to train AI what not to do or show, they are often shown graphic, violent, or sexual content for hours at a time. It’s kind of similar to content moderation jobs, and lots of people do both data labeling and content moderation, or switch back and forth between the industries. It’s such a big thing in Kenya that I mentioned it to the driver who took me to meet Michael for this interview, and she told me that she too was a data labeler, as are many of her friends. Michael has since become critical at the Data Labelers Association, a group that is fighting to organize people who do data labeling work and who is advocating for better working conditions, higher pay, and more protections for data labelers. I met Michael at a coworking space in Nairobi in a very tiny room, so I’m not on camera after this, but here’s my conversation with Michael. The Emotional Labor Behind AI Intimacy by Michael Geoffrey Asia


    YouTube Version: https://youtu.be/QH654YPxvEE

    Learn more about your ad choices. Visit megaphone.fm/adchoices

    16 February 2026, 8:00 am
  • 50 minutes 2 seconds
    Ring Is Back and Scarier Than Ever

    We start this week with exciting news: we bought a Super Bowl ad! For… $2,550. We explain how. After the break, Jason tells us about Ring’s recently launched Search Party feature, and gives us a very timely reminder of what Ring really is and how we got here. In the subscribers-only section, Joseph breaks down Lockdown Mode and how it kept the FBI out of a Washington Post reporter’s phone.


    Timestamps:


    0:00 - Intro

    2:49 - Watch 404 Media’s Super Bowl Ad

    27:29 - With Ring, American Consumers Built a Surveillance Dragnet:


    SUBSCRIBER'S STORY - FBI Couldn’t Get into WaPo Reporter’s iPhone Because It Had Lockdown Mode Enabled
    YouTube version: https://youtu.be/0JK-VSrtlWw



    Learn more about your ad choices. Visit megaphone.fm/adchoices

    11 February 2026, 11:00 am
  • 1 hour 25 minutes
    The Screen Time Panic Sets Parents Up to Fail

    Patrick Klepek on the reality of parenting in the age of Roblox and YouTube.


    I listened to hours of podcasts about how screen time affects kids of all ages and how parents should manage screen time but I still felt completely unprepared for this challenge when I had a kid. 


    I think the reason for that is that there’s a lot of reporting about how screens are impacting kids, and a lot of reporting about the research into this subject, but rarely did I encounter a conversation between parents that talks about how any of that information can be realistically applied in the real world.  


    This week on the podcast we’re joined by Patrick Klepek in order to have the kind of conversation I wish I heard before I became a parent, but I think there’s something here for everyone. Patrick is the cofounder of Remap, a website and one of my favorite podcasts about video games, and the writer behind Crossplay, a newsletter about the intersection of parenting and games. Patrick is also my former colleague at Vice, back when I worked at Motherboard and he at Waypoint. Patrick has been reporting about video games for most of his life, is a wonderful writer, and a parent. I find his perspective on many of these issues—screen time, parental controls, YouTube, Roblox—extremely useful and interesting, and I hope you do as well. 


    YouTube Version: https://youtu.be/LjK1Swsm1m4


    Listen to the weekly podcast on Apple Podcasts, Spotify, or YouTube


    Become a paid subscriber for early access to these interview episodes and to power our journalism. If you become a paid subscriber, check your inbox for an email from our podcast host Transistor for a link to the subscribers-only version! You can also add that subscribers feed to your podcast app of choice and never miss an episode that way. The email should also contain the subscribers-only unlisted YouTube link for the extended video version too. It will also be in the show notes in your podcast player.

    Learn more about your ad choices. Visit megaphone.fm/adchoices

    9 February 2026, 11:00 am
  • 55 minutes
    The Latest Epstein Dump is a Disaster

    We start this week with Sam and Emanuel’s article about the latest Epstein dump, and how it’s really a disaster in a lot of ways. After the break, Matthew runs us through Moltbot and its terrible security. After the break, Emanuel breaks down his two recent stories about a fundamental issue exposing a bunch of very sensitive data.


    0:00 - Intro

    2:19 - DOJ Released Unredacted Nude Images in Epstein Files

    25:08 - Silicon Valley’s Favorite New AI Agent Has Serious Security Flaws

    34:55 - Exposed Moltbook Database Let Anyone Take Control of Any AI Agent on the Site



    YouTube version: https://youtu.be/gDcOOP_Y9cU




    Learn more about your ad choices. Visit megaphone.fm/adchoices

    4 February 2026, 11:00 am
  • 50 minutes 24 seconds
    How Identity Literally Changes What You See (with Samuel Bagg)

    This week Joseph talks to Samuel Bagg, assistant professor of political science at the University of South Carolina. Bagg recently wrote a fascinating essay, linked below, about how the problem with lots of things might be knowledge-based (people believing stuff that’s wrong or dangerous) but the solution is not more knowledge. It’s all about social identity. This is an incredibly interesting discussion, and definitely check out more of Bagg’s writing.

    YouTube version:  https://youtu.be/lNKOqp-rZL8


    Learn more about your ad choices. Visit megaphone.fm/adchoices

    2 February 2026, 11:00 am
  • 49 minutes 42 seconds
    Creators Worry Porn Platform Is Falling Into ‘AI Psychosis’

    We start this week with Sam’s piece about ManyVids, and how some creators believe its CEO, and the person who controls their livelihood, may be experiencing ‘AI psychosis’. After the break, Jason gives us an update on some mysterious disappearing ICE footage. In the subscribers-only section, we talk about Flock and what police are being told to do: not describe what they’re using the AI cameras for.


    Timestamps:

    0:00 - Intro 2:41 - Aliens and Angel Numbers: Creators Worry Porn Platform ManyVids Is Falling Into ‘AI Psychosis’; Amid Backlash, Massive Porn Platform ManyVids Doubles Down on Bizarre, AI-Generated Posts 32:12 - DHS Says Critical ICE Surveillance Footage From Abuse Case Was Actually Never Recorded, Doesn't Matter


    YouTube version: https://youtu.be/EFv0rD9F9es


    Subscribe at 404media.co for bonus content.

    Learn more about your ad choices. Visit megaphone.fm/adchoices

    28 January 2026, 11:00 am
  • 52 minutes 25 seconds
    Exposing the People Behind Deepfake Porn Sites with Bellingcat Investigator Kolina Koltai

    This week, Sam is in conversation with Kolina Koltai. Kolina is an investigator, senior researcher and trainer at Bellingcat. Her investigations focus on the people and systems behind AI companies and platforms that peddle non-consensual deepfake explicit imagery. They discuss how she found herself in this field, her recent investigation uncovering the man behind two deepfake porn sites, and how it feels to watch these sites go down after exposing the people running them.


    YouTube Version: https://youtu.be/CbmUwwVGaf4



    Stories discussed:

    Learn more about your ad choices. Visit megaphone.fm/adchoices

    26 January 2026, 11:00 am
  • 44 minutes 46 seconds
    Here’s What Palantir Is Really Building

    We start this week with Joseph’s article about ELITE, a tool Palantir is working on for ICE. After the break, Emanuel tells us how AI influencers are making fake sex tape-style photos with celebrities, who can’t be best pleased about it. In the subscribers-only section, Matthew breaks down Comic-Con’s ban of AI art.


    0:00 - Intro

    2:16 - ‘ELITE’: The Palantir App ICE Uses to Find Neighborhoods to Raid

    22:45 - Instagram AI Influencers Are Defaming Celebrities With Sex Scandals


    Subscriber's Story: Comic-Con Bans AI Art After Artist Pushback


    YouTube version: https://youtu.be/b-QHWpqjD-E

    Learn more about your ad choices. Visit megaphone.fm/adchoices

    21 January 2026, 11:00 am
  • 54 minutes 37 seconds
    How Wikipedia Will Survive in the Age of AI (With Wikipedia’s CTO Selena Deckelmann)

    The Wikimedia Foundation’s chief technology and product officer explains how she helps manage one of the most visited sites in the world in the age of generative AI. 


    Wikipedia is turning 25 this month, and it’s never been more important. 


    The online, collectively created encyclopedia has been a cornerstone of the internet decades, but as generative AI started flooding every platform with AI-generated slop over the last couple of years, Wikipedia’s governance model, editing process, and dedication to citing reliable sources has emerged as one of the most reliable and resilient models we have. 


    And yet, as successful as the model is, it’s almost never replicated. 


    This week on the podcast we’re joined by Selena Deckelmann, the Chief Product and Technology Officer at the Wikimedia Foundation, the nonprofit organization that operates Wikipedia. That means Selena oversees the technical infrastructure and product strategy for one of the most visited sites in the world, and one the most comprehensive repositories of human knowledge ever assembled. Wikipedia is turning 25 this month, so I wanted to talk to Selena about how Wikipedia works and how it plans to continue to work in the age of generative AI.  


    YouTube Version: https://youtu.be/39LR9ouJR3c


    Subscribe at 404media.co for bonus content.


    Listen to the weekly podcast on Apple Podcasts, Spotify, or YouTube


    Learn more about your ad choices. Visit megaphone.fm/adchoices

    19 January 2026, 11:00 am
  • More Episodes? Get the App