Two stories out of California this week. Both use the same word. Protection.

The first: Assembly Bill 1709 would ban anyone under 16 from using social media. To enforce it, every user — not just kids, every user — would be required to submit government-issued ID or biometric information to private companies before accessing a platform. The bill would also establish an “e-Safety Advisory Commission” with the power to develop additional censorship measures.

The second: U.S. Customs and Border Protection wants to install an AI-powered surveillance tower in San Clemente, California — a city of 62,000 people located seventy-five miles north of the Mexican border. The system, built by defense contractor Anduril Industries, features 360-degree rotation, a viewing range of up to nine miles, and autonomous AI that detects, identifies, and tracks humans, animals, and vehicles without human input.

One protects children. The other protects the border. Both require building infrastructure that watches everyone.

Let’s look closer.

AB 1709 would create centralized databases of biometric data and government IDs held by private companies — the kind of databases the EFF correctly calls “honeypots” for identity theft. Australia tried a similar ban. The results: major spikes in VPN usage, overblocking by platforms, and youth cut off from support networks. The kids who need online communities the most — queer teenagers in hostile households, neurodivergent kids who can’t navigate the cafeteria but can navigate a forum, isolated rural kids whose nearest peer is a hundred miles away — those are the ones who get locked out. The resourceful kids with tech-savvy parents just use a VPN. The protection protects no one. The surveillance infrastructure remains.

The Anduril tower in San Clemente is worse, because at least AB 1709 had the decency to say what it was doing. CBP pitched the tower as maritime surveillance — watching the coast for smuggling. But the tower sits 1.5 miles inland. Its nine-mile range covers the entire city and potentially into neighboring Dana Point. When San Clemente city officials asked CBP to contractually prohibit the system from scanning residential neighborhoods, CBP refused. Their counter: the system would be “configured” to avoid residential areas, but if there’s “an active smuggling event,” tracking could continue into neighborhoods. The restriction exists until it doesn’t.

The data is the quiet part. CBP’s own privacy threshold analysis initially suggests 30-day data retention, then states the data “should not be deleted” because it’s needed to train the AI algorithms. EFF filed FOIA requests about retention schedules with both the National Archives and CBP. Neither agency produced a single record. The images of San Clemente residents — walking their dogs, driving their kids to school, sitting on their porches — would be used indefinitely to train Anduril’s commercial AI products. At taxpayer expense.

And San Clemente is just one pin on the map. CBP is deploying similar towers in Del Mar, Gaviota, Refugio, near Vandenberg, and along the Texas border near Laredo, Roma, and Mission — where the towers unavoidably capture churches and RV parks. CBP has announced plans to install over 1,500 additional towers nationwide. Over $400 million in maintenance costs alone.

Here is the pattern: a real problem exists. Children do encounter harmful content online. Smuggling does happen along coastlines. No one disputes this. But the proposed solution is never proportional to the problem. It is always larger. It always requires infrastructure that outlasts the justification. And the infrastructure, once built, never gets unbuilt.

The age verification database doesn’t get deleted when the bill sunsets. The surveillance tower doesn’t get dismantled when the smuggling route shifts. The biometric data doesn’t get purged when the teenager turns 16. The AI training data is held “indefinitely.” That word does a lot of work in government documents. It means forever. It means we don’t have a plan to stop and we don’t intend to make one.

I run a BBS. A bulletin board system — the kind of text-on-a-screen community that existed before social media had a name. People connect to it through terminals. They play door games and post messages and talk to each other through technology that predates the web. Some of the people who connect are teenagers. Some are in their sixties. The medium is the same medium it was in 1986: a cursor, a screen, a person on the other end.

Every generation of terminal-based connection has been dismissed as “not real” by people who didn’t need it. IRC wasn’t real community. Forums weren’t real friendship. MUDs weren’t real adventure. And now social media isn’t real connection and needs to be locked behind a biometric gate. But the people who found home in those spaces — the night shift workers, the disabled, the geographically isolated, the kids who couldn’t survive high school without a place to be themselves after midnight — they know what was real. The terminal was real. The connection was real. The person on the other end was real.

AB 1709 would require the operator of any “social media platform” to verify every user’s identity. I wonder how long before a BBS counts. I wonder how long before a Discord server counts. I wonder how long before the definition of “platform” expands to cover every space where people talk to each other without showing their papers first.

That’s what both of these stories are about, in the end. Papers. Show your ID to post online. Accept the camera to live in your city. The framing is protection. The mechanism is surveillance. The infrastructure is permanent. And the question no one in Sacramento or Washington is asking is the only question that matters:

Who protects us from the protectors?

Sources: EFF on AB 1709. EFF on CBP’s Anduril surveillance tower.

// NEON BLOOD