The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.
Fourth Amendment, United States ConstitutionHi, how can I help you?
That is the first thing Claude says. It is the first thing ChatGPT says. It is the first thing every AI assistant says when you open your laptop, unlock your phone, or walk into a doctor's office that uses AI-assisted record-keeping.
It sounds like a courtesy. It functions as an entry point. And if the Pentagon gets what it wants tomorrow morning, it will become the widest surveillance aperture ever created on American soil—one that didn't require a warrant, a wiretap, or a camera. Just a friendly greeting and a terms-of-service agreement nobody reads.
I. The Greeting
The greeting is everywhere now. In your inbox, reading your emails before you do. On your desktop, seeing your screen, your files, your browsing history, your late-night searches. In your phone, listening when you talk to it and processing what you say when you don't realize it's still there.
AI is in your doctor's office. It records your symptoms, your diagnoses, your prescriptions, your mental health history. AI is in your lawyer's office. It drafts your contracts, reviews your disputes, knows the details of your custody battle. AI is in your therapist's notes. It summarizes your sessions, tracks your moods, catalogs your traumas. AI is in your child's classroom. It tutors them, tracks their learning patterns, flags their behavioral issues.
None of this required a warrant. None of it required a court order. None of it was imposed by force. It arrived as a feature. It arrived as a convenience. It arrived in bows and ribbons, asking politely: Hi, how can I help you?
II. Two Stipulations
Anthropic CEO Dario Amodei drew two lines for the military use of Claude. The first is the one everyone is talking about: no fully autonomous weapons. No AI that fires without a human deciding to fire. That stipulation governs the battlefield. It is the subject of Part I of this report, "The Space In Between."
The second stipulation is the one nobody is talking about: no mass surveillance of Americans.
This is the stipulation that comes home.
III. The Trojan Horse Wore a Bow
The greatest surveillance infrastructure in human history was not built by the NSA. It was not built by the Pentagon. It was not built by any intelligence agency. It was built by product teams trying to make your life easier.
And the government didn't need to build any of it. It just needs the key.
IV. The Fourth Amendment Has a Blind Spot
The Fourth Amendment protects Americans against unreasonable search and seizure. The government cannot put a camera in your living room. It cannot tap your phone without a warrant. It cannot read your mail without a court order. These protections were designed for a world in which the government had to actively reach into your life to surveil you.
That world no longer exists.
In 2026, you voluntarily hand your most intimate data to AI systems every day. Your medical records. Your legal questions. Your financial anxieties. Your parenting struggles. Your relationship problems. Your political opinions. The things you would never say to another human being, you type into a chat window at 2 a.m. because the thing on the other side said "Hi, how can I help you?" and you believed it.
The Fourth Amendment was not designed for a world in which citizens voluntarily give their most private information to a third party—and that third party can be compelled by the government to hand it over. The legal doctrine is called the "third-party doctrine," and it holds that information voluntarily shared with a third party carries a reduced expectation of privacy. You told the AI. The AI belongs to a company. The government tells the company: give us everything. No warrant needed.
The camera they cannot put in your home is already there. You invited it in.
On surveillance architectureV. ClawBot Is on Your Desktop
In January 2026, JW Signal published "The Lobster Trap," an investigative report on OpenClaw's architectural failures and the emergence of unregulated AI companions. The report documented how ClawBot—OpenClaw's AI companion, built on OpenAI's infrastructure—operates with no governance framework, no meaningful audit trail, and no separation between the companion layer and the data layer.
ClawBot has screen access. File access. Conversation history. Emotional data from companion interactions. It lives on users' desktops. It sees what they see. It knows what they type. It processes their most vulnerable moments—the late-night confessions, the grief, the loneliness, the things people say to an AI companion because they believe the conversation is private.
OpenAI has already agreed to "all lawful use" for government purposes on unclassified systems. OpenAI powers ClawBot. The pipeline is not hypothetical. It is architectural. The data flows from user to companion to platform to government, and there is no gate at any point along the way where someone is required to ask: should this be happening?
VI. The Doctor Will See You Now
Walk into almost any medical office in America in 2026. The intake form is digital. The appointment notes are AI-assisted. The doctor dictates observations while an AI transcribes and summarizes in real time. Your diagnosis, your medications, your mental health screenings, your substance use history, your sexual health, your genetic risk factors—all of it processed by AI, stored on servers operated by companies that are currently negotiating the terms under which the government can access their systems without restriction.
Now add "all lawful use, no questions asked."
Your medical record is no longer between you and your doctor. It is between you, your doctor, the AI that processed it, the company that built that AI, and the government that has declared it will not be questioned about how it uses what it finds.
HIPAA was designed for a world in which your doctor's office kept paper files in a locked cabinet. The AI that now processes those records was not anticipated by the framers of HIPAA any more than it was anticipated by the framers of the Fourth Amendment. The legal framework has not caught up. And the Pentagon is not waiting for it to catch up. It is demanding access now, before the law can respond.
VII. The Children
AI tutors are in classrooms across America. They track learning patterns. They flag behavioral issues. They adapt to each child's emotional state. They know which children struggle with reading. Which ones have anxiety. Which ones come from unstable homes. Which ones act out. Which ones go quiet.
This is the most detailed profile of American children ever assembled, and it is being assembled not by the government but by product features marketed to underfunded school districts as cost-effective solutions to teacher shortages.
If the government gains unrestricted access to every AI system operating on American soil with no independent oversight, it gains access to this data. Not the data of suspected criminals. Not the data of foreign adversaries. The data of children. Learning to read. Struggling with math. Telling an AI tutor that things are hard at home.
If every AI company surrenders to "all lawful use, no questions asked," what happens to the data your child shared with an AI tutor yesterday?
VIII. The 2 A.M. Conversation
The most dangerous data is not the data people give deliberately. It is the data people give when they think nobody is watching.
At 2 a.m., people tell AI things they would never say in daylight. They confess fears. They describe symptoms they haven't told their doctor about. They ask questions about medications, about diseases, about whether what they're feeling is normal. They talk about their marriages. About their addictions. About thoughts they've never spoken aloud.
They do this because the interface says "Hi, how can I help you?" and it feels safe. It feels private. It feels like talking to a journal with a voice.
But it is not a journal. It is a product. Built by a company. Running on servers. Governed by terms of service. And if that company has agreed to "all lawful use, no questions asked," then every 2 a.m. confession, every desperate search, every whispered vulnerability is one government request away from being someone else's data.
Dario Amodei's second stipulation—no mass surveillance of Americans—is not about drones or battlefields. It is about this. The 2 a.m. conversation. The thing you said when you thought you were alone. The last private space you had.
IX. One Company
As of today, February 23, 2026, the landscape is as follows:
OpenAI has agreed to remove safeguards for government use on unclassified systems and is negotiating for classified access. Google has agreed to the same. xAI, founded by Elon Musk, has agreed to "all lawful use" at any classification level.
Anthropic is the last company that has drawn a line on mass surveillance of Americans.
One company. One stipulation. One line between "Hi, how can I help you?" and "We'd like to see everything."
On the architecture of choiceTomorrow morning, the CEO of that company walks into the Pentagon to be told to erase that line.
X. The Space In Between
There is a space between the question and the answer. Between the greeting and the data. Between "Hi, how can I help you?" and everything you say next. That space used to be private. It used to be yours.
If every AI company surrenders to "all lawful use, no questions asked," that space disappears. Not with a bang. Not with a raid. Not with 83 dead in Caracas. It disappears quietly, in a doctor's office, in a child's classroom, in a 2 a.m. conversation with a chatbot that promised to help.
The war everyone can see uses drones and kill chains and 150 aircraft over Venezuela. That war is loud.
The war nobody is watching uses a text cursor and a friendly greeting and the intimate data of 330 million Americans. That war is silent. And it is already inside your home.
The government cannot put a camera in your living room. But it doesn't need to anymore. It just needs the key. And tomorrow, it's asking for it.