BEREA, Ky. — A viral headline making the rounds lately says a federal judge basically declared your AI chats “public record.” That is entirely too broad. But the warning underneath that headline is real enough that ordinary users, business owners, and anyone handling sensitive information needs to pay attention.
In United States v. Heppner, a federal judge ruled that a defendant’s chats with Anthropic’s Claude were not protected by attorney-client privilege or the work-product doctrine. The reasoning was straightforward: Claude is a third-party commercial AI service, not a lawyer, and the user had no reasonable expectation that the chats were confidential in the same way a conversation with legal counsel would be.
That does not mean every chat with ChatGPT, Gemini, or Claude is automatically an open public record. It does mean a federal court has now given everyone a very blunt reminder: typing something into an AI chatbot is not the same thing as whispering it behind closed doors, and you should stop assuming these systems are private by default.
Content Retention vs. Identifying Data
The part most people miss when discussing AI privacy is that there are really two completely separate questions at play.
- Content retention: How long does the company keep the actual text of the chat?
- Identifying-data retention: Even if the company says a conversation is deleted or de-identified, how long do they keep the logs, account links, device information, IP addresses, or other data that points directly back to you?
Those are not the same thing, and the corporate privacy policies do not always answer them with the same level of clarity.
How the Big Platforms Handle Your Data
ChatGPT (OpenAI): OpenAI’s privacy policy notes that it collects personal data (account details, IP addresses, usage data) and keeps it as long as needed for service, security, legal, and business purposes. For the chats themselves, the clean, public-facing rule is this: regular chats stay in your account until you delete them. Deleted chats are generally removed from OpenAI’s systems within 30 days unless they must be kept longer for safety, fraud prevention, or legal obligations. Temporary Chats are not saved in your history and vanish within 30 days. That sounds simple, but remember: that is a rule about chat content, not a promise that every identifying server trace connected to your account vanishes in a month.
Gemini (Google): Google is actually more explicit about short-term handling. If you use Gemini with “Keep Activity” turned off, Google states those temporary chats are still retained with your account for up to 72 hours to provide the service, protect users, and process feedback. If your activity is turned on, Google explicitly notes that human reviewers may read and annotate some conversations to improve its products. “History off” does not mean “nothing is kept.” It just means the retention window is shorter and the chat doesn’t appear on your screen.
Claude (Anthropic): Anthropic now gives consumer users a visible fork in the road. If you do not allow your chats to be used for model training, the standard 30-day retention period applies. However, if you do allow training, Anthropic says it may retain your data in a de-identified format in its model-training pipelines for up to five years. That five-year number is massive, but it refers to de-identified training retention, not necessarily keeping your name and email attached to every prompt for half a decade.
Why “De-Identified” Isn’t a Magic Word
That distinction matters because people often hear phrases like “de-identified,” “not used for training,” or “deleted,” and assume that means everything is wiped clean. It does not.
A deleted chat may be scheduled for removal while surrounding server logs or account-linked records are still retained for billing or abuse prevention. A conversation that is no longer sitting in your visible sidebar may still have existed on a server long enough to be reviewed, logged, or—as the Heppner case proves—pulled into a legal dispute.
Most people are not drafting criminal-defense strategies into Claude. But plenty of people are pasting in business plans, contracts, customer lists, private family problems, unpublished scripts, and half-baked ideas they would never post publicly. The court didn’t say every AI prompt is open for the world to see; it simply showed how fragile our assumptions of confidentiality become once a third-party cloud platform is involved.
The Venice Counterexample
To be fair, not every AI service is built the same way. Venice.ai is a useful counterexample because it does not frame privacy mainly as a promise to “delete it later.” It frames privacy as a foundational architecture choice.
Venice says your conversation history stays locally on your device, in your browser, and that requests are relayed through its proxy without being stored on their servers. If you clear your browser data, those conversations are gone. But even Venice is not magic. When you use Venice to access outside “frontier” models (like Claude or Gemini), the provider’s own retention policies can still apply to the prompt being processed. It is a real contrast to the main three, but not a blanket escape hatch from all data risk.
The Berea Perspective
For readers here in Berea, the practical lesson is not to panic. It is to be deliberate.
If you are running a small business, managing donors, replying to patrons, or organizing complex schedules for upcoming shows at places like The Spotlight Playhouse, AI can save you enormous amounts of time. But that does not mean an AI chat window is the right place to paste raw payroll files, confidential legal strategies, health records, or anything else you would not want living on a corporate server. The convenience is incredibly real. So is the need for judgment.
Your AI chats may be less private than you assume, different companies handle server retention in vastly different ways, and “delete” almost never means every digital trace is gone. Before you paste something sensitive into a prompt box, you should know which question you are really asking: How long does the chat stay around, and how long can the company still know it was me?
About the Author
Chad Hembree is a certified network engineer with 30 years of experience in IT and networking. He hosted the nationally syndicated radio show Tech Talk with Chad Hembree throughout the 1990s and into the early 2000s, and previously served as CEO of DataStar. Today, he is based in Berea as the Executive Director of The Spotlight Playhouse, proof that some careers don’t pivot, they evolve.
Upcoming Events in Berea & Beyond
Theater & Performance at The Spotlight Playhouse
(Tickets and info for all shows: thespotlightplayhouse.com)
- Murder on the 518 (The Bluegrass Players) — March 20–29
- Disney’s Frozen JR. (Teen Production, Ages 14–18) — March 20–29
- “Finally” A Broadway Revue (The Spotlight Players) — April 3–12
- The Booking Committee (The Bluegrass Players) — April 17–25
- Disney’s Finding Nemo KIDS (Ages 4–11) — April 24–26
Music & Concerts
Community, Arts & Outdoors
- Photography Workshop with John Snell (Berea Arts Council) — Sat., March 28 at 10:00 a.m.
- Easter Eggstravaganza (Lake Reba Park Softball Fields, Richmond) — Sat., April 4 at 11:00 a.m.
- Red Oaks Forest School Art Club (Forestry Outreach Center) — Sat., April 11 at 10:00 a.m.
- Mushroom Inoculation Workshop with ASPI (Forestry Outreach Center) — Sat., April 25
