Home » OpenAI Ordered to Retain Deleted ChatGPT Logs in Legal Clash With The New York Times

OpenAI Ordered to Retain Deleted ChatGPT Logs in Legal Clash With The New York Times

by Prime Time Press Team

SAN FRANCISCO, June 18, 2025 — A federal court has ordered OpenAI to preserve user conversations from its ChatGPT platform well beyond its usual 30-day deletion window, issuing a significant ruling in the ongoing copyright and defamation lawsuit filed by The New York Times. The decision, issued by U.S. Magistrate Judge Ona T. Wang in the Southern District of New York, compels OpenAI to retain deleted logs that could be material to the case, despite the company’s warnings that such a move could threaten user privacy.

This pivotal ruling arises amid intensifying legal battles over the intersection of artificial intelligence, intellectual property, and digital privacy. At the heart of the case is the allegation that ChatGPT and similar large language models trained on news articles — including paywalled or copyrighted content — may produce responses that replicate or closely mimic original material without permission.

In her ruling, Judge Wang underscored the necessity of maintaining potentially relevant output logs, stating that deletion under OpenAI’s existing data policies could result in the permanent loss of critical evidence. The court’s directive extends to both user-submitted inputs and the AI-generated responses that may be implicated in alleged copyright infringement or defamatory reproduction.

OpenAI, whose platform has been at the forefront of the AI revolution, previously maintained a 30-day retention limit on standard user and API interactions, with exceptions only for its enterprise clients or under regulatory requirements. The company has voiced strong opposition to the court’s decision, asserting that indefinite retention of deleted logs would breach established user trust and could conflict with global privacy regulations like the European Union’s GDPR.

“This sweeping demand conflicts with the expectations of our users, many of whom share sensitive data with the understanding that their interactions will be short-lived and secure,” OpenAI said in a statement. It emphasized that retained data under the court order will be stored securely, in a segregated system accessible only to a vetted internal legal team, and will not be shared directly with plaintiffs unless further authorized.

The New York Times lawsuit, filed in December 2023, is part of a broader effort by news organizations to challenge the use of their copyrighted material in the training and functioning of generative AI systems. The lawsuit argues that OpenAI’s models replicate news content in ways that undermine the economic value of original reporting. A similar case involving other AI developers is also pending, signaling a trend toward stricter legal scrutiny of how AI companies handle third-party content.

The court’s mandate may set a legal precedent, compelling other AI firms to reconsider their own data-handling practices. Industry analysts suggest that this case could have a chilling effect on AI innovation if companies are required to maintain vast data logs indefinitely, especially when those logs contain potentially sensitive user information.

At the same time, privacy advocates warn that prolonged data retention opens new risks. “Once user data is stored longer than originally promised, it becomes more vulnerable to breaches, misuse, and scope creep,” said Sarah Martin, director of the Digital Privacy Initiative. She added that companies operating across jurisdictions will face increased compliance challenges if U.S. court orders contradict international privacy laws.

OpenAI has signaled its intention to challenge the ruling further. The company filed a motion seeking reconsideration and has requested that a higher court vacate or narrow the scope of the preservation directive. As part of its appeal, OpenAI argues that the plaintiffs should rely on sampling of existing data rather than requiring wholesale retention of all deleted logs.

In the interim, OpenAI has suspended deletion of certain user data and is developing a legal compliance protocol to satisfy the court’s order. The outcome of this dispute is likely to influence how generative AI companies handle litigation holds and user data management in future legal proceedings.

With generative AI rapidly reshaping industries, the case underscores the mounting tension between innovation, user privacy, and legal accountability. As courts wrestle with applying traditional evidence rules to AI systems, the policies companies set today may define the future landscape of digital rights and responsibilities.

You may also like

About Us

Welcome to PrimeTimePress, where quality meets precision in the world of printing. We are a leading provider of professional printing services, specializing in delivering high-quality, reliable, and cost-effective print solutions to businesses and individuals alike.

© 2024Primetimepress. All rights reserved.