The New York Times allegedly paid someone to “hack” OpenAI’s models via prompting

[ad_1]


summary
Summary

OpenAI is accusing the New York Times of “hacking” its products in the ongoing copyright dispute.

In a legal filing, OpenAI claims that an individual paid by the Times used “deceptive prompts” to create copies of NYT articles. These prompts would violate OpenAI’s terms of service.

The NYT demonstrated that OpenAI’s GPT models could generate copies of NYT articles when it filed a lawsuit against OpenAI for copyright infringement.

The allegations in the Times’s complaint do not meet its famously rigorous journalistic standards. The truth, which will come out in the course of this case, is that the Times paid someone to hack OpenAI’s products. It took them tens of thousands of attempts to generate the highly anomalous results that make up Exhibit J to the Complaint. They were able to do so only by targeting and exploiting a bug (which OpenAI has committed to addressing) by using deceptive prompts that blatantly violate OpenAI’s terms of use. And even then, they had to feed the tool portions of the very articles they sought to elicit verbatim passages of, virtually all of which already appear on multiple public websites.

From the indictment

OpenAI was expected to take this argument to court: earlier this year, the AI company already accused The New York Times of using manipulative prompts to deliberately provoke copyright infringement by its AI models.

Ad

Ad

These prompts, which OpenAI claims violate its terms of service, were used to generate verbatim copies of New York Times content.

This allegedly provoked “regurgitation” of content was supposedly a rare error in the systems’ learning process that could be fixed, OpenAI claimed.

At the time, OpenAI accused the New York Times of “not telling the full story.” The excerpt from the lawsuit supports that claim.

The New York Times’ lawyer claims that the so-called “hacking” was merely a search for evidence of copyrighted content in OpenAI’s AI models.

[ad_2]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top