Adobe wants to create a robots.txt-styled indicator for images used in AI training
Published on: 2025-08-10 12:00:00
For years, websites included information about what kind of crawlers were not allowed on their site with a robots.txt file. Adobe, which wants to create a similar standard for images, has added a tool to content credentials with an intention to give them a bit more control over what is used to train AI models.
Convincing AI companies to actually adhere to Adobe’s standard may be the primary challenge, especially considering AI crawlers are already known to ignore requests in the robots.txt file.
Content credentials are information in a media file’s metadata used to identify authenticity and ownership. It’s a type of implementation of the Coalition for Content Provenance and Authenticity (C2PA), a standard for content authenticity.
Image Credits:Adobe
Adobe is releasing a new web tool to let creators attach content credentials to all image files, even if they are not created or edited through its own tools. Plus, it’s providing a way for creators to signal to AI companies that they
... Read full article.