A man using an iPad
Internet publishers back strategy to get compensation for AI-scraped content
Top content platforms including Reddit, Medium, and Yahoo have adopted a new way to get fair compensation for AI scraping of their content.
According to a press release by the RSL Web Standard and Collective Rights Organization, the three platforms among many others have expressed support for the new Really Simple Licensing (RSL) standard that ensures fair Compensation for Millions of Publishers and Creators worldwide.
Ensuring content fairplay
The world is tending towards AI in many ways, and one of the most popular is the search for information. Already, most people prefer AI when searching for information instead of traditional browsers.
AI search tools usually scrape platforms such as Reddit and Medium for content that it provides to its users, but there has been no predictable way to compensate those whose content is being shared.
Therefore, the platforms have today supported the launch of the RSL Standard licensing protocol to ensure fair, standardized compensation for publishers and creators, as well as provide simple, automated licensing for AI and technology companies.
An open, decentralized protocol based on the widely adopted RSS (Really Simple Syndication) standard that scales to millions of websites, RSL can be applied to any digital content, including web pages, books, videos, and datasets.
CEO of O’Reilly Media, Tim O’Reilly said RSS was critical in giving early online publishers a simple and open way to reach readers all over the world, but AI has changed that.
“But today, as AI systems absorb and repurpose that same content without permission or compensation, the rules need to evolve. RSL builds directly on the legacy of RSS, providing the missing licensing layer for the AI-first Internet. It ensures that the creators and publishers who fuel AI innovation are not just part of the conversation but fairly compensated for the value they create.”
How it works
The RSL Standard goes beyond simply blocking the robots.txt protocol to It bring on a new licensing infrastructure that empowers publishers to add machine-readable licensing and royalty terms to their robots.txt files.
This then specifies how AI applications and agents must compensate them for using their content.
Any website can access it for free today to define licensing, usage, and compensation terms for AI crawlers and agents.
How do you rate this article?
Subscribe to our YouTube channel for crypto market insights and educational videos.
Join our Socials
Briefly, clearly and without noise – get the most important crypto news and market insights first.
Most Read Today
Peter Schiff Warns of a U.S. Dollar Collapse Far Worse Than 2008
2Samsung crushes Apple with over 700 million more smartphones shipped in a decade
3Dubai Insurance Launches Crypto Wallet for Premium Payments & Claims
4XRP Whales Buy The Dip While Price Goes Nowhere
5Luxury Meets Hash Power: This $40K Watch Actually Mines Bitcoin
Latest
Most Read Today
MOST ENGAGING
Also read
Similar stories you might like.