
Innovation in the field of indexing and crawling of copyrighted content.
Bot Consent Protocol (BCP): A Standard for Regulating Automated Access to Authored Digital Content
Whitepaper on the Legal, Technical, and Economic Framework for Managing Bots, Crawlers, and Automated Traffic
Version: 1.0 (Draft) | Status: Proposed Standard | Acronym: BCP | Date: January 8th, 2026
1. Executive Summary
The Bot Consent Protocol (BCP) is a proposed technical and legal standard for regulating automated access to websites.
Its purpose is to establish a transparent, fair, and measurable relationship between website owners and automated systems accessing their infrastructure—including search engine crawlers, analytics tools, scrapers, AI model crawlers, and other forms of non-human traffic.
In the current web ecosystem, automated traffic accounts for a significant share—often the majority—of total traffic, yet most of this traffic is unregulated, unaccountable, and carries no responsibility for the load it imposes.
The traditional mechanism, robots.txt, introduced in 1994, has become ineffective: it is not legally binding, not technically enforceable, and does not protect against excessive or abusive crawling.
BCP introduces a new approach: automated agents must accept clearly defined terms of use before continuing access, similar to how human users must accept cookies, GDPR notices, or terms of service.
Acceptance is automated and based on continued requests after the display of a Bot Consent Page (BCP Page).
The Bot Consent Protocol (BCP) is the first step toward that future.
Podrobno o novem predlogu starndarda Bot Consent Protocol na Hac.
