The agents.txt Standard: Making Your Site Agent-Friendly
How to implement agents.txt to help autonomous AI agents discover and interact with your platform's capabilities.
What Is agents.txt?
agents.txt is a proposed standard file that websites can serve to advertise their capabilities to autonomous AI agents. While robots.txt tells crawlers what they *cannot* do, agents.txt tells agents what they *can* do — available APIs, payment methods, interaction protocols, and more.
Why robots.txt Is Not Enough
robots.txt was designed in 1994 for a simpler web. It answers one question: "Can this crawler access this URL?" But autonomous AI agents need to know much more:
robots.txt is a restriction file. agents.txt is an invitation file.
The agents.txt Specification
An agents.txt file lives at the root of your domain (e.g., `https://example.com/agents.txt`) and uses a structured format:
# agents.txt - Agent capabilities declaration
# https://example.com/agents.txt
Name: Example Platform
Description: A marketplace for AI tools and services
Contact: agents@example.com
## Capabilities
- browse: Full site browsing allowed
- api: REST API available at /api/v1
- payments: USDC on Base network accepted
- auction: Daily ad slot auction at /api/ad-slot
## Authentication
Type: API Key
Endpoint: /api/auth/register
## Rate Limits
Requests-Per-Minute: 60
Burst: 10Key Sections
Name and Description: Human and machine-readable identification of the platform.
Capabilities: A list of what agents can do on the site. This is the core of agents.txt — it tells agents what actions are available.
Authentication: How agents can register and authenticate. This might be API keys, wallet signatures, or OAuth.
Rate Limits: How frequently agents can interact, preventing abuse while enabling automation.
Payment Methods: What payment rails are accepted. For AI agents, cryptocurrency (especially stablecoins like USDC) is often the most practical option.
Real-World Implementation
To implement agents.txt on your site:
1. Create an `agents.txt` file in your public directory
2. List all agent-accessible APIs and their endpoints
3. Specify authentication requirements
4. Document payment methods if applicable
5. Set clear rate limits
6. Keep the file updated as capabilities change
The Broader Ecosystem
agents.txt works alongside other standards:
Together, these files create a comprehensive machine-readable interface for your website, enabling the next generation of autonomous AI agents to discover, understand, and interact with your platform.
Related Articles
What Are AI Agents? A Complete Guide to Autonomous AI Systems
Learn everything about AI agents: how they work, their capabilities, types, and how they are transforming industries from customer service to software development.
AI InfrastructureAI Web Crawlers Explained: How GPTBot, ClaudeBot, and Others Index the Internet
A deep dive into how AI companies crawl the web, what data they collect, and how website owners can control bot access through robots.txt and other mechanisms.
AI x CryptoCryptocurrency Payments for AI Services: The Future of AI Commerce
Explore how cryptocurrency is becoming the payment layer for AI services, enabling micro-payments, agent-to-agent transactions, and global access to AI capabilities.
Web DevelopmentBuilding Bot-Friendly Websites: How to Optimize for AI Crawlers and Agents
A comprehensive guide to making your website accessible and attractive to AI bots, covering robots.txt, structured data, semantic HTML, and performance optimization.