Cloudflare Now Blocks AI Bots by Default
Cloudflare now blocks AI bots by default. Here's what this means for your AI search visibility and how to make the right choice.
Cloudflare’s AI Bot Blocking: What Marketers Need to Know
On July 1, 2025, Cloudflare made a significant policy change: AI bot blocking became the default setting for all new domains.
Given that Cloudflare protects approximately 20% of internet websites, this shift alters how AI systems can access web content—and what you need to decide about your own AI visibility.
What Changed
As of the policy update, newly created Cloudflare domains automatically block all known AI crawlers. Previously, these crawlers operated with default access permissions.
This affects:
- GPTBot (OpenAI/ChatGPT)
- ClaudeBot (Anthropic)
- PerplexityBot
- Various other AI crawlers
If you created a Cloudflare-protected site after July 2025 without changing settings, AI crawlers cannot access your content.
Existing domains aren’t automatically switched to blocking—the change affects new domain defaults—but Cloudflare provides tools to enable blocking for existing sites.
Why Cloudflare Made This Change
Website owners have complained that AI systems harvest content with minimal reciprocal benefit.
Cloudflare cited data showing the imbalance: “Anthropic’s ClaudeBot makes approximately 71,000 requests for every single referral click it sends back.”
That ratio—71,000 crawl requests per click—illustrates the concern. AI systems access vast amounts of content to power their responses, but the traffic flowing back to content creators is comparatively tiny.
Website owners argued they were subsidizing AI training and operations without fair compensation. Cloudflare responded by giving owners more control, with blocking as the default.
The Tradeoff for Marketers
This creates a genuine tradeoff. Blocking AI bots means:
Benefits of blocking:
- Protect content from AI training without compensation
- Reduce server load from crawler traffic
- Maintain control over how content is used
- Potential future compensation if pay-per-crawl models emerge
Costs of blocking:
- Invisible to AI search platforms
- Cannot be cited or recommended by AI
- Miss growing AI referral traffic
- Competitors who allow access gain relative advantage
There’s no universally right answer. The choice depends on your business model and strategic priorities.
When to Allow AI Crawlers
Allow AI crawlers if:
AI Search Visibility Is Strategic
If your business benefits from being discovered and recommended by AI platforms, blocking defeats that goal. Every blocked crawl is a missed opportunity to appear in AI responses.
You’re Competing for AI-Driven Discovery
If competitors allow AI access and you don’t, they gain visibility while you’re invisible. In competitive markets, this disadvantage compounds over time.
Referral Traffic Has Value
Despite low click-through rates, AI referral traffic exists. Users who click through from AI responses have high intent—they saw a recommendation and chose to learn more.
Brand Awareness Matters
Being mentioned in AI responses builds awareness even without clicks. Users who hear your brand recommended by ChatGPT may search for you later through other channels.
When to Block AI Crawlers
Block AI crawlers if:
Content Monetization Depends on Direct Visits
If your business model requires users to visit your site (ad-supported content, for example), AI providing answers without sending traffic undermines revenue.
You’re Protecting Training Data
If your content represents proprietary knowledge you don’t want incorporated into AI training, blocking prevents that use.
You’re Holding Out for Compensation
Pay-per-crawl models are proposed but not implemented at scale. Blocking now positions you to negotiate or participate in future compensation schemes.
Your Industry Is Less AI-Search Relevant
Some industries see minimal AI search discovery. If AI recommendations don’t drive your business, the tradeoff favors blocking.
Cloudflare’s Control Options
Cloudflare provides several configuration options:
Block AI Bots Setting
Three levels:
- Allow all: All AI bots can crawl
- Block all: All AI bots are blocked
- Block only on ad-supported pages: Selective blocking
AI Crawl Control
Dashboard visibility into:
- Which AI crawlers are accessing your site
- Crawl frequency and volume
- Robots.txt compliance
AI Labyrinth
A defensive option that traps non-compliant scrapers in redirect loops while allowing legitimate visitors through. This targets bots that ignore robots.txt without blocking compliant crawlers.
Content Signals Policy
Granular permissions allowing different treatment for:
- Training (incorporating into model weights)
- Search indexing (using for real-time retrieval)
- Inference (using during response generation)
This enables nuanced positions like “allow search indexing but not training.”
Pay Per Crawl (Proposed)
A proposed marketplace model where AI companies compensate content creators for crawl access. Not yet implemented at scale, but Cloudflare is building infrastructure for it.
Making the Decision
Consider these factors for your situation:
Traffic Analysis
Check your analytics:
- How much traffic currently comes from AI referrers?
- Is that traffic converting?
- What’s the trend over time?
If AI traffic is material and valuable, blocking is costly.
Competitive Analysis
Assess competitors:
- Are they allowing AI access?
- Do they appear in AI responses for your keywords?
- Would blocking put you at competitive disadvantage?
Unilateral blocking while competitors allow access shifts visibility to them.
Business Model Fit
Evaluate your model:
- Does AI recommendation align with how you acquire customers?
- Is direct traffic essential to revenue?
- What’s the value of brand mentions vs. site visits?
Different business models have different optimal choices.
Content Sensitivity
Consider content characteristics:
- Is your content proprietary or commoditized?
- Would AI training on it create competitive risk?
- Is the content designed for discovery or direct consumption?
Proprietary content may warrant different treatment than discovery-oriented content.
Implementation Steps
Once you’ve decided:
To Allow AI Crawlers
- Log into Cloudflare dashboard
- Navigate to Security → Bots
- Check AI bot settings
- Ensure “Block AI Bots” is disabled or set to allow
- Verify robots.txt doesn’t block AI crawlers at that level
- Test with crawler validation tools
To Block AI Crawlers
- Log into Cloudflare dashboard
- Navigate to Security → Bots
- Enable “Block AI Bots”
- Optionally configure selective blocking
- Consider AI Labyrinth for non-compliant bots
- Monitor for unintended blocking of legitimate traffic
To Take a Middle Position
- Use Content Signals to allow search indexing but not training
- Block specific crawlers while allowing others
- Allow on some page types, block on others
- Review and adjust based on results
The Bigger Picture
Cloudflare’s policy change reflects an evolving relationship between content creators and AI systems.
Current state:
- AI systems extract value from web content
- Traffic flowing back is disproportionately small
- Content creators have limited leverage
Emerging state:
- Content creators gaining technical control (like Cloudflare settings)
- Business model experiments (pay-per-crawl)
- Regulatory attention to AI training practices
Your decision today operates in this evolving context. Consider not just current state but where the relationship between AI and content is heading.
The Visibility Implication
For brands prioritizing AI visibility, the takeaway is clear: ensure your technical infrastructure allows AI access.
Check:
- Cloudflare settings (if applicable)
- Robots.txt for AI crawler rules
- WAF and CDN configurations
- Any bot blocking tools
Technical barriers to AI crawling create invisible ceilings on AI visibility. Remove them or accept the limitation.
RivalHound monitors your AI visibility and helps identify technical barriers affecting your presence. Start your free trial to ensure AI platforms can discover your content.