AI Readiness Checker

Is your website ready for ChatGPT, Claude, Gemini, and Perplexity? Check AI crawler access, llms.txt, structured data, and content optimization.

9 AI crawlersllms.txt check18 checks~5 seconds
This tool analyzes the single URL you enter, not your entire website.
Analyzing...Checking AI crawler access, llms.txt, structured data, and content. This takes a few seconds.

What We Check

18 checks across crawler access, AI-specific files, and content optimization.

Ai Crawlers Allowed

Checks if any AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) are blocked in robots.txt.

Gptbot Allowed

GPTBot is OpenAI

Claudebot Allowed

ClaudeBot is Anthropic

Gemini Allowed

Google-Extended controls whether your content is used for Gemini/Bard AI training.

llms.txt File

llms.txt is a new standard file that provides AI models with a structured summary of your website.

Llms Full Txt

llms-full.txt is an extended version with complete documentation for AI comprehension.

Ai Txt

ai.txt declares your site

Ai Plugin

OpenAI plugin manifest enables ChatGPT plugin integration for your website.

Structured Data

JSON-LD schema markup provides explicit context about content type for AI understanding.

Meta Desc Ai

AI models use meta descriptions as a primary source for content summaries.

Heading Structure

Clear H1/H2 hierarchy helps AI models understand the structure and key topics of your page.

Content Depth

Pages with substantial content (300+ words) are more likely to be cited by AI models.

Language Attribute

Language attribute helps AI models determine content language for proper processing.

OG Tags

Open Graph tags serve as secondary metadata for AI content understanding.

Robots Txt

A robots.txt file is essential for controlling which bots can access your content.

Sitemap Accessible

AI crawlers use sitemaps to discover and index your content efficiently.

Canonical URL

Canonical URLs prevent AI from processing duplicate versions of your pages.

Https

HTTPS is required for trust. AI crawlers may deprioritize insecure sites.

Why It Matters

AI is changing how people find information online.

40%

AI Search Growth

AI-powered search usage grew 40% in 2025

9

AI Crawlers

Major AI bots scanning the web daily

llms.txt

New Standard

Adopted by thousands of sites in 2025

Free

No Signup

Instant results, unlimited scans

Frequently Asked Questions

Everything you need to know about AI readiness.

What is llms.txt?
llms.txt is a new standard file (like robots.txt) that provides AI language models with a markdown summary of your website. It helps ChatGPT, Claude, and other AI models understand your site content and structure.
Should I block or allow AI crawlers?
It depends on your goals. If you want your content to appear in AI-generated answers (ChatGPT, Perplexity, etc.), allow them. If you want to protect proprietary content from AI training, block them via robots.txt.
What is GPTBot and ClaudeBot?
GPTBot is OpenAI's crawler that indexes content for ChatGPT. ClaudeBot is Anthropic's crawler for Claude. Both respect robots.txt directives and can be individually allowed or blocked.
How does structured data help AI?
JSON-LD schema markup gives AI explicit context about your content type (Article, Product, FAQ, etc.), making it easier for AI models to accurately represent your content in their responses.

Check your AI readiness now

Enter your URL above and see how AI models interact with your website.

Start Audit