SEO & GSC Audit
Check any website against 48 Google Search Console rules. Each check explained, with concrete fix instructions. Free, no signup, ~20 seconds.
What We Check — All 48 Rules
Built from real production experience running 13+ live websites. Each check addresses a specific Google Search Console issue.
- Homepage HTTP 200 — your site responds correctly
- HTTP → HTTPS redirect — secure connection enforced
- www → non-www (or vice versa) — single canonical version
- HSTS header — forces HTTPS on every visit
- /index → / — no duplicate homepage URLs
- Redirect is 301 not 302 — passes full SEO authority
- Trailing slash consistency — /page or /page/, only one canonical
- TTFB < 500ms — Core Web Vitals threshold for server response time. Affects LCP and ranking signals.
- No noindex meta tag — common staging leftover
- No X-Robots-Tag header — hidden in server config
- Returns HTTP 404 status — not soft 404
- Has noindex meta tag — won't pollute index
- No canonical to homepage — prevents soft 404 detection
- sitemap.xml exists & valid XML
- Clean URLs (no .php, no /index)
- No priority/changefreq tags (Google ignores)
- Sample URLs return HTTP 200
- Homepage canonical in sitemap
- XML content-type
- UTF-8 encoding declared
- Absolute URLs (https://...)
- Same-domain only
- HTTPS only (not http://)
- No tracking params (utm_, fbclid)
- No # fragments
- Sample URLs without noindex
- Sample URLs self-canonical
- <lastmod> tags present
- Canonical tag present
- Canonical matches site URL
- No duplicate canonical tags
- Title tag (20-70 chars)
- Meta description (50-170 chars)
- Viewport meta tag (mobile)
- JSON-LD structured data
- robots.txt exists
- Sitemap directive in robots.txt
- favicon.ico exists
- No "Disallow: /" on User-agent: *
- CSS/JS not blocked (Google needs them)
- Valid Disallow syntax (no empty lines)
- Content-Type: text/plain
- Service account auth
- URL Inspection accessible
- Coverage state (Submitted & indexed?)
- Verdict (PASS/NEUTRAL/FAIL)
- robots.txt state (real Google view)
- Page fetch state (last crawl result)
- Submitted sitemaps status
What We Check
Every check comes with a pass/fail result and specific fix instructions.
Sitemap & Robots
Validates sitemap.xml schema, URL sampling, robots.txt directives, and crawl configuration.
Meta & Canonical
Title length, meta description, canonical consistency, noindex detection, duplicate tags.
Core Web Vitals
Real TTFB measurement from our servers. Identifies slow server response times.
Security Headers
HSTS, X-Frame-Options, Content-Type-Options, Referrer-Policy verification.
Structured Data
Detects JSON-LD presence and validates against schema.org spec.
GSC API Integration
Connects to Google's URL Inspection API for real indexing status and crawl data.
Why It Matters
Numbers that make a difference for your website.
Checks
Every rule Google cares about
Audit Time
Full report in seconds
Free
No signup, no limits
GSC Data
Not simulated, actual API
Frequently Asked Questions
Common questions about this tool and how to use the results.
Ready to audit your site?
Enter your URL above and get results in seconds. Completely free.
Start Audit