Your AI discoverability score is a composite metric that quantifies how well your website is positioned to be found, understood, and cited by AI search platforms. The score ranges from 0 to 100, with higher scores indicating greater likelihood of appearing in AI-generated responses across ChatGPT, Gemini, Perplexity, and Claude.
How the Score Is Calculated
The discoverability score is a weighted composite of five category scores. Each category reflects a distinct dimension of AI readiness:
Structured Data (25% of total score)
Measures the completeness and accuracy of your schema markup implementation. This category evaluates:
- Coverage: What percentage of your pages have valid structured data.
- Type diversity: Whether you implement the high-priority schema types (Organization, Product, Service, FAQPage, Article, BreadcrumbList).
- Property completeness: Whether each schema instance includes all recommended properties, not just the required minimum.
- Validation: Whether your markup passes Schema.org and Google Rich Results validation without errors.
A site with validated JSON-LD on every page template, covering all relevant schema types with complete properties, scores 100 in this category.
Content Quality (25% of total score)
Evaluates whether your content is structured for AI extraction and citation, based on the Princeton GEO research findings:
- Statistical content: Presence of data points, percentages, and quantified claims with sources.
- Authoritative citations: Inline references to credible sources, research, and industry standards.
- Structural formatting: Use of headings, bullet points, Q&A formats, and TL;DR summaries that AI systems can easily parse.
- Content depth: Length and comprehensiveness relative to the topic and competitive benchmarks. Articles over 2,900 words average 5.1 citations from AI systems compared to 3.2 for articles under 800 words.
Crawler Access (20% of total score)
Assesses whether AI platforms can technically reach and index your content:
- robots.txt configuration for all major AI crawlers (GPTBot, OAI-SearchBot, Google-Extended, PerplexityBot, ClaudeBot, Claude-SearchBot, CCBot).
- XML sitemap availability and accuracy.
- Server response times and availability for crawler requests.
- Absence of rate limiting or blocking that would prevent consistent indexing.
Allowing all major AI crawlers without restrictions scores highest. Blocking any crawler reduces the score proportionally to that platform's market share.
Rendering Compatibility (15% of total score)
Measures whether your content is accessible to crawlers that cannot execute JavaScript:
- Percentage of critical content available in the initial HTML response (without JavaScript rendering).
- Server-side rendering or static generation coverage.
- Identification of JavaScript-dependent content sections.
This category is particularly important because OpenAI's crawlers cannot render JavaScript. If significant content is invisible to these crawlers, your score reflects the gap.
Content Freshness (15% of total score)
Tracks how recently your content was updated, weighted by page importance:
- Pages updated within the last 30 days receive full freshness credit.
- Pages updated 30-60 days ago receive partial credit.
- Pages updated 60-90 days ago receive minimal credit.
- Pages not updated in over 90 days receive no freshness credit.
High-traffic and high-priority pages (product pages, service pages, homepage) are weighted more heavily than low-traffic blog posts or utility pages.
Score Interpretation
Point11 groups scores into four tiers:
- 80-100 — Strong: Your site is well-positioned for AI discoverability. Focus on maintaining content freshness and expanding structured data coverage to new content.
- 60-79 — Moderate: Core foundations are in place but gaps exist. Review the category breakdown to identify the weakest areas and prioritize remediation.
- 40-59 — Weak: Significant issues are limiting your AI visibility. Likely causes include missing structured data, blocked crawlers, or JavaScript-dependent content. Address critical issues immediately.
- 0-39 — Critical: Your site is largely invisible to AI platforms. Common at this level: AI crawlers are blocked in robots.txt, no structured data is implemented, or the site is entirely client-side rendered.
Tracking Score Changes
Point11 recalculates your discoverability score after every crawl cycle (daily, weekly, or manual depending on your configuration). The score history is available in the Analytics > Discoverability Trend view, showing how your score has changed over time and correlating changes with specific actions taken.
Score changes are also reported in weekly summary emails sent to all organization members with notification preferences enabled.
Sources
- GEO: Generative Engine Optimization: https://arxiv.org/abs/2311.09735
- Schema.org Official Documentation: https://schema.org/docs/gs.html
- Google Search Central — Structured Data Introduction: https://developers.google.com/search/docs/appearance/structured-data/intro-structured-data
- OpenAI Crawlers Overview: https://platform.openai.com/docs/bots