Point11

Connecting your first website

Add your website to Point11, verify domain ownership, and configure crawl settings for AI discoverability analysis.

Connecting a website to Point11 enables the platform to analyze your pages for AI discoverability, monitor your structured data implementation, and track how AI search platforms cite your content. This guide covers the full connection process from domain verification through initial crawl configuration.

Adding a Website

From your Point11 dashboard, navigate to Websites > Add Website. Enter the root URL of the site you want to connect (e.g., https://www.example.com). Point11 will automatically detect whether the site uses www or non-www as its canonical prefix and whether it enforces HTTPS.

You can connect multiple websites under a single organization. Each website operates as an independent workspace with its own audit history, discoverability scores, and monitoring configuration.

Domain Verification

Before Point11 can crawl and analyze your site, you must verify that your organization controls the domain. Three verification methods are available:

DNS TXT Record (Recommended)

Add a TXT record to your domain's DNS configuration with the verification token provided by Point11. This method is preferred because it does not require changes to your website's code or hosting configuration.

  • Record type: TXT
  • Host: @ (or your subdomain)
  • Value: point11-verification=your-unique-token
  • TTL: 3600 (or your DNS provider's default)

DNS propagation typically completes within minutes but can take up to 48 hours depending on your DNS provider.

HTML Meta Tag

Add a meta tag to the <head> section of your homepage:

<meta name="point11-verification" content="your-unique-token" />

Point11 will check for this tag when you click "Verify." The tag must remain on the page as long as the site is connected.

File Upload

Upload a verification file (point11-verify.html) to your web root so it is accessible at https://yourdomain.com/point11-verify.html. The file contents are provided during the verification flow.

Configuring Crawl Settings

After verification, Point11 configures its crawler to analyze your site. The default settings work well for most websites, but you can customize the following:

Crawl Scope

  • Full site: Point11 crawls all pages discoverable from your sitemap and internal links. Recommended for most websites.
  • Specific paths: Restrict crawling to particular directories (e.g., /products/, /docs/). Useful for large sites where only certain sections are relevant to AI discoverability.
  • Sitemap-only: Crawl only pages listed in your XML sitemap. Use this when your site has sections you do not want analyzed.

Crawl Frequency

  • Daily: Recommended for sites with frequently updated content (e-commerce, news, documentation).
  • Weekly: Suitable for most corporate and B2B websites.
  • Manual: Crawl only when triggered by a team member.

Respect robots.txt

Point11's crawler respects your robots.txt directives by default. If specific paths are disallowed, they will not be analyzed. You can override this in settings if you want Point11 to audit pages that are blocked from public crawlers, but note that this only affects Point11's internal analysis and does not change how AI platforms access your site.

What Happens During the Initial Crawl

Once connected and configured, Point11 performs an initial crawl that:

  • Discovers all accessible pages within your defined scope.
  • Analyzes each page for structured data (JSON-LD, Microdata, RDFa).
  • Evaluates content structure, heading hierarchy, and semantic HTML.
  • Checks robots.txt rules for AI crawler access (GPTBot, Google-Extended, PerplexityBot, ClaudeBot).
  • Identifies JavaScript-rendered content that may be invisible to AI crawlers.
  • Generates your initial AI discoverability score.

The initial crawl duration depends on site size. Most sites under 10,000 pages complete within an hour. Larger sites may take several hours.

Troubleshooting Connection Issues

  • Verification fails: Ensure the DNS record, meta tag, or file is accessible from the public internet. Check for typos in the verification token.
  • Crawl returns few pages: Verify your XML sitemap is accessible and up to date. Check that internal links are not blocked by robots.txt or nofollow directives.
  • Timeout errors: Large pages or slow server response times can cause timeouts. Ensure your server can handle crawler requests without rate limiting Point11's user agent.

Sources

Need help implementing this?

Our team can walk you through the setup.