Tiers, pages, and data retention
Plan limits affect how many sites you can track, how many URLs we crawl per site, how many Google accounts or properties you connect, and how Search Console history is labeled for retention in the product. Exact commercial names of plans may match your checkout page; the table below reflects the technical limits used in the application.
Limits by tier
| Tier | GSC accounts | Websites | Pages per website (approx.) | Data retention (label) |
|---|---|---|---|---|
| Free | 1 | 1 | 100 | 30 days |
| Noob | 1 | 1 | 1,000 | 90 days |
| Pro | 1 | 5 | 2,000 | 90 days |
| Business | 2 | 25 | 3,000 | 90 days |
| Enterprise | 5 | 100 | 5,000 | 90 days |
There is a small internal margin so content pages close to the limit do not fail unpredictably; in practice your effective cap is essentially the stated number.
Crawl throughput
Your tier sets how many pages we will index per site. How fast a large site is fully refreshed can also depend on server-side job settings (for example a maximum number of page attempts per scheduler run). Those knobs are operational, not something you configure in the app, but they explain why very large sites might catch up over multiple passes.