Official Analyzer Identity

Sende SEO Analyzer

This is the official identity page for Sende SEO Analyzer. It explains how the analyzer works, its privacy policy, and its terms of use for site owners, hosting providers, and protection platforms.

The purpose of this page is to support legitimate auditing without bypassing security controls.

Official User-Agent
SEOChecksBot/1.0; +https://seo-checks.com/bot
The analyzer does not bypass challenge pages, CAPTCHA, or protected areas. If access is blocked, the report states that the protection layer prevented access instead of attempting to bypass it.

What it does

It reads the submitted URL, robots.txt, and sitemap, then fully audits only URLs declared in the sitemap in limited batches.

What it does not do

It does not bypass challenge pages, solve CAPTCHA, or access private or protected areas.

Why it appears in logs

It fetches sitemap URLs as an identified crawler to generate a clear diagnostic report for the site owner.

How Sende SEO Analyzer works

  • The analyzer starts with the URL explicitly submitted by the user.
  • When `AUDIT_DATA_PROVIDER=dataforseo`, the root page fetch is performed through DataForSEO (JavaScript execution and browser rendering as configured); otherwise the analyzer uses direct fetches from our infrastructure as SEOChecksBot.
  • It reads robots.txt and sitemap.xml when available.
  • The full audit is limited to sitemap URLs only; it does not crawl other internal links.
  • If no valid sitemap is found, it reports that issue and does not start alternative internal crawling.
  • It analyzes sitemap URLs in limited batches to reduce server load.
  • It records technical results such as status code, title, description, H1, and canonical.
  • If a protection page such as Just a moment or a 403 response appears, it reports the block and does not attempt to bypass it.

Privacy Policy

  • The analyzer checks only public pages accessible on the internet.
  • It does not request passwords and does not access user accounts or admin panels.
  • It does not collect payment data or sensitive personal visitor data.
  • When the audit data provider is set to DataForSEO (`AUDIT_DATA_PROVIDER=dataforseo`), the URLs you submit for auditing are sent to the third-party processor **DataForSEO** to fetch and analyze pages (including JavaScript execution and browser rendering as configured). DataForSEO may apply its own caching and processing consistent with its service. Further details appear in DataForSEO’s documentation and policies.
  • To limit abuse, we apply a daily (UTC calendar day) cap on **new** audit submissions per client IP, tracked using a one-way hash of the IP (we do not store the raw IP as plain identifier text). The cap is configured per deployment via `AUDIT_RATE_LIMIT_PER_DAY_PUBLIC` (default **3** per IP per day; **0** disables the daily cap for non-bypass IPs on self-hosted installs).
  • If a completed audit already exists for the same submitted URL, the service may return that prior audit within **24 hours** via a Redis-backed cache instead of enqueueing a new job.
  • It stores the submitted URL, extracted domain, audit status, created time, start time, and finish time to generate and display the report by audit ID.
  • It stores the audited page URLs, HTTP status for each page, and technical SEO fields required for the report, such as canonical and noindex detection.
  • It stores detected issues such as issue type, affected URL, evidence, impact, suggested fix, and priority score.
  • It stores limited operational logs including audit ID, checked URL, duration, page count, and server resource usage such as load, memory, and swap.
  • It does not archive full page content, passwords, payment data, or private account content.

Terms of Use

  • Use the analyzer only for sites you own or have explicit permission to audit.
  • Do not use the analyzer to disturb websites, test security controls, or bypass access restrictions.
  • Analyzer results are diagnostic and assistive; they do not guarantee search ranking.
  • Audits are automatically delayed or stopped during high load to protect the service and audited sites.
  • If a site blocks access through robots.txt or a protection layer, the analyzer treats that as a block signal and does not bypass it.
  • When `AUDIT_DATA_PROVIDER=dataforseo`, page retrieval relies on DataForSEO’s infrastructure. Availability, consistency, and any blocks (for example CDN/WAF or origin responses) depend on both the target site and the vendor; we do not guarantee identical outcomes to direct fetches.

Allowing the analyzer through Cloudflare or WAF

If you own the site and want to allow the audit, add a safe exception in Cloudflare or your WAF. Prefer allowing verified crawlers instead of disabling protection for all visitors.

Allow by IP + User-Agent
If IP is YOUR_TOOL_SERVER_IP
AND User-Agent contains "SEOChecksBot"
Then Skip Managed Challenge
Allow verified crawlers
If cf.client.bot is true
Then Skip Managed Challenge

Blocking the analyzer

Site owners can block the analyzer through robots.txt or security rules. When blocked, the analyzer reports the block instead of bypassing it.

robots.txt
User-agent: SEOChecksBot
Disallow: /

Contact

If analyzer visits appear in your logs and you need clarification or want to request blocking assistance, contact us by email.