Skip to content

Unclear if sitemap.xml and robots.txt are set up correctly.

Sitemap and robots.txt are the first files search engines fetch. A typo there can wipe your site from the index.

Try this first

  1. 1Open yourdomain.tld/robots.txt. No Disallow: / on the whole site (that blocks everything), do reference the sitemap at the bottom: Sitemap: https://yourdomain.tld/sitemap.xml.
  2. 2Open yourdomain.tld/sitemap.xml. Should be valid XML with the URLs you want indexed. Yoast and RankMath generate this automatically in WordPress.
  3. 3Only block what truly should not be crawled: /wp-admin, /cart, /checkout, search results. The rest stays crawlable.
  4. 4Submit sitemap.xml in Search Console. That shows which URLs get indexed and which errors Google sees.
  5. 5Webflow: sitemap.xml is auto-generated under Project Settings > SEO. robots.txt is configurable there too.
  6. 6Re-check quarterly. A new plugin or an accidental 'Discourage Search Engines' tick can ruin everything.

When to bring us in

Suspect the site dropped out of Google due to a robots or sitemap mistake? Run a technical SEO audit or have an SEO specialist review Search Console with you.

See also

None of the above fits?

Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.

Who are you?

For the AI question we need your email and company, so we can follow up if the AI gets stuck, and to prevent abuse.

Limited to 2 questions per hour and 5 per day, kept lean so the AI stays useful. For more, contacting us directly works better for you and us.

Or skip the DIY entirely

Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.