Site dropped from Google, robots.txt turns out to have Disallow: /.
Disallow: / on the whole site blocks every search engine. Often set accidentally via 'Discourage Search Engines' during a rebuild and never reset.
Try this first
- 1Open yourdomain.tld/robots.txt in the browser. See 'Disallow: /' on an empty User-agent line? That's the cause.
- 2WordPress: go to Settings > Reading. Untick 'Discourage Search Engines from indexing this site'. That regenerates robots.txt automatically.
- 3Re-check robots.txt after a minute. You should see 'User-agent: *' with specific Disallows (e.g. /wp-admin/), not the whole site.
- 4Request re-inspection in Search Console for the homepage and a few key pages. Speeds up re-indexing.
- 5Check sitemap submission. If gone, resubmit.
- 6Document the incident and add the check to your launch checklist. It usually happens after a staging-to-live switch.
When to bring us in
Site blocked for weeks? Ranking recovery can take time. An SEO specialist can help with re-indexing, link audit and content refresh to come back faster.
See also
- WordPress, plugins and theme have gone 6+ months without updatesOut-of-date WP is the number-one entry for malware. Don't just hit 'update all', back up first.
- Theme update broke the layout or threw a fatal errorThemes overwrite custom CSS on update unless you use a child theme.
- WordPress shows a blank screen after a plugin install or updateWSOD (white screen of death) is usually one crashing plugin. You isolate it.
None of the above fits?
Describe your situation below. We pass your input plus the steps you already saw to our AI and return tailored next-step advice. If it's too risky to DIY, we'll say so.
Or skip the DIY entirely
Our Managed IT clients do not look these things up. One point of contact, a fixed monthly price, resolved within working hours.