Skip to main content
If your pages aren’t appearing in Google search results, work through these common causes:

1. Sitemap Missing or Inaccessible

Google discovers pages through your sitemap. If it can’t access it, pages won’t be indexed. How to check:
  1. Visit https://yourdomain.com/sitemap.xml in your browser
  2. You should see XML with your page URLs
  3. Check Google Search Console > Sitemaps for errors
Common sitemap issues:
  • Sitemap returns 404 (not created)
  • Sitemap is empty or malformed XML
  • URLs in sitemap use wrong domain (e.g., lovable.app instead of your custom domain)
Use our SEO Bot Crawler Test to verify what Google sees when visiting your sitemap.

2. robots.txt Blocking Crawlers

A misconfigured robots.txt can prevent Google from crawling your site. How to check:
  1. Visit https://yourdomain.com/robots.txt
  2. Ensure it doesn’t contain Disallow: / for all user agents
Good robots.txt:
User-agent: *
Allow: /

Sitemap: https://yourdomain.com/sitemap.xml
Bad robots.txt (blocks everything):
User-agent: *
Disallow: /
Google discovers pages by following links. Orphan pages (no links pointing to them) are hard to find. Solutions:
  • Add navigation links to all important pages
  • Include pages in your sitemap
  • Get external sites to link to your content
  • Submit URLs directly in Google Search Console

4. Site Is Too New

Google doesn’t index sites instantly. New sites can take days to weeks to appear. Speed up indexing:
  1. Submit your sitemap in Google Search Console
  2. Use “Request Indexing” for important pages
  3. Share your site on social media to generate signals

5. JavaScript Content Not Being Prerendered

If Hado SEO isn’t active, Google sees your raw JavaScript instead of rendered content. How to verify prerendering is working:
  1. Use our SEO Bot Crawler Test
  2. Enter your URL
  3. Check that the rendered HTML contains your actual content, not just <div id="root"></div>