Skip to main content

Accessing Domain Settings

  1. Go to your Dashboard
  2. Select the domain you want to configure
  3. Click Domain Settings in the header

Custom robots.txt

The robots.txt file tells search engines which pages to crawl and which to ignore.

Why Configure robots.txt?

Some platforms (like Base44) don’t serve a robots.txt file by default. Without one:
  • Search engines may not find your sitemap
  • You can’t control crawling behavior
  • Some pages might get indexed unintentionally

Configuration

In Domain Settings, you can set a custom robots.txt:
User-agent: *
Allow: /

# Block admin pages
Disallow: /admin/
Disallow: /api/

# Reference your sitemap
Sitemap: https://yourdomain.com/sitemap.xml

Common Directives

DirectiveMeaning
User-agent: *Rules apply to all bots
Allow: /Allow crawling of all pages
Disallow: /path/Block crawling of specific path
Sitemap: URLLocation of your sitemap

Example: Basic robots.txt

User-agent: *
Allow: /

Sitemap: https://yourdomain.com/sitemap.xml

Example: With Restrictions

User-agent: *
Allow: /
Disallow: /dashboard/
Disallow: /api/
Disallow: /admin/
Disallow: /private/

Sitemap: https://yourdomain.com/sitemap.xml

Custom sitemap.xml

A sitemap helps search engines discover all your pages.

Why Configure a Sitemap?

  • Discovery - Ensures all pages are found
  • Priority - Indicate important pages
  • Freshness - Show when pages were updated
  • Faster indexing - Search engines crawl more efficiently

Configuration

In Domain Settings, you can set a custom sitemap.xml:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://yourdomain.com/</loc>
    <lastmod>2024-01-15</lastmod>
    <changefreq>weekly</changefreq>
    <priority>1.0</priority>
  </url>
  <url>
    <loc>https://yourdomain.com/about</loc>
    <lastmod>2024-01-10</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  <url>
    <loc>https://yourdomain.com/pricing</loc>
    <lastmod>2024-01-12</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.9</priority>
  </url>
  <url>
    <loc>https://yourdomain.com/blog</loc>
    <lastmod>2024-01-15</lastmod>
    <changefreq>daily</changefreq>
    <priority>0.7</priority>
  </url>
</urlset>

Sitemap Fields

FieldRequiredDescription
locYesFull URL of the page
lastmodNoLast modification date (YYYY-MM-DD)
changefreqNoHow often page changes (daily, weekly, monthly)
priorityNoImportance relative to other pages (0.0 to 1.0)

Priority Guidelines

PriorityUse For
1.0Homepage
0.8-0.9Key landing pages, pricing
0.6-0.7Blog, about, contact
0.4-0.5Individual blog posts
0.1-0.3Legal pages, archives

When Custom Files Are Required

Base44 apps require custom robots.txt and sitemap.xml configuration because Base44 doesn’t serve these files natively. A warning banner will appear in your dashboard if configuration is needed.

Saving Changes

  1. Edit the robots.txt or sitemap.xml content
  2. Click Save Changes
  3. Changes take effect immediately
  4. Trigger a Recrawl to update cached versions

Verification

After configuring:
  1. Visit https://yourdomain.com/robots.txt to verify
  2. Visit https://yourdomain.com/sitemap.xml to verify
  3. Submit sitemap to Google Search Console
  4. Check for crawl errors in Search Console

Domain Deletion

To remove a domain from Hado SEO:
  1. Go to Domain Settings
  2. Scroll to the Danger Zone
  3. Click Delete Domain
  4. Confirm deletion
Deleting a domain will:
  • Stop prerendering immediately
  • Remove all analytics data
  • Require reconfiguration to restore
Make sure to update your DNS records after deletion.