Accessing Domain Settings
- Go to your Dashboard
- Select the domain you want to configure
- Click Domain Settings in the header
Custom robots.txt
The robots.txt file tells search engines which pages to crawl and which to ignore.
Some platforms (like Base44) don’t serve a robots.txt file by default. Without one:
- Search engines may not find your sitemap
- You can’t control crawling behavior
- Some pages might get indexed unintentionally
Configuration
In Domain Settings, you can set a custom robots.txt:
User-agent: *
Allow: /
# Block admin pages
Disallow: /admin/
Disallow: /api/
# Reference your sitemap
Sitemap: https://yourdomain.com/sitemap.xml
Common Directives
| Directive | Meaning |
|---|
User-agent: * | Rules apply to all bots |
Allow: / | Allow crawling of all pages |
Disallow: /path/ | Block crawling of specific path |
Sitemap: URL | Location of your sitemap |
Example: Basic robots.txt
User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml
Example: With Restrictions
User-agent: *
Allow: /
Disallow: /dashboard/
Disallow: /api/
Disallow: /admin/
Disallow: /private/
Sitemap: https://yourdomain.com/sitemap.xml
Custom sitemap.xml
A sitemap helps search engines discover all your pages.
- Discovery - Ensures all pages are found
- Priority - Indicate important pages
- Freshness - Show when pages were updated
- Faster indexing - Search engines crawl more efficiently
Configuration
In Domain Settings, you can set a custom sitemap.xml:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://yourdomain.com/</loc>
<lastmod>2024-01-15</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://yourdomain.com/about</loc>
<lastmod>2024-01-10</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>https://yourdomain.com/pricing</loc>
<lastmod>2024-01-12</lastmod>
<changefreq>monthly</changefreq>
<priority>0.9</priority>
</url>
<url>
<loc>https://yourdomain.com/blog</loc>
<lastmod>2024-01-15</lastmod>
<changefreq>daily</changefreq>
<priority>0.7</priority>
</url>
</urlset>
Sitemap Fields
| Field | Required | Description |
|---|
loc | Yes | Full URL of the page |
lastmod | No | Last modification date (YYYY-MM-DD) |
changefreq | No | How often page changes (daily, weekly, monthly) |
priority | No | Importance relative to other pages (0.0 to 1.0) |
Priority Guidelines
| Priority | Use For |
|---|
| 1.0 | Homepage |
| 0.8-0.9 | Key landing pages, pricing |
| 0.6-0.7 | Blog, about, contact |
| 0.4-0.5 | Individual blog posts |
| 0.1-0.3 | Legal pages, archives |
When Custom Files Are Required
Base44 apps require custom robots.txt and sitemap.xml configuration because Base44 doesn’t serve these files natively. A warning banner will appear in your dashboard if configuration is needed.
Saving Changes
- Edit the robots.txt or sitemap.xml content
- Click Save Changes
- Changes take effect immediately
- Trigger a Recrawl to update cached versions
Verification
After configuring:
- Visit
https://yourdomain.com/robots.txt to verify
- Visit
https://yourdomain.com/sitemap.xml to verify
- Submit sitemap to Google Search Console
- Check for crawl errors in Search Console
Domain Deletion
To remove a domain from Hado SEO:
- Go to Domain Settings
- Scroll to the Danger Zone
- Click Delete Domain
- Confirm deletion
Deleting a domain will:
- Stop prerendering immediately
- Remove all analytics data
- Require reconfiguration to restore
Make sure to update your DNS records after deletion.