← Back to glossary
Sitemap Glossary

What is Sitemap Directive in robots.txt?

The Sitemap: line in a robots.txt file that tells crawlers where to find XML sitemaps.

The `Sitemap:` directive in robots.txt is a standardized way to tell search engine crawlers where to find a website's XML sitemaps. It's placed at the top or bottom of the robots.txt file and can appear multiple times for multiple sitemaps.

The directive is not tied to any `User-agent` block — it applies globally to all crawlers. The URL must be absolute (including protocol and domain).

This is the most reliable method of sitemap discovery because robots.txt is always the first file a search engine crawler checks when visiting a domain.

Example

User-agent: *
Allow: /

Sitemap: https://example.com/sitemap.xml
Sitemap: https://example.com/sitemap-news.xml

Work with sitemaps programmatically

SitemapKit's API lets you discover, extract, and parse XML sitemaps from any domain. Get structured JSON data with all sitemap elements including Sitemap Directive in robots.txt.

Related Terms