← All use cases
Use Case

Sitemap API for Web Scraping

Replace full-site crawls with sitemap extraction

Instead of crawling an entire website page by page, use sitemap extraction to get all target URLs in a single API call. This is faster, cheaper, and more polite to the target server.

Why use SitemapKit?

  • Get all URLs in one API call instead of crawling page by page
  • 10-100x faster than a full site crawl
  • Respectful to target servers — no crawl pressure
  • Includes lastmod to skip unchanged pages

Example

import requests

# Get all product URLs from an e-commerce site
resp = requests.post(
    "https://sitemapkit.com/api/v1/sitemap/full",
    headers={"x-api-key": "YOUR_API_KEY"},
    json={"url": "shop.example.com"}
)

urls = resp.json()["urls"]
product_urls = [u["loc"] for u in urls if "/products/" in u["loc"]]
print(f"Found {len(product_urls)} product pages to scrape")

Start using SitemapKit for web scraping

Free tier includes 100 API calls/month. No credit card required.

Other use cases