How to Add a Sitemap in Astro Starlight
I set up a sitemap for the Astro template Starlight project, so here’s a note for future reference.
Sitemap Configuration in Starlight
Starlight has a built-in sitemap generation feature.
To enable the sitemap, you just need to set the URL in the site
option of the astro.config.mjs
file.
import { defineConfig } from 'astro/config';import starlight from '@astrojs/starlight';
export default defineConfig({ site: 'https://example.com', integrations: [ starlight({ title: 'Site with Sitemap Configured' }), ],});
With this configuration, a sitemap will be automatically generated during the build process.
Setting up robots.txt
After configuring the sitemap, the next step is to add a link to the sitemap in the robots.txt file.
This makes it easier for crawlers to find the sitemap.
Creating a Static robots.txt File
Use this method if you want to create robots.txt directly.
Create a public/robots.txt
file and add the following content:
User-agent: *Allow: /
Sitemap: https://example.com/sitemap-index.xml
Generating a Dynamic robots.txt File
You can also dynamically generate robots.txt by reusing the site
value from astro.config.mjs
.
Instead of placing a static file in the public/
directory, create a src/pages/robots.txt.ts
file and add the following code:
import type { APIRoute } from 'astro';
const getRobotsTxt = (sitemapURL: URL) => `User-agent: *Allow: /
Sitemap: ${sitemapURL.href}`;
export const GET: APIRoute = ({ site }) => { const sitemapURL = new URL('sitemap-index.xml', site); return new Response(getRobotsTxt(sitemapURL));};
This method automatically uses the site
URL set in astro.config.mjs
, so if there’s a URL change, you only need to modify it in one place.
Verifying the Sitemap
After completing the configuration, run the build to check if the sitemap is generated correctly.
pnpm build
npm run build
yarn build
After building, you can access https://example.com/sitemap-index.xml
in your browser to check the contents of the sitemap.
Once you’ve added the sitemap, register it using tools like Google Search Console to notify crawlers.