Set "sitemap.xml" and "robots.txt" in Nuxt3

Nuxt

Language :

Hi, I’m Lovefield.

"sitemap.xml" and "robots.txt" are almost essential if you create a web site. Because in the online world, customers flow in through search. Making robots in Nuxt3 is simple. Simply create and place the 'robots.txt' file in the 'public' folder. But we don't write like that. When you develop a site, there is always a mode based on your environment (server environment). It is largely divided into "prod", "stage", and "dev". In "prod", robot information should be set to allow bots to take well, and in the case of "stage", bots should not take site information. It's not just a single file. In this article, I will describe how to set "sitemap.xml" and "robots.txt" in Nuxt3.

The technology you need to know to set up is Server Routes. Nuxt3 provides a variety of features in the 'server' folder. You will use the Server Routes feature. If you create a file such as ~/server/routes/hello.ts, you can access "https://userdomain.com/hello" on the site. The inside of the file is as follows:

export default defineEventHandler(()=>Hello World!’);

1. Set robots.txt

Create an server/routes/robots.txt.ts file.

export default defineEventHandler((event) => {
    const config = useRuntimeConfig();

    setResponseHeader(event, "content-type", "text/plain");

    if (config.public.mode === "prod") {
        return `User-agent: *\nAllow: /\nSitemap:${config.public.domain}/sitemap.xml`;
    } else {
        return "User-agent: *\nDisallow: /";
    }
});

First, use the setResponseHeader function to specify the content-type of the page. Robots.txt uses the "text/plane" type. I put the mode information of the site in RuntimeConfig. You have set the information "User-agent: *\nAllow: /\nSitemap: :${config.public.domain}/sitemap.xml " to be exposed only when the site is in the "prod" state, and "User-agent: *\nDisallow: /" to be exposed otherwise. It's easier than you think, right?

2. Set sitemap.xml

Create an server/routes/sitemap.xml.ts file.

export default defineEventHandler((event) => {
    const config = useRuntimeConfig();
    let document:string = “”;

    setResponseHeader(event, "content-type", "application/xml");

    document += `<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml">`;
    document += `...`;
    document += `</urlset>`;


    return document;
});

The basic gist is not much different from how to create robots.txt. However, in sitemap, dynamic urls must also be set up. To do that, you need to communicate with the server, but unfortunately, the composable provided by Noxt3 is not available in the definineEventHandler. It means that useFetch or useAsyncData cannot be used. You must use the function fetch here.

export default defineEventHandler(async (event) => {
    const config = useRuntimeConfig();
    const data = await fetch(“url”).then((res)=>res.json()); // Request API
    let document:string = “”;

    setResponseHeader(event, "content-type", "application/xml");

    document += `<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml">`;
    document += `...`;
    document += `</urlset>`;


    return document;
});

You will now refer to the sitemap format at https://www.sitemaps.org/protocol.html and write it.

Bonus. Set healthcheck

Create an server/routes/healthcheck.ts file.

export default defineEventHandler(() => "OK");

Lovefield

Web Front-End developer

하고싶은게 많고, 나만의 서비스를 만들고 싶은 변태스러운 개발자입니다.