How to serve sitemap.xml with Next.JS
To not only increase entropy but also be useful, this blog should be indexed by search engines. And to make it easier for search engines to index, they came up with robots.txt and sitemap.
sitemap.xml - a list of pages that can be downloaded. Optionally you can specify how often the page is refreshed and when was the last time, but as far as I know, the optional parameters are ignored. More details can be found here.
robots.txt - instructions for robots. In our case, we simply indicate the path from where to get the sitemap. More details can be found here.
For our site to produce sitemap.xml, create a sitemap.xml.tsx file in the pages folder:
//Change this to your site url.
In the public folder, create the robots.txt file:
As usual, all the code on the GitHub. If you find a mistake, or you know how to do it better, please submit an issue.