Free Online XML Sitemap Generator - XML sitemap generator Site map generation tool

Small SEO Tool

XML Sitemap Generator

Enter a domain name

Modified date
Change frequency
Default priority
How many pages do I need to crawl?

Links Found: 0


About XML Sitemap Generator


Use our free tool to easily produce an XML sitemap to alert search engines such as Google, Bing, and Yandex about all of your web pages and any changes to them, and to ensure that all of your sites are properly indexed.

In the form, provide your whole website URL as well as some optional parameters.
Wait until the site has been crawled entirely before clicking the 'Generate Sitemap' button.
You will be routed to the created sitemap details page, which includes the number of pages, a list of broken links, the text of the XML file, and a link to the sitemap file. Download the sitemap file from that website and save it to your site's domain root folder.
Add your sitemap URL to your Google Webmaster account.



XML is an acronym that stands for Extensible Markup Language (XML). It is a machine-readable file format that can be read by search engines and other data-munching applications such as feed readers.

An Sitemap XML is a document that assists Google and other major search engines in better understanding your website while crawling it.

It essentially provides a site's URLs (pages) in an organised manner, allowing you (the webmaster) to include additional information about each URL.

This includes details such as:

  • When was the page last updated?
  • How frequently it changes
  • What it has to do with other URLs on the site
  • Its significance to the broader site, and so on.



You want your website to be towards the top of search engine results pages (SERPs), right?

Everyone does, of course!

However, in order for your site to be indexed and subsequently rank, search engines such as Google must frequently "scan" it.

They do this in order to provide the most recent content in search results.

Search bots may crawl a site numerous times each day, especially if new items are posted throughout the day, as is the case with news sites.

The crawl process is primarily algorithmic, which means that computer algorithms decide how frequently search bots should crawl each site.