Create a Website

How Do I Edit 'robots.txt'?

A "robots.txt" file is generated automatically and can be accessed by adding "robots.txt" to the end of your website name, for example

This file contains the following directives:

  • User-Agent: *. Means that the settings specified below are valid for all web crawlers.
  • Allow and Disallow. By default, all pages of your website are indexed by search engines, however it’s possible to hide some pages from indexing.
  • Sitemap. Tells search engines how to find a sitemap address. A "sitemap.xml" file is generated automatically.
  • Host. This directive shows up if a website is connected to one or more domains. It indicates the primary domain of the website to web crawlers.
Was this article helpful?
35 people found this article helpful.