Do I need the Robot.txt?
Auteur : Simon C.
Visité 1902,
Followers 1,
Partagé 56
Google Robot. How do I prevent Google indexing specific pages on my website? Google Webmaster says that I should mark the pages concerned with this metatag ' but I do not know how to do this as I am new to website building. Any suggestions would be most welcome. Simon Cowan
Posté le
Robot.txt is use to set the filter to some folders on your server. This has to be managed manually taking care of the folders present on your server.
To index (or not) some pages on your website you simply have to select the page you want from Step 2 (Map) and click to access the page properties. Here you can uncheck the Sitemap option (Expert tab).