indexing - Should sitemap be disallowed in robots.txt? and robot.txt itself? -


this basic question, can't find direct answer anywhere online. when searching website on google, sitemap.xml , robots.txt returned search results (amongst more useful results). prevent should add following lines robots.txt?:

disallow: /sitemap.xml disallow: /robots.txt 

this won't stop search engines accessing sitemap or robots file?

also/instead should use google's url removal tool?

you won't stop crawler indexing robots.txt because chicken , egg situation, however, if aren't specifying google , other search engines directly @ sitemap, lose indexing weight denying sitemap.xml. there particular reason why want not have users able see sitemap? specific google crawler:

 allow: /  # sitemap  sitemap: http://www.mysite.com/sitemap.xml 

Comments

Popular posts from this blog

c++ - Is it possible to compile a VST on linux? -

c# - SharpSVN - How to get the previous revision? -

php cli reading files and how to fix it? -