Hi,
I got a question from a client about the security information the robots.txt file is exposing. It now shows a list of directories. Wouldn't it be better if these directories wouldn't be exposed or shown?
My client (based on some pen test advise) now considers whitelisting in the robots.txt. I think that does not make sense
I like to hear your opinions
Currently it contains:
# Begin robots.txt file #/-----------------------------------------------\ #| In single portal/domain situations, uncomment the sitmap line and enter domain name #\-----------------------------------------------/ #Sitemap: http://www.DomainNamehere.com/sitemap.aspx User-agent: * Disallow: /*/ctl/ # Googlebot permits * Disallow: /admin/ Disallow: /App_Browsers/ Disallow: /App_Code/ Disallow: /App_Data/ Disallow: /App_GlobalResources/ Disallow: /bin/ Disallow: /Components/ Disallow: /Config/ Disallow: /contest/ Disallow: /controls/ Disallow: /Documentation/ Disallow: /HttpModules/ Disallow: /Install/ Disallow: /Providers/ Disallow: /Activity-Feed/userId/ # Do not index user profiles # End of robots.txt file
Michael TobischDNN★MVP
These Forums are dedicated to discussion of DNN Platform.
For the benefit of the community and to protect the integrity of the ecosystem, please observe the following posting guidelines:
Awesome! Simply post in the forums using the link below and we'll get you started.