DNN Blogs

Written for the Community, by the Community

DNN Summit Session: Multi-Site DNN Instances and Robots.txt

Written By Will Strohl
2021-03-01

Upendo Ventures: DNN SEO and Robots.txt

Our first session at DNN Summit 2021 this year was a short introduction to search engine optimization (SEO) and then a deeper dive on how to manage the Robots.txt file in your DNN website. Management of this critical SEO file is easier than ever on DNN 9.8 and newer. This blog post is a follow-up to provide resources for the event attendees.

In short, you now can manage your robots.txt file(s) directly within DNN itself while logged in as a superuser, and you should review this file regularly, and especially before you launch a new website.

Slide Deck


Multi-Site Support in DNN

In order to support multiple sites that each have their own robots.txt file, leave the default one the way it is already, for your primary website. Then, add a new one for each additional site you wish to support. Usually, this will be a copy of the original robots.txt, but the file name will be preceded by the domain. For example, if the additional site is example.com, then the new robots.txt file for that site will simply be example.com.robots.txt, instead of robots.txt.

Now, when you're logged in as a superuser, you can edit any of the robots.txt files that are found in the root of the website folder, using the existing Config Manager feature.

Code Sample

In order to get this fully working when requested by search engines and other bots, you'll need to first be sure the IIS redirect module is installed, then add redirects for each like you see in the example below. You'll want to put the redirects inside of the <system.webServer> section.


                            <rewrite>
                            <rules>
                              <rule name="Robots.txt: Demo Site 1" stopProcessing="true">
                                <match url=".+" />
                                <conditions>
                                  <add input="{HTTP_HOST}" pattern="localhost.demo1" />
                                  <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
                                </conditions>
                                <action type="Rewrite" url="localhost.demo1.{C:0}" logRewrittenUrl="true" />
                              </rule>
                              <rule name="Robots.txt: Demo Site 2" stopProcessing="true">
                                <match url=".+" />
                                <conditions>
                                  <add input="{HTTP_HOST}" pattern="localhost.demo2" />
                                  <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
                                </conditions>
                                <action type="Rewrite" url="localhost.demo2.{C:0}" logRewrittenUrl="true" />
                              </rule>
                              <rule name="Robots.txt: Demo Site 3" stopProcessing="true">
                                <match url=".+" />
                                <conditions>
                                  <add input="{HTTP_HOST}" pattern="localhost.demo3" />
                                  <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
                                </conditions>
                                <action type="Rewrite" url="localhost.demo3.{C:0}" logRewrittenUrl="true" />
                              </rule>
                            </rules>
                          </rewrite>
    

Keeping the example code above in mind, now, when you are on the localhost.demo website and ask for the robots.txt file, you'll see the specific file that's only maintained for that website.

This blog article is cross-posted from our business website.

Total: 0 Comment(s)

Would you like to help us?

Awesome! Simply post in the forums using the link below and we'll get you started.

Get Involved