Details
-
New Feature
-
Status: Closed
-
Normal
-
Resolution: Fixed
-
7.9.14
-
0.1
-
Quasar
-
Puma Sprint 233
Description
The problem is that while we do support multiple hosts, we can only support one global robots.txt. This is a problem when using multiple subsites.
Shortly discussed with JH, we came up with the following ideas:
- Make it context-aware / integrate it with the hst:hosts
- Have a default config, and allow robotstxt documents/compounds to define which domain they're for. (independent of the HST configured hosts)
Attachments
Issue Links
- relates to
-
HIPPLUG-546 robotstxt should allow for multiple (context-aware?) robots.txt
- Closed
-
HIPPLUG-1357 Remove not functioning SitemapFeedBasedOnHstSitemap and SiteMapBasedOnHstSiteMapResource
- Closed
-
CMS-12938 Add "disallow" document for robots.txt plugin
- Closed