The problem is that while we do support multiple hosts, we can only support one global robots.txt. This is a problem when using multiple subsites.
Shortly discussed with JH, we came up with the following ideas:
- Make it context-aware / integrate it with the hst:hosts
- Have a default config, and allow robotstxt documents/compounds to define which domain they're for. (independent of the HST configured hosts)