We have robots.txt files for production, and a robots.txt file for staging that denies everything. We deploy everything with git, so we have removed robots.txt from the git repo and manually added the files to the servers.
This works great until we set up a new server and completely forget about the robots.txt file, and then google indexes the staging server and some other stuff we’d rather not be indexed in production, and GRRR
We have config files set up for each subdomain. Is there a way we can piggyback off these config files to server different robots.txt data off different servers?