Closed
Description
Is there a way to either ignore the robots.txt file in specific environments, or include different robots.txt files for each environment?
Here's the scenario: I have a production branch, a dev branch, and a staging branch. For production I obviously want to allow crawlers via Allow: /
. However in dev and staging environments, I want to block crawlers.
Due to the deployment strategy I'm forced to adhere to, I'm unable to just edit the file after deploying it and then .gitignore it like I normally would. Can I somehow configure this file in create-react-app per environment, or using environment variables?
Metadata
Metadata
Assignees
Labels
No labels