Gary Illyes from Google mentioned on LinkedIn that if you wish to you should utilize a single robots.txt file for all of your worldwide websites. He added that he’s not saying it is a good concept, however somewhat, technically, you’ll be able to implement your robots.txt this manner.
Gary wrote, “Technically you would have a single central robots.txt for all of your worldwide websites (de.instance, ch.instance, at.instance, and so forth.) that is hosted in your CDN (cdn.instance/robots.txt).”
Though, Gary later added, “Not saying it is a good suggestion, or dangerous for that matter, simply that it is potential.”
Now, Ohgm wrote about this a number of years in the past over right here if you need a extra detailed technical walkthrough.
However that is simply one other search engine optimisation PSA from Gary.
Discussion board dialogue at LinkedIn.