apache — let search engine fetch robots.txt based upon domain name

if you have a website with multiple domain(for some special reason),and you want google or other major search engine to fetch different robots.txt,how will you do?

first,get all ready prepared robots.txt and put them in the root folder of you website.you need to name it as robots.txt ,robots1.txt ,robots2.txt

second,create .htaccess file( if not existed)

third,add the following apache config command to the head of the file

[codesyntax lang=”apache”]

RewriteCond %{HTTP_HOST} abcdomain1\.com$ [NC]
RewriteRule robots.txt robots1.txt [L]

RewriteCond %{HTTP_HOST} abcdomain2\.com$ [NC]
RewriteRule robots.txt robots2.txt [L]

RewriteCond %{HTTP_HOST} abcdomain3\.com$ [NC]
RewriteRule robots.txt robots3.txt [L]