You can disallow multiple folders in a robots.txt file by adding each folder to a Disallow
directive on a separate line, for example, like so:
User-agent: * Disallow: /temp/ Disallow: /private/ # ...
If you have multiple User-agent
set in your robots.txt file, then you can set multiple Disallow
rules for each one in a similar way. For example:
User-agent: foobot Disallow: /mobile/ Disallow: /plugins/ # ... User-agent: barbot Disallow: /temp/ Disallow: /private/ # ... User-agent: bazbot Disallow: /
Please note that when you have multiple User-agent
sets/groups, it would only apply Disallow
(or Allow
) rules to their respective User-agent
set/group.
This post was published by Daniyal Hamid. Daniyal currently works as the Head of Engineering in Germany and has 20+ years of experience in software engineering, design and marketing. Please show your love and support by sharing this post.