How to Disallow Multiple Folders in Robots.txt?

Adding multiple folders to disallow rules in Robots.txt

You can disallow multiple folders in a robots.txt file by adding each folder to a Disallow directive on a separate line, for example, like so:

User-agent: *
Disallow: /temp/
Disallow: /private/
# ...

If you have multiple User-agent set in your robots.txt file, then you can set multiple Disallow rules for each one in a similar way. For example:

User-agent: foobot
Disallow: /mobile/
Disallow: /plugins/
# ...

User-agent: barbot
Disallow: /temp/
Disallow: /private/
# ...

User-agent: bazbot
Disallow: /

Please note that when you have multiple User-agent sets/groups, it would only apply Disallow (or Allow) rules to their respective User-agent set/group.


Hope you found this post useful. It was published . Please show your love and support by sharing this post.