Article suggestions
  1. Home
  2. Section: Developer Hub
  3. How to Add Bots to Your Robots.txt File (Ahrefs Example)

How to Add Bots to Your Robots.txt File (Ahrefs Example)

How to Add Bots to Your Robots.txt File (Ahrefs Example)
  • avatar
Written by Chan Nier
Updated on February 19, 2025

 Link: https://support.brilliantdirectories.com/support/solutions/articles/12000099791

A robots.txt file is a text file placed on a website's server that instructs web crawlers and other automated processes which pages or sections of the site should or should not be accessed. It's a protocol used to communicate with web robots and is commonly used for managing website indexing and crawling by search engines like Google, Bing, SEO tools and others.

Adding a Web Robot to the site

Having a web robot crawl a site involves granting permission for it to access and index the site's content.

To allow third-party tools to crawl a website, they must be included in the Allowed list within the website's Robots.txt file.

The ROBOTS.TXT file can be accessed through:

  • Navigate to the Developer Hub sidebar menu link.
  • Click on the Robots File option.

To incorporate https://ahrefs.com/ into the list, please add the following lines to the beginning of the code:

User-agent: AhrefsSiteAudit
Allow: /
Generic
User-agent: AhrefsBot
Allow: /
Generic

Lastly, click on "Save Changes". Now, Ahrefs has permission to crawl the website without encountering any blockages.


Thank you for leaving a rating!
Did you find this article helpful?
0 out of 0 people found this article helpful so far
Can't find what you're looking for? Get in touch
How can we help?
Send your question below and we'll get back to you as soon as possible.
Cancel
translation missing: en.kb.default.contact_form_error
×
Thanks for your message!
Thanks for your message!
×