How to optimize a robots.txt file – Let’s first start off by saying what a robots.txt file is. Definition: This file tells web crawlers what to crawl and what not to crawl. It can also request crawlers to delay for a bit before they crawl a site. Most people use this file to block search engines from crawling specific pages or directories.

What Happens if You Block Lots of Pages in Your robots.txt File?

It discouraging to search engines when you block them from lots of pages. They may visit your website less, crawl fewer pages per visit or not spend as much time crawling your pages. If there are a lot of pages you do not want search engines to crawl, index and rank there are better ways of communicating this. Before you block search engines from certain pages using the robots.txt file weigh your options.

Block Search Engines from Crawling Pages Without Using the robots.txt File

Every file and directory is different so how you block search engines should be too. Depending on what you are blocking, you may want to use alternate methods. This way search engines are not discouraged from crawling, indexing and ranking your website for keywords. Here are some other ways that you can ensure this:

  • .htaccess file
  • Add a meta nofollow tag to a specific page
  • Add a meta noindex tag to a specific page
  • Put pages or directories behind a login
  • Set server permissions based on read, write and execute permissions for owners, users and groups

So What Should Your robots.txt File Look Like?

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Every website is different, but to give a quick example of an optimized robots.txt file you should include a few things. Let all friendly search bots crawl your site. Keep your Ajax, JavaScript and CSS accessible. Add your sitemap so that search engines can find all of your pages easily. Be sure to remove any sensitive files. These can make you run into trouble. It is ok to keep a default login URL in there but you may wish to change this URL. This overlaps with website security. Check out the link to the left to learn about our WordPress hardening services. We also perform full website security audits and website backups.

A Real-Life Example of a Seriously Flawed and Dangerous robots.txt File

Recently, we worked on a website build for a client. We include many things with our web designs, like SEO compatibility. This is for a good reason. We understand the web, SEO, web privacy laws and web security. We found a couple of interesting directories in our client’s robots.txt file and decided to check them out. This wasn’t part of our website build. We are curious by nature and are very detail-orientated and analytical. The results were shocking. Had this sensitive information been found by the wrong person our client could have had many critical things happen:

  1. Complete de-indexing by Google.
  2. Complete de-indexing by Bing.
  3. Domain rating destroyed (Need to buy a new domain and possibly re-brand).
  4. Had their company shut down.
  5. Face jail or prison time.

Pay for Peace of Mind and Hire a Professional Company Who Understands the Web

If you are a website owner and are looking for a reliable company to fulfill a web design for you please contact us for a quote. We also want to hear from you if you are a Web Developer and are launching a website without assistance from an experienced SEO. If you are a website owner and are needing a website built or if you want your existing one redesigned email us. We can make a really big difference, no matter what your role is. We are available Monday to Friday from 10 am to 6 pm. If you have an urgent matter please write URGENT in the subject line. Assistance is only a click or a call away. 604-778-8438. You can also reach us through our online chat.

About the Author

SEO Candyland

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Want More Great Content?

Check Out These Articles