Datafeedr Support Forums

Datafeedr Support Forums (/index.php)
-   Questions (/forumdisplay.php?f=67)
-   -   blocking robots from crawling buy links? (/showthread.php?t=10087)

proneone January 13th, 2014 03:40 PM

blocking robots from crawling buy links?
 
i was told that it was relatively easy to block search engines from crawling my affiliate links with datafeedr, given the semi-static link structure it uses.

does anyone have an example robots.txt file i can implement to accomplish this?

our buy now link structure looks like:

http://domain.com/store/product/buy/...24671/product/

thanks!

Eric January 13th, 2014 03:42 PM

Hi,

Adding this to your robots.txt file should do it:

Code:

User-agent: *
Disallow: /store/product/buy

Eric

Eric January 13th, 2014 03:43 PM

But make sure to validate it:
https://www.google.ca/search?q=valid...sm=91&ie=UTF-8

proneone January 14th, 2014 08:48 AM

thanks eric!

i'll get this implemented today and confirm.

proneone January 17th, 2014 10:10 AM

ok here are the results i get when validating the robots.txt

User-agent Access? Reason
* yes no Disallow directive found

This is the default access policy for any robot without a matching User-agent field.
Rule records

1 User-agent: *
2 Disallow: /store/product/buy/


that doesn't look right.. correct?

thanks!

proneone January 17th, 2014 10:13 AM

nevermind - fixed it.. i needed a blank line after the disallow

thanks!

Eric January 17th, 2014 10:13 AM

It looks fine to me...

hosfield1010 January 24th, 2014 07:05 PM

Hi may I ask why one would do this? Is this something necessary for SEO?

Thx.

gpsugar1 January 25th, 2014 02:32 PM

So by doing all the above, This ensures web crawlers are blocked from discovering ALL affiliate links right? Are there other measures to take as well?


All times are GMT -5. The time now is 01:16 PM.

Powered by vBulletin® Version 3.6.8
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.