AdsBot, the end of Arbitrage as we know it ?

So Danny has caught this one. Aaron seconds it.

We have a new Google bot, named AdsBot, that’s different from all the others. It’s an AdWords landing page “quality bot”, which will assign a quality score for your website. Based on that assigned score, your minimum bid on keywords, and other factors your Ad’s AdRank will be modified.

AdRank represents the position where your Ad is placed, within the serps “Sponsored Links” section.

Anyone can block the AdsBot, and Google provides extremely detailed info on this, but please keep in mind that your assigned quality score will suffer :

We believe that a non-participating advertiser does detract from the user’s search experience, and from the overall quality of the AdWords program. While you can exclude your site from review, this will provide us with little information about your landing page’s quality and relevance. Therefore, if you restrict AdWords from visiting your landing pages, you will experience a drop in Quality Scores for your related keywords. (This will cause higher minimum bid requirements for any landing page for which you’ve restricted access.)

I think this is a GREAT improvement to business and contextual advertising relevancy, as well as increased search quality, whilst making arbitrage a lot harder to accomplish. Arbitragers usually employ “made for Arbitrage” websites.

Arbitrage represents a technique employed to take advantage of differences in price. E.g. buying Adwords advertising for the keyword “cars” with $2 per click, and delivering a landing page for that ad, that sells ads with a $3 per click income. More on Arbitrage from Graywolf.

More on the technical part :

While we strongly recommend against restricting our system’s automatic review of your landing page, you can edit your site’s robots.txt file to avoid a review. The file must explicitly exclude your page from our system visits as follows:

To prevent AdsBot-Google from accessing your site, add the following to your robots.txt file:

  • User-agent: AdsBot-Google
  • Disallow: /

To prevent AdsBot-Google from accessing parts of your site, add the following to your robots.txt file:

  • User-agent: AdsBot-Google
  • Disallow: /exclude/

Where exclude represents the directories you don’t want the AdWords system to visit.

Note: In order to avoid increasing CPCs for advertisers who don’t intend to restrict AdWords visits to their pages, the system will ignore blanket exclusions (User-agent: *) in robots.txt files.

Published by

Cristian Mezei

I am myself.

Comments are closed.