Ad Serving Blog Insights into the complex world of digital advertising technology

For comments, questions or topics you would like to read about on this blog, simply reach out to me at

mark@adservingblog.com

Good bot / bad bot: The battle for eyeballs

In order to convert internet users into customers, advertisers and brands ideally need human eyeballs to land on their ad units. Spiders, crawlers and bots don’t do much to promote this cause.

That’s why it’s important for publishers’ internal or third party ad serving systems to make use of the spider and bot list that the IAB publishes on a monthly basis, as well as the valid browser list.

This simple measure enables visits by spiders, crawlers and bots to be filtered out so that reports on ad impressions, views, ad visibility and clicks will be as accurate as possible.

What can publishers and ad serving systems do about the problem of data that originates from identified bots?

There are two approaches.

First, a system might be designed to prevent the bot’s access in the first place. To use a metaphor, no interaction is recorded because the visitor is turned away at the door. The second approach is that a system can be configured to filter the data after the fact.

Both are valid approaches. However, the first one not only blocks “bad bots”, but it gives “good bots” a hard time as well. These “good bots” are responsible for analysis of site structure and ad delivery, so they do bring value to the ecosystem. When the information they need, however, is blocked they are not able to do their jobs properly or at all.

And if beneficial bots such as these are not handled in the right way, it has a detrimental impact on ad campaign delivery and performance goals. Impressions or, even worse, clicks that have been artificially created by crawlers can end up being counted as valid interactions. This skews reporting, which in turn hampers future data-driven decision-making.

Today there’s no official information available that lists each of these "good bot" systems, including the name of the company running a particular crawler service, a description of its purpose, the technical contact person, and so on. To have a listing of this kind in place would save significant time on the sides of the publisher, the agency, and the ad serving system.

The most recent meeting of the IAB Spiders & Bots committee saw this idea brought to the table. As a result, I’m hopeful that a solution may be on the horizon, and that our methods for counting those human eyeballs will improve even further.

 

Print this page

Tags: IAB, Data, Bots

Who am I?

Mark Thielen
Mark Thielen
CTO, ADTECH
Follow me on Twitter LinkedIn Profile

With eight years as a technology leader in the ad serving industry, my work has centred around leading local and remote sites in Europe and North America to drive product and technology innovation. Covering engineering, architectural and operational aspects of scalable performance system environments, I've been fortunate enough to serve as a business and technology executive since 2007.