Why are my queries being used up at a faster than expected rate?

Search engine spiders may be crawling your site and depleting your GeoIP2 Precision service queries. The robots exclusion standard may help prevent search engines spiders from doing so.

You can write code to check the User-Agent header for the client to see if it’s a spider.

If you implemented the robots exclusion standard and still are seeing queries being used up faster than expected, we recommend logging each request to the GeoIP2 Precision service in order to better understand what is causing your code to make a web service request.