How can I prevent my service credit from being used up at a faster than expected rate?

Search engine spiders may crawl your site and deplete your GeoIP2 Precision service queries unexpectedly. The robots exclusion standard can help prevent search engines spiders from doing so.

You can write code to check the User-Agent header for the client to see if it’s a spider.

If you implemented the robots exclusion standard and still are seeing queries being used up faster than expected, we recommend logging each request to the GeoIP2 Precision service in order to better understand what is causing your code to make a web service request.