|
Chris B. |
A feature I'm finding myself really wishing existed in GP is to the ability to block or rate limit search engine crawlers. I'm having to find and set up a solution to deal with heavy bot traffic to my GP instances and I think a lot of users could find this feature really helpful. Maybe even include a feature to calculate how much crawler traffic is in the access logs to give an idea of how bad the problem possibly is with this particular issue.