Throttling Control - robots
Posted: Wed Apr 25, 2018 10:06 am
Hi Support,
I have implemented the rules according the fair use policy and normally it's working quite well.
However, in some situations I still get the error CLIENT.RobotDetected, although to my understanding
there should not be a reason for waiting with a next call:
####### Ex 1 #### Change from yellow to black after 4 sec.
20180316 09:01:24 0 UA get: QUERY1
20180316 09:01:25 0 busy (images=green:100, inpadoc=green:45, other=green:1000, retrieval=green:100, search=yellow:15)
20180316 09:01:25 0 QuotaWeekUsed: 705572105; QuotaHourUsed: 164828218
20180316 09:01:29 0 UA get: QUERY2
20180316 09:01:29 0 overloaded (images=green:50, inpadoc=green:30, other=green:1000, retrieval=green:50, search=black:0)
20180316 09:01:29 0 QuotaWeekUsed: 705572385; QuotaHourUsed: 164828498 ThrottlingControlQuota
20180316 09:01:29 0 UA-status: 403 Forbidden CLIENT.RobotDetected
####### Ex 2 #### Change from green to black
20180424 03:11:49 0 UA get: QUERY1
20180424 03:11:50 0 idle (images=green:200, inpadoc=green:60, other=green:1000, retrieval=green:200, search=green:30)
20180424 03:11:50 0 QuotaWeekUsed: 536305934; QuotaHourUsed: 7117918
20180424 03:11:50 0 UA get: QUERY2
20180424 03:11:50 0 busy (images=green:100, inpadoc=green:45, other=green:1000, retrieval=green:100, search=black:0)
20180424 03:11:50 0 QuotaWeekUsed: 536306214; QuotaHourUsed: 7118198 ThrottlingControlQuota
20180424 03:11:50 0 UA-status: 403 Forbidden CLIENT.RobotDetected
In both cases, I should have expected a search=red warning before. Having a green status and
getting, uncontrollable, locked out for 900 seconds is not really fun.
Do you have any advise how to prevent those situations?
Regards, Martien
I have implemented the rules according the fair use policy and normally it's working quite well.
However, in some situations I still get the error CLIENT.RobotDetected, although to my understanding
there should not be a reason for waiting with a next call:
####### Ex 1 #### Change from yellow to black after 4 sec.
20180316 09:01:24 0 UA get: QUERY1
20180316 09:01:25 0 busy (images=green:100, inpadoc=green:45, other=green:1000, retrieval=green:100, search=yellow:15)
20180316 09:01:25 0 QuotaWeekUsed: 705572105; QuotaHourUsed: 164828218
20180316 09:01:29 0 UA get: QUERY2
20180316 09:01:29 0 overloaded (images=green:50, inpadoc=green:30, other=green:1000, retrieval=green:50, search=black:0)
20180316 09:01:29 0 QuotaWeekUsed: 705572385; QuotaHourUsed: 164828498 ThrottlingControlQuota
20180316 09:01:29 0 UA-status: 403 Forbidden CLIENT.RobotDetected
####### Ex 2 #### Change from green to black
20180424 03:11:49 0 UA get: QUERY1
20180424 03:11:50 0 idle (images=green:200, inpadoc=green:60, other=green:1000, retrieval=green:200, search=green:30)
20180424 03:11:50 0 QuotaWeekUsed: 536305934; QuotaHourUsed: 7117918
20180424 03:11:50 0 UA get: QUERY2
20180424 03:11:50 0 busy (images=green:100, inpadoc=green:45, other=green:1000, retrieval=green:100, search=black:0)
20180424 03:11:50 0 QuotaWeekUsed: 536306214; QuotaHourUsed: 7118198 ThrottlingControlQuota
20180424 03:11:50 0 UA-status: 403 Forbidden CLIENT.RobotDetected
In both cases, I should have expected a search=red warning before. Having a green status and
getting, uncontrollable, locked out for 900 seconds is not really fun.
Do you have any advise how to prevent those situations?
Regards, Martien