Does the request execute the retry method after some seconds?

here is my code.

import requests,time
proxies = {'http':'36.33.1.177:21219'}
url='http://218.94.78.61:8080/newPub/service/json/call?serviceName=sysBasicManage&methodName=queryOutputOtherPollutionList&paramsJson=%7B%22ticket%22:%22451a9846-058b-4944-86c6-fccafdb7d8d0%22,%22parameter%22:%7B%22monitorSiteType%22:%2202%22,%22enterpriseCode%22:%22320100000151%22,%22monitoringType%22:%222%22%7D%7D'

i = 0
a = requests.adapters.HTTPAdapter(max_retries=10)
s = requests.Session()
s.mount(url, a)
for x in xrange(1,1000):
    time.sleep(1)
    print x
    try:
        r= s.get(url,proxies=proxies)
        print r
    except Exception as ee:
        i = i + 1
        print ee
        print 'i=%s' % i

      

the proxy is a bit unsteady, so I set max_retries, but it still has some kind of exception, so is there some method to execute after some seconds on every try ??

+3


source to share


1 answer


It's just requests

not possible with a library . However, you can use an external library like backoff .

backoff

provides a decorator and wraps it around your function. Sample code:



@backoff.on_exception(backoff.constant,
                      requests.exceptions.RequestException,
                      max_tries=10, interval=10)
def get_url(url):
    return requests.get(url)

      

The above code waits 10 seconds for the next retry on every exception requests.exceptions.RequestException

and it tries to execute 10 times as stated in max_tries

.

+9


source







All Articles