Reusing Django connections with Python requests

What's the correct way to reuse Python asks for connections in Django across multiple HTTP requests. This is what I am currently doing:

import requests

def do_request(data):
    return requests.get('http://foo.bar/', data=data, timeout=4)

def my_view1(request)
    req = do_request({x: 1})
    ...

def my_view2(request)
    req = do_request({y: 2})
    ...

      

So, I have one function that makes a request. This function is called in different kinds of Django. Views are called in separate HTTP requests by users. My question is, does Python automatically request to reuse the same connections (via urllib3's connection pool)?

Or do I need to first create a request session object to work with?

s = requests.Session()  

def do_request(data):
    return s.get('http://foo.bar/', data=data, auth=('user', 'pass'), timeout=4).text

      

And if so, should the session object be created in the global scope or inside a function?

def do_request(data):
    s = requests.Session()  
    return s.get('http://foo.bar/', data=data, auth=('user', 'pass'), timeout=4).text

      

I may have multiple HTTP requests at the same time, so the solution should be safe for e ... I'm new to connection pooling, so I'm really not sure and the Requests docs aren't that extensive here.

+3


source to share


1 answer


Create a session, save the session by passing it through functions and returning it, or create a session object at the global or class level so that the latest state is maintained whenever it is referenced. And it will work like a charm.



+3


source







All Articles