How is Django going to filter the set of queries being evaluated?

I have been caching a general set of queries that I would like to filter based on different fields as appropriate. I am wondering if filtering the query being evaluated if I lose the advantage of caching it in the first place; Django just creates another request from scratch, which is a collection of requests associated with creating a cached request and a filter that I apply afterwards?

+3


source to share


1 answer


Yes, the results will be thrown away.

You can see this from the source: filter()

calls _filter_or_exclude()

, which calls _clone()

and then adds to its request. _clone

, you can see does not set the attribute _result_cache

.

Overall, it is not clear what he can do to maintain overall results. If it is a complex query with a small result set, it can be replaced with a simple SQL release that verifies that the primary key is one of the results you found, but this will not always be more efficient, and in some situations it will mess with the semantics ( if the database changes in a way that affects the query results in the time interval between the cache and with an additional filter).

If you want to enforce this behavior while manually storing ids, you can do this:

pks = SomeObject.objects.filter(...).values_list('pk', flat=True)
some_of_them = SomeObject.objects.filter(pk_in=pks).filter(...)
others = SomeObject.objects.filter(pk_in=pks).filter(...)

      



You can also of course just filter in Python for example. by

 common = SomeObject.objects.filter(...)
 some_of_them = [m for m in common if m.attribute == 'foo']
 others = [m for m in common if m.other_attribute == 'bar']

      

(You can also use filter(lambda m: m.attribute == 'foo', common)

, if you like, or specify common

in more detail the definition in list

).

Whether one of these options or a re-issue of the query depends on the size of the sets involved, the complexity of the filters, and which indexes are present.

+5


source







All Articles