Async vs Threads for low latency HTTP bulk requests

I need to do bulk HTTP (for a financial application, the host does not offer a better API) Requests (about 800 per second ) and process their response (in JSON, usually no more than 1kb) with low latency (only for deserializing and comparing some values), so that, finally, make another request based on the response (the time between the response and the next request should not exceed 1-2 ms ).

I am currently using traditional threads with synchronous requests where about 50% of the threads only process requests after 40-60 seconds and the other 50% always request.Although this approach worked fine with about 50-100 requests per second, I experienced that with 800 requests per second, the time between the response and the next request of the stream is too high (often 50-200 ms).

How I want to fix this, I would like to ask:

1. Are asynchronous operations better for this? Read a lot about scalability and responsiveness with Asyncs but not sure if it's good for low latency (Context Switching, Task Creation, etc.) 2. Can I tweak the flow approximation? (Regardless of quest # 1) I was thinking of something like giving the higher priority (more .ThreadPriority

) ones when they are currently processing responses, but that doesn't really work / It might be possible to completely stop other threads from executing while processing
(3. Which class / lib for HTTP request should I use?) Currently HttpWebRequest

s are used which are slightly faster than tagsHttpClient

in my tests, or should I be using something else?

Any help would be greatly appreciated

+3


source to share


2 answers


I went through and solved this exact issue (to download a lot of XML files from the server in parallel) and in my experience using Async was about 20-50% faster, depending on the file size.

Unfortunately it was a few months ago, but I was using WebClient in every request and just doing WebClient.DownloadString, so if this is your use case it might work for you.



Real answer: try both and profile it . It shouldn't be hard to switch between the two!

+3


source


System.Net.Http.HttpClient

Works well enough for your needs in my experience . One of the problems is setting the number of parallel connections. You cannot use the property directly HttpWebRequest.ConnectionLimit

, so you have to go through ServicePointManager

:

ServicePoint servicePoint = ServicePointManager.FindServicePoint(uri);
servicePoint.ConnectionLimit = connectionLimit;

      



Then you can use tasks for your parallel operations and wait for the result with Task.WhenAll

.

0


source







All Articles