Increased Kinesis latency results in low proportion and high latency across Lambda

We use Kinesis as a buffer for Lambda, which we then inject into Redshift. The Lambda function creates a file in S3 and does a COPY in Redshift to insert the data. We are experiencing very long delays in receiving Kinesis data, and we are concerned that this is deleting data older than 24 hours. We currently have 3 steps and do not have the maximum maximum bandwidth available.

At the same time, we have also seen an increase in the amount of data flowing into Kinesis. However, since we are using about a third of our bandwidth for recording, we should not be throttled. There is no hesitation in any of the Lambda or Red Shift metrics.

The attached files show statistics from our Kinesis stream. What could be causing this and how can I fix it?

Kinesis get requests

Kinesis get latency

enter image description here

enter image description here

+3


source to share


1 answer


What is most likely happening is that your lambda function is keeping up with the baud rate coming into Kinesis. The way lambda functions work with Kinesis event streams works, only one (single core) lambda function is attached to each shard. This way you only get 3 functions.

You can see if this feature is lagging by looking at the iteratorAgeMilliseconds metric on Kinesis. This, combined with looking at the average execution time of your lambda function and the batch size of the lambda event source should give you an idea of ​​how much data your lambda function is actually processing per second. (Event source batch size) * (average size of each record) / (average duration of lambda invocation) * (number of shards) = total bytes/second processed

... You can use this to determine how many kinesis skulls you need to keep up with the load.



Also, you might want to look into the "fan out" setting, where you have one function to read a lambda function exiting the stream and then call another lambda function directly with events. This saves you from the proximity of the shards in the lambda.

+2


source







All Articles