How do I use fluentD to parse docker's multi-line logs?

I am trying to bundle my docker logon environment using fluentD driver, free, resilient, kibana.

The logs are going correctly, but the nested JSON objects are each logged as one event (see image).

enter image description here

Is this the correct way to structure the data with a custom regex?

In my fluentd.conf

there is the following:
<source>
  type forward
  port 24224
  bind 0.0.0.0
</source>

<match docker.**>
  type elasticsearch
  logstash_format true
  logstash_prefix logstash
  host elasticsearch
  port 9200
  flush_interval 5s
</match>

      

+3


source to share


1 answer


Tkwon123 - Am I correct in wanting you to write JSON logs to your docker containers and each JSON object display as 1 log kibana?

If I understand you correctly, it seems like you just need to make sure your JSON logs are only printed with carriage return. Looking at the screenshot, it looks like you can print a JSON object that splits it across multiple lines.



Hope I understood your question.

0


source







All Articles