Logstasher + Kibana: post is double quoting and difficult to parse

I am using this stack:

  • On every front server
    • Rails
    • logstasher gem (rails log in json formats)
    • logstash-forwarder (just forwards the logs to logstash on the central server)
  • On the log server:
    • logstash (for centralizing and indexing logs)
    • kibana to display

Kibana works well with JSON format. But the "message" data is provided as a string, not as json (cf proxied snippet). Is there a way to fix this? For example, accessing status is a little trickier

Here's an example post

{
  _index: logstash-2014.09.18
  _type: rails
  _id: RHJgU2L_SoOKS79pBzU_mA
  _version: 1
  _score: null
  _source: {
  message: "{"@source":"unknown","@tags":["request"],"@fields":{"method":"GET","path":"/foo/bar","format":"html","controller":"items","action":"show","status":200,"duration":377.52,"view":355.67,"db":7.47,"ip":"123.456.789.123","route":"items#show","request_id":"021ad750600ab99758062de60102da8f"},"@timestamp":"2014-09-18T09:07:31.822782+00:00"}"
  @version: 1
  @timestamp: 2014-09-18T09:08:21.990Z
  type: rails
  file: /home/user/path/logstash_production.log
  host: webserver.example.com
  offset: 23200721
  format: json_event
  }
  sort: [
    rails
  ]
}

      

Thank you for your help;).

EDIT 1: Add logstash config files:

/etc/logstash/conf.d/01-lumberjack-input.conf

input {
  lumberjack {
    port => 5000
    type => "logs"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    codec => "json"
  }
}

      

/etc/logstash/conf.d/10-syslog.conf

filter {
 if [type] == "syslog" {
   grok {
     match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
     add_field => [ "received_at", "%{@timestamp}" ]
     add_field => [ "received_from", "%{host}" ]
   }
   syslog_pri { }
   date {
     match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
   }
 }
}

      

/etc/logstash/conf.d/30-lumberjack-output.conf

output {
  elasticsearch { host => localhost }
#  stdout { codec => rubydebug }
}

      

if useful, config logstash-forwarder: /etc/logstash-forwarder

on web servers

{
  "network": {
    "servers": [ "123.465.789.123:5000" ],
    "timeout": 45,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
  },
  "files": [
    {
      "paths": [
        "/var/log/messages",
        "/var/log/secure"
       ],
      "fields": { "type": "syslog" }
    },
    {
      "paths": [
        "/home/xnxx/gportal/shared/log/logstash_production.log"
      ],
      "fields": { "type": "rails", "format": "json_event" }
    }
  ]
}

      

My config files are mostly based on this tutorial: https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu- 14-04

+3


source to share


2 answers


The last way was to stop and start logstash, otherwise (reboot) the configuration doesn't seem to be updated.

So instead of:

sudo service logstash restart

I did:



sudo service logstash stop

wait ~ 1 minute, then

sudo service logstash start

Not sure why (the init script does this but doesn't wait 1 minute), but it worked for me.

0


source


I've never personally used the lumberjack input, but it looks like it should support codec=>json

, so I'm not sure why it doesn't. You can try putting this instead of (in /etc/logstash/conf.d/01-lumberjack-input.conf

):



filter {
  json {
    source => 'message'
    remove_field => [ 'message' ]
  }
}

      

+1


source







All Articles