How to parse data from S3 using Logstash and click on Elastic Search and then on Kibana

I have a log file generated into an S3 bucket every minute. Data is limited to "\ x01". One of the columns is a timestamp.

I want to load this data into elastic search.

I tried using the following logstash config. But it doesn't seem to work. I see no way out. I made some links from http://brewhouse.io/blog/2014/11/04/big-data-with-elk-stack.html

The Logstash configuration file looks like this:

input {
  s3 {
    bucket => "mybucketname"
    credentials => [ "accesskey", "secretkey" ]
  }
}
filter {
  csv {
   columns => [ "col1", "col2", "@timestamp" ]
   separator => "\x01"
  }
}
output {
  stdout { } 
}

      

How do I modify this file to bring in a new file every minute?

Then I will eventually want to connect Kibana to ES to visualize the changes.

+4


source to share


1 answer


Just use logstash-forwarder to send files from S3, you will need to create certificates for authorization.

There is a really good tutorial: https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-logs-on-centos-7

if you are getting I / O errors mb you can solve them by installing the cluster:

inside logstash.conf:

output {
    elasticsearch {
        host => "127.0.0.1"
        cluster => CLUSTER_NAME
    }

      



inside elasticsearch.yml:

cluster.name: CLUSTER_NAME

      

If you have problems generating certificates, you can generate them using this: https://raw.githubusercontent.com/driskell/log-courier/develop/src/lc-tlscert/lc-tlscert.go

I also found a better init.d for logstash-forwarder on CentOS: http://smuth.me/posts/centos-6-logstash-forwarder-init-script.html

0


source







All Articles