How to set a field in Logstash to "not_analyzed" using the Logstash config file

I have an elasticsearch index that I use to index a collection of documents.

These documents are originally in csv format and I am viewing them using logstash.

My Logstash config file.

    input {
        file {
                path => "/csv_files_for_logstash/app1lg.csv"
                        type => "core2"
                        start_position => "beginning"
        }   }


    filter {
        csv {
                separator => ","
                        columns=> ["Date","Package Name","App Version Code","Current Device Installs","Daily Device Installs","Daily Device Uninstalls","Daily Device Upgrades","Current User Installs","Total User Installs","Daily User Installs","Daily User Uninstalls"]
        }
        mutate {convert => ["App Version Code", "string"]}
        mutate {convert => ["Current Device Installs", "float"]}
        mutate {convert => ["Daily Device Installs", "float"]}
        mutate {convert => ["Daily Device Uninstalls", "float"]}
        mutate {convert => ["Current User Installs", "float"]}
        mutate {convert => ["Total User Installs", "float"]}
        mutate {convert => ["Daily User Installs", "float"]}
        mutate {convert => ["Daily User Uninstalls", "float"]}
        ruby {
                code => '
                  b = event["App Version Code"]
                  string2=""
                  for counter in (3..(b.size-1))
                         if counter == 4
                                 string2+= "."+ b[counter]
                         elsif counter ==  6
                                string2+= "("+b[counter]
                         elsif counter == 8
                                string2+= b[counter] + ")"
                         else
                                 string2+= b[counter]
                         end

                   end

                   event["App Version Code"] = string2

                  '

        }
}
   output {
        elasticsearch {
                embedded => true
                        action => "index"
                        host => "es"
                        index => "fivetry"
                        workers => 1

        }
        stdout{
                codec => rubydebug {
                }
        }
}

      

Now my field value (app version code) looks like "123456789" in the csv which I parse to "4.56 (789)" using Ruby code.

This term breaks it down into different meanings as it is not analyzed.

I know there is another way to create a mapping and set it to not_ananlysed, but I don’t know how. So,

Is there a way to set this not_analysed using just my logstash config file?

Besides,

The .raw field does not appear in Kibana, with which I can use the exact string.

Thanks and regards,

+3


source to share


2 answers


You cannot set the mapping through the Logstash config. The mapping is not related to Logstash, but only to Elasticsearch.



You will need to pre-map these fields in Elasticsearch before inserting these documents, you can either create an index or use the mapping API to set the mapping, or use index templates to do this, which will allow you to create the mapping without first creating the index.

+2


source


Logstash provides a default template to use for new indexes. You can edit this file, but it's not a good idea (it will be overwritten on update, etc.).



elasticsearch {} output allows you to specify your own template to use instead of the default one.

0


source







All Articles