Elasticsearch limits token filter

I am trying to use delimited_payload_filter on a textbox but no luck

my requests:

PUT /myIndex
{
"settings": {
    "analysis" : {
        "analyzer" : {
            "kuku" : {
                "tokenizer" : "standard",
                "filter" : ["delimited_payload_filter"]
            }
        }
    }
},
"mappings": {
   "calls" : {
      "properties": {
            "text": {
                "type" : "text",
                "analyzer" : "kuku"
            }
       }
     }
 }
}       

      

}

Then I add the following document:

PUT /myIndex/calls/1
{
    "text" : "the|1 quick|2 fox|3"
}

      

I expect that if I make the next request I get hit, but I didn't.

GET /myIndex/calls/_search
{
    "query": {
        "match_phrase": {
             "text": "quick fox"
         }
    }
}

      

+3


source to share


1 answer


Change the tokenizer to something other than "standard", such as "space". "Standard" is text marking and removal of "|". the separator before the delimited_payload filter can work with it.



0


source







All Articles