During query processing, the content is analysed by the analysis module. It consists of analyzer, tokenizer, tokenfilters and charfilters.
POST _analyze
{
"analyzer": "standard",
"text": "Today's weather is beautiful"
}
The above will produce tokens based on the standard analyser.
analyzer, tokenizer, tokenfilters and charfilters.
POST _analyze
{
"analyzer": "standard",
"text": "Today's weather is beautiful"
}
Token produced: today's, weather ,is,beautilful
PUT index_mapping_analysis
{
"settings": {
"analysis": {
"analyzer": {
"my_grammar_analyzer": {
"type": "standard",
"max_token_length": 5,
"stopwords": "_english_"
}
}
}
}
}
POST index_4_analysis/_analyze
{
"analyzer": "my_grammar_analyzer",
"text": "Today's weather is beautiful"
}
Token Produced: today,s,weath,er,beaut,iful
No comments:
Post a Comment