Uses of Class
com.azure.search.documents.indexes.models.LexicalTokenizer
Packages that use LexicalTokenizer
Package
Description
Package containing the data models for SearchServiceClient.
-
Uses of LexicalTokenizer in com.azure.search.documents.indexes.models
Subclasses of LexicalTokenizer in com.azure.search.documents.indexes.modelsModifier and TypeClassDescriptionfinal classGrammar-based tokenizer that is suitable for processing most European-language documents.final classTokenizes the input from an edge into n-grams of the given size(s).final classEmits the entire input as a single token.final classBreaks text following the Unicode Text Segmentation rules.final classDivides text using language-specific rules and reduces words to their base forms.final classDivides text using language-specific rules.final classTokenizes the input into n-grams of the given size(s).final classTokenizer for path-like hierarchies.final classTokenizer that uses regex pattern matching to construct distinct tokens.final classTokenizes urls and emails as one token.Methods in com.azure.search.documents.indexes.models that return LexicalTokenizerModifier and TypeMethodDescriptionstatic LexicalTokenizerLexicalTokenizer.fromJson(com.azure.json.JsonReader jsonReader) Reads an instance of LexicalTokenizer from the JsonReader.Methods in com.azure.search.documents.indexes.models that return types with arguments of type LexicalTokenizerModifier and TypeMethodDescriptionSearchIndex.getTokenizers()Get the tokenizers property: The tokenizers for the index.Methods in com.azure.search.documents.indexes.models with parameters of type LexicalTokenizerModifier and TypeMethodDescriptionSearchIndex.setTokenizers(LexicalTokenizer... tokenizers) Set the tokenizers property: The tokenizers for the index.Method parameters in com.azure.search.documents.indexes.models with type arguments of type LexicalTokenizerModifier and TypeMethodDescriptionSearchIndex.setTokenizers(List<LexicalTokenizer> tokenizers) Set the tokenizers property: The tokenizers for the index.