Class ClassicTokenizer
java.lang.Object
com.azure.search.documents.indexes.models.LexicalTokenizer
com.azure.search.documents.indexes.models.ClassicTokenizer
- All Implemented Interfaces:
com.azure.json.JsonSerializable<LexicalTokenizer>
Grammar-based tokenizer that is suitable for processing most European-language documents. This tokenizer is
implemented using Apache Lucene.
-
Constructor Summary
ConstructorsConstructorDescriptionClassicTokenizer(String name) Creates an instance of ClassicTokenizer class. -
Method Summary
Modifier and TypeMethodDescriptionstatic ClassicTokenizerfromJson(com.azure.json.JsonReader jsonReader) Reads an instance of ClassicTokenizer from the JsonReader.Get the maxTokenLength property: The maximum token length.Get the odataType property: A URI fragment specifying the type of tokenizer.setMaxTokenLength(Integer maxTokenLength) Set the maxTokenLength property: The maximum token length.com.azure.json.JsonWritertoJson(com.azure.json.JsonWriter jsonWriter) Methods inherited from class com.azure.search.documents.indexes.models.LexicalTokenizer
getNameMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.azure.json.JsonSerializable
toJson, toJson, toJsonBytes, toJsonString
-
Constructor Details
-
ClassicTokenizer
Creates an instance of ClassicTokenizer class.- Parameters:
name- the name value to set.
-
-
Method Details
-
getOdataType
Get the odataType property: A URI fragment specifying the type of tokenizer.- Overrides:
getOdataTypein classLexicalTokenizer- Returns:
- the odataType value.
-
getMaxTokenLength
Get the maxTokenLength property: The maximum token length. Default is 255. Tokens longer than the maximum length are split. The maximum token length that can be used is 300 characters.- Returns:
- the maxTokenLength value.
-
setMaxTokenLength
Set the maxTokenLength property: The maximum token length. Default is 255. Tokens longer than the maximum length are split. The maximum token length that can be used is 300 characters.- Parameters:
maxTokenLength- the maxTokenLength value to set.- Returns:
- the ClassicTokenizer object itself.
-
toJson
- Specified by:
toJsonin interfacecom.azure.json.JsonSerializable<LexicalTokenizer>- Overrides:
toJsonin classLexicalTokenizer- Throws:
IOException
-
fromJson
Reads an instance of ClassicTokenizer from the JsonReader.- Parameters:
jsonReader- The JsonReader being read.- Returns:
- An instance of ClassicTokenizer if the JsonReader was pointing to an instance of it, or null if it was pointing to JSON null.
- Throws:
IllegalStateException- If the deserialized JSON object was missing any required properties.IOException- If an error occurs while reading the ClassicTokenizer.
-