Megosztás a következőn keresztül:


EdgeNGramTokenizer Class

Definition

Tokenizes the input from an edge into n-grams of the given size(s). This tokenizer is implemented using Apache Lucene.

public class EdgeNGramTokenizer : Azure.Search.Documents.Indexes.Models.LexicalTokenizer
type EdgeNGramTokenizer = class
    inherit LexicalTokenizer
Public Class EdgeNGramTokenizer
Inherits LexicalTokenizer
Inheritance
EdgeNGramTokenizer

Constructors

EdgeNGramTokenizer(String)

Initializes a new instance of EdgeNGramTokenizer.

Properties

MaxGram

The maximum n-gram length. Default is 2. Maximum is 300.

MinGram

The minimum n-gram length. Default is 1. Maximum is 300. Must be less than the value of maxGram.

Name

The name of the tokenizer. It must only contain letters, digits, spaces, dashes or underscores, can only start and end with alphanumeric characters, and is limited to 128 characters.

(Inherited from LexicalTokenizer)
TokenChars

Character classes to keep in the tokens.

Applies to