package
0.0.0-20220907150529-4ecbd2543f9e
Repository: https://github.com/stackrox/bleve.git
Documentation: pkg.go.dev

# Packages

No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
package token_map implements a generic TokenMap, often used in conjunction with filters to remove or process specific tokens.

# Functions

No description provided by the author
BuildTermFromRunesOptimistic will build a term from the provided runes AND optimistically attempt to encode into the provided buffer if at any point it appears the buffer is too small, a new buffer is allocated and that is used instead this should be used in cases where frequently the new term is the same length or shorter than the original term (in number of bytes).
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author

# Constants

No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author

# Variables

No description provided by the author

# Structs

No description provided by the author
Token represents one occurrence of a term at a particular location in a field.
TokenFreq represents all the occurrences of a term in all fields of a document.
TokenLocation represents one occurrence of a term at a particular location in a field.

# Interfaces

No description provided by the author
No description provided by the author
No description provided by the author
A TokenFilter adds, transforms or removes tokens from a token stream.
A Tokenizer splits an input string into tokens, the usual behaviour being to map words to tokens.

# Type aliases

TokenFrequencies maps document terms to their combined frequencies from all fields.
No description provided by the author
No description provided by the author
No description provided by the author