package
2.4.4
Repository: https://github.com/blevesearch/bleve.git
Documentation: pkg.go.dev

# Packages

package token_map implements a generic TokenMap, often used in conjunction with filters to remove or process specific tokens.

# Functions

BuildTermFromRunesOptimistic will build a term from the provided runes AND optimistically attempt to encode into the provided buffer if at any point it appears the buffer is too small, a new buffer is allocated and that is used instead this should be used in cases where frequently the new term is the same length or shorter than the original term (in number of bytes).

# Constants

# Variables

# Structs

Token represents one occurrence of a term at a particular location in a field.

# Interfaces

A TokenFilter adds, transforms or removes tokens from a token stream.
A Tokenizer splits an input string into tokens, the usual behaviour being to map words to tokens.

# Type aliases