package
0.0.186
Repository: https://github.com/alexamies/chinesenotes-go.git
Documentation: pkg.go.dev

# Functions

No description provided by the author
Segment a text document into segments of Chinese separated by either puncuation or non-Chinese text.

# Structs

Tokenizes Chinese text using a dictionary.
A text segment that contains either Chinese or non-Chinese text.
A text token contains the results of tokenizing a string.

# Interfaces

Tokenizes Chinese text.