# Functions
NewGitLexer creates a new Lexer reading from io.Reader.
NewGitLexerString creates a new Lexer reading from a string.
NewLexer creates a new Lexer reading from io.Reader.
NewLexerString creates a new Lexer reading from a string.
Split splits a string according to posix or non-posix rules.
# Structs
DefaultTokenizer implements a simple tokenizer like Unix shell.
Lexer represents a lexical analyzer.
# Interfaces
Tokenizer is the interface that classifies a token according to words, whitespaces, quotations, escapes and escaped quotations.