# README
tokenizer package
This package parses SQL statements to do two specific things:
- Returns tokens for SQL statements, where those tokens identify keyword, type, name and value text within the SQL statement;
- Can determine if a statement "is complete - that is, has a trailing semicolon on a statement.
This package is part of a wider project, github.com/mutablelogic/go-sqlite
.
Please see the module documentation
for more information.
Using the tokenizer
Here's an example of using the tokenizer:
import (
"github.com/mutablelogic/go-sqlite/pkg/tokenizer"
)
func Tokenize(q string) ([]interface{},error) {
tokenizer := NewTokenizer(test)
tokens := []interface{}{}
for {
token, err := tokenizer.Next()
if token == nil {
return tokens, nil
}
if err != nil {
return nil, err
}
tokens = append(tokens, token)
}
}
Tokens returned can be one of the following types:
KeywordToken
: a keyword, such asSELECT
,FROM
,WHERE
, etc.TypeToken
: a type such asINTEGER
,TEXT
, etcNameToken
: a table or column nameValue Token
: a numeric, boolean or text valueWhitespaceToken
: Spaces, tabs and newlinesPuncuationToken
: anything not included above
Establishing if a statement is complete
Call the func IsComplete(string) bool
method to determine if a statement is complete.
As per the sqlite documentation "useful during
command-line input to determine if the currently entered text seems to form a complete SQL
statement or if additional input is needed before sending the text into SQLite for parsing".
However, "...do not parse the SQL statements thus will not detect syntactically incorrect SQL."
# Functions
IsComplete returns true if the input string appears to be a complete SQL statement.
NewTokenizer returns a new Tokenizer that scans the input SQL statement.
# Type aliases
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author