package
0.11.2
Repository: https://github.com/tekwizely/run.git
Documentation: pkg.go.dev

# Functions

Lex initiates the lexer against a byte array .
LexAssert assumes 'ASSERT' has already been matched .
LexAssertMessage parses an (optional) assertion error message .
LexAssignmentValue delegates to other rValue lexers .
LexCmdConfigOpt matches: name [ ! | ? | ?= VALUE ] .
LexCmdConfigOptTail matches: [-l] [--long] [<label>] ["desc"] .
LexCmdConfigShell lexes a doc block SHELL line .
LexCmdConfigUsage lexes a doc block USAGE line .
LexCmdScriptAfterLBrace finishes a script after the trailing LBrace.
LexCmdScriptMaybeLBrace lexes a script with an optional leading LBrace.
LexCmdScriptMaybeRBrace tries to match an RBrace .
LexCmdShellName lexes a command's shell .
LexDocBlockAttr lexes a doc block attribute line .
LexDocBlockDesc lexes a single dock block description line.
LexDocBlockNQString lexes a doc block comment line .
LexDQString lexes a Double-Quoted String .
LexExpectCommandName matches a dash-id or throws an error .
LexExpectNewline matches whitespace + newline or throws an error.
LexExport lexes a global OR doc block EXPORT line .
LexIgnoreNewline matches + ignores whitespace + newline .
LexMain is the primary lexer entry point .
LexMaybeBangOrQMark eats current whitespace, then emits either TokenBang, TokenMark, or TokenUnknownRune .
LexMaybeNewline eats current whitespace, then emits either TokenNewline or TokenNotNewline .
LexSQString lexes a Single-Quoted String No escapable sequences in SQuotes, not even '\'' .
LexSubCmd matches: [ '$' '(' [::print::] ')' ] .
LexVarRef matches: [ '$' '{' [A-Za-z0-9_.]* '}' ] .

# Constants

We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
'!'.
' ]'.
'[ '.
':'.
','.
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
' ]]'.
'[[ '.
'$'.
We define our lexer tokens starting from the pre-defined START token .
' ))'.
'(( '.
We define our lexer tokens starting from the pre-defined START token .
'"'.
We define our lexer tokens starting from the pre-defined START token .
'=' | ':='.
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
'{'.
'['.
'('.
We define our lexer tokens starting from the pre-defined START token .
Meta token.
' )'.
'( '.
'?'.
?=.
'}'.
']'.
')'.
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
"'".
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .
We define our lexer tokens starting from the pre-defined START token .

# Structs

LexContext allows us to track additional states of the lexer .

# Type aliases

LexFn is a lexer fun that takes a context .