package
0.0.0-20240509090842-11decd0816f6
Repository: https://github.com/iflytek/spark-ai-go.git
Documentation: pkg.go.dev
# Functions
CalculateMaxTokens calculates the max number of tokens that could be added to a text.
CountTokens gets the number of tokens the text contains.
GenerateFromSinglePrompt is a convenience function for calling an LLM with a single string prompt, expecting a single string response.
ModelContextSize gets the max number of tokens for a language model.
WithFrequencyPenalty will add an option to set the frequency penalty for sampling.
WithFunctionCallBehavior will add an option to set the behavior to use when calling functions.
WithFunctions will add an option to set the functions to include in the request.
WithMaxLength will add an option to set the maximum length of the generated text.
WithMaxTokens is an option for LLM.Call.
WithMinLength will add an option to set the minimum length of the generated text.
WithModel is an option for LLM.Call.
WithN will add an option to set how many chat completion choices to generate for each input message.
WithOptions is an option for LLM.Call.
WithPresencePenalty will add an option to set the presence penalty for sampling.
WithRepetitionPenalty will add an option to set the repetition penalty for sampling.
WithSeed will add an option to use deterministic sampling.
WithStopWords is an option for LLM.Call.
WithStreamingFunc is an option for LLM.Call that allows streaming responses.
WithTemperature is an option for LLM.Call.
WithTopK will add an option to use top-k sampling.
WithTopP will add an option to use top-p sampling.
# Structs
CallOptions is a set of options for LLM.Call.
Generation is a single generation from a LLM.
LLMResult is the class that contains all relevant information for an LLM Result.
# Type aliases
CallOption is a function that configures a CallOptions.