package
1.19.7
Repository: https://github.com/yomorun/yomo.git
Documentation: pkg.go.dev

# Packages

Package provider defines the ai.Provider interface and provides a mock provider for unittest.
Package register provides a register for registering and unregistering functions.

# Functions

DecodeRequest decodes the request body into given type.
DecorateHandler decorates the http.Handler.
FromCallerContext returns the caller from the request context.
FromTransIDContext returns the transID from the request context.
FromTransIDContext returns the transID from the request context.
NewBasicAPIServer creates a new restful service.
NewCaller returns a new caller.
NewCallSyncer creates a new CallSyncer.
NewHandler return a hander that handles chat completions requests.
NewReducer creates a new instance of memory StreamFunction.
NewResponseWriter returns a new ResponseWriter.
NewServeMux creates a new http.ServeMux for the llm bridge server.
NewService creates a new service for handling the logic from handler layer.
No description provided by the author
NewStreamRecorder returns a new StreamRecorder.
ParseConfig parses the AI config from conf.
RespondWithError writes an error to response according to the OpenAI API spec.
Serve starts the Basic API Server.
No description provided by the author
WithCallerContext adds the caller to the request context.
WithTracerContext adds the tracer to the request context.
WithTransIDContext adds the transID to the request context.

# Constants

DefaultZipperAddr is the default endpoint of the zipper.
No description provided by the author
No description provided by the author
No description provided by the author

# Variables

ErrConfigFormatError is the error when the ai config format is incorrect.
ErrConfigNotFound is the error when the ai config was not found.
RequestTimeout is the timeout for the request, default is 90 seconds.
RunFunctionTimeout is the timeout for awaiting the function response, default is 60 seconds.

# Structs

BasicAPIServer provides restful service for end user.
Caller calls the invoke function and keeps the metadata and system prompt.
Config is the configuration of AI bridge.
Handler handles the http request.
ReduceMessage is the message from the reducer.
Server is the configuration of the BasicAPIServer, which is the endpoint for end user access.
Service is the service layer for llm bridge server.
ServiceOptions is the option for creating service.
ToolCallResult is the result of a CallSyncer.Call().

# Interfaces

CallSyncer fires a bunch of function callings, and wait the result of these function callings.
EventResponseWriter is the interface for writing events to the underlying ResponseWriter.
StreamRecorder records the stream status of the ResponseWriter.

# Type aliases

Provider is the configuration of llm provider.
SystemPromptOp defines the operation of system prompt.