package
0.0.0-20250207170155-8b3aac1c94b0
Repository: https://github.com/soyuz43/prbuddy.git
Documentation: pkg.go.dev

# Functions

ContinuePRConversation reuses HandleQuickAssist for continuing a normal (persistent) PR conversation.
GenerateDraftPR uses the LLM's chat endpoint to generate a PR draft (stateless).
GeneratePreDraftPR obtains the latest commit message and diff, then returns them for usage in PR creation.
GenerateWhatSummary generates a summary of git diffs using the LLM (stateless).
No description provided by the author
HandleDCERequest handles ephemeral (DCE-driven) requests, returning the final text from a fresh ephemeral conversation, after running your DCE logic.
HandleQuickAssist returns the final LLM response for a persistent conversation, accumulating the streaming output behind-the-scenes into one string.
JSONHandler creates a handler for JSON requests/responses with unified error handling.
LoadDraftContext retrieves saved conversation context for a specific branch/commit.
SaveDraftContext saves conversation messages to disk for a specific branch/commit.
SetLLMClient allows injecting a different LLMClient (useful for testing or future extensions).
StartPRConversation initiates a new PR conversation with a commit message and diffs.
StartServer initializes and runs the HTTP server with full lifecycle management.

# Variables

ServeCmd is the Cobra command to start the API server.

# Structs

Request/Response types.
Request/Response types.
DefaultLLMClient implements the LLMClient interface using Ollama’s /api/chat.
Request/Response types.
Request/Response types.
LLMResponse represents the top-level structure from Ollama (non-streaming).
Request/Response types.
OllamaStreamChunk is used during streaming (partial response).
Request/Response types.
No description provided by the author

# Interfaces

LLMClient defines the interface for interacting with the LLM (Ollama).