# README
Simple LLM Client Example
This example demonstrates how to use the LLM client library with both OpenAI and Anthropic providers. It shows:
- Regular chat completion
- Streaming chat completion
Prerequisites
- Create a
.env
file in the project root with your API keys:
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
- Make sure you have the required dependencies:
go mod tidy
Running the Example
From the project root:
go run examples/simple/main.go
This will:
- Test regular chat with both providers
- Test streaming chat with both providers
Each test will show the provider name and response.