Categorygithub.com/mr-joshcrane/chatproxy
modulepackage
0.0.0-20230710045121-85c3cbde2ba8
Repository: https://github.com/mr-joshcrane/chatproxy.git
Documentation: pkg.go.dev

# README

Go ReferenceLicense: GPL-2.0Go Report Card

Chatproxy

This README ghostwritten by the Chat CLI tool backed by ChatGPT4

Chatproxy is a powerful Golang library that simplifies interactions with OpenAI's GPT-4 model, allowing developers to seamlessly integrate GPT-4 into their Go applications for various tasks. It comes with a collection of ready-to-use command-line tools that serve as examples of how to leverage the chatproxy library. Users can customize the API client, output formats, and authentication methods, making it an indispensable tool for Golang enthusiasts working with AI-powered document generation and processing.

Key Features

  • Effortless integration with GPT-4 for your Go applications
  • Simple API client with customizable settings
  • Common functions for handling messages, conversation history, and errors
  • Collection of handy command-line tools: Ask, Card, Commit, Chat and TLDR

Unlock the power of GPT-4 in your Go projects with Chatproxy and take your applications to the next level.

Default Transcript Logging

By default, Chatproxy records transcripts of your interactions with ChatGPT4, providing an invaluable resource for tracking and understanding the data being sent to OpenAI's powerful AI model. This proves advantageous since it ensures transparency and allows you to be well aware of the information being exchanged.

What is recorded?

Chatproxy logs various types of messages and interactions, including:

  1. User input
  2. Assistant responses (generated by ChatGPT4)
  3. System prompts, instructions, and status updates

Where is the transcript stored?

By default, the logged data is recorded in the transcript field of the ChatGPTClient struct. This holds a comprehensive log of interactions in the form of user inputs, bot responses, and system messages, offering an accessible, all-in-one record.

Benefits of Default Transcript Logging

Default transcript logging in Chatproxy serves multiple purposes:

  • Ensures visibility and transparency when working with OpenAI services
  • Provides insights into the data exchanged with ChatGPT4 to help you understand your usage better
  • Assists in detecting issues, debugging, and optimizing interactions with ChatGPT4 for the desired output
  • Helps you maintain compliance with any data retention or privacy policies, enabling precise control over the information sent to OpenAI

Embrace the convenience and peace of mind offered by Chatproxy's default transcript logging, taking full advantage of data awareness and transparency for your Golang applications using OpenAI and ChatGPT4.

Chatproxy Library

Installation and Usage

go get -u github.com/mr-joshcrane/chatproxy
package main

import (
	"fmt"
	"github.com/mr-joshcrane/chatproxy"
)

func main() {
	answer, err := chatproxy.Ask("What is the capital of France?")
	if err != nil {
		fmt.Println("Error:", err)
		return
	}
	fmt.Println("Answer:", answer)
}
Answer: Paris

Ask CLI Tool

Installation and Usage

go install github.com/mr-joshcrane/chatproxy/cmd/ask@latest
ask "What is the capital of France?"
The capital of France is Paris.

Cards CLI Tool

Installation and Usage

go install github.com/mr-joshcrane/chatproxy/cmd/cards@latest
cards path/to/your/file/README.md

Question: What are special commands in the Chat CLI Tool?
Answer: Three special commands: 
  1. ">filename.txt" to load a file,
  2. "<filename.txt" to write response to a file,
  3. "?" to generate comprehension questions.

Commit CLI Tool

Installation and Usage

go install github.com/mr-joshcrane/chatproxy/cmd/commit@latest
commit
Accept Generated Message? (Y)es/(N)o
Add installation and usage instructions for Chatproxy library and CLI tools

Chat CLI Tool

Installation and Usage

go install github.com/mr-joshcrane/chatproxy/cmd/chat@latest
chat
    SYSTEM) Please describe the purpose of this assistant.
    USER) You help me write Golang documentation.
    USER) >chatproxy
    USER) What special commands are available in the Chat CLI tool and what do they
    do?
    USER) What special commands are availble in the chat CLI tool and what do they do?
    ASSISTANT) In the chat CLI tool, there are THREE special commands:
        1. `>filename.txt`: This command is used to load the contents of a local file into the chat session.
        The text in the file will be used as input for the assistant, and it will base its response on that content.
        To use this command, simply type `>` followed by the name of the file you want to load. For example:

   >file.txt

        2. `<filename.txt`: This command is used to write the assistant's response to a local file.
        You can use this command to save the generated content for future reference.
        To use this command, type `<` followed by the name of the file you want to save the response to,
        and then enter your input prompt. For example:

   <output.txt What is the capital of France?

        3.  The `?` command in the chat CLI tool is used to generate comprehension questions based on the provided text.
        It uses Bloom's Taxonomy (2001) to create questions that help assess the understanding of the given content.
        To use this command, simply type `?` at the beginning of the chat input
        and questions will be generated from the content of the current
        conversation. To make sure you were really paying attention!

These special commands help users extend the interactivity between the chat CLI tool and external files, making it more convenient to use different sources of information or store assistant responses for later use.

TLDR CLI Tool

Installation and Usage

go install github.com/mr-joshcrane/chatproxy/cmd/tldr@latest
tldr path/to/your/file.txt
A brief summary of your file.

tldr https://example.site.com
A brief summary of your website.

OPENAI_API_KEY Environment Variable

Purpose: The OPENAI_API_KEY is used to authenticate and authorize API access to OpenAI's GPT-4 services.

Usage: Store the token as an environment variable (OPENAI_API_KEY="YOUR_TOKEN") in your system or application, so that the library can access it automatically.

Obtaining a token: You can get an API key by creating an account on OpenAI's platform at https://beta.openai.com/signup/. After signing up, visit the API Keys section in your account to obtain a token.

User responsibilities: It is crucial to keep the token secret and secure, as it allows access to your OpenAI account and its services. Make sure not to share the token in public repositories or with unauthorized individuals. Additionally, be aware of usage limits and costs associated with OpenAI API services, as you will be billed according to your account's pricing plan.

Always follow OpenAI's guidelines, terms, and conditions when using its services.

# Packages

No description provided by the author

# Functions

Ask sends a question to the GPT-4 API, aiming to receive a relevant and informed answer.
No description provided by the author
Card generates a set of flashcards from a given file or URL, aiming to enhance learning by summarizing important concepts.
Chat function initiates the chat with the user and enables interaction between user and the chat proxy.
Commit analyzes staged Git files, parsing the diff, and generates a meaningful commit message.
CreateAuditLog creates a new file for recording the conversation's audit log with a timestamped filename.
NewChatGPTClient initializes the ChatGPTClient with the desired options, allowing customization through functional options so the client can be tailored to specific needs or requirements.
MessageFromFile reads the contents of a file, and returns a formatted message with the file name and content, as well as an estimation of the token count.
MessageToFile writes the given content string to a file with the specified path.
TLDR generates a concise summary of content from a file or URL, aiming to condense important information.
WithFixedResponse configures the ChatGPTClient to return a predetermined response, offering quicker or consistent replies, or simulating specific behavior for test cases.
WithFixedResponseAPIValidate still makes an API call (ensuring request and token length validation) but enforces a specific response from the chatbot, ensuring a known output and avoiding unpredictable or unnecessary responses during validation.
WithInput assigns a custom input reader for ChatGPTClient, allowing the client to read input from any source, offering improved flexibility and adaptability.
WithOutput allows customizing the output/error handling in the ChatGPTClient, making the client more adaptable to different environments or reporting workflows.
WithStreaming controls the streaming mode of the ChatGPTClient, giving the user the choice between streamed responses for real-time interactions or buffered responses for complete replies.
WithToken uses the provided token for authentication when creating a new ChatGPTClient.
WithTranscript enables keeping a log of all conversation messages, ensuring a persistent record that can be useful for auditing, debugging, or further analysis.

# Constants

No description provided by the author
Role constants that represent the role of the message sender.
Role constants that represent the role of the message sender.
Role constants that represent the role of the message sender.

# Variables

No description provided by the author

# Structs

ChatGPTClient manages interactions with a GPT-based chatbot, providing a way to organize the conversation, handle input/output, and maintain an audit trail.
ChatMessage represents a message in the chat, providing context and a way to model conversation between different participant roles (e.g., user, bot, system).
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author

# Interfaces

No description provided by the author

# Type aliases

ClientOption is used to flexibly configure the ChatGPTClient to meet various requirements and use cases, such as custom input/output handling or error reporting.
CompletionOption is used to customize the behavior of the openai.ChatCompletionRequest to suit different use cases, such as setting stop words or modifying token limits.