# Packages
No description provided by the author
# README
paraphraser-api
Backend for paraphrasing tool leveraging LLM APIs.
Implemented as an AWS Lambda application and utilizes Serverless Framework for deployment.
Prerequisites
Endpoints
-
POST /paraphrase
- available providers: "chatgpt", "gemini"
- available tones: "formal", "amicable", "fun", "casual", "sympathetic", "persuasive"
- sample request:
{ "provider": "chatgpt", "tone": "formal", "text": "I'm hungry. What's for dinner?" }
- sample response:
{ "result": "I am currently experiencing hunger. May I inquire about the menu for this evening's meal?" }
-
GET /providers
- sample response:
{ "providers": [ "chatgpt", "gemini" ] }
- sample response:
-
GET /tones
- sample response:
{ "tones": [ "formal", "amicable", "fun", "casual", "sympathetic", "persuasive" ] }
- sample response:
Usage
configure
$ make .env
- see generated
.env
file for configuration
tidy dependencies
$ make deps
run unit tests
$ make test
run all tests (unit + integration tests)
$ make testIncludeInt
build serverless functions
$ make build
- this generates
bin
directory to be used in deployment
deploy serverless application
$ make deploy
Helpers during development:
format all .go files in project (using go fmt)
$ make fmt
generate test mocks (to be used with stretchr/testify) for all interfaces in project
$ make mocks
- can be configured in
.mockery.yaml