repositorypackage
0.0.0-20250108071439-4aad341427f6
Repository: https://github.com/savioruz/simeru-scraper.git
Documentation: pkg.go.dev
# README
Simeru Scraper
Simeru Scraper is a web scraper that scrapes schedules from the official website and caches the data in Redis. The API is built with Go and Fiber.
Table of Contents
- Features
- Deployment
- Requirements
- Installation
- Usage
- Project Structure
- Contributing
- License
- Acknowledgements
Features
- Scrapes schedules from the official website
- Caches data in Redis
- Cron job to update data
- API Documentation with Swagger
- Docker support
Deployment
-
Koyeb
Requirements
- Go 1.23+
- Docker
- Redis
- Make
Installation
-
Clone the repository:
git clone https://github.com/savioruz/simeru-scraper.git cd simeru-scraper
-
Environment Variables:
Create a
.env
file in the root directory and add the following:cp .env.example .env
Usage
Running the API
You can run the API using Docker or directly with Make.
Docker (Recommended)
-
Run redis:
make docker.redis
-
Run the application:
make docker.run
For production, you need to secure redis on Makefile with a password.
Make
-
Run the application:
make run
You need to have Redis running on your machine.
API Documentation
Swagger documentation is available at: http://localhost:3000/swagger.
Project Structure
.
├── config/ # Configuration files
│── docs/ # Project documentation
├── internal/
│ ├── adapters/ # Adapters for external services
│ │ ├── cache/ # Cache layer
│ │ ├── handlers/ # Handlers layer
│ │ └── repositories/ # Storage layer integration
│ └── cores/ # Core business layer
│ ├── entities/ # Business entities
│ ├── ports/ # Adapter implementations
│ │ └── ports.go
│ └── services/ # Use cases layer
├── pkg/
│ ├── constant/
│ ├── middleware/
│ ├── routes/
│ ├── server/ # Server configuration
│ └── utils/ # Utility functions
├── main.go
├── .env
└── Dockerfile
Contributing
Feel free to open issues or submit pull requests with improvements.
License
This project is licensed under the MIT License. See the LICENSE file for details.