# README
NERDb
Back at it. More soon..
Data & Server
Built on data provided by Zkillboard, EVE Ref, and ESI.
Killmail, player and other data is stored in Postgres. The schema and migrations managed in postgres.
Data loading and updating is handled by the killfeed package, written in Go.
Web app
Built with Next.
Running the project locally.
Requirements
Note: The following instructions use
go run
commands, which is fine for now. Future revisions will have a compiled binary to execute them.For each command, more info about its usage can be found in its containing directory.
Start the database
From project root, run docker compose up -d
Create database schema
Run go run postgres/migrate.go
Start loading current Zkillboard data
Run go run killfeed/main.go zkill
Load historical killmail data from Eve Ref
Run go run killfeed/main.go everef
Expect this step to take hours. It is intentionally not loading days concurrently, so that there is not a heavy load on the network or database.
Load character, corporation, and alliance names
Killmail data contains only IDs, and CCP's ESI API does not provide bulk operations to get multiple entity's info. This is a two step process to load bulk data, and then update more current data.
- Load bulk data from Eve Ref
- Download the latest bulk dataset
- Unzip the file
- Run
go run killfeed/main.go updater --src everef --dir $PATH_TO_UNZIPPED_FILE
- Run ESI updater for the remaining values
- Run
go run killfeed/main.go updater --src esi
- Run
TODO:
These processes will be updated to also update corporations and alliances.
The web app will update values on demand as users view entities in the app, respecting ESI's caching.