package
0.9.0-beta3
Repository: https://github.com/lovoo/goka.git
Documentation: pkg.go.dev

# README

Goka Example (Redis)

Using Redis as option of storage.

This example has an Emitter and one Processor with when this emitter generates events with random keys (user_id) consumed by Kafka that uses Redis as storage/cache.

Usage

It's easy to configures this example.

  1. Starts the Kafka present on docker-compose file with command bellow:
docker-compose -f kafka.yml -p ci up -d
  1. Check the status of docker containers:
docker ps

Resulting for example:

CONTAINER ID        IMAGE                 COMMAND                  CREATED             STATUS              PORTS                                            NAMES
38a6efb145ba        wurstmeister/kafka    "start-kafka.sh"         9 seconds ago       Up 4 seconds        0.0.0.0:9092->9092/tcp, 0.0.0.0:9997->9997/tcp   kafka
48df80931a1f        confluent/zookeeper   "/usr/local/bin/zk-d…"   10 seconds ago      Up 5 seconds        2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp       zookeeper
040fbe9dfc13        redis:latest          "docker-entrypoint.s…"   10 seconds ago      Up 5 seconds        0.0.0.0:6379->6379/tcp                           redis

This same configuration should be present on config.yaml file, with kafka an redis like:

kafka:
  brokers:
    - "127.0.0.1:9092"
    group: "examples"
    stream: "events"
    redis: "127.0.0.1:6379"
    namespace: "producer"

Where: * brokers : slice of kafka brokers hosts. * group : group name of this example belongs. * stream: stream name of this example belongs. * redis: address of redis (localhost:6379). * namespace: namespace distinguish applications that write to the same keys on Redis.

  1. Fetch the go redis client package:
go get -u gopkg.in/redis.v5
  1. Build and run the example:
go build .
./redis -config config.yaml

The events are produced and consumed by Kafka with random keys. It's possible run several of the same binary and check the behaviour of kafka rebalancing and removing partitions without broken.

# Functions

Consume starts goka events consumer.
NewProducer returns a new kafka producer.

# Structs

No description provided by the author
No description provided by the author
No description provided by the author

# Interfaces

Producer defines an interface whose events are produced on kafka.
Publisher defines an interface to Publish the event somewhere.