Skip to content

Library to work with Apache Kafka using Avro serialization format

License

Notifications You must be signed in to change notification settings

anstoli/go-kafka-avro

Repository files navigation

go-kafka-avro

A wrapper for Confluent's libraries for Apache Kafka and Schema Registry.

Installation

First install dependencies:

go get github.com/confluentinc/confluent-kafka-go github.com/landoop/schema-registry

To install use go get:

go get github.com/mycujoo/go-kafka-avro

Usage

First, you need to create cached schema registry client:

srClient, err := kafkaavro.NewCachedSchemaRegistryClient(baseurl)

For more options look at Landoop Schema Registry Client README.

Producer

Create kafka producer:

kafkaProducer, err := kafka.NewProducer(&kafka.ConfigMap{"bootstrap.servers": "localhost"})

Then construct you can construct one or more kafkaavro producers:

producer, err := kafkaavro.NewProducer(kafkaavro.ProducerConfig{
	TopicName:            "topic",
	KeySchema:            `"string"`,
	ValueSchema:          `{"type": "record", "name": "test", "fields" : [{"name": "val", "type": "int", "default": 0}]}`,
	Producer:             kafkaProducer,
	SchemaRegistryClient: srClient,
})

Publish message using Produce method:

err = producer.Produce("key", "value", nil)

If you provide deliverChan then call will not be blocking until delivery.

Supported go versions

We support version >=1.12

Related

Some code for cached schema registry client was based on https://github.com/dangkaka/go-kafka-avro implementation.

About

Library to work with Apache Kafka using Avro serialization format

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages