A Go application that feeds of data from Apache Kafka to send SMS,EMAIL or connects via webhook.
It can be used as is for just notification. By reacting to events pushed to Apache Kafka.
type Event struct {
EventId string `json:"eventId"`
Subject string `json:"subject"`
Channel map[string]bool `json:"channel"`
Recipient []string `json:"recipient"`
UnmappedData map[string]string `json:"unmappedData"`
EventType string `json:"eventType"`
Description string `json:"description"`
DateCreated time.Time `json:"dateCreated"`
}
An event pushed to Apache Kafka would be unmarshalled to the Event
object. So structure your json string. The unmappedData
data field can be used for misc data.
Sample :
{
"eventId":"12345",
"subject":"Signup For Service",
"channel":{"EMAIL":true},
"recipient":["[email protected]"],
"eventType":"SUBSCRIPTION",
"description":"Signp Notification",
"UnmappedData":{"Name":"Malike St","ItemName":"Sample Subscription"}
}
Where it would work with elasticsearch kafka watch to send notification once there's hit in elasticsearch.
Uses elasticsearch report engine to send scheduled reports as PDF,HTML or CSV by email.
i. Embedded Reports
ii. CSV/PDF Attached Reports
Connects via Twilio to send sms messages. To make sure your event is processed by the SMS delivery gateway when using this for a Notification Service the channel field in your event written to Apache Kafka should be something like this:
"channel": {
"SMS": true
}
But when using this as a Custom Watcher, you don't need to worry about the format since it would be formatted for you by elasticsearch kafka watch
The SMS channel is not supported when using this for Scheduled Reports.
Use as | Supported |
---|---|
Notification Service | Yes |
Custom Watcher | Yes |
Scheduled Reports | No |
Connects via SMTP to send emails. To make sure your event is processed by the Email delivery channel when using this for a Notification Service the channel field in your event written to Apache Kafka should be something like this:
"channel": {
"EMAIL": true
}
Elasticsearch Kafka Watch would help use this a Custom Elasticsearch Watcher. It would generate the right event for Apache Kafka.
For scheduled reports the same plugin would help generate the event which would cause go-kafka-alert to react by emailing the report.
Use as | Supported |
---|---|
Notification Service | Yes |
Custom Watcher | Yes |
Scheduled Reports | Yes |
Use as | Supported |
---|---|
Notification Service | Yes |
Custom Watcher | Yes |
Scheduled Reports | No |
NB : For multiple channels for the same event use this :
"channel": {
"SMS": true,
"EMAIL": true,
"API": true
}
There are two ways to load the configuration file :
1. Spring Cloud Config
This is a sample Spring Cloud Config Server with configruations loaded from here. If you want to read more on the whys
and the hows
of loading configuration files from Config Servers
Sample configuration to start app with UAT configuration
go-kafka-alert -loglevel=trace -profile=uat
This would load the uat configuration profile http://localhost:8888/go-kafka-alert-uat.json
from the config server.
2. File System
The app is meant to be a light-weight application. Find a sample configuration file:
{
"workers": 4,
"logFileLocation": "/var/log/go_kafka_alert.log",
"log": true,
"kafkaConfig": {
"bootstrapServers": "localhost:9092",
"kafkaTopic": "go-kafka-event-stream",
"kafkaTopicConfig": "latest",
"kafkaGroupId": "consumerGroupOne",
"kafkaTimeout": 5000
},
"webhookConfig": {
"appURL": "http://url.",
"appKey": "Malike"
},
"smsConfig": {
"twilioAccountId": "Malike",
"twilioAuthToken": "Malike",
"smsSender": "+15005550006"
},
"emailConfig": {
"smtpServerHost": "smtp.gmail.com",
"tls": true,
"smtpServerPort": 465,
"emailSender": "Sender",
"emailFrom": "[email protected]",
"emailAuthUserName": "[email protected]",
"emailAuthPassword": "xxxxxx"
},
"dbConfig": {
"mongoHost": "localhost",
"mongoPort": 27017,
"mongoDBUsername": "",
"mongoDBPassword": "",
"mongoDB": "go_kafka_alert",
"collection": "message"
},
"templates": {
"APPFLAG_API": "User {{.UnmappedData.UserName}} has failed to execute service {{.UnmappedData.ServiceName}} {{.UnmappedData.FailureCount}} times in the past {{.UnmappedData.FailureDuration}} minutes",
"SERVICEHEALTH_API": "Service {{.UnmappedData.ServiceName}} has failed execution {{.UnmappedData.FailureCount}} in the past {{.UnmappedData.FailureDuration}} minutes",
"SUBSCRIPTION_API": "Hello {{.UnmappedData.Name}}, Thanks for subscribing to {{.UnmappedData.ItemName}}",
"APPFLAG_SMS": "User {{.UnmappedData.UserName}} has failed to execute service {{.UnmappedData.ServiceName}} {{.UnmappedData.FailureCount}} times in the past {{.UnmappedData.FailureDuration}} minutes",
"SERVICEHEALTH_SMS": "Service {{.UnmappedData.ServiceName}} has failed execution {{.UnmappedData.FailureCount}} in the past {{.UnmappedData.FailureDuration}} minutes",
"SUBSCRIPTION_SMS": "Hello {{.UnmappedData.Name}}, Thanks for subscribing to {{.UnmappedData.ItemName}}",
"SUBSCRIPTION_EMAIL": "<html><head></head><body> Hello {{.UnmappedData.Name}}, Thanks for subscribing to {{.UnmappedData.ItemName}} </body></html>",
"REPORTATTACHED_EMAIL": "<html><head></head><body> Hello {{.UnmappedData.Name}}, Find attached report for {{.UnmappedData.ItemName}} </body></html>",
"REPORTEMBEDED_EMAIL": "{{.UnmappedData.Content}}"
}
}
i. kafkaConfig
Apache Kafka configuration. Note you can comma separate the value for bootstrapServers
nodes if you have multiple nodes.
Example 127.0.0.1:2181,127.0.0.2:2181
.
For the other Apache Kafka configurations I'm assuming you already know how what they mean. Read the Apache Kafka docs if you want to know more. The project uses the go kafka library by Confluent.
ii. webhookConfig
iii. smsConfig
This is where configuration for your twilio account are. This would enable sending SMS notifications. The project uses the twilio sms config.
iv. emailConfig
This is where configuration for your email smtp would be. This would enable sending EMAIL notifications. It uses http://gopkg.in/gomail.v2
v. dbConfig
Messages sent out are stored for auditing purposes. Together with the response from twilio or your smtp gateway. This configuration stores them in MongoDB. Uses this mongodb library for Go.
vi. templates
These are the messaging templates configured for all the alert types. Follow this to learn how to create your templates. The templates are stored as maps to give an O(1) when finding a template. The key of the map follows this convention "EventType"
+_
+"Delivery Channel"
. This means an SMS for EventType, SUBSCRIPTION would be SUBSCRIPTION_SMS
To compile source, clone repository and run dep ensure
. The projects uses dep as a dependency management tool.
Because of go kafka library by Confluent, you'll need to also get librdkafka
installed.
For Debian systems follow this link.
For OSx use brew install librdkafka
.
You can use [ldflags](https://blog.cloudflare.com/setting-go-variables-at-compile-time/)
to package the go build with the default parameters.
Required parameters are :
profile
: Configuration profile if configuration would be loaded from Spring Cloud Config Server
configServer
: Base url of config server. Eg http://localhost:8888
.
Version |
---|
0.1-Prelease Tag |
Contributions are always welcome! Please read the contribution guidelines first.
Please read this.