Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: ssl example added #196

Merged
merged 1 commit into from
Jul 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -229,3 +229,6 @@ tags
pyvenv.cfg
pip-selfcheck.json
poetry.toml

# local SSL cluster certificates
kafka-certs/
485 changes: 0 additions & 485 deletions examples/fastapi-sse/poetry.lock

This file was deleted.

88 changes: 88 additions & 0 deletions examples/ssl-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# Kstreams SSL example

This example shows how setup an SSL connection with `kstreams`. For this purpose we have to setup a local kafka SSL cluster.

`ENV` varialbles are exported using the script `/scripts/ssl/export-env-variables` and loaded in python using [pydantic-settings](https://docs.pydantic.dev/latest/concepts/pydantic_settings/).

Check [resources.py](https://github.com/kpn/kstreams/blob/master/examples/ssl-example/ssl_example/resources.py) to see how the backend with `SSL` is created

## Requirements

`python 3.8+`, `poetry`, `docker-compose`, `openssl`

### Usage

First create the server and client sertificates:

```bash
./scripts/ssl/ssl-setup
```

After executing the privious script, you will see a new folder called `kafka-certs`. The folder contains the `server` (inside the server folder),
`admin` (inside the admin folder) and `client` certicicates. Do not worry about the content of them, it is just an example and they can be deleted,
shared and recreated (it is just a local example)

Now you can run the local SSL cluster:

```bash
./scripts/cluster/start
```

Second, you need to install the project dependencies dependencies. In a different terminal execute:

```bash
poetry install
```

Export the env variables:

```bash
. ./scripts/ssl/export-env-variables
```

Then we can run the project

```bash
poetry run app
```

You should see something similar to the following logs:

```bash
kstreams/examples/ssl-example via 🐳 colima is 📦 v0.1.0 via 🐍 v3.12.4
❯ poetry run app

INFO:ssl_example.app:Starting application...
INFO:aiokafka.consumer.subscription_state:Updating subscribed topics to: frozenset({'local--kstreams'})
INFO:aiokafka.consumer.consumer:Subscribed to topic(s): {'local--kstreams'}
INFO:kstreams.prometheus.monitor:Starting Prometheus Monitoring started...
INFO:ssl_example.app:Producing event 0
INFO:ssl_example.app:Producing event 1
INFO:ssl_example.app:Producing event 2
INFO:ssl_example.app:Producing event 3
INFO:ssl_example.app:Producing event 4
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
ERROR:aiokafka.consumer.group_coordinator:Group Coordinator Request failed: [Error 15] GroupCoordinatorNotAvailableError
INFO:aiokafka.consumer.group_coordinator:Discovered coordinator 1 for group example-group
INFO:aiokafka.consumer.group_coordinator:Revoking previously assigned partitions set() for group example-group
INFO:aiokafka.consumer.group_coordinator:(Re-)joining group example-group
INFO:aiokafka.consumer.group_coordinator:Joined group 'example-group' (generation 1) with member_id aiokafka-0.11.0-5fb10c73-64b2-42a8-ae8a-23f59d4a3b6b
INFO:aiokafka.consumer.group_coordinator:Elected group leader -- performing partition assignments using roundrobin
INFO:aiokafka.consumer.group_coordinator:Successfully synced group example-group with generation 1
INFO:aiokafka.consumer.group_coordinator:Setting newly assigned partitions {TopicPartition(topic='local--kstreams', partition=0)} for group example-group
```

## Note

If you plan on using this example, pay attention to the `pyproject.toml` dependencies, where
`kstreams` is pointing to the parent folder. You will have to set the latest version.
45 changes: 45 additions & 0 deletions examples/ssl-example/docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
version: '3'
services:

zookeeper:
image: "confluentinc/cp-zookeeper:7.3.0"
hostname: zookeeper
container_name: zookeeper
ports:
- 32181:32181
environment:
- ZOOKEEPER_CLIENT_PORT=32181
kafka:
volumes:
- ./kafka-certs/server/:/etc/kafka/secrets
- ./kafka-certs/admin/client.properties:/etc/kafka/config/client.properties
image: "confluentinc/cp-kafka:7.3.0"
hostname: kafka
container_name: kafka
environment:
KAFKA_BROKER_ID: "1"
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: "1"
KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:32181"
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
CONFLUENT_METRICS_ENABLE: 'true'
CONFLUENT_SUPPORT_CUSTOMER_ID: anonymous
# Kafka security
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: SSL:SSL,SSL2:SSL
KAFKA_SECURITY_INTER_BROKER_PROTOCOL: SSL
KAFKA_ADVERTISED_LISTENERS: SSL://localhost:9092,SSL2://kafka:9093
KAFKA_SSL_KEYSTORE_FILENAME: keystore.jks
KAFKA_SSL_TRUSTSTORE_FILENAME: truststore.jks
KAFKA_SSL_KEY_CREDENTIALS: ssl-key-credentials
KAFKA_SSL_KEYSTORE_CREDENTIALS: key-store-credentials
KAFKA_SSL_TRUSTSTORE_CREDENTIALS: trust-store-credentials
KAFKA_SSL_CLIENT_AUTH: required
KAFKA_AUTHORIZER_CLASS_NAME: kafka.security.authorizer.AclAuthorizer
KAFKA_SSL_PRINCIPAL_MAPPING_RULES: RULE:^.*[Cc][Nn]=([a-zA-Z0-9._-]*).*$$/CN=$$1/,DEFAULT
KAFKA_SUPER_USERS: User:CN=localhost;User:CN=admin.client.company.org;
ports:
- "9092:9092"
- "29092:29092"
depends_on:
- zookeeper

Loading
Loading