Skip to content

Commit

Permalink
Upload existing integrations as-is
Browse files Browse the repository at this point in the history
  • Loading branch information
AlexRuiz7 committed Jan 17, 2024
1 parent 8b4546e commit 2111390
Show file tree
Hide file tree
Showing 36 changed files with 27,839 additions and 0 deletions.
41 changes: 41 additions & 0 deletions integrations/.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
WAZUH_VERSION=4.3.10
ELASTIC_PASSWORD=changeme


## ELASTIC STACK
# Password for the 'kibana_system' user (at least 6 characters)
KIBANA_PASSWORD=kibana_system

# Version of Elastic products
STACK_VERSION=8.6.2

# Set the cluster name
CLUSTER_NAME=docker-cluster

# Set to 'basic' or 'trial' to automatically start the 30-day trial
LICENSE=basic
#LICENSE=trial

# Port to expose Elasticsearch HTTP API to the host
ES_PORT=9201
#ES_PORT=127.0.0.1:9200

# Port to expose Kibana to the host
KIBANA_PORT=5602
#KIBANA_PORT=80

# Increase or decrease based on the available host memory (in bytes)
MEM_LIMIT=1073741824

## OPENSEARCH STACK
#Stack version
OS_VERSION=2.6.0

#Opensearch port
OS_PORT=9202

#Opensearch dashboard port
OSD_PORT=5603

SPLUNK_FORWARDER_URL=https://download.splunk.com/products/universalforwarder/releases/9.0.4/linux/splunkforwarder-9.0.4-de405f4a7979-linux-2.6-amd64.deb
LOGSTASH_URL=https://artifacts.elastic.co/downloads/logstash/logstash-8.6.2-linux-x86_64.tar.gz
74 changes: 74 additions & 0 deletions integrations/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Wazuh integrations

This folder contains a docker environment with all the necessary to test integrations with Splunk and Elasticsearch, from the Wazuh Indexer as well as from the Wazuh manager.

## Docker compose content:

- 1 Splunk Indexer 9.0.4
- 1 Wazuh stack (indexer, dashboard and manager). In the manager container there is also a Splunk Forwarder and a Logstash in the `/opt` folder
- 1 Elastic stack (Elasticsearch,Kibana and the setup container)
- 1 Opensearch stack (Opensearch and Opensearch dashboards)
- 1 Logstash 8.6.2
- 1 Generator that will automatically generate all the required certificates and download the required packages

## Additional content:

- Dashboards for Splunk, Kibana and Opensearch
- Sample alerts for the last 7 days after starting the environments. Those are inside the wazuh-manager in `/var/ossec/logs/alerts/sample_alerts.json` and also in the `alerts.json` file merged with the non-sample data.

## Requirement:

- Internet connection
- Docker
- Docker compose

## Usage

In the .env file it is possible to configure the desired version of the Wazuh stack. It will only work with already released versions.

After that, running `docker compose up -d` will start all the containers. Once the start process finishes, the integrations will be configured. It is necessary to manually start the Splunk integration from manager by running `/opt/splunkforwarder/bin/splunk start --accept-license` in the Wazuh manager container.To stop the environment and clear it, use `docker compose down`.

The Splunk Indexer instance is accessible from https://localhost:8000, credentials `admin:password`. In this instance, the logs imported from the Wazuh Indexer are in the `main` index, and the logs imported from the manager are in the `wazuh-alerts` index.

The Wazuh Dashboard instance is accessible from https://localhost:5601 credentials `admin:SecretPassword`.

The Kibana instance is accessible from http://localhost:5602 credentials `elastic:changeme`. In this instance, the logs imported from the Wazuh Indexer are in the `indexer-wazuh-alerts-4.x-<date>` index, and the logs imported from the manager are in the `wazuh-alerts-4.x-<date>` index.

The Opensearch dashboards instance is accessible from http://localhost:5603 credentials `admin:admin`. In this instance, the logs imported from the Wazuh Indexer are in the `indexer-wazuh-alerts-4.x-<date>` index, and the logs imported from the manager are in the `wazuh-alerts-4.x-<date>` index.

The integration from the manager contains sample data, and also the alerts that are generated. The integration from the indexer will not contain any sample data. Additionally, the dashboards for all the platforms will only work with the index `wazuh-alerts...`, meaning that they will not reflect the data generated from the Indexer integration.

## Import dashboards

### Splunk

The Splunk dashboards are located in `extra/dashboards/Splunk`. The steps to import them to the indexer are the following:

- Open a dashboard file and copy all its content
- In the indexer navigate to `Search & Reporting`, `Dashboards`, click `Create New Dashboard`, write the title and select `Dashboard Studio`, select `Grid` and click on `Create`
- On the top menu, there is a `Source` icon. Click on it, and replace all the content with the copied content from the dashboard file. After that, click on `Back` and click on `Save`.
- Repeat the steps for all the desired dashboards.

### Elastic

The Elastic dashboards are located in `docker/integrations/extra/dashboards/elastic`. The steps to import them to the indexer are the following:

- Open the Elastic web interface
- Expand the left bar, and go to `Stack management`
- Click on `Saved Objects`, select `Import`, click on the `Import` icon and browse the dashboard file. It is possible to import only the desired dashboard, or the file `all-dashboards.ndjson`, that contains all the dashboards.
- Click on Import.
- Repeat the steps for all the desired dashboards.

After that, the dashboard should be imported. It can be seen opening the left bar and selecting `Dashboard`.

### Opensearch

The Elastic dashboards are located in `docker/integrations/extra/dashboards/opensearch`. The steps to import them to the indexer are the following:

- Open the Opensearch web interface
- Expand the left bar, and go to `Stack management`
- Click on `Saved Objects`, select `Import`, click on the `Import` icon and browse the dashboard file. It is possible to import only the desired dashboard, or the file `all-dashboards.ndjson`, that contains all the dashboards.
- Click on Import.
- Repeat the steps for all the desired dashboards.

After that, the dashboard should be imported. It can be seen opening the left bar and selecting `Dashboard`.
15 changes: 15 additions & 0 deletions integrations/config/certs/ca.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
{
"CN": "Wazuh",
"key": {
"algo": "rsa",
"size": 2048
},
"names": [
{
"C": "US",
"L": "San Francisco",
"O": "Wazuh",
"OU": "Wazuh Root CA"
}
]
}
58 changes: 58 additions & 0 deletions integrations/config/certs/cfssl.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
{
"signing": {
"default": {
"expiry": "8760h"
},
"profiles": {
"intermediate_ca": {
"usages": [
"signing",
"digital signature",
"key encipherment",
"cert sign",
"crl sign",
"server auth",
"client auth"
],
"expiry": "8760h",
"ca_constraint": {
"is_ca": true,
"max_path_len": 0,
"max_path_len_zero": true
}
},
"peer": {
"usages": [
"signing",
"digital signature",
"key encipherment",
"data encipherment",
"client auth",
"server auth"
],
"expiry": "8760h"
},
"server": {
"usages": [
"signing",
"digital signing",
"key encipherment",
"data encipherment",
"server auth"
],
"expiry": "8760h"
},
"client": {
"usages": [
"signing",
"digital signature",
"key encipherment",
"data encipherment",
"client auth"
],
"expiry": "8760h"
}
}
}
}

19 changes: 19 additions & 0 deletions integrations/config/certs/host.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"CN": "HOST",
"key": {
"algo": "rsa",
"size": 2048
},
"names": [
{
"C": "US",
"L": "California",
"O": "Wazuh",
"OU": "Wazuh"
}
],
"hosts": [
"HOST",
"localhost"
]
}
Loading

0 comments on commit 2111390

Please sign in to comment.