-
Notifications
You must be signed in to change notification settings - Fork 50
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Valeriya Popova
committed
Jun 27, 2023
1 parent
40fb5a4
commit 244eb38
Showing
15 changed files
with
829 additions
and
218 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
name: SLO | ||
|
||
on: | ||
pull_request: | ||
branches: [main] | ||
workflow_dispatch: | ||
|
||
jobs: | ||
test-slo: | ||
concurrency: | ||
group: slo-${{ github.ref }} | ||
if: (!contains(github.event.pull_request.labels.*.name, 'no slo')) | ||
|
||
runs-on: ubuntu-latest | ||
name: SLO test | ||
permissions: | ||
checks: write | ||
pull-requests: write | ||
contents: read | ||
issues: write | ||
|
||
steps: | ||
- name: Checkout repository | ||
uses: actions/checkout@v3 | ||
|
||
- name: Run SLO | ||
uses: ydb-platform/slo-tests@js-version | ||
with: | ||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} | ||
KUBECONFIG_B64: ${{ secrets.SLO_KUBE_CONFIG }} | ||
AWS_CREDENTIALS_B64: ${{ secrets.SLO_AWS_CREDENTIALS }} | ||
AWS_CONFIG_B64: ${{ secrets.SLO_AWS_CONFIG }} | ||
DOCKER_USERNAME: ${{ secrets.SLO_DOCKER_USERNAME }} | ||
DOCKER_PASSWORD: ${{ secrets.SLO_DOCKER_PASSWORD }} | ||
DOCKER_REPO: ${{ secrets.SLO_DOCKER_REPO }} | ||
DOCKER_FOLDER: ${{ secrets.SLO_DOCKER_FOLDER }} | ||
s3_endpoint: ${{ secrets.SLO_S3_ENDPOINT }} | ||
s3_images_folder: ${{ vars.SLO_S3_IMAGES_FOLDER }} | ||
grafana_domain: ${{ vars.SLO_GRAFANA_DOMAIN }} | ||
grafana_dashboard: ${{ vars.SLO_GRAFANA_DASHBOARD }} | ||
ydb_version: 'newest' | ||
timeBetweenPhases: 30 | ||
shutdownTime: 30 | ||
|
||
language_id0: sync | ||
language0: python-sync | ||
workload_path0: tests/slo | ||
workload_build_context0: ../.. | ||
workload_build_options0: -f Dockerfile | ||
|
||
- uses: actions/upload-artifact@v3 | ||
if: always() | ||
with: | ||
name: slo-logs | ||
path: logs/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
FROM python:3.8 | ||
COPY . /src | ||
WORKDIR /src | ||
RUN python -m pip install --upgrade pip && python -m pip install -e . && python -m pip install -r tests/slo/requirements.txt | ||
WORKDIR tests/slo | ||
|
||
ENTRYPOINT ["python", "src"] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,133 @@ | ||
# SLO workload | ||
|
||
SLO is the type of test where app based on ydb-sdk is tested against falling YDB cluster nodes, tablets, network | ||
(that is possible situations for distributed DBs with hundreds of nodes) | ||
|
||
### Implementations: | ||
|
||
There are two implementations: | ||
|
||
- `sync` | ||
- `async` (now unimplemented) | ||
|
||
### Usage: | ||
|
||
It has 3 commands: | ||
|
||
- `create` - creates table in database | ||
- `cleanup` - drops table in database | ||
- `run` - runs workload (read and write to table with sets RPS) | ||
|
||
### Run examples with all arguments: | ||
|
||
create: | ||
`python tests/slo/src/ create localhost:2136 /local -t tableName | ||
--min-partitions-count 6 --max-partitions-count 1000 --partition-size 1 -с 1000 | ||
--write-timeout 10000` | ||
|
||
cleanup: | ||
`python tests/slo/src/ cleanup localhost:2136 /local -t tableName` | ||
|
||
run: | ||
`python tests/slo/src/ run localhost:2136 /local -t tableName | ||
--prom-pgw http://prometheus-pushgateway:9091 -report-period 250 | ||
--read-rps 1000 --read-timeout 10000 | ||
--write-rps 100 --write-timeout 10000 | ||
--time 600 --shutdown-time 30` | ||
|
||
## Arguments for commands: | ||
|
||
### create | ||
`python tests/slo/src/ create <endpoint> <db> [options]` | ||
|
||
``` | ||
Arguments: | ||
endpoint YDB endpoint to connect to | ||
db YDB database to connect to | ||
Options: | ||
-t --table-name <string> table name to create | ||
-p-min --min-partitions-count <int> minimum amount of partitions in table | ||
-p-max --max-partitions-count <int> maximum amount of partitions in table | ||
-p-size --partition-size <int> partition size in mb | ||
-c --initial-data-count <int> amount of initially created rows | ||
--write-timeout <int> write timeout milliseconds | ||
--batch-size <int> amount of new records in each create request | ||
--threads <int> number of threads to use | ||
``` | ||
|
||
### cleanup | ||
`python tests/slo/src/ cleanup <endpoint> <db> [options]` | ||
|
||
``` | ||
Arguments: | ||
endpoint YDB endpoint to connect to | ||
db YDB database to connect to | ||
Options: | ||
-t --table-name <string> table name to create | ||
``` | ||
|
||
### run | ||
`python tests/slo/src/ run <endpoint> <db> [options]` | ||
|
||
``` | ||
Arguments: | ||
endpoint YDB endpoint to connect to | ||
db YDB database to connect to | ||
Options: | ||
-t --table-name <string> table name to create | ||
--prom-pgw <string> prometheus push gateway | ||
--report-period <int> prometheus push period in milliseconds | ||
--read-rps <int> read RPS | ||
--read-timeout <int> read timeout milliseconds | ||
--write-rps <int> write RPS | ||
--write-timeout <int> write timeout milliseconds | ||
--time <int> run time in seconds | ||
--shutdown-time <int> graceful shutdown time in seconds | ||
--read-threads <int> number of threads to use for write requests | ||
--write-threads <int> number of threads to use for read requests | ||
``` | ||
|
||
## Authentication | ||
|
||
Workload using [auth-env](https://ydb.yandex-team.ru/docs/reference/ydb-sdk/recipes/auth-env) for authentication. | ||
|
||
## What's inside | ||
When running `run` command, the program creates three jobs: `readJob`, `writeJob`, `metricsJob`. | ||
|
||
- `readJob` reads rows from the table one by one with random identifiers generated by writeJob | ||
- `writeJob` generates and inserts rows | ||
- `metricsJob` periodically sends metrics to Prometheus | ||
|
||
Table have these fields: | ||
- `object_id Uint64` | ||
- `object_hash Uint64 Digest::NumericHash(id)` | ||
- `payload_str UTF8` | ||
- `payload_double Double` | ||
- `payload_timestamp Timestamp` | ||
|
||
Primary key: `("object_hash", "object_id")` | ||
|
||
## Collected metrics | ||
- `oks` - amount of OK requests | ||
- `not_oks` - amount of not OK requests | ||
- `inflight` - amount of requests in flight | ||
- `latency` - summary of latencies in ms | ||
- `attempts` - summary of amount for request | ||
|
||
> You must reset metrics to keep them `0` in prometheus and grafana before beginning and after ending of jobs | ||
## Look at metrics in grafana | ||
You can get dashboard used in that test [here](https://github.com/ydb-platform/slo-tests/blob/main/k8s/helms/grafana.yaml#L69) - you will need to import json into grafana. |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
requests==2.28.2 | ||
ratelimiter==1.2.0.post0 | ||
prometheus-client==0.17.0 | ||
quantile-estimator==0.1.2 |
Oops, something went wrong.