Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data Prepper 2.7 documentation (#6763) #6797

Merged
merged 1 commit into from
Mar 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,7 @@ extensions:
region: <YOUR_REGION_1>
sts_role_arn: <YOUR_STS_ROLE_ARN_1>
refresh_interval: <YOUR_REFRESH_INTERVAL>
disable_refresh: false
<YOUR_SECRET_CONFIG_ID_2>:
...
```
Expand All @@ -148,7 +149,8 @@ Option | Required | Type | Description
secret_id | Yes | String | The AWS secret name or ARN. |
region | No | String | The AWS region of the secret. Defaults to `us-east-1`.
sts_role_arn | No | String | The AWS Security Token Service (AWS STS) role to assume for requests to the AWS Secrets Manager. Defaults to `null`, which will use the [standard SDK behavior for credentials](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/credentials.html).
refresh_interval | No | Duration | The refreshment interval for AWS secrets extension plugin to poll new secret values. Defaults to `PT1H`. See [Automatically refreshing secrets](#automatically-refreshing-secrets) for details.
refresh_interval | No | Duration | The refreshment interval for the AWS Secrets extension plugin to poll new secret values. Defaults to `PT1H`. For more information, see [Automatically refreshing secrets](#automatically-refreshing-secrets).
disable_refresh | No | Boolean | Disables regular polling on the latest secret values inside the AWS secrets extension plugin. Defaults to `false`. When set to `true`, `refresh_interval` will not be used.

#### Reference secrets
ß
Expand Down
15 changes: 15 additions & 0 deletions _data-prepper/managing-data-prepper/extensions/extensions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
layout: default
title: Extensions
parent: Managing Data Prepper
has_children: true
nav_order: 18
---

# Extensions

Data Prepper extensions provide Data Prepper functionality outside of core Data Prepper pipeline components.
Many extensions provide configuration options that give Data Prepper administrators greater flexibility over Data Prepper's functionality.

Extension configurations can be configured in the `data-prepper-config.yaml` file under the `extensions:` YAML block.

67 changes: 67 additions & 0 deletions _data-prepper/managing-data-prepper/extensions/geoip_service.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
---
layout: default
title: geoip_service
nav_order: 5
parent: Extensions
grand_parent: Managing Data Prepper
---

# geoip_service

The `geoip_service` extension configures all [`geoip`]({{site.url}}{{site.baseurl}}/data-prepper/pipelines/configuration/processors/geoip) processors in Data Prepper.

## Usage

You can configure the GeoIP service that Data Prepper uses for the `geoip` processor.
By default, the GeoIP service comes with the [`maxmind`](#maxmind) option configured.

The following example shows how to configure the `geoip_service` in the `data-prepper-config.yaml` file:

```
extensions:
geoip_service:
maxmind:
database_refresh_interval: PT1H
cache_count: 16_384
```

## maxmind

Check failure on line 28 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L28

[OpenSearch.Spelling] Error: maxmind. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.
Raw output
{"message": "[OpenSearch.Spelling] Error: maxmind. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 28, "column": 4}}}, "severity": "ERROR"}

Check failure on line 28 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L28

[OpenSearch.HeadingCapitalization] 'maxmind' is a heading and should be in sentence case.
Raw output
{"message": "[OpenSearch.HeadingCapitalization] 'maxmind' is a heading and should be in sentence case.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 28, "column": 4}}}, "severity": "ERROR"}

The GeoIP service supports the MaxMind [GeoIP and GeoLite](https://dev.maxmind.com/geoip) databases.
By default, Data Prepper will use all three of the following [MaxMind GeoLite2](https://dev.maxmind.com/geoip/geolite2-free-geolocation-data) databases:

* City
* Country
* ASN

The service also downloads databases automatically to keep Data Prepper up to date with changes from MaxMind.

You can use the following options to configure the `maxmind` extension.

Option | Required | Type | Description
:--- | :--- | :--- | :---
`databases` | No | [database](#database) | The database configuration.

Check failure on line 43 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L43

[OpenSearch.HeadingCapitalization] 'database' is a heading and should be in sentence case.
Raw output
{"message": "[OpenSearch.HeadingCapitalization] 'database' is a heading and should be in sentence case.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 43, "column": 21}}}, "severity": "ERROR"}
`database_refresh_interval` | No | Duration | How frequently to check for updates from MaxMind. This can be any duration in the range of 15 minutes to 30 days. Default is `PT7D`.
`cache_count` | No | Integer | The maximum cache count by number of items in the cache, with a range of 100--100,000. Default is `4096`.
`database_destination` | No | String | The name of the directory in which to store downloaded databases. Default is `{data-prepper.dir}/data/geoip`.
`aws` | No | [aws](#aws) | Configures the AWS credentials for downloading the database from Amazon Simple Storage Service (Amazon S3).

Check failure on line 47 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L47

[OpenSearch.Spelling] Error: aws. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.
Raw output
{"message": "[OpenSearch.Spelling] Error: aws. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 47, "column": 15}}}, "severity": "ERROR"}
`insecure` | No | Boolean | When `true`, this options allows you to download database files over HTTP. Default is `false`.

## database

Option | Required | Type | Description
:--- | :--- | :--- | :---
`city` | No | String | The URL of the city in which the database resides. Can be an HTTP URL for a manifest file, an MMDB file, or an S3 URL.
`country` | No | String | The URL of the country in which the database resides. Can be an HTTP URL for a manifest file, an MMDB file, or an S3 URL.
`asn` | No | String | The URL of the Autonomous System Number (ASN) of where the database resides. Can be an HTTP URL for a manifest file, an MMDB file, or an S3 URL.
`enterprise` | No | String | The URL of the enterprise in which the database resides. Can be an HTTP URL for a manifest file, an MMDB file, or an S3 URL.


## aws

Check failure on line 60 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L60

[OpenSearch.Spelling] Error: aws. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.
Raw output
{"message": "[OpenSearch.Spelling] Error: aws. If you are referencing a setting, variable, format, function, or repository, surround it with tic marks.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 60, "column": 4}}}, "severity": "ERROR"}

Check failure on line 60 in _data-prepper/managing-data-prepper/extensions/geoip_service.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/managing-data-prepper/extensions/geoip_service.md#L60

[OpenSearch.HeadingCapitalization] 'aws' is a heading and should be in sentence case.
Raw output
{"message": "[OpenSearch.HeadingCapitalization] 'aws' is a heading and should be in sentence case.", "location": {"path": "_data-prepper/managing-data-prepper/extensions/geoip_service.md", "range": {"start": {"line": 60, "column": 4}}}, "severity": "ERROR"}

Option | Required | Type | Description
:--- | :--- | :--- | :---
`region` | No | String | The AWS Region to use for the credentials. Default is the [standard SDK behavior for determining the Region](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/region-selection.html).
`sts_role_arn` | No | String | The AWS Security Token Service (AWS STS) role to assume for requests to Amazon S3. Default is `null`, which will use the [standard SDK behavior for credentials](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/credentials.html).
`aws_sts_header_overrides` | No | Map | A map of header overrides that the AWS Identity and Access Management (IAM) role assumes when downloading from Amazon S3.
`sts_external_id` | No | String | An STS external ID used when Data Prepper assumes the STS role. For more information, see the `ExternalID` documentation in the [STS AssumeRole](https://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRole.html) API reference.
54 changes: 43 additions & 11 deletions _data-prepper/pipelines/configuration/processors/date.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,24 +9,32 @@
# date


The `date` processor adds a default timestamp to an event, parses timestamp fields, and converts timestamp information to the International Organization for Standardization (ISO) 8601 format. This timestamp information can be used as an event timestamp.
The `date` processor adds a default timestamp to an event, parses timestamp fields, and converts timestamp information to the International Organization for Standardization (ISO) 8601 format. This timestamp information can be used as an event timestamp.

## Configuration

The following table describes the options you can use to configure the `date` processor.

<!-- vale off -->
Option | Required | Type | Description
:--- | :--- | :--- | :---
match | Conditionally | List | List of `key` and `patterns` where patterns is a list. The list of match can have exactly one `key` and `patterns`. There is no default value. This option cannot be defined at the same time as `from_time_received`. Include multiple date processors in your pipeline if both options should be used.
from_time_received | Conditionally | Boolean | A boolean that is used for adding default timestamp to event data from event metadata which is the time when source receives the event. Default value is `false`. This option cannot be defined at the same time as `match`. Include multiple date processors in your pipeline if both options should be used.
destination | No | String | Field to store the timestamp parsed by date processor. It can be used with both `match` and `from_time_received`. Default value is `@timestamp`.
source_timezone | No | String | Time zone used to parse dates. It is used in case the zone or offset cannot be extracted from the value. If the zone or offset are part of the value, then timezone is ignored. Find all the available timezones [the list of database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List) in the **TZ database name** column.
destination_timezone | No | String | Timezone used for storing timestamp in `destination` field. The available timezone values are the same as `source_timestamp`.
locale | No | String | Locale is used for parsing dates. It's commonly used for parsing month names(`MMM`). It can have language, country and variant fields using IETF BCP 47 or String representation of [Locale](https://docs.oracle.com/javase/8/docs/api/java/util/Locale.html) object. For example `en-US` for IETF BCP 47 and `en_US` for string representation of Locale. Full list of locale fields which includes language, country and variant can be found [the language subtag registry](https://www.iana.org/assignments/language-subtag-registry/language-subtag-registry). Default value is `Locale.ROOT`.
`match` | Conditionally | [Match](#Match) | The date match configuration. This option cannot be defined at the same time as `from_time_received`. There is no default value.
`from_time_received` | Conditionally | Boolean | When `true`, the timestamp from the event metadata, which is the time at which the source receives the event, is added to the event data. This option cannot be defined at the same time as `match`. Default is `false`.
`date_when` | No | String | Specifies under what condition the `date` processor should perform matching. Default is no condition.
`to_origination_metadata` | No | Boolean | When `true`, the matched time is also added to the event's metadata as an instance of `Instant`. Default is `false`.
`destination` | No | String | The field used to store the timestamp parsed by the date processor. Can be used with both `match` and `from_time_received`. Default is `@timestamp`.
`output_format` | No | String | Determines the format of the timestamp added to an event. Default is `yyyy-MM-dd'T'HH:mm:ss.SSSXXX`.

Check warning on line 26 in _data-prepper/pipelines/configuration/processors/date.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/pipelines/configuration/processors/date.md#L26

[OpenSearch.AcronymParentheses] 'SSSXXX': Spell out acronyms the first time that you use them on a page and follow them with the acronym in parentheses. Subsequently, use the acronym alone.
Raw output
{"message": "[OpenSearch.AcronymParentheses] 'SSSXXX': Spell out acronyms the first time that you use them on a page and follow them with the acronym in parentheses. Subsequently, use the acronym alone.", "location": {"path": "_data-prepper/pipelines/configuration/processors/date.md", "range": {"start": {"line": 26, "column": 125}}}, "severity": "WARNING"}
`source_timezone` | No | String | The time zone used to parse dates, including when the zone or offset cannot be extracted from the value. If the zone or offset are part of the value, then the time zone is ignored. A list of all the available time zones is contained in the **TZ database name** column of [the list of database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).
`destination_timezone` | No | String | The time zone used for storing the timestamp in the `destination` field. A list of all the available time zones is contained in the **TZ database name** column of [the list of database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).
`locale` | No | String | The location used for parsing dates. Commonly used for parsing month names (`MMM`). The value can contain language, country, or variant fields in IETF BCP 47, such as `en-US`, or a string representation of the [locale](https://docs.oracle.com/javase/8/docs/api/java/util/Locale.html) object, such as `en_US`. A full list of locale fields, including language, country, and variant, can be found in [the language subtag registry](https://www.iana.org/assignments/language-subtag-registry/language-subtag-registry). Default is `Locale.ROOT`.
<!-- vale on -->

<!---## Configuration
### Match

Content will be added to this section.--->
Option | Required | Type | Description
:--- | :--- | :--- | :---
`key` | Yes | String | Represents the event key against which to match patterns. Required if `match` is configured.
`patterns` | Yes | List | A list of possible patterns that the timestamp value of the key can have. The patterns are based on a sequence of letters and symbols. The `patterns` support all the patterns listed in the Java [DatetimeFormatter](https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html) reference. The timestamp value also supports `epoch_second`, `epoch_milli`, and `epoch_nano` values, which represent the timestamp as the number of seconds, milliseconds, and nanoseconds since the epoch. Epoch values always use the UTC time zone.

## Metrics

Expand All @@ -40,5 +48,29 @@

The `date` processor includes the following custom metrics.

* `dateProcessingMatchSuccessCounter`: Returns the number of records that match with at least one pattern specified by the `match configuration` option.
* `dateProcessingMatchFailureCounter`: Returns the number of records that did not match any of the patterns specified by the `patterns match` configuration option.
* `dateProcessingMatchSuccessCounter`: Returns the number of records that match at least one pattern specified by the `match configuration` option.
* `dateProcessingMatchFailureCounter`: Returns the number of records that did not match any of the patterns specified by the `patterns match` configuration option.

## Example: Add the default timestamp to an event
The following `date` processor configuration can be used to add a default timestamp in the `@timestamp` filed applied to all events:

```yaml
- date:
from_time_received: true
destination: "@timestamp"
```

## Example: Parse a timestamp to convert its format and time zone
The following `date` processor configuration can be used to parse the value of the timestamp applied to `dd/MMM/yyyy:HH:mm:ss` and write it in `yyyy-MM-dd'T'HH:mm:ss.SSSXXX` format:

```yaml
- date:
match:
- key: timestamp
patterns: ["dd/MMM/yyyy:HH:mm:ss"]
destination: "@timestamp"
output_format: "yyyy-MM-dd'T'HH:mm:ss.SSSXXX"
source_timezone: "America/Los_Angeles"
destination_timezone: "America/Chicago"
locale: "en_US"
```
49 changes: 49 additions & 0 deletions _data-prepper/pipelines/configuration/processors/decompress.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
---
layout: default
title: decompress
parent: Processors
grand_parent: Pipelines
nav_order: 40
---

# decompress

Check failure on line 9 in _data-prepper/pipelines/configuration/processors/decompress.md

View workflow job for this annotation

GitHub Actions / vale

[vale] _data-prepper/pipelines/configuration/processors/decompress.md#L9

[OpenSearch.HeadingCapitalization] 'decompress' is a heading and should be in sentence case.
Raw output
{"message": "[OpenSearch.HeadingCapitalization] 'decompress' is a heading and should be in sentence case.", "location": {"path": "_data-prepper/pipelines/configuration/processors/decompress.md", "range": {"start": {"line": 9, "column": 3}}}, "severity": "ERROR"}

The `decompress` processor decompresses any Base64-encoded compressed fields inside of an event.

## Configuration

Option | Required | Type | Description
:--- | :--- | :--- | :---
`keys` | Yes | List<String> | The fields in the event that will be decompressed.
`type` | Yes | Enum | The type of decompression to use for the `keys` in the event. Only `gzip` is supported.
`decompress_when` | No | String| A [Data Prepper conditional expression](https://opensearch.org/docs/latest/data-prepper/pipelines/expression-syntax/) that determines when the `decompress` processor will run on certain events.
`tags_on_failure` | No | List<String> | A list of strings with which to tag events when the processor fails to decompress the `keys` inside an event. Defaults to `_decompression_failure`.

## Usage

The following example shows the `decompress` processor used in `pipelines.yaml`:

```yaml
processor:
- decompress:
decompress_when: '/some_key == null'
keys: [ "base_64_gzip_key" ]
type: gzip
```

## Metrics

The following table describes common [abstract processor](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/processor/AbstractProcessor.java) metrics.

| Metric name | Type | Description |
| ------------- | ---- | -----------|
| `recordsIn` | Counter | The ingress of records to a pipeline component. |
| `recordsOut` | Counter | The egress of records from a pipeline component. |
| `timeElapsed` | Timer | The time elapsed during execution of a pipeline component. |

### Counter

The `decompress` processor accounts for the following metrics:

* `processingErrors`: The number of processing errors that have occurred in the `decompress` processor.

Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ layout: default
title: delete_entries
parent: Processors
grand_parent: Pipelines
nav_order: 51
nav_order: 41
---

# delete_entries
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ layout: default
title: dissect
parent: Processors
grand_parent: Pipelines
nav_order: 52
nav_order: 45
---

# dissect
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ layout: default
title: drop_events
parent: Processors
grand_parent: Pipelines
nav_order: 53
nav_order: 46
---

# drop_events
Expand Down
Loading
Loading