

# CloudWatch pipelines processors
Processors

CloudWatch pipelines processors transform, parse, and enrich log data as it flows through the pipeline. A pipeline can have up to 20 processors that are applied sequentially in the order they are defined.

**Transformation metadata**  
When a pipeline processes log events, CloudWatch pipelines automatically adds transformation metadata to each processed log entry. This metadata indicates that the log has been transformed, making it easy to distinguish between original and processed data. If you enable the **Keep original log** option during pipeline creation, you can compare the original log with the transformed version at any time.


**Processor categories**  

| Category | Description | 
| --- | --- | 
| Parsers | Convert raw log data into structured formats, such as Open Cybersecurity Schema Framework (OCSF), CSV, JSON, and so on | 
| Transformers | Modify log data structure; add, copy, move, or delete fields | 
| String Processors | Manipulate string values; case conversion, trimming, substitution | 

**Topics**
+ [

# Parser processors
](parser-processors.md)
+ [

# Transformation processors
](transformation-processors.md)
+ [

# String manipulation processors
](string-processors.md)
+ [

# Filter processors
](filter-processors.md)
+ [

# Common processor use cases
](processor-examples.md)
+ [

# Processor compatibility and restrictions
](processor-compatibility.md)

# Parser processors


Parser processors convert raw or semi-structured log data into structured formats. Each pipeline can have at most one parser processor, which must be the first processor in the pipeline.

**Conditional processing not supported**  
Parser processors (except Grok) do not support conditional processing with the `when` parameter. This includes OCSF, CSV, JSON, KeyValue, VPC, Route53, WAF, Postgres, and CloudFront parsers. For more information, see [Expression syntax for conditional processing](conditional-processing.md).

## OCSF processor


Parses and transforms log data according to Open Cybersecurity Schema Framework (OCSF) standards.

**Configuration**  
Configure the OCSF processor with the following parameters:

```
processor:
  - ocsf:
      version: "1.5"
      mapping_version: 1.5.0
      schema:
          microsoft_office365_management_activity:
```Parameters

`version` (required)  
The OCSF schema version to use for transformation. Must be 1.5

`mapping_version` (required)  
The OCSF mapping version for transformation. Must be 1.5.0.

`schema` (required)  
Schema object specifying the data source type. The supported schemas depend on the pipeline source type - each source type has its own set of compatible OCSF schemas. You must use a schema that matches your pipeline's source type.

This table lists the supported schema combinations.


| Pipeline Source Type | Supported Schemas | Version | Mapping Version | 
| --- | --- | --- | --- | 
| cloudwatch\$1logs | cloud\$1trail: | 1.5 | Not required | 
| cloudwatch\$1logs | route53\$1resolver: | 1.5 | Not required | 
| cloudwatch\$1logs | vpc\$1flow: | 1.5 | Not required | 
| cloudwatch\$1logs | eks\$1audit: | 1.5 | Not required | 
| cloudwatch\$1logs | aws\$1waf: | 1.5 | Not required | 
| s3 | Any OCSF schema | Any | Any | 
| microsoft\$1office365 | microsoft\$1office365: | 1.5 | 1.5.0 | 
| microsoft\$1entraid | microsoft\$1entraid: | 1.5 | 1.5.0 | 
| microsoft\$1windows\$1event | microsoft\$1windows\$1event: | 1.5 | 1.5.0 | 
| paloaltonetworks\$1nextgenerationfirewall | paloaltonetworks\$1nextgenerationfirewall: | 1.5 | 1.5.0 | 
| okta\$1auth0 | okta\$1auth0: | 1.5 | 1.5.0 | 
| okta\$1sso | okta\$1sso: | 1.5 | 1.5.0 | 
| crowdstrike\$1falcon | crowdstrike\$1falcon: | 1.5 | 1.5.0 | 
| github\$1auditlogs | github\$1auditlogs: | 1.5 | 1.5.0 | 
| sentinelone\$1endpointsecurity | sentinelone\$1endpointsecurity: | 1.5 | 1.5.0 | 
| servicenow\$1cmdb | servicenow\$1cmdb: | 1.5 | 1.5.0 | 
| wiz\$1cnapp | wiz\$1cnapp: | 1.5 | 1.5.0 | 
| zscaler\$1internetaccess | zscaler\$1internetaccess: | 1.5 | 1.5.0 | 

## CSV processor


Parses CSV formatted data into structured fields.

**Configuration**  
Configure the CSV processor with the following parameters:

```
processor:
  - csv:      
      column_names: ["col1", "col2", "col3"]
      delimiter: ","
      quote_character: '"'
```Parameters

`column_names` (optional)  
Array of column names for parsed fields. Maximum 100 columns, each name up to 128 characters. If not provided, defaults to column\$11, column\$12, and so on.

`delimiter` (optional)  
Character used to separate CSV fields. Must be a single character. Defaults to comma (,).

`quote_character` (optional)  
Character used to quote CSV fields containing delimiters. Must be a single character. Defaults to double quote (").

To use the processor without specifying additional parameters, use the following command:

```
processor:
  - csv: {}
```

## Grok processor


Parses unstructured data using Grok patterns. At most 1 Grok is supported per pipeline. For details on the Grok transformer in CloudWatch Logs see [Processors that you can use](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html) in the *CloudWatch Logs User Guide*.

**Configuration**  
Configure the Grok processor with the following parameters:

When the data source is a dictionary, you can use this configuration:

```
processor:
  - grok:      
      match:
       source_key: ["%{WORD:level} %{GREEDYDATA:msg}"]
```

When the data source is CloudWatch Logs, you can use this configuration:

```
processor:
  - grok:      
      match:
       source_key: ["%{WORD:level} %{GREEDYDATA:msg}"]
```Parameters

`match` (required)  
Field mapping with Grok patterns. Only one field mapping allowed.

`match.<field>` (required)  
Array with single Grok pattern. Maximum 512 characters per pattern.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

**Important**  
If the Grok processor is used as the parser (first processor) in a pipeline and its `when` condition evaluates to false, the entire pipeline does not execute for that log event. Parsers must run for downstream processors to receive structured data.

## VPC processor


Parses VPC Flow Log data into structured fields.

**Configuration**  
Configure the VPC processor with the following parameters:

```
processor:
  - parse_vpc: {}
```

## JSON processor


Parses JSON data into structured fields.

**Configuration**  
Configure the JSON processor with the following parameters:

```
processor:
  - parse_json:
      source: "message"
      destination: "parsed_json"
```Parameters

`source` (optional)  
The field containing the JSON data to parse. If omitted, the entire log message is processed

`destination` (optional)  
The field where the parsed JSON will be stored. If omitted, parsed fields are added to the root level

## Route 53 processor


Parses Route 53 resolver log data into structured fields.

**Configuration**  
Configure the Route 53 processor with the following parameters:

```
processor:
  - parse_route53: {}
```

## Key-value processor


Parses key-value pair formatted data into structured fields.

**Configuration**  
Configure the key-value processor with the following parameters:

```
processor:
  - key_value:
      source: "message"
      destination: "parsed_kv"
      field_delimiter: "&"
      key_value_delimiter: "="
```Parameters

`source` (optional)  
Field containing key-value data. Maximum 128 characters.

`destination` (optional)  
Target field for parsed key-value pairs. Maximum 128 characters.

`field_delimiter` (optional)  
Pattern to split key-value pairs. Maximum 10 characters.

`key_value_delimiter` (optional)  
Pattern to split keys from values. Maximum 10 characters.

`overwrite_if_destination_exists` (optional)  
Whether to overwrite existing destination field.

`prefix` (optional)  
Prefix to add to extracted keys. Maximum 128 characters.

`non_match_value` (optional)  
Value for keys without matches. Maximum 128 characters.

To use the processor without specifying additional parameters, use the following command:

```
processor:
  - key_value: {}
```

# Transformation processors


Transformation processors modify the structure of log events by adding, copying, moving, or removing fields.

## add\$1entries processor


Adds static key-value pairs to log events. At most 1 `add_entries` processor can be added to a pipeline.

**Configuration**  
Configure the add\$1entries processor with the following parameters:

```
processor:
  - add_entries:
      entries:
        - key: "environment"
          value: "production"
          overwrite_if_key_exists: false
```Parameters

`entries` (required)  
Array of key-value pairs to add to each log event.

`entries[].key` (required)  
The field name to add to the log event. Supports nested fields using dot notation.

`entries[].value` (required)  
The static value to assign to the key.

`entries[].overwrite_if_key_exists` (optional)  
Boolean flag that determines behavior when the key already exists. Defaults to false.

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## copy\$1values processor


Copies values from one field to another. At most 1 `copy_values` processor can be added to a pipeline.

**Configuration**  
Configure the copy\$1values processor with the following parameters:

```
processor:
  - copy_values:
      entries:
        - from_key: "user_id"
          to_key: "backup_user"
          overwrite_if_to_key_exists: false
```Parameters

`entries` (required)  
Array of copy operations to perform on each log event.

`entries[].from_key` (required)  
The field name to copy the value from. Uses dot notation for nested fields.

`entries[].to_key` (required)  
The field name to copy the value to. Will create nested structures if using dot notation.

`entries[].overwrite_if_to_key_exists` (optional)  
Boolean flag controlling behavior when target field already exists. Defaults to false.

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## delete\$1entries processor


Removes specified fields from log events.

**Configuration**  
Configure the delete\$1entries processor with the following parameters:

```
processor:
  - delete_entries:
      with_keys: ["temp_field", "debug_info"]
```Parameters

`with_keys` (required)  
Array of field names to remove from each log event. Supports nested field deletion using dot notation.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## move\$1keys processor


Moves fields from one location to another.

**Configuration**  
Configure the move\$1keys processor with the following parameters:

```
processor:
  - move_keys:
      entries:
        - from_key: "old_field"
          to_key: "new_field"
          overwrite_if_to_key_exists: true
```Parameters

`entries` (required)  
Array of move operations. Maximum 5 entries.

`entries[].from_key` (required)  
Source field name. Maximum 128 characters.

`entries[].to_key` (required)  
Target field name. Maximum 128 characters.

`entries[].overwrite_if_to_key_exists` (optional)  
Whether to overwrite existing target field.

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## flatten processor


Flattens nested object structures.

**Configuration**  
Configure the flatten processor with the following parameters:

```
processor:
  - flatten:
      source: "metadata"
      target: "flattened"
      remove_processed_fields: true
      exclude_keys: ["sensitive_data"]
```Parameters

`source` (required)  
Field containing nested object to flatten.

`target` (required)  
Target field prefix for flattened keys.

`remove_processed_fields` (optional)  
Whether to remove the original nested field after flattening.

`exclude_keys` (optional)  
Array of keys to exclude from flattening. Maximum 20 keys, each up to 128 characters.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

# String manipulation processors


String processors modify text values within log events through operations like case conversion, trimming, and pattern matching.

## lowercase\$1string processor


Converts specified fields to lowercase.

**Configuration**  
Configure the lowercase\$1string processor with the following parameters:

```
processor:
  - lowercase_string:
      with_keys: ["status", "method"]
```Parameters

`with_keys` (required)  
Array of field names to convert to lowercase. Only processes string values.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## uppercase\$1string processor


Converts specified fields to uppercase.

**Configuration**  
Configure the uppercase\$1string processor with the following parameters:

```
processor:
  - uppercase_string:
      with_keys: ["status_code", "method"]
```Parameters

`with_keys` (required)  
Array of field names to convert to uppercase. Only processes string values.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## trim\$1string processor


Removes leading and trailing whitespace from specified fields.

**Configuration**  
Configure the trim\$1string processor with the following parameters:

```
processor:
  - trim_string:
      with_keys: ["message", "user_input"]
```Parameters

`with_keys` (required)  
Array of field names to trim whitespace from. Only processes string values.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## substitute\$1string processor


Performs string substitution using regular expressions.

**Configuration**  
Configure the substitute\$1string processor with the following parameters:

```
processor:
  - substitute_string:
      entries:
        - source: "message"
          from: "ERROR"
          to: "WARN"
```Parameters

`entries` (required)  
Array of substitution operations to perform on each log event.

`entries[].source` (required)  
The field to perform string substitution on.

`entries[].from` (required)  
The regular expression pattern to match and replace.

`entries[].to` (required)  
The replacement string for matched patterns.

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## truncate processor


Truncates field values to specified length.

**Configuration**  
Configure the truncate processor with the following parameters:

```
processor:
  - truncate:
      source_keys: ["message", "description"]
      length: 100
      start_at: 0
```Parameters

`source_keys` (required)  
Array of field names to truncate. Each field name maximum 128 characters.

`length` (optional)  
Maximum length after truncation. Range: 1-8192.

`start_at` (optional)  
Starting position for truncation. Range: 0-8192. Defaults to 0.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## extract\$1value processor


Extracts values using regular expressions.

**Configuration**  
Configure the extract\$1value processor with the following parameters:

```
processor:
  - extract_value:
      entries:
        - source: "message"
          target: "extracted_data"
          from: "user=(?<user>\\w+)"
          to: "${user}"
          target_type: "string"
```Parameters

`entries` (required)  
Array of extraction operations. Maximum 20 entries.

`entries[].source` (required)  
Field to extract from. Maximum 128 characters.

`entries[].target` (required)  
Target field for extracted value. Maximum 128 characters.

`entries[].from` (required)  
Regular expression pattern. Maximum 128 characters.

`entries[].to` (required)  
Replacement pattern with capture groups. Maximum 128 characters.

`entries[].target_type` (optional)  
Target data type ("integer", "double", "string", "boolean").

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## convert\$1entry\$1type processor


Converts field values between different data types.

**Configuration**  
Configure the convert\$1entry\$1type processor with the following parameters:

```
processor:
  - convert_entry_type:
      key: "count"
      type: "integer"
```Parameters

`key` (required)  
Single field name to convert.

`type` (required)  
Target data type. Options: "integer", "double", "string", "boolean".

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## date processor


Parses and formats date/time fields.

**Configuration**  
Configure the date processor with the following parameters:

```
processor:
  - date:
      match:
        - key: "timestamp"
          patterns: ["yyyy-MM-dd'T'HH:mm:ss.SSSSSS'Z'"]
      destination: "@timestamp"
      source_timezone: "UTC"
      destination_timezone: "America/New_York"
```Parameters

`match` (required)  
Array of date matching configurations. Maximum 10 entries.

`match[].key` (required)  
Field containing the date string. Maximum 128 characters.

`match[].patterns` (required)  
Array of date format patterns to try. Maximum 5 patterns, each up to 256 characters.

`destination` (optional)  
Single target field for all parsed dates. Maximum 128 characters.

`source_timezone` (optional)  
Source timezone for parsing.

`destination_timezone` (optional)  
Target timezone for output.

`output_format` (optional)  
Output date format. Maximum 64 characters.

`destination_type` (optional)  
Output type - "timestampz", "long", or "string".

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## dissect processor


Extracts structured data using pattern matching.

**Configuration**  
Configure the dissect processor with the following parameters:

```
processor:
  - dissect:
      map:
        message: "%{timestamp} %{level}"
```Parameters

`map` (required)  
Field mapping with dissect patterns.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## list\$1to\$1map processor


Converts array fields to map structures.

**Configuration**  
Configure the list\$1to\$1map processor with the following parameters:

```
processor:
  - list_to_map:
      source: "tags"
      key: "name"
      value_key: "value"
      target: "tag_map"
```Parameters

`source` (required)  
Field containing array data. Maximum 128 characters.

`key` (required)  
Field name to use as map key. Maximum 128 characters.

`value_key` (optional)  
Field name to use as map value. Maximum 128 characters.

`target` (optional)  
Target field for map structure. Maximum 128 characters.

`flatten` (optional)  
Whether to flatten the resulting map.

`flattened_element` (optional)  
Which element to use when flattening ("first" or "last").

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## rename\$1keys processor


Renames fields in log events.

**Configuration**  
Configure the rename\$1keys processor with the following parameters:

```
processor:
  - rename_keys:
      entries:
        - from_key: "old_name"
          to_key: "new_name"
          overwrite_if_to_key_exists: true
```Parameters

`entries` (required)  
Array of rename operations. Maximum 5 entries.

`entries[].from_key` (required)  
Current field name. Maximum 128 characters.

`entries[].to_key` (required)  
New field name. Maximum 128 characters.

`entries[].overwrite_if_to_key_exists` (optional)  
Whether to overwrite existing target field.

`when` (optional)  
Processor-level conditional expression. When specified, the entire processor is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when` (optional)  
Entry-level conditional expression. When specified, only this entry is skipped if the expression evaluates to false. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

`entries[].when_else` (optional)  
Fallback entry that executes only when none of the other `when` conditions in the same processor matched. The expression value identifies which `when` conditions to consider. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## select\$1entries processor


Selects only specified fields from events.

**Configuration**  
Configure the select\$1entries processor with the following parameters:

```
processor:
  - select_entries:
      include_keys: ["timestamp", "level", "message"]
```Parameters

`include_keys` (required)  
Array of field names to keep. Maximum 50 keys, each up to 128 characters.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

## translate processor


Translates field values using lookup tables.

**Configuration**  
Configure the translate processor with the following parameters:

```
processor:
  - translate:
      mappings:
        - source: "status_code"
          targets:
            - target: "status_text"
              map:
                "200": "OK"
                "404": "Not Found"
```Parameters

`mappings` (required)  
Array of translation configurations. Maximum 10 mappings.

`mappings[].source` (required)  
Field to translate. Maximum 128 characters.

`mappings[].targets` (required)  
Array of target configurations. Maximum 10 targets.

`mappings[].targets[].target` (required)  
Target field name. Maximum 128 characters.

`mappings[].targets[].map` (required)  
Translation mapping. Maximum 100 entries, each value up to 512 characters.

`when` (optional)  
Conditional expression that determines whether this processor executes. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md).

# Filter processors


Filter processors let you selectively remove log entries from the pipeline based on conditions you define.

## drop\$1events processor


Filters out unwanted log entries based on conditional expressions. Use this processor to reduce noise from third-party pipeline connectors and lower storage costs by removing log events that match specified conditions.

**Configuration**  
Configure the drop\$1events processor with the following parameters:

```
processor:
  - drop_events:
      when: "log.level == 'DEBUG' or log.level == 'TRACE'"
```Parameters

`when` (required)  
Conditional expression that determines which log entries to drop. Log entries matching this expression are removed from the pipeline. Maximum length is 256 characters. See [Expression syntax for conditional processing](conditional-processing.md) for expression syntax.

`handle_expression_failure` (optional)  
Behavior when the `when` expression evaluation fails. Allowed values: `"skip"` (default) keeps the event, or `"apply"` drops the event regardless of the failure.

**Example Drop low-severity log entries**  
The following configuration drops all DEBUG and TRACE log entries, keeping only higher-severity events:  

```
processor:
  - drop_events:
      when: "log.level in {'DEBUG', 'TRACE'}"
      handle_expression_failure: "skip"
```

# Common processor use cases


Here are common scenarios and example configurations for combining processors:

**Example Standardize log formats and add metadata**  
Parse JSON logs, standardize field names, and add environment information:  

```
processor:
  - parse_json: {}
  - rename_keys:
      entries:
        - from_key: "timestamp"
          to_key: "@timestamp"
        - from_key: "log_level"
          to_key: "level"
  - add_entries:
      entries:
        - key: "environment"
          value: "production"
        - key: "application"
          value: "payment-service"
```

**Example Clean and normalize field values**  
Standardize status codes and remove sensitive data:  

```
processor:
  - uppercase_string:
      with_keys: ["status", "method"]
  - delete_entries:
      with_keys: ["credit_card", "password"]
  - substitute_string:
      entries:
        - source: "status"
          from: "SUCCESS"
          to: "OK"
```

**Example Extract and transform specific fields**  
Extract user information and format for analysis:  

```
processor:
  - extract_value:
      entries:
        - source: "user_agent"
          target: "browser"
          from: "(?<browser>Chrome|Firefox|Safari)"
          to: "${browser}"
  - lowercase_string:
      with_keys: ["browser"]
  - move_keys:
      entries:
        - from_key: "browser"
          to_key: "user_data.browser"
```

**Example Conditional processing with entry-level conditions**  
Add different metadata based on log severity using entry-level `when` conditions:  

```
processor:
  - add_entries:
      entries:
        - key: "alert_level"
          value: "critical"
          when: "log.level == 'ERROR'"
        - key: "alert_level"
          value: "info"
          when_else: "log.level == 'ERROR'"
```

**Example Drop unwanted log entries**  
Filter out debug and trace log entries from a third-party source to reduce noise and storage costs:  

```
processor:
  - drop_events:
      when: "log.level in {'DEBUG', 'TRACE'}"
      handle_expression_failure: "skip"
```

**Example Processor-level conditional with delete\$1entries**  
Remove sensitive fields only when the environment is production:  

```
processor:
  - delete_entries:
      with_keys: ["password", "api_key", "ssn"]
      when: "environment in {'prod', 'staging'}"
```

# Processor compatibility and restrictions
General processor rules

Maximum count  
A pipeline can have at most 20 processors.

Parser placement  
Parser processors (OCSF, CSV, Grok, etc.), if used, must be the first processor in a pipeline.

Unique processors  
The following processors can appear only once per pipeline:  
+ `add_entries`
+ `copy_values`


| Processor Type | CloudWatch Logs Source | S3 Source | API-based Sources | 
| --- | --- | --- | --- | 
| OCSF | Must be first processor | Must be first processor | Must be first processor | 
| parse\$1vpc | Must be first processor | Not applicable | Not applicable | 
| parse\$1route53 | Must be first processor | Not applicable | Not applicable | 
| parse\$1json | Must be first processor | Must be first processor | Must be first processor | 
| grok | Must be first processor | Must be first processor | Must be first processor | 
| csv | Must be first processor | Not compatible | Not compatible | 
| key\$1value | Must be first processor | Must be first processor | Must be first processor | 
| add\$1entries | Must be first processor | Must be first processor | Must be first processor | 
| copy\$1values | Must be first processor | Must be first processor | Must be first processor | 
| String processors (lowercase, uppercase, trim) | Must be first processor | Must be first processor | Must be first processor | 
| Field processors (move\$1keys, rename\$1keys) | Must be first processor | Must be first processor | Must be first processor | 
| Data transformation (date, flatten) | Must be first processor | Must be first processor | Must be first processor | 

**Compatibility definitions**  

Must be first processor  
When used, must be the first processor in the pipeline configuration

Not compatible  
Cannot be used with this source type

Not applicable  
Processor is not relevant for this source type

## Processor-specific restrictions



**Processor restrictions by source type**  

| Processor | Source Type | Restrictions | 
| --- | --- | --- | 
| OCSF | CloudWatch Logs with CloudTrail |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 
| OCSF | API-based Sources |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 
| parse\$1vpc | CloudWatch Logs |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 
| parse\$1route53 | CloudWatch Logs |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 
| add\$1entries | All Sources |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 
| copy\$1values | All Sources |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/processor-compatibility.html)  | 

**Important**  
When using processors with restrictions:  
Always validate your pipeline configuration using the `ValidateTelemetryPipelineConfiguration` API before deployment
Test the pipeline with sample data using the `TestTelemetryPipeline` API to ensure proper processing
Monitor pipeline metrics after deployment to ensure events are being processed as expected