

# Logging data events
<a name="logging-data-events-with-cloudtrail"></a>

This section describes how to log data events using the [CloudTrail console](#logging-data-events-console) and [AWS CLI](#creating-data-event-selectors-with-the-AWS-CLI).

By default, trails and event data stores do not log data events. Additional charges apply for data events. For more information, see [AWS CloudTrail Pricing](https://aws.amazon.com/cloudtrail/pricing/).

Data events provide information about the resource operations performed on or in a resource. These are also known as *data plane operations*. Data events are often high-volume activities.

Example data events include:
+ [Amazon S3 object-level API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events) (for example, `GetObject`, `DeleteObject`, and `PutObject` API operations) on objects in S3 buckets.
+ AWS Lambda function execution activity (the `Invoke` API).
+ CloudTrail [https://docs.aws.amazon.com/awscloudtraildata/latest/APIReference/API_PutAuditEvents.html](https://docs.aws.amazon.com/awscloudtraildata/latest/APIReference/API_PutAuditEvents.html) activity on a [CloudTrail Lake channel](query-event-data-store-integration.md) that is used to log events from outside AWS.
+ Amazon SNS [https://docs.aws.amazon.com/sns/latest/api/API_Publish.html](https://docs.aws.amazon.com/sns/latest/api/API_Publish.html) and [https://docs.aws.amazon.com/sns/latest/api/API_PublishBatch.html](https://docs.aws.amazon.com/sns/latest/api/API_PublishBatch.html) API operations on topics.

You can use advanced event selectors to create fine-grained selectors, which help you control costs by only logging the specific events of interest for your use cases. For example, you can use advanced event selectors to log specific API calls by adding a filter on the `eventName` field. For more information, see [Filtering data events by using advanced event selectors](filtering-data-events.md).

**Note**  
The events that are logged by your trails are available in Amazon EventBridge. For example, if you choose to log data events for S3 objects but not management events, your trail processes and logs only data events for the specified S3 objects. The data events for these S3 objects are available in Amazon EventBridge. For more information, see [AWS service events delivered via CloudTrail](https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-service-event-cloudtrail.html) in the *Amazon EventBridge User Guide* and the [AWS Events Reference](https://docs.aws.amazon.com//eventbridge/latest/ref/welcome.html). 

**Contents**
+ [Data events](#logging-data-events)
  + [Data events supported by AWS CloudTrail](#w2aac21c31c19c11)
  + [Examples: Logging data events for Amazon S3 objects](#logging-data-events-examples)
  + [Logging data events for S3 objects in other AWS accounts](#logging-data-events-for-s3-resources-in-other-accounts)
+ [Read-only and write-only events](#read-write-events-data)
+ [Logging data events with the AWS Management Console](#logging-data-events-console)
+ [Logging data events with the AWS Command Line Interface](#creating-data-event-selectors-with-the-AWS-CLI)
  + [Logging data events for trails with the AWS CLI](#logging-data-events-CLI-trail-examples)
    + [Log data events for trails by using advanced event selectors](#creating-data-event-selectors-advanced)
    + [Log all Amazon S3 events for an Amazon S3 bucket by using advanced event selectors](#creating-data-adv-event-selectors-CLI-s3)
    + [Log Amazon S3 on AWS Outposts events by using advanced event selectors](#creating-data-event-selectors-CLI-outposts)
    + [Log events by using basic event selectors](#creating-data-event-selectors-basic)
  + [Logging data events for event data stores with the AWS CLI](#logging-data-events-CLI-eds-examples)
    + [Include all Amazon S3 events for a specific bucket](#creating-data-adv-event-selectors-CLI-s3-eds)
    + [Include Amazon S3 on AWS Outposts events](#creating-data-event-selectors-CLI-outposts-eds)
+ [Filtering data events by using advanced event selectors](filtering-data-events.md)
  + [How CloudTrail evaluates multiple conditions for a field](filtering-data-events.md#filtering-data-events-conditions)
    + [Example showing multiple conditions for the `resources.ARN` field](filtering-data-events.md#filtering-data-events-conditions-ex)
  + [AWS CLI examples for filtering data events](filtering-data-events.md#filtering-data-events-examples)
    + [Example 1: Filtering on the `eventName` field](filtering-data-events.md#filtering-data-events-eventname)
    + [Example 2: Filtering on the `resources.ARN` and `userIdentity.arn` fields](filtering-data-events.md#filtering-data-events-useridentityarn)
    + [Example 3: Filtering on the `resources.type` and `eventName` fields to exclude individual objects deleted by an Amazon S3 DeleteObjects event](filtering-data-events.md#filtering-data-events-deleteobjects)
+ [Aggregating data events](aggregating-data-events.md)
  + [Enabling aggregations for data events using the console](aggregating-data-events.md#aggregating-data-events-console)
  + [Enabling aggregations for data events using the AWS CLI](aggregating-data-events.md#aggregating-data-events-cli)
    + [Example: API\$1ACTIVITY aggregated event](aggregating-data-events.md#aggregating-data-events-api-activity-example)
    + [Example: RESOURCE\$1ACCESS aggregated event](aggregating-data-events.md#aggregating-data-events-resource-access-example)
+ [Logging data events for AWS Config compliance](#config-data-events-best-practices)
+ [Logging data events with the AWS SDKs](#logging-data-events-with-the-AWS-SDKs)

## Data events
<a name="logging-data-events"></a>

The following table shows the resource types available for trails and event data stores. The **Resource type (console)** column shows the appropriate selection in the console. The **resources.type value** column shows the `resources.type` value that you would specify to include data events of that type in your trail or event data store using the AWS CLI or CloudTrail APIs.

For trails, you can use basic or advanced event selectors to log data events for Amazon S3 objects in general purpose buckets, Lambda functions, and DynamoDB tables (shown in the first three rows of the table). You can use only advanced event selectors to log the resource types shown in the remaining rows.

For event data stores, you can use only advanced event selectors to include data events.

### Data events supported by AWS CloudTrail
<a name="w2aac21c31c19c11"></a>


****  

| AWS service | Description | Resource type (console) | resources.type value | 
| --- | --- | --- | --- | 
| Amazon RDS | [Amazon RDS API activity](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/logging-using-cloudtrail-data-api.html#logging-using-cloudtrail-data-api.including-excluding-cloudtrail-events) on a DB Cluster. | RDS Data API - DB Cluster | AWS::RDS::DBCluster | 
| Amazon S3 | [Amazon S3 object-level API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events) (for example, `GetObject`, `DeleteObject`, and `PutObject` API operations) on objects in general purpose buckets. | S3 | AWS::S3::Object | 
| Amazon S3 | [Amazon S3 API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events) on access points. | S3 Access Point | AWS::S3::AccessPoint | 
| Amazon S3 | [Amazon S3 object-level API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events) (for example, `GetObject`, `DeleteObject`, and `PutObject` API operations) on objects in directory buckets. | S3 Express | AWS::S3Express::Object | 
| Amazon S3 | [Amazon S3 Object Lambda access points API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events), such as calls to `CompleteMultipartUpload` and `GetObject`. | S3 Object Lambda | AWS::S3ObjectLambda::AccessPoint | 
| Amazon S3 | Amazon FSx API activity on volumes.  | FSx Volume | AWS::FSx::Volume | 
| Amazon S3 Tables | Amazon S3 API activity on [tables](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-tables-create.html). | S3 table | AWS::S3Tables::Table | 
| Amazon S3 Tables | Amazon S3 API activity on [table buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-tables-buckets.html). | S3 table bucket | AWS::S3Tables::TableBucket | 
| Amazon S3 Vectors | Amazon S3 API activity on [vector buckets](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-vectors-buckets.html). | S3 vector bucket | AWS::S3Vectors::VectorBucket | 
| Amazon S3 Vectors | Amazon S3 API activity on [vector indexes](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-vectors-indexes.html). | S3 vector index | AWS::S3Vectors::Index | 
| Amazon S3 on Outposts |  [Amazon S3 on Outposts object-level API activity](https://docs.aws.amazon.com/AmazonS3/latest/userguide/cloudtrail-logging-s3-info.html#cloudtrail-data-events). | S3 Outposts | AWS::S3Outposts::Object | 
| Amazon SNS | Amazon SNS [https://docs.aws.amazon.com/sns/latest/api/API_Publish.html](https://docs.aws.amazon.com/sns/latest/api/API_Publish.html) API operations on platform endpoints. | SNS platform endpoint | AWS::SNS::PlatformEndpoint | 
| Amazon SNS | Amazon SNS [https://docs.aws.amazon.com/sns/latest/api/API_Publish.html](https://docs.aws.amazon.com/sns/latest/api/API_Publish.html) and [https://docs.aws.amazon.com/sns/latest/api/API_PublishBatch.html](https://docs.aws.amazon.com/sns/latest/api/API_PublishBatch.html) API operations on topics. | SNS topic | AWS::SNS::Topic | 
| Amazon SQS | [Amazon SQS API activity](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-logging-using-cloudtrail.html#sqs-data-events-in-cloud-trail) on messages.  | SQS | AWS::SQS::Queue | 
| AWS Supply Chain | AWS Supply Chain API activity on an instance.  | Supply Chain | AWS::SCN::Instance | 
| Amazon SWF | [Amazon SWF API activity](https://docs.aws.amazon.com/amazonswf/latest/developerguide/ct-logging.html#cloudtrail-data-events) on [domains](https://docs.aws.amazon.com/amazonswf/latest/developerguide/swf-dev-domains.html).  | SWF domain | AWS::SWF::Domain | 
| AWS AppConfig | [AWS AppConfig API activity](https://docs.aws.amazon.com/appconfig/latest/userguide/logging-using-cloudtrail.html#appconfig-data-events-cloudtrail) for configuration operations such as calls to `StartConfigurationSession` and `GetLatestConfiguration`. | AWS AppConfig | AWS::AppConfig::Configuration | 
| AWS AppSync | [AWS AppSync API activity](https://docs.aws.amazon.com/appsync/latest/devguide/cloudtrail-logging.html#cloudtrail-data-events) on AppSync GraphQL APIs. | AppSync GraphQL | AWS::AppSync::GraphQLApi | 
| Amazon Aurora DSQL | Amazon Aurora DSQL API activity on cluster resources.  | Amazon Aurora DSQL | AWS::DSQL::Cluster | 
| AWS B2B Data Interchange | B2B Data Interchange API activity for Transformer operations such as calls to `GetTransformerJob` and `StartTransformerJob`. | B2B Data Interchange | AWS::B2BI::Transformer | 
| AWS Backup | AWS Backup Search Data API activity on search jobs. | AWS Backup Search Data APIs | AWS::Backup::SearchJob | 
| Amazon Bedrock | [Amazon Bedrock API activity](https://docs.aws.amazon.com/bedrock/latest/userguide/logging-using-cloudtrail.html#service-name-data-events-cloudtrail) on an agent alias. | Bedrock agent alias | AWS::Bedrock::AgentAlias | 
| Amazon Bedrock | Amazon Bedrock API activity on async invocations. | Bedrock async invoke | AWS::Bedrock::AsyncInvoke | 
| Amazon Bedrock | Amazon Bedrock API activity on a flow alias. | Bedrock flow alias | AWS::Bedrock::FlowAlias | 
| Amazon Bedrock | Amazon Bedrock API activity on guardrails. | Bedrock guardrail | AWS::Bedrock::Guardrail | 
| Amazon Bedrock | Amazon Bedrock API activity on inline agents. | Bedrock Invoke Inline-Agent | AWS::Bedrock::InlineAgent | 
| Amazon Bedrock | [Amazon Bedrock API activity](https://docs.aws.amazon.com/bedrock/latest/userguide/logging-using-cloudtrail.html#service-name-data-events-cloudtrail) on a knowledge base. | Bedrock knowledge base | AWS::Bedrock::KnowledgeBase | 
| Amazon Bedrock | Amazon Bedrock API activity on models. | Bedrock model | AWS::Bedrock::Model | 
| Amazon Bedrock | Amazon Bedrock API activity on prompts. | Bedrock prompt | AWS::Bedrock::PromptVersion | 
| Amazon Bedrock | Amazon Bedrock API activity on sessions. | Bedrock session | AWS::Bedrock::Session | 
| Amazon Bedrock | Amazon Bedrock API activity on flow executions.  | Bedrock flow execution | AWS::Bedrock::FlowExecution | 
| Amazon Bedrock | Amazon Bedrock API activity on an automated reasoning policy.  | Bedrock automated reasoning policy | AWS::Bedrock::AutomatedReasoningPolicy | 
| Amazon Bedrock | Amazon Bedrock API activity on an automated reasoning policy version.  | Bedrock automated reasoning policy version | AWS::Bedrock::AutomatedReasoningPolicyVersion | 
| Amazon Bedrock | Amazon Bedrock data automation project API activity. | **Bedrock Data Automation project** | `AWS::Bedrock::DataAutomationProject` | 
| Amazon Bedrock | Bedrock data automation invocation API activity. | **Bedrock Data Automation invocation** | `AWS::Bedrock::DataAutomationInvocation` | 
| Amazon Bedrock | Amazon Bedrock data automation profile API activity. | **Bedrock Data Automation profile** | `AWS::Bedrock::DataAutomationProfile` | 
| Amazon Bedrock | Amazon Bedrock blueprint API activity. | **Bedrock blueprint** | `AWS::Bedrock::Blueprint` | 
| Amazon Bedrock | Amazon Bedrock Code-Interpreter API activity. | **Bedrock-AgentCore Code-Interpreter** | `AWS::BedrockAgentCore::CodeInterpreter` | 
| Amazon Bedrock | Amazon Bedrock Browser API activity. | **Bedrock-AgentCore Browser** | `AWS::BedrockAgentCore::Browser` | 
| Amazon Bedrock | Amazon Bedrock Workload Identity API activity. | **Bedrock-AgentCore Workload Identity** | `AWS::BedrockAgentCore::WorkloadIdentity` | 
| Amazon Bedrock | Amazon Bedrock Workload Identity Directory API activity. | **Bedrock-AgentCore Workload Identity Directory** | `AWS::BedrockAgentCore::WorkloadIdentityDirectory` | 
| Amazon Bedrock | Amazon Bedrock Token Vault API activity. | **Bedrock-AgentCore Token Vault** | `AWS::BedrockAgentCore::TokenVault` | 
| Amazon Bedrock | Amazon Bedrock APIKey CredentialProvider API activity. | **Bedrock-AgentCore APIKey CredentialProvider** | `AWS::BedrockAgentCore::APIKeyCredentialProvider` | 
| Amazon Bedrock | Amazon Bedrock Runtime API activity. | **Bedrock-AgentCore Runtime** | `AWS::BedrockAgentCore::Runtime` | 
| Amazon Bedrock | Amazon Bedrock Runtime-Endpoint API activity. | **Bedrock-AgentCore Runtime-Endpoint** | `AWS::BedrockAgentCore::RuntimeEndpoint` | 
| Amazon Bedrock | Amazon Bedrock Gateway API activity. | **Bedrock-AgentCore Gateway** | `AWS::BedrockAgentCore::Gateway` | 
| Amazon Bedrock | Amazon Bedrock Memory API activity. | **Bedrock-AgentCore Memory** | `AWS::BedrockAgentCore::Memory` | 
| Amazon Bedrock | Amazon Bedrock Oauth2 CredentialProvider API activity. | **Bedrock-AgentCore Oauth2 CredentialProvider** | `AWS::BedrockAgentCore::OAuth2CredentialProvider` | 
| Amazon Bedrock | Amazon Bedrock Browser-Custom API activity. | **Bedrock-AgentCore Browser-Custom** | `AWS::BedrockAgentCore::BrowserCustom` | 
| Amazon Bedrock | Amazon Bedrock Code-Interpreter-Custom API activity. | **Bedrock-AgentCore Code-Interpreter-Custom** | `AWS::BedrockAgentCore::CodeInterpreterCustom` | 
| Amazon Bedrock | Amazon Bedrock Tool API activity. | Bedrock Tool | AWS::Bedrock::Tool | 
| AWS Cloud Map | [AWS Cloud Map API activity](https://docs.aws.amazon.com/cloud-map/latest/dg/cloudtrail-data-events.html) on a [namespace](https://docs.aws.amazon.com/cloud-map/latest/api/API_Namespace.html). | AWS Cloud Map namespace | AWS::ServiceDiscovery::Namespace | 
| AWS Cloud Map | [AWS Cloud Map API activity](https://docs.aws.amazon.com/cloud-map/latest/dg/cloudtrail-data-events.html) on a [service](https://docs.aws.amazon.com/cloud-map/latest/api/API_Service.html). | AWS Cloud Map service | AWS::ServiceDiscovery::Service | 
| Amazon CloudFront | CloudFront API activity on a [https://docs.aws.amazon.com/cloudfront/latest/APIReference/API_KeyValueStore.html](https://docs.aws.amazon.com/cloudfront/latest/APIReference/API_KeyValueStore.html). | CloudFront KeyValueStore | AWS::CloudFront::KeyValueStore | 
| AWS CloudTrail | CloudTrail [https://docs.aws.amazon.com/awscloudtraildata/latest/APIReference/API_PutAuditEvents.html](https://docs.aws.amazon.com/awscloudtraildata/latest/APIReference/API_PutAuditEvents.html) activity on a [CloudTrail Lake channel](query-event-data-store-integration.md) that is used to log events from outside AWS. | CloudTrail channel | AWS::CloudTrail::Channel | 
| Amazon CloudWatch | [Amazon CloudWatch API activity](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/logging_cw_api_calls.html#CloudWatch-data-plane-events) on metrics. | CloudWatch metric | AWS::CloudWatch::Metric | 
| Amazon CloudWatch Network Flow Monitor | Amazon CloudWatch Network Flow Monitor API activity on monitors. | Network Flow Monitor monitor | AWS::NetworkFlowMonitor::Monitor | 
| Amazon CloudWatch Network Flow Monitor | Amazon CloudWatch Network Flow Monitor API activity on scopes. | Network Flow Monitor scope | AWS::NetworkFlowMonitor::Scope | 
| Amazon CloudWatch RUM | Amazon CloudWatch RUM API activity on app monitors. | RUM app monitor | AWS::RUM::AppMonitor | 
| Amazon CodeGuru Profiler | CodeGuru Profiler API activity on profiling groups. | CodeGuru Profiler profiling group | AWS::CodeGuruProfiler::ProfilingGroup | 
| Amazon CodeWhisperer | Amazon CodeWhisperer API activity on a customization. | CodeWhisperer customization | AWS::CodeWhisperer::Customization | 
| Amazon CodeWhisperer | Amazon CodeWhisperer API activity on a profile. | CodeWhisperer | AWS::CodeWhisperer::Profile | 
| Amazon Cognito | Amazon Cognito API activity on Amazon Cognito [identity pools](https://docs.aws.amazon.com/cognito/latest/developerguide/amazon-cognito-info-in-cloudtrail.html#identity-pools-cloudtrail-events). | Cognito Identity Pools | AWS::Cognito::IdentityPool | 
| AWS Data Exchange | AWS Data Exchange API activity on assets. | **Data Exchange asset** | `AWS::DataExchange::Asset` | 
| Amazon Data Firehose | Amazon Data Firehose delivery stream API activity. | **Amazon Data Firehose** | `AWS::KinesisFirehose::DeliveryStream` | 
| AWS Deadline Cloud | [Deadline Cloud](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on fleets. | **Deadline Cloud fleet** | `AWS::Deadline::Fleet` | 
| AWS Deadline Cloud | [Deadline Cloud](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on jobs. | **Deadline Cloud job** | `AWS::Deadline::Job` | 
| AWS Deadline Cloud | [Deadline Cloud](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on queues. | **Deadline Cloud queue** | `AWS::Deadline::Queue` | 
| AWS Deadline Cloud | [Deadline Cloud](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on workers. | **Deadline Cloud worker** | `AWS::Deadline::Worker` | 
| Amazon DynamoDB | [Amazon DynamoDB item-level API activity](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/logging-using-cloudtrail.html#ddb-data-plane-events-in-cloudtrail) on tables (for example, `PutItem`, `DeleteItem`, and `UpdateItem` API operations). For tables with streams enabled, the `resources` field in the data event contains both `AWS::DynamoDB::Stream` and `AWS::DynamoDB::Table`. If you specify `AWS::DynamoDB::Table` for the `resources.type`, it will log both DynamoDB table and DynamoDB streams events by default. To exclude [streams events](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/logging-using-cloudtrail.html#ddb-data-plane-events-in-cloudtrail), add a filter on the `eventName` field.   | DynamoDB | `AWS::DynamoDB::Table`  | 
| Amazon DynamoDB | [Amazon DynamoDB](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/logging-using-cloudtrail.html#ddb-data-plane-events-in-cloudtrail) API activity on streams. | DynamoDB Streams | AWS::DynamoDB::Stream | 
| Amazon Elastic Block Store | [Amazon Elastic Block Store (EBS)](https://docs.aws.amazon.com/ebs/latest/userguide/logging-ebs-apis-using-cloudtrail.html) direct APIs, such as `PutSnapshotBlock`, `GetSnapshotBlock`, and `ListChangedBlocks` on Amazon EBS snapshots. | Amazon EBS direct APIs | AWS::EC2::Snapshot | 
| Amazon Elastic Compute Cloud | Amazon EC2 instance connect endpoint API activity. | **EC2 instance connect endpoint** | `AWS::EC2::InstanceConnectEndpoint` | 
| Amazon Elastic Container Service | Amazon Elastic Container Service API activity on a container instance. | ECS container instance | AWS::ECS::ContainerInstance | 
| Amazon Elastic Kubernetes Service | Amazon Elastic Kubernetes Service API activity on dashboards.  | Amazon Elastic Kubernetes Service dashboard | AWS::EKS::Dashboard | 
| Amazon EMR | [Amazon EMR API activity](https://docs.aws.amazon.com/emr/latest/ManagementGuide/logging-using-cloudtrail.html#cloudtrail-data-events) on a write-ahead log workspace. | EMR write-ahead log workspace | AWS::EMRWAL::Workspace | 
| AWS End User Messaging SMS | [AWS End User Messaging SMS](https://docs.aws.amazon.com/sms-voice/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on origination identities. | SMS Voice origination identity | AWS::SMSVoice::OriginationIdentity | 
| AWS End User Messaging SMS | [AWS End User Messaging SMS](https://docs.aws.amazon.com/sms-voice/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on messages. | SMS Voice message | AWS::SMSVoice::Message | 
| AWS End User Messaging Social | [AWS End User Messaging Social](https://docs.aws.amazon.com/social-messaging/latest/userguide/logging-using-cloudtrail.html#cloudtrail-data-events) API activity on phone number IDs. | Social-Messaging Phone Number Id | AWS::SocialMessaging::PhoneNumberId | 
| AWS End User Messaging Social | AWS End User Messaging Social API activity on Waba IDs. | Social-Messaging Waba ID | AWS::SocialMessaging::WabaId | 
| Amazon FinSpace | [Amazon FinSpace](https://docs.aws.amazon.com/finspace/latest/userguide/logging-cloudtrail-events.html#finspace-dataplane-events) API activity on environments. | FinSpace | AWS::FinSpace::Environment | 
| Amazon GameLift Streams | Amazon GameLift Streams [streaming API activity](https://docs.aws.amazon.com/gameliftstreams/latest/developerguide/logging-using-cloudtrail.html#cloudtrail-data-events) on applications. | GameLift Streams application | AWS::GameLiftStreams::Application | 
| Amazon GameLift Streams | Amazon GameLift Streams [streaming API activity](https://docs.aws.amazon.com/gameliftstreams/latest/developerguide/logging-using-cloudtrail.html#cloudtrail-data-events) on stream groups. | GameLift Streams stream group | AWS::GameLiftStreams::StreamGroup | 
| AWS Glue | AWS Glue API activity on tables that were created by Lake Formation. | Lake Formation | AWS::Glue::Table | 
| Amazon GuardDuty | Amazon GuardDuty API activity for a [detector](https://docs.aws.amazon.com/guardduty/latest/ug/logging-using-cloudtrail.html#guardduty-data-events-in-cloudtrail). | GuardDuty detector | AWS::GuardDuty::Detector | 
| AWS HealthImaging | AWS HealthImaging API activity on data stores. | MedicalImaging data store | AWS::MedicalImaging::Datastore | 
| AWS HealthImaging | AWS HealthImaging image set API activity. | **MedicalImaging image set** | `AWS::MedicalImaging::Imageset` | 
| AWS IoT | [AWS IoT API activity](https://docs.aws.amazon.com/greengrass/v2/developerguide/logging-using-cloudtrail.html#greengrass-data-events-cloudtrail) on [certificates](https://docs.aws.amazon.com/iot/latest/developerguide/x509-client-certs.html). | IoT certificate | AWS::IoT::Certificate | 
| AWS IoT | [AWS IoT API activity](https://docs.aws.amazon.com/greengrass/v2/developerguide/logging-using-cloudtrail.html#greengrass-data-events-cloudtrail) on [things](https://docs.aws.amazon.com/iot/latest/developerguide/thing-registry.html). | IoT thing | AWS::IoT::Thing | 
| AWS IoT Greengrass Version 2 | [Greengrass API activity](https://docs.aws.amazon.com/greengrass/v2/developerguide/logging-using-cloudtrail.html#greengrass-data-events-cloudtrail) from a Greengrass core device on a component version. Greengrass doesn't log access denied events. | IoT Greengrass component version | AWS::GreengrassV2::ComponentVersion | 
| AWS IoT Greengrass Version 2 | [Greengrass API activity](https://docs.aws.amazon.com/greengrass/v2/developerguide/logging-using-cloudtrail.html#greengrass-data-events-cloudtrail) from a Greengrass core device on a deployment. Greengrass doesn't log access denied events. | IoT Greengrass deployment | AWS::GreengrassV2::Deployment | 
| AWS IoT SiteWise | [IoT SiteWise API activity](https://docs.aws.amazon.com/iot-sitewise/latest/userguide/logging-using-cloudtrail.html#service-name-data-events-cloudtrail) on [assets](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_CreateAsset.html). | IoT SiteWise asset | AWS::IoTSiteWise::Asset | 
| AWS IoT SiteWise | [IoT SiteWise API activity](https://docs.aws.amazon.com/iot-sitewise/latest/userguide/logging-using-cloudtrail.html#service-name-data-events-cloudtrail) on [time series](https://docs.aws.amazon.com/iot-sitewise/latest/APIReference/API_DescribeTimeSeries.html). | IoT SiteWise time series | AWS::IoTSiteWise::TimeSeries | 
| AWS IoT SiteWise Assistant | Sitewise Assistant API activity on conversations. | Sitewise Assistant conversation | AWS::SitewiseAssistant::Conversation | 
| AWS IoT TwinMaker | IoT TwinMaker API activity on an [entity](https://docs.aws.amazon.com/iot-twinmaker/latest/apireference/API_CreateEntity.html). | IoT TwinMaker entity | AWS::IoTTwinMaker::Entity | 
| AWS IoT TwinMaker | IoT TwinMaker API activity on a [workspace](https://docs.aws.amazon.com/iot-twinmaker/latest/apireference/API_CreateWorkspace.html). | IoT TwinMaker workspace | AWS::IoTTwinMaker::Workspace | 
| Amazon Kendra Intelligent Ranking | Amazon Kendra Intelligent Ranking API activity on [rescore execution plans](https://docs.aws.amazon.com/kendra/latest/dg/cloudtrail-intelligent-ranking.html#cloud-trail-intelligent-ranking-log-entry). | Kendra Ranking | AWS::KendraRanking::ExecutionPlan | 
| Amazon Keyspaces (for Apache Cassandra) | [Amazon Keyspaces API activity](https://docs.aws.amazon.com/keyspaces/latest/devguide/logging-using-cloudtrail.html#keyspaces-in-cloudtrail-dml) on a table. | Cassandra table | AWS::Cassandra::Table | 
| Amazon Keyspaces (for Apache Cassandra) | Amazon Keyspaces (for Apache Cassandra) API activity on Cassandra CDC streams.  | Cassandra CDC streams | AWS::Cassandra::Stream | 
| Amazon Kinesis Data Streams | Kinesis Data Streams API activity on [streams](https://docs.aws.amazon.com/streams/latest/dev/working-with-streams.html). | Kinesis stream | AWS::Kinesis::Stream | 
| Amazon Kinesis Data Streams | Kinesis Data Streams API activity on [stream consumers](https://docs.aws.amazon.com/streams/latest/dev/building-consumers.html). | Kinesis stream consumer | AWS::Kinesis::StreamConsumer | 
| Amazon Kinesis Video Streams | Kinesis Video Streams API activity on video streams, such as calls to GetMedia and PutMedia. | Kinesis video stream | AWS::KinesisVideo::Stream | 
| Amazon Kinesis Video Streams | Kinesis Video Streams video signaling channel API activity. | **Kinesis video signaling channel** | `AWS::KinesisVideo::SignalingChannel` | 
| AWS Lambda | AWS Lambda function execution activity (the `Invoke` API). | Lambda | AWS::Lambda::Function | 
| Amazon Location Maps | Amazon Location Maps API activity. | Geo Maps | AWS::GeoMaps::Provider | 
| Amazon Location Places | Amazon Location Places API activity. | Geo Places | AWS::GeoPlaces::Provider | 
| Amazon Location Routes | Amazon Location Routes API activity. | Geo Routes | AWS::GeoRoutes::Provider | 
| Amazon Machine Learning | Machine Learning API activity on ML models. | Maching Learning MlModel | AWS::MachineLearning::MlModel | 
| Amazon Managed Blockchain | Amazon Managed Blockchain API activity on a network. | Managed Blockchain network | AWS::ManagedBlockchain::Network | 
| Amazon Managed Blockchain | [Amazon Managed Blockchain](https://docs.aws.amazon.com/managed-blockchain/latest/ethereum-dev/logging-using-cloudtrail.html#ethereum-jsonrpc-logging) JSON-RPC calls on Ethereum nodes, such as `eth_getBalance` or `eth_getBlockByNumber`. | Managed Blockchain | AWS::ManagedBlockchain::Node | 
| Amazon Managed Blockchain Query | Amazon Managed Blockchain Query API activity. | Managed Blockchain Query | AWS::ManagedBlockchainQuery::QueryAPI | 
| Amazon Managed Workflows for Apache Airflow | Amazon MWAA API activity on environments.  | Managed Apache Airflow | AWS::MWAA::Environment | 
| Amazon Neptune Graph | Data API activities, for example queries, algorithms, or vector search, on a Neptune Graph. | Neptune Graph | AWS::NeptuneGraph::Graph | 
| Amazon One Enterprise | Amazon One Enterprise API activity on a UKey. | Amazon One UKey | AWS::One::UKey | 
| Amazon One Enterprise | Amazon One Enterprise API activity on users. | Amazon One User | AWS::One::User | 
| AWS Payment Cryptography | AWS Payment Cryptography API activity on aliases. | Payment Cryptography Alias | AWS::PaymentCryptography::Alias | 
| AWS Payment Cryptography | AWS Payment Cryptography API activity on keys. | Payment Cryptography Key | AWS::PaymentCryptography::Key | 
| Amazon Pinpoint | Amazon Pinpoint API activity on mobile targeting applications. | Mobile Targeting Application | AWS::Pinpoint::App | 
| AWS Private CA | AWS Private CA Connector for Active Directory API activity. | AWS Private CA Connector for Active Directory | AWS::PCAConnectorAD::Connector | 
| AWS Private CA | AWS Private CA Connector for SCEP API activity. | AWS Private CA Connector for SCEP | AWS::PCAConnectorSCEP::Connector | 
| Amazon Q Apps | Data API activity on [Amazon Q Apps](https://docs.aws.amazon.com/amazonq/latest/qbusiness-ug/purpose-built-qapps.html). | Amazon Q Apps | AWS::QApps::QApp | 
| Amazon Q Apps | Data API activity on Amazon Q App sessions. | Amazon Q App Session | AWS::QApps::QAppSession | 
| Amazon Q Business | [Amazon Q Business API activity](https://docs.aws.amazon.com/amazonq/latest/business-use-dg/logging-using-cloudtrail.html#service-name-data-plane-events-cloudtrail) on an application. | Amazon Q Business application | AWS::QBusiness::Application | 
| Amazon Q Business | [Amazon Q Business API activity](https://docs.aws.amazon.com/amazonq/latest/business-use-dg/logging-using-cloudtrail.html#service-name-data-plane-events-cloudtrail) on a data source. | Amazon Q Business data source | AWS::QBusiness::DataSource | 
| Amazon Q Business | [Amazon Q Business API activity](https://docs.aws.amazon.com/amazonq/latest/business-use-dg/logging-using-cloudtrail.html#service-name-data-plane-events-cloudtrail) on an index. | Amazon Q Business index | AWS::QBusiness::Index | 
| Amazon Q Business | [Amazon Q Business API activity](https://docs.aws.amazon.com/amazonq/latest/business-use-dg/logging-using-cloudtrail.html#service-name-data-plane-events-cloudtrail) on a web experience. | Amazon Q Business web experience | AWS::QBusiness::WebExperience | 
| Amazon Q Business  | Amazon Q Business integration API activity. | **Amazon Q Business integration** | `AWS::QBusiness::Integration` | 
| Amazon Q Developer | Amazon Q Developer API activity on an integration. | Q Developer integration | AWS::QDeveloper::Integration | 
| Amazon Q Developer | [Amazon Q Developer API activity](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/logging_cw_api_calls.html#Q-Developer-Investigations-Cloudtrail) on operational investigations. | AIOps Investigation Group | AWS::AIOps::InvestigationGroup | 
| Amazon Quick | Amazon Quick API activity on an action connector. | AWSQuickSuite Actions | AWS::Quicksight::ActionConnector | 
| Amazon Quick | Amazon Quick Flow API activity. | **QuickSight flow** | `AWS::QuickSight::Flow` | 
| Amazon Quick | Amazon Quick FlowSession API activity. | **QuickSight flow session** | `AWS::QuickSight::FlowSession` | 
| Amazon SageMaker AI |  Amazon SageMaker AI [https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_runtime_InvokeEndpointWithResponseStream.html](https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_runtime_InvokeEndpointWithResponseStream.html) activity on endpoints. | SageMaker AI endpoint | AWS::SageMaker::Endpoint | 
| Amazon SageMaker AI | Amazon SageMaker AI API activity on feature stores. | SageMaker AI feature store | AWS::SageMaker::FeatureGroup | 
| Amazon SageMaker AI | Amazon SageMaker AI API activity on [experiment trial components](https://docs.aws.amazon.com/sagemaker/latest/dg/experiments-monitoring.html). | SageMaker AI metrics experiment trial component | AWS::SageMaker::ExperimentTrialComponent | 
| Amazon SageMaker AI | Amazon SageMaker AI MLflow API activity. | **SageMaker MLflow** | `AWS::SageMaker::MlflowTrackingServer` | 
| AWS Signer | Signer API activity on signing jobs. | Signer signing job | AWS::Signer::SigningJob | 
| AWS Signer | Signer API activity on signing profiles. | Signer signing profile | AWS::Signer::SigningProfile | 
| Amazon Simple Email Service | Amazon Simple Email Service (Amazon SES) API activity on configuration sets. | SES configuration set | AWS::SES::ConfigurationSet | 
| Amazon Simple Email Service | Amazon Simple Email Service (Amazon SES) API activity on email identities. | SES identity | AWS::SES::EmailIdentity | 
| Amazon Simple Email Service | Amazon Simple Email Service (Amazon SES) API activity on templates. | SES template | AWS::SES::Template | 
| Amazon SimpleDB | Amazon SimpleDB API activity on domains. | SimpleDB domain | AWS::SDB::Domain | 
| AWS Step Functions | [Step Functions API activity](https://docs.aws.amazon.com/step-functions/latest/dg/procedure-cloud-trail.html#cloudtrail-data-events) on activities.  | Step Functions | AWS::StepFunctions::Activity | 
| AWS Step Functions | [Step Functions API activity](https://docs.aws.amazon.com/step-functions/latest/dg/procedure-cloud-trail.html#cloudtrail-data-events) on state machines.  | Step Functions state machine | AWS::StepFunctions::StateMachine | 
| AWS Systems Manager | [Systems Manager API activity](https://docs.aws.amazon.com/systems-manager/latest/userguide/monitoring-cloudtrail-logs.html#cloudtrail-data-events) on control channels. | Systems Manager | AWS::SSMMessages::ControlChannel | 
| AWS Systems Manager | Systems Manager API activity on impact assessments. | SSM Impact Assessment  | AWS::SSM::ExecutionPreview | 
| AWS Systems Manager | [Systems Manager API activity](https://docs.aws.amazon.com/systems-manager/latest/userguide/monitoring-cloudtrail-logs.html#cloudtrail-data-events) on managed nodes. | Systems Manager managed node | AWS::SSM::ManagedNode | 
| Amazon Timestream | Amazon Timestream [https://docs.aws.amazon.com/timestream/latest/developerguide/API_query_Query.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_query_Query.html) API activity on databases. | Timestream database | AWS::Timestream::Database | 
| Amazon Timestream | Amazon Timestream API activity on regional endpoints. | Timestream regional endpoint | AWS::Timestream::RegionalEndpoint | 
| Amazon Timestream | Amazon Timestream [https://docs.aws.amazon.com/timestream/latest/developerguide/API_query_Query.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_query_Query.html) API activity on tables. | Timestream table | AWS::Timestream::Table | 
| Amazon Verified Permissions | Amazon Verified Permissions API activity on a policy store. | Amazon Verified Permissions | AWS::VerifiedPermissions::PolicyStore | 
| Amazon WorkSpaces Thin Client | WorkSpaces Thin Client API activity on a Device. | Thin Client Device | AWS::ThinClient::Device | 
| Amazon WorkSpaces Thin Client | WorkSpaces Thin Client API activity on an Environment. | Thin Client Environment | AWS::ThinClient::Environment | 
| AWS X-Ray | [X-Ray API activity](https://docs.aws.amazon.com/xray/latest/devguide/xray-api-cloudtrail.html#cloudtrail-data-events) on [traces](https://docs.aws.amazon.com/xray/latest/devguide/xray-concepts.html#xray-concepts-traces). | X-Ray trace | AWS::XRay::Trace | 
| Amazon AIDevOps | AIDevOps API activity on agent spaces. | Agent Space | AWS::AIDevOps::AgentSpace | 
| Amazon AIDevOps | AIDevOps API activity on associations. | AIDevOps association | AWS::AIDevOps::Association | 
| Amazon AIDevOps | AIDevOps API activity on operator app teams. | AIDevOps operator app team | AWS::AIDevOps::OperatorAppTeam | 
| Amazon AIDevOps | AIDevOps API activity on pipeline metadata. | AIDevOps Pipelines Metadata | AWS::AIDevOps::PipelineMetadata | 
| Amazon AIDevOps | AIDevOps API activity on services. | AIDevOps service | AWS::AIDevOps::Service | 
| Amazon Bedrock | Bedrock API activity on advanced optimize prompt jobs. | AdvancedOptimizePromptJob | AWS::Bedrock::AdvancedOptimizePromptJob | 
| Amazon Bedrock AgentCore | Bedrock AgentCore API activity on evaluators. | Bedrock-AgentCore Evaluator | AWS::BedrockAgentCore::Evaluator | 
| Amazon Cost Optimization | CloudOptimization API activity on profiles. | CloudOptimization Profile | AWS::CloudOptimization::Profile | 
| Amazon Cost Optimization | CloudOptimization API activity on recommendations. | CloudOptimization Recommendation | AWS::CloudOptimization::Recommendation | 
| Amazon GuardDuty | GuardDuty API activity on malware scans. | GuardDuty malware scan | AWS::GuardDuty::MalwareScan | 
| Amazon NovaAct | Amazon NovaAct API activity on workflow definitions. | Workflow definition | AWS::NovaAct::WorkflowDefinition | 
| Amazon NovaAct | Amanzon NovaAct API activity on workflow runs. | Workflow run | AWS::NovaAct::WorkflowRun | 
| Amazon Redshift | Redshift API activity on clusters. | Amazon Redshift Cluster | AWS::Redshift::Cluster | 
| Amazon Support | SupportAccess API activity on tenants. | SupportAccess tenant | AWS::SupportAccess::Tenant | 
| Amazon Support | SupportAccess API activity on trusting accounts. | SupportAccess trusting account | AWS::SupportAccess::TrustingAccount | 
| Amazon Support | SupportAccess API activity on trusting roles. | SupportAccess trusting role | AWS::SupportAccess::TrustingRole | 
| Amazon Transform | Transform API activity on agent instances. | Transform agent instance | AWS::Transform::AgentInstance | 
| Amazon Transform Custom | Transform Custom API activity on campaigns. | Transform-Custom campaign | AWS::TransformCustom::Campaign | 
| Amazon Transform Custom | Transform Custom API activity on conversations. | Transform-Custom conversation | AWS::TransformCustom::Conversation | 
| Amazon Transform Custom | Transform Custom API activity on knowledge items. | Transform-Custom knowledge item | AWS::TransformCustom::KnowledgeItem | 
| Amazon Transform Custom | Transform Custom API activity on packages. | Transform-Custom package | AWS::TransformCustom::Package | 

To record CloudTrail data events, you must explicitly add each resource type for which you want to collect activity. For more information, see [Creating a trail with the CloudTrail console](cloudtrail-create-a-trail-using-the-console-first-time.md) and [Create an event data store for CloudTrail events with the console](query-event-data-store-cloudtrail.md).

On a single-Region trail or event data store, you can log data events only for resources that you can access in that Region. Though S3 buckets are global, AWS Lambda functions and DynamoDB tables are regional.

Additional charges apply for logging data events. For CloudTrail pricing, see [AWS CloudTrail Pricing](https://aws.amazon.com/cloudtrail/pricing/).

### Examples: Logging data events for Amazon S3 objects
<a name="logging-data-events-examples"></a>

**Logging data events for all S3 objects in an S3 bucket**

The following example demonstrates how logging works when you configure logging of all data events for an S3 bucket named `amzn-s3-demo-bucket`. In this example, the CloudTrail user specified an empty prefix, and the option to log both **Read** and **Write** data events.

1. A user uploads an object to `amzn-s3-demo-bucket`. 

1. The `PutObject` API operation is an Amazon S3 object-level API. It is recorded as a data event in CloudTrail. Because the CloudTrail user specified an S3 bucket with an empty prefix, events that occur on any object in that bucket are logged. The trail or event data store processes and logs the event.

1. Another user uploads an object to `amzn-s3-demo-bucket2`. 

1. The `PutObject` API operation occurred on an object in an S3 bucket that wasn't specified for the trail or event data store. The trail or event data store doesn't log the event. 

**Logging data events for specific S3 objects**

The following example demonstrates how logging works when you configure a trail or event data store to log events for specific S3 objects. In this example, the CloudTrail user specified an S3 bucket named `amzn-s3-demo-bucket3`, with the prefix *my-images*, and the option to log only **Write** data events.

1. A user deletes an object that begins with the `my-images` prefix in the bucket, such as `arn:aws:s3:::amzn-s3-demo-bucket3/my-images/example.jpg`.

1. The `DeleteObject` API operation is an Amazon S3 object-level API. It is recorded as a **Write** data event in CloudTrail. The event occurred on an object that matches the S3 bucket and prefix specified in the trail or event data store. The trail or event data store processes and logs the event.

1. Another user deletes an object with a different prefix in the S3 bucket, such as `arn:aws:s3:::amzn-s3-demo-bucket3/my-videos/example.avi`.

1. The event occurred on an object that doesn't match the prefix specified in your trail or event data store. The trail or event data store doesn't log the event.

1. A user calls the `GetObject` API operation for the object, `arn:aws:s3:::amzn-s3-demo-bucket3/my-images/example.jpg`.

1. The event occurred on a bucket and prefix that are specified in the trail or event data store, but `GetObject` is a read-type Amazon S3 object-level API. It is recorded as a **Read** data event in CloudTrail, and the trail or event data store is not configured to log **Read** events. The trail or event data store doesn't log the event.

**Note**  
For trails, if you are logging data events for specific Amazon S3 buckets, we recommend you do not use an Amazon S3 bucket for which you are logging data events to receive log files that you have specified in the data events section for your trail. Using the same Amazon S3 bucket causes your trail to log a data event each time log files are delivered to your Amazon S3 bucket. Log files are aggregated events delivered at intervals, so this is not a 1:1 ratio of event to log file; the event is logged in the next log file. For example, when CloudTrail delivers logs, the `PutObject` event occurs on the S3 bucket. If the S3 bucket is also specified in the data events section, the trail processes and logs the `PutObject` event as a data event. That action is another `PutObject` event, and the trail processes and logs the event again.  
To avoid logging data events for the Amazon S3 bucket where you receive log files if you configure a trail to log all Amazon S3 data events in your AWS account, consider configuring delivery of log files to an Amazon S3 bucket that belongs to another AWS account. For more information, see [Receiving CloudTrail log files from multiple accountsRedacting bucket owner account IDs for data events called by other accounts](cloudtrail-receive-logs-from-multiple-accounts.md).

### Logging data events for S3 objects in other AWS accounts
<a name="logging-data-events-for-s3-resources-in-other-accounts"></a>

When you configure your trail to log data events, you can also specify S3 objects that belong to other AWS accounts. When an event occurs on a specified object, CloudTrail evaluates whether the event matches any trails in each account. If the event matches the settings for a trail, the trail processes and logs the event for that account. Generally, both API callers and resource owners can receive events.

If you own an S3 object and you specify it in your trail, your trail logs events that occur on the object in your account. Because you own the object, your trail also logs events when other accounts call the object.

If you specify an S3 object in your trail, and another account owns the object, your trail only logs events that occur on that object in your account. Your trail doesn't log events that occur in other accounts.

**Example: Logging data events for an Amazon S3 object for two AWS accounts**

The following example shows how two AWS accounts configure CloudTrail to log events for the same S3 object.

1. In your account, you want your trail to log data events for all objects in your S3 bucket named `amzn-s3-demo-bucket`. You configure the trail by specifying the S3 bucket with an empty object prefix.

1. Bob has a separate account that has been granted access to the S3 bucket. Bob also wants to log data events for all objects in the same S3 bucket. For his trail, he configures his trail and specifies the same S3 bucket with an empty object prefix.

1. Bob uploads an object to the S3 bucket with the `PutObject` API operation.

1. This event occurred in his account and it matches the settings for his trail. Bob's trail processes and logs the event.

1. Because you own the S3 bucket and the event matches the settings for your trail, your trail also processes and logs the same event. Because there are now two copies of the event (one logged in Bob's trail, and one logged in yours), CloudTrail charges for two copies of the data event.

1. You upload an object to the S3 bucket.

1. This event occurs in your account and it matches the settings for your trail. Your trail processes and logs the event.

1. Because the event didn't occur in Bob's account, and he doesn't own the S3 bucket, Bob's trail doesn't log the event. CloudTrail charges for only one copy of this data event.

**Example: Logging data events for all buckets, including an S3 bucket used by two AWS accounts**

The following example shows the logging behavior when **Select all S3 buckets in your account** is enabled for trails that collect data events in an AWS account.

1. In your account, you want your trail to log data events for all S3 buckets. You configure the trail by choosing **Read** events, **Write** events, or both for **All current and future S3 buckets** in **Data events**.

1. Bob has a separate account that has been granted access to an S3 bucket in your account. He wants to log data events for the bucket to which he has access. He configures his trail to get data events for all S3 buckets.

1. Bob uploads an object to the S3 bucket with the `PutObject` API operation.

1. This event occurred in his account and it matches the settings for his trail. Bob's trail processes and logs the event.

1. Because you own the S3 bucket and the event matches the settings for your trail, your trail also processes and logs the event. Because there are now two copies of the event (one logged in Bob's trail, and one logged in yours), CloudTrail charges each account for a copy of the data event.

1. You upload an object to the S3 bucket.

1. This event occurs in your account and it matches the settings for your trail. Your trail processes and logs the event.

1. Because the event didn't occur in Bob's account, and he doesn't own the S3 bucket, Bob's trail doesn't log the event. CloudTrail charges for only one copy of this data event in your account.

1. A third user, Mary, has access to the S3 bucket, and runs a `GetObject` operation on the bucket. She has a trail configured to log data events on all S3 buckets in her account. Because she is the API caller, CloudTrail logs a data event in her trail. Though Bob has access to the bucket, he is not the resource owner, so no event is logged in his trail this time. As the resource owner, you receive an event in your trail about the `GetObject` operation that Mary called. CloudTrail charges your account and Mary's account for each copy of the data event: one in Mary's trail, and one in yours.

## Read-only and write-only events
<a name="read-write-events-data"></a>

When you configure your trail or event data store to log data and management events, you can specify whether you want read-only events, write-only events, or both.
+ **Read**

  **Read** events include API operations that read your resources, but don't make changes. For example, read-only events include the Amazon EC2 `DescribeSecurityGroups` and `DescribeSubnets` API operations. These operations return only information about your Amazon EC2 resources and don't change your configurations. 
+ **Write**

  **Write** events include API operations that modify (or might modify) your resources. For example, the Amazon EC2 `RunInstances` and `TerminateInstances` API operations modify your instances.

**Example: Logging read and write events for separate trails**

The following example shows how you can configure trails to split log activity for an account into separate S3 buckets: one bucket named amzn-s3-demo-bucket1 receives read-only events and a second amzn-s3-demo-bucket2 receives write-only events.

1. You create a trail and choose the S3 bucket named `amzn-s3-demo-bucket1` to receive log files. You then update the trail to specify that you want **Read** management events and data events.

1. You create a second trail and choose the S3 bucket the `amzn-s3-demo-bucket2 ` to receive log files. You then update the trail to specify that you want **Write** management events and data events.

1. The Amazon EC2 `DescribeInstances` and `TerminateInstances` API operations occur in your account.

1. The `DescribeInstances` API operation is a read-only event and it matches the settings for the first trail. The trail logs and delivers the event to the `amzn-s3-demo-bucket1`.

1. The `TerminateInstances` API operation is a write-only event and it matches the settings for the second trail. The trail logs and delivers the event to the `amzn-s3-demo-bucket2 `.

## Logging data events with the AWS Management Console
<a name="logging-data-events-console"></a>

The following procedures describe how to an update existing event data store or trail to log data events by using the AWS Management Console. For information about how to create an event data store to log data events, see [Create an event data store for CloudTrail events with the console](query-event-data-store-cloudtrail.md). For information about how to create a trail to log data events, see [Creating a trail with the console](cloudtrail-create-a-trail-using-the-console-first-time.md#creating-a-trail-in-the-console). 

For trails, the steps for logging data events differ based on whether you're using advanced event selectors or basic event selectors. You can log data events for all resource types using advanced event selectors, but if you use basic event selectors you're limited to logging data events for Amazon S3 buckets and bucket objects, AWS Lambda functions, and Amazon DynamoDB tables.

### Updating an existing event data store to log data events using the console
<a name="logging-data-events-with-the-cloudtrail-console-eds"></a>

Use the following procedure to update an existing event data store to log data events. For more information about using advanced event selectors, see [Filtering data events by using advanced event selectors](filtering-data-events.md) in this topic.

1. Sign in to the AWS Management Console and open the CloudTrail console at [https://console.aws.amazon.com/cloudtrail/](https://console.aws.amazon.com/cloudtrail/).

1.  From the navigation pane, under **Lake**, choose **Event data stores**. 

1. On the **Event data stores** page, choose the event data store you want to update.
**Note**  
You can only enable data events on event data stores that contain CloudTrail events. You cannot enable data events on CloudTrail event data stores for AWS Config configuration items, CloudTrail Insights events, or non-AWS events.

1. On the details page, in **Data events**, choose **Edit**.

1. If you are not already logging data events, choose the **Data events** check box.

1. For **Resource type**, choose the resource type on which you want to log data events.

1. Choose a log selector template. You can choose a predefined template, or choose **Custom** to define your own event collection conditions.

   You can choose from the following predefined templates:
   + **Log all events** – Choose this template to log all events.
   + **Log only read events** – Choose this template to log only read events. Read-only events are events that do not change the state of a resource, such as `Get*` or `Describe*` events.
   + **Log only write events** – Choose this template to log only write events. Write events add, change, or delete resources, attributes, or artifacts, such as `Put*`, `Delete*`, or `Write*` events.
   + **Log only AWS Management Console events** – Choose this template to log only events originating from the AWS Management Console.
   + **Exclude AWS service initiated events** – Choose this template to exclude AWS service events, which have an `eventType` of `AwsServiceEvent`, and events initiated with AWS service-linked roles (SLRs).

1. (Optional) In **Selector name**, enter a name to identify your selector. The selector name is a descriptive name for an advanced event selector, such as "Log data events for only two S3 buckets". The selector name is listed as `Name` in the advanced event selector and is viewable if you expand the **JSON view**.

1. If you selected **Custom**, in **Advanced event selectors** build an expression based on the values of advanced event selector fields.
**Note**  
Selectors don't support the use of wildcards like `*` . To match multiple values with a single condition, you may use `StartsWith`, `EndsWith`, `NotStartsWith`, or `NotEndsWith` to explicitly match the beginning or end of the event field.

   1. Choose from the following fields.
      + **`readOnly`** - `readOnly` can be set to **equals** a value of `true` or `false`. Read-only data events are events that do not change the state of a resource, such as `Get*` or `Describe*` events. Write events add, change, or delete resources, attributes, or artifacts, such as `Put*`, `Delete*`, or `Write*` events. To log both `read` and `write` events, don't add a `readOnly` selector.
      + **`eventName`** - `eventName` can use any operator. You can use it to include or exclude any data event logged to CloudTrail, such as `PutBucket`, `GetItem`, or `GetSnapshotBlock`.
      + **`eventSource`** – The event source to include or exclude. This field can use any operator.
      + **eventType** – The event type to include or exclude. For example, you can set this field to **not equals** `AwsServiceEvent` to exclude [AWS service events](non-api-aws-service-events.md). For a list of event types, see [`eventType`](cloudtrail-event-reference-record-contents.md#ct-event-type) in [CloudTrail record contents for management, data, and network activity events](cloudtrail-event-reference-record-contents.md).
      + **sessionCredentialFromConsole** – Include or exclude events originating from an AWS Management Console session. This field can be set to **equals** or **not equals** with a value of `true`.
      + **userIdentity.arn** – Include or exclude events for actions taken by specific IAM identities. For more information, see [CloudTrail userIdentity element](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-event-reference-user-identity.html).
      + **`resources.ARN`** - You can use any operator with `resources.ARN`, but if you use **equals** or **does not equal**, the value must exactly match the ARN of a valid resource of the type you've specified in the template as the value of `resources.type`.
**Note**  
You can't use the `resources.ARN` field to filter resource types that do not have ARNs.

        For more information about the ARN formats of data event resources, see [Actions, resources, and condition keys for AWS services](https://docs.aws.amazon.com/service-authorization/latest/reference/reference_policies_actions-resources-contextkeys.html) in the *Service Authorization Reference*.

   1. For each field, choose **\$1 Condition** to add as many conditions as you need, up to a maximum of 500 specified values for all conditions. For example, to exclude data events for two S3 buckets from data events that are logged on your event data store, you can set the field to **resources.ARN**, set the operator for **does not start with**, and then paste in an S3 bucket ARN for which you do not want to log events.

      To add the second S3 bucket, choose **\$1 Condition**, and then repeat the preceding instruction, pasting in the ARN for or browsing for a different bucket.

      For information about how CloudTrail evaluates multiple conditions, see [How CloudTrail evaluates multiple conditions for a field](filtering-data-events.md#filtering-data-events-conditions).
**Note**  
You can have a maximum of 500 values for all selectors on an event data store. This includes arrays of multiple values for a selector such as `eventName`. If you have single values for all selectors, you can have a maximum of 500 conditions added to a selector.

   1. Choose **\$1 Field** to add additional fields as required. To avoid errors, do not set conflicting or duplicate values for fields. For example, do not specify an ARN in one selector to be equal to a value, then specify that the ARN not equal the same value in another selector.

1. To add another resource type on which to log data events, choose **Add data event type**. Repeat steps 6 through this step to configure advanced event selectors for another resource type.

1. After you've reviewed and verified your choices, choose **Save changes**.

### Updating an existing trail to log data events with advanced event selectors using the console
<a name="logging-data-events-with-the-cloudtrail-console-adv"></a>

In the AWS Management Console, if your trail is using advanced event selectors, you can choose from predefined templates that log all data events on a selected resource. After you choose a log selector template, you can customize the template to include only the data events you most want to see. For more information about using advanced event selectors, see [Filtering data events by using advanced event selectors](filtering-data-events.md) in this topic.

1. On the **Dashboard** or **Trails** pages of the CloudTrail console, choose the trail you want to update.

1. On the details page, in **Data events**, choose **Edit**.

1. If you are not already logging data events, choose the **Data events** check box.

1. For **Resource type**, choose the resource type on which you want to log data events.

1. Choose a log selector template. You can choose a predefined template, or choose **Custom** to define your own event collection conditions.

   You can choose from the following predefined templates:
   + **Log all events** – Choose this template to log all events.
   + **Log only read events** – Choose this template to log only read events. Read-only events are events that do not change the state of a resource, such as `Get*` or `Describe*` events.
   + **Log only write events** – Choose this template to log only write events. Write events add, change, or delete resources, attributes, or artifacts, such as `Put*`, `Delete*`, or `Write*` events.
   + **Log only AWS Management Console events** – Choose this template to log only events originating from the AWS Management Console.
   + **Exclude AWS service initiated events** – Choose this template to exclude AWS service events, which have an `eventType` of `AwsServiceEvent`, and events initiated with AWS service-linked roles (SLRs).
**Note**  
Choosing a predefined template for S3 buckets enables data event logging for all buckets currently in your AWS account and any buckets you create after you finish creating the trail. It also enables logging of data event activity performed by any user or role in your AWS account, even if that activity is performed on a bucket that belongs to another AWS account.  
If the trail applies only to one Region, choosing a predefined template that logs all S3 buckets enables data event logging for all buckets in the same Region as your trail and any buckets you create later in that Region. It will not log data events for Amazon S3 buckets in other Regions in your AWS account.  
If you are creating a trail for all Regions, choosing a predefined template for Lambda functions enables data event logging for all functions currently in your AWS account, and any Lambda functions you might create in any Region after you finish creating the trail. If you are creating a trail for a single Region (for trails, this only can be done by using the AWS CLI), this selection enables data event logging for all functions currently in that Region in your AWS account, and any Lambda functions you might create in that Region after you finish creating the trail. It does not enable data event logging for Lambda functions created in other Regions.  
Logging data events for all functions also enables logging of data event activity performed by any user or role in your AWS account, even if that activity is performed on a function that belongs to another AWS account.

1. (Optional) In **Selector name**, enter a name to identify your selector. The selector name is a descriptive name for an advanced event selector, such as "Log data events for only two S3 buckets". The selector name is listed as `Name` in the advanced event selector and is viewable if you expand the **JSON view**.

1. If you selected **Custom**, in **Advanced event selectors** build an expression based on the values of advanced event selector fields.
**Note**  
Selectors don't support the use of wildcards like `*` . To match multiple values with a single condition, you may use `StartsWith`, `EndsWith`, `NotStartsWith`, or `NotEndsWith` to explicitly match the beginning or end of the event field.

   1. Choose from the following fields.
      + **`readOnly`** - `readOnly` can be set to **equals** a value of `true` or `false`. Read-only data events are events that do not change the state of a resource, such as `Get*` or `Describe*` events. Write events add, change, or delete resources, attributes, or artifacts, such as `Put*`, `Delete*`, or `Write*` events. To log both `read` and `write` events, don't add a `readOnly` selector.
      + **`eventName`** - `eventName` can use any operator. You can use it to include or exclude any data event logged to CloudTrail, such as `PutBucket`, `GetItem`, or `GetSnapshotBlock`.
      + **`eventSource`** – The event source to include or exclude. This field can use any operator.
      + **eventType** – The event type to include or exclude. For example, you can set this field to **not equals** `AwsServiceEvent` to exclude [AWS service events](non-api-aws-service-events.md). For a list of event types, see [`eventType`](cloudtrail-event-reference-record-contents.md#ct-event-type) in [CloudTrail record contents for management, data, and network activity events](cloudtrail-event-reference-record-contents.md).
      + **sessionCredentialFromConsole** – Include or exclude events originating from an AWS Management Console session. This field can be set to **equals** or **not equals** with a value of `true`.
      + **userIdentity.arn** – Include or exclude events for actions taken by specific IAM identities. For more information, see [CloudTrail userIdentity element](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-event-reference-user-identity.html).
      + **`resources.ARN`** - You can use any operator with `resources.ARN`, but if you use **equals** or **does not equal**, the value must exactly match the ARN of a valid resource of the type you've specified in the template as the value of `resources.type`.
**Note**  
You can't use the `resources.ARN` field to filter resource types that do not have ARNs.

        For more information about the ARN formats of data event resources, see [Actions, resources, and condition keys for AWS services](https://docs.aws.amazon.com/service-authorization/latest/reference/reference_policies_actions-resources-contextkeys.html) in the *Service Authorization Reference*.

   1. For each field, choose **\$1 Condition** to add as many conditions as you need, up to a maximum of 500 specified values for all conditions. For example, to exclude data events for two S3 buckets from data events that are logged on your event data store, you can set the field to **resources.ARN**, set the operator for **does not start with**, and then paste in an S3 bucket ARN for which you do not want to log events.

      To add the second S3 bucket, choose **\$1 Condition**, and then repeat the preceding instruction, pasting in the ARN for or browsing for a different bucket.

      For information about how CloudTrail evaluates multiple conditions, see [How CloudTrail evaluates multiple conditions for a field](filtering-data-events.md#filtering-data-events-conditions).
**Note**  
You can have a maximum of 500 values for all selectors on an event data store. This includes arrays of multiple values for a selector such as `eventName`. If you have single values for all selectors, you can have a maximum of 500 conditions added to a selector.

   1. Choose **\$1 Field** to add additional fields as required. To avoid errors, do not set conflicting or duplicate values for fields. For example, do not specify an ARN in one selector to be equal to a value, then specify that the ARN not equal the same value in another selector.

1. To add another resource type on which to log data events, choose **Add data event type**. Repeat steps 4 through this step to configure advanced event selectors for the resource type.

1. After you've reviewed and verified your choices, choose **Save changes**.

### Update an existing trail to log data events with basic event selectors using the console
<a name="logging-data-events-with-the-cloudtrail-console"></a>

Use the following procedure to update an existing trail to log data events using basic event selectors.

1. Sign in to the AWS Management Console and open the CloudTrail console at [https://console.aws.amazon.com/cloudtrail/](https://console.aws.amazon.com/cloudtrail/).

1. Open the **Trails** page of the CloudTrail console and choose the trail name.
**Note**  
While you can edit an existing trail to log data events, as a best practice, consider creating a separate trail specifically for logging data events.

1. For **Data events**, choose **Edit**.

1. For Amazon S3 buckets:

   1. For **Data event source**, choose **S3**.

   1. You can choose to log **All current and future S3 buckets**, or you can specify individual buckets or functions. By default, data events are logged for all current and future S3 buckets.
**Note**  
Keeping the default **All current and future S3 buckets** option enables data event logging for all buckets currently in your AWS account and any buckets you create after you finish creating the trail. It also enables logging of data event activity performed by any user or role in your AWS account, even if that activity is performed on a bucket that belongs to another AWS account.  
If you are creating a trail for a single Region (done by using the AWS CLI), selecting the **Select all S3 buckets in your account** option enables data event logging for all buckets in the same Region as your trail and any buckets you create later in that Region. It will not log data events for Amazon S3 buckets in other Regions in your AWS account.

   1. If you leave the default, **All current and future S3 buckets**, choose to log **Read** events, **Write** events, or both.

   1. To select individual buckets, empty the **Read** and **Write** check boxes for **All current and future S3 buckets**. In **Individual bucket selection**, browse for a bucket on which to log data events. To find specific buckets, type a bucket prefix for the bucket you want. You can select multiple buckets in this window. Choose **Add bucket** to log data events for more buckets. Choose to log **Read** events, such as `GetObject`, **Write** events, such as `PutObject`, or both.

      This setting takes precedence over individual settings you configure for individual buckets. For example, if you specify logging **Read** events for all S3 buckets, and then choose to add a specific bucket for data event logging, **Read** is already selected for the bucket you added. You cannot clear the selection. You can only configure the option for **Write**.

      To remove a bucket from logging, choose **X**.

1. To add another resource type on which to log data events, choose **Add data event type**.

1. For Lambda functions:

   1. For **Data event source**, choose **Lambda**.

   1. In **Lambda function**, choose **All regions** to log all Lambda functions, or **Input function as ARN** to log data events on a specific function. 

      To log data events for all Lambda functions in your AWS account, select **Log all current and future functions**. This setting takes precedence over individual settings you configure for individual functions. All functions are logged, even if all functions are not displayed.
**Note**  
If you are creating a trail for all Regions, this selection enables data event logging for all functions currently in your AWS account, and any Lambda functions you might create in any Region after you finish creating the trail. If you are creating a trail for a single Region (done by using the AWS CLI), this selection enables data event logging for all functions currently in that Region in your AWS account, and any Lambda functions you might create in that Region after you finish creating the trail. It does not enable data event logging for Lambda functions created in other Regions.  
Logging data events for all functions also enables logging of data event activity performed by any user or role in your AWS account, even if that activity is performed on a function that belongs to another AWS account.

   1. If you choose **Input function as ARN**, enter the ARN of a Lambda function.
**Note**  
If you have more than 15,000 Lambda functions in your account, you cannot view or select all functions in the CloudTrail console when creating a trail. You can still select the option to log all functions, even if they are not displayed. If you want to log data events for specific functions, you can manually add a function if you know its ARN. You can also finish creating the trail in the console, and then use the AWS CLI and the **put-event-selectors** command to configure data event logging for specific Lambda functions. For more information, see [Managing trails with the AWS CLI](cloudtrail-additional-cli-commands.md).

1. To add another resource type on which to log data events, choose **Add data event type**.

1. For DynamoDB tables:

   1. For **Data event source**, choose **DynamoDB**.

   1. In **DynamoDB table selection**, choose **Browse** to select a table, or paste in the ARN of a DynamoDB table to which you have access. A DynamoDB table ARN uses the following format:

      ```
      arn:partition:dynamodb:region:account_ID:table/table_name
      ```

      To add another table, choose **Add row**, and browse for a table or paste in the ARN of a table to which you have access.

1. Choose **Save changes**.

## Logging data events with the AWS Command Line Interface
<a name="creating-data-event-selectors-with-the-AWS-CLI"></a>

You can configure your trails or event data stores to log data events using the AWS CLI.

**Topics**
+ [Logging data events for trails with the AWS CLI](#logging-data-events-CLI-trail-examples)
+ [Logging data events for event data stores with the AWS CLI](#logging-data-events-CLI-eds-examples)

### Logging data events for trails with the AWS CLI
<a name="logging-data-events-CLI-trail-examples"></a>

You can configure your trails to log management and data events using the AWS CLI.

**Note**  
Be aware that if your account is logging more than one copy of management events, you incur charges. There is always a charge for logging data events. For more information, see [AWS CloudTrail Pricing](https://aws.amazon.com/cloudtrail/pricing/).
You can use either advanced event selectors or basic event selectors, but not both. If you apply advanced event selectors to a trail, any existing basic event selectors are overwritten.
If your trail uses basic event selectors, you can only log the following resource types:  
`AWS::DynamoDB::Table`
`AWS::Lambda::Function`
`AWS::S3::Object`
To log additional resource types, you'll need to use advanced event selectors. To convert a trail to advanced event selectors, run the **get-event-selectors** command to confirm the current event selectors, and then configure the advanced event selectors to match the coverage of the previous event selectors, then add selectors for any resource types for which you want to log data events.
You can use advanced event selectors to filter based on the value of the [supported advanced event selector fields](filtering-data-events.md)supported advanced event selector fields, giving you the ability to log only the data events of interest. For more information about configuring these fields, see [https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_AdvancedFieldSelector.html](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_AdvancedFieldSelector.html) in the *AWS CloudTrail API Reference* and [Filtering data events by using advanced event selectors](filtering-data-events.md) in this guide.

To see whether your trail is logging management and data events, run the [https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/get-event-selectors.html](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/get-event-selectors.html) command.

```
aws cloudtrail get-event-selectors --trail-name TrailName
```

The command returns the event selectors for the trail.

**Topics**
+ [Log data events for trails by using advanced event selectors](#creating-data-event-selectors-advanced)
+ [Log all Amazon S3 events for an Amazon S3 bucket by using advanced event selectors](#creating-data-adv-event-selectors-CLI-s3)
+ [Log Amazon S3 on AWS Outposts events by using advanced event selectors](#creating-data-event-selectors-CLI-outposts)
+ [Log events by using basic event selectors](#creating-data-event-selectors-basic)

#### Log data events for trails by using advanced event selectors
<a name="creating-data-event-selectors-advanced"></a>

**Note**  
If you apply advanced event selectors to a trail, any existing basic event selectors are overwritten. Before configuring advanced event selectors, run the **get-event-selectors** command to confirm the current event selectors, and then configure the advanced event selectors to match the coverage of the previous event selectors, then add selectors for any additional data events you want to log.

The following example creates custom advanced event selectors for a trail named *TrailName* to include read and write management events (by omitting the `readOnly` selector), `PutObject` and `DeleteObject` data events for all Amazon S3 bucket/prefix combinations except for a bucket named `amzn-s3-demo-bucket` and data events for an AWS Lambda function named `MyLambdaFunction`. Because these are custom advanced event selectors, each set of selectors has a descriptive name. Note that a trailing slash is part of the ARN value for S3 buckets.

```
aws cloudtrail put-event-selectors --trail-name TrailName --advanced-event-selectors
'[
  {
    "Name": "Log readOnly and writeOnly management events",
    "FieldSelectors": [
      { "Field": "eventCategory", "Equals": ["Management"] }
    ]
  },
  {
    "Name": "Log PutObject and DeleteObject events for all but one bucket",
    "FieldSelectors": [
      { "Field": "eventCategory", "Equals": ["Data"] },
      { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
      { "Field": "eventName", "Equals": ["PutObject","DeleteObject"] },
      { "Field": "resources.ARN", "NotStartsWith": ["arn:aws:s3:::amzn-s3-demo-bucket/"] }
    ]
  },
  {
    "Name": "Log data plane actions on MyLambdaFunction",
    "FieldSelectors": [
      { "Field": "eventCategory", "Equals": ["Data"] },
      { "Field": "resources.type", "Equals": ["AWS::Lambda::Function"] },
      { "Field": "resources.ARN", "Equals": ["arn:aws:lambda:us-east-2:111122223333:function/MyLambdaFunction"] }
    ]
  }
]'
```

The example returns the advanced event selectors that are configured for the trail.

```
{
  "AdvancedEventSelectors": [
    {
      "Name": "Log readOnly and writeOnly management events",
      "FieldSelectors": [
        {
          "Field": "eventCategory", 
          "Equals": [ "Management" ]
        }
      ]
    },
    {
      "Name": "Log PutObject and DeleteObject events for all but one bucket",
      "FieldSelectors": [
        {
          "Field": "eventCategory", 
          "Equals": [ "Data" ]
        },
        {
          "Field": "resources.type", 
          "Equals": [ "AWS::S3::Object" ]
        },
        {
          "Field": "resources.ARN", 
          "NotStartsWith": [ "arn:aws:s3:::amzn-s3-demo-bucket/" ]
        },
      ]
    },
{
      "Name": "Log data plane actions on MyLambdaFunction",
      "FieldSelectors": [
        {
          "Field": "eventCategory", 
          "Equals": [ "Data" ]
        },
        {
          "Field": "resources.type", 
          "Equals": [ "AWS::Lambda::Function" ]
        },
        {
          "Field": "eventName", 
          "Equals": [ "Invoke" ]
        },
        {
          "Field": "resources.ARN", 
          "Equals": [ "arn:aws:lambda:us-east-2:111122223333:function/MyLambdaFunction" ]
        }
      ]
    }
  ],
  "TrailARN": "arn:aws:cloudtrail:us-east-2:123456789012:trail/TrailName"
}
```

#### Log all Amazon S3 events for an Amazon S3 bucket by using advanced event selectors
<a name="creating-data-adv-event-selectors-CLI-s3"></a>

**Note**  
If you apply advanced event selectors to a trail, any existing basic event selectors are overwritten.

The following example shows how to configure your trail to include all data events for all Amazon S3 objects in a specific S3 bucket. The value for S3 events for the `resources.type` field is `AWS::S3::Object`. Because the ARN values for S3 objects and S3 buckets are slightly different, you must add the `StartsWith` operator for `resources.ARN` to capture all events.

```
aws cloudtrail put-event-selectors --trail-name TrailName --region region \
--advanced-event-selectors \
'[
    {
            "Name": "S3EventSelector",
            "FieldSelectors": [
                { "Field": "eventCategory", "Equals": ["Data"] },
                { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
                { "Field": "resources.ARN", "StartsWith": ["arn:partition:s3:::amzn-s3-demo-bucket/"] }
            ]
        }
]'
```

The command returns the following example output.

```
{
    "TrailARN": "arn:aws:cloudtrail:region:account_ID:trail/TrailName",
    "AdvancedEventSelectors": [
        {
            "Name": "S3EventSelector",
            "FieldSelectors": [
                {
                    "Field": "eventCategory",
                    "Equals": [
                        "Data"
                    ]
                },
                {
                    "Field": "resources.type",
                    "Equals": [
                        "AWS::S3::Object"
                    ]
                },
                {
                    "Field": "resources.ARN",
                    "StartsWith": [
                        "arn:partition:s3:::amzn-s3-demo-bucket/"
                    ]
                }
            ]
        }
    ]
}
```

#### Log Amazon S3 on AWS Outposts events by using advanced event selectors
<a name="creating-data-event-selectors-CLI-outposts"></a>

**Note**  
If you apply advanced event selectors to a trail, any existing basic event selectors are overwritten.

The following example shows how to configure your trail to include all data events for all Amazon S3 on Outposts objects in your outpost.

```
aws cloudtrail put-event-selectors --trail-name TrailName --region region \
--advanced-event-selectors \
'[
    {
            "Name": "OutpostsEventSelector",
            "FieldSelectors": [
                { "Field": "eventCategory", "Equals": ["Data"] },
                { "Field": "resources.type", "Equals": ["AWS::S3Outposts::Object"] }
            ]
    }
]'
```

The command returns the following example output.

```
{
    "TrailARN": "arn:aws:cloudtrail:region:account_ID:trail/TrailName",
    "AdvancedEventSelectors": [
        {
            "Name": "OutpostsEventSelector",
            "FieldSelectors": [
                {
                    "Field": "eventCategory",
                    "Equals": [
                        "Data"
                    ]
                },
                {
                    "Field": "resources.type",
                    "Equals": [
                        "AWS::S3Outposts::Object"
                    ]
                }
            ]
        }
    ]
}
```

#### Log events by using basic event selectors
<a name="creating-data-event-selectors-basic"></a>

The following is an example result of the **get-event-selectors** command showing basic event selectors. By default, when you create a trail by using the AWS CLI, a trail logs all management events. By default, trails do not log data events.

```
{
    "TrailARN": "arn:aws:cloudtrail:us-east-2:123456789012:trail/TrailName",
    "EventSelectors": [
        {
            "IncludeManagementEvents": true,
            "DataResources": [],
            "ReadWriteType": "All"
        }
    ]
}
```

To configure your trail to log management and data events, run the [https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/put-event-selectors.html](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/put-event-selectors.html) command.

The following example shows how to use basic event selectors to configure your trail to include all management and data events for the S3 objects in two S3 bucket prefixes. You can specify from 1 to 5 event selectors for a trail. You can specify from 1 to 250 data resources for a trail.

**Note**  
The maximum number of S3 data resources is 250, if you choose to limit data events by using basic event selectors.

```
aws cloudtrail put-event-selectors --trail-name TrailName --event-selectors '[{ "ReadWriteType": "All", "IncludeManagementEvents":true, "DataResources": [{ "Type": "AWS::S3::Object", "Values": ["arn:aws:s3:::amzn-s3-demo-bucket1/prefix", "arn:aws:s3:::amzn-s3-demo-bucket2;/prefix2"] }] }]'
```

The command returns the event selectors that are configured for the trail.

```
{
    "TrailARN": "arn:aws:cloudtrail:us-east-2:123456789012:trail/TrailName",
    "EventSelectors": [
        {
            "IncludeManagementEvents": true,
            "DataResources": [
                {
                    "Values": [
                        "arn:aws:s3:::amzn-s3-demo-bucket1/prefix",
                        "arn:aws:s3:::amzn-s3-demo-bucket2/prefix2",
                    ],
                    "Type": "AWS::S3::Object"
                }
            ],
            "ReadWriteType": "All"
        }
    ]
}
```

### Logging data events for event data stores with the AWS CLI
<a name="logging-data-events-CLI-eds-examples"></a>

You can configure your event data stores to include data events using the AWS CLI. Use the [https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/create-event-data-store.html](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/create-event-data-store.html) command to create a new event data store to log data events. Use the [https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/update-event-data-store.html](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/update-event-data-store.html) command to update the advanced event selectors for an existing event data store.

You configure advanced event selectors to log data events on an event data store. For a list of supported fields, see [Filtering data events by using advanced event selectors](filtering-data-events.md).

To see whether your event data store includes data events, run the [https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/get-event-data-store.html](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/cloudtrail/get-event-data-store.html) command. 

```
aws cloudtrail get-event-data-store --event-data-store EventDataStoreARN
```

The command returns the settings for the event data store.

```
{
    "EventDataStoreArn": "arn:aws:cloudtrail:us-east-1:111122223333:eventdatastore/EXAMPLE492-301f-4053-ac5e-EXAMPLE6441aa",
    "Name": "ebs-data-events",
    "Status": "ENABLED",
    "AdvancedEventSelectors": [
        {
            "Name": "Log all EBS direct APIs on EBS snapshots",
            "FieldSelectors": [
                {
                    "Field": "eventCategory",
                    "Equals": [
                        "Data"
                    ]
                },
                {
                    "Field": "resources.type",
                    "Equals": [
                        "AWS::EC2::Snapshot"
                    ]
                }
            ]
        }
    ],
    "MultiRegionEnabled": true,
    "OrganizationEnabled": false,
    "BillingMode": "EXTENDABLE_RETENTION_PRICING",
    "RetentionPeriod": 366,
    "TerminationProtectionEnabled": true,
    "CreatedTimestamp": "2023-11-04T15:57:33.701000+00:00",
    "UpdatedTimestamp": "2023-11-20T20:37:34.228000+00:00"
}
```

**Topics**
+ [Include all Amazon S3 events for a specific bucket](#creating-data-adv-event-selectors-CLI-s3-eds)
+ [Include Amazon S3 on AWS Outposts events](#creating-data-event-selectors-CLI-outposts-eds)

#### Include all Amazon S3 events for a specific bucket
<a name="creating-data-adv-event-selectors-CLI-s3-eds"></a>

The following example shows how to create an event data store to include all data events for all Amazon S3 objects in a specific general purpose S3 bucket and exclude AWS service events and events generated by the `bucket-scanner-role` `userIdentity`. The value for S3 events for the `resources.type` field is `AWS::S3::Object`. Because the ARN values for S3 objects and S3 buckets are slightly different, you must add the `StartsWith` operator for `resources.ARN` to capture all events.

```
aws cloudtrail create-event-data-store --name "EventDataStoreName" --multi-region-enabled \
--advanced-event-selectors \
'[
    {
        "Name": "S3EventSelector",
        "FieldSelectors": [
            { "Field": "eventCategory", "Equals": ["Data"] },
            { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
            { "Field": "resources.ARN", "StartsWith": ["arn:partition:s3:::amzn-s3-demo-bucket/"] },
            { "Field": "userIdentity.arn", "NotStartsWith": ["arn:aws:sts::123456789012:assumed-role/bucket-scanner-role"]},
            { "Field": "eventType","NotEquals": ["AwsServiceEvent"]}
        ]
    }
]'
```

The command returns the following example output.

```
{
    "EventDataStoreArn": "arn:aws:cloudtrail:us-east-1:111122223333:eventdatastore/EXAMPLE492-301f-4053-ac5e-EXAMPLE441aa",
    "Name": "EventDataStoreName",
    "Status": "ENABLED",
    "AdvancedEventSelectors": [
        {
            "Name": "S3EventSelector",
            "FieldSelectors": [
                {
                    "Field": "eventCategory",
                    "Equals": [
                        "Data"
                    ]
                },
                {
                    "Field": "resources.ARN",
                    "StartsWith": [
                        "arn:partition:s3:::amzn-s3-demo-bucket/"
                    ]
                },
                {
                    "Field": "resources.type",
                    "Equals": [
                        "AWS::S3::Object"
                    ]
                },
                { 
                    "Field": "userIdentity.arn", 
                    "NotStartsWith": [
                        "arn:aws:sts::123456789012:assumed-role/bucket-scanner-role"
                     ]
                },
                { 
                    "Field": "eventType",
                    "NotEquals": [
                        "AwsServiceEvent"
                    ]
                }
            ]
        }
    ],
    "MultiRegionEnabled": true,
    "OrganizationEnabled": false,
    "BillingMode": "EXTENDABLE_RETENTION_PRICING",
    "RetentionPeriod": 366,
    "TerminationProtectionEnabled": true,
    "CreatedTimestamp": "2024-11-04T15:57:33.701000+00:00",
    "UpdatedTimestamp": "2024-11-20T20:49:21.766000+00:00"
}
```

#### Include Amazon S3 on AWS Outposts events
<a name="creating-data-event-selectors-CLI-outposts-eds"></a>

The following example shows how to create an event data store that includes all data events for all Amazon S3 on Outposts objects in your outpost.

```
aws cloudtrail create-event-data-store --name EventDataStoreName \
--advanced-event-selectors \
'[
    {
            "Name": "OutpostsEventSelector",
            "FieldSelectors": [
                { "Field": "eventCategory", "Equals": ["Data"] },
                { "Field": "resources.type", "Equals": ["AWS::S3Outposts::Object"] }
            ]
        }
]'
```

The command returns the following example output.

```
{
    "EventDataStoreArn": "arn:aws:cloudtrail:us-east-1:111122223333:eventdatastore/EXAMPLEb4a8-99b1-4ec2-9258-EXAMPLEc890",
    "Name": "EventDataStoreName",
    "Status": "CREATED",
    "AdvancedEventSelectors": [
        {
            "Name": "OutpostsEventSelector",
            "FieldSelectors": [
                {
                    "Field": "eventCategory",
                    "Equals": [
                        "Data"
                    ]
                },
                {
                    "Field": "resources.type",
                    "Equals": [
                        "AWS::S3Outposts::Object"
                    ]
                }
            ]
        }
    ],
    "MultiRegionEnabled": true,
    "OrganizationEnabled": false,
    "BillingMode": "EXTENDABLE_RETENTION_PRICING",
    "RetentionPeriod": 366,
    "TerminationProtectionEnabled": true,
    "CreatedTimestamp": "2023-02-20T21:00:17.673000+00:00",
    "UpdatedTimestamp": "2023-02-20T21:00:17.820000+00:00"
}
```

# Filtering data events by using advanced event selectors
<a name="filtering-data-events"></a>

This section describes how you can use advanced event selectors to create fine-grained selectors for logging data events, which can help you control costs by only logging the specific data events of interest.

For example:
+ You can include or exclude specific API calls by adding a filter on the `eventName` field.
+ You can include or exclude logging for specific resources by adding a filter on the `resources.ARN` field. For example, if you were logging S3 data events, you could exclude logging for the S3 bucket for your trail.
+ You can choose to log only write-only events or read-only events by adding a filter on the `readOnly` field.

The following table describes the supported fields for filtering data events. For a list of supported fields for each CloudTrail event type, see [AdvancedEventSelector](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_AdvancedEventSelector.html) in the *AWS CloudTrail API Reference*.


| Field | Required | Valid operators | Description | 
| --- | --- | --- | --- | 
|  **`eventCategory` **  |  Yes  |  `Equals`  |  This field is set to `Data` to log data events.  | 
|  **`resources.type`**  |  Yes  |  `Equals`  |  This field is used to select the resource type for which you want to log data events. The [Data events](logging-data-events-with-cloudtrail.md#logging-data-events) table shows the possible values.  | 
|  **`readOnly`**  |  No  |  `Equals`  |  This is an optional field used to include or exclude data events based on the `readOnly` value. A value of `true` logs only read events. A value of `false` logs only write events. If you do not add this field, CloudTrail logs both read and write events.   | 
|  **`eventName`**  |  No  |  `EndsWith` `Equals` `NotEndsWith` `NotEquals` `NotStartsWith` `StartsWith`  |  This is an optional filed used to ﬁlter in or ﬁlter out any data event logged to CloudTrail, such as `PutBucket` or `GetSnapshotBlock`. If you're using the AWS CLI, you can specify multiple values by separating each value with a comma. If you're using the console, you can specify multiple values by creating a condition for each `eventName` you want to filter on.  | 
|  **`resources.ARN`**  |  No  |  `EndsWith` `Equals` `NotEndsWith` `NotEquals` `NotStartsWith` `StartsWith`  |  This is an optional field used to exclude or include data events for a specific resource by providing the `resources.ARN`. You can use any operator with `resources.ARN`, but if you use `Equals` or `NotEquals`, the value must exactly match the ARN of a valid resource for the `resources.type` you've speciﬁed. To log all data events for all objects in a specific S3 bucket, use the `StartsWith` operator, and include only the bucket ARN as the matching value. If you're using the AWS CLI, you can specify multiple values by separating each value with a comma. If you're using the console, you can specify multiple values by creating a condition for each `resources.ARN` you want to filter on.  | 
|  **`eventSource`**  |  No  |  `EndsWith` `Equals` `NotEndsWith` `NotEquals` `NotStartsWith` `StartsWith`  |  You can use it to include or exclude specific event sources. The `eventSource` is typically a short form of the service name without spaces plus `.amazonaws.com`. For example, you could set `eventSource` `Equals` to `ec2.amazonaws.com` to log only Amazon EC2 data events.  | 
|  **`eventType`**  |  No  |  `EndsWith` `Equals` `NotEndsWith` `NotEquals` `NotStartsWith` `StartsWith`  |  The [eventType](cloudtrail-event-reference-record-contents.md#ct-event-type) to include or exclude. For example, you can set this field to `NotEquals` `AwsServiceEvent` to exclude [AWS service events](non-api-aws-service-events.md).  | 
|  **`sessionCredentialFromConsole`**  |  No  |  `Equals` `NotEquals`  |  Include or exclude events originating from an AWS Management Console session. This field can be set to `Equals` or `NotEquals` with a value of `true`.  | 
|  **`userIdentity.arn`**  |  No  |  `EndsWith` `Equals` `NotEndsWith` `NotEquals` `NotStartsWith` `StartsWith`  |  Include or exclude events for actions taken by specific IAM identities. For more information, see [CloudTrail userIdentity element](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-event-reference-user-identity.html).  | 

To log data events using the CloudTrail console, you choose the **Data events** option and then select the **Resource type** of interest when you are creating or updating a trail or event data store. The [Data events](logging-data-events-with-cloudtrail.md#logging-data-events) table shows the possible resource types you can choose on the CloudTrail console.

![\[Selection of the SNS topic resource type on the console.\]](http://docs.aws.amazon.com/awscloudtrail/latest/userguide/images/cloudtrail-data-event-type.png)


To log data events with the AWS CLI, configure the `--advanced-event-selector` parameter to set the `eventCategory` equal to `Data` and the `resources.type` value equal to the resource type value for which you want to log data events. The [Data events](logging-data-events-with-cloudtrail.md#logging-data-events) table lists the available resource types.

For example, if you wanted to log data events for all Cognito Identity pools, you’d configure the `--advanced-event-selectors` parameter to look like this:

```
--advanced-event-selectors '[
    {
       "Name": "Log Cognito data events on Identity pools",
       "FieldSelectors": [
         { "Field": "eventCategory", "Equals": ["Data"] },
         { "Field": "resources.type", "Equals": ["AWS::Cognito::IdentityPool"] }
       ]
     }
]'
```

The preceding example logs all Cognito data events on Identity pools. You can further refine the advanced event selectors to filter on the `eventName`, `readOnly`, and `resources.ARN` fields to log specific events of interest or exclude events that aren’t of interest.

You can configure advanced event selectors to filter data events based on multiple fields. For example, you can configure advanced event selectors to log all Amazon S3 `PutObject` and `DeleteObject` API calls but exclude event logging for a specific S3 bucket as shown in the following example. Replace *amzn-s3-demo-bucket* with the name of your bucket.

```
--advanced-event-selectors
'[
  {
    "Name": "Log PutObject and DeleteObject events for all but one bucket",
    "FieldSelectors": [
      { "Field": "eventCategory", "Equals": ["Data"] },
      { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
      { "Field": "eventName", "Equals": ["PutObject","DeleteObject"] },
      { "Field": "resources.ARN", "NotStartsWith": ["arn:aws:s3:::amzn-s3-demo-bucket/"] }
    ]
  }
]'
```

You can also include multiple conditions for a field. For information on how multiple conditions are evaluated, see [How CloudTrail evaluates multiple conditions for a field](#filtering-data-events-conditions).

You can use advanced event selectors to log both management and data events. To log data events for multiple resource types, add a field selector statement for each resource type that you want to log data events for.

**Note**  
Trails can use either basic event selectors or advanced event selectors, but not both. If you apply advanced event selectors to a trail, any existing basic event selectors are overwritten.  
Selectors don't support the use of wildcards like `*` . To match multiple values with a single condition, you may use `StartsWith`, `EndsWith`, `NotStartsWith`, or `NotEndsWith` to explicitly match the beginning or end of the event field.

**Topics**
+ [How CloudTrail evaluates multiple conditions for a field](#filtering-data-events-conditions)
+ [AWS CLI examples for filtering data events](#filtering-data-events-examples)

## How CloudTrail evaluates multiple conditions for a field
<a name="filtering-data-events-conditions"></a>

For advanced event selectors, CloudTrail evaluates multiple conditions for a field as follows:
+ DESELECT operators are AND'd together. If any of the DESELECT operator conditions are met, the event is not delivered. These are the valid DESELECT operators for advanced event selectors:
  + `NotEndsWith`
  + `NotEquals`
  + `NotStartsWith`
+ SELECT operators are OR'd together. These are the valid SELECT operators for advanced event selectors:
  + `EndsWith`
  + `Equals`
  + `StartsWith`
+ Combinations of SELECT and DESELECT operators follow the above rules and both groups are AND'd together.

### Example showing multiple conditions for the `resources.ARN` field
<a name="filtering-data-events-conditions-ex"></a>

The following example event selector statement collects data events for the `AWS::S3::Object` resource type and applies multiple conditions on the `resources.ARN` field.

```
{
    "Name": "S3Select",
    "FieldSelectors": [
      {
        "Field": "eventCategory",
        "Equals": [
          "Data"
        ]
      },
      {
        "Field": "resources.type",
        "Equals": [
          "AWS::S3::Object"
        ]
      },
      {
        "Field": "resources.ARN",
        "Equals": [
          "arn:aws:s3:::amzn-s3-demo-bucket/object1"
        ],
        "StartsWith": [
          "arn:aws:s3:::amzn-s3-demo-bucket/"
        ],
        "EndsWith": [
          "object3"
        ],
        "NotStartsWith": [
          "arn:aws:s3:::amzn-s3-demo-bucket/deselect"
        ],
        "NotEndsWith": [
          "object5"
        ],
        "NotEquals": [
          "arn:aws:s3:::amzn-s3-demo-bucket/object6"
        ]
      }
    ]
  }
```

In the preceding example, Amazon S3 data events for the `AWS::S3::Object` resource will be delivered if: 

1. None of these DESELECT operator conditions are met:
   + the `resources.ARN` field `NotStartsWith` the value `arn:aws:s3:::amzn-s3-demo-bucket/deselect`
   + the `resources.ARN` field `NotEndsWith` the value `object5`
   + the `resources.ARN` field `NotEquals` the value `arn:aws:s3:::amzn-s3-demo-bucket/object6`

1. At least one of these SELECT operator conditions is met: 
   + the `resources.ARN` field `Equals` the value `arn:aws:s3:::amzn-s3-demo-bucket/object1`
   + the `resources.ARN` field `StartsWith` the value `arn:aws:s3:::amzn-s3-demo-bucket/`
   + the `resources.ARN` field `EndsWith` the value `object3`

Based on the evaluation logic:

1. Data events for `amzn-s3-demo-bucket/object1` will be delivered because it matches the value for the `Equals` operator and doesn’t match any of the values for the `NotStartsWith`, `NotEndsWith`, and `NotEquals` operators.

1. Data event for `amzn-s3-demo-bucket/object2` will be delivered because it matches the value for the `StartsWith` operator and doesn’t match any of the values for the `NotStartsWith`, `NotEndsWith`, and `NotEquals` operators.

1. Data events for `amzn-s3-demo-bucket1/object3` will be delivered because it matches the `EndsWith` operator and doesn’t match any of the values for the `NotStartsWith`, `NotEndsWith`, and `NotEquals` operators.

1. Data events for `arn:aws:s3:::amzn-s3-demo-bucket/deselectObject4` will not be delivered because it matches the condition for the `NotStartsWith` even though it matches the condition for the `StartsWith` operator.

1. Data events for `arn:aws:s3:::amzn-s3-demo-bucket/object5` will not be delivered because it matches the condition for the `NotEndsWith` even though it matches the condition for the `StartsWith` operator.

1. Data events for the `arn:aws:s3:::amzn-s3-demo-bucket/object6` will not be delivered because it matches the condition for the `NotEquals` operator even though it matches the condition for the `StartsWith` operator.

## AWS CLI examples for filtering data events
<a name="filtering-data-events-examples"></a>

This section provides AWS CLI examples showing how to filter data events on different fields. For additional AWS CLI examples, see [Log data events for trails by using advanced event selectors](logging-data-events-with-cloudtrail.md#creating-data-event-selectors-advanced) and [Logging data events for event data stores with the AWS CLI](logging-data-events-with-cloudtrail.md#logging-data-events-CLI-eds-examples).

For information about how to log data events using the console, see [Logging data events with the AWS Management Console](logging-data-events-with-cloudtrail.md#logging-data-events-console).

**Topics**
+ [Example 1: Filtering on the `eventName` field](#filtering-data-events-eventname)
+ [Example 2: Filtering on the `resources.ARN` and `userIdentity.arn` fields](#filtering-data-events-useridentityarn)
+ [Example 3: Filtering on the `resources.type` and `eventName` fields to exclude individual objects deleted by an Amazon S3 DeleteObjects event](#filtering-data-events-deleteobjects)

### Example 1: Filtering on the `eventName` field
<a name="filtering-data-events-eventname"></a>

In the first example, the `--advanced-event-selectors` for a trail are configured to log only the `GetObject`, `PutObject`, and `DeleteObject` API calls for Amazon S3 objects in general purpose buckets.

```
aws cloudtrail put-event-selectors \
--trail-name trailName \
--advanced-event-selectors '[
  {
    "Name": "Log GetObject, PutObject and DeleteObject S3 data events",
    "FieldSelectors": [
      { "Field": "eventCategory", "Equals": ["Data"] },
      { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
      { "Field": "eventName", "Equals": ["GetObject","PutObject","DeleteObject"] }
    ]
  }
]'
```

The next example creates a new event data store that logs data events for EBS Direct APIs but excludes `ListChangedBlocks` API calls. You can use the [https://docs.aws.amazon.com/cli/latest/reference/cloudtrail/update-event-data-store.html](https://docs.aws.amazon.com/cli/latest/reference/cloudtrail/update-event-data-store.html) command to update an existing event data store.

```
aws cloudtrail create-event-data-store \
--name "eventDataStoreName"
--advanced-event-selectors '[
    {
        "Name": "Log all EBS Direct API data events except ListChangedBlocks",
        "FieldSelectors": [
            { "Field": "eventCategory", "Equals": ["Data"] },
            { "Field": "resources.type", "Equals": ["AWS::EC2::Snapshot"] },
            { "Field": "eventName", "NotEquals": ["ListChangedBlocks"] }
         ]
    }
]'
```

### Example 2: Filtering on the `resources.ARN` and `userIdentity.arn` fields
<a name="filtering-data-events-useridentityarn"></a>

The following example shows how to include all data events for all Amazon S3 objects in a specific general purpose S3 bucket but exclude events generated by the `bucket-scanner-role` `userIdentity`. The value for S3 events for the `resources.type` field is `AWS::S3::Object`. Because the ARN values for S3 objects and S3 buckets are slightly different, you must add the `StartsWith` operator for `resources.ARN`.

```
aws cloudtrail put-event-selectors \
--trail-name trailName \
--advanced-event-selectors \
'[
    {
        "Name": "S3EventSelector",
        "FieldSelectors": [
            { "Field": "eventCategory", "Equals": ["Data"] },
            { "Field": "resources.type", "Equals": ["AWS::S3::Object"] },
            { "Field": "resources.ARN", "StartsWith": ["arn:partition:s3:::amzn-s3-demo-bucket/"] },
            { "Field": "userIdentity.arn", "NotStartsWith": ["arn:aws:sts::123456789012:assumed-role/bucket-scanner-role"]}
        ]
    }
]'
```

### Example 3: Filtering on the `resources.type` and `eventName` fields to exclude individual objects deleted by an Amazon S3 DeleteObjects event
<a name="filtering-data-events-deleteobjects"></a>

The following example shows how to include all data events for all Amazon S3 objects in a specific general purpose Amazon S3 bucket but exclude the individual objects deleted by the `DeleteObject` operation. The value for S3 events for the `resources.type` field is `AWS::S3::Object`. The value for the event name is `DeleteObject`. 

```
aws cloudtrail put-event-selectors \
--trail-name trailName \
--advanced-event-selectors \

{
    "Name": "Exclude Events for DeleteObject operation",
    "FieldSelectors": [
      {
        "Field": "eventCategory",
        "Equals": [
          "Data"
        ]
      },
      {
        "Field": "resources.type",
        "Equals": [
          "AWS::S3::Object"
        ]
      },
      {
        "Field": "eventName",
        "NotEquals": [
          "DeleteObject"
        ]
      }
    ]
  },
  {
    "Name": "Exclude DeleteObject Events for individual objects deleted by DeleteObjects Operation",
    "FieldSelectors": [
      {
        "Field": "eventCategory",
        "Equals": [
          "Data"
        ]
      },
      {
        "Field": "resources.type",
        "Equals": [
          "AWS::S3::Object"
        ]
      },
      {
        "Field": "eventName",
        "Equals": [
          "DeleteObject"
        ]
      },
      {
        "Field": "eventType",
        "NotEquals": [
          "AwsServiceEvent"
        ]
      }
    ]
  }
] (edited)
```

# Aggregating data events
<a name="aggregating-data-events"></a>

Data events provide information about the resource operations performed on or in a resource. These are also known as data plane operations. Data events are often high-volume activities.

By enabling aggregation on your data events, you can efficiently monitor high-volume data access patterns without processing massive amounts of individual events. This feature automatically consolidates data events into 5-minute summaries, showing key trends like access frequency, error rates, and most-used actions. For example, instead of processing thousands of individual S3 bucket access events to understand usage patterns, you receive consolidated summaries showing top users and actions.

You can enable aggregation on data events when creating a new trail or updating an existing trail that collects data events. You can select one or all of the three out-of-the-box templates to aggregate your data events on:
+ **API Activity** to get a 5-minute summary of your data events based on the API calls made. Use this to understand your API usage patterns, including frequency, callers, and source.
+ **Resource Access** to get the activity patterns on your AWS resources. Use this to understand how your AWS resources are being accessed, how many times they are being accessed in the 5-minute window, who is accessing the resource, and what actions are being performed.
+ **User Actions** to get activity patterns based on the IAM principal making API calls in your account.

## Enabling aggregations for data events using the console
<a name="aggregating-data-events-console"></a>

To enable aggregations on trails, you first choose data events logging when you are creating or updating a trail and configuring data events to log events in the trail. Then, in the configure event aggregation step, you can select templates such as **API Activity** and **Resource Access** from the Aggregation templates dropdown as shown in the screenshot below.

![\[Screenshot of the CloudTrail console showing the Aggregation templates dropdown with API Activity and Resource Access options selected\]](http://docs.aws.amazon.com/awscloudtrail/latest/userguide/images/Enable-Aggregation-console.png)


## Enabling aggregations for data events using the AWS CLI
<a name="aggregating-data-events-cli"></a>

You can configure your trails to aggregate events using the AWS CLI.

To see whether your trail is aggregating data events, run the `get-event-configurations` command.

```
aws cloudtrail get-event-configuration --region us-east-1 --trail-name TrailName
```

The command returns the aggregation configuration for the trail.

Before you enable event aggregation, you must create a trail and configure data events in it.

To enable event aggregation on a trail, follow the step below. The trail will aggregate events based on the `API_ACTIVITY` and `RESOURCE_ACCESS` aggregation templates.

```
aws cloudtrail put-event-configuration --region us-east-1 --trail TrailName \
--aggregation-configurations \
'[
    {
        "EventCategory": "Data",
        "Templates":
        [
            "API_ACTIVITY",
            "RESOURCE_ACCESS"
        ]
    }
]'
```

### Example: API\$1ACTIVITY aggregated event
<a name="aggregating-data-events-api-activity-example"></a>

The following shows an example of an aggregated event for the `API_ACTIVITY` template:

```
{
    "eventVersion": "1.0",
    "accountId": "111122223333",
    "eventId": "62759c1a-6248-48e1-a6b3-d5fb7e6c4bc0",
    "eventCategory": "Aggregated",
    "eventType": "AwsAggregatedEvent",
    "awsRegion": "us-west-2",
    "eventSource": "s3.amazonaws.com",
    "timeWindow":
    {
        "windowStart": "2025-11-17T19:20:00Z",
        "windowEnd": "2025-11-17T19:25:00Z",
        "windowSize": "PT5M"
    },
    "summary":
    {
        "primaryDimension":
        {
            "dimension": "eventName",
            "statistics":
            [
                {
                    "name": "PutObject",
                    "value": 1000
                }
            ],
            "aggregationType": "Count"
        },
        "details":
        [
            {
                "dimension": "resourceARN",
                "statistics":
                [
                    {
                        "name": "arn:aws:s3:::bucket-1",
                        "value": 800
                    },
                    {
                        "name": "arn:aws:s3:::bucket-2",
                        "value": 150
                    },
                    {
                        "name": "arn:aws:s3:::bucket-3",
                        "value": 50
                    }
                ],
                "aggregationType": "Count"
            }
        ]
    }
}
```

### Example: RESOURCE\$1ACCESS aggregated event
<a name="aggregating-data-events-resource-access-example"></a>

The following shows an example of an aggregated event for the `RESOURCE_ACCESS` template:

```
{
    "eventVersion": "1.0",
    "accountId": "111122223333",
    "eventId": "2ed87efa-45c1-412d-bc38-7e0879faa6df",
    "eventCategory": "Aggregated",
    "eventType": "AwsAggregatedEvent",
    "awsRegion": "us-west-2",
    "eventSource": "s3.amazonaws.com",
    "timeWindow":
    {
        "windowStart": "2025-11-17T19:20:00Z",
        "windowEnd": "2025-11-17T19:25:00Z",
        "windowSize": "PT5M"
    },
    "summary":
    {
        "primaryDimension":
        {
            "dimension": "resourceARN",
            "statistics":
            [
                {
                    "name": "arn:aws:s3:::bucket-1",
                    "value": 800
                }
            ],
            "aggregationType": "Count"
        },
        "details":
        [
            {
                "dimension": "eventName",
                "statistics":
                [
                    {
                        "name": "PutObject",
                        "value": 800
                    }
                ],
                "aggregationType": "Count"
            }
        ]
    }
}
```

## Logging data events for AWS Config compliance
<a name="config-data-events-best-practices"></a>

If you are using AWS Config conformance packs to help your enterprise maintain compliance with formalized standards such as those required by Federal Risk and Authorization Management Program (FedRAMP) or National Institute of Standards and Technology (NIST), conformance packs for compliance frameworks generally require you to log data events for Amazon S3 buckets, at minimum. Conformance packs for compliance frameworks include a [managed rule](https://docs.aws.amazon.com/config/latest/developerguide/evaluate-config_use-managed-rules.html) called [https://docs.aws.amazon.com/config/latest/developerguide/cloudtrail-s3-dataevents-enabled.html](https://docs.aws.amazon.com/config/latest/developerguide/cloudtrail-s3-dataevents-enabled.html) that checks for S3 data event logging in your account. Many conformance packs that are not associated with compliance frameworks also require S3 data event logging. The following are examples of conformance packs that include this rule.
+ [Operational Best Practices for AWS Well-Architected Framework Security Pillar](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-wa-Security-Pillar.html)
+ [Operational Best Practices for FDA Title 21 CFR Part 11](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-FDA-21CFR-Part-11.html)
+ [Operational Best Practices for FFIEC](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-ffiec.html)
+ [Operational Best Practices for FedRAMP(Moderate)](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-fedramp-moderate.html)
+ [Operational Best Practices for HIPAA Security](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-hipaa_security.html)
+ [Operational Best Practices for K-ISMS](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-k-isms.html)
+ [Operational Best Practices for Logging](https://docs.aws.amazon.com/config/latest/developerguide/operational-best-practices-for-logging.html)

For a full list of sample conformance packs available in AWS Config, see [Conformance pack sample templates](https://docs.aws.amazon.com/config/latest/developerguide/conformancepack-sample-templates.html) in the *AWS Config Developer Guide.*

## Logging data events with the AWS SDKs
<a name="logging-data-events-with-the-AWS-SDKs"></a>

Run the [GetEventSelectors](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_GetEventSelectors.html) operation to see whether your trail is logging data events. You can configure your trails to log data events by running the [PutEventSelectors](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_PutEventSelectors.html) operation. For more information, see the [AWS CloudTrail API Reference](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/).

Run the [GetEventDataStore](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_GetEventDataStore.html) operation to see whether your event data store is logging data events. You can configure your event data stores to include data events by running the [CreateEventDataStore](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_CreateEventDataStore.html) or [UpdateEventDataStore](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/API_UpdateEventDataStore.html) operations and specifying advanced event selectors. For more information, see [Create, update, and manage event data stores with the AWS CLI](lake-eds-cli.md) and the [AWS CloudTrail API Reference](https://docs.aws.amazon.com/awscloudtrail/latest/APIReference/).