

# Data protection
<a name="a-data-protection"></a>

**Topics**
+ [

# SEC 7. How do you classify your data?
](sec-07.md)
+ [

# SEC 8. How do you protect your data at rest?
](sec-08.md)
+ [

# SEC 9. How do you protect your data in transit?
](sec-09.md)

# SEC 7. How do you classify your data?
<a name="sec-07"></a>

Classification provides a way to categorize data, based on criticality and sensitivity in order to help you determine appropriate protection and retention controls.

**Topics**
+ [

# SEC07-BP01 Identify the data within your workload
](sec_data_classification_identify_data.md)
+ [

# SEC07-BP02 Define data protection controls
](sec_data_classification_define_protection.md)
+ [

# SEC07-BP03 Automate identification and classification
](sec_data_classification_auto_classification.md)
+ [

# SEC07-BP04 Define data lifecycle management
](sec_data_classification_lifecycle_management.md)

# SEC07-BP01 Identify the data within your workload
<a name="sec_data_classification_identify_data"></a>

It’s critical to understand the type and classification of data your workload is processing, the associated business processes, where the data is stored, and who is the data owner. You should also have an understanding of the applicable legal and compliance requirements of your workload, and what data controls need to be enforced. Identifying data is the first step in the data classification journey. 

**Benefits of establishing this best practice:**

 Data classification allows workload owners to identify locations that store sensitive data and determine how that data should be accessed and shared. 

 Data classification aims to answer the following questions: 
+ **What type of data do you have?**

  This could be data such as: 
  +  Intellectual property (IP) such as trade secrets, patents, or contract agreements. 
  +  Protected health information (PHI) such as medical records that contain medical history information connected to an individual. 
  +  Personally identifiable information (PII), such as name, address, date of birth, and national ID or registration number. 
  +  Credit card data, such as the Primary Account Number (PAN), cardholder name, expiration date, and service code number. 
  +  Where is the sensitive data is stored? 
  +  Who can access, modify, and delete data? 
  +  Understanding user permissions is essential in guarding against potential data mishandling. 
+ **Who can perform create, read, update, and delete (CRUD) operations? **
  +  Account for potential escalation of privileges by understanding who can manage permissions to the data. 
+ **What business impact might occur if the data is disclosed unintentionally, altered, or deleted? **
  +  Understand the risk consequence if data is modified, deleted, or inadvertently disclosed. 

By knowing the answers to these questions, you can take the following actions: 
+  Decrease sensitive data scope (such as the number of sensitive data locations) and limit access to sensitive data to only approved users. 
+  Gain an understanding of different data types so that you can implement appropriate data protection mechanisms and techniques, such as encryption, data loss prevention, and identity and access management. 
+  Optimize costs by delivering the right control objectives for the data. 
+  Confidently answer questions from regulators and auditors regarding the types and amount of data, and how data of different sensitivities are isolated from each other. 

 **Level of risk exposed if this best practice is not established**: High 

## Implementation guidance
<a name="implementation-guidance"></a>

 Data classification is the act of identifying the sensitivity of data. It might involve tagging to make the data easily searchable and trackable. Data classification also reduces the duplication of data, which can help reduce storage and backup costs while speeding up the search process. 

 Use services such as Amazon Macie to automate at scale both the discovery and classification of sensitive data. Other services, such as Amazon EventBridge and AWS Config, can be used to automate remediation for data security issues such as unencrypted Amazon Simple Storage Service (Amazon S3) buckets and Amazon EC2 EBS volumes or untagged data resources. For a complete list of AWS service integrations, see the [EventBridge documentation](https://docs.aws.amazon.com/eventbridge/latest/userguide/event-types.html). 

 [Detecting PII](https://docs.aws.amazon.com/comprehend/latest/dg/how-pii.html) in unstructured data such as customer emails, support tickets, product reviews, and social media, is possible by [using Amazon Comprehend](https://aws.amazon.com/blogs/machine-learning/detecting-and-redacting-pii-using-amazon-comprehend/), which is a natural language processing (NLP) service that uses machine learning (ML) to find insights and relationships like people, places, sentiments, and topics in unstructured text. For a list of AWS services that can assist with data identification, see [Common techniques to detect PHI and PII data using AWS services](https://aws.amazon.com/blogs/industries/common-techniques-to-detect-phi-and-pii-data-using-aws-services/). 

 Another method that supports data classification and protection is [AWS resource tagging](https://docs.aws.amazon.com/general/latest/gr/aws_tagging.html). Tagging allows you to assign metadata to your AWS resources that you can use to manage, identify, organize, search for, and filter resources. 

 In some cases, you might choose to tag entire resources (such as an S3 bucket), especially when a specific workload or service is expected to store processes or transmissions of already known data classification. 

 Where appropriate, you can tag an S3 bucket instead of individual objects for ease of administration and security maintenance. 

### Implementation steps
<a name="implementation-steps"></a>

**Detect sensitive data within Amazon S3: **

1.  Before starting, make sure you have the appropriate permissions to access the Amazon Macie console and API operations. For additional details, see [Getting started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html). 

1.  Use Amazon Macie to perform automated data discovery when your sensitive data resides in [Amazon S3](https://aws.amazon.com/s3/). 
   +  Use the [Getting Started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html) guide to configure a repository for sensitive data discovery results and create a discovery job for sensitive data. 
   +  [How to use Amazon Macie to preview sensitive data in S3 buckets.](https://aws.amazon.com/blogs/security/how-to-use-amazon-macie-to-preview-sensitive-data-in-s3-buckets/) 

      By default, Macie analyzes objects by using the set of managed data identifiers that we recommend for automated sensitive data discovery. You can tailor the analysis by configuring Macie to use specific managed data identifiers, custom data identifiers, and allow lists when it performs automated sensitive data discovery for your account or organization. You can adjust the scope of the analysis by excluding specific buckets (for example, S3 buckets that typically store AWS logging data). 

1.  To configure and use automated sensitive data discovery, see [Performing automated sensitive data discovery with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/discovery-asdd-account-manage.html). 

1.  You might also consider [Automated Data Discovery for Amazon Macie](https://aws.amazon.com/blogs/aws/automated-data-discovery-for-amazon-macie/). 

**Detect sensitive data within Amazon RDS: **

 For more information on data discovery in [Amazon Relational Database Service (Amazon RDS)](https://aws.amazon.com/rds/) databases, see [Enabling data classification for Amazon RDS database with Macie](https://aws.amazon.com/blogs/security/enabling-data-classification-for-amazon-rds-database-with-amazon-macie/). 

**Detect sensitive data within DynamoDB: **
+  [Detecting sensitive data in DynamoDB with Macie](https://aws.amazon.com/blogs/security/detecting-sensitive-data-in-dynamodb-with-macie/) explains how to use Amazon Macie to detect sensitive data in [Amazon DynamoDB](https://aws.amazon.com/dynamodb/) tables by exporting the data to Amazon S3 for scanning. 

**AWS Partner solutions: **
+  Consider using our extensive AWS Partner Network. AWS Partners have extensive tools and compliance frameworks that directly integrate with AWS services. Partners can provide you with a tailored governance and compliance solution to help you meet your organizational needs. 
+  For customized solutions in data classification, see [Data governance in the age of regulation and compliance requirements](https://aws.amazon.com/big-data/featured-partner-solutions-data-governance-compliance/). 

 You can automatically enforce the tagging standards that your organization adopts by creating and deploying policies using AWS Organizations. Tag policies let you specify rules that define valid key names and what values are valid for each key. You can choose to monitor only, which gives you an opportunity to evaluate and clean up your existing tags. After your tags are in compliance with your chosen standards, you can turn on enforcement in the tag policies to prevent non-compliant tags from being created. For more details, see [Securing resource tags used for authorization using a service control policy in AWS Organizations](https://aws.amazon.com/blogs/security/securing-resource-tags-used-for-authorization-using-service-control-policy-in-aws-organizations/) and the example policy on [preventing tags from being modified except by authorized principals](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps_examples_tagging.html#example-require-restrict-tag-mods-to-admin). 
+  To begin using tag policies in [AWS Organizations](https://aws.amazon.com/organizations/), it’s strongly recommended that you follow the workflow in [Getting started with tag policies](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_tag-policies-getting-started.html) before moving on to more advanced tag policies. Understanding the effects of attaching a simple tag policy to a single account before expanding to an entire organizational unit (OU) or organization allows you to see a tag policy’s effects before you enforce compliance with the tag policy. [Getting started with tag policies](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_tag-policies-getting-started.html) provides links to instructions for more advanced policy-related tasks. 
+  Consider evaluating other [AWS services and features](https://docs.aws.amazon.com/whitepapers/latest/data-classification/using-aws-cloud-to-support-data-classification.html#aws-services-and-features) that support data classification, which are listed in the [Data Classification](https://docs.aws.amazon.com/whitepapers/latest/data-classification/data-classification.html) whitepaper. 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [Getting Started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html) 
+  [Automated data discovery with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/discovery-asdd.html) 
+  [Getting started with tag policies](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_tag-policies-getting-started.html) 
+  [Detecting PII entities](https://docs.aws.amazon.com/comprehend/latest/dg/how-pii.html) 

 **Related blogs:** 
+  [How to use Amazon Macie to preview sensitive data in S3 buckets.](https://aws.amazon.com/blogs/security/how-to-use-amazon-macie-to-preview-sensitive-data-in-s3-buckets/) 
+  [Performing automated sensitive data discovery with Amazon Macie.](https://aws.amazon.com/blogs/aws/automated-data-discovery-for-amazon-macie/) 
+  [Common techniques to detect PHI and PII data using AWS Services](https://aws.amazon.com/blogs/industries/common-techniques-to-detect-phi-and-pii-data-using-aws-services/) 
+  [Detecting and redacting PII using Amazon Comprehend](https://aws.amazon.com/blogs/machine-learning/detecting-and-redacting-pii-using-amazon-comprehend/) 
+  [Securing resource tags used for authorization using a service control policy in AWS Organizations](https://aws.amazon.com/blogs/security/securing-resource-tags-used-for-authorization-using-service-control-policy-in-aws-organizations/) 
+  [Enabling data classification for Amazon RDS database with Macie](https://aws.amazon.com/blogs/security/enabling-data-classification-for-amazon-rds-database-with-amazon-macie/) 
+  [Detecting sensitive data in DynamoDB with Macie](https://aws.amazon.com/blogs/security/detecting-sensitive-data-in-dynamodb-with-macie/) 

 **Related videos:** 
+  [Event-driven data security using Amazon Macie](https://www.youtube.com/watch?v=onqA7MJssoU) 
+  [Amazon Macie for data protection and governance](https://www.youtube.com/watch?v=SmMSt0n6a4k) 
+  [Fine-tune sensitive data findings with allow lists](https://www.youtube.com/watch?v=JmQ_Hybh2KI) 

# SEC07-BP02 Define data protection controls
<a name="sec_data_classification_define_protection"></a>

 Protect data according to its classification level. For example, secure data classified as public by using relevant recommendations while protecting sensitive data with additional controls. 

By using resource tags, separate AWS accounts per sensitivity (and potentially also for each caveat, enclave, or community of interest), IAM policies, AWS Organizations SCPs, AWS Key Management Service (AWS KMS), and AWS CloudHSM, you can define and implement your policies for data classification and protection with encryption. For example, if you have a project with S3 buckets that contain highly critical data or Amazon Elastic Compute Cloud (Amazon EC2) instances that process confidential data, they can be tagged with a `Project=ABC` tag. Only your immediate team knows what the project code means, and it provides a way to use attribute-based access control. You can define levels of access to the AWS KMS encryption keys through key policies and grants to ensure that only appropriate services have access to the sensitive content through a secure mechanism. If you are making authorization decisions based on tags you should make sure that the permissions on the tags are defined appropriately using tag policies in AWS Organizations.

 **Level of risk exposed if this best practice is not established:** High 

## Implementation guidance
<a name="implementation-guidance"></a>
+  Define your data identification and classification schema: Identification and classification of your data is performed to assess the potential impact and type of data you store, and who can access it. 
  +  [AWS Documentation](https://docs.aws.amazon.com/) 
+  Discover available AWS controls: For the AWS services you are or plan to use, discover the security controls. Many services have a security section in their documentation. 
  +  [AWS Documentation](https://docs.aws.amazon.com/) 
+  Identify AWS compliance resources: Identify resources that AWS has available to assist. 
  +  [https://aws.amazon.com/compliance/](https://aws.amazon.com/compliance/?ref=wellarchitected) 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [AWS Documentation](https://docs.aws.amazon.com/) 
+  [Data Classification whitepaper](https://docs.aws.amazon.com/whitepapers/latest/data-classification/data-classification.html) 
+  [Getting started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html) 
+  [AWS Compliance](https://aws.amazon.com/compliance/) 

 **Related videos:** 
+  [Introducing the New Amazon Macie](https://youtu.be/I-ewoQekdXE) 

# SEC07-BP03 Automate identification and classification
<a name="sec_data_classification_auto_classification"></a>

 Automating the identification and classification of data can help you implement the correct controls. Using automation for this instead of direct access from a person reduces the risk of human error and exposure. You should evaluate using a tool, such as [Amazon Macie](https://aws.amazon.com/macie/), that uses machine learning to automatically discover, classify, and protect sensitive data in AWS. Amazon Macie recognizes sensitive data, such as personally identifiable information (PII) or intellectual property, and provides you with dashboards and alerts that give visibility into how this data is being accessed or moved. 

 **Level of risk exposed if this best practice is not established:** Medium 

## Implementation guidance
<a name="implementation-guidance"></a>
+  Use Amazon Simple Storage Service (Amazon S3) Inventory: Amazon S3 inventory is one of the tools you can use to audit and report on the replication and encryption status of your objects. 
  +  [Amazon S3 Inventory](https://docs.aws.amazon.com/AmazonS3/latest/dev/storage-inventory.html) 
+  Consider Amazon Macie: Amazon Macie uses machine learning to automatically discover and classify data stored in Amazon S3.
  +  [Amazon Macie](https://aws.amazon.com/macie/) 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [Amazon Macie](https://aws.amazon.com/macie/) 
+  [Amazon S3 Inventory](https://docs.aws.amazon.com/AmazonS3/latest/dev/storage-inventory.html) 
+  [Data Classification Whitepaper](https://docs.aws.amazon.com/whitepapers/latest/data-classification/data-classification.html) 
+  [Getting started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html) 

 **Related videos:** 
+  [Introducing the New Amazon Macie](https://youtu.be/I-ewoQekdXE) 

# SEC07-BP04 Define data lifecycle management
<a name="sec_data_classification_lifecycle_management"></a>

 Your defined lifecycle strategy should be based on sensitivity level as well as legal and organization requirements. Aspects including the duration for which you retain data, data destruction processes, data access management, data transformation, and data sharing should be considered. When choosing a data classification methodology, balance usability versus access. You should also accommodate the multiple levels of access and nuances for implementing a secure, but still usable, approach for each level. Always use a defense in depth approach and reduce human access to data and mechanisms for transforming, deleting, or copying data. For example, require users to strongly authenticate to an application, and give the application, rather than the users, the requisite access permission to perform action at a distance. In addition, ensure that users come from a trusted network path and require access to the decryption keys. Use tools, such as dashboards and automated reporting, to give users information from the data rather than giving them direct access to the data. 

 **Level of risk exposed if this best practice is not established:** Low 

## Implementation guidance
<a name="implementation-guidance"></a>
+  Identify data types: Identify the types of data that you are storing or processing in your workload. That data could be text, images, binary databases, and so forth. 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [Data Classification Whitepaper](https://docs.aws.amazon.com/whitepapers/latest/data-classification/data-classification.html) 
+  [Getting started with Amazon Macie](https://docs.aws.amazon.com/macie/latest/user/getting-started.html) 

 **Related videos:** 
+  [Introducing the New Amazon Macie](https://youtu.be/I-ewoQekdXE) 

# SEC 8. How do you protect your data at rest?
<a name="sec-08"></a>

Protect your data at rest by implementing multiple controls, to reduce the risk of unauthorized access or mishandling.

**Topics**
+ [

# SEC08-BP01 Implement secure key management
](sec_protect_data_rest_key_mgmt.md)
+ [

# SEC08-BP02 Enforce encryption at rest
](sec_protect_data_rest_encrypt.md)
+ [

# SEC08-BP03 Automate data at rest protection
](sec_protect_data_rest_automate_protection.md)
+ [

# SEC08-BP04 Enforce access control
](sec_protect_data_rest_access_control.md)
+ [

# SEC08-BP05 Use mechanisms to keep people away from data
](sec_protect_data_rest_use_people_away.md)

# SEC08-BP01 Implement secure key management
<a name="sec_protect_data_rest_key_mgmt"></a>

 Secure key management includes the storage, rotation, access control, and monitoring of key material required to secure data at rest for your workload. 

 **Desired outcome:** A scalable, repeatable, and automated key management mechanism. The mechanism should provide the ability to enforce least privilege access to key material, provide the correct balance between key availability, confidentiality, and integrity. Access to keys should be monitored, and key material rotated through an automated process. Key material should never be accessible to human identities. 

**Common anti-patterns:** 
+  Human access to unencrypted key material. 
+  Creating custom cryptographic algorithms. 
+  Overly broad permissions to access key material. 

 **Benefits of establishing this best practice:** By establishing a secure key management mechanism for your workload, you can help provide protection for your content against unauthorized access. Additionally, you may be subject to regulatory requirements to encrypt your data. An effective key management solution can provide technical mechanisms aligned to those regulations to protect key material. 

 **Level of risk exposed if this best practice is not established:** High 

## Implementation guidance
<a name="implementation-guidance"></a>

 Many regulatory requirements and best practices include encryption of data at rest as a fundamental security control. In order to comply with this control, your workload needs a mechanism to securely store and manage the key material used to encrypt your data at rest. 

 AWS offers AWS Key Management Service (AWS KMS) to provide durable, secure, and redundant storage for AWS KMS keys. [Many AWS services integrate with AWS KMS](https://aws.amazon.com/kms/features/#integration) to support encryption of your data. AWS KMS uses FIPS 140-2 Level 3 validated hardware security modules to protect your keys. There is no mechanism to export AWS KMS keys in plain text. 

 When deploying workloads using a multi-account strategy, it is considered [best practice](https://docs.aws.amazon.com/prescriptive-guidance/latest/security-reference-architecture/application.html#app-kms) to keep AWS KMS keys in the same account as the workload that uses them. In this distributed model, responsibility for managing the AWS KMS keys resides with the application team. In other use cases, organizations may choose to store AWS KMS keys into a centralized account. This centralized structure requires additional policies to enable the cross-account access required for the workload account to access keys stored in the centralized account, but may be more applicable in use cases where a single key is shared across multiple AWS accounts. 

 Regardless of where the key material is stored, access to the key should be tightly controlled through the use of [key policies](https://docs.aws.amazon.com/kms/latest/developerguide/key-policies.html) and IAM policies. Key policies are the primary way to control access to a AWS KMS key. In addition, AWS KMS key grants can provide access to AWS services to encrypt and decrypt data on your behalf. Take time to review the [best practices for access control to your AWS KMS keys](https://docs.aws.amazon.com/kms/latest/developerguide/iam-policies-best-practices.html). 

 It is best practice to monitor the use of encryption keys to detect unusual access patterns. Operations performed using AWS managed keys and customer managed keys stored in AWS KMS can be logged in AWS CloudTrail and should be reviewed periodically. Special attention should be placed on monitoring key destruction events. To mitigate accidental or malicious destruction of key material, key destruction events do not delete the key material immediately. Attempts to delete keys in AWS KMS are subject to a [waiting period](https://docs.aws.amazon.com/kms/latest/developerguide/deleting-keys.html#deleting-keys-how-it-works), which defaults to 30 days, providing administrators time to review these actions and roll back the request if necessary. 

 Most AWS services use AWS KMS in a way that is transparent to you - your only requirement is to decide whether to use an AWS managed or customer managed key. If your workload requires the direct use of AWS KMS to encrypt or decrypt data, the best practice is to use [envelope encryption](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#enveloping) to protect your data. The [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html) can provide your applications client-side encryption primitives to implement envelope encryption and integrate with AWS KMS. 

### Implementation steps
<a name="implementation-steps"></a>

1.  Determine the appropriate [key management options](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#key-mgmt) (AWS managed or customer managed) for the key. 
   +  For ease of use, AWS offers AWS owned and AWS managed keys for most services, which provide encryption-at-rest capability without the need to manage key material or key policies. 
   +  When using customer managed keys, consider the default key store to provide the best balance between agility, security, data sovereignty, and availability. Other use cases may require the use of custom key stores with [AWS CloudHSM](https://aws.amazon.com/cloudhsm/) or the [external key store](https://docs.aws.amazon.com/kms/latest/developerguide/keystore-external.html). 

1.  Review the list of services that you are using for your workload to understand how AWS KMS integrates with the service. For example, EC2 instances can use encrypted EBS volumes, verifying that Amazon EBS snapshots created from those volumes are also encrypted using a customer managed key and mitigating accidental disclosure of unencrypted snapshot data. 
   +  [How AWS services use AWS KMS](https://docs.aws.amazon.com/kms/latest/developerguide/service-integration.html) 
   +  For detailed information about the encryption options that an AWS service offers, see the Encryption at Rest topic in the user guide or the developer guide for the service. 

1.  Implement AWS KMS: AWS KMS makes it simple for you to create and manage keys and control the use of encryption across a wide range of AWS services and in your applications. 
   +  [Getting started: AWS Key Management Service (AWS KMS)](https://docs.aws.amazon.com/kms/latest/developerguide/getting-started.html) 
   +  Review the [best practices for access control to your AWS KMS keys](https://docs.aws.amazon.com/kms/latest/developerguide/iam-policies-best-practices.html). 

1.  Consider AWS Encryption SDK: Use the AWS Encryption SDK with AWS KMS integration when your application needs to encrypt data client-side. 
   +  [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html) 

1.  Enable [IAM Access Analyzer](https://docs.aws.amazon.com/IAM/latest/UserGuide/what-is-access-analyzer.html) to automatically review and notify if there are overly broad AWS KMS key policies. 

1.  Enable [Security Hub CSPM](https://docs.aws.amazon.com/securityhub/latest/userguide/kms-controls.html) to receive notifications if there are misconfigured key policies, keys scheduled for deletion, or keys without automated rotation enabled. 

1.  Determine the logging level appropriate for your AWS KMS keys. Since calls to AWS KMS, including read-only events, are logged, the CloudTrail logs associated with AWS KMS can become voluminous. 
   +  Some organizations prefer to segregate the AWS KMS logging activity into a separate trail. For more detail, see the [Logging AWS KMS API calls with CloudTrail](https://docs.aws.amazon.com/kms/latest/developerguide/logging-using-cloudtrail.html) section of the AWS KMS developers guide. 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [AWS Key Management Service](https://docs.aws.amazon.com/kms/latest/developerguide/overview.html) 
+  [AWS cryptographic services and tools](https://docs.aws.amazon.com/crypto/latest/userguide/awscryp-overview.html) 
+  [Protecting Amazon S3 Data Using Encryption](https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingEncryption.html) 
+  [Envelope encryption](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#enveloping) 
+  [Digital sovereignty pledge](https://aws.amazon.com/blogs/security/aws-digital-sovereignty-pledge-control-without-compromise/) 
+  [Demystifying AWS KMS key operations, bring your own key, custom key store, and ciphertext portability](https://aws.amazon.com/blogs/security/demystifying-kms-keys-operations-bring-your-own-key-byok-custom-key-store-and-ciphertext-portability/) 
+  [AWS Key Management Service cryptographic details](https://docs.aws.amazon.com/kms/latest/cryptographic-details/intro.html) 

 **Related videos:** 
+  [How Encryption Works in AWS](https://youtu.be/plv7PQZICCM) 
+  [Securing Your Block Storage on AWS](https://youtu.be/Y1hE1Nkcxs8) 
+  [AWS data protection: Using locks, keys, signatures, and certificates](https://www.youtube.com/watch?v=lD34wbc7KNA) 

 **Related examples:** 
+  [Implement advanced access control mechanisms using AWS KMS](https://catalog.workshops.aws/advkmsaccess/en-US/introduction) 

# SEC08-BP02 Enforce encryption at rest
<a name="sec_protect_data_rest_encrypt"></a>

 You should enforce the use of encryption for data at rest. Encryption maintains the confidentiality of sensitive data in the event of unauthorized access or accidental disclosure. 

 **Desired outcome:** Private data should be encrypted by default when at rest. Encryption helps maintain confidentiality of the data and provides an additional layer of protection against intentional or inadvertent data disclosure or exfiltration. Data that is encrypted cannot be read or accessed without first unencrypting the data. Any data stored unencrypted should be inventoried and controlled. 

 **Common anti-patterns:** 
+  Not using encrypt-by-default configurations. 
+  Providing overly permissive access to decryption keys. 
+  Not monitoring the use of encryption and decryption keys. 
+  Storing data unencrypted. 
+  Using the same encryption key for all data regardless of data usage, types, and classification. 

 **Level of risk exposed if this best practice is not established:** High 

## Implementation guidance
<a name="implementation-guidance"></a>

 Map encryption keys to data classifications within your workloads. This approach helps protect against overly permissive access when using either a single, or very small number of encryption keys for your data (see [SEC07-BP01 Identify the data within your workload](sec_data_classification_identify_data.md)). 

 AWS Key Management Service (AWS KMS) integrates with many AWS services to make it easier to encrypt your data at rest. For example, in Amazon Simple Storage Service (Amazon S3), you can set [default encryption](https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-encryption.html) on a bucket so that new objects are automatically encrypted. When using AWS KMS, consider how tightly the data needs to be restricted. Default and service-controlled AWS KMS keys are managed and used on your behalf by AWS. For sensitive data that requires fine-grained access to the underlying encryption key, consider customer managed keys (CMKs). You have full control over CMKs, including rotation and access management through the use of key policies. 

 Additionally, [Amazon Elastic Compute Cloud (Amazon EC2)](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html#encryption-by-default) and [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/default-bucket-encryption.html) support the enforcement of encryption by setting default encryption. You can use [AWS Config Rules](https://docs.aws.amazon.com/config/latest/developerguide/managed-rules-by-aws-config.html) to check automatically that you are using encryption, for example, for [Amazon Elastic Block Store (Amazon EBS) volumes](https://docs.aws.amazon.com/config/latest/developerguide/encrypted-volumes.html), [Amazon Relational Database Service (Amazon RDS) instances](https://docs.aws.amazon.com/config/latest/developerguide/rds-storage-encrypted.html), and [Amazon S3 buckets](https://docs.aws.amazon.com/config/latest/developerguide/s3-default-encryption-kms.html). 

 AWS also provides options for client-side encryption, allowing you to encrypt data prior to uploading it to the cloud. The AWS Encryption SDK provides a way to encrypt your data using [envelope encryption](https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#enveloping). You provide the wrapping key, and the AWS Encryption SDK generates a unique data key for each data object it encrypts. Consider AWS CloudHSM if you need a managed single-tenant hardware security module (HSM). AWS CloudHSM allows you to generate, import, and manage cryptographic keys on a FIPS 140-2 level 3 validated HSM. Some use cases for AWS CloudHSM include protecting private keys for issuing a certificate authority (CA), and turning on transparent data encryption (TDE) for Oracle databases. The AWS CloudHSM Client SDK provides software that allows you to encrypt data client side using keys stored inside AWS CloudHSM prior to uploading your data into AWS. The Amazon DynamoDB Encryption Client also allows you to encrypt and sign items prior to upload into a DynamoDB table. 

 **Implementation steps** 
+  **Enforce encryption at rest for Amazon S3:** Implement [Amazon S3 bucket default encryption](https://docs.aws.amazon.com/AmazonS3/latest/userguide/default-bucket-encryption.html). 

   **Configure [default encryption for new Amazon EBS volumes](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html):** Specify that you want all newly created Amazon EBS volumes to be created in encrypted form, with the option of using the default key provided by AWS or a key that you create. 

   **Configure encrypted Amazon Machine Images (AMIs):** Copying an existing AMI with encryption configured will automatically encrypt root volumes and snapshots. 

   **Configure [Amazon RDS encryption](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Overview.Encryption.html):** Configure encryption for your Amazon RDS database clusters and snapshots at rest by using the encryption option. 

   **Create and configure AWS KMS keys with policies that limit access to the appropriate principals for each classification of data:** For example, create one AWS KMS key for encrypting production data and a different key for encrypting development or test data. You can also provide key access to other AWS accounts. Consider having different accounts for your development and production environments. If your production environment needs to decrypt artifacts in the development account, you can edit the CMK policy used to encrypt the development artifacts to give the production account the ability to decrypt those artifacts. The production environment can then ingest the decrypted data for use in production. 

   **Configure encryption in additional AWS services:** For other AWS services you use, review the [security documentation](https://docs.aws.amazon.com/security/) for that service to determine the service’s encryption options. 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [AWS Crypto Tools](https://docs.aws.amazon.com/aws-crypto-tools) 
+  [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html) 
+  [AWS KMS Cryptographic Details Whitepaper](https://docs.aws.amazon.com/kms/latest/cryptographic-details/intro.html) 
+  [AWS Key Management Service](https://aws.amazon.com/kms) 
+  [AWS cryptographic services and tools](https://docs.aws.amazon.com/crypto/latest/userguide/awscryp-overview.html) 
+  [Amazon EBS Encryption](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html) 
+  [Default encryption for Amazon EBS volumes](https://aws.amazon.com/blogs/aws/new-opt-in-to-default-encryption-for-new-ebs-volumes/) 
+  [Encrypting Amazon RDS Resources](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.Encryption.html) 
+  [How do I enable default encryption for an Amazon S3 bucket?](https://docs.aws.amazon.com/AmazonS3/latest/user-guide/default-bucket-encryption.html) 
+  [Protecting Amazon S3 Data Using Encryption](https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingEncryption.html) 

 **Related videos:** 
+  [How Encryption Works in AWS](https://youtu.be/plv7PQZICCM) 
+  [Securing Your Block Storage on AWS](https://youtu.be/Y1hE1Nkcxs8) 

# SEC08-BP03 Automate data at rest protection
<a name="sec_protect_data_rest_automate_protection"></a>

 Use automated tools to validate and enforce data at rest controls continuously, for example, verify that there are only encrypted storage resources. You can [automate validation that all EBS volumes are encrypted](https://docs.aws.amazon.com/config/latest/developerguide/encrypted-volumes.html) using [AWS Config Rules](https://docs.aws.amazon.com/config/latest/developerguide/evaluate-config.html). [AWS Security Hub CSPM](http://aws.amazon.com/security-hub/) can also verify several different controls through automated checks against security standards. Additionally, your AWS Config Rules can automatically [remediate noncompliant resources](https://docs.aws.amazon.com/config/latest/developerguide/remediation.html#setup-autoremediation). 

 **Level of risk exposed if this best practice is not established:** Medium 

## Implementation guidance
<a name="implementation_guidance"></a>

 *Data at rest* represents any data that you persist in non-volatile storage for any duration in your workload. This includes block storage, object storage, databases, archives, IoT devices, and any other storage medium on which data is persisted. Protecting your data at rest reduces the risk of unauthorized access, when encryption and appropriate access controls are implemented. 

 Enforce encryption at rest: You should ensure that the only way to store data is by using encryption. AWS KMS integrates seamlessly with many AWS services to make it easier for you to encrypt all your data at rest. For example, in Amazon Simple Storage Service (Amazon S3) you can set [default encryption](https://docs.aws.amazon.com/AmazonS3/latest/dev/bucket-encryption.html) on a bucket so that all new objects are automatically encrypted. Additionally, [Amazon EC2](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html#encryption-by-default) and [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/default-bucket-encryption.html) support the enforcement of encryption by setting default encryption. You can use [AWS Managed Config Rules](https://docs.aws.amazon.com/config/latest/developerguide/managed-rules-by-aws-config.html) to check automatically that you are using encryption, for example, for [EBS volumes](https://docs.aws.amazon.com/config/latest/developerguide/encrypted-volumes.html), [Amazon Relational Database Service (Amazon RDS) instances](https://docs.aws.amazon.com/config/latest/developerguide/rds-storage-encrypted.html), and [Amazon S3 buckets](https://docs.aws.amazon.com/config/latest/developerguide/s3-default-encryption-kms.html). 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [AWS Crypto Tools](https://docs.aws.amazon.com/aws-crypto-tools) 
+  [AWS Encryption SDK](https://docs.aws.amazon.com/encryption-sdk/latest/developer-guide/introduction.html) 

 **Related videos:** 
+  [How Encryption Works in AWS](https://youtu.be/plv7PQZICCM) 
+  [Securing Your Block Storage on AWS](https://youtu.be/Y1hE1Nkcxs8) 

# SEC08-BP04 Enforce access control
<a name="sec_protect_data_rest_access_control"></a>

 To help protect your data at rest, enforce access control using mechanisms, such as isolation and versioning, and apply the principle of least privilege. Prevent the granting of public access to your data. 

**Desired outcome:** Verify that only authorized users can access data on a need-to-know basis. Protect your data with regular backups and versioning to prevent against intentional or inadvertent modification or deletion of data. Isolate critical data from other data to protect its confidentiality and data integrity. 

**Common anti-patterns:**
+  Storing data with different sensitivity requirements or classification together. 
+  Using overly permissive permissions on decryption keys. 
+  Improperly classifying data. 
+  Not retaining detailed backups of important data. 
+  Providing persistent access to production data. 
+  Not auditing data access or regularly reviewing permissions.

**Level of risk exposed if this best practice is not established:** Low 

## Implementation guidance
<a name="implementation-guidance"></a>

 Multiple controls can help protect your data at rest, including access (using least privilege), isolation, and versioning. Access to your data should be audited using detective mechanisms, such as AWS CloudTrail, and service level logs, such as Amazon Simple Storage Service (Amazon S3) access logs. You should inventory what data is publicly accessible, and create a plan to reduce the amount of publicly available data over time. 

 Amazon Glacier Vault Lock and Amazon S3 Object Lock provide mandatory access control for objects in Amazon S3—once a vault policy is locked with the compliance option, not even the root user can change it until the lock expires. 

### Implementation steps
<a name="implementation-steps"></a>
+  **Enforce access control**: Enforce access control with least privileges, including access to encryption keys. 
+  **Separate data based on different classification levels**: Use different AWS accounts for data classification levels, and manage those accounts using [AWS Organizations](https://docs.aws.amazon.com/organizations/latest/userguide/orgs_introduction.html). 
+  **Review AWS Key Management Service (AWS KMS) policies**: [Review the level of access](https://docs.aws.amazon.com/kms/latest/developerguide/overview.html) granted in AWS KMS policies. 
+  **Review Amazon S3 bucket and object permissions**: Regularly review the level of access granted in S3 bucket policies. Best practice is to avoid using publicly readable or writeable buckets. Consider using [AWS Config](https://docs.aws.amazon.com/config/latest/developerguide/managed-rules-by-aws-config.html) to detect buckets that are publicly available, and Amazon CloudFront to serve content from Amazon S3. Verify that buckets that should not allow public access are properly configured to prevent public access. By default, all S3 buckets are private, and can only be accessed by users that have been explicitly granted access. 
+  **Use [AWS IAM Access Analyzer](https://docs.aws.amazon.com/IAM/latest/UserGuide/what-is-access-analyzer.html):** IAM Access Analyzer analyzes Amazon S3 buckets and generates a finding when [an S3 policy grants access to an external entity.](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-analyzer-resources.html#access-analyzer-s3) 
+  **Use [Amazon S3 versioning](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Versioning.html) and [object lock](https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lock.html) when appropriate**. 
+  **Use [Amazon S3 Inventory](https://docs.aws.amazon.com/AmazonS3/latest/dev/storage-inventory.html)**: Amazon S3 Inventory can be used to audit and report on the replication and encryption status of your S3 objects. 
+  **Review [Amazon EBS](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-modifying-snapshot-permissions.html) and [AMI sharing](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/sharing-amis.html) permissions**: Sharing permissions can allow images and volumes to be shared with AWS accounts that are external to your workload. 
+  **Review [AWS Resource Access Manager](https://docs.aws.amazon.com/ram/latest/userguide/what-is.html) Shares periodically to determine whether resources should continue to be shared.** Resource Access Manager allows you to share resources, such as AWS Network Firewall policies, Amazon Route 53 resolver rules, and subnets, within your Amazon VPCs. Audit shared resources regularly and stop sharing resources which no longer need to be shared. 

## Resources
<a name="resources"></a>

 **Related best practices:** 
+ [SEC03-BP01 Define access requirements](sec_permissions_define.md) 
+  [SEC03-BP02 Grant least privilege access](sec_permissions_least_privileges.md) 

 **Related documents:** 
+  [AWS KMS Cryptographic Details Whitepaper](https://docs.aws.amazon.com/kms/latest/cryptographic-details/intro.html) 
+  [Introduction to Managing Access Permissions to Your Amazon S3 Resources](https://docs.aws.amazon.com/AmazonS3/latest/dev/intro-managing-access-s3-resources.html) 
+  [Overview of managing access to your AWS KMS resources](https://docs.aws.amazon.com/kms/latest/developerguide/control-access-overview.html) 
+  [AWS Config Rules](https://docs.aws.amazon.com/config/latest/developerguide/managed-rules-by-aws-config.html) 
+  [Amazon S3 \$1 Amazon CloudFront: A Match Made in the Cloud](https://aws.amazon.com/blogs/networking-and-content-delivery/amazon-s3-amazon-cloudfront-a-match-made-in-the-cloud/) 
+  [Using versioning](https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html) 
+  [Locking Objects Using Amazon S3 Object Lock](https://docs.aws.amazon.com/AmazonS3/latest/dev/object-lock.html) 
+  [Sharing an Amazon EBS Snapshot](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-modifying-snapshot-permissions.html) 
+  [Shared AMIs](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/sharing-amis.html) 
+  [Hosting a single-page application on Amazon S3](https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/deploy-a-react-based-single-page-application-to-amazon-s3-and-cloudfront.html) 

 **Related videos:** 
+  [Securing Your Block Storage on AWS](https://youtu.be/Y1hE1Nkcxs8) 

# SEC08-BP05 Use mechanisms to keep people away from data
<a name="sec_protect_data_rest_use_people_away"></a>

 Keep all users away from directly accessing sensitive data and systems under normal operational circumstances. For example, use a change management workflow to manage Amazon Elastic Compute Cloud (Amazon EC2) instances using tools instead of allowing direct access or a bastion host. This can be achieved using [AWS Systems Manager Automation](https://docs.aws.amazon.com/systems-manager/latest/userguide/systems-manager-automation.html), which uses [automation documents](https://docs.aws.amazon.com/systems-manager/latest/userguide/automation-documents.html) that contain steps you use to perform tasks. These documents can be stored in source control, be peer reviewed before running, and tested thoroughly to minimize risk compared to shell access. Business users could have a dashboard instead of direct access to a data store to run queries. Where CI/CD pipelines are not used, determine which controls and processes are required to adequately provide a normally deactivated break-glass access mechanism. 

 **Level of risk exposed if this best practice is not established:** Low 

## Implementation guidance
<a name="implementation-guidance"></a>
+  Implement mechanisms to keep people away from data: Mechanisms include using dashboards, such as Quick, to display data to users instead of directly querying. 
  +  [Quick](https://aws.amazon.com/quicksight/) 
+  Automate configuration management: Perform actions at a distance, enforce and validate secure configurations automatically by using a configuration management service or tool. Avoid use of bastion hosts or directly accessing EC2 instances. 
  +  [AWS Systems Manager](https://aws.amazon.com/systems-manager/) 
  +  [AWS CloudFormation](https://aws.amazon.com/cloudformation/) 
  +  [CI/CD Pipeline for AWS CloudFormation templates on AWS](https://aws.amazon.com/quickstart/architecture/cicd-taskcat/) 

## Resources
<a name="resources"></a>

 **Related documents:** 
+  [AWS KMS Cryptographic Details Whitepaper](https://docs.aws.amazon.com/kms/latest/cryptographic-details/intro.html) 

 **Related videos:** 
+  [How Encryption Works in AWS](https://youtu.be/plv7PQZICCM) 
+  [Securing Your Block Storage on AWS](https://youtu.be/Y1hE1Nkcxs8) 

# SEC 9. How do you protect your data in transit?
<a name="sec-09"></a>

Protect your data in transit by implementing multiple controls to reduce the risk of unauthorized access or loss.

**Topics**
+ [

# SEC09-BP01 Implement secure key and certificate management
](sec_protect_data_transit_key_cert_mgmt.md)
+ [

# SEC09-BP02 Enforce encryption in transit
](sec_protect_data_transit_encrypt.md)
+ [

# SEC09-BP03 Automate detection of unintended data access
](sec_protect_data_transit_auto_unintended_access.md)
+ [

# SEC09-BP04 Authenticate network communications
](sec_protect_data_transit_authentication.md)

# SEC09-BP01 Implement secure key and certificate management
<a name="sec_protect_data_transit_key_cert_mgmt"></a>

 Transport Layer Security (TLS) certificates are used to secure network communications and establish the identity of websites, resources, and workloads over the internet, as well as private networks. 

 **Desired outcome:** A secure certificate management system that can provision, deploy, store, and renew certificates in a public key infrastructure (PKI). A secure key and certificate management mechanism prevents certificate private key material from disclosure and automatically renews the certificate on a periodic basis. It also integrates with other services to provide secure network communications and identity for machine resources inside of your workload. Key material should never be accessible to human identities. 

 **Common anti-patterns:** 
+  Performing manual steps during the certificate deployment or renewal processes. 
+  Paying insufficient attention to certificate authority (CA) hierarchy when designing a private CA. 
+  Using self-signed certificates for public resources. 

 **Benefits of establishing this best practice: **
+  Simplify certificate management through automated deployment and renewal 
+  Encourage encryption of data in transit using TLS certificates 
+  Increased security and auditability of certificate actions taken by the certificate authority 
+  Organization of management duties at different layers of the CA hierarchy 

 **Level of risk exposed if this best practice is not established:** High

## Implementation guidance
<a name="implementation-guidance"></a>

 Modern workloads make extensive use of encrypted network communications using PKI protocols such as TLS. PKI certificate management can be complex, but automated certificate provisioning, deployment, and renewal can reduce the friction associated with certificate management. 

 AWS provides two services to manage general-purpose PKI certificates: [AWS Certificate Manager](https://docs.aws.amazon.com/acm/latest/userguide/acm-overview.html) and [AWS Private Certificate Authority (AWS Private CA)](https://docs.aws.amazon.com/privateca/latest/userguide/PcaWelcome.html). ACM is the primary service that customers use to provision, manage, and deploy certificates for use in both public-facing as well as private AWS workloads. ACM issues certificates using AWS Private CA and [integrates](https://docs.aws.amazon.com/acm/latest/userguide/acm-services.html) with many other AWS managed services to provide secure TLS certificates for workloads. 

 AWS Private CA allows you to establish your own root or subordinate certificate authority and issue TLS certificates through an API. You can use these kinds of certificates in scenarios where you control and manage the trust chain on the client side of the TLS connection. In addition to TLS use cases, AWS Private CA can be used to issue certificates to Kubernetes pods, Matter device product attestations, code signing, and other use cases with a [custom template](https://docs.aws.amazon.com/privateca/latest/userguide/UsingTemplates.html). You can also use [IAM Roles Anywhere](https://docs.aws.amazon.com/rolesanywhere/latest/userguide/introduction.html) to provide temporary IAM credentials to on-premises workloads that have been issued X.509 certificates signed by your Private CA. 

 In addition to ACM and AWS Private CA, [AWS IoT Core](https://docs.aws.amazon.com/iot/latest/developerguide/what-is-aws-iot.html) provides specialized support for provisioning, managing and deploying PKI certificates to IoT devices. AWS IoT Core provides specialized mechanisms for [onboarding IoT devices](https://docs.aws.amazon.com/whitepapers/latest/device-manufacturing-provisioning/device-manufacturing-provisioning.html) into your public key infrastructure at scale. 

**Considerations for establishing a private CA hierarchy **

 When you need to establish a private CA, it's important to take special care to properly design the CA hierarchy upfront. It's a best practice to deploy each level of your CA hierarchy into separate AWS accounts when creating a private CA hierarchy. This intentional step reduces the surface area for each level in the CA hierarchy, making it simpler to discover anomalies in CloudTrail log data and reducing the scope of access or impact if there is unauthorized access to one of the accounts. The root CA should reside in its own separate account and should only be used to issue one or more intermediate CA certificates. 

 Then, create one or more intermediate CAs in accounts separate from the root CA's account to issue certificates for end users, devices, or other workloads. Finally, issue certificates from your root CA to the intermediate CAs, which will in turn issue certificates to your end users or devices. For more information on planning your CA deployment and designing your CA hierarchy, including planning for resiliency, cross-region replication, sharing CAs across your organization, and more, see [Planning your AWS Private CA deployment](https://docs.aws.amazon.com/privateca/latest/userguide/PcaPlanning.html). 

### Implementation steps
<a name="implementation-steps"></a>

1.  Determine the relevant AWS services required for your use case: 
   +  Many use cases can leverage the existing AWS public key infrastructure using [AWS Certificate Manager](https://docs.aws.amazon.com/acm/latest/userguide/acm-overview.html). ACM can be used to deploy TLS certificates for web servers, load balancers, or other uses for publicly trusted certificates. 
   +  Consider [AWS Private CA](https://docs.aws.amazon.com/privateca/latest/userguide/PcaWelcome.html) when you need to establish your own private certificate authority hierarchy or need access to exportable certificates. ACM can then be used to issue [many types of end-entity certificates](https://docs.aws.amazon.com/privateca/latest/userguide/PcaIssueCert.html) using the AWS Private CA. 
   +  For use cases where certificates must be provisioned at scale for embedded Internet of things (IoT) devices, consider [AWS IoT Core](https://docs.aws.amazon.com/iot/latest/developerguide/x509-client-certs.html). 

1.  Implement automated certificate renewal whenever possible: 
   +  Use [ACM managed renewal](https://docs.aws.amazon.com/acm/latest/userguide/managed-renewal.html) for certificates issued by ACM along with integrated AWS managed services. 

1.  Establish logging and audit trails: 
   +  Enable [CloudTrail logs](https://docs.aws.amazon.com/privateca/latest/userguide/PcaCtIntro.html) to track access to the accounts holding certificate authorities. Consider configuring log file integrity validation in CloudTrail to verify the authenticity of the log data. 
   +  Periodically generate and review [audit reports](https://docs.aws.amazon.com/privateca/latest/userguide/PcaAuditReport.html) that list the certificates that your private CA has issued or revoked. These reports can be exported to an S3 bucket. 
   +  When deploying a private CA, you will also need to establish an S3 bucket to store the Certificate Revocation List (CRL). For guidance on configuring this S3 bucket based on your workload's requirements, see [Planning a certificate revocation list (CRL)](https://docs.aws.amazon.com/privateca/latest/userguide/crl-planning.html). 

## Resources
<a name="resources"></a>

 **Related best practices:** 
+  [SEC02-BP02 Use temporary credentials](sec_identities_unique.md) 
+ [SEC08-BP01 Implement secure key management](sec_protect_data_rest_key_mgmt.md)
+  [SEC09-BP04 Authenticate network communications](sec_protect_data_transit_authentication.md) 

 **Related documents:** 
+  [How to host and manage an entire private certificate infrastructure in AWS](https://aws.amazon.com/blogs/security/how-to-host-and-manage-an-entire-private-certificate-infrastructure-in-aws/) 
+  [How to secure an enterprise scale ACM Private CA hierarchy for automotive and manufacturing](https://aws.amazon.com/blogs/security/how-to-secure-an-enterprise-scale-acm-private-ca-hierarchy-for-automotive-and-manufacturing/) 
+  [Private CA best practices](https://docs.aws.amazon.com/privateca/latest/userguide/ca-best-practices.html) 
+  [How to use AWS RAM to share your ACM Private CA cross-account](https://aws.amazon.com/blogs/security/how-to-use-aws-ram-to-share-your-acm-private-ca-cross-account/) 

 **Related videos:** 
+  [Activating AWS Certificate Manager Private CA (workshop)](https://www.youtube.com/watch?v=XrrdyplT3PE) 

 **Related examples:** 
+  [Private CA workshop](https://catalog.workshops.aws/certificatemanager/en-US/introduction) 
+  [IOT Device Management Workshop](https://iot-device-management.workshop.aws/en/) (including device provisioning) 

 **Related tools:** 
+  [Plugin to Kubernetes cert-manager to use AWS Private CA](https://github.com/cert-manager/aws-privateca-issuer) 

# SEC09-BP02 Enforce encryption in transit
<a name="sec_protect_data_transit_encrypt"></a>

Enforce your defined encryption requirements based on your organization’s policies, regulatory obligations and standards to help meet organizational, legal, and compliance requirements. Only use protocols with encryption when transmitting sensitive data outside of your virtual private cloud (VPC). Encryption helps maintain data confidentiality even when the data transits untrusted networks.

 **Desired outcome:** All data should be encrypted in transit using secure TLS protocols and cipher suites. Network traffic between your resources and the internet must be encrypted to mitigate unauthorized access to the data. Network traffic solely within your internal AWS environment should be encrypted using TLS wherever possible. The AWS internal network is encrypted by default and network traffic within a VPC cannot be spoofed or sniffed unless an unauthorized party has gained access to whatever resource is generating traffic (such as Amazon EC2 instances, and Amazon ECS containers). Consider protecting network-to-network traffic with an IPsec virtual private network (VPN). 

 **Common anti-patterns:** 
+  Using deprecated versions of SSL, TLS, and cipher suite components (for example, SSL v3.0, 1024-bit RSA keys, and RC4 cipher). 
+  Allowing unencrypted (HTTP) traffic to or from public-facing resources. 
+  Not monitoring and replacing X.509 certificates prior to expiration. 
+  Using self-signed X.509 certificates for TLS. 

 **Level of risk exposed if this best practice is not established:** High 

## Implementation guidance
<a name="implementation-guidance"></a>

 AWS services provide HTTPS endpoints using TLS for communication, providing encryption in transit when communicating with the AWS APIs. Insecure protocols like HTTP can be audited and blocked in a VPC through the use of security groups. HTTP requests can also be [automatically redirected to HTTPS](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/using-https-viewers-to-cloudfront.html) in Amazon CloudFront or on an [Application Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/load-balancer-listeners.html#redirect-actions). You have full control over your computing resources to implement encryption in transit across your services. Additionally, you can use VPN connectivity into your VPC from an external network or [AWS Direct Connect](https://aws.amazon.com/directconnect/) to facilitate encryption of traffic. Verify that your clients are making calls to AWS APIs using at least TLS 1.2, as [AWS is deprecating the use of earlier versions of TLS in June 2023](https://aws.amazon.com/blogs/security/tls-1-2-required-for-aws-endpoints/). AWS recommends using TLS 1.3. Third-party solutions are available in the AWS Marketplace if you have special requirements. 

 **Implementation steps** 
+  **Enforce encryption in transit:** Your defined encryption requirements should be based on the latest standards and best practices and only allow secure protocols. For example, configure a security group to only allow the HTTPS protocol to an application load balancer or Amazon EC2 instance. 
+  **Configure secure protocols in edge services:** [Configure HTTPS with Amazon CloudFront](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/using-https.html) and use a [security profile appropriate for your security posture and use case](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/secure-connections-supported-viewer-protocols-ciphers.html#secure-connections-supported-ciphers). 
+  **Use a [VPN for external connectivity](https://docs.aws.amazon.com/vpc/latest/userguide/vpn-connections.html):** Consider using an IPsec VPN for securing point-to-point or network-to-network connections to help provide both data privacy and integrity. 
+  **Configure secure protocols in load balancers:** Select a security policy that provides the strongest cipher suites supported by the clients that will be connecting to the listener. [Create an HTTPS listener for your Application Load Balancer](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/create-https-listener.html). 
+  **Configure secure protocols in Amazon Redshift:** Configure your cluster to require a [secure socket layer (SSL) or transport layer security (TLS) connection](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.SSL.html). 
+  **Configure secure protocols:** Review AWS service documentation to determine encryption-in-transit capabilities. 
+  **Configure secure access when uploading to Amazon S3 buckets:** Use Amazon S3 bucket policy controls to [enforce secure access](https://docs.aws.amazon.com/AmazonS3/latest/userguide/security-best-practices.html) to data. 
+  **Consider using [AWS Certificate Manager](https://aws.amazon.com/certificate-manager/):** ACM allows you to provision, manage, and deploy public TLS certificates for use with AWS services. 
+  **Consider using [AWS Private Certificate Authority](https://aws.amazon.com/private-ca/) for private PKI needs:** AWS Private CA allows you to create private certificate authority (CA) hierarchies to issue end-entity X.509 certificates that can be used to create encrypted TLS channels. 

## Resources
<a name="resources"></a>

 **Related documents:** 
+ [ Using HTTPS with CloudFront ](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/using-https.html)
+ [ Connect your VPC to remote networks using AWS Virtual Private Network](https://docs.aws.amazon.com/vpc/latest/userguide/vpn-connections.html)
+ [ Create an HTTPS listener for your Application Load Balancer ](https://docs.aws.amazon.com/elasticloadbalancing/latest/application/create-https-listener.html)
+ [ Tutorial: Configure SSL/TLS on Amazon Linux 2 ](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/SSL-on-amazon-linux-2.html)
+ [ Using SSL/TLS to encrypt a connection to a DB instance ](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.SSL.html)
+ [ Configuring security options for connections ](https://docs.aws.amazon.com/redshift/latest/mgmt/connecting-ssl-support.html)

# SEC09-BP03 Automate detection of unintended data access
<a name="sec_protect_data_transit_auto_unintended_access"></a>

 Use tools such as Amazon GuardDuty to automatically detect suspicious activity or attempts to move data outside of defined boundaries. For example, GuardDuty can detect Amazon Simple Storage Service (Amazon S3) read activity that is unusual with the [Exfiltration:S3/AnomalousBehavior finding](https://docs.aws.amazon.com/guardduty/latest/ug/guardduty_finding-types-s3.html#exfiltration-s3-objectreadunusual). In addition to GuardDuty, [Amazon VPC Flow Logs](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html), which capture network traffic information, can be used with Amazon EventBridge to detect connections, both successful and denied. [Amazon S3 Access Analyzer](http://aws.amazon.com/blogs/storage/protect-amazon-s3-buckets-using-access-analyzer-for-s3) can help assess what data is accessible to who in your Amazon S3 buckets. 

 **Level of risk exposed if this best practice is not established:** Medium 

## Implementation guidance
<a name="implementation-guidance"></a>
+  Automate detection of unintended data access: Use a tool or detection mechanism to automatically detect attempts to move data outside of defined boundaries, for example, to detect a database system that is copying data to an unrecognized host. 
  + [ VPC Flow Logs](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html) 
+  Consider Amazon Macie: Amazon Macie is a fully managed data security and data privacy service that uses machine learning and pattern matching to discover and protect your sensitive data in AWS. 
  + [ Amazon Macie ](https://aws.amazon.com/macie/)

## Resources
<a name="resources"></a>

 **Related documents:** 
+ [ VPC Flow Logs](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html) 
+ [ Amazon Macie ](https://aws.amazon.com/macie/)

# SEC09-BP04 Authenticate network communications
<a name="sec_protect_data_transit_authentication"></a>


****  

|  | 
| --- |
| This best practice was updated with new guidance on December 6, 2023. | 

 Verify the identity of communications by using protocols that support authentication, such as Transport Layer Security (TLS) or IPsec. 

 Design your workload to use secure, authenticated network protocols whenever communicating between services, applications, or to users. Using network protocols that support authentication and authorization provides stronger control over network flows and reduces the impact of unauthorized access. 

 **Desired outcome:** A workload with well-defined data plane and control plane traffic flows between services. The traffic flows use authenticated and encrypted network protocols where technically feasible. 

 **Common anti-patterns:** 
+  Unencrypted or unauthenticated traffic flows within your workload. 
+  Reusing authentication credentials across multiple users or entities. 
+  Relying solely on network controls as an access control mechanism. 
+  Creating a custom authentication mechanism rather than relying on industry-standard authentication mechanisms. 
+  Overly permissive traffic flows between service components or other resources in the VPC. 

 **Benefits of establishing this best practice:** 
+  Limits the scope of impact for unauthorized access to one part of the workload. 
+  Provides a higher level of assurance that actions are only performed by authenticated entities. 
+  Improves decoupling of services by clearly defining and enforcing intended data transfer interfaces. 
+  Enhances monitoring, logging, and incident response through request attribution and well-defined communication interfaces. 
+  Provides defense-in-depth for your workloads by combining network controls with authentication and authorization controls. 

 **Level of risk exposed if this best practice is not established:** Low 

## Implementation guidance
<a name="implementation-guidance"></a>

 Your workload’s network traffic patterns can be characterized into two categories: 
+  *East-west traffic* represents traffic flows between services that make up a workload. 
+  *North-south traffic* represents traffic flows between your workload and consumers. 

 While it is common practice to encrypt north-south traffic, securing east-west traffic using authenticated protocols is less common. Modern security practices recommend that network design alone does not grant a trusted relationship between two entities. When two services may reside within a common network boundary, it is still best practice to encrypt, authenticate, and authorize communications between those services. 

 As an example, AWS service APIs use the [AWS Signature Version 4 (SigV4)](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_aws-signing.html) signature protocol to authenticate the caller, no matter what network the request originates from. This authentication ensures that AWS APIs can verify the identity that requested the action, and that identity can then be combined with policies to make an authorization decision to determine whether the action should be allowed or not. 

 Services such as [Amazon VPC Lattice](https://docs.aws.amazon.com/vpc-lattice/latest/ug/access-management-overview.html) and [Amazon API Gateway](https://docs.aws.amazon.com/apigateway/latest/developerguide/permissions.html) allow you use the same SigV4 signature protocol to add authentication and authorization to east-west traffic in your own workloads. If resources outside of your AWS environment need to communicate with services that require SigV4-based authentication and authorization, you can use [AWS Identity and Access Management (IAM) Roles Anywhere](https://docs.aws.amazon.com/rolesanywhere/latest/userguide/introduction.html) on the non-AWS resource to acquire temporary AWS credentials. These credentials can be used to sign requests to services using SigV4 to authorize access. 

 Another common mechanism for authenticating east-west traffic is TLS mutual authentication (mTLS). Many Internet of Things (IoT), business-to-business applications, and microservices use mTLS to validate the identity of both sides of a TLS communication through the use of both client and server-side X.509 certificates. These certificates can be issued by AWS Private Certificate Authority (AWS Private CA). You can use services such as [Amazon API Gateway](https://docs.aws.amazon.com/apigateway/latest/developerguide/rest-api-mutual-tls.html) and [AWS App Mesh](https://docs.aws.amazon.com/app-mesh/latest/userguide/mutual-tls.html) to provide mTLS authentication for inter- or intra-workload communication. While mTLS provides authentication information for both sides of a TLS communication, it does not provide a mechanism for authorization. 

 Finally, OAuth 2.0 and OpenID Connect (OIDC) are two protocols typically used for controlling access to services by users, but are now becoming popular for service-to-service traffic as well. API Gateway provides a [JSON Web Token (JWT) authorizer](https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-jwt-authorizer.html), allowing workloads to restrict access to API routes using JWTs issued from OIDC or OAuth 2.0 identity providers. OAuth2 scopes can be used as a source for basic authorization decisions, but the authorization checks still need to be implemented in the application layer, and OAuth2 scopes alone cannot support more complex authorization needs. 

### Implementation steps
<a name="implementation-steps"></a>
+  **Define and document your workload network flows:** The first step in implementing a defense-in-depth strategy is defining your workload’s traffic flows. 
  +  Create a data flow diagram that clearly defines how data is transmitted between different services that comprise your workload. This diagram is the first step to enforcing those flows through authenticated network channels. 
  +  Instrument your workload in development and testing phases to validate that the data flow diagram accurately reflects the workload’s behavior at runtime. 
  +  A data flow diagram can also be useful when performing a threat modeling exercise, as described in [SEC01-BP07 Identify threats and prioritize mitigations using a threat model](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/sec_securely_operate_threat_model.html). 
+  **Establish network controls:** Consider AWS capabilities to establish network controls aligned to your data flows. While network boundaries should not be the only security control, they provide a layer in the defense-in-depth strategy to protect your workload. 
  +  Use [security groups](https://docs.aws.amazon.com/vpc/latest/userguide/security-groups.html) to establish define and restrict data flows between resources. 
  +  Consider using [AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/what-is-privatelink.html) to communicate with both AWS and third-party services that support AWS PrivateLink. Data sent through a AWS PrivateLink interface endpoint stays within the AWS network backbone and does not traverse the public Internet. 
+  **Implement authentication and authorization across services in your workload:** Choose the set of AWS services most appropriate to provide authenticated, encrypted traffic flows in your workload. 
  +  Consider [Amazon VPC Lattice](https://docs.aws.amazon.com/vpc-lattice/latest/ug/what-is-vpc-lattice.html) to secure service-to-service communication. VPC Lattice can use [SigV4 authentication combined with auth policies](https://docs.aws.amazon.com/vpc-lattice/latest/ug/auth-policies.html) to control service-to-service access. 
  +  For service-to-service communication using mTLS, consider [API Gateway](https://docs.aws.amazon.com/apigateway/latest/developerguide/rest-api-mutual-tls.html) or [App Mesh](https://docs.aws.amazon.com/app-mesh/latest/userguide/mutual-tls.html). [AWS Private CA](https://docs.aws.amazon.com/privateca/latest/userguide/PcaWelcome.html) can be used to establish a private CA hierarchy capable of issuing certificates for use with mTLS. 
  +  When integrating with services using OAuth 2.0 or OIDC, consider [API Gateway using the JWT authorizer](https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-jwt-authorizer.html). 
  +  For communication between your workload and IoT devices, consider [AWS IoT Core](https://docs.aws.amazon.com/iot/latest/developerguide/client-authentication.html), which provides several options for network traffic encryption and authentication. 
+  **Monitor for unauthorized access:** Continually monitor for unintended communication channels, unauthorized principals attempting to access protected resources, and other improper access patterns. 
  +  If using VPC Lattice to manage access to your services, consider enabling and monitoring [VPC Lattice access logs](https://docs.aws.amazon.com/vpc-lattice/latest/ug/monitoring-access-logs.html). These access logs include information on the requesting entity, network information including source and destination VPC, and request metadata. 
  +  Consider enabling [VPC flow logs](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html) to capture metadata on network flows and periodically review for anomalies. 
  +  Refer to the [AWS Security Incident Response Guide](https://docs.aws.amazon.com/whitepapers/latest/aws-security-incident-response-guide/aws-security-incident-response-guide.html) and the [Incident Response section](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/incident-response.html) of the AWS Well-Architected Framework security pillar for more guidance on planning, simulating, and responding to security incidents. 

## Resources
<a name="resources"></a>

 **Related best practices:** 
+ [ SEC03-BP07 Analyze public and cross-account access ](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/sec_permissions_analyze_cross_account.html)
+ [ SEC02-BP02 Use temporary credentials ](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/sec_identities_unique.html)
+ [ SEC01-BP07 Identify threats and prioritize mitigations using a threat model ](https://docs.aws.amazon.com/wellarchitected/latest/security-pillar/sec_securely_operate_threat_model.html)

 **Related documents:** 
+ [ Evaluating access control methods to secure Amazon API Gateway APIs ](https://aws.amazon.com/blogs/compute/evaluating-access-control-methods-to-secure-amazon-api-gateway-apis/)
+ [ Configuring mutual TLS authentication for a REST API ](https://docs.aws.amazon.com/apigateway/latest/developerguide/rest-api-mutual-tls.html)
+ [ How to secure API Gateway HTTP endpoints with JWT authorizer ](https://aws.amazon.com/blogs/security/how-to-secure-api-gateway-http-endpoints-with-jwt-authorizer/)
+ [ Authorizing direct calls to AWS services using AWS IoT Core credential provider ](https://docs.aws.amazon.com/iot/latest/developerguide/authorizing-direct-aws.html)
+ [AWS Security Incident Response Guide ](https://docs.aws.amazon.com/whitepapers/latest/aws-security-incident-response-guide/aws-security-incident-response-guide.html)

 **Related videos:** 
+ [AWS re:invent 2022: Introducing VPC Lattice ](https://www.youtube.com/watch?v=fRjD1JI0H5w)
+ [AWS re:invent 2020: Serverless API authentication for HTTP APIs on AWS](https://www.youtube.com/watch?v=AW4kvUkUKZ0)

 **Related examples:** 
+ [ Amazon VPC Lattice Workshop ](https://catalog.us-east-1.prod.workshops.aws/workshops/9e543f60-e409-43d4-b37f-78ff3e1a07f5/en-US)
+ [ Zero-Trust Episode 1 – The Phantom Service Perimeter workshop ](https://catalog.us-east-1.prod.workshops.aws/workshops/dc413216-deab-4371-9e4a-879a4f14233d/en-US)