Automate the deployment of AWS Supply Chain data lakes in a multi-repository setup by using GitHub Actions, Artifactory, and Terraform - AWS Prescriptive Guidance

Automate the deployment of AWS Supply Chain data lakes in a multi-repository setup by using GitHub Actions, Artifactory, and Terraform

Keshav Ganesh, Amazon Web Services

Summary

This pattern provides an automated approach for deploying and managing AWS Supply Chain data lakes using a multi-repository continuous integration and continuous deployment (CI/CD) pipeline. It demonstrates two deployment methods: automated deployment using GitHub Actions workflows, or manual deployment using Terraform directly. Both approaches use Terraform for infrastructure as code (IaC), with the automated method adding GitHub Actions and JFrog Artifactory for enhanced CI/CD capabilities.

The solution leverages AWS Supply Chain, AWS Lambda, and Amazon Simple Storage Service (Amazon S3) to establish the data lake infrastructure, while using either deployment method to automate configuration and resource creation. This automation eliminates manual configuration steps and ensures consistent deployments across environments. In addition, AWS Supply Chain eliminates the need for deep expertise in extract, transform, and load (ETL) and can provide insights and analytics powered by Amazon Quick Sight.

By implementing this pattern, organizations can reduce deployment time, maintain infrastructure as code, and manage supply chain data lakes through a version-controlled, automated process. The multi-repository approach provides fine-grained access control and supports independent deployment of different components. Teams can choose the deployment method that best fits their existing tools and processes.

Prerequisites and limitations

Prerequisites

Ensure the following are installed on your local machine:

Ensure the following are in place before deployment:

  • An active AWS account.

  • A virtual private cloud (VPC) with two private subnets in your AWS account in the AWS Region of your choice.

  • Sufficient permissions for the AWS Identity and Access Management (IAM) role used for deployment to the following services:

    • AWS Supply Chain – Full Access preferred for deploying its components like datasets and integration flows, along with accessing it from the AWS Management Console.

    • Amazon CloudWatch Logs – For creating and managing CloudWatch log groups.

    • Amazon Elastic Compute Cloud (Amazon EC2) – For Amazon EC2 security groups and Amazon Virtual Private Cloud (Amazon VPC) endpoints.

    • Amazon EventBridge – For use by AWS Supply Chain.

    • IAM – For creating AWS Lambda service roles.

    • AWS Key Management Service (AWS KMS) – For access to the AWS KMS keys used for the Amazon S3 artifacts bucket and the Amazon S3 AWS Supply Chain staging bucket.

    • AWS Lambda – For creating the Lambda functions that deploy the AWS Supply Chain components.

    • Amazon S3 – For access to the Amazon S3 artifacts bucket, server access logging bucket, and AWS Supply Chain staging bucket. If you’re using manual deployment, permissions for the Amazon S3 Terraform artifacts bucket are also required.

    • Amazon VPC – For creating and managing a VPC.

If you prefer to use GitHub Actions workflows for deployment, do the following:

If you prefer to do a manual deployment, do the following:

If you prefer to use GitHub Actions workflows for deployment, set up the following:

Limitations

  • The AWS Supply Chain instance doesn’t support complex data transformation techniques.

  • AWS Supply Chain is most suited for supply chain domains because it provides built-in analytics and insights. For any other domain, AWS Supply Chain can be used as a data store as part of the data lake architecture.

  • Lambda functions used in this solution might need to be enhanced to handle API retries and memory management in a production scale deployment.

  • Some AWS services aren’t available in all AWS Regions. For Region availability, see AWS Services by Region. For specific endpoints, see Service endpoints and quotas, and choose the link for the service.

Architecture

You can deploy this solution either by using automated GitHub Actions workflows or manually using Terraform.

Automated deployment with GitHub Actions

The following diagram shows the automated deployment option that uses GitHub Actions workflows. JFrog Artifactory is used for artifacts management. It stores resource information and outputs for use in a multi-repository deployment.

Automated deployment option that uses GitHub Actions workflows and JFrog.

Manual deployment with Terraform

The following diagram shows the manual deployment option through Terraform. Instead of JFrog Artifactory, Amazon S3 is used for artifacts management.

Manual deployment option using Terraform and Amazon S3.

Deployment workflow

The diagrams show the following workflow:

  1. Deploy AWS Supply Chain service datasets infrastructure and databases using one of the following deployment methods:

    • Automated deployment – Uses GitHub Actions workflows to orchestrate all deployment steps and uses JFrog Artifactory for artifacts management.

    • Manual deployment – Executes Terraform commands directly for each deployment step and uses Amazon S3 for artifacts management.

  2. Create the supporting AWS resources that are required for AWS Supply Chain service operation:

    • Amazon VPC endpoints and security groups

    • AWS KMS keys

    • CloudWatch Logs log groups

  3. Create and deploy the following infrastructure resources:

    • Lambda functions that manage (create, update, and delete) the AWS Supply Chain service instance, namespaces, and datasets.

    • AWS Supply Chain staging Amazon S3 bucket for data ingestion

  4. Deploy the Lambda function that manages integration flows between the staging bucket and AWS Supply Chain datasets. After deployment is complete, the remaining workflow steps manage data ingestion and analysis.

  5. Configure source data ingestion to the AWS Supply Chain staging Amazon S3 bucket.

  6. After data is added to the AWS Supply Chain staging Amazon S3 bucket, the service automatically triggers the integration flow to the AWS Supply Chain datasets.

  7. AWS Supply Chain integrates with Quick Sight Analytics to produce dashboards based on the ingested data.

Tools

AWS services

  • Amazon CloudWatch Logs helps you centralize the logs from all your systems, applications, and AWS services so you can monitor them and archive them securely.

  • AWS Command Line Interface (AWS CLI) is an open source tool that helps you interact with AWS services through commands in your command-line shell.

  • Amazon Elastic Compute Cloud (Amazon EC2) provides scalable computing capacity in the AWS Cloud. You can launch as many virtual servers as you need and quickly scale them up or down.

  • Amazon EventBridge is a serverless event bus service that helps you connect your applications with real-time data from a variety of sources. For example, AWS Lambda functions, HTTP invocation endpoints using API destinations, or event buses in other AWS accounts.

  • AWS Identity and Access Management (IAM) helps you securely manage access to your AWS resources by controlling who is authenticated and authorized to use them.

  • AWS IAM Identity Center helps you centrally manage single sign-on (SSO) access to all of your AWS accounts and cloud applications.

  • AWS Key Management Service (AWS KMS) helps you create and control cryptographic keys to help protect your data.

  • AWS Lambda is a compute service that helps you run code without needing to provision or manage servers. It runs your code only when needed and scales automatically, so you pay only for the compute time that you use.

  • Amazon Q in AWS Supply Chain is an interactive generative AI assistant that helps you operate your supply chain more efficiently by analyzing the data in your AWS Supply Chain data lake.

  • Amazon Quick Sight is a cloud-scale business intelligence (BI) service that helps you visualize, analyze, and report your data in a single dashboard.

  • Amazon Simple Storage Service (Amazon S3) is a cloud-based object storage service that helps you store, protect, and retrieve any amount of data.

  • AWS Supply Chain is a cloud-based managed application that can be used as a data store in organizations for supply chain domains, which can be used to generate insights and perform analysis on the ingested data.

  • Amazon Virtual Private Cloud (Amazon VPC) helps you launch AWS resources into a virtual network that you’ve defined. This virtual network resembles a traditional network that you’d operate in your own data center, with the benefits of using the scalable infrastructure of AWS. An Amazon VPC endpoint is a virtual device that helps you privately connect your VPC to supported AWS services without requiring an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection.

Other tools

  • GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that’s tightly integrated with GitHub repositories. You can use GitHub Actions to automate your build, test, and deployment pipeline.

  • HashiCorp Terraform is an infrastructure as code (IaC) tool that helps you create and manage cloud and on-premises resources.

  • JFrog Artifactory provides end-to-end automation and management of binaries and artifacts through the application delivery process.

  • Python is a general-purpose computer programming language. This pattern uses Python for the AWS function’s code to interact with AWS Supply Chain

    .

Best practices

Epics

TaskDescriptionSkills required

Clone the repository.

To clone this pattern’s repository, run the following command in your local workstation:

git clone https://github.com/aws-samples/sample-automate-aws-supply-chain-deployment.git cd ASC-Deployment
AWS DevOps

(Automated option) Verify prerequisites for deployment.

Make sure that the Prerequisites are complete for the automated deployment.

App owner

(Manual option) Prepare for deployment of AWS Supply Chain datasets.

To go to the terraform-deployment directory of ASC-Datasets, run the following command:

cd ASC-Datasets/terraform-deployment

To assume the role ARN that was created in the Prerequisites, run the following command:

aws sts assume-role --role-arn <enter AWS user role ARN> --role-session-name <your-session-name>

To configure and export the environment variables, run the following commands:

# Export Environment variables export REGION=<Enter deployment region> export REPO_NAME=<Enter Current ASC Datasets dir name> export PROJECT_NAME="asc-deployment-poc" export ACCOUNT_ID=<Enter deployment Account ID> export ENVIRONMENT="dev" export LAMBDA_LAYER_TEMP_DIR_TERRAFORM="layerOutput" export LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM="lambdaOutput" export AWS_USER_ROLE=<Enter user role ARN for AWS Console access and deployment> export S3_TERRAFORM_ARTIFACTS_BUCKET_NAME="$PROJECT_NAME-$ACCOUNT_ID-$REGION-terraform-artifacts-$ENVIRONMENT"
AWS DevOps

(Manual option) Prepare for managing AWS Supply Chain integration flows in deployment.

To go to the terraform-deployment directory of ASC-Integration-Flows, run the following command:

cd ASC-Integration-Flows/terraform-deployment

To assume the role ARN that was created earlier, run the following command:

aws sts assume-role --role-arn <enter AWS user role ARN> --role-session-name <your-session-name>

To configure and export the environment variables, run the following commands:

# Export Environment variables export REGION=<Enter deployment region> export REPO_NAME=<Enter Current ASC Integration Flows dir name> export ASC_DATASET_VARS_REPO=<Enter Current ASC Datasets dir name> #Must be the same directory name used for ASC Datasets deployment export PROJECT_NAME="asc-deployment-poc" export ACCOUNT_ID=<Enter deployment Account ID> export ENVIRONMENT="dev" export LAMBDA_LAYER_TEMP_DIR_TERRAFORM="layerOutput" export LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM="lambdaOutput" export S3_TERRAFORM_ARTIFACTS_BUCKET_NAME="$PROJECT_NAME-$ACCOUNT_ID-$REGION-terraform-artifacts-$ENVIRONMENT"
App owner
TaskDescriptionSkills required

Copy the ASC-Datasets directory.

To copy the ASC-Datasets directory to a new location, use the following steps:

  1. To go to the ASC-Datasets directory, run the following command:

    cd ASC-Datasets
  2. To copy the ASC-Datasets directory to a new location, run the following commands:

    cp -r ASC-Datasets ../ASC-Datasets-standalone cd ../ASC-Datasets-standalone
AWS DevOps

Set up the ASC-Datasets directory.

To set up ASC-Datasets as a standalone repository in your organization, run the following commands:

git init git add . git commit -m "Initial commit: ASC-Datasets standalone repository" git remote add origin <INSERT_ASC_DATASETS_GITHUB_URL> git branch -M dev
AWS DevOps

Configure the branch name in the .github workflow file.

Set up the branch name in the deployment workflow file as shown in the following example:

on: workflow_dispatch: push: branches: - dev #Change to any other branch preferred for deployment
App owner

Set up GitHub environments and configure environment values.

To set up GitHub environments in your GitHub organization, use the instructions in Setup GitHub environments in this pattern’s repository.

To configure environment values in the workflow files, use the instructions in Setup environment values in the workflow files in this pattern’s repository.

App owner

Trigger the workflow.

To push your changes to your GitHub organization and trigger the deployment workflow, run the following command:

git push -u origin dev
AWS DevOps
TaskDescriptionSkills required

Copy the ASC-Integration-Flows directory.

To copy the ASC-Integration-Flows directory to a new location, use the following steps:

  1. To go to the ASC-Integration-Flows directory, run the following command:

    cd ASC-Integration-Flows
  2. To copy the ASC-Integration-Flows directory to a new location, run the following commands:

    cp -r ASC-Integration-Flows ../ASC-Integration-Flows-standalone cd ../ASC-Integration-Flows-standalone
AWS DevOps

Set up the ASC-Integration-Flows directory.

To set up the ASC-Integration-Flows directory as a standalone repository in your organization, run the following commands:

git init git add . git commit -m "Initial commit: ASC-Integration-Flows standalone repository" git remote add origin <INSERT_ASC_Integration_Flows_GITHUB_URL> git branch -M dev
AWS DevOps

Configure the branch name in the .github workflow file.

Set up the branch name in the deployment workflow file as shown in the following example:

on: workflow_dispatch: push: branches: - dev #Change to any other branch preferred for deployment
App owner

Set up GitHub environments and configure environment values.

To set up GitHub environments in your GitHub organization, use the instructions in Setup GitHub environments in this pattern’s repository.

To configure environment values in the workflow files, use the instructions in Setup environment values in the workflow files in this pattern’s repository.

App owner

Trigger the workflow.

To push your changes to your GitHub organization and trigger the deployment workflow, run the following command:

git push -u origin dev
AWS DevOps
TaskDescriptionSkills required

Navigate to the terraform-deployment directory.

To go to the terraform-deployment directory of ASC-Datasets, run the following command:

cd ASC-Datasets/terraform-deployment
AWS DevOps

Set up the Terraform state Amazon S3 bucket.

To set up the Terraform state Amazon S3 bucket, use the following script:

# Setup terraform bucket chmod +x ../scripts/setup-terraform.sh ../scripts/setup-terraform.sh
AWS DevOps

Set up the Terraform artifacts Amazon S3 bucket.

To set up the Terraform artifacts Amazon S3 bucket, use the following script:

# Setup terraform artifacts bucket chmod +x ../scripts/setup-terraform-artifacts-bucket.sh ../scripts/setup-terraform-artifacts-bucket.sh
AWS DevOps

Set up the Terraform backend and providers configuration.

To set up the Terraform backend and providers configuration, use the following script:

# Setup terraform backend and providers config if they don't exist chmod +x ../scripts/generate-terraform-config.sh ../scripts/generate-terraform-config.sh
AWS DevOps

Generate a deployment plan.

To generate a deployment plan, run the following commands:

# Run terraform init and validate terraform init terraform validate # Run terraform plan terraform plan \ -var-file="tfInputs/$ENVIRONMENT.tfvars" \ -var="project_name=$PROJECT_NAME" \ -var="environment=$ENVIRONMENT" \ -var="user_role=$AWS_USER_ROLE" \ -var="lambda_temp_dir=$LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM" \ -var="layer_temp_dir=$LAMBDA_LAYER_TEMP_DIR_TERRAFORM" \ -parallelism=40 \ -out='tfplan.out'
AWS DevOps

Deploy the configurations.

To deploy the configurations, run the following command:

# Run terraform apply terraform apply tfplan.out
AWS DevOps

Update other configurations and store outputs.

To update AWS KMS key policies and store the applied configurations outputs in the Terraform artifacts Amazon S3 bucket, run the following commands:

# Update AWS Supply Chain KMS Key policy with the service's requirements chmod +x ../scripts/update-asc-kms-policy.sh ../scripts/update-asc-kms-policy.sh
# Update AWS KMS Keys' policy with IAM roles chmod +x ../scripts/update-kms-policy.sh ../scripts/update-kms-policy.sh
# Create terraform outputs file to be used as input variables terraform output -json > raw_output.json jq -r 'to_entries | map( if .value.type == "string" then "\(.key) = \"\(.value.value)\"" else "\(.key) = \(.value.value | tojson)" end ) | .[]' raw_output.json > $REPO_NAME-outputs.tfvars
# Upload reformed outputs file to Amazon S3 terraform artifacts bucket (For retrieval from other repositories) aws s3 cp $REPO_NAME-outputs.tfvars s3://$S3_TERRAFORM_ARTIFACTS_BUCKET_NAME/$REPO_NAME-outputs.tfvars rm -f raw_output.json rm -f $REPO_NAME-outputs.tfvars
AWS DevOps
TaskDescriptionSkills required

Navigate to the terraform-deployment directory.

To go to the terraform-deployment directory of ASC-Integration-Flows, run the following command:

cd ASC-Integration-Flows/terraform-deployment
AWS DevOps

Set up the Terraform backend and providers configuration.

To set up the Terraform backend and provider configurations, use the following script:

# Setup terraform backend and providers config if they don't exist chmod +x ../scripts/generate-terraform-config.sh ../scripts/generate-terraform-config.sh
AWS DevOps

Generate a deployment plan.

To generate a deployment plan, run the following commands. These commands initialize your Terraform environment, merge configuration variables from ASC-Datasets with your existing Terraform configurations, and generate a deployment plan.

# Run terraform init and validate terraform init terraform validate
# Download and merge ASC DATASET tfvars chmod +x ../scripts/download-vars-through-s3.sh ../scripts/download-vars-through-s3.sh $ASC_DATASET_VARS_REPO
# Run terraform plan terraform plan \ -var-file="tfInputs/$ENVIRONMENT.tfvars" \ -var="project_name=$PROJECT_NAME" \ -var="environment=$ENVIRONMENT" \ -var="lambda_temp_dir=$LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM" \ -var="layer_temp_dir=$LAMBDA_LAYER_TEMP_DIR_TERRAFORM" \ -parallelism=40 \ -out='tfplan.out'
AWS DevOps

Deploy the configurations.

To deploy the configurations, run the following command:

# Run terraform apply terraform apply tfplan.out
AWS DevOps

Update other configurations.

To update AWS KMS key policies and store the applied configurations outputs in the Terraform artifacts Amazon S3 bucket, run the following commands:

# Update AWS KMS Keys' policy with IAM roles chmod +x ../scripts/update-kms-policy-through-s3.sh ../scripts/update-kms-policy-through-s3.sh $ASC_DATASET_VARS_REPO
# Create terraform outputs file to be used as input variables terraform output -json > raw_output.json jq -r 'to_entries | map( if .value.type == "string" then "\(.key) = \"\(.value.value)\"" else "\(.key) = \(.value.value | tojson)" end ) | .[]' raw_output.json > $REPO_NAME-outputs.tfvars
# Upload reformed outputs file to Amazon S3 terraform artifacts bucket (For retrieval from other repositories) aws s3 cp $REPO_NAME-outputs.tfvars s3://$S3_TERRAFORM_ARTIFACTS_BUCKET_NAME/$REPO_NAME-outputs.tfvars rm -f raw_output.json rm -f $REPO_NAME-outputs.tfvars
AWS DevOps
TaskDescriptionSkills required

Upload sample CSV files.

To upload sample CSV files for the datasets, use the following steps:

  1. Create sample CSV files with varied data for the Calendar and Outbound Order Line datasets that were created in deployment.

  2. Fetch the AWS Supply Chain instance ID asc_instance_id from the terraform outputs directory.

  3. Note the Amazon S3 bucket name for AWS Supply Chain that was created in deployment: aws-supply-chain-data-<Instance_ID>

  4. To upload the files by using the AWS CLI, run the following commands:

    # Upload Calendar CSV file aws s3 cp calendar_sample.csv s3://aws-supply-chain-data-<Instance_ID>/calendar-data/ # Upload Outbound Order Line CSV file aws s3 cp outbound_order_line_sample.csv s3://aws-supply-chain-data-<Instance_ID>/outbound-order-line-data/
Data engineer
TaskDescriptionSkills required

Set up AWS Supply Chain access.

To set up AWS Supply Chain access from the AWS Management Console, use the following steps:

  1. Sign in to the AWS Management Console and search for the AWS Supply Chain service.

  2. Go to the instance asc-deployment-poc-dev-asc-instance.

  3. This pattern uses IAM Identity Center to manage user access to the AWS Supply Chain instance. To ensure complete access to this solution, sign in as the Administrator of the application.

App owner
TaskDescriptionSkills required

Trigger the destroy workflow for integration flows resources.

Trigger the destroy workflow of ASC-Integration-Flows from your deployment branch in your GitHub organization.

AWS DevOps

Trigger the destroy workflow for datasets resources.

Trigger the destroy workflow of ASC-Datasets from your deployment branch in your GitHub organization.

AWS DevOps
TaskDescriptionSkills required

Navigate to the terraform-deployment directory.

To go to the terraform-deployment directory of ASC-Integration-Flows, run the following command:

cd ASC-Integration-Flows/terraform-deployment
AWS DevOps

Set up the Terraform backend and providers configuration.

To set up the Terraform backend and providers configuration, use the following script:

# Setup terraform backend and providers config if they don't exist chmod +x ../scripts/generate-terraform-config.sh ../scripts/generate-terraform-config.sh
AWS DevOps

Generate infrastructure destruction plan.

To prepare for the controlled destruction of your AWS infrastructure by generating a detailed teardown plan, run the following commands. The process initializes Terraform, incorporates AWS Supply Chain dataset configurations, and creates a destruction plan that you can review before executing.

# Run terraform init and validate terraform init terraform validate
# Download and merge ASC DATASET tfvars chmod +x ../scripts/download-vars-through-s3.sh ../scripts/download-vars-through-s3.sh $ASC_DATASET_VARS_REPO
# Run terraform plan terraform plan -destroy\ -var-file="tfInputs/$ENVIRONMENT.tfvars" \ -var="project_name=$PROJECT_NAME" \ -var="environment=$ENVIRONMENT" \ -var="lambda_temp_dir=$LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM" \ -var="layer_temp_dir=$LAMBDA_LAYER_TEMP_DIR_TERRAFORM" \ -parallelism=40 \ -out='tfplan.out'
AWS DevOps

Execute infrastructure destruction plan.

To execute the planned destruction of your infrastructure, run the following command:

# Run terraform apply terraform apply tfplan.out
AWS DevOps

Remove Terraform outputs from Amazon S3 bucket.

To remove the outputs file that was uploaded during the deployment of ASC-Integration-Flows, run the following command:

# Delete the outputs file aws s3 rm s3://$S3_TERRAFORM_ARTIFACTS_BUCKET_NAME/$REPO_NAME-outputs.tfvars
AWS DevOps
TaskDescriptionSkills required

Navigate to the terraform-deployment directory.

To go to the terraform-deployment directory of ASC-Datasets, run the following command:

cd ASC-Datasets/terraform-deployment
AWS DevOps

Set up the Terraform backend and providers configuration.

To set up the Terraform backend and providers configuration, use the following script:

# Setup terraform backend and providers config if they don't exist chmod +x ../scripts/generate-terraform-config.sh ../scripts/generate-terraform-config.sh
AWS DevOps

Generate infrastructure destruction plan.

To create a plan for destroying AWS Supply Chain dataset resources, run the following commands:

# Run terraform init and validate terraform init terraform validate # Run terraform plan terraform plan -destroy\ -var-file="tfInputs/$ENVIRONMENT.tfvars" \ -var="project_name=$PROJECT_NAME" \ -var="environment=$ENVIRONMENT" \ -var="user_role=$AWS_USER_ROLE" \ -var="lambda_temp_dir=$LAMBDA_FUNCTION_TEMP_DIR_TERRAFORM" \ -var="layer_temp_dir=$LAMBDA_LAYER_TEMP_DIR_TERRAFORM" \ -parallelism=40 \ -out='tfplan.out'
AWS DevOps

Empty Amazon S3 buckets.

To empty all Amazon S3 buckets (except the server access logging bucket, which is configured for force-destroy), use the following script:

# Delete S3 buckets excluding server access logging bucket chmod +x ../scripts/empty-s3-buckets.sh ../scripts/empty-s3-buckets.sh tfplan.out
AWS DevOps

Execute infrastructure destruction plan.

To execute the planned destruction of your AWS Supply Chain dataset infrastructure using the generated plan, run the following command:

# Run terraform apply terraform apply tfplan.out
AWS DevOps

Remove Terraform outputs from the Amazon S3 Terraform artifacts bucket.

To complete the cleanup process, remove the outputs file that was uploaded during the deployment of ASC-Datasets by running the following command:

# Delete the outputs file aws s3 rm s3://$S3_TERRAFORM_ARTIFACTS_BUCKET_NAME/$REPO_NAME-outputs.tfvars
AWS DevOps

Troubleshooting

IssueSolution

An AWS Supply Chain dataset or integration flow did not deploy correctly because of AWS Supply Chain internal errors or insufficient IAM permissions for the service role.

First, clean up all resources. Then, redeploy the AWS Supply Chain dataset resources and then redeploy the AWS Supply Chain integration flow resources.

The AWS Supply Chain integration flow doesn’t fetch the new data files uploaded for the AWS Supply Chain datasets.

  1. Check that the prefix of the AWS Supply Chain integration flow configuration matches the prefix used when uploading the sample data files.

  2. If the resources for the AWS Supply Chain datasets were re-created, their associated Amazon Resource Names (ARNs) change internally. Therefore, redeploy the resources for the AWS Supply Chain service integration flows resources.

Related resources

AWS documentation

Other resources

Additional information

This solution can be replicated for more datasets and can be queried for further analysis, through prebuilt dashboards provided with AWS Supply Chain or custom integration with Amazon Quick Sight. In addition, you can use Amazon Q to ask questions related to your AWS Supply Chain instance.

Analyze data with AWS Supply Chain Analytics

For instructions to set up AWS Supply Chain Analytics, see Setting AWS Supply Chain Analytics in the AWS Supply Chain documentation.

This pattern demonstrated the creation of Calendar and Outbound_Order_Line datasets. To create an analysis that uses these datasets, use the following steps:

  1. To analyze the datasets, use the Seasonality Analysis dashboard. To add the dashboard, follow the steps in Prebuilt dashboards in the AWS Supply Chain documentation.

  2. Choose the dashboard to see its analysis that is based on sample CSV files for Calendar data and Outbound Order Line data.

The dashboard provides insights on demand over the years based on the ingested data for the datasets. You can further specify the ProductID, CustomerID, years, and other parameters for analysis.

Use Amazon Q to ask questions related to your AWS Supply Chain instance

Amazon Q in AWS Supply Chain is an interactive generative AI assistant that helps you operate your supply chain more efficiently. Amazon Q can do the following:

  • Analyze the data in your AWS Supply Chain data lake.

  • Provide operational and financial insights.

  • Answer your immediate supply chain questions.

For more information about using Amazon Q, see Enabling Amazon Q in AWS Supply Chain and Using Amazon Q in AWS Supply Chain in the AWS Supply Chain documentation.