

# Tutorial: Transfer data between applications with Amazon AppFlow
Tutorial: Transfer data between applications

This tutorial explains how to use Amazon AppFlow with [Amazon Simple Storage Service](https://aws.amazon.com/s3) (Amazon S3) and Salesforce through the AWS Management Console. Optionally, if you want to use a different supported software as a service (SaaS) application, the tutorial provides general instructions for how to create a flow. A flow uses a connection to transfer data between a source and a destination. When you run a flow, Amazon AppFlow verifies that the data is available in the source, processes the data according to the flow configuration, and transfers the processed data to the destination.

**Objective**  
In this tutorial, you learn to transfer data between applications. Specifically, you transfer data both from Amazon S3 to Salesforce, and from Salesforce to Amazon S3. First, you synchronize additional account records with the customer relationship management (CRM) data already stored in Salesforce (Flow 1). You can optionally add validations to this flow to only transfer good data. Then, you transfer the account data in Salesforce to Amazon S3 in an event-triggered flow (Flow 2). When Amazon AppFlow detects a change to the target data in the CRM storage service, an event-triggered flow runs. This way, you have access to up-to-date information in Amazon S3, where you can import it into an object for data lake hydration to generate business value.

In this tutorial, you accomplish the following: 
+ Store a sample data set of accounts in [Amazon Simple Storage Service](https://aws.amazon.com/s3) (Amazon S3).
+ **Flow 1** — Use [Amazon AppFlow](https://aws.amazon.com/appflow) to transfer data from Amazon S3 to Salesforce.
+ **Flow 2** — Use Amazon AppFlow to transfer data from Salesforce to Amazon S3.

The following diagram shows the two workflows.

![\[Amazon AppFlow tutorial diagram.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-tutorial.png)


**Estimated cost: **Some of the actions in this tutorial may incur minor charges on your AWS account. The provided sample data is 1 KB. Should you choose to use your own data, you might incur greater charges. Reduce charges by completing the tutorial through [Step 5: Clean up your resources](flow-tutorial-clean-up.md). For information about pricing, see [Amazon S3 pricing](https://aws.amazon.com/s3/pricing/) and [Amazon AppFlow pricing](https://aws.amazon.com/appflow/pricing/).

**Topics**
+ [

## Prerequisites
](#flow-tutorial-prerequisites)
+ [

# Step 1: Upload data to Amazon S3
](flow-tutorial-set-up-source.md)
+ [Step 2: Connect to an application](flow-tutorial-connection.md)
+ [

# Step 3: Transfer data from Amazon S3 to a SaaS destination
](flow-tutorial-s3-salesforce.md)
+ [

# Step 4: Transfer data from a SaaS source to Amazon S3
](flow-tutorial-salesforce-s3.md)
+ [Step 5: Clean up](flow-tutorial-clean-up.md)

## Prerequisites


Before you begin, you need access to an AWS account and an account for a supported application. This tutorial uses Salesforce, but you can follow the steps to create flows with a different application. Before you can access the AWS services in this tutorial, your administrator must grant the required permissions to your user, group, or role.
+ **Amazon AppFlow setup** — If you haven't already done so, complete the [Getting started prerequisites](getting-started.md#prerequisites).
+ **AWS Identity and Access Management (IAM) setup** — You or your administrator must attach the AWS managed policy `AmazonAppFlowFullAccess` to your user, group, or role. For information on how to attach an IAM policy, see [Adding and removing IAM identity permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_manage-attach-detach.html) in the *IAM User Guide*. Also, you must create and attach the following policy to your user, group, or role.

------
#### [ JSON ]

****  

  ```
  {
    "Version":"2012-10-17",		 	 	 
    "Statement": [
      {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": [
          "s3:GetBucketTagging",
          "s3:ListBucketVersions",
          "s3:CreateBucket",
          "s3:ListBucket",
          "s3:GetBucketPolicy",
          "s3:PutEncryptionConfiguration",
          "s3:GetEncryptionConfiguration",
          "s3:PutBucketTagging",
          "s3:GetObjectTagging",
          "s3:GetBucketOwnershipControls",
          "s3:PutObjectTagging",
          "s3:DeleteObject",
          "s3:DeleteBucket",
          "s3:DeleteObjectTagging",
          "s3:GetBucketPublicAccessBlock",
          "s3:GetBucketPolicyStatus",
          "s3:PutBucketPublicAccessBlock",
          "s3:PutAccountPublicAccessBlock",
          "s3:ListAccessPoints",
          "s3:PutBucketOwnershipControls",
          "s3:PutObjectVersionTagging",
          "s3:DeleteObjectVersionTagging",
          "s3:GetBucketVersioning",
          "s3:GetBucketAcl",
          "s3:PutObject",
          "s3:GetObject",
          "s3:GetAccountPublicAccessBlock",
          "s3:ListAllMyBuckets",
          "s3:GetAnalyticsConfiguration",
          "s3:GetBucketLocation"
        ],
        "Resource": "*"
      }
    ]
  }
  ```

------

  For information on how to create IAM policies, see [Creating IAM policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html) in the *IAM User Guide*. These two policies grant you all the permissions that you need to complete this tutorial. For more information on the different types of policies, see [Managed policies and inline policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-vs-inline.html) in the *IAM User Guide*.
+ **Salesforce setup (Optional)** — If you already have a Salesforce account or you want to complete this tutorial with a different SaaS application, you can skip this step. Sign up for a free Salesforce developer account [here](https://developer.salesforce.com/signup).

# Step 1: Upload data to Amazon S3


Suppose you have data that you want to turn into Salesforce account records. You acquired this data from a web form and used it to generate account records. You can upload this list of additional accounts to Amazon Simple Storage Service (Amazon S3). Amazon AppFlow can transfer the data from Amazon S3 to Salesforce to synchronize your customer relationship management (CRM) data.

To use Amazon S3 as your source for the flow, create a storage container, called a bucket, and populate it with data. Amazon AppFlow can transfer the data within an S3 bucket to any of the supported destinations. In this step, you create an S3 bucket, create a source folder within the S3 bucket, and upload sample data to the source folder.

**Topics**
+ [

## (Optional) Download sample data
](#tutorial-download-data)
+ [Create an S3 bucket](#tutorial-create-bucket)
+ [Create a folder](#tutorial-create-folder)
+ [Upload data](#tutorial-upload-data)
+ [

## Additional resources
](#tutorial-s3-additional-resources)

## (Optional) Download sample data


If you have your own data that you want to use for this tutorial, you can skip this step. Also, if you use a SaaS application other than Salesforce, this sample data may not be useful.

The sample data includes nine account records. Download this sample data set.

**To get the sample data**

1. Download the zip file [tutorial-account-data.zip](samples/tutorial-account-data.zip).

1. Extract the zip file. The unzipped file called `tutorial-account-data.csv` contains the sample data set.

## Create an S3 bucket
Create an S3 bucket

After you extract your sample data, use the AWS Management Console to create an S3 bucket to store your data. Your S3 bucket must occupy the same AWS Region as the one where you want to use Amazon AppFlow.

**To create an S3 bucket**

1. Open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. In the **Buckets** section, choose **Create bucket**.

1. For **Bucket name**, enter a descriptive name. The name must be globally unique. For example, enter ***username*-appflow-tutorial**.

1. For **AWS Region**, choose the same Region as your Amazon AppFlow console.
**Warning**  
If your S3 bucket isn't in the same AWS Region as your console, your flow can't access it.

1. Keep the other settings at their default values. Choose **Create bucket**.

## Create a folder in an S3 bucket
Create a folder

Now that you have an S3 bucket, use the console to create a folder in the bucket where you want to store the sample data. While a folder isn't essential, it's useful for keeping your files organized.

**To create a folder in Amazon S3**

1. Open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. In the **Buckets** section, choose your S3 bucket from the list.

1. Under the **Objects** tab, choose **Create folder**.

1. For the folder name, enter **source**.

1. Choose **Create folder**.

## Upload data to Amazon S3
Upload data

Now that you have set up your S3 bucket, upload the data.

**To populate the S3 bucket with data**

1. Open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. In the **Buckets** section, choose your S3 bucket from the list.

1. Choose the `source` folder. Then, under the **Objects** tab, choose **Upload**.

1. Choose **Add files**, and choose your data set. If you downloaded the sample data set, choose the `tutorial-account-data.csv` file.

1. Choose **Upload**.

You now have an S3 bucket with sample data in the `source` folder.

## Additional resources


For more information on Amazon S3, see the following resources:
+ [Amazon S3](https://docs.aws.amazon.com/appflow/latest/userguide/s3.html) in the *Amazon AppFlow User Guide*.
+ [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html) in the *Amazon S3 User Guide*.

# Step 2: Connect Amazon AppFlow to an application
Step 2: Connect to an application

You can securely move your data between supported source and destination applications with a connection in Amazon AppFlow. Connections store the configuration details and credentials necessary to run flows without the need to repeatedly enter information. After you have an established connection with an application, you can use that connection in new or existing flows.

**Topics**
+ [

## Prerequisites
](#flow-tutorial-connection-prerequisites)
+ [Create a connection](#flow-tutorial-make-connection)
+ [

## Additional resources
](#tutorial-connection-additional-resources)

## Prerequisites


Before you begin, complete the [tutorial prerequisites](flow-tutorial.md#flow-tutorial-prerequisites).

## Create a connection between Amazon AppFlow and a SaaS application
Create a connection

If you want to create and run a flow, you must establish a connection with the software as a service (SaaS). You can create this connection while you create the flow, or you can create the connection separately. Here, you create a connection in Amazon AppFlow before you create the flow.

**To create a connection with Salesforce**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Expand the navigation pane on the left-hand side of the console page and choose **Connections**.

1. For **Connectors**, select **Salesforce**.

1. Choose **Create connection**.

1. Leave the default selections and enter a **Connection name**. For example, enter **my-salesforce-connection**.

1. Choose **Continue**.

1. If you're not already logged into Salesforce, Amazon AppFlow prompts you to log in.

1. Choose **Allow** to give Amazon AppFlow access to your Salesforce account.

**To create a connection with other applications**
+ Go to the [Supported applications](app-specific.md) page and select the application that you want to connect with. Follow the instructions for your selected application.

You now have a connection in the Amazon AppFlow console to your SaaS account. If you use the same third-party application in both flows, you only need one connection.

## Additional resources


For more information on connections, see the following resources:
+ [Managing connections](https://docs.aws.amazon.com/appflow/latest/userguide/connections.html) in the *Amazon AppFlow User Guide*.
+ [Salesforce](https://docs.aws.amazon.com/appflow/latest/userguide/salesforce.html) in the *Amazon AppFlow User Guide*.

# Step 3: Transfer data from Amazon S3 to a SaaS destination


Amazon S3 now hosts your data, but you still need to synchronize all your records in the destination. To transfer data to a supported destination, you must create and run a flow with Amazon AppFlow. In this step, you use the AWS Management Console to send data from Amazon S3 to either Salesforce or another software as a service (SaaS) application.

**Topics**
+ [

## Prerequisites
](#flow-tutorial-s3-salesforce-prerequisites)
+ [

## Create a flow
](#flow-tutorial-create-s3-salesforce-flow)
+ [

## Run a flow
](#flow-tutorial-run-s3-salesforce-flow)
+ [

## View transferred data
](#view-transferred-data)
+ [

## (Optional) Edit flow to add validations
](#edit-add-validation)

## Prerequisites


Before you begin, complete [Step 1: Upload data to Amazon S3](flow-tutorial-set-up-source.md).

## Create a flow


The following procedures detail how to create a flow from Amazon S3 to Salesforce, but you can follow the steps with any destination.

**To complete Step 1: Specify flow details**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/). Ensure the AWS Region of your Amazon AppFlow console is the same one as your S3 bucket.

1. Choose **Create flow**.

1. For **Flow name**, enter **s3-to-*SaaS***. For example, if your destination is Salesforce, enter **s3-to-salesforce**.

1. Under **Data encryption**, you have the option to activate custom encryption settings. By default, Amazon AppFlow encrypts your data with a key in AWS Key Management Service (AWS KMS). AWS creates, uses, and manages this key for you. Amazon AppFlow always encrypts your data during transit and at rest. The default encryption is adequate for this tutorial, so don't select custom encryption settings. For more information, see [Data protection](https://docs.aws.amazon.com/appflow/latest/userguide/data-protection.html) in the *Amazon AppFlow User Guide*.

1. Under **Tags**, you have the option to add tags to your flow. Tags are key-value pairs that assign metadata to resources that you create. Tags are not necessary for this tutorial. For more information, see [Tagging AWS resources](https://docs.aws.amazon.com/general/latest/gr/aws_tagging.html) in the *AWS General Reference*.

1.  To continue to Step 2: Configure flow, choose **Next**.

**To complete Step 2: Configure flow**

1. For **Source name**, choose **Amazon S3**.

1. In **Bucket details**, for *Choose an S3 bucket*, select your S3 bucket.

1. For *Enter bucket prefix*, enter **source**. Bucket prefixes are folders.

1. Ensure **Data format preference** is **CSV format**.

1. Configure the **Destination details**. These details vary based on the destination that you want to transfer data to.
   + If you want to transfer data to Salesforce, do the following:

     1. For **Destination name**, select **Salesforce**.

     1. For **Choose Salesforce connection**, select your connection. For example, select `my-salesforce-connection`, the connection that you created in the previous step.
**Tip**  
If you don't have a connection, you can choose **Connect** to create one now.

     1. If you want to use the sample data that you downloaded, for **Choose Salesforce object**, select **Account**.
   + If you want to transfer data to another supported application besides Salesforce, do the following:

     1. For **Destination name**, select the destination that you want for your data.

     1. For **Choose connection**, select the connection that you created, or create one.

     1. Select **object** and specify the correct object type for your data.

     1. If there are any other destination details, configure the required fields.

1. In the **Error handling** section, you can specify how you want the flow to handle errors and where to put the data that causes errors. For this tutorial, you can leave the settings in this section at their default values.

1. For **Flow trigger**, leave the default selection **Run on demand**. When you select this value, you use a single button in the console to run the flow.
**Tip**  
You can also run flows on a schedule. Amazon AppFlow bases the time zone for this schedule on your web browser. For more information, see [Schedule-triggered flows](https://docs.aws.amazon.com/appflow/latest/userguide/flow-triggers.html) in the *Amazon AppFlow User Guide*.

1. To continue to Step 3: Map data fields, choose **Next**.

**To complete Step 3: Map data fields**

1. Map your data fields. These vary based on the destination for your data transfer.
   + If you're transferring to Salesforce, do the following:

     1. Under **Mapping method**, leave the default selection **Manually map fields**.

     1. Under **Destination record preference**, leave the default selection **Insert new records**.

     1. In the **Source to destination field mapping** section, select the *Choose source fields* dropdown and select **Map all fields directly**.
**Important**  
If you use the sample data, ensure Account Name maps to Account Name, Account Type maps to Account Type, Billing State/Province maps to Billing State/Province, Account Rating maps to Account Rating, and Industry maps to Industry.

     1. Choose **Map selected fields**.
   + If you want to transfer data to another supported application besides Salesforce, do the following:

     1. Select **Mapping method** and specify how you want to map your data. You can choose to map the source fields to the destination fields manually, or else upload a .csv file that includes these mappings.

     1. Map your fields from the source field name to the destination field name.

1. Under **Validations**, specify what happens to invalid data within the flow. For this step, you don't need any validations.

1. To continue to Step 4: Add filters, choose **Next**.

**To complete Step 4: Add filters**

1. Under **Filters**, specify what data the flow transfers. With this setting, you can ensure the flow transfers data only when it meets certain criteria. For this tutorial, you don't need any filters.

1. To continue to Step 5: Review and create, choose **Next**.

**To complete Step 5: Review and create**
+ Review the flow settings, and then choose **Create flow**.

## Run a flow


You now have a run-on-demand flow. When you choose the **Run flow** button in the console, this flow transfers your data.

**To run a flow**

1. In **Flows**, select your flow from the list.

1. Choose **Run flow**.

When the flow successfully runs, a banner appears. If you use the provided data, the banner shows nine processed records.

![\[Success message for flow run.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-success.png)


## View transferred data


After your flow runs, you can view the data in the destination.

**To view transferred data**
+ If you use the sample Salesforce account data, navigate to your Salesforce **Account tab** to view the imported account records. For more information on Salesforce accounts, see [Salesforce accounts](https://help.salesforce.com/s/articleView?id=sf.accounts).

You have now transferred data from Amazon S3 to Salesforce or the SaaS application that you chose. If you used Salesforce and the sample data, you have synchronized and expanded your Salesforce account data.

## (Optional) Edit flow to add validations


The flow that you ran transferred all the records in the data set. You can add validations to a flow so that you transfer only valid records. In this procedure, if you use the sample data, you edit your Amazon S3 to Salesforce flow to transfer only account records with ratings.

Before you edit and run the flow again, delete the records that you transferred from the original flow.

**To delete account records in Salesforce**
+ Follow the directions in [Mass Delete Records](https://help.salesforce.com/s/articleView?id=sf.admin_massdelete.htm).

For the sample data set, suppose you consider account records valid only if they have an account rating. Two of the account records don't have associated account ratings. You don't want these records to transfer from Amazon S3 so that you only have valid data in Salesforce.

**To edit a flow and add validations**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In **Flows**, choose your flow.

1. Choose **Actions**, then choose **Edit flow**.

1. Choose **Next** until you reach **Step 3: Edit data fields**.

1. In **Validations**, choose **Add validation**.

1. If you use the sample data, for **Field name**, select **Account rating**. For **Condition**, choose **Values missing or null**. For **Action**, choose **Ignore record**. This configuration will omit the transfer of account records with missing rating values.  
![\[Example4 and Example8 are missing account rating values.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/validate-data.png)

1. Choose **Save**.

**To run the edited flow and view transferred data**

1. In **Flows**, select your flow from the list.

1. Choose **Run flow**. When the flow successfully runs, a banner appears.  
![\[Success message for flow run.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-success.png)

1. If you use the sample Salesforce account data, navigate to your Salesforce **Account** tab to view the imported account records. For more information on Salesforce accounts, see [Salesforce Accounts](https://help.salesforce.com/s/articleView?id=sf.accounts).

If you used the sample data, only seven of the nine records transferred. `Example4` and `Example8` do not appear because they have no account ratings associated with them.

# Step 4: Transfer data from a SaaS source to Amazon S3


Suppose you now want to transfer your data from Salesforce to Amazon S3. With Amazon S3, you can synchronize and replicate customer relationship management (CRM) data into data lakes to analyze or use to drive machine learning. To keep this information up to date, you can create an event-triggered flow from Salesforce to Amazon S3. An event-triggered flow runs when Amazon AppFlow detects a change to the target data in the CRM storage service.

After you create an S3 bucket, you can set up and run a flow with Amazon AppFlow to transfer data from a supported source to the S3 bucket. You can use one S3 bucket as both a source and destination, so you don't need to create a new S3 bucket if you already created one for this tutorial. In this step, you use the AWS Management Console to create and run a flow from Salesforce or another software as a service (SaaS) application to Amazon S3.

**Topics**
+ [

## Prerequisites
](#flow-tutorial-salesforce-s3-prerequisites)
+ [

## Change data capture in Salesforce
](#change-data-capture-salesforce)
+ [

## Create a flow
](#flow-tutorial-create-salesforce-s3-flow)
+ [Run a flow (event-triggered or on-demand)](#flow-tutorial-run-salesforce-s3-flow)
+ [

## View transferred data
](#get-transferred-data)

## Prerequisites


Before you begin, you need an S3 bucket to receive the data if you don't already have one. You can use the same S3 bucket as both a source and destination for different flows. This tutorial uses Salesforce for a SaaS account, but you can use another supported source application if you want. Some flow options that this tutorial uses don't work for a SaaS application other than Salesforce.
+ **Amazon S3 setup** — If you don't already have an S3 bucket, [Create an S3 bucket](flow-tutorial-set-up-source.md#tutorial-create-bucket) to prepare Amazon S3 to receive your data.
+ **Salesforce setup (Optional)** — If you already have a Salesforce account, or you want to complete this tutorial with a different SaaS application, you can skip this step. Sign up for a free Salesforce developer account [here](https://developer.salesforce.com/signup).
+ **Transfer data to Salesforce (Optional)** — If you use Salesforce for this tutorial, we recommend that you complete [Step 3: Transfer data from Amazon S3](flow-tutorial-s3-salesforce.md) before you continue.

## Change data capture in Salesforce


To run an event-triggered flow, Amazon AppFlow needs to receive a notification when a record changes. When you use the change data capture feature in Salesforce, you can generate change event notifications for selected entities. If you don't have administrator-level credentials, you might not be able to select entities to generate change notifications. However, the free developer account has administrator privileges.

**To enable change data capture**

1. Open Salesforce at [www.salesforce.com](https://www.salesforce.com/) and log in to your account.

1. Navigate to the **Change Data Capture** page.

1. If you use the sample data, select **Account (Account)** to generate change event notifications. Otherwise, select the appropriate entity for your data.

For more information about Salesforce change data capture, see [Change Data Capture](https://developer.salesforce.com/docs/atlas.en-us.change_data_capture.meta/change_data_capture/cdc_intro.htm).

## Create a flow


The following procedures detail how to create a flow from Salesforce to Amazon S3, but you can follow the steps with any supported source. Some flow options that this tutorial uses don't work for a SaaS application other than Salesforce, but alternate steps appear.

**To complete Step 1: Specify flow details**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow name**, enter ***SaaS*-to-s3**. For example, if your source is Salesforce, enter **salesforce-to-s3**.

1. Under **Data encryption**, you have the option to activate custom encryption settings. By default, Amazon AppFlow encrypts your data with a key in AWS Key Management Service (AWS KMS). AWS creates, uses, and manages this key for you. Amazon AppFlow always encrypts your data during transit and at rest. The default encryption is adequate for this tutorial, so don't select custom encryption settings. For more information, see [Data protection](https://docs.aws.amazon.com/appflow/latest/userguide/data-protection.html) in the *Amazon AppFlow User Guide*.

1. Under **Tags**, you have the option to add tags to your flow. Tags are key-value pairs that assign metadata to resources that you create. Tags aren't necessary for this tutorial. For more information, see [Tagging AWS resources](https://docs.aws.amazon.com/general/latest/gr/aws_tagging.html) in the *AWS General Reference*.

1. To continue to Step 2: Configure flow, choose **Next**.

**To complete Step 2: Configure flow**

1. Configure the **Source details**. These details vary based on the source that you want to transfer data from.
   + If you want to transfer data from Salesforce, do the following:

     1. For **Source name**, choose **Salesforce**.

     1. For **Choose Salesforce connection**, select your connection. For example, select `my-salesforce-connection`, the connection that you created in a previous step.
**Tip**  
If you don't have a connection, you can choose **Connect** to create one now.

     1. Select **Salesforce events**.

     1. If you use the sample data, for **Choose Salesforce event**, select **Account Change Event**. Otherwise, select the event that matches your data.
   + If you want to transfer data from another supported application besides Salesforce, do the following:

     1. For **Source name**, select the source that you want for your data.

     1. For **Choose connection**, select the connection that you created, or create one.

     1. Select **object** and specify the correct object type for your data.

     1. If there are any other source details, configure the required fields.

1. For **Destination name**, choose **Amazon S3**.

1. In **Bucket details**, for *Choose an S3 bucket*, select your S3 bucket. Use the same S3 bucket that contains the `source` folder from the previous step.

1. For *Enter bucket prefix*, enter **destination**. Bucket prefixes are folders.
**Tip**  
If you don't have a folder that matches the name that you entered, the flow automatically creates one when it runs.

1. Configure the **Flow trigger**. This varies based on the source where you want to transfer data from.
   + If you want to transfer data from Salesforce, leave the default selection **Run flow on event**.
   + If you want to transfer data from another supported application besides Salesforce, leave the default selection **Run on demand**. This option allows you to run the flow with the selection of one button in the console.
**Tip**  
You can also run flows on a schedule. Amazon AppFlow bases the time zone for this schedule on your web browser. For more information, see [Schedule-triggered flows](https://docs.aws.amazon.com/appflow/latest/userguide/flow-triggers.html) in the *Amazon AppFlow User Guide*.

1. To continue to Step 3: Map data fields, choose **Next**.

**To complete Step 3: Map data fields**

1. Under **Mapping method**, leave the default selection **Manually map fields**.

1. In the **Source to destination field mapping** section, select the *Choose source fields* dropdown and select **Map all fields directly**.

1. Under **Validations**, specify what happens to invalid data within the flow. For this step, you don't need any validations.

1. To continue to Step 4: Add filters, choose **Next**.

**To complete Step 4: Add filters**

1. Under **Filters**, specify what data the flow transfers. With this setting, you can ensure the flow transfers data only when it meets certain criteria. For this tutorial, you don't need any filters.

1. To continue to Step 5: Review and create, choose **Next**.

**To complete Step 5: Review and create**
+ Review the flow settings, then choose **Create flow**.

## Run a flow
Run a flow (event-triggered or on-demand)

You now have a flow. The source that you use determines how you run this flow.

### Run an event-triggered flow with Salesforce


Your event-triggered flow runs when a change occurs to a record that you've set up to generate change event notifications. Here, you change a record within your Salesforce account to activate a flow run.

**To run an event-triggered flow with Salesforce**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In **Flows**, select the `salesforce-to-s3` flow.

1. Choose **Activate flow**.

1. Open Salesforce at [www.salesforce.com](https://www.salesforce.com/) and log in to your account.

1. Navigate to the page where Salesforce stores your records. For the sample data, this is the **Accounts** page.

1. Edit one of the records. For example, in the sample data, change the **Rating** in `Example3` from **cold** to **hot**.

After about a minute, refresh your flow page in Amazon AppFlow. When the flow successfully runs, a timestamp from the last flow run appears.

![\[Timestamp showing last event-triggered flow run.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-timestamp.png)


### Run an on-demand flow with a supported SaaS source


Your on-demand flow runs when you choose the **Run flow** button in the console.

**To run an on-demand flow**

1. In **Flows**, select your flow from the list.

1. Choose **Run flow**.

When the flow successfully runs, a banner appears.

![\[Success message for flow run.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-success2.png)


## View transferred data


The data from your source now resides in your S3 bucket. From the S3 bucket, you can, for example, consume the data from multiple AWS services for analysis. In this step, you download and view the data on your computer.

**To retrieve the transferred data**

1. Open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. In **Buckets**, choose your S3 bucket from the list.

1. In your S3 bucket, choose the `destination` folder. Then choose the flow folder, for example, `salesforce-to-s3`.

1. The folder contains one file. Select this file and choose **Download**.

1. Navigate to the file in your `Downloads` folder and rename it with a descriptive name.

1. Open the file to view the updated record.

You've now transferred data from Salesforce or the SaaS that you chose to Amazon S3. If you used Salesforce, you set up an event-triggered flow to keep up-to-date with changing data.

# Step 5: Clean up your resources
Step 5: Clean up

After you've completed the tutorial, it's good practice to clean up any resources that you no longer want to use. This way, your account doesn't incur any further charges.

**Topics**
+ [

## Clean up in Amazon S3
](#flow-tutorial-clean-s3)
+ [

## Clean up in Amazon AppFlow
](#flow-tutorial-clean-appflow)
+ [

## Clean up in Salesforce
](#flow-tutorial-clean-salesforce)

## Clean up in Amazon S3


Because you used an S3 bucket as both a source and a destination throughout this tutorial, Amazon S3 hosted multiple files. Unless you delete these files, their storage continues to incur charges on your AWS account. Before you delete an S3 bucket, ensure you have saved any important files to another location.

**To clean up your S3 bucket**

1. Open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. In the **Buckets** section, select your S3 bucket and choose **Empty**. Follow the prompts to delete the contents of the bucket.

1. In the **Buckets** section, select your S3 bucket and choose **Delete**. Follow the prompts to delete the S3 bucket.
**Warning**  
Because S3 bucket names are globally unique, when you delete your S3 bucket, someone else can use its name. If you want to reserve an S3 bucket name, don't delete the bucket.

Now you have deleted all of the Amazon S3 resources that you created for the tutorial.

For more information on how to empty and delete S3 buckets, see the following resources:
+ [Emptying a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/empty-bucket.html) in the *Amazon S3 User Guide*.
+ [Deleting a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/delete-bucket.html) in the *Amazon S3 User Guide*.

## Clean up in Amazon AppFlow


Amazon AppFlow stores both your connection and flows. To clean up all resources that you created in this tutorial, delete the two flows and your connection to the SaaS application.

**To clean up your flows**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the **Flows** section, select a flow and choose **Delete**. Follow the prompts to delete your flow.

1. Perform the above step for any flows that remain.

**To clean up your connection**

1. Open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the **Connections** section, under **Connectors**, open the *Choose a connector* box. Select the connector that you used in the tutorial.

1. Select the connection and choose **Delete**.

1. If you used more than one connector, repeat steps 2 and 3 for all connectors.

Now you have deleted all of the resources that you created within Amazon AppFlow for the tutorial.

## Clean up in Salesforce


If you used Salesforce for this tutorial and uploaded the sample data from an S3 bucket to Salesforce, you might want to delete the sample account records.

**To delete imported records in Salesforce**
+ Follow the directions in [Mass delete records](https://help.salesforce.com/s/articleView?id=sf.admin_massdelete.htm).

After you complete these steps, you have cleaned up all of the resources that you created in this tutorial. Deleted resources no longer incur charges on your AWS account.