

# Supported source and destination applications
<a name="app-specific"></a>

Choose an application in the following list to learn more about its setup requirements.

**Topics**
+ [Adobe Analytics](connectors-adobeanalytics.md)
+ [AfterShip](connectors-aftership.md)
+ [Amazon Connect](connectors-amazon-connect.md)
+ [

# Amazon EventBridge
](EventBridge.md)
+ [

# Amazon Lookout for Metrics
](lookout.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon Redshift](redshift.md)
+ [

# Amazon S3
](s3.md)
+ [

# Amplitude
](amplitude.md)
+ [Asana](connectors-asana.md)
+ [BambooHR](connectors-bamboohr.md)
+ [Blackbaud Raiser's Edge NXT](connectors-blackbaudraisersedgenxt.md)
+ [Braintree](connectors-braintree.md)
+ [CircleCI](connectors-circleci.md)
+ [Coupa](connectors-coupa.md)
+ [

# Datadog
](datadog.md)
+ [Delighted](connectors-delighted.md)
+ [DocuSign Monitor](connectors-docusign-monitor.md)
+ [Domo](connectors-domo.md)
+ [

# Dynatrace
](dynatrace.md)
+ [Facebook Ads](connectors-facebook-ads.md)
+ [Facebook Page Insights](connectors-facebook-page-insights.md)
+ [Freshdesk](connectors-freshdesk.md)
+ [Freshsales](connectors-freshsales.md)
+ [GitHub](connectors-github.md)
+ [GitLab](connectors-gitlab.md)
+ [Google Ads](connectors-google-ads.md)
+ [

# Google Analytics
](google-analytics.md)
+ [Google Analytics 4](connectors-google-analytics-4.md)
+ [Google BigQuery](connectors-googlebigquery.md)
+ [Google Calendar](connectors-google-calendar.md)
+ [Google Search Console](connectors-google-search-console.md)
+ [Google Sheets](connectors-google-sheets.md)
+ [HubSpot](connectors-hubspot.md)
+ [

# Infor Nexus
](infor-nexus.md)
+ [Instagram Ads](connectors-instagram-ads.md)
+ [Intercom](connectors-intercom.md)
+ [JDBC](connectors-jdbc.md)
+ [Jira Cloud](connectors-jira-cloud.md)
+ [Kustomer](connectors-kustomer.md)
+ [LinkedIn Ads](connectors-linkedin-ads.md)
+ [LinkedIn Pages](connectors-linkedin-pages.md)
+ [Mailchimp](connectors-mailchimp.md)
+ [

# Marketo
](marketo.md)
+ [Microsoft Dynamics 365](connectors-microsoft-dynamics-365.md)
+ [Microsoft SharePoint Online](connectors-microsoft-sharepoint-online.md)
+ [Microsoft Teams](connectors-microsoft-teams.md)
+ [Mixpanel](connectors-mixpanel.md)
+ [Okta](connectors-okta.md)
+ [Oracle HCM](connectors-oracle-hcm.md)
+ [PayPal](connectors-paypal.md)
+ [Pendo](connectors-pendo.md)
+ [Pipedrive](connectors-pipedrive.md)
+ [Productboard](connectors-productboard.md)
+ [QuickBooks Online](connectors-quickbooks-online.md)
+ [Recharge](connectors-recharge.md)
+ [Salesforce](salesforce.md)
+ [Salesforce Marketing Cloud](connectors-salesforce-marketing-cloud.md)
+ [

# Salesforce Pardot
](pardot.md)
+ [SAP OData](sapodata.md)
+ [SendGrid](connectors-sendgrid.md)
+ [

# ServiceNow
](servicenow.md)
+ [

# Singular
](singular.md)
+ [

# Slack
](slack.md)
+ [Smartsheet](connectors-smartsheet.md)
+ [Snapchat Ads](connectors-snapchat-ads.md)
+ [

# Snowflake
](snowflake.md)
+ [Stripe](connectors-stripe.md)
+ [

# Trend Micro
](trend-micro.md)
+ [Typeform](connectors-typeform.md)
+ [

# Upsolver
](upsolver.md)
+ [

# Veeva
](veeva.md)
+ [WooCommerce](connectors-woocommerce.md)
+ [

# Zendesk
](zendesk.md)
+ [Zendesk Chat](connectors-zendesk-chat.md)
+ [Zendesk Sell](connectors-zendesk-sell.md)
+ [Zendesk Sunshine](connectors-zendesk-sunshine.md)
+ [Zoho CRM](connectors-zoho-crm.md)
+ [Zoom](connectors-zoom.md)

# Adobe Analytics connector for Amazon AppFlow
<a name="connectors-adobeanalytics"></a>

Adobe Analytics is a business analysis software as a service (SaaS) solution. If you’re an Adobe Analytics user, your account contains business data, analytics, and more. You can use Amazon AppFlow to transfer data from Adobe Analytics to certain AWS services or other supported applications.

## Amazon AppFlow support for Adobe Analytics
<a name="adobeanalytics-support"></a>

Amazon AppFlow supports Adobe Analytics as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Adobe Analytics.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Adobe Analytics.

## Before you begin
<a name="adobeanalytics-prereqs"></a>

To use Amazon AppFlow to transfer data from Adobe Analytics to supported destinations, you must meet these requirements:
+ You have an account with Adobe Analytics that contains the data that you want to transfer. For more information about the Adobe Analytics data objects that Amazon AppFlow supports, see [Supported objects](#adobeanalytics-objects).
+ In your Adobe Analytics account, you've created an app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For information about how to create an app, see [Add a new app](https://experienceleague.adobe.com/docs/mobile-services/using/manage-apps-ug/t-new-app.html?lang=en) in the Adobe Analytics documentation.
+ You've configured the app with a redirect URL for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Adobe Analytics. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID and client secret from your app settings. You provide these values to Amazon AppFlow when you create your connection.

### Connecting Amazon AppFlow to your Adobe Analytics account
<a name="adobeanalytics-connecting"></a>

To connect Amazon AppFlow to your Adobe Analytics account, provide the client credentials from your Adobe Analytics app so that Amazon AppFlow can access your data. If you haven't yet configured your Adobe Analytics account for Amazon AppFlow integration, see [Before you begin](#adobeanalytics-prereqs).

**To connect to Adobe Analytics**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Adobe Analytics**.

1. Choose **Create connection**.

1. In the **Connect to Adobe Analytics** window, enter the following information:
   + **Connection name** — A name for the connection.
   + **Client ID** — The client ID in your Adobe Analytics app.
   + **Client secret** — The client secret in your Adobe Analytics app. 
   + **X-API-KEY** — Re-enter the client ID in this field.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. Choose **Connect**.

1. In the window that appears, sign in to your Adobe Analytics account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Adobe Analytics as the data source, you can select this connection.

### Transferring data from Adobe Analytics with a flow
<a name="adobeanalytics-transfer-data"></a>

To transfer data from Adobe Analytics, create an Amazon AppFlow flow, and choose Adobe Analytics as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Adobe Analytics, see [Supported objects](#adobeanalytics-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#adobeanalytics-destinations).

### Supported destinations
<a name="adobeanalytics-destinations"></a>

When you create a flow that uses Adobe Analytics as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

### Supported objects
<a name="adobeanalytics-objects"></a>

When you create a flow that uses Adobe Analytics as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-adobeanalytics.html)

# AfterShip connector for Amazon AppFlow
<a name="connectors-aftership"></a>

AfterShip is a shipment tracking software as a service (SaaS) solution for e-commerce companies. AfterShip user accounts manage tracking data across more than 600 shipping services worldwide. You can use Amazon AppFlow to transfer data from AfterShip to certain AWS services or other supported applications.

## Amazon AppFlow support for AfterShip
<a name="aftership-support"></a>

Amazon AppFlow supports AfterShip as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from AfterShip.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to AfterShip.

## Before you begin
<a name="aftership-prereqs"></a>

To use Amazon AppFlow to transfer data from AfterShip to supported destinations, you must meet these requirements:
+ You have an account with AfterShip that contains the data that you want to transfer. For more information about the AfterShip data objects that Amazon AppFlow supports, see [Supported objects](#aftership-objects).
+ In the settings for your account, you've created an API key for Amazon AppFlow. Amazon AppFlow uses the API key to make authenticated calls to your account and securely access your data. For more information, see [Get the API key](https://www.aftership.com/docs/shipping/quickstart/api-quick-start#2-get-the-api-key) in the *AfterShip API Quick Start*.

Note the value of your API key. When you connect to your AfterShip account, you provide this value to Amazon AppFlow.

## Connecting Amazon AppFlow to your AfterShip account
<a name="aftership-connecting"></a>

To connect Amazon AppFlow to your AfterShip account, provide details from your AfterShip account so that Amazon AppFlow can access your data. If you haven't yet configured your AfterShip account for Amazon AppFlow integration, see [Before you begin](#aftership-prereqs).

Users who run the AfterShip connector for Amazon AppFlow can use one of two API versions:
+ If you created your API key after July 7, 2022, select as-api-key. This is the latest version of the key and has additional security features, such as Advanced Encryption Standard (AES) and Rivest, Shamir, Adleman (RSA) signatures.
+ If you created your API key prior to July 7, 2022, you must select the aftership-api-key. This is a legacy version of the key and doesn't include the additional security features. To use AES or RSA signatures, replace your existing legacy key with a new API key. For more information, see [Authentication](https://www.aftership.com/docs/tracking/quickstart/authentication#4-legacy-api-keys) in the *AfterShip API Quick Start*. 

**To connect to AfterShip**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **AfterShip**.

1. Choose **Create connection**.

1. In the **Connect to AfterShip** window, enter the following information:
   + **API key** – Enter your API key.
   + **API secret key** – Enter your secret key.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your AfterShip account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses AfterShip as the data source, you can select this connection.

## Transferring data from AfterShip with a flow
<a name="aftership-transfer-data"></a>

To transfer data from AfterShip, create an Amazon AppFlow flow, and choose AfterShip as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for AfterShip, see [Supported objects](#aftership-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#aftership-destinations).

## Supported destinations
<a name="aftership-destinations"></a>

When you create a flow that uses AfterShip as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="aftership-objects"></a>

When you create a flow that uses AfterShip as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-aftership.html)

# Amazon Connect connector for Amazon AppFlow
<a name="connectors-amazon-connect"></a>

Amazon Connect is an AWS service that you can use to set up an omnichannel, cloud-based contact center for your customers. Amazon Connect provides the Customer Profiles feature. This feature helps you create unified customer profiles. These profiles combine customer information from external applications with contact history from Amazon Connect. For example, you can combine contact information, order history, and interaction history from software as a service (SaaS) applications like Salesforce, Zendesk and other Amazon AppFlow connectors. The contact center agents for your organization can use this consolidated information during customer support interactions.

If you use Amazon Connect, you can also use Amazon AppFlow to transfer data from supported data sources to Customer Profiles.

For more information about Customer Profiles, see [Use Amazon Connect Customer Profiles](https://docs.aws.amazon.com/connect/latest/adminguide/customer-profiles.html) in the *Amazon Connect Administrator Guide*

## Amazon AppFlow support for Amazon Connect
<a name="amazon-connect-support"></a>

Amazon AppFlow supports Amazon Connect as follows.

**Supported as a data source?**  
No. You can't use Amazon AppFlow to transfer data from Amazon Connect.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Amazon Connect.

**Supported Amazon Connect features**  
Amazon AppFlow integrates only with the Customer Profiles feature.

## Transferring data to Amazon Connect with a flow
<a name="amazon-connect-transfer-data"></a>

To transfer data to Amazon Connect Customer Profiles, you create an Amazon AppFlow flow, and you choose Amazon Connect as the data destination. Then, you use Amazon Connect to set up data mappings in Customer Profiles. These mappings define how data from the data source is mapped to the customer profile.

Before you can use Amazon AppFlow to transfer data to Customer Profiles, you must meet these requirements:
+ You have an Amazon Connect instance.
+ You have enabled the Customer Profiles feature for your Amazon Connect instance. When you enable Customer Profiles, you create a customer profiles domain, which is the container for your customer data in Amazon Connect.
+ You have configured Customer Profiles to encrypt your data under a KMS key.

For more information about creating a flow in Amazon AppFlow and setting up data mappings in Amazon Connect, see [Set up integration for external applications using Amazon AppFlow](https://docs.aws.amazon.com/connect/latest/adminguide/integrate-external-applications-appflow.html) in the *Amazon Connect Administrator Guide*.

# Amazon EventBridge
<a name="EventBridge"></a>

The following are the requirements and connection instructions for using Amazon EventBridge with Amazon AppFlow.

**Note**  
You can use Amazon EventBridge as a destination only.

**Topics**
+ [

## Requirements
](#EventBridge-requirements)
+ [

## Connection instructions
](#EventBridge-setup)
+ [

## Notes
](#EventBridge-notes)
+ [

## Related resources
](#EventBridge-resources)

## Requirements
<a name="EventBridge-requirements"></a>

 Amazon AppFlow integrates with Amazon EventBridge to receive events from Salesforce. When you configure a flow that responds to Salesforce events, you can choose Amazon EventBridge as a destination. This enables Salesforce events received by Amazon AppFlow to be routed directly to a [partner event bus](https://docs.aws.amazon.com/eventbridge/latest/userguide/create-partner-event-bus.html).
+ To configure Amazon EventBridge integration in Amazon AppFlow, you must first create a flow with Amazon EventBridge as the destination and then specify the partner event source.
+ Before you can activate the flow, you must go to Amazon EventBridge to associate the partner event source with the event bus. After you complete this association and activate the flow, Salesforce events start flowing to the Amazon EventBridge event bus.

## Connection instructions
<a name="EventBridge-setup"></a>

**To create a flow with Amazon EventBridge as the destination**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow** and enter a name for your flow.

1. For **Source details**, choose **Salesforce** as the source and select **Salesforce Events** with the specific event name.

1. For **Destination details**, choose Amazon EventBridge as the destination and one of the following partner event sources:
   + **Existing partner event source** - Amazon AppFlow displays a list of existing partner event sources that are available to you.
   + **New partner event source** - Amazon AppFlow creates a new partner event source on your behalf. If you choose this option, the partner event source name generated by Amazon AppFlow appears in a dialog box. (Optional) You can modify this name if needed.
**Note**  
The actual call to Amazon EventBridge API operations for creating this partner event source happens only when you choose **Create flow** in step 11 of this procedure.

1. For **Large event handling**, specify the S3 bucket where you want Amazon AppFlow to send large event information.

1. Ensure that **Run flow on event** is selected in the **Flow trigger** section. This setting ensures that the flow is executed when a new Salesforce event occurs.

1. For field mapping, choose **Map all fields directly**. Alternatively, you can choose the fields that you're interested in using from the **Source field name** list.

1. Choose **Next**.

1. (Optional) Configure filters for data fields in Amazon AppFlow.

1. Choose **Next**.

1. Review the settings and then choose **Create flow**.

**To associate the partner event source with the event bus in Amazon EventBridge**

1. Open the **Partner event sources** view in the Amazon EventBridge console at [https://console.aws.amazon.com/events/home?#/partners/](https://console.aws.amazon.com/events/home?#/partners/).

1. Choose the partner event source that you created.

1. Choose **Associate with event bus**.

1. Validate the name of the partner event bus.

1. Choose **Associate**.

1. Return to Amazon AppFlow and choose **Activate flow** to activate the flow. 

## Notes
<a name="EventBridge-notes"></a>
+ Events are limited to 256 KB. For events larger than 256 KB, Amazon AppFlow doesn't send the full event to Amazon EventBridge. Instead, the event payload contains a pointer to an S3 bucket, where you can get the full event.
+ Events should be enabled in Salesforce and also in Amazon AppFlow for the destination to receive them. The destination service receives all such events configured for your account. If you need to filter the kinds of events that you want to process, or send different events to different targets, you can use [content-based filtering with event patterns](https://docs.aws.amazon.com/eventbridge/latest/userguide/content-filtering-with-event-patterns.html).

## Related resources
<a name="EventBridge-resources"></a>
+  [Receiving events from a SaaS partner](https://docs.aws.amazon.com/eventbridge/latest/userguide/create-partner-event-bus.html) in the *Amazon EventBridge* documentation 
+  [Amazon AppFlow now supports Amazon EventBridge as a destination](https://aws.amazon.com/about-aws/whats-new/2020/08/amazon-appflow-now-supports-amazon-eventbridge-as-a-destination) in the AWS *What's new* blog
+  [Building Salesforce integrations with Amazon EventBridge and Amazon AppFlow](https://aws.amazon.com/blogs/compute/building-salesforce-integrations-with-amazon-eventbridge/) in the AWS *Compute* blog

# Amazon Lookout for Metrics
<a name="lookout"></a>

The following are the requirements and connection instructions for using Amazon Lookout for Metrics with Amazon AppFlow.

**Note**  
You can use Amazon Lookout for Metrics as a destination only.

**Topics**
+ [

## Requirements
](#lookout-requirements)
+ [

## Setup instructions
](#lookout-setup)
+ [

## Notes
](#lookout-notes)
+ [

## Related resources
](#lookout-resources)

## Requirements
<a name="lookout-requirements"></a>
+ To get access to Amazon Lookout for Metrics, you must first be added to the allow list. To request access, see [Amazon Lookout for Metrics Preview](https://pages.awscloud.com/AmazonLookout-for-MetricsPreview.html). For more information about the service, see [Amazon Lookout for Metrics](https://aws.amazon.com/lookout-for-metrics/).

## Setup instructions
<a name="lookout-setup"></a>

**To create a flow with Amazon Lookout for Metrics as the destination**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow** and enter a name for your flow.

1. Under **Data encryption**, choose **Customize encryption settings (advanced)** then select an existing customer managed key (CMK) or create a new one. The default AWS managed CMK is not supported when using Amazon Lookout for Metrics as a destination.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. For **Source details**, choose a supported source and provide the requested information.

1. For **Destination details**, choose Amazon Lookout for Metrics as the destination for your time-series data.

1. When using Amazon Lookout for Metrics as a destination, only the **Run flow on schedule** option is available. Specify the appropriate schedule settings, such as the frequency, start date, and start time. You can also enter an end date (optional).

   Amazon Lookout for Metrics currently supports the following scheduling options:
   + If the source supports minutes: you can run the flow every 5 or 10 minutes by selecting **5** or **10** from the **Every** dropdown list.
   + If the source supports hours: you can run the flow once an hour by selecting **1** from the **Every** dropdown list.
   + If the source supports days: you can run the flow once a day by selecting **1** from the **Every** dropdown list.

1. Choose **Next**.

1. Under **Source to destination field mapping**, go to the **Source field name** dropdown list and choose **Map all fields directly**. Alternatively, you can manually select the fields that you want to use from the list.
**Note**  
A timestamp field is not required in your data. However, in order to use the anomaly detection feature of Amazon Lookout for Metrics, you need at least one measure or numeric column with values changing over time.

1. (Optional) Under **Validations - optional**, add validations to check whether a field has bad data. For each field, choose the condition that indicates bad data and what action Amazon AppFlow should take when a field in a record is bad.

1. Choose **Next**.

1. (Optional) Specify a filter to determine which records to transfer. To add a filter, choose **Add filter**, select the field name, select a condition, and then specify the criteria.

1. Choose **Next**.

1. Review the settings and then choose **Create flow**.

## Notes
<a name="lookout-notes"></a>
+ The default AWS managed CMK is not supported when using Amazon Lookout for Metrics as a destination.
+ The following sources are supported when using Amazon Lookout for Metrics as a destination:
  + Amplitude
  + Dynatrace
  + Google Analytics
  + Infor Nexus
  + Marketo
  + Salesforce
  + ServiceNow
  + Singular
  + Trend Micro
  + Veeva
  + Zendesk
+ Amazon Lookout for Metrics currently supports the following scheduling options:
  + If the source supports minutes: you can run the flow every 5 or 10 minutes 
  + If the source supports hours: you can run the flow once an hour
  + If the source supports days: you can run the flow once a day 

## Related resources
<a name="lookout-resources"></a>
+ [Amazon Lookout for Metrics](https://aws.amazon.com/lookout-for-metrics/) service page
+ [Amazon Lookout for Metrics Preview](https://pages.awscloud.com/AmazonLookout-for-MetricsPreview.html) 

# Amazon RDS for PostgreSQL connector for Amazon AppFlow
<a name="connectors-amazon-rds-postgres-sql"></a>

Amazon Relational Database Service (Amazon RDS) helps you set up and manage relational databases in the AWS Cloud. With Amazon RDS for PostgreSQL, you can set up Amazon RDS databases that run the PostgreSQL open source database system. If you use Amazon RDS for PostgreSQL, you can also use Amazon AppFlow to populate your databases with data that you transfer from certain AWS services or other supported applications.

## Amazon AppFlow support for Amazon RDS for PostgreSQL
<a name="amazon-rds-postgres-sql-support"></a>

Amazon AppFlow supports Amazon RDS for PostgreSQL as follows.

**Supported as a data source?**  
No. You can't use Amazon AppFlow to transfer data from Amazon RDS for PostgreSQL.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Amazon RDS for PostgreSQL.

## Before you begin
<a name="amazon-rds-postgres-sql-prereqs"></a>

Before you can use Amazon AppFlow to transfer data to Amazon RDS for PostgreSQL, you must have one or more Amazon RDS databases where you've set the engine to PostreSQL. For the steps to create such a database, see [Creating a PostgreSQL DB instance](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_GettingStarted.CreatingConnecting.PostgreSQL.html#CHAP_GettingStarted.Creating.PostgreSQL) in the *Amazon RDS User Guide*.

From your database settings, note the endpoint name and port. You provide these values, along with your database user name and password, to Amazon AppFlow when you connect to your database.

## Connecting Amazon AppFlow to your Amazon RDS for PostgreSQL database
<a name="amazon-rds-postgres-sql-connecting"></a>

To connect Amazon AppFlow to your Amazon RDS for PostgreSQL database, provide details from your database settings.

**To connect to Amazon RDS for PostgreSQL**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Amazon RDS for PostgreSQL**.

1. Choose **Create connection**.

1. In the **Connect to Amazon RDS for PostgreSQL** window, enter the following information:
   + **driver** – Choose **postgresql**.
   + **hostname** – The endpoint name of the destination DB instance.
   + **port** – The DB instance port number.
   + **username** – The name of the DB instance master user.
   + **password** – The DB instance password.
   + **database** – The DB instance name.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Amazon RDS for PostgreSQL as the data destination, you can select this connection.

## Transferring data to Amazon RDS for PostgreSQL with a flow
<a name="amazon-rds-postgres-sql-transfer-data"></a>

To transfer data to Amazon RDS for PostgreSQL, create an Amazon AppFlow flow, and choose Amazon RDS for PostgreSQL as the data destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

# Amazon Redshift connector for Amazon AppFlow
<a name="redshift"></a>

Amazon Redshift is a data warehouse service in AWS. If you use Amazon Redshift, you can also use Amazon AppFlow to transfer data from supported sources into your Amazon Redshift databases. When you connect Amazon AppFlow to Amazon Redshift with the recommended settings, Amazon AppFlow transfers your data with the Amazon Redshift Data API.

For more information about Amazon Redshift, see the [Amazon Redshift Management Guide](https://docs.aws.amazon.com/redshift/latest/mgmt/welcome.html).

## Amazon AppFlow support for Amazon Redshift
<a name="redshift-support"></a>

Amazon AppFlow supports Amazon Redshift as follows.

**Supported as a data source?**  
No. You can't use Amazon AppFlow to transfer data from Amazon Redshift.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Amazon Redshift.

## Before you begin
<a name="redshift-prereqs"></a>

Before you can use Amazon AppFlow to transfer data to Amazon Redshift, you must meet these requirements:
+ You have an Amazon Redshift database. If you are new to Amazon Redshift, see the [Amazon Redshift Getting Started Guide](https://docs.aws.amazon.com/redshift/latest/gsg/) to learn about basic concepts and tasks. You specify your database in the Amazon Redshift connection settings in Amazon AppFlow.
+ **Recommended**: You have an AWS Identity and Access Management (IAM) role that authorizes Amazon AppFlow to access your database through the Amazon Redshift Data API. You need this role to configure an Amazon Redshift connection with the recommended settings. For more information, and for the polices that you attach to this role, see [Allow Amazon AppFlow to access Amazon Redshift databases with the Data API](security_iam_service-role-policies.md#access-redshift).
+ You have an Amazon S3 bucket that Amazon AppFlow can use as an intermediate destination when it transfers data to Amazon Redshift. You specify this bucket in the connection settings. For the steps to create a bucket, see [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) in the *Amazon S3 User Guide*.
+ You have an IAM role that grants Amazon Redshift read-only access to Amazon S3. You specify this role in the connection settings, and you associate it with your Amazon Redshift cluster. For more information, and for the polices that you attach to this role, see [Allow Amazon Redshift to access your Amazon AppFlow data in Amazon S3](security_iam_service-role-policies.md#redshift-access-s3).
+ In IAM, you’re authorized with the required pass role permissions below.

### Required pass role permissions
<a name="redshift-passrole"></a>

Before you can create an Amazon Redshift connection, you must have certain IAM permissions assigned to you as an AWS user. These permissions must allow you pass IAM roles to Amazon AppFlow and Amazon Redshift, as shown by the following example IAM policy:

Before you use this example policy, replace the variable elements with the required values:
+ `account-id` – Your AWS account ID.
+ `appflow-redshift-access-role-name` – The name of the role that authorizes Amazon AppFlow to access your Amazon Redshift database.
+ `region` – The code of the AWS Region where you use Amazon AppFlow. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference*.
+ `redshift-s3-access-role-name` – The name of the role that grants Amazon Redshift read-only access to Amazon S3.

## Connecting Amazon AppFlow to your Amazon Redshift database
<a name="redshift-connecting"></a>

To connect Amazon AppFlow to your Amazon Redshift database, provide the required database details, S3 bucket, and IAM roles. If you haven't yet created the required resources, see the preceding section, [Before you begin](#redshift-prereqs).

**To create an Amazon Redshift connection**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Amazon Redshift**.

1. Choose **Create connection**.

1. For **Data warehouse type**, choose whether to connect to **Amazon Redshift Serverless** or an **Amazon Redshift cluster**.

1. If you chose to connect to Amazon Redshift Serverless, enter the following information:
   + **Workgroup name** – The name of your Amazon Redshift workgroup.
   + **Database name** – The name of the Amazon Redshift database that stores the data that you transfer with Amazon AppFlow.
   + **Bucket details** – The Amazon S3 bucket where Amazon AppFlow writes your data as an intermediate destination. Amazon Redshift gets your data from this bucket.
   + **IAM role for Amazon S3 access** – The IAM role that authorizes Amazon Redshift to get and decrypt the data from the S3 bucket.
   + **IAM role for Amazon Redshift Data API access** — The IAM role that authorizes Amazon AppFlow to access your database through the Amazon Redshift Data API.
**Note**  
After you create a connection to Amazon Redshift Serverless, you must also grant the required access privileges to your database user. For more information, see [Granting access privileges to the database user (required for Amazon Redshift Serverless)](#grant-access).

1. If you chose to connect to an Amazon Redshift cluster, do one of the following:
   + **Recommended:** Choose **Data API** to connect through the Amazon Redshift Data API. This option is recommended because Amazon AppFlow can use the Data API to connect to public and private Amazon Redshift clusters. Enter the following information:
     + **Cluster identifier** – The unique identifier of your Amazon Redshift cluster.
     + **Database name** – The name of the Amazon Redshift database that stores the data that you transfer with Amazon AppFlow.
     + **Bucket details** – The Amazon S3 bucket where Amazon AppFlow writes your data as an intermediate destination. Amazon Redshift gets your data from this bucket.
     + **IAM role for Amazon S3 access** – The IAM role that authorizes Amazon Redshift to get and decrypt the data from the S3 bucket.
     + **IAM role for Amazon Redshift Data API access** – The IAM role that authorizes Amazon AppFlow to access your database through the Amazon Redshift Data API.
     + **Amazon Redshift database user name** – The user name that you use to authenticate with your Amazon Redshift database.
   + **Not recommended:** Choose **JDBC URL** to connect through a Java Database Connectivity (JDBC) URL. For information about the settings for this option, see the [Guidance for connections that use JDBC URLs](#jdbc-guidance) section that follows.
**Warning**  
We don't recommend that you choose the **JDBC URL** option because Amazon AppFlow can't use JDBC URLs to connect to private Amazon Redshift clusters. Amazon AppFlow will discontinue support for JDBC URLs in the near future. We strongly recommend that you configure your connection with the Data API instead.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Amazon Redshift as the data destination, you can select this connection.

## Granting access privileges to the database user (required for Amazon Redshift Serverless)
<a name="grant-access"></a>

After you connect Amazon AppFlow to Amazon Redshift Serverless, you must also grant access privileges to a database user account. Amazon AppFlow uses this account to access your database. Until you grant the access privileges, Amazon AppFlow can't access your database, and it can't run flows that transfer data to the database.

**Note**  
This action is necessary only if you created a connection to Amazon Redshift Serverless. It isn't necessary if you chose to connect to an Amazon Redshift cluster.

You grant the access privileges to a database user that Amazon Redshift creates for you when you create the connection in Amazon AppFlow. Amazon Redshift names this user `IAMR:data-api-access-role`. In that name, `data-api-access-role` is the name of the IAM role that authorizes access to your database through the Amazon Redshift Data API. If you already created the connection in the Amazon AppFlow console, you provided that role for the **IAM role for Amazon Redshift Data API access** field.

Amazon Redshift maps this role to the database user. After you grant the access privileges, Amazon Redshift allows the database user to access your data with the permissions that you assigned to the role.

**To grant the access privileges**
+ Use your SQL client to run the Amazon Redshift SQL command `GRANT`.

  For example, you can run this command to permit the user to access all of the tables in a specific schema:

  ```
  GRANT ALL ON ALL TABLES IN SCHEMA schema-name TO "IAMR:data-api-access-role"
  ```

  To apply the privileges more restrictively, you can run this command to permit the user to access a specific table in a specific schema:

  ```
  GRANT ALL ON TABLE table-name IN SCHEMA schema-name TO "IAMR:data-api-access-role"
  ```

These examples grant `ALL` privileges because the user must be able to read the schema and write data to the cluster.

For more information about the `GRANT` SQL command, see [GRANT](https://docs.aws.amazon.com/redshift/latest/dg/r_GRANT.html) in the *Amazon Redshift Database Developer Guide*.

## Guidance for connections that use JDBC URLs
<a name="jdbc-guidance"></a>

The following information applies only to Amazon Redshift connections that are configured with JDBC URLs. We don't recommend these types of connections because Amazon AppFlow will discontinue support for JDBC URLs in the near future. You can refer to this section to manage existing connections that use JDBC URLs. However, for any new Amazon Redshift connections that you create, you should configure them with the Data API instead.

### JDBC requirements
<a name="jdbc-requirements"></a>

You must provide Amazon AppFlow with the following:
+ The user name and password of your Amazon Redshift user account.
+ The JDBC URL of your Amazon Redshift cluster. For more information, see [Finding your cluster connection string](https://docs.aws.amazon.com/redshift/latest/mgmt/configuring-connections.html#connecting-connection-string) in the *Amazon Redshift Management Guide*. 

You must also do the following:
+ Ensure that you enter a correct JDBC connector and password when configuring your Redshift connections. An incorrect JDBC connector or password can return an '[Amazon](500310)' error.
+ Ensure that your cluster is publicly accessible by going to the AWS Management Console, navigating to the Amazon Redshift console and choose CLUSTERS. Then, select the cluster that you want to modify and choose **Actions > Modify Publicly > Enable**. Save your changes.

  If you still can't connect to the cluster from the internet or a different network, go to the Amazon Redshift console and select the cluster that you want to modify. Under **Properties**, choose **Network and security settings**. Choose the link next to VPC security group to open the Amazon Elastic Compute Cloud (Amazon EC2) console. On the Inbound Rules tab, make sure that your IP address and the port of your Amazon Redshift cluster are allowed. The default port for Amazon Redshift is 5439, but your port might be different.
+ Ensure that your Amazon Redshift cluster is accessible from Amazon AppFlow IP address ranges in your Region.

### JDBC settings
<a name="jdbc-settings"></a>
+ **JDBC URL** — The JDBC URL of the Amazon Redshift cluster where you want to connect.
+ **Bucket details** — The Amazon S3 bucket where Amazon AppFlow writes your data as an intermediate destination. Amazon Redshift gets your data from this bucket.
+ **IAM role for Amazon S3 access** — The IAM role that authorizes Amazon Redshift to get and decrypt the data from the S3 bucket.
+ **Amazon Redshift database user name** — The user name that you use to authenticate with your Amazon Redshift database.
+ **Amazon Redshift database password** — The password you use to authenticate with your Amazon Redshift database.

### Notes
<a name="redshift-notes"></a>
+ The default port for Amazon Redshift is 5439, but your port might be different. To find the Amazon AppFlow IP CIDR block for your region, see [AWS IP address ranges](https://docs.aws.amazon.com/general/latest/gr/aws-ip-ranges.html) in the *Amazon Web Services General Reference*.
+ Amazon AppFlow currently supports the insert action when transferring data into Amazon Redshift, but not the update or upsert action.

### Related resources
<a name="redshift-resources"></a>
+ [Finding your cluster connection string](https://docs.aws.amazon.com/redshift/latest/mgmt/configuring-connections.html#connecting-connection-string) in the *Amazon Redshift Management Guide*
+  [How to make a private Redshift cluster publicly accessible](https://aws.amazon.com/premiumsupport/knowledge-center/redshift-cluster-private-public/) in the AWS Knowledge Center
+ [Workaround to extract Salesforce data using Amazon AppFlow and upsert it to Amazon Redshift tables hosted on private subnet using data APIs](https://github.com/aws-samples/amazon-appflow/blob/master/sf-appflow-upsert-redshift-lambda/README.md) in the Amazon AppFlow GitHub Page

## Transferring data to Amazon Redshift with a flow
<a name="redshift-transfer-data"></a>

To transfer data to Amazon Redshift, create an Amazon AppFlow flow, and choose Amazon Redshift as the data destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

# Amazon S3
<a name="s3"></a>

The following are the requirements and connection instructions for using Amazon Simple Storage Service (Amazon S3) with Amazon AppFlow.

**Note**  
You can use Amazon S3 as a source or a destination.

**Topics**
+ [

## Requirements
](#s3-requirements)
+ [

## Connection instructions
](#s3-setup)
+ [

## Notes
](#s3-notes)
+ [

## Supported destinations
](#s3-destinations)
+ [

## Related resources
](#s3-resources)

## Requirements
<a name="s3-requirements"></a>
+ Your S3 buckets must be in the same AWS Region as your console and flow.
+ If you use Amazon S3 as the data source, you must place your source files inside a folder in your S3 bucket.
+ If your source files are in CSV format, each file must have a header row. The header row is a series of field names separated by commas.
+  Each source file should not exceed 125 MB in size. However, you can upload multiple CSV/JSONL files in the source location, and Amazon AppFlow will read from all of them to transfer data over a single flow run. You can check for any applicable destination data transfer limits in [Quotas for Amazon AppFlow](service-quotas.md).
+ Amazon AppFlow does not support cross-account access to S3 buckets in order to prevent unauthorized access and potential security concerns.

## Connection instructions
<a name="s3-setup"></a>

**To use Amazon S3 as a source or destination while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Amazon S3** from the **Source name** or **Destination name** dropdown list.

1. Under **Bucket details**, select the S3 bucket that you're retrieving from or adding to. You can specify a prefix, which is equivalent to specifying a folder within the S3 bucket where your source files are located or records are to be written to the destination.

![\[Bucket details form with fields for choosing S3 bucket and entering bucket prefix.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-s3-console.png)


Now that you are connected to your S3 bucket, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#s3-requirements) section above.

## Notes
<a name="s3-notes"></a>
+ When you use Amazon S3 as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.
+ When you use Amazon S3 as a destination, the following additional settings are available.


| Setting name | Description | 
| --- | --- | 
|  **AWS Glue Data Catalog settings**  |  Catalog the data that you transfer in the AWS Glue Data Catalog. When you catalog your data, you make it easier to discover and access with AWS analytics and machine learning services. For more information, see [Cataloging the data output from an Amazon AppFlow flow](flows-catalog.md).  | 
|  **Data format preference**  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/s3.html)  If you choose Parquet as the format for your destination file in Amazon S3, the option to aggregate all records into one file per flow run will not be available. When choosing Parquet, Amazon AppFlow will write the output as string, and not declare the data types as defined by the source.   | 
|  |  | 
|  **Filename preference**  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/s3.html)  | 
|  **Partition and aggregation settings**  |  Organize the data that you transfer into partitions and files of a specified size. These settings can help you optimize query performance for applications that access the data. For more information, see [Partitioning and aggregating data output from an Amazon AppFlow flow](flows-partition.md).  | 

## Supported destinations
<a name="s3-destinations"></a>

When you create a flow that uses Amazon S3 as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ SAP OData
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="s3-resources"></a>
+ [Amazon Simple Storage Service User Guide](https://docs.aws.amazon.com/AmazonS3/latest/user-guide/what-is-s3.html)
+ [Amazon AppFlow now supports new data formats for ingesting files into Amazon S3](https://aws.amazon.com/about-aws/whats-new/2020/09/amazon-appflow-supports-new-data-formats-ingesting-flies-amazon-s3/) in the AWS* What’s new* blog
+ How to insert new Salesforce records with data in Amazon S3 using Amazon AppFlow  

+ How to transfer data from Slack to Amazon S3 using Amazon AppFlow  

+ How to transfer data from Google Analytics to Amazon S3 using Amazon AppFlow  

+ How to transfer data from Zendesk Support to Amazon S3 using Amazon AppFlow  


# Amplitude
<a name="amplitude"></a>

The following are the requirements and connection instructions for using Amplitude with Amazon AppFlow.

**Note**  
You can use Amplitude as a source only.

**Topics**
+ [

## Requirements
](#amplitude-requirements)
+ [

## Connection instructions
](#amplitude-setup)
+ [

## Notes
](#amplitude-notes)
+ [

## Supported destinations
](#amplitude-destinations)
+ [

## Related resources
](#amplitude-resources)

## Requirements
<a name="amplitude-requirements"></a>

You must provide Amazon AppFlow with the API key and secret key for the project with the data that you want to transfer. Your API key can be found on the Settings page of the Amplitude dashboard. For more information about how to retrieve this information from Amplitude, see [Settings](https://help.amplitude.com/hc/en-us/articles/235649848#project-general-settings) in the Amplitude documentation.

## Connection instructions
<a name="amplitude-setup"></a>

**To connect to Amplitude while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Amplitude** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Amplitude** dialog box.

   1. Under **API key**, enter your API key.

   1. Under **Secret key**, enter your secret key.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Connection form for Amplitude with fields for API key, secret key, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-amplitude-console.png)

1. You will be redirected to the Amplitude login page. When prompted, grant Amazon AppFlow permissions to access your Amplitude account.

Now that you are connected to your Amplitude account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#amplitude-requirements).

## Notes
<a name="amplitude-notes"></a>
+ When you use Amplitude as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per day.
+ Amplitude can process 25 MB of data as part of a single flow run.

## Supported destinations
<a name="amplitude-destinations"></a>

When you create a flow that uses Amplitude as the data source, you can set the destination to any of the following connectors: 
+ Lookout for Metrics
+ Amazon S3

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="amplitude-resources"></a>
+  [Settings](https://help.amplitude.com/hc/en-us/articles/235649848#project-general-settings) in the Amplitude documentation
+  [Breaking Data Silos with Amazon AppFlow and Amplitude](https://amplitude.com/blog/aws-appflow-amplitude-announcement) from *Inside Amplitude* 

# Asana connector for Amazon AppFlow
<a name="connectors-asana"></a>

Asana is a cloud-based team collaboration solution that helps teams organize, plan, and complete tasks and projects. If you're an Asana user, your account contains data about your workspaces, projects, tasks, teams, and more. You can use Amazon AppFlow to transfer data from Asana to certain AWS services or other supported applications.

## Amazon AppFlow support for Asana
<a name="asana-support"></a>

Amazon AppFlow supports Asana as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Asana.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Asana.

## Before you begin
<a name="asana-prereqs"></a>

To use Amazon AppFlow to transfer data from Asana to supported destinations, you must meet these requirements:
+ You have an account with Asana that contains the data that you want to transfer. For more information about the Asana data objects that Amazon AppFlow supports, see [Supported objects](#asana-objects).
+ In your Asana account settings, you've created either of the following resources for Amazon AppFlow. These resources provide credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account.
  + A Developer App, which supports OAuth 2.0 authentication. For information about how to create a Developer App, see [OAuth](https://developers.asana.com/docs/oauth) in the Asana Developers documentation.
  + A personal access token. For more information, see [Personal access token](https://developers.asana.com/docs/personal-access-token) in the Asana Developers documentation.
+ If you created an OAuth app, you've configured it with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Asana. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

If you created a Developer App, note the client ID and client secret. If you created a personal access token, note the token value. You provide these values to Amazon AppFlow when you connect to your Asana account.

## Connecting Amazon AppFlow to your Asana account
<a name="asana-connecting"></a>

To connect Amazon AppFlow to your Asana account, provide the client credentials from your Developer App, or provide a personal access token. If you haven't yet configured your Asana account for Amazon AppFlow integration, see [Before you begin](#asana-prereqs).

**To connect to Asana**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Asana**.

1. Choose **Create connection**.

1. In the **Connect to Asana** window, for **Select authentication type**, choose how to authenticate Amazon AppFlow with your Asana account when it requests to access your data:
   + Choose **OAuth2** to authenticate Amazon AppFlow with the client ID and client secret from an Asana Developer App. Then enter values for **Client ID** and **Client secret**.
   + Choose **PAT** to authenticate Amazon AppFlow with a personal access token. Then enter the token value for **Personal access token**.

1. In the **Connect to Asana** window, enter the following information:

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Asana account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Asana as the data source, you can select this connection.

## Transferring data from Asana with a flow
<a name="asana-transfer-data"></a>

To transfer data from Asana, create an Amazon AppFlow flow, and choose Asana as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Asana, see [Supported objects](#asana-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#asana-destinations).

## Supported destinations
<a name="asana-destinations"></a>

When you create a flow that uses Asana as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="asana-objects"></a>

When you create a flow that uses Asana as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-asana.html)

# BambooHR connector for Amazon AppFlow
<a name="connectors-bamboohr"></a>

BambooHR is a human resources software as a service (SaaS) solution. If you’re a BambooHR user, your account contains data on employees and applicants, such as employee information, benefits, vacation time, openings, reports, files, and more. You can use Amazon AppFlow to transfer data from BambooHR to certain AWS services or other supported applications. 



## Amazon AppFlow support for BambooHR
<a name="bamboohr-support"></a>

Amazon AppFlow supports BambooHR as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from BambooHR.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to BambooHR.

## Before you begin
<a name="bamboohr-prereqs"></a>

To use Amazon AppFlow to transfer data from BambooHR to supported destinations, you must meet these requirements:
+ You have an account with BambooHR that contains the data that you want to transfer. For more information about the BambooHR data objects that Amazon AppFlow supports, see [Supported objects](#bamboohr-objects).
+ In the API keys settings for your account, you've created an API key for Amazon AppFlow. Amazon AppFlow uses the API key to make authenticated calls to your account and securely access your data. For more information, see [Authentication](https://documentation.bamboohr.com/docs#authentication) in the BambooHR documentation.

Note the value of your API key. When you connect to your BambooHR account, you provide this value to Amazon AppFlow.

## Connecting Amazon AppFlow to your BambooHR account
<a name="bamboohr-connecting"></a>

To connect Amazon AppFlow to your BambooHR account, provide details from your BambooHR project so that Amazon AppFlow can access your data. If you haven't yet configured your BambooHR project for Amazon AppFlow integration, see [Before you begin](#bamboohr-prereqs).

**To connect to BambooHR**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **BambooHR**.

1. Choose **Create connection**.

1. In the **Connect to BambooHR** window, enter the following information:
   + **API key** – Enter your API key.
   + **Instance URL** – The URL of the instance where you want to run the operation, for example, https://api.bamboohr.com/api/gateway.php/amazonawstest.
   + **Zone (Optional)** – The time zone that you access Amazon AppFlow from.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your BambooHR account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses BambooHR as the data source, you can select this connection.

## Transferring data from BambooHR with a flow
<a name="bamboohr-transfer-data"></a>

To transfer data from BambooHR, create an Amazon AppFlow flow, and choose BambooHR as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for BambooHR, see [Supported objects](#bamboohr-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#bamboohr-destinations).

## Supported destinations
<a name="bamboohr-destinations"></a>

When you create a flow that uses BambooHR as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="bamboohr-objects"></a>

When you create a flow that uses BambooHR as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-bamboohr.html)

# Blackbaud Raiser's Edge NXT connector for Amazon AppFlow
<a name="connectors-blackbaudraisersedgenxt"></a>

Blackbaud Raiser's Edge NXT is a customer relationship management (CRM) software as a service (SaaS) solution for nonprofit organizations. If you’re a Blackbaud Raiser's Edge NXT user, your account contains data on prospects, analytics, gift management, and more. You can use Amazon AppFlow to transfer data from Blackbaud Raiser's Edge NXT to certain AWS services or other supported applications.

## Amazon AppFlow support for Blackbaud Raiser's Edge NXT
<a name="blackbaudraisersedgenxt-support"></a>

Amazon AppFlow supports Blackbaud Raiser's Edge NXT as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Blackbaud Raiser's Edge NXT.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Blackbaud Raiser's Edge NXT.

## Before you begin
<a name="blackbaudraisersedgenxt-prereqs"></a>

To use Amazon AppFlow to transfer data from Blackbaud Raiser's Edge NXT to supported destinations, you must meet these requirements:
+ You have an account with Blackbaud Raiser's Edge NXT that contains the data that you want to transfer. For more information about the Blackbaud Raiser's Edge NXT data objects that Amazon AppFlow supports, see [Supported objects](#blackbaudraisersedgenxt-objects).
+ In your Blackbaud SKY Developer account, you've created a SKY developer app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. You can use default settings for the Grant type, the authorization tokens URL, and the authorization code URL, or use your own. For information about how to create a developer app, see [Applications](https://developer.blackbaud.com/skyapi/docs/applications) in the SKY API documentation. 
+ In the setting for Scopes, you've defined access to Blackbaud data with the option **Full data access.**
+ You've configured the app with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Blackbaud Raiser's Edge NXT. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID, client secret, and subscription key from the settings for your app. You provide these values to Amazon AppFlow when you create your connection.

## Connecting Amazon AppFlow to your Blackbaud Raiser's Edge NXT account
<a name="blackbaudraisersedgenxt-connecting"></a>

To connect Amazon AppFlow to your Blackbaud Raiser's Edge NXT account, provide details from your SKY developer app so that Amazon AppFlow can access your data. If you haven't yet configured your Blackbaud Raiser's Edge NXT account for Amazon AppFlow integration, see [Before you begin](#blackbaudraisersedgenxt-prereqs).

**To connect to Blackbaud Raiser's Edge NXT**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Blackbaud Raiser's Edge NXT**.

1. Choose **Create connection**.

1. In the **Connect to Blackbaud Raiser's Edge NXT** window, enter the following information:
   + **Connection name** — Enter a name for your connection. 
   + **Client ID ** — The client ID in your Blackbaud Raiser's Edge NXT project.
   + **Client secret ** — The client secret in your Blackbaud Raiser's Edge NXT project.
   + **Subscription key ** — The subscription key in your Blackbaud Raiser's Edge NXT project.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. Choose **Connect**.

1. In the window that appears, sign in to your Blackbaud Raiser's Edge NXT account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Blackbaud Raiser's Edge NXT as the data source, you can select this connection.

## Transferring data from Blackbaud Raiser's Edge NXT with a flow
<a name="blackbaudraisersedgenxt-transfer-data"></a>

To transfer data from Blackbaud Raiser's Edge NXT, create an Amazon AppFlow flow, and choose Blackbaud Raiser's Edge NXT as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Blackbaud Raiser's Edge NXT, see [Supported objects](#blackbaudraisersedgenxt-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#blackbaudraisersedgenxt-destinations).

## Supported destinations
<a name="blackbaudraisersedgenxt-destinations"></a>

When you create a flow that uses Blackbaud Raiser's Edge NXT as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="blackbaudraisersedgenxt-objects"></a>

When you create a flow that uses Blackbaud Raiser's Edge NXT as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-blackbaudraisersedgenxt.html)

# Braintree connector for Amazon AppFlow
<a name="connectors-braintree"></a>

Braintree is an online payment processing solution. If you're a Braintree user, your account contains data about your transactions. You can use Amazon AppFlow to transfer data from Braintree to certain AWS services or other supported applications.

## Amazon AppFlow support for Braintree
<a name="braintree-support"></a>

Amazon AppFlow supports Braintree as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Braintree.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Braintree.

## Before you begin
<a name="braintree-prereqs"></a>

To use Amazon AppFlow to transfer data from Braintree to supported destinations, you must meet these requirements:
+ You have an account with Braintree that contains the data that you want to transfer.
+ In the API settings for your account, you've created an API key for Amazon AppFlow. The API key provides the credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For more information, see [Important Gateway Credentials](https://developer.paypal.com/braintree/articles/control-panel/important-gateway-credentials) in the Braintree documentation.

From your API key settings, note the values for public key and private key. You provide these values to Amazon AppFlow when you connect to your Braintree account.

## Connecting Amazon AppFlow to your Braintree account
<a name="braintree-connecting"></a>

To connect Amazon AppFlow to your Braintree account, provide the credentials from your Braintree API key so that Amazon AppFlow can access your data. If you haven't yet configured your Braintree account for Amazon AppFlow integration, see [Before you begin](#braintree-prereqs).

**To connect to Braintree**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Braintree**.

1. Choose **Create connection**.

1. In the **Connect to Braintree** window, enter the following information:
   + **Public Key** – The public key value from the API key in your Braintree account.
   + **Private Key** – The private key value from the API key in your Braintree account.
   + **Braintree Instance Url** – Choose one of the following:
     + **https://payments.braintree-api.com/graphql** – Connects to the Braintree production environment.
     + **https://payments.sandbox.braintree-api.com/graphql** – Connects to the Braintree sandbox environment.

     For more information about these environments, see [Try It Out](https://developer.paypal.com/braintree/articles/get-started/try-it-out) in the Braintree documentation.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Braintree as the data source, you can select this connection.

## Transferring data from Braintree with a flow
<a name="braintree-transfer-data"></a>

To transfer data from Braintree, create an Amazon AppFlow flow, and choose Braintree as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

## Supported destinations
<a name="braintree-destinations"></a>

When you create a flow that uses Braintree as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# CircleCI connector for Amazon AppFlow
<a name="connectors-circleci"></a>

CircleCI is a continuous integration and continuous delivery platform. If you're a CircleCI user, your account contains data about your projects, pipelines, workflows, and more. You can use Amazon AppFlow to transfer data from CircleCI to certain AWS services or other supported applications.

## Amazon AppFlow support for CircleCI
<a name="circleci-support"></a>

Amazon AppFlow supports CircleCI as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from CircleCI.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to CircleCI.

## Before you begin
<a name="circleci-prereqs"></a>

To use Amazon AppFlow to transfer data from CircleCI to supported destinations, you must meet these requirements:
+ You have an account with CircleCI that contains the data that you want to transfer. For more information about the CircleCI data objects that Amazon AppFlow supports, see [Supported objects](#circleci-objects).
+ In the user settings for your account, you've created a personal API token. For the steps to do this, see [Creating a personal API token](https://circleci.com/docs/managing-api-tokens/?utm_source=google&utm_medium=sem&utm_campaign=sem-google-dg--uscan-en-dsa-maxConv-auth-nb&utm_term=g_-_c__dsa_&utm_content=&gclid=Cj0KCQiA4OybBhCzARIsAIcfn9lS-1gBgq0NRzEsA_b20-dhUG8aEHQqIu9wdXFEhSfg0kHsXEhufi8aAtPGEALw_wcB#creating-a-personal-api-token) in the CircleCI Docs site.

You provide the personal API token to Amazon AppFlow in the settings for your CircleCI connection.

## Connecting Amazon AppFlow to your CircleCI account
<a name="circleci-connecting"></a>

To connect Amazon AppFlow to your CircleCI account, provide your personal API token so that Amazon AppFlow can access your data. If you haven't yet configured your CircleCI account for Amazon AppFlow integration, see [Before you begin](#circleci-prereqs).

**To connect to CircleCI**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **CircleCI**.

1. Choose **Create connection**.

1. In the **Connect to CircleCI** window, for **CircleCI Token**, enter the personal API token from the user settings of your CircleCI account

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses CircleCI as the data source, you can select this connection.

## Transferring data from CircleCI with a flow
<a name="circleci-transfer-data"></a>



To transfer data from CircleCI, create an Amazon AppFlow flow, and choose CircleCI as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for CircleCI, see [Supported objects](#circleci-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#circleci-destinations).

## Supported destinations
<a name="circleci-destinations"></a>

When you create a flow that uses CircleCI as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="circleci-objects"></a>

When you create a flow that uses CircleCI as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-circleci.html)

# Coupa connector for Amazon AppFlow
<a name="connectors-coupa"></a>

Coupa is a business spend software as a service (SaaS) solution. If you’re a Coupa user, your account contains data on procurements, invoicing, expenses, payments, and more. You can use Amazon AppFlow to transfer data between Coupa and certain AWS services or other supported applications.

## Amazon AppFlow support for Coupa
<a name="coupa-support"></a>

Amazon AppFlow supports Coupa as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Coupa.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Coupa.

## Before you begin
<a name="coupa-prereqs"></a>

To use Amazon AppFlow to transfer data from Coupa to supported destinations, you must meet these requirements:
+ You have an account with Coupa that contains the data that you want to transfer. For more information about the Coupa data objects that Amazon AppFlow supports, see [Supported objects](#coupa-objects).
+ In your Amazon AppFlow account, you've created an OAuth2/OIDC client app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. 

  For information about how to create an OAuth2 client app, see [OAuth 2.0 Getting Started with Coupa API ](https://compass.coupa.com/en-us/products/core-platform/integration-playbooks-and-resources/integration-knowledge-articles/oauth-2.0-getting-started-with-coupa-api) in the *Coupa Compass*.
+ You've given your app a Grant type of Client Credentials. 
+ You've chosen the following scopes to be included in the API: 
  + `core.accounting.read`
  + `core.approval.read`
  + `core.common.read`
  + `core.easyform_response.read`
  + `core.expense.read`
  + `core.integration.read`
  + `core.inventory.adjustment.read`
  + `core.inventory.asn.read`
  + `core.inventory.balance.read`
  + `core.inventory.consumption.read`
  + `core.inventory.cycle_counts.read`
  + `core.inventory.receiving.read`
  + `core.inventory.return_to_supplier.read`
  + `core.inventory.transfer. read`
  + `core.invoice.read`
  + `core.item.read`
  + `core.legal_entity.read`
  + `core.pay.charges.read`
  + `core.pay.payment_accounts.read`
  + `core.pay.payments.read`
  + `core.pay.virtual_cards.read`
  + `core.payables.allocations.read`
  + `core.payables.expense.read`
  + `core.payables.external.read"`
  + `core.payables.invoice.read`
  + `core.payables.order.read`
  + `core.project.read`
  + `core.purchase_order. read`
  + `core.requisition.read`
  + `core.sourcing.pending_supplier.read`
  + `core.sourcing.read`
  + `core.sourcing.response.read`
  + `core.supplier.read`
  + `core.supplier_information_sites.read`
  + `core.supplier_information_tax registrations.read`
  + `core.supplier_sharing_settings.read`
  + `core.supplier_sites.read`
  + `core.uom.read`
  + `core.user.read`
  + `core.user_group.read`
  + `email login offline_access openid profile`
  + `travel_booking.common.read`
  + `travel_booking.team.read`
  + `travel_booking.trip.read`
  + `travel_booking.user.read`
  + `treasury.cash_management.read`

Note the client ID, client secret, and instance URL for your Coupa project.

## Connecting Amazon AppFlow to your Coupa account
<a name="coupa-connecting"></a>

To connect Amazon AppFlow to your Coupa account, provide details from your Coupa project so that Amazon AppFlow can access your data. If you haven't yet configured your Coupa project for Amazon AppFlow integration, see [Before you begin](#coupa-prereqs).

**To connect to Coupa**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Coupa**.

1. Choose **Create connection**.

1. In the **Connect to Coupa** window, enter the following information:
   + **Connection name** — A name for the connection.
   + **Authorization tokens URL** — From the dropdown menu, choose one of the following options: For partner and demo instances, choose https://\$1\$1company\$1name\$1.coupacloud.com.oauth2/token. For customer instances, choose https://\$1\$1company\$1name\$1.coupahost.com.oauth2/token. 
   + **Custom authorization tokens URL** — The same company name used in the authorization tokens URL.
   + **Client ID** — The client ID in your Coupa project.
   + **Client secret** — The client secret in your Coupa project. 
   + **Instance URL** — The instance URL of your Coupa project. For example, https://\$1company\$1name\$1.coupacloud.com (for partner and demo instances), or https://\$1company\$1name\$1.coupahost.com (for customer instances).

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Coupa account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Coupa as the data source, you can select this connection.

## Transferring data from Coupa with a flow
<a name="coupa-transfer-data"></a>

To transfer data from Coupa, create an Amazon AppFlow flow, and choose Coupa as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Coupa, see [Supported objects](#coupa-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#coupa-destinations).

## Supported destinations
<a name="coupa-destinations"></a>

When you create a flow that uses Coupa as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="coupa-objects"></a>

When you create a flow that uses Coupa as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-coupa.html)

# Datadog
<a name="datadog"></a>

The following are the requirements and connection instructions for using Datadog with Amazon AppFlow.

**Note**  
You can use Datadog as a source only.

**Topics**
+ [

## Requirements
](#datadog-requirements)
+ [

## Connection instructions
](#datadog-setup)
+ [

## Notes
](#datadog-notes)
+ [

## Supported destinations
](#datadog-destinations)
+ [

## Related resources
](#datadog-resources)

## Requirements
<a name="datadog-requirements"></a>
+ You must provide Amazon AppFlow with an API key and an application key. For more information about how to retrieve your API key and application key, see the [API and Application Keys](https://docs.datadoghq.com/account_management/api-app-keys/) information in the Datadog documentation.
+ You must configure your flow with a date range and query filter.

## Connection instructions
<a name="datadog-setup"></a>

**To connect to Datadog while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Datadog** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Datadog** dialog box.

   1. Under **API key**, enter your API key.

   1. Under **Application key**, enter your application key.

   1. Under **Select region**, select the region for your instance of Datadog.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Datadog connection form with API key, Application key, region selection, and encryption options.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-datadog-console.png)

1. You will be redirected to the Datadog login page. When prompted, grant Amazon AppFlow permissions to access your Datadog account.

Now that you are connected to your Datadog, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#datadog-requirements) section.

## Notes
<a name="datadog-notes"></a>
+ When you use Datadog as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

## Supported destinations
<a name="datadog-destinations"></a>

When you create a flow that uses Datadog as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="datadog-resources"></a>
+ [API and Application Keys](https://docs.datadoghq.com/account_management/api-app-keys/) information in the *Datadog* documentation

# Delighted connector for Amazon AppFlow
<a name="connectors-delighted"></a>

Delighted is a cloud-based survey tool that helps its users distribute surveys and then collect and analyze the feedback. If you're a Delighted user, then your account contains data about your survey responses. You can use Amazon AppFlow to transfer data from Delighted to certain AWS services or other supported applications.

## Amazon AppFlow support for Delighted
<a name="delighted-support"></a>

Amazon AppFlow supports Delighted as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Delighted.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Delighted.

## Before you begin
<a name="delighted-prereqs"></a>

To use Amazon AppFlow to transfer data from Delighted to supported destinations, you must have an account with Delighted that contains the data that you want to transfer. For more information about the Delighted data objects that Amazon AppFlow supports, see [Supported objects](#delighted-objects).

From your account settings, note the API key. You provide this value to Amazon AppFlow when you create a connection to your Delighted account. For more information about Delighted API keys, see [Authentication](https://app.delighted.com/docs/api?gclid=Cj0KCQiAq5meBhCyARIsAJrtdr7AtSu0W6hS8OmoyWdqLMzzNUNTd9TQ8DoGMwsRitprPQrZNCMXZ-gaAqbDEALw_wcB#authentication) in the Delighted API documentation.

## Connecting Amazon AppFlow to your Delighted account
<a name="delighted-connecting"></a>

To connect Amazon AppFlow to your Delighted account, provide the API key from your Delighted account settings so that Amazon AppFlow can access your data.

**To connect to Delighted**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Delighted**.

1. Choose **Create connection**.

1. In the **Connect to Delighted** window, for **API Key**, enter a test or live API key from your Delighted account.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Delighted as the data source, you can select this connection.

## Transferring data from Delighted with a flow
<a name="delighted-transfer-data"></a>

To transfer data from Delighted, create an Amazon AppFlow flow, and choose Delighted as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Delighted, see [Supported objects](#delighted-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#delighted-destinations).

## Supported destinations
<a name="delighted-destinations"></a>

When you create a flow that uses Delighted as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="delighted-objects"></a>

When you create a flow that uses Delighted as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-delighted.html)

# DocuSign Monitor connector for Amazon AppFlow
<a name="connectors-docusign-monitor"></a>

DocuSign Monitor provides data about digital agreements that are processed through DocuSign. If you're a DocuSign user, your account contains monitoring data about your DocuSign activity. You can use Amazon AppFlow to transfer your monitoring data to certain AWS services or other supported applications.

## Amazon AppFlow support for DocuSign Monitor
<a name="docusign-monitor-support"></a>

Amazon AppFlow supports DocuSign Monitor as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from DocuSign Monitor.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to DocuSign Monitor.



## Before you begin
<a name="docusign-monitor-prereqs"></a>

To use Amazon AppFlow to transfer data from DocuSign Monitor to supported destinations, you must meet these requirements:
+ You have an account with DocuSign that contains the data that you want to transfer. For more information about the DocuSign Monitor data objects that Amazon AppFlow supports, see [Supported objects](#docusign-monitor-objects).
+ In the settings for your Docusign account, you've created an app and integration key for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For more information, see [Configure your app](https://developers.docusign.com/platform/configure-app/) in the DocuSign Developer documentation.
+ In the settings for your app, you've done the following:
  + Created a secret key.
  + Added a redirect URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from DocuSign Monitor. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

From your app settings, note your integration key and secret key because you specify these values in the connection settings in Amazon AppFlow.

## Connecting Amazon AppFlow to your DocuSign account
<a name="docusign-monitor-connecting"></a>

To connect Amazon AppFlow to your DocuSign account, provide the integration key and secret key from your app so that Amazon AppFlow can access your data. If you haven't yet configured your DocuSign account for Amazon AppFlow integration, see [Before you begin](#docusign-monitor-prereqs).

**To connect to DocuSign Monitor**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **DocuSign Monitor**.

1. Choose **Create connection**.

1. In the **Connect to DocuSign Monitor** window, enter the following information:
   + **Client ID** – The integration key from your app settings.
   + **Client secret** – The secret key from your app settings.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your DocuSign account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses DocuSign Monitor as the data source, you can select this connection.

## Transferring data from DocuSign Monitor with a flow
<a name="docusign-monitor-transfer-data"></a>



To transfer data from DocuSign Monitor, create an Amazon AppFlow flow, and choose DocuSign Monitor as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for DocuSign Monitor, see [Supported objects](#docusign-monitor-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#docusign-monitor-destinations).

## Supported destinations
<a name="docusign-monitor-destinations"></a>

When you create a flow that uses DocuSign Monitor as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="docusign-monitor-objects"></a>

When you create a flow that uses Docusign Monitor as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-docusign-monitor.html)

# Domo connector for Amazon AppFlow
<a name="connectors-domo"></a>

Domo is a business intelligence solution. If you're a Domo user, your account contains data about your Domo resources, such as your datasets and data permissions policies. You can use Amazon AppFlow to transfer data from Domo to certain AWS services or other supported applications.

## Amazon AppFlow support for Domo
<a name="domo-support"></a>

Amazon AppFlow supports Domo as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Domo.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Domo.

## Before you begin
<a name="domo-prereqs"></a>

To use Amazon AppFlow to transfer data from Domo to supported destinations, you must meet these requirements:
+ You have an account with Domo that contains the data that you want to transfer. For more information about the Domo data objects that Amazon AppFlow supports, see [Supported objects](#domo-objects).
+ On the Domo for Developers site, you've created a client for Amazon AppFlow. The client provides the credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create a client, see [API Authentication Quickstart](https://developer.domo.com/docs/authentication/quickstart-5) in the Domo for Developers documentation.

From the client settings, note client ID and secret because you provide these values in the connection settings in Amazon AppFlow.

## Connecting Amazon AppFlow to your Domo account
<a name="domo-connecting"></a>

To connect Amazon AppFlow to your Domo account, provide the client ID and secret from your client so that Amazon AppFlow can access your data. If you haven't yet configured your Domo account for Amazon AppFlow integration, see [Before you begin](#domo-prereqs).

**To connect to Domo**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Domo**.

1. Choose **Create connection**.

1. In the **Connect to Domo** window, enter the following information:
   + **Client ID** – The client ID from your Domo client.
   + **Client secret** – The secret from your Domo client.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Domo as the data source, you can select this connection.

## Transferring data from Domo with a flow
<a name="domo-transfer-data"></a>

To transfer data from Domo, create an Amazon AppFlow flow, and choose Domo as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Domo, see [Supported objects](#domo-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#domo-destinations).

When transferring dataset data objects, the limit is 1000 records per page. Pagination is not supported for data-permission-policy data objects, and the lambda limit is 5.5 MB.

## Supported destinations
<a name="domo-destinations"></a>

When you create a flow that uses Domo as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="domo-objects"></a>

When you create a flow that uses Domo as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-domo.html)

# Dynatrace
<a name="dynatrace"></a>

The following are the requirements and connection instructions for using Dynatrace with Amazon AppFlow.

**Note**  
You can use Dynatrace as a source only.

**Topics**
+ [

## Requirements
](#dynatrace-requirements)
+ [

## Connection instructions
](#dynatrace-setup)
+ [

## Notes
](#dynatrace-notes)
+ [

## Supported destinations
](#dynatrace-destinations)
+ [

## Related resources
](#dynatrace-resources)

## Requirements
<a name="dynatrace-requirements"></a>
+ You must provide Amazon AppFlow with an API token. For more information about how to retrieve or generate an API token to use with Amazon AppFlow, see the [Access tokens](https://www.dynatrace.com/support/help/reference/dynatrace-concepts/access-tokens/) instructions in the Dynatrace documentation.
+ You must configure your flow with a date filter with a date range that does not exceed 30 days.

## Connection instructions
<a name="dynatrace-setup"></a>

**To connect to Dynatrace while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Dynatrace** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Dynatrace** dialog box.

   1. Under **API token**, enter your API token.

   1. Under **Subdomain**, enter the subdomain for your instance of Dynatrace.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Dynatrace connection form with fields for API token, subdomain, data encryption, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-dynatrace-console.png)

1. You will be redirected to the Dynatrace login page. When prompted, grant Amazon AppFlow permissions to access your Dynatrace account.

Now that you are connected to your Dynatrace account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#dynatrace-requirements).

## Notes
<a name="dynatrace-notes"></a>
+ When you use Dynatrace as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

## Supported destinations
<a name="dynatrace-destinations"></a>

When you create a flow that uses Dynatrace as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="dynatrace-resources"></a>
+ [Access tokens](https://www.dynatrace.com/support/help/reference/dynatrace-concepts/access-tokens/) instructions in the Dynatrace documentation 
+  [Dynatrace API documentation](https://www.dynatrace.com/support/help/dynatrace-api/) for more information about the types of data you can extract from Dynatrace
+  [Dynatrace is launch partner of Amazon AppFlow – a service for easy and secure data transfer](https://www.dynatrace.com/news/blog/dynatrace-integrates-with-amazon-appflow/) from *Dynatrace Resources*

# Facebook Ads connector for Amazon AppFlow
<a name="connectors-facebook-ads"></a>

You can use the Facebook Ads connector in Amazon AppFlow to transfer data about the ads that you run with the Facebook Marketing API. The Marketing API is a series of Graph API endpoints that create and manage ads on Facebook and Instagram. After you connect Amazon AppFlow to your Facebook developer account, you can transfer data about your ads, campaigns, budgets, and more.

**Topics**
+ [

## Facebook Ads support
](#facebook-ads-support)
+ [

## Before you begin
](#facebook-ads-prereqs)
+ [

## Connecting Amazon AppFlow to the Facebook Marketing API
](#facebook-ads-connecting)
+ [

## Transferring data from the Facebook Marketing API with a flow
](#facebook-ads-import-data)
+ [

## Supported objects
](#facebook-ads-reference-objects)
+ [

## Supported destinations
](#facebook-ads-reference-destinations)

## Facebook Ads support
<a name="facebook-ads-support"></a>

The following list summarizes how Amazon AppFlow supports the Facebook Marketing API through the Facebook Ads connector.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data about your Facebook ads from the Marketing API.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to the Marketing API or your Facebook developer account.

**Supported versions**  
Amazon AppFlow supports the following versions of the Marketing API:  
+ v23.0
+ v22.0
+ v21.0
+ v20.0
+ v19.0
+ v18.0
+ v17.0
For more information about Marketing API versions, see [Changelog](https://developers.facebook.com/docs/graph-api/changelog) in the Meta for Developers documentation.

## Before you begin
<a name="facebook-ads-prereqs"></a>

To use Amazon AppFlow to transfer data from the Marketing API to supported destinations, you'll need to meet these requirements:
+ You have a Meta for Developers account.
+ Your account contains an app with its type set to *Business*. For information about creating an app, see [Create an App](https://developers.facebook.com/docs/development/create-an-app) in the Meta for Developers App Development documentation.
+ Your Meta for Developers app includes the *Facebook Login* product, which you've configured to meet the following additional requirements:
  + Client OAuth login is enabled
  + Web OAuth login is enabled
  + One or more OAuth redirect URIs are present for Amazon AppFlow. Each of these URIs has the following form:

    `https://region.console.aws.amazon.com/appflow/oauth`

    In this URI, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from the Marketing API. For example, if you use Amazon AppFlow in the US East (N. Virginia) region, the URI is `https://us-east-1.console.aws.amazon.com/appflow/oauth`.

    For the AWS Regions that Amazon AppFlow supports, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

  For more information about Facebook Login, see [Facebook Login](https://developers.facebook.com/docs/facebook-login) in the Meta For Developers documentation.
+ Your Meta for Developers app includes the *Marketing API* product, which you use to manage the ads that Amazon AppFlow transfers data about.

## Connecting Amazon AppFlow to the Facebook Marketing API
<a name="facebook-ads-connecting"></a>

To connect Amazon AppFlow to data about your Facebook ads, create an Amazon AppFlow connection where you provide details about your Meta for Developers app. If you haven't yet configured your app for Amazon AppFlow integration, see [Before you begin](#facebook-ads-prereqs).

**To connect to Facebook Ads**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Facebook Ads**.

1. Choose **Create connection**.

1. In the **Connect to Facebook Ads** window, enter the following information:
   + **Custom authorization code URL** – Specify the Marketing API version that you use in your Facebook developer app to complete the URL shown in the console: 

     `https://www.facebook.com/version/dialog/oauth`

     For example, if you use v17.0, the URL is `https://www.facebook.com/v17.0/dialog/oauth`.

     For the Marketing API versions that Amazon AppFlow supports, see [Facebook Ads support](#facebook-ads-support).
   + **Client ID** – The App ID that's assigned to your Meta for Developers app.
   + **Client secret** – The App secret that's assigned to your Meta for Developers app.
   + **Facebook Instance URL** – Choose **https://graph.facebook.com**.
   + **Facebook API version** – Choose the Marketing API version that you use. This version must match the one that you specified for **Custom authorization code URL**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Facebook Ads as the data source, you can select this connection.

## Transferring data from the Facebook Marketing API with a flow
<a name="facebook-ads-import-data"></a>

To transfer data about your Facebook ads from the Marketing API, create an Amazon AppFlow flow, and choose Facebook Ads as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose which data object you want to transfer. For most Facebook Ads objects, you must choose two values: one for **Choose Facebook Ads object**, and another for **Choose Facebook Ads subobject**. The subobject is an individual instance of the object. For example, if the object that you choose is **Campaigns**, then the subobject is the specific campaign to transfer data from. For the objects that Amazon AppFlow supports for Facebook Ads, see [Supported objects](#facebook-ads-reference-objects).

Also choose the destination where you want to transfer the data object that you selected. For information on how to configure your destination, see [Supported destinations](#facebook-ads-reference-destinations).

## Supported objects
<a name="facebook-ads-reference-objects"></a>

When you create a flow that uses Facebook Ads as the data source, you can transfer any of the following data objects:
+ Account
+ Campaigns
+ Ad Sets
+ Campaign Budget
+ Ads
+ Ad Creatives

For more information about these objects and the data that they contain, see [Ad Campaign Structure](https://developers.facebook.com/docs/marketing-api/campaign-structure) in the Meta for Developers Marketing API documentation.

## Supported destinations
<a name="facebook-ads-reference-destinations"></a>

When you create a flow that uses Facebook Ads as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Facebook Page Insights connector for Amazon AppFlow
<a name="connectors-facebook-page-insights"></a>

Facebook Page Insights provides Facebook Page owners with information about the performance and visitor demographics of their Pages. You can use Amazon AppFlow to transfer data from Facebook Page Insights to certain AWS services or other supported applications.

## Amazon AppFlow support for Facebook Page Insights
<a name="facebook-page-insights-support"></a>

Amazon AppFlow supports Facebook Page Insights as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Facebook Page Insights.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Facebook Page Insights.

**Supported API version**  
Amazon AppFlow retrieves your Facebook Page Insights data by sending requests to Graph API v15.0. 

## Before you begin
<a name="facebook-page-insights-prereqs"></a>

To use Amazon AppFlow to transfer data from Facebook Page Insights to supported destinations, you must meet these requirements:
+ You have a Facebook account and one or more Facebook Pages that contain the data that you want to transfer. For more information about the Facebook Page Insights data objects that Amazon AppFlow supports, see [Supported objects](#facebook-page-insights-objects).
+ You have a Meta for Developers account.
+ Your account contains an app with its type set to *Business*. For information about how to create an app, see [Create an App](https://developers.facebook.com/docs/development/create-an-app) in the Meta for Developers App Development documentation.
+ Your Meta for Developers app includes the *Facebook Login* product, and you've configured this product to meet the following additional requirements:
  + Client OAuth login is enabled.
  + Web OAuth login is enabled.
  + One or more OAuth redirect URIs are present for Amazon AppFlow. Each of these URIs has the following form:

    `https://region.console.aws.amazon.com/appflow/oauth`

    In this URI, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from the Marketing API. For example, if you use Amazon AppFlow in the US East (N. Virginia) region, the URI is `https://us-east-1.console.aws.amazon.com/appflow/oauth`.

    For the AWS Regions that Amazon AppFlow supports, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

  For more information about Facebook Login, see [Facebook Login](https://developers.facebook.com/docs/facebook-login) in the Meta For Developers documentation.
+ In the Data Use Checkup settings for your app, you've activated the public\$1profile and email permissions. For the steps to configure Data Use Checkup settings, see [Data Use Checkup](https://developers.facebook.com/docs/development/maintaining-data-access/data-use-checkup/) in the Meta for Developers App Development documentation.
+ You've configured your app with the following permissions:
  + `ads_management`
  + `ads_read`
  + `page_events`
  + `pages_manage_ads`
  + `pages_manage_cta`
  + `pages_manage_engagement`
  + `pages_manage_instant_articles`
  + `pages_manage_metadata`
  + `pages_manage_posts`
  + `pages_read_engagement`
  + `pages_read_user_content`
  + `pages_show_list`
  + `public_profile`
  + `read_insights`

  For more information about these permissions, see [Permissions Reference](https://developers.facebook.com/docs/permissions/reference) in the Meta for Developers Graph API documentation.

From the settings for your app, note the app ID and app secret. You provide these values to Amazon AppFlow in the connection settings.

## Connecting Amazon AppFlow to your Facebook Page Insights account
<a name="facebook-page-insights-connecting"></a>

To connect Amazon AppFlow to your Facebook account, provide the app credentials from your Meta for Developers app so that Amazon AppFlow can access your data. If you haven't yet configured an app for Amazon AppFlow integration, see [Before you begin](#facebook-page-insights-prereqs).

**To connect to Facebook Page Insights**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Facebook Page Insights**.

1. Choose **Create connection**.

1. In the **Connect to Facebook Page Insights** window, enter the following information:
   + **Client ID** – The app ID from your Meta for Developers app.
   + **Client secret** – The app secret from your Meta for Developers app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Facebook account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Facebook Page Insights as the data source, you can select this connection.

## Transferring data from Facebook Page Insights with a flow
<a name="facebook-page-insights-transfer-data"></a>

To transfer data from Facebook Page Insights, create an Amazon AppFlow flow, and choose Facebook Page Insights as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Facebook Page Insights, see [Supported objects](#facebook-page-insights-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#facebook-page-insights-destinations).

## Supported destinations
<a name="facebook-page-insights-destinations"></a>

When you create a flow that uses Facebook Page Insights as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="facebook-page-insights-objects"></a>

When you create a flow that uses Facebook Page Insights as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-facebook-page-insights.html)

# Freshdesk connector for Amazon AppFlow
<a name="connectors-freshdesk"></a>

Freshdesk is an online customer service solution. If you're a Freshdesk user, your account contains data about your customer engagements, including agents, conversations, and satisfaction ratings. You can use Amazon AppFlow to transfer data from Freshdesk to certain AWS services or other supported applications.

## Amazon AppFlow support for Freshdesk
<a name="freshdesk-support"></a>

Amazon AppFlow supports Freshdesk as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Freshdesk.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Freshdesk.

## Before you begin
<a name="freshdesk-prereqs"></a>

To use Amazon AppFlow to transfer data from Freshdesk to supported destinations, you must meet these requirements:
+ You have an account with Freshdesk that contains the data that you want to transfer. For more information about the Freshdesk data objects that Amazon AppFlow supports, see [Supported objects](#freshdesk-objects).

Note the following values because you specify them in the connection settings in Amazon AppFlow.
+ The API key from the profile settings of your Freshdesk account. The API key authenticates third-party services like Amazon AppFlow to access your account. For the steps to find the key, see [How to find your API key](https://support.freshdesk.com/en/support/solutions/articles/215517-how-to-find-your-api-key) at the Freshdesk support site.
+ Your Freshdesk address.

## Connecting Amazon AppFlow to your Freshdesk account
<a name="freshdesk-connecting"></a>

To connect Amazon AppFlow to your Freshdesk account, provide your API key and Freshdesk address.

**To connect to Freshdesk**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Freshdesk**.

1. Choose **Create connection**.

1. 

   In the **Connect to Freshdesk** window, enter the following information:
   + **API key** – The API key from your Freshdesk profile settings.
   + **Instance URL** – Your Freshdeskaddress, such as `https:my-company-name.freshdesk.com`.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Freshdesk as the data source, you can select this connection.

## Transferring data from Freshdesk with a flow
<a name="freshdesk-transfer-data"></a>

To transfer data from Freshdesk, create an Amazon AppFlow flow, and choose Freshdesk as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Freshdesk, see [Supported objects](#freshdesk-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#freshdesk-destinations).

## Supported destinations
<a name="freshdesk-destinations"></a>

When you create a flow that uses Freshdesk as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="freshdesk-objects"></a>

When you create a flow that uses Freshdesk as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-freshdesk.html)

# Freshsales connector for Amazon AppFlow
<a name="connectors-freshsales"></a>

Freshsales is a Customer Relationship Management (CRM) service that helps companies leverage customer data and interactions. If you’re a Freshsales user, your account contains information about communication, timelines, meetings, chats, workflows, and more. You can use Amazon AppFlow to transfer data from Freshsales to certain AWS services or other supported applications.

## Amazon AppFlow support for Freshsales
<a name="freshsales-support"></a>

Amazon AppFlow supports Freshsales as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Freshsales.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Freshsales.



## Before you begin
<a name="freshsales-prereqs"></a>

To use Amazon AppFlow to transfer data from Freshsales to supported destinations, you must have an account with Freshsales that contains the data that you want to transfer. For more information about the Freshsales data objects that Amazon AppFlow supports, see [Supported objects](#freshsales-objects).

From the API settings of your Freshsales account, note the value of your API key. When you connect to your Freshsales account, you provide this value to Amazon AppFlow. For more information, see [How to find my API key?](https://support.freshsales.io/en/support/solutions/articles/220099-how-to-find-my-api-key-) on the Freshsales support site.

## Connecting Amazon AppFlow to your Freshsales account
<a name="freshsales-connecting"></a>

To connect Amazon AppFlow to your Freshsales account, provide details from your Freshsales project so that Amazon AppFlow can access your data. If you haven't yet configured your Freshsales project for Amazon AppFlow integration, see [Before you begin](#freshsales-prereqs).

**To connect to Freshsales**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Freshsales**.

1. Choose **Create connection**.

1. In the **Connect to Freshsales** window, enter the following information:
   + **API key** – Enter the word **token** in this field.
   + **API secret key** – Enter your secret key. This is named “Your API Key” in the Freshsales console, for example, **sfg999666t673t7t82**. 
   + **Instance URL** – Enter the URL for your Freshsales instance, for example, https://my-freshsales-instance.myfreshworks.com/crm1/sales.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Freshsales account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Freshsales as the data source, you can select this connection.

## Transferring data from Freshsales with a flow
<a name="freshsales-transfer-data"></a>

To transfer data from Freshsales, create an Amazon AppFlow flow, and choose Freshsales as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Freshsales, see [Supported objects](#freshsales-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#freshsales-destinations).

## Supported destinations
<a name="freshsales-destinations"></a>

When you create a flow that uses Freshsales as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="freshsales-objects"></a>

When you create a flow that uses Freshsales as the data source, you can transfer any of the following data objects to supported destinations:


|  ** Object**  |  ** Field**  |  ** Data type**  |  ** Supported filters**  | 
| --- | --- | --- | --- | 
|  accounts |  |  |  | 
|  contacts |  |  |  | 

# GitHub connector for Amazon AppFlow
<a name="connectors-github"></a>

GitHub is a service that hosts code repositories for software developers, and it provides version control with Git. If you're a GitHub user, your account contains data about your repositories, such as branches, commits, and pull requests. You can use Amazon AppFlow to transfer data from GitHub to certain AWS services or other supported applications.

## Amazon AppFlow support for GitHub
<a name="github-support"></a>

Amazon AppFlow supports GitHub as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from GitHub.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to GitHub.

## Before you begin
<a name="github-prereqs"></a>

To use Amazon AppFlow to transfer data from GitHub to supported destinations, you must meet these requirements:
+ You have an account with GitHub that contains the data that you want to transfer. For more information about the GitHub data objects that Amazon AppFlow supports, see [Supported objects](#github-objects).
+ In the developer settings of your account, you've created either of the following resources for Amazon AppFlow. These resources provide credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account.
  + An OAuth app. For the steps to create one, see [Creating an OAuth App](https://docs.github.com/en/developers/apps/building-oauth-apps/creating-an-oauth-app) in the GitHub Docs.
  + A personal access token. For the steps to create one, see [Creating a personal access token](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token) in the GitHub Docs.
+ If you created an OAuth app, you've configured it with the following settings:
  + You've set the homepage URL to `https://console.aws.amazon.com/appflow/home`.
  + You've specified a callback URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from GitHub. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've generated a client secret.
+ If you created a personal access token, it permits the following recommended scopes. If you want to allow fewer scopes, you can omit any that apply to objects that you don't want to transfer.
  + `repo:status`
  + `repo_deployment`
  + `public_repo`
  + `security_events`
  + `admin:repo_hook`
  + `read:repo_hook`
  + `read:org`
  + `read:public_key`
  + `notifications`
  + `read:user`
  + `user:email`
  + `read:discussion`

  For more information about these scopes, see [Available scopes](https://docs.github.com/en/developers/apps/building-oauth-apps/scopes-for-oauth-apps#available-scopes) in the GitHub Docs.

If you created an OAuth app, note the client ID and client secret. If you created a personal access token, note the token value. You provide these values to Amazon AppFlow when you connect to your GitHub account.

## Connecting Amazon AppFlow to your GitHub account
<a name="github-connecting"></a>

To connect Amazon AppFlow to your GitHub account, provide the client credentials from your OAuth app, or provide a personal access token. If you haven't yet configured your GitHub account for Amazon AppFlow integration, see [Before you begin](#github-prereqs).

**To connect to GitHub**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **GitHub**.

1. Choose **Create connection**.

1. In the **Connect to GitHub** window, for **Select authentication type**, choose how to authenticate Amazon AppFlow with your GitHub account when it requests to access your data:
   + Choose **OAuth2** to authenticate Amazon AppFlow with the client ID and client secret from an OAuth app. Then, enter values for **Client ID** and **Client secret**.
   + Choose **BasicAuthPersonalAccessToken** to authenticate Amazon AppFlow with a personal access token. Then, enter values for **User name** and **Personal Access Token**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A window appears that asks if you want to allow Amazon AppFlow to access your GitHub account.

1. Choose **Authorize**.

1. Confirm the access request with GitHub. You can choose **Send SMS** to use a two-factor authentication code, or you can choose **Use your password** to enter your password.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses GitHub as the data source, you can select this connection.

## Transferring data from GitHub with a flow
<a name="github-transfer-data"></a>

To transfer data from GitHub, create an Amazon AppFlow flow, and choose GitHub as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for GitHub, see [Supported objects](#github-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#github-destinations).

## Supported destinations
<a name="github-destinations"></a>

When you create a flow that uses GitHub as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="github-objects"></a>

When you create a flow that uses GitHub as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-github.html)

# GitLab connector for Amazon AppFlow
<a name="connectors-gitlab"></a>

GitLab is an open source code repository and software development platform. If you're a GitLab user, your account contains data about your projects and repositories. You can use Amazon AppFlow to transfer data from GitLab to certain AWS services or other supported applications.

## Amazon AppFlow support for GitLab
<a name="gitlab-support"></a>

Amazon AppFlow supports GitLab as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from GitLab.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to GitLab.

**Supported API version**  
Amazon AppFlow retrieves your data by sending requests to the GitLab v4 REST API.

## Before you begin
<a name="gitlab-prereqs"></a>

To use Amazon AppFlow to transfer data from GitLab to supported destinations, you must meet these requirements:
+ You have a GitLab account and one or more projects that contain the data that you want to transfer. For more information about the GitLab data objects that Amazon AppFlow supports, see [Supported objects](#gitlab-objects).
+ In the settings of your account, you've created either of the following resources for Amazon AppFlow. These resources provide credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account.
  + An application, which provides OAuth 2.0 authentication. For the steps to create an application, see [User owned applications](https://docs.gitlab.com/ee/integration/oauth_provider.html#user-owned-applications) in the GitLab Docs.
  + A personal access token. For the steps to create one, see [Create a personal access token](https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token) in the GitLab Docs.

    Your personal access token must permit the `api` scope.
+ If you created an application, you've configured it with the following settings:
  + You've specified a redirect URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from GitLab. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've permitted the scopes that provide access to the data objects that you want to transfer. For information about GitLab OAuth 2.0 scopes, see [Authorized applications](https://docs.gitlab.com/ee/integration/oauth_provider.html#authorized-applications) in the GitLab Docs.

If you created an application, note the application ID and secret. If you created a personal access token, note the token value. You provide these values to Amazon AppFlow when you connect to your GitLab account.

## Connecting Amazon AppFlow to your GitLab account
<a name="gitlab-connecting"></a>

To connect Amazon AppFlow to your GitLab account, provide the credentials from your application, or provide a personal access token. If you haven't yet configured your GitLab account for Amazon AppFlow integration, see [Before you begin](#gitlab-prereqs).

**To connect to GitLab**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **GitLab**.

1. Choose **Create connection**.

1. In the **Connect to GitLab** window, for **Select authentication type**, choose how to authenticate Amazon AppFlow with your GitLab account when it requests to access your data:
   + Choose **OAuth2** to authenticate Amazon AppFlow with the credentials from an application. Then, enter the following values:
     + **Client ID** – The application ID.
     + **Client secret** – The secret.
   + Choose **PersonalAccessToken** to authenticate Amazon AppFlow with a personal access token. Then, enter the token value for **Personal access token**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Depending on the authentication type that you chose, do one of the following:
   + If you chose **OAuth2**, choose **Continue**. Then, in the window that appears, sign in to your GitLab account, and grant access to Amazon AppFlow.
   + If you chose **PersonalAccessToken**, choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses GitLab as the data source, you can select this connection.

## Transferring data from GitLab with a flow
<a name="gitlab-transfer-data"></a>



To transfer data from GitLab, create an Amazon AppFlow flow, and choose GitLab as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for GitLab, see [Supported objects](#gitlab-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#gitlab-destinations).

## Supported destinations
<a name="gitlab-destinations"></a>

When you create a flow that uses GitLab as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="gitlab-objects"></a>

When you create a flow that uses GitLab as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-gitlab.html)

# Google Ads connector for Amazon AppFlow
<a name="connectors-google-ads"></a>

Google Ads is a platform that advertisers use to display ads on the web, such as in Google search results, YouTube videos, mobile apps, and on websites. If you are a Google Ads user, you can use Amazon AppFlow to transfer data about your account, ad campaigns, and ad groups to certain AWS services or other supported applications.

**Topics**
+ [

## Google Ads support
](#google-ads-support)
+ [

## Before you begin
](#google-ads-prereqs)
+ [

## Connecting Amazon AppFlow to your Google Ads account
](#google-ads-connecting)
+ [

## Transferring data from Google Ads with a flow
](#google-ads-import-data)
+ [

## Supported objects
](#google-ads-reference-objects)
+ [

## Supported destinations
](#google-ads-reference-destinations)

## Google Ads support
<a name="google-ads-support"></a>

Amazon AppFlow supports Google Ads as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from your Google Ads account.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to your Google Ads account.

**Supported versions**  
Amazon AppFlow supports the following versions of the Google Ads API:  
+ v22
+ v21
+ v20

## Before you begin
<a name="google-ads-prereqs"></a>

To use Amazon AppFlow to transfer data from Google Ads to AWS services, you'll need to meet these requirements:
+ You have a Google Cloud Platform account and a Google Cloud project.
+ In your Google Cloud project, you've enabled the Google Ads API. For information on how to enable APIs, see [Enable and disable APIs](https://support.google.com/googleapi/answer/6158841) in the API Console Help for Google Cloud Platform.
+ You have a Google Ads developer token. For information on how to retrieve or create a developer token, see [Obtain Your Developer Token](https://developers.google.com/google-ads/api/docs/first-call/dev-token) in the Google Ads API documentation.
+ In your Google Cloud project, you've configured an OAuth consent screen for external users that meets the following requirements:
  + You've set *amazon.com* as an authorized domain.
  + You've set *Google Ads API* as an authorized scope.

  For information about the OAuth consent screen, see [Setting up your OAuth consent screen](https://support.google.com/cloud/answer/10311615#) in the Google Cloud Platform Console Help.
+ In your Google Cloud project, you've configured an OAuth 2.0 client ID. For information on how to create one, see [Setting up OAuth 2.0](https://support.google.com/cloud/answer/6158849?hl=en#zippy=) in the Google Cloud Platform Console Help.

  The OAuth 2.0 client ID must have one or more authorized redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google Ads. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

From your Google Ads settings, note your developer token. From the settings for your OAuth 2.0 client ID in your Google Cloud project, note your client ID and client secret. You will provide these values to Amazon AppFlow when you connect to your Google Cloud project.

## Connecting Amazon AppFlow to your Google Ads account
<a name="google-ads-connecting"></a>

To connect Amazon AppFlow to your Google Ads account, provide details from the Google Cloud project so that Amazon AppFlow can access your Google Ads data. If you haven't yet configured your Google Cloud project for Amazon AppFlow integration, see [Before you begin](#google-ads-prereqs).

**To connect to Google Ads**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google Ads**.

1. Choose **Create connection**.

1. In the **Connect to Google Ads** window, enter the following information:
   + **Access type** – Choose **offline**.
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Google Cloud project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Google Cloud project.
   + **Google Ads developer token** – The developer token from your Google Ads account.
   + **Google Ads instance URL** – Choose **https://googleads.googleapis.com**.
   + **Google Ads API version** – Choose the Google Ads API version that you use. For the versions that Amazon AppFlow supports, see [Google Ads support](#google-ads-support).
   + **Manager account ID** – Optionally, the account ID of a Google Ads manager account that you want to connect with Amazon AppFlow.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**. A **Sign in with Google** window opens.

1. Choose your Google account, and sign in.

1. On the page titled **amazon.com wants to access your Google Account**, choose Continue.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google Ads as the data source, you can select this connection.

## Transferring data from Google Ads with a flow
<a name="google-ads-import-data"></a>

To transfer data from Google Ads, create an Amazon AppFlow flow, and choose Google Ads as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose which data object you want to transfer. For the objects that Amazon AppFlow supports for Google Ads, see [Supported objects](#google-ads-reference-objects).

Also choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#google-ads-reference-destinations).

## Supported objects
<a name="google-ads-reference-objects"></a>

When you create a flow that uses Google Ads as the data source, you can transfer any of the following data objects to supported destinations:
+ Account
+ Account Budget
+ Campaign
+ Campaign Budget
+ Ad Group
+ Ad Group Ad

## Supported destinations
<a name="google-ads-reference-destinations"></a>

When you create a flow that uses Google Ads as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Google Analytics
<a name="google-analytics"></a>

The following are the requirements and connection instructions for using Google Analytics with Amazon AppFlow.

**Notes**  
The Google Analytics connector transfers data only from Universal Analytics properties. If you want to transfer data from Google Analytics 4 properties instead, use the [Google Analytics 4 connector](connectors-google-analytics-4.md).  
In time, Google Analytics will end support for Universal Analytics properties, and that platform will fully support only Google Analytics 4 properties. For more information, see [Introducing the next generation of Analytics, Google Analytics 4 (GA4)](https://support.google.com/analytics/answer/10089681?hl=en).
You can use Google Analytics as a source only.

## Requirements
<a name="googleanalytics-requirements"></a>

You must log in to the Google API Console at [https://console.developers.google.com](https://console.developers.google.com) and do the following:
+ Activate the Analytics API.
+ Create a new app named **AppFlow**. Set the user type as **Internal**. Add the scope for read-only access and add `amazon.com` as an authorized domain.
+ Create a new OAuth 2.0 client. Set the application type as **Web application**.
+ Set the authorized JavaScript origins URL to `https://console.aws.amazon.com`.
+  Set the authorized redirect URL to `https://region.console.aws.amazon.com/appflow/oauth`. For example, if you use Amazon AppFlow in the US East (N. Virginia) Region, set the URL to `https://us-east-1.console.aws.amazon.com/appflow/oauth`.
+ Provide Amazon AppFlow with your client ID and client secret. After you provide them, you are redirected to the Google login page. When prompted, grant Amazon AppFlow permissions to access your Google Analytics account. Note that your Google Analytics user account must also be a Google Workspaces user account.

For more information, see [Management API - Authorization](https://developers.google.com/analytics/devguides/config/mgmt/v3/authorization) in the Google Analytics documentation.

## Connection instructions
<a name="googleanalaytics-setup"></a>

**To connect to Google Analytics while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed key in the AWS Key Management Service (AWS KMS) instead of the default AWS managed KMS key, choose **Data encryption**, **Customize encryption settings** and then choose an existing KMS key or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Google Analytics** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Google Analytics** dialog box.

   1. Under **Client ID**, enter your client ID.

   1. Under **Client secret**, enter your client secret.

   1. Under **Secret access key**, enter your secret access key.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Continue**.  
![\[Google Analytics connection form with fields for client ID, secret, data encryption, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-googleanalytics-console.png)

1. You will be redirected to the Google Analytics login page. When prompted, grant Amazon AppFlow permissions to access your Google Analytics account.

Now that you are connected to your Google Analytics account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#googleanalytics-requirements) section.

## Notes
<a name="googleanalytics-notes"></a>
+ When you use Google Analytics as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per day.
+ Google Analytics can process 9 dimension and 10 metrics (including custom ones) as part of a single flow run.
+ If you choose Google Analytics, you can only specify JSON as the data format for the Amazon S3 destination file.
+ You can import custom dimensions and metrics from Google Analytics into Amazon S3. To specify custom dimensions or metrics, choose the **upload a .csv file with mapped field** option in the **Map data fields** step of the flow configuration. In the source field name in the CSV file, specify the custom dimension or the metric as `ga:dimensionXX` or `ga:metricXX`, with *XX* containing the actual index (numerical value) that you provided to Google Analytics.

  The following is an example row in the CSV file: 

  `ga:dimension24|DIMENSION, PriceDimension`

  This imports the custom dimension in Google Analytics to a field named `PriceDimension` in the destination Amazon S3 file.
**Note**  
The option to specify custom dimensions and metrics is available only when you upload a CSV file with mapped fields, and not when you manually map fields using the console.

## Supported destinations
<a name="google-analytics-destinations"></a>

When you create a flow that uses Google Analytics as the data source, you can set the destination to any of the following connectors: 
+ Lookout for Metrics
+ Amazon S3
+ Upsolver

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="googleanalytics-resources"></a>
+ [Management API - Authorization](https://developers.google.com/analytics/devguides/config/mgmt/v3/authorization) in the Google Analytics documentation
+ [Create a Property](https://support.google.com/analytics/answer/10269537#property) in the Google Analytics documentation
+  [Analyzing Google Analytics data with Amazon AppFlow and Athena](https://aws.amazon.com/blogs/big-data/analyzing-google-analytics-data-with-amazon-appflow-and-amazon-athena) in the *AWS Big Data Blog* 
+ How to transfer data from Google Analytics to Amazon S3 using Amazon AppFlow  


# Google Analytics 4 connector for Amazon AppFlow
<a name="connectors-google-analytics-4"></a>

Google Analytics 4 is an analytics service that tracks and reports metrics about visitor interactions with your apps and websites. These metrics include page views, active users, and events. You can use Amazon AppFlow to transfer data from Google Analytics 4 to certain AWS services or other supported applications.

**Note**  
The Google Analytics 4 connector transfers data only from Google Analytics 4 properties. If you want to transfer data from Universal Analytics properties instead, use the [Google Analytics connector](google-analytics.md).  
In time, Google Analytics will end support for Universal Analytics properties, and that platform will fully support only Google Analytics 4 properties. For more information, see [Introducing the next generation of Analytics, Google Analytics 4 (GA4)](https://support.google.com/analytics/answer/10089681?hl=en).

## Amazon AppFlow support for Google Analytics 4
<a name="google-analytics-4-support"></a>

Amazon AppFlow supports Google Analytics 4 as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Google Analytics 4.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Google Analytics 4.

## Before you begin
<a name="google-analytics-4-prereqs"></a>

To use Amazon AppFlow to transfer data from Google Analytics 4 to supported destinations, you must meet these requirements:
+ You have a Google Analytics account with one or more data streams that collect the data that you want to transfer. For more information about the Google Analytics 4 data objects that Amazon AppFlow supports, see [Supported objects](#google-analytics-4-objects).
+ You have a Google Cloud Platform account and a Google Cloud project.
+ In your Google Cloud project, you've enabled the following APIs:
  + Google Analytics API
  + Google Analytics Admin API
  + Google Analytics Data API

  For the steps to enable these APIs, see [Enable and disable APIs](https://support.google.com/googleapi/answer/6158841) in the API Console Help for Google Cloud Platform.
+ In your Google Cloud project, you've configured an OAuth consent screen for external users. For information about the OAuth consent screen, see [Setting up your OAuth consent screen](https://support.google.com/cloud/answer/10311615#) in the Google Cloud Platform Console Help.
+ In your Google Cloud project, you've configured an OAuth 2.0 client ID. For the steps to create an OAuth 2.0 client ID, see [Setting up OAuth 2.0](https://support.google.com/cloud/answer/6158849?hl=en#zippy=) in the Google Cloud Platform Console Help.

  The OAuth 2.0 client ID must have one or more authorized redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google Analytics 4. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID and client secret from the settings for your OAuth 2.0 client ID. When you connect to your Google Cloud project, you provide these values to Amazon AppFlow.

## Connecting Amazon AppFlow to Google Analytics 4
<a name="google-analytics-4-connecting"></a>

To connect Amazon AppFlow to Google Analytics 4, provide the client credentials from the OAuth 2.0 client ID from your Google Cloud project. Amazon AppFlow uses these credentials to access your data. If you haven't yet configured your Google Cloud project for Amazon AppFlow integration, see [Before you begin](#google-analytics-4-prereqs).

**To connect to Google Analytics 4**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google Analytics 4**.

1. Choose **Create connection**.

1. In the **Connect to Google Analytics 4** window, enter the following information:
   + **Access type** – Choose **offline**.
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Google Cloud project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Google Cloud project.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Google account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google Analytics 4 as the data source, you can select this connection.

## Transferring data from Google Analytics 4 with a flow
<a name="google-analytics-4-transfer-data"></a>

To transfer data from Google Analytics 4, create an Amazon AppFlow flow, and choose Google Analytics 4 as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Google Analytics 4, see [Supported objects](#google-analytics-4-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#google-analytics-4-destinations).

## Supported destinations
<a name="google-analytics-4-destinations"></a>

When you create a flow that uses Google Analytics 4 as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="google-analytics-4-objects"></a>

When you create a flow that uses Google Analytics 4 as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-google-analytics-4.html)

# Google BigQuery connector for Amazon AppFlow
<a name="connectors-googlebigquery"></a>

Google BigQuery is a query and analysis solution. If you’re a Google BigQuery user, your account contains data, analytics, and more. You can use Amazon AppFlow to transfer data between Google BigQuery and certain AWS services or other supported applications.

## Amazon AppFlow support for Google BigQuery
<a name="googlebigquery-support"></a>

Amazon AppFlow supports Google BigQuery as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Google BigQuery.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Google BigQuery.

## Before you begin
<a name="googlebigquery-prereqs"></a>

To use Amazon AppFlow to transfer data from Google BigQuery to supported destinations, you must meet these requirements:
+ You have an account with Google BigQuery that contains the data that you want to transfer. For more information about the Google BigQuery data objects that Amazon AppFlow supports, see [Supported objects](#googlebigquery-objects).
+ In your Google BigQuery account, you've created an External OAuth2 Google Cloud web app for Amazon AppFlow, and you’ve added the appropriate scopes. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For information about how to create an app, see [ Building a Node.js app on App Engine ]( https://cloud.google.com/appengine/docs/standard/nodejs/building-app ) in the Google BigQuery documentation.
+ You've activated the access scopes that provide access to the data that you want to transfer. For more information about Google BigQuery scopes, see [ Comply with OAuth 2.0 policies ]( https://developers.google.com/identity/protocols/oauth2/production-readiness/policy-compliance ) in the *Google Identity Documentation*. 
+ You've configured the app with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google BigQuery. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
+ Note the client ID and client secret from the settings for your OAuth 2.0 client ID. You provide these values to Amazon AppFlow when you connect to your Google BigQuery project. 

## Connecting Amazon AppFlow to your Google BigQuery account
<a name="googlebigquery-connecting"></a>

To connect Amazon AppFlow to your Google BigQuery account, provide the client credentials from your Google Cloud web app so that Amazon AppFlow can access your data. If you haven't yet configured your Google BigQuery project for Amazon AppFlow integration, see [Before you begin](#googlebigquery-prereqs).

**To connect to Google BigQuery**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google BigQuery**.

1. Choose **Create connection**.

1. In the **Connect to Google BigQuery** window, enter the following information:
   + **Connection name** — A name for your connection.
   + **access\$1type** — Specify an access type to generate a refresh token. 
   + **Client ID** — The client ID in your Google Cloud web app. 
   + **Client secret** — The client secret in your Google Cloud web app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. Choose **Connect**.

1. In the window that appears, sign in to your Google BigQuery account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google BigQuery as the data source, you can select this connection.

## API preference
<a name="googlebigquery-api-preference"></a>

When you use Google BigQuery as either the source or destination, you can configure the **Google BigQuery API preference** setting. Use this setting to specify whether Amazon AppFlow uses synchronous (smaller data transfers) or asynchronous (larger transfers) data transfer when you run your flow.

The Amazon AppFlow console provides this setting on the **Configure flow** page under **Source details** or **Destination details**. To view it, expand the **Additional settings** section.

You can choose one of these options:
+ **Automatic (default)** — For each flow run, Amazon AppFlow selects the type of data transfer to use.
+ **Standard** — Amazon AppFlow uses only Google BigQuery synchronous data transfer. This option optimizes your flow for small to medium-sized data transfers. 
+ **Bulk** — Amazon AppFlow runs Google BigQuery asynchronous data transfers, and it's optimal for large datasets.

## Transferring data to or from Google BigQuery with a flow
<a name="googlebigquery-transfer-data"></a>

To transfer data to or from Google BigQuery, create an Amazon AppFlow flow, and choose Google BigQuery as the data source or destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Google BigQuery, see [Supported objects](#googlebigquery-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#googlebigquery-destinations).

## Supported destinations
<a name="googlebigquery-destinations"></a>

When you create a flow that uses Google BigQuery as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="googlebigquery-objects"></a>

When you create a ﬂow that uses Google BigQuery as the data source, you can transfer any data from any table that you’ve defined. Other connectors support specific objects, but the Google BigQuery connector lacks predefined entities. Instead, it displays entities dynamically, based on the current column headers in the Google BigQuery table itself.

# Google Calendar connector for Amazon AppFlow
<a name="connectors-google-calendar"></a>

Google Calendar is an online calendar service that helps users schedule meetings, set up events, set reminders, and share their schedules. If you're a Google Calendar user, your account contains data about your calendar, events, access controls list rules, and more. You can use Amazon AppFlow to transfer data from Google Calendar to certain AWS services or other supported applications.

## Amazon AppFlow support for Google Calendar
<a name="google-calendar-support"></a>

Amazon AppFlow supports Google Calendar as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Google Calendar.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Google Calendar.

## Before you begin
<a name="google-calendar-prereqs"></a>

To use Amazon AppFlow to transfer data from Google Calendar to supported destinations, you must meet these requirements:
+ You have a Google account that you use to sign in and use the Google Calendar app. In your Google account, Google Calendar contains the data that you want to transfer.
+ You have a Google Cloud Platform account and a Google Cloud project.
+ In your Google Cloud project, you've enabled the Google Calendar API. For the steps to enable it, see [Enable and disable APIs](https://support.google.com/googleapi/answer/6158841) in the API Console Help for Google Cloud Platform.
+ In your Google Cloud project, you've configured an OAuth consent screen for external users. For information about the OAuth consent screen, see [Setting up your OAuth consent screen](https://support.google.com/cloud/answer/10311615#) in the Google Cloud Platform Console Help.
+ In your Google Cloud project, you've configured an OAuth 2.0 client ID that meets the following requirements:
  + You've set the application type to **Web application**.
  + You've added one or more authorized redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google Calendar. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

  For the steps to create an OAuth 2.0 client ID, see [Setting up OAuth 2.0](https://support.google.com/cloud/answer/6158849?hl=en#zippy=) in the Google Cloud Platform Console Help.

Note the client ID and client secret from the settings for your OAuth 2.0 client ID. You provide these values to Amazon AppFlow when you connect to your Google Cloud project.

## Connecting Amazon AppFlow to your Google Calendar account
<a name="google-calendar-connecting"></a>

To connect Amazon AppFlow to Google Calendar, provide the client credentials from the OAuth 2.0 client ID from your Google Cloud project. Amazon AppFlow uses these credentials to access your data. If you haven't yet configured your Google Cloud project for Amazon AppFlow integration, see [Before you begin](#google-calendar-prereqs).

**To connect to Google Calendar**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google Calendar**.

1. Choose **Create connection**.

1. In the **Connect to Google Calendar** window, enter the following information:
   + **Access type** – Choose **offline**.
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Google Cloud project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Google Cloud project.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Google account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google Calendar as the data source, you can select this connection.

## Transferring data from Google Calendar with a flow
<a name="google-calendar-transfer-data"></a>

To transfer data from Google Calendar, create an Amazon AppFlow flow, and choose Google Calendar as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Google Calendar, see [Supported objects](#google-calendar-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#google-calendar-destinations).

## Supported destinations
<a name="google-calendar-destinations"></a>

When you create a flow that uses Google Calendar as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="google-calendar-objects"></a>

When you create a flow that uses Google Calendar as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-google-calendar.html)

# Google Search Console connector for Amazon AppFlow
<a name="connectors-google-search-console"></a>

Google Search Console is a service from Google that allows website owners to optimize and manage their sites’ presence in Google Search results. If you're a Google Search Console user, your account contains data about your sites and their search traffic. You can use Amazon AppFlow to transfer data from Google Search Console to certain AWS services or other supported applications.

## Amazon AppFlow support for Google Search Console
<a name="google-search-console-support"></a>

Amazon AppFlow supports Google Search Console as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Google Search Console.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Google Search Console.

## Before you begin
<a name="google-search-console-prereqs"></a>

To use Amazon AppFlow to transfer data from Google Search Console to supported destinations, you must meet these requirements:
+ You have a Google Cloud Platform account and a Google Cloud project.
+ In Google Search Console, you have one or more verified website properties that have the data that you want to transfer. For the steps to add a property, see [Add a website property to Search Console](https://support.google.com/webmasters/answer/34592?hl=en) in the Search Console Help. For more information about the Google Search Console data objects that Amazon AppFlow supports, see [Supported objects](#google-search-console-objects).
+ In your Google Cloud project, you've enabled the Google Search Console API. For the steps to enable it, see [Enable and disable APIs](https://support.google.com/googleapi/answer/6158841) in the API Console Help for Google Cloud Platform.
+ In your Google Cloud project, you've configured an OAuth consent screen for external users that meets the following requirements:
  + You've set *amazon.com* as an authorized domain.
  + You've set *Google Ads API* as an authorized scope.

  For information about the OAuth consent screen, see [Setting up your OAuth consent screen](https://support.google.com/cloud/answer/10311615#) in the Google Cloud Platform Console Help.
+ In your Google Cloud project, you've configured an OAuth 2.0 client ID. For the steps to create one, see [Setting up OAuth 2.0](https://support.google.com/cloud/answer/6158849?hl=en#zippy=) in the Google Cloud Platform Console Help.

  The OAuth 2.0 client ID must have one or more authorized redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google Search Console. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID and client secret from the settings for your OAuth 2.0 client ID. You provide these values to Amazon AppFlow when you connect to your Google Cloud project.

## Connecting Amazon AppFlow to your Google Search Console account
<a name="google-search-console-connecting"></a>

To connect Amazon AppFlow to your Google Search Console account, provide the client credentials from the OAuth 2.0 client ID from your Google Cloud project. Amazon AppFlow uses these credentials to access your data. If you haven't yet configured your Google Cloud project for Amazon AppFlow integration, see [Before you begin](#google-search-console-prereqs).

**To connect to Google Search Console**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google Search Console**.

1. Choose **Create connection**.

1. In the **Connect to Google Search Console** window, enter the following information:
   + **access\$1type** – Choose **offline**.
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Google Cloud project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Google Cloud project.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**. A **Sign in with Google** window opens.

1. Choose your Google account, and sign in.

1. On the page titled **amazon.com wants to access your Google Account**, choose Continue.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google Search Console as the data source, you can select this connection.

## Transferring data from Google Search Console with a flow
<a name="google-search-console-transfer-data"></a>



To transfer data from Google Search Console, create an Amazon AppFlow flow, and choose Google Search Console as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Google Search Console, see [Supported objects](#google-search-console-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#google-search-console-destinations).

## Supported destinations
<a name="google-search-console-destinations"></a>

When you create a flow that uses Google Search Console as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="google-search-console-objects"></a>

When you create a flow that uses Google Search Console as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-google-search-console.html)

# Google Sheets connector for Amazon AppFlow
<a name="connectors-google-sheets"></a>

Google Sheets is a spreadsheet based collaboration service that helps teams share data in real time across multiple devices. If you’re a Google Sheets user, your account contains data about spreadsheets, documents, slides, meetings, security, and more. You can use Amazon AppFlow to transfer data from Google Sheets to certain AWS services or other supported applications.



## Amazon AppFlow support for Google Sheets
<a name="google-sheets-support"></a>

Amazon AppFlow supports Google Sheets as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Google Sheets.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Google Sheets.

Amazon AppFlow currently supports Google Sheets API v4 and Google Drive API v3. 

## Before you begin
<a name="google-sheets-prereqs"></a>

To use Amazon AppFlow to transfer data from Google Sheets to supported destinations, you must meet these requirements:
+ You have a Google account where you sign in to use the Google Sheets app. In your Google account, Google Sheets contains the data that you want to transfer.
+ You have a Google Cloud Platform account and a Google Cloud project.
+ In your Google Cloud project, you've enabled the Google Sheets API and Google Drive APIs. For the steps to enable them, see [Enable and disable APIs](https://support.google.com/googleapi/answer/6158841) in the API Console Help for Google Cloud Platform.
+ In your Google Cloud project, you've configured an OAuth consent screen for external users. For more information about the OAuth consent screen, see [Setting up your OAuth consent screen](https://support.google.com/cloud/answer/10311615#) in the Google Cloud Platform Console Help.
+ In the OAuth consent screen, you've added the following scopes:
  + The Google Sheets API read-only scope, https://www.googleapis.com/auth/spreadsheets.readonly.
  + The Google Drive API read-only scope, https://www.googleapis.com/auth/drive.readonly.

  For more information about these scopes, see [OAuth 2.0 Scopes for Google APIs](https://developers.google.com/identity/protocols/oauth2/scopes) in the Google Identity documentation.
+ In your Google Cloud project, you've configured an OAuth 2.0 client ID. For the steps to create this client ID, see [Setting up OAuth 2.0](https://support.google.com/cloud/answer/6158849?hl=en#zippy=) in the Google Cloud Platform Console Help.

  The OAuth 2.0 client ID must have one or more authorized redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Google Sheets. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
+  In addition, set the authorized JavaScript origins URL to the following:

  `https://region.console.aws.amazon.com`

  Like the *region* in the redirect URLs, the region in the JavaScript origins URL is the code for the AWS region where you use Amazon AppFlow to transfer data from Google Sheets. So if, as above, you’re in the US East (N. Virginia) Region, the URL is the following:

  `https://us-east-1.console.aws.amazon.com`

Note the client ID and client secret from the settings for your OAuth 2.0 client ID. You provide these values to Amazon AppFlow when you connect to your Google Cloud project.

## Connecting Amazon AppFlow to your Google Sheets account
<a name="google-sheets-connecting"></a>

To connect Amazon AppFlow to your Google Sheets account, provide details from your Google Sheets project so that Amazon AppFlow can access your data. If you haven't yet configured your Google Sheets project for Amazon AppFlow integration, see [Before you begin](#google-sheets-prereqs).

**To connect to Google Sheets**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Google Sheets**.

1. Choose **Create connection**.

1. In the **Connect to Google Sheets** window, enter the following information:
   + **Access type** – Choose **offline**.
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Google Sheets project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Google Sheets project.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Google Sheets account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Google Sheets as the data source, you can select this connection.

## Transferring data from Google Sheets with a flow
<a name="google-sheets-transfer-data"></a>

To transfer data from Google Sheets, create an Amazon AppFlow flow, and choose Google Sheets as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Google Sheets, see [Supported objects](#google-sheets-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#google-sheets-destinations).

If a flow is left idle for too long, it can time out. To increase the default session time, see [Set session length for Google Cloud services](https://support.google.com/a/answer/9368756) in the Google Workspace Admin Help.

Note also that the Google Sheets API is a shared service. To keep the overall environment functioning smoothly, Google places limits on the number of read requests you’re allowed per minute. If you exceed the limit, Google Sheets will generate an error. To learn more about limits, and about how to request an increase in your limit, see [Usage limits](https://developers.google.com/sheets/api/limits) in the Google Sheets Reference.

## Supported destinations
<a name="google-sheets-destinations"></a>

When you create a flow that uses Google Sheets as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="google-sheets-objects"></a>

When you create a ﬂow that uses Google Sheets as the data source, you can transfer any of the supported data objects to supported destinations. Other connectors support specific objects, but the Google Sheets connector lacks predefined entities. Instead, it displays entities dynamically, based on the current column headers in the Google Sheets spreadsheet itself.

Note that if you change or update the column headers after creating a flow, you’ll need to either update the headers by using the Amazon AppFlow update flow page, or create a new flow. For information on updating a flow, see [Managing Amazon AppFlow flows ](https://docs.aws.amazon.com/appflow/latest/userguide/flows-manage.html). For information on creating a new flow, see [Creating flows in Amazon AppFlow ](https://docs.aws.amazon.com/appflow/latest/userguide/create-flow.html).

# HubSpot connector for Amazon AppFlow
<a name="connectors-hubspot"></a>

HubSpot is a customer relations management (CRM) solution that supports marketing, sales, customer service, and content management. After you connect Amazon AppFlow your HubSpot account, you can use HubSpot as a data source or destination in your flows. Run these flows to transfer data between HubSpot and AWS services or other supported applications.

## Amazon AppFlow support for HubSpot
<a name="hubspot-support"></a>

Amazon AppFlow supports HubSpot as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from HubSpot.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to HubSpot.

**Supported API versions**  
Amazon AppFlow can retrieve your data by sending requests to the following versions of the HubSpot API:  
+ v3
+ v2
+ v1

## Before you begin
<a name="hubspot-prereqs"></a>

To use Amazon AppFlow to transfer data from HubSpot to supported destinations, you must meet these requirements:
+ You have an account with HubSpot that contains the data that you want to transfer. For more information about the HubSpot data objects that Amazon AppFlow supports, see [Supported objects](#hubspot-objects).
+ You have an App Developers account with HubSpot Developers.
+ In HubSpot Developers, you've created an app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [Creating and installing apps](https://developers.hubspot.com/docs/api/creating-an-app) in the HubSpot Developers documentation.
+ You've configured your app as follows:
  + You've specified a redirect URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from HubSpot. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've permitted the following scopes:
    + `automation`
    + `content`
    + `crm.lists.read`
    + `crm.lists.write`
    + `crm.objects.companies.read`
    + `crm.objects.companies.write`
    + `crm.objects.contacts.read`
    + `crm.objects.contacts.write`
    + `crm.objects.custom.read`
    + `crm.objects.custom.write`
    + `crm.objects.deals.read`
    + `crm.objects.deals.write`
    + `crm.objects.owners.read`
    + `crm.schemas.custom.read`
    + `e-commerce`
    + `forms`
    + `oauth`
    + `sales-email-read`
    + `tickets`

    For more information about these scopes, see [Scopes](https://developers.hubspot.com/docs/api/working-with-oauth#scopes) in the HubSpot Developers documentation.

From your app settings, note your client ID and client secret because you specify these values in the connection settings in Amazon AppFlow.

## Connecting Amazon AppFlow to your HubSpot account
<a name="hubspot-connecting"></a>

To connect Amazon AppFlow to your HubSpot account, provide details from your HubSpot Developers app so that Amazon AppFlow can access your data. If you haven't yet configured your HubSpot account for Amazon AppFlow integration, see [Before you begin](#hubspot-prereqs).

**To connect to HubSpot**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **HubSpot**.

1. Choose **Create connection**.

1. In the **Connect to HubSpot** window, provide the client credentials from your app for **Client ID** and **Client secret**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your HubSpot account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses HubSpot as the data source, you can select this connection.

## Transferring data to or from HubSpot with a flow
<a name="hubspot-transfer-data"></a>

To transfer data to or from HubSpot, create an Amazon AppFlow flow, and choose HubSpot as the data source or destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure a flow that uses HubSpot as the data source, you choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for HubSpot, see [Supported objects](#hubspot-objects). You also choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#hubspot-destinations).

## Supported destinations
<a name="hubspot-destinations"></a>

When you create a flow that uses HubSpot as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](#connectors-hubspot)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="hubspot-objects"></a>

When you create a flow that uses HubSpot as the data source, you can transfer any of the following data objects to supported destinations:


**HubSpot API v3**  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-hubspot.html)


**HubSpot API v2**  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-hubspot.html)


**HubSpot API v1**  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-hubspot.html)

# Infor Nexus
<a name="infor-nexus"></a>

The following are the requirements and connection instructions for using Infor Nexus with Amazon AppFlow.

**Note**  
You can use Infor Nexus as a source only.

**Topics**
+ [

## Requirements
](#infornexus-requirements)
+ [

## Connection instructions
](#infornexus-setup)
+ [

## Supported destinations
](#infor-nexus-destinations)
+ [

## Notes
](#infornexus-notes)

## Requirements
<a name="infornexus-requirements"></a>
+ Amazon AppFlow uses hash-based message authentication (HMAC) to connect to Infor Nexus.
+ You must provide Amazon AppFlow with your access key ID, user ID, secret access key, and data key. To retrieve this information, contact your Infor Nexus administrator.

## Connection instructions
<a name="infornexus-setup"></a>

**To connect to Infor Nexus while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Infor Nexus** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Infor Nexus** dialog box.

   1. Under **Access Key ID**, enter your access key ID.

   1. Under **User ID**, enter your Infor Nexus user ID.

   1. Under **Secret access key**, enter your secret access key.

   1. Under **Datakey**, enter your data key.

   1. Under **Subdomain**, enter the subdomain for your instance of Infor Nexus.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Connect to Infor Nexus form with fields for access credentials and subdomain.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-infornexus-console.png)

1. You will be redirected to the Infor Nexus login page. When prompted, grant Amazon AppFlow permissions to access your Infor Nexus account.

Now that you are connected to your Infor Nexus account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#infornexus-requirements) section.

## Supported destinations
<a name="infor-nexus-destinations"></a>

When you create a flow that uses Infor Nexus as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Notes
<a name="infornexus-notes"></a>
+ When you use Infor Nexus as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

# Instagram Ads connector for Amazon AppFlow
<a name="connectors-instagram-ads"></a>

Instagram Ads is an advertising solution for Instagram. If you run ads on Instagram, your account contains data about your ads, campaigns, ad images, and more. You can use Amazon AppFlow to transfer data from Instagram Ads to certain AWS services or other supported applications.

## Amazon AppFlow support for Instagram Ads
<a name="instagram-ads-support"></a>

Amazon AppFlow supports Instagram Ads as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Instagram Ads.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Instagram Ads.

## Before you begin
<a name="instagram-ads-prereqs"></a>

To use Amazon AppFlow to transfer data from Instagram Ads to supported destinations, you must meet these requirements:
+ You have an Instagram business account that you use to run your ads. For more information about the Instagram Ads data objects that Amazon AppFlow supports, see [Supported objects](#instagram-ads-objects).
+ You've connected your Instagram business account to a Facebook Page. This connection makes it possible for third-party applications like Amazon AppFlow to access your Instagram data. For the steps to connect, see [Add or Remove an Instagram Account From Your Facebook Page](https://www.facebook.com/business/help/connect-instagram-to-page) in the Meta Business Help Center.
+ You have a Meta for Developers account.
+ Your Meta for Developers account contains an app with its type set to *Business*. For information about how to create an app, see [Create an App](https://developers.facebook.com/docs/development/create-an-app) in the Meta for Developers App Development documentation.
+ Your Meta for Developers app includes the *Facebook Login* product, and you've configured the product to meet the following additional requirements:
  + Client OAuth login is enabled.
  + Web OAuth login is enabled.
  + One or more OAuth redirect URIs are present for Amazon AppFlow. Each of these URIs has the following form:

    `https://region.console.aws.amazon.com/appflow/oauth`

    In this URI, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from the Marketing API. For example, if you use Amazon AppFlow in the US East (N. Virginia) region, the URI is `https://us-east-1.console.aws.amazon.com/appflow/oauth`.

    For the AWS Regions that Amazon AppFlow supports, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

  For more information about Facebook Login, see [Facebook Login](https://developers.facebook.com/docs/facebook-login) in the Meta For Developers documentation.
+ Your app includes the *Marketing API* product, and you use this product to manage the ads that Amazon AppFlow transfers data about.
+ You've configured your app with the following permissions:
  + `ads_management`
  + `ads_read`
  + `business_management`
  + `read_insights`

  For more information about these permissions, see [Permissions Reference](https://developers.facebook.com/docs/permissions/reference) in the Meta for Developers Graph API documentation.

  Each of these permissions must be approved for *Advanced Access* through the *App Review* process. For the steps to create an App Review submission, see [Submitting For Review]() in the Meta for Developers App Review documentation.

From the settings for your app, note the app ID and app secret. You provide these values to Amazon AppFlow when you connect to your account.

## Connecting Amazon AppFlow to Instagram Ads
<a name="instagram-ads-connecting"></a>

To connect Amazon AppFlow to Instagram Ads, provide the app credentials from your Meta for Developers app so that Amazon AppFlow can access your data. If you haven't yet configured an app for Amazon AppFlow integration, see [Before you begin](#instagram-ads-prereqs).

**To connect to Instagram Ads**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Instagram Ads**.

1. Choose **Create connection**.

1. In the **Connect to Instagram Ads** window, enter the following information:
   + **Client ID** – The app ID from your Meta for Developers app.
   + **Client secret** – The app secret from your Meta for Developers app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Instagram Ads as the data source, you can select this connection.

## Transferring data from Instagram Ads with a flow
<a name="instagram-ads-transfer-data"></a>



To transfer data from Instagram Ads, create an Amazon AppFlow flow, and choose Instagram Ads as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Instagram Ads, see [Supported objects](#instagram-ads-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#instagram-ads-destinations).

## Supported destinations
<a name="instagram-ads-destinations"></a>

When you create a flow that uses Instagram Ads as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="instagram-ads-objects"></a>

When you create a flow that uses Instagram ads as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-instagram-ads.html)

# Intercom connector for Amazon AppFlow
<a name="connectors-intercom"></a>

Intercom is a customer engagement solution. It helps organizations learn who is using a website or product so that the organization can engage those users with targeted messages and support. If you're an Intercom user, then your account contains data about your contacts, conversations, customer segments, and more. You can use Amazon AppFlow to transfer data from Intercom to certain AWS services or other supported applications.

## Amazon AppFlow support for Intercom
<a name="intercom-support"></a>

Amazon AppFlow supports Intercom as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Intercom.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Intercom.

## Before you begin
<a name="intercom-prereqs"></a>

To use Amazon AppFlow to transfer data from Intercom to supported destinations, you must meet these requirements:
+ You have an account with Intercom that contains the data that you want to transfer. For more information about the Intercom data objects that Amazon AppFlow supports, see [Supported objects](#intercom-objects).
+ In your Intercom account, you've created an app for Amazon AppFlow. The app provides the credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [How do I create an app?](https://www.intercom.com/help/en/articles/1827298-how-do-i-create-an-app) in the Intercom Help Center.
+ You've configured the app with a redirect URL for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Intercom. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

**Note**  
You must add your connecting profile region redirect URL (or URLs) to the list of redirect URLs in your Intercom app. If you don’t make this addition, the app defaults to the first redirect URL in the list, and your connection will fail. For more information, see [ Redirect URLs ]( https://developers.intercom.com/docs/build-an-integration/learn-more/authentication/setting-up-oauth/#redirect-urls ) in the Intercom Developer Platform Help Center.

From the settings for your app, note the client ID and client Secret. You provide these values to Amazon AppFlow when you connect to your Intercom account.

## Connecting Amazon AppFlow to your Intercom account
<a name="intercom-connecting"></a>

To connect Amazon AppFlow to your Intercom account, provide the client credentials from your Intercom app so that Amazon AppFlow can access your data. If you haven't yet configured your Intercom account for Amazon AppFlow integration, see [Before you begin](#intercom-prereqs).

**To connect to Intercom**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Intercom**.

1. Choose **Create connection**.

1. In the **Connect to Intercom** window, enter the following information:
   + **Authorization tokens URL** — Choose the URL based on the data host region where you use Intercom (Europe, US, Australia).
   + **Authorization code URL** — Choose the URL based on the data host region where you use Intercom (Europe, US, Australia).
   + **Client ID** — The client ID from your Intercom app.
   + **Client secret** — The client secret from your Intercom app.
   + ****Instance URL**** — Choose the URL based on the data host region where you use Intercom (Europe, US, Australia).

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Intercom account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Intercom as the data source, you can select this connection.

## Transferring data from Intercom with a flow
<a name="intercom-transfer-data"></a>

To transfer data from Intercom, create an Amazon AppFlow flow, and choose Intercom as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Intercom, see [Supported objects](#intercom-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#intercom-destinations).

## Supported destinations
<a name="intercom-destinations"></a>

When you create a flow that uses Intercom as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="intercom-objects"></a>

When you create a flow that uses Intercom as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-intercom.html)

# JDBC connector for Amazon AppFlow
<a name="connectors-jdbc"></a>

Java Database Connectivity (JDBC) is a Java API that developers use to connect their applications to relational databases. JDBC is included in the Java Standard Edition from Oracle. You can use Amazon AppFlow to transfer data from a databases by a creating a JDBC connection. Then you can transfer the data to other databases, AWS services, or other supported applications.

## Amazon AppFlow support for JDBC
<a name="jdbc-support"></a>

Amazon AppFlow supports JDBC as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from databases through the JDBC API.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to databases through the JDBC API.

## Before you begin
<a name="jdbc-prereqs"></a>

Before you can use Amazon AppFlow to transfer data to or from a database using the JDBC connector, you must have one or more databases that support and are enabled for JDBC API access. For more information about installing the JDBC driver, see the JDBC documentation for your version of Java, such as the [JDBC Getting Started](https://docs.oracle.com/javase/tutorial/jdbc/basics/gettingstarted.html) documentation in the Oracle Java SE 8 Documentation.

From your database settings, note the endpoint name and port. You provide these values, along with your database user name and password, to Amazon AppFlow when you connect to your database.

## Connecting Amazon AppFlow to a database through JDBC
<a name="jdbc-connecting"></a>

To connect Amazon AppFlow to your database through the JDBC API, provide details from your database settings so that Amazon AppFlow can access your data.

**To connect through JDBC**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **JDBC**.

1. Choose **Create connection**.

1. In the **Connect to JDBC** window, enter the following information:
   + **driver** — Choose **mysql **or **postgresql **depending on the type of database where you want to connect.
   + **hostname** — The hostname associated with the database that you're connecting to.
   + **port** — The port that is activated for JDBC access to the database.
   + **username** — The user name for a user that has access to the database.
   + **password** — The password associated with the user name.
   + **database** — The name of the database where you want to connect.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses JDBC as the data source, you can select this connection.

## Transferring data to or from a database through JDBC
<a name="jdbc-transfer-data"></a>

 To transfer data to or from a database through the JDBC API, create an Amazon AppFlow flow, and choose JDBC as the data source or the data destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the a flow that uses the JDBC connector as a source or destination, you set the following options:
+ **connection** – The Amazon AppFlow JDBC connection that you created.
+ **API Version** – The supported JDBC API version.
+ **object** – Typically, the database schema.
+ **subobject** – Typically, the name of the database table that you want to transfer data to or from.

## Supported destinations
<a name="jdbc-destinations"></a>

When you create a flow that uses JDBC as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ JDBC
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)

# Jira Cloud connector for Amazon AppFlow
<a name="connectors-jira-cloud"></a>

Jira Cloud is a platform developed by Atlassian. The platform includes issue tracking products that help teams plan and track their agile projects. If you're a Jira Cloud user, your account contains data about your projects, such as issues, workflows, and events. You can use Amazon AppFlow to transfer your Jira Cloud data to certain AWS services or other supported applications.

## Amazon AppFlow support for Jira Cloud
<a name="jira-cloud-support"></a>

Amazon AppFlow supports Jira Cloud as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Jira Cloud.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Jira Cloud.

**Supported Jira Cloud products **  
Amazon AppFlow uses the Jira REST API to transfer data objects from the Jira Software product. It does not transfer objects that are unique to the other products in Jira Cloud: Jira Work Management and Jira Service Management.  
Amazon AppFlow only connects to Jira Software on Jira Cloud. Amazon AppFlow doesn't connect to the on-premise Jira Software Data Center product.

**Supported Jira API version**  
Version 2

## Before you begin
<a name="jira-cloud-prereqs"></a>

To use Amazon AppFlow to transfer data from Jira Cloud to supported destinations, you must meet these requirements:
+ You have an Atlassian account where you use the Jira Software product in Jira Cloud.
+ In the developer console for your Atlassian account, you've created an OAuth 2.0 integration app for Amazon AppFlow. This app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For more information, see [Enabling OAuth 2.0 (3LO)](https://developer.atlassian.com/cloud/jira/platform/oauth-2-3lo-apps/#enabling-oauth-2-0--3lo-) in the Atlassian Developer documentation.

  You must configure your app as follows:
  + In the authorization settings, you've specified a callback URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Jira Cloud. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + In the distribution settings, you've set the distribution status to **Sharing**.
  + In the permissions settings, you've added the Jira API, and you've enabled the recommended scopes below.

In the settings for your app, note the client ID and client secret because you need them to create a connection in Amazon AppFlow.

### Recommended scopes
<a name="jira-cloud-prereqs-scopes"></a>

Before Amazon AppFlow can securely access your data in Jira Cloud, the permissions settings for your OAuth 2.0 integration app must allow the necessary scopes for the Jira API. We recommend that you enable the scopes below so that Amazon AppFlow can access all supported data objects.

If you want to allow fewer scopes, you can omit any scopes that apply to objects that you don't want to transfer.

You can add scopes to your app by managing permissions in the Atlassian Developer console.
+ Under **Jira platform REST API** scopes, we recommend that you add all scopes.
+ Under **Granular scopes**, we recommend that you add the following scopes:
  + `read:application-role:jira`
  + `read:audit-log:jira`
  + `read:avatar:jira`
  + `read:field:jira`
  + `read:group:jira`
  + `read:instance-configuration:jira`
  + `read:issue-details:jira`
  + `read:issue-event:jira`
  + `read:issue-link-type:jira`
  + `read:issue-meta:jira`
  + `read:issue-security-level:jira`
  + `read:issue-security-scheme:jira`
  + `read:issue-type-scheme:jira`
  + `read:issue-type-screen-scheme:jira`
  + `read:issue-type:jira`
  + `read:issue.time-tracking:jira`
  + `read:label:jira`
  + `read:notification-scheme:jira`
  + `read:permission:jira`
  + `read:priority:jira`
  + `read:project:jira`
  + `read:project-category:jira`
  + `read:project-role:jira`
  + `read:project-type:jira`
  + `read:project-version:jira`
  + `read:project.component:jir`
  + `read:project.property:jira`
  + `read:resolution:jira`
  + `read:screen:jira`
  + `read:status:jira`
  + `read:user:jira`
  + `read:workflow-scheme:jira`
  + `read:workflow:jira`
  + `read:field-configuration:jira`
  + `read:issue-type-hierarchy:jira`
  + `read:webhook:jira`

## Connecting Amazon AppFlow to your Jira Cloud account
<a name="jira-cloud-connecting"></a>

To connect Amazon AppFlow to your Jira Cloud account, provide details from your OAuth 2.0 integration app so that Amazon AppFlow can access your data. If you haven't yet configured your Jira Cloud account for Amazon AppFlow integration, see [Before you begin](#jira-cloud-prereqs).

**To connect to Jira Cloud**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Jira Cloud**.

1. Choose **Create connection**.

1. In the **Connect to Jira Cloud** window, enter the following information:
   + **Client ID** – The client ID from the OAuth 2.0 integration app.
   + **Client secret** – The client secret from the OAuth 2.0 integration app.
   + **Jira Cloud Domain URL** – The URL where you sign in to your Jira Cloud account, for example, `https://your-account.atlassian.net`.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A window appears that asks if you want to allow Amazon AppFlow to access your Atlassian account.

1. Choose **Accept**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Jira Cloud as the data source, you can select this connection.

## Transferring data from Jira Cloud with a flow
<a name="jira-cloud-transfer-data"></a>

To transfer data from Jira Cloud, create an Amazon AppFlow flow, and choose Jira Cloud as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Jira Cloud, see [Supported objects](#jira-cloud-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#jira-cloud-destinations).

## Supported objects
<a name="jira-cloud-objects"></a>

When you create a flow that uses Jira Cloud as the data source, you can transfer any of the following data objects to supported destinations:


| Object | Jira API endpointTo retrieve your data, Amazon AppFlow queries the following endpoints in the Jira REST API. The following paths are appended to the base URI `https://your-account.atlassian.net/rest/api/2` | 
| --- | --- | 
|  Audit Record  | /auditing/record | 
|  Groups  | /group/bulk | 
|  Issue  | /search | 
|  Issue Events  | /events | 
|  Issue Fields  | /field | 
|  Issue Field Configurations  | /fieldconfiguration | 
|  Issue Link Type  | /issueLinkType | 
|  Issue Notification Schemes  | /notificationscheme | 
|  Issue Priority  | /priority | 
|  Issue Resolution  | /resolution | 
|  Issue Security Scheme  | /issuesecurityschemes | 
|  Issue Type  | /issuetype | 
|  Issue Type Scheme  | /issuetypescheme | 
|  Issue Type Screen Scheme  | /issuetypescreenscheme | 
|  Jira Settings  | /application-properties | 
|  Jira Settings Advanced  | /application-properties/advanced-settings | 
|  Jira Settings Global  | /configuration | 
|  Label  | /label | 
|  Myself  | /myself | 
|  Permission  | /mypermissions | 
|  Project  | /project/search | 
|  Project Category  | /projectCategory | 
|  Project Type  | /project/type | 
|  Server Info  | /serverInfo | 
|  User  | /users | 
|  Workflow  | /workflow | 
|  Workflow Scheme  | /workflowscheme | 
|  Workflow Scheme Project Association  | /workflowscheme/project | 
|  Workflow Status  | /status | 
|  Workflow Status Category  | /statuscategory | 

For more information about these objects, see the [Jira REST API v2](https://developer.atlassian.com/cloud/jira/platform/rest/v2/intro/) documentation.

## Supported destinations
<a name="jira-cloud-destinations"></a>

When you create a flow that uses Jira Cloud as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Kustomer connector for Amazon AppFlow
<a name="connectors-kustomer"></a>

Kustomer is a Customer Relationship Management (CRM) service that helps companies create and maintain operational solutions with customers. If you’re a Kustomer user, your account contains customer data across a number of digital channels. You can use Amazon AppFlow to transfer data from Kustomer to certain AWS services or other supported applications.

## Amazon AppFlow support for Kustomer
<a name="kustomer-support"></a>

Amazon AppFlow supports Kustomer as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Kustomer.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Kustomer.

## Before you begin
<a name="kustomer-prereqs"></a>

To use Amazon AppFlow to transfer data from Kustomer to supported destinations, you must meet these requirements:
+ You have an account with Kustomer that contains the data that you want to transfer. For more information about the Kustomer data objects that Amazon AppFlow supports, see [Supported objects](#kustomer-objects).
+ In the API keys settings for your account, you've created an API key for Amazon AppFlow, and you have the token value. Amazon AppFlow uses the API key token to make authenticated calls to your account and securely access your data. For the steps to create a key, see [API keys](https://help.kustomer.com/api-keys-SJs5YTIWX) in the Kustomer Help Center.

To connect Amazon AppFlow to your Kustomer account, you provide the token of your API key. You can view and copy this token only when you create the API key. If you don't have the token value, create a new API key.

## Connecting Amazon AppFlow to your Kustomer account
<a name="kustomer-connecting"></a>

To connect Amazon AppFlow to your Kustomer account, provide details from your Kustomer project so that Amazon AppFlow can access your data. If you haven't yet configured your Kustomer project for Amazon AppFlow integration, see [Before you begin](#kustomer-prereqs).

**To connect to Kustomer**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Kustomer**.

1. Choose **Create connection**.

1. In the **Connect to Kustomer** window, enter the following information:
   + **Access token** – The access token that you created earlier.
   + **Instance URL** – The URL of the instance where you want to run the operation, for example, https://domain.api.kustomerapp.com.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Kustomer account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Kustomer as the data source, you can select this connection.

## Transferring data from Kustomer with a flow
<a name="kustomer-transfer-data"></a>

To transfer data from Kustomer, create an Amazon AppFlow flow, and choose Kustomer as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Kustomer, see [Supported objects](#kustomer-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#kustomer-destinations).

## Supported destinations
<a name="kustomer-destinations"></a>

When you create a flow that uses Kustomer as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="kustomer-objects"></a>

When you create a flow that uses Kustomer as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-kustomer.html)

# LinkedIn Ads connector for Amazon AppFlow
<a name="connectors-linkedin-ads"></a>

LinkedIn Ads is an ad platform that helps organizations and brands to reach audiences throughout the user community of professionals on LinkedIn. If you use LinkedIn Ads, your account contains data about your ads and campaigns. You can use Amazon AppFlow to transfer data from LinkedIn Ads to certain AWS services or other supported applications.

## Amazon AppFlow support for LinkedIn Ads
<a name="linkedin-ads-support"></a>

Amazon AppFlow supports LinkedIn Ads as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from LinkedIn Ads.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to LinkedIn Ads.

**Supported API version**  
Amazon AppFlow retrieves your LinkedIn Ads data by sending requests to version 202509 of the LinkedIn API.

## Before you begin
<a name="linkedin-ads-prereqs"></a>

To use Amazon AppFlow to transfer data from LinkedIn Ads to supported destinations, you must meet these requirements:
+ You have a LinkedIn account and a LinkedIn Page. For the steps to create a page, see [Create a LinkedIn Page](https://www.linkedin.com/help/linkedin/answer/a543852/create-a-linkedin-page?lang=en) on LinkedIn Help.
+ In LinkedIn Developers, you've created an app, and you've configured it with the following settings:
  + The app is associated with your LinkedIn Page.
  + The app includes the Marketing Developer Platform product.
  + The app Auth settings have one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from LinkedIn Ads. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
+ From your LinkedIn account, you've created a LinkedIn Campaign Manager account, which you use to manage your ads on LinkedIn. For the steps to create an account, see [Create an ad account in Campaign Manager as a new advertiser](https://www.linkedin.com/help/linkedin/answer/a426102/create-an-ad-account-in-campaign-manager-as-a-new-advertiser?lang=en) on LinkedIn Help.

From the Auth settings for your app, note the client ID and client secret. You provide these values to Amazon AppFlow when you connect to LinkedIn Ads.

## Connecting Amazon AppFlow to LinkedIn Ads
<a name="linkedin-ads-connecting"></a>

To connect Amazon AppFlow to LinkedIn Ads, provide the client credentials from your LinkedIn Developers app so that Amazon AppFlow can access your data. If you haven't yet configured your LinkedIn account for Amazon AppFlow integration, see [Before you begin](#linkedin-ads-prereqs).

**To connect to LinkedIn Ads**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **LinkedIn Ads**.

1. Choose **Create connection**.

1. In the **Connect to LinkedIn Ads** window, enter the following information:
   + **Client ID** – The client ID from the Auth settings of your LinkedIn Developers app.
   + **Client secret** – The client secret from the Auth settings of your LinkedIn Developers app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your LinkedIn account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses LinkedIn Ads as the data source, you can select this connection.

## Transferring data from LinkedIn Ads with a flow
<a name="linkedin-ads-transfer-data"></a>

To transfer data from LinkedIn Ads, create an Amazon AppFlow flow, and choose LinkedIn Ads as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for LinkedIn Ads, see [Supported objects](#linkedin-ads-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#linkedin-ads-destinations).

## Supported destinations
<a name="linkedin-ads-destinations"></a>

When you create a flow that uses LinkedIn Ads as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="linkedin-ads-objects"></a>

When you create a flow that uses LinkedIn Ads as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-linkedin-ads.html)

# LinkedIn Pages connector for Amazon AppFlow
<a name="connectors-linkedin-pages"></a>

LinkedIn Pages is a solution for organizations to post industry updates, job opportunities, and information. If you're a LinkedIn Pages user, your account contains data about your pages, followers, and engagement. You can use Amazon AppFlow to transfer data from LinkedIn Pages to certain AWS services or other supported applications.

## Amazon AppFlow support for LinkedIn Pages
<a name="linkedin-pages-support"></a>

Amazon AppFlow supports LinkedIn Pages as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from LinkedIn Pages.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to LinkedIn Pages.

**Supported API version**  
Amazon AppFlow retrieves your LinkedIn Pages data by sending requests to version 202212 of the LinkedIn API.

## Before you begin
<a name="linkedin-pages-prereqs"></a>

To use Amazon AppFlow to transfer data from LinkedIn Pages to supported destinations, you must meet these requirements:
+ You have a LinkedIn account and a LinkedIn Page. For the steps to create a page, see [Create a LinkedIn Page](https://www.linkedin.com/help/linkedin/answer/a543852/create-a-linkedin-page?lang=en) on LinkedIn Help.
+ In LinkedIn Developers, you've created an app, and you've configured it as follows:
  + The app is associated with your LinkedIn Page.
  + The app includes the Marketing Developer Platform product.
  + The app Auth settings include one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from LinkedIn Pages. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

From the Auth settings for your app, note the client ID and client secret. You provide these values to Amazon AppFlow when you connect to LinkedIn Pages.

## Connecting Amazon AppFlow LinkedIn Pages
<a name="linkedin-pages-connecting"></a>

To connect Amazon AppFlow to LinkedIn Pages, provide the client credentials from your LinkedIn Developers app so that Amazon AppFlow can access your data. If you haven't yet configured your LinkedIn account for Amazon AppFlow integration, see [Before you begin](#linkedin-pages-prereqs).

**To connect to LinkedIn Pages**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **LinkedIn Pages**.

1. Choose **Create connection**.

1. In the **Connect to LinkedIn Pages** window, enter the following information:
   + **Client ID** – The client ID from the Auth settings of your LinkedIn Developers app.
   + **Client secret** – The client secret from the Auth settings of your LinkedIn Developers app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your LinkedIn account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses LinkedIn Pages as the data source, you can select this connection.

## Transferring data from LinkedIn Pages with a flow
<a name="linkedin-pages-transfer-data"></a>

To transfer data from LinkedIn Pages, create an Amazon AppFlow flow, and choose LinkedIn Pages as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for LinkedIn Pages, see [Supported objects](#linkedin-pages-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#linkedin-pages-destinations).

## Supported destinations
<a name="linkedin-pages-destinations"></a>

When you create a flow that uses LinkedIn Pages as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="linkedin-pages-objects"></a>

When you create a flow that uses LinkedIn Pages as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-linkedin-pages.html)

# Mailchimp connector for Amazon AppFlow
<a name="connectors-mailchimp"></a>

Mailchimp is a marketing automation platform and email marketing service. If you're a Mailchimp user, your account contains data about your email campaigns, such as open and click details, segments, and automations. You can use Amazon AppFlow to transfer data from Mailchimp to certain AWS services or other supported applications.

## Amazon AppFlow support for Mailchimp
<a name="mailchimp-support"></a>

Amazon AppFlow supports Mailchimp as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Mailchimp.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Mailchimp.



## Before you begin
<a name="mailchimp-prereqs"></a>

To use Amazon AppFlow to transfer data from Mailchimp to supported destinations, you must meet these requirements:
+ You have an account with Mailchimp that contains the data that you want to transfer. For more information about the Mailchimp data objects that Amazon AppFlow supports, see [Supported objects](#mailchimp-objects).
+ In your account, you've created an API key. For the steps to create one, see [About API Keys](https://mailchimp.com/help/about-api-keys/) in the Mailchimp Help Center.

Note the API key from your account settings. You provide it to Amazon AppFlow when you connect to your Mailchimp account.

## Connecting Amazon AppFlow to your Mailchimp account
<a name="mailchimp-connecting"></a>

To connect Amazon AppFlow to your Mailchimp account, provide your API key so that Amazon AppFlow can access your data. If you haven't yet configured your Mailchimp account for Amazon AppFlow integration, see [Before you begin](#mailchimp-prereqs).

**To connect to Mailchimp**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Mailchimp**.

1. Choose **Create connection**.

1. In the **Connect to Mailchimp** window, enter the following information:
   + **API Key** – The API key from your Mailchimp account settings.
   + **Instance URL** – The Mailchimp Marketing API URL that provides access to your Mailchimp data. These URLs have the form `https://data-center.api.mailchimp.com`, where *data-center* is the data center for your account. For more information, see [API structure]() in the Mailchimp Marketing API documentation.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Mailchimp as the data source, you can select this connection.

## Transferring data from Mailchimp with a flow
<a name="mailchimp-transfer-data"></a>



To transfer data from Mailchimp, create an Amazon AppFlow flow, and choose Mailchimp as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Mailchimp, see [Supported objects](#mailchimp-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#mailchimp-destinations).

## Supported destinations
<a name="mailchimp-destinations"></a>

When you create a flow that uses Mailchimp as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="mailchimp-objects"></a>

When you create a flow that uses Mailchimp as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mailchimp.html)

# Marketo
<a name="marketo"></a>

The following are the requirements and connection instructions for using Marketo with Amazon AppFlow.

**Note**  
You can use Marketo as a source or destination.

**Topics**
+ [

## Requirements
](#marketo-requirements)
+ [

## Connection instructions
](#marketo-setup)
+ [

## Notes
](#marketo-notes)
+ [

## Supported destinations
](#marketo-destinations)
+ [

## Related resources
](#marketo-resources)

## Requirements
<a name="marketo-requirements"></a>

You must provide Amazon AppFlow with your client ID and client secret. For more information about how to retrieve your client ID and client secret, see [Credentials for API Access](https://docs.marketo.com/display/public/DOCS/Create+a+Custom+Service+for+Use+with+ReST+API#CreateaCustomServiceforUsewithReSTAPI-CredentialsforAPIAccess) in the Marketo documentation.

## Connection instructions
<a name="marketo-setup"></a>

**To connect to Marketo while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings**. Then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag**, and then enter the key name and value.

1. Choose **Next**.

1. Choose **Marketo** from the **Source name** or **Destination name** dropdown list.

1. Choose **Connect** to open the **Connect to Marketo** dialog box.

   1. Under **Client ID**, enter your Marketo client ID.

   1. Under **Client secret**, enter your client secret.

   1. Under **Account/Munchkin ID**, specify the unique part of the base URL or endpoint assigned to your Marketo account.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Marketo connection form with fields for client ID, secret, account ID, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-marketo-console.png)

1. You will be redirected to the Marketo login page. When prompted, grant Amazon AppFlow permissions to access your Marketo account.

Now that you are connected to your Marketo account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in [Requirements](#marketo-requirements).

## Notes
<a name="marketo-notes"></a>
+ When you use Marketo as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per hour.
+ Depending on your instance, Marketo might queue requests for data extraction. This can result in longer flow run times. If you want to avoid queueing, contact your Marketo administrator for assistance. We recommend that you avoid running concurrent flows using Marketo if your use case does not benefit from it.
+ Depending on your Marketo instance, you can submit more than one bulk import request (with limitations). Each request is added as a job to be processed in a First-In-First-Out (FIFO) queue. A maximum of two jobs are processed at the same time. A maximum of ten jobs are allowed in the queue at any given time, including the two currently being processed. If you exceed the ten job maximum, a 1016: Too many imports error is returned. If you want to avoid queueing, contact your Marketo administrator for assistance.
+ There is a soft quota of 1 GB per flow when extracting data from Marketo. If you need to process more records in a single flow, you can submit a request to Amazon AppFlow through the Amazon AppFlow support channel. For more information, see [Creating a support case](https://docs.aws.amazon.com/awssupport/latest/user/case-management.html#creating-a-support-case) in the *AWS Support User Guide*.

## Supported destinations
<a name="marketo-destinations"></a>

When you create a flow that uses Marketo as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="marketo-resources"></a>
+ [Credentials for API Access](https://docs.marketo.com/display/public/DOCS/Create+a+Custom+Service+for+Use+with+ReST+API#CreateaCustomServiceforUsewithReSTAPI-CredentialsforAPIAccess) in the Marketo documentation
+ [API Limits with Marketo](https://developers.marketo.com/rest-api/marketo-integration-best-practices/) in the Marketo documentation
+ [Error Codes with Marketo](https://developers.marketo.com/rest-api/error-codes/) in the Marketo documentation
+ Introduction to the Marketo Connector in Amazon AppFlow  


# Microsoft Dynamics 365 connector for Amazon AppFlow
<a name="connectors-microsoft-dynamics-365"></a>

Microsoft Dynamics 365 is a portfolio of business applications for enterprise resource planning (ERP) and customer relationship management (CRM). If you're a Microsoft Dynamics 365 user, your account contains data about your business, such as your products, customers, business units, and more. You can use Amazon AppFlow to transfer data from Microsoft Dynamics 365 to certain AWS services or other supported applications.

## Amazon AppFlow support for Microsoft Dynamics 365
<a name="microsoft-dynamics-365-support"></a>

Amazon AppFlow supports Microsoft Dynamics 365 as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Microsoft Dynamics 365.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Microsoft Dynamics 365.

## Before you begin
<a name="microsoft-dynamics-365-prereqs"></a>

To use Amazon AppFlow to transfer data from Microsoft Dynamics 365 to supported destinations, you must meet these requirements:
+ You have a Microsoft account, and you've used it to sign up for Microsoft Dynamics 365. Your Microsoft Dynamics 365 account contains the data that you want to transfer.
+ In the Microsoft Azure portal, you've created an app registration for Amazon AppFlow. The registered app provides the client credentials that authenticate Amazon AppFlow when it accesses the data in your account. For the steps to register an app, see [Register an application with the Microsoft identity platform](https://learn.microsoft.com/en-us/graph/auth-register-app-v2) in the Microsoft Graph documentation.
+ You've configured your registered app with the following settings:
  + In the authentication settings, you've added a platform, and you've set the platform application type to *web*. You've configured the platform with a redirect URL for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Microsoft Dynamics 365. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

    For the steps to add a platform and set the redirect URL, see [Add a redirect URI](https://learn.microsoft.com/en-us/graph/auth-register-app-v2#add-a-redirect-uri) in the Microsoft Graph documentation.
  + You've created a client secret. For the steps to create one, see [Add a client secret](https://learn.microsoft.com/en-us/graph/auth-register-app-v2#add-a-client-secret) in the Microsoft Graph documentation.
**Notes**  
When you connect Amazon AppFlow to your Microsoft Dynamics 365 account, you provide the client secret *value*. You don't provide the client secret *ID*.
At the time that you create the client secret, you must store it's value somewhere that you can access later. After you leave the page where you create the client secret, Microsoft Azure never shows the value again.
  + In the app manifest, you've edited the following attributes to have a value of `true`:
    + `"allowPublicClient": true,`
    + `"oauth2AllowIdTokenImplicitFlow": true,`
    + `"oauth2AllowImplicitFlow": true,`

    For more information about these attributes, and for the steps to configure the app manifest, see [Azure Active Directory app manifest](https://learn.microsoft.com/en-us/azure/active-directory/develop/reference-app-manifest) in the Microsoft identity platform documentation.
  + In the API permissions settings, you've set the following configurations:
    + The app permits the `user_impersonation` permission for the Dynamics CRM API.
    + The app permits the `User.Read` permission for the Microsoft Graph API. For information about this permission, see the [Microsoft Graph permissions reference](https://learn.microsoft.com/en-us/graph/permissions-reference) in the Microsoft Graph documentation.
    + You've turned on the option to grant admin consent. For more information, see [Admin consent](https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent#admin-consent) in the Microsoft identity platform documentation.

Note the following values because you'll need them when you connect Amazon AppFlow to your Microsoft Dynamics 365 account:
+ The application (client) ID of your registered app.
+ The directory (tenant) ID of your registered app.
+ The client secret value (not the client secret ID) of your registered app.
+ The service root URL of your Dynamics 365 instance. You can find this value in the **Developer Resources** page in the Dynamics 365 web application. For information on how to access this page, see [Developer resources page](https://learn.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/developer-resources-page?view=op-9-1) in the Dynamics 365 documentation.

  The service root URL has the following format:

  ```
  https://instance-id.api.crm.dynamics.com/api/data/v9.2/
  ```

  You don't provide this URL to Amazon AppFlow directly. Instead, you provide segments of it for the fields **Custom authorization code URL** and **Instance URL**. 

## Connecting Amazon AppFlow to your Microsoft Dynamics 365 account
<a name="microsoft-dynamics-365-connecting"></a>

To connect Amazon AppFlow to Microsoft Dynamics 365, provide details from your registered app in Microsoft Azure so that Amazon AppFlow can access your data. If you haven't yet configured your Microsoft account for Amazon AppFlow integration, see [Before you begin](#microsoft-dynamics-365-prereqs).

**To connect to Microsoft Dynamics 365**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Microsoft Dynamics 365**.

1. Choose **Create connection**.

1. In the **Connect to Microsoft Dynamics 365** window, enter the following information:
   + **Custom authorization code URL** — From your service root URL, the segment `instance-id.api.crm.dynamics.com`.
   + **Client ID** — The application (client) ID of your registered app.
   + **Client secret** — The client secret value (not the client secret ID) of your registered app.
   + **Instance URL** — From your service root URL, the segment `https://instance-id.api.crm.dynamics.com`.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Microsoft Dynamics 365 account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Microsoft Dynamics 365 as the data source, you can select this connection.

## Transferring data from Microsoft Dynamics 365 with a flow
<a name="microsoft-dynamics-365-transfer-data"></a>

To transfer data from Microsoft Dynamics 365, create an Amazon AppFlow flow, and choose Microsoft Dynamics 365 as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

## Supported destinations
<a name="microsoft-dynamics-365-destinations"></a>

When you create a flow that uses Microsoft Dynamics 365 as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Microsoft SharePoint Online connector for Amazon AppFlow
<a name="connectors-microsoft-sharepoint-online"></a>

Microsoft SharePoint Online is a collaboration solution that teams use to share files, data, and other resources throughout their organization. If you're a SharePoint user, you have sites with document libraries that contain various types of documents, like PDFs, Microsoft Word documents, Microsoft Excel files, and more. You can use Amazon AppFlow to transfer these documents to Amazon S3. When you run a transfer, Amazon AppFlow also provides a file with descriptive metadata for each document.

## Amazon AppFlow support for Microsoft SharePoint Online
<a name="microsoft-sharepoint-online-support"></a>

Amazon AppFlow supports Microsoft SharePoint Online as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer documents and metadata from Microsoft SharePoint Online.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Microsoft SharePoint Online.

**Supported destination for SharePoint Online data**  
You can transfer only to Amazon S3.

**Supported SharePoint products**  
Amazon AppFlow connects only to Microsoft SharePoint Online. It doesn't connect to the on-premises SharePoint Server product.

## Before you begin
<a name="microsoft-sharepoint-online-prereqs"></a>

To use Amazon AppFlow to transfer data from Microsoft SharePoint Online to supported destinations, you must meet the following requirements:
+ You have a Microsoft account where you've signed up for Microsoft SharePoint Online. Your SharePoint account must have at least one site with a document library. The document library must have the documents that you want to transfer.
+ You have your Azure AD tenant ID. You provide this ID to Amazon AppFlow when you connect to your Microsoft SharePoint Online account. For the steps to look up the ID, see [Find your Azure AD tenant](https://learn.microsoft.com/en-us/azure/azure-portal/get-subscription-tenant-id#find-your-azure-ad-tenant) in the Azure portal documentation.

If you meet those requirements, you're ready to create a connection between Amazon AppFlow and your Sharepoint account. No additional steps are necessary in your Microsoft account because Amazon AppFlow fulfills the remaining requirements with an *AWS managed client app*.

### The AWS managed client app for Sharepoint
<a name="microsoft-sharepoint-online-managed-client"></a>

The AWS managed client app for Sharepoint simplifies the connection setup. If you use it, you don't have to provide the OAuth 2.0 credentials of a client ID and client secret. To get those credentials, you would have to create an app registration in Microsoft Azure. Instead, the only information that you must get from your Microsoft account is your Azure tenant ID. To create the connection, you provide the tenant ID and, when Amazon AppFlow prompts you, you sign into your Microsoft account and authorize Amazon AppFlow to access to your Sharepoint data.

Alternatively, you can choose to create a connection that uses OAuth 2.0 credentials from your own app registration instead of the AWS managed client app. This option is more complicated, but it gives you more control over the credentials. For example, you could use Microsoft Azure to change the credentials, revoke them, or manage who can access them.

### Requirements for using your own app registration (optional)
<a name="optional-app-registration"></a>

If you want to authorize Amazon AppFlow with the OAuth 2.0 credentials from your own app registration, you must meet these requirements:
+ In the Microsoft Azure portal, you've created an app registration for Amazon AppFlow. For the steps to register an app, see [Register an application with the Microsoft identity platform](https://learn.microsoft.com/en-us/graph/auth-register-app-v2) in the Microsoft Graph documentation.
+ You've configured your registered app as follows:
  + You've added one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Microsoft SharePoint Online. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've added the recommended permissions.
  + You've created a client secret.

Note the following values from your registered app because you provide them to Amazon AppFlow when you connect to your Sharepoint account:
+ Application (client) ID
+ Client secret

#### Recommended permissions for the app registration
<a name="microsoft-sharepoint-online-permissions"></a>

Before Amazon AppFlow can securely access your data in Microsoft SharePoint Online, your registered app must allow the necessary permissions for the Microsoft Graph API. We recommend that you allow the following permissions so that Amazon AppFlow can access all supported resources.

You can add permissions to your registered app by using the API permissions page in the Microsoft Azure portal. Configure your permissions as follows:
+ Under **Microsoft APIs**, choose **Microsoft Graph**.
+ For the permissions type, choose **delegated**. For information about delegated permissions, see [Permission types](https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent#using-the-admin-consent-endpoint) in the Microsoft identity platform documentation.
+ Add the following recommended permissions:
  + `offline_access`
  + `Sites.Read.All`
  + `User.Read`

  For information about these permissions, see the [Microsoft Graph permissions reference](https://learn.microsoft.com/en-us/graph/permissions-reference) in the Microsoft Graph documentation.

## Connecting Amazon AppFlow to your Microsoft SharePoint Online account
<a name="microsoft-sharepoint-online-connecting"></a>

To connect Amazon AppFlow to Microsoft SharePoint Online, provide details from your registered app in Microsoft Azure so that Amazon AppFlow can access the documents in your SharePoint document libraries. If you haven't yet configured your Microsoft account for Amazon AppFlow integration, see [Before you begin](#microsoft-sharepoint-online-prereqs).

**To connect to Microsoft SharePoint Online**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Microsoft SharePoint Online**.

1. Choose **Create connection**.

1. In the **Connect to Microsoft SharePoint Online** window, enter the following information about your registered app:
   + **Custom authorization tokens URL** – Your Azure AD tenant ID.
   + **Custom authorization code URL** – Azure AD tenant ID

1. By default, the **Use AWS managed client app** checkbox is activated. You can do either of the following:
   + If you want to use the AWS managed client app, keep the checkbox activated.
   + If you want to use your own client app (called an app registration in Microsoft Azure), choose the checkbox to deactivate it. Then, provide values for **Client ID** and **Client secret**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A window appears that asks if you want to allow Amazon AppFlow to access your Microsoft SharePoint Online account.

1. Choose **Authorize**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Microsoft SharePoint Online as the data source, you can select this connection.

## Transferring data from Microsoft SharePoint Online with a flow
<a name="microsoft-sharepoint-online-transfer-data"></a>

To transfer documents and metadata from Microsoft SharePoint Online to Amazon S3, create an Amazon AppFlow flow. In the flow configuration, you set the data source by choosing a Microsoft SharePoint Online connection. Specifically for flows that transfer from SharePoint, you also choose a SharePoint site that's hosted in your account, and one or more SharePoint document libraries that belong to the site. You also set the data destination by choosing an Amazon S3 bucket in your AWS account.

**To configure a flow with Microsoft SharePoint Online as the data source**

For the standard steps to create a flow, see [Create a flow using the AWS console](create-flow-console.md). Use the following steps only to configure the data source and data destination details for a flow that transfers from SharePoint. You configure these settings when you reach the **Configure flow** page in the flow creation process.

1. Under **Source details**, for **Source name**, choose **Microsoft SharePoint Online**.

1. For **Choose Microsoft SharePoint Online connection**, choose the connection that you created. If you haven't created a connection yet, see [Connecting Amazon AppFlow to your Microsoft SharePoint Online account](#microsoft-sharepoint-online-connecting).

1. For **Choose API version**, choose **v1.0**.

1. For **Choose Microsoft SharePoint Online site**, choose the SharePoint site in your account that contains the documents that you want to transfer.

1. Under **Selected resources**, the console shows the document libraries that belong to the SharePoint site. Each document library is represented as a folder. If a folder contains subfolders or documents, you can expand the folder to show its contents.

   Select the check box for one or more folders to pick the documents that your flow transfers to Amazon S3. When you run the flow, Amazon AppFlow transfers the documents that are in the folder, in addition to the documents that are in all of its subfolders.

   For the limits that apply to how many folders and documents you can transfer, see [Quotas and limitations for the Microsoft SharePoint Online connector](#microsoft-sharepoint-online-quotas).

1. Under **Destination details**, for **Destination name**, choose **Amazon S3**. Then, for **Bucket details**, choose the S3 bucket that stores the output from your flow. To organize your output, you can specify an optional prefix, which becomes a folder in your S3 bucket.

After you configure your flow with a SharePoint document library and a destination S3 bucket, you can work through the remaining flow configuration steps in the console by using the standard steps.

### Microsoft SharePoint Online output in Amazon S3
<a name="microsoft-sharepoint-online-output"></a>

When you run a flow that transfers from SharePoint, Amazon AppFlow creates the following items in the destination S3 bucket:
+ A JSON file that contains metadata about every document that Amazon AppFlow transfers from your document libraries. For the metadata fields, see [Supported metadata fields for Microsoft SharePoint Online documents](#microsoft-sharepoint-online-objects). The name of the file is the execution ID of the flow run. To learn what flow run the execution ID is associated with, you can view a list of IDs under the **Run history** tab in the details page for a flow.
+ A folder that contains the folders and documents that you transferred from the document libraries of your site. The name of this folder is also the execution ID of the flow run.

The scope of the output depends on whether you configured the flow to run on a schedule or run on demand:
+ If the flow runs on a schedule, Amazon AppFlow performs incremental data transfers. When the flow runs for the first time, Amazon AppFlow transfers every document in the document libraries that you chose in the data source configuration. Then, for all subsequent flow runs, Amazon AppFlow transfers only those files that you created or changed in SharePoint since the prior flow run.

  To configure a flow to run on a schedule, you can use the console to set the schedule settings under **Flow trigger** in the flow creation process.
+ If the flow runs on demand, Amazon AppFlow performs full data transfers. For every flow run, Amazon AppFlow transfers every document in the document libraries that you chose in the data source configuration.

  To configure a flow to run on demand, you can use the console to set this option under **Flow trigger** in the flow creation process. After you create an on-demand flow, you run the flow by choosing **Run flow** on the flow details page.

## Supported metadata fields for Microsoft SharePoint Online documents
<a name="microsoft-sharepoint-online-objects"></a>

When you run a flow that transfers documents from Microsoft SharePoint Online, Amazon AppFlow creates a metadata file in the destination S3 bucket. The metadata describes each document that Amazon AppFlow transferred for the flow run.

The following table lists the metadata fields that Amazon AppFlow supports. For each transferred document, Amazon AppFlow writes only those fields that apply to the document type.


|   ** Metadata field **   |   ** Data type **   |   ** Supported filters **   | 
| --- | --- | --- | 
|  Audio  |  Struct  |    | 
|  Bundle  |  Struct  |    | 
|  Created DateTime  |  DateTime  |    | 
|  CreatedBy  |  Struct  |    | 
|  Deleted  |  Struct  |    | 
|  Description  |  String  |    | 
|  Entity Content Tag  |  String  |    | 
|  Entity Tag  |  String  |    | 
|  File  |  Struct  |    | 
|  File System Info  |  Struct  |    | 
|  File Type  |  String  |  EQUAL\$1TO  | 
|  Id  |  String  |    | 
|  Image  |  Struct  |    | 
|  Last Modified By  |  Struct  |    | 
|  Last Modified DateTime  |  DateTime  |  GREATER\$1THAN  | 
|  Location  |  Struct  |    | 
|  Malware  |  Struct  |    | 
|  Name  |  String  |    | 
|  Package  |  Struct  |    | 
|  Parent Reference  |  Struct  |    | 
|  Pending Operations  |  Struct  |    | 
|  Photo  |  Struct  |    | 
|  Publication  |  Struct  |    | 
|  Remote Item  |  Struct  |    | 
|  Root  |  Struct  |    | 
|  Search Result  |  Struct  |    | 
|  SharePoint Ids  |  Struct  |    | 
|  Shared  |  Struct  |    | 
|  Size  |  Integer  |    | 
|  Special Folder  |  Struct  |    | 
|  Video  |  Struct  |    | 
|  Web Dav Url  |  String  |    | 
|  Web Url  |  String  |    | 

## Quotas and limitations for the Microsoft SharePoint Online connector
<a name="microsoft-sharepoint-online-quotas"></a>

The following table lists the quotas that apply to flows that transfer from SharePoint.


| Resource | Quota | 
| --- | --- | 
|  The maximum number of SharePoint document library folders transferred by a flow  | 17 | 
|  The maximum size of any file transferred by a flow  | 250 GB | 
|  The maximum number of files transferred by a flow run  | 10,000 | 
|  The maximum total data size transferred by a flow run  | 250 GB | 

The following limitations also apply to flows that transfer from SharePoint:
+ For scheduled flows, if a flow remains running when the next flow run is scheduled to start, then Amazon AppFlow skips the next flow run. Amazon AppFlow does this to allow the first flow run enough time to complete.
+ Amazon AppFlow doesn't provide the option to catalog your output in the AWS Glue Data Catalog. Amazon AppFlow typically provides that option for flows that transfer to Amazon S3, but the option is available only for structured source data. The documents that you transfer from your SharePoint document libraries are unstructured data.
+ Amazon AppFlow doesn't provide the data partitioning options that it typically provides for flows that transfer to Amazon S3. Amazon AppFlow partitions all SharePoint output only into folders that are named after the execution ID of the flow run.

# Microsoft Teams connector for Amazon AppFlow
<a name="connectors-microsoft-teams"></a>

Microsoft Teams is a platform developed by Microsoft that helps teams collaborate through chat, online meetings, and more. If you're a Microsoft Teams user, your account contains data about your resources, including teams, groups, and channels. You can use Amazon AppFlow to transfer data from Microsoft Teams to certain AWS services or other supported applications.

## Amazon AppFlow support for Microsoft Teams
<a name="microsoft-teams-support"></a>

Amazon AppFlow supports Microsoft Teams as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Microsoft Teams.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Microsoft Teams.

## Before you begin
<a name="microsoft-teams-prereqs"></a>

To use Amazon AppFlow to transfer data from Microsoft Teams to supported destinations, you must meet these requirements:
+ You have a Microsoft account with which you've signed up for the following services:
  + Microsoft Teams. For more information about the Microsoft Teams data objects that Amazon AppFlow supports, see [Supported objects](#microsoft-teams-objects).
  + Microsoft 365.
  + The Microsoft 365 Developer Program.
+ In the Microsoft Azure portal, you've created an app registration for Amazon AppFlow. The registered app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to register an app, see [Register an application with the Microsoft identity platform](https://learn.microsoft.com/en-us/graph/auth-register-app-v2) in the Microsoft Graph documentation.
+ You've configured your registered app with the following settings:
  + You've added one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Microsoft Teams. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've added the recommended permissions below.
  + You've created a client secret.

Note the following values from your registered app because you provide them to Amazon AppFlow when you connect to your Microsoft Teams account:
+ Application (client) ID
+ Directory (tenant) ID
+ Client secret

### Recommended permissions
<a name="microsoft-teams-permissions"></a>

Before Amazon AppFlow can securely access your data in Microsoft Teams, your registered app must allow the necessary permissions for the Microsoft Graph API. We recommend that you enable the permissions below so that Amazon AppFlow can access all supported data objects.

If you want to grant fewer permissions, you can omit any permissions that apply to objects that you don't want to transfer.

You can add permissions to your registered app by using the API permissions page in the Microsoft Azure portal. Configure your permissions as follows:
+ Under **Microsoft APIs**, choose **Microsoft Graph**.
+ For the permissions type, choose **delegated**. For information about delegated permissions, see [Permission types](https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent#using-the-admin-consent-endpoint) in the Microsoft identity platform documentation.
+ Add the following recommended permissions:
  + `User.Read`
  + `Offline_access`
  + `User.Read.All`
  + `User.ReadWrite.All`
  + `TeamsTab.ReadWriteForTeam`
  + `TeamsTab.ReadWriteForChat`
  + `TeamsTab.ReadWrite.All`
  + `TeamsTab.Read.All`
  + `TeamSettings.ReadWrite.All`
  + `TeamSettings.Read.All`
  + `TeamMember.ReadWrite.All`
  + `TeamMember.Read.All`
  + `Team.ReadBasic.All`
  + `GroupMember.ReadWrite.All`
  + `GroupMember.Read.All`
  + `Group.ReadWrite.All`
  + `Group.Read.All`
  + `Directory.ReadWrite.All`
  + `Directory.Read.All`
  + `Directory.AccessAsUser.All`
  + `Chat.ReadWrite`
  + `Chat.ReadBasic`
  + `Chat.Read`
  + `ChannelSettings.ReadWrite.All`
  + `ChannelSettings.Read.All`
  + `ChannelMessage.Read.All`
  + `Channel.ReadBasic.All`

  For information about these permissions, see the [Microsoft Graph permissions reference](https://learn.microsoft.com/en-us/graph/permissions-reference) in the Microsoft Graph documentation.
+ Enable the option to grant admin consent to your app. For more information, see [Admin consent](https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent#admin-consent) in the Microsoft identity platform documentation.

## Connecting Amazon AppFlow to your Microsoft Teams account
<a name="microsoft-teams-connecting"></a>

To connect Amazon AppFlow to Microsoft Teams, provide details from your registered app in Microsoft Azure so that Amazon AppFlow can access your data. If you haven't yet configured your Microsoft account for Amazon AppFlow integration, see [Before you begin](#microsoft-teams-prereqs).

**To connect to Microsoft Teams**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Microsoft Teams**.

1. Choose **Create connection**.

1. In the **Connect to Microsoft Teams** window, enter the following information about your registered app:
   + **Custom authorization tokens URL** – The directory (tenant) ID.
   + **Custom authorization code URL** – The directory (tenant) ID
   + The **Client ID** and **Client secret**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A **Sign in** window opens.

1. Enter your user name and password to sign in to your Microsoft account.

1. On the **Permissions requested** page, choose **Accept**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Microsoft Teams as the data source, you can select this connection.

## Transferring data from Microsoft Teams with a flow
<a name="microsoft-teams-transfer-data"></a>



To transfer data from Microsoft Teams, create an Amazon AppFlow flow, and choose Microsoft Teams as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Microsoft Teams, see [Supported objects](#microsoft-teams-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#microsoft-teams-destinations).

## Supported destinations
<a name="microsoft-teams-destinations"></a>

When you create a flow that uses Microsoft Teams as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="microsoft-teams-objects"></a>

When you create a flow that uses Microsoft Teams as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-microsoft-teams.html)

# Mixpanel connector for Amazon AppFlow
<a name="connectors-mixpanel"></a>

Mixpanel is a service that provides analytics about user engagement in web and mobile applications. If you use Mixpanel, you can also use Amazon AppFlow to transfer your data to certain AWS services or other supported applications.

**Topics**
+ [

## Mixpanel support
](#mixpanel-support)
+ [

## Before you begin
](#mixpanel-prereqs)
+ [

## Connecting Amazon AppFlow to your Mixpanel account
](#mixpanel-connecting)
+ [

## Transferring data from Mixpanel with a flow
](#mixpanel-import-data)
+ [

## Supported objects
](#mixpanel-reference-objects)
+ [

## Supported destinations
](#mixpanel-reference-destinations)

## Mixpanel support
<a name="mixpanel-support"></a>

Amazon AppFlow supports Mixpanel as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from your Mixpanel account.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to your Mixpanel account.

## Before you begin
<a name="mixpanel-prereqs"></a>

Before you can use Amazon AppFlow to transfer data from Mixpanel, you need the following:
+ A Mixpanel project that contains the data that you want to transfer.
+ A *service account* for your Mixpanel project. In Mixpanel, a service account is a type of user that you authorize to access a project programmatically with the Mixpanel API. Amazon AppFlow needs this account to access your data. For more information, see [Service Accounts](https://developer.mixpanel.com/reference/service-accounts) in the Mixpanel documentation.

  When you create a Mixpanel connection in Amazon AppFlow, you provide the following properties from your service account:
  + Username
  + Secret

## Connecting Amazon AppFlow to your Mixpanel account
<a name="mixpanel-connecting"></a>

To connect Amazon AppFlow to your Mixpanel project, provide details about the service account that enables Amazon AppFlow to access your data. To create a service account, see [Before you begin](#mixpanel-prereqs).

**To connect to Mixpanel**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Mixpanel**.

1. Choose **Create connection**.

1. In the **Connect to Mixpanel** window, enter the following:
   + **User name** – The user name of the Mixpanel service account that provides access to your project.
   + **Password** – The service account secret.
   + **MixPanel Instance URL** – Choose **https://mixpanel.com/api/app/me**.
   + **MixPanel API version** – Choose **2.0**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Mixpanel as the data source, you can select this connection.

## Transferring data from Mixpanel with a flow
<a name="mixpanel-import-data"></a>

To transfer data from Mixpanel, create an Amazon AppFlow flow, and choose Mixpanel as the data source. To learn how to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose which data object that you want to transfer. For more information about the objects that Amazon AppFlow supports for Mixpanel, see [Supported objects](#mixpanel-reference-objects).

**Required filters for Mixpanel data objects**  
When you create a flow and use Mixpanel as the data source, most data objects require you to specify one or more *filters*. Filters are typically optional criteria that you use to transfer data objects selectively. Specifically for flows that transfer from Mixpanel, you must specify filters to provide Amazon AppFlow with parameter values that it needs to query your data.  
For the filters that are required for each Mixpanel data object, see [Supported objects](#mixpanel-reference-objects).

Also choose the destination where you want to transfer the data object that you selected. For more information on how to configure your destination, see [Supported destinations](#mixpanel-reference-destinations).

## Supported objects
<a name="mixpanel-reference-objects"></a>

When you create a flow that uses Mixpanel as the data source, you can transfer any of the data objects shown in the following table. To retrieve each object, Amazon AppFlow sends a query to the URI in the *Mixpanel endpoint* column. Most data objects support one or more filters that appear under *Supported filters*. Flows that transfer from Mixpanel require certain filters.


| Object | Mixpanel endpointThe following paths are appended to the base URI: `https://mixpanel.com/api/2.0`. | Supported filters | 
| --- | --- | --- | 
| Annotations | /annotations |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Cohorts | /cohorts/list | None | 
| Engage | /engage | None | 
| Events | /events |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Events Names | /events/names |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Events Properties | /events/properties |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Events Properties Top | /events/properties/top |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Events Properties Values | /events/properties/values |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Events Top | /events/top |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Funnels | /funnels |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Profile Event Activity | /stream/query |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Retention | /retention/addiction |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Segmentation | /segmentation |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Segmentation Average | /segmentation/average |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Segmentation Numeric | /segmentation/numeric |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 
| Segmentation Sum | /segmentation/sum |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-mixpanel.html)  | 

\$1 You must specify this filter in your flow definition before Amazon AppFlow can successfully retrieve your data.

## Supported destinations
<a name="mixpanel-reference-destinations"></a>

When you create a flow that uses Mixpanel as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Okta connector for Amazon AppFlow
<a name="connectors-okta"></a>

Okta is an identity and access management solution. If you you're an Okta user, your account contains data about your Okta objects, such as your users, groups, devices and applications. You can use Amazon AppFlow to transfer data from Okta to certain AWS services or other supported applications.

## Amazon AppFlow support for Okta
<a name="okta-support"></a>

Amazon AppFlow supports Okta as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Okta.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Okta.

## Before you begin
<a name="okta-prereqs"></a>

To use Amazon AppFlow to transfer data from Okta to supported destinations, you must meet these requirements:
+ You have an account with Okta that contains the data that you want to transfer. For more information about the Okta data objects that Amazon AppFlow supports, see [Supported objects](#okta-objects).
+ In your account , you've created either of the following resources for Amazon AppFlow. These resources provide credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account.
  + An OIDC app integration to support OAuth 2.0 authentication. For the steps to create an app integration, see [Create OIDC app integrations](https://help.okta.com/en-us/Content/Topics/Apps/Apps_App_Integration_Wizard_OIDC.htm) in the Okta Help Center.
  + An API token. For the steps to create one, see [Create an API token](https://developer.okta.com/docs/guides/create-an-api-token/main/) in the Okta Help Center.
+ If you created an OIDC app integration, you've configured it with the following settings:
  + The application type is *Web Application*.
  + The activated grant types include *Authorization Code* and *Refresh Token*.
  + The sign-in redirect URIs include one or more URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Okta. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + The following scopes are permitted:
    + `okta.apps.read`
    + `okta.devices.read`
    + `okta.groups.read`
    + `okta.users.read`
    + `okta.userTypes.read`

If you created an OIDC app integration, note the client ID and client secret . If you created an API token, note the token value. You provide these values to Amazon AppFlow when you connect to your Okta account.

## Connecting Amazon AppFlow to your Okta account
<a name="okta-connecting"></a>

To connect Amazon AppFlow to your Okta account, provide the client credentials from your app integration, or provide an API token. If you haven't yet configured your Okta account for Amazon AppFlow integration, see [Before you begin](#okta-prereqs).

**To connect to Okta**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Okta**.

1. Choose **Create connection**.

1. In the **Connect to Okta** window, for **Select authentication type**, choose how to authenticate Amazon AppFlow with your Okta account when it requests to access your data:
   + Choose **OAuth2** to authenticate Amazon AppFlow with the client credentials from an OIDC app integration. Then, specify the following:
     + **Authorization tokens URL** and **Authorization code URL** – For each of these fields, do the following: 

       1. Choose the format of your Okta Org URL. For more information, see [Org URLs](https://developer.okta.com/docs/concepts/okta-organizations/#org-urls) in the Okta Developer documentation.

       1. Enter your Okta subdomain. For the steps to look up your subdomain, see [Find your Okta domain](https://developer.okta.com/docs/guides/find-your-domain/main/) in the Okta Developer documentation..
     + **Client ID** – The client ID from your app integration.
     + **Client secret** – The client secret from your app integration.
   + Choose **Okta\$1API\$1Token** to authenticate Amazon AppFlow with an API token. Then, enter the token value for **Okta API Token**.

1. For **Your Okta Domain URL**, enter your domain URL, such as ***my-domain*.okta.com**. For the steps to find your domain, see [Find your Okta domain](https://developer.okta.com/docs/guides/find-your-domain/main/) in the Okta Developer documentation.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Okta account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Okta as the data source, you can select this connection.

## Transferring data from Okta with a flow
<a name="okta-transfer-data"></a>



To transfer data from Okta, create an Amazon AppFlow flow, and choose Okta as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Okta, see [Supported objects](#okta-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#okta-destinations).

## Supported destinations
<a name="okta-destinations"></a>

When you create a flow that uses Okta as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="okta-objects"></a>

When you create a flow that uses Okta as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-okta.html)

# Oracle HCM connector for Amazon AppFlow
<a name="connectors-oracle-hcm"></a>

Oracle Human Capital Management (HCM) is a cloud-based application for human resources (HR) processes. You can use Amazon AppFlow to transfer data from Oracle HCM to certain AWS services or other supported applications.

## Amazon AppFlow support for Oracle HCM
<a name="oracle-hcm-support"></a>

Amazon AppFlow supports Oracle HCM as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Oracle HCM.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Oracle HCM.

## Before you begin
<a name="oracle-hcm-prereqs"></a>

To use Amazon AppFlow to transfer data from Oracle HCM to supported destinations, you must have an account with Oracle HCM that contains the data that you want to transfer. 

## Connecting Amazon AppFlow to your Oracle HCM account
<a name="oracle-hcm-connecting"></a>

To connect Amazon AppFlow to your Oracle HCM account, provide your account credentials and instance URL.

**To connect to Oracle HCM**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Oracle HCM**.

1. Choose **Create connection**.

1. In the **Connect to Oracle HCM** window, enter the following information:
   + **User name** – The user name for your Oracle HCM account.
   + **Password** – The password for your Oracle HCM account.
   + **Oraclehcm Instance URL** – The URL of your Oracle HCM instance.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Oracle HCM as the data source, you can select this connection.

## Transferring data from Oracle HCM with a flow
<a name="oracle-hcm-transfer-data"></a>

To transfer data from Oracle HCM, create an Amazon AppFlow flow, and choose Oracle HCM as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

## Supported destinations
<a name="oracle-hcm-destinations"></a>

When you create a flow that uses Oracle HCM as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# PayPal connector for Amazon AppFlow
<a name="connectors-paypal"></a>

PayPal is a payments system that facilitates online money transfers between parties, such as transfers between customers and online vendors. If you're a PayPal user, your account contains data about your transactions, such as their payers, dates, and statuses. You can use Amazon AppFlow to transfer data from PayPal to certain AWS services or other supported applications.

## Amazon AppFlow support for PayPal
<a name="paypal-support"></a>

Amazon AppFlow supports PayPal as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from PayPal.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to PayPal.

## Before you begin
<a name="paypal-prereqs"></a>

To use Amazon AppFlow to transfer data from PayPal to supported destinations, you must meet these requirements:
+ You have an account with PayPal that contains the data that you want to transfer. For more information about the PayPal data objects that Amazon AppFlow supports, see [Supported objects](#paypal-objects).
+ In PayPal Developer, you've created a REST API app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [How do I create REST API credentials?](https://www.paypal.com/us/cshelp/article/how-do-i-create-rest-api-credentials-ts1949) in the PayPal Help Center.
+ You have configured the app with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from PayPal. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID and secret from the settings for your REST API app. You provide these values to Amazon AppFlow when you connect to your PayPal account.

## Connecting Amazon AppFlow to your PayPal account
<a name="paypal-connecting"></a>

To connect Amazon AppFlow to your PayPal account, provide the client credentials from your REST API app so that Amazon AppFlow can access your data. If you haven't yet configured your PayPal account for Amazon AppFlow integration, see [Before you begin](#paypal-prereqs).

**To connect to PayPal**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **PayPal**.

1. Choose **Create connection**.

1. In the **Connect to PayPal** window, enter the following information:
   + **Authorization tokens URL** – Do one of the following:
     + To connect to a REST API app in the PayPal Live environment, choose **https://api-m.paypal.com/v1/oauth2/token**.
     + To connect to a REST API app in the PayPal Sandbox environment, choose **https://api-m.sandbox.paypal.com/v1/oauth2/token**.
   + **Client ID** – The client ID of your REST API app in PayPal Developer.
   + **Client secret** – The secret of your REST API app in PayPal Developer.
   + **Instance URL** – Do one of the following:
     + To connect to a REST API app in the PayPal Live environment, choose **https://api-m.paypal.com**.
     + To connect to a REST API app in the PayPal Sandbox environment, choose **https://api-m.sandbox.paypal.com**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your PayPal account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses PayPal as the data source, you can select this connection.

## Transferring data from PayPal with a flow
<a name="paypal-transfer-data"></a>

To transfer data from PayPal, create an Amazon AppFlow flow, and choose PayPal as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for PayPal, see [Supported objects](#paypal-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#paypal-destinations).

## Supported destinations
<a name="paypal-destinations"></a>

When you create a flow that uses PayPal as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="paypal-objects"></a>

When you create a flow that uses PayPal as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-paypal.html)

# Pendo connector for Amazon AppFlow
<a name="connectors-pendo"></a>

Pendo is a product analytics solution that helps teams record, monitor, and analyze data about the user experience in their apps. If you're a Pendo user, your account contains data about your users and their behavior in your product. You can use Amazon AppFlow to transfer data from Pendo to certain AWS services or other supported applications.

## Amazon AppFlow support for Pendo
<a name="pendo-support"></a>

Amazon AppFlow supports Pendo as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Pendo.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Pendo.

## Before you begin
<a name="pendo-prereqs"></a>

To use Amazon AppFlow to transfer data from Pendo to supported destinations, you must meet these requirements:
+ You have an account with Pendo that contains the data that you want to transfer. For more information about the Pendo data objects that Amazon AppFlow supports, see [Supported objects](#pendo-objects).
+ In your Pendo account, you've created an integration key for Amazon AppFlow, and you've configured the key to allow write access. For the steps to create a key, see [Authentication](https://developers.pendo.io/docs/?bash#authentication) in the Pendo Developers documentation.

Note the value of the integration key. You provide this value to Amazon AppFlow when you connect to your Pendo account.

## Connecting Amazon AppFlow to your Pendo account
<a name="pendo-connecting"></a>

To connect Amazon AppFlow to your Pendo account, provide the value of your integration key so that Amazon AppFlow can access your data. If you haven't yet configured your Pendo account for Amazon AppFlow integration, see [Before you begin](#pendo-prereqs).

**To connect to Pendo**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Pendo**.

1. Choose **Create connection**.

1. In the **Connect to Pendo** window, for **API key**, enter the value of the integration key from your Pendo account.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Pendo as the data source, you can select this connection.

## Transferring data from Pendo with a flow
<a name="pendo-transfer-data"></a>

To transfer data from Pendo, create an Amazon AppFlow flow, and choose Pendo as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Pendo, see [Supported objects](#pendo-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#pendo-destinations).

## Supported destinations
<a name="pendo-destinations"></a>

When you create a flow that uses Pendo as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="pendo-objects"></a>

When you create a flow that uses Pendo as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-pendo.html)

# Pipedrive connector for Amazon AppFlow
<a name="connectors-pipedrive"></a>

Pipedrive is a Customer Relationship Management (CRM) service that helps companies track and carry out projects. If you’re a Pipedrive user, your account contains data about connections with your customers and within your organization. This can include deals, contacts, demos, proposals, and more. You can use Amazon AppFlow to transfer data from Pipedrive to certain AWS services or other supported applications.

## Amazon AppFlow support for Pipedrive
<a name="pipedrive-support"></a>

Amazon AppFlow supports Pipedrive as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Pipedrive.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Pipedrive.

## Before you begin
<a name="pipedrive-prereqs"></a>

To use Amazon AppFlow to transfer data from Pipedrive to supported destinations, you must meet these requirements:
+ You have an account with Pipedrive that contains the data that you want to transfer. For more information about the Pipedrive data objects that Amazon AppFlow supports, see [Supported objects](#pipedrive-objects).
+ In your Pipedrive account, you've created an unlisted app in Marketplace Manager. This app provides the credentials that Amazon AppFlow uses to make authenticated calls to your account and securely access your data. For the steps to create an app, see [Creating an app](https://pipedrive.readme.io/docs/marketplace-creating-a-proper-app) in the *Pipedrive Developer Documentation*.

  You've configured your app as follows:
  + You've specified a redirect URL (also referred to as a *callback URL*) for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Pipedrive. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've activated the access scopes that provide access to the data that you want to transfer. For more information about Pipedrive scopes, see [Scopes and permission explanations](https://pipedrive.readme.io/docs/marketplace-scopes-and-permissions-explanations) in the *Pipedrive Developer Documentation*.

From the settings for your app, note the client ID and client secret. When you connect to your Pipedrive account, you provide these values to Amazon AppFlow.

## Connecting Amazon AppFlow to your Pipedrive account
<a name="pipedrive-connecting"></a>

To connect Amazon AppFlow to your Pipedrive account, provide details from your Pipedrive project so that Amazon AppFlow can access your data. If you haven't yet configured your Pipedrive project for Amazon AppFlow integration, see [Before you begin](#pipedrive-prereqs).

**To connect to Pipedrive**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Pipedrive**.

1. Choose **Create connection**.

1. In the **Connect to Pipedrive** window, enter the following information:
   + **Client ID** – The client ID of the OAuth 2.0 client ID in your Pipedrive project.
   + **Client secret** – The client secret of the OAuth 2.0 client ID in your Pipedrive project.
   + **Instance URL** – The URL of the instance where you want to run the operation, for example, https://awsappflow-domain.pipedrive.com.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Pipedrive account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Pipedrive as the data source, you can select this connection.

## Transferring data from Pipedrive with a flow
<a name="pipedrive-transfer-data"></a>

To transfer data from Pipedrive, create an Amazon AppFlow flow, and choose Pipedrive as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Pipedrive, see [Supported objects](#pipedrive-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#pipedrive-destinations).

## Supported destinations
<a name="pipedrive-destinations"></a>

When you create a flow that uses Pipedrive as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="pipedrive-objects"></a>

When you create a flow that uses Pipedrive as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-pipedrive.html)

# Productboard connector for Amazon AppFlow
<a name="connectors-productboard"></a>

Productboard is a product management solution. If you're a Productboard user, your account contains data about the projects in your roadmap, such as products, features, and status. You can use Amazon AppFlow to transfer data from Productboard to certain AWS services or other supported applications.

## Amazon AppFlow support for Productboard
<a name="productboard-support"></a>

Amazon AppFlow supports Productboard as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Productboard.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Productboard.

## Before you begin
<a name="productboard-prereqs"></a>

To use Amazon AppFlow to transfer data from Productboard to supported destinations, you must have an account with Productboard that contains the data that you want to transfer.

From the Public API settings in your account, note the access token because you provide this value to Amazon AppFlow when you connect to Productboard. For the steps to get the token, see [Public API Access Token](https://developer.productboard.com/#section/Authentication/Public-API-Access-Token) in the Productboard API Reference.

## Connecting Amazon AppFlow to your Productboard account
<a name="productboard-connecting"></a>

To connect Amazon AppFlow to your Productboard account, provide the access token from your account settings so that Amazon AppFlow can access your data.

**To connect to Productboard**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Productboard**.

1. Choose **Create connection**.

1. In the **Connect to Productboard** window, for **Access Token**, enter your access token.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Productboard as the data source, you can select this connection.

## Transferring data from Productboard with a flow
<a name="productboard-transfer-data"></a>

To transfer data from Productboard, create an Amazon AppFlow flow, and choose Productboard as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Productboard, see [Supported objects](#productboard-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#productboard-destinations).

## Supported destinations
<a name="productboard-destinations"></a>

When you create a flow that uses Productboard as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="productboard-objects"></a>

When you create a flow that uses Productboard as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-productboard.html)

# QuickBooks Online connector for Amazon AppFlow
<a name="connectors-quickbooks-online"></a>

QuickBooks Online is a cloud-based accounting solution for businesses. If you're a QuickBooks Online user, your account contains data about your accounts, customers, invoices, and more. You can use Amazon AppFlow to transfer data from QuickBooks Online to certain AWS services or other supported applications.

## Amazon AppFlow support for QuickBooks Online
<a name="quickbooks-online-support"></a>

Amazon AppFlow supports QuickBooks Online as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from QuickBooks Online.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to QuickBooks Online.

## Before you begin
<a name="quickbooks-online-prereqs"></a>

To use Amazon AppFlow to transfer data from QuickBooks Online to supported destinations, you must meet these requirements:
+ You have an account with QuickBooks Online that contains the data that you want to transfer. For more information about the QuickBooks Online data objects that Amazon AppFlow supports, see [Supported objects](#quickbooks-online-objects).
+ In your Intuit developer account, you've created an app for Amazon AppFlow. This app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [Create and start developing your app](https://developer.intuit.com/app/developer/qbo/docs/get-started/start-developing-your-app) in the Intuit Developer documentation.
+ You've configured your app to permit the `com.intuit.quickbooks.accounting` scope.

Note the following values because you specify them in the connection settings in Amazon AppFlow.
+ The client ID and client secret from your app settings.
+ The company ID from your QuickBooks Online account settings.

## Connecting Amazon AppFlow to your QuickBooks Online account
<a name="quickbooks-online-connecting"></a>

To connect Amazon AppFlow to your QuickBooks Online account, provide details from your app so that Amazon AppFlow can access your data. If you haven't yet configured your QuickBooks Online account for Amazon AppFlow integration, see [Before you begin](#quickbooks-online-prereqs).

**To connect to QuickBooks Online**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **QuickBooks Online**.

1. Choose **Create connection**.

1. In the **Connect to QuickBooks Online** window, enter the following information:
   + **Client ID** – The client ID from your app settings.
   + **Client secret** – The client secret from your app settings.
   + **Instance URL** – The endpoint where Amazon AppFlow sends requests to access your data. Choose one of the following:
     + **https://sandbox-quickbooks.api.intuit.com** – The base URL for the QuickBooks Online development environment. For more information about this environment and the data that it contains, see [Create and test with a sandbox company](https://developer.intuit.com/app/developer/qbo/docs/develop/sandboxes/manage-your-sandboxes) in the Intuit Developer documentation.
     + **https://quickbooks.api.intuit.com** – The base URL for the QuickBooks Online production environment.
   + **QuickBooks CompanyId** – The company ID from your QuickBooks Online account settings.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Intuit account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses QuickBooks Online as the data source, you can select this connection.

## Transferring data from QuickBooks Online with a flow
<a name="quickbooks-online-transfer-data"></a>



To transfer data from QuickBooks Online, create an Amazon AppFlow flow, and choose QuickBooks Online as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for QuickBooks Online, see [Supported objects](#quickbooks-online-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#quickbooks-online-destinations).

## Supported destinations
<a name="quickbooks-online-destinations"></a>

When you create a flow that uses QuickBooks Online as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="quickbooks-online-objects"></a>

When you create a flow that uses QuickBooks Online as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-quickbooks-online.html)

# Recharge connector for Amazon AppFlow
<a name="connectors-recharge"></a>

Recharge is a subscription payment solution designed for merchants to set up and manage dynamic, recurring billing across web and mobile applications. If you're a Recharge user, your account contains data about your customers, transactions, subscriptions, and more. You can use Amazon AppFlow to transfer data from Recharge to certain AWS services or other supported applications.

## Amazon AppFlow support for Recharge
<a name="recharge-support"></a>

Amazon AppFlow supports Recharge as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Recharge.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Recharge.

## Before you begin
<a name="recharge-prereqs"></a>

To use Amazon AppFlow to transfer data from Recharge to supported destinations, you must meet these requirements:
+ You have an account with Recharge that contains the data that you want to transfer. For more information about the Recharge data objects that Amazon AppFlow supports, see [Supported objects](#recharge-objects).
+ In your Recharge account, you've created an API token. For the steps to create this token, see [Recharge API key](https://docs.rechargepayments.com/docs/recharge-api-key) in the Recharge documentation.
+ You've configured the API token with read permissions that allow Amazon AppFlow to access the data that you want to transfer.

From your account settings, note your API token key because you provide this value to Amazon AppFlow when you connect to your Recharge account.

## Connecting Amazon AppFlow to your Recharge account
<a name="recharge-connecting"></a>

To connect Amazon AppFlow to your Recharge account, provide the API token from your account settings.

**To connect to Recharge**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Recharge**.

1. Choose **Create connection**.

1. In the **Connect to Recharge** window, for **API Token**, enter your API token key.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Recharge as the data source, you can select this connection.

## Transferring data from Recharge with a flow
<a name="recharge-transfer-data"></a>

To transfer data from Recharge, create an Amazon AppFlow flow, and choose Recharge as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Recharge, see [Supported objects](#recharge-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#recharge-destinations).

## Supported destinations
<a name="recharge-destinations"></a>

When you create a flow that uses Recharge as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="recharge-objects"></a>

When you create a flow that uses Recharge as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-recharge.html)

# Salesforce connector for Amazon AppFlow
<a name="salesforce"></a>

Salesforce provides customer relationship management (CRM) software that help you with sales, customer service, e-commerce, and more. If you're a Salesforce user, you can connect Amazon AppFlow to your Salesforce account. Then, you can use Salesforce as a data source or destination in your flows. Run these flows to transfer data between Salesforce and AWS services or other supported applications.

## Amazon AppFlow support for Salesforce
<a name="salesforce-support"></a>

Amazon AppFlow supports Salesforce as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Salesforce.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Salesforce.

**Supported API version**  
Amazon AppFlow transfers data with version 58.0 of the Salesforce Platform API.  
Amazon AppFlow began supporting this version on June 30, 2023. If you have a Salesforce connection that you created before this date, the connection uses a prior API version. The version depends on when you created the connection. For more information, see [History of supported Salesforce Platform API versions](#salesforce-api-version-history).

## Before you begin
<a name="salesforce-requirements"></a>

Before you can use Amazon AppFlow to transfer data to or from Salesforce, you must meet these requirements.

**Minimum requirements**
+ You have a Salesforce account.
+ Your Salesforce account is enabled for API access. API access is enabled by default for the Enterprise, Unlimited, Developer, and Performance editions.
+ Your Salesforce account allows you to install connected apps. If you lack access to this functionality, contact your Salesforce administrator. For more information, see [Connected Apps](https://help.salesforce.com/articleView?id=connected_app_overview.htm) in the Salesforce help.

**Optional requirements**
+ If you want to use event-driven flow triggers, you must enable change data capture in Salesforce. For more information on how to enable this, see [Select Objects for Change Notifications in the User Interface](https://developer.salesforce.com/docs/atlas.en-us.change_data_capture.meta/change_data_capture/cdc_select_objects.htm) in the Salesforce documentation.
+ If you want to create private connections using AWS PrivateLink, you must enable both `Manage Metadata` and `Manage External Connections` user permissions in your Salesforce account. Private connections are currently available in the us-east-1, us-west-2, ap-northeast-1, ap-south-1, ap-southeast-2, ca-central-1, and eu-central-1 AWS Regions.

If you meet those requirements, you're ready to connect Amazon AppFlow to your Salesforce account. For typical connections, you don't need do anything else in Salesforce. Amazon AppFlow handles the remaining requirements with the AWS managed connected app.

### The AWS managed connected app for Salesforce
<a name="salesforce-managed-client"></a>

The AWS managed connected app helps you create Salesforce connections in fewer steps. Amazon AppFlow creates this connected app for you in your Salesforce account. In Salesforce, a connected app is a framework that authorizes external applications, like Amazon AppFlow, to access your Salesforce data. Amazon AppFlow configures the connected app with the required settings and names it *Amazon AppFlow Embedded Login App*.

Amazon AppFlow creates the connected app only when you do both of the following:
+ Create a Salesforce connection by using the Amazon AppFlow console.
+ When you configure the connection, set **OAuth grant type** to **Authorization code**.

### Requirements for the OAuth grant types for Salesforce
<a name="salesforce-grant-types"></a>

When you use the Amazon AppFlow console to configure a Salesforce connection, you choose the *OAuth grant type*. The grant type determines how Amazon AppFlow communicates with Salesforce to request access to your data. Your choice affects the requirements that you must meet before you create the connection. You can choose either of these types:

**Authorization code**  
 If you choose this grant type, the Amazon AppFlow console shows a window that prompts you for authorization. In the window, you sign in to your Salesforce account if you haven't signed in already. Then, you choose **Allow** to allow Amazon AppFlow to access your data. After you authorize Amazon AppFlow, it creates the AWS managed connected app in your Salesforce account.  
If you want to use this grant type, you don't need to meet any additional requirements in your Salesforce account.

**JSON Web Token (JWT)**  
If you choose this grant type, you provide a JWT that authorizes Amazon AppFlow to access your Salesforce data. Then, when Amazon AppFlow attempts to access your data, it passes the JWT to Salesforce, and Salesforce grants the access.  
If you want to use this grant type, you must create a JWT ahead of time, but you won't need to sign in to Salesforce when Amazon AppFlow connects to your account.  
For more information about the JWT authorization flow, and for the steps to create a JWT, see [OAuth 2.0 JWT Bearer Flow for Server-to-Server Integration](https://help.salesforce.com/s/articleView?id=sf.remoteaccess_oauth_jwt_flow.htm&type=5) in the Salesforce help.  
Before you can create a JWT, you must create your own connected app in your Salesforce account. Also, you must configure this connected app to meet the requirements for Amazon AppFlow integration.

### Requirements for using your own connected app
<a name="salesforce-global-connected-app-instructions"></a>

Unless you use the AWS managed connected app that Amazon AppFlow creates for you, you must meet these requirements:
+ In your Salesforce account, you've created a connected app for Amazon AppFlow. For more information about connected apps, and for the steps to create one, see [Create a Connected App](https://help.salesforce.com/s/articleView?id=sf.connected_app_create.htm&type=5) in the Salesforce help.
+ You've configured the connected app as follows:
  + You've activated the **Enable OAuth Settings** check box.
  + In the **Callback URL** text field, you've entered one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Salesforce. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
  + You've activated the **Require Secret for Web Server Flow** check box.
  + In the **Available OAuth Scopes** list, you've added the following scopes:
    + Manage user data via APIs (api)
    + Access custom permissions (custom\$1permissions)
    + Access the identity URL service (id, profile, email, address, phone)
    + Access unique user identifiers (openid)
    + Perform requests at any time (refresh\$1token, offline\$1access)
  + You've set the refresh token policy for the connected app to **Refresh token is valid until revoked**. Otherwise, your flows will fail when your refresh token expires. For more information on how to check and edit the refresh token policy, see [Manage OAuth Access Policies for a Connected App](https://help.salesforce.com/articleView?id=connected_app_manage_oauth.htm) in the Salesforce documentation.
  + If you configured your connected app to enforce IP address restrictions, you must grant access to the addresses used by Amazon AppFlow. For more information, see [AWS IP address ranges](https://docs.aws.amazon.com/general/latest/gr/aws-ip-ranges.html) in the *Amazon Web Services General Reference*.

## Connecting Amazon AppFlow to your Salesforce account
<a name="salesforce-setup"></a>

To grant Amazon AppFlow access to your Salesforce data, create a Salesforce connection in the Amazon AppFlow console. If you haven't yet configured your Salesforce account for Amazon AppFlow integration, see [Before you begin](#salesforce-requirements).

**To connect to Salesforce**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Salesforce**.

1. Choose **Create connection**. The console shows the **Connect to Salesforce** window.  
![\[\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-salesforce-console.png)

1. For **Connection name**, enter a custom name that will help you recognize the connection later.

1. For **OAuth grant type**, choose how to authorize Amazon AppFlow to access your Salesforce data:
   + **Authorization code** — Authorize Amazon AppFlow in a window that the console shows after you finish configuring the connection.
   + **JSON Web Token (JWT)** — Authorize Amazon AppFlow by providing a JWT.

1. For **Salesforce environment**, choose one of the following:
   + **Production** — Connects Amazon AppFlow to your Salesforce production org.
   + **Sandbox** — Connects Amazon AppFlow to a Salesforce sandbox.

1. For **PrivateLink**, choose **Enabled** if you want to connect to your Salesforce account privately through an AWS PrivateLink connection. Otherwise, leave this open set to **Disabled**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. Choose **Connect**.

1. If you chose **Authorization code** for **OAuth grant type**, the console shows a window. In the window, sign in to your Salesforce account if needed. Then, choose **Allow** to allow Amazon AppFlow to access your Salesforce data.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Salesforce as the data source, you can select this connection.

**AWS PrivateLink connections**  
If you enabled the option to connect to Salesforce through AWS PrivateLink, wait for Amazon AppFlow to set up the private connection before you create a flow. To set up the connection, Amazon AppFlow provisions an interface VPC endpoint and attempts to connect to your VPC endpoint service. This can take several minutes. Until the process completes, you can't transfer your Salesforce objects with a flow.  
For more information about AWS PrivateLink, see the [AWS PrivateLink Guide](https://docs.aws.amazon.com/vpc/latest/privatelink/).

## Additional flow settings for Salesforce
<a name="salesforce-additional-settings"></a>

When you configure a flow that uses a Salesforce connection, the Amazon AppFlow console shows some unique settings that aren't available for other types of flows.

### Salesforce API preference
<a name="salesforce-api-preference"></a>

When you use Salesforce as the source or destination, you can configure the **Salesforce API preference** setting. Use this setting to specify what Salesforce API Amazon AppFlow uses when your flow transfers data to or from Salesforce. Your choice optimizes your flow for small to medium-sized data transfers, large data transfers, or both.

The Amazon AppFlow console provides this setting on the **Configure flow** page under **Source details** or **Destination details**. To view it, expand the **Additional settings** section.

![\[\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-salesforce-api-preference.png)


You can choose one of these options:
+ **Automatic (default)** — For each flow run, Amazon AppFlow selects the API to use based on the number of records that the run transfers. The threshold of records that determines the API varies based on whether Salesforce is the source or the destination, as shown in the following table:    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/salesforce.html)
**Notes**  
If you choose this option, be aware that each of the potential Salesforce APIs structures data differently. For recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and how null values are represented.
Flow runs that use Bulk API 2.0 can't transfer Salesforce compound fields.

  If you choose this option, you optimize flow performance for all data transfer sizes, but the tradeoff is inconsistent formatting in the output.
+ **Standard** — Amazon AppFlow uses only Salesforce REST API. This option optimizes your flow for small to medium-sized data transfers. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0.
**Note**  
If you choose this option and your flow attempts to transfer a very large dataset, it might fail with a timeout error.
+ **Bulk** — Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large datasets. If you choose this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
**Note**  
If you choose this option, your flow can't transfer Salesforce compound fields because Bulk API 2.0 doesn't support them.

### Salesforce destination record preference
<a name="salesforce-destination-record-preference"></a>

When you use Salesforce as a destination, the Amazon AppFlow console shows additional settings on the **Map data fields** page under **Destination record preference**.

![\[\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow-salesforce-destination-record-preference.png)


You can choose one of these options:

**Insert new records**  
This is the default data transfer option. When you choose this setting, Amazon AppFlow inserts your source data into the chosen Salesforce object as a new record.

**Update existing records**  
When you choose this setting, Amazon AppFlow uses your source data to update existing records in Salesforce. For every source record, Amazon AppFlow looks for a matching record in Salesforce based on your criteria. You can specify matching criteria on the **Map data fields** page. To do so, select a field in the source application and map it to a Salesforce record ID field with the dropdown list.  
When a matching record is found, Amazon AppFlow updates the record in Salesforce. If no matching record is found, Amazon AppFlow ignores the record or fails the flow per your chosen error handling option. You can specify your error handling preferences on the **Configure flow** page.  
Note that you must use the upsert operation in order to update existing records using an external id field. The standard update operation does not support use of an external id field.

**Upsert records **  
When you choose this setting, Amazon AppFlow performs an upsert operation in Salesforce. For every source record, Amazon AppFlow looks for a matching record in Salesforce based on your criteria. You can specify matching criteria on the Map data fields page. To do so, select a field in the source application and map it to a Salesforce external field using the dropdown list.  
When Amazon AppFlow finds a matching record, it updates the record in Salesforce. If Amazon AppFlow finds no matching record, it inserts the data as a new record. Any errors in performing the operation are handled according to your chosen error handling option. You can specify your error handling preferences on the **Configure flow** page.

**Delete existing records**  
When you choose this setting, Amazon AppFlow deletes Salesforce records that you specify. To specify the records, create a file that contains the IDs that Salesforce assigned to them. Provide that file as the source data for your flow.  
For example, the following CSV file lists the IDs of two Salesforce records to delete.  

```
salesforce_id
A1B2C3D4E5F6G7H8I9
J1K2L3M4N5O6P7Q9R0
```
In this example, the IDs appear under the one source field in the file, `salesforce_id`.  
In your flow definition, you must specify the source field that contains the IDs of the objects to delete. You do this when you map data fields. At that point, you map the source field to the corresponding destination field in Salesforce. For example, if you assigned the Salesforce object **Opportunity** to your flow, then the destination field name is **Opportunity ID**.  
You can provide a source data file that has other fields besides the one with the IDs, but Amazon AppFlow ignores them.  
Each flow can delete only one type of object, which is the Salesforce object that you choose when you configure the destination details.  
After your flow runs, you can view the records that it deleted in your Salesforce recycle bin. You can recover your files from the recycle bin if needed. However, you must do so before its retention period elapses or before the files are manually purged.  
If any errors occur when you run the flow, Amazon AppFlow handles them according to the error handling option that you chose when you configured the flow.

## Notes
<a name="salesforce-notes"></a>
+ Amazon AppFlow only supports the automatic import of newly created Salesforce fields into Amazon S3 without requiring the user to update their flow configurations.
+ When you use Salesforce as a source, you can import 15 GB of data as part of a single flow run. To transfer over 15 GB of data, you can split your workload into multiple flows by applying the appropriate filters to each flow. Salesforce records are typically 2 KB in size, but can be up to 4 KB. Therefore, 15 GB would be approximately 7.5 million Salesforce records.
+ When you use Salesforce as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.
+ Amazon AppFlow supports Change Data Capture Events and Platform events from Salesforce. 

## Supported destinations
<a name="salesforce-destinations"></a>

When you create a flow that uses Salesforce as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon EventBridge
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="salesforce-resources"></a>
+ [Building Salesforce integrations with EventBridge and Amazon AppFlow](https://aws.amazon.com/blogs/compute/building-salesforce-integrations-with-amazon-eventbridge) in the *AWS Compute* blog 
+  [Building Secure and Private Data Flows Between AWS and Salesforce Using Amazon AppFlow](https://aws.amazon.com/blogs/apn/building-secure-and-private-data-flows-between-aws-and-salesforce-using-amazon-appflow) in the *AWS Partner Network (APN)* blog 
+ [Using Amazon AppFlow to Achieve Bi-Directional Sync Between Salesforce and Amazon RDS for PostgreSQL](https://aws.amazon.com/blogs/apn/using-amazon-appflow-to-achieve-bi-directional-sync-between-salesforce-and-amazon-rds-for-postgresql/) in the *AWS Partner Network (APN)* blog 
+ [Salesforce Private Connect Demo](https://www.salesforce.com/form/conf/aws/private-connect-demo) in the Salesforce documentation
+ [Manage OAuth Access Policies for a Connected App](https://help.salesforce.com/articleView?id=connected_app_manage_oauth.htm) in the Salesforce documentation
+ [Select Objects for Change Notifications in the User Interface](https://developer.salesforce.com/docs/atlas.en-us.change_data_capture.meta/change_data_capture/cdc_select_objects.htm) in the Salesforce documentation
+ How to insert new Salesforce records with data in Amazon S3 using Amazon AppFlow  


## Using a connected app with the Amazon AppFlow API
<a name="salesforce-global-connected-app"></a>

You can use your own connected app for Salesforce with Amazon AppFlow API.

To use your own connected app, you need to pass on the clientId, clientSecret, and Secrets Manager secret ARN to Amazon AppFlow.

You must attach a resource policy to the Secrets Manager secret and the KMS key which is used to encrypt the secret. This resource policy allows Amazon AppFlow to read the secret and use it.

The following is the policy to be attached for the KMS key. Replace the *placeholder* with your own information.

 Additionally, you can add confused deputy protection to this KMS key policy. To learn about the confused deputy problem and mitigations, refer to our [Amazon S3 documentation.](https://docs.aws.amazon.com/appflow/latest/userguide/s3-policies-management.html#cross-service-confused-deputy-prevention) The following example shows how you can use the `aws:SourceArn` and `aws:SourceAccount` global condition context keys in your AWS KMS key to prevent the confused deputy problem. Replace * Account ID * with your AWS account ID and *Resource ARNs* with a list of ARNs for any connector profiles created with the client credentials secret. Additionally you may use wildcards in the aws:SourceAccount key (\$1). For example, you can replace *Resource ARNs* with `arn:aws:appflow:region:accountId:*` to give access to all Amazon AppFlow resources created on your behalf.

The following is the policy to be attached for the secret. Replace the *placeholder* with your own information.

## History of supported Salesforce Platform API versions
<a name="salesforce-api-version-history"></a>

When you run a flow with Salesforce as the source or destination, Amazon AppFlow transfers data by using a version of the Salesforce Platform API. The version depends on when you created the Salesforce connection that you assigned to the flow.


| Date when connection created | API version used | 
| --- | --- | 
| June 30, 2023 to present | 58.0 | 
| August 30, 2022 to June 29, 2023 | 55.0 | 
| January 19th, 2021 to August 29, 2022 | 50.0 | 
| Before January 19th, 2021 | 47.0 | 

# Salesforce Marketing Cloud connector for Amazon AppFlow
<a name="connectors-salesforce-marketing-cloud"></a>

Marketing Cloud is a Salesforce platform for digital marketing that helps its customers manage campaigns across multiple channels, including email, mobile, and social. If you use Marketing Cloud, you can also use Amazon AppFlow to transfer your data to certain AWS services or other supported applications.

**Topics**
+ [

## Salesforce Marketing Cloud support
](#salesforce-marketing-cloud-support)
+ [

## Before you begin
](#salesforce-marketing-cloud-prereqs)
+ [

## Connecting Amazon AppFlow to your Salesforce Marketing Cloud account
](#salesforce-marketing-cloud-connecting)
+ [

## Transferring data from Salesforce Marketing Cloud with a flow
](#salesforce-marketing-cloud-transfer-data)
+ [

## Supported objects
](#salesforce-marketing-cloud-reference-objects)
+ [

## Supported destinations
](#salesforce-marketing-cloud-reference-destinations)

## Salesforce Marketing Cloud support
<a name="salesforce-marketing-cloud-support"></a>

Amazon AppFlow supports Salesforce Marketing Cloud as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from your Marketing Cloud account.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to your Marketing Cloud account.

## Before you begin
<a name="salesforce-marketing-cloud-prereqs"></a>

Before you can use Amazon AppFlow to transfer data from Marketing Cloud, you need the following:
+ A Salesforce Marketing Cloud account that contains the data that you want to transfer. For more information about the Marketing Cloud data objects that Amazon AppFlow supports, see [Supported objects](#salesforce-marketing-cloud-reference-objects).
+ A Marketing Cloud *package* so that Amazon AppFlow can access your data. In Marketing Cloud, you create packages to add custom functionality to your account. For the steps to create a package, see [Create and Install Packages](https://developer.salesforce.com/docs/marketing/marketing-cloud/guide/install-packages.html) in the Marketing Cloud documentation.

When you create a package for Amazon AppFlow integration, do the following:

1. Add an API integration component to the package.

1. Set the integration type of the component to server-to-server.

1. Grant read access to every data object that you want to transfer with Amazon AppFlow.

1. The Salesforce Marketing Cloud connector now supports fetching records from the data extension. If you want to fetch data extension records, you need to add the read and write scopes to your package.

1. After you create the package, note the following properties. You need them to create a connection in Amazon AppFlow:
   + Client ID
   + Client secret
   + Authentication base URI
   + REST base URI or SOAP base URI (You can use either one; it doesn't matter which one you use)

## Connecting Amazon AppFlow to your Salesforce Marketing Cloud account
<a name="salesforce-marketing-cloud-connecting"></a>

To connect Amazon AppFlow to your Marketing Cloud account, provide details about the package so that Amazon AppFlow can access your data. To learn how to create a package, see [Before you begin](#salesforce-marketing-cloud-prereqs).

**To connect to Salesforce Marketing Cloud**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Salesforce Marketing Cloud**.

1. Choose **Create connection**.

1. In the **Connect to Salesforce Marketing Cloud** window, provide the following details: 
   + **Custom authorization tokens URL** – The authentication base URI that's assigned to your Marketing Cloud package. Provide the subdomain to complete the URI shown in the console: `https://subdomain.auth.marketingcloudapis.com/v2/token`. 
   + **Client ID** – The client ID that is assigned to your Marketing Cloud package.
   + **Client secret** – The client secret that is assigned to your Marketing Cloud package.
   + **Salesforce Marketing Cloud Subdomain Endpoint** – The REST base URI or SOAP base URI that is assigned to your Marketing Cloud package. These URIs looks similar to the following examples:
     + `https://subdomain.rest.marketingcloudapis.com/`
     + `https://subdomain.soap.marketingcloudapis.com/`

     In these examples, *subdomain* is the same value that you provide for the custom authorization tokens URL.

     You must provide either the REST or SOAP URI, but the one that you use doesn't matter. With either one, Amazon AppFlow connects to your Marketing Cloud package, and it transfers data by using the REST or SOAP endpoint as needed.

   For more information about the authentication, REST, and SOAP URIs for Marketing Cloud packages, see [Your Subdomain and Your Tenant's Endpoints](https://developer.salesforce.com/docs/marketing/marketing-cloud/guide/your-subdomain-tenant-specific-endpoints.html) in the Marketing Cloud documentation.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Salesforce Marketing Cloud as the data source, you can select this connection.

## Transferring data from Salesforce Marketing Cloud with a flow
<a name="salesforce-marketing-cloud-transfer-data"></a>

To transfer data from Marketing Cloud, create an Amazon AppFlow flow, and choose Salesforce Marketing Cloud as the data source. To learn how to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For more information about the objects that Amazon AppFlow supports for Marketing Cloud, see [Supported objects](#salesforce-marketing-cloud-reference-objects).

Also choose the destination where you want to transfer the data object that you selected. For more information on how to configure your destination, see [Supported destinations](#salesforce-marketing-cloud-reference-destinations).

## Supported objects
<a name="salesforce-marketing-cloud-reference-objects"></a>

When you create a flow that uses Salesforce Marketing Cloud as the data source, you can transfer the following data objects from your Marketing Cloud account: 
+ Activity
+ Bounce Event
+ Click Event
+ Content Area
+ Data Extension
+ Email
+ Forwarded Email Event
+ Forwarded Email OptInEvent
+ Link
+ Link Send
+ List
+ List Subscriber
+ Not Sent Event
+ Open Event
+ Send
+ Sent Event
+ Subscriber
+ Survey Event
+ Unsub Event
+ Audit Events
+ Campaigns
+ Interactions
+ Content Assets

## Supported destinations
<a name="salesforce-marketing-cloud-reference-destinations"></a>

When you create a flow that uses Salesforce Marketing Cloud as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Salesforce Pardot
<a name="pardot"></a>

The following are the requirements and connection instructions for using Pardot with Amazon AppFlow.

**Note**  
You can use Pardot as a source only.

**Topics**
+ [

## Requirements
](#pardot-requirements)
+ [

## Setup instructions
](#pardot-setup)
+ [

## Notes
](#pardot-notes)
+ [

## Supported destinations
](#salesforce-pardot-destinations)
+ [

## Related resources
](#pardot-resources)

## Requirements
<a name="pardot-requirements"></a>
+ Your Salesforce account must be enabled for API access. API access is enabled by default for Enterprise, Unlimited, Developer, and Performance editions.
+ Your Salesforce account must allow you to install connected apps. If this option is disabled, contact your Salesforce administrator.
+ After you create a Pardot connection in Amazon AppFlow, verify that the connected app named *Amazon AppFlow Pardot Embedded Login App* is installed in your Salesforce account. For instructions on how to create a connected app in Salesforce, see [Requirements for using your own connected app](salesforce.md#salesforce-global-connected-app-instructions). For more information about connected apps in Salesforce, see [Connected Apps](https://help.salesforce.com/articleView?id=connected_app_overview.htm) in the Salesforce documentation.
+ The refresh token policy for the **Amazon AppFlow Pardot Embedded Login App** must be set to **Refresh token is valid until revoked**. Otherwise, your flows will fail when your refresh token expires.
+ If your Pardot app enforces IP address restrictions, you must grant access to the addresses used by Amazon AppFlow. For more information, see [AWS IP address ranges](https://docs.aws.amazon.com/general/latest/gr/aws-ip-ranges.html) in the *Amazon Web Services General Reference*.

**Pardot version support**  
Amazon AppFlow supports Pardot version 4 only. If you are still using version 3, you must upgrade to version 4 to use Amazon AppFlow. For more information, see [Transitioning from version 3 to version 4](https://developer.pardot.com/kb/api-version-4/#transitioning-from-version-3-to-version-4) in the Pardot documentation.

**Authentication and Pardot business ID**
+ Amazon AppFlow supports authentication via OAuth2 with Pardot. For more information, see [Authentication Via Salesforce OAuth](https://developer.pardot.com/kb/authentication/#via-salesforce-oauth) in the Pardot documentation.
+ You must have the Pardot Business Unit ID that you are trying to authenticate with. To find the Pardot Business Unit ID in Salesforce, go to **Setup** and enter **Pardot Account Setup** in the **Quick Find** box. Your Pardot Business Unit ID begins with *0Uv* and is 18 characters long. If you cannot access the Pardot account setup information, ask your Salesforce administrator to provide you with the Pardot Business Unit ID.

## Setup instructions
<a name="pardot-setup"></a>

**To connect to Pardot while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Pardot** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Pardot** dialog box. If you are connecting to Pardot for the first time, follow the instructions to complete the OAuth workflow and create a connection profile.

1. You will be redirected to the Pardot login page. When prompted, grant Amazon AppFlow permissions to access your Pardot account.

Now that you are connected to your Pardot account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#pardot-requirements) section.

## Notes
<a name="pardot-notes"></a>
+ When you use Pardot as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.
+ You can connect Amazon AppFlow to your Pardot [sandbox account](https://pi.demo.pardot.com/) in addition to your Pardot [production account](http://pi.pardot.com/).
+ Amazon AppFlow inherits quotas from Pardot. Quotas are enforced on daily requests and concurrent requests at the customer level.*Pardot Pro* customers are allocated 25,000 API requests a day.*Pardot Ultimate* customers can make up to 100,000 API requests a day. These limits reset at the beginning of the day based on your account time zone settings. Any request that exceeds these quotas results in an [error code 122](http://developer.pardot.com/kb/error-codes-messages/#error-code-122). Amazon AppFlow handles these error codes transparently.

## Supported destinations
<a name="salesforce-pardot-destinations"></a>

When you create a flow that uses Salesforce Pardot as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon EventBridge
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="pardot-resources"></a>
+ [Transitioning from version 3 to version 4](https://developer.pardot.com/kb/api-version-4/#transitioning-from-version-3-to-version-4) in the Pardot documentation
+ [Connected Apps](https://help.salesforce.com/articleView?id=connected_app_overview.htm) in the Salesforce documentation
+ [Authentication Via Salesforce OAuth](https://developer.pardot.com/kb/authentication/#via-salesforce-oauth) in the Pardot documentation

# SAP OData connector for Amazon AppFlow
<a name="sapodata"></a>

The Amazon AppFlow SAP OData connector provides the ability to fetch, create, and update records exposed by SAP S/4HANA and SAP on premises systems through OData APIs.

With this connector, you can connect Amazon AppFlow to your OData services, including those that extract data from SAP applications that use the Operational Data Provisioning (ODP) framework. These applications are called ODP providers. For more information about how OData services can extract ODP data in SAP, see [ODP-Based Data Extraction via OData](https://help.sap.com/docs/SAP_BPC_VERSION_BW4HANA/dd104a87ab9249968e6279e61378ff66/11853413cf124dde91925284133c007d.html?version=11.0) in the SAP BW/4HANA documentation.

When you connect Amazon AppFlow to ODP providers, you can create flows that run full data transfers or incremental updates. Incremental updates for ODP data are efficient because they transfer only those records that changed since the prior flow run.

## Amazon AppFlow support for SAP OData
<a name="sap-odata-support"></a>

With the SAP OData connector, Amazon AppFlow supports SAP as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from SAP.

**Supported as a data destination?**  
Amazon AppFlow supports SAP OData as a destination, but not for ODP data. You can use Amazon AppFlow to transfer data to an OData service, but you can't transfer data to an ODP provider.

## Before you begin
<a name="sapodata-requirements"></a>

To use Amazon AppFlow to transfer data from SAP OData to supported destinations, you must meet these requirements:
+ Your SAP NetWeaver stack version must be 7.40 SP02 or above.
+ You must enable catalog service for service discovery.
  + **OData V2.0:** The OData V2.0 catalog service(s) can be enabled in your SAP Gateway via transaction **/IWFND/MAINT\$1SERVICE **.  
![\[Service catalog interface showing two catalog services with version and description details.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/sapodata-odatav2-catalog-service-enablement.png)
  + **OData V4.0:** The OData V4.0 catalog services can be enabled in your SAP Gateway environment by publishing the service groups **/IWFND/CONFIG** or as described in the SAP documentation relevant to your gateway version.  
![\[Service group configuration interface showing available system aliases and services.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/sapodata-odatav4-catalog-service-enablement.png)
+ You must enable OData V2.0/V4.0 services in your SAP Gateway. The OData V2.0 services can be enabled via transaction **/IWFND/MAINT\$1SERVICE** and V4.0 services can be published via transaction **/IWFND/V4\$1ADMIN**.
+ Your SAP OData service must support client side pagination/query options such as **\$1top** and **\$1skip**. It must also support system query option **\$1count**.
+ Amazon AppFlow supports following authentication mechanisms:
  + **Basic** - Supported for OData V2.0 and OData V4.0
  + **OAuth 2.0** - Supported for only OData V2.0. You must enable OAuth 2.0 for the OData service and register the OAuth client per SAP documentation and set the authorized redirect URL as follows:
    + https://console.aws.amazon.com/appflow/oauth for the us-east-1 Region
    + https://region.console.aws.amazon.com/appflow/oauth for all other Regions
+ You must enable secure setup for connecting over HTTPS.
+ You must provide required authorization for the user in SAP to discover the services and extract data using SAP OData services. Please refer to the security documentation provided by SAP.

### ODP Requirements
<a name="sapodata-odp-requirements"></a>

Before you can transfer data from an ODP provider, you need to meet the following requirements:
+ You have an SAP NetWeaver AS ABAP instance.
+ Your SAP NetWeaver instance contains an ODP provider that you want to transfer data from. ODP providers include:
  + SAP DataSources (Transaction code RSO2)
  + SAP Core Data Services ABAP CDS Views
  + SAP BW or SAP BW/4HANA systems (InfoObject, DataStore Object)
  + Real-time replication of Tables and DB-Views from SAP Source System via SAP Landscape Replication Server (SAP SLT)
  + SAP HANA Information Views in SAP ABAP based Sources
+ Your SAP NetWeaver instance has the SAP Gateway Foundation component.
+ You have created an OData service that extracts data from your ODP provider. To create the OData service, you use the SAP Gateway Service Builder. To access your ODP data, Amazon AppFlow calls this service by using the OData API. For more information, see [Generating a Service for Extracting ODP Data via OData](https://help.sap.com/docs/SAP_BPC_VERSION_BW4HANA/dd104a87ab9249968e6279e61378ff66/69b481859ef34bab9cc7d449e6fff7b6.html?version=11.0) in the SAP BW/4HANA documentation.
+ To generate an OData service based on ODP data sources, SAP Gateway Foundation must be installed locally in your ERP/BW stack or in a hub configuration.
  + For your ERP/BW applications, the SAP NetWeaver AS ABAP stack must be at 7.50 SP02 or above.
  + For the hub system (SAP Gateway), the SAP NetWeaver AS ABAP of the hub system must be 7.50 SP01 or above for remote hub setup.

### Private Connection Requirements
<a name="sapodata-private-connection-requirements"></a>

Before you can create a private connection to SAP, you need to meet the following requirements:
+ You need to create VPC Endpoint Service for your SAP OData instance running in a VPC. This VPC endpoint service must have Amazon AppFlow service principal **appflow.amazonaws.com** as allowed principal and must be available in **at least more than 50% AZs in a region**.
+ When creating connection using OAuth, your **Authorization Code URL** must be reachable by the network from where the connection is being setup. This is because OAuth connection involves browser interaction with SAP Login Page which cannot happen over AWS PrivateLink. The network from where the connection is being setup must be connected to SAP OData instance running in a VPC so that hostname of authorization code url can be resolved. Alternately, you can choose to make your Authorization Code URL available over public internet so that console user interaction can happen from any network.
+  For OAuth, in addition to **Application Host URL**, your **Authorization Tokens URL** must also be available behind VPC Endpoint Service to fetch Access/Refresh tokens over private network.
+ For OAuth, you must set your OAuthCode expiry to at least 5 minutes.

## Connecting Amazon AppFlow to your SAP account
<a name="sapodata-setup"></a>

To connect Amazon AppFlow to your SAP account, provide details about your SAP OData service so that Amazon AppFlow can access your data. If you haven't yet configured your SAP OData service for Amazon AppFlow integration, see [Before you begin](#sapodata-requirements).

**To create an SAP OData connection**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **SAP OData**.

1. Choose **Create connection**.

1. In the **Connect to SAP OData** window, enter the following information:

   1. Under **Application Host URL**, enter your Application host url. This application host url must be accessible over public internet for non PrivateLink connection.

   1. Under **Application Service Path**, enter your catalog service path. e.g.**/sap/opu/odata/iwfnd/catalogservice;v=2**. Amazon AppFlow doesn’t accept specific object path.

   1. Under **Port Number**, enter your port number.

   1. Under **Client Number**, enter your 3 digit client number. Acceptable values are [001-999]. e.g.**010**

   1. Under **Logon Language**, enter your two character logon language. e.g. **EN**.

   1. (Optional) To use private connection for data transfer, under **AWS PrivateLink service name**, enter your VPC Endpoint (PrivateLink) service name. e.g. **com.amazonaws.vpce.us-east-1.vpce-svc-xxxxxxxxxxxxxx**

   1. Select your preferred Authentication Mode.
      + If Basic,

        1. Under **User name**, enter your useraname.

        1. Under **Password**, enter your password.
      + If OAuth2,

        1. Under **Authorization Code URL**, enter your authorization code URL.

        1. Under **Authorization Tokens URL**, enter your authorization token URL.

        1. Under **OAuth Scopes**, enter your OAuth scopes separated by space. e.g.**/IWFND/SG\$1MED\$1CATALOG\$10002 ZAPI\$1SALES\$1ORDER\$1SRV\$10001** 

        1. Under **Client ID**, enter your client id .

        1. Under **Client Secret**, enter your client secret .

   1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

      By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

      Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

      If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Continue**.

   1. If using OAuth, you will be redirected to the SAP login page. When prompted, grant Amazon AppFlow permissions to access your SAP account.  
![\[Form to connect SAP OData with AWS PrivateLink, showing fields for configuration and OAuth2 authentication.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-sapodata-console.png)

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses SAP OData as the data source, you can select this connection.

If you chose to enable PrivateLink, note the following:
+ Amazon AppFlow creates AWS PrivateLink Endpoint (if not already present) connection to your VPC Endpoint Service before any metadata/data transfer calls can be made to your SAP OData instance over private network. AWS PrivateLink Endpoint creation can take 3-5 minutes, and until its created, profile status would be PENDING. While the connection status is PENDING, you are unable to transfer SAP OData objects with a flow.
+ If your VPC Endpoint Service has **Acceptance Required** setting set to true, you will need to accept the connection in the AWS account which has VPC Endpoint service for AWS PrivateLink endpoint provisioning to start.
+ Once the AWS PrivateLink Endpoint connection is established, Amazon AppFlow fetches (only for OAuth) access/refresh tokens using the authCode, makes a test connection call over private network, and finally changes connection status from PENDING to CREATED.
+ If for any reason private connection creation fails, connection status would change to FAILED.

## Transferring data from SAP OData with a flow
<a name="sap-odata-transfer-data"></a>

To transfer data from SAP OData, create an Amazon AppFlow flow, and choose SAP OData as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose which data object you want to transfer. If the data object originates from an ODP provider, you can configure the flow so that it runs efficient incremental updates that transfer changed records only.

### Transferring ODP data
<a name="sap-odata-transfer-data-odp"></a>

When you create a flow that transfers an ODP data object, you can configure the flow to run *incremental* or *full* data transfers.

#### Incremental ODP data transfers
<a name="odp-incremental"></a>

When you create a flow that transfers ODP data incrementally, it does the following:
+ It subscribes to the *operational delta queue* of your ODP provider. This queue provides Amazon AppFlow with delta tokens, which indicate changes made to the provider's records in SAP.
+ For the initial flow run, it performs a full data transfer. It obtains all available records from your ODP provider, except for any that you omit by adding filters to your flow configuration.
+ For subsequent flow runs, it performs incremental data transfers. By using the information provided by the delta tokens, it transfers only those records that changed after the last flow run.

When you create an SAP OData flow in the Amazon AppFlow console, you can configure it to transfer data incrementally in the **Flow trigger** section, where you do the following:

1. Choose **Run flow on schedule**.

1. Use the scheduling fields to specify when the flow begins, how often it repeats, and when it ends.

1. Under **Transfer mode**, choose **Incremental transfer**.

   For ODP data objects specifically, the console requires no additional input. This behavior differs from SAP data objects that don't come from an ODP provider. For those objects, you must specify a source timestamp field that Amazon AppFlow uses to identify new or changed records. For ODP data, no such timestamp is necessary because Amazon AppFlow uses the information that's provided by the delta token that it receives from the operational delta queue.

**Important**  
When you create an incremental flow for an ODP data object, the flow creates a subscription to the operational delta queue for that object. Although Amazon AppFlow creates these subscriptions, it doesn't administer them on your behalf. Keep the following subscription behaviors in mind to prevent unwanted effects:  
When a flow subscribes to a queue, it also removes all prior subscriptions to that queue. If you previously created any scheduled flows that transfer the same object, delete those flows. They no longer receive delta tokens, and they stop performing incremental data transfers. For any individual ODP object, maintain only one scheduled flow at a time.
When you delete a flow that subscribes to an operational delta queue, that operation does not delete the subscription itself. You can only delete the subscription by using the SAP system to do so.

#### Full ODP data transfers
<a name="odp-full"></a>

You can create flows that run full data transfers of your ODP data. For these flows, Amazon AppFlow does not create subscriptions to operational delta queues like it does for incremental flows.

When you create an SAP OData flow in the Amazon AppFlow console, you can configure it to run full data transfers in the **Flow trigger** section, where you do the following:
+ Under **Choose how to trigger the flow**, do either of the following:
  + Choose **Run on demand**. After you create an on-demand flow, you run it manually by choosing **Run flow** on its details page in the Amazon AppFlow console.
  + Choose **Run flow on schedule** and define your schedule:.

    1. Use the scheduling fields to specify when the flow begins, how often it repeats, and when it ends.

    1. For **Transfer mode**, choose **Full transfer.**
**Note**  
To create a flow that runs full data transfers, the frequency that you choose must be no more frequent than **Daily**. If it is more frequent, then you won't be able to choose **Full transfer**.

## Advanced capabilities for the SAP OData connector
<a name="sapodata-destination"></a>

For the SAP OData connector, Amazon AppFlow supports a couple unique capabilities that are unavailable with other destination-enabled connectors. With it, you can:
+ Capture the SAP success response when you create a new record.
+ Create deep entities with the SAP OData deep insert feature. For more information about this feature, see [Deep Insert](https://help.sap.com/viewer/68bf513362174d54b58cddec28794093/7.40.22/en-US/11a426519eff236ee10000000a445394.html) in the SAP Gateway Foundation documentation.

You can use these capabilities individually or in combination. For example, you can capture SAP's success response when you insert a deep entity.

To enable these capabilities, complete the following steps.

**To capture the SAP success response for new records**

1. Create an Amazon S3 bucket. The bucket must be in the same AWS Region as the flow that you create for your SAP OData connector. For the steps to create a bucket, see [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) in the *Amazon S3 User Guide*.

1. Configure the flow by following the steps in [Creating flows in Amazon AppFlow](create-flow.md), but do one additional step: 

   1. On the **Configure flow** page, under **Response handling**, select the bucket that you created. The SAP success response payload is delivered to this bucket when finish creating your flow.

**To create SAP deep entities**

1. Generate a JSON Lines input file that defines one deep entity per line, as shown by the following example.

------
#### [ JSON Lines (required format) ]

   The following input file defines two deep entities in JSON Lines format (also called newline-delimited JSON). In this format, each line is a complete JSON object that defines an individual deep entity.

   Each deep entity can include multiple levels of hierarchical data. This example creates two Sales Orders, and each contains two associated Sales Order Items.

   ```
   {"SalesOrderType": "OR","SalesOrganization": "1710","DistributionChannel": "10","OrganizationDivision": "00","SoldToParty": "USCU_S13","TransactionCurrency": "USD","PurchaseOrderByCustomer": "TEST-PO2021","to_Item": [{"Material": "MZ-FG-C990","RequestedQuantity": "10","RequestedQuantityUnit": "PC"},{"Material": "MZ-FG-M500","RequestedQuantity": "10","RequestedQuantityUnit": "PC"}]}
   {"SalesOrderType": "OR","SalesOrganization": "1710","DistributionChannel": "10","OrganizationDivision": "00","SoldToParty": "USCU_S13","TransactionCurrency": "USD","PurchaseOrderByCustomer": "TEST-PO2021","to_Item": [{"Material": "MZ-FG-C990","RequestedQuantity": "10","RequestedQuantityUnit": "PC"},{"Material": "MZ-FG-M500","RequestedQuantity": "10","RequestedQuantityUnit": "PC"}]}
   ```

------
#### [ Formatted JSON (for readability) ]

   The following example shows one of the deep entities from the JSON Lines input file. This example is formatted for readability so that you can more easily see the nested JSON values.

   ```
   {
     "SalesOrderType": "OR",
     "SalesOrganization": "1710",
     "DistributionChannel": "10",
     "OrganizationDivision": "00",
     "SoldToParty": "USCU_S13",
     "TransactionCurrency": "USD",
     "PurchaseOrderByCustomer": "TEST-PO2021",
     "to_Item":
     [
       {
         "Material": "MZ-FG-C990",
         "RequestedQuantity": "10",
         "RequestedQuantityUnit": "PC"
       },
       {
         "Material": "MZ-FG-M500",
         "RequestedQuantity": "10",
         "RequestedQuantityUnit": "PC"
       }
     ]
   }
   ```

   Remember that Amazon AppFlow requires JSON Lines format, so this example would be an invalid input file.

------

1. Create an Amazon S3 bucket. The bucket must be in the same AWS Region as the flow that you create for your SAP OData connector. For the steps to create a bucket, see [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) in the *Amazon S3 User Guide*.

1. Upload your deep entities input file to the bucket that you created. For the steps to upload a file, see [Uploading objects](https://docs.aws.amazon.com/AmazonS3/latest/userguide/upload-objects.html) in the *Amazon S3 User Guide*.

1. Configure the flow by following the steps in [Creating flows in Amazon AppFlow](create-flow.md), but do one alternate step: 

   1. On the **Map data fields** page, under **Mapping method**, choose **Passthrough fields without modification**.
**Note**  
When you choose this option, the console disables the options under **Source to destination field mapping**. With this option, you don't define mappings in the console. Instead, the fields in your input file must match the fields that you use in SAP.

### Transferring data with concurrent processes
<a name="concurrent-processes"></a>

When you configure a flow that transfers OData records from an SAP instance, you can speed up the transfer by setting multiple *concurrent processes*. Each concurrent process is a query that retrieves a batch of records from your SAP instance. When the flow transfers your data, it runs these processes at the same time. As a result, the flow uses multiple parallel threads that can transfer large datasets more quickly.

**Note**  
Amazon AppFlow supports concurrent processes only for flows that do the following:   
Transfer OData records.
Transfer from SAP as the data source.
Amazon AppFlow doesn’t support this feature for ODP records or for flows that transfer to SAP as the data destination.

**To transfer your data with concurrent processes**

Configure the flow by following the steps in [Creating flows in Amazon AppFlow](create-flow.md), and do these additional steps: 

1. On the **Configure flow** page, choose your SAP OData connector under **Source details**.

1. In the **Source details** section, under **Additional settings**, set the following options:  
**Batch size**  
The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.  
**Maximum number of concurrent processes**  
The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data. The default value is one. You can specify up to 10.

![\[Additional settings panel with batch size and concurrent processes options for SAP data transfer.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/sapodata-concurrent-processes.png)


When the flow runs, Amazon AppFlow calculates how many processes it needs by dividing the number of records in your instance with the batch size. If the number is less than the maximum, the flow runs the processes only once, and it runs only as many processes as it needs. If the number exceeds the maximum, the flow runs the processes multiple times, and it doesn’t exceed the maximum at any one time.

## Notes
<a name="sapodata-notes"></a>
+ When you use SAP OData as a source, you can run schedule-triggered flows at a maximum frequency of one flow runs per minute.
+ If you have a private ConnectorProfile for a VPC endpoint service, and you try to create another private ConnectorProfile for the same VPC endpoint service, Amazon AppFlow will re-use the already created private connection, and thus you would not need to wait for private connection provisioning to complete to list and choose SAP OData object.
+ Amazon AppFlow allows at max 1000 flow executions at a time per AWS account. If you choose to run multiple flows against the same SAP OData instance, you need to accordingly scale your instance.

## Supported destinations
<a name="sapodata-destinations"></a>

When you create a flow that uses SAP OData as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Redshift
+ Amazon S3
+ SAP OData

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="sapodata-resources"></a>
+ [Setting up SAP Gateway](https://help.sap.com/viewer/product/SAP_GATEWAY) in *SAP* documentation.

# SendGrid connector for Amazon AppFlow
<a name="connectors-sendgrid"></a>

SendGrid is a marketing automation platform and email marketing service. If you're a SendGrid user, your account contains data about your SendGrid activity, such as your lists, segments, and campaigns. You can use Amazon AppFlow to transfer data from SendGrid to certain AWS services or other supported applications.

## Amazon AppFlow support for SendGrid
<a name="sendgrid-support"></a>

Amazon AppFlow supports SendGrid as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from SendGrid.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to SendGrid.



## Before you begin
<a name="sendgrid-prereqs"></a>

To use Amazon AppFlow to transfer data from SendGrid to supported destinations, you must meet these requirements:
+ You have an account with SendGrid that contains the data that you want to transfer. For more information about the SendGrid data objects that Amazon AppFlow supports, see [Supported objects](#sendgrid-objects).
+ You've configured your account with the following settings:
  + You've enabled two-factor authentication. For the steps to enable it, see [Two-Factor Authentication](https://docs.sendgrid.com/ui/account-and-settings/two-factor-authentication) in the SendGrid documentation.
  + You've created an API key that grants full access to your account. For the steps to create one, see [API Keys](https://docs.sendgrid.com/ui/account-and-settings/api-keys) in the SendGrid documentation.

Note the API key from your account settings. You provide it to Amazon AppFlow when you connect to your SendGrid account.

## Connecting Amazon AppFlow to your SendGrid account
<a name="sendgrid-connecting"></a>

To connect Amazon AppFlow to your SendGrid account, provide your API key so that Amazon AppFlow can access your data. If you haven't yet configured your SendGrid account for Amazon AppFlow integration, see [Before you begin](#sendgrid-prereqs).

**To connect to SendGrid**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **SendGrid**.

1. Choose **Create connection**.

1. In the **Connect to SendGrid** window, for **API Key**, enter the API key from your SendGrid account settings.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses SendGrid as the data source, you can select this connection.

## Transferring data from SendGrid with a flow
<a name="sendgrid-transfer-data"></a>



To transfer data from SendGrid, create an Amazon AppFlow flow, and choose SendGrid as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for SendGrid, see [Supported objects](#sendgrid-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#sendgrid-destinations).

## Supported destinations
<a name="sendgrid-destinations"></a>

When you create a flow that uses SendGrid as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="sendgrid-objects"></a>

When you create a flow that uses SendGrid as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-sendgrid.html)

# ServiceNow
<a name="servicenow"></a>

The following are the requirements and connection instructions for using ServiceNow with Amazon AppFlow.

**Note**  
You can use ServiceNow as a source only.

**Topics**
+ [

## Requirements
](#servicenow-requirements)
+ [

## Connection instructions
](#servicenow-setup)
+ [

## Considerations and limitations
](#servicenow-considerations-and-limitations)
+ [

## Supported destinations
](#servicenow-destinations)
+ [

## Related resources
](#servicenow-resources)

## Requirements
<a name="servicenow-requirements"></a>

Before you can use Amazon AppFlow to import data from ServiceNow, you need the following: 
+ A ServiceNow account so that you can provide Amazon AppFlow with your user name, password, and instance name.
+ Access to your ServiceNow instance through a role. This can be an admin role or one that allows the read operation for the following:
  + `sys_db_object`
  + `sys_db_object.*`
  + `sys_dictionary`
  + `sys_dictionary.*`
  + `sys_glide_object`
  + Any table that you want to access with Amazon AppFlow. For example, if you want to import data from a table named `incidents`, you need read access to `incidents` and `incidents.*`.

  For more information about ServiceNow roles, see [Roles](https://docs.servicenow.com/bundle/sandiego-platform-administration/page/administer/roles/reference/r_SecurityJumpStartACLRules.html) in the ServiceNow documentation.

## Connection instructions
<a name="servicenow-setup"></a>

**To connect to ServiceNow while creating a flow**

The ServiceNow connector for Amazon Appflow now has the option to create connections using either Basic Auth or OAuth2 authentication. You make this choice in the console when you create your connection. If you choose Basic Auth, you’ll need to provide your username, password, and Instance URL. If you choose OAuth2, you’ll need to provide your Client ID, Client secret, and Instance URL.

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **ServiceNow** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to ServiceNow** dialog box.

1. 

   1. Under **Connection name**, enter a name for your connection.

   1. In the **Select authentication mode** dropdown menu, select either **Basic Auth** or **OAuth2**.

   1. (For Basic Auth only) Under **User name**, enter your ServiceNow user name.

   1. (For Basic Auth only) Under **Password**, enter the password for that account.

   1. (For OAuth2 only) Under **Client ID**, enter the Client ID from your app.

   1. (For OAuth2 only) Under **Client secret**, specify the Client secret from your app.

   1. Under **Instance URL**, specify the instance of ServiceNow you want to connect to.

   1. Choose **Connect**.

1. Once connected, you can choose the ServiceNow object.

Now that you are connected to your ServiceNow account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#servicenow-requirements) section.

## Considerations and limitations
<a name="servicenow-considerations-and-limitations"></a>
+ Once you are connected to your ServiceNow instance, you can select the relevant objects from ServiceNow by using the dropdown list. Given the amount of data being available via ServiceNow, the dropdown list may take some time to fully populate. Amazon AppFlow will list all tables available (including custom ones) and you can map the source fields to the destination fields during flow setup.
+ You can run your flows either on demand, or on schedule, which enables you to integrate your ServiceNow data with AWS services.
+ When you use ServiceNow as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.
+ When you use ServiceNow as a source for incremental flows that run on a schedule, Amazon AppFlow uses the `sys_updated_on` field to identify the updated record set.
+ ServiceNow can process up to 100,000 records as part of a single flow run.
+ The Truncate or Mask transformations are not supported for ServiceNow reference type fields. If applied, the following behavior occurs for the respective transformation:
  + Truncate: Reference type fields become an empty string
  + Mask: Reference type fields become `null`

## Supported destinations
<a name="servicenow-destinations"></a>

When you create a flow that uses ServiceNow as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="servicenow-resources"></a>
+  [Roles](https://docs.servicenow.com/bundle/paris-platform-administration/page/administer/roles/concept/c_Roles.html) in the *ServiceNow* documentation

# Singular
<a name="singular"></a>

The following are the requirements and connection instructions for using Singular with Amazon AppFlow.

**Note**  
You can use Singular as a source only.

**Topics**
+ [

## Requirements
](#singular-requirements)
+ [

## Connection instructions
](#singular-setup)
+ [

## Notes
](#singular-notes)
+ [

## Supported destinations
](#singular-destinations)
+ [

## Related resources
](#singular-resources)

## Requirements
<a name="singular-requirements"></a>
+ You must provide Amazon AppFlow with an API key. For more information about retrieving your client ID and client secret, see [Authentication](https://support.singular.net/hc/en-us/articles/360045245692-Reporting-API-Reference) in the Singular documentation.
+ The date range for the flow cannot exceed 30 days.
+ The flow cannot return more than 100,000 records.

## Connection instructions
<a name="singular-setup"></a>

**To connect to Singular while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Singular** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Singular** dialog box.

   1. Under **API key**, enter your API key.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Connect to Singular dialog with fields for API key, data encryption, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-singular-console.png)

1. You will be redirected to the Singular login page. When prompted, grant Amazon AppFlow permissions to access your Singular account.

Now that you are connected to your Singular account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#singular-requirements) section.

## Notes
<a name="singular-notes"></a>
+ When you use Singular as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per hour.

## Supported destinations
<a name="singular-destinations"></a>

When you create a flow that uses Singular as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="singular-resources"></a>
+ [Authentication](https://support.singular.net/hc/en-us/articles/360045245692-Reporting-API-Reference) in the Singular documentation
+ [Load all your paid marketing with Amazon AppFlow. No code required.](https://www.singular.net/amazon-appflow/) from Singular

# Slack
<a name="slack"></a>

The following are the requirements and connection instructions for using Slack with Amazon AppFlow.

**Note**  
You can use Slack as a source only.

**Topics**
+ [

## Requirements
](#slack-requirements)
+ [

## Connection instructions
](#slack-setup)
+ [

## Notes
](#slack-notes)
+ [

## Supported destinations
](#slack-destinations)
+ [

## Related resources
](#slack-resources)

## Requirements
<a name="slack-requirements"></a>
+ To create a Slack connection in Amazon AppFlow, you must note your client ID, client secret, and Slack instance name. To retrieve your client ID and secret from Slack, you first must create a Slack App if you haven't already. For more information about how to create an App and then retrieve your client ID and secret, see the [Slack documentation](https://api.slack.com/docs/sign-in-with-slack#sign-in-with-slack__details__create-your-slack-app-if-you-havent-already).
+ Set the redirect URL as follows:
  + https://console.aws.amazon.com/appflow/oauth for the us-east-1 Region
  + https://*region*.console.aws.amazon.com/appflow/oauth for all other Regions
+ Set the following user token scopes:
  + `channels:history`
  + `channels:read`
  + `groups:history`
  + `groups:read`
  + `im:history`
  + `im:read`
  + `mpim:history`
  + `mpim:read`

## Connection instructions
<a name="slack-setup"></a>

**To connect to Slack while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Slack** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Slack** dialog box.

   1. Under **Client ID**, enter your Slack client ID.

   1. Under **Client secret**, enter your Slack client secret.

   1. Under **Workspace**, enter the name of your Slack instance.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Continue**.  
![\[Slack connection form with fields for client ID, secret, workspace URL, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-slack-console.png)

1. You will be redirected to the Slack login page. When prompted, grant Amazon AppFlow permissions to access your Slack account.

Now that you are connected to your Slack account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#slack-requirements) section.

## Notes
<a name="slack-notes"></a>
+ When you use Slack as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

## Supported destinations
<a name="slack-destinations"></a>

When you create a flow that uses Slack as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="slack-resources"></a>
+  [Retrieve your client ID and secret](https://api.slack.com/docs/sign-in-with-slack#sign-in-with-slack__details__create-your-slack-app-if-you-havent-already) in the Slack documentation
+  [New – Announcing Amazon AppFlow (dataflow: Slack, S3, Athena, QuickSight)](https://aws.amazon.com/blogs/aws/new-announcing-amazon-appflow) in the *AWS News* blog
+ How to transfer data from Slack to Amazon S3 using Amazon AppFlow  


# Smartsheet connector for Amazon AppFlow
<a name="connectors-smartsheet"></a>

Smartsheet is a spreadsheet-based online collaboration service that helps teams plan and track their projects and initiatives. If you're a Smartsheet user, your account contains data about your sheets, such as their dates when created, dates when modified, owners, access levels, and more. You can use Amazon AppFlow to transfer data from Smartsheet to certain AWS services or other supported applications.

## Amazon AppFlow support for Smartsheet
<a name="smartsheet-support"></a>

Amazon AppFlow supports Smartsheet as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Smartsheet.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Smartsheet.

## Before you begin
<a name="smartsheet-prereqs"></a>

To use Amazon AppFlow to transfer data from Smartsheet to supported destinations, you must meet these requirements:
+ You have an account with Smartsheet that contains the data that you want to transfer. For more information about the Smartsheet data objects that Amazon AppFlow supports, see [Supported objects](#smartsheet-objects).
+ In your Smartsheet account, you've created an app for Amazon AppFlow. The app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [OAuth Walkthrough](https://smartsheet.redoc.ly/#section/OAuth-Walkthrough) in the *Smartsheet API Reference (2.0.0)*.
+ You've configured the app with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Smartsheet. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the client ID and secret from the settings for your app. You provide these values to Amazon AppFlow when you connect to your Smartsheet account.

## Connecting Amazon AppFlow to your Smartsheet account
<a name="smartsheet-connecting"></a>

To connect Amazon AppFlow to your Smartsheet account, provide the client credentials from your app so that Amazon AppFlow can access your data. If you haven't yet configured your Smartsheet account for Amazon AppFlow integration, see [Before you begin](#smartsheet-prereqs).

**To connect to Smartsheet**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Smartsheet**.

1. Choose **Create connection**.

1. In the **Connect to Smartsheet** window, enter the following information:
   + **Authorization tokens URL** – Do one of the following:
     + To connect to the Smartsheet US region, choose **https://api.smartsheet.com/2.0/token**.
     + To connect to the Smartsheet EU region, choose **https://api.smartsheet.eu/2.0/token**.
   + **Authorization code URL** – Do one of the following:
     + To connect to the Smartsheet US region, choose **https://api.smartsheet.com/b/authorize**.
     + To connect to the Smartsheet EU region, choose **https://api.smartsheet.eu/b/authorize**.
   + **Client ID** – The client ID from app in your Smartsheet account.
   + **Client secret** – The client secret from the app in your Smartsheet account.
   + **Instance URL** – Do one of the following:
     + To connect to the Smartsheet US region, choose **https://api.smartsheet.com**.
     + To connect to the Smartsheet EU region, choose **https://api.smartsheet.eu**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Smartsheet account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Smartsheet as the data source, you can select this connection.

## Transferring data from Smartsheet with a flow
<a name="smartsheet-transfer-data"></a>

To transfer data from Smartsheet, create an Amazon AppFlow flow, and choose Smartsheet as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Smartsheet, see [Supported objects](#smartsheet-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#smartsheet-destinations).

## Supported destinations
<a name="smartsheet-destinations"></a>

When you create a flow that uses Smartsheet as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="smartsheet-objects"></a>

When you create a flow that uses SmartSheet as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-smartsheet.html)

# Snapchat Ads connector for Amazon AppFlow
<a name="connectors-snapchat-ads"></a>

You can use the Snapchat Ads connector in Amazon AppFlow to transfer data about the ads that you run on Snapchat. After you connect Amazon AppFlow to your ad account with Snapchat business, you can transfer data about your ads, campaigns, customer segments, and more. You can transfer this data to certain AWS services or other supported applications.

## Amazon AppFlow support for Snapchat Ads
<a name="snapchat-ads-support"></a>

Amazon AppFlow supports Snapchat Ads as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Snapchat Ads.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Snapchat Ads.

## Before you begin
<a name="snapchat-ads-prereqs"></a>

To use Amazon AppFlow to transfer data from Snapchat Ads to supported destinations, you must meet these requirements:
+ You have a Snapchat business account, and you've used it to create an ad account. The ad account contains the data that you want to transfer with Amazon AppFlow. For more information about ad accounts, see [Create an Ad Account](https://businesshelp.snapchat.com/s/article/create-ad-account?language=en_US) in the Snapchat Business Help Center.
+ In your Snapchat account, you've created an OAuth app for Amazon AppFlow. The app provides the credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an app, see [Activate Access to the Snapchat Marketing API](https://businesshelp.snapchat.com/s/article/api-apply?language=en_US) in the Snapchat Business Help Center.
+ You've configured the OAuth app with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Snapchat Ads. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

From the OAuth app settings, note the values for Snap client ID and Snap client secret key. You provide these values to Amazon AppFlow when you connect to your Snapchat account.

## Connecting Amazon AppFlow to your Snapchat Ads account
<a name="snapchat-ads-connecting"></a>

To connect Amazon AppFlow to Snapchat Ads, provide the credentials from the OAuth app in your Snapchat account so that Amazon AppFlow can access your data. If you haven't yet configured your Snapchat account for Amazon AppFlow integration, see [Before you begin](#snapchat-ads-prereqs).

**To connect to Snapchat Ads**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Snapchat Ads**.

1. Choose **Create connection**.

1. In the **Connect to Snapchat Ads** window, enter the following information:
   + **Client ID** — The Snap client ID from your OAuth app.
   + **Client secret** — The Snap client secret key from your OAuth app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your Snapchat account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Snapchat Ads as the data source, you can select this connection.

## Transferring data from Snapchat Ads with a flow
<a name="snapchat-ads-transfer-data"></a>

To transfer data from Snapchat Ads, create an Amazon AppFlow flow, and choose Snapchat Ads as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Snapchat Ads, see [Supported objects](#snapchat-ads-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#snapchat-ads-destinations).

## Supported destinations
<a name="snapchat-ads-destinations"></a>

When you create a flow that uses Snapchat Ads as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="snapchat-ads-objects"></a>

When you create a flow that uses Snapchat Ads as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-snapchat-ads.html)

# Snowflake
<a name="snowflake"></a>

The following are the requirements and connection instructions for using Snowflake with Amazon AppFlow.

**Note**  
You can use Snowflake as a destination only.

**Topics**
+ [

## Requirements
](#snowflake-requirements)
+ [

## Connection instructions
](#snowflake-setup)
+ [

## Related resources
](#snowflake-resources)

## Requirements
<a name="snowflake-requirements"></a>
+ Amazon AppFlow uses the Snowflake COPY command to move data using an S3 bucket. To configure the integration, see [Configuring Secure Access to Amazon S3](https://docs.snowflake.net/manuals/user-guide/data-load-s3-config.html) in the Snowflake documentation.
+ You must also add access to the `kms:Decrypt` action so that Snowflake can access the encrypted data that Amazon AppFlow stored in the Amazon S3 bucket.

  ```
  {
      "Effect": "Allow",
      "Action": "kms:Decrypt",
      "Resource": "*"
  }
  ```
+ You must provide Amazon AppFlow with the following information:
  + the name of the stage and the S3 bucket for the stage
  + the user name and password for your Snowflake account
  + the S3 bucket prefix
  + the warehouse that you want to use to move the data

## Connection instructions
<a name="snowflake-setup"></a>

**To connect to Snowflake while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Snowflake** from the **Destination name** dropdown list.

1. Choose **Connect** or **Connect with PrivateLink** to open the **Connect to Snowflake** dialog box.

   1. Under **Warehouse**, enter the Snowflake warehouse that you want to use to move the data.

   1. Under **Stage name**, enter the Amazon S3 stage name in the following format: <Database> <Schema> <Stage name>

   1. Under **Bucket details**, select the S3 bucket where Amazon AppFlow will write data prior to copying it.

   1. Under **Account name**, enter your Snowflake account name. You can find your account name in the URL of your Snowflake instance. For example, if your Snowflake URL is https://vna33034.snowflakecomputing.com, your account name is vna33034.

   1. Under **User name**, enter the user name you use to log into Snowflake.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Snowflake connection form with fields for warehouse, stage, bucket, account, and other details.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-snowflake-console.png)

Now that you are connected to your Snowflake account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#snowflake-requirements) section.

## Related resources
<a name="snowflake-resources"></a>
+ [Configuring Secure Access to Amazon S3](https://docs.snowflake.net/manuals/user-guide/data-load-s3-config.html) in the Snowflake documentation

# Stripe connector for Amazon AppFlow
<a name="connectors-stripe"></a>

Stripe powers ecommerce with payment processing and other commerce solutions for businesses. If you're a Stripe user, your account contains data about your transactions, such as your balance, charges, and payouts. You can use Amazon AppFlow to transfer data from Stripe to certain AWS services or other supported applications.

## Amazon AppFlow support for Stripe
<a name="stripe-support"></a>

Amazon AppFlow supports Stripe as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Stripe.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Stripe.



## Before you begin
<a name="stripe-prereqs"></a>

Before you can use Amazon AppFlow to transfer data from Stripe, you must have a Stripe account that contains the data to transfer. For more information about the Stripe data objects that Amazon AppFlow supports, see [Supported objects](#stripe-objects).

From your Stripe account, you must obtain a test or live API key. You provide this key to Amazon AppFlow when you connect to your Stripe account. For the steps to obtain these keys, see [Manage API keys](https://stripe.com/docs/development/dashboard/manage-api-keys) in the Stripe Docs.

## Connecting Amazon AppFlow to your Stripe account
<a name="stripe-connecting"></a>

To connect Amazon AppFlow to your Stripe account, provide your API key so that Amazon AppFlow can access your data. If you haven't yet configured your Stripe account for Amazon AppFlow integration, see [Before you begin](#stripe-prereqs).

**To connect to Stripe**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Stripe**.

1. Choose **Create connection**.

1. In the **Connect to Stripe** window, for **API Key**, enter a test or live API key from your Stripe account settings.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Stripe as the data source, you can select this connection.

## Transferring data from Stripe with a flow
<a name="stripe-transfer-data"></a>

To transfer data from Stripe, create an Amazon AppFlow flow, and choose Stripe as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Stripe, see [Supported objects](#stripe-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#stripe-destinations).

## Supported destinations
<a name="stripe-destinations"></a>

When you create a flow that uses Stripe as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="stripe-objects"></a>

When you create a flow that uses Stripe as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-stripe.html)

# Trend Micro
<a name="trend-micro"></a>

The following are the requirements and connection instructions for using Trend Micro with Amazon AppFlow.

**Note**  
You can use Trend Micro as a source only.

**Topics**
+ [

## Requirements
](#trendmicro-requirements)
+ [

## Connection instructions
](#trendmicro-setup)
+ [

## Notes
](#trendmicro-notes)
+ [

## Supported destinations
](#trend-micro-destinations)
+ [

## Related resources
](#trendmicro-resources)

## Requirements
<a name="trendmicro-requirements"></a>

You must provide Amazon AppFlow with an API secret. For more information about how to generate or retrieve an API secret from Trend Micro, see [Create and Manage API Keys](https://automation.deepsecurity.trendmicro.com/article/12_0/create-and-manage-api-keys/) in the *Trend Micro* documentation.

## Connection instructions
<a name="trendmicro-setup"></a>

**To connect to Trend Micro while creating a flow:**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Trend Micro** from the **Source name** drop-down list.

1. Choose **Connect** or **Connect with PrivateLink** to open the **Connect to Trend Micro** dialog box.

   1. Under **API secret key**, enter your API secret key.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Connect to Trend Micro dialog with fields for API secret key, data encryption, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-trendmicro-console.png)

Now that you are connected to your Trend Micro account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#trendmicro-requirements) section.

## Notes
<a name="trendmicro-notes"></a>
+ When you use Trend Micro as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per hour.

## Supported destinations
<a name="trend-micro-destinations"></a>

When you create a flow that uses Trend Micro as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="trendmicro-resources"></a>
+  [Create and Manage API Keys](https://automation.deepsecurity.trendmicro.com/article/12_0/create-and-manage-api-keys/) in the Trend Micro documentation 

# Typeform connector for Amazon AppFlow
<a name="connectors-typeform"></a>

Typeform is an online survey tool. If you're a Typeform user, your account contains data about your survey forms and responses. You can use Amazon AppFlow to transfer data from Typeform to certain AWS services or other supported applications.

## Amazon AppFlow support for Typeform
<a name="typeform-support"></a>

Amazon AppFlow supports Typeform as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Typeform.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Typeform.

## Before you begin
<a name="typeform-prereqs"></a>

To use Amazon AppFlow to transfer data from Typeform to supported destinations, you must meet these requirements:
+ You have an account with Typeform that contains the data that you want to transfer. For more information about the Typeform data objects that Amazon AppFlow supports, see [Supported objects](#typeform-objects).
+ In the settings of your account, you've created either of the following resources for Amazon AppFlow. These resources provide credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account.
  + A developer app to provide OAuth 2.0 authentication. For the steps to create a developer app, see [Create an application in the Typeform admin panel](https://developer.typeform.com/get-started/applications/#1-create-an-application-in-the-typeform-admin-panel) in the documentation for Typeform Developers Platform.
  + A personal token. For the steps to create one, see [Personal access token for Typeform's APIs](https://developer.typeform.com/get-started/personal-access-token) in the documentation for Typeform Developers Platform.
+ If you created a developer app, you've configured it with a redirect URL for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Typeform. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
+ If you created a personal token, you've included the scopes that provide access to the data objects that you want to transfer. For information about Typeform scopes, see [OAuth scopes for your applications](https://developer.typeform.com/get-started/scopes/) in the documentation for Typeform Developers Platform.

If you created a developer app, note the client ID and client secret. If you created a personal token, note the token value. You provide these values to Amazon AppFlow when you connect to your Typeform account.

## Connecting Amazon AppFlow to your Typeform account
<a name="typeform-connecting"></a>

To connect Amazon AppFlow to your Typeform account, provide details from your Typeform project so that Amazon AppFlow can access your data. If you haven't yet configured your Typeform project for Amazon AppFlow integration, see [Before you begin](#typeform-prereqs).

**To connect to Typeform**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Typeform**.

1. Choose **Create connection**.

1. In the **Connect to Typeform** window, for **Select authentication type**, choose how to authenticate Amazon AppFlow with your Typeform account when it requests to access your data:
   + Choose **OAuth2** to authenticate Amazon AppFlow with the credentials from a developer app. Then, enter values for **Client ID** and **Client secret**.
   + Choose **PAT** to authenticate Amazon AppFlow with a personal access token. Then, enter the token value for **Personal access token**.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Depending on the authentication type that you chose, do one of the following:
   + If you chose **OAuth2**, choose **Continue**. Then, in the window that appears, sign in to your Typeform account, and grant access to Amazon AppFlow.
   + If you chose **PAT**, choose **Connect**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Typeform as the data source, you can select this connection.

## Transferring data from Typeform with a flow
<a name="typeform-transfer-data"></a>

To transfer data from Typeform, create an Amazon AppFlow flow, and choose Typeform as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Typeform, see [Supported objects](#typeform-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#typeform-destinations).

## Supported destinations
<a name="typeform-destinations"></a>

When you create a flow that uses Typeform as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="typeform-objects"></a>

When you create a flow that uses Typeform as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-typeform.html)

# Upsolver
<a name="upsolver"></a>

The following are the requirements and connection instructions for using Upsolver with Amazon AppFlow.

**Note**  
You can use Upsolver as a destination only.

**Topics**
+ [

## Requirements
](#upsolver-requirements)
+ [

## Setup instructions
](#upsolver-setup)
+ [

## Notes
](#upsolver-notes)
+ [

## Related resources
](#upsolver-resources)

## Requirements
<a name="upsolver-requirements"></a>
+ You must create an Amazon AppFlow data source in the Upsolver user interface. This will create an S3 bucket in your AWS account where Amazon AppFlow will send data.
+ Alternatively, you can create an Amazon S3 bucket through the Amazon S3 console. The bucket name must begin with `upsolver-appflow`.

## Setup instructions
<a name="upsolver-setup"></a>

**To connect to Upsolver while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Upsolver** from the **Destination name** dropdown list.

1. Under **Bucket details**, select the S3 bucket in which you will place your data. You can specify a prefix, which is equivalent to specifying a folder within the S3 bucket where your source files are located or records are to be written to the destination.  
![\[Destination name field showing Upsolver selected, with bucket details section below.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-upsolver-console.png)

Now that you are connected to your Amazon S3 bucket, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#upsolver-requirements).

## Notes
<a name="upsolver-notes"></a>
+ You can configure Amazon AppFlow flows with Upsolver as the destination, and send data from any supported source to the integrated Upsolver Amazon S3 bucket. The data is then available for downstream processing in Upsolver.

## Related resources
<a name="upsolver-resources"></a>
+  [Amazon AppFlow data source](https://docs.upsolver.com/upsolver-1/connecting-data-sources/amazon-aws-data-sources/amazon-appflow-data-source) from the Upsolver documentation

# Veeva
<a name="veeva"></a>

The following are the requirements and connection instructions for using Veeva with Amazon AppFlow.

**Note**  
You can use Veeva as a source only.

**Topics**
+ [

## Requirements
](#veeva-requirements)
+ [

## Connection instructions
](#veeva-setup)
+ [

## Extract Veeva VAULT documents with Amazon AppFlow
](#veeva-document-extraction-feature)
+ [

## Notes
](#veeva-notes)
+ [

## Supported destinations
](#veeva-destinations)
+ [

## Related resources
](#veeva-resources)

## Requirements
<a name="veeva-requirements"></a>
+ You must provide Amazon AppFlow with your user name, password, and Veeva instance name.
+  Your user account must have API access. For more information, see [API access permissions](https://docs-vdm.veevanetwork.com/doc/vndocad/Content/Network_topics/Whats_new/20R2.0/Users.htm#api) in the Veeva documentation.

## Connection instructions
<a name="veeva-setup"></a>

**To connect to Veeva while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Veeva** from the **Source name** dropdown list.

1. Choose **Connect** to open the **Connect to Veeva** dialog box.

   1. Under **User name**, enter the user name you use to log into Veeva.

   1. Under **Password**, enter your secret key.

   1. Under **Instance name**, enter the name of your Veeva instance.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Connect**.  
![\[Veeva connection form with fields for user name, password, instance name, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-veeva-console.png)

Now that you are connected to your Veeva account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#veeva-requirements) section above.

## Extract Veeva VAULT documents with Amazon AppFlow
<a name="veeva-document-extraction-feature"></a>

You can use Amazon AppFlow to extract documents from Veeva VAULT. Follow the steps below to configure a flow to extract documents.

**Step 1: Create a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

**Step 2: Configure the flow**

1. Choose **Veeva VAULT** from the **Source name** dropdown list.

1. Choose a Veeva VAULT connection from already existing connections or create a new connection.

1. Choose **Veeva VAULT documents** from the radio options.

1. Choose a **Veeva VAULT document type** from the dropdown.

1. Choose **Document metadata and source files** option to extract source files along with associated metadata. Choose **Metadata only** option to only download Metadata. By default Metadata only is selected.

1. If you select **Document metadata and source files.** 

   1. Choose **versions** of the document you want to extract, By default only latest version of document is extracted, You can select all versions to be extracted. 

   1. Choose **Renditions** options if required, By default Renditions are not included.  
![\[Configure flow interface for Veeva connection with source details and download options.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/flow_setup_veeva-document_extraction.png)

1. Choose a destination from drop down menu.
**Note**  
Currently Amazon AppFlow only supports Amazon S3 as a destination for document extraction.

1. Choose a **Bucket Name** and **Bucket Prefix**.

1. Select a trigger to run flow. You can select **Run on demand** or ** Run on Schedule**to run the flow. If you choose a scheduled trigger,you can run flows at a maximum frequency of one flow run **per hour**.

1. Choose **Next**.

**Step 3: Map data fields**

1. You can choose a mapping method either to **Manually map the fields** or ** Upload .csv file with mapped fields ** to map fields from source to destination.

1. If you choose to **Manually map the fields** choose the fields from drop down list.

1. Options like **Add formula**, **Modify Values** and **Validations** are not supported for Veeva VAULT document extraction. 

1. Choose **Next**.

**Step 4 (Optional): Add filters**

Specify a filter to determine which records to transfer. Amazon AppFlow enables you to filter data fields by adding multiple filters and by adding criteria to a filter. If you want to filter the documents by **Document subtype** or **Document Classification** you can add the appropriate filters here.

1. Based on the selected field names choose appropriate filter condition.

1. Choose **Next**.

**Step 5: Review and create**
+ Review the information for your flow. To change the information for a step, choose **Edit**. When you are finished, choose **Create flow**. 

## Notes
<a name="veeva-notes"></a>
+ When you use Veeva as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

## Supported destinations
<a name="veeva-destinations"></a>

When you create a flow that uses Veeva as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="veeva-resources"></a>
+  [API access permissions](https://docs-vdm.veevanetwork.com/doc/vndocad/Content/Network_topics/Whats_new/20R2.0/Users.htm#api) in the Veeva Product Support Portal 

# WooCommerce connector for Amazon AppFlow
<a name="connectors-woocommerce"></a>

WooCommerce helps online merchants build commercial websites with a plugin for WordPress. If you're a WooCommerce user, then your account contains data about your site and your transactions, such as your orders, products, reviews, shipments, and more. You can use Amazon AppFlow to transfer data from WooCommerce to certain AWS services or other supported applications.

## Amazon AppFlow support for WooCommerce
<a name="woocommerce-support"></a>

Amazon AppFlow supports WooCommerce as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from WooCommerce.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to WooCommerce.

## Before you begin
<a name="woocommerce-prereqs"></a>

To use Amazon AppFlow to transfer data from WooCommerce to supported destinations, you must meet these requirements:
+ You have an account with WooCommerce that contains the data that you want to transfer. For more information about the WooCommerce data objects that Amazon AppFlow supports, see [Supported objects](#woocommerce-objects).
+ In your WooCommerce account, you've created a REST API key for Amazon AppFlow. For information about how create a key, see [Authentication](https://woocommerce.github.io/woocommerce-rest-api-docs/?shell#authentication) in the WooCommerce documentation.

From the REST API key details, note the consumer key and consumer secret. You provide these values to Amazon AppFlow when you connect to your WooCommerce account.

## Connecting Amazon AppFlow to your WooCommerce account
<a name="woocommerce-connecting"></a>

To connect Amazon AppFlow to your WooCommerce account, provide the credentials from the REST API key in your WooCommerce account so that Amazon AppFlow can access your data. If you haven't yet configured your WooCommerce account for Amazon AppFlow integration, see [Before you begin](#woocommerce-prereqs).

**To connect to WooCommerce**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **WooCommerce**.

1. Choose **Create connection**.

1. In the **Connect to WooCommerce** window, enter the following information:
   + **Consumer Key** — The consumer key from your REST API key.
   + **Consumer Secret** — The consumer secret from your REST API key.
   + **Instance URL** — The site name that you assigned when you created your site in WooCommerce.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Connect**.

1. In the window that appears, sign in to your WooCommerce account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses WooCommerce as the data source, you can select this connection.

## Transferring data from WooCommerce with a flow
<a name="woocommerce-transfer-data"></a>

To transfer data from WooCommerce, create an Amazon AppFlow flow, and choose WooCommerce as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for WooCommerce, see [Supported objects](#woocommerce-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#woocommerce-destinations).

## Supported destinations
<a name="woocommerce-destinations"></a>

When you create a flow that uses WooCommerce as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="woocommerce-objects"></a>

When you create a flow that uses WooCommerce as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-woocommerce.html)

# Zendesk
<a name="zendesk"></a>

The following are the requirements and connection instructions for using Zendesk with Amazon AppFlow.

**Note**  
You can use Zendesk as a source or a destination.

**Topics**
+ [

## Requirements
](#zendesk-requirements)
+ [

## Connection instructions
](#zendesk-setup)
+ [

## Notes
](#zendesk-notes)
+ [

## Supported destinations
](#zendesk-destinations)
+ [

## Related resources
](#zendesk-resources)

## Requirements
<a name="zendesk-requirements"></a>
+ To use Amazon AppFlow, you need to register the application to generate OAuth credentials that your application can use to authenticate API calls to Zendesk. This is done in Zendesk Support.
+ In Zendesk, you must create an OAuth client with the following settings:
  + Unique identifier: aws\$1integration\$1to\$1Zendesk
  + Redirect URL: https://console.aws.amazon.com/appflow/oauth (us-east-1) or https://*region*.console.aws.amazon.com/appflow/oauth (all other Regions)

For more information, see [Setting up the Amazon AppFlow integration with Zendesk](https://support.zendesk.com/hc/en-us/articles/360047196173-Setting-up-the-Amazon-AppFlow-integration-with-Zendesk#topic_lk1_xxn_4lb) in the Zendesk documentation.

## Connection instructions
<a name="zendesk-setup"></a>

**To connect to Zendesk while creating a flow**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then choose an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value.

1. Choose **Next**.

1. Choose **Zendesk** from the **Source name** or **Destination name** dropdown list.

1. Choose **Connect** to open the **Connect to Zendesk** dialog box.

   1. Under **Client ID**, enter your Zendesk client ID.

   1. Under **Client secret**, enter your Zendesk client secret.

   1. Under **Account**, enter the name of your instance of Zendesk.

   1. Under **Data encryption**, enter your AWS KMS key.

   1. Under **Connection name**, specify a name for your connection.

   1. Choose **Continue**.  
![\[Zendesk connection form with fields for client ID, secret, account URL, and connection name.\]](http://docs.aws.amazon.com/appflow/latest/userguide/images/connection_setup-zendesk-console.png)

Now that you are connected to your Zendesk account, you can continue with the flow creation steps as described in [Creating flows in Amazon AppFlow](create-flow.md).

**Tip**  
If you aren’t connected successfully, ensure that you have followed the instructions in the [Requirements](#zendesk-requirements).

## Notes
<a name="zendesk-notes"></a>
+ When you use Zendesk as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.
+ When you use Zendesk as a destination, the following additional settings are available:


| Setting name | Description | 
| --- | --- | 
|  **Insert new records**  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/zendesk.html)  | 
|  **Update existing records**  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/zendesk.html)  | 
|  **Upsert records **  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/zendesk.html)  | 

## Supported destinations
<a name="zendesk-destinations"></a>

When you create a flow that uses Zendesk as the data source, you can set the destination to any of the following connectors: 
+ Amazon Connect
+ Amazon Honeycode
+ Lookout for Metrics
+ Amazon Redshift
+ Amazon S3
+ Marketo
+ Salesforce
+ Snowflake
+ Upsolver
+ Zendesk

You can also set the destination to any custom connectors that you create with the Amazon AppFlow Custom Connector SDKs for [ Python](https://github.com/awslabs/aws-appflow-custom-connector-python) or [Java ](https://github.com/awslabs/aws-appflow-custom-connector-java). You can download these SDKs from GitHub.

## Related resources
<a name="zendesk-resources"></a>
+  [Setting up the Amazon AppFlow integration with Zendesk](https://support.zendesk.com/hc/en-us/articles/360047196173-Setting-up-the-Amazon-AppFlow-integration-with-Zendesk#topic_lk1_xxn_4lb) in the Zendesk documentation 
+  [Building great customer experiences with Zendesk and AWS](https://www.zendesk.com/blog/building-great-customer-experiences-zendesk-aws/) from Zendesk 
+ How to transfer data from Zendesk Support to Amazon S3 using Amazon AppFlow  


# Zendesk Chat connector for Amazon AppFlow
<a name="connectors-zendesk-chat"></a>

Zendesk Chat is a live chat service that Zendesk offers as part of its platform. Zendesk Chat helps businesses automate and enhance customer support interactions across web, mobile, and social channels. In a Zendesk Chat account, you store data related to customer conversations. If you use Zendesk Chat, you can also use Amazon AppFlow to transfer this data to certain AWS services or other supported applications.

## Amazon AppFlow support for Zendesk Chat
<a name="zendesk-chat-support"></a>

Amazon AppFlow supports Zendesk Chat as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Zendesk Chat.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Zendesk Chat.

## Before you begin
<a name="zendesk-chat-prereqs"></a>

To use Amazon AppFlow to transfer data from Zendesk Chat to supported destinations, you must meet these requirements:
+ You have a Zendesk Chat account.
+ In the Zendesk Chat account settings, you've registered Amazon AppFlow with an *API client*. The API client provides the client credentials that Amazon AppFlow uses to access your data securely with authenticated calls to your account.
+ You've configured your API client with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Zendesk Chat. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

In the settings for your API client, note the client ID and client secret because you will need them to create a connection in Amazon AppFlow.

## Connecting Amazon AppFlow to your Zendesk Chat account
<a name="zendesk-chat-connecting"></a>

To connect Amazon AppFlow to your Zendesk Chat account, provide your Zendesk subdomain and the client credentials that authorize Amazon AppFlow to access your data. If you haven't yet configured your Zendesk Chat account to integrate with Amazon AppFlow, see [Before you begin](#zendesk-chat-prereqs).

**To connect to Zendesk Chat**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Zendesk Chat**.

1. Choose **Create connection**.

1. In the **Connect to Zendesk Chat** window, enter the following information:
   + **Custom authorization code URL** – Your Zendesk subdomain. You can find this value in the URL that you visit when you sign in to Zendesk Chat. For example, in the account URL `https://my-account.zendesk.com`, the subdomain is `my-account`.
   + **Client ID** and **Client secret** – The client credentials that Zendesk assigned to your API client.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A window appears that asks if you want to allow Amazon AppFlow to access your Zendesk Chat account.

1. Choose **Allow**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Zendesk Chat as the data source, you can select this connection.

## Transferring data from Zendesk Chat with a flow
<a name="zendesk-chat-transfer-data"></a>

To transfer data from Zendesk Chat, create an Amazon AppFlow flow, and choose Zendesk Chat as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Zendesk Chat, see [Supported objects](#zendesk-chat-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#zendesk-chat-destinations).

## Supported objects
<a name="zendesk-chat-objects"></a>

When you create a flow that uses Zendesk Chat as the data source, you can transfer any of the following data objects to supported destinations:
+ Chat Offline Message
+ Chat Support Chat
+ Agent
+ Agent Event
+ Account
+ Department
+ Trigger
+ Shortcut
+ Ban
+ Goal
+ Skill
+ Role
+ Route Setting Account
+ Route Setting Agent

## Supported destinations
<a name="zendesk-chat-destinations"></a>

When you create a flow that uses Zendesk Chat as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Zendesk Sell connector for Amazon AppFlow
<a name="connectors-zendesk-sell"></a>

Zendesk Sell is a customer relationship management (CRM) service that Zendesk offers as part of its platform. Zendesk Sell automates sales workflows to help its users engage leads and close deals. In a Zendesk Sell account, you store data related to sales opportunities, such as contacts, deals, and leads. If you use Zendesk Sell, you can also use Amazon AppFlow to transfer this data to certain AWS services or other supported applications.

## Amazon AppFlow support for Zendesk Sell
<a name="zendesk-sell-support"></a>

Amazon AppFlow supports Zendesk Sell as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Zendesk Sell.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Zendesk Sell.

## Before you begin
<a name="zendesk-sell-prereqs"></a>

To use Amazon AppFlow to transfer data from Zendesk Sell to supported destinations, you must meet these requirements:
+ You have a Zendesk Sell account.
+ In the OAuth settings for your Zendesk Sell account, you've registered Amazon AppFlow with a *developer app*. The developer app provides the client credentials that Amazon AppFlow uses to access your data securely with authenticated calls to the Zendesk Sell API.
+ You've configured the developer app with a redirect URL for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Zendesk Sell. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

In the settings for your developer app, note the client ID and client secret because you will need them to create a connection in Amazon AppFlow.

## Connecting Amazon AppFlow to your Zendesk Sell account
<a name="zendesk-sell-connecting"></a>

To connect Amazon AppFlow to your Zendesk Sell account, provide the client credentials from the developer app that authorizes Amazon AppFlow to access your data. If you haven't yet configured your Zendesk Sell account to integrate with Amazon AppFlow, see [Before you begin](#zendesk-sell-prereqs).

**To connect to Zendesk Sell**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Zendesk Sell**.

1. Choose **Create connection**.

1. In the **Connect to Zendesk Sell** window, enter values for **Client ID** and **Client secret**. Zendesk assigns these client credentials to the developer app in your Zendesk Sell account.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. An **Authorize Application** window opens. The window prompts you to give Amazon AppFlow read-only access to your data.

1. Choose **Authorize**.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Zendesk Sell as the data source, you can select this connection.

## Transferring data from Zendesk Sell with a flow
<a name="zendesk-sell-transfer-data"></a>

To transfer data from Zendesk Sell, create an Amazon AppFlow flow, and choose Zendesk Sell as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Zendesk Sell, see [Supported objects](#zendesk-sell-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#zendesk-sell-destinations).

## Supported objects
<a name="zendesk-sell-objects"></a>

When you create a flow that uses Zendesk Sell as the data source, you can transfer any of the following data objects to supported destinations:
+ Contact
+ Deal
+ Lead
+ Note
+ Task

## Supported destinations
<a name="zendesk-sell-destinations"></a>

When you create a flow that uses Zendesk Sell as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

# Zendesk Sunshine connector for Amazon AppFlow
<a name="connectors-zendesk-sunshine"></a>

Zendesk Sunshine is an application that helps builders create custom experiences on the Zendesk platform for ticketing and customer service. If you're a Zendesk Sunshine user, your account contains data about your Zendesk objects and their relationships. You can use Amazon AppFlow to transfer data from Zendesk Sunshine to certain AWS services or other supported applications.

## Amazon AppFlow support for Zendesk Sunshine
<a name="zendesk-sunshine-support"></a>

Amazon AppFlow supports Zendesk Sunshine as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Zendesk Sunshine.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Zendesk Sunshine.

## Before you begin
<a name="zendesk-sunshine-prereqs"></a>

To use Amazon AppFlow to transfer data from Zendesk Sunshine to supported destinations, you must meet these requirements:
+ You have an account with Zendesk that contains the data that you want to transfer. For more information about the Zendesk Sunshine data objects that Amazon AppFlow supports, see [Supported objects](#zendesk-sunshine-objects).
+ In your account, you've activated custom objects. For the steps to activate, see [Enabling custom objects](https://developer.zendesk.com/documentation/custom-data/custom-objects/getting-started-with-custom-objects/#enabling-custom-objects) in the Zendesk Developers documentation.
+ In your account settings, you've created an OAuth client for Amazon AppFlow. The OAuth client provides the client credentials that Amazon AppFlow uses to access your data securely with authenticated calls to your account.
+ You've configured your OAuth client with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Zendesk Sunshine. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

In the settings for your OAuth client, note the client ID and client secret. You provide these values to Amazon AppFlow when you connect to your Zendesk account.

## Connecting Amazon AppFlow to Zendesk Sunshine
<a name="zendesk-sunshine-connecting"></a>

To connect Amazon AppFlow to Zendesk Sunshine, provide the client credentials from your OAuth client so that Amazon AppFlow can access your data. If you haven't yet configured your Zendesk Sunshine project for Amazon AppFlow integration, see [Before you begin](#zendesk-sunshine-prereqs).

**To connect to Zendesk Sunshine**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Zendesk Sunshine**.

1. Choose **Create connection**.

1. In the **Connect to Zendesk Sunshine** window, enter the following information:
   + **Custom authorization tokens URL** and **Custom authorization code URL** – For each of these fields, enter your Zendesk subdomain. You can find the subdomain in the URL that you visit when you sign in to Zendesk. For example, in the account URL `https://my-account.zendesk.com`, the subdomain is `my-account`.
   + **Client ID** and **Client secret** – The client credentials that Zendesk assigned to your OAuth client.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Zendesk account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Zendesk Sunshine as the data source, you can select this connection.

## Transferring data from Zendesk Sunshine with a flow
<a name="zendesk-sunshine-transfer-data"></a>

To transfer data from Zendesk Sunshine, create an Amazon AppFlow flow, and choose Zendesk Sunshine as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Zendesk Sunshine, see [Supported objects](#zendesk-sunshine-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#zendesk-sunshine-destinations).

## Supported destinations
<a name="zendesk-sunshine-destinations"></a>

When you create a flow that uses Zendesk Sunshine as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="zendesk-sunshine-objects"></a>

When you create a flow that uses Zendesk Sunshine as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-zendesk-sunshine.html)

# Zoho CRM connector for Amazon AppFlow
<a name="connectors-zoho-crm"></a>

Zoho CRM is a customer relationship management (CRM) system that helps its users conduct sales, marketing, and customer support. If you're a Zoho CRM user, your account contains data about your campaigns, deals, leads, and more. After you connect Amazon AppFlow your Zoho CRM account, you can use Zoho CRM as a data source or destination in your flows. Run these flows to transfer data between Zoho CRM and AWS services or other supported applications.

## Amazon AppFlow support for Zoho CRM
<a name="zoho-crm-support"></a>

Amazon AppFlow supports Zoho CRM as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Zoho CRM.

**Supported as a data destination?**  
Yes. You can use Amazon AppFlow to transfer data to Zoho CRM.

**Supported API version**  
Amazon AppFlow transfers your data by sending requests to version 2.1 of the Zoho CRM API.

## Before you begin
<a name="zoho-crm-prereqs"></a>

To use Amazon AppFlow to transfer data to or from Zoho CRM, you must meet these requirements:
+ You have a Zoho account, which you use to sign in to Zoho CRM. Your Zoho CRM account contains the data that you want to transfer. 
+ In the Zoho Developer Console, you've created a server-based application for Amazon AppFlow. This application provides the credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For the steps to create an application, see [Register your Application](https://www.zoho.com/crm/developer/docs/api/v2.1/register-client.html) in the Zoho CRM documentation.
+ You've configured the application with one or more redirect URLs for Amazon AppFlow.

  Redirect URLs have the following format:

  ```
  https://region.console.aws.amazon.com/appflow/oauth
  ```

  In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Zoho CRM. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

  ```
  https://us-east-1.console.aws.amazon.com/appflow/oauth
  ```

  For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*
+ (Optional) If you want to use your application credentials for all Zoho CRM data centers, you've activated Multi-DC in the application settings, and you've activated all applicable domains.
+ If you want to transfer data to Zoho CRM as the destination, you've stored the data in an Amazon S3 bucket. If you're new to Amazon S3, see [Getting started with Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/GetStartedWithS3.html) in the *Amazon Simple Storage Service User Guide*.

From your application settings, note the values for client ID and client secret. You provide these values to Amazon AppFlow when you connect to your Zoho CRM account.

## Connecting Amazon AppFlow to your Zoho CRM account
<a name="zoho-crm-connecting"></a>

To connect Amazon AppFlow to your Zoho CRM account, provide details from your Zoho CRM application so that Amazon AppFlow can access your data. If you haven't yet configured your Zoho CRM account for Amazon AppFlow integration, see [Before you begin](#zoho-crm-prereqs).

**To connect to Zoho CRM**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Zoho CRM**.

1. Choose **Create connection**.

1. In the **Connect to Zoho CRM** window, enter the following information:
   + **Authorization tokens URL** – The URL for the supported data hosting region (Europe, US, Australia, India, or Japan).
   + **Authorization code URL** – The URL for authorization code based on the selected data hosting region.
   + **Client ID** – The client ID of the application in your Zoho CRM account.
   + **Client secret** – The client secret of the application in your Zoho CRM account.
   + **Instance URL** – The instance URL based on the selected data hosting region.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**.

1. In the window that appears, sign in to your Zoho CRM account, and grant access to Amazon AppFlow.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Zoho CRM as the data source, you can select this connection.

## Transferring data to or from Zoho CRM with a flow
<a name="zoho-crm-transfer-data"></a>

To transfer data to or from Zoho CRM, create an Amazon AppFlow flow, and choose Zoho CRM as the data source or destination. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

## Supported destinations
<a name="zoho-crm-destinations"></a>

When you create a flow that uses Zoho CRM as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](#connectors-zoho-crm)

# Zoom connector for Amazon AppFlow
<a name="connectors-zoom"></a>

Zoom is an online video conferencing solution for individuals and teams. If you're a Zoom user, your account contains data about your resources, such as users, groups, and rooms. You can use Amazon AppFlow to transfer data from Zoom to certain AWS services or other supported applications.

## Amazon AppFlow support for Zoom
<a name="zoom-support"></a>

Amazon AppFlow supports Zoom as follows.

**Supported as a data source?**  
Yes. You can use Amazon AppFlow to transfer data from Zoom.

**Supported as a data destination?**  
No. You can't use Amazon AppFlow to transfer data to Zoom.

**Supported Zoom plans**  
Amazon AppFlow supports only paid plans for Zoom, such as Pro, Business, or Enterprise. You can’t use Amazon AppFlow to transfer data from a Zoom account that subscribes to the free Basic plan. For more information about Zoom plans, see [Plans & Pricing](https://zoom.us/pricing) on the Zoom website.

## Before you begin
<a name="zoom-prereqs"></a>

To use Amazon AppFlow to transfer data from Zoom to supported destinations, you must meet these requirements:
+ You have an account with Zoom that contains the data that you want to transfer. For more information about the Zoom data objects that Amazon AppFlow supports, see [Supported objects](#zoom-objects).
+ In the Zoom App Marketplace, you've created an OAuth app for Amazon AppFlow. This app provides the client credentials that Amazon AppFlow uses to access your data securely when it makes authenticated calls to your account. For more information, see [Build an App](https://marketplace.zoom.us/docs/guides/build/) in the Zoom Developers Docs.
+ You've configured If the app with the following settings:
  + You've disabled the option to publish to the Zoom App Marketplace.
  + You've added the recommended scopes below.
  + You've added one or more redirect URLs for Amazon AppFlow.

    Redirect URLs have the following format:

    ```
    https://region.console.aws.amazon.com/appflow/oauth
    ```

    In this URL, *region* is the code for the AWS Region where you use Amazon AppFlow to transfer data from Zoom. For example, the code for the US East (N. Virginia) Region is `us-east-1`. For that Region, the URL is the following:

    ```
    https://us-east-1.console.aws.amazon.com/appflow/oauth
    ```

    For the AWS Regions that Amazon AppFlow supports, and their codes, see [Amazon AppFlow endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/appflow.html) in the *AWS General Reference.*

Note the values for client ID and client secret from your OAuth app settings. You provide these values to Amazon AppFlow when you connect to your Zoom account.

### Recommended scopes
<a name="zoom-scopes"></a>

Your OAuth app must allow the necessary scopes for the Zoom APIs. These scopes permit Amazon AppFlow to securely access your data in Zoom. We recommend that you enable the scopes below so that Amazon AppFlow can access all supported data objects.

If you want to allow fewer scopes, you can omit any scopes that apply to objects that you don't want to transfer.

You can add scopes by managing your app in the Zoom App Marketplace.
+ `group:master`
+ `group:read:admin`
+ `group:write:admin`
+ `report:master`
+ `report:read:admin`
+ `report_chat:read:admin`
+ `role:master`
+ `role:read:admin`
+ `role:write:admin`
+ `room:master`
+ `room:read:admin`
+ `room:write:admin`
+ `user:master`
+ `user:read:admin`
+ `user:write:admin`

For more information about these scopes, see [OAuth Scopes](https://marketplace.zoom.us/docs/guides/auth/oauth/oauth-scopes/) in the Zoom Developers Docs.

## Connecting Amazon AppFlow to your Zoom account
<a name="zoom-connecting"></a>

To connect Amazon AppFlow to your Zoom account, provide the client credentials from your OAuth app. Amazon AppFlow uses these credentials to access your data. If you haven't yet configured your Zoom account for Amazon AppFlow integration, see [Before you begin](#zoom-prereqs).

**To connect to Zoom**

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. In the navigation pane on the left, choose **Connections**.

1. On the **Manage connections** page, for **Connectors**, choose **Zoom**.

1. Choose **Create connection**.

1. In the **Connect to Zoom** window, for **Client ID** and **Client secret**, enter the client credentials from your OAuth app.

1. Optionally, under **Data encryption**, choose **Customize encryption settings (advanced)** if you want to encrypt your data with a customer managed key in the AWS Key Management Service (AWS KMS).

   By default, Amazon AppFlow encrypts your data with a KMS key that AWS creates, uses, and manages for you. Choose this option if you want to encrypt your data with your own KMS key instead.

   Amazon AppFlow always encrypts your data during transit and at rest. For more information, see [Data protection in Amazon AppFlow](data-protection.md).

   If you want to use a KMS key from the current AWS account, select this key under **Choose an AWS KMS key**. If you want to use a KMS key from a different AWS account, enter the Amazon Resource Name (ARN) for that key.

1. For **Connection name**, enter a name for your connection.

1. Choose **Continue**. A **Sign in** window opens.

1. Enter your user name and password to sign in to your Zoom account.

1. When prompted, verify your sign-in attempt with a one-time passcode.

1. Authorize Amazon AppFlow to access your Zoom account.

On the **Manage connections** page, your new connection appears in the **Connections** table. When you create a flow that uses Zoom as the data source, you can select this connection.

## Transferring data from Zoom with a flow
<a name="zoom-transfer-data"></a>

To transfer data from Zoom, create an Amazon AppFlow flow, and choose Zoom as the data source. For the steps to create a flow, see [Creating flows in Amazon AppFlow](create-flow.md).

When you configure the flow, choose the data object that you want to transfer. For the objects that Amazon AppFlow supports for Zoom, see [Supported objects](#zoom-objects).

Also, choose the destination where you want to transfer the data object that you selected. For more information about how to configure your destination, see [Supported destinations](#zoom-destinations).

## Supported destinations
<a name="zoom-destinations"></a>

When you create a flow that uses Zoom as the data source, you can set the destination to any of the following connectors: 
+ [Amazon Lookout for Metrics](lookout.md)
+ [Amazon Redshift](redshift.md)
+ [Amazon RDS for PostgreSQL](connectors-amazon-rds-postgres-sql.md)
+ [Amazon S3](s3.md)
+ [HubSpot](connectors-hubspot.md)
+ [Marketo](marketo.md)
+ [Salesforce](salesforce.md)
+ [SAP OData](sapodata.md)
+ [Snowflake](snowflake.md)
+ [Upsolver](upsolver.md)
+ [Zendesk](zendesk.md)
+ [Zoho CRM](connectors-zoho-crm.md)

## Supported objects
<a name="zoom-objects"></a>

When you create a flow that uses Zoom as the data source, you can transfer any of the following data objects to supported destinations:

[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/appflow/latest/userguide/connectors-zoom.html)