

# Working with AWS services from the AWS Toolkit Explorer
<a name="working-with-aws"></a>

Your AWS services and resources are available through the AWS Toolkit for JetBrains Explorer.

For more information on how to navigate the AWS Toolkit for JetBrains and the AWS Explorer, see the [Navigation](https://docs.aws.amazon.com//toolkit-for-jetbrains/latest/userguide/navigation.html) topic in this User Guide.

To learn more about working with a specific AWS service from the AWS Toolkit for JetBrains, choose from the following list of topics.

**Topics**
+ [Experimental features](experimental-features.md)
+ [AWS App Runner](using-apprunner.md)
+ [Amazon CodeCatalyst](codecatalyst-service.md)
+ [AWS CloudFormation](cloudformation.md)
+ [Amazon CloudWatch Logs](building-cloudwatch.md)
+ [Amazon DynamoDB](dynamodb.md)
+ [Amazon ECS](ecs.md)
+ [Amazon EventBridge](eventbridge.md)
+ [AWS Lambda](building-lambda.md)
+ [Amazon RDS](accessing-rds.md)
+ [Amazon Redshift](accessing-redshift.md)
+ [Amazon S3](building-S3.md)
+ [AWS Serverless](sam.md)
+ [Amazon SQS](sqs.md)
+ [Resources](more-resources.md)

# Working with experimental features
<a name="experimental-features"></a>

Experimental features offer early access to features in the AWS Toolkit for JetBrains before they're officially released. 

**Warning**  
Because experimental features continue to be tested and updated, they may have usability issues. And experimental features may be removed from the AWS Toolkit for JetBrains without notice.

You can enable experimental features for specific AWS services in the **AWS** section of the **Settings** pane in your JetBrains IDE.

1. To edit AWS settings in JetBrains, choose **File**, **Settings** (or press **Ctrl\$1Alt\$1S**).

1. In the **Settings** pane, expand **Tools** and choose **AWS**, **Experimental Features**.

1. Select the checkboxes for the experimental features you want to access prior to release. If you want to switch off an experimental feature, clear the relevant checkbox.

1. After enabling experimental features, you can confirm by opening the **AWS Explorer** and choosing **Options** (the gear icon), **Experimental Features**. A checkmark beside the name of the feature indicates that it's available for use.

# Using AWS Toolkit for JetBrains with AWS App Runner
<a name="using-apprunner"></a>

[AWS App Runner](https://docs.aws.amazon.com/apprunner/latest/dg/what-is-apprunner.html) provides a fast, simple, and cost-effective way to deploy from source code or a container image directly to a scalable and secure web application in the AWS Cloud. Using it, you don't need to learn new technologies, decide which compute service to use, or know how to provision and configure AWS resources.

You can use AWS App Runner to create and manage services based on a *source image* or *source code*. If you use a source image, you can choose a public or private container image that's stored in an image repository. App Runner supports the following image repository providers:
+ Amazon Elastic Container Registry (Amazon ECR): Stores private images in your AWS account.
+ Amazon Elastic Container Registry Public (Amazon ECR Public): Stores publicly readable images.

 If you choose the source code option, you can deploy from a source code repository that's maintained by a supported repository provider. Currently, App Runner supports [GitHub](https://github.com/) as a source code repository provider: 

## Prerequisites
<a name="apprunner-prereqs"></a>

This section assumes you already have an AWS account and the latest version of AWS Toolkit for JetBrains that features AWS App Runner. In addition to those core requirements, make sure that all relevant IAM users have permissions to interact with the App Runner service. Also you need to obtain specific information about your service source such as the container image URI or the connection to the GitHub repository. You need this information when creating your App Runner service.

### Configuring IAM permissions for App Runner
<a name="app-runner-permissions"></a>

The easiest way to grant the permissions that are required for App Runner is to attach an existing AWS managed policy to the relevant IAM entity, specifically a user or group. App Runner provides two managed policies that you can attach to your IAM users:
+ `AWSAppRunnerFullAccess`: Allows users to perform all App Runner actions.
+ `AWSAppRunnerReadOnlyAccess`: Allow users to list and view details about App Runner resources. 

In addition, if you choose a private repository from the Amazon Elastic Container Registry (Amazon ECR) as the service source, you must create the following access role for your App Runner service:
+ `AWSAppRunnerServicePolicyForECRAccess`: Allows App Runner to access Amazon Elastic Container Registry (Amazon ECR) images in your account.

You can use the **Create App Runner Service** dialog box to create this IAM role.

**Note**  
The **AWSServiceRoleForAppRunner** service-linked role allows AWS App Runner to complete the following tasks:  
Push logs to Amazon CloudWatch Logs log groups.
Create Amazon CloudWatch Events rules to subscribe to Amazon Elastic Container Registry (Amazon ECR) image push.
You don't need to manually create the service-linked role. When you create an AWS App Runner in the AWS Management Console or by using API operations that are called by AWS Toolkit for JetBrains, AWS App Runner creates this service-linked role for you. 

For more information, see [Identity and access management for App Runner](https://docs.aws.amazon.com/apprunner/latest/dg/security-iam.html) in the *AWS App Runner Developer Guide*.

### Obtaining service sources for App Runner
<a name="app-runner-sources"></a>

You can use AWS App Runner to deploy services from a source image or source code. 

------
#### [ Source image ]

If you're deploying from a source image, you can obtain a link to the repository for that image from a private or public AWS image registry. 
+ Amazon ECR private registry: Copy the URI for a private repository that uses the Amazon ECR console at [https://console.aws.amazon.com/ecr/repositories](https://console.aws.amazon.com/ecr/repositories). 
+ Amazon ECR public registry: Copy the URI for a public repository that uses the Amazon ECR Public Gallery at [https://gallery.ecr.aws/](https://gallery.ecr.aws).

You specify the URI for the image repository when entering details for your source in the **Create App Runner Service** dialog box.

For more information, see [App Runner service based on a source image](https://docs.aws.amazon.com/apprunner/latest/dg/service-source-image.html) in the *AWS App Runner Developer Guide*.

------
#### [ Source code ]

For your source code to be deployed to an AWS App Runner service, that code must be stored in a Git repository that's maintained by a supported repository provider. App Runner supports one source code repository provider: [GitHub](https://github.com/).

For information about setting up a GitHub repository, see the [Getting started documentation](https://docs.github.com/en/github/getting-started-with-github) on GitHub.

To deploy your source code to an App Runner service from a GitHub repository, App Runner establishes a connection to GitHub. If your repository is private (that is, it isn't publicly accessible on GitHub), you must provide App Runner with connection details. 

**Important**  
To create GitHub connections, you must use the App Runner console ([https://console.aws.amazon.com/apprunner](https://console.aws.amazon.com/apprunner)) to create a connection that links GitHub to AWS. You can select the connections that are available on the **GitHub connections** page when using the **Create App Runner Service** dialog box to specify details about your source code repository.   
For more information, see [Managing App Runner connections](https://docs.aws.amazon.com/apprunner/latest/dg/manage-connections.html) in the *AWS App Runner Developer Guide*.

The App Runner service instance provides a managed runtime that allows your code to build and run. AWS App Runner currently supports the following runtimes:
+ Python managed runtime 
+ Node.js managed runtime

Using the **Create App Runner Service** dialog box that's available in AWS Toolkit for JetBrains, you provide information about how the App Runner service builds and starts your service. You can enter the information directly in the interface or specify a YAML-formatted [App Runner configuration file](https://docs.aws.amazon.com/apprunner/latest/dg/config-file.html). Values in this file instruct App Runner how to build and start your service, and provide runtime context. This includes relevant network settings and environment variables. The configuration file is named `apprunner.yaml`. It's automatically added to root directory of your application’s repository.

 

------

## Pricing
<a name="app-runner-pricing"></a>

You're charged for the compute and memory resources that your application uses. In addition, if you automate your deployments, you also pay a set monthly fee for each application that covers all automated deployments for that month. If you opt to deploy from source code, you additionally pay a build fee for the amount of time that it takes App Runner to build a container from your source code.

For more information, see [AWS App Runner Pricing](https://aws.amazon.com/apprunner/pricing/).

**Topics**
+ [Prerequisites](#apprunner-prereqs)
+ [Pricing](#app-runner-pricing)
+ [Creating App Runner services](creating-service-apprunner.md)
+ [Managing App Runner services](managing-service-apprunner.md)

# Creating App Runner services
<a name="creating-service-apprunner"></a>

You can create an App Runner service in AWS Toolkit for JetBrains by using the **Create App Runner Service** dialog box. You can use its interface to select a source repository and configure the service instance where your application runs. 

Before creating an App Runner service, make sure that you completed all the [prerequisites](using-apprunner.md#apprunner-prereqs). Included is providing the relevant IAM permissions and taking note of the specific information about the source repository that you want to deploy.<a name="create-service"></a>

# To create an App Runner service
<a name="create-service"></a>

1. Open AWS Explorer, if it isn't already open.

1. Right-click the **App Runner** node and choose **Create Service**.

   The **Create App Runner Service** dialog box is displayed.

1. Enter your unique **Service name**.

1. Choose your source type (**ECR**, **ECR public** or **Source code repository**) and configure the relevant settings:

------
#### [ ECR/ECR public ]

   If you're using a private registry, choose the **Deployment type**:
   + **Manual**: Use manual deployment if you want to explicitly initiate each deployment to your service. 
   + **Automatic**: Use automatic deployment if you want implement continuous integration and deployment (CI/CD) behavior for your service. If you choose this option, it means that whenever you push a new image version to your image repository or a new commit to your code repository, App Runner automatically deploys it to your service without further action required from you.

   For **Container image URI**, enter the URI of the image repository that you copied from your Amazon ECR private registry or Amazon ECR Public Gallery.

   For **Start Command**, enter the command to start the service process.

   For **Port**, enter the IP port that's used by the service.

   If you're using an Amazon ECR private registry, select the required **ECR access role** and choose **Create**.
   + The **Create IAM Role** dialog box displays the **Name**, **Managed Policies**, and **Trust Relationships** for the IAM role. Choose **Create**.

------
#### [ Source code repository ]

   Choose the **Deployment type**:
   + **Manual**: Use manual deployment if you want to explicitly initiate each deployment to your service. 
   + **Automatic**: Use automatic deployment if you want implement continuous integration and deployment (CI/CD) behavior for your service. If you choose this option, it means that whenever you push a new image version to your image repository, or a new commit to your code repository, App Runner automatically deploys it to your service without further action required from you.

   For **Connections**, select a connection that's available from the list on **GitHub connections** page.

   For **Repository URL**, enter the link to remote repository that's hosted on GitHub.

   For **Branch**, indicate which Git branch of your source code that you want to deploy.

   For **Configuration**, specify how you want to specify your runtime configuration:
   + **Configure all settings here**: Choose this option if you want to specify the following settings for the runtime environment of your application: 
     + **Runtime**: Choose **Python 3** or **Nodejs 12**.
     + **Port**: Enter the IP port that your service uses.
     + **Build command**: Enter the command to build your application in the runtime environment of your service instance.
     + **Start command**: Enter the command to start your application in the runtime environment of your service instance.
   + **Provide a configuration file settings here**: Choose this option to use the settings that are defined by the `apprunner.yaml` configuration file. This file is in the root directory of your application’s repository.

------

1. Specify values to define the runtime configuration of the App Runner service instance: 
   + **CPU**: The number of CPU units that are reserved for each instance of your App Runner service (Default: `1 vCPU`).
   + **Memory:** The amount of memory that's reserved for each instance of your App Runner service (Default: `2 GB`)
   + **Environmental variables**: Optional environment variables that you use to customize behavior in your service instance. Create environmental variables by defining a key and a value.

1. Choose **Create**

   When your service is being created, its status changes from **Operation in progress** to **Running**.

1. After your service starts running, right-click it and choose **Copy Service URL**.

1. To access your deployed application, paste the copied URL into the address bar of your web browser.

# Managing App Runner services
<a name="managing-service-apprunner"></a>

After creating an App Runner service, you can manage it by using the AWS Explorer pane to carry out the following activities:
+ [Pausing and resuming App Runner services](#pause-resume-apprunner)
+ [Deploying App Runner services](#deploying-apprunner)
+ [Viewing logs streams for App Runner](#viewing-logs-apprunner)
+ [Deleting App Runner services](#deleting-apprunner)

## Pausing and resuming App Runner services
<a name="pause-resume-apprunner"></a>

If you need to disable your web application temporarily and stop the code from running, you can pause your AWS App Runner service. App Runner reduces the compute capacity for the service to zero. When you're ready to run your application again, resume your App Runner service. App Runner provisions new compute capacity, deploys your application to it, and runs the application.

**Important**  
You're billed for App Runner only when it's running. Therefore, you can pause and resume your application as needed to manage costs. This is particularly helpful in development and testing scenarios.<a name="pause-app-runner"></a>

## To pause your App Runner service
<a name="pause-app-runner"></a>

1. Open AWS Explorer, if it isn't already open.

1. Expand **App Runner** to view the list of services.

1. Right-click your service and choose **Pause**.

1. In the dialog box that displays, choose **Pause**.

   While the service is pausing, the service status changes from **Running** to **Operation in progress** and then to **Paused**.<a name="pause-app-runner"></a>

## To resume your App Runner service
<a name="pause-app-runner"></a>

1. Open AWS Explorer, if it isn't already open.

1. Expand **App Runner** to view the list of services.

1. Right-click your service and choose **Resume**.

1. In the dialog box that displays, choose **Resume**.

   While the service is resuming, the service status changes from **Paused** to **Operation in progress** and then to **Running**.

## Deploying App Runner services
<a name="deploying-apprunner"></a>

If you choose the manual deployment option for your service, you need to explicitly initiate each deployment to your service. <a name="deploy-app-runner"></a>

1. Open AWS Explorer, if it isn't already open.

1. Expand **App Runner** to view the list of services.

1. Right-click your service and choose **Deploy**.

1. While your application is being deployed, the service status changes from **Operation in progress** to **Running**.

1. To confirm that your application is successfully deployed, right-click the same service and choose **Copy Service URL**.

1. To access your deployed web application, paste the copied URL into the address bar of your web browser.

## Viewing logs streams for App Runner
<a name="viewing-logs-apprunner"></a>

Use CloudWatch Logs to monitor, store, and access your log files for services such as App Runner. CloudWatch Logs records two distinct types of log files: log events and log streams. Log events are records of activity that was recorded by the application or resource that you monitor with CloudWatch Logs . A log stream is a sequence of log events that share the same source. 

You can access the two following types of log streams for App Runner:
+ **Service log streams**: Contains the logging output that's generated by App Runner. For this type of log stream, the log events are records of how App Runner manages your service and acts on it.
+ **Application log streams**: Contains the output of your running application code.<a name="view-logs-apprunner"></a>

1. Expand **App Runner** to view the list of services

1. Right-click a service and choose one of the following options:
   + **View Service Log Streams**
   + **View Application Log Streams**

   The **Log Streams** pane displays the log events that make up the log stream. 

1. To view more information about a specific event, right-click it and choose **Export Log Stream**, **Open in Editor** or **Export Log Stream**, **Save to a File**.

## Deleting App Runner services
<a name="deleting-apprunner"></a>

**Important**  
If you delete your App Runner service, it's permanently removed and your stored data is deleted. If you need to recreate the service, App Runner needs to fetch your source again and build it if it's a code repository. Your web application gets a new App Runner domain. <a name="delete-app-runner"></a>

1. Open AWS Explorer, if it isn't already open.

1. Expand **App Runner** to view the list of services.

1. Right-click a service and choose **Delete Service**.

1. In the confirmation dialog box, enter *delete me* and then choose **OK**.

   The deleted service displays the **Operation in progress** status, and then the service disappears from the list.

# Amazon CodeCatalyst for JetBrains
<a name="codecatalyst-service"></a>

## What is Amazon CodeCatalyst?
<a name="codecatalyst-intro"></a>

Amazon CodeCatalyst is a cloud-based collaboration space for software development teams. Using the AWS Toolkit for JetBrains Gateway, you can view and manage your CodeCatalyst resources directly from JetBrains Gateway. You can also use the Toolkit to launch, manage, and edit Dev Environments virtual computing environments. For more information about CodeCatalyst, see the [Amazon CodeCatalyst](https://docs.aws.amazon.com/codecatalyst/latest/userguide/welcome.html) User Guide.

The following topics describe how to connect the AWS Toolkit for JetBrains Gateway with CodeCatalyst and how to work with CodeCatalyst through JetBrains Gateway.

**Topics**
+ [What is Amazon CodeCatalyst?](#codecatalyst-intro)
+ [Getting Started with CodeCatalyst](codecatalyst-setup.md)
+ [Working with CodeCatalyst](codecatalyst-overview.md)

# Getting Started with CodeCatalyst and the AWS Toolkit for JetBrains
<a name="codecatalyst-setup"></a>

To get started working with CodeCatalyst from the JetBrains Gateway, complete the following.

## Downloading JetBrains Gateway
<a name="codecatalyst-setup-jbgateway"></a>

Before you integrate the AWS Toolkit with your CodeCatalyst accounts, make sure that you're using the current version of JetBrains Gateway. To download the latest version of JetBrains Gateway, choose the JetBrains Gateway distribution you want from the following links:
+ [JetBrains Gateway for Linux](https://download.jetbrains.com/product?code=GW&latest&distribution=linux)
+ [JetBrains Gateway for Windows]( https://download.jetbrains.com/product?code=GW&latest&distribution=windows)
+ [JetBrains Gateway for macOS](https://download.jetbrains.com/product?code=GW&latest&distribution=mac)
+ [JetBrains Gateway for macOS Apple Silicon](https://download.jetbrains.com/product?code=GW&latest&distribution=macM1)

## Installing the AWS Toolkit for JetBrains Gateway
<a name="codecatalyst-setup-toolkit"></a>

To connect JetBrains with your CodeCatalyst account, you must install the latest version of the toolkit extension. You can find the latest version and complete the installation of the toolkit directly from the JetBrains **Plugins Marketplace**. 

To install the AWS Toolkit plugin from the JetBrains **Plugins Marketplace**, complete the following steps:

1. From the JetBrains Gateway main screen, choose the **Settings/Preferences ** icon, located in the bottom-left corner of the application. 

1. choose **Settings/Preferences** to open the **Settings/Preferences** view. 

1. In the **Settings/Preferences** view, choose **Plugins** to open the **Plugins** view.
**Note**  
The **Plugins** view can open in either the **Marketplace** view or the **Installed** view.   
If this is your first time installing the AWS Toolkit for JetBrains Gateway, select the **Plugins Marketplace** view to continue.
If you have a previous version of the AWS Toolkit for JetBrains Gateway, update it from the **Installed** view.

1. From the **Marketplace** view, enter the text `AWS Toolkit` and choose the **AWS Toolkit** plugin entry when it appears.

1. Choose **Install** to download and install the **AWS Toolkit for JetBrains Gateway**.
**Note**  
JetBrains Gateway displays the status of your download and installation progress. After the toolkit is successfully installed, the JetBrains Gateway **Connections** explorer updates with the **Amazon CodeCatalyst** plugin icon.

## Creating a CodeCatalyst account and setting up authentication
<a name="codecatalyst-setup-id"></a>

To work with CodeCatalyst from the AWS Toolkit for JetBrains, you must have an active CodeCatalyst account and set up either an AWS Builder ID or IAM Identity Center authentication to connect with JetBrains Gateway. If you don't have an active CodeCatalyst account, AWS Builder ID, or IAM Identity Center authentication set up, see the [Setting up with CodeCatalyst](https://docs.aws.amazon.com/codecatalyst/latest/userguide/setting-up-topnode.html) section in the *CodeCatalyst* User Guide.

## Authenticate and connect JetBrains Gateway with CodeCatalyst
<a name="codecatalyst-setup-connect"></a>

To authenticate with AWS and connect JetBrains Gateway with your CodeCatalyst account, complete the following steps.

**Note**  
To authenticate with AWS using IAM Identity Center, you must be running AWS Toolkit for JetBrains version 2.6 or later.

1. From the JetBrains Gateway **Connections** explorer, choose the **Amazon CodeCatalyst** plugin to open the **Amazon CodeCatalyst** plugin view.

1. From the **CodeCatalyst** plugin view, choose **Sign in** to open the **AWS Toolkit: Setup Authentication** dialog.

1. From the **AWS Toolkit: Setup Authentication** dialog, choose the tab for your preferred authentication method: **IAM Identity Center** or **AWS Builder ID**.

1. From the tab of your preferred authentication method, complete any required fields, then choose the **Connect** button to open the **AWS authentication portal** in your default web browser.

1. From your web browser, confirm the security code matches the code presented in your IDE to proceed.

1. When prompted, choose **Allow** to confirm the connection between JetBrains and your AWS account. When the connection process is complete, your web browser displays a confirmation indicating that it's safe to close the window and return to your IDE.

1. The JetBrains Gateway **CodeCatalyst** plugin view is updated to the **Dev Environments** view.

# Working with Amazon CodeCatalyst in JetBrains Gateway
<a name="codecatalyst-overview"></a>

You can launch a virtual computing environment, known as a Dev Environment, from JetBrains. Dev Environments are customizable cloud-development environments that you can copy and share among different team members in your Space. For more information about Dev Environments and how you can access them from CodeCatalyst, see the [Dev Environments](https://docs.aws.amazon.com/codecatalyst/latest/userguide/devenvironment.html) section in the *Amazon CodeCatalyst* User Guide. 

The following sections describe how to create, open, and work with Dev Environments from JetBrains Gateway.

## Opening a Dev Environment
<a name="codecatalyst-overview-open"></a>

To open an existing Dev Environment from JetBrains Gateway, complete the following steps.

1. From the **Connections** explorer, choose the **Amazon CodeCatalyst** plugin.

1. From the **Remote Development** wizard body, navigate to the parent Space and project for the Dev Environment that you want to open.

1. Choose the Dev Environment that you want to open.

1. Confirm the opening process for your Dev Environment to continue.
**Note**  
JetBrains displays the progress in a new status window, when the opening process is complete, your Dev Environment opens in a new window.

## Creating a Dev Environment
<a name="codecatalyst-overview-creating"></a>

To create a new Dev Environment:

1. From the **Connections** explorer, choose the **CodeCatalyst** plugin.

1. From the **Remote Development** wizard header section, choose the **Create a Dev Environment** link to open the **New CodeCatalyst Dev Environment** view.

1. From the **New CodeCatalyst Dev Environment** view, use the following fields to configure your Dev Environment preferences.
   + **IDE**: Select your preferred JetBrains IDE to launch in your Dev Environment.
   + **CodeCatalyst Project**: Choose a CodeCatalyst Space and project for your Dev Environment. 
   + **Dev Environment Alias**: Enter an alternate name for your Dev Environment. 
   + **Compute**: Choose the virtual hardware configuration for your Dev Environment. 
   + **Persistent storage**: Choose the amount of persistent storage for your Dev Environment. 
   + **Inactivity timeout**: Choose the system idle time that passes before your Dev Environment enters into standby. 

1. To create your Dev Environment, choose **Create Dev Environment**.
**Note**  
When you choose **Create Dev Environment**, the **New Dev Environment** view closes and the process to create your Dev Environment starts. The process can take several minutes and you can't use other JetBrains Gateway features until your Dev Environment is created.  
JetBrains displays the progress in a new status window, and, when the process is complete, your Dev Environment opens in a new window.

## Creating a Dev Environment from a third-party repository
<a name="codecatalyst-overview-creating-source-repo"></a>

You can create Dev Environments from a third-party repository by linking to the repository as a source.

Linking to a third-party repository as a source is handled at the project level in CodeCatalyst. For instructions and additional details on how to connect a third-party repository to your Dev Environment, see the [Linking a source repository](https://docs.aws.amazon.com//codecatalyst/latest/userguide/source-repositories-link.html) topic in the *Amazon CodeCatalyst User Guide*.

## Configuring Dev Environment settings
<a name="codecatalyst-overview-configure"></a>

To change the settings for an existing Dev Environment from JetBrains Gateway, complete the following steps.

**Note**  
You can't modify the amount of storage assigned to your Dev Environment after it has been created.

1. From the **Connections** explorer, choose the **Amazon CodeCatalyst** plugin.

1. From the **Remote Development** wizard body, navigate to the parent Space and project for the Dev Environment that you want to configure.

1. Choose the **Settings** icon, next to the Dev Environment that you want to configure, to open the **Configure Dev Environment:** settings.

1. From **Configure Dev Environment:** settings menu, configure your Dev Environment by changing the following options:
   + **Dev Environment Alias**: Optional field to specify and alternate name for your Dev Environment.
   + **IDE:** choose the JetBrains IDE you want to launch inside of your Dev Environment. 
   + **Compute**: Choose the virtual hardware configuration for your Dev Environment. 
   + **Inactivity timeout**: Choose the system idle time that passes before your Dev Environment enters into standby. 

## Pausing a Dev Environment
<a name="codecatalyst-overview-pause"></a>

The activity in your Dev Environment is stored persistently. This means that you can pause and resume your Dev Environment without losing your work.

To pause your Dev Environment, complete the following steps.

1. From the **Connections** explorer, choose the **Amazon CodeCatalyst** plugin.

1. From the **Remote Development** wizard body, navigate to the parent Space and project for the Dev Environment that you want to pause.

1. Choose the **Pause** icon next to your active Dev Environment to open the **Confirm Pause** dialog.

1. Choose **Yes** to close the **Confirm Pause** dialog and initialize the pause process.
**Note**  
JetBrains displays the progress of the pause process in a new status window. When the Dev Environment has stopped, the **Pause ** icon is removed from the user interface.

## Resuming a Dev Environment
<a name="codecatalyst-overview-resume"></a>

The activity in your Dev Environment is stored persistently. This means that you can resume a paused Dev Environment without losing your previous work.

To resume a paused Dev Environment, complete the following steps.

1. From the **Connections** explorer, choose the **Amazon CodeCatalyst** plugin.

1. From the **Remote Development** wizard body, navigate to the parent Space and project for the Dev Environment that you want to resume.

1. Choose the Dev Environment you want to resume.
**Note**  
JetBrains displays the progress of the resume process in a new status window. When the Dev Environment has resumed, a **Pause ** icon is added next to the Dev Environment **Settings** icon.

## Deleting a Dev Environment
<a name="codecatalyst-overview-delete"></a>

To delete your Dev Environment, complete the following steps:

1. From the **Connections** explorer, choose the **Amazon CodeCatalyst** plugin.

1. From the **Remote Development** wizard body, navigate to the parent Space and project for the Dev Environment that you want to delete.

1. Choose the **X** icon button next to your Dev Environment to open the **Confirm Deletion** dialog.

1. Choose **Yes** to close the dialog and delete your Dev Environment.
**Important**  
After you choose **Yes**, your Dev Environment is deleted and can't be retrieved. Before deleting a Dev Environment, make sure to commit and push your code changes to the original source repository. Otherwise, your unsaved changes will be permanently lost.  
After a Dev Environment is deleted, the **Remote Development** wizard updates and the Dev Environment is no longer listed in your resources.

## Configuring Dev Environments defaults
<a name="codecatalyst-overview-default"></a>

You can configure your Dev Environment default settings in the `devfile` of your Dev Environment. The `devfile` specification is an open standard, which you can update in a YAML document.

For more information about how to define and configure your `devfile`, see [devfile.io](https://devfile.io/).

To open and edit your `devfile` from your JetBrains Gateway Dev Environment instance, complete the following steps.

1. From the **Navigation bar** in your active JetBrains Dev Environment, expand the **Amazon CodeCatalyst Dev Environment** node to open the **Backend Status Details** menu.

1. Choose the **Configure Dev Environment** tab, then choose **Open Devfile** to open your `devfile` in the JetBrains **Editor**.

1. From the **Editor**, make changes to your `devfile` and save your work.

1. Upon saving your changes, the **Amazon CodeCatalyst Dev Environment** node displays an alert indicating that your Dev Environment requires a rebuild.

1. Expand the **Amazon CodeCatalyst Dev Environment** node and choose the **Rebuild Dev Environment** node from the **Configure Dev Environment** tab.

# Working with AWS CloudFormation by using the AWS Toolkit for JetBrains
<a name="cloudformation"></a>

The following topics describe how to use the AWS Toolkit for JetBrains to work with AWS CloudFormation stacks in an AWS account.

**Topics**
+ [Viewing event logs for a stack](cloudformation-logs.md)
+ [Deleting a stack](cloudformation-delete.md)

# Viewing event logs for an AWS CloudFormation stack by using the AWS Toolkit for JetBrains
<a name="cloudformation-logs"></a>

1. Open AWS Explorer, if it isn't already open. If the stack is in an AWS Region that's different from the current one, switch to a different AWS Region that contains it.

1. Expand **CloudFormation**.

1. To view event logs for the stack, right-click the stack's name. The AWS Toolkit for JetBrains displays the event logs in the **CloudFormation** tool window.

   To hide or show the **CloudFormation** tool window, on the main menu, choose **View**, **Tool Windows**, **CloudFormation**.  
![\[Choosing to view event logs for an AWS CloudFormation stack starting from AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/cloudformation-logs.png)

# Deleting an AWS CloudFormation stack by using the AWS Toolkit for JetBrains
<a name="cloudformation-delete"></a>

1. Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region that contains the stack, do that now.

1. Expand **CloudFormation**.

1. Right-click the name of the stack to delete, and then choose **Delete CloudFormation Stack**.  
![\[Choosing to delete a AWS CloudFormation stack starting from AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-delete.png)

1. Enter the stack's name to confirm it's deleted, and then choose **OK**. If the stack deletion succeeds, the AWS Toolkit for JetBrains removes the stack name from the **CloudFormation** list in **AWS Explorer**. If the stack deletion fails, you can troubleshoot by viewing the event logs for the stack.

# Working with CloudWatch Logs by using the AWS Toolkit for JetBrains
<a name="building-cloudwatch"></a>

Amazon CloudWatch Logs enables you to centralize the logs from all of your systems, applications, and AWS services that you use, in a single, highly scalable service. You can then easily view them, search them for specific error codes or patterns, filter them based on specific fields, or archive them securely for future analysis. For more information, see [What Is Amazon CloudWatch Logs?](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/WhatIsCloudWatchLogs.html) in the *Amazon CloudWatch User Guide*.

The following topics describe how to use the AWS Toolkit for JetBrains to work with CloudWatch Logs in an AWS account.

**Topics**
+ [Viewing CloudWatch log groups and log streams](viewing-CloudWatch-logs.md)
+ [Working with CloudWatch log events](working-CloudWatch-log-events.md)
+ [Working with CloudWatch Logs Insights](cloudwatch-log-insights.md)

# Viewing CloudWatch log groups and log streams by using the AWS Toolkit for JetBrains
<a name="viewing-CloudWatch-logs"></a>

A *log stream* is a sequence of log events that share the same source. Each separate source of logs into CloudWatch Logs makes up a separate log stream.

 A *log group* is a group of log streams that share the same retention, monitoring, and access control settings. You can define log groups and specify which streams to put into each group. There is no limit on the number of log streams that can belong to one log group. 

For more information, see [Working with Log Groups and Log Streams ](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Working-with-log-groups-and-streams.html) in the *Amazon CloudWatch User Guide*.

**Topics**
+ [Viewing log groups and log streams with the **CloudWatch Logs** node](#viewing-log-groups)
+ [Viewing log streams with the **Lambda** node](#viewing-lamba-log-groups)
+ [Viewing log streams with the **Amazon ECS** node](#viewing-ecs-log-groups)

## Viewing log groups and log streams with the **CloudWatch Logs** node
<a name="viewing-log-groups"></a>

1. Open AWS Explorer, if it isn't already open.

1. Click the **CloudWatch Logs** node to expand the list of log groups.

   The log groups for the [current AWS Region](setup-region.md#setup-region-current-region) are displayed under the **CloudWatch Logs** node.

1. To view the log streams in a log group, do one of the following:
   + Double-click the name of the log group.
   + Right-click the name of the log group, and then choose **View Log Streams**.

   The log group's contents are displayed in the **Log Streams** pane. For information about interacting with the log events in each stream, see [Working with CloudWatch log eventsWorking with CloudWatch Logs Insights](working-CloudWatch-log-events.md).  
![\[Viewing log streams in a CloudWatch log group in AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/cloudwatch-view-log-streams.png)

## Viewing log streams with the **Lambda** node
<a name="viewing-lamba-log-groups"></a>

You can view CloudWatch Logs for AWS Lambda functions by using the **Lambda** node in AWS Explorer. 

**Note**  
You can also view log streams for all AWS services, including Lambda functions, using the **CloudWatch Logs** node in AWS Explorer. We recommend using the **Lambda** node, however, for an overview of log data specific to Lambda functions.

1. Open AWS Explorer, if it isn't already open.

1. Click the **Lambda** node to expand the list of Lambda functions.

   The Lambda functions for the [current AWS Region](setup-region.md#setup-region-current-region) are displayed beneath the **Lambda** node.

1. Right-click a Lambda function, and then choose **View Log Streams**.

   The log streams for the function are displayed in the **Log Streams** pane. For information about interacting with the log events in each stream, see [Working with CloudWatch log eventsWorking with CloudWatch Logs Insights](working-CloudWatch-log-events.md).

## Viewing log streams with the **Amazon ECS** node
<a name="viewing-ecs-log-groups"></a>

You can view CloudWatch Logs for clusters and containers that are run and maintained in Amazon Elastic Container Service by using the **Amazon ECS** node in AWS Explorer 

**Note**  
You can also view log groups for all AWS services, including Amazon ECS, using the **CloudWatch Logs** node in AWS Explorer. We recommend using the **Amazon ECS** node, however, for an overview of log data specific to Amazon ECS clusters and containers.

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon ECS** node to expand the list of Amazon ECS clusters.

   The Amazon ECS clusters for the [current AWS Region](setup-region.md#setup-region-current-region) are displayed beneath the **Amazon ECS** node.

1. Right-click a cluster, and then choose **View Log Streams**.

   The log streams for the cluster are displayed in the **Log Streams** pane.

1. To view log streams for a specific container, click a cluster to expand its list of registered containers.

   The containers registered for the cluster are displayed beneath.

1. Right-click a container, and then choose **View Container Log Stream**.

   The log streams for the container are displayed in the **Log Streams** pane. For information about interacting with the log events for clusters and containers, see [Working with CloudWatch log eventsWorking with CloudWatch Logs Insights](working-CloudWatch-log-events.md). 

# Working with CloudWatch log events in log streams by using the AWS Toolkit for JetBrains
<a name="working-CloudWatch-log-events"></a>

After you've opened the **Log Steams** pane, you can access the log events in each stream. Log events are records of activity recorded by the application or resource being monitored.

**Topics**
+ [Viewing and filtering log events in a stream](#viewing-log-events)
+ [Working with log actions](#working-with-log-actions)
+ [Exporting CloudWatch log events to a file or an editor](#exporting-CW-logs)

## Viewing and filtering log events in a stream
<a name="viewing-log-events"></a>

When you open a log stream, the **Log Events** pane displays that stream's sequence of log events.

1. To find a log stream to view, open the **Log Streams** pane (see [Viewing CloudWatch log groups and log streams](viewing-CloudWatch-logs.md)).
**Note**  
You can use pattern matching to locate a stream in a list. Click the **Log Streams** pane and start entering text. The first log stream name with text that matches yours is highlighted. You can also reorder the list by clicking the top of the **Last Event Time** column.

1. Double-click a log stream to view its sequence of log events.

   The **Log Events** pane displays the log events that make up the log stream. 

1. To filter the log events according to content, enter text in the **Filter logstream** field and press **Return**. 

    The results are log events containing text that's a case-sensitive match with your filter text. The filter searches the complete log stream, including events not displayed on the screen.
**Note**  
You can also use pattern matching to locate a log event in the pane. Click the **Log Events** pane and start entering text. The first log event with text that matches yours is highlighted. Unlike with **Filter logstream** search, only on-screen events are checked.

1. To filter log events according to time, right-click a log event, and then choose **Show Logs Around**.

    You can select **One Minute**, **Five Minutes**, or **Ten Minutes**. For example, if you select **Five Minutes**, the filtered list shows only log events that occurred five minutes before and after the selected entry.  
![\[Viewing and filtering log actions on the Log Events pane.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/cloudwatch-filter-log-events.png)

On the left of the **Log Events** pane, the [log actions](#working-with-log-actions) offer more ways to interact with log events.

## Working with log actions
<a name="working-with-log-actions"></a>

On the left of the **Log Events** pane, four log actions allow you to refresh, edit, tail, and wrap CloudWatch log events.

![\[Viewing log actions on the Log Events pane.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/cloudwatch-log-actions.png)


1. To find log events to interact with, [open the **Log Streams**](#viewing-log-events) pane.

1. Choose one of the following log actions:
   + **Refresh** – Updates the list with log events that occurred after the **Log Events** pane was opened.
   + **Open in Editor** – Opens the on-screen log events in the IDE's default editor. 
**Note**  
This action exports only on-screen log events to the IDE editor. To view all the stream's events in the editor, choose the [**Export Log Stream**](#exporting-CW-logs) option. 
   + **Tail logs** – Streams new logs events to the **Log Events** pane. This is a useful feature for continuous updates on longer-running services such as Amazon EC2 instances and AWS CodeBuild builds. 
   + **Wrap logs** – Displays log event text on multiple lines if the size of the pane hides longer entries.

## Exporting CloudWatch log events to a file or an editor
<a name="exporting-CW-logs"></a>

Exporting a CloudWatch log stream enables you to open its log events in the IDE's default editor or download them to a local folder.

1. To find a log stream to access, [open the **Log Streams**](#viewing-log-events) pane.

1. Right-click a log stream, and then choose **Export Log Stream**, **Open in Editor** or **Export Log Stream**, **Save to a File**.
   + **Open in Editor** –Opens the log events that make up the selected stream in the IDE's default editor.
**Note**  
This option exports all events in the log stream to the IDE editor.
   + **Save to a File** – Opens the **Download Log Stream** dialog box. This enables you to select a download folder and rename the file containing the log events.

# Working with CloudWatch Logs Insights by using the AWS Toolkit for JetBrains
<a name="cloudwatch-log-insights"></a>

You can use the AWS Toolkit for JetBrains to work with CloudWatch Logs Insights. CloudWatch Logs Insights enables you to interactively search and analyze your log data in Amazon CloudWatch Logs. For more information, see [Analyzing Log Data with CloudWatch Logs Insights](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AnalyzingLogData.html) in the *Amazon CloudWatch Logs User Guide*.

## IAM permissions for CloudWatch Logs Insights
<a name="iam-permissions-for-cwlog-insights"></a>

 You need the following permissions to run and view CloudWatch Logs Insights query results: 

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement" : [
    {
      "Effect" : "Allow",
      "Action" : [
        "logs:StartQuery",
        "logs:GetQueryResults",
        "logs:GetLogRecord",
        "logs:describeLogGroups",
        "logs:describeLogStreams"
      ],
      "Resource" : "*"
    }
  ]
}
```

------

The following permission is not required but will allow the AWS Toolkit for JetBrains to automatically stop any currently running queries when you close the associated results pane or IDE. 

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement" : [
    {
      "Effect" : "Allow",
      "Action" : [
        "logs:StopQuery"
      ],
      "Resource" : "*"
    }
  ]
}
```

------

## Working with CloudWatch Logs Insights
<a name="working-with-cwlog-insights"></a>

**To open the CloudWatch Logs Insights query editor**

1. Open AWS Explorer.

1.  Double-click the **CloudWatch Logs** node to expand the list of log groups. 

1.  Right-click on the log group you want to open, and then choose **Open Query Editor**. 

**To start a CloudWatch Logs Insights query**

1. In the **Query Log Groups** window, change the query parameters as desired.

   You can choose a time range by date or relative time.

   The **Query Log Groups** field accepts the CloudWatch Logs Insights Query Syntax. For more information, see [CloudWatch Logs Insights Query Syntax](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_QuerySyntax.html) in the *Amazon CloudWatch Logs User Guide*.

1.  Choose **Execute** to begin the query. 

**To save a CloudWatch Logs Insights query**

1. Type a query name. 

1.  Choose **Save Query**. 

    The selected log groups and query are saved to your AWS account. Time ranges are not saved. 

   You can retrieve and reuse saved queries from the CloudWatch Logs Insights AWS Management Console page.

**To retrieve a saved CloudWatch Logs Insights query**

1.  In the **Query Log Groups** window, choose **Retrieve Saved Queries**. 

1.  Choose the desired query and choose **OK**. 

   The selected log groups and query replace anything in the existing dialog.

**To navigate through query results**
+  In the CloudWatch Logs Insights **Query Results** window, in the top right corner, choose **Open Query Editor**. 

**To view an individual log record**
+  In the query results pane, double-click a row to open a new tab with details about that log record. 

   You can also navigate to the log record's associated log stream by choosing **View Log Stream** in the top right corner. 

# Amazon DynamoDB in the AWS Toolkit for JetBrains
<a name="dynamodb"></a>

Amazon DynamoDB is a fully managed NoSQL database service that provides predictable performance with seamless scalability. For detailed information about the DynamoDB service, see the [Amazon DynamoDB](/amazondynamodb/latest/developerguide/Introduction.html) *User Guide*.

The following topics describe how to work with the DynamoDB service from the AWS Toolkit for JetBrains.

**Topics**
+ [Working with Amazon DynamoDB](dynamodb-overview.md)
+ [Working with DynamoDB tables](dynamodb-tables.md)

# Working with Amazon DynamoDB from the AWS Toolkit for JetBrains
<a name="dynamodb-overview"></a>

The AWS Toolkit for JetBrains allows you to view, copy Amazon Resource Names (ARNs), and delete your Amazon DynamoDB resources, directly from your JetBrains IDE.

The following sections describe how to work with these service features from the AWS Toolkit for JetBrains.

## Viewing DynamoDB resources
<a name="dynamodb-overview-view-resources"></a>

At this time, DynamoDB resources can't be created directly from the toolkit, but your resources are visible. To view your DynamoDB resources, complete the following steps:

1. Navigate to the **Explorer** tab in the AWS Toolkit for JetBrains.

1. Expand the **DynamoDB** node.

1. Your DynamoDB resources are displayed under the DynamoDB node.

## Copying DynamoDB resources ARNs
<a name="dynamodb-overview-copy-arn"></a>

An Amazon Resource Name (ARN) is a unique ID that gets assigned to every AWS resource, which include DynamoDB tables. To copy the `ARN` ID for a DynamoDB resource, complete the following steps:

1. Navigate to the **Explorer** tab in the AWS Toolkit for JetBrains.

1. Expand the **DynamoDB** node.

1. Open the context menu for (right-click) the DynamoDB resource you want to copy the `ARN` ID for.

1. Choose **Copy ARN** to copy the resource `ARN` ID to your OS clipboard.

## Deleting DynamoDB resources
<a name="dynamodb-overview-delete"></a>

To delete a DynamoDB resource, complete the following steps:

1. Navigate to the **Explorer** tab in the AWS Toolkit for JetBrains.

1. Expand the **DynamoDB** node.

1. Open the context menu for (right-click) the DynamoDB resource you want to delete.

1. Choose **Delete table...** to open the **Delete table...** confirmation dialog.

1. Complete the confirmation instructions to delete your DynamoDB table.

# Working with Amazon DynamoDB tables the AWS Toolkit for JetBrains
<a name="dynamodb-tables"></a>

The primary resource of Amazon DynamoDB is a data-base table. The following sections describe how to work with DynamoDB tables from the AWS Toolkit for JetBrains.

## Viewing a DynamoDB table
<a name="dynamodb-overview-view-table"></a>

To view a DynamoDB table, complete the following steps:

1. Navigate to the **Explorer** tab in the AWS Toolkit for JetBrains.

1. Expand the **DynamoDB** node.

1. From your list of DynamoDB resources, double-click a table to view it in the **Editor** window.

**Note**  
The first time you view table data, an initial scan with a max-result limit of 50 items is retrieved.

## Setting the max-result limit
<a name="dynamodb-table-results"></a>

To change the default limit of retrieved table entries, compete the following steps:

1. From the AWS Explorer, double-click a table to view it in the JetBrains **Editor** window.

1. From the table view, choose the **Settings icon**, located in the upper-right hand corner of your **Editor** window.

1. Hover over the **Max Results** option to view a list of available max-result values.

## Scanning a DynamoDB table
<a name="dynamodb-table-scan"></a>

To scan a DynamoDB table, complete the following steps:

**Note**  
This scan generates a **PartiQL query** and requires that you have the correct AWS Identity and Access Management (AWS IAM) policies in place. To learn more about the PartiQL security policy requirements, see the [IAM security policies with PartiQL for DynamoDB](https://docs.aws.amazon.com/) topic in the *Amazon DynamoDB Developer Guide*.

1. From the AWS Explorer, double-click a table to view it in the JetBrains **Editor** window.

1. From the table view, expand the **Scan** header.

1. Select the **Table/ Index** you want to scan from the drop-down menu.

1. Choose **Run** to proceed with the scan, the scan is complete when the table data is returned in the **Editor** window.

# Working with Amazon Elastic Container Service by Using the AWS Toolkit for JetBrains
<a name="ecs"></a>

The following topics describe how to use the AWS Toolkit for JetBrains to work with Amazon ECS resources in an AWS account.

**Topics**
+ [Amazon ECS Exec](ecs-exec.md)

# Amazon Elastic Container Service (Amazon ECS) Exec in AWS Toolkit
<a name="ecs-exec"></a>

You can use the Amazon ECS Exec feature to issue single commands or run a shell in an Amazon Elastic Container Service (Amazon ECS) container, directly from the AWS Toolkit. 

**Important**  
Enabling and Disabling Amazon ECS Exec changes the state of resources in your AWS account. This includes stopping and restarting the service. Altering the state of resources while the Amazon ECS Exec is enabled can lead to unpredictable results. For more information about Amazon ECS Exec, see the developer guide [Using Amazon ECS Exec for Debugging](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-exec.html#ecs-exec-considerations).

## Amazon ECS Exec prerequisites
<a name="ecs-exec-prereq"></a>

Before you can use the Amazon ECS Exec feature, there are prerequisite conditions that need to be met.

**Important**  
In order to enable Amazon ECS Exec for a particular service, Amazon ECS Cloud Debugging must be disbled for that service.

### Amazon ECS requirements
<a name="w2aac17c25b8c11b7"></a>

Depending on whether your tasks are hosted on Amazon EC2 or AWS Fargate, Amazon ECS Exec has different version requirements.
+ If you're using Amazon EC2, you must use an Amazon ECS optimized AMI that was released after January 20th, 2021, with an agent version of 1.50.2 or greater. Additional information is available for you in the developer guide [Amazon ECS optimized AMIs](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-optimized_AMI.html).
+ If you're using AWS Fargate, you must use platform version 1.4.0 or higher. Additional information about Fargate requirements is available to you in the developer guide [AWS Fargate platform versions](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/platform_versions.html).

### AWS account configuration and IAM permissions
<a name="w2aac17c25b8c11b9"></a>

To use the Amazon ECS Exec feature, you need to have an existing Amazon ECS cluster associated with your AWS account. Amazon ECS Exec uses Systems Manager to establish a connection with the containers on your cluster and requires specific Task IAM Role Permissions to communicate with the SSM service.

You can find IAM role and policy information, specific to Amazon ECS Exec, in the [IAM permissions required for ECS Exec](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-exec.html#ecs-exec-enabling-and-using) developer guide.

## Working with the Amazon ECS Exec
<a name="w2aac17c25b8c15"></a>

You can enable or disable Amazon ECS Exec directly from the AWS Explorer in the AWS Toolkit for JetBrains. When Amazon ECS Exec is enabled, you can choose containers from the Amazon ECS menu and then run commands against them.

### Enabling Amazon ECS Exec
<a name="w2aac17c25b8c15b5"></a>

1. From the AWS Explorer, expand the Amazon ECS menu.

1. Expand the **Clusters** section, and choose the cluster your want to modify.

1. Open the context menu for (right-click) the service you want to modify and choose **Enable Command Execution**.
**Note**  
If Amazon ECS Cloud Debugging is enabled for this service, the **Enable Command Execution** option will not be available. Disabling Cloud Debugging will restore the option, but it will stop and restart your service.

**Important**  
This will start a new deployment of your Service and may take a few minutes. For more information, see the note at the beginning of this section.)

### Disabling Amazon ECS Exec
<a name="w2aac17c25b8c15b7"></a>

1. From the AWS Explorer, expand the Amazon ECS menu.

1. Expand the **Clusters** section, and choose the cluster your want to modify.

1. Open the context menu for (right-click) the service you want to modify and choose **Disable Command Execution**.

**Important**  
This will start a new deployment of your Service and may take a few minutes. For more information, see the note at the beginning of this section.

### Running commands against a Container
<a name="w2aac17c25b8c15b9"></a>

To run commands against a container using the AWS Explorer, Amazon ECS Exec must be enabled. If it's not enabled, see the **Enabling Amazon ECS Exec** procedure in this section.

1. From the AWS Explorer, expand the Amazon ECS menu.

1. Expand the **Clusters** section, and choose the cluster your want to modify.

1. Expand a service to list its containers.

1. Open the context menu for (right-click) the container you want to modify and choose **Run Command in Container**.

1. In the **Run Command in Container** dialog box, choose the **Task ARN** that you want.

1. You can type the command you want to run or select it from a list of commands that were run during the same session.

1. Choose **Execute**

### Running commands from within a shell
<a name="w2aac17c25b8c15c11"></a>

To run commands against a container from within a shell, using the AWS Explorer, Amazon ECS Exec must be enabled. If it's not enabled, see the **Enabling Amazon ECS Exec** procedure in this section.

1. From the AWS Explorer, expand the Amazon ECS menu.

1. Expand the **Clusters** section, and choose the cluster your want to modify.

1. Expand the service to list its containers.

1. Open the context menu for (right-click) the container you want to modify and choose **Open Interactive Shell**.

1. In the **Interactive Shell** dialog box, choose the **Task ARN** that you want.

1. Choose a shell from the corresponding drop down, or enter the name of the shell you want to interact with.

1. When you are satisfied with your settings, choose **Execute**.

1. When the shell opens in a terminal, you can enter commands to interact with the container.

# Working with Amazon EventBridge by using the AWS Toolkit for JetBrains
<a name="eventbridge"></a>

The following topic describes how to use the AWS Toolkit for JetBrains to work with Amazon EventBridge schemas in an AWS account.

**Topics**
+ [Working with Amazon EventBridge schemas](eventbridge-schemas.md)

# Working with Amazon EventBridge schemas
<a name="eventbridge-schemas"></a>

You can use the AWS Toolkit for JetBrains to work with Amazon EventBridge Schemas as follows.

**Note**  
Working with EventBridge Schemas is currently supported only by the AWS Toolkit for IntelliJ and the AWS Toolkit for PyCharm.

The following information assumes you have already [set up the AWS Toolkit for JetBrains](getting-started.md).

**Contents**
+ [View a schema](#eventbridge-schemas-view)
+ [Find a schema](#eventbridge-schemas-find)
+ [Generate code for a schema](#eventbridge-schemas-generate-code)
+ [Create an AWS SAM application that uses a schema](#eventbridge-schemas-serverless-app)

## View an available schema
<a name="eventbridge-schemas-view"></a>

1. With the [**AWS Explorer**](aws-explorer.md) tool window displayed, expand **Schemas**.

1. Expand the name of the registry that contains the schema you want to view. For example, many of the schemas that AWS supplies are in the **aws.events** registry.

1. To view the schema in the editor, right-click the title of the schema, and on the context menu, choose **View Schema**. 

## Find an available schema
<a name="eventbridge-schemas-find"></a>

With the [**AWS Explorer**](aws-explorer.md) tool window displayed, do one of the following:
+ Begin typing the title of the schema you want to find. The **AWS Explorer** highlights the titles of schemas that contain a match.
+ Right-click **Schemas**, and on the context menu, choose **Search Schemas**. In the **Search EventBridge Schemas** dialog box, begin typing the title of the schema you want to find. The dialog box displays the titles of schemas that contain a match.
+ Expand **Schemas**. Right-click the name of the registry that contains the schema you want to find, and then choose **Search Schemas in Registry**. In the **Search EventBridge Schemas** dialog box, begin typing the title of the schema you want to find. The dialog box displays the titles of schemas that contain a match.

To view a schema in the list of matches, do one of the following:
+ To display the schema in the editor, in **AWS Explorer**, right-click the title of the schema, and then choose **View Schema**. 
+ In the **Search EventBridge Schemas** dialog box, choose the title of the schema to display the schema. 

## Generate code for an available schema
<a name="eventbridge-schemas-generate-code"></a>

1. With the [**AWS Explorer**](aws-explorer.md) tool window displayed, expand **Schemas**.

1. Expand the name of the registry that contains the schema you want to generate code for.

1. Right-click the title of the schema, and then choose **Download code bindings**.

1. In the **Download code bindings** dialog box, choose the following:
   + The **Version** of the schema to generate code for.
   + The supported programming **Language** and language version to generate code for.
   + The **File location** where you want to store the generated code on the local development machine.

1. Choose **Download**.

## Create an AWS Serverless Application Model application that uses an available schema
<a name="eventbridge-schemas-serverless-app"></a>

1. On the **File** menu, choose **New**, **Project**. 

1. In the **New Project** dialog box, choose **AWS**.

1. Choose **AWS Serverless Application**, and then choose **Next**.

1. Specify the following:
   + A **Project name** for the project.
   + A **Project location** on your local development machine for the project.
   + A supported AWS Lambda **Runtime** for the project.
   + An AWS Serverless Application Model (AWS SAM) **SAM Template** for the project. The choices currently include the following:
     + **AWS SAM EventBridge Hello World (EC2 Instance State Change) ** – When deployed, creates an AWS Lambda function and an associated Amazon API Gateway endpoint in your AWS account. By default, this function and endpoint respond only to an Amazon EC2 instance status change.
     + **AWS SAM EventBridge App from Scratch (for any Event trigger from a Schema Registry)** – When deployed, creates an AWS Lambda function and an associated Amazon API Gateway endpoint in your AWS account. This function and endpoint can respond to events that are available in the schema you specify.

       If you choose this template, you must also specify the following:
       + The named profile, **Credentials**, to use.
       + The AWS **Region** to use.
       + The EventBridge **Event Schema** to use.
   + The version of the SDK to use for the project (**Project SDK**).

After you create an AWS serverless application project, you can do the following:
+ [Deploy the application](sam-sync.md)
+ [Change (update) the application's settings](sam-update.md)
+ [Delete the deployed application](sam-delete.md)

You can also do the following with Lambda functions that are part of the application:
+ [Run (invoke) or debug the local version of a function](invoke-lambda.md)
+ [Run (invoke) the remote version of a function](lambda-remote.md)
+ [Change a function's settings](lambda-update.md)
+ [Delete a function](lambda-delete.md)

# Working with AWS Lambda from the AWS Toolkit for JetBrains
<a name="building-lambda"></a>

The following topics describe how to work with AWS Lambda functions from the AWS Toolkit for JetBrains.

**Topics**
+ [Lambda Runtimes](lambda-runtimes.md)
+ [Creating a function](create-new-lambda.md)
+ [Running (invoking) or debugging a local function](invoke-lambda.md)
+ [Running (invoking) a remote function](lambda-remote.md)
+ [Changing (updating) function settings](lambda-update.md)
+ [Deleting a function](lambda-delete.md)

# AWS Lambda runtimes and support in the AWS Toolkit for JetBrains
<a name="lambda-runtimes"></a>

AWS Lambda supports multiple languages through the use of runtimes. A runtime provides a language-specific environment that relays invocation events, context information, and responses between Lambda and the function. For detailed information about the Lambda service and supported runtimes, see the [Lambda runtimes](https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html#runtime-support-policy) topic in the *AWS Lambda User Guide*.

The following describes runtime environments currently supported for use with the AWS Toolkit for JetBrains.


| Name | Identifier | Operating System | Architecture | 
| --- | --- | --- | --- | 
|  Node.js 18  |  nodejs18.x  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Node.js 16  |  nodejs16.x  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Node.js 14  |  nodejs14.x  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Python 3.11  |  python3.11  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Python 3.10  |  python3.10  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Python 3.9  |  python3.9  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Python 3.8  |  python3.8  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Python 3.7  |  python3.7  |  Amazon Linux 2  |  x86\$164  | 
|  Java 17  |  java17  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Java 11  |  java11  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Java 8  |  java8.al2  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Java 8  |  java8  |  Amazon Linux 2  |  x86\$164  | 
|  .NET 6  |  dotnet6  |  Amazon Linux 2  |  x86\$164, arm64  | 
|  Go 1.x  |  go1.x  |  Amazon Linux 2  |  x86\$164  | 

# Creating an AWS Lambda function by using the AWS Toolkit for JetBrains
<a name="create-new-lambda"></a>

You can use the AWS Toolkit for JetBrains to create an AWS Lambda function that is part of an AWS serverless application. Or you can create a standalone Lambda function.

To create a Lambda function that is part of an AWS serverless application, skip the rest of this topic and see [Creating an application](deploy-serverless-app.md) instead.

To create a standalone Lambda function, you must first install the AWS Toolkit for JetBrains and, if you haven't yet, connect to an AWS account for the first time. Then, with IntelliJ IDEA, PyCharm, WebStorm, or JetBrains Rider already running, do one of the following:
+ Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region to create the function in, do that now. Then right-click **Lambda**, and choose **Create new AWS Lambda**.  
![\[Creating an AWS Lambda function by starting from AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-create-aws-explorer.png)

  Complete the [Create Function](create-function-dialog.md) dialog box, and then choose **Create Function**. The AWS Toolkit for JetBrains creates a corresponding AWS CloudFormation stack for the deployment, and adds the function name to the **Lambda** list in **AWS Explorer**. If the deployment fails, you can try to determine why by viewing event logs for the stack.
+ Create a code file that implements a function handler for [Java](https://docs.aws.amazon.com/lambda/latest/dg/java-programming-model-handler-types.html), [Python](https://docs.aws.amazon.com/lambda/latest/dg/python-programming-model-handler-types.html), [Node.js](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html), or [C\$1](https://docs.aws.amazon.com/lambda/latest/dg/dotnet-programming-model-handler-types.html). 

  If you need to switch to a different AWS Region to create the remote function to be run (invoked), do that now. Then in the code file, choose the **Lambda** icon in the gutter next to the function handler, and then choose **Create new AWS Lambda**. Complete the [Create Function](create-function-dialog.md) dialog box, and then choose **Create Function**.  
![\[Creating an AWS Lambda function by starting from an existing function handler in a code file\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-create-code-file.png)
**Note**  
If the **Lambda** icon isn't displayed in the gutter next to the function handler, try displaying it for the current project by selecting the following box in **Settings**/**Preferences**: **Tools**, **AWS**, **Project settings**, **Show gutter icons for all potential AWS Lambda handlers**. Also, if the function handler is already defined in the corresponding AWS SAM template, the **Create new AWS Lambda** command won't appear.

  After you choose **Create Function**, the AWS Toolkit for JetBrains creates a corresponding function in the Lambda service for the connected AWS account. If the operation succeeds, after you refresh **AWS Explorer**, the **Lambda** list displays the name of the new function.
+ If you already have a project that contains an AWS Lambda function, and if you need to first switch to a different AWS Region to create the function in, do that now. Then in the code file that contains the function handler for [Java](https://docs.aws.amazon.com/lambda/latest/dg/java-programming-model-handler-types.html), [Python](https://docs.aws.amazon.com/lambda/latest/dg/python-programming-model-handler-types.html), [Node.js](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html), or [C\$1](https://docs.aws.amazon.com/lambda/latest/dg/dotnet-programming-model-handler-types.html), choose the **Lambda** icon in the gutter next to the function handler. Choose **Create new AWS Lambda**, complete the [Create Function](create-function-dialog.md) dialog box, and then choose **Create Function**.  
![\[Creating an AWS Lambda function by starting from an existing function handler in a code file\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-create-code-file.png)
**Note**  
If the **Lambda** icon isn't displayed in the gutter next to the function handler, try displaying it for the current project by selecting the following box in **Settings**/**Preferences**: **Tools**, **AWS**, **Project settings**, **Show gutter icons for all potential AWS Lambda handlers**. Also, the **Create new AWS Lambda** command won't be displayed if the function handler is already defined in the corresponding AWS SAM template.

  After you choose **Create Function**, the AWS Toolkit for JetBrains creates a corresponding function in the Lambda service for the connected AWS account. If the operation succeeds, after you refresh **AWS Explorer**, the new function's name appears in the **Lambda** list.

After you create the function, you can run (invoke) or debug the local version of the function or run (invoke) the remote version.

# Running (invoking) or debugging the local version of an AWS Lambda function by using the AWS Toolkit for JetBrains
<a name="invoke-lambda"></a>

To complete this procedure, you must create the AWS Lambda function that you want to run (invoke) or debug, if you have not created it already.
**Note**  
To run (invoke) or debug the local version of a Lambda function, and run (invoke) or debug that function locally with any nondefault or optional properties, you must first set those properties in the function's corresponding AWS SAM template file (for example, in a file named `template.yaml` within the project). For a list of available properties, see [AWS::Serverless::Function](https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction) in the [awslabs/serverless-application-model](https://github.com/awslabs/serverless-application-model/) repository on GitHub.

1. Do one of the following:
   + In the code file that contains the function handler for [Java](https://docs.aws.amazon.com/lambda/latest/dg/java-programming-model-handler-types.html), [Python](https://docs.aws.amazon.com/lambda/latest/dg/python-programming-model-handler-types.html), [Node.js](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html), or [C\$1](https://docs.aws.amazon.com/lambda/latest/dg/dotnet-programming-model-handler-types.html), choose the Lambda icon in the gutter next to the function handler. Choose **Run '[Local]'** or **Debug '[Local]'**.   
![\[Running or debugging the local version of a Lambda function by starting from the function handler in the code file\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-local-code.png)
   + With the **Project** tool window already open and displaying the project that contains the function, open the project's `template.yaml` file. Choose the **Run** icon in the gutter next to the function's resource definition, and then choose **Run '[Local]'** or **Debug '[Local]'**.  
![\[Running or debugging the local version of a Lambda function by starting from the function definition in the AWS SAM template file\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-local-template.png)

1. Complete the [Edit configuration (local function settings)](run-debug-configurations-dialog-local.md) dialog box if it's displayed, and then choose **Run** or **Debug**. Results are displayed in the **Run** or **Debug** tool window.
   + If the **Edit configuration** dialog box doesn't appear and you want to change the existing configuration, first change its configuration, and then repeat this procedure from the beginning. 
   + If the configuration details are missing, expand **Templates**, **AWS Lambda**, and then choose **Local**. Choose **OK**, and then repeat this procedure from the beginning. 

# Running (invoking) the remote version of an AWS Lambda function by using the AWS Toolkit for JetBrains
<a name="lambda-remote"></a>

A *remote* version of an AWS Lambda function is a function whose source code already exists inside of the Lambda service for an AWS account.

To complete this procedure, you must first install the AWS Toolkit for JetBrains and, if you haven't yet, connect to an AWS account for the first time. Then with IntelliJ IDEA, PyCharm, WebStorm, or JetBrains Rider running, do the following.

1. Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region that contains the function, do that now.

1. Expand **Lambda**, and confirm that the name of the function is listed. If it is, skip ahead to step 3 in this procedure.

   If the name of the function isn't listed, create the Lambda function that you want to run (invoke). 

   If you created the function as part of an AWS serverless application, you must also deploy that application.

   If you created the function by creating a code file that implements a function handler for [Java](https://docs.aws.amazon.com/lambda/latest/dg/java-programming-model-handler-types.html), [Python](https://docs.aws.amazon.com/lambda/latest/dg/python-programming-model-handler-types.html), [Node.js](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html), or [C\$1](https://docs.aws.amazon.com/lambda/latest/dg/dotnet-programming-model-handler-types.html), then in the code file, choose the Lambda icon next to the function handler. Then choose **Create new AWS Lambda**. Complete the [Create Function](create-function-dialog.md) dialog box, and then choose **Create Function**.

1. With **Lambda** open in **AWS Explorer**, right-click the name of the function, and then choose **Run '[Remote]'**.  
![\[Running the remote version of a Lambda function by starting from AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-remote.png)

1. Complete the [Edit configuration (remote function settings)](run-debug-configurations-dialog-remote.md) dialog box if it's displayed, and then choose **Run** or **Debug**. Results are displayed in the **Run** or **Debug** tool window.
   + If the **Edit configuration** dialog box doesn't appear and you want to change the existing configuration, first change its configuration, and then repeat this procedure from the beginning. 
   + If the configuration details are missing, expand **Templates**, **AWS Lambda**, and then choose **Local**. Choose **OK**, and then repeat this procedure from the beginning. 

# Changing (updating) AWS Lambda function settings by using the AWS Toolkit for JetBrains
<a name="lambda-update"></a>

To use the AWS Toolkit for JetBrains to change (update) the settings for an AWS Lambda function, do one of the following.
+ With the code file open that contains the function handler for [Java](https://docs.aws.amazon.com/lambda/latest/dg/java-programming-model-handler-types.html), [Python](https://docs.aws.amazon.com/lambda/latest/dg/python-programming-model-handler-types.html), [Node.js](https://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html), or [C\$1](https://docs.aws.amazon.com/lambda/latest/dg/dotnet-programming-model-handler-types.html), on the main menu, choose **Run**, **Edit Configurations**. Complete the [Run/Debug Configurations](run-debug-configurations-dialog.md) dialog box, and then choose **OK**.
+ Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region that contains the function, do that now. Expand **Lambda**, choose the name of the function to change the configuration for, and then do one of the following:
  + **Change settings such as the timeout, memory, environment variables, and execution role –** Right-click the name of the function, and then choose **Update Function Configuration**.  
![\[Choosing the Update Function Configuration command\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/update-function-configuration.png)

    Complete the [Update Configuration](update-configuration-dialog.md) dialog box, and then choose **Update**. 
  + **Change settings such as the input payload** – On the main menu, choose **Run**, **Edit Configurations**. Complete the [Run/Debug Configurations](run-debug-configurations-dialog.md) dialog box, and then choose **OK**.  
![\[Choosing the Edit Configurations command\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/edit-configurations.png)

    If the configuration details are missing, first expand **Templates**, **AWS Lambda**, and then choose **Local** (for the local version of the function) or **Remote** (for the remote version of that same function). Choose **OK**, and then repeat this procedure from the beginning.)
  + **Change settings such as the function handler name or Amazon Simple Storage Service (Amazon S3) source bucket** – Right-click the function name, and then choose **Update Function Code**.  
![\[Choosing the Update Function Code command\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/update-function-code.png)

    Complete the [Update Code](update-code-dialog.md) dialog box, and then choose **Update**.
  + **Change other available property settings that aren't listed in the preceding bullets** – Change those settings in the function's corresponding AWS SAM template file (for example, in a file named `template.yaml` within the project). 

    For a list of available property settings, see [AWS::Serverless::Function](https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction) in the [awslabs/serverless-application-model](https://github.com/awslabs/serverless-application-model/) repository on GitHub. 

# Deleting an AWS Lambda function by using the AWS Toolkit for JetBrains
<a name="lambda-delete"></a>

You can use the AWS Toolkit to delete an AWS Lambda function that is part of an AWS serverless application, or you can delete a standalone Lambda function.

**Note**  
The deleted Lambda function cannot be recovered. You must be diligent with the resource name, such as the Lambda function and version you intend on deleting.

To delete a Lambda function that is part of an AWS serverless application, skip the rest of this topic and see [Deleting an application](sam-delete.md) instead.

To delete a standalone Lambda function, do the following.

1. Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region that contains the function, do that now.

1. Expand **Lambda**.

1. Right-click the name of the function to delete, and then choose **Delete Function**.  
![\[Choosing the Delete Function command\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/lambda-delete.png)

1. Enter the function's name to confirm the deletion, and then choose **OK**. If the function deletion succeeds, the AWS Toolkit for JetBrains removes the function name from the **Lambda** list.

# Accessing Amazon RDS by using the AWS Toolkit for JetBrains
<a name="accessing-rds"></a>

Using Amazon Relational Database Service (Amazon RDS), you can provision and manage SQL relational database systems in the cloud. Using AWS Toolkit for JetBrains, you can connect to and interact with the following Amazon RDS database engines:
+ Aurora – A MySQL and PostgreSQL-compatible relational database built for the cloud. For more information, see the [https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/CHAP_AuroraOverview.html](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/CHAP_AuroraOverview.html).
+ MySQL – Amazon RDS supports several major versions of the open-source relational database. For more information, see [MySQL on Amazon RDS](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_MySQL.html) in the *Amazon RDS User Guide*.
+ PostgreSQL – Amazon RDS supports several major version of the open-source object-relational database. For more information, see [PostgreSQL on Amazon RDS](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_PostgreSQL.html) in the *Amazon RDS User Guide*.

The following topics describe the prerequisites for accessing RDS databases and how to use AWS Toolkit for JetBrains to connect to a database instance.

**Topics**
+ [Prerequisites for accessing Amazon RDS databases](rds-access-prerequisities.md)
+ [Connecting to an Amazon RDS database](rds-connection.md)

# Prerequisites for accessing Amazon RDS databases
<a name="rds-access-prerequisities"></a>

Before you can connect to an Amazon RDS database using AWS Toolkit for JetBrains, you need to complete the following tasks: 
+ [Create a DB instance and set up its authentication method](#db-authentication)
+ [Download and install DataGrip](#datagrip-info)

## Creating an Amazon RDS DB instance and configuring an authentication method
<a name="db-authentication"></a>

 AWS Toolkit for JetBrains enables you to connect to an Amazon RDS DB instance that's already been created and configured in AWS. A DB instance is an isolated database environment running in the cloud that can contain multiple user-created databases. For information about creating DB instances for the supported database engines, see [ Getting started with Amazon RDS resources](Amazon RDS User GuideCHAP_GettingStarted.html) in the *Amazon RDS User Guide*. 

When connecting to a database using AWS Toolkit for JetBrains, users can choose to authenticate using IAM credentials or Secrets Manager. The following table describes key features and information resources for both options: 


****  

| Authentication methods | How it works | More information | 
| --- | --- | --- | 
|  Connect with IAM credentials  |  With IAM database authentication, you don't need to store user credentials in the database because authentication is managed externally using AWS Identity and Access Management (IAM) credentials.By default, IAM database authentication is disabled on DB instances. You can enable IAM database authentication (or disable it again) using the AWS Management Console, AWS CLI, or the API.   |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/rds-access-prerequisities.html)  | 
|  Connect with AWS Secrets Manager  |  A database administrator can store credentials for a database as a secret in Secrets Manager. Secrets Manager encrypts and stores the credentials within the secret as the *protected secret text*. When an application with permissions accesses the database, Secrets Manager decrypts the protected secret text and returns it over a secured channel. The client parses the returned credentials, connection string, and any other required information and then uses that information to access the database.  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/rds-access-prerequisities.html)  | 

## Working with Amazon RDS databases using DataGrip
<a name="datagrip-info"></a>

After you've connected to an Amazon RDS data source, you can start interacting with it. By using DataGrip from JetBrains, you can carry out database tasks such as writing SQL, running queries, and importing/exporting data. Features provided by DataGrip are also available in the database plugin for a range of JetBrains IDEs. For information about DataGrip, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/).

# Connecting to an Amazon RDS database
<a name="rds-connection"></a>

With **AWS Explorer**, you can select an Amazon RDS database, choose an authentication method, and then configure the connection settings. After you've successfully tested the connection, you can start interacting with the data source using JetBrains DataGrip. 

**Important**  
Ensure that you've completed the [prerequisites](rds-access-prerequisities.md) to enable users to access and interact with Amazon RDS databases.

Select a tab for instructions on connecting to a database instance using your preferred authentication method.

------
#### [ Connect with IAM credentials ]

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon RDS** node to expand the list of supported database engines.

1. Click a supported database engine (Aurora, MySQL, or PostgreSQL) node to expand the list of available database instances.
**Note**  
If you select Aurora, you can choose between expanding a MySQL cluster and a PostgreSQL cluster. 

1. Right-click a database and choose **Connect with IAM credentials**.
**Note**  
You can also choose **Copy Arn** to add the database's Amazon Resource Name (ARN) to your clipboard.

1. In the **Data Sources and Drivers** dialog box, do the following to ensure a database connection can be opened:
   + In the **Imported Data Sources** pane, confirm that the correct the correct data source is selected.
   + If a message indicates that you need to **Download missing driver files**, choose **Go to Driver** (the wrench icon) to download the required files.

1. In the **General** tab of the **Settings** pane, confirm that the following fields display the correct values: 
   + **Host/Port** – The endpoint and port used for connections to the database. For Amazon RDS databases hosted in the AWS Cloud, endpoints always end with `rds.amazon.com`. If you're connecting to a DB instance through a proxy, use these fields to specify the proxy's connection details.
   + **Authentication** – **AWS IAM** (authentication using IAM credentials).
   + **User** – The name of your database user account.
   + **Credentials** – The credentials used to access your AWS account. 
   + **Region** – The AWS Region where the database is hosted. 
   + **RDS Host/Port** – The endpoint and port for the database as listed in the AWS Management Console. If you're using a different endpoint to connect to a DB instance, specify the proxy's connection details in the **Host/Port** fields (described previously).
   + **Database** – The name of the database. 
   + **URL** – The URL that the JetBrains IDE will use to connect to the database.  
![\[Connection settings for an Amazon RDS database with IAM credentials used for authentication.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/rds-auth-iam.png)
**Note**  
For a full description of the connection settings that you can configure using the **Data sources and drivers** dialog box, see the [documentation for the JetBrains IDE](https://www.jetbrains.com/help/) that you're using. 

1. To verify the connection settings are correct, choose **Test Connection**.

   A green check mark indicates a successful test.

1. Choose **Apply** to apply your settings, and then choose **OK** to start working with the data source.

   The **Database** tool window opens. This displays the available data sources as a tree with nodes representing database elements such as schemas, tables, and keys. 
**Important**  
To use the **Database** tool window, you must first download and install DataGrip from JetBrains. For more information, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/). 

------
#### [ Connect with Secrets Manager ]

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon RDS** node to expand the list of supported database engines.

1. Click a supported database engine (Aurora, MySQL, or PostgreSQL) node to expand the list of available database instances.
**Note**  
If you select Aurora, you can choose between expanding a MySQL cluster and a PostgreSQL cluster. 

1. Right-click a database and choose **Connect with Secrets Manager**.
**Note**  
You can also choose **Copy Arn** to add the database's Amazon Resource Name (ARN) to your clipboard.

1. In the **Select a Database Secret** dialog box, use the drop-down field to pick credentials for the database, and then choose **Create**.

1. In the **Data Sources and Drivers** dialog box, do the following to ensure a database connection can be opened:
   + In the **Imported Data Sources** pane, confirm that the correct the correct data source is selected.
   + If a message indicates that you need to **Download missing driver files**, choose **Go to Driver** (the wrench icon) to download the required files.

1. In the **General** tab of the **Settings** pane, confirm that the following fields display the correct values: 
   + **Host/Port** – The endpoint and port used for connections to the database. For Amazon RDS databases hosted in the AWS Cloud, endpoints always end with `rds.amazon.com`. If you're connecting to a database through a proxy database, use these fields to specify the proxy's connection details.
   + **Authentication** – **SecretsManager Auth** (authentication using AWS Secrets Manager).
   + **Credentials** – The credentials used to access your AWS account.
   + **Region** – The AWS Region where the database is hosted. 
   + **Secret Name/ARN** – The name and ARN of the secret containing authentication credentials. To override the connection settings in the **Host/Port** fields, select the **Use the url and port from the secret** check box. 
   + **Database** – The name of the database instance you selected in **AWS Explorer**. 
   + **URL** – The URL that the JetBrains IDE will use to connect to the database.
**Note**  
If you're using Secrets Manager for authentication, there are no fields for a user name and password for the database. This information is contained in the encrypted secret data portion of a secret.  
![\[Connection settings with IAM credentials used for authentication.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/rds-auth-asm.png)
**Note**  
For a full description of the connection settings that you can configure using the **Data sources and drivers** dialog box, see the [documentation for the JetBrains IDE](https://www.jetbrains.com/help/) that you're using. 

1. To verify the connection settings are correct, choose **Test Connection**.

   A green check mark indicates a successful test.

1. Choose **Apply** to apply your settings, and then choose **OK** to start working with the data source.

   The **Database** tool window opens. This displays the available data sources as a tree with nodes representing database elements such as schemas, tables, and keys. 
**Important**  
To use the **Database** tool window, you must first download and install DataGrip from JetBrains. For more information, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/). 

------

# Accessing Amazon Redshift by using the AWS Toolkit for JetBrains
<a name="accessing-redshift"></a>

An Amazon Redshift data warehouse is an enterprise-class relational database query and management system. With AWS Toolkit for JetBrains, you can connect to and interact with Amazon Redshift clusters. An Amazon Redshift cluster consists of a collection of nodes that enables clients to query databases hosted on that cluster. 

The following topics describe the prerequisites for accessing Amazon Redshift clusters and how to use AWS Toolkit for JetBrains to connect to a database in a cluster.

**Topics**
+ [Prerequisites for accessing Amazon Redshift clusters](redshift-access-prerequisities.md)
+ [Connecting to an Amazon Redshift cluster](redshift-connection.md)

# Prerequisites for accessing Amazon Redshift clusters
<a name="redshift-access-prerequisities"></a>

Before you start can interacting with an Amazon Redshift cluster using AWS Toolkit for JetBrains, you need to complete the following tasks: 
+ [Create an Amazon Redshift cluster and set up its authentication method](#cluster-authentication)
+ [Download and install DataGrip](#datagrip-info-rs)

## Creating an Amazon Redshift cluster and configuring an authentication method
<a name="cluster-authentication"></a>

 AWS Toolkit for JetBrains enables you to connect to an Amazon Redshift cluster that's already created and configured in AWS. Each cluster contains one or more databases. For information about creating and configuring Amazon Redshift clusters, see [Getting started with Amazon Redshift](https://docs.aws.amazon.com/redshift/latest/gsg/getting-started.html) in the *Amazon Redshift Getting Started Guide*.

When connecting to a cluster using AWS Toolkit for JetBrains, users can choose to authenticate using IAM credentials or AWS Secrets Manager. The following table describes key features and information resources for both options: 


****  

| Authentication methods | How it works | More information | 
| --- | --- | --- | 
|  Connect with IAM credentials  |  With IAM database authentication, you don't need to store user credentials in the database because authentication is managed externally using AWS Identity and Access Management (IAM) credentials.By default, IAM database authentication is disabled on database instances. You can enable IAM database authentication (or disable it again) using the AWS Management Console, AWS CLI, or the API.   |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/redshift-access-prerequisities.html)  | 
|  Connect with AWS Secrets Manager;  |  A database administrator can store credentials for a database as a secret in Secrets Manager. Secrets Manager encrypts and stores the credentials within the secret as the *protected secret text*. When an application with permissions accesses the database, Secrets Manager decrypts the protected secret text and returns it over a secured channel. The client parses the returned credentials, connection string, and any other required information and then uses that information to access the database.  |  [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/redshift-access-prerequisities.html)  | 

## Working with Amazon RDS databases using DataGrip
<a name="datagrip-info-rs"></a>

After you've connected to a database in Amazon Redshift cluster, you can start interacting with it. Using DataGrip from JetBrains, you can carry out database tasks such as writing SQL, running queries, and importing/exporting data. Features provided by DataGrip are also available in the database plugin for a range of JetBrains IDEs. For information about DataGrip, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/).

# Connecting to an Amazon Redshift cluster
<a name="redshift-connection"></a>

With **AWS Explorer**, you can select an Amazon Redshift cluster, choose an authentication method, and then configure the connection settings. After you've successfully tested the connection, you can start interacting with the data source using JetBrains DataGrip. 

**Important**  
Ensure that you've completed the [prerequisites](redshift-access-prerequisities.md) to enable users to access and interact with Amazon Redshift clusters and databases.

Select a tab for instructions on connecting to a cluster using your preferred authentication method.

------
#### [ Connect with IAM credentials ]

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon Redshift** node to expand the list of available clusters.

1. Right-click a cluster and choose **Connect with IAM credentials**.
**Note**  
You can also choose **Copy Arn** to add the cluster's Amazon Resource Name (ARN) to your clipboard.

1. In the **Data Sources and Drivers** dialog box, do the following to ensure a database connection can be opened:
   + In the **Imported Data Sources** pane, confirm that the correct data source is selected.
   + If a message indicates that you need to **Download missing driver files**, choose **Go to Driver** (the wrench icon) to download the required files.

1. On the **General** tab of the **Settings** pane, confirm that the following fields display the correct values: 
   + **Host/Port** – The endpoint and port used for connections to the cluster. For Amazon Redshift clusters hosted in the AWS Cloud, endpoints always end with `redshift.amazon.com`.
   + **Authentication** – **AWS IAM** (authentication using IAM credentials). 
   + **User** – The name of your database user account.
   + **Credentials** – The credentials used to access your AWS account. 
   + **Region** – The AWS Region where the database is hosted. 
   + **Cluster ID** – The ID of the cluster you selected in **AWS Explorer**. 
   + **Database** – The name of the database in the cluster you'll connect to. 
   + **URL** – The URL that the JetBrains IDE will use to connect to the cluster's database.  
![\[Connection settings for a Amazon Redshift cluster with IAM credentials used for authentication.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/redshift-auth-iam.png)
**Note**  
For a full description of the connection settings that you can configure using the **Data sources and drivers** dialog box, see the [documentation for the JetBrains IDE](https://www.jetbrains.com/help/) that you're using. 

1. To verify the connection settings are correct, choose **Test Connection**.

   A green check mark indicates a successful test.

1. Choose **Apply** to apply your settings, and then choose **OK** to start working with the data source.

   The **Database** tool window opens. This displays the available data sources as a tree with nodes representing database elements such as schemas, tables, and keys. 
**Important**  
To use the **Database** tool window, you must first download and install DataGrip from JetBrains. For more information, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/). 

------
#### [ Connect with Secrets Manager ]

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon Redshift** node to expand the list of available clusters.

1. Right-click a cluster and choose **Connect with Secrets Manager**.
**Note**  
You can also choose **Copy Arn** to add the cluster's Amazon Resource Name (ARN) to your clipboard.

1. In the **Select a Database Secret** dialog box, use the drop-down field to pick credentials for the database, and then choose **Create**.

1. In the **Data Sources and Drivers** dialog box, do the following to ensure a database connection can be opened:
   + In the **Imported Data Sources**, confirm that the correct the correct data source is selected.
   + If a message appears in the dialog box to **Download missing driver files**, choose **Go to Driver** (the wrench icon) to download the required files.

1. On the **General** tab of the **Settings** pane, confirm that the following fields display the correct values: 
   + **Host/Port** – The endpoint and port used for connections to the cluster. For Amazon Redshift clusters hosted in the AWS Cloud, endpoints always end with `redshift.amazon.com`.
   + **Authentication** – **SecretsManager Auth** (authentication using AWS Secrets Manager). 
   + **Credentials** – The credentials used to connect to the AWS account. 
   + **Region** – The AWS Region where the cluster is hosted. 
   + **Secret Name/ARN** – The name and ARN of the secret containing authentication credentials. If you want to override the connection settings in the **Host/Port** fields, select the **Use the url and port from the secret** check box.
   + **Database** – The name of the database in the cluster you'll connect to. 
   + **URL** – The URL that the JetBrains IDE will use to connect to the database.
**Note**  
If you're using AWS Secrets Manager for authentication, there are no fields for specifying a user name and password for the cluster. This information is contained in the encrypted secret data portion of a secret.  
![\[Connection settings for a Amazon Redshift cluster with Secrets Manager used for authentication.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/redshift-auth-asm.png)
**Note**  
For a full description of the connection settings that you can configure using the **Data sources and drivers** dialog box, see the [documentation for the JetBrains IDE](https://www.jetbrains.com/help/) that you're using. 

1. To verify the connection settings are correct, choose **Test Connection**.

   A green check mark indicates a successful test.

1. Choose **Apply** to apply your settings, and then choose **OK** to start working with the data source.

   The **Database** tool window opens. This displays the available data sources as a tree with nodes representing database elements such as schemas, tables, and keys. 
**Important**  
To use the **Database** tool window, you must first download and install DataGrip from JetBrains. For more information, see [https://www.jetbrains.com/datagrip/](https://www.jetbrains.com/datagrip/). 

------

# Working with Amazon S3 by using the AWS Toolkit for JetBrains
<a name="building-S3"></a>

The following topics describe how to use the AWS Toolkit for JetBrains to work with Amazon S3 buckets and objects in an AWS account.

**Topics**
+ [Working with Amazon S3 buckets](work-with-S3-buckets.md)
+ [Working with Amazon S3 objects](work-with-S3-objects.md)

# Working with Amazon S3 buckets by using the AWS Toolkit for JetBrains
<a name="work-with-S3-buckets"></a>

Every object you store in Amazon S3 resides in a bucket. You can use buckets to group related objects in the same way that you use a directory to group files in a file system.

**Topics**
+ [Creating an Amazon S3 bucket](#creating-s3-bucket)
+ [Viewing Amazon S3 buckets](#viewing-s3-bucket)
+ [Deleting an Amazon S3 bucket](#deleting-s3-buckets)

## Creating an Amazon S3 bucket
<a name="creating-s3-bucket"></a>

1. Open AWS Explorer, if it isn't already open.

1. Right-click the **Amazon S3** node and choose **Create S3 Bucket**.  
![\[Creating an AWS Lambda bucket in AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/s3-bucket-create.png)

1. In the **Create S3 Bucket** dialog box, enter a name for the bucket.
**Note**  
Because Amazon S3 allows your bucket to be used as a URL that can be accessed publicly, the bucket name that you choose must be globally unique. If some other account has already created a bucket with the name that you chose, you must use another name. For more information, see [Bucket Restrictions and Limitations](https://docs.aws.amazon.com/AmazonS3/latest/userguide/BucketRestrictions.html) in the *Amazon Simple Storage Service User Guide*.

1. Choose **Create**.

## Viewing Amazon S3 buckets
<a name="viewing-s3-bucket"></a>

1. Open AWS Explorer, if it isn't already open.

1. Click the **Amazon S3** node to expand the list of buckets.
   + The S3 buckets for the [current AWS Region](setup-region.md#setup-region-current-region) are displayed beneath the **Amazon S3** node.

## Deleting an Amazon S3 bucket
<a name="deleting-s3-buckets"></a>

1. Open AWS Explorer, if it isn't already.

1. Click the **Amazon S3** node to expand the list of buckets.

1. Right-click the bucket to delete, and then choose **Delete S3 Bucket**.  
![\[Deleting an AWS Lambda bucket in AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/s3-bucket-delete.png)

1. Enter the bucket's name to confirm the deletion, and then choose **OK**.
   + If the bucket contains objects, the bucket is emptied before deletion. A notification is displayed after the deletion is complete.

# Working with Amazon S3 objects by using the AWS Toolkit for JetBrains
<a name="work-with-S3-objects"></a>

Objects are the fundamental entities stored in Amazon S3. Objects consist of object data and metadata.

**Topics**
+ [Viewing an object in an Amazon S3 bucket](#viewing-s3-object-in-bucket)
+ [Opening an object in the IDE](#opening-s3-object-in-IDE)
+ [Uploading an object](#uploading-s3-object)
+ [Downloading an object](#downloading-s3-object)
+ [Deleting an object](#deleting-s3-object)

## Viewing an object in an Amazon S3 bucket
<a name="viewing-s3-object-in-bucket"></a>

This procedure opens the **S3 Bucket Viewer**. You can use it to view, upload, download, and delete objects grouped by folders in an Amazon S3 bucket.

1. Open AWS Explorer, if it isn't already open.

1. To view a bucket's objects, do one of the following:
   + Double-click the name of the bucket.
   + Right-click the name of the bucket, and then choose **View Bucket**.

The **S3 Bucket Viewer** displays information about the bucket's name, [Amazon Resource Name (ARN)](https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html), and creation date. The objects and folders in the bucket are available in the pane beneath.

## Opening an object in the IDE
<a name="opening-s3-object-in-IDE"></a>

If the object in an Amazon S3 bucket is a file type recognized by the IDE, you can download a read-only copy and open it in the IDE.

1. To find an object to download, open the **S3 Bucket Viewer** (see [Viewing an object in an Amazon S3 bucket](#viewing-s3-object-in-bucket)).

1. Double-click the name of the object.

The file opens in the default IDE window for that file type.

## Uploading an object
<a name="uploading-s3-object"></a>

1. To find the folder you want to upload objects to, open the **S3 Bucket Viewer** (see [Viewing an object in an Amazon S3 bucket](#viewing-s3-object-in-bucket)).

1. Right-click the folder, and then choose **Upload**.

1. In the dialog box, select the files to upload.
**Note**  
You can upload multiple files at once. You can't upload directories.

1. Choose **OK**.

## Downloading an object
<a name="downloading-s3-object"></a>

1. To find a folder to download objects from, open the **S3 Bucket Viewer** (see [Viewing an object in an Amazon S3 bucket](#viewing-s3-object-in-bucket)).

1. Choose a folder to display its objects.

1. Right-click an object, and then choose **Download**.

1. In the dialog box, select the download location.
**Note**  
If you're downloading multiple files, ensure you select the path name instead of the folder. You can't download directories.

1. Choose **OK**.
**Note**  
If a file already exists in the download location, you can overwrite it or leave it in place by skipping the download.

## Deleting an object
<a name="deleting-s3-object"></a>

1. To find the object to delete, open the **S3 Bucket Viewer** (see [Viewing an object in an Amazon S3 bucket](#viewing-s3-object-in-bucket)).

1. After you select the object, delete it by doing one of the following:
   + Press **Delete**.
   + Right-click, and then choose **Delete**.
**Note**  
You can select and delete multiple objects at once.

1. To confirm the deletion, choose **Delete**.

# Working with AWS serverless applications by using the AWS Toolkit for JetBrains
<a name="sam"></a>

The following topics describe how to use the AWS Toolkit for JetBrains to work with AWS serverless applications in an AWS account.

**Topics**
+ [Creating an application](deploy-serverless-app.md)
+ [Syncing an application](sam-sync.md)
+ [Changing (updating) application settings](sam-update.md)
+ [Deleting an application](sam-delete.md)

# Creating an AWS serverless application by using the AWS Toolkit for JetBrains
<a name="deploy-serverless-app"></a>

To complete this procedure, you must first install the AWS Toolkit and, if you haven't yet, connect to an AWS account for the first time. Then with IntelliJ IDEA, PyCharm, WebStorm, or JetBrains Rider already running, do the following."?>

1. With IntelliJ IDEA, PyCharm, WebStorm, or JetBrains Rider already running, do one of the following:
   + For IntelliJ IDEA or WebStorm, choose **File**, **New**, **Project**.
   + For PyCharm, choose **File**, **New Project**.
   + For JetBrains Rider, choose **File**, **New** for a new solution. Or right-click an existing solution in the **Explorer** tool window, and then choose **Add**, **New Project**.

1. For IntelliJ IDEA, choose **AWS**, **AWS Serverless Application**, and then choose **Next**.  
![\[Choosing to create an AWS serverless application in IntelliJ IDEA\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-create-intellij.png)

   For PyCharm, choose **AWS Serverless Application**.  
![\[Choosing to create an AWS serverless application in PyCharm\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-create-pycharm.png)

   For WebStorm, choose **AWS Serverless Application**.  
![\[Choosing to create an AWS serverless application in WebStorm\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-create-webstorm.png)

   For JetBrains Rider, choose **AWS Serverless Application**.  
![\[Choosing to create an AWS serverless application in JetBrains Rider\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-create-rider.png)

1. Complete the [New Project dialog box (or the New Solution dialog box for JetBrains Rider)](new-project-dialog.md), and then choose **Finish** (for IntelliJ IDEA) or **Create** (for PyCharm, WebStorm, or JetBrains Rider). The AWS Toolkit for JetBrains creates the project and adds the serverless application's code files to the new project.

1. If you're using IntelliJ IDEA, with the **Project** tool window already open and displaying the project that contains the serverless application's files, do one of the following:
   + For Maven-based projects, right-click the project's `pom.xml` file, and then choose **Add as Maven Project**.  
![\[Choosing to add the POM file as a Maven project\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/add-as-maven-project.png)
   + For Gradle-based projects, right-click the project's `build.gradle` file, and then choose **Import Gradle project**.  
![\[Choosing to import the Gradle project\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/import-gradle-project.png)

     Complete the **Import Module from Gradle** dialog box, and then choose **OK**.

After you create the serverless application, you can run (invoke) or debug the local version of an AWS Lambda function that is contained in that application.

You can also deploy the serverless application. After you deploy it, you can (invoke) the remote version of a Lambda function that is part of that deployed application.

# Syncing AWS SAM applications from the AWS Toolkit for JetBrains
<a name="sam-sync"></a>

AWS Serverless Application Model (AWS SAM) `sam sync` is an AWS SAM-CLI-command deployment process that automatically identifies changes made to your serverless applications, then chooses the best way to build and deploy those changes to the AWS Cloud. If you've only made changes to your application code without changing the infrastructure, AWS SAM Sync updates your application without redeploying your CloudFormation stack.

For additional information about `sam sync` and AWS SAM CLI commands, see the [AWS SAM CLI command reference](https://docs.aws.amazon.com//serverless-application-model/latest/developerguide/serverless-sam-cli-command-reference.html) topic in the *AWS Serverless Application Model User Guide*.

The following sections describe how to get started working with AWS SAM Sync.

## Prerequisites
<a name="w2aac17c37c11b9"></a>

Prior to working with AWS SAM Sync, the following prerequisites must be met:
+ You have a working AWS SAM application. For more information on creating a AWS SAM application, see the [Working with AWS SAM](https://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/key-tasks.html#key-tasks-sam-create) topic in this User Guide.
+ You've installed version 1.78.0. (or later) of the AWS SAM CLI. For more information on installing the AWS SAM CLI, see the [Installing the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/install-sam-cli.html) topic in the *AWS Serverless Application Model User Guide*.
+ Your application is running in a development environment.

**Note**  
To sync and deploy a serverless application that contains an AWS Lambda function with any non-default properties, the optional properties must be set in the AWS SAM template file associated with the AWS Lambda function, prior to deployment.  
To learn more about AWS Lambda properties, see the [AWS::Serverless::Function](https://github.com/aws/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction) section in the *AWS Serverless Application Model User Guide* on GitHub.

## Getting Started
<a name="w2aac17c37c11c11"></a>

To get started working with AWS SAM Sync, complete the following procedure.

**Note**  
Make sure that you're AWS Region is set to the location associated with your serverless application.  
To learn more about changing your AWS region from the AWS Toolkit for JetBrains, see the [Switch between AWS Regions](https://docs.aws.amazon.com//toolkit-for-jetbrains/latest/userguide/key-tasks.html#key-tasks-switch-region) topic in this User Guide.

1. From your serverless application project in the **Project** tool window, open the context menu for (right-click) your `template.yaml` file.

1. From the `template.yaml` context menu, choose **Sync Serverless Application (formerly Deploy)** to open the **Confirm development stack** dialog.

1. Confirm that you are working from a development stack to open the **Sync Serverless Application** dialog.  
![\[Confirm development stack dialog\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-sync-dev-stack.png)

1. Complete the steps in the **Sync Serverless Application** dialog, then choose **Sync** to begin the AWS SAM Sync process. To learn more about the **Sync Serverless Application** dialog, see the [Sync Serverless Application Dialog](#sam-sync-serverless-app-dialog) section located below.

1. During the sync process, the AWS Toolkit for JetBrains **Run Window** is updated with the deployment status.

1. Following a successful sync, the name of your CloudFormation stack is added to the **AWS Explorer**. 

   If the sync fails, troubleshooting details can be found in the JetBrains **Run Window** or the CloudFormation **event logs**. To learn more about viewing CloudFormation event logs, see the [Viewing event logs for a stack](https://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/key-tasks.html#key-tasks-cloudformation-logs) topic in this User Guide.

## Sync Serverless Application Dialog
<a name="sam-sync-serverless-app-dialog"></a>

The **Sync serverless application dialog** assists you with the AWS SAM sync process. The following sections are descriptions and details for each of the different dialog components.

### Create Stack or Update Stack
<a name="w2aac17c37c11c13b5"></a>

**Required:** To create a new deployment stack, enter a name in the provided field to create and set the CloudFormation stack for your serverless application deployment. 

Alternatively, to deploy to an existing CloudFormation stack, select the stack name from the auto-populated list of stacks associated with your AWS account.

### Template Parameters
<a name="w2aac17c37c11c13b7"></a>

**Optional:** Populates with a list of parameters detected from your project `template.yaml` file. To specify parameter values, enter a new parameter value into the provided text-field located in the **value** column.

### S3 Bucket
<a name="w2aac17c37c11c13b9"></a>

**Required:** To choose an existing Amazon Simple Storage Service (Amazon S3) bucket for storing your CloudFormation template, select it from the list.

To create and use new Amazon S3 bucket for storage, choose **Create** and follow the prompts.

### ECR Repository
<a name="w2aac17c37c11c13c11"></a>

**Required, only visible when working with an Image package type:** Choose an existing Amazon Elastic Container Registry (Amazon ECR) repository URI for deployment of your serverless application.

For information about AWS Lambda package types, see the [Lambda deployment packages](https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-package.html) section in the *AWS Lambda Developer Guide.*

### CloudFormation Capabilities
<a name="w2aac17c37c11c13c13"></a>

**Required:** Choose the capabilities that CloudFormation is allowed to use when creating stacks.

### Tags
<a name="w2aac17c37c11c13c15"></a>

**Optional:** Enter your preferred tags in the provided text fields to tag a parameter.

### Build Function Inside a Container
<a name="w2aac17c37c11c13c17"></a>

**Optional, Docker required:** Selecting this options builds your serverless-application functions inside of a local Docker container, prior to deployment. This option is useful if a function depends on packages with natively compiled dependencies or programs.

For more information, see the [Building applications](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-using-build.html) topic in the *AWS Serverless Application Model Developer Guide*.

# Changing (updating) AWS Serverless application settings by using the AWS Toolkit for JetBrains
<a name="sam-update"></a>

You must first deploy the AWS serverless application that you want to change, if you haven't deployed it already.
**Note**  
To deploy a serverless application that contains an AWS Lambda function, and deploy that function with any nondefault or optional properties, you must first set those properties in the function's corresponding AWS SAM template file (for example, in a file named `template.yaml` within the project). For a list of available properties, see [AWS::Serverless::Function](https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction) in the [awslabs/serverless-application-model](https://github.com/awslabs/serverless-application-model/) repository on GitHub.

1. With the **Project** tool window already open and displaying the project that contains the serverless application's files, open the project's `template.yaml` file. Change the file's contents to reflect the new settings, and then save and close the file.

1. If you need to switch to a different AWS Region to deploy the serverless application to, do that now.

1. Right-click the project's `template.yaml` file, and then choose **Deploy Serverless Application**.  
![\[Choosing the Deploy Serverless Application command\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/deploy-serverless-application.png)

1. Complete the [Deploy Serverless Application](deploy-serverless-application-dialog.md) dialog box, and then choose **Deploy**. The AWS Toolkit for JetBrains updates the corresponding AWS CloudFormation stack for the deployment. 

   If the deployment fails, you can try to determine why by viewing event logs for the stack.

# Deleting an AWS serverless application by using the AWS Toolkit for JetBrains
<a name="sam-delete"></a>

Before deleting an AWS serverless application, you must first deploy it.

1. Open AWS Explorer, if it isn't already open. If you need to switch to a different AWS Region that contains the serverless application, do that now.

1. Expand **CloudFormation**.

1. Right-click the name of the AWS CloudFormation stack that contains the serverless application you want to delete, and then choose **Delete CloudFormation Stack**.  
![\[Choosing to delete the AWS CloudFormation stack for an AWS serverless application starting from AWS Explorer\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/sam-delete.png)

1. Enter the stack's name to confirm the deletion, and then choose **OK**. If the stack deletion succeeds, the AWS Toolkit for JetBrains removes the stack name from the **CloudFormation** list in **AWS Explorer**. If the stack deletion fails, you can try to determine why by viewing event logs for the stack.

# Working with Amazon Simple Queue Service from the AWS Toolkit for JetBrains
<a name="sqs"></a>

The following topics describe how work with the Amazon Simple Queue Service service from the AWS Toolkit for JetBrains.

**Topics**
+ [Working with Amazon Simple Queue Service queues](sqs-queues.md)
+ [Using Amazon SQS with AWS Lambda in the AWS Toolkit for JetBrains](sqs-lambda.md)
+ [Using Amazon SQS with Amazon SNS in the AWS Toolkit for JetBrains](sqs-sns.md)

# Working with Amazon Simple Queue Service queues
<a name="sqs-queues"></a>

The following topics describe how to use the AWS Toolkit for JetBrains to work with Amazon Simple Queue Service queues and messages.

Standard and FIFO (First-In-Last-Out) are the two kinds of messages you can send using Amazon SQS in the AWS Toolkit for JetBrains. 

**To create an Amazon SQS queue**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, open the context menu for (right-click) the **Amazon SQS** service, and choose **Create Queue...**.

1. Provide a queue name and choose the queue type. 
**Note**  
For more information on queue types, see the [Amazon SQS standard queues](https://docs.aws.amazon.com//AWSSimpleQueueService/latest/SQSDeveloperGuide/standard-queues.html) and [Amazon SQS FIFO (First-In-First-Out) queues](https://docs.aws.amazon.com//AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html) topics in the *Amazon Simple Queue Service Developer Guide*. 

1. Choose **Create**.

**To view Amazon SQS messages**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue you want to view and choose **View Messages**.

1. Choose **View Messages** to view the messages in this queue.

**To edit Amazon SQS queue properties**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue that you want to edit and choose **Edit Queue Properties...**.

1. In the **Edit Queue Properties** dialog box that opens, review and modify your queue properties. For more information on Amazon SQS properties, see [Configuring queue parameters (console)](https://docs.aws.amazon.com//AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-configure-queue-parameters.html) in the *Amazon Simple Queue Service Developer Guide*.

**To send Standard messages**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue for sending your message and choose **Send a message**.

1. Populate the message and choose **Send**. After you send the message, you see a confirmation that includes the message ID.

**To send FIFO messages**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue for sending your message and choose **Send a message**.

1. Populate the message, group id, and an optional deduplication id.
**Note**  
If no deduplication id is provided, one will be generated.

1. Choose **Send**. After you send the message, you see a confirmation that includes the message ID.

**To delete an Amazon SQS queue**

1. Verify that a queue is empty before you delete it. For more information see [Confirming that a queue is empty](https://docs.aws.amazon.com//AWSSimpleQueueService/latest/SQSDeveloperGuide/confirm-queue-is-empty.html) in the *Amazon Simple Queue Service Developer Guide*.

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click **Amazon SQS**, and choose **Delete Queue...**.

1. Confirm that you want to delete the queue, and choose **OK** in the deletion dialog box.

# Using Amazon SQS with AWS Lambda in the AWS Toolkit for JetBrains
<a name="sqs-lambda"></a>

The following procedure details how to configure Amazon SQS queues as Lambda triggers in the AWS Toolkit for JetBrains. 

**To configure an Amazon SQS queue as a Lambda triggers**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue you want to work with and choose **Configure Lambda Trigger**.

1. In the dialog box, from the drop-down menu, choose the Lambda function that you want to trigger.

1. Choose **Configure**.

1. If the Lambda function lacks the necessary IAM permissions for Amazon SQS to run it, the toolkit generates a minimal policy that you can add to the IAM role for the Lambda function. 

   Choose **Add Policy**.

After you configure your queue, you get a status message about the applied changes, including any applicable error messages.

# Using Amazon SQS with Amazon SNS in the AWS Toolkit for JetBrains
<a name="sqs-sns"></a>

The following procedure details how to subscribe Standard Amazon SQS queues to Amazon SNS topics using the AWS Toolkit for JetBrains. 

**Note**  
You can't subscribe FIFO Amazon SQS queues to Amazon SNS topics.

**To subscribe a Standard Amazon SQS queue to an Amazon SNS topic**

1. From the AWS Toolkit for JetBrains, expand the AWS Explorer to view your AWS services.

1. From the AWS Explorer, expand the **Amazon SQS** service to view a list of your existing queues.

1. Right-click the queue you want to work with and choose **Subscribe to SNS topic...**.

1. In the dialog box, from the drop-down menu, choose an Amazon SNS topic, and then choose **Subscribe**.

# Working with resources
<a name="more-resources"></a>

In addition to accessing AWS services that are listed by default in the AWS Explorer, you can also go to **Resources** and choose from hundreds of resources to add to the interface. In AWS, a **resource** is an entity you can work with. Some of the resources that can be added include Amazon AppFlow, Amazon Kinesis Data Streams, AWS IAM roles, Amazon VPC, and Amazon CloudFront distributions.

After making your selection, you can go to **Resources** and expand the resource type to list the available resources for that type. For example, if you select the `AWS::Lambda::Function` resource type, you can access the resources that define different functions, their properties, and their attributes.

After adding a resource type to **Resources**, you can interact with it and its resources in the following ways:
+ View a list of existing resources that are available in the current AWS Region for this resource type.
+ View a read-only version of the JSON file that describes a resource.
+ Copy the resource identifier for the resource.
+ View the AWS documentation that explains the purpose of the resource type and the schema (in JSON and YAML formats) for modelling a resource. 
+ Create a new resource by editing and saving a JSON-formatted template that conforms to a schema.**\$1**
+ Update or delete an existing resource.**\$1**

**Important**  
**\$1**In the current release of the AWS Toolkit for JetBrains the option to create, edit, and delete resources is an *experimental feature*. Because experimental features continue to be tested and updated, they may have usability issues. And experimental features may be removed from the AWS Toolkit for JetBrains without notice.  
To allow the use of experimental features for resources, open the **Settings** pane in your JetBrains IDE, and expand **Tools**, and then choose **AWS**, **Experimental Features**. Select **JSON Resource Modification** to allow you to create, update, and delete resources.  
  
For more information, see [Working with experimental features](experimental-features.md).

## IAM permissions for accessing resources
<a name="cloud-api-permissions"></a>

You require specific AWS Identity and Access Management permissions to access the resources associated with AWS services. For example, an IAM entity, such as a user or a role, requires Lambda permissions to access `AWS::Lambda::Function` resources. 

In addition to permissions for service resources, an IAM entity requires permissions to permit the AWS Toolkit for JetBrains to call AWS Cloud Control API operations on its behalf. Cloud Control API operations allow the IAM user or role to access and update the remote resources.

The easiest way to grant permissions is to attach the AWS managed policy, **PowerUserAccess**, to the IAM entity that's calling these API operations using the Toolkit interface. This [managed policy](https://docs.aws.amazon.com//IAM/latest/UserGuide/access_policies_job-functions.html#jf_developer-power-user) grants a range of permissions for performing application development tasks, including calling API operations. 

For specific permissions that define allowable API operations on remote resources, see the [AWS Cloud Control API User Guide.](https://docs.aws.amazon.com//cloudcontrolapi/latest/userguide/security.html)

## Adding and interacting with existing resources
<a name="configure-resources"></a>

1. In the **AWS Explorer**, right-click **Resources** and choose **Add or remove resources**.

   **Additional Explorer Resources** in the **Settings** pane displays a list of resource types that are available for selection.
**Note**  
You can also display the list of resource types by double-clicking the **Add or remove resources** node, which is under **Resources**.   
![\[Selecting resources to configure.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/add-resources-renamed.png)

1. In the **Additional Explorer Resources**, select the resource types to add to the **AWS Explorer** and press **Return** or choose **OK** to confirm.

   The resource types that you selected are listed under **Resources**.
**Note**  
If you've already added a resource type to the **AWS Explorer** and then clear the checkbox for that type, it's no longer listed under **Resources** after you choose **OK**. Only those resource types that are currently selected are visible in the **AWS Explorer**.

1. To view the resources that already exist for a resource type, expand the entry for that type.

   A list of available resources is displayed under their resource type.

1. To interact with a specific resource, right-click its name and choose one of the following options:
   + **View resource**: View a read-only version of the JSON-formatted template that describes the resource.

     After the template is displayed, you can change it by choosing **Edit** if you have the required [experimental feature](#experimental-feature-warning) enabled.
**Note**  
You can also view the resource by double-clicking it.
   + **Copy identifier**: Copy the identifier for the specific resource to the clipboard. (For example, the `AWS::DynamoDB::Table` resource can be identified using the `TableName` property.) 
   + **Update resource**: Edit the JSON-formatted template for the resource in a JetBrains editor. For more information, see [Creating and updating resources](#create-resources). 
   + **Delete resource**: Delete the resource by confirming the deletion in a dialog box that is displayed. (Deleting resources is currently an [experimental feature](#experimental-feature-warning) in this version of the AWS Toolkit for JetBrains.)
**Warning**  
If you delete a resource, any AWS CloudFormation stack that uses that resource will fail to update. To fix this update failure, you need to either recreate the resource or remove the reference to it in the stack's CloudFormation template. For more information, see this [Knowledge Center article](https://aws.amazon.com/premiumsupport/knowledge-center/failing-stack-updates-deleted/).  
![\[Menu options for a selected resource.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/resource-menu-options-renamed.png)

## Creating and updating resources
<a name="create-resources"></a>

**Important**  
The creation and updating of resources is currently an [experimental feature](#experimental-feature-warning) in this version of the AWS Toolkit for JetBrains.

Creating a new resource involves adding a resource type to the **Resources** list and then editing a JSON-formatted template that defines the resource, its properties, and its attributes.

For example, a resource that belongs to the `AWS::SageMaker::UserProfile` resource type is defined with a template that creates a user profile for Amazon SageMaker AI Studio. The template that defines this user profile resource must conform to the resource type schema for `AWS::SageMaker::UserProfile`. If the template doesn't comply with the schema because of missing or incorrect properties, for example, the resource can't be created or updated. 

1. Add the resource type for the resource you want to create by right-clicking **Resources** and choosing **Add or remove resources**.

1. After the resource type is added under **Resources**, right-click its name and choose **Create resource**. You can also access information about how to model the resource by choosing **View documentation**.  
![\[Menu options for a selected resource type.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/resource-new.png)

1. In the editor, start to define properties that make up the resource template. The autocomplete feature suggests property names that conform with your template's schema. When your temple fully conforms with JSON syntax, the error count is replaced by a green checkmark. For detailed information about the schema, choose **View documentation**.
**Note**  
As well as conforming to basic JSON syntax, your template must conform to the schema that models the resource type. Your template is validated against the schema model when you try to create or update the remote resource.  
![\[Editor displaying the template that describes a resource type.\]](http://docs.aws.amazon.com/toolkit-for-jetbrains/latest/userguide/images/resource-template.png)

1. After you finish declaring your resource, choose **Create** to validate your template and save the resource to the remote AWS Cloud. (Choose **Update** if you're modifying an existing resource.)

   If your template defines the resource in accordance with its schema, a message displays to confirm that the resource was created. (If the resource already exists, the message confirms that the resource was updated.)

   After the resource is created, it's added to the list under the resource type heading.

1. If your file contains errors, a message displays to explain that the resource couldn't be created or updated. Open the **Event Log** to identify the template elements that you need to fix.