

# CodePipeline pipeline structure reference
Pipeline structure reference

You can use CodePipeline to structure a CI/CD pipeline of automated steps that accomplish tasks that build, test, and deploy your application source code. This reference section provides details about the JSON structure and parameters in your pipeline. For a high-level list of concepts that describe how pipelines are used, see [CodePipeline concepts](concepts.md).

 
+ When you create a pipeline, you choose an available source action and provider, such as an S3 bucket, CodeCommit repository, Bitbucket repository, or GitHub repository that contains your source code and starts your pipeline when you commit a source code change. This reference section provides reference information about the available sources for your pipeline. For more information about how to work with source actions, see [Start a pipeline in CodePipeline](pipelines-about-starting.md). 
+ You can choose the test, build, and deploy actions and providers that you want to automatically include when your pipeline runs. This reference section provides reference information about the available actions and how they fit in your pipeline JSON.
+ Your finished pipeline will consist of a source stage along with additional stages where you configure actions to deploy and test your application. For a conceptual example of a DevOps pipeline that deploys your application, see [DevOps pipeline example](concepts-devops-example.md).

By default, any pipeline you successfully create in AWS CodePipeline has a valid structure. However, if you manually create or edit a JSON file to create a pipeline or update a pipeline from the AWS CLI, you might inadvertently create a structure that is not valid. The following reference can help you better understand the requirements for your pipeline structure and how to troubleshoot issues. See the constraints in [Quotas in AWS CodePipeline](limits.md), which apply to all pipelines.

The following sections provide high level parameters and their position in the pipeline structure. Pipeline structure requirements are detailed in each section for the following pipeline component types:
+ Field reference for the [Pipeline declaration](pipeline-requirements.md)
+ Field reference for the [Stage declaration](stage-requirements.md)
+ Field reference for the [Action declaration](action-requirements.md)
+ List of [Valid action providers in CodePipeline](actions-valid-providers.md) by action type
+ Reference for [Valid settings for the `PollForSourceChanges` parameter](PollForSourceChanges-defaults.md)
+ Reference for [Valid input and output artifacts for each action type](reference-action-artifacts.md)
+ List of links for [Valid configuration parameters for each provider type](structure-configuration-examples.md)

For more information, see the [PipelineDeclaration](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PipelineDeclaration.html) object in the *CodePipeline API Guide*.

The following example pipeline console view shows the pipeline named new-github, stages named `Source`, `manual`, and `Build`, and actions from GitHub (via GitHub App), manual approval, and CodeBuild action providers.

![\[An example of the pipeline view in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/pipeline-console-view.png)


The pipeline editing mode, when viewed in the console diagram, allows you to edit source overrides, triggers, and actions as shown in the following example.

![\[An example of the pipeline editing mode in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/pipeline-console-view-edit.png)


**Topics**
+ [

# Pipeline declaration
](pipeline-requirements.md)
+ [

# Stage declaration
](stage-requirements.md)
+ [

# Action declaration
](action-requirements.md)
+ [

# Valid action providers in CodePipeline
](actions-valid-providers.md)
+ [

# Valid settings for the `PollForSourceChanges` parameter
](PollForSourceChanges-defaults.md)
+ [

# Valid input and output artifacts for each action type
](reference-action-artifacts.md)
+ [

# Valid configuration parameters for each provider type
](structure-configuration-examples.md)

# Pipeline declaration


The pipeline and metadata level of a pipeline has a basic structure that includes the following parameters and syntax. The pipeline parameter represents the structure of actions and stages to be performed in the pipeline. 

For more information, see the [PipelineDeclaration](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PipelineDeclaration.html) object in the *CodePipeline API Guide*.

The following example shows the pipeline and metadata level of the pipeline structure in both JSON and YAML for a V2 type pipeline.

------
#### [ YAML ]

```
pipeline:
  name: MyPipeline
  roleArn: >-
    arn:aws:iam::ACCOUNT_ID:role/service-role/AWSCodePipelineServiceRole-us-west-2-MyPipeline
  artifactStore:
    type: S3
    location: amzn-s3-demo-bucket
  stages:
    ...
  version: 6
  executionMode: SUPERSEDED
  pipelineType: V2
  variables:
  - name: MyVariable
    defaultValue: '1'
  triggers:
  - providerType: CodeStarSourceConnection
    gitConfiguration:
      sourceActionName: Source
      push:
      - branches:
          includes:
          - main
          excludes:
          - feature-branch
      pullRequest:
      - events:
        - CLOSED
        branches:
          includes:
          - main*
metadata:
  pipelineArn: 'arn:aws:codepipeline:us-west-2:ACCOUNT_ID:MyPipeline'
  created: '2019-12-12T06:49:02.733000+00:00'
  updated: '2020-09-10T06:34:07.447000+00:00'
  pollingDisabledAt: '2020-09-10T06:34:07.447000\$100:00'
```

------
#### [ JSON ]

```
{
    "pipeline": {
        "name": "MyPipeline",
        "roleArn": "arn:aws:iam::ACCOUNT_ID:role/service-role/AWSCodePipelineServiceRole-us-west-2-MyPipeline",
        "artifactStore": {
            "type": "S3",
            "location": "amzn-s3-demo-bucket"
        },
        "stages": {
            ...   
    },
        "version": 6,
        "executionMode": "SUPERSEDED",
                "pipelineType": "V2",
        "variables": [
            {
                "name": "MyVariable",
                "defaultValue": "1"
            }
        ],
        "triggers": [
            {
                "providerType": "CodeStarSourceConnection",
                "gitConfiguration": {
                    "sourceActionName": "Source",
                    "push": [
                        {
                            "branches": {
                                "includes": [
                                    "main"
                                ],
                                "excludes": [
                                    "feature-branch"
                                ]
                            }
                        }
                    ],
                    "pullRequest": [
                        {
                            "events": [
                                "CLOSED"
                            ],
                            "branches": {
                                "includes": [
                                    "main*"
                                ]
                            }
                        }
                    ]
                }
            }
        ]
    },
    "metadata": {
        "pipelineArn": "arn:aws:codepipeline:us-west-2:ACCOUNT_ID:MyPipeline",
        "created": "2019-12-12T06:49:02.733000+00:00",
        "updated": "2020-09-10T06:34:07.447000+00:00",
        "pollingDisabledAt": "2020-09-10T06:34:07.447000+00:00"
    }
}
```

------

## `name`


The name of the pipeline. When you edit or update a pipeline, the pipeline name cannot be changed.

**Note**  
If you want to rename an existing pipeline, you can use the CLI `get-pipeline` command to build a JSON file that contains your pipeline's structure. You can then use the CLI `create-pipeline` command to create a pipeline with that structure and give it a new name.

## `roleArn`


The IAM ARN for the CodePipeline service role, such as arn:aws:iam::80398EXAMPLE:role/CodePipeline\$1Service\$1Role.

To use the console to view the pipeline service role ARN instead of the JSON structure, choose your pipeline in the console, and then choose **Settings**. Under the **General** tab, the **Service role ARN** field displays.

## `artifactStore` OR `artifactStores`


The `artifactStore` field contains the artifact bucket type and location for a pipeline with all actions in the same AWS Region. If you add actions in a Region different from your pipeline, the `artifactStores` mapping is used to list the artifact bucket for each AWS Region where actions are executed. When you create or edit a pipeline, you must have an artifact bucket in the pipeline Region and then you must have one artifact bucket per Region where you plan to execute an action. 

**Note**  
In the pipeline structure, you must include either `artifactStore` or `artifactStores` in your pipeline, but you cannot use both. If you create a cross-region action in your pipeline, you must use `artifactStores`.

The following example shows the basic structure for a pipeline with cross-Region actions that uses the `artifactStores` parameter: 

```
    "pipeline": {
        "name": "YourPipelineName",
        "roleArn": "CodePipeline_Service_Role",
        "artifactStores": {
            "us-east-1": {
                "type": "S3",
                "location": "S3 artifact bucket name, such as amzn-s3-demo-bucket"
            },
            "us-west-2": {
                "type": "S3",
                "location": "S3 artifact bucket name, such as amzn-s3-demo-bucket"
            }
        },
        "stages": [
            {

...
```

### `type`


The location type for the artifact bucket, specified as Amazon S3.

### `location`


The name of the Amazon S3 bucket automatically generated for you the first time you create a pipeline using the console, such as codepipeline-us-east-2-1234567890, or any Amazon S3 bucket you provision for this purpose

## `stages`


This parameter contains the name of each stage in the pipeline. For more information about the parameters and syntax at the stage level of the pipeline structure, see the [StageDeclaration](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_StageDeclaration.html) object in the * CodePipeline API Guide*.

The pipeline structure for stages has the following requirements:
+ A pipeline must contain at least two stages.
+ The first stage of a pipeline must contain at least one source action. It can contain source actions only.
+ Only the first stage of a pipeline can contain source actions.
+ At least one stage in each pipeline must contain an action that is not a source action.
+ All stage names in a pipeline must be unique.
+ Stage names cannot be edited in the CodePipeline console. If you edit a stage name by using the AWS CLI, and the stage contains an action with one or more secret parameters (such as an OAuth token), the value of those secret parameters is not preserved. You must manually enter the value of the parameters (which are masked by four asterisks in the JSON returned by the AWS CLI) and include them in the JSON structure.

**Important**  
Pipelines that are inactive for longer than 30 days will have polling disabled for the pipeline. For more information, see [pollingDisabledAt](#metadata.pollingDisabledAt) in the pipeline structure reference. For the steps to migrate your pipeline from polling to event-based change detection, see [Change Detection Methods](change-detection-methods.md).

## `version`


The version number of a pipeline is automatically generated and updated every time you update the pipeline.

## `executionMode`


You can set the pipeline execution mode so that you can specify the pipeline behavior for consecutive runs, such as queueing, superseding, or running in parallel mode. For more information, see [Set or change the pipeline execution mode](execution-modes.md).

**Important**  
For pipelines in PARALLEL mode, stage rollback is not available. Similarly, failure conditions with a rollback result type cannot be added to a PARALLEL mode pipeline.

## `pipelineType`


The pipeline type specifies the available structure and features in the pipeline, such as for a V2 type pipeline. For more information, see [Pipeline types](pipeline-types.md).

## `variables`


Variables at the pipeline level are defined when the pipeline is created and resolved at pipeline run time. For more information, see [Variables reference](reference-variables.md). For a tutorial with a pipeline-level variable that is passed at the time of the pipeline execution, see [Tutorial: Use pipeline-level variables](tutorials-pipeline-variables.md).

## `triggers`


Triggers allow you to configure your pipeline to start on a particular event type or filtered event type, such as when a change on a particular branch or pull request is detected. Triggers are configurable for source actions with connections that use the `CodeStarSourceConnection` action in CodePipeline, such as GitHub, Bitbucket, and GitLab. For more information about source actions that use connections, see [Add third-party source providers to pipelines using CodeConnections](pipelines-connections.md).

For more information and more detailed examples, see [Automate starting pipelines using triggers and filtering](pipelines-triggers.md).

For filtering, regular expression patterns in glob format are supported as detailed in [Working with glob patterns in syntax](syntax-glob.md).

**Note**  
The CodeCommit and S3 source actions require either a configured change detection resource (an EventBridge rule) or use the option to poll the repository for source changes. For pipelines with a Bitbucket, GitHub, or GitHub Enterprise Server source action, you do not have to set up a webhook or default to polling. The connections action manages change detection for you. 

**Important**  
Pipelines that are inactive for longer than 30 days will have polling disabled for the pipeline. For more information, see [pollingDisabledAt](#metadata.pollingDisabledAt) in the pipeline structure reference. For the steps to migrate your pipeline from polling to event-based change detection, see [Change Detection Methods](change-detection-methods.md).

### `gitConfiguration` fields


The Git configuration for the trigger, including the event types and any parameters for filtering by branches, file paths, tags, or pull request events. 

The fields in the JSON structure are defined as follows:
+ `sourceActionName`: The name of the pipeline source action with the Git configuration.
+ `push`: Push events with filtering. These events use an OR operation between different push filters and an AND operation inside filters.
  + `branches`: The branches to filter on. Branches use an AND operation between includes and excludes. 
    + `includes`: Patterns to filter on for branches that will be included. Includes use an OR operation.
    + `excludes`: Patterns to filter on for branches that will be excluded. Excludes use an OR operation.
  + `filePaths`: The file path names to filter on. 
    + `includes`: Patterns to filter on for file paths that will be included. Includes use an OR operation.
    + `excludes`: Patterns to filter on for file paths that will be excluded. Excludes use an OR operation.
  + `tags`: The tag names to filter on.
    + `includes`: Patterns to filter on for tags that will be included. Includes use an OR operation.
    + `excludes`: Patterns to filter on for tags that will be excluded. Excludes use an OR operation.
+ `pullRequest`: Pull request events with filtering on pull request events and pull request filters.
  + `events`: Filters on open, updated, or closed pull request events as specified.
  + `branches`: The branches to filter on. Branches use an AND operation between includes and excludes. 
    + `includes`: Patterns to filter on for branches that will be included. Includes use an OR operation.
    + `excludes`: Patterns to filter on for branches that will be excluded. Excludes use an OR operation.
  + `filePaths`: The file path names to filter on. 
    + `includes`: Patterns to filter on for file paths that will be included. Includes use an OR operation.
    + `excludes`: Patterns to filter on for file paths that will be excluded. Excludes use an OR operation.

The following is an example of the trigger configuration for push and pull request event types.

```
"triggers": [
            {
                "provider": "Connection",
                "gitConfiguration": {
                    "sourceActionName": "ApplicationSource",
                    "push": [
                        {
                            "filePaths": {
                                "includes": [
                                    "projectA/**",
                                    "common/**/*.js"
                                ],
                                "excludes": [
                                    "**/README.md",
                                    "**/LICENSE",
                                    "**/CONTRIBUTING.md"
                                ]
                            },
                            "branches": {
                                "includes": [
                                    "feature/**",
                                    "release/**"
                                ],
                                "excludes": [
                                    "mainline"
                                ]
                            },
                            "tags": {
                                "includes": [
                                    "release-v0", "release-v1"
                                ],
                                "excludes": [
                                    "release-v2"
                                ]
                            }
                        }
                    ],
                    "pullRequest": [
                        {
                            "events": [
                                "CLOSED"
                            ],
                            "branches": {
                                "includes": [
                                    "feature/**",
                                    "release/**"
                                ],
                                "excludes": [
                                    "mainline"
                                ]
                            },
                            "filePaths": {
                                "includes": [
                                    "projectA/**",
                                    "common/**/*.js"
                                ],
                                "excludes": [
                                    "**/README.md",
                                    "**/LICENSE",
                                    "**/CONTRIBUTING.md"
                                ]
                            }
                        }
                    ]
                }
            }
        ],
```

### Event type `push` fields for include and exclude


Include and exclude behavior for levels of Git configuration fields for **push** event types are shown in the following list:

```
push (OR operation is used between push and pullRequest or multiples)
    filePaths (AND operation is used between filePaths, branches, and tags)
        includes (AND operation is used between includes and excludes)
            **/FILE.md, **/FILE2 (OR operation is used between file path names)
        excludes (AND operation is used between includes and excludes)
            **/FILE.md, **/FILE2 (OR operation is used between file path names)
    branches (AND operation is used between filePaths, branches, and tags)
        includes (AND operation is used between includes and excludes)
            BRANCH/**", "BRANCH2/** (OR operation is used between branch names)
        excludes (AND operation is used between includes and excludes)
            BRANCH/**", "BRANCH2/** (OR operation is used between branch names)
    tags (AND operation is used between filePaths, branches, and tags)        
         includes (AND operation is used between includes and excludes)
            TAG/**", "TAG2/** (OR operation is used between tag names)
         excludes (AND operation is used between includes and excludes)
            TAG/**", "TAG2/** (OR operation is used between tag names)
```

### Event type `pull request` fields for include and exclude


Include and exclude behavior for levels of Git configuration fields for **pull request** event types are shown in the following list:

```
pullRequest (OR operation is used between push and pullRequest or multiples)
    events (AND operation is used between events, filePaths, and branches). Includes/excludes are N/A for pull request events.
    filePaths (AND operation is used between events, filePaths, and branches)
        includes (AND operation is used between includes and excludes)
            **/FILE.md, **/FILE2 (OR operation is used between file path names)
        excludes (AND operation is used between includes and excludes)
            **/FILE.md, **/FILE2 (OR operation is used between file path names)
    branches (AND operation is used between events, filePaths, and branches)
        includes (AND operation is used between includes and excludes)
            BRANCH/**", "BRANCH2/** (OR operation is used between branch names)
        excludes (AND operation is used between includes and excludes)
            BRANCH/**", "BRANCH2/** (OR operation is used between branch names)
```

## `metadata`


The pipeline metadata fields are distinct from the pipeline structure and cannot be edited. When you update a pipeline, the date in the `updated` metadata field changes automatically.

### `pipelineArn`


The Amazon Resource Name (ARN) of the pipeline.

To use the console to view the pipeline ARN instead of the JSON structure, choose your pipeline in the console, and then choose **Settings**. Under the **General** tab, the **Pipeline ARN** field displays.

### `created`


The date and time when the pipeline was created.

### `updated`


The date and time when the pipeline was last updated.

### `pollingDisabledAt`


The date and time when, for a pipeline that is configured for polling for change detection, when polling was disabled.

Pipelines that are inactive for longer than 30 days will have polling disabled for the pipeline.
+ Inactive pipelines will have polling disabled after 30 days of no executions.
+ Pipelines using EventBridge, CodeStar Connections, or webhooks will not be affected.
+ Active pipelines will not be affected.

For more information, see the `pollingDisabledAt` parameter under [PipelineMetadata](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_PipelineMetadata.html) object in the * CodePipeline API Guide*. For the steps to migrate your pipeline from polling to event-based change detection, see [Change Detection Methods](change-detection-methods.md).

# Stage declaration


The stage level of a pipeline has a basic structure that includes the following parameters and syntax. For more information, see the [StageDeclaration](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_StageDeclaration.html) object in the * CodePipeline API Guide*.

The following example shows the stage level of the pipeline structure in both JSON and YAML. The example shows two stages named `Source` and `Build`. The example contains two conditions, one for `onSuccess` and one for `beforeEntry`.

------
#### [ YAML ]

```
pipeline:
  name: MyPipeline
  roleArn: >-
    arn:aws:iam::ACCOUNT_ID:role/service-role/AWSCodePipelineServiceRole-us-west-2-MyPipeline
  artifactStore:
    type: S3
    location: amzn-s3-demo-bucket
  stages:
    - name: Source
      actions:
        - name: Source
          ...
    - name: Build
      actions:
        - name: Build
          ...
      onSuccess:
        conditions:
        - result: ROLLBACK
          rules:
          - name: DeploymentWindowRule
         ...
      beforeEntry:
        conditions:
        - result: FAIL
          rules:
          - name: MyLambdaRule
         ...
  version: 6
metadata:
  pipelineArn: 'arn:aws:codepipeline:us-west-2:ACCOUNT_ID:MyPipeline'
  created: '2019-12-12T06:49:02.733000+00:00'
  updated: '2020-09-10T06:34:07.447000+00:00'
```

------
#### [ JSON ]

```
{
    "pipeline": {
        "name": "MyPipeline",
        "roleArn": "arn:aws:iam::ACCOUNT_ID:role/service-role/AWSCodePipelineServiceRole-us-west-2-MyPipeline",
        "artifactStore": {
            "type": "S3",
            "location": "amzn-s3-demo-bucket"
        },
        "stages": [
            {
                "name": "Source",
                "actions": [
                    {
                        "name": "Source",
                        ...
                    }
                ]
            },
            {
                "name": "Build",
                "actions": [
                    {
                        "name": "Build",
                        ...
                    }
                ],
                "onSuccess": {
                    "conditions": [
                        {
                            "result": "ROLLBACK",
                            "rules": [
                                {
                                    "name": "DeploymentWindowRule",
                                    ...
                                }
                            ]
                        }
                    ]
                },
                "beforeEntry": {
                    "conditions": [
                        {
                            "result": "FAIL",
                            "rules": [
                                {
                                    "name": "MyLambdaRule",
                                     ...
                                }
                            ]
                        }
                    ]
                }
            }
        ],
            
            }
        ],
        "version": 6
    },
    "metadata": {
        "pipelineArn": "arn:aws:codepipeline:us-west-2:ACCOUNT_ID:MyPipeline",
        "created": "2019-12-12T06:49:02.733000+00:00",
        "updated": "2020-09-10T06:34:07.447000+00:00"
    }
}
```

------

## `name`


The name of the stage.

## `actions`


The action level of a pipeline has a basic structure that includes the following parameters and syntax. To view parameters and examples, see [Action declaration](action-requirements.md).

## `conditions`


Conditions contain one or more rules that are available in a list of rules in CodePipeline. If all rules in a condition succeed, then the condition is met. You can configure conditions so that when the criteria are not met, the specified result engages.

You can configure the following types of conditions:
+ `beforeEntry`
+ `onFailure`
+ `onSuccess`

For more information and examples, see [Configure conditions for a stage](stage-conditions.md).

## `rules`


Each condition has a rule set which is an ordered set of rules that are evaluated together. Therefore, if one rule fails in the condition, then the condition fails. You can override rule conditions at pipeline runtime.

The available rules are provided in the Rule reference. For more information, see the Rule structure reference at [Rule structure reference](rule-reference.md).

# Action declaration


The action level of a pipeline has a basic structure that includes the following parameters and syntax. For more information, see the [ActionDeclaration](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ActionDeclaration.html) object in the *CodePipeline API Guide*.

The following example shows the action level of the pipeline structure in both JSON and YAML.

------
#### [ YAML ]

```
 
. . . 

  stages:
    - name: Source
      actions:
        - name: Source
          actionTypeId:
            category: Source
            owner: AWS
            provider: S3
            version: '1'
          runOrder: 1
          configuration:
            PollForSourceChanges: 'false'
            S3Bucket: amzn-s3-demo-bucket
            S3ObjectKey: codedeploy_linux.zip
          outputArtifacts:
            - name: SourceArtifact
          inputArtifacts: []
          region: us-west-2
          namespace: SourceVariables
    - name: Build
      actions:
        - name: Build
          actionTypeId:
            category: Build
            owner: AWS
            provider: CodeBuild
            version: '1'
          runOrder: 1
          configuration:
            EnvironmentVariables: >-
              [{"name":"ETag","value":"#{SourceVariables.ETag}","type":"PLAINTEXT"}]
            ProjectName: my-project
          outputArtifacts:
            - name: BuildArtifact
          inputArtifacts:
            - name: SourceArtifact
          region: us-west-2
          namespace: BuildVariables
          runOrder: 1
          configuration:
            CustomData: >-
              Here are the exported variables from the build action: S3 ETAG:
              #{BuildVariables.ETag}
          outputArtifacts: []
          inputArtifacts: []
          region: us-west-2
```

------
#### [ JSON ]

```
 
. . . 

        "stages": [
            {
                "name": "Source",
                "actions": [
                    {
                        "name": "Source",
                        "actionTypeId": {
                            "category": "Source",
                            "owner": "AWS",
                            "provider": "S3",
                            "version": "1"
                        },
                        "runOrder": 1,
                        "configuration": {
                            "PollForSourceChanges": "false",
                            "S3Bucket": "amzn-s3-demo-bucket",
                            "S3ObjectKey": "aws-codepipeline-s3-aws-codedeploy_linux.zip"
                        },
                        "outputArtifacts": [
                            {
                                "name": "SourceArtifact"
                            }
                        ],
                        "inputArtifacts": [],
                        "region": "us-west-2",
                        "namespace": "SourceVariables"
                    }
                ]
            },
            {
                "name": "Build",
                "actions": [
                    {
                        "name": "Build",
                        "actionTypeId": {
                            "category": "Build",
                            "owner": "AWS",
                            "provider": "CodeBuild",
                            "version": "1"
                        },
                        "runOrder": 1,
                        "configuration": {
                            "EnvironmentVariables": "[{\"name\":\"ETag\",\"value\":\"#{SourceVariables.ETag}\",\"type\":\"PLAINTEXT\"}]",
                            "ProjectName": "my-build-project"
                        },
                        "outputArtifacts": [
                            {
                                "name": "BuildArtifact"
                            }
                        ],
                        "inputArtifacts": [
                            {
                                "name": "SourceArtifact"
                            }
                        ],
                        "region": "us-west-2",
                        "namespace": "BuildVariables"
                    }
                ]
      
. . .
```

------

For a list of example `configuration` details appropriate to the provider type, see [Valid configuration parameters for each provider type](structure-configuration-examples.md).

The action structure has the following requirements:
+ All action names within a stage must be unique.
+ A source action is required for each pipeline.
+ Source actions that do not use a connection can be configured for change detection or to turn off change detection. See [Change Detection Methods](change-detection-methods.md).
+ This is true for all actions, whether they are in the same stage or in following stages, but the input artifact does not have to be the next action in strict sequence from the action that provided the output artifact. Actions in parallel can declare different output artifact bundles, which are, in turn, consumed by different following actions.
+ When you use an Amazon S3 bucket as a deployment location, you also specify an object key. An object key can be a file name (object) or a combination of a prefix (folder path) and file name. You can use variables to specify the location name you want the pipeline to use. Amazon S3 deployment actions support the use of the following variables in Amazon S3 object keys.  
**Using variables in Amazon S3**    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/action-requirements.html)

## `name`


The name of the action.

## `region`


For actions where the provider is an AWS service, the AWS Region of the resource.

Cross-Region actions use the `Region` field to designate the AWS Region where the actions are to be created. The AWS resources created for this action must be created in the same Region provided in the `region` field. You cannot create cross-Region actions for the following action types:
+ Source actions
+ Actions by third-party providers
+ Actions by custom providers

## `roleArn`


The ARN of the IAM service role that performs the declared action. This is assumed through the roleArn that is specified at the pipeline level.

## `namespace`


Actions can be configured with variables. You use the `namespace` field to set the namespace and variable information for execution variables. For reference information about execution variables and action output variables, see [Variables reference](reference-variables.md).

**Note**  
For Amazon ECR, Amazon S3, or CodeCommit sources, you can also create a source override using input transform entry to use the `revisionValue` in EventBridge for your pipeline event, where the `revisionValue` is derived from the source event variable for your object key, commit, or image ID. For more information, see the optional step for input transform entry included in the procedures under [Amazon ECR source actions and EventBridge resources](create-cwe-ecr-source.md), [Connecting to Amazon S3 source actions with a source enabled for events](create-S3-source-events.md), or [CodeCommit source actions and EventBridge](triggering.md).

## `actionTypeId`


The action type ID is identified as a combination of the following four fields.

### `category`


The type of action, or step, in the pipeline, such as a source action. Each action type has a specific set of valid action providers. For a list of valid providers by action type, see the [Action structure reference](action-reference.md).

These are the valid `actionTypeId` categories (action types) for CodePipeline:
+ `Source`
+ `Build`
+ `Approval`
+ `Deploy`
+ `Test`
+ `Invoke`
+ `Compute`

### `owner`


For all currently supported action types, the only valid owner string is `AWS`, `ThirdParty`, or `Custom`. For the valid owner string for a specific action, see the [Action structure reference](action-reference.md).

For more information, see the [CodePipeline API Reference](http://docs.aws.amazon.com/codepipeline/latest/APIReference).

### `version`


The version of the action.

### `provider`


The action provider, such as CodeBuild.
+ Valid provider types for an action category depend on the category. For example, for a source action category, a valid provider type is `S3`, `CodeStarSourceConnection`, `CodeCommit`, or `Amazon ECR`. This example shows the structure for a source action with an `S3` provider:

  ```
  "actionTypeId": {
    "category": "Source",
    "owner": "AWS",
    "version": "1",
    "provider": "S3"},
  ```

## `InputArtifacts`


This field is contains the input artifact structure, if supported for the action category. The input artifact of an action must exactly match the output artifact declared in a preceding action. For example, if a preceding action includes the following declaration: 

```
"outputArtifacts": [
    {
    "MyApp"
    }
],
```

 and there are no other output artifacts, then the input artifact of a following action must be: 

```
"inputArtifacts": [
    {
    "MyApp"
    }
],
```

For example, a source action cannot have input artifacts because it is the first action in the pipeline. However, a source action will always have output artifacts that are processed by the following action. The output artifacts for a source action are the application files from the source repository, zipped and provided through the artifact bucket, that are processed by the following action, such as a CodeBuild action that acts on the application files with build commands.

As an example of actions that cannot have output artifacts, deploy actions do not have output artifacts because these actions are generally the last action in a pipeline.

### `name`


The artifact name for the action's input artifacts.

## `outputArtifacts`


Output artifact names must be unique in a pipeline. For example, a pipeline can include one action that has an output artifact named `"MyApp"` and another action that has an output artifact named `"MyBuiltApp"`. However, a pipeline cannot include two actions that both have an output artifact named `"MyApp"`.

 This field is contains the output artifact structure, if supported for the action category. The output artifact of an action must exactly match the output artifact declared in a preceding action. For example, if a preceding action includes the following declaration: 

```
"outputArtifacts": [
    {
    "MyApp"
    }
],
```

 and there are no other output artifacts, then the input artifact of a following action must be: 

```
"inputArtifacts": [
    {
    "MyApp"
    }
],
```

For example, a source action cannot have input artifacts because it is the first action in the pipeline. However, a source action will always have output artifacts that are processed by the following action. The output artifacts for a source action are the application files from the source repository, zipped and provided through the artifact bucket, that are processed by the following action, such as a CodeBuild action that acts on the application files with build commands.

As an example of actions that cannot have output artifacts, deploy actions do not have output artifacts because these actions are generally the last action in a pipeline.

### `name`


The artifact name for the action's output artifacts.

## `configuration` (by action provider)


The action configuration contains details and parameters appropriate to the provider type. In the section below, the example action configuration parameters are specific to the S3 source action.

The action configuration and input/output artifact limits can vary by action provider. For a list of action configuration examples by action provider, see [Action structure reference](action-reference.md) and the table in [Valid configuration parameters for each provider type](structure-configuration-examples.md). The table provides a link to the action reference for each provider type, which lists the configuration parameters for each action in detail. For a table with the input and output artifact limits for each action provider, see [Valid input and output artifacts for each action type](reference-action-artifacts.md).

The following considerations apply to working with actions:
+ Source actions do not have input artifacts, and deploy actions do not have output artifacts.
+ For source action providers that do not use a connection, such as S3, you must use the `PollForSourceChanges` parameter to specify whether you want your pipeline to start automatically when a change is detected. See [Valid settings for the `PollForSourceChanges` parameter](PollForSourceChanges-defaults.md).
+ To configure automated change detection to start your pipeline, or to disable change detection, see [Source actions and change detection methods](change-detection-methods.md).
+ To configure triggers with filtering, use the source action for connections, and then see [Automate starting pipelines using triggers and filtering](pipelines-triggers.md).
+ For the output variables for each action, see [Variables reference](reference-variables.md).
**Note**  
For Amazon ECR, Amazon S3, or CodeCommit sources, you can also create a source override using input transform entry to use the `revisionValue` in EventBridge for your pipeline event, where the `revisionValue` is derived from the source event variable for your object key, commit, or image ID. For more information, see the optional step for input transform entry included in the procedures under [Amazon ECR source actions and EventBridge resources](create-cwe-ecr-source.md), [Connecting to Amazon S3 source actions with a source enabled for events](create-S3-source-events.md), or [CodeCommit source actions and EventBridge](triggering.md).
**Important**  
Pipelines that are inactive for longer than 30 days will have polling disabled for the pipeline. For more information, see [pollingDisabledAt](pipeline-requirements.md#metadata.pollingDisabledAt) in the pipeline structure reference. For the steps to migrate your pipeline from polling to event-based change detection, see [Change Detection Methods](change-detection-methods.md).

**Note**  
The CodeCommit and S3 source actions require either a configured change detection resource (an EventBridge rule) or use the option to poll the repository for source changes. For pipelines with a Bitbucket, GitHub, or GitHub Enterprise Server source action, you do not have to set up a webhook or default to polling. The connections action manages change detection for you. 

## `runOrder`


A positive integer that indicates the run order of the action within the stage. Parallel actions in the stage are shown as having the same integer. For example, two actions with a run order of two will run in parallel after the first action in the stage runs.

The default `runOrder` value for an action is 1. The value must be a positive integer (natural number). You cannot use fractions, decimals, negative numbers, or zero. To specify a serial sequence of actions, use the smallest number for the first action and larger numbers for each of the rest of the actions in sequence. To specify parallel actions, use the same integer for each action you want to run in parallel. In the console, you can specify a serial sequence for an action by choosing **Add action group** at the level in the stage where you want it to run, or you can specify a parallel sequence by choosing **Add action**. *Action group* refers to a run order of one or more actions at the same level.

For example, if you want three actions to run in sequence in a stage, you would give the first action the `runOrder` value of 1, the second action the `runOrder` value of 2, and the third the `runOrder` value of 3. However, if you want the second and third actions to run in parallel, you would give the first action the `runOrder` value of 1 and both the second and third actions the `runOrder` value of 2.

**Note**  
The numbering of serial actions do not have to be in strict sequence. For example, if you have three actions in a sequence and decide to remove the second action, you do not need to renumber the `runOrder` value of the third action. Because the `runOrder` value of that action (3) is higher than the `runOrder` value of the first action (1), it runs serially after the first action in the stage.

# Valid action providers in CodePipeline


The pipeline structure format is used to build actions and stages in a pipeline. An action type consists of an action category and provider type. 

Each action category has a valid list of action providers. To reference the valid action providers for each action category, see the [Action structure reference](action-reference.md). 

Each action category has a designated set of providers. Each action provider, such as Amazon S3, has a provider name, such as `S3`, that must be used in the `Provider` field in the action category in your pipeline structure. 

There are three valid values for the `Owner` field in the action category section in your pipeline structure: `AWS`, `ThirdParty`, and `Custom`.

To find the provider name and owner information for your action provider, see the [Action structure reference](action-reference.md) or [Valid input and output artifacts for each action type](reference-action-artifacts.md).

This table lists valid providers by action type.

**Note**  
For Bitbucket, GitHub, or GitHub Enterprise Server actions, refer to the [CodeStarSourceConnection for Bitbucket Cloud, GitHub, GitHub Enterprise Server, GitLab.com, and GitLab self-managed actions](action-reference-CodestarConnectionSource.md) action reference topic.


**Valid action providers by action type**  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/actions-valid-providers.html)

Some action types in CodePipeline are available in select AWS Regions only. It is possible that an action type is available in an AWS Region, but an AWS provider for that action type is not available.

For more information about each action provider, see [Integrations with CodePipeline action types](integrations-action-type.md). 

# Valid settings for the `PollForSourceChanges` parameter


The `PollForSourceChanges` parameter default is determined by the method used to create the pipeline, as described in the following table. In many cases, the `PollForSourceChanges` parameter defaults to true and must be disabled. 

When the `PollForSourceChanges` parameter defaults to true, you should do the following:
+ Add the `PollForSourceChanges` parameter to the JSON file or CloudFormation template.
+ Create change detection resources (CloudWatch Events rule, as applicable).
+ Set the `PollForSourceChanges` parameter to false.
**Note**  
If you create a CloudWatch Events rule or webhook, you must set the parameter to false to avoid triggering the pipeline more than once.

  The `PollForSourceChanges` parameter is not used for Amazon ECR source actions.
+   
**`PollForSourceChanges` parameter defaults**    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/PollForSourceChanges-defaults.html)

# Valid input and output artifacts for each action type


Depending on the action type and provider, you can have the following number of input and output artifacts.


**Action type constraints for artifacts**  

| Owner | Type of action | Provider | Valid number of input artifacts | Valid number of output artifacts | 
| --- | --- | --- | --- | --- | 
| AWS | Source | S3 | 0 | 1 | 
| AWS | Source | CodeCommit | 0 | 1 | 
| AWS | Source | ECR | 0 | 1 | 
| ThirdParty | Source | CodeStarSourceConnection | 0 | 1 | 
| AWS | Build | CodeBuild | 1 to 5 | 0 to 5 | 
| AWS | Test | CodeBuild | 1 to 5 | 0 to 5 | 
| AWS | Test | DeviceFarm | 1 | 0 | 
| AWS | Approval | ThirdParty | 0 | 0 | 
| AWS | Deploy | S3 | 1 | 0 | 
| AWS | Deploy | CloudFormation | 0 to 10 | 0 to 1 | 
| AWS | Deploy | CodeDeploy | 1 | 0 | 
| AWS | Deploy | ElasticBeanstalk | 1 | 0 | 
| AWS | Deploy | OpsWorks | 1 | 0 | 
| AWS | Deploy | ECS | 1 | 0 | 
| AWS | Deploy | ServiceCatalog | 1 | 0 | 
| AWS | Invoke | Lambda | 0 to 5 | 0 to 5 | 
| ThirdParty | Deploy | AlexaSkillsKit | 1 to 2 | 0 | 
| Custom | Build | Jenkins | 0 to 5 | 0 to 5 | 
| Custom | Test | Jenkins | 0 to 5 | 0 to 5 | 
| Custom | Any supported category | As specified in the custom action | 0 to 5 | 0 to 5 | 

# Valid configuration parameters for each provider type


This section lists valid `configuration` parameters for each action provider.

Every action must have a valid action configuration, which depends on the provider type for that action. The following table lists the required action configuration elements for each valid provider type:


**Action configuration properties for provider types**  
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/structure-configuration-examples.html)

The following example shows a valid configuration for a deploy action that uses Alexa Skills Kit:

```
"configuration": {
  "ClientId": "amzn1.application-oa2-client.aadEXAMPLE",
  "ClientSecret": "****",
  "RefreshToken": "****",
  "SkillId": "amzn1.ask.skill.22649d8f-0451-4b4b-9ed9-bfb6cEXAMPLE"
}
```

The following example shows a valid configuration for a manual approval:

```
"configuration": {
  "CustomData": "Comments on the manual approval",
  "ExternalEntityLink": "http://my-url.com",
  "NotificationArn": "arn:aws:sns:us-west-2:12345EXAMPLE:Notification"
}
```