

# Analyze data in a collaboration
Analysis

In AWS Clean Rooms, you can analyze data by running queries or jobs.

A *query* is a method to access and analyze configured tables in a collaboration, using a supported set of functions, classes, and variables. The currently supported query language in AWS Clean Rooms is SQL. There are 3 ways to run a query in AWS Clean Rooms: Write SQL code, use an approved SQL analysis template, or use the Analysis builder UI.

A *job* is a method to access and analyze configured tables in a collaboration using a supported set of functions, classes, and variables. The currently supported type of job in AWS Clean Rooms is PySpark. There is one way to run a job in AWS Clean Rooms: by using an approved PySpark analysis template.

The following topics describe how to analyze data in AWS Clean Rooms by running SQL queries or PySpark jobs.

**Topics**
+ [

# Running SQL queries
](running-sql-queries.md)
+ [

# Running PySpark jobs
](run-jobs.md)

# Running SQL queries


**Note**  
You can only run queries if the member who is responsible to pay for query compute costs has joined the collaboration as an active member.

As the [member who can query](glossary.md#glossary-member-who-can-query), you can run a SQL query by:
+ Building a SQL query manually using the SQL code editor.
+ Using an approved SQL [analysis template](create-analysis-template.md).
+ Using the **Analysis builder UI** to build a query without having to write SQL code.

When the member who can query runs a SQL query on the tables in the collaboration, AWS Clean Rooms assumes the relevant roles to access the tables on their behalf. AWS Clean Rooms applies the analysis rules as necessary to the input query and its output.

The analysis rules and output constraints are enforced automatically. AWS Clean Rooms only returns the results that comply with the defined analysis rules.

AWS Clean Rooms supports SQL queries that can be different than other query engines. For specifications, see the [AWS Clean Rooms SQL Reference](https://docs.aws.amazon.com/clean-rooms/latest/sql-reference/sql-reference.html). If you want to run queries on data tables protected with differential privacy, you should ensure that your queries are compatible with the [general-purpose query structure](analysis-rules-custom.md#dp-query-structure-syntax) of AWS Clean Rooms Differential Privacy.

**Note**  
When using [Cryptographic Computing for Clean Rooms](crypto-computing.md), not all SQL operations generate valid results. For example, you can conduct a COUNT on an encrypted column but conducting a SUM on encrypted numbers leads to errors. In addition, queries might also yield incorrect results. For example, queries that SUM sealed columns produce errors. However, a GROUP BY query over sealed columns seems to succeed but produces different groups than those produced by a GROUP BY query over the cleartext.

The [member paying for query compute costs](glossary.md#glossary-member-paying-for-query-compute) is charged for the queries run in the collaboration.

The member who can query can select multiple [members who can receive results](glossary.md#glossary-member-who-can-receive-results) to receive the results from a single query. For more information, see [Querying configured tables using the SQL code editor](use-sql-editor.md). For general information about receiving query results, see [Receiving and using analysis results](receive-query-results.md).

## Prerequisites


 Before you run a SQL query, make sure that you have the following:
+ An active membership in AWS Clean Rooms collaboration
+ Access to at least one configured table in the collaboration
+ Confirmation that the member responsible for query compute costs is an active collaboration member

For information about how to query data or view queries by calling the AWS Clean Rooms [StartProtectedQuery API](https://docs.aws.amazon.com/clean-rooms/latest/apireference/API_StartProtectedQuery.html) operation directly or by using the AWS SDKs, see the [AWS Clean Rooms API Reference](https://docs.aws.amazon.com/clean-rooms/latest/apireference/Welcome.html).

For information about query logging, see [Analysis logging in AWS Clean Rooms](query-logs.md).

**Note**  
If you run a query on [encrypted](glossary.md#glossary-encryption) data tables, the results from the encrypted columns are encrypted.

## Spark properties configuration for SQL queries


AWS Clean Rooms enables you to optionally customize Spark runtime behavior by configuring supported Spark properties for SQL queries. These properties let you fine-tune performance, memory usage, and query execution parameters. With this feature, you have greater control over how your Spark-based queries are processed, allowing for optimization based on your specific workload requirements.

You can now adjust settings such as shuffle partitions, broadcast join thresholds, and adaptive query execution parameters directly from the AWS Clean Rooms console. This feature is particularly useful for complex queries or large datasets where default configurations may not be optimal. By fine-tuning these Spark properties, you can potentially improve query performance, reduce resource consumption, and better manage memory usage for your Spark-based collaboration analyses.

To leverage this feature, you'll find a new **Spark properties** section in the query interface. You can select from a list of supported properties and specify custom values. You can also configure Spark properties programmatically using the [StartProtectedQuery API.](https://docs.aws.amazon.com/clean-rooms/latest/apireference/API_StartProtectedQuery.html) This advanced configuration option empowers data analysts and engineers to optimize their queries for enhanced efficiency and scalability.

For more information about Spark properties, including default values, see [Spark Properties](https://spark.apache.org/docs/latest/configuration.html#spark-properties) in the Apache Spark documentation. 

The following topics explain how to query data in a collaboration using the AWS Clean Rooms console.

**Topics**
+ [

## Prerequisites
](#sql-queries-prereqs)
+ [

## Spark properties configuration for SQL queries
](#spark-properties-config)
+ [

# Querying configured tables using the SQL code editor
](use-sql-editor.md)
+ [

# Querying ID mapping tables using the SQL code editor
](query-id-mapping-tables.md)
+ [

# Querying configured tables using a SQL analysis template
](use-analysis-template.md)
+ [

# Querying with the analysis builder
](query-data-analysis-builder.md)
+ [

# Viewing the impact of differential privacy
](query-data-with-diff-privacy.md)
+ [

# Viewing recent queries
](view-queries-console.md)
+ [

# Viewing query details
](view-query-details.md)

# Querying configured tables using the SQL code editor
Querying configured tables

As a member who can query, you can build a query manually by writing SQL code in the SQL code editor. The SQL code editor is located in the **Analysis** section of the **Analysis** tab in the AWS Clean Rooms console. 

The SQL code editor is displayed by default. If you want to use the analysis builder to build queries, see [Querying with the analysis builder](query-data-analysis-builder.md). 

**Important**  
If you start writing a SQL query in the code editor and then turn on the **Analysis builder UI**, your query isn't saved.

AWS Clean Rooms supports many SQL commands, functions, and conditions. For more information, see the [AWS Clean Rooms SQL Reference](https://docs.aws.amazon.com/clean-rooms/latest/sql-reference/sql-reference.html). 

**Tip**  
If a scheduled maintenance occurs while a query is running, the query is terminated and rolled back. You must restart the query. 

**To query configured tables using the SQL code editor**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run queries**.

1. On the **Analysis** tab, under **Tables**, view the list of tables and their associated analysis rule type (**Aggregation analysis rule**, **List analysis rule**, or **Custom analysis rule**).
**Note**  
If you don’t see the tables that you expect in the list, it might be for the following reasons:  
The tables haven't been [associated](associate-configured-table.md).
The tables don't have an [analysis rule configured](add-analysis-rule.md).

1. (Optional) To view the table's schema and analysis rule controls, expand the table by selecting the plus sign icon (**\$1**).

1. Under the **Analysis** section, for **Analysis mode**, select **Write SQL code**.
**Note**  
The **Analysis** section only displays if the member who can receive results and the member who is responsible to pay for query compute costs have joined the collaboration as an active member.

1. Build the query by typing the query into the SQL code editor.

   For more information about supported SQL commands and functions, see the [AWS Clean Rooms SQL Reference.](https://docs.aws.amazon.com/clean-rooms/latest/sql-reference/sql-reference.html) 

   You can also use the following options to build your query.

------
#### [ Use an example query ]

   To use an example query

   1. Select the three vertical dots next to the table.

   1. Under **Insert in editor**, choose **Example query**.
**Note**  
Inserting an **Example query** appends it to the query already in the editor.

      The query example appears. All of the tables listed under **Tables** are included in the query. 

   1. Edit the placeholder values in the query.

------
#### [ Insert column names or functions ]

   To insert a column name or function

   1. Select the three vertical dots next to a column.

   1. Under **Insert in editor**, choose **Column name**.

   1. To manually insert a function that is permitted on a column, 

      1. Select the three vertical dots next to a column.

      1. Select **Insert in editor**.

      1. Select the name of the permitted function (such as INNER JOIN, SUM, SUM DISTINCT, or COUNT).

   1. Press **Ctrl** \$1 **Space** to view the table schemas in the code editor.
**Note**  
Members who can query can view and use the partition columns in each configured table association. Ensure the partition column is labeled as a partition column in the AWS Glue table underlying the configured table.

   1. Edit the placeholder values in the query.

------

1. Specify the supported **Worker type** and the **Number of workers**. 

   You can choose the instance type and number of instances (workers) to run your SQL queries. 

   For CR.1X, you can select up to 128 workers or a minimum of 4 workers. 

   For CR.4X, you can select up to 32 workers or a minimum of 4 workers. 

   Use the following table to determine the type and number or workers you need for your use case.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/use-sql-editor.html)
**Note**  
Different worker types and number of workers have associated costs. To learn more about the pricing, see [AWS Clean Rooms pricing](https://aws.amazon.com/clean-rooms/pricing/).

1. For **Send results to**, specify who can receive results. 
**Note**  
To receive results, the collaboration member must be configured as a result receiver and must be an active participant in the collaboration (**Status: Active**)

1. (Member who can query only) The **Use your default result settings** checkbox is selected by default. Keep this selected if you want to keep your default result settings.

   If you want to specify different results settings for this query, clear the **Use your default result settings** checkbox, and then choose the following. 

   1. **Result format** (**CSV** or **PARQUET**)

   1. **Result files** (**Single** or **Multiple**)

   1. **Results destination in Amazon S3**

   Each member who can receive results can specify a different **Result format**, **Result files**, and **Results destination in Amazon S3**.

1. To specify **Spark properties**:

   1. Expand **Spark properties**.

   1. Choose **Add Spark properties**.

   1. On the **Spark properties** dialog box, choose a **Property name** from the dropdown list and enter a **Value**.

   The following table provides a definition for each property.

   For more information about Spark properties, see [Spark Properties](https://spark.apache.org/docs/latest/configuration.html#spark-properties) in the Apache Spark documentation.     
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/use-sql-editor.html)

1. Choose **Run**.
**Note**  
You can't run the query if the member who can receive results hasn’t configured the query results settings.

1. View the **Results**. 

   For more information, see [Receiving and using analysis results](receive-query-results.md).

1. Continue to adjust parameters and run your query again, or choose the **\$1** button to start a new query in a new tab.

**Note**  
AWS Clean Rooms aims to provide clear error messaging. If an error message doesn't have enough details to help you troubleshoot, contact the account team. Provide them with a description of how the error occurred and the error message (including any identifiers). For more information, see [Troubleshooting AWS Clean Rooms](troubleshooting.md).

# Querying ID mapping tables using the SQL code editor
Querying ID mapping tables

The following procedure describes how to run a multi-table join query on the ID mapping table to join the `sourceId` with the `targetId`.

Before you query the ID mapping table, the ID mapping table must be successfully populated.

**To query ID mapping tables using the SQL code editor**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run queries**.

1. On the **Analysis** tab, go to the **Analysis** section.
**Note**  
The **Analysis** section only displays if the member who can receive results and the member who is responsible to pay for query compute costs have joined the collaboration as an active member.

1. On the **Analysis** tab, under **Tables**, view the list of ID mapping tables (under **Managed by AWS Clean Rooms**) and their associated analysis rule type (**ID mapping table analysis rule**).
**Note**  
If you don’t see the ID mapping tables that you expect in the list, it might be because the ID mapping tables haven't been successfully populated. For more information, see [Populating an existing ID mapping table](populate-id-mapping-table.md).

1. Build the query by typing the query into the SQL code editor.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/query-id-mapping-tables.html)

1. Specify the supported **Worker type** and the **Number of workers**. 

   Use the following table to determine the type and number or workers you need for your use case.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/query-id-mapping-tables.html)
**Note**  
Different worker types and number of workers have associated costs. To learn more about the pricing, see [AWS Clean Rooms pricing](https://aws.amazon.com/clean-rooms/pricing/).

1. Choose **Run**.
**Note**  
You can't run the query if the member who can receive results hasn’t configured the query results settings.

1. View the **Results**.

   For more information, see [Receiving and using analysis results](receive-query-results.md).

1. Continue to adjust parameters and run your query again, or choose the **\$1** button to start a new query in a new tab.

**Note**  
AWS Clean Rooms aims to provide clear error messaging. If an error message doesn't have enough details to help you troubleshoot, contact the account team. Provide them with a description of how the error occurred and the error message (including any identifiers). For more information, see [Troubleshooting AWS Clean Rooms](troubleshooting.md).

# Querying configured tables using a SQL analysis template
Querying configured tables using an analysis template

This procedure demonstrates how to use an analysis template in the AWS Clean Rooms console to query configured tables with the **Custom** analysis rule. 

**To use a SQL analysis template to query configured tables with the **Custom** analysis rule**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run queries**.

1. On the **Analysis** tab, under the **Tables** section, view the tables and their associated analysis rule type (**Custom analysis rule**).
**Note**  
If you don’t see the tables that you expect in the list, it might be for the following reasons:  
The tables haven't been [associated](associate-configured-table.md).
The tables don't have an [analysis rule configured](add-analysis-rule.md).

1. Under the **Analysis** section, for **Analysis mode**, select **Run analysis templates** and then choose the analysis template from the dropdown list.

1. The parameters form the SQL analysis template will automatically populate in the **Definition**.

1. Specify the supported **Worker type** and the **Number of workers**. 

   Use the following table to determine the type and number or workers you need for your use case.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/use-analysis-template.html)
**Note**  
Different worker types and number of workers have associated costs. To learn more about the pricing, see [AWS Clean Rooms pricing](https://aws.amazon.com/clean-rooms/pricing/).

1. Specify the supported **Spark properties**.

   1. Select **Add Spark properties**.

   1. On the **Spark properties** dialog box, choose a **Property name** from the dropdown list and enter a **Value**.

   The following table provides a definition for each property.

   For more information about Spark properties, see [Spark Properties](https://spark.apache.org/docs/latest/configuration.html#spark-properties) in the Apache Spark documentation.     
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/use-analysis-template.html)

1. Choose **Run**.
**Note**  
You can't run the query if the member who can receive results hasn’t configured the query results settings.

1. Continue to adjust parameters and run your query again, or choose the **\$1** button to start a new query in a new tab.

# Querying with the analysis builder


You can use the analysis builder to build queries without having to write SQL code. With the analysis builder, you can build a query for a collaboration that has:
+ A single table that uses the [aggregation analysis rule](analysis-rules-aggregation.md) with no JOIN required
+ Two tables (one from each member) that both use the [aggregation analysis rule](glossary.md#glossary-agg-analysis-rule)
+ Two tables (one from each member) that both use the [list analysis rule](glossary.md#glossary-list-analysis-rule)
+ Two tables (one from each member) that both use the aggregation analysis rule and two tables (one from each member) that both use the list analysis rule

If you want to manually write SQL queries, see [Querying configured tables using the SQL code editor](use-sql-editor.md).

The analysis builder appears as the **Analysis builder UI** option in the **Analysis** section of the **Analysis** tab in the AWS Clean Rooms console. 

**Important**  
If you turn on the **Analysis builder UI**, start building a query in the analysis builder, and then turn off the **Analysis builder UI**, your query isn't saved.

**Tip**  
If a scheduled maintenance occurs while a query is running, the query is terminated and rolled back. You must restart the query. 

The following topics explain how to use the analysis builder.

**Topics**
+ [

## Use the analysis builder to query a single table (aggregation)
](#use-analysis-builder-one-table)
+ [

## Use the analysis builder to query two tables (aggregation or list)
](#use-analysis-builder-two-tables)

## Use the analysis builder to query a single table (aggregation)


This procedure demonstrates how to use the **Analysis builder UI** in the AWS Clean Rooms console to build a query. The query is for a collaboration that has a single table that uses the [aggregation analysis rule](analysis-rules-aggregation.md) with no JOIN required.

**To use the analysis builder to query a single table**

Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run queries**.

1. On the **Analysis** tab, under **Tables**, view the table and its associated analysis rule type. (The analysis rule type should be the **Aggregation analysis rule**.)
**Note**  
If you don’t see the table you expect, it might be for the following reasons:  
The table hasn't been [associated](associate-configured-table.md).
The table doesn't have an [analysis rule configured](add-analysis-rule.md).

1. Under the **Analysis** section, turn on **Analysis builder UI**.

1. Build a query.

   If you want to see all of the aggregation metrics, skip to step 9.

   1. For **Choose metrics**, review the aggregate metrics that have been preselected by default and remove any metric if needed.

   1. (Optional) For **Add segments – optional**, choose one or more parameters.
**Note**  
**Add segments – optional** is only displayed if dimensions are specified for the table.

   1. (Optional) For **Add filters – optional**, choose **Add filter**, and then choose a **Parameter**, operator, and **Value**. 

      To add more filters, choose **Add another filter**. 

      To remove a filter, choose **Remove**.
**Note**  
ORDER BY isn't supported for aggregation queries.  
Only the AND operator is supported in filters.

   1. (Optional) For **Add description – optional**, enter a description to help identify the query in the list of queries.

1. Expand **Preview SQL code**.

   1. View the SQL code that's generated from the analysis builder.

   1. To copy the SQL code, choose **Copy**.

   1. To edit the SQL code, choose **Edit in SQL code editor**.

1. Choose **Run**.
**Note**  
You can't run the query if the member who can receive results hasn’t configured the query results settings.

1. Continue to adjust parameters and run your query again, or choose the **\$1** button to start a new query in a new tab.

**Note**  
AWS Clean Rooms aims to provide clear error messaging. If an error message doesn't have enough details to help you troubleshoot, contact the account team. Provide them with a description of how the error occurred and the error message (including any identifiers). For more information, see [Troubleshooting AWS Clean Rooms](troubleshooting.md).

## Use the analysis builder to query two tables (aggregation or list)


This procedure describes how to use the analysis builder in the AWS Clean Rooms console to build a query for a collaboration that has:
+ Two tables (one from each member) that both use the [aggregation analysis rule](analysis-rules-aggregation.md)
+ Two tables (one from each member) that both use the [list analysis rule](analysis-rules-list.md)
+ Two tables (one from each member) that both use the aggregation analysis rule and two tables (one from each member) that both use the list analysis rule

**To use the analysis builder to query two tables**

Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run queries**..

1. On the **Analysis** tab, under **Tables**, view the two tables and their associated analysis rule type (**Aggregation analysis rule** or **List analysis rule**).
**Note**  
If you don’t see the tables you expect in the list, it might be for the following reasons:  
The tables haven't been [associated](associate-configured-table.md).
The tables don't have an [analysis rule configured](add-analysis-rule.md).

1. Under the **Analysis** section, turn on **Analysis builder UI**.

1. Build a query.

   If the collaboration contains two tables that use the **Aggregation analysis rule** and two tables that use the **List analysis rule**, first choose **Aggregation** or **List**, and then follow the prompts based on the selected analysis rule.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/query-data-analysis-builder.html)

1. Expand **Preview SQL code**.

   1. View the SQL code that's generated from the analysis builder.

   1. To copy the SQL code, choose **Copy**.

   1. To edit the SQL code, choose **Edit in SQL code editor**.

1. Choose **Run**.
**Note**  
You can't run the query if the member who can receive results hasn’t configured the query results settings

1. Continue to adjust parameters and run your query again, or choose the **\$1** button to start a new query in a new tab.

**Note**  
AWS Clean Rooms aims to provide clear error messaging. If an error message doesn't have enough details to help you troubleshoot, contact the account team. Provide them with a description of how the error occurred and the error message (including any identifiers). For more information, see [Troubleshooting AWS Clean Rooms](troubleshooting.md).

# Viewing the impact of differential privacy


In general, writing and running queries doesn't change when differential privacy is turned on. However, you can't run a query if there isn't enough privacy budget remaining. As you run queries and consume the privacy budget, you can see approximately how many aggregations you can run and how that might impact future queries.

**To view the impact of differential privacy in a collaboration**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member details** status of **Run queries**.

1. On the **Analysis** tab, under **Tables**, view the remaining privacy budget. This is displayed as the estimated number of **aggregation functions remaining** and the **Utility used** (rendered as a percentage).
**Note**  
The estimated number of **aggregate functions remaining** and the percentage of the **Utility used** only display for the member who can query.

1. Choose **View impact** to view how much noise is injected into the results and approximately how many aggregation functions you can run.

# Viewing recent queries


You can view the queries that ran in the last 90 days on the **Analysis** tab.

**Note**  
If your only member ability is **Contribute data**, and you aren't the [member paying for query compute costs](glossary.md#glossary-member-paying-for-query-compute), the **Analysis** tab doesn't appear on the console.

**To view recent queries**

Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose a collaboration.

1. On the **Analysis** tab, under **Analyses**, select **All queries** from the dropdown, and view the queries that have been run in the last 90 days. 

1. To sort recent queries by **Status**, select a status from the **All statuses** dropdown list.

   The statuses are: **Submitted**, **Started**, **Cancelled**, **Success**, **Failed**, and **Timed out**.

# Viewing query details


You can view the query details as the member who can run queries or as a member who can receive results.

**To view the details of the query**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose a collaboration.

1. On the **Analysis** tab, do one of the following: 
   + Choose the option button for the specific query you want to view, and then choose **View details**.
   + Choose the **Protected query ID**. 

1. On the **Query details** page, 
   + If you are the member who can run queries, view the **Query details**, **SQL text** and **Results**.

     You see a message confirming that the query results were delivered to the member who can receive results.
   + If you are the member who can receive results, view the **Query details** and **Results**.

# Running PySpark jobs


As the [member who can query](glossary.md#glossary-member-who-can-query), you can run a PySpark job on a configured table by using an approved PySpark [analysis template](create-analysis-template.md).

**Prerequisites**

 Before you run a PySpark job, you must have:
+ An active membership in AWS Clean Rooms collaboration
+ Access to at least one analysis template in the collaboration
+ Access to at least one configured table in the collaboration
+ Permissions to write the results of a PySpark job to a specified S3 bucket

  For information about creating the required service role, see [Create a service role to write results of a PySpark job](setting-up-roles.md#create-role-pyspark-job).
+ The member who is responsible to pay for compute costs has joined the collaboration as an active member

For information about how to query data or view queries by calling the AWS Clean Rooms `StartProtectedJob` API operation directly or by using the AWS SDKs, see the [AWS Clean Rooms API Reference](https://docs.aws.amazon.com/clean-rooms/latest/apireference/Welcome.html).

For information about job logging, see [Analysis logging in AWS Clean Rooms](query-logs.md).

For information about receiving job results, see [Receiving and using analysis results](receive-query-results.md).

The following topics explain how to run a PySpark job on a configured table in a collaboration using the AWS Clean Rooms console.

**Topics**
+ [

# Running a PySpark job on a configured table using a PySpark analysis template
](run-jobs-with-analysis-template.md)
+ [

# Viewing recent jobs
](view-recent-jobs.md)
+ [

# Viewing job details
](view-job-details.md)

# Running a PySpark job on a configured table using a PySpark analysis template
Running a job using an analysis template

This procedure demonstrates how to use a PySpark analysis template in the AWS Clean Rooms console to analyze configured tables with the **Custom** analysis rule. 

**To run a PySpark job on a configured table using a PySpark analysis template**

Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose the collaboration that has **Your member abilities** status of **Run jobs**.

1. On the **Analysis** tab, under the **Tables** section, view the tables and their associated analysis rule type (**Custom analysis rule**).
**Note**  
If you don’t see the tables that you expect in the list, it might be for the following reasons:  
The tables haven't been [associated](associate-configured-table.md).
The tables don't have an [analysis rule configured](add-analysis-rule.md).

1. Under the **Analysis** section, for **Analysis mode**, select **Run analysis templates**.

1. Choose the PySpark analysis template from the **Analysis template** dropdown list.

   The parameters from the PySpark analysis template will automatically populate in the **Definition**.

1. If the analysis template has parameters defined, under **Parameters**, provide values for the parameters:

   1. For each parameter, view the **Parameter name** and **Default value** (if configured).

   1. Enter a **Value** for each parameter you want to override.
**Note**  
If you don't provide a value but a default value exists, the default value will be used.
**Important**  
Parameter values can be up to 1,000 characters and support UTF-8 encoding. All parameter values are treated as strings and passed to your user script through the context object.  
Ensure that your user script validates and handles parameter values safely. For more information about secure parameter handling, see [Working with parameters in PySpark analysis templates](pyspark-parameter-handling.md).

1. Specify the supported **Worker type** and the **Number of workers**. 

   Use the following table to determine the type and number or workers you need for your use case.    
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/clean-rooms/latest/userguide/run-jobs-with-analysis-template.html)
**Note**  
Different worker types and number of workers have associated costs. To learn more about the pricing, see [AWS Clean Rooms pricing](https://aws.amazon.com/clean-rooms/pricing/).

1. Choose **Run**.
**Note**  
You can't run the job if the member who can receive results hasn’t configured the job results settings.

1. Continue to adjust parameters and run your job again, or choose the **\$1** button to start a new job in a new tab.

# Viewing recent jobs


You can view the jobs that ran in the last 90 days on the **Analysis** tab.

**Note**  
If your only member ability is **Contribute data**, and you aren't the [member paying for job compute costs](glossary.md#glossary-member-paying-for-query-compute), the **Analysis** tab doesn't appear on the console.

**To view recent jobs**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose a collaboration.

1. On the **Analysis** tab, under **Analyses**, select **All jobs** from the dropdown, and view the jobs that have been run in the last 90 days. 

1. To sort recent jobs by **Status**, select a status from the **All statuses** dropdown list.

   The statuses are: **Submitted**, **Started**, **Cancelled**, **Success**, **Failed**, and **Timed out**.

# Viewing job details


You can view the job details as the member who can run jobs or as a member who can receive results.

**To view the details of the job**

1. Sign in to the AWS Management Console and open the AWS Clean Rooms console at [https://console.aws.amazon.com/cleanrooms](https://console.aws.amazon.com/cleanrooms/home).

1. In the left navigation pane, choose **Collaborations**.

1. Choose a collaboration.

1. On the **Analysis** tab, under **Analyses**, select **All jobs** from the dropdown, and then do one of the following: 
   + Choose the option button for the specific job you want to view, and then choose **View details**.
   + Choose the **Protected job ID**. 

1. On the **Job details** page, 
   + If you are the member who can run jobs, view the **Job details**, **Job**, and **Results**.

     You see a message confirming that the job results were delivered to the member who can receive results.
   + If you are the member who can receive results, view the **Job details** and **Results**.