Skip to content

/AWS1/CL_CRMWORKERCOMPCONFPRPS

The configuration properties that define the compute environment settings for workers in Clean Rooms. These properties enable customization of the underlying compute environment to optimize performance for your specific workloads.

CONSTRUCTOR

IMPORTING

Optional arguments:

it_spark TYPE /AWS1/CL_CRMSPARKPROPERTIES_W=>TT_SPARKPROPERTIES TT_SPARKPROPERTIES

The Spark configuration properties for SQL workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.


Queryable Attributes

spark

The Spark configuration properties for SQL workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.

Accessible with the following methods

Method Description
GET_SPARK() Getter for SPARK, with configurable default
ASK_SPARK() Getter for SPARK w/ exceptions if field has no value
HAS_SPARK() Determine if SPARK has a value