

# Configure jobs using queue environments
<a name="configure-jobs"></a>

AWS Deadline Cloud uses *queue environments* to configure the software on your workers. An environment enables you to perform time-consuming tasks, such as set up and tear-down, once for all the tasks in a session. It defines the actions to run on a worker when starting or stopping a session. You can configure an environment for a queue, jobs that run in the queue, and the individual steps for a job.

You define environments as queue environments or job environments. Create queue environments with the Deadline Cloud console or with the [deadline:CreateQueueEnvironment](https://docs.aws.amazon.com/deadline-cloud/latest/APIReference/API_CreateQueueEnvironment.html) operation and define job environments in the job templates of the jobs you submit. They follow the Open Job Description (OpenJD) specification for environments. For details, see [<Environment>](https://github.com/OpenJobDescription/openjd-specifications/wiki/2023-09-Template-Schemas#4-environment) in the OpenJD specification on GitHub.

In addition to a `name` and `description`, each environment contains two fields that define the environment on the host. They are:
+ `script` – The action taken when this environment is run on a worker.
+ `variables` – A set of environment variable name/value pairs that are set when entering the environment.

You must set at least one of `script` or `variables`.

You can define more than one environment in your job template. Each environment is applied in the order that they are listed in the template. You can use this to help manage the complexity of your environments.

The default queue environment for Deadline Cloud uses the conda package manager to load software into the environment, but you can use other package managers. The default environment defines two parameters to specify the software that should be loaded. These variables are set by submitters provided by Deadline Cloud, though you can set them in your own scripts and applications that use the default environment. They are:
+ `CondaPackages` – A space-separated list of conda package match specifications to install for the job. For example, the Blender submitter would add `blender=3.6` to render frames in Blender 3.6.
+ `CondaChannels` – A space-separated list of conda channels to install packages from. For service-managed fleets, packages are installed from the `deadline-cloud` channel. You can add other channels.

# Control the job environment with OpenJD queue environments
<a name="control-the-job-environment"></a>

You can define customized environments for your rendering jobs using *queue environments*. A queue environment is a template that controls the environment variables, file mappings, and other settings for jobs running in a specific queue. It enables you to tailor the execution environment for the jobs submitted to a queue to the requirements of your workloads. AWS Deadline Cloud provides three nested levels where you can apply [Open Job Description (OpenJD) environments](https://github.com/OpenJobDescription/openjd-specifications/wiki/2023-09-Template-Schemas#4-environment): queue, job, and step. By defining queue environments, you can ensure consistent and optimized performance for different types of jobs, streamline resource allocation, and simplify queue management.

The queue environment is a template that you attach to a queue in your AWS account from the AWS management console or using the AWS CLI. You can create one environment for a queue, or you can create multiple queue environments that applied in order to create the execution environment. This approach enables you to create and test an environment in steps to help ensure that it works correctly for you jobs.

Job and step environments are defined in the job template you use to create a job in your queue. The OpenJD syntax is the same in these different forms of environments. In this section we will show them inside of job templates. 

**Topics**
+ [

# Set environment variables in a queue environment
](set-environment-variables.md)
+ [

# Set the path in a queue environment
](set-the-path.md)
+ [

# Run a background daemon process from the queue environment
](run-a-background-daemon-process.md)

# Set environment variables in a queue environment
<a name="set-environment-variables"></a>

Many applications and frameworks use environment variables to control feature settings, logging levels, and display configuration. You can use [Open Job Description (OpenJD) environments](https://github.com/OpenJobDescription/openjd-specifications/wiki/2023-09-Template-Schemas#4-environment) to set environment variables that every task command within their scope inherits.

## Environment variable scope
<a name="set-env-vars-scope"></a>

AWS Deadline Cloud applies environment variables from queue environments that you attach to a queue. Within a job template, you can also define environment variables at the job and step levels using [OpenJD environments](https://github.com/OpenJobDescription/openjd-specifications/wiki/2023-09-Template-Schemas#4-environment). Variables defined at a narrower scope override variables with the same name from a broader scope.
+ **Queue environment** – A template that you attach to a queue in Deadline Cloud. Variables apply to all jobs submitted to the queue. You can set variables with a `variables` map for fixed values, or use scripts for dynamic values.
+ **Job environment** – Defined under `jobEnvironments` in a job template. Variables apply to all steps and tasks in the job. A job-level variable overrides a queue-level variable with the same name.
+ **Step environment** – Defined under `stepEnvironments` in a job template. Variables apply only to the tasks in that step. A step-level variable overrides a job-level or queue-level variable with the same name.

## Setting variables in a queue environment
<a name="set-env-vars-queue-env"></a>

You can set environment variables in a queue environment using a `variables` map for fixed values, or using a `script` with an `onEnter` action for dynamic values.

The following queue environment template uses a `variables` map to set the `QT_QPA_PLATFORM` variable to `offscreen`, which allows applications that use the [Qt Framework](https://www.qt.io/product/framework) to run on worker hosts without an interactive display.

```
specificationVersion: 'environment-2023-09'
environment:
  name: QtOffscreen
  variables:
    QT_QPA_PLATFORM: offscreen
```

For dynamic values, such as modifying `PATH` or activating virtual environments, use a script that prints lines in the format `openjd_env: VAR=value` to stdout. The `openjd_env:` prefix is required. Using `echo`, `export`, or other shell mechanisms without the prefix does not propagate variables to jobs and tasks.

The following queue environment template sets the `QT_QPA_PLATFORM` variable using a script.

```
specificationVersion: 'environment-2023-09'
environment:
  name: QtOffscreen
  script:
    actions:
      onEnter:
        command: bash
        args:
        - "{{Env.File.Enter}}"
    embeddedFiles:
    - name: Enter
      type: TEXT
      data: |
        #!/bin/env bash
        set -euo pipefail
        echo "openjd_env: QT_QPA_PLATFORM=offscreen"
```

To attach a queue environment to your queue, use the Deadline Cloud console or the AWS CLI. For more information, see [Create a queue environment](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue-environment.html) in the AWS Deadline Cloud User Guide. The following AWS CLI command creates a queue environment from a template file.

```
aws deadline create-queue-environment \
    --farm-id FARM_ID \
    --queue-id QUEUE_ID \
    --priority 1 \
    --template-type YAML \
    --template file://my-queue-env.yaml
```

For more complex examples, such as creating and activating conda virtual environments, see the [Deadline Cloud queue environment samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/queue_environments) on GitHub.

## Setting variables in a job template
<a name="set-env-vars-job-template"></a>

In a job template, add a `variables` map to a `jobEnvironments` or `stepEnvironments` entry. Each entry is a key-value pair where the key is the variable name and the value is the variable value.

The following job template sets the `QT_QPA_PLATFORM` environment variable to `offscreen`, which allows applications that use the [Qt Framework](https://www.qt.io/product/framework) to run on worker hosts without an interactive display.

```
specificationVersion: 'jobtemplate-2023-09'
name: MyJob
jobEnvironments:
- name: JobEnv
  variables:
    QT_QPA_PLATFORM: offscreen
```

You can set multiple variables in a single environment definition.

```
jobEnvironments:
- name: JobEnv
  variables:
    JOB_VERBOSITY: MEDIUM
    JOB_PROJECT_ID: my-project-id
    JOB_ENDPOINT_URL: https://my-host-name/my/path
    QT_QPA_PLATFORM: offscreen
```

You can reference job parameters in variable values by using the `{{Param.ParameterName}}` syntax.

```
jobEnvironments:
- name: JobEnv
  variables:
    JOB_EXAMPLE_PARAM: "{{Param.ExampleParam}}"
```

To override a job-level variable for a specific step, define a `stepEnvironments` entry with the same variable name. The following example defines `JOB_PROJECT_ID` at the job level with the value `project-12`, and then overrides the value at the step level with `step-project-12`. Tasks in the step use the step-level value.

```
specificationVersion: 'jobtemplate-2023-09'
name: MyJob
jobEnvironments:
- name: JobEnv
  variables:
    JOB_PROJECT_ID: project-12
steps:
- name: MyStep
  stepEnvironments:
  - name: StepEnv
    variables:
      JOB_PROJECT_ID: step-project-12
```

## Try it: Running the environment variable sample
<a name="set-env-vars-example"></a>

The Deadline Cloud samples repository includes a [job bundle that demonstrates setting and viewing environment variables](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_vars/template.yaml). The sample job template defines variables at both the job and step levels, then runs a task that prints the merged result. Use the following procedure to run the sample and inspect the results.

### Prerequisites
<a name="set-prerequisites"></a>

1. If you do not have a Deadline Cloud farm with a queue and associated Linux fleet, follow the guided onboarding experience in the [Deadline Cloud console](https://console.aws.amazon.com/deadlinecloud/home) to create one with default settings.

1. If you do not have the Deadline Cloud CLI and AWS Deadline Cloud monitor on your workstation, follow the steps in [Set up Deadline Cloud submitters](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/submitter.html).

1. Use `git` to clone the [Deadline Cloud samples GitHub repository](https://github.com/aws-deadline/deadline-cloud-samples).

   ```
   git clone https://github.com/aws-deadline/deadline-cloud-samples.git
   cd deadline-cloud-samples/job_bundles
   ```

### Running the sample
<a name="set-run-example"></a>

1. Use the Deadline Cloud CLI to submit the `job_env_vars` sample.

   ```
   deadline bundle submit job_env_vars
   ```

1. In the Deadline Cloud monitor, select the new job to monitor its progress. After the Linux fleet associated with the queue has a worker available, the job completes in a few seconds. Select the task, then choose **View logs** in the top right menu of the tasks panel.

### Comparing session actions with their definitions
<a name="set-compare-actions"></a>

The log view shows three session actions. Open the file [job\$1env\$1vars/template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_vars/template.yaml) in a text editor to compare each action with its definition in the job template.

1. Select the **Launch JobEnv** session action. The log output shows the job-level environment variables being set.

   ```
   Setting: JOB_VERBOSITY=MEDIUM
   Setting: JOB_EXAMPLE_PARAM=An example parameter value
   Setting: JOB_PROJECT_ID=project-12
   Setting: JOB_ENDPOINT_URL=https://internal-host-name/some/path
   Setting: QT_QPA_PLATFORM=offscreen
   ```

   The following lines from the job template define this environment.

   ```
   jobEnvironments:
   - name: JobEnv
     variables:
       JOB_VERBOSITY: MEDIUM
       JOB_EXAMPLE_PARAM: "{{Param.ExampleParam}}"
       JOB_PROJECT_ID: project-12
       JOB_ENDPOINT_URL: https://internal-host-name/some/path
       QT_QPA_PLATFORM: offscreen
   ```

1. Select the **Launch StepEnv** session action. The log output shows the step-level variables, including the overridden `JOB_PROJECT_ID`.

   ```
   Setting: STEP_VERBOSITY=HIGH
   Setting: JOB_PROJECT_ID=step-project-12
   ```

   The following lines from the job template define this environment.

   ```
   stepEnvironments:
   - name: StepEnv
     variables:
       STEP_VERBOSITY: HIGH
       JOB_PROJECT_ID: step-project-12
   ```

1. Select the **Task run** session action. The log output shows the merged environment variables available to the task. Notice that `JOB_PROJECT_ID` uses the step-level value `step-project-12`.

   ```
   Environment variables starting with JOB_*:
   JOB_ENDPOINT_URL=https://internal-host-name/some/path
   JOB_EXAMPLE_PARAM='An example parameter value'
   JOB_PROJECT_ID=step-project-12
   JOB_VERBOSITY=MEDIUM
   
   Environment variables starting with STEP_*:
   STEP_VERBOSITY=HIGH
   ```

# Set the path in a queue environment
<a name="set-the-path"></a>

Use OpenJD environments to provide new commands in an environment. First you create a directory containing script files, and then add that directory to the `PATH` environment variables so that executables in your script can run them without needing to specify the directory path each time. The list of variables in an environment definition doesn’t provide a way to modify the variable, so you do this by running a script instead. After the script sets things up and modifies the `PATH`, it exports the variable to the OpenJD runtime with the command `echo "openjd_env: PATH=$PATH"`. 

## Prerequisites
<a name="set-prerequisites"></a>

 Perform the following steps to run the [sample job bundle with environment variables](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_vars/template.yaml) from the Deadline Cloud samples github repository. 

1.  If you do not have a Deadline Cloud farm with a queue and associated Linux fleet, follow the guided onboarding experience in the [Deadline Cloud console](https://console.aws.amazon.com/deadlinecloud/home) to create one with default settings. 

1.  If you do not have the Deadline Cloud CLI and Deadline Cloud monitor on your workstation, follow the steps in [Set up Deadline Cloud submitters](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/submitter.html) from the user guide. 

1.  Use `git` to clone the [Deadline Cloud samples GitHub repository](https://github.com/aws-deadline/deadline-cloud-samples). 

   ```
   git clone https://github.com/aws-deadline/deadline-cloud-samples.git
    Cloning into 'deadline-cloud-samples'...
    ...
   cd deadline-cloud-samples/job_bundles
   ```

## Run the path sample
<a name="path-run-sample"></a>

1.  Use the Deadline Cloud CLI to submit the `job_env_with_new_command` sample.

   ```
    $ deadline bundle submit job_env_with_new_command
    Submitting to Queue: MySampleQueue
    ...
   ```

1.  In the Deadline Cloud monitor, you will see the new job and can monitor its progress. Once the Linux fleet associated with the queue has a worker available to run the job’s task, the job completes in a few seconds. Select the task, then choose the **View logs** option in the top right menu of the tasks panel. 

    On the right are two session actions, **Launch RandomSleepCommand** and **Task run**. The log viewer in the center of the window corresponds to the selected session action on the right. 

## Compare session actions with their definitions
<a name="path-view-logs"></a>

In this section you use the Deadline Cloud monitor to compare the session actions with where they are defined in the job template. It continues from the previous section. 

Open the file [job\$1env\$1with\$1new\$1command/template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_with_new_command/template.yaml) in a text editor. Compare the session actions to where they are defined in the job template. 

1.  Select the **Launch RandomSleepCommand** session action in the Deadline Cloud monitor. You will see log output as follows.

   ```
    2024/07/16 17:25:32-07:00
    2024/07/16 17:25:32-07:00 ==============================================
    2024/07/16 17:25:32-07:00 --------- Entering Environment: RandomSleepCommand
    2024/07/16 17:25:32-07:00 ==============================================
    2024/07/16 17:25:32-07:00 ----------------------------------------------
    2024/07/16 17:25:32-07:00 Phase: Setup
    2024/07/16 17:25:32-07:00 ----------------------------------------------
    2024/07/16 17:25:32-07:00 Writing embedded files for Environment to disk.
    2024/07/16 17:25:32-07:00 Mapping: Env.File.Enter -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmpbt8j_c3f
    2024/07/16 17:25:32-07:00 Mapping: Env.File.SleepScript -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmperastlp4
    2024/07/16 17:25:32-07:00 Wrote: Enter -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmpbt8j_c3f
    2024/07/16 17:25:32-07:00 Wrote: SleepScript -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmperastlp4
    2024/07/16 17:25:32-07:00 ----------------------------------------------
    2024/07/16 17:25:32-07:00 Phase: Running action
    2024/07/16 17:25:32-07:00 ----------------------------------------------
    2024/07/16 17:25:32-07:00 Running command sudo -u job-user -i setsid -w /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/tmpbwrquq5u.sh
    2024/07/16 17:25:32-07:00 Command started as pid: 2205
    2024/07/16 17:25:32-07:00 Output:
    2024/07/16 17:25:33-07:00 openjd_env: PATH=/sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/bin:/opt/conda/condabin:/home/job-user/.local/bin:/home/job-user/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin
    No newer logs at this moment.
   ```

    The following lines from the job template specified this action.

   ```
    jobEnvironments:
    - name: RandomSleepCommand
      description: Adds a command 'random-sleep' to the environment.
      script:
        actions:
          onEnter:
            command: bash
            args:
            - "{{Env.File.Enter}}"
        embeddedFiles:
        - name: Enter
          type: TEXT
          data: |
            #!/bin/env bash
            set -euo pipefail
   
            # Make a bin directory inside the session's working directory for providing new commands
            mkdir -p '{{Session.WorkingDirectory}}/bin'
   
            # If this bin directory is not already in the PATH, then add it
            if ! [[ ":$PATH:" == *':{{Session.WorkingDirectory}}/bin:'* ]]; then
              export "PATH={{Session.WorkingDirectory}}/bin:$PATH"
   
              # This message to Open Job Description exports the new PATH value to the environment
              echo "openjd_env: PATH=$PATH"
            fi
   
            # Copy the SleepScript embedded file into the bin directory
            cp '{{Env.File.SleepScript}}' '{{Session.WorkingDirectory}}/bin/random-sleep'
            chmod u+x '{{Session.WorkingDirectory}}/bin/random-sleep'
        - name: SleepScript
          type: TEXT
          runnable: true
          data: |
            ...
   ```

1.  Select the **Launch StepEnv** session action in the Deadline Cloud monitor. You see log output as follows. 

   ```
    2024/07/16 17:25:33-07:00
    2024/07/16 17:25:33-07:00 ==============================================
    2024/07/16 17:25:33-07:00 --------- Running Task
    2024/07/16 17:25:33-07:00 ==============================================
    2024/07/16 17:25:33-07:00 ----------------------------------------------
    2024/07/16 17:25:33-07:00 Phase: Setup
    2024/07/16 17:25:33-07:00 ----------------------------------------------
    2024/07/16 17:25:33-07:00 Writing embedded files for Task to disk.
    2024/07/16 17:25:33-07:00 Mapping: Task.File.Run -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmpdrwuehjf
    2024/07/16 17:25:33-07:00 Wrote: Run -> /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/embedded_filesf3tq_1os/tmpdrwuehjf
    2024/07/16 17:25:33-07:00 ----------------------------------------------
    2024/07/16 17:25:33-07:00 Phase: Running action
    2024/07/16 17:25:33-07:00 ----------------------------------------------
    2024/07/16 17:25:33-07:00 Running command sudo -u job-user -i setsid -w /sessions/session-ab132a51b9b54d5da22cbe839dd946baaw1c8hk5/tmpz81iaqfw.sh
    2024/07/16 17:25:33-07:00 Command started as pid: 2256
    2024/07/16 17:25:33-07:00 Output:
    2024/07/16 17:25:34-07:00 + random-sleep 12.5 27.5
    2024/07/16 17:26:00-07:00 Sleeping for duration 26.90
    2024/07/16 17:26:00-07:00 ----------------------------------------------
    2024/07/16 17:26:00-07:00 Uploading output files to Job Attachments
    2024/07/16 17:26:00-07:00 ----------------------------------------------
   ```

1.  The following lines from the job template specified this action.

   ```
    steps:
    - name: EnvWithCommand
      script:
        actions:
          onRun:
            command: bash
            args:
            - '{{Task.File.Run}}'
        embeddedFiles:
        - name: Run
          type: TEXT
          data: |
            set -xeuo pipefail
   
            # Run the script installed into PATH by the job environment
            random-sleep 12.5 27.5
      hostRequirements:
        attributes:
        - name: attr.worker.os.family
          anyOf:
          - linux
   ```

# Run a background daemon process from the queue environment
<a name="run-a-background-daemon-process"></a>

 In many rendering use cases, loading the application and scene data can take a significant amount of time. If a job reloads them for every frame, it will spend most of its time on overhead. It’s often possible to load the application once as a background daemon process, have it load the scene data, and then send it commands via inter-process communication (IPC) to perform the renders. 

 Many of the open source Deadline Cloud integrations use this pattern. The Open Job Description project provides an [adaptor runtime library](https://github.com/OpenJobDescription/openjd-adaptor-runtime-for-python) with robust IPC patterns on all supported operating systems. 

 To demonstrate this pattern, there is a [self-contained sample job bundle](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_daemon_process/template.yaml) that uses Python and bash code to implement a background daemon and the IPC for tasks to communicate with it. The daemon is implemented in Python, and listens for a POSIX SIGUSR1 signal for when to process a task. The task details are passed to the daemon in a specific JSON file, and the results of running the task are returned as another JSON file. 

## Prerequisites
<a name="daemon-prerequisites"></a>

 Perform the following steps to run the [sample job bundle with a daemon process](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_daemon_process/template.yaml) from the Deadline Cloud samples github repository. 

1.  If you do not have a Deadline Cloud farm with a queue and associated Linux fleet, follow the guided onboarding experience in the [Deadline Cloud console](https://console.aws.amazon.com/deadlinecloud/home) to create one with default settings. 

1.  If you do not have the Deadline Cloud CLI and Deadline Cloud monitor on your workstation, follow the steps in [Set up Deadline Cloud submitters](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/submitter.html) from the user guide. 

1.  Use `git` to clone the [Deadline Cloud samples GitHub repository](https://github.com/aws-deadline/deadline-cloud-samples). 

   ```
   git clone https://github.com/aws-deadline/deadline-cloud-samples.git
    Cloning into 'deadline-cloud-samples'...
    ...
   cd deadline-cloud-samples/job_bundles
   ```

## Run the daemon sample
<a name="daemon-run-sample"></a>

1.  Use the Deadline Cloud CLI to submit the `job_env_daemon_process` sample.

   ```
    git clone https://github.com/aws-deadline/deadline-cloud-samples.git
   Cloning into 'deadline-cloud-samples'...
    ...
   cd deadline-cloud-samples/job_bundles
   ```

1.  In the Deadline Cloud monitor application, you will see the new job and can monitor its progress. Once the Linux fleet associated with the queue has a worker available to run the job’s task, it completes in about a minute. With one of the tasks selected, choose the **View logs** option in the top right menu of the tasks panel. 

    On the right there are two session actions, **Launch DaemonProcess** and **Task run**. The log viewer in the center of the window corresponds to the selected session action on the right. 

    Select the option **View logs for all tasks**. The timeline shows the rest of the tasks that ran as part of the session, and the `Shut down DaemonProcess` action that exited the environment. 

## View the daemon logs
<a name="daemon-view-logs"></a>

1. In this section you use the Deadline Cloud monitor to compare the session actions with where they are defined in the job template. It continues from the previous section. 

    Open the file [job\$1env\$1daemon\$1process/template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/job_env_daemon_process/template.yaml) in a text editor. Compare the session actions to where they are defined in the job template. 

1.  Select the `Launch DaemonProcess` session action in Deadline Cloud monitor. You will see log output as follows.

   ```
    2024/07/17 16:27:20-07:00
    2024/07/17 16:27:20-07:00 ==============================================
    2024/07/17 16:27:20-07:00 --------- Entering Environment: DaemonProcess
    2024/07/17 16:27:20-07:00 ==============================================
    2024/07/17 16:27:20-07:00 ----------------------------------------------
    2024/07/17 16:27:20-07:00 Phase: Setup
    2024/07/17 16:27:20-07:00 ----------------------------------------------
    2024/07/17 16:27:20-07:00 Writing embedded files for Environment to disk.
    2024/07/17 16:27:20-07:00 Mapping: Env.File.Enter -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/enter-daemon-process-env.sh
    2024/07/17 16:27:20-07:00 Mapping: Env.File.Exit -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/exit-daemon-process-env.sh
    2024/07/17 16:27:20-07:00 Mapping: Env.File.DaemonScript -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/daemon-script.py
    2024/07/17 16:27:20-07:00 Mapping: Env.File.DaemonHelperFunctions -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/daemon-helper-functions.sh
    2024/07/17 16:27:20-07:00 Wrote: Enter -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/enter-daemon-process-env.sh
    2024/07/17 16:27:20-07:00 Wrote: Exit -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/exit-daemon-process-env.sh
    2024/07/17 16:27:20-07:00 Wrote: DaemonScript -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/daemon-script.py
    2024/07/17 16:27:20-07:00 Wrote: DaemonHelperFunctions -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/daemon-helper-functions.sh
    2024/07/17 16:27:20-07:00 ----------------------------------------------
    2024/07/17 16:27:20-07:00 Phase: Running action
    2024/07/17 16:27:20-07:00 ----------------------------------------------
    2024/07/17 16:27:20-07:00 Running command sudo -u job-user -i setsid -w /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/tmp_u8slys3.sh
    2024/07/17 16:27:20-07:00 Command started as pid: 2187
    2024/07/17 16:27:20-07:00 Output:
    2024/07/17 16:27:21-07:00 openjd_env: DAEMON_LOG=/sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/daemon.log
    2024/07/17 16:27:21-07:00 openjd_env: DAEMON_PID=2223
    2024/07/17 16:27:21-07:00 openjd_env: DAEMON_BASH_HELPER_SCRIPT=/sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/daemon-helper-functions.sh
   ```

    The following lines from the job template specified this action.

   ```
      stepEnvironments:
      - name: DaemonProcess
        description: Runs a daemon process for the step's tasks to share.
        script:
          actions:
            onEnter:
              command: bash
              args:
              - "{{Env.File.Enter}}"
            onExit:
              command: bash
              args:
              - "{{Env.File.Exit}}"
          embeddedFiles:
          - name: Enter
            filename: enter-daemon-process-env.sh
            type: TEXT
            data: |
              #!/bin/env bash
              set -euo pipefail
   
              DAEMON_LOG='{{Session.WorkingDirectory}}/daemon.log'
              echo "openjd_env: DAEMON_LOG=$DAEMON_LOG"
              nohup python {{Env.File.DaemonScript}} > $DAEMON_LOG 2>&1 &
              echo "openjd_env: DAEMON_PID=$!"
              echo "openjd_env: DAEMON_BASH_HELPER_SCRIPT={{Env.File.DaemonHelperFunctions}}"
   
              echo 0 > 'daemon_log_cursor.txt'
        ...
   ```

1.  Select one of the Task run: N session action in Deadline Cloud monitor. You will see log output as follows.

   ```
   2024/07/17 16:27:22-07:00
    2024/07/17 16:27:22-07:00 ==============================================
    2024/07/17 16:27:22-07:00 --------- Running Task
    2024/07/17 16:27:22-07:00 ==============================================
    2024/07/17 16:27:22-07:00 Parameter values:
    2024/07/17 16:27:22-07:00 Frame(INT) = 2
    2024/07/17 16:27:22-07:00 ----------------------------------------------
    2024/07/17 16:27:22-07:00 Phase: Setup
    2024/07/17 16:27:22-07:00 ----------------------------------------------
    2024/07/17 16:27:22-07:00 Writing embedded files for Task to disk.
    2024/07/17 16:27:22-07:00 Mapping: Task.File.Run -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/run-task.sh
    2024/07/17 16:27:22-07:00 Wrote: Run -> /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/embedded_fileswy00x5ra/run-task.sh
    2024/07/17 16:27:22-07:00 ----------------------------------------------
    2024/07/17 16:27:22-07:00 Phase: Running action
    2024/07/17 16:27:22-07:00 ----------------------------------------------
    2024/07/17 16:27:22-07:00 Running command sudo -u job-user -i setsid -w /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/tmpv4obfkhn.sh
    2024/07/17 16:27:22-07:00 Command started as pid: 2301
    2024/07/17 16:27:22-07:00 Output:
    2024/07/17 16:27:23-07:00 Daemon PID is 2223
    2024/07/17 16:27:23-07:00 Daemon log file is /sessions/session-972e21d98dde45e59c7153bd9258a64dohwg4yg1/daemon.log
    2024/07/17 16:27:23-07:00
    2024/07/17 16:27:23-07:00 === Previous output from daemon
    2024/07/17 16:27:23-07:00 ===
    2024/07/17 16:27:23-07:00
    2024/07/17 16:27:23-07:00 Sending command to daemon
    2024/07/17 16:27:23-07:00 Received task result:
    2024/07/17 16:27:23-07:00 {
    2024/07/17 16:27:23-07:00   "result": "SUCCESS",
    2024/07/17 16:27:23-07:00   "processedTaskCount": 1,
    2024/07/17 16:27:23-07:00   "randomValue": 0.2578537967668988,
    2024/07/17 16:27:23-07:00   "failureRate": 0.1
    2024/07/17 16:27:23-07:00 }
    2024/07/17 16:27:23-07:00
    2024/07/17 16:27:23-07:00 === Daemon log from running the task
    2024/07/17 16:27:23-07:00 Loading the task details file
    2024/07/17 16:27:23-07:00 Received task details:
    2024/07/17 16:27:23-07:00 {
    2024/07/17 16:27:23-07:00  "pid": 2329,
    2024/07/17 16:27:23-07:00  "frame": 2
    2024/07/17 16:27:23-07:00 }
    2024/07/17 16:27:23-07:00 Processing frame number 2
    2024/07/17 16:27:23-07:00 Writing result
    2024/07/17 16:27:23-07:00 Waiting until a USR1 signal is sent...
    2024/07/17 16:27:23-07:00 ===
    2024/07/17 16:27:23-07:00
    2024/07/17 16:27:23-07:00 ----------------------------------------------
    2024/07/17 16:27:23-07:00 Uploading output files to Job Attachments
    2024/07/17 16:27:23-07:00 ----------------------------------------------
   ```

    The following lines from the job template are what specified this action. ``` steps: 

   ```
    steps:
    - name: EnvWithDaemonProcess
      parameterSpace:
        taskParameterDefinitions:
        - name: Frame
          type: INT
          range: "{{Param.Frames}}"
   
      stepEnvironments:
        ...
   
      script:
        actions:
          onRun:
            timeout: 60
            command: bash
            args:
            - '{{Task.File.Run}}'
        embeddedFiles:
        - name: Run
          filename: run-task.sh
          type: TEXT
          data: |
            # This bash script sends a task to the background daemon process,
            # then waits for it to respond with the output result.
   
            set -euo pipefail
   
            source "$DAEMON_BASH_HELPER_SCRIPT"
   
            echo "Daemon PID is $DAEMON_PID"
            echo "Daemon log file is $DAEMON_LOG"
   
            print_daemon_log "Previous output from daemon"
   
            send_task_to_daemon "{\"pid\": $$, \"frame\": {{Task.Param.Frame}} }"
            wait_for_daemon_task_result
   
            echo Received task result:
            echo "$TASK_RESULT" | jq .
   
            print_daemon_log "Daemon log from running the task"
   
      hostRequirements:
        attributes:
        - name: attr.worker.os.family
          anyOf:
          - linux
   ```

# Provide applications for your jobs
<a name="provide-applications"></a>

You can use a queue environment to load applications to process your jobs. When you create a service-managed fleet using the Deadline Cloud console, you have the option of creating a queue environment that uses the conda package manager to load applications. 

If you want to use a different package manager, you can create a queue environment for that manager. For an example using Rez, see [Use a different package manager](#provide-applications-other-package).

Deadline Cloud provides a conda channel to load a selection of rendering applications into your environment. They support the submitters that Deadline Cloud provides for digital content creation applications. 

You can also load software for conda-forge to use in your jobs. The following examples show job templates using the queue environment provided by Deadline Cloud to load applications before running the job.

**Topics**
+ [

## Getting an application from a conda channel
](#provide-applications-get-application)
+ [

## Use a different package manager
](#provide-applications-other-package)

## Getting an application from a conda channel
<a name="provide-applications-get-application"></a>

You can create a custom queue environment for your Deadline Cloud workers that installs the software of your choice. This example queue environment has the same behavior as the environment used by the console for service-managed fleets. It runs conda directly to create the environment.

The environment creates a new conda virtual environment for every Deadline Cloud session that runs on a worker, and then deletes the environment when it is done. 

Conda caches the downloaded packages so that they don't need to be downloaded again, but each session must link all of the packages into the environment.

The environment defines three scripts that run when Deadline Cloud starts a session on a worker. The first script runs when the `onEnter` action is called. It calls the other two to set up environment variables. When the script finishes running, the conda environment is available with all of the specified environment variables set.

For the latest version of the example, see [conda\$1queue\$1env\$1console\$1equivalent.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/queue_environments/conda_queue_env_console_equivalent.yaml) in the [deadline-cloud-samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline) repository on GitHub.

If the you want to use an application that is not available in the conda channel, you can create a conda channel in Amazon S3 and then build your own packages for that application. See [Create a conda channel using S3](configure-jobs-s3-channel.md) to learn more.

### Get open source libraries from conda-forge
<a name="get-application-csv-example"></a>

This section describes how to use open source libraries from the `conda-forge` channel. The following example is a job template that uses the `polars` Python package.

The job sets the `CondaPackages` and `CondaChannels` parameters defined in the queue environment that tell Deadline Cloud where to get the package.

The section of the job template that sets the parameters is:

```
- name: CondaPackages
  description: A list of conda packages to install. The job expects a Queue Environment to handle this.
  type: STRING
  default: polars
- name: CondaChannels
  description: A list of conda channels to get packages from. The job expects a Queue Environment to handle this.
  type: STRING
  default: conda-forge
```

For the latest version of the complete example job template, see [stage\$11\$1self\$1contained\$1template/template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/job_bundles/job_dev_progression/stage_1_self_contained_template/template.yaml). For the latest version of the queue environment that loads the conda packages, see [conda\$1queue\$1env\$1console\$1equivalent.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/queue_environments/conda_queue_env_console_equivalent.yaml) in the [deadline-cloud-samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline) repository on GitHub. 

### Get Blender from the deadline-cloud channel
<a name="get-application-blender"></a>

The following example shows a job template that gets Blender from the `deadline-cloud` conda channel. This channel supports the submitters that Deadline Cloud provides for digital content creation software, though you can use the same channel to load software for your own use.

For a list of the software provided by the `deadline-cloud` channel, see [Default queue environment](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue-environment.html#conda-queue-environment) in the *AWS Deadline Cloud User Guide*.

This job sets the `CondaPackages` parameter defined in the queue environment to tell Deadline Cloud to load Blender into the environment. 

The section of the job template that sets the parameter is:

```
- name: CondaPackages
  type: STRING
  userInterface:
    control: LINE_EDIT
    label: Conda Packages
    groupLabel: Software Environment
  default: blender
  description: >
    Tells the queue environment to install Blender from the deadline-cloud conda channel.
```

For the latest version of the complete example job template, see [blender\$1render/template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/job_bundles/blender_render/template.yaml). For the latest version of the queue environment that loads the conda packages, see [conda\$1queue\$1env\$1console\$1equivalent.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/queue_environments/conda_queue_env_console_equivalent.yaml) in the [deadline-cloud-samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline) repository on GitHub. 

## Use a different package manager
<a name="provide-applications-other-package"></a>

The default package manager for Deadline Cloud is conda. If you need to use a different package manager, such as Rez, you can create a custom queue environment that contains scripts that use your package manager instead. 

This example queue environment provides the same behavior as the environment used by the console for service-managed fleets. It replaces the conda package manager with Rez. 

The environment defines three scripts that run when Deadline Cloud starts a session on a worker. The first script runs when the `onEnter` action is called. It calls the other two to set up environment variables. When the script finishes running, the Rez environment is available with all of the specified environment variables set.

The example assumes that you have a customer-managed fleet that uses a shared file system for the Rez packages.

For the latest version of the example, see [rez\$1queue\$1env.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/queue_environments/rez_queue_env.yaml) in the [deadline-cloud-samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline) repository on GitHub.