

# Making AWS service requests using the AWS SDK for PHP Version 3
<a name="making-service-requests"></a>

## SDK request workflow overview
<a name="usage-summary"></a>

Working with the AWS SDK for PHP Version 3 follows a consistent pattern across all AWS services. The basic workflow involves three main steps:

1. [**Create a service client**](#creating-a-client)—Instantiate a **Client** object for the AWS service you want to use.

1. [**Execute operations**](#executing-service-operations)—Call methods on the client that correspond to operations in the service's API.

1. [**Process results**](#result-objects)—Work with the array-like **Result** object returned on success, or handle any **Exception** thrown on failure.

The following sections explain each of these steps in detail, starting with how to create and configure service clients.

## Creating a basic service client
<a name="creating-a-client"></a>

You can create a client by passing an associative array of options to a client’s constructor.

 **Imports** 

```
require 'vendor/autoload.php';

use Aws\S3\S3Client;
use Aws\Exception\AwsException;
```

 **Sample Code** 

```
//Create an S3Client
$s3 = new Aws\S3\S3Client([
    'region' => 'us-east-2'  // Since version 3.277.10 of the SDK,
]);                          // the 'version' parameter defaults to 'latest'.
```

Information about the optional "version" parameter is available in the [configuration options](guide_configuration.md#cfg-version) topic.

Notice that we did **not** explicitly provide credentials to the client. That’s because the SDK uses the [default credential provider chain](guide_credentials_default_chain.md) to look for credential information.

All of the general client configuration options are described in detail in [Client constructor options for the AWS SDK for PHP Version 3](guide_configuration.md). The array of options provided to a client can vary based on which client you’re creating. These custom client configuration options are described in the [API documentation](https://docs.aws.amazon.com/aws-sdk-php/latest/) for each client.

While the example above shows basic client creation, you can customize your service clients to meet specific requirements. For more detailed information about configuring service clients through code, see [Configuring service clients in code for the AWS SDK for PHP Version 3](configuring-service-clients-code.md). If you need to configure service clients using external configuration files or environment variables, refer to [Configuring service clients for the AWS SDK for PHP Version 3 externally](configuring-service-clients-ext.md).

## Making requests
<a name="executing-service-operations"></a>

You can make service requests by calling the method of the same name on a client object. For example, to perform the Amazon S3 [PutObject operation](https://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html), you call the `Aws\S3\S3Client::putObject()` method.

 **Imports** 

```
require 'vendor/autoload.php';

use Aws\S3\S3Client;
```

 **Sample Code** 

```
// Use the us-east-2 region and latest version of each client.
$sharedConfig = [
    'profile' => 'default',
    'region' => 'us-east-2'
];

// Create an SDK class used to share configuration across clients.
$sdk = new Aws\Sdk($sharedConfig);

// Use an Aws\Sdk class to create the S3Client object.
$s3Client = $sdk->createS3();

// Send a PutObject request and get the result object.
$result = $s3Client->putObject([
    'Bucket' => 'amzn-s3-demo-bucket',
    'Key' => 'my-key',
    'Body' => 'this is the body!'
]);

// Download the contents of the object.
$result = $s3Client->getObject([
    'Bucket' => 'amzn-s3-demo-bucket',
    'Key' => 'my-key'
]);

// Print the body of the result by indexing into the result object.
echo $result['Body'];
```

Operations available to a client and the structure of the input and output are defined at runtime based on a service description file. When creating a client, if you don't provide a `version` parameter (for example., *“2006-03-01”* or *“latest”*) of the service model, the client defaults to the latest version. The SDK finds the corresponding configuration file based on the provided version.

Operation methods like `putObject()` all accept a single argument, an associative array that represents the parameters of the operation. The structure of this array (and the structure of the result object) is defined for each operation in the SDK’s API Documentation (e.g., see the API docs for [putObject operation](https://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject)).

### HTTP handler options
<a name="http-handler-options"></a>

You can also fine-tune how the underlying HTTP handler executes the request by using the special `@http` parameter. The options you can include in the `@http` parameter are the same as the ones you can set when you instantiate the client with the [“http” client option](guide_configuration.md#config-http).

```
// Send the request through a proxy
$result = $s3Client->putObject([
    'Bucket' => 'amzn-s3-demo-bucket',
    'Key'    => 'my-key',
    'Body'   => 'this is the body!',
    '@http'  => [
        'proxy' => 'http://192.168.16.1:10'
    ]
]);
```

## Working with result objects
<a name="result-objects"></a>

Executing a successful operation returns an `Aws\Result` object. Instead of returning the raw XML or JSON data of a service, the SDK coerces the response data into an associative array structure. It normalizes some aspects of the data based on its knowledge of the specific service and the underlying response structure.

You can access data from the `AWS\Result` object like an associative PHP array.

 **Imports** 

```
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
```

 **Sample Code** 

```
// Use the us-east-2 region and latest version of each client.
$sharedConfig = [
    'profile' => 'default',
    'region' => 'us-east-2',
];

// Create an SDK class used to share configuration across clients.
$sdk = new Aws\Sdk($sharedConfig);

// Use an Aws\Sdk class to create the S3Client object.
$s3 = $sdk->createS3();
$result = $s3->listBuckets();
foreach ($result['Buckets'] as $bucket) {
    echo $bucket['Name'] . "\n";
}

// Convert the result object to a PHP array
$array = $result->toArray();
```

The contents of the result object depend on the operation that was executed and the version of a service. The result structure of each API operation is documented in the API docs for each operation.

The SDK is integrated with [JMESPath](http://jmespath.org/), a [DSL](http://en.wikipedia.org/wiki/Domain-specific_language) used to search and manipulate JSON data or, in our case, PHP arrays. The result object contains a `search()` method you can use to more declaratively extract data from the result.

 **Sample Code** 

```
$s3 = $sdk->createS3();
$result = $s3->listBuckets();
```

```
$names = $result->search('Buckets[].Name');
```

# Command objects in the AWS SDK for PHP Version 3
<a name="guide_commands"></a>

The AWS SDK for PHP uses the [command pattern](http://en.wikipedia.org/wiki/Command_pattern) to encapsulate the parameters and handler that will be used to transfer an HTTP request at a later point in time.

## Implicit use of commands
<a name="implicit-use-of-commands"></a>

If you examine any client class, you can see that the methods corresponding to API operations don’t actually exist. They are implemented using the `__call()` magic method. These pseudo-methods are actually shortcuts that encapsulate the SDK’s use of command objects.

You don’t typically need to interact with command objects directly. When you call methods like `Aws\S3\S3Client::putObject()`, the SDK actually creates an `Aws\CommandInterface` object based on the provided parameters, executes the command, and returns a populated `Aws\ResultInterface` object (or throws an exception on error). A similar flow occurs when calling any of the `Async` methods of a client (e.g., `Aws\S3\S3Client::putObjectAsync()`): the client creates a command based on the provided parameters, serializes an HTTP request, initiates the request, and returns a promise.

The following examples are functionally equivalent.

```
$s3Client = new Aws\S3\S3Client([
    'version' => '2006-03-01',
    'region'  => 'us-standard'
]);

$params = [
    'Bucket' => 'amzn-s3-demo-bucket',
    'Key'    => 'baz',
    'Body'   => 'bar'
];

// Using operation methods creates a command implicitly
$result = $s3Client->putObject($params);

// Using commands explicitly
$command = $s3Client->getCommand('PutObject', $params);
$result = $s3Client->execute($command);
```

## Command parameters
<a name="command-parameters"></a>

All commands support a few special parameters that are not part of a service’s API but instead control the SDK’s behavior.

### `@http`
<a name="http"></a>

Using this parameter, it’s possible to fine-tune how the underlying HTTP handler executes the request. The options you can include in the `@http` parameter are the same as the ones you can set when you instantiate the client with the [“http” client option](guide_configuration.md#config-http).

```
// Configures the command to be delayed by 500 milliseconds
$command['@http'] = [
    'delay' => 500,
];
```

### `@retries`
<a name="retries"></a>

Like the [“retries” client option](guide_configuration.md#config-retries), `@retries` controls how many times a command can be retried before it is considered to have failed. Set it to `0` to disable retries.

```
// Disable retries
$command['@retries'] = 0;
```

**Note**  
If you have disabled retries on a client, you cannot selectively enable them on individual commands passed to that client.

## Creating command objects
<a name="creating-command-objects"></a>

You can create a command using a client’s `getCommand()` method. It doesn’t immediately execute or transfer an HTTP request, but is only executed when it is passed to the `execute()` method of the client. This gives you the opportunity to modify the command object before executing the command.

```
$command = $s3Client->getCommand('ListObjects');
$command['MaxKeys'] = 50;
$command['Prefix'] = 'foo/baz/';
$result = $s3Client->execute($command);

// You can also modify parameters
$command = $s3Client->getCommand('ListObjects', [
    'MaxKeys' => 50,
    'Prefix'  => 'foo/baz/',
]);
$command['MaxKeys'] = 100;
$result = $s3Client->execute($command);
```

## Command `HandlerList`
<a name="command-handlerlist"></a>

When a command is created from a client, it is given a clone of the client’s `Aws\HandlerList` object. The command is given a **clone** of the client’s handler list to allow a command to use custom middleware and handlers that do not affect other commands that the client executes.

This means that you can use a different HTTP client per command (e.g., `Aws\MockHandler`) and add custom behavior per command through middleware. The following example uses a `MockHandler` to create mock results instead of sending actual HTTP requests.

```
use Aws\Result;
use Aws\MockHandler;

// Create a mock handler
$mock = new MockHandler();
// Enqueue a mock result to the handler
$mock->append(new Result(['foo' => 'bar']));
// Create a "ListObjects" command
$command = $s3Client->getCommand('ListObjects');
// Associate the mock handler with the command
$command->getHandlerList()->setHandler($mock);
// Executing the command will use the mock handler, which returns the
// mocked result object
$result = $client->execute($command);

echo $result['foo']; // Outputs 'bar'
```

In addition to changing the handler that the command uses, you can also inject custom middleware to the command. The following example uses the `tap` middleware, which functions as an observer in the handler list.

```
use Aws\CommandInterface;
use Aws\Middleware;
use Psr\Http\Message\RequestInterface;

$command = $s3Client->getCommand('ListObjects');
$list = $command->getHandlerList();

// Create a middleware that just dumps the command and request that is
// about to be sent
$middleware = Middleware::tap(
    function (CommandInterface $command, RequestInterface $request) {
        var_dump($command->toArray());
        var_dump($request);
    }
);

// Append the middleware to the "sign" step of the handler list. The sign
// step is the last step before transferring an HTTP request.
$list->append('sign', $middleware);

// Now transfer the command and see the var_dump data
$s3Client->execute($command);
```

## `CommandPool`
<a name="command-pool"></a>

The `Aws\CommandPool` enables you to execute commands concurrently using an iterator that yields `Aws\CommandInterface` objects. The `CommandPool` ensures that a constant number of commands are executed concurrently while iterating over the commands in the pool (as commands complete, more are executed to ensure a constant pool size).

Here’s a very simple example of just sending a few commands using a `CommandPool`.

```
use Aws\S3\S3Client;
use Aws\CommandPool;

// Create the client
$client = new S3Client([
    'region'  => 'us-standard',
    'version' => '2006-03-01'
]);

$bucket = 'amzn-s3-demo-bucket';
$commands = [
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'a']),
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'b']),
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'c'])
];

$pool = new CommandPool($client, $commands);

// Initiate the pool transfers
$promise = $pool->promise();

// Force the pool to complete synchronously
$promise->wait();
```

That example is pretty underpowered for the `CommandPool`. Let’s try a more complex example. Let’s say you want to upload files on disk to an Amazon S3 bucket. To get a list of files from disk, we can use PHP’s `DirectoryIterator`. This iterator yields `SplFileInfo` objects. The `CommandPool` accepts an iterator that yields `Aws\CommandInterface` objects, so we map over the `SplFileInfo` objects to return `Aws\CommandInterface` objects.

```
<?php
require 'vendor/autoload.php';

use Aws\Exception\AwsException;
use Aws\S3\S3Client;
use Aws\CommandPool;
use Aws\CommandInterface;
use Aws\ResultInterface;
use GuzzleHttp\Promise\PromiseInterface;

// Create the client
$client = new S3Client([
    'region'  => 'us-standard',
    'version' => '2006-03-01'
]);

$fromDir = '/path/to/dir';
$toBucket = 'amzn-s3-demo-bucket';

// Create an iterator that yields files from a directory
$files = new DirectoryIterator($fromDir);

// Create a generator that converts the SplFileInfo objects into
// Aws\CommandInterface objects. This generator accepts the iterator that
// yields files and the name of the bucket to upload the files to.
$commandGenerator = function (\Iterator $files, $bucket) use ($client) {
    foreach ($files as $file) {
        // Skip "." and ".." files
        if ($file->isDot()) {
            continue;
        }
        $filename = $file->getPath() . '/' . $file->getFilename();
        // Yield a command that is executed by the pool
        yield $client->getCommand('PutObject', [
            'Bucket' => $bucket,
            'Key'    => $file->getBaseName(),
            'Body'   => fopen($filename, 'r')
        ]);
    }
};

// Now create the generator using the files iterator
$commands = $commandGenerator($files, $toBucket);

// Create a pool and provide an optional array of configuration
$pool = new CommandPool($client, $commands, [
    // Only send 5 files at a time (this is set to 25 by default)
    'concurrency' => 5,
    // Invoke this function before executing each command
    'before' => function (CommandInterface $cmd, $iterKey) {
        echo "About to send {$iterKey}: "
            . print_r($cmd->toArray(), true) . "\n";
    },
    // Invoke this function for each successful transfer
    'fulfilled' => function (
        ResultInterface $result,
        $iterKey,
        PromiseInterface $aggregatePromise
    ) {
        echo "Completed {$iterKey}: {$result}\n";
    },
    // Invoke this function for each failed transfer
    'rejected' => function (
        AwsException $reason,
        $iterKey,
        PromiseInterface $aggregatePromise
    ) {
        echo "Failed {$iterKey}: {$reason}\n";
    },
]);

// Initiate the pool transfers
$promise = $pool->promise();

// Force the pool to complete synchronously
$promise->wait();

// Or you can chain the calls off of the pool
$promise->then(function() { echo "Done\n"; });
```

### `CommandPool` configuration
<a name="commandpool-configuration"></a>

The `Aws\CommandPool` constructor accepts various configuration options.

**concurrency (callable\$1int)**  
Maximum number of commands to execute concurrently. Provide a function to resize the pool dynamically. The function is provided the current number of pending requests and is expected to return an integer representing the new pool size limit.

**before (callable)**  
Function to invoke before sending each command. The `before` function accepts the command and the key of the iterator of the command. You can mutate the command as needed in the `before` function before sending the command.

**fulfilled (callable)**  
Function to invoke when a promise is fulfilled. The function is provided the result object, ID of the iterator that the result came from, and the aggregate promise that can be resolved or rejected if you need to short-circuit the pool.

**rejected (callable)**  
Function to invoke when a promise is rejected. The function is provided an `Aws\Exception` object, ID of the iterator that the exception came from, and the aggregate promise that can be resolved or rejected if you need to short-circuit the pool.

### Manual garbage collection between commands
<a name="manual-garbage-collection-between-commands"></a>

If you are hitting the memory limit with large command pools, this may be due to cyclic references generated by the SDK not yet having been collected by the [PHP garbage collector](https://www.php.net/manual/en/features.gc.php) when your memory limit was hit. Manually invoking the collection algorithm between commands may allow the cycles to be collected before hitting that limit. The following example creates a `CommandPool` that invokes the collection algorithm using a callback before sending each command. Note that invoking the garbage collector does come with a performance cost, and optimal usage will depend on your use case and environment.

```
$pool = new CommandPool($client, $commands, [
    'concurrency' => 25,
    'before' => function (CommandInterface $cmd, $iterKey) {
        gc_collect_cycles();
    }
]);
```