

# Choosing a DLAMI
Choosing a DLAMI

We offer a range of DLAMI options as mentioned in the [GPU DLAMI release notes](https://docs.aws.amazon.com/dlami/latest/devguide/appendix-ami-release-notes.html#appendix-ami-release-notes-gpu). To help you select the correct DLAMI for your use case, we group images by the hardware type or functionality for which they were developed. Our top level groupings are:
+ **DLAMI Type:** Base, Single-Framework, Multi-Framework (Conda DLAMI)
+ **Compute Architecture:** x86-based, Arm64-based [AWS Graviton](https://aws.amazon.com/ec2/graviton/)
+ **Processor Type:** [GPU](https://docs.aws.amazon.com/dlami/latest/devguide/gpu), [CPU](https://docs.aws.amazon.com/dlami/latest/devguide/cpu), [Inferentia](https://docs.aws.amazon.com/dlami/latest/devguide/inferentia), [Trainium](https://docs.aws.amazon.com/dlami/latest/devguide/trainium)
+ **SDK:** [CUDA](https://developer.nvidia.com/cuda-toolkit), [AWS Neuron](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/neuron-intro/get-started.html)
+ **OS:** Amazon Linux, Ubuntu

The rest of the topics in this guide help further inform you and go into more details. 

**Topics**
+ [

# CUDA Installations and Framework Bindings
](overview-cuda.md)
+ [

# Deep Learning Base AMI
](overview-base.md)
+ [

# Deep Learning AMI with Conda
](overview-conda.md)
+ [

# DLAMI Architecture Options
](overview-architecture.md)
+ [

# DLAMI Operating System Options
](overview-os.md)

**Next Up**  
[Deep Learning AMI with Conda](overview-conda.md)

# CUDA Installations and Framework Bindings


While deep learning is all pretty cutting edge, each framework offers "stable" versions. These stable versions may not work with the latest CUDA or cuDNN implementation and features. Your use case and the features you require can help you choose a framework. If you are not sure, then use the latest Deep Learning AMI with Conda. It has official `pip` binaries for all frameworks with CUDA, using whichever most recent version is supported by each framework. If you want the latest versions, and to customize your deep learning environment, use the Deep Learning Base AMI.

Look at our guide on [Stable Versus Release Candidates](overview-conda.md#overview-conda-stability) for further guidance.

## Choose a DLAMI with CUDA


The [Deep Learning Base AMI](overview-base.md) has all available CUDA version series

The [Deep Learning AMI with Conda](overview-conda.md) has all available CUDA version series

**Note**  
We no longer include the MXNet, CNTK, Caffe, Caffe2, Theano, Chainer, or Keras Conda environments in the AWS Deep Learning AMIs.

For specific framework version numbers, see the [Deep Learning AMIs Release Notes](appendix-ami-release-notes.md)

Choose this DLAMI type or learn more about the different DLAMIs with the **Next Up** option.

Choose one of the CUDA versions and review the full list of DLAMIs that have that version in the **Appendix**, or learn more about the different DLAMIs with the **Next Up** option.

**Next Up**  
[Deep Learning Base AMI](overview-base.md)

## Related Topics

+ For instructions on switching between CUDA versions, refer to the [Using the Deep Learning Base AMI](tutorial-base.md) tutorial.

# Deep Learning Base AMI
Base

The Deep Learning Base AMI is like an empty canvas for deep learning. It comes with everything you need up until the point of the installation of a particular framework, and has your choice of CUDA versions. 

## Why to Choose the Base DLAMI


This AMI group is useful for project contributors who want to fork a deep learning project and build the latest. It's for someone who wants to roll their own environment with the confidence that the latest NVIDIA software is installed and working so they can focus on picking which frameworks and versions they want to install. 

Choose this DLAMI type or learn more about the different DLAMIs with the **Next Up** option.

**Next Up**  
[DLAMI with Conda](https://docs.aws.amazon.com/dlami/latest/devguide/overview-conda.html)

## Related Topics

+ [Using the Deep Learning Base AMI](https://docs.aws.amazon.com/dlami/latest/devguide/tutorial-base.html)

# Deep Learning AMI with Conda
Conda

The Conda DLAMI uses `conda` virtual environments, they are present either multi-framework or single framework DLAMIs. These environments are configured to keep the different framework installations separate and streamline switching between frameworks. This is great for learning and experimenting with all of the frameworks the DLAMI has to offer. Most users find that the new Deep Learning AMI with Conda is perfect for them. 

They are updated often with the latest versions from the frameworks, and have the latest GPU drivers and software. They are generally referred to as *the* AWS Deep Learning AMIs in most documents. These DLAMIs support Ubuntu 20.04, Ubuntu 22.04, Amazon Linux 2, Amazon Linux 2023 Operating systems. Operating systems support depends on support from upstream OS.

## Stable Versus Release Candidates


The Conda AMIs use optimized binaries of the most recent formal releases from each framework. Release candidates and experimental features are not to be expected. The optimizations depend on the framework's support for acceleration technologies like Intel's MKL DNN, which speeds up training and inference on C5 and C4 CPU instance types. The binaries are also compiled to support advanced Intel instruction sets including but not limited to AVX, AVX-2, SSE4.1, and SSE4.2. These accelerate vector and floating point operations on Intel CPU architectures. Additionally, for GPU instance types, the CUDA and cuDNN are updated with whichever version the latest official release supports. 

The Deep Learning AMI with Conda automatically installs the most optimized version of the framework for your Amazon EC2 instance upon the framework's first activation. For more information, refer to [Using the Deep Learning AMI with Conda](tutorial-conda.md). 

If you want to install from source, using custom or optimized build options, the [Deep Learning Base AMI](overview-base.md)s might be a better option for you.

## Python 2 Deprecation


The Python open source community has officially ended support for Python 2 on January 1, 2020. The TensorFlow and PyTorch community have announced that the TensorFlow 2.1 and PyTorch 1.4 releases are the last ones supporting Python 2. Previous releases of the DLAMI (v26, v25, etc) that contain Python 2 Conda environments continue to be available. However, we provide updates to the Python 2 Conda environments on previously published DLAMI versions only if there are security fixes published by the open-source community for those versions. DLAMI releases with the latest versions of the TensorFlow and PyTorch frameworks do not contain the Python 2 Conda environments.

## CUDA Support


Specific CUDA version numbers can be found in the [GPU DLAMI release notes](https://docs.aws.amazon.com/dlami/latest/devguide/appendix-ami-release-notes.html#appendix-ami-release-notes-gpu).

**Next Up**  
[DLAMI Architecture Options](overview-architecture.md)

## Related Topics

+ For a tutorial on using a Deep Learning AMI with Conda, see the [Using the Deep Learning AMI with Conda](tutorial-conda.md) tutorial.

# DLAMI Architecture Options
Architecture

AWS Deep Learning AMIss are offered with either x86-based or Arm64-based [AWS Graviton2](https://aws.amazon.com/ec2/graviton/) architectures.

For information about getting started with the ARM64 GPU DLAMI, see [The ARM64 DLAMI](tutorial-arm64.md). For more details on available instance types, see [Choosing a DLAMI instance type](instance-select.md).

**Next Up**  
[DLAMI Operating System Options](overview-os.md)

# DLAMI Operating System Options
OS

DLAMIs are offered in the following operating systems.
+ Amazon Linux 2
+ Amazon Linux 2023
+ Ubuntu 20.04
+ Ubuntu 22.04

Older versions of operating systems are available on deprecated DLAMIs. For more information on DLAMI deprecation, see [Deprecations for DLAMI](https://docs.aws.amazon.com/dlami/latest/devguide/deprecations.html)

Before choosing a DLAMI, assess what instance type you need and identify your AWS Region.

**Next Up**  
[Choosing a DLAMI instance type](instance-select.md)