View a markdown version of this page

Supported LLM providers - Generative AI Application Builder on AWS

Supported LLM providers

The solution can integrate with the following LLM providers:

  1. Amazon Bedrock

    • Documentation: https://aws.amazon.com/bedrock/

    • Supported models:

      • Amazon

        • Nova Lite

        • Nova Micro

        • Nova Pro

      • AI21 Labs

        • Jamba 1.5 Mini

        • Jamba 1.5 Large

      • Anthropic

        • Claude v3 Haiku

        • Claude v3.5 Sonnet

        • Claude v3.7 Sonnet (through the use of inference profiles)

      • Cohere

        • Command R

        • Command R+

      • Deepseek

        • Deepseek-R1 (through the use of inference profiles)

      • Meta

        • Llama 3

        • Llama 3.2 (through the use of inference profiles)

      • Mistral AI

        • Mistral 7B Instruct

        • Mistral 8x7B Instruct

      • Cross-region inference

        • Ability to use inference profiles defined in the same Region as the Deployment dashboard

  2. Amazon SageMaker AI

    • Documentation: https://aws.amazon.com/sagemaker/

    • Supported models: Text to Text models

For the latest model parameters, best practices, and recommended uses, refer to the documentation from the model providers.