Examples | Available Images | AWS Doc
- [2025/11/20] We released v0.11.2 vLLM DLC, available in EC2/EKS/ECS
public.ecr.aws/deep-learning-containers/vllm:0.11.2-gpu-py312-ec2and SageMakerpublic.ecr.aws/deep-learning-containers/vllm:0.11.2-gpu-py312 - [2025/11/17] We released first Sglang DLC, available in SageMaker
public.ecr.aws/deep-learning-containers/sglang:0.5.5-gpu-py312
- Learn to set up and validate a distributed training environment on Amazon EKS using AWS Deep Learning Containers for scalable ML model training across multiple nodes. Checkout Master Distributed Training on EKS for details 🌐
- Seamlessly integrate AWS Deep Learning Containers with Amazon SageMaker's managed MLflow service to streamline your ML experiment tracking, model management, and deployment workflow. Checkout Level Up with SageMaker AI & MLflow for details 🔄
- Deploy and serve Large Language Models efficiently on Amazon EKS using vLLM Deep Learning Containers for optimized inference performance and scalability. Checkout Deploy LLMs Like a Pro on EKS for details 🚀
- Learn to fine-tune and deploy Meta's Llama 3.2 Vision model for AI-powered web automation by combining AWS DLCs, Amazon EKS, and Bedrock to enable visual understanding in your applications. Checkout Web Automation with Meta Llama 3.2 Vision for details 🎯
- Discover how to simplify and accelerate your deep learning workflow by integrating AWS Deep Learning Containers with Amazon Q Developer and Model Context Protocol (MCP) for streamlined environment setup and management. Checkout Supercharge Your DL Environment for details ⚡
- Learn how to deploy and optimize Large Language Models (LLMs) on Amazon EKS using vLLM Deep Learning Containers for high-performance inference at scale. Checkout the Workshop Guide and Sample Code for details 🚀
AWS Deep Learning Containers (DLCs) are a suite of Docker images that streamline the deployment of AI/ML workloads on Amazon SageMaker, Amazon EKS, and Amazon EC2.
- Pre-optimized Environments: Production-ready containers with optimized deep learning frameworks
- Latest AI/ML Tools: Quick access to cutting-edge frameworks like vLLM, SGLang, and PyTorch
- Multi-Platform Support: Run seamlessly on SageMaker, EKS, or EC2
- Enterprise-Ready: Built with security, performance, and scalability in mind
- Rapid Deployment: Get started in minutes with pre-configured environments
- Framework Flexibility: Support for popular frameworks like PyTorch, TensorFlow, and more
- Performance Optimized: Containers tuned for AWS infrastructure
- Regular Updates: Quick access to latest framework releases and security patches
- AWS Integration: Seamless compatibility with AWS AI/ML services
- Data Scientists building and training models
- ML Engineers deploying production workloads
- DevOps teams managing ML infrastructure
- Researchers exploring cutting-edge AI capabilities
Our containers undergo rigorous security scanning and are regularly updated to address vulnerabilities, ensuring your ML workloads run on a secure foundation.
This project is licensed under the Apache-2.0 License.
