Issue Description
The documentation for running vLLM on MI300X GPUs references a Docker image tag that does not exist on Docker Hub.
Documentation URL: https://github.com/ROCm/amdgpu-docs/blob/develop/docs/gpu-partitioning/mi300x/run-vllm.rst
Issue: The documentation instructs users to pull the image rocm/vllm:instinct_main:
sudo docker pull rocm/vllm:instinct_main
Problem: This tag does not exist in the rocm/vllm Docker Hub repository.
Available Tags
The repository currently has 20 tags total, including:
latest
v0.14.0_amd_dev
rocm7.0.0_vllm_0.11.2_20251210
rocm6.4.1_vllm_0.10.1_20250909
rocm6.3.1_instinct_vllm0.8.3_20250415
- etc.
Suggested Fix
The documentation should be updated to reference a valid tag. Options include:
- Use
latest for the most recent version
- Use a specific versioned tag like
rocm7.0.0_vllm_0.11.2_20251210
- Use one of the instinct-specific tags like
rocm6.3.1_instinct_vllm0.8.3_20250415
Impact
Users following the documentation will encounter an error when attempting to pull the image, preventing them from running the vLLM workload as documented.
Issue Description
The documentation for running vLLM on MI300X GPUs references a Docker image tag that does not exist on Docker Hub.
Documentation URL: https://github.com/ROCm/amdgpu-docs/blob/develop/docs/gpu-partitioning/mi300x/run-vllm.rst
Issue: The documentation instructs users to pull the image
rocm/vllm:instinct_main:Problem: This tag does not exist in the rocm/vllm Docker Hub repository.
Available Tags
The repository currently has 20 tags total, including:
latestv0.14.0_amd_devrocm7.0.0_vllm_0.11.2_20251210rocm6.4.1_vllm_0.10.1_20250909rocm6.3.1_instinct_vllm0.8.3_20250415Suggested Fix
The documentation should be updated to reference a valid tag. Options include:
latestfor the most recent versionrocm7.0.0_vllm_0.11.2_20251210rocm6.3.1_instinct_vllm0.8.3_20250415Impact
Users following the documentation will encounter an error when attempting to pull the image, preventing them from running the vLLM workload as documented.