[ARCHIVED 2026-04-20 — stampby retired; see bong-water-water-bong] no longer actively developed
-
Updated
Apr 20, 2026 - Shell
[ARCHIVED 2026-04-20 — stampby retired; see bong-water-water-bong] no longer actively developed
vLLM + Qwen3.6-27B (BF16) OpenAI-compatible inference server on AMD Strix Halo (Ryzen AI Max+ 395, gfx1151). Vision input, 256K context, /v1/responses with separated reasoning, via TheRock ROCm.
AMD ISP4 camera driver module for Ryzen AI laptops
llama.cpp + Qwen3.6-27B (Q8_0 GGUF) OpenAI-compatible inference server on AMD Strix Halo (Ryzen AI Max+ 395, gfx1151). 256K context, ~7.5 t/s decode via TheRock ROCm Docker.
ComfyUI on AMD Strix Halo (RDNA 3.5 / gfx1151) via Docker. Ubuntu Rolling + UV-managed Python 3.12 + ROCm preview wheels. Solves the silent CPU fallback Debian/Python 3.13 images hit on gfx1151.
Talos-O (Omni): A sovereign, embodied agentic organism forged on AMD Strix Halo. Integrating the Chimera Kernel (Linux 7.0), Zero-Copy Introspection, and the Phronesis Engine. Built from First Principles.
Experiments, notes, and benchmarks for AMD Ryzen AI (Krackan Point NPU + Radeon iGPU) on Linux
Docker infrastructure for AMD Strix Halo (RDNA 3.5 / gfx1151): PyTorch + ROCm base container and a separate Ollama LLM service. Two folders, two Compose files, one Strix Halo box.
Experimental local LLM API for AMD Strix Halo (gfx1151) on ROCm 7.10 (TheRock). Two-service split: vLLM inference engine + FastAPI gateway with OpenAI protocol normalization, auth, management. Docker Compose.
A fully offline, air-gapped desktop AI workspace for legal, medical, and financial professionals. Runs entirely on the AMD Ryzen NPU with zero cloud calls and zero telemetry. All AI processing happens locally so no data leaves the device, ensuring complete privacy and confidentiality.
A lightweight TUI monitor for AMD Ryzen AI NPUs
Stable Diffusion image generation on AMD Ryzen AI NPUs for Linux
Add a description, image, and links to the ryzen-ai topic page so that developers can more easily learn about it.
To associate your repository with the ryzen-ai topic, visit your repo's landing page and select "manage topics."