Skip to content

Conversation

@abhinavclemson
Copy link
Collaborator

@abhinavclemson abhinavclemson commented Jan 22, 2026

Description

Add Deepseek and GPT OSS vLLM standalone weight mapping.

This change introduces deepseek3.py and gpt_oss.py containing DEEPSEEK_VLLM_MAPPING and GPT_OSS_VLLM_MAPPING dataclasses. These dataclasses define how to map MaxText model weights to a format compatible with vLLM format on tpu-inference side, including parameter name mappings, and sharding specifications.

Command:

xpk workload create-pathways  --workload "abhinav-grpo-new"  --docker-image [gcr.io/cloud-tpu-multipod-dev/abhinavsing_vllm_runner:latest](http://gcr.io/cloud-tpu-multipod-dev/abhinavsing_vllm_runner:latest) --cluster mazumdera-v5p-64-pw-2 --tpu-type=v5p-64  --num-slices=1  --priority=high --custom-pathways-worker-args="--xprof_max_trace_buffers=16384" --command "MODEL_IMPL_TYPE=flax_nnx NEW_MODEL_DESIGN=1 TF_CPP_MIN_LOG_LEVEL=0 JAX_PLATFORMS=proxy JAX_BACKEND_TARGET=grpc://127.0.0.1:29000 ENABLE_PATHWAYS_PERSISTENCE='1' python3 -m src.MaxText.rl.train_rl src/MaxText/configs/rl.yml   model_name=gpt-oss-20b   tokenizer_path=unsloth/gpt-oss-20b-BF16   load_parameters_path=gs://shuningjin-multipod-dev/gpt-oss-20b/scan-flags-false-2025-11-11-01-42-40/0/items   run_name=test-tunix-maxtext-gpt-oss-20b-BF16   base_output_directory=gs://abhinavsing_bucket/   checkpoint_storage_use_ocdbt=False checkpoint_storage_use_zarr3=False rollout_data_parallelism=2 rollout_tensor_parallelism=8 hbm_utilization_vllm=0.6 batch_size=8 micro_batch_size=8 profiler=xplane profiler_steps=2"

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link

codecov bot commented Jan 22, 2026

Codecov Report

❌ Patch coverage is 0% with 58 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
...axText/integration/tunix/weight_mapping/gpt_oss.py 0.00% 32 Missing ⚠️
...Text/integration/tunix/weight_mapping/deepseek3.py 0.00% 20 Missing ⚠️
...xText/integration/tunix/weight_mapping/__init__.py 0.00% 6 Missing ⚠️

📢 Thoughts on this report? Let us know!

@abhinavclemson abhinavclemson force-pushed the deepseek-moe branch 4 times, most recently from 71d1fb9 to 5b2d042 Compare January 22, 2026 16:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants