A Python toolbox for color space conversion, 3D LUT generation, and LUT-based color matching / training. It supports GPU acceleration (CuPy) with a CPU fallback (NumPy), and provides a lightweight operator-based pipeline that lets you chain decode / matrix / tone mapping / roll-off operators.
Note: this repo contains both (1) “generate a LUT from JSON color space definitions” utilities and (2) “fit/optimize a LUT from paired images” training code. Which path you use depends on the entry script you run.
- Generate 3D LUTs (
.cube) from JSON color space descriptions - Color conversion + operator pipeline (decode/encode, matrices, white adaptation, roll-off, tone mapping, etc.)
- LUT training/optimization from paired images (example entry script included)
- Simple GUI (PyQt5 / PyOpenGL)
Python 3.10+ is recommended.
pip install -r requirements.txtrequirements.txt includes cupy>=12.0.0. If you don’t have CUDA properly installed, CuPy installation may fail or the package may be unusable.
- CPU-only workflow: comment out/remove
cupy, or install a CuPy build that matches your CUDA environment. - GPU workflow: make sure your CUDA driver/toolkit matches the selected CuPy version.
The repo provides a training entry script: run_train.py.
python run_train.pyMost training knobs live in config.py. You can visualize the latest loss curve with plot_latest_loss.py.
There are multiple demos you can use as references for “how to build a pipeline” and “how to export a .cube”:
demo_aces_ap1_to_709.py: generate an ACES AP1 (ACEScg linear) → Rec.709 LUTdemo_slog3_to_709_minimal.py/demo_slog3_to_709_midgray_rgb_rolloff.py: S-Log3 → 709 variantsgen_logc3_to_709_simple.py: a simplified LogC3 → 709 LUT generator
ColorSpaceUtils/ provides a JSON-driven LUT generation workflow:
ColorSpaceUtils/Bin/: color space JSON definitionsColorSpaceUtils/CSTLUTGen.py: LUT generation core (create identity grid + run a pipeline)
python -m ColorSpaceUtils.CSTLUTGen --src ColorSpaceUtils/Bin/bt709_display.json --dst ColorSpaceUtils/Bin/slog3sgmt3c_camera.json --precision 33 --output Rec709_to_SLog3.cubeIf both JSON files include a mid_gray field, the converter may apply an automatic scaling to align mid-gray luminance (e.g. 18% reflectance). To disable it, there is typically an option like auto_mid_gray_align=False (see the code for the exact API).
GUI entry script: gui_main.py (requires PyQt5, PyOpenGL, pyqtgraph).
python gui_main.pyIf OpenGL is not available (driver issues / remote desktop), validate the workflow using the scripts first.
run_train.py: training entryconfig.py: training/experiment configrequirements.txt: dependenciesplot_latest_loss.py: plot the latest loss curveImgObj/: image objects, I/O, dataset containers (e.g.image.py,pictpool.py)LUTObj/: LUT data structures and processingLossObj/: loss functions and related logicColorSpaceUtils/: color space definitions, pipelines, and LUT generationCSTLutGen_Demo/: small LUT-generation demoscheckpoint/: training checkpoint images
The ACES AP1 → Rec.709 workflow is documented in ACES_AP1_TO_709_README.md. The corresponding script is demo_aces_ap1_to_709.py, and an example output LUT is aces_ap1_to_709.cube.
- Transfer expressions in JSON are evaluated in a “restricted expression” style implementation. Don’t run untrusted JSON definitions.
- When building pipelines, keep units consistent (encoded vs linear vs absolute nits).
All trademarks and product names mentioned in this repository are the property of their respective owners.
Unless otherwise noted, the contents of this repository are licensed under Creative Commons Attribution-NonCommercial 3.0 China (CC BY-NC 3.0 CN).
- You may share and adapt the material for non-commercial purposes, as long as you provide appropriate attribution.
- For commercial licensing, please contact the author.
See LICENSE for the full license text.
Third-party dependencies and assets (including but not limited to Python packages, sample images, and datasets) remain under their respective licenses/terms.