CODA: Repurposing Continuous VAEs for Discrete Tokenization
This repository contains the CODA tokenizer, as introduced in CODA: Repurposing Continuous VAEs for Discrete Tokenization.
Project Page: https://lzy-tony.github.io/coda
Code: https://github.com/LeapLabTHU/CODA
Highlights
CODA addresses the challenges of training conventional VQ tokenizers by decoupling compression and discretization. Instead of training from scratch, CODA adapts off-the-shelf continuous VAEs into discrete tokenizers, leading to stable and efficient training with strong visual fidelity.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.