CODA: Repurposing Continuous VAEs for Discrete Tokenization

This repository contains the CODA tokenizer, as introduced in CODA: Repurposing Continuous VAEs for Discrete Tokenization.

Project Page: https://lzy-tony.github.io/coda

Code: https://github.com/LeapLabTHU/CODA

Highlights

CODA addresses the challenges of training conventional VQ tokenizers by decoupling compression and discretization. Instead of training from scratch, CODA adapts off-the-shelf continuous VAEs into discrete tokenizers, leading to stable and efficient training with strong visual fidelity.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support