--- library_name: transformers tags: [Danish, Mixed Tokenization, CerebrasGPT] --- ### DA-MIXED-CEREBRAS This is an experimental Danish language model fine-tuned on a combination of tokenizers, including both morphological and BPE approaches. Built on the CerebrasGPT-111M architecture, it explores how mixed tokenization strategies affect Danish text generation.