Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
occiglot
/
tokenizer-wiki-bench
like
5
Follow
Occiglot
45
Modalities:
Text
Formats:
parquet
Languages:
Afrikaans
Arabic
Bulgarian
+ 42
Size:
10M - 100M
ArXiv:
arxiv:
2012.15613
Libraries:
Datasets
Dask
Croissant
+ 1
License:
mit
Dataset card
Data Studio
Files
Files and versions
Community
1
f4c6647
tokenizer-wiki-bench
/
pt
1 contributor
History:
1 commit
mbrack
Upload dataset
631a380
verified
about 1 year ago
train-00000-of-00014.parquet
Safe
463 MB
LFS
Upload dataset
about 1 year ago
train-00001-of-00014.parquet
Safe
227 MB
LFS
Upload dataset
about 1 year ago
train-00002-of-00014.parquet
Safe
238 MB
LFS
Upload dataset
about 1 year ago
train-00003-of-00014.parquet
Safe
132 MB
LFS
Upload dataset
about 1 year ago
train-00004-of-00014.parquet
Safe
159 MB
LFS
Upload dataset
about 1 year ago
train-00005-of-00014.parquet
Safe
177 MB
LFS
Upload dataset
about 1 year ago
train-00006-of-00014.parquet
Safe
119 MB
LFS
Upload dataset
about 1 year ago
train-00007-of-00014.parquet
Safe
179 MB
LFS
Upload dataset
about 1 year ago
train-00008-of-00014.parquet
Safe
155 MB
LFS
Upload dataset
about 1 year ago
train-00009-of-00014.parquet
Safe
165 MB
LFS
Upload dataset
about 1 year ago
train-00010-of-00014.parquet
Safe
157 MB
LFS
Upload dataset
about 1 year ago
train-00011-of-00014.parquet
Safe
218 MB
LFS
Upload dataset
about 1 year ago
train-00012-of-00014.parquet
Safe
199 MB
LFS
Upload dataset
about 1 year ago
train-00013-of-00014.parquet
Safe
229 MB
LFS
Upload dataset
about 1 year ago