Datasets:
Dataset Viewer issue: StreamingRowsError
#21
by
egrace479
- opened
The dataset viewer is not working.
Error details:
Error code: StreamingRowsError
Exception: ValueError
Message: image at A00000001831.jpg doesn't have metadata in hf://datasets/imageomics/2018-NEON-beetles@58d24dcb04d4c57ed9decb57baefe147761320cf/group_images/metadata.csv.
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
File "/src/libs/libcommon/src/libcommon/utils.py", line 197, in decorator
return func(*args, **kwargs)
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2226, in __iter__
for key, example in ex_iterable:
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 219, in __iter__
for key_example in islice(self.generate_examples_fn(**gen_kwags), shard_example_idx_start, None):
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/folder_based_builder/folder_based_builder.py", line 311, in _generate_examples
raise ValueError(
ValueError: image at A00000001831.jpg doesn't have metadata in hf://datasets/imageomics/2018-NEON-beetles@58d24dcb04d4c57ed9decb57baefe147761320cf/group_images/metadata.csv.
group_images_masks
has the same issue for A00000001831_mask.png
. Both of those images are listed in the file_name
column for both metadata files.
Additionally, the separate segmented splits
subset isn't registering anything, is that because I need to have it in the configs as follows?
config_name: separate segmented splits
split: train
data_files:
- Separate_segmented_train_test_splits_80_20/train/*/*.jpg
split: test
data_files:
- Separate_segmented_train_test_splits_80_20/test/*/*.jpg
- Separate_segmented_train_test_splits_80_20/metadata.csv
The metadata file does have both split
and subset
columns.
Thanks!