The dataset viewer is not available for this split.
Error code: FeaturesError Exception: ArrowInvalid Message: JSON parse error: Column(/constraints/[]/function/node_list/[]/args/[]) changed from number to string in row 0 Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables df = pandas_read_json(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json return pd.read_json(path_or_buf, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 815, in read_json return json_reader.read() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1025, in read obj = self._get_object_parser(self.data) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1051, in _get_object_parser obj = FrameParser(json, **kwargs).parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1187, in parse self._parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1402, in _parse self.obj = DataFrame( File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/core/frame.py", line 778, in __init__ mgr = dict_to_mgr(data, index, columns, dtype=dtype, copy=copy, typ=manager) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/core/internals/construction.py", line 503, in dict_to_mgr return arrays_to_mgr(arrays, columns, index, dtype=dtype, typ=typ, consolidate=copy) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/core/internals/construction.py", line 114, in arrays_to_mgr index = _extract_index(arrays) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/core/internals/construction.py", line 677, in _extract_index raise ValueError("All arrays must be of the same length") ValueError: All arrays must be of the same length During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 231, in compute_first_rows_from_streaming_response iterable_dataset = iterable_dataset._resolve_features() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 3335, in _resolve_features features = _infer_features_from_batch(self.with_format(None)._head()) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2096, in _head return next(iter(self.iter(batch_size=n))) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2296, in iter for key, example in iterator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1856, in __iter__ for key, pa_table in self._iter_arrow(): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1878, in _iter_arrow yield from self.ex_iterable._iter_arrow() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 476, in _iter_arrow for key, pa_table in iterator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 323, in _iter_arrow for key, pa_table in self.generate_tables_fn(**gen_kwags): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables raise e File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables pa_table = paj.read_json( File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Column(/constraints/[]/function/node_list/[]/args/[]) changed from number to string in row 0
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Curated by: [Andrew Rosemberg & Contributors]
Dataset Card for Parametric Optimization Problems
This dataset is a collection of parametrized optimization problems stored in MathOptFormat (.mof.json
) files. Each file encodes a mathematical optimization problem—its objective, constraints, and parameters—using a standardized data structure for portability and ease of parsing.
Dataset Details
Dataset Description
Parametric optimization problems arise in scenarios where certain elements (e.g., coefficients, constraints) may vary according to problem parameters. This collection gathers different problem instances across various domains (e.g., power systems, control, resource allocation) in a uniform JSON-based format. Users can load, modify, and solve these problems with specialized libraries—particularly with the LearningToOptimize.jl package in Julia.
A general form of a parameterized convex optimization problem is
where is the parameter.
Usage
Using the LearningToOptimize.jl package in julia, users can generate problem variants by sampling parameter values follwing defined rules:
using LearningToOptimize
general_sampler(
"PGLib/Load/ACPPowerModel/pglib_opf_case3_lmbd.m_ACPPowerModel_load.mof.json";
samplers=[
(original_parameters) -> scaled_distribution_sampler(original_parameters, 10000),
(original_parameters) -> line_sampler(original_parameters, 1.01:0.01:1.25),
(original_parameters) -> box_sampler(original_parameters, 300),
],
)
where scaled_distribution_sampler
, line_sampler
and box_sampler
are some examples of built in samplers.
Outside Dataset Sources
- PGLib: power-grid-lib
- JuMP JuMP Tutorials
Uses
Direct Use
These problems can be directly used to:
- Test solver performance on a variety of instances.
- Benchmark machine learning models that learn optimization proxies.
- Generate synthetic scenarios by applying parametric samplers for stress-testing or research.
Out-of-Scope Use
- The dataset is not intended for training general-purpose NLP or computer vision models.
- Direct personal or sensitive information is not included, so any privacy-infringing use does not apply.
Dataset Structure
TBD
File Structure
In a typical .mof.json
file, you will find:
- Objectives: Specifies the optimization sense (e.g.,
Min
,Max
) and the functions to be optimized. - Variables: A list of decision variables, potentially including parameters as special variable entries.
- Constraints: Each constraint references a function (made up of one or more variables) and a set specifying bounds, including
Parameter
sets for parametric variables.
An example snippet for a parameter:
{
"function": {
"name": "name_of_parameter",
"type": "Variable"
},
"set": {
"type": "Parameter",
"value": 1.0
}
}
- Downloads last month
- 69