ppo-LunarLander-v2 / results.json
lopohui's picture
first model
7fb48b7
raw
history blame contribute delete
164 Bytes
{"mean_reward": 249.1637533062048, "std_reward": 27.890147313996174, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-21T02:35:30.124004"}