Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors with EvolutionStrategy in MOO benchmarks #1237

Open
theoajc opened this issue Sep 10, 2021 · 0 comments
Open

Errors with EvolutionStrategy in MOO benchmarks #1237

theoajc opened this issue Sep 10, 2021 · 0 comments

Comments

@theoajc
Copy link
Contributor

theoajc commented Sep 10, 2021

Steps to reproduce

  1. run benchmarks by using command python -m nevergrad.benchmark multiobjective_example_many —num_workers=76 —plot

Observed Results

When running the the benchmarks to check performance of the optimizers in a MOO context I noticed these errors from the EvolutionStrategies optimizer in particular. The benchmarks aren't running for some of the optimizers and this is the error that is given. I ran the same command on an older version of master to ensure the no_hypervolume changes didn’t introduce these errors and saw that the same errors did crop up even before those changes.

Starting 282 (4/16 of worker): Experiment: Experiment: EvolutionStrategy(offsprings=200, recombination_ratio=0.5)<budget=800, num_workers=1, batch_mode=True> (dim=7, param=:[0. 0. 0. 0. 0. 0. 0.]) on Instance of MultiExperiment(function_class='MultiExperiment', name='sphere,sphere,sphere,sphere') with seed None
Error when applying Experiment: Experiment: EvolutionStrategy(offsprings=200, recombination_ratio=0.5)<budget=800, num_workers=1, batch_mode=True> (dim=7, param=:[0. 0. 0. 0. 0. 0. 0.]) on Instance of MultiExperiment(function_class='MultiExperiment', name='sphere,sphere,sphere,sphere') with seed None:
Traceback (most recent call last):
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 196, in run
    self._run_with_error()
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 274, in _run_with_error
    raise e
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 266, in _run_with_error
    obase.Optimizer.minimize(
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/base.py", line 641, in minimize
    args = self.ask()
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/base.py", line 462, in ask
    candidate = self._internal_ask_candidate()
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/es.py", line 53, in _internal_ask_candidate
    uid = self._uid_queue.ask()
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/utils.py", line 329, in ask
    raise RuntimeError("Both asked and told queues are empty.")
RuntimeError: Both asked and told queues are empty.
Starting 319 (5/16 of worker): Experiment: Experiment: RecMixES<budget=800, num_workers=100, batch_mode=True> (dim=6, param=:[0. 0. 0. 0. 0. 0.]) on Instance of MultiExperiment(function_class='MultiExperiment', name='sphere,sphere,sphere,sphere,sphere,sphere') with seed None
Error when applying Experiment: Experiment: RecMixES<budget=800, num_workers=100, batch_mode=True> (dim=6, param=:[0. 0. 0. 0. 0. 0.]) on Instance of MultiExperiment(function_class='MultiExperiment', name='sphere,sphere,sphere,sphere,sphere,sphere') with seed None:
Traceback (most recent call last):
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 196, in run
    self._run_with_error()
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 274, in _run_with_error
    raise e
  File "/private/home/theoajclarke/nevergrad/nevergrad/benchmark/xpbase.py", line 266, in _run_with_error
    obase.Optimizer.minimize(
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/base.py", line 624, in minimize
    self.tell(x, result)
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/base.py", line 376, in tell
    self._internal_tell_candidate(candidate, loss)
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/es.py", line 77, in _internal_tell_candidate
    self._select()
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/es.py", line 82, in _select
    choices_rank = self._rank_method(choices, n_selected=self._config.popsize)
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/multiobjective/nsga2.py", line 212, in rank
    frontiers = frontier_ranker.compute_ranking(population)
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/multiobjective/nsga2.py", line 162, in compute_ranking
    dominance_test_result = self.compare(uid2candidate[uid1], uid2candidate[uid2])
  File "/private/home/theoajclarke/nevergrad/nevergrad/optimization/multiobjective/nsga2.py", line 129, in compare
    one_wins = np.sum(candidate1.losses < candidate2.losses)
  File "/private/home/theoajclarke/nevergrad/nevergrad/parametrization/core.py", line 82, in losses
    raise RuntimeError("No loss was provided")
RuntimeError: No loss was provided

Expected Results

I would expect the benchmarks to run normally and not produce errors.

Relevant Code

@bottler bottler changed the title Errors with some algorithms in MOO benchmarks Errors with EvolutionStrategy in MOO benchmarks Sep 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant