Skip to content

Using decorators to enforce stochastic fitness

paddymccrudden edited this page Sep 6, 2017 · 3 revisions

The deap package generally builds on deterministic fitness functions. But what about if your fitness is stochastic? Do you have to re-write the underlying functions? The answer is No!

The problem I am working on is as follows: a portfolio is represented by a gene, which in turn is simply a np.array of fixed dimension. The fitness function is stochastic in that the evaluation of a portfolio consists of a portfolio metric (like return, or return/ risk) over a number of samples of potential asset return futures. When you use the key functions like eaMuPlusLambda to evolve such a portfolio a funny thing happens. Since the fitness function is assumed to be deterministic, portfolios that happened to have a good sample of returns keep getting selected.

The solution I found here was to use a decorator. On Deap Documentation the authors note that "Tool decoration is a very powerful feature that helps to control very precise things during an evolution without changing anything in the algorithm or operators."

In my case, I added a decorator to the selection function that forces a recalculation of fitness before selection

 
toolbox.register("select", tools.selBest)
toolbox.decorate("select", self.recalcuate_fitness())

The recalculation function is pretty simple too:


def recalcuate_fitness(self):

   def decorator(func):
       def wrapper(*args, **kargs):

           # first deleted all the fitness values
           pop = args[0]
           for ind in pop:
               del ind.fitness.values

           # then recalculate fitness
           fitnesses = toolbox.map(toolbox.evaluate, pop)
           for ind, fit in zip(pop, fitnesses):
               ind.fitness.values = fit

           # the apply the function and return
           return func(*args, **kargs)
       return wrapper
   return decorator

I don't know if this is optimal, but it works very well!

Clone this wiki locally