mloptimizer.core
#
Submodules#
Package Contents#
Classes#
Base class for the optimization of a classifier |
|
Class for the optimization of a gradient boosting classifier from keras.wrappers.scikit_learn.KerasClassifier. |
- class mloptimizer.core.Optimizer(estimator_class, features: numpy.array, labels: numpy.array, folder=os.curdir, log_file='mloptimizer.log', hyperparam_space: mloptimizer.hyperparams.HyperparameterSpace = None, eval_function=train_score, fitness_score='accuracy', metrics=None, seed=random.randint(0, 1000000), use_parallel=False, use_mlflow=False)[source]#
Base class for the optimization of a classifier
- estimator_class#
class of the classifier
- Type:
class
- features#
np.array with the features
- Type:
np.array
- labels#
np.array with the labels
- Type:
np.array
- hyperparam_space#
object with the hyperparameter space: fixed and evolvable hyperparams
- Type:
- eval_dict#
dictionary with the evaluation of the individuals
- Type:
dict
- populations#
list of populations
- Type:
list
- logbook#
list of logbook
- Type:
list
- seed#
seed for the random functions
- Type:
int
- use_parallel#
flag to use parallel processing
- Type:
bool
- use_mlflow#
flag to use mlflow
- Type:
bool
- set_mlopt_seed(seed)[source]#
Method to set the seed for the random functions
- Parameters:
seed (int) – seed for the random functions
- static get_subclasses(my_class)[source]#
Method to get all the subclasses of a class (in this case use to get all the classifiers that can be optimized).
- Parameters:
my_class (class) – class to get the subclasses
- Returns:
list of subclasses
- Return type:
list
- optimize_clf(population_size: int = 10, generations: int = 3, cxpb=0.5, mutpb=0.5, tournsize=4, indpb=0.5, n_elites=10, checkpoint: str = None, opt_run_folder_name: str = None) object [source]#
Method to optimize the classifier. It uses the custom_ea_simple method to optimize the classifier.
- Parameters:
population_size (int, optional (default=10)) – number of individuals in each generation
generations (int, optional (default=3)) – number of generations
cxpb (float, optional (default=0.5)) – crossover probability
mutpb (float, optional (default=0.5)) – mutation probability
tournsize (int, optional (default=4)) – number of individuals to select in the tournament
indpb (float, optional (default=0.5)) – independent probability for each attribute to be mutated
n_elites (int, optional (default=10)) – number of elites to keep in the next generation
checkpoint (str, optional (default=None)) – path to the checkpoint file
opt_run_folder_name (str, optional (default=None)) – name of the folder where the execution will be saved
- Returns:
clf – classifier with the best hyperparams
- Return type:
classifier
- class mloptimizer.core.KerasClassifierOptimizer(estimator_class, features: numpy.array, labels: numpy.array, folder=os.curdir, log_file='mloptimizer.log', hyperparam_space: mloptimizer.hyperparams.HyperparameterSpace = None, eval_function=train_score, fitness_score='accuracy', metrics=None, seed=random.randint(0, 1000000), use_parallel=False, use_mlflow=False)[source]#
Bases:
mloptimizer.core.Optimizer
Class for the optimization of a gradient boosting classifier from keras.wrappers.scikit_learn.KerasClassifier. It inherits from BaseOptimizer.