📝Tips and Tricks in Keras
Warning: There is no magical formula or Holy Grail here, though a new world might open the door for you.
📈Python for finance series
- Identifying Outliers
- Identifying Outliers — Part Two
- Identifying Outliers — Part Three
- Stylized Facts
- Feature Engineering & Feature Selection
- Data Transformation
- Fractionally Differentiated Features
- Data Labelling
This can be done by subclassing the Tuner
class you are using and overriding run_trial
. (Note that Hyperband
sets the epochs to train for via its own logic, so if you're using Hyperband
you shouldn't tune the epochs). Here's an example with kt.tuners.BayesianOptimization
:
class MyTuner(kerastuner.tuners.BayesianOptimization):
def run_trial(self, trial, *args, **kwargs):
# You can add additional HyperParameters for preprocessing and custom training loops
# via overriding `run_trial`
kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)…