📝Tips and Tricks in Keras

How to tune the number of epochs and batch_size in Keras-tuner?

Ke Gui
2 min readJun 6, 2020
Photo by Dave Gandy under the Public Domain Dedication License

Warning: There is no magical formula or Holy Grail here, though a new world might open the door for you.

📈Python for finance series

  1. Identifying Outliers
  2. Identifying Outliers — Part Two
  3. Identifying Outliers — Part Three
  4. Stylized Facts
  5. Feature Engineering & Feature Selection
  6. Data Transformation
  7. Fractionally Differentiated Features
  8. Data Labelling

This can be done by subclassing the Tuner class you are using and overriding run_trial. (Note that Hyperband sets the epochs to train for via its own logic, so if you're using Hyperband you shouldn't tune the epochs). Here's an example with kt.tuners.BayesianOptimization:

class MyTuner(kerastuner.tuners.BayesianOptimization):
def run_trial(self, trial, *args, **kwargs):
# You can add additional HyperParameters for preprocessing and custom training loops
# via overriding `run_trial`
kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)…

--

--

Ke Gui
Ke Gui

Written by Ke Gui

An ordinary guy who wants to be the reason someone believes in the goodness of people. He is living at Brisbane, Australia, with a lovely backyard.

Responses (1)