|
| 1 | +```@raw html |
| 2 | +--- |
| 3 | +authors: |
| 4 | + - name: Bernhard Ahrens |
| 5 | + avatar: https://raw.githubusercontent.com/EarthyScience/EasyHybrid.jl/72c2fa9df829d46d25df15352a4b728d2dbe94ed/docs/src/assets/Bernhard_Ahrens.png |
| 6 | + link: https://www.bgc-jena.mpg.de/en/bgi/miss |
| 7 | + - name: Lazaro Alonso |
| 8 | + avatar: https://avatars.githubusercontent.com/u/19525261?v=4 |
| 9 | + platform: github |
| 10 | + link: https://lazarusa.github.io |
| 11 | +
|
| 12 | +--- |
| 13 | +
|
| 14 | +<Authors /> |
| 15 | +``` |
| 16 | + |
| 17 | +# Getting Started |
| 18 | + |
| 19 | + |
| 20 | +### 1. Setup and Data Loading |
| 21 | + |
| 22 | +Load package and synthetic dataset |
| 23 | + |
| 24 | +```@example hyperparameter_tuning |
| 25 | +using EasyHybrid |
| 26 | +using CairoMakie |
| 27 | +using Hyperopt |
| 28 | +``` |
| 29 | + |
| 30 | +```@example hyperparameter_tuning |
| 31 | +ds = load_timeseries_netcdf("https://github.com/bask0/q10hybrid/raw/master/data/Synthetic4BookChap.nc") |
| 32 | +ds = ds[1:20000, :] # Use subset for faster execution |
| 33 | +first(ds, 5) |
| 34 | +``` |
| 35 | + |
| 36 | +### 2. Define the Process-based Model |
| 37 | + |
| 38 | +RbQ10 model: Respiration model with Q10 temperature sensitivity |
| 39 | + |
| 40 | +```@example hyperparameter_tuning |
| 41 | +function RbQ10(;ta, Q10, rb, tref = 15.0f0) |
| 42 | + reco = rb .* Q10 .^ (0.1f0 .* (ta .- tref)) |
| 43 | + return (; reco, Q10, rb) |
| 44 | +end |
| 45 | +``` |
| 46 | + |
| 47 | +### 3. Configure Model Parameters |
| 48 | + |
| 49 | +Parameter specification: (default, lower_bound, upper_bound) |
| 50 | + |
| 51 | +```@example hyperparameter_tuning |
| 52 | +parameters = ( |
| 53 | + rb = (3.0f0, 0.0f0, 13.0f0), # Basal respiration [μmol/m²/s] |
| 54 | + Q10 = (2.0f0, 1.0f0, 4.0f0), # Temperature sensitivity - describes factor by which respiration is increased for 10 K increase in temperature [-] |
| 55 | +) |
| 56 | +``` |
| 57 | + |
| 58 | +### 4. Construct the Hybrid Model |
| 59 | + |
| 60 | +Define input variables |
| 61 | + |
| 62 | +```@example hyperparameter_tuning |
| 63 | +forcing = [:ta] # Forcing variables (temperature) |
| 64 | +predictors = [:sw_pot, :dsw_pot] # Predictor variables (solar radiation) |
| 65 | +target = [:reco] # Target variable (respiration) |
| 66 | +``` |
| 67 | + |
| 68 | +Parameter classification as global, neural or fixed (difference between global and neural) |
| 69 | + |
| 70 | +```@example hyperparameter_tuning |
| 71 | +global_param_names = [:Q10] # Global parameters (same for all samples) |
| 72 | +neural_param_names = [:rb] # Neural network predicted parameters |
| 73 | +``` |
| 74 | + |
| 75 | +Construct hybrid model |
| 76 | + |
| 77 | +```@example hyperparameter_tuning |
| 78 | +hybrid_model = constructHybridModel( |
| 79 | + predictors, # Input features |
| 80 | + forcing, # Forcing variables |
| 81 | + target, # Target variables |
| 82 | + RbQ10, # Process-based model function |
| 83 | + parameters, # Parameter definitions |
| 84 | + neural_param_names, # NN-predicted parameters |
| 85 | + global_param_names, # Global parameters |
| 86 | + hidden_layers = [16, 16], # Neural network architecture |
| 87 | + activation = relu, # Activation function |
| 88 | + scale_nn_outputs = true, # Scale neural network outputs |
| 89 | + input_batchnorm = false # Apply batch normalization to inputs |
| 90 | +) |
| 91 | +``` |
| 92 | + |
| 93 | +### 5. Train the Model |
| 94 | + |
| 95 | +```@example hyperparameter_tuning |
| 96 | +out = train( |
| 97 | + hybrid_model, |
| 98 | + ds, |
| 99 | + (); |
| 100 | + nepochs = 100, # Number of training epochs |
| 101 | + batchsize = 512, # Batch size for training |
| 102 | + opt = AdamW(0.001), # Optimizer and learning rate |
| 103 | + monitor_names = [:rb, :Q10], # Parameters to monitor during training |
| 104 | + yscale = identity, # Scaling for outputs |
| 105 | + patience = 30, # Early stopping patience |
| 106 | + show_progress=false, |
| 107 | + hybrid_name="before" |
| 108 | +) |
| 109 | +``` |
| 110 | + |
| 111 | +```@raw html |
| 112 | +<video src="../training_history_before.mp4" controls="controls" autoplay="autoplay"></video> |
| 113 | +``` |
| 114 | + |
| 115 | +### 6. Check Results |
| 116 | + |
| 117 | +Evolution of train and validation loss |
| 118 | + |
| 119 | +```@example hyperparameter_tuning |
| 120 | +EasyHybrid.plot_loss(out, yscale = identity) |
| 121 | +``` |
| 122 | + |
| 123 | +Check results - what do you think - is it the true Q10 used to generate the synthetic dataset? |
| 124 | + |
| 125 | +```@example hyperparameter_tuning |
| 126 | +out.train_diffs.Q10 |
| 127 | +``` |
| 128 | + |
| 129 | +Quick scatterplot - dispatches on the output of train |
| 130 | + |
| 131 | +```@example hyperparameter_tuning |
| 132 | +EasyHybrid.poplot(out) |
| 133 | +``` |
| 134 | + |
| 135 | +## Hyperparameter Tuning |
| 136 | + |
| 137 | +EasyHybrid provides built-in hyperparameter tuning capabilities to optimize your model configuration. This is especially useful for finding the best neural network architecture, optimizer settings, and other hyperparameters. |
| 138 | + |
| 139 | +### Basic Hyperparameter Tuning |
| 140 | + |
| 141 | +You can use the `tune` function to automatically search for optimal hyperparameters: |
| 142 | + |
| 143 | +```@example hyperparameter_tuning |
| 144 | +# Create empty model specification for tuning |
| 145 | +mspempty = ModelSpec() |
| 146 | +
|
| 147 | +# Define hyperparameter search space |
| 148 | +nhyper = 4 |
| 149 | +ho = @thyperopt for i=nhyper, |
| 150 | + opt = [AdamW(0.01), AdamW(0.1), RMSProp(0.001), RMSProp(0.01)], |
| 151 | + input_batchnorm = [true, false] |
| 152 | + |
| 153 | + hyper_parameters = (;opt, input_batchnorm) |
| 154 | + println("Hyperparameter run: ", i, " of ", nhyper, " with hyperparameters: ", hyper_parameters) |
| 155 | + |
| 156 | + # Run tuning with current hyperparameters |
| 157 | + out = EasyHybrid.tune( |
| 158 | + hybrid_model, |
| 159 | + ds, |
| 160 | + mspempty; |
| 161 | + hyper_parameters..., |
| 162 | + nepochs = 10, |
| 163 | + plotting = false, |
| 164 | + show_progress = false, |
| 165 | + file_name = "test$i.jld2" |
| 166 | + ) |
| 167 | + |
| 168 | + out.best_loss |
| 169 | +end |
| 170 | +
|
| 171 | +# Get the best hyperparameters |
| 172 | +ho.minimizer |
| 173 | +printmin(ho) |
| 174 | +
|
| 175 | +# Train the model with the best hyperparameters |
| 176 | +best_hyperp = best_hyperparams(ho) |
| 177 | +
|
| 178 | +``` |
| 179 | + |
| 180 | +### Train model with the best hyperparameters |
| 181 | + |
| 182 | +```@example hyperparameter_tuning |
| 183 | +# Run tuning with specific hyperparameters |
| 184 | +out_tuned = EasyHybrid.tune( |
| 185 | + hybrid_model, |
| 186 | + ds, |
| 187 | + mspempty; |
| 188 | + best_hyperp..., |
| 189 | + nepochs = 100, |
| 190 | + monitor_names = [:rb, :Q10], |
| 191 | + hybrid_name="after" |
| 192 | +) |
| 193 | +
|
| 194 | +# Check the tuned model performance |
| 195 | +out_tuned.best_loss |
| 196 | +``` |
| 197 | + |
| 198 | +```@raw html |
| 199 | +<video src="../training_history_after.mp4" controls="controls" autoplay="autoplay"></video> |
| 200 | +``` |
| 201 | + |
| 202 | +### Key Hyperparameters to Tune |
| 203 | + |
| 204 | +When tuning your hybrid model, consider these important hyperparameters: |
| 205 | + |
| 206 | +- **Optimizer and Learning Rate**: Try different optimizers (AdamW, RMSProp, Adam) with various learning rates |
| 207 | +- **Neural Network Architecture**: Experiment with different `hidden_layers` configurations |
| 208 | +- **Activation Functions**: Test different activation functions (relu, sigmoid, tanh) |
| 209 | +- **Batch Normalization**: Enable/disable `input_batchnorm` and other normalization options |
| 210 | +- **Batch Size**: Adjust `batchsize` for optimal training performance |
| 211 | + |
| 212 | +### Tips for Hyperparameter Tuning |
| 213 | + |
| 214 | +- **Start with a small search space** to get a baseline understanding |
| 215 | +- **Monitor for overfitting** by tracking validation loss |
| 216 | +- **Consider computational cost** - more hyperparameters and epochs increase training time |
| 217 | + |
| 218 | +## More Examples |
| 219 | + |
| 220 | +Check out the `projects/` directory for additional examples and use cases. Each project demonstrates different aspects of hybrid modeling with EasyHybrid. |
0 commit comments