Picking the appropriate scale when tuning hyperparameters can have an impact on the efficiency of the optimization process. SigOpt now allows users to specify the Parameter transformation to indicate the change in scale.
In most machine learning applications, the learning rate hyperparameter is often searched in the log space. In other words, it is more efficient to sample values from different orders of magnitude in the given bounds. SigOpt assumes base 10 when applying the logarithmic transformation. Below is a code snippet on how to indicate log transformation for the learning rate parameter inside a SigOpt experiment create call.
dict( name="learning_rate", bounds=dict(min=1e-4, max=1), type="double", transformation="log", )
This is functionally equivalent of setting the parameter bounds in the log space, i.e.,
dict( name="log10_learning_rate", bounds=dict(min=-4, max=0), type="double", )
And then manually exponentiating the assignments, e.g.,
learning_rate = 10 ** suggestion.assignments['log10_learning_rate'].
Changes in Web Visualization
There are changes in the Analysis page to indicate when a parameter is set to log transformation. You can view parameters with log transformation in log scale in the Experiment History plots.