Metric Constraints
The Metric Constraints feature combines the ideas of Metric Strategy and Metric Thresholds to give you more control over where SigOpt searches for the desirable outcomes. For example, some stored metrics might be guardrail metrics, where you want the values for these metrics to satisfy some baseline values. In other cases, some Multimetric experiments can be rephrased as a constrained single metric experiment, where you want to optimize a primary metric, subject to the secondary metric exceeding a certain threshold.
Example: Optimizing Accuracy with Constraint on Inference Time
We modify the neural network example from the Metric Thresholds documentation. Instead of being interested in the tradeoff between accuracy and inference time, we only want to optimize for the accuracy of the network. We now set the inference time as a Constraint Metric, with a threshold of 100ms. The Constraint Metric feature allows you to state limitations at experiment creation to help inform SigOpt of the practical restrictions in your problem.
Defining the Metric Constraints
To assign one of your metrics as a constraint metric, specify the strategy
field as constraint
in the desired Metric object when creating your experiment. You must specify a threshold
for the Constraint Metric. When the objective is set to minimize
, observations with constraint metric values lower than or equal to the threshold are considered as feasible. When the objective is set to maximize
, observations with constraint metric values greater than or equal to the threshold are considered as feasible.
Creating the Experiment
Below, we create a new experiment, using the above example of optimizing the accuracy of the network subject to inference time constraint.
from sigopt import Connection
conn = Connection(client_token="SIGOPT_API_TOKEN")
conn.set_api_url("https://api.staging.sigopt.com")
experiment = conn.experiments().create(
name="Neural network with inference time constraint",
parameters=[
dict(
name="learning_rate",
bounds=dict(
min=0.0001,
max=1
),
transformation="log",
type="double"
),
dict(
name="nodes_per_layer",
bounds=dict(
min=5,
max=20
),
type="int"
)
],
metrics=[
dict(
name="inference_time_milliseconds",
objective="minimize",
strategy="constraint",
threshold=100
),
dict(
name="validation_accuracy",
objective="maximize",
strategy="optimize"
)
],
observation_budget=65,
parallel_bandwidth=2,
project="sigopt-examples",
type="offline"
)
print("Created experiment: https://app.staging.sigopt.com/experiment/" + experiment.id)
Updating Thresholds
Like Metric Threshold, Metric Constraint allows you to update the threshold
on the properties page of an experiment at any time while the experiment is in progress. The thresholds can also be updated directly through our API. An example of this is given below.
Feasibility
If the threshold
defined for a Constraint Metric cannot be satisfied anywhere in the domain, this feature will perform unpredictably. For example, if the inference time is set to 0
, this feature will assume 0 ms is actually feasible as inference time and explore the domain erratically trying to find an observation that satisfies this threshold. When defining thresholds, it is best to state thresholds at the start of an experiment which are well-understood based on previous experiments or prior knowledge.
Best Assignments
Best Assignments List will only consider observations that satisfy all constraint metric thresholds. Additionally, it is possible that there will be no observation that satisfies all thresholds in the early stages of experiments, and thus no best assignments. In this situation, Best Assignments List will return no best assignments (an empty Pagination), and we recommend waiting until the experiment is further along to retrieve best assignments.
Recommendations
- Setting the
observation_budget
higher, SigOpt experiments with Metric Constraints require additional Observations to understand the associated feasible parameter space. As a result, we recommend adding 25% to your originalobservation_budget
for each additional constraint metric. - Parallelism is another powerful tool for accelerating tuning by testing multiple Suggestions simultaneously. It can be used in conjunction with Metric Constraints, but each constraint metric requires extra time for SigOpt to compute the associated feasible parameter space. We recommend limiting parallelism to no more than 5 simultaneously open suggestions.
Limitations
During development of this feature, we have produced some guidance regarding the successful use of Metric Constraints to control the optimization of the desired metrics.
- There can be up to 4 constraint metrics for an experiment.
- Constraint metrics must define
threshold
values. - All constraint metrics must be reported for successful Observations.
- No constraint metrics can be reported for Failed Observations.
observation_budget
must be set for an experiment with Metric Constraints.- Multisolution experiments are not permitted.
- Multitask experiments are not permitted.