You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I recently discovered Ax, still trying to figure out what is and isn't possible with it.
We have existing data from previous observations of parameters & results of a black box function. So far, using the Service API, I've been able to generate suggestions what to try next, and visualize the parameter space in interesting ways.
What I would like to do next is create optimal samples for which some parameters have been fixed, but still leverage all existing data that has these parameters with varying values. I.e. fit GP with larger bounds whereas restrict the optimization phase to a stricter bound set.
E.g., existing data has variables and value ranges of A : [0,2], B : [0,2], C : [0,2], D: [0,2]; and I would like to create optimal samples and visualize the conditional parameter space given A is fixed to 1.25, whereas the system is free to optimize B, C, and D. If this is possible, I would like to continue to fixing more parameters, e.g., A=1.25 & B=0.25, optimize B&C.
Given we have continuous variables, it is likely that not a single existing data point has values A=1.25 and B=0.25.
Is this doable, how?
So far I've tried
Updating parameter bounds after completing trials ax_client.experiment.search_space.parameters['A'].update_range(0.125, .126), but ax_client.get_next_trial() throws ValueError: StandardizeY transform requires non-empty observation data.
Initiating the whole experiment with a search space that allows for the existing data, but including ParameterConstraints that would have restricted .get_next_trial() to remain within the fixed values, but now I can't even complete_trial() since ValueError: Parameter constraint ParameterConstraint(..) is violated
Thanks in advance, Ax seems like a great library!
The text was updated successfully, but these errors were encountered:
Agreed -- @tonyhammainen would you mind chiming in on #383 with your reason for wanting to fix parameters midway through the optimization? As @Balandat mentions here, the optimization should handle the narrowing of the search space for you.
Hi!
I recently discovered Ax, still trying to figure out what is and isn't possible with it.
We have existing data from previous observations of parameters & results of a black box function. So far, using the Service API, I've been able to generate suggestions what to try next, and visualize the parameter space in interesting ways.
What I would like to do next is create optimal samples for which some parameters have been fixed, but still leverage all existing data that has these parameters with varying values. I.e. fit GP with larger bounds whereas restrict the optimization phase to a stricter bound set.
E.g., existing data has variables and value ranges of A : [0,2], B : [0,2], C : [0,2], D: [0,2]; and I would like to create optimal samples and visualize the conditional parameter space given A is fixed to 1.25, whereas the system is free to optimize B, C, and D. If this is possible, I would like to continue to fixing more parameters, e.g., A=1.25 & B=0.25, optimize B&C.
Given we have continuous variables, it is likely that not a single existing data point has values A=1.25 and B=0.25.
Is this doable, how?
So far I've tried
ax_client.experiment.search_space.parameters['A'].update_range(0.125, .126)
, butax_client.get_next_trial()
throwsValueError: StandardizeY transform requires non-empty observation data.
.get_next_trial()
to remain within the fixed values, but now I can't evencomplete_trial()
sinceValueError: Parameter constraint ParameterConstraint(..) is violated
Thanks in advance, Ax seems like a great library!
The text was updated successfully, but these errors were encountered: