Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

applying complex constrains #2299

Open
StanleyYoo opened this issue Mar 23, 2024 · 4 comments
Open

applying complex constrains #2299

StanleyYoo opened this issue Mar 23, 2024 · 4 comments

Comments

@StanleyYoo
Copy link

Hi,
I am trying to apply complex constrains to the SearchSpace. The SearchSpace comprises with 6 RangeParameter s as:
x00, x01, x02, x03, x04 and x05. The constrains I need to apply is the discriminant calculated follwoing procedure shall be larger than 0, i.e. discriminant >=0:

x05_rad = radians(x05)
x = x04 * sin(x05_rad)
y = x04 * cos(x05_rad)
x01 = radians(90 - x01)
own_x_vel, own_y_vel = vector_from_polar(x00, x01)
y_over_x = -(x / y)
a = 1**2 + y_over_x**2
b = (2 * own_x_vel) + (2 * y_over_x * own_y_vel)
c = own_x_vel**2 + own_y_vel**2 - x02**2
discriminant = b**2 - (4 * a * c)

However, the ParameterConstraint only support very simple contraints. How this could be resolved?

search_space = SearchSpace(
    parameters=parameters,
    parameter_constraints=parameter_constraints,
)

Thank you in advance.
Stanley

@Balandat
Copy link
Contributor

As you said, the standard parameter constraints do not support complex nonlinear constraints. This is for a few reasons, not least that this makes the acquisition function optimization a lot more challenging. It also makes it a lot harder to deal with the parameter transformations that Ax applies under the hood. See https://github.com/facebook/Ax/discussions/1797#discussioncomment-6827496 for a more detailed discussion.

What is the mathematical problem you're trying to solve? Is there a way to express the constraints in other coordinates?

@StanleyYoo
Copy link
Author

StanleyYoo commented Mar 25, 2024

Hi Balandat,
Thanks for your response and the discussion at #1797 is well understood. Then do you have any suggestion to address my issue? the mathematical problem I need to solve is that I only want to confine the SearchSpace which only satisfy discriminant = b**2 - (4 * a * c) >= 0 requirement. Since otherwise, the sample from the SearchSpace passed to a next step and they won't work, meaning the entire BO process stopped since the next step yields empty result. The next step only work once real solution exists, i.e. discriminant = b**2 - (4 * a * c) >= 0. I have thinking about including 'discriminant' together with other parameters, x00, x01, x02, x03, x04 and x05 at SearchSpace but it hasn't worked since there is no correlation between 'discriminant' and others. If you have any idea to resolve this or even discarding/abandon a sample which are not satisfy the criteria discriminant = b**2 - (4 * a * c) >= 0, please let me know!

@mgarrard mgarrard self-assigned this Apr 22, 2024
@mgarrard mgarrard removed their assignment Jun 4, 2024
@bernardbeckerman
Copy link
Contributor

We've recently deprecated discussions, which unfortunately killed @Balandat's link above. For posterity I'm copying the question and answer this refers to here.

Question from Stefan2016 on Aug 25, 2023

Dear all,

I have read in the Botorch repo that nonlinear constraints are now possible. From my understanding this should now also be possible with Ax, is this correct?

E.g., x^2 + y^2 ≤ 25 as a parameter_constraints the following code?

ax_client.create_experiment(
    name="hartmann_test_experiment",
    parameters=[
        {
            "name": "x1",
            "type": "range",
            "bounds": [0.0, 1.0],
            "value_type": "float",  # Optional, defaults to inference from type of "bounds".
            "log_scale": False,  # Optional, defaults to False.
        },
        {
            "name": "x2",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x3",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x4",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x5",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x6",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
    ],
    objectives={"hartmann6": ObjectiveProperties(minimize=True)},
    parameter_constraints=["x1 + x2 <= 2.0"],  # Optional.
    outcome_constraints=["l2norm <= 1.25"],  # Optional.
)

Answer from Balandat on Aug 25, 2023

HI @Stefan2016, unfortunately this is currently not easily possible. The string representations passed to the parameter_constraints arg are parsed internally into affine constraint objects, for which the coefficients and rhs are being passed down to the botorch models. This currently does not support nonlinear expressions. The main difficulties here are

  1. To use this in the optimization, we need to construct a callable that we can pass to the optimizer (with the default settings this would be the optimizers in scipy.optimize that are expected to map numpy arrays to numpy floats). It's hard to do that without allowing something like python's eval() (which we want to avoid) or do some reasonably bespoke string parsing. Another option would be to direcly pass that callable, but that has issues as well (see below).
  2. Serialization: Ax allows serializing the experiement to json / a database. While this does work with a string representation, it wouldn't with an arbitrary python callable defining the nonlinear constraints
  3. Transforms. Ax does some reasonably complicated transformation of the parameters and outcomes in its modeling layer (in order to be able to use consistent priors on model hyperparameters and avoid numerical issues). While applying these transformations to a linear constraint mapping is reasonably straightforward (and we can use the transformed constraint in the transformed space), it is not for arbitrary nonlinear constraints.

In short, there are a bunch of challenges here. We'd like to enable this feature but it's not straightforward and we don't have any concrete plans to work on it in the near future.

@Balandat
Copy link
Contributor

Balandat commented Jun 5, 2024

@StanleyYoo since these are highly nonlinear constraints, it's not straightforward to support them easily via the AxClient API. You essentially have two options:

  1. Use our low level API and our EXPERIMENTAL support for nonlinear inequality constraints on the parameters there. See [FEATURE REQUEST] modify Ax API to allow for callable that evaluates a constraint and is passed to the optimizer #769 for a long discussion on the topic and some examples. Note that there are lots of gotchas here (e.g. that this really only works if your search space is already the unit cube and potentially others that are mentioned in the discussion).
  2. Simply define a new metric, for each parameterization compute the discriminant and return that as the metric value and then impose an outcome constraint on that metric to be greater than zero. This will mean that the surrogate model will have to learn the discriminant as a function of the parameters. This is not going to be super efficient but is the easiest to hook up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants