Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with Fixed parameters if nonlinear_inequality_constraint is imposed #2562

Open
Abrikosoff opened this issue Jul 5, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@Abrikosoff
Copy link

Hi Ax Team,

I've been trying to implement a 'local' NChooseK constraint, where one has the choice of imposing the constraint on any segment of the suggestions (e.g., x1-x6, and I want 2Choose1 on x1,x2). The code below is a runnable repro:

import random

import torch
from ax import Data, Experiment, ParameterType, RangeParameter, SearchSpace
from ax.modelbridge.registry import Models
from ax.runners.synthetic import SyntheticRunner
from torch.quasirandom import SobolEngine

import copy
import numpy as np
from ax.service.ax_client import AxClient
from ax.modelbridge.registry import Models
from ax.core.observation import ObservationFeatures
from botorch.models.gp_regression import SingleTaskGP
from ax.utils.measurement.synthetic_functions import branin
from ax.models.torch.botorch_modular.surrogate import Surrogate
from botorch.acquisition.monte_carlo import qNoisyExpectedImprovement
from ax.modelbridge.generation_strategy import GenerationStrategy, GenerationStep
from botorch.test_functions import Hartmann
import numpy as np
from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.utils.measurement.synthetic_functions import hartmann6

K = 6-3
k=3
dim_of_problem = 6
q=1

# Load our sample 2-objective problem
from botorch.test_functions.multi_objective import BraninCurrin, DTLZ2

branin_currin = BraninCurrin(negate=True).to(
    dtype=torch.double,
    device=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
)

test_dtlz2 = DTLZ2(dim=dim_of_problem, negate=True)

from botorch.acquisition import ExpectedImprovement
from botorch.fit import fit_gpytorch_model
from botorch.models import SingleTaskGP
from botorch.models.transforms import Standardize
from gpytorch.mlls import ExactMarginalLogLikelihood
from typing import List

def narrow_gaussian(x, ell):
    return torch.exp(-0.5 * (x / ell) ** 2)

def create_ineq_constraint(var_idx: List[int], non_zero: int):
    def ineq_constraint_on_vars(x: torch.Tensor, ell: float = 1e-3):
        """
        Each callable is expected to take a `(num_restarts) x q x d`-dim tensor as an
        input and return a `(num_restarts) x q`-dim tensor with the constraint values.
        """
        x_slice = x[..., var_idx[0]:var_idx[1]+1]
        return narrow_gaussian(x_slice, ell).sum(dim=-1) - (x_slice.shape[-1] - non_zero)
    
    return ineq_constraint_on_vars

def setup_ineqs(var_idxs: List[List[int]], non_zeros: List[int]):
    """
    Setup the inequality constraints for the optimization problem.
    """
    return [create_ineq_constraint(var_idx, non_zero) 
            for var_idx, non_zero in zip(var_idxs, non_zeros)]

def setup_nonlinear_constraints_list(
        # all_params: List[Dict[str, Any]],
        idx_list: List[int],
):
    # Define the nonlinear constraints for the optional parameters
    nonlinear_constraints_list = []
    for idx in idx_list:
        nonlinear_constraints_list.append([idx, idx+1])
    nonlinear_constraints = setup_ineqs(nonlinear_constraints_list, [1])

    return nonlinear_constraints

def get_feasible_sobol_points_multisubspace(n, 
                                            list_of_nck: List[List[int]],
                                            list_of_non_zeros: List[int],                             
                                            dim_of_problem=dim_of_problem):
    """
    Sobol sequence where we apply N-choose-K constraints to any subset of dimensions.
    
    Args:
    n: number of samples to draw
    list_of_nck: list of form [[start1, end1], [start2, end2], ...] which specifies 
                 the start and end indices (inclusive) of the variables to be constrained
    list_of_non_zeros: list of form [k1, k2, ...] which specifies the number of non-zero 
                       elements in each constrained subset
    dim_of_problem: dimension of the problem
    
    Returns:
    X: n x dim_of_problem tensor with applied constraints
    """
    # Generate Sobol sequence
    X = SobolEngine(dimension=dim_of_problem, scramble=True).draw(n).to(torch.double)
    
    # Create a mask initialized with ones
    mask = torch.ones(n, dim_of_problem)
    
    for (start, end), k in zip(list_of_nck, list_of_non_zeros):
        subset_size = end - start + 1
        
        # Generate random choices for each subset
        choices = torch.zeros(n, subset_size)
        choices[:, :k] = 1
        
        # Shuffle the choices for each sample
        for i in range(n):
            choices[i] = choices[i][torch.randperm(subset_size)]
        
        # Apply the choices to the mask
        mask[:, start:end+1] = choices
    
    # Apply mask to X
    X = X * mask

    return X

def get_batch_initial_conditions_multisubspace(num_restarts, 
                                               raw_samples, 
                                               q,
                                               list_of_nck,
                                               list_of_non_zeros
                                               ):
    # X = get_feasible_sobol_points(n=raw_samples*q, k=k).unsqueeze(1)
    X = get_feasible_sobol_points_multisubspace(n=raw_samples*q,
                                                list_of_nck=list_of_nck,
                                                list_of_non_zeros=list_of_non_zeros, 
                                                dim_of_problem=6).unsqueeze(1)
    X = X.reshape((torch.Size((raw_samples,q,dim_of_problem))))
    # acq_vals = acqf(X)
    num_samples = X.shape[0]  # Total number of samples in X
    random_indices = torch.randperm(num_samples)[:num_restarts]
    sampled_X = X[random_indices]
    return sampled_X

# Example usage
n = 1000
dim_of_problem = 6

list_of_nck = [[0, 2]]  # 3-choose-2
list_of_non_zeros = [1]

var_idxs = list_of_nck
non_zeros = list_of_non_zeros

ineq_constraints = setup_ineqs(var_idxs, non_zeros)
get_batch_initial_conditions_subset = get_batch_initial_conditions_multisubspace(5, 100, 1, list_of_nck, list_of_non_zeros)

generation_strategy = GenerationStrategy(
                        steps=[
                            GenerationStep(
                                model=Models.SOBOL,
                                num_trials=1,  # https://github.com/facebook/Ax/issues/922
                                min_trials_observed=1,
                                max_parallelism=6,
                                model_kwargs={"seed": 9999},
                                model_gen_kwargs={
                                    "model_gen_options": {
                                    }
                                },
                            ), 
                            GenerationStep(
                                model=Models.BOTORCH_MODULAR,
                                num_trials=-1,
                                model_gen_kwargs={
                                    "model_gen_options": {
                                        "optimizer_kwargs": {
                                            "nonlinear_inequality_constraints": ineq_constraints,
                                            "batch_initial_conditions": get_batch_initial_conditions_subset,
                                            "options": {
                                                "batch_limit": 1,
                                                "maxiter": 200
                                            },
                                        },
                                    }
                                },
                        ),
                    ]
                )

ax_client = AxClient(generation_strategy=generation_strategy)

ax_client.create_experiment(
    name="hartmann_test_experiment",
    parameters=[
        {
            "name": "x1",
            "type": "range",
            "bounds": [0.0, 1.0],
            "value_type": "float",  # Optional, defaults to inference from type of "bounds".
            "log_scale": False,  # Optional, defaults to False.
        },
        {
            "name": "x2",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x3",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x4",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x5",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        # {
        #     "name": "x5",
        #     "type": "fixed",
        #     "value": 0.0,
        # },
        {
            "name": "x6",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
    ],
    objectives=
    {
        # `threshold` arguments are optional
        "a": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[0]),
        "b": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[1]),
    },
    parameter_constraints=["x1 + x2 <= 2.0"],  # Optional.
    # outcome_constraints=["l2norm <= 1.25"],  # Optional.
)

def evaluate(parameters):
    evaluation = test_dtlz2(
        torch.tensor([parameters.get("x1"), 
                      parameters.get("x2"),
                      parameters.get("x3"),
                      parameters.get("x4"),
                      parameters.get("x5"),
                      parameters.get("x6"),])
    )
    # In our case, standard error is 0, since we are computing a synthetic function.
    # Set standard error to None if the noise level is unknown.
    return {"a": (evaluation[0].item(), 0.0), "b": (evaluation[1].item(), 0.0)}

for i in range(25):
    parameterization, trial_index = ax_client.get_next_trial()
    # Local evaluation here can be replaced with deployment to external system.
    ax_client.complete_trial(trial_index=trial_index, raw_data=evaluate(parameterization))

Now I have noticed that if I have a fixed parameter in my search space (as commented out above), then I get an error:
ValueError: batch_initial_conditions.shape[-1] must be 5. The shape is torch.Size([5, 1, 6]).
I looked under the hood, and apparently it seems that the last dimension of the bounds tensor needs to match the last dims of batch_initial_conditions. But if there is a fixed parameter in there, apparently this doesn't contribute to the bounds? Is this meant to be like this, and if yes, is there a workaround?

@Abrikosoff Abrikosoff changed the title Possible bug in Problem with Fixed parameters if nonlinear_inequality_constraint is imposed Jul 5, 2024
@sdaulton
Copy link
Contributor

Hi @Abrikosoff, Fixed parameters are typically removed via the RemoveFixed transform since they aren't needed for modeling/generating candidates. The easiest thing to do would be to add the fixed parameter to the input of in the non-linear constraint callable when you define it (knowing that it won't be passed in).

Alternatively, you could remove the RemoveFixed transform and include the fixed parameter in the model/model's search space. The default transforms are Cont_X_trans + Y_trans (1, 2) and you can set the transforms by passing a list to the model_kwargs on the GenerationStep

@lena-kashtelyan lena-kashtelyan added the question Further information is requested label Aug 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants