I am trying to use nonlinear constraints with the NLOptNoGrad optimizer as the .acquisition function optimizer I switched to using the GN_ISRES since this supports both equality and inequality constraints. However, I found that even though the NLOptNoGrad optimizer is obeying the nonlinear constraints, the best observation being reported actually violates the constraints because it comes from one of the initial random samples.
Are there any plans for adding more comprehensive support for constraints such that they are respected by all aspects of the optimization?
I am also wondering if there is any support for using NLOPT's "Augmented Lagrangian" to add constraints to algorithms that otherwise don't support them.
I am trying to use nonlinear constraints with the
NLOptNoGradoptimizer as the .acquisition function optimizer I switched to using the GN_ISRES since this supports both equality and inequality constraints. However, I found that even though theNLOptNoGradoptimizer is obeying the nonlinear constraints, the best observation being reported actually violates the constraints because it comes from one of the initial random samples.Are there any plans for adding more comprehensive support for constraints such that they are respected by all aspects of the optimization?
I am also wondering if there is any support for using NLOPT's "Augmented Lagrangian" to add constraints to algorithms that otherwise don't support them.