Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sequentially register indexed nonlinear constraint-functions #3253

Closed
HerAxp opened this issue Feb 27, 2023 · 10 comments · Fixed by #3258
Closed

sequentially register indexed nonlinear constraint-functions #3253

HerAxp opened this issue Feb 27, 2023 · 10 comments · Fixed by #3258

Comments

@HerAxp
Copy link

HerAxp commented Feb 27, 2023

I have a (MI)NLP problem, with many nonlinear constraints that have the same functional shape but vary in the subset of variables to which each constraint holds. I would like to ask if you could help me in the following: is there any efficient known way to sequentially register indexed nonlinear constraint-functions to a model and then use them to add the constraints to the model?
For illustration, assume we have an NLP with some objective and n variables.
Assume in addition we have a collection of Sets of indices A_1, ...,A_n of variables (e.g. A_k = [1, 2, 5, n] is the index set of the variables x_1, x_2, x_5 and x_n) and we have a general constraint function f defined e.g. as
f(x, A) = sum_i exp((x_i)^2) where the sum is for i \in A.
Then I would like to define the n constraints f(x, A_1) < 1, ..., f(x, A_n) < 1 (or any other rhs number instead of 1)in a loop, given A_1, ...,A_n are known and listed.
How can I (effciently) register sequentially each of this constraint functions and add the constraint to the model in a loop?

Many thanks in advance for any help.

@odow
Copy link
Member

odow commented Feb 27, 2023

Do something like this:

julia> using JuMP

julia> n = 5
5

julia> A = [[1, 2, 3], [1, 4, 5]]
2-element Vector{Vector{Int64}}:
 [1, 2, 3]
 [1, 4, 5]

julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(model, x[1:n])
5-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]
 x[4]
 x[5]

julia> @NLconstraint(model, [a in A], sum(exp(x[i]^2) for i in a) <= 1)
1-dimensional DenseAxisArray{NonlinearConstraintRef{ScalarShape},1,...} with index sets:
    Dimension 1, [[1, 2, 3], [1, 4, 5]]
And data, a 2-element Vector{NonlinearConstraintRef{ScalarShape}}:
 (exp(x[1] ^ 2.0) + exp(x[2] ^ 2.0) + exp(x[3] ^ 2.0)) - 1.0  0
 (exp(x[1] ^ 2.0) + exp(x[4] ^ 2.0) + exp(x[5] ^ 2.0)) - 1.0  0

p.s. In future, please post usage questions like this on the community forum: https://discourse.julialang.org/c/domain/opt/13. We keep the GitHub issues for bug reports and feature requests.

@HerAxp
Copy link
Author

HerAxp commented Feb 27, 2023

many thanks for your reply. Sorry maybe I was not explicit enough, the exponential function was just an illustration, actuall I have much more complicated nonlinear functions, so that I would need to register the function for each A_k so far I understood

@odow
Copy link
Member

odow commented Feb 27, 2023

If all the A have the same length, do:

julia> using JuMP

julia> n = 5
5

julia> A = [[1, 2, 3], [1, 4, 5]]
2-element Vector{Vector{Int64}}:
 [1, 2, 3]
 [1, 4, 5]

julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(model, x[1:n])
5-element Vector{VariableRef}:
 x[1]
 x[2]
 x[3]
 x[4]
 x[5]

julia> f(x...) = sum(exp(x[i]^2) for i in 1:length(x))
f (generic function with 1 method)

julia> register(model, :f, 3, f; autodiff = true)

julia> @NLconstraint(model, [a in A], f(x[a]...) <= 1)
1-dimensional DenseAxisArray{NonlinearConstraintRef{ScalarShape},1,...} with index sets:
    Dimension 1, [[1, 2, 3], [1, 4, 5]]
And data, a 2-element Vector{NonlinearConstraintRef{ScalarShape}}:
 f(x[1], x[2], x[3]) - 1.0  0
 f(x[1], x[4], x[5]) - 1.0  0

Otherwise, post on the forum with a reproducible example, and we can discuss.

@HerAxp
Copy link
Author

HerAxp commented Feb 27, 2023

I think this is good, many thanks for your help!!

@HerAxp
Copy link
Author

HerAxp commented Feb 27, 2023

Just for curiosity, how would you replace the register if not all the A have the same length?

@odow
Copy link
Member

odow commented Feb 28, 2023

You would need to register a differently named function for each number of input arguments.

@HerAxp
Copy link
Author

HerAxp commented Feb 28, 2023

Thanks!. Maybe this comes back tot he orginal question, is there any way to register them and add the constraints in a loop for each number of input arguments?

@odow
Copy link
Member

odow commented Feb 28, 2023

See https://jump.dev/JuMP.jl/stable/manual/nlp/#Raw-expression-input. You could do something like:

julia> using JuMP

julia> n = 5;

julia> A = [[1], [1, 2], [2, 3, 4], [1, 3, 4, 5]];

julia> model = Model();

julia> @variable(model, x[1:n]);

julia> funcs = Set{Symbol}();

julia> f(x...) = sum(exp(x[i]^2) for i in 1:length(x));

julia> for a in A
           key = Symbol("f$(length(a))")
           if !(key in funcs)
               push!(funcs, key)
               register(model, key, length(a), f; autodiff = true)
           end
           expr = Expr(:call, key, x[a]...)
           add_nonlinear_constraint(model, :($expr <= 1))
       end

julia> print(model)
Feasibility
Subject to
 f1(x[1]) - 1.0  0
 f2(x[1], x[2]) - 1.0  0
 f3(x[2], x[3], x[4]) - 1.0  0
 f4(x[1], x[3], x[4], x[5]) - 1.0  0

If you still have questions, can you make a post on the discourse forum with a reproducible example? That way the question and answer can be searched for by other people in the future.

@odow
Copy link
Member

odow commented Mar 1, 2023

I'm adding this as an example to the documentation: #3258

@HerAxp
Copy link
Author

HerAxp commented Mar 1, 2023

thanks!

@odow odow closed this as completed in #3258 Mar 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging a pull request may close this issue.

2 participants