Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Invalid variable ordering with old nonlinear interface #146

Closed
andrewrosemberg opened this issue Feb 27, 2024 · 10 comments · Fixed by #147
Closed

Invalid variable ordering with old nonlinear interface #146

andrewrosemberg opened this issue Feb 27, 2024 · 10 comments · Fixed by #147

Comments

@andrewrosemberg
Copy link

I have found an error when modifying one of the test cases:

If a parameter is created before the other variables, the model errors when optimizing.

ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
    () -> MOI.Bridges.full_bridge_optimizer(
        MOI.Utilities.CachingOptimizer(
            MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
            ipopt,
        ),
        Float64,
    )
POI_cached_optimizer() = ParametricOptInterface.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
@variable(model, z in MOI.Parameter(10.0)) # Parameter creation that causes the error
@variable(model, x)
@variable(model, y)
@constraint(model, x >= z)
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
optimize!(model)

This errors, while the following doesn't:

ipopt = Ipopt.Optimizer()
MOI.set(ipopt, MOI.RawOptimizerAttribute("print_level"), 0)
cached =
    () -> MOI.Bridges.full_bridge_optimizer(
        MOI.Utilities.CachingOptimizer(
            MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
            ipopt,
        ),
        Float64,
    )
POI_cached_optimizer() = ParametricOptInterface.Optimizer(cached())
model = Model(() -> POI_cached_optimizer())
@variable(model, x)
@variable(model, y)
@variable(model, z in MOI.Parameter(10.0)) # No more error
@constraint(model, x >= z)
@constraint(model, x + y >= z)
@NLobjective(model, Min, x^2 + y^2)
optimize!(model)
@odow
Copy link
Member

odow commented Feb 28, 2024

Smaller example is

julia> using JuMP, Ipopt

julia> import ParametricOptInterface as POI

julia> function main()
           model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
           @variable(model, z in Parameter(0))
           @variable(model, x)
           @NLobjective(model, Min, x)
           optimize!(model)
       end
main (generic function with 1 method)

julia> main()
This is Ipopt version 3.14.14, running with linear solver MUMPS 5.6.2.

Number of nonzeros in equality constraint Jacobian...:        0
Number of nonzeros in inequality constraint Jacobian.:        0
Number of nonzeros in Lagrangian Hessian.............:        0

ERROR: BoundsError: attempt to access 1-element Vector{Float64} at index [2]
Stacktrace:
  [1] getindex
    @ ./essentials.jl:13 [inlined]
  [2] _forward_eval(f::MathOptInterface.Nonlinear.ReverseAD._FunctionStorage, d::MathOptInterface.Nonlinear.ReverseAD.NLPEvaluator, x::Vector{…})

This looks like a mismatch caused by shuffling the variable indices with @NL. The examples in the tests worked because they added the parameters after the decision variables (accidentally? intentionally?).

I don't think there is a simple fix for this. The NLPBlock is pretty fixed at assuming a given order of the variables.

But you can switch to the new nonlinear interface:

julia> function main_expr()
           model = Model(() -> POI.Optimizer(Ipopt.Optimizer()))
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, x + y >= z)
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           return
       end
main_expr (generic function with 1 method)

julia> main_expr()
objective_value(model) = 99.99999800270126
objective_value(model) = 3.999999926847757

but if you're using Ipopt, it supports Parameter directly

julia> function main_ipopt()
           model = Model(Ipopt.Optimizer)
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, x + y >= z)
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           return
       end
main_ipopt (generic function with 1 method)

julia> main_ipopt()
objective_value(model) = 99.99999980286509
objective_value(model) = 3.999999962453093

@odow odow changed the title POI ERROR with NLP Invalid variable ordering with old nonlinear interface Feb 28, 2024
@odow
Copy link
Member

odow commented Feb 28, 2024

Perhaps POI should error if it detects a NLPBlock.

@andrewrosemberg
Copy link
Author

That makes sense.

Without using POI I also found another error when using NLPBlock MOI.Parameter and Ipopt (which could come from the same cause). Using parameters on the rhs gives a drastically different solution even when its value is just one times the original rhs. With the new API this error also disappears.

The minimum working case is a bit convoluted but simple to run.
We just need PowerModels 0.19 (which still used the old NLP API ):

using JuMP
using PowerModels # v0.19.0 for old NLP API
using PGLib
using Ipopt


optimizer = Ipopt.Optimizer

matpower_case_name = "6468_rte" # pglib_opf_case5_pjm 6468_rte
network_data = make_basic_network(pglib(matpower_case_name))

function jump(network_data)
    # The problem to iterate over
    model = JuMP.Model(optimizer)

    pm = instantiate_model(
        network_data,
        ACPPowerModel,
        PowerModels.build_opf;
        setting=Dict("output" => Dict("branch_flows" => true, "duals" => true)),
        jump_model=model,
    )

    JuMP.optimize!(model)

    @info "Status" JuMP.termination_status(model)
    return JuMP.objective_value(model)
end

function jump_with_parameter(network_data)
    # The problem to iterate over
    model = JuMP.Model(optimizer)

    # Save original load value and Link POI
    num_loads = length(network_data["load"])

    @variable(model, load_scaler[i=1:num_loads] in MOI.Parameter.(1.0))

    for (str_i, l) in network_data["load"]
        i = parse(Int, str_i)
        l["pd"] = load_scaler[i] * l["pd"]
        l["qd"] = load_scaler[i] * l["qd"]
    end

    pm = instantiate_model(
        network_data,
        ACPPowerModel,
        PowerModels.build_opf;
        setting=Dict("output" => Dict("branch_flows" => true, "duals" => true)),
        jump_model=model,
    )

    JuMP.optimize!(model)

    @info "Status" JuMP.termination_status(model)
    return JuMP.objective_value(model)
end

objective_normal_jump = jump(network_data) # 2.05e6

objective_parameter_jump = jump_with_parameter(network_data) # 5.83e5

@odow
Copy link
Member

odow commented Feb 28, 2024

Ah. We probably need to throw an error if parameters are combined with the old nonlinear API. I don't know if that is tested.

@odow
Copy link
Member

odow commented Feb 28, 2024

See jump-dev/Ipopt.jl#406. You just can't mix the two.

@andrewrosemberg
Copy link
Author

andrewrosemberg commented May 8, 2024

I am trying to get the dual of the parameter when using Ipopt but failing miserably.

Modifying @odow example a little bit:

function main_ipopt()
           model = Model(Ipopt.Optimizer)
           set_silent(model)
           @variable(model, z in Parameter(10.0))
           @variable(model, x)
           @variable(model, y)
           @constraint(model, x >= z)
           @constraint(model, sin(x) + y >= z) # modified constraint
           @objective(model, Min, x^2 + y^2)
           optimize!(model)
           @show objective_value(model)
           set_parameter_value(z, 2.0)
           optimize!(model)
           @show objective_value(model)
           @show dual(z)
           return
           end

I get (as expected) the error:

ERROR: To query the dual variables associated with a variable bound, first obtain a constraint reference using one of `UpperBoundRef`, `LowerBoundRef`, or `FixRef`, and then call `dual` on the returned constraint reference.

But none of these work (e.g., dual(FixRef(z))).

In POI I normally run: MOI.get(JuMP.owner_model(z), POI.ParameterDual(), z)

But Ipopt is not currently working with POI on a non-linear problem. If I set the solver like:

ipopt = Ipopt.Optimizer()
cached =
    () -> MOI.Bridges.full_bridge_optimizer(
        MOI.Utilities.CachingOptimizer(
            MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
            ipopt,
        ),
        Float64,
)
POI_cached_optimizer() = POI.Optimizer(cached())
set_optimizer(model, () -> POI_cached_optimizer())

I get:

ERROR: MathOptInterface.UnsupportedConstraint{MathOptInterface.ScalarNonlinearFunction, MathOptInterface.EqualTo{Float64}}: `MathOptInterface.ScalarNonlinearFunction`-in-`MathOptInterface.EqualTo{Float64}` constraint is not supported by the model.

But I am a bit lost on where to start to solve this.

@andrewrosemberg
Copy link
Author

Could and should I try implementing something like the ParameterDual in MOI?

ParameterDual <: MOI.AbstractVariableAttribute

@odow
Copy link
Member

odow commented May 8, 2024

MathOptInterface does not support computing the dual of a Parameter. If you need this, use a fixed variable instead.

@andrewrosemberg
Copy link
Author

I changed the parameters that I need the dual to a constraint instead but I think this breaks POI:


using JuMP
using Gurobi
import ParametricOptInterface as POI

model = JuMP.Model(() -> POI.Optimizer(Gurobi.Optimizer()))

@variable(model, x <= 1)
@variable(model, y)
@variable(model, z in MOI.Parameter(1.0))

@objective(model, 10 * x + 1000 * y)

c = @constraint(model, Min, x + y == 1 + z) # 1 represents the parameter that I want the dual and z is just another parameter.

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

set_normalized_rhs(c, 0.0)

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

set_parameter_value(z, 2)

JuMP.optimize!(model)

@info println(model) termination_status(model) value(x) value(y) value(z)

I am avoiding the JuMP.fix and using instead just the RHS to help my API.

@odow
Copy link
Member

odow commented May 10, 2024

I am avoiding the JuMP.fix

To get the dual of a particular parameter, you really need to use @variable(model, z == 1) and reduced_cost(z).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

2 participants