Skip to content

Commit

Permalink
[docs] remove mention of the time-to-first-solve issue (#3929)
Browse files Browse the repository at this point in the history
  • Loading branch information
odow authored Jan 30, 2025
1 parent 6fc43de commit 324c3a7
Show file tree
Hide file tree
Showing 3 changed files with 1 addition and 125 deletions.
38 changes: 0 additions & 38 deletions docs/src/manual/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,44 +73,6 @@ deleting different constraint types, you may need to use
[`set_optimizer`](@ref). See [Switching optimizer for the relaxed problem](@ref)
for an example of when this is useful.

### Reducing time-to-first-solve latency

By default, JuMP uses [bridges](@ref LazyBridgeOptimizer) to reformulate the
model you are building into an equivalent model supported by the solver.

However, if your model is already supported by the solver, bridges add latency
(read [The "time-to-first-solve" issue](@ref)). This is particularly noticeable
for small models.

To reduce the "time-to-first-solve,s" try passing `add_bridges = false`.
```jldoctest
julia> model = Model(HiGHS.Optimizer; add_bridges = false);
```
or
```jldoctest
julia> model = Model();
julia> set_optimizer(model, HiGHS.Optimizer; add_bridges = false)
```

However, be wary. If your model and solver combination needs bridges, an error
will be thrown:
```jldoctest
julia> model = Model(SCS.Optimizer; add_bridges = false);
julia> @variable(model, x)
x
julia> @constraint(model, 2x <= 1)
ERROR: Constraints of type MathOptInterface.ScalarAffineFunction{Float64}-in-MathOptInterface.LessThan{Float64} are not supported by the solver.
If you expected the solver to support your problem, you may have an error in your formulation. Otherwise, consider using a different solver.
The list of available solvers, along with the problem types they support, is available at https://jump.dev/JuMP.jl/stable/installation/#Supported-solvers.
[...]
```

### Solvers which expect environments

Some solvers accept (or require) positional arguments such as a license
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,8 +62,7 @@
# ```
# $ julia path/to/file.jl
# ```
# Use the REPL or a notebook instead, and read [The "time-to-first-solve" issue](@ref)
# for more information.
# Use the REPL or a notebook instead.

# ### Code blocks in this documentation

Expand Down
85 changes: 0 additions & 85 deletions docs/src/tutorials/getting_started/performance_tips.jl
Original file line number Diff line number Diff line change
Expand Up @@ -34,91 +34,6 @@ import HiGHS
# variables. This is particularly important if you're learning JuMP after using
# a language like MATLAB.

# ## The "time-to-first-solve" issue

# Similar to the infamous [time-to-first-plot](https://discourse.julialang.org/t/roadmap-for-a-faster-time-to-first-plot/22956)
# plotting problem, JuMP suffers from time-to-first-solve latency. This latency
# occurs because the first time you call JuMP code in each session, Julia needs
# to compile a lot of code specific to your problem. This issue is actively being
# worked on, but there are a few things you can do to improve things.

# ### Suggestion 1: don't call JuMP from the command line

# In other languages, you might be used to a workflow like:
# ```
# $ julia my_script.jl
# ```
# This doesn't work for JuMP, because we have to pay the compilation latency
# every time you run the script. Instead, use one of the [suggested workflows](https://docs.julialang.org/en/v1/manual/workflow-tips/)
# from the Julia documentation.

# ### Suggestion 2: disable bridges if none are being used

# At present, the majority of the latency problems are caused by JuMP's bridging
# mechanism. If you only use constraints that are natively supported by the
# solver, you can disable bridges by passing `add_bridges = false` to
# [`Model`](@ref).

model = Model(HiGHS.Optimizer; add_bridges = false)

# ### Suggestion 3: use PackageCompiler

# As an example of compilation latency, consider the following linear program
# with two variables and two constraints:

# ```julia
# using JuMP, HiGHS
# model = Model(HiGHS.Optimizer)
# set_silent(model)
# @variable(model, x >= 0)
# @variable(model, 0 <= y <= 3)
# @objective(model, Min, 12x + 20y)
# @constraint(model, c1, 6x + 8y >= 100)
# @constraint(model, c2, 7x + 12y >= 120)
# optimize!(model)
# open("model.log", "w") do io
# print(io, solution_summary(model; verbose = true))
# return
# end
# ```

# Saving the problem in `model.jl` and calling from the command line results in:
# ```
# $ time julia model.jl
# 15.78s user 0.48s system 100% cpu 16.173 total
# ```
# Clearly, 16 seconds is a large overhead to pay for solving this trivial model.
# However, the compilation latency is independent on the problem size, and so 16
# seconds of additional overhead may be tolerable for larger models that take
# minutes or hours to solve.

# In cases where the compilation latency is intolerable, JuMP is compatible with
# the [PackageCompiler.jl](https://julialang.github.io/PackageCompiler.jl/dev/)
# package, which makes it easy to generate a custom _sysimage_ (a binary
# extension to Julia that caches compiled code) that dramatically reduces the
# compilation latency. A custom image for our problem can be created as follows:
# ```julia
# using PackageCompiler, Libdl
# PackageCompiler.create_sysimage(
# ["JuMP", "HiGHS"],
# sysimage_path = "customimage." * Libdl.dlext,
# precompile_execution_file = "model.jl",
# )
# ```
# When Julia is run with the custom image, the run time is now 0.7 seconds
# instead of 16:
# ```
# $ time julia --sysimage customimage model.jl
# 0.68s user 0.22s system 153% cpu 0.587 total
# ```
# Other performance tweaks, such as disabling bridges or using direct mode can
# reduce this time further.

# !!! note
# `create_sysimage` only needs to be run once, and the same sysimage can be
# used--to a slight detriment of performance--even if we modify
# `model.jl` or run a different file.

# ## Use macros to build expressions

# Use JuMP's macros (or [`add_to_expression!`](@ref)) to build expressions.
Expand Down

0 comments on commit 324c3a7

Please sign in to comment.