Skip to content

Commit

Permalink
More updates for printing in JuMP v1.23
Browse files Browse the repository at this point in the history
  • Loading branch information
pulsipher committed Aug 13, 2024
1 parent bb2081c commit 93798d1
Show file tree
Hide file tree
Showing 12 changed files with 175 additions and 149 deletions.
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ InfiniteOpt = "0.5"
Ipopt = "1.6"
HiGHS = "1"
julia = "1.6"
JuMP = "1.22"
JuMP = "1.23"
Literate = "2.18"
Plots = "1"
SpecialFunctions = "2"
36 changes: 18 additions & 18 deletions docs/src/guide/measure.md
Original file line number Diff line number Diff line change
Expand Up @@ -342,12 +342,12 @@ tmodel = transformation_model(model);
# output
A JuMP Model
Minimization problem with:
Variables: 3
Objective function type: QuadExpr
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: QuadExpr
├ num_variables: 3
├ num_constraints: 0
└ Names registered in the model: none
```

Expand Down Expand Up @@ -384,12 +384,12 @@ trans_m = transformation_model(model);
# output
A JuMP Model
Minimization problem with:
Variables: 5
Objective function type: QuadExpr
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: QuadExpr
├ num_variables: 5
├ num_constraints: 0
└ Names registered in the model: none
```
Now let's look again at the number of supports, the transcription of `u`, and the
Expand Down Expand Up @@ -442,12 +442,12 @@ tmodel = transformation_model(model);
# output
A JuMP Model
Minimization problem with:
Variables: 2
Objective function type: QuadExpr
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: QuadExpr
├ num_variables: 2
├ num_constraints: 0
└ Names registered in the model: none
```
Then we get the supports are consistent for `u` and the integral:
Expand Down
50 changes: 25 additions & 25 deletions docs/src/guide/model.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: No optimizer attached.
Solver: none
Transformation built and up-to-date: false
```
Ultimately, `model` will be solved via a transformation backend. By default,
Expand All @@ -52,7 +52,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: No optimizer attached.
Solver: none
Transformation built and up-to-date: false
```

Expand All @@ -72,7 +72,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: Ipopt
Solver: Ipopt
Transformation built and up-to-date: false
```
For completeness, the table of currently supported JuMP compatible optimizers
Expand All @@ -96,7 +96,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: Ipopt
Solver: Ipopt
Transformation built and up-to-date: false
```

Expand All @@ -120,11 +120,11 @@ julia> using InfiniteOpt, Ipopt
julia> backend = TranscriptionBackend(Ipopt.Optimizer)
A TranscriptionBackend that uses a
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Ipopt
├ solver: Ipopt
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
```

We query the underlying transformation backend, transformation model, and transformation
Expand All @@ -137,19 +137,19 @@ julia> using InfiniteOpt; model = InfiniteModel();
julia> tbackend = transformation_backend(model)
A TranscriptionBackend that uses a
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> tmodel = transformation_model(model)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> data = transformation_data(model);
```
Expand All @@ -163,11 +163,11 @@ julia> set_transformation_backend(model, TranscriptionBackend(Ipopt.Optimizer))
julia> tbackend = transformation_backend(model)
A TranscriptionBackend that uses a
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Ipopt
├ solver: Ipopt
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
```
Again, since `TranscriptionBackend` is the default, the following models are equivalent:
```jldoctest
Expand All @@ -187,7 +187,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: Ipopt
Solver: Ipopt
Transformation built and up-to-date: false
```

Expand Down
36 changes: 18 additions & 18 deletions docs/src/guide/optimize.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,27 +97,27 @@ let's extract the transformation backend from the example above in the basic usa
julia> backend = transformation_backend(model)
A TranscriptionBackend that uses a
A JuMP Model
Minimization problem with:
Variables: 11
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 1 constraint
`AffExpr`-in-`MathOptInterface.GreaterThan{Float64}`: 10 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 11 constraints
Model mode: AUTOMATIC
CachingOptimizer state: ATTACHED_OPTIMIZER
Solver name: Ipopt
├ solver: Ipopt
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 11
├ num_constraints: 22
│ ├ AffExpr in MOI.EqualTo{Float64}: 1
│ ├ AffExpr in MOI.GreaterThan{Float64}: 10
│ └ VariableRef in MOI.GreaterThan{Float64}: 11
└ Names registered in the model: none
julia> tmodel = transformation_model(model)
A JuMP Model
Minimization problem with:
Variables: 11
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 1 constraint
`AffExpr`-in-`MathOptInterface.GreaterThan{Float64}`: 10 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 11 constraints
Model mode: AUTOMATIC
CachingOptimizer state: ATTACHED_OPTIMIZER
Solver name: Ipopt
├ solver: Ipopt
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 11
├ num_constraints: 22
│ ├ AffExpr in MOI.EqualTo{Float64}: 1
│ ├ AffExpr in MOI.GreaterThan{Float64}: 10
│ └ VariableRef in MOI.GreaterThan{Float64}: 11
└ Names registered in the model: none
```

The `JuMP` variable(s) stored in the transformation backend that correspond to a
Expand Down
60 changes: 30 additions & 30 deletions docs/src/guide/transcribe.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,16 +68,16 @@ julia> build_transformation_backend!(inf_model)
julia> trans_model = transformation_model(inf_model)
A JuMP Model
Minimization problem with:
Variables: 4
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 1 constraint
`QuadExpr`-in-`MathOptInterface.LessThan{Float64}`: 3 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 3 constraints
`VariableRef`-in-`MathOptInterface.ZeroOne`: 1 constraint
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 4
├ num_constraints: 8
│ ├ AffExpr in MOI.EqualTo{Float64}: 1
│ ├ QuadExpr in MOI.LessThan{Float64}: 3
│ ├ VariableRef in MOI.GreaterThan{Float64}: 3
│ └ VariableRef in MOI.ZeroOne: 1
└ Names registered in the model: none
julia> print(trans_model)
Min 2 z + y(0.0) + y(5.0) + y(10.0)
Expand Down Expand Up @@ -357,27 +357,27 @@ which wraps [`build_transcription_backend!`](@ref InfiniteOpt.TranscriptionOpt.b
julia> backend1 = TranscriptionBackend() # make an empty backend
A TranscriptionBackend that uses a
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
julia> build_transformation_backend!(inf_model);
julia> backend2 = transformation_backend(inf_model) # generate from an InfiniteModel
A TranscriptionBackend that uses a
A JuMP Model
Minimization problem with:
Variables: 4
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 1 constraint
`QuadExpr`-in-`MathOptInterface.LessThan{Float64}`: 3 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 3 constraints
`VariableRef`-in-`MathOptInterface.ZeroOne`: 1 constraint
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 4
├ num_constraints: 8
│ ├ AffExpr in MOI.EqualTo{Float64}: 1
│ ├ QuadExpr in MOI.LessThan{Float64}: 3
│ ├ VariableRef in MOI.GreaterThan{Float64}: 3
│ └ VariableRef in MOI.ZeroOne: 1
└ Names registered in the model: none
```
The call to `build_transformation_backend!` is the backbone
behind infinite model transformation and is what encapsulates all the methods to
Expand All @@ -391,11 +391,11 @@ via [`transformation_model`](@ref):
```jldoctest transcribe; setup = :(empty!(inf_model.backend))
julia> transformation_model(inf_model)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
├ solver: none
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none
```
Here we observe that such a model is currently empty and hasn't been populated
yet.
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Feasibility problem with:
Measures: 0
Transformation backend information:
Backend type: TranscriptionBackend
Solver name: Ipopt
Solver: Ipopt
Transformation built and up-to-date: false
```
Learn more about `InfiniteModel`s and optimizers on our
Expand Down
18 changes: 15 additions & 3 deletions src/TranscriptionOpt/model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,20 @@ function TranscriptionBackend(optimizer_constructor; kwargs...)
return InfiniteOpt.JuMPBackend{Transcription}(model, TranscriptionData())
end

# Get the solver name from MOI
# Inspired by https://github.com/jump-dev/JuMP.jl/blob/ce946b7092c45bdac916c9b531a13a5b929d45f0/src/print.jl#L281-L291
function _try_solver_name(model)
if mode(model) != JuMP.DIRECT &&
MOI.Utilities.state(backend(model)) == MOI.Utilities.NO_OPTIMIZER
return "none"
end
try
return MOI.get(backend(model), MOI.SolverName())
catch
return "unknown"

Check warning on line 143 in src/TranscriptionOpt/model.jl

View check run for this annotation

Codecov / codecov/patch

src/TranscriptionOpt/model.jl#L143

Added line #L143 was not covered by tests
end
end

# Printing
function JuMP.show_backend_summary(
io::IO,
Expand All @@ -148,9 +162,7 @@ function JuMP.show_backend_summary(
# TODO add approximation method info (requires InfiniteOpt refactoring)
end
# solver name
moi_summary = sprint(JuMP.show_backend_summary, backend.model)
solver_str = filter(startswith("Solver"), split(moi_summary, "\n"))[1]
println(io, " ", solver_str)
println(io, " Solver: ", _try_solver_name(backend.model))
return
end

Expand Down
Loading

0 comments on commit 93798d1

Please sign in to comment.