Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linearize bilinear terms of lower dual*primal in upper objective #157

Open
wants to merge 134 commits into
base: master
Choose a base branch
from

Conversation

NLaws
Copy link
Contributor

@NLaws NLaws commented Dec 28, 2021

Summary

When the upper level objective contains bilinear products of lower dual and primal variables they can be linearized under certain conditions. The changes in this PR checks those conditions and linearizes the biinear terms when possible. The mathematical details can be found in:
http://www.optimization-online.org/DB_HTML/2021/08/8561.html

Changes

  • add an optional flag to BilevelModel called linearize_bilinear_upper_terms, which is a boolean that defaults to false.
  • add automated test for the new capability using the existing jump_conejo2016 test.
  • address vectorized DualOf variables not supported #158 by adding support for constructing DualOf vectors and DenseAxisArrays of constraint references

Additional information

There are some additional, manual tests in a new file for the linearization process that I am keeping in the code for now. They will be useful to adapt and add to the automated tests as this new capability is expanded.
For example, currently the linearization process requires that the lower level model be provided by the user in its standard form. However, I believe that we could do the conversion of the lower level to standard form for users (and I have a start on it in the new functions). I will need help with that because I'm not sure how to handle the dualizing and all the maps if we convert the lower level to standard form before building the single level model. (I think that we will need a map from/to:
single level model to/from the dualized standard form lower model to/from the general form primal lower model).

and starting to handle case when set AB is empty (no shared lower primal variables in bilinear upper and lower objectives). checking with jump_conejo2016 but linearized results do not match the bilinear problem results yet
and move more methods to src/bilinear_linearization.jl
still not working quite for Conejo, seem to be getting wrong dual variable indices, also will likely need to Dualize the standard form problem for linearization to work
and fill out some doc strings
try to get SCIP working on Actions
@NLaws NLaws marked this pull request as draft December 28, 2021 23:56
@NLaws NLaws marked this pull request as ready for review December 28, 2021 23:56
@NLaws
Copy link
Contributor Author

NLaws commented Dec 28, 2021

trying to close and reopen to kick off Actions tests

@NLaws NLaws closed this Dec 28, 2021
@NLaws NLaws reopened this Dec 28, 2021
@NLaws
Copy link
Contributor Author

NLaws commented Dec 29, 2021

@joaquimg I tried to get the tests running on actions by updating the testing framework to use test/Project.toml but it is still expecting the old format:

ERROR: Compat `Cbc` not listed in `deps` or `extras` section.

I can't find any info online on how to upgrade a package to the new Pkg testing framework. At first, Actions was failing because it could not add SCIP. Some guidance on how to get the test suite running would be greatly appreciated.

for example:
```julia

m = BilevelModel();
T = 100;
nodes = [:a, :b];
@variable(Lower(m), y1[nodes, 1:T] >= 0);
@variable(Lower(m), y2[nodes, 1:T] >= 0);
@constraint(Lower(m), b[n in nodes, t in 1:T], y1[n,t] + y2[n,t] == 100);

@variable(Upper(m), lambda, DualOf(b))

2-dimensional DenseAxisArray{BilevelJuMP.BilevelVariableRef,2,...} with index sets:
    Dimension 1, [:a, :b]
    Dimension 2, Base.OneTo(100)
And data, a 2×100 Matrix{BilevelJuMP.BilevelVariableRef}:
 lambda[1]  lambda[3]  lambda[5]  lambda[7]  lambda[9]   …  lambda[193]  lambda[195]  lambda[197]  lambda[199]
 lambda[2]  lambda[4]  lambda[6]  lambda[8]  lambda[10]     lambda[194]  lambda[196]  lambda[198]  lambda[200]
```
NLaws and others added 23 commits July 25, 2022 18:13
@time memoized and threaded find_connected_rows_cols:
21.606794 seconds (2.95 M allocations: 166.152 MiB, 1.61% compilation time)

@time with generated recursive_col_search:
1.217451 seconds (1.41 M allocations: 74.890 MiB, 99.92% compilation time)

(tests done with first 8 values in upper_var_to_lower_ctr)

this commit is in case we need to go back to other methods and test times

next commit will (likely) switch to the generated method of recursive_col_search in find_connected_rows_cols (and use the globalV in more places perhaps)
hours of time turned into minutes (e.g. BilevelJuMP.get_all_connected_rows_cols went from 8.5 hours to 5.5 minutes)
was not returning an expression and was returning too soon
the repeat function was repeating the same instance of vector to be filled, resulting in nthreads repeats in each vector (which causes slow downs in downstream code that loop over the outputs of this function)
much faster using I,J,V from findnz in find_connected_rows_cols than prior speed up attempts. also, lower memory impact to use custom dict for memo cache because do not need to save the array input for each call of find_connected_rows_cols
sped up some methods in linearization of bilinear terms by:
- memoizing find_connected_rows_cols
- multithreading several for loops
- working with lower level sparse coefficient matrix in its rows, cols, vals form (from findnz)
can save time for large models if checks have already been done
new bool arg to main_linearization for checking conditions
- only need one pair of (j,n) from set A for each block in lower level problem
- see equ. 23 or 24 in https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9729553
@joaquimg
Copy link
Owner

joaquimg commented Oct 3, 2022

This is getting very nice.
Let me know whenever it is ready for another pass!

@NLaws
Copy link
Contributor Author

NLaws commented Oct 19, 2022

@joaquimg ready!

@NLaws
Copy link
Contributor Author

NLaws commented Nov 6, 2022

@joaquimg ^^

1 similar comment
@NLaws
Copy link
Contributor Author

NLaws commented Dec 10, 2022

@joaquimg ^^

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants