Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multistart optimization #57

Open
amostof opened this issue Nov 11, 2022 · 3 comments
Open

Multistart optimization #57

amostof opened this issue Nov 11, 2022 · 3 comments

Comments

@amostof
Copy link

amostof commented Nov 11, 2022

Hello,

I have a multistart optimization code that generates initial parameter guesses initialps as you can see below and later used every row of them to solve a new optimization problem.

using Distributions, LatinHypercubeSampling, Statistics
p = convert(Array{Float64,1}, 1:3)
bounds = [Vector{Float64}(undef,2) for _ in 1:length(p)]
searchgrid = [1E-9, 1E-1]
for (ipara,para) in enumerate(p)
  bounds[ipara][1] = para * searchgrid[1]
  bounds[ipara][2] = para * searchgrid[2]
end

function latinCube(bounds, dims, nguess = 100)  
    initialps = []
    plan, _ = LHCoptim(nguess, dims, 1000)
    plan /= nguess
    
    for i in 1:dims
        append!(initialps, [quantile(LogUniform(bounds[i][1], bounds[i][2]), plan[:,i])])
    end
    return permutedims(hcat(initialps...))
end

nguess = 100
initialps = latinCube(bounds, length(p), nguess)

My problem is that I do not know how to implement it on p when it is a named tuple in the format of,

p = (p₁ = 1., p₂ = fixed(2.), p₃ = bounded(3., 0, 100))

So my question is how I can do multistart optimization using ParameterHandling.

@willtebbutt
Copy link
Member

Hi @amostof . Thanks for opening the issue. Unfortunately I'm struggling to follow the intent of your code, and I can't run it because there are various undefined functions (LHCoptim, LogUniform, quantile) etc. Would you be able to provide a MWE so that I can better understand what you're trying to achieve?

@amostof
Copy link
Author

amostof commented Nov 12, 2022

My bad, I forgot to add the needed libraries. I edited the first message. In the code, I create a list of initial guesses for my parameter values, which later I use each of them with Adam optimizer to find the real parameters of the system; just a multistart optimization to circumvent local minima. I can do this on simple arrays but do not know how to implement them on NamedTuples.

@willtebbutt
Copy link
Member

Sorry for the slow response! It's been a busy couple of months and this dropped off my radar.

Unfortunately this isn't completely straightforward to do with ParameterHandling as it currently stands, because it assumes that when you have the parameters in "flat" vector form (as you have in the columns of initialps), you have also mapped them to R^D.

One option instead would be to construct the latin hypercube in a bounded hyper-rectangle in R^D, and use those values as the unconstrained values of your initialisation. Then you could just call value_flatten on an arbitrary p to produce the unflatten function which will map columns of p to the NamedTuple objects that you need.

Sorry again for the slow response -- let me know if this is of any use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants