-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed an issue where applying DirectSolver to StateIndependentsComp was breaking when used with other linear solvers under MPI. #1020
Conversation
… transcriptions. Phase now has default_linear_solver and default_nonlinear_solver options like trajectory.
… other solvers potentially unknown to us
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
…e of NonlinearBlockJac/PetscKrylov. The solvers are needed in the presence of MPI regardless of number of procs.
…kJac if there are direct connections.
Added auto_solvers options to Trajectory and Phases, allowing users to disable the behavior if desired. Added parallel_phases option to Trajectory, which toggles whether the top level phase container is a parallel group.
@@ -684,7 +684,7 @@ def configure_solvers(self, phase): | |||
phase : dymos.Phase | |||
The phase object to which this transcription instance applies. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just noting the docstring here doesn't include the 'requires_solvers' arg
dymos/trajectory/trajectory.py
Outdated
|
||
warn = False | ||
if self._has_connected_phases and isinstance(self.phases, om.ParallelGroup) and MPI and self.comm.size > 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since self.comm.size will never be > 1 if MPI is not active, you don't really need the and MPI
in the if check.
if requires_solvers is not None: | ||
req_solvers.update(requires_solvers) | ||
|
||
reasons_txt = '\n'.join(f' {key}: {val}' for key, val in req_solvers.items()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could move inside the if any(req_solvers.values())
block
if requires_solvers is not None: | ||
req_solvers.update(requires_solvers) | ||
|
||
reasons_txt = '\n'.join(f' {key}: {val}' for key, val in req_solvers.items()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Intersting that the user can put whatever they want in this sentence. Looks like it also prints out for every phase in the model?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The intention was to try to include a "reason" in the warning that gets printed out, so allowing the string in here seemed like the quickest way to do it.
And yes while a "user" could do this, in this context "user" really means "transcription developer".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would print out for every phase where this behavior is applicable. I agree that that's somewhat noisy, but users are capable of squelching it by explicitly setting the solver themselves.
warn = True | ||
phase.nonlinear_solver = om.NewtonSolver(iprint=0) | ||
phase.nonlinear_solver.options['solve_subsystems'] = True | ||
phase.nonlinear_solver.options['maxiter'] = 1000 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is maxiter being raised from 100 to 1000?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had seen one case where it was taking somewhat more than 100 iterations so I just bumped it up.
Summary
Previously dymos automatically assigned a DirectSolver to StateIndependentsComp when necessary, but this was problematic depending on the layout of the systems on MPI (mostly due to the presence of distributed systems within the ODE.
Dymos needs solvers on phases in the following conditions:
solve_segments
is True for one or more states in pseudospectral and Birkhoff transcriptions.input_initial
set toTrue
for one or more statesduration_balance
is used and therefore a residual is associated with the time duration of a phase.By default, Dymos will automatically assign
NewtonSolver
andDirectSolver
to the phase if one of these conditions is met.Furthermore, if a Trajectory has its phases under a ParallelGroup and there are explicit connections between these phases, some form of solver is required to ensure that the inputs/ouputs between the phases are consistent.
By default, Dymos will add NonlinearBlockJac and PETScKrylov as the default solvers in
trajectory.phases
when this is the case.The user can bypass this behavior by assigning some other solver to the phase or
Trajectory.phases
. If the user really truly wants to use a RunOnce solver in these situations, then the new Trajectory and Phase optionauto_solvers
should be set toFalse
.Adds new options to Trajectory:
parallel_phases
dictates whether the top level phase group is a ParallelGroup or a standard Group. Setting this toFalse
allows the user to force phases to run in series when operating in parallel.auto_solvers
isTrue
by default which causes dymos to attempt to place solvers where it thinks they are necessary. Setting this toFalse
will disable this behavior.New options for Phase
auto_solvers
isTrue
by default which causes dymos to attempt to place solvers where it thinks they are necessary. Setting this toFalse
will disable this behavior.Related Issues
indep_states
#1009Backwards incompatibilities
None
New Dependencies
None