You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Credit to @jthorton for bringing this up and offering this solution!
The QCArchive queue is FIFO, so when failed optimizations are error cycled, older submissions jump to the front of the line and can outcompete newer submissions for execution.
As a workaround for this, we can decrement priority for failures from their current value, ending at 'low' priority. This should keep forward progress for well-behaved datasets and computations that haven't had a chance to execute yet.
The text was updated successfully, but these errors were encountered:
Credit to @jthorton for bringing this up and offering this solution!
The QCArchive queue is FIFO, so when failed optimizations are error cycled, older submissions jump to the front of the line and can outcompete newer submissions for execution.
As a workaround for this, we can decrement priority for failures from their current value, ending at 'low' priority. This should keep forward progress for well-behaved datasets and computations that haven't had a chance to execute yet.
The text was updated successfully, but these errors were encountered: