Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Checks for variables with None value in NLP #1564

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

dallan-keylogic
Copy link
Contributor

Fixes

  • Adds checks to diagnostic methods to catch uninitialized variables before running numerical diagnostics

Summary/Motivation:

This PR addresses #1522 by adding checks for uninitialized variables in the nlp object, as well as new functions in model_statistics and a new method for the DiagnosticsToolbox, display_variables_with_none_value_in_activated_constraints, to create a list of offending variables for your model.

Legal Acknowledgement

By contributing to this software project, I agree to the following terms and conditions for my contribution:

  1. I agree my contributions are submitted under the license terms described in the LICENSE.txt file at the top level of this directory.
  2. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.

@dallan-keylogic dallan-keylogic changed the title Report numerical issues uninitialized Checks for variables with None value in NLP Jan 31, 2025
@codecov-commenter
Copy link

codecov-commenter commented Jan 31, 2025

Codecov Report

Attention: Patch coverage is 92.30769% with 2 lines in your changes missing coverage. Please review.

Project coverage is 77.06%. Comparing base (d43c5dd) to head (2cd1949).

Files with missing lines Patch % Lines
idaes/core/util/model_diagnostics.py 88.88% 1 Missing and 1 partial ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1564   +/-   ##
=======================================
  Coverage   77.06%   77.06%           
=======================================
  Files         389      389           
  Lines       62680    62706   +26     
  Branches    10276    10280    +4     
=======================================
+ Hits        48303    48324   +21     
- Misses      11939    11944    +5     
  Partials     2438     2438           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@agarciadiego agarciadiego left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A typo and a change in text otherwise this looks good to me

RuntimeError,
match=self.err_msg(3),
):
diag_tbx.display_problematic_constraint_terms(model.con_y[1])
Copy link
Contributor

@adam-a-a adam-a-a Feb 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dallan-keylogic Just curious--and we could easily loop through constraints ourselves--but is there another method that display problematic constraint terms for all active constraints? Unrelated to this PR, but figured I'd ask before checking.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or alternatively, do we have a method that simply displays a list of constraints that have been identified as having problematic terms?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adam-a-a There are separate methods for finding all constraints with either mismatched terms or potential cancellations. This method is here to provide more detail on one specific constraint, as displaying all information for all constraints upfront would be overwhelming.

@ksbeattie ksbeattie added Priority:Normal Normal Priority Issue or PR bug Something isn't working labels Feb 6, 2025
Copy link
Contributor

@adam-a-a adam-a-a left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! The only thing that seems to be missing is adding to documentation on the new DiagnosticToolbox / model_diagnostics methods/functions. Is this already being handled by automated documentation via doc strings? If that is all we think we need, that's fine, but if there is anything additional that we think should be more explicitly described in documentation, we can make an issue to remind us of that.

@adam-a-a
Copy link
Contributor

adam-a-a commented Feb 6, 2025

Just double-checked readthedocs and see that the automated documentation is sufficient!

@ksbeattie ksbeattie added the CI:run-integration triggers_workflow: Integration label Feb 6, 2025
@idaes-build idaes-build removed the CI:run-integration triggers_workflow: Integration label Feb 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Priority:Normal Normal Priority Issue or PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants