Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Identifier for evaluation threshold #439

Closed
brandtkeller opened this issue May 24, 2024 · 0 comments · Fixed by #457
Closed

Identifier for evaluation threshold #439

brandtkeller opened this issue May 24, 2024 · 0 comments · Fixed by #457
Assignees
Labels
enhancement New feature or request

Comments

@brandtkeller
Copy link
Member

Is your feature request related to a problem? Please describe.

lula evaluate is designed to be used to compare two assessments (a threshold assessment and a new assessment) and provide a pass/fail exit output and exit code as a result of being more/equally compliant (pass) or less compliant (fail).

Current this operates on the assumption of two separate assessment files being compared OR the two latest results in a single assessment file.

As Lula looks to make artifact consolidation into a single-file a first class process - we need the ability to store many assessments-results into a single object AND establish a way to identify which result is the threshold.

Example:
I may have an assessment result with a single finding (currently the threshold) - I perform another assessment (validate) which is less compliant. I now have two results - if I validate again, current logic would use the 2nd result (which was failing) as my new threshold. Instead I still want to use the original threshold to ensure I maintain a specific level of compliance until the assessment has become more compliant and the threshold needs to be updated

Describe the solution you'd like

  • Given assessment results are being written to a single file
  • When evaluate is executed
  • Then the threshold should be a specific result with some identifier

Additional context

Might be room here for props to be introduced

@brandtkeller brandtkeller added the enhancement New feature or request label May 24, 2024
@github-actions github-actions bot added the triage Awaiting triage from the team label May 24, 2024
@brandtkeller brandtkeller removed the triage Awaiting triage from the team label May 29, 2024
@brandtkeller brandtkeller self-assigned this May 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

1 participant