Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug/992 precision loss #993

Open
wants to merge 14 commits into
base: main
Choose a base branch
from
Open

Bug/992 precision loss #993

wants to merge 14 commits into from

Conversation

neosunhan
Copy link
Collaborator

Description

Fixed precision loss in several functions when using float64 data type.

Issue/s resolved: #992

Changes proposed:

  • Passed in the dtype argument to backend torch functions instead of converting to desired data type after using default torch data type.

Type of change

  • Bug fix (non-breaking change which fixes an issue)

Due Diligence

  • All split configurations tested
  • Multiple dtypes tested in relevant functions
  • Documentation updated (if needed)
  • Updated changelog.md under the title "Pending Additions"

Does this change modify the behaviour of other functions? If so, which?

no

skip ci

@ghost
Copy link

ghost commented Jul 15, 2022

👇 Click on the image for a new way to code review

Review these changes using an interactive CodeSee Map

Legend

CodeSee Map legend

@codecov
Copy link

codecov bot commented Jul 15, 2022

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 91.81%. Comparing base (3302a44) to head (c815a29).
Report is 432 commits behind head on main.

Current head c815a29 differs from pull request most recent head e978b57

Please upload reports for the commit e978b57 to get more accurate results.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #993      +/-   ##
==========================================
- Coverage   91.83%   91.81%   -0.03%     
==========================================
  Files          77       72       -5     
  Lines       11193    10528     -665     
==========================================
- Hits        10279     9666     -613     
+ Misses        914      862      -52     
Flag Coverage Δ
unit 91.81% <100.00%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

neosunhan and others added 4 commits August 15, 2022 13:32
`fabs` and `matrix_norm` were also modified to explicitly cast to
float, in accordance with pre-established behaviour.
@ClaudiaComito ClaudiaComito marked this pull request as ready for review February 9, 2023 11:13
@ClaudiaComito ClaudiaComito added this to the Repo Clean-Up milestone Jul 31, 2023
Copy link
Contributor

This pull request is stale because it has been open for 60 days with no activity.

@github-actions github-actions bot added the stale label May 27, 2024
@ClaudiaComito ClaudiaComito added the bug Something isn't working label May 27, 2024
@ClaudiaComito ClaudiaComito modified the milestones: Repo Clean-Up, 1.5.0 May 27, 2024
@github-actions github-actions bot removed the stale label Jun 3, 2024
Copy link
Contributor

github-actions bot commented Aug 5, 2024

This pull request is stale because it has been open for 60 days with no activity.

@github-actions github-actions bot added the stale label Aug 5, 2024
@ClaudiaComito ClaudiaComito removed this from the 1.5.0 milestone Aug 26, 2024
@ClaudiaComito ClaudiaComito added this to the 1.5.1 milestone Aug 26, 2024
@github-actions github-actions bot removed the stale label Sep 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working GSoC
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Precision loss when using float64 data type
3 participants