-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug/992 precision loss #993
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #993 +/- ##
==========================================
- Coverage 91.83% 91.81% -0.03%
==========================================
Files 77 72 -5
Lines 11193 10528 -665
==========================================
- Hits 10279 9666 -613
+ Misses 914 862 -52
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
This pull request is stale because it has been open for 60 days with no activity. |
This pull request is stale because it has been open for 60 days with no activity. |
Description
Fixed precision loss in several functions when using
float64
data type.Issue/s resolved: #992
Changes proposed:
dtype
argument to backend torch functions instead of converting to desired data type after using default torch data type.Type of change
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
no
skip ci