Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev #634

Merged
merged 16 commits into from
Jan 10, 2024
Merged

Dev #634

merged 16 commits into from
Jan 10, 2024

Conversation

patrick-kidger
Copy link
Owner

No description provided.

packquickly and others added 16 commits December 28, 2023 03:05
* Add weight normalization

* vmap over axes instead of for-loop to norm over multiple axes, docs for weight_norm

* Removed defaulted arguments, simplified

`_norm_except_axis`, silenced `beartype` warning.

* Remove `self.v` from `WeightNorm`, use `self.layer.weight` directly

---------

Co-authored-by: boris <[email protected]>
Co-authored-by: Patrick Kidger <[email protected]>
This is to fix a crash on TPUs, see #628.
This is to be more in line with the abstract/final design pattern, that we generally try to subscribe to:

https://docs.kidger.site/equinox/pattern/

This advocates against using `super()`.

In practice we don't use it absolutely everywhere, for backward compatibility: there are two cases where we can't use it:
- `Conv`, `ConvTranspose`, `Pool`, `AdaptivePool` are concrete classes which are subclassed. As such their subclasses (e.g. `Conv1d`) aren't strict.
- The name of `StatefulLayer` does not start with `"Abstract"`.  This means that we can't make any downstream layers (`BatchNorm`, `SpectralNorm`) strict either.
@patrick-kidger patrick-kidger merged commit 1a01c3c into main Jan 10, 2024
2 checks passed
@patrick-kidger patrick-kidger deleted the dev branch January 10, 2024 21:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants