diff --git a/README.md b/README.md index c8dea370..5eceb7a9 100644 --- a/README.md +++ b/README.md @@ -61,24 +61,32 @@ If you found this library useful in academic research, please cite: [(arXiv link ## See also: other libraries in the JAX ecosystem -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. +#### Always useful -[Equinox](https://github.com/patrick-kidger/equinox): neural networks. +[Equinox](https://github.com/patrick-kidger/equinox): neural networks and everything not already in core JAX! + +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. + +#### Deep learning [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. -[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). -[Lineax](https://github.com/google/lineax): linear solvers. +[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). -[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. +#### Scientific computing -[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. -[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. [PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) + +#### Awesome JAX + +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects. diff --git a/docs/index.md b/docs/index.md index 3db6fd1d..836c327d 100644 --- a/docs/index.md +++ b/docs/index.md @@ -49,16 +49,32 @@ Have a look at the [Getting Started](./usage/getting-started.md) page. ## See also: other libraries in the JAX ecosystem -[Equinox](https://github.com/patrick-kidger/equinox): neural networks. +#### Always useful -[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. +[Equinox](https://github.com/patrick-kidger/equinox): neural networks and everything not already in core JAX! -[Lineax](https://github.com/google/lineax): linear solvers and linear least squares. +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. +#### Deep learning -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. +[Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). [Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). + +#### Scientific computing + +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. + +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. + +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. + +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. + +[PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) + +#### Awesome JAX + +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects.## See also: other libraries in the JAX ecosystem