Documentation | Install | Usage | Examples | Contributing
A library to experiment with new optimization algorithms in MLX.
- Diverse Exploration: includes proven and experimental optimizers like DiffGrad, QHAdam, and Muon (docs).
- Easy Integration: fully compatible with MLX for straightforward experimentation and downstream adoption.
- Benchmark Examples: enables quick testing on classic optimization and machine learning tasks.
The design of mlx-optmizers is largely inspired by pytorch-optmizer.
The reccomended way to install mlx-optimizers is to install the latest stable release through PyPi:
pip install mlx-optimizers
To install mlx-optimizers from source, first clone the repository:
git clone https://github.com/stockeh/mlx-optimizers.git
cd mlx-optimizers
Then run
pip install -e .
There are a variety of optimizers to choose from (see docs). Each of these inherit the mx.optimizers
class from MLX, so the core functionality remains the same. We can simply use the optimizer as follows:
import mlx_optimizers as optim
#... model, grads, etc.
optimizer = optim.DiffGrad(learning_rate=0.001)
optimizer.update(model, grads)
The examples folder offers a non-exhaustive set of demonstrative use cases for mlx-optimizers. This includes classic optimization benchmarks on the Rosenbrock function and training a simple neural net classifier on MNIST.
Interested in adding a new optimizer? Start with verifying it is not already implemented or in development, then open a new feature request! If you spot a bug, please open a bug report.
Developer? See our contributing guide.