Skip to content

stockeh/mlx-optimizers

Repository files navigation

logo logo

Documentation | Install | Usage | Examples | Contributing

ci PyPI

A library to experiment with new optimization algorithms in MLX.

  • Diverse Exploration: includes proven and experimental optimizers like DiffGrad, QHAdam, and Muon (docs).
  • Easy Integration: fully compatible with MLX for straightforward experimentation and downstream adoption.
  • Benchmark Examples: enables quick testing on classic optimization and machine learning tasks.

The design of mlx-optmizers is largely inspired by pytorch-optmizer.

Install

The reccomended way to install mlx-optimizers is to install the latest stable release through PyPi:

pip install mlx-optimizers

To install mlx-optimizers from source, first clone the repository:

git clone https://github.com/stockeh/mlx-optimizers.git
cd mlx-optimizers

Then run

pip install -e .

Usage

There are a variety of optimizers to choose from (see docs). Each of these inherit the mx.optimizers class from MLX, so the core functionality remains the same. We can simply use the optimizer as follows:

import mlx_optimizers as optim

#... model, grads, etc.
optimizer = optim.DiffGrad(learning_rate=0.001)
optimizer.update(model, grads)

Examples

The examples folder offers a non-exhaustive set of demonstrative use cases for mlx-optimizers. This includes classic optimization benchmarks on the Rosenbrock function and training a simple neural net classifier on MNIST.

logo mnist logo mnist

Contributing

Interested in adding a new optimizer? Start with verifying it is not already implemented or in development, then open a new feature request! If you spot a bug, please open a bug report.

Developer? See our contributing guide.