Skip to content

Commit

Permalink
Merge pull request #36 from AlexImmer/pip-install
Browse files Browse the repository at this point in the history
Resolve #28: enable pip install
  • Loading branch information
runame authored Jul 24, 2021
2 parents 540a330 + afcefaf commit b21d415
Show file tree
Hide file tree
Showing 9 changed files with 113 additions and 94 deletions.
5 changes: 2 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,8 @@ language: python
python:
- '3.8'
install:
- pip install -r requirements.txt
- pip install -r tests/requirements.txt
- python setup.py install
- pip install .
- pip install .[tests]
script:
- pytest -vx --cov=laplace/ tests/
after_success:
Expand Down
46 changes: 22 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ There is also a corresponding paper, [*Laplace Redux — Effortless Bayesian Dee
```bibtex
@article{daxberger2021laplace,
title={Laplace Redux--Effortless Bayesian Deep Learning},
author={Daxberger, Erik and Kristiadi, Agustinus and Immer, Alexander
author={Daxberger, Erik and Kristiadi, Agustinus and Immer, Alexander
and Eschenhagen, Runa and Bauer, Matthias and Hennig, Philipp},
journal={arXiv preprint arXiv:2106.14806},
year={2021}
Expand All @@ -22,54 +22,51 @@ There is also a corresponding paper, [*Laplace Redux — Effortless Bayesian Dee
## Setup

We assume `python3.8` since the package was developed with that version.
To install `laplace` with `pip`, run the following:
To install laplace with `pip`, run the following:
```bash
# directly install from git
pip install laplace@git+https://github.com/AlexImmer/Laplace.git
pip install laplace-torch
```

For development purposes, clone the repository and then install:
```bash
# or after cloning the repository for development
pip install -r requirements.txt
# for development
pip install -e .
# run tests
pip install -r tests/requirements.txt
pip install -e .[tests]
pytest tests/
```

## Structure
The laplace package consists of two main components:
## Structure
The laplace package consists of two main components:

1. The subclasses of [`laplace.BaseLaplace`](laplace/baselaplace.py) that implement different sparsity structures: different subsets of weights (`'all'` and `'last_layer'`) and different structures of the Hessian approximation (`'full'`, `'kron'`, and `'diag'`). This results in six currently available options: `laplace.FullLaplace`, `laplace.KronLaplace`, `laplace.DiagLaplace`, and the corresponding last-layer variations `laplace.FullLLLaplace`, `laplace.KronLLLaplace`, and `laplace.DiagLLLaplace`, which are all subclasses of [`laplace.LLLaplace`](laplace/lllaplace.py). All of these can be conveniently accessed via the [`laplace.Laplace`](laplace/laplace.py) function.
2. The backends in [`laplace.curvature`](laplace/curvature/) which provide access to Hessian approximations of
1. The subclasses of [`laplace.BaseLaplace`](https://github.com/AlexImmer/Laplace/blob/main/laplace/baselaplace.py) that implement different sparsity structures: different subsets of weights (`'all'` and `'last_layer'`) and different structures of the Hessian approximation (`'full'`, `'kron'`, and `'diag'`). This results in six currently available options: `laplace.FullLaplace`, `laplace.KronLaplace`, `laplace.DiagLaplace`, and the corresponding last-layer variations `laplace.FullLLLaplace`, `laplace.KronLLLaplace`, and `laplace.DiagLLLaplace`, which are all subclasses of [`laplace.LLLaplace`](https://github.com/AlexImmer/Laplace/blob/main/laplace/lllaplace.py). All of these can be conveniently accessed via the [`laplace.Laplace`](https://github.com/AlexImmer/Laplace/blob/main/laplace/laplace.py) function.
2. The backends in [`laplace.curvature`](https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/) which provide access to Hessian approximations of
the corresponding sparsity structures, for example, the diagonal GGN.

Additionally, the package provides utilities for
decomposing a neural network into feature extractor and last layer for `LLLaplace` subclasses ([`laplace.feature_extractor`](laplace/feature_extractor.py))
decomposing a neural network into feature extractor and last layer for `LLLaplace` subclasses ([`laplace.feature_extractor`](https://github.com/AlexImmer/Laplace/blob/main/laplace/feature_extractor.py))
and
effectively dealing with Kronecker factors ([`laplace.matrix`](laplace/matrix.py)).
effectively dealing with Kronecker factors ([`laplace.matrix`](https://github.com/AlexImmer/Laplace/blob/main/laplace/matrix.py)).

## Extendability
To extend the laplace package, new `BaseLaplace` subclasses can be designed, for example,
a block-diagonal structure or subset-of-weights Laplace.
Alternatively, extending or integrating backends (subclasses of [`curvature.curvature`](laplace/curvature/curvature.py)) allows to provide different Hessian
Alternatively, extending or integrating backends (subclasses of [`curvature.curvature`](https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/curvature.py)) allows to provide different Hessian
approximations to the Laplace approximations.
For example, currently the [`curvature.BackPackInterface`](laplace/curvature/backpack.py) based on [BackPACK](https://github.com/f-dangel/backpack/) and [`curvature.AsdlInterface`](laplace/curvature/asdl.py) based on [ASDL](https://github.com/kazukiosawa/asdfghjkl) are available.
For example, currently the [`curvature.BackPackInterface`](https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/backpack.py) based on [BackPACK](https://github.com/f-dangel/backpack/) and [`curvature.AsdlInterface`](https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/asdl.py) based on [ASDL](https://github.com/kazukiosawa/asdfghjkl) are available.
The `curvature.AsdlInterface` provides a Kronecker factored empirical Fisher while the `curvature.BackPackInterface`
does not, and only the `curvature.BackPackInterface` provides access to Hessian approximations
for a regression (MSELoss) loss function.

## Example usage

### *Post-hoc* prior precision tuning of last-layer LA
### *Post-hoc* prior precision tuning of last-layer LA

In the following example, a pre-trained model is loaded,
then the Laplace approximation is fit to the training data,
and the prior precision is optimized with cross-validation `'CV'`.
After that, the resulting LA is used for prediction with
the `'probit'` predictive for classification.
After that, the resulting LA is used for prediction with
the `'probit'` predictive for classification.

```python
from laplace import Laplace
Expand All @@ -79,7 +76,7 @@ model = load_map_model()

# User-specified LA flavor
la = Laplace(model, 'classification',
subset_of_weights='all',
subset_of_weights='all',
hessian_structure='diag')
la.fit(train_loader)
la.optimize_prior_precision(method='CV', val_loader=val_loader)
Expand All @@ -97,14 +94,14 @@ the log marginal likelihood.

```python
from laplace import Laplace

# Un- or pre-trained model
model = load_model()

# Default to recommended last-layer KFAC LA:
la = Laplace(model, likelihood='regression')
la.fit(train_loader)

# ML w.r.t. prior precision and observation noise
ml = la.log_marginal_likelihood(prior_prec, obs_noise)
ml.backward()
Expand All @@ -115,7 +112,8 @@ ml.backward()
The documentation is available [here](https://aleximmer.github.io/Laplace) or can be generated and/or viewed locally:

```bash
pip install pdoc3 matplotlib
# assuming the repository was cloned
pip install -e .[docs]
# create docs and write to html
bash update_docs.sh
# .. or serve the docs directly
Expand All @@ -127,7 +125,7 @@ pdoc --http 0.0.0.0:8080 laplace --template-dir template
This package relies on various improvements to the Laplace approximation for neural networks, which was originally due to MacKay [1].

- [1] MacKay, DJC. [*A Practical Bayesian Framework for Backpropagation Networks*](https://authors.library.caltech.edu/13793/). Neural Computation 1992.
- [2] Gibbs, M. N. [*Bayesian Gaussian Processes for Regression and Classification*](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.147.1130&rep=rep1&type=pdf). PhD Thesis 1997.
- [2] Gibbs, M. N. [*Bayesian Gaussian Processes for Regression and Classification*](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.147.1130&rep=rep1&type=pdf). PhD Thesis 1997.
- [3] Snoek, J., Rippel, O., Swersky, K., Kiros, R., Satish, N., Sundaram, N., Patwary, M., Prabhat, M., Adams, R. [*Scalable Bayesian Optimization Using Deep Neural Networks*](https://arxiv.org/abs/1502.05700). ICML 2015.
- [4] Ritter, H., Botev, A., Barber, D. [*A Scalable Laplace Approximation for Neural Networks*](https://openreview.net/forum?id=Skdvd2xAZ). ICLR 2018.
- [5] Foong, A. Y., Li, Y., Hernández-Lobato, J. M., Turner, R. E. [*'In-Between' Uncertainty in Bayesian Neural Networks*](https://arxiv.org/abs/1906.11537). ICML UDL Workshop 2019.
Expand Down
2 changes: 1 addition & 1 deletion docs/feature_extractor.html
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ <h2 class="section-title" id="header-classes">Classes</h2>
<dl>
<dt id="laplace.feature_extractor.FeatureExtractor"><code class="flex name class">
<span>class <span class="ident">FeatureExtractor</span></span>
<span>(</span><span>model: torch.nn.modules.module.Module, last_layer_name: Union[str, NoneType] = None)</span>
<span>(</span><span>model: torch.nn.modules.module.Module, last_layer_name: Optional[str] = None)</span>
</code></dt>
<dd>
<div class="desc"><p>Feature extractor for a PyTorch neural network.
Expand Down
36 changes: 17 additions & 19 deletions docs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -35,45 +35,42 @@ <h1 class="title">Package <code>laplace</code></h1>
<p>There is also a corresponding paper, <a href="https://arxiv.org/abs/2106.14806"><em>Laplace Redux — Effortless Bayesian Deep Learning</em></a>, which introduces the library, provides an introduction to the Laplace approximation, reviews its use in deep learning, and empirically demonstrates its versatility and competitiveness. Please consider referring to the paper when using our library:</p>
<pre><code class="language-bibtex">@article{daxberger2021laplace,
title={Laplace Redux--Effortless Bayesian Deep Learning},
author={Daxberger, Erik and Kristiadi, Agustinus and Immer, Alexander
author={Daxberger, Erik and Kristiadi, Agustinus and Immer, Alexander
and Eschenhagen, Runa and Bauer, Matthias and Hennig, Philipp},
journal={arXiv preprint arXiv:2106.14806},
year={2021}
}
</code></pre>
<h2 id="setup">Setup</h2>
<p>We assume <code>python3.8</code> since the package was developed with that version.
To install <code><a title="laplace.laplace" href="laplace.html">laplace.laplace</a></code> with <code>pip</code>, run the following:</p>
<pre><code class="language-bash"># directly install from git
pip install laplace@git+https://github.com/AlexImmer/Laplace.git
To install laplace with <code>pip</code>, run the following:</p>
<pre><code class="language-bash">pip install laplace-torch
</code></pre>
<p>For development purposes, clone the repository and then install:</p>
<pre><code class="language-bash"># or after cloning the repository for development
pip install -r requirements.txt
# for development
pip install -e .
# run tests
pip install -r tests/requirements.txt
pip install -e .[tests]
pytest tests/
</code></pre>
<h2 id="structure">Structure</h2>
<p>The laplace package consists of two main components: </p>
<p>The laplace package consists of two main components:</p>
<ol>
<li>The subclasses of <a href="laplace/baselaplace.py"><code>laplace.BaseLaplace</code></a> that implement different sparsity structures: different subsets of weights (<code>'all'</code> and <code>'last_layer'</code>) and different structures of the Hessian approximation (<code>'full'</code>, <code>'kron'</code>, and <code>'diag'</code>). This results in six currently available options: <code><a title="laplace.FullLaplace" href="#laplace.FullLaplace">FullLaplace</a></code>, <code><a title="laplace.KronLaplace" href="#laplace.KronLaplace">KronLaplace</a></code>, <code><a title="laplace.DiagLaplace" href="#laplace.DiagLaplace">DiagLaplace</a></code>, and the corresponding last-layer variations <code><a title="laplace.FullLLLaplace" href="#laplace.FullLLLaplace">FullLLLaplace</a></code>, <code><a title="laplace.KronLLLaplace" href="#laplace.KronLLLaplace">KronLLLaplace</a></code>,
and <code><a title="laplace.DiagLLLaplace" href="#laplace.DiagLLLaplace">DiagLLLaplace</a></code>, which are all subclasses of <a href="laplace/lllaplace.py"><code>laplace.LLLaplace</code></a>. All of these can be conveniently accessed via the <a href="laplace/laplace.py"><code>laplace.Laplace</code></a> function.</li>
<li>The backends in <a href="laplace/curvature/"><code>laplace.curvature</code></a> which provide access to Hessian approximations of
<li>The subclasses of <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/baselaplace.py"><code>laplace.BaseLaplace</code></a> that implement different sparsity structures: different subsets of weights (<code>'all'</code> and <code>'last_layer'</code>) and different structures of the Hessian approximation (<code>'full'</code>, <code>'kron'</code>, and <code>'diag'</code>). This results in six currently available options: <code><a title="laplace.FullLaplace" href="#laplace.FullLaplace">FullLaplace</a></code>, <code><a title="laplace.KronLaplace" href="#laplace.KronLaplace">KronLaplace</a></code>, <code><a title="laplace.DiagLaplace" href="#laplace.DiagLaplace">DiagLaplace</a></code>, and the corresponding last-layer variations <code><a title="laplace.FullLLLaplace" href="#laplace.FullLLLaplace">FullLLLaplace</a></code>, <code><a title="laplace.KronLLLaplace" href="#laplace.KronLLLaplace">KronLLLaplace</a></code>,
and <code><a title="laplace.DiagLLLaplace" href="#laplace.DiagLLLaplace">DiagLLLaplace</a></code>, which are all subclasses of <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/lllaplace.py"><code>laplace.LLLaplace</code></a>. All of these can be conveniently accessed via the <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/laplace.py"><code>laplace.Laplace</code></a> function.</li>
<li>The backends in <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/"><code>laplace.curvature</code></a> which provide access to Hessian approximations of
the corresponding sparsity structures, for example, the diagonal GGN.</li>
</ol>
<p>Additionally, the package provides utilities for
decomposing a neural network into feature extractor and last layer for <code><a title="laplace.LLLaplace" href="#laplace.LLLaplace">LLLaplace</a></code> subclasses (<a href="laplace/feature_extractor.py"><code>laplace.feature_extractor</code></a>)
decomposing a neural network into feature extractor and last layer for <code><a title="laplace.LLLaplace" href="#laplace.LLLaplace">LLLaplace</a></code> subclasses (<a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/feature_extractor.py"><code>laplace.feature_extractor</code></a>)
and
effectively dealing with Kronecker factors (<a href="laplace/matrix.py"><code>laplace.matrix</code></a>).</p>
effectively dealing with Kronecker factors (<a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/matrix.py"><code>laplace.matrix</code></a>).</p>
<h2 id="extendability">Extendability</h2>
<p>To extend the laplace package, new <code><a title="laplace.BaseLaplace" href="#laplace.BaseLaplace">BaseLaplace</a></code> subclasses can be designed, for example,
a block-diagonal structure or subset-of-weights Laplace.
Alternatively, extending or integrating backends (subclasses of <a href="laplace/curvature/curvature.py"><code>curvature.curvature</code></a>) allows to provide different Hessian
Alternatively, extending or integrating backends (subclasses of <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/curvature.py"><code>curvature.curvature</code></a>) allows to provide different Hessian
approximations to the Laplace approximations.
For example, currently the <a href="laplace/curvature/backpack.py"><code>curvature.BackPackInterface</code></a> based on <a href="https://github.com/f-dangel/backpack/">BackPACK</a> and <a href="laplace/curvature/asdl.py"><code>curvature.AsdlInterface</code></a> based on <a href="https://github.com/kazukiosawa/asdfghjkl">ASDL</a> are available.
For example, currently the <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/backpack.py"><code>curvature.BackPackInterface</code></a> based on <a href="https://github.com/f-dangel/backpack/">BackPACK</a> and <a href="https://github.com/AlexImmer/Laplace/blob/main/laplace/curvature/asdl.py"><code>curvature.AsdlInterface</code></a> based on <a href="https://github.com/kazukiosawa/asdfghjkl">ASDL</a> are available.
The <code><a title="laplace.curvature.AsdlInterface" href="curvature/index.html#laplace.curvature.AsdlInterface">AsdlInterface</a></code> provides a Kronecker factored empirical Fisher while the <code><a title="laplace.curvature.BackPackInterface" href="curvature/index.html#laplace.curvature.BackPackInterface">BackPackInterface</a></code>
does not, and only the <code><a title="laplace.curvature.BackPackInterface" href="curvature/index.html#laplace.curvature.BackPackInterface">BackPackInterface</a></code> provides access to Hessian approximations
for a regression (MSELoss) loss function.</p>
Expand All @@ -83,15 +80,15 @@ <h3 id="post-hoc-prior-precision-tuning-of-last-layer-la"><em>Post-hoc</em> prio
then the Laplace approximation is fit to the training data,
and the prior precision is optimized with cross-validation <code>'CV'</code>.
After that, the resulting LA is used for prediction with
the <code>'probit'</code> predictive for classification. </p>
the <code>'probit'</code> predictive for classification.</p>
<pre><code class="language-python">from laplace import Laplace

# pre-trained model
model = load_map_model()

# User-specified LA flavor
la = Laplace(model, 'classification',
subset_of_weights='all',
subset_of_weights='all',
hessian_structure='diag')
la.fit(train_loader)
la.optimize_prior_precision(method='CV', val_loader=val_loader)
Expand Down Expand Up @@ -119,7 +116,8 @@ <h3 id="differentiating-the-log-marginal-likelihood-wrt-hyperparameters">Differe
</code></pre>
<h2 id="documentation">Documentation</h2>
<p>The documentation is available <a href="https://aleximmer.github.io/Laplace">here</a> or can be generated and/or viewed locally:</p>
<pre><code class="language-bash">pip install pdoc3 matplotlib
<pre><code class="language-bash"># assuming the repository was cloned
pip install -e .[docs]
# create docs and write to html
bash update_docs.sh
# .. or serve the docs directly
Expand All @@ -129,7 +127,7 @@ <h2 id="references">References</h2>
<p>This package relies on various improvements to the Laplace approximation for neural networks, which was originally due to MacKay [1].</p>
<ul>
<li>[1] MacKay, DJC. <a href="https://authors.library.caltech.edu/13793/"><em>A Practical Bayesian Framework for Backpropagation Networks</em></a>. Neural Computation 1992.</li>
<li>[2] Gibbs, M. N. <a href="https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.147.1130&amp;rep=rep1&amp;type=pdf"><em>Bayesian Gaussian Processes for Regression and Classification</em></a>. PhD Thesis 1997. </li>
<li>[2] Gibbs, M. N. <a href="https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.147.1130&amp;rep=rep1&amp;type=pdf"><em>Bayesian Gaussian Processes for Regression and Classification</em></a>. PhD Thesis 1997.</li>
<li>[3] Snoek, J., Rippel, O., Swersky, K., Kiros, R., Satish, N., Sundaram, N., Patwary, M., Prabhat, M., Adams, R. <a href="https://arxiv.org/abs/1502.05700"><em>Scalable Bayesian Optimization Using Deep Neural Networks</em></a>. ICML 2015.</li>
<li>[4] Ritter, H., Botev, A., Barber, D. <a href="https://openreview.net/forum?id=Skdvd2xAZ"><em>A Scalable Laplace Approximation for Neural Networks</em></a>. ICLR 2018.</li>
<li>[5] Foong, A. Y., Li, Y., Hernández-Lobato, J. M., Turner, R. E. <a href="https://arxiv.org/abs/1906.11537"><em>'In-Between' Uncertainty in Bayesian Neural Networks</em></a>. ICML UDL Workshop 2019.</li>
Expand Down
Binary file modified docs/regression_example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 6 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"
5 changes: 0 additions & 5 deletions requirements.txt

This file was deleted.

Loading

0 comments on commit b21d415

Please sign in to comment.