Skip to content

Latest commit

 

History

History
17 lines (11 loc) · 897 Bytes

README.md

File metadata and controls

17 lines (11 loc) · 897 Bytes

Building a multilayer perceptron from scratch

The mathematics and computation that drive neural networks are frequently seen as erudite and impenetrable. A clearly illustrated example of building from scratch a neural network for handwriting recognition is presented in MLP.ipynb. This tutorial provides a step-by-step overview of the mathematics and code used in many modern machine learning algorithms.

Installation

To view this notebook in your browser simply click the MLP.ipynb file above.

To run this notebook locally make sure you have git, python, and Jupyter installed.

Then in a terminal window:

$ git clone https://github.com/KirillShmilovich/MLP-Neural-Network-From-Scrath
$ cd MLP-Neural-Network-From-Scrath
$ jupyter-notebook MLP.ipynb