This repository contains the code and experiments for evaluating the tuning, scalability, generalization and reliability of three different deep learning model architectures (CNN, gMLP, and ViT) on the CIFAR10 and CIFAR100 datasets.
-
Notifications
You must be signed in to change notification settings - Fork 0
Model performance and tuning analysis conducted on the CIFAR10 and CIFAR100 datasets. Convolutional Neural Network (CNN), Gated Multilayer Perceptron (gMLP), and Vision Transformer (ViT) model architectures are utilized. The study is built using PyTorch, PyTorch Lightning, and Optuna. MLflow, DVC, YAML files and the Hydra framework are used.
License
ZCalkins/Model-Performance-and-Tuning-Analysis-on-CIFAR
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Model performance and tuning analysis conducted on the CIFAR10 and CIFAR100 datasets. Convolutional Neural Network (CNN), Gated Multilayer Perceptron (gMLP), and Vision Transformer (ViT) model architectures are utilized. The study is built using PyTorch, PyTorch Lightning, and Optuna. MLflow, DVC, YAML files and the Hydra framework are used.
Topics
Resources
License
Stars
Watchers
Forks
Releases
No releases published