Sandbox for training deep learning networks
-
Updated
Sep 6, 2024 - Python
Sandbox for training deep learning networks
Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
PyTorch implementation of CNNs for CIFAR benchmark
Unofficial PyTorch Reimplementation of RandAugment.
Implementation of the mixup training method
Multi-Scale Dense Networks for Resource Efficient Image Classification (ICLR 2018 Oral)
Quickly comparing your image classification models with the state-of-the-art models (such as DenseNet, ResNet, ...)
TensorFlow implementation of GoogLeNet and Inception for image classification.
Pretrained GANs + VAEs + classifiers for MNIST/CIFAR in pytorch.
Open Set Recognition
Implementation of our Pattern Recognition paper "DMT: Dynamic Mutual Training for Semi-Supervised Learning"
The official implementation of paper: "Inter-Instance Similarity Modeling for Contrastive Learning"
[TIP 2022] Towards Better Accuracy-efficiency Trade-offs: Divide and Co-training. Plus, an image classification toolbox includes ResNet, Wide-ResNet, ResNeXt, ResNeSt, ResNeXSt, SENet, Shake-Shake, DenseNet, PyramidNet, and EfficientNet.
Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.
Training Low-bits DNNs with Stochastic Quantization
✨
Add a description, image, and links to the cifar topic page so that developers can more easily learn about it.
To associate your repository with the cifar topic, visit your repo's landing page and select "manage topics."