CIFAR10 LeNet5 Variation 1: GELU
This repository contains a variation of the original LeNet5 architecture adapted for CIFAR-10. The model consists of two convolutional layers followed by three fully connected layers, using linear (GELU) activations and Kaiming uniform initialization. It is trained with a batch size of 32 using the Adam optimizer (learning rate 0.001) and CrossEntropyLoss. In our experiments, this model achieved a test loss of 0.0623 and a top-1 accuracy of 59.51% on CIFAR-10.
Model Details
- Architecture: 2 Convolutional Layers, 3 Fully Connected Layers.
- Activations: GELU.
- Weight Initialization: Kaiming Uniform.
- Optimizer: Adam (lr=0.001).
- Loss Function: CrossEntropyLoss.
- Dataset: CIFAR-10.
Usage
Load this model in PyTorch to fine-tune or evaluate on CIFAR-10 using your training and evaluation scripts.
Feel free to update this model card with further training details, benchmarks, or usage examples.