Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
-
Updated
Oct 7, 2024 - Python
Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
flexible and extensible implementation of a multithreaded feedforward neural network in Java including popular optimizers, wrapped up in a console user interface
CNeuron is a simple singular neural network neuron implementation in C, designed for easy integration into C projects.
Implementation and comparison of SGD, SGD with momentum, RMSProp and AMSGrad optimizers on the Image classification task using MNIST dataset
Using different optimizers for a comparison study, finding the root of differences by visualization and to find the best case for a specific task
Week 1 assignment form Coursera's "Advanced Machine Learning - Introduction to Deep Learning"
Deep Learning Optimizers
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
Implement different variants of gradient descent in python using numpy
Simple MATLAB toolbox for deep learning network: Version 1.0.3
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
Advance Machine Learning (CSL 712) Course Lab Assignments
Implementation of a 3 layer neural net in numpy, trained and tested on MNIST dataset
Add a description, image, and links to the sgd-momentum topic page so that developers can more easily learn about it.
To associate your repository with the sgd-momentum topic, visit your repo's landing page and select "manage topics."