This repository is a collection of various fundamental deep learning components and techniques. It is part of a larger project where each directory represents an important aspect of deep learning. Especially a strong focus was made on NLP (transformers, attention model...)
Each directory has been refactored for improved readability, organization, and functionality.
AMAL-student_tp4.2021
: Contains project work from a specific course assignment.Dropout-BN
: Implements Dropout and Batch Normalization, essential regularization techniques in deep learning.RNN
: Implements a Recurrent Neural Network, a type of artificial neural network commonly used for sequence data.Transformer
: Contains the implementation of a Transformer model, a type of model architecture based on self-attention mechanisms.attention-mecanism
: Demonstrates the implementation and usage of attention mechanisms in deep learning models.autograd
: Provides a simple implementation of automatic differentiation, a key component in training neural networks.dataloader
: Demonstrates how to load and preprocess data for a deep learning model.gradient-descent
: Showcases the gradient descent optimization algorithm, a fundamental method used to train machine learning models.lstm-gru
: Contains implementations of Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) models, two common variants of RNNs.seq2seq
: Implements a sequence-to-sequence model, commonly used in tasks like machine translation and chatbot development.
Note: .DS_Store
is a system file used by the macOS operating system and does not contain any project-related content.
Contributions are always welcome. If you have any suggestions or improvements, feel free to create a pull request or open an issue.