This repo is home to the code that accompanies Data Structures, Algorithms and Machine Learning Optimization curriculum, which provides a comprehensive overview of all of the subjects — across Data Structures, Algorithms and Machine Learning Optimization — that underlie contemporary machine learning approaches, including deep learning and other artificial intelligence techniques:
-
- A Brief History of Data
- A Brief History of Algorithms
- “Big O” Notation for Time and Space Complexity
-
- List-Based Data Structures: Arrays, Linked Lists, Stacks, Queues, and Deques
- Searching and Sorting: Binary, Bubble, Merge, and Quick
- Set-Based Data Structures: Maps and Dictionaries
- Hashing: Hash Tables, Load Factors, and Hash Maps
-
- Trees: Decision Trees, Random Forests, and Gradient-Boosting (XGBoost)
- Graphs: Terminology, Directed Acyclic Graphs (DAGs)
- Resources for Further Study of Data Structures & Algorithms
-
- The Statistical Approach to Regression: Ordinary Least Squares
- When Statistical Approaches to Optimization Break Down
- The Machine Learning Solution
-
- Objective Functions
- Cost / Loss / Error Functions
- Minimizing Cost with Gradient Descent
- Learning Rate
- Critical Points, incl. Saddle Points
- Gradient Descent from Scratch with PyTorch
- The Global Minimum and Local Minima
- Mini-Batches and Stochastic Gradient Descent (SGD)
- Learning Rate Scheduling
- Maximizing Reward with Gradient Ascent
-
- A Layer of Artificial Neurons in PyTorch
- Jacobian Matrices
- Hessian Matrices and Second-Order Optimization
- Momentum
- Nesterov Momentum
- AdaGrad
- AdaDelta
- RMSProp
- Adam
- Nadam
- Training a Deep Neural Net
- Resources for Further Study