N.B.: Please don't use the assignment and quiz solution. Try to solve the problem by yourself.
Become an expert in neural networks, and learn to implement them using the deep learning framework PyTorch. Build convolutional networks for image recognition, recurrent networks for sequence generation, generative adversarial networks for image generation, and learn how to deploy models accessible from a website. - Source
- Grokking Deep Learning: https://www.manning.com/books/grokking-deep-learning
- Grokking Deep Learning Book's Excercise: https://github.com/iamtrask/Grokking-Deep-Learning
- Neural Networks And Deep Learning by Michael Nielsen
- The Deep Learning Textbook from Ian Goodfellow, Yoshua Bengio, and Aaron Courville
- A Beginner's Guide to LSTMs and Recurrent Neural Networks
- Understanding LSTM Networks - Colah's Blog
- iGAN - GitHub
- Attacking Machine Learning with Adversarial Examples
- Image-to-Image Demo
- Pix2Pix and CycleGAN Github by Jun-Yan
- CycleGAN
- Implementation of StarGAN
- AWS - Machine Learning Workflow
- GCP - Machine Learning Workflow
- Azure - Machine Learning Workflow
- Open Neural Network Exchange
- Generate your own Bach music using like - DeepBach
- Predict seizures in intracranial EEG recordings on - Kaggle
Get your first taste of deep learning by applying style transfer to your own images, and gain experience using development tools such as Anaconda and Jupyter notebooks.
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Welcome to the Deep Learning Nanodegree Program | Application of Deep Learning | Source/GitHub |
2 | Meet Your Instructors | Instructors-Matt, Luis and Cezanne | Source/GitHub |
3 | Program Structure | Course outline, Chapter Introduction, Project Guidelines | GitHub |
4 | Community Guidelines | Details about the community rules | GitHub |
5 | Prerequisites | Required programming and math skills | GitHub |
6 | Getting Set Up | Required tools-Anaconda, Jupyter Notebook | Source |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Access the Career Portal | Career portal guidelines | GitHub |
2 | Prepare for the Udacity Talent Program | Requirements for udacity profile, complete udacity profile | GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | What It Takes | Description of Udacity Nanodegree | Source/GitHub |
2 | Project Reviews | Project review and feedback system | Source/GitHub |
3 | Knowledge | Tell about knowledge sharing website | Source |
4 | Mentors and Student Hub | Helping support of the course - mentors and student hub | Source |
5 | Community Initiatives | Community introduction, project milestone and everyday challenge | Source |
6 | Meet the Careers Team | Career guide lines for the mentors | Source/GitHub |
7 | Introduction to the Career Portal | Create my career profile | Source/GitHub |
8 | Access Your Career Portal | How to improve my career portal | Source/GitHub |
9 | Your Udacity Professional Profile | Udacity Professional Profile features important, professional information | Source/GitHub |
10 | Prepare for the Udacity Talent Program | Udacity Talent Program, update udacity profile | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Frequently Asked Questions (FAQ) | Frequently asked question and forum | Source |
2 | Support | Discuss about help center | Source |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Instructor | Instructor-Mat Leonard, welcome about anaconda | Source/GitHub |
2 | Introduction | Conda or Anaconda installation step | Source |
3 | What is Anaconda? | Anaconda's description, managing packages and environments | Source/GitHub |
4 | Installing Anaconda | Installing Anaconda | Source |
5 | Managing packages | Managing packages system | Source/GitHub |
6 | Managing environments | Managing and the using of the environments | Source/GitHub |
7 | More environment actions | Saving, loading, listing and removing environments | Source/GitHub |
8 | Best practices | Using and sharing environments | Source/GitHub |
9 | On Python versions at Udacity | Why Python version-3 is used this course | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introduction | Overview of the lesson | Source/GitHub |
2 | Style Transfer | Overview about the style transfer | Source/GitHub |
3 | DeepTraffic | Introduction the deepTrafic (application DL) | Source/GitHub |
4 | Flappy Bird | DL application in Flappy Bird | Source/GitHub |
5 | Books to Read | Some suggested books for DL | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Instructor | Overview of Jupyter Notebook and Introduce the Mat Leonard, instructor | Source/GitHub |
2 | What are Jupyter notebooks? | Introduction of jupyter notebook, Literate programming and How notebooks work | GitHub |
3 | Installing Jupyter Notebook | Installing process of Jupyter Notebook | Source/GitHub |
4 | Launching the notebook server | Launching and shutdowing the notebook server | Source/GitHub |
5 | Notebook interface | Notebook interface - tool bar, command palette etc | Source/GitHub |
6 | Code cells | What is Code cells and uses of it | Source/GitHub |
7 | Markdown cells | Writting procedure of markdown cells - headers, emphasis, code, math expressions | Source/GitHub |
8 | Keyboard shortcuts | Keyboard shortcuts | Source/GitHub |
9 | Magic keywords | Timing code, Embedding visualizations, Debugging | Source/GitHub |
10 | Converting notebooks | Converting notebooks in differents file format | Source/GitHub |
11 | Creating a slideshow | Creating and Running a slideshow | Source/GitHub |
12 | Finishing up | Summary and finish message | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introduction | Matrix and importance in Deep Learning | Source/GitHub |
2 | Data Dimensions | Data dimensions - scalers, vector, matrix, tensors | Source/GitHub |
3 | Data in NumPy | NumPy Introduction | Source/GitHub |
4 | Element-wise Matrix Operations | Matrices element wise operation | Source/GitHub |
5 | Element-wise Operations in NumPy | Matrices element wise operation using NumPy | Source/GitHub |
6 | Matrix Multiplication: Part 1 | Dot multiplication, element wise multiplication in metrices | Source/GitHub |
7 | Matrix Multiplication: Part 2 | Important note about matrix multiplication | Source/GitHub |
8 | NumPy Matrix Multiplication | NumPy Matrix Multiplication | Source/GitHub |
9 | Matrix Transposes | Matrix Transposes and where use of it | Source/GitHub |
10 | Transposes in NumPy | Transposes matrices in NumPy | Source/GitHub |
11 | NumPy Quiz | Short programming quiz that asks to use a few NumPy features | GitHub |
Learn neural networks basics, and build your first network with Python and NumPy. Use the modern deep learning framework PyTorch to build multi-layer neural networks, and analyze real data.
Learn how to build convolutional networks and use them to classify images (faces, melanomas, etc.) based on patterns and objects that appear in them. Use these networks to learn data compression and image denoising.
Build your own recurrent networks and long short-term memory networks with PyTorch; perform sentiment analysis and use recurrent networks to generate new text from TV scripts.
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Sentiment RNN, Introduction | LSTM example, Sentiment analysis | Source/GitHub |
2 | Pre-Notebook: Sentiment RNN | Implementing a complete RNN that can classify the sentiment of movie reviews | Source |
3 | Notebook: Sentiment RNN | Implementing a complete RNN that can classify the sentiment of movie reviews | GitHub |
4 | Data Pre-Processing | Import data, Remove punctuation, Split data into list | GitHub |
5 | Encoding Words, Solution | Encoding the word & label as conver to word to integers | GitHub |
6 | Getting Rid of Zero-Length | Set all the input size as standard, remove zero len data and modify high len data | GitHub |
7 | Cleaning & Padding Data | Removing zero length data | GitHub |
8 | Padded Features, Solution | Pad or truncate all data to a specific length | GitHub |
9 | TensorDataset & Batching Data | Split the train-validation-test data, Data loaded from numpy to tensor, Data loaded with batch size | GitHub |
10 | Defining the Model | Introduction about the model of the network | GitHub |
11 | Complete Sentiment RNN | Initialize the model parameters, Feedforward network, Backpropagation, Initializes hidden state | GitHub |
12 | Training the Model | Hyperparameters, Loss function and Optimization | GitHub |
13 | Testing | Testing the model | GitHub |
14 | Inference, Solution | Inference | GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introduction to Attention | Introduction and defination of Attention, Application and where uses attention | Source/GitHub |
2 | Encoders and Decoders | Sequence-to-sequence models, encoders, decoders | Source/GitHub |
3 | Sequence to Sequence Recap | Review of sequence-to-sequence models, encoders, decoders | Source/GitHub |
4 | Encoding -- Attention Overview | Overview of encoding in Attention | Source/GitHub |
5 | Decoding -- Attention Overview | Overview of decoding in Attention | Source/GitHub |
6 | Attention Overview | Example and question about attention | Source/GitHub |
7 | Attention Encoder | Behind the scenario of encoder algorithms in attention | Source/GitHub |
8 | Attention Decoder | Backbone of decoder and the Attention decoder phase | Source/GitHub |
9 | Attention Encoder & Decoder | Quiz about encoder and decoder | Source/GitHub |
10 | Bahdanau and Luong Attention | Bahdanu/Additive Attention and Luong/Multiplicative Attention Model Introduction | Source/GitHub |
11 | Multiplicative Attention | Details architecture of Multiplicative Attention | Source/GitHub |
12 | Additive Attention | 3-Concat attention and details architecture of Additive Attention | Source/GitHub |
13 | Additive and Multiplicative Attention | Quiz: Additive and Multiplicative Attention | Source/GitHub |
14 | Computer Vision Applications | Example of computer vision applications with attention | Source/GitHub |
15 | Other Attention Methods | The transformer models and indside of the model (encoder and decoder part) | Source/GitHub |
16 | The Transformer and Self-Attention | Full details of the Transformer and Self-Attention architecture | Source/GitHub |
17 | Notebook: Attention Basics | Attention Basic function-Scoring, Annotations Matrix, Softmax, Attention Context Vector | GitHub |
18 | [SOLUTION]: Attention Basics | Attention Basic function-Scoring, Annotations Matrix, Softmax, Attention Context Vector | GitHub |
19 | Outro | Ending message and remainder the most important information in that data | Source/GitHub |
Learn to understand and implement a Deep Convolutional GAN (generative adversarial network) to generate realistic images, with Ian Goodfellow, the inventor of GANs, and Jun-Yan Zhu, the creator of CycleGANs.
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introducing Ian GoodFellow | Introduction about Ian Goodfellow and his experience | Source/GitHub |
2 | Applications of GANs | What you can do with GAN, as- text to images, art to realistic image, face to cartoon, dat to night mode, unsupervised image-to-image, Imitation learning | Source/GitHub |
3 | How GANs work | Autoregressive model, Process of GAN, Generator models & Discriminator | Source/GitHub |
4 | Games and Equilibria | Game theory, Rock-Paper-Scissors game, Equilibriam situtation | Source/GitHub |
5 | Tips for Training GANs | GAN layers architecture, activation and loss functions for generator & discriminator, batch normalization | Source/GitHub |
6 | Generating Fake Images | Excercise dataset introduction, MNIST dataset - fake or real image | Source/GitHub |
7 | MNIST GAN | Built a GAN to generate new images of handwritten digits | Source/GitHub |
8 | GAN Notebook & Data | Introduction the excercise and datasets | Source/GitHub |
9 | Pre-Notebook: MNIST GAN | All about generating new images of handwritten digits | Solution |
10 | Notebook: MNIST GAN | Excercise of GAN | GitHub |
11 | The Complete Model | Hints of the complete model | GitHub |
12 | Generator & Discriminator | Generator and Discriminator Model implementation | GitHub |
13 | Hyperparameters | Hyperparameters set for the excercise | GitHub |
14 | Fake and Real Losses | Loss calculation for the fake & real models | GitHub |
15 | Optimization Strategy, Solution | Adam optimization is used for D & G models | GitHub |
16 | Training Two Networks | Complete the training of the networks | GitHub |
17 | Training Solution | Solution of the training of the networks | GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Deep Convolutional GANs | Introduction about Deep convolutional GANs | Source/GitHub |
2 | DCGAN, Discriminator | DCGAN Architecture basic | Source/GitHub |
3 | DCGAN Generator | DCGAN Generator, transpose convolutional network | Source/GitHub |
4 | What is Batch Normalization? | Batch normalization defination and mathematical calculation | Source/GitHub |
5 | Pre-Notebook: Batch Norm | Excercise of Batch Norm | Source/GitHub |
6 | Notebook: Batch Norm | Excercise of Batch Norm | Source/GitHub |
7 | Benefits of Batch Normalization | Describe the benefits of Batch Normalization | Source/GitHub |
8 | DCGAN Notebook & Data | DCGAN Excercise introduction | Source/GitHub |
9 | Pre-Notebook: DCGAN, SVHN | Excercise of DCGAN, SVHN | Source |
10 | Notebook: DCGAN, SVHN | Excercise of DCGAN, SVHN | GitHub |
11 | Scaling, Solution | Scaling calculation | GitHub |
12 | Discriminator | Discriminator architecture for this network | GitHub |
13 | Discriminator, Solution | Discriminator architecture solution for this network | GitHub |
14 | Generator | Describe the general structure of the generator | GitHub |
15 | Generator, Solution | The solution of the generator model | GitHub |
16 | Optimization Strategy | Optimization parameters set for the model | GitHub |
17 | Optimization Solution & Samples | The solution of Optimization parameters | GitHub |
18 | Other Applications of GANs | More about GAN - Semi-Supervised Learning, Domain Invariance, Ethical and Artistic Applications: Further Reading | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introducing Jun-Yan Zhu | Introduction of Jun-Yan Zhu and his background, Intorduction CycleGAN, Pix2Pix | Source/GitHub |
2 | Image to Image Translation | Image to image translation with example | Source/GitHub |
3 | Designing Loss Functions | Loss are calculated using Euclidean distance | Source/GitHub |
4 | GANs, a Recap | Full review of Generator, Descrimenator of GAN network | GitHub |
5 | Pix2Pix Generator | What changes of Pix2Pix Generator | Source/GitHub |
6 | Pix2Pix Discriminator | What changes of Pix2Pix Discriminator | Source/GitHub |
7 | CycleGANs & Unpaired Data | Unpaired data, Mappings, Inverse mappings | Source/GitHub |
8 | Cycle Consistency Loss | Calculate the Cycle Consistency Loss | Source/GitHub |
9 | Why Does This Work? | Weaknesses of CycleGAN | Source/GitHub |
10 | Beyond CycleGANs | Augmented CycleGAN, Paired CycleGAN, Cross-domain models, StarGAN | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | CycleGAN Notebook & Data | Introduction of the excercise, datasets, objective of the excercise | Source/GitHub |
2 | Pre-Notebook: CycleGAN | CycleGAN Excercise | Source |
3 | Notebook: CycleGAN | Excercise of CycleGAN | GitHub |
4 | DC Discriminator | Implement the Discriminator Function | GitHub |
5 | DC Discriminator, Solution | Solution of the Discriminator Function | GitHub |
6 | Generator & Residual Blocks | Uses of Residual Blocks and it uses in the excercise | GitHub |
7 | CycleGAN Generator | Implement of Residual Blocks and Generator Function | GitHub |
8 | Blocks & Generator, Solution | Solution of Residual Blocks and Generator Function | GitHub |
9 | Adversarial & Cycle Consistency Losses | Description of Adversarial & Cycle Consistency Losses | GitHub |
10 | Loss & Optimization, Solution | Solution of Loss and optimization | GitHub |
11 | Training Exercise | Implement the training function for descrimenator and generator | GitHub |
12 | Training Solution & Generated Samples | Introduction of the excercise, datasets, objective of the excercise | GitHub |
Train and deploy your own PyTorch sentiment analysis model. Deployment gives you the ability to use a trained model to analyze new, user input. Build a model, deploy it, and create a gateway for accessing it from a website.
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Welcome! | Instructor introduction, deployment lesson introduction | Source/GitHub |
2 | What's Ahead | Behind the scenario of deployment, reason of using and it's characteristcs | Source/GitHub |
3 | Problem Introduction | How to approach a real life problem using machine learning | Source/GitHub |
4 | Machine Learning Workflow | Structure of machine learning workflow with example | Source/GitHub |
5 | Machine Learning Workflow | Machine learning structure quiz | Source/GitHub |
6 | What is Cloud Computing & Why | Cloud computing defination, benfits, risks and now mostly why it is used | Source/GitHub |
7 | Why Cloud Computing? | Cloud computing quiz | Source/GitHub |
8 | Machine Learning Applications | Machine Learning Applications, Example of ML in the workplace | Source/GitHub |
9 | Machine Learning Applications | Machine Learning Applications, Example of ML in the workplace | Source/GitHub |
10 | Paths to Deployment | Path of deployment, DevOps in Machine Learning | Source/GitHub |
11 | Paths to Deployment | Paths to Deployment quiz | Source/GitHub |
12 | Production Environments | Production environments of ML and how it works | Source/GitHub |
13 | Production Environments | Production Environments quiz | Source/GitHub |
14 | Endpoints & REST APIs | Endpoint and REST API description, HTTP communication and Method | Source/GitHub |
15 | Endpoints & REST APIs | Endpoints & REST APIs quiz | Source/GitHub |
16 | Containers | Container definition and structure | Source/GitHub |
17 | Containers | Container Quiz | Source/GitHub |
18 | Containers - Straight From the Experts | Container details from an expart developer - Jesse Swidler, a senior software engineer at Udacity | Source/GitHub |
19 | Characteristics of Modeling & Deployment | Description of characteristics of Modeling & Deployment | Source/GitHub |
20 | Characteristics of Modeling & Deployment | Characteristics of Modeling & Deployment quiz | Source/GitHub |
21 | Comparing Cloud Providers | Characteristics of Modeling & Deployment quiz | Source/GitHub |
22 | Comparing Cloud Providers | Comparing Cloud Providers quiz | Source/GitHub |
23 | Closing Statements | Learn about deployment | Source/GitHub |
24 | Summary | Summary of the lesson | Source/GitHub |
25 | [Optional] Cloud Computing Defined | Details about cloud computing defined | Source/GitHub |
26 | [Optional] Cloud Computing Explained | Details about cloud computing defined | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Introduction to Amazon SageMaker | Basic understanding of the SageMaker service | Source/GitHub |
2 | Create an AWS Account | Procedure to open an AWS account | Source/GitHub |
3 | Checking GPU Access | Checking GPU Access | Source/GitHub |
4 | Setting up a Notebook Instance | Creating an instance in Amazon SageMaker service | Source/GitHub |
5 | Cloning the Deployment Notebooks | Starting instance and clone the project file | Source |
6 | Is Everything Set Up? | Setting up the AWS account and SageMaker instance | Source/GitHub |
7 | Boston Housing Example - Getting the Data Ready | Working with Boston Housing Example and Getting the Data Ready | Source/GitHub |
8 | Boston Housing Example - Training the Model | Training XGBoost the model | Source/GitHub |
9 | Boston Housing Example - Testing the Model | Test the model and clean up the data directory | Source/GitHub |
10 | Mini-Project: Building Your First Model | Introduction the IMDB sentiment analysis mini project | Source |
11 | Mini-Project: Solution | Solution of the mini project | Source/GitHub |
Throughout this Nanodegree program, i'll have the opportunityto prove your skills by building the following projects-
Build and train neural networks from scratch to predict the number of bikeshare users on a given day.
In this project, i was got to build a neural network from scratch to carry out a prediction problem on a real dataset.
The data comes from the UCI Machine Learning Database.
Design and train a convolutional neural network to analyze images of dogs and correctly identify their breeds. Use transfer learning and well-known architectures to improve this model - this is excellent preparation for more advanced applications.
GitHub Profiles are a key piece of "evidence" to an employer that you'd be a good job candidate, because they can see the details of your work. Recruiters use GitHub as a way to find job candidates, and many Nanodegree alumni have received work opportunities from their activity on GitHub. In addition, using GitHub is a way for you to collaborate on projects with other programmers - this will show that you are able to work well with others on an engineering team on the job.
Build a recurrent neural network on TensorFlow to process text. Use it to generate new episodes of your favorite TV show, based on old scripts.
Build a pair of multi-layer neural networks and make them compete against each other in order to generate new, realistic faces. Try training them on a set of celebrity faces, and see what new faces the computer comes out with!
Train and deploy your own PyTorch sentiment analysis model. You'll build a model and create a gateway for accessing it from a website.
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Intro | Introduction about Evaluation Metrics | Source/GitHub |
2 | Confusion Matrix | Described the model of confusion matrix and how it is used | Source/GitHub |
3 | Confusion Matrix 2 | Quiz solution about confusion matrix | Source/GitHub |
4 | Accuracy | Importance of accuracy in deep learning model | Source/GitHub |
5 | Accuracy 2 | Quiz solution about accuracy | Source/GitHub |
6 | When accuracy won't work | In some model accuracy is not play an vital role where accuracy impact bad results | Source/GitHub |
7 | False Negatives and Positives | Discuss where the false negatives and positives is used | Source/GitHub |
8 | Precision and Recall | Introduction about precision and recall | Source/GitHub |
9 | Precision | Describe the law of precision with examples | Source/GitHub |
10 | Recall | Describe the law of recall with examples | Source/GitHub |
10 | ROC Curve | How to calculate ROC Curve and graph of ROC Curve | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Intro | Introduction about Linear regression with examples | Source/GitHub |
2 | Quiz: Housing Prices | Regression example with housing prices | Source/GitHub |
3 | Solution: Housing Prices | Solution of regression problem of housing prices | Source/GitHub |
4 | Fitting a Line Through Data | How to fit a line through data | Source/GitHub |
5 | Moving a Line | Explain how the line-slope work with graph & mathematically | Source/GitHub |
6 | Absolute Trick | How to calculate the absolute trick | Source/GitHub |
7 | Square Trick | Another way to calculate come to the closer line calculation | Source/GitHub |
8 | Gradient Descent | Minimize the error | Source/GitHub |
9 | Mean Absolute Error | Law of mean absolute error | Source/GitHub |
10 | Mean Squared Error | Law of mean squared error | Source/GitHub |
11 | Minimizing Error Functions | Relation in trick & error function and how minimize error function | Source/GitHub |
12 | Mean vs Total Error | Difference between Mean vs Total Error | Source/GitHub |
13 | Mini-batch Gradient Descent | Defination of Mini-batch Gradient Descent | Source/GitHub |
14 | Absolute Error vs Squared Error | Differences between Absolute Error and Squared Error | Source/GitHub |
15 | Linear Regression in scikit-learn | Basic sckit-learn and predict data using sklearn.linear_model | GitHub |
16 | Higher Dimensions | Higher dimensions error calculation | Source/GitHub |
17 | Multiple Linear Regression | Multiple Linear Regression with Excercise | GitHub |
18 | Closed Form Solution | How to come closer in solution nth term | Source/GitHub |
19 | (Optional) Closed form Solution Math | Derivation of error for nth term for closed form solution | GitHub |
20 | Linear Regression Warnings | Where linear regression doesn't work well | GitHub |
21 | Polynomial Regression | Polynomial Regression defination with one example | Source/GitHub |
22 | Regularization | L1 & L2 Regularization, Simple & Complex Model | Source/GitHub |
23 | Neural Network Regression | Basic neural network for regression | Source/GitHub |
24 | Neural Networks Playground | A Visual and Interactive Guide to the Basics of Neural Networks | Source/GitHub |
25 | Outro | Ending summurization of lesson | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Welcome to MiniFlow | Limitation of Numpy, introduction tensorflow, miniFlow and differentiable graphs | Source/GitHub |
2 | Graphs | What is neural network and neural network graph, fordward propagatin | Source/GitHub |
3 | MiniFlow Architecture | Implement the MiniFlow Architecture | Source/GitHub |
4 | Forward Propagation | Forward Propagation Implemented, Change the Add() function | GitHub |
5 | Forward Propagation Solution | Forward Propagation Solution with proper explain | GitHub |
6 | Learning and Loss | Linear neural network implement | GitHub |
7 | Linear Transform | Linear Tranform functions are implemented | GitHub |
8 | Sigmoid Function | Where sigmoid function is implemented and how implemented | GitHub |
9 | Cost | Loss calculation using MSE | GitHub |
10 | Cost Solution | Solution of cost function | Source/GitHub |
11 | Gradient Descent | Gradient Descent, Convergence, Divergence | GitHub |
12 | Backpropagation | How to calculate backpropagation in the neural network | GitHub |
13 | Stochastic Gradient Descent | Stochastic Gradient Descent Impementation | GitHub |
14 | SGD Solution | Stochastic Gradient Descent Solution | GitHub |
15 | Outro | Ending Message | Source/GitHub |
No | Lesson | Topic | Link/Source |
---|---|---|---|
1 | Intro | Introductiona and lesson structure of keras | Source/GitHub |
2 | Keras | Keras usefull method details with excercise | GitHub |
3 | Pre-Lab: Student Admissions in Keras | Excercise implementation details describe | Source/GitHub |
4 | Lab: Student Admissions in Keras | Excercise implementation details describe | GitHub |
5 | Optimizers in Keras | Different types of keras optimizers details | Source/GitHub |
6 | Mini Project Intro | Hints the next coming mini project | Source/GitHub |
7 | Pre-Lab: IMDB Data in Keras | Solution Tips | GitHub |
8 | Lab: Student Admissions in Keras | Excercise implementation details describe | GitHub |
The premise of this challenge is to build a habit of practicing new skills by making a public commitment of practicing the topic of your program every day for 30 days.