My journey through 100DaysOfMLCode Challenge
- Started working on Entity Extraction.
- I will be using bi-LSTM + CRF with character embeddings for NER and POS.
- Reading: https://guillaumegenthial.github.io/sequence-tagging-with-tensorflow.html
- Repo: https://github.com/partoftheorigin/EntityExtraction
- Almost done with Sequence Tagging.
- Repo: https://github.com/partoftheorigin/EntityExtraction/blob/master/LSTM_CRF_Twitter_NER.ipynb
- Working on Automated Question Generation.
- Research papers:
- Working on Automated Question Generation.
- Attended a MeetUp on Getting to the Core of Deep Learning.
- Still understanding the underlying concepts and approaches of the research papers so that we can create our own model from scratch.
- Working on Automated Question Generation.
- Experimenting with SQuAD2.0 The Stanford Question Answering Dataset
- Working on Automated Question Generation.
- Started implementation of Context based Question-Answering model based on NeuralQA paper for Stanford SQuAd Dataset.
- Preparing data now.
- Continuing work on Neural Question Answer Generation.
- Processing SQuAd Dataset to get it ready for training.
- Once the data is ready I will create word embeddings using GloVe.
- Working on Neural QA.
- Preprocessed and saved data to be used afterwards for training.
- Creating word embeddings using GloVe.
- Repo: https://github.com/partoftheorigin/NeuralQuestionAnswer
- Read this text classification guide by Google Developers: https://developers.google.com/machine-learning/guides/text-classification/
- Helped a friend with her first ML project, classification using Iris Dataset.
- Will continue work on Automated Question Answer Generation using SQuAD Dataset.
- Browsed various GitHub repositories
- Worked on Google's Text Classification guide and Neural QA.
- Completed work on Automated Question Generation.
- Read research on Fast and Easy Short Answer Grading with High Accuracy - http://www.aclweb.org/anthology/N16-1123
- Getting hands-on with statistical packages in Python.
- Read research on Neural Arithmetic Logic Units - https://arxiv.org/pdf/1808.00508.pdf
- Neural networks enhanced with Neural Arithmetic Logic Units (NALU) can learn to track time, perform arithmetic over images of numbers, translate numerical language into real-valued scalars, execute computer code, and count objects in images.
- Will be coding this paper in PyTorch