The project uses natural language processing (NLP) to create a chatbot capable of understanding and responding to user queries. The chatbot is trained on a dataset of intents, which are predefined categories of user inputs. The data is preprocessed, tokenized, and transformed into numerical vectors using a bag-of-words approach. A neural network model is constructed using TensorFlow and TFLearn libraries, trained using softmax activation function and categorical cross-entropy loss.
The project aims to create a chatbot that can understand and respond to user queries using natural language processing (NLP) techniques. The chatbot is trained on a dataset of intents, which are predefined categories of user inputs. The training data consists of patterns and corresponding tags, representing user inputs. The project begins by preprocessing the data, tokenizing the patterns into individual words, and stemming them using the Lancaster stemming algorithm. The words are then transformed into numerical vectors using a bag-of-words approach, with labels one-hot encoded to represent different intents. A neural network model is constructed using TensorFlow and TFLearn libraries, and trained on the preprocessed data using the softmax activation function and categorical cross-entropy loss. The model is saved for future use.
- Utilizes natural language processing (NLP) to understand user queries.
- Trains chatbot on predefined intent dataset.
- Data preprocessing includes tokenization, stemming, numerical representation, and one-hot encoding.
- Constructs neural network model using TensorFlow and TFLearn libraries.
- Uses activation and loss functions.
- Trains model on preprocessed data to associate patterns with specific intents.
- Saves trained model for future use.
To install ChatFlow, follow these steps:
- Clone the repository using HTTPS:
git clone https://github.com/{User}/ChatFlow.git
- Alternatively, clone using SSH:
git clone [email protected]:{User}/ChatFlow.git
- Navigate to the project directory:
cd ChatFlow
- Utilizes natural language processing (NLP) and machine learning techniques.
- Imports libraries like nltk, LancasterStemmer, tensorflow, tflearn, numpy, random, json, and pickle for data handling and serialization. • Loads training data from an intents.json file, tokenizing and stemming text patterns, and normalizing words.
- Converts data into numerical arrays and output labels.
- Saves preprocessed data into a data.pickle file to avoid repeated preprocessing.
- Defines neural network architecture using tflearn, consisting of input layers, fully connected layers, and an output layer.
- Trains model using preprocessed training data and saves it to a file (model.tflearn) to avoid retraining.
- Trains model and saves it if model file does not exist, ensuring efficient reuse.
Contributions are welcome! To contribute to Monster Maze, follow these steps:
- Fork the repository.
- Create a new branch:
git checkout -b feature/YourFeature
- Make your changes and commit them:
git commit -m "Add new feature"
- Push to the branch:
git push origin feature/YourFeature
- Create a new Pull Request.
This project is licensed under the MIT License. See the LICENSE file for details.