A simple python script to find and compare WhatsApp chat entropy.
Here I'm using Shannon equation to find word based chat entropy of whatsapp messages(with 3 of my friends).
-
Download this notebook
-
Export your whatsapp chats(without media) as a .txt file. You can find the export option in whatsapp(top-left corner --> more options).
-
Save the .txt files in the same directory as that of the notebook.
-
Rename the files as 'friend A.txt', 'friend B.txt', 'friend C.txt'.
-
Put your username in the code at the right place(please see the notebook).
-
Now just see the results and share with your friends!
In the context of Information Theory Entropy is a measure of Information contained in a message. Higher entropy indicates higher information. The results of this project can give you an idea about how much information you or your friend provide while chatting. However this is an oversimplified model so it may not produce the correct result in complicated situations (e.g. use of different laguages, too much use of different words with same meaning etc.)
If you have any new idea about this please contribute by pull request or contact me. If you liked this repo star it!