GS@HackTheDeep: Map the Collections
Full Stack end-to-end user experience - from importing the file, cleaning and enriching the data then finally visualising it with full search capability
- Flask python framework
- Worms API for taxonomy look up
- Google Maps API for visualistion and geolocation look up
- Python autocorrect library for spell checks
- ElasticSearch to store the new transformed data
- React.Js for a modular UI
- Taxonomy look up: special characters are currently ignored this causes more errors in the data than there actually could be
- API calls needed to all be batched up
- Run time for full transformation is currently quite lengthy
- Learning ReactJS and other new libraries from scratch
- Date format when purely text i.e. "Summer of 1992"
- create and extend the cleaning process to more columns
- populate sparse location attributes by backtracking enriched geocoordinates
- create a rest API to receive the "dirty" data file instead of manually uploading the file
- add additional visualisation libraries on top of elasticSearch - now data stored in an easy query-able state