Wrapper for Keras with support to easy multimodal data and models loading and handling.
You can access the library documentation page at marcbs.github.io/multimodal_keras_wrapper/
Some code examples are available in demo.ipynb and test.py. Additionally, in the section Projects you can see some practical examples of projects using this library.
The following dependencies are required for using this library:
- Anaconda
- Keras - custom fork or original version
- cloud >= 2.8.5
- scipy
- coco-caption
Only when using NMS for certain localization utilities:
- cython >= 0.23.4
In order to install the library you just have to follow these steps:
- Clone this repository:
git clone https://github.com/MarcBS/multimodal_keras_wrapper.git
- Include the repository path into your PYTHONPATH:
export PYTHONPATH=$PYTHONPATH:/path/to/multimodal_keras_wrapper
- If you wish to install the dependencies (it will install our custom Keras fork):
pip install -r requirements.txt
You can see more practical examples in projects which use this library:
VIBIKNet for Visual Question Answering
ABiViRNet for Video Description
Sentence-SelectioNN for Domain Adaptation in SMT
For additional information on the Deep Learning library, visit the official web page www.keras.io or the GitHub repository https://github.com/fchollet/keras.
You can also use our custom Keras version, which provides several additional layers for Multimodal Learning.