-
Notifications
You must be signed in to change notification settings - Fork 58
Home
A first approach to structure the parameter- and tool-space of semantic entity annotation systems was published by Cornolti et al. [0]. However, the BAT-framework is hard to setup and does not allow for an easy comparison of tools and datasets. GERBIL [3] offers an easy-to-use web-based platform for the agile comparison of annotators using multiple datasets and uniform measuring approaches. To add a tool to GERBIL, all the end user has to do is to provide a URL to a REST interface to its tool which abides by a given specification. The integration and benchmarking of the tool against user-specified datasets is then carried out automatically by the GERBIL platform. Currently, out platform provides results for 7 annotators (more coming up soon). Internally, GERBIL is based on the Natural Language Programming Interchange Format (NIF) and provide Java classes for implementing APIs for datasets and annotators to NIF. Furthermore, all datasets and annotators are described in RDF. GERBIL will provide persistent URIs for your experiments that are publishable and allow for a new level of reproducibility.
GERBIL will not return annotated text to a user. If a user wants to annotate his texts, he or she must use the tool of his or her choice.
[0] http://dl.acm.org/citation.cfm?id=2488411 [3] http://aksw.org/Projects/GERBIL.html