Skip to content

ImplementedMeasures

Joseph Lizier edited this page Aug 20, 2015 · 1 revision

The measures (and estimation techniques) implemented by this toolkit

The JIDT implements a range of information-theoretic measures, for both discrete and continuous-valued variables.

The measures (and estimation techniques) implemented by this toolkit are as follows

Measure Discrete Continuous (implementation technique specified)
Entropy yes (box) kernel estimation*, Kozachenko*, Gaussian*
Entropy rate yes Use two multivariate entropy calculators
Mutual information yes Kraskov*+, (box) kernel estimation*+, Gaussian*+, Symbolic*+
Conditional mutual information yes Kraskov*+, Gaussian*, Symbolic*+
Multi-information / Integration yes Kraskov, (box) kernel estimation
Transfer Entropy yes Kraskov*, (box) kernel estimation*, Gaussian*, Symbolic
Conditional/Complete Transfer Entropy yes Kraskov*, Gaussian*
Active information storage yes Kraskov, (box) kernel estimation, Gaussian
Predictive information / Excess entropy yes Kraskov, (box) kernel estimation, Gaussian
Separable information yes

Key to table:

  • * indicates multivariate (i.e. joint state) implementation is provided.
  • + also implements MI from multivariate continuous variables to a discrete variable
Clone this wiki locally