Skip to content

Commit

Permalink
Docstring/export improvements (#76)
Browse files Browse the repository at this point in the history
* use reexport for entropies

* improve docstring of mutualinfo

* usemissing name
  • Loading branch information
Datseris authored Aug 26, 2021
1 parent 5ef6d18 commit a4176d6
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 58 deletions.
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "TransferEntropy"
uuid = "ea221983-52f3-5440-99c7-13ea201cd633"
repo = "https://github.com/kahaaga/TransferEntropy.jl.git"
version = "1.2.1"
version = "1.3.0"

[deps]
DSP = "717857b8-e6f2-59f4-9121-6e50c889abd2"
Expand Down
12 changes: 3 additions & 9 deletions src/core.jl
Original file line number Diff line number Diff line change
@@ -1,12 +1,6 @@
import Entropies:
genentropy, ProbabilitiesEstimator, EntropyEstimator,
VisitationFrequency, RectangularBinning,
NaiveKernel, DirectDistance, TreeDistance,
NearestNeighborEntropyEstimator, KozachenkoLeonenko, Kraskov

export VisitationFrequency, RectangularBinning,
NaiveKernel, DirectDistance, TreeDistance,
KozachenkoLeonenko, Kraskov
using Reexport
@reexport using Entropies
using Entropies: NearestNeighborEntropyEstimator

import DelayEmbeddings: AbstractDataset, Dataset
export Dataset
Expand Down
66 changes: 18 additions & 48 deletions src/mutualinfo/interface.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,24 @@ export mutualinfo, Kraskov1, Kraskov2
abstract type MutualInformationEstimator <: EntropyEstimator end

"""
## Mutual information
mutualinfo(x, y, est; base = 2, q = 1)
Estimate mutual information between `x` and `y`, ``I^{q}(x; y)``, using the provided
entropy/probability estimator `est` from Entropies.jl, and Rényi entropy of order `q`
(defaults to `q = 1`, which is the Shannon entropy), with logarithms to the given `base`.
Both `x` and `y` can be vectors or (potentially multivariate) [`Dataset`](@ref)s.
Worth highlighting here are the estimators that compute entropies _directly_, e.g.
nearest-neighbor based methhods. The choice is between naive
estimation using the [`KozachenkoLeonenko`](@ref) or [`Kraskov`](@ref) entropy estimators,
or the improved [`Kraskov1`](@ref) and [`Kraskov2`](@ref) dedicated ``I`` estimators. The
latter estimators reduce bias compared to the naive estimators.
**Note**: only Shannon entropy is possible to use for nearest neighbor estimators, so the
keyword `q` cannot be provided; it is hardcoded as `q = 1`.
## Description
Mutual information ``I`` between (potentially collections of) variables ``X`` and ``Y``
is defined as
Expand All @@ -20,53 +37,6 @@ I^{q}(X; Y) = H^{q}(X) + H^{q}(Y) - H^{q}(X, Y),
```
where ``H^{q}(\\cdot)`` is the generalized Renyi entropy of order ``q``.
## General interface
mutualinfo(x, y, est; base = 2, q = 1)
Estimate mutual information between `x` and `y`, ``I^{q}(x; y)``, using the provided
entropy/probability estimator `est` and Rényi entropy of order `q` (defaults to `q = 1`,
which is the Shannon entropy), with logarithms to the given `base`.
Both `x` and `y` can be vectors or (potentially multivariate) [`Dataset`](@ref)s.
## Binning based
mutualinfo(x, y, est::VisitationFrequency{RectangularBinning}; base = 2, q = 1)
Estimate ``I^{q}(x; y)`` using a visitation frequency estimator.
See also [`VisitationFrequency`](@ref), [`RectangularBinning`](@ref).
## Kernel density based
mutualinfo(x, y, est::NaiveKernel{Union{DirectDistance, TreeDistance}}; base = 2, q = 1)
Estimate ``I^{q}(x; y)`` using a naive kernel density estimator.
It is possible to use both direct evaluation of distances, and a tree-based approach.
Which approach is faster depends on the application.
See also [`NaiveKernel`](@ref), [`DirectDistance`](@ref), [`TreeDistance`](@ref).
## Nearest neighbor based
mutualinfo(x, y, est::KozachenkoLeonenko; base = 2)
mutualinfo(x, y, est::Kraskov; base = 2)
mutualinfo(x, y, est::Kraskov1; base = 2)
mutualinfo(x, y, est::Kraskov2; base = 2)
Estimate ``I^{1}(x; y)`` using a nearest neighbor based estimator. Choose between naive
estimation using the [`KozachenkoLeonenko`](@ref) or [`Kraskov`](@ref) entropy estimators,
or the improved [`Kraskov1`](@ref) and [`Kraskov2`](@ref) dedicated ``I`` estimators. The
latter estimators reduce bias compared to the naive estimators.
*Note: only Shannon entropy is possible to use for nearest neighbor estimators, so the
keyword `q` cannot be provided; it is hardcoded as `q = 1`*.
See also [`KozachenkoLeonenko`](@ref), [`Kraskov`](@ref), [`Kraskov1`](@ref),
[`Kraskov2`](@ref).
"""
function mutualinfo end

Expand Down

0 comments on commit a4176d6

Please sign in to comment.