Overview
This package contains routines to perform Information Theoretic measures. The priliminary build up of this tool is to validate, prove and analyze Information Inequalities. This can be used both for rigorous computations and analysis of Information measures and expressions.
An first overview of the package was given at BLA, the slides are available here.
Features
The package is still under active development and things evolve quickly (or at least should)
- enclosure of the solution of interval linear systems
- exact characterization of the entropic space
- verified proof and $\LaTeX$ rendering in display
- enclosure of singularvalues of the entropic space generator matrix
- Further work
Installation
Open a Julia session and enter
using Pkg; Pkg.add("InformationInequalities")
this will download the package and all the necessary dependencies for you. Next you can import the package with
using InformationInequalities
and you are ready to go.
Quickstart
using InformationInequalities
using Plots
E="3I(X;Y|Z)+2H(X|Y,Z)"
A=LinearInformationExpressionToCanonical(E)
$-H(X,Y,Z) + 3 H(X,Z) + H(Y,Z) - 3 H(Z)$.
To plot an Information expression as a tree graph in Entropy co-ordinates,
using InformationInequalities
using Plots
E="3I(X;Y|Z)+2H(X|Y,Z)"
A=plotIE(E)
Another example TBD
Citation
If you use this package in your work, please cite it as
@software{nrethnakar2022,
author = {
Nivedita Rethnakar and
Raymond W Yeung
Suhas Diggavi
},
title = {InformationInequalities.jl: Exploring Information Theoretic Inequalities},
month = {1},
year = {2022},
doi = {10.5282/zenodo.5363564},
url = {https://github.com/nivupai/InformationInequalities.jl}
}
InformationInequalities
Documentation for InformationInequalities.
InformationInequalities.ConditionalEntropyList
InformationInequalities.ConditionalMutualInformationList
InformationInequalities.Elemental2Canonical
InformationInequalities.Elemental2Canonical_H
InformationInequalities.Elemental2Canonical_MI
InformationInequalities.ElementalMeasures
InformationInequalities.GeometryConeGamma2
InformationInequalities.LinearInformationExpressionToCanonical
InformationInequalities.elementsGamma2
InformationInequalities.entropic_matrix
InformationInequalities.entropic_terms
InformationInequalities.entropy_vector
InformationInequalities.find_entropic_vector
InformationInequalities.find_matrixG
InformationInequalities.find_subset
InformationInequalities.minimalE
InformationInequalities.minimal_EIM_list_canonical
InformationInequalities.numEIM
InformationInequalities.order_entropic
InformationInequalities.order_entropic1
InformationInequalities.order_entropic_expression
InformationInequalities.order_string
InformationInequalities.plotEntropyTree
InformationInequalities.plotIE
InformationInequalities.plotInformationExpression
InformationInequalities.simplify
InformationInequalities.simplifyH
InformationInequalities.unique_entropy_vector
InformationInequalities.ConditionalEntropyList
β FunctionList down all conditional Entropy expressions for a given number n
of random variables. Conditional Entropies are of the form H(X,Y|Z)
julia> ConditionalEntropyList(2,"π")
["H(π1)" "H(π1|π2)" "H(π2)" "H(π2|π1)"]
julia> ConditionalEntropyList(2)
["H(X1)" "H(X1|X2)" "H(X2)" "H(X2|X1)"]
julia> ConditionalEntropyList(3,"diceπ ")
["H(diceπ 1)"
"H(diceπ 1|diceπ 2)"
"H(diceπ 1|diceπ 2,diceπ 3)"
"H(diceπ 1|diceπ 3)"
"H(diceπ 2)"
"H(diceπ 2|diceπ 1)"
"H(diceπ 2|diceπ 1,diceπ 3)"
"H(diceπ 2|diceπ 3)"
"H(diceπ 3)"
"H(diceπ 3|diceπ 1)"
"H(diceπ 3|diceπ 1,diceπ 2)"
"H(diceπ 3|diceπ 2)"]
julia> ConditionalEntropy(3,"Z")
["H(Z1)"
"H(Z1|Z2)"
"H(Z1|Z2,Z3)"
"H(Z1|Z3)"
"H(Z2)"
"H(Z2|Z1)"
"H(Z2|Z1,Z3)"
"H(Z2|Z3)"
"H(Z3)"
"H(Z3|Z1)"
"H(Z3|Z1,Z2)"
"H(Z3|Z2)"]
InformationInequalities.ConditionalMutualInformationList
β FunctionList all conditional Mutual Information expressions for a given number n
of random variables. Conditional Entropies are of the form I(X;Y|Z)
aka Mutual information between X
and Y
given Z
.
julia> ConditionalMutualInformationList(2,"π")
["I(π1;π2)" "I(π2;π1)"]
julia> ConditionalMutualInformationList(3)
["I(X1;X2)","I(X1;X2|X3)","I(X1;X3)","I(X1;X3|X2)","I(X2;X1)","I(X2;X1|X3)","I(X2;X3)","I(X2;X3|X1)","I(X3;X1)","I(X3;X1|X2)","I(X3;X2)","I(X3;X2|X1)"]
InformationInequalities.Elemental2Canonical
β FunctionElemental2Canonical(s::String="I(Xi;Xπ¬|Ο)")
Convert elemental Information measure to canonical form. If s
is unspecified, it performs a default expression.
Examples
julia> Elemental2Canonical("I(Xi;Xπ¬|Ο)")
"H(Xi,Ο)+H(Xπ¬,Ο)-H(Xi,Xπ¬,Ο)-H(Ο)"
julia> Elemental2Canonical("H(Xi,X2|Ο,Ξ²)")
"H(Xi,X2,Ο,Ξ²) - H(Ο,Ξ²)"
julia>Elemental2Canonical("I(Xi;X2,Ο,π|ZΟ,Ξ²,π©)")
"H(Xi,ZΟ,Ξ²,π©)+H(X2,Ο,π,ZΟ,Ξ²,π©)-H(Xi,X2,Ο,π,ZΟ,Ξ²,π©)-H(ZΟ,Ξ²,π©)"
InformationInequalities.Elemental2Canonical_H
β FunctionElemental2Canonical_H(s::String="I(Xi;Xπ¬|Ο)") Convert a Entropy or Conditional entropy expression s to elemental form. If s
is unspecified, it performs a default expression.
Examples
julia> Elemental2Canonical_H("H(Xi,Xπ¬|Ο,Ξ³)")
"H(Xi,Xπ¬,Ο,Ξ³)-H(Ο,Ξ³)"
InformationInequalities.Elemental2Canonical_MI
β FunctionMutualInformation_ElemToCanon(s::String="I(Xi;Xπ¬|Ο)")
Convert a mutual information expression s to elemental form. If s
is unspecified, it performs a default expression.
Examples
julia> Elemental2Canonical("I(Xi;Xπ¬|Ο)")
"H(Xi,Ο)+H(Xπ¬,Ο)-H(Xi,Xπ¬,Ο)-H(Ο)"
InformationInequalities.ElementalMeasures
β FunctionList of Elemental Information measures (EIM) for a given n
number of random variables. EIM comprise of conditional entropies H(X1...,Xn|Y1...Ym)
and conditional mutual information I(Xβ...Xβ;Yβ....Yβ|Zβ...Zβ)
measures.
julia> ElementalMeasures(2)
["H(X1)","H(X1|X2)","H(X2)","H(X2|X1)","I(X1;X2)","I(X2;X1)"]
julia> ElementalMeasures(3,"π")
["H(π1)"
"H(π1|π2)"
"H(π1|π2,π3)"
"H(π1|π3)"
"H(π2)"
"H(π2|π1)"
"H(π2|π1,π3)"
"H(π2|π3)"
"H(π3)"
"H(π3|π1)"
"H(π3|π1,π2)"
"H(π3|π2)"
"I(π1;π2)"
"I(π1;π2|π3)"
"I(π1;π3)"
"I(π1;π3|π2)"
"I(π2;π1)"
"I(π2;π1|π3)"
"I(π2;π3)"
"I(π2;π3|π1)"
"I(π3;π1)"
"I(π3;π1|π2)"
"I(π3;π2)"
"I(π3;π2|π1)"]
InformationInequalities.GeometryConeGamma2
β MethodΞβ geometry. This is the simplest case with two random variables (say X,Y
) forming a geometry in three dimension. The geometric space is spanned by entropy vectors H(X)
, H(Y)
and H(X,Y)
. Ξβ is a 3D cone in the positive orthant. This function is used for visualizing the entropic space in 3D.
InformationInequalities.LinearInformationExpressionToCanonical
β MethodLinearInformationExpressionToCanonical(A)
julia>LinearInformationExpressionToCanonical("I(X;Y|Z)-2.3H(U,V)-2H(u)")
"1H(X,Z)+1H(Y,Z)-1H(X,Y,Z)-1H(Z)-2.3H(U,V)-2H(u)"
InformationInequalities.elementsGamma2
β MethodSet of discrete points in Ξβ
confined within a hypercube
InformationInequalities.entropic_matrix
β FunctionFind the Entropic matrix G
for a given
n`
julia> entropic_matrix(3)
InformationInequalities.entropic_terms
β MethodE,UE,V,Ξ» =entropic_terms(S::AbstractString)
Find the additive entropic terms in the canonical expression. This function is internally used in simplifyH(S::AbstractString)
.
Arguments
- S = Any linear Information Expression. These are linear combination of $I(X_1,...X_k;Y_1,..Y_l|Z_1,...Z_n)$ and $H(X_1,X_2,...X_m|Z_1,..Z_n)$
Output
- E = Constituent Information measures in S
- U = The distinct (unique) elements from E
- V = The individual elements of S
- Ξ» = The scaling coefficients of U (i.e., $V=Ξ»^T E$)
Examples
julia> S="I(X;Y)-H(X,Y|Z)-3H(X,Y)+2I(X;Y)"
julia> E,UE,V,Ξ»=entropic_terms(S)
["I(X;Y)" "H(X,Y|Z)" "H(X,Y)" "I(X;Y)"],
["I(X;Y)" "H(X,Y|Z)" "H(X,Y)"],
["1I(X;Y)" "-1H(X,Y|Z)" "-3H(X,Y)" "2I(X;Y)"],
[1.0 -1.0 -3.0 2.0]
InformationInequalities.entropy_vector
β FunctionFor a given number n
of random variables, list down all the elemental information measures and their corresponding entropic decompositions. The entropic vectors are identified with the prefix h
and follows lexicographic mapping. e.g., H(X1,X3,X7)=h137
. Note that, for now this lexicographic mapping works only for n < 10
.
InformationInequalities.find_entropic_vector
β FunctionGiven a linear expression in canonical form, it finds the co-ordinates in the entropic space Ξ
InformationInequalities.find_matrixG
β FunctionFind the Entropic matrix G
for a given n
julia> find_matrixG(3)
InformationInequalities.find_subset
β Functionfind_subset(n::Int64,p,q,RV::AbstractString="X")
Given i
and j
compute Ξ β π© \{i,j}
; i.e., all non exclusive subsets. i,j
can also be empty (i.e. []), in which case the non-empty superset gets listed. An optional prefix can be added (default is X
).
Examples
julia>find_subset(4,1,3,"X")
["" "X1" "X1,X2" "X1,X2,X5" "X1,X5" "X2" "X2,X5" "X5"]
julia>find_subset(4,1,3,"")
["" "1" "1,2" "1,2,5" "1,5" "2" "2,5" "5"]
julia> find_subset(5,4,3,"π")
["" "π3" "π3,π4" "π4"]
julia> find_subset(5,[],[],"π")
["π1" "π1,π2" "π1,π2,π3" "π1,π3" "π2" "π2,π3" "π3"]
InformationInequalities.minimalE
β MethodC=minimalE(V,U,Ξ»)
This function is internally used in simplifyH(S::AbstractString)
.
Arguments
- $E$ = Constituent Information measures in $S$
- $U$ = The distinct (unique) elements from $E$
- $V$ = The individual elements of $S$
- $Ξ»$ = The scaling coefficients of $U$ (i.e., $V=Ξ»^T E$)
Output
- C = Simplified Information Expression as a linear combination of $U$
Examples
julia> S="7H(X,Y|Z1,Z2)+2H(X,Y)-4H(X,Y)+H(Z)-3H(X,Y|Z1,Z2)"
julia> E,U,V,Ξ»=entropic_terms(S)
julia> Z= minimalE(V,U,Ξ»)
"+4.0H(X,Y|Z1,Z2)-2.0H(X,Y)+H(Z)"
InformationInequalities.minimal_EIM_list_canonical
β FunctionFor a given number n
of random variables, list down all the elemental information measures in minimal canonical form.
julia> u,v,m,n=minimal_EIM_list_canonical(2)
InformationInequalities.numEIM
β FunctionFor a given number n
of random variables, list down the maximum number of elemental information measures in minimal canonical form.
julia> u,v,m,n=numEIM(2)
InformationInequalities.order_entropic
β FunctionEach entropy word in an entropic vector is sorted.
Examples
julia> order_entropic("h24-h32-h132-h2")
"h24 - h23 - h123 - h2"
InformationInequalities.order_entropic1
β FunctionEach entropy word in an entropic vector is sorted.
Examples
julia> order_entropic1("h24-h32-h132-h2")
"h24 - h23 - h123 - h2"
julia> order_entropic1("7h32 - h243 - h13701 - h92252")
"7h23-h234-h01137-h22259"
InformationInequalities.order_entropic_expression
β FunctionExpress an entropic expression in the lexicographic order of entropic vectors
julia> order_entropic_expression("h12-h32-123")
InformationInequalities.order_string
β FunctionEach word in a sentence (string) is sorted alphabetically.
Examples
julia> order_string("This is a sorted sentence; Who is 1 and two ")
" This is a deorst ;ceeennst Who is 1 adn otw "
InformationInequalities.plotEntropyTree
β MethodplotEntropyTree(S,...)
julia>plotEntropyTree("H(X,Y)+1.2 H(X)+7H(X1,X2,X3)")
julia>plotEntropyTree("H(X,Y)+1H(X1,X2)+3H(X1,X2,X3)",curves=:false,nodecolor=:gold,edgecolor=:gray,nodeshape=:rect,nodesize=0.15)
InformationInequalities.plotIE
β MethodplotIE(S,...)
julia>PlotIE("H(X,Y)+7H(X1,X2,X3)")
julia>plotIE("I(X;Y)+2I(X;Y|Z)",curves=:false)
InformationInequalities.plotInformationExpression
β MethodPlotInformationExpression(S,...)
julia>PlotInformationExpression("H(X,Y)+7H(X1,X2,X3)")
julia>PlotInformationExpression("I(X;Y)+2I(X;Y|Z)",curves=:false)
InformationInequalities.simplify
β MethodExpress an Information Expression in the simplest form (algebraically).
Examples
julia> simplify("I(X;Y)-H(X,Y|Z)-3H(X,Y)+2I(X;Y)")
"+3.0I(X;Y)-H(X,Y|Z)-3.0H(X,Y)"
julia> simplify("3I(X;Y|Z)-H(X,Y|Z)-3I(X;Y|Z)-3H(X,Y|Z)+H(X1,X2)")
"-4.0H(X,Y|Z)+H(X1,X2)"
InformationInequalities.simplifyH
β MethodExpress an Information Expression in the simplest form (algebraically). This is used in conjunction with the canonical expression.
Examples
julia>
InformationInequalities.unique_entropy_vector
β FunctionFor a given number n<10
of random variables, list down all the unique elemental information measures and their corresponding entropic decompositions. The entropic vectors are identified with the prefix h
and follows lexicographic mapping. e.g., H(X1,X3,X7)=h137
. Note that, for now this lexicographic mapping works only for n < 10
.