jhplot.stat
Class Entropy
- java.lang.Object
-
- jhplot.stat.Entropy
-
public abstract class Entropy extends java.lang.Object
Implements common discrete Shannon Entropy functions. Provides: univariate entropy H(X), conditional entropy H(X|Y), joint entropy H(X,Y). Defaults to log_2, and so the entropy is calculated in bits.
-
-
Field Summary
Fields Modifier and Type Field and Description static double
LOG_BASE
-
Method Summary
All Methods Static Methods Concrete Methods Modifier and Type Method and Description static double
calculateConditionalEntropy(double[] dataVector, double[] conditionVector)
Calculates the conditional entropy H(X|Y) from two vectors.static double
calculateEntropy(double[] dataVector)
Calculates the univariate entropy H(X) from a vector.static double
calculateJointEntropy(double[] firstVector, double[] secondVector)
Calculates the joint entropy H(X,Y) from two vectors.
-
-
-
Method Detail
-
calculateEntropy
public static double calculateEntropy(double[] dataVector)
Calculates the univariate entropy H(X) from a vector. Uses histograms to estimate the probability distributions, and thus the entropy. The entropy is bounded 0 ≤ H(X) ≤ log |X|, where log |X| is the log of the number of states in the random variable X.- Parameters:
dataVector
- Input vector (X). It is discretised to the floor of each value before calculation.- Returns:
- The entropy H(X).
-
calculateConditionalEntropy
public static double calculateConditionalEntropy(double[] dataVector, double[] conditionVector)
Calculates the conditional entropy H(X|Y) from two vectors. X = dataVector, Y = conditionVector. Uses histograms to estimate the probability distributions, and thus the entropy. The conditional entropy is bounded 0 ≤ H(X|Y) ≤ H(X).- Parameters:
dataVector
- Input vector (X). It is discretised to the floor of each value before calculation.conditionVector
- Input vector (Y). It is discretised to the floor of each value before calculation.- Returns:
- The conditional entropy H(X|Y).
-
calculateJointEntropy
public static double calculateJointEntropy(double[] firstVector, double[] secondVector)
Calculates the joint entropy H(X,Y) from two vectors. The order of the input vectors is irrelevant. Uses histograms to estimate the probability distributions, and thus the entropy. The joint entropy is bounded 0 ≤ H(X,Y) ≤ log |XY|, where log |XY| is the log of the number of states in the joint random variable XY.- Parameters:
firstVector
- Input vector. It is discretised to the floor of each value before calculation.secondVector
- Input vector. It is discretised to the floor of each value before calculation.- Returns:
- The joint entropy H(X,Y).
-
-
DMelt 3.0 © DataMelt by jWork.ORG