• LeZi-update

  • Referenced in 18 articles [sw01541]
  • information-theoretic framework. Shannon’s entropy measure is identified as a basis for comparing user...
  • JIDT

  • Referenced in 9 articles [sw23550]
  • JIDT includes implementations: principally for the measures transfer entropy, mutual information, and their conditional variants...
  • RTransferEntropy

  • Referenced in 3 articles [sw29816]
  • Series with Shannon and Renyi Transfer Entropy. Measuring information flow between time series with Shannon...
  • Rnaz

  • Referenced in 3 articles [sw17118]
  • data and the usage of an entropy measure to represent sequence similarities. RNAz 2.0 shows...
  • SITS

  • Referenced in 2 articles [sw18739]
  • flow is also quantified through the entropy measure...
  • kappalab

  • Referenced in 49 articles [sw06086]
  • capacity (or non-additive measure, fuzzy measure) and integral manipulation on a finite setting ... method based on linear programming, a maximum entropy like method based on variance minimization...
  • tseriesEntropy

  • Referenced in 1 article [sw26028]
  • Tests for Time Series. Implements an Entropy measure of dependence based on the Bhattacharya-Hellinger...
  • acss

  • Referenced in 6 articles [sw10997]
  • traditional (but problematic) measures of complexity are also provided: entropy and change complexity...
  • ITE

  • Referenced in 5 articles [sw12811]
  • many different variants of entropy, mutual information, divergence, association measures, cross quantities, and kernels...
  • Inform

  • Referenced in 1 article [sw35834]
  • data. This includes classical information-theoretic measures (e.g. entropy, mutual information) and measures of information ... dynamics (e.g. active information storage, transfer entropy), but also several less common, yet powerful information ... effective information, information flow and integration measures. However, what makes Inform unique is that...
  • CDNA

  • Referenced in 9 articles [sw37103]
  • Significantly lower entropy estimates for natural DNA sequences . Its purpose is to measure the ”predictability/compressibility ... Expectation Maximization). Our focus in Significantly lower entropy estimates for natural DNA sequences...
  • infotheory

  • Referenced in 2 articles [sw29391]
  • theory. It implements widely used measures such as entropy and mutual information, as well...
  • TSEntropies

  • Referenced in 2 articles [sw34014]
  • Pincus in ”Approximate entropy as a measure of system complexity”, Proceedings of the National Academy ... America, 88, 2297-2301 (March 1991). Sample entropy was proposed by J. S. Richman...
  • entropart

  • Referenced in 1 article [sw24140]
  • package entropart: Entropy Partitioning to Measure Diversity. Measurement and partitioning of diversity, based on Tsallis...
  • IDTxl

  • Referenced in 4 articles [sw25603]
  • estimate the following measures: 1) For network inference: multivariate transfer entropy (TE)/Granger causality...
  • FSMRDE

  • Referenced in 7 articles [sw22250]
  • relative decision entropy-based feature selection approach. Rough set theory has been proven ... algorithm (called FSMRDE) in rough sets. To measure the significance of features in FSMRDE ... propose a new model of relative decision entropy, which is an extension of Shannon...
  • MedianOfNinthers

  • Referenced in 2 articles [sw32520]
  • numbers, typical low-entropy artificial datasets, and real-world data. Measurements are open-sourced alongside...
  • copent

  • Referenced in 1 article [sw33918]
  • method. Copula Entropy is a mathematical concept for multivariate statistical independence measuring and testing...
  • ForeCA

  • Referenced in 1 article [sw25728]
  • forecastable” signal. The measure of forecastability is based on the Shannon entropy of the spectral...
  • smcUtils

  • Referenced in 1 article [sw14629]
  • branching), measures of weight uniformity (coefficient of variation, effective sample size, and entropy...