JOURNAL OF MACHINE LEARNING RESEARCH, ISSN 1532-4435, 2019, Volume 20

We consider the problem of estimating a probability distribution that maximizes the entropy while satisfying a finite number of moment constraints, possibly...

Entropy maximization | approximate dynamic programming | MINIMIZATION | relative entropy minimization | convex optimization | fast gradient method | MOMENT-CLOSURE | EFFICIENT | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Entropy maximization | approximate dynamic programming | MINIMIZATION | relative entropy minimization | convex optimization | fast gradient method | MOMENT-CLOSURE | EFFICIENT | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Journal Article

Advances in Mathematics, ISSN 0001-8708, 10/2018, Volume 337, pp. 139 - 170

Inspired by Edward Witten's questions, we compute the mutual information associated with free fermions, and we deduce many results about entropies for chiral...

CFT | Relative entropy | MATHEMATICS | STATES | NETS | CONFORMAL FIELD-THEORY | REPRESENTATIONS | SUBFACTORS | OPERATOR-ALGEBRAS

CFT | Relative entropy | MATHEMATICS | STATES | NETS | CONFORMAL FIELD-THEORY | REPRESENTATIONS | SUBFACTORS | OPERATOR-ALGEBRAS

Journal Article

Journal of Statistical Physics, ISSN 0022-4715, 4/2014, Volume 155, Issue 1, pp. 93 - 105

The maximum entropy formalism developed by Jaynes determines the relevant ensemble in nonequilibrium statistical mechanics by maximising the entropy functional...

Kullback–Leibler divergence | Relative entropy | Gibbs–Jaynes entropy | Physical Chemistry | Nonequilibrium statistical mechanics | Theoretical, Mathematical and Computational Physics | Quantum Physics | Maximum entropy formalism | Statistical Physics, Dynamical Systems and Complexity | Physics | Kullback-Leibler divergence | Gibbs-Jaynes entropy | PHYSICS, MATHEMATICAL

Kullback–Leibler divergence | Relative entropy | Gibbs–Jaynes entropy | Physical Chemistry | Nonequilibrium statistical mechanics | Theoretical, Mathematical and Computational Physics | Quantum Physics | Maximum entropy formalism | Statistical Physics, Dynamical Systems and Complexity | Physics | Kullback-Leibler divergence | Gibbs-Jaynes entropy | PHYSICS, MATHEMATICAL

Journal Article

Asian-European Journal of Mathematics, ISSN 1793-5571, 06/2018

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 03/2015, Volume 61, Issue 3, pp. 1458 - 1473

We prove a lower bound on the relative entropy between two finite-dimensional states in terms of their entropy difference and the dimension of the underlying...

Thermodynamics | Reactive power | Upper bound | surprisal | Heating | relative entropy | channel capacity | heat capacity | Entropy | Probability distribution | entropy inequalities | Information theory | thermodynamics | QUANTUM | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Relative entropy | IRREVERSIBILITY | SYSTEMS | WORK | ASYMPTOTICS | Finite element method | Usage | Distribution (Probability theory) | Innovations

Thermodynamics | Reactive power | Upper bound | surprisal | Heating | relative entropy | channel capacity | heat capacity | Entropy | Probability distribution | entropy inequalities | Information theory | thermodynamics | QUANTUM | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Relative entropy | IRREVERSIBILITY | SYSTEMS | WORK | ASYMPTOTICS | Finite element method | Usage | Distribution (Probability theory) | Innovations

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 08/2010, Volume 56, Issue 8, pp. 3712 - 3720

A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the...

minimum mean- square error (MMSE) estimation | Estimation error | Divergence | Estimation theory | Probability | Entropy | Shannon theory | free probability | Gaussian noise | relative entropy | Random variables | Mutual information | Network address translation | Signal to noise ratio | Information theory | statistics | mutual information | GAUSSIAN CHANNELS | MEAN-SQUARE ERROR | ANALOGS | PERTURBATION | FREE PROBABILITY-THEORY | COMPUTER SCIENCE, INFORMATION SYSTEMS | minimum mean-square error (MMSE) estimation | POWER INEQUALITY | SIMPLE PROOF | ENGINEERING, ELECTRICAL & ELECTRONIC | FISHER INFORMATION MEASURE | Measurement | Entropy (Information theory) | Integrals | Noise | Gaussian | Representations | Estimators

minimum mean- square error (MMSE) estimation | Estimation error | Divergence | Estimation theory | Probability | Entropy | Shannon theory | free probability | Gaussian noise | relative entropy | Random variables | Mutual information | Network address translation | Signal to noise ratio | Information theory | statistics | mutual information | GAUSSIAN CHANNELS | MEAN-SQUARE ERROR | ANALOGS | PERTURBATION | FREE PROBABILITY-THEORY | COMPUTER SCIENCE, INFORMATION SYSTEMS | minimum mean-square error (MMSE) estimation | POWER INEQUALITY | SIMPLE PROOF | ENGINEERING, ELECTRICAL & ELECTRONIC | FISHER INFORMATION MEASURE | Measurement | Entropy (Information theory) | Integrals | Noise | Gaussian | Representations | Estimators

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 01/2011, Volume 57, Issue 1, pp. 33 - 55

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy...

Fisher information inequality (FII) | Symmetric matrices | Estimation | de Bruijn's identity | Data processing | Entropy | Covariance matrix | minimum mean-square error (MMSE) | Fisher information | entropy power inequality (EPI) | divergence | differential entropy | Data processing inequality | relative entropy | Markov processes | Random variables | mutual information | BINARY SEQUENCES | BLIND SOURCE SEPARATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | BROADCAST CHANNELS | ENGINEERING, ELECTRICAL & ELECTRONIC | VECTOR GAUSSIAN CHANNELS | WIRE-TAP CHANNEL | CONVOLUTION INEQUALITY | FADING CHANNELS | CENTRAL-LIMIT-THEOREM | Measurement | Entropy (Information theory) | Dependent variables | Ingredients | Inequalities | Proving | Linear transformations | Gaussian | Concavity | Mathematics | Information Theory | Computer Science

Fisher information inequality (FII) | Symmetric matrices | Estimation | de Bruijn's identity | Data processing | Entropy | Covariance matrix | minimum mean-square error (MMSE) | Fisher information | entropy power inequality (EPI) | divergence | differential entropy | Data processing inequality | relative entropy | Markov processes | Random variables | mutual information | BINARY SEQUENCES | BLIND SOURCE SEPARATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | BROADCAST CHANNELS | ENGINEERING, ELECTRICAL & ELECTRONIC | VECTOR GAUSSIAN CHANNELS | WIRE-TAP CHANNEL | CONVOLUTION INEQUALITY | FADING CHANNELS | CENTRAL-LIMIT-THEOREM | Measurement | Entropy (Information theory) | Dependent variables | Ingredients | Inequalities | Proving | Linear transformations | Gaussian | Concavity | Mathematics | Information Theory | Computer Science

Journal Article

Entropy, ISSN 1099-4300, 06/2017, Volume 19, Issue 6, p. 269

Information entropy and its extension, which are important generalizations of entropy, are currently applied to many research domains. In this paper, a novel...

Generalized relative entropy | Upper bound | Relative entropy | Distance metric | Adjusted distance | generalized relative entropy | distance metric | MAXIMUM-ENTROPY | ENERGY | CROSS-ENTROPY | PHYSICS, MULTIDISCIPLINARY | upper bound | MUTUAL-INFORMATION | DISTRIBUTIONS | adjusted distance | TUPLE NUCLEOTIDE COMPOSITION | GENOMES | relative entropy | COMPLEXITY | REGISTRATION

Generalized relative entropy | Upper bound | Relative entropy | Distance metric | Adjusted distance | generalized relative entropy | distance metric | MAXIMUM-ENTROPY | ENERGY | CROSS-ENTROPY | PHYSICS, MULTIDISCIPLINARY | upper bound | MUTUAL-INFORMATION | DISTRIBUTIONS | adjusted distance | TUPLE NUCLEOTIDE COMPOSITION | GENOMES | relative entropy | COMPLEXITY | REGISTRATION

Journal Article

Statistical Science, ISSN 0883-4237, 2/2015, Volume 30, Issue 1, pp. 40 - 58

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon...

Differential and relative entropy/extropy | Bregman divergence | Duality | Gini index of heterogeneity | Kullback-Leibler divergence | Repeat rate | Proper scoring rules | proper scoring rules | PROBABILITY | INFORMATION | STATISTICS & PROBABILITY | repeat rate | duality | Kullback–Leibler divergence

Differential and relative entropy/extropy | Bregman divergence | Duality | Gini index of heterogeneity | Kullback-Leibler divergence | Repeat rate | Proper scoring rules | proper scoring rules | PROBABILITY | INFORMATION | STATISTICS & PROBABILITY | repeat rate | duality | Kullback–Leibler divergence

Journal Article

Computer Methods in Applied Mechanics and Engineering, ISSN 0045-7825, 02/2015, Volume 284, pp. 712 - 731

In this paper, we devise cell-based maximum-entropy (max-ent) basis functions that are used in a Galerkin method for the solution of partial differential...

Smooth and nonnegative basis functions | Approximate distance function | R-functions | Relative entropy | Compact-support | Delaunay mesh | ARBITRARY PLANAR POLYGONS | MESHFREE METHOD | PART I | FINITE-ELEMENTS | ISOGEOMETRIC ANALYSIS | FORMULATION | CONVOLUTION SURFACES | MATHEMATICS, INTERDISCIPLINARY APPLICATIONS | MECHANICS | DISTANCE FIELDS | ENGINEERING, MULTIDISCIPLINARY | MOVING LEAST-SQUARES | CONSTRUCTION | Construction | Approximation | Basis functions | Mathematical analysis | Approximants | Poisson equation | Mathematical models | Boolean algebra | Galerkin methods | Galerkin, Mètodes de | Mètodes numèrics | Classificació AMS | 65 Numerical analysis | 65N Partial differential equations, boundary value problems | Anàlisi numèrica | Matemàtiques i estadística | Àrees temàtiques de la UPC

Smooth and nonnegative basis functions | Approximate distance function | R-functions | Relative entropy | Compact-support | Delaunay mesh | ARBITRARY PLANAR POLYGONS | MESHFREE METHOD | PART I | FINITE-ELEMENTS | ISOGEOMETRIC ANALYSIS | FORMULATION | CONVOLUTION SURFACES | MATHEMATICS, INTERDISCIPLINARY APPLICATIONS | MECHANICS | DISTANCE FIELDS | ENGINEERING, MULTIDISCIPLINARY | MOVING LEAST-SQUARES | CONSTRUCTION | Construction | Approximation | Basis functions | Mathematical analysis | Approximants | Poisson equation | Mathematical models | Boolean algebra | Galerkin methods | Galerkin, Mètodes de | Mètodes numèrics | Classificació AMS | 65 Numerical analysis | 65N Partial differential equations, boundary value problems | Anàlisi numèrica | Matemàtiques i estadística | Àrees temàtiques de la UPC

Journal Article

Journal of Mathematical Physics, ISSN 0022-2488, 12/2013, Volume 54, Issue 12, p. 122203

htmlabstractThe Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties....

LIEB | INEQUALITIES | RELATIVE ENTROPIES | PHYSICS, MATHEMATICAL | Data processing | Properties (attributes) | Entropy | Entropy (Information theory) | Information theory | Cases (containers) | DATA PROCESSING | INFORMATION THEORY | CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS | ENTROPY

LIEB | INEQUALITIES | RELATIVE ENTROPIES | PHYSICS, MATHEMATICAL | Data processing | Properties (attributes) | Entropy | Entropy (Information theory) | Information theory | Cases (containers) | DATA PROCESSING | INFORMATION THEORY | CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS | ENTROPY

Journal Article

Information Sciences, ISSN 0020-0255, 03/2014, Volume 260, pp. 74 - 97

Pattern recognition is a collection of computer techniques to classify various observations into different clusters of similar attributes in either supervised...

Relative entropy fuzzy c-means clustering | Fuzzy clustering | Relative entropy | Fuzzy c-means | BANKRUPTCY | ALGORITHM | COMPUTER SCIENCE, INFORMATION SYSTEMS | NOISE | VALIDITY | LAMBERT-W-FUNCTION

Relative entropy fuzzy c-means clustering | Fuzzy clustering | Relative entropy | Fuzzy c-means | BANKRUPTCY | ALGORITHM | COMPUTER SCIENCE, INFORMATION SYSTEMS | NOISE | VALIDITY | LAMBERT-W-FUNCTION

Journal Article

Journal of Physics A: Mathematical and Theoretical, ISSN 1751-8113, 11/2018, Volume 51, Issue 48, p. 484001

The existence of a positive log-Sobolev constant implies a bound on the mixing time of a quantum dissipative evolution under the Markov approximation. For...

log-Sobolev inequality | conditional relative entropy | quasi-factorization of the relative entropy | quantum relative entropy | mixing time | quantum dissipative evolution | STATES | INEQUALITIES | MARKOV-CHAINS | PHYSICS, MULTIDISCIPLINARY | MONOTONICITY | STRONG SUBADDITIVITY | PHYSICS, MATHEMATICAL | MUTUAL INFORMATION | RECOVERY MAPS

log-Sobolev inequality | conditional relative entropy | quasi-factorization of the relative entropy | quantum relative entropy | mixing time | quantum dissipative evolution | STATES | INEQUALITIES | MARKOV-CHAINS | PHYSICS, MULTIDISCIPLINARY | MONOTONICITY | STRONG SUBADDITIVITY | PHYSICS, MATHEMATICAL | MUTUAL INFORMATION | RECOVERY MAPS

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 09/2015, Volume 61, Issue 9, pp. 5063 - 5080

Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative...

Redundancy | Tsallis entropy | Probability | power-law family | Minimization | Extraterrestrial measurements | Entropy | Kullback-Leibler divergence | linear family | Covariance matrices | Q measurement | information geometry | relative entropy | Best approximant | Pythagorean property | projection | exponential family | Renyi entropy | MAXIMUM-ENTROPY | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | CONVERGENCE | RENYI ENTROPY | DIVERGENCE | relative entropy; Renyi entropy

Redundancy | Tsallis entropy | Probability | power-law family | Minimization | Extraterrestrial measurements | Entropy | Kullback-Leibler divergence | linear family | Covariance matrices | Q measurement | information geometry | relative entropy | Best approximant | Pythagorean property | projection | exponential family | Renyi entropy | MAXIMUM-ENTROPY | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | CONVERGENCE | RENYI ENTROPY | DIVERGENCE | relative entropy; Renyi entropy

Journal Article

Journal of Physics A: Mathematical and Theoretical, ISSN 1751-8113, 03/2018, Volume 51, Issue 15, p. 154003

Many quantum information measures can be written as an optimization of the quantum relative entropy between sets of states. For example, the relative entropy...

von Neumann entropy | convex optimization | entanglement measures | quantum capacity | quantum conditional mutual information | quantum relative entropy | CAPACITY | CHANNEL | PHYSICS, MULTIDISCIPLINARY | CONDITIONAL MUTUAL INFORMATION | PHYSICS, MATHEMATICAL | Quantum Physics | Physics

von Neumann entropy | convex optimization | entanglement measures | quantum capacity | quantum conditional mutual information | quantum relative entropy | CAPACITY | CHANNEL | PHYSICS, MULTIDISCIPLINARY | CONDITIONAL MUTUAL INFORMATION | PHYSICS, MATHEMATICAL | Quantum Physics | Physics

Journal Article

Reviews in Mathematical Physics, ISSN 0129-055X, 08/2019, Volume 31, Issue 7, p. 1950022

We consider a quantum quasi-relative entropy S f K for an operator K and an operator convex function f . We show how to obtain the error bounds for the...

quasi-entropy | STATES | INEQUALITIES | CONVEXITY | INFORMATION | Cauchy-Schwartz inequality | PROOF | Entropy | Wigner-Yanase-Dyson information | PHYSICS, MATHEMATICAL | TRACE FUNCTIONS | relative entropy | data processing inequality | strong subadditivity

quasi-entropy | STATES | INEQUALITIES | CONVEXITY | INFORMATION | Cauchy-Schwartz inequality | PROOF | Entropy | Wigner-Yanase-Dyson information | PHYSICS, MATHEMATICAL | TRACE FUNCTIONS | relative entropy | data processing inequality | strong subadditivity

Journal Article

Mathematical Programming, ISSN 0025-5610, 1/2017, Volume 161, Issue 1, pp. 1 - 32

In this expository article, we study optimization problems specified via linear and relative entropy inequalities. Such relative entropy programs (REPs) are...

Golden–Thompson inequality | Theoretical, Mathematical and Computational Physics | Mathematics | Dynamical systems | Optimization over non-commuting variables | Von-Neumann entropy | 94A15 | Mathematical Methods in Physics | Araki–Umegaki relative entropy | Robust optimization | 81P45 | Calculus of Variations and Optimal Control; Optimization | Mathematics of Computing | 90C25 | Numerical Analysis | Quantum channel capacity | 94A17 | Shannon entropy | Quantum information | Combinatorics | Matrix permanent | Araki-Umegaki relative entropy | Golden-Thompson inequality | STATISTICAL-MECHANICS | MATRIX | MATHEMATICS, APPLIED | INFORMATION | ALGORITHM | CONVEX-OPTIMIZATION | MAXIMIZATION | COMPUTER SCIENCE, SOFTWARE ENGINEERING | OPERATIONS RESEARCH & MANAGEMENT SCIENCE | QUANTUM CHANNELS | PERMANENTS | 2ND-ORDER CONE | MIXED VOLUMES | Electrical engineering | Atoms | Studies | Entropy | Quantum physics | Analysis | Optimization | Mathematical programming | Functions (mathematics) | Maximization | Mathematical analysis | Inequalities | Convexity

Golden–Thompson inequality | Theoretical, Mathematical and Computational Physics | Mathematics | Dynamical systems | Optimization over non-commuting variables | Von-Neumann entropy | 94A15 | Mathematical Methods in Physics | Araki–Umegaki relative entropy | Robust optimization | 81P45 | Calculus of Variations and Optimal Control; Optimization | Mathematics of Computing | 90C25 | Numerical Analysis | Quantum channel capacity | 94A17 | Shannon entropy | Quantum information | Combinatorics | Matrix permanent | Araki-Umegaki relative entropy | Golden-Thompson inequality | STATISTICAL-MECHANICS | MATRIX | MATHEMATICS, APPLIED | INFORMATION | ALGORITHM | CONVEX-OPTIMIZATION | MAXIMIZATION | COMPUTER SCIENCE, SOFTWARE ENGINEERING | OPERATIONS RESEARCH & MANAGEMENT SCIENCE | QUANTUM CHANNELS | PERMANENTS | 2ND-ORDER CONE | MIXED VOLUMES | Electrical engineering | Atoms | Studies | Entropy | Quantum physics | Analysis | Optimization | Mathematical programming | Functions (mathematics) | Maximization | Mathematical analysis | Inequalities | Convexity

Journal Article

Pattern Recognition Letters, ISSN 0167-8655, 07/2019, Volume 125, pp. 677 - 683

By virtue of their simplicity and efficiency, hashing algorithms have achieved significant success on large-scale approximate nearest neighbor search....

Hashing | Image retrieval | Symmetric relative entropy | IMAGE | CODES | QUANTIZATION | RETRIEVAL | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | Computer science | International marketing | Algorithms | Neural networks

Hashing | Image retrieval | Symmetric relative entropy | IMAGE | CODES | QUANTIZATION | RETRIEVAL | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | Computer science | International marketing | Algorithms | Neural networks

Journal Article

Acta Astronautica, ISSN 0094-5765, 03/2018, Volume 144, pp. 271 - 282

Gaussian approximation filters have increasingly been developed to enhance the accuracy of attitude estimation in space missions. The effective employment of...

Cubature Kalman filter | Attitude estimation | Relative entropy | Confidence level | Innovation-based adaptive estimation | SIZE | ALGORITHMS | ENGINEERING, AEROSPACE | KALMAN FILTER

Cubature Kalman filter | Attitude estimation | Relative entropy | Confidence level | Innovation-based adaptive estimation | SIZE | ALGORITHMS | ENGINEERING, AEROSPACE | KALMAN FILTER

Journal Article