1995, ISBN 047109756X, xii, 230

Book

1998, Water science and technology library, ISBN 9780792352242, Volume 30, xv, 365

Book

Journal of the Royal Statistical Society. Series B (Statistical Methodology), ISSN 1369-7412, 6/2013, Volume 75, Issue 3, pp. 427 - 450

Estimation of high dimensional covariance matrices is known to be a difficult problem, has many applications and is of current interest to the larger...

Maximum likelihood estimation | Investment risk | Covariance | Spectral theory | Eigenvalues | Entropy | Covariance matrices | Standard and Poors 500 Index | Estimators | Truncation | Portfolio optimization | Convex optimization | Condition number | Eigenvalue | Cross‐validation | Shrinkage | Regularization | Risk comparisons | Covariance estimation | Cross-validation | STATISTICS & PROBABILITY | GRAPHS | WISHART DISTRIBUTIONS | MODELS | MATRICES | LASSO | SELECTION | LIKELIHOOD | Investment analysis | Analysis | Studies | Regularization methods | Estimating techniques | Mathematical analysis | Statistics | Decision theory | Samples | Statistical methods | Conditioning | Covariance matrix | covariance estimation | risk comparisons | regularization | portfolio optimization | convex optimization | cross-validation | shrinkage | eigenvalue | condition number

Maximum likelihood estimation | Investment risk | Covariance | Spectral theory | Eigenvalues | Entropy | Covariance matrices | Standard and Poors 500 Index | Estimators | Truncation | Portfolio optimization | Convex optimization | Condition number | Eigenvalue | Cross‐validation | Shrinkage | Regularization | Risk comparisons | Covariance estimation | Cross-validation | STATISTICS & PROBABILITY | GRAPHS | WISHART DISTRIBUTIONS | MODELS | MATRICES | LASSO | SELECTION | LIKELIHOOD | Investment analysis | Analysis | Studies | Regularization methods | Estimating techniques | Mathematical analysis | Statistics | Decision theory | Samples | Statistical methods | Conditioning | Covariance matrix | covariance estimation | risk comparisons | regularization | portfolio optimization | convex optimization | cross-validation | shrinkage | eigenvalue | condition number

Journal Article

Journal of the American Statistical Association, ISSN 0162-1459, 03/2007, Volume 102, Issue 477, pp. 359 - 378

Scoring rules assess the quality of probabilistic forecasts, by assigning a numerical score based on the predictive distribution and on the event or value that...

Continuous ranked probability score | Strictly proper | Predictive distribution | Utility function | Bayes factor | Entropy | Skill score | Brier score | Prediction interval | Minimum contrast estimation | Loss function | Scoring rule | Bregman divergence | Cross-validation | Quantile forecast | Kernel score | Negative definite function | Coherent | Weather | Maximum likelihood estimation | Forecasting standards | Review Article | Weather forecasting | Statistical weather forecasting | Probability forecasts | Interval estimators | Probabilities | Forecasting models | kernel score | coherent | PRECIPITATION FORECASTS | loss function | entropy | predictive distribution | MODEL SELECTION | quantile forecast | INTERVAL ESTIMATION | minimum contrast estimation | REGRESSION | INFORMATION | ENSEMBLES | STATISTICS & PROBABILITY | cross-validation | utility function | GENERAL-METHOD | PROBABILITY-DISTRIBUTIONS | prediction interval | brier score | negative definite function | strictly proper | scoring rule | SYSTEMS | continuous ranked probability score | skill score | VERIFICATION | Bayesian statistical decision theory | Prediction theory | Usage | Analysis | Probability | Statistical analysis | Estimating techniques | Theory | Predictions

Continuous ranked probability score | Strictly proper | Predictive distribution | Utility function | Bayes factor | Entropy | Skill score | Brier score | Prediction interval | Minimum contrast estimation | Loss function | Scoring rule | Bregman divergence | Cross-validation | Quantile forecast | Kernel score | Negative definite function | Coherent | Weather | Maximum likelihood estimation | Forecasting standards | Review Article | Weather forecasting | Statistical weather forecasting | Probability forecasts | Interval estimators | Probabilities | Forecasting models | kernel score | coherent | PRECIPITATION FORECASTS | loss function | entropy | predictive distribution | MODEL SELECTION | quantile forecast | INTERVAL ESTIMATION | minimum contrast estimation | REGRESSION | INFORMATION | ENSEMBLES | STATISTICS & PROBABILITY | cross-validation | utility function | GENERAL-METHOD | PROBABILITY-DISTRIBUTIONS | prediction interval | brier score | negative definite function | strictly proper | scoring rule | SYSTEMS | continuous ranked probability score | skill score | VERIFICATION | Bayesian statistical decision theory | Prediction theory | Usage | Analysis | Probability | Statistical analysis | Estimating techniques | Theory | Predictions

Journal Article

IEEE transactions on information theory, ISSN 0018-9448, 05/2015, Volume 61, Issue 5, pp. 2835 - 2885

We propose a general methodology for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional...

Functions | Algorithms | Research | Functional equations | Mathematical research | Information theory | Mathematics | Polynomials | Entropy | Estimating techniques | Bias | Approximations | minimax lower bound | Liu algorithm | maximum likelihood estimator | approximation theory | nonsmooth functional estimation | Rényi entropy | entropy estimation | polynomial approximation | minimax-optimality | high dimensional statistics | Chow | Mean squared error

Functions | Algorithms | Research | Functional equations | Mathematical research | Information theory | Mathematics | Polynomials | Entropy | Estimating techniques | Bias | Approximations | minimax lower bound | Liu algorithm | maximum likelihood estimator | approximation theory | nonsmooth functional estimation | Rényi entropy | entropy estimation | polynomial approximation | minimax-optimality | high dimensional statistics | Chow | Mean squared error

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 05/2015, Volume 61, Issue 5, pp. 2835 - 2885

We propose a general methodology for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional...

Maximum likelihood estimation | approximation theory | entropy estimation | minimax-optimality | Entropy | Complexity theory | Approximation methods | high dimensional statistics | minimax lower bound | maximum likelihood estimator | nonsmooth functional estimation | polynomial approximation | Chow-Liu algorithm | Polynomials | Mutual information | Mean squared error | Renyi entropy | Rényi entropy | MAXIMUM-LIKELIHOOD | COMPUTER SCIENCE, INFORMATION SYSTEMS | UNSEEN | COMPRESSION | INFERENCE | ADAPTIVE ESTIMATION | ENGINEERING, ELECTRICAL & ELECTRONIC | GEOMETRIZING RATES | PROBABILITY-DISTRIBUTIONS | NONPARAMETRIC-ESTIMATION | LINEAR FUNCTIONALS

Maximum likelihood estimation | approximation theory | entropy estimation | minimax-optimality | Entropy | Complexity theory | Approximation methods | high dimensional statistics | minimax lower bound | maximum likelihood estimator | nonsmooth functional estimation | polynomial approximation | Chow-Liu algorithm | Polynomials | Mutual information | Mean squared error | Renyi entropy | Rényi entropy | MAXIMUM-LIKELIHOOD | COMPUTER SCIENCE, INFORMATION SYSTEMS | UNSEEN | COMPRESSION | INFERENCE | ADAPTIVE ESTIMATION | ENGINEERING, ELECTRICAL & ELECTRONIC | GEOMETRIZING RATES | PROBABILITY-DISTRIBUTIONS | NONPARAMETRIC-ESTIMATION | LINEAR FUNCTIONALS

Journal Article

1998, CWI tract, ISBN 9061964830, Volume 125, 120

Book

IEEE Transactions on Information Theory, ISSN 0018-9448, 10/2013, Volume 59, Issue 10, pp. 6220 - 6242

Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal...

Context | Causal influence | directed information | Q measurement | context-tree weighting (CTW) | universal probability assignment | Estimation | rate of convergence | Educational institutions | Entropy | Information rates | Convergence | CAPACITY | DATA-COMPRESSION | THEOREM | COMPUTER SCIENCE, INFORMATION SYSTEMS | PREDICTION | ENGINEERING, ELECTRICAL & ELECTRONIC | TREE WEIGHTING METHOD | CHANNELS | FEEDBACK | ENTROPY | Convergence (Mathematics) | Research | Analysis | Entropy (Information theory) | Estimation theory | Probability distribution | Algorithms | Estimating techniques | Information theory

Context | Causal influence | directed information | Q measurement | context-tree weighting (CTW) | universal probability assignment | Estimation | rate of convergence | Educational institutions | Entropy | Information rates | Convergence | CAPACITY | DATA-COMPRESSION | THEOREM | COMPUTER SCIENCE, INFORMATION SYSTEMS | PREDICTION | ENGINEERING, ELECTRICAL & ELECTRONIC | TREE WEIGHTING METHOD | CHANNELS | FEEDBACK | ENTROPY | Convergence (Mathematics) | Research | Analysis | Entropy (Information theory) | Estimation theory | Probability distribution | Algorithms | Estimating techniques | Information theory

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 09/2019, Volume 65, Issue 9, pp. 5323 - 5338

We study a distributed estimation problem in which two remotely located parties, Alice and Bob, observe an unlimited number of i.i.d. samples corresponding to...

multiterminal estimation | Reactive power | Correlation | Estimation | decentralized estimation | Encoding | Entropy | Distributed estimation | correlation estimation | Indexes | Testing | PART I | COMPUTER SCIENCE, INFORMATION SYSTEMS | COMPRESSION | WIRELESS SENSOR NETWORKS | ENGINEERING, ELECTRICAL & ELECTRONIC

multiterminal estimation | Reactive power | Correlation | Estimation | decentralized estimation | Encoding | Entropy | Distributed estimation | correlation estimation | Indexes | Testing | PART I | COMPUTER SCIENCE, INFORMATION SYSTEMS | COMPRESSION | WIRELESS SENSOR NETWORKS | ENGINEERING, ELECTRICAL & ELECTRONIC

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 03/2019, Volume 65, Issue 3, pp. 1512 - 1534

We investigate the problem of estimating a random variable Y under a privacy...

Data privacy | Correlation | Estimation | privacy-utility tradeoff | Gaussian additive privacy mechanism | guessing probability | Electronic mail | minimum mean-squared error | Optimization | Privacy | Perturbation methods | Rényi’s entropy | Random variables | maximal correlation | Rényi's entropy | CONNECTION | CORRELATION-COEFFICIENT | INFORMATION | Renyi's entropy | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Lower bounds | Constraints | Tradeoffs | Mathematical functions | Continuity (mathematics)

Data privacy | Correlation | Estimation | privacy-utility tradeoff | Gaussian additive privacy mechanism | guessing probability | Electronic mail | minimum mean-squared error | Optimization | Privacy | Perturbation methods | Rényi’s entropy | Random variables | maximal correlation | Rényi's entropy | CONNECTION | CORRELATION-COEFFICIENT | INFORMATION | Renyi's entropy | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Lower bounds | Constraints | Tradeoffs | Mathematical functions | Continuity (mathematics)

Journal Article

IEEE Transactions on Image Processing, ISSN 1057-7149, 07/2017, Volume 26, Issue 7, pp. 3087 - 3097

Age estimation based on the human face remains a significant problem in computer vision and pattern recognition. In order to estimate an accurate age or age...

Neural networks | age estimation | Estimation | Machine learning | Age difference | K-L divergence distance | Aging | Feature extraction | Entropy | convolutional neural networks | Face | Age estimation | Convolutional neural networks | APPEARANCE | age difference | FACE | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | Usage | Research | Entropy (Information theory) | Index Medicus

Neural networks | age estimation | Estimation | Machine learning | Age difference | K-L divergence distance | Aging | Feature extraction | Entropy | convolutional neural networks | Face | Age estimation | Convolutional neural networks | APPEARANCE | age difference | FACE | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | Usage | Research | Entropy (Information theory) | Index Medicus

Journal Article

1996, Series in financial economics and quantitative analysis, ISBN 9780471953111, xvi, 307

Book

Neural Computation, ISSN 1530-888X, 12/2016, Volume 28, Issue 12, pp. 2687 - 2725

This study considers the common situation in data analysis when there are few observations of the distribution of interest or the target distribution, while...

Letters | MODELS | DIVERGENCE ESTIMATION | NEUROSCIENCES | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ALGORITHM | GEOMETRY | Data analysis | Parameter estimation | Probability distribution | Electroencephalography | Algorithms | Approximations | Learning | Approximation | Data sets | Data processing | Mathematical models | Subspaces | Maximum entropy

Letters | MODELS | DIVERGENCE ESTIMATION | NEUROSCIENCES | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ALGORITHM | GEOMETRY | Data analysis | Parameter estimation | Probability distribution | Electroencephalography | Algorithms | Approximations | Learning | Approximation | Data sets | Data processing | Mathematical models | Subspaces | Maximum entropy

Journal Article

1984, Series on econometrics and management sciences, ISBN 9780088109857, Volume 1, xv, 246

Book

The Annals of Statistics, ISSN 0090-5364, 4/2011, Volume 39, Issue 2, pp. 887 - 930

Suppose that we observe entries or, more generally, linear combinations of entries of an unknown m × T -matrix A corrupted by noise. We are particularly...

Integers | Minimax | Sample size | Analytical estimating | Matrices | Entropy | Random variables | Regression analysis | Covariance matrices | Estimators | Sparse recovery | Empirical process | Quasi-convex Schatten class embeddings | Schatten norm | Penalized least-squares estimator | High-dimensional low-rank matrices | REGRESSION | penalized least-squares estimator | empirical process | sparse recovery | STATISTICS & PROBABILITY | TRACE-NORM | NORM MINIMIZATION | CONSISTENCY | LASSO | quasi-convex Schatten class embeddings | SELECTION | AGGREGATION | ENTROPY | Studies | Estimates | Matrix | Probability | Mathematics | 62G05 | 62F10

Integers | Minimax | Sample size | Analytical estimating | Matrices | Entropy | Random variables | Regression analysis | Covariance matrices | Estimators | Sparse recovery | Empirical process | Quasi-convex Schatten class embeddings | Schatten norm | Penalized least-squares estimator | High-dimensional low-rank matrices | REGRESSION | penalized least-squares estimator | empirical process | sparse recovery | STATISTICS & PROBABILITY | TRACE-NORM | NORM MINIMIZATION | CONSISTENCY | LASSO | quasi-convex Schatten class embeddings | SELECTION | AGGREGATION | ENTROPY | Studies | Estimates | Matrix | Probability | Mathematics | 62G05 | 62F10

Journal Article

2009, Foundations and trends in communications and information theory, ISBN 9781601982308, Volume 5, issue 3, 2008., Issue 3, [x], 93

Book

IEEE Transactions on Information Theory, ISSN 0018-9448, 08/2010, Volume 56, Issue 8, pp. 3712 - 3720

A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the...

minimum mean- square error (MMSE) estimation | Estimation error | Divergence | Estimation theory | Probability | Entropy | Shannon theory | free probability | Gaussian noise | relative entropy | Random variables | Mutual information | Network address translation | Signal to noise ratio | Information theory | statistics | mutual information | GAUSSIAN CHANNELS | MEAN-SQUARE ERROR | ANALOGS | PERTURBATION | FREE PROBABILITY-THEORY | COMPUTER SCIENCE, INFORMATION SYSTEMS | minimum mean-square error (MMSE) estimation | POWER INEQUALITY | SIMPLE PROOF | ENGINEERING, ELECTRICAL & ELECTRONIC | FISHER INFORMATION MEASURE | Measurement | Entropy (Information theory) | Mean square errors | Normal distribution | Maximum entropy method | Integrals | Noise | Gaussian | Representations | Estimators

minimum mean- square error (MMSE) estimation | Estimation error | Divergence | Estimation theory | Probability | Entropy | Shannon theory | free probability | Gaussian noise | relative entropy | Random variables | Mutual information | Network address translation | Signal to noise ratio | Information theory | statistics | mutual information | GAUSSIAN CHANNELS | MEAN-SQUARE ERROR | ANALOGS | PERTURBATION | FREE PROBABILITY-THEORY | COMPUTER SCIENCE, INFORMATION SYSTEMS | minimum mean-square error (MMSE) estimation | POWER INEQUALITY | SIMPLE PROOF | ENGINEERING, ELECTRICAL & ELECTRONIC | FISHER INFORMATION MEASURE | Measurement | Entropy (Information theory) | Mean square errors | Normal distribution | Maximum entropy method | Integrals | Noise | Gaussian | Representations | Estimators

Journal Article

IEEE Transactions on Image Processing, ISSN 1057-7149, 06/2014, Volume 23, Issue 6, pp. 2487 - 2500

In the Part 1 of this two-part study, we present a method of imaging and velocity estimation of ground moving targets using passive synthetic aperture radar....

Transmitters | Imaging | Estimation | Receivers | Radar imaging | Apertures | Image reconstruction | passive radar | entropy | passive imaging | Synthetic aperture radar | velocity estimation | filtered backprojection | SIGNAL | ALGORITHM | SENSOR | OBJECTS | LOCATION RADAR SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | COMPUTATION | Usage | Numerical analysis | Image processing | Innovations | Imaging systems | Waveforms | Fourier transformations | Design and construction | Moving targets | Images | Grounds | Mathematical models | Position (location)

Transmitters | Imaging | Estimation | Receivers | Radar imaging | Apertures | Image reconstruction | passive radar | entropy | passive imaging | Synthetic aperture radar | velocity estimation | filtered backprojection | SIGNAL | ALGORITHM | SENSOR | OBJECTS | LOCATION RADAR SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | COMPUTATION | Usage | Numerical analysis | Image processing | Innovations | Imaging systems | Waveforms | Fourier transformations | Design and construction | Moving targets | Images | Grounds | Mathematical models | Position (location)

Journal Article

Journal of Computational and Graphical Statistics, ISSN 1061-8600, 04/2017, Volume 26, Issue 2, pp. 355 - 366

We consider a new method for sparse covariance matrix estimation which is motivated by previous results for the so-called Stein-type estimators. Stein proposed...

Covariance graph | Numerical optimization | Multivariate analysis | Mathematical statistics | WISHART DISTRIBUTION | REGRESSION | ADAPTIVE LASSO | STATISTICS & PROBABILITY | PROLIFERATION | MATRIX ESTIMATION | Computer simulation | Gaussian distribution | Entropy | Covariance matrix | Shrinkage | Patients | Optimization | Studies | Simulation | Normal distribution | Risk assessment | Eigenvalues | Breast | Maximum likelihood estimators | Mathematical models | Estimating techniques | Descent | Estimators | Cancer

Covariance graph | Numerical optimization | Multivariate analysis | Mathematical statistics | WISHART DISTRIBUTION | REGRESSION | ADAPTIVE LASSO | STATISTICS & PROBABILITY | PROLIFERATION | MATRIX ESTIMATION | Computer simulation | Gaussian distribution | Entropy | Covariance matrix | Shrinkage | Patients | Optimization | Studies | Simulation | Normal distribution | Risk assessment | Eigenvalues | Breast | Maximum likelihood estimators | Mathematical models | Estimating techniques | Descent | Estimators | Cancer

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 05/2009, Volume 55, Issue 5, pp. 2392 - 2405

A new universal estimator of divergence is presented for multidimensional continuous densities based on k -nearest-neighbor ( k -NN) distances. Assuming...

nearest-neighbor | Multidimensional systems | Density measurement | Divergence | Laboratories | Kullback-Leibler | Frequency estimation | Probability distribution | Partitioning algorithms | random vector | information measure | Convergence | Neuroscience | partition | universal estimation | Mutual information | Information theory | CONSISTENCY | MUTUAL INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENTROPY | ENGINEERING, ELECTRICAL & ELECTRONIC | Analysis | Algorithms | Estimates | Experiments | Methods | Asymptotic properties | Density | Estimators

nearest-neighbor | Multidimensional systems | Density measurement | Divergence | Laboratories | Kullback-Leibler | Frequency estimation | Probability distribution | Partitioning algorithms | random vector | information measure | Convergence | Neuroscience | partition | universal estimation | Mutual information | Information theory | CONSISTENCY | MUTUAL INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENTROPY | ENGINEERING, ELECTRICAL & ELECTRONIC | Analysis | Algorithms | Estimates | Experiments | Methods | Asymptotic properties | Density | Estimators

Journal Article