Proceedings of the IEEE, ISSN 0018-9219, 06/2010, Volume 98, Issue 6, pp. 925 - 936

On the heels of compressed sensing, a new field has very recently emerged. This field addresses a broad range of problems of significant practical interest,...

Computer vision | low-rank matrices | Filtering | matrix completion | Linear matrix inequalities | Remote sensing | Noise level | semidefinite programming | oracle inequalities | Collaboration | nuclear-norm minimization | Machine learning | Motion pictures | Frequency | duality in optimization | Compressed sensing | Semidefinite programming | Duality in optimization | Nuclear-norm minimization | Oracle inequalities | Low-rank matrices | Matrix completion | INFORMATION | ENGINEERING, ELECTRICAL & ELECTRONIC | Studies | Noise | Minimization | Matrices | Detection | Compressed | Optimization | Quantitative analysis

Computer vision | low-rank matrices | Filtering | matrix completion | Linear matrix inequalities | Remote sensing | Noise level | semidefinite programming | oracle inequalities | Collaboration | nuclear-norm minimization | Machine learning | Motion pictures | Frequency | duality in optimization | Compressed sensing | Semidefinite programming | Duality in optimization | Nuclear-norm minimization | Oracle inequalities | Low-rank matrices | Matrix completion | INFORMATION | ENGINEERING, ELECTRICAL & ELECTRONIC | Studies | Noise | Minimization | Matrices | Detection | Compressed | Optimization | Quantitative analysis

Journal Article

Journal of Machine Learning Research, ISSN 1532-4435, 07/2010, Volume 11, pp. 2057 - 2078

Given a matrix M of low-rank, we consider the problem of reconstructing it from noisy observations of a small, random subset of its entries. The problem arises...

Low-rank matrices | Spectral methods | Manifold optimization | Matrix completion | low-rank matrices | spectral methods | matrix completion | manifold optimization | APPROXIMATIONS | ALGORITHMS | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Low-rank matrices | Spectral methods | Manifold optimization | Matrix completion | low-rank matrices | spectral methods | matrix completion | manifold optimization | APPROXIMATIONS | ALGORITHMS | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Journal Article

The Annals of Statistics, ISSN 0090-5364, 4/2011, Volume 39, Issue 2, pp. 887 - 930

Suppose that we observe entries or, more generally, linear combinations of entries of an unknown m Ă T -matrix A corrupted by noise. We are particularly...

Integers | Minimax | Sample size | Analytical estimating | Matrices | Entropy | Random variables | Regression analysis | Covariance matrices | Estimators | Sparse recovery | Empirical process | Quasi-convex Schatten class embeddings | Schatten norm | Penalized least-squares estimator | High-dimensional low-rank matrices | REGRESSION | penalized least-squares estimator | empirical process | sparse recovery | STATISTICS & PROBABILITY | TRACE-NORM | NORM MINIMIZATION | CONSISTENCY | LASSO | quasi-convex Schatten class embeddings | SELECTION | AGGREGATION | ENTROPY | Studies | Estimates | Matrix | Probability | Mathematics | 62G05 | 62F10

Integers | Minimax | Sample size | Analytical estimating | Matrices | Entropy | Random variables | Regression analysis | Covariance matrices | Estimators | Sparse recovery | Empirical process | Quasi-convex Schatten class embeddings | Schatten norm | Penalized least-squares estimator | High-dimensional low-rank matrices | REGRESSION | penalized least-squares estimator | empirical process | sparse recovery | STATISTICS & PROBABILITY | TRACE-NORM | NORM MINIMIZATION | CONSISTENCY | LASSO | quasi-convex Schatten class embeddings | SELECTION | AGGREGATION | ENTROPY | Studies | Estimates | Matrix | Probability | Mathematics | 62G05 | 62F10

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 05/2010, Volume 56, Issue 5, pp. 2053 - 2080

This paper is concerned with the problem of recovering an unknown matrix from a small fraction of its entries. This is known as the matrix completion problem,...

random matrices and techniques from random matrix theory | Minimization methods | low-rank matrices | matrix completion | Mathematics | Information filtering | nuclear norm minimization | free probability | semidefinite programming | Collaboration | Duality in optimization | Signal processing | Associate members | Information filters | Motion pictures | Nuclear norm minimization | Semidefinite programming | Free probability | Random matrices and techniques from random matrix theory | Low-rank matrices | Matrix completion | INEQUALITIES | INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Computer programming | Matrices | Research | Mathematical optimization | Methods | Information science | Matrix | Algebra | Theoretical mathematics | Filtration | Filtering | Mathematical analysis | Norms | Incoherence | Vectors (mathematics) | Recovery | Optimization

random matrices and techniques from random matrix theory | Minimization methods | low-rank matrices | matrix completion | Mathematics | Information filtering | nuclear norm minimization | free probability | semidefinite programming | Collaboration | Duality in optimization | Signal processing | Associate members | Information filters | Motion pictures | Nuclear norm minimization | Semidefinite programming | Free probability | Random matrices and techniques from random matrix theory | Low-rank matrices | Matrix completion | INEQUALITIES | INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Computer programming | Matrices | Research | Mathematical optimization | Methods | Information science | Matrix | Algebra | Theoretical mathematics | Filtration | Filtering | Mathematical analysis | Norms | Incoherence | Vectors (mathematics) | Recovery | Optimization

Journal Article

Neural Networks, ISSN 0893-6080, 02/2018, Volume 98, pp. 34 - 41

Conventional methods of matrix completion are linear methods that are not effective in handling data of nonlinear structures. Recently a few researchers...

Deep learning | Matrix factorization | Image inpainting | Collaborative filtering | Matrix completion | NEUROSCIENCES | LOW-RANK | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | Machine Learning | Pattern Recognition, Automated - methods | Neural Networks (Computer) | Index Medicus

Deep learning | Matrix factorization | Image inpainting | Collaborative filtering | Matrix completion | NEUROSCIENCES | LOW-RANK | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | Machine Learning | Pattern Recognition, Automated - methods | Neural Networks (Computer) | Index Medicus

Journal Article

IEEE Transactions on Pattern Analysis and Machine Intelligence, ISSN 0162-8828, 04/2017, Volume 39, Issue 4, pp. 818 - 832

Low-rank recovery models have shown potential for salient object detection, where a matrix is decomposed into a low-rank matrix representing image background...

subspace learning | Image segmentation | Laplace equations | Image color analysis | low rank | Computational modeling | Object detection | Matrix decomposition | Sparse matrices | Salient object detection | structured sparsity | matrix decomposition | VISUAL-ATTENTION | REGION DETECTION | MODEL | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | Harmonic functions | Usage | Computer-generated environments | Computer simulation | State of the art | Sparsity | Salience | Decomposition | Performance measurement | Object recognition | Regularization | Vision systems | Image detection

subspace learning | Image segmentation | Laplace equations | Image color analysis | low rank | Computational modeling | Object detection | Matrix decomposition | Sparse matrices | Salient object detection | structured sparsity | matrix decomposition | VISUAL-ATTENTION | REGION DETECTION | MODEL | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | Harmonic functions | Usage | Computer-generated environments | Computer simulation | State of the art | Sparsity | Salience | Decomposition | Performance measurement | Object recognition | Regularization | Vision systems | Image detection

Journal Article

IEEE Transactions on Information Theory, ISSN 0018-9448, 06/2010, Volume 56, Issue 6, pp. 2980 - 2998

Let M be an nÂż Ă n matrix of rank r, and assume that a uniformly random subset E of its entries is observed. We describe an efficient algorithm, which we call...

spectral methods | low rank | manifold optimization | matrix completion | Optimization methods | Reconstruction algorithms | Watches | Information filtering | Sparse matrices | Root mean square | Gradient descent | Collaboration | Motion pictures | Information filters | phase transition | Mathematical model | Low rank | Spectral methods | Manifold optimization | Matrix completion | Phase transition | APPROXIMATIONS | INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | LOW-RANK MATRIX | ALGORITHMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Matrices | Research | Mathematical models | Matrix | Algorithms | Theory | Information systems | Reconstruction | Mean square values | Mathematical analysis | Data sets | Roots | Information theory

spectral methods | low rank | manifold optimization | matrix completion | Optimization methods | Reconstruction algorithms | Watches | Information filtering | Sparse matrices | Root mean square | Gradient descent | Collaboration | Motion pictures | Information filters | phase transition | Mathematical model | Low rank | Spectral methods | Manifold optimization | Matrix completion | Phase transition | APPROXIMATIONS | INFORMATION | COMPUTER SCIENCE, INFORMATION SYSTEMS | LOW-RANK MATRIX | ALGORITHMS | ENGINEERING, ELECTRICAL & ELECTRONIC | Matrices | Research | Mathematical models | Matrix | Algorithms | Theory | Information systems | Reconstruction | Mean square values | Mathematical analysis | Data sets | Roots | Information theory

Journal Article

Foundations of Computational Mathematics, ISSN 1615-3375, 12/2009, Volume 9, Issue 6, pp. 717 - 772

We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries...

Nuclear norm minimization | Random matrices | Decoupling | Noncommutative Khintchine inequality | Convex optimization | Duality in optimization | Compressed sensing | Low-rank matrices | Matrix completion | MATHEMATICS, APPLIED | INEQUALITIES | U-STATISTICS | MATHEMATICS | COMPUTER SCIENCE, THEORY & METHODS

Nuclear norm minimization | Random matrices | Decoupling | Noncommutative Khintchine inequality | Convex optimization | Duality in optimization | Compressed sensing | Low-rank matrices | Matrix completion | MATHEMATICS, APPLIED | INEQUALITIES | U-STATISTICS | MATHEMATICS | COMPUTER SCIENCE, THEORY & METHODS

Journal Article

Pattern Recognition, ISSN 0031-3203, 05/2018, Volume 77, pp. 378 - 394

Conventional matrix completion methods are generally linear because they assume that the given data are from linear transformations of lower-dimensional latent...

Image inpainting | Non-linear denoising | Schatten p-norm | Low-rank | Single-/multi-label classification | Kernel | Matrix completion | RECOGNITION | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | PRE-IMAGE | NORM | REGULARIZATION | Information management

Image inpainting | Non-linear denoising | Schatten p-norm | Low-rank | Single-/multi-label classification | Kernel | Matrix completion | RECOGNITION | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC | PRE-IMAGE | NORM | REGULARIZATION | Information management

Journal Article

Journal of Machine Learning Research, ISSN 1532-4435, 12/2011, Volume 12, pp. 3413 - 3430

This paper provides the best bounds to date on the number of randomly sampled entries required to reconstruct an unknown low-rank matrix. These results improve...

Operator chernoff bound | Nuclear norm minimization | Random matrices | Convex optimization | Compressed sensing | Low-rank matrices | Matrix completion | low-rank matrices | compressed sensing | matrix completion | random matrices | MINIMIZATION | BOUNDS | INEQUALITY | convex optimization | operator Chernoff bound | nuclear norm minimization | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Operator chernoff bound | Nuclear norm minimization | Random matrices | Convex optimization | Compressed sensing | Low-rank matrices | Matrix completion | low-rank matrices | compressed sensing | matrix completion | random matrices | MINIMIZATION | BOUNDS | INEQUALITY | convex optimization | operator Chernoff bound | nuclear norm minimization | AUTOMATION & CONTROL SYSTEMS | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE

Journal Article

Pattern Recognition, ISSN 0031-3203, 04/2018, Volume 76, pp. 715 - 726

Modern technologies have been producing data with complex intrinsic structures, which can be naturally represented as two-dimensional matrices, such as gray...

Matrix analysis | Sparse | Support vector machine | Low rank | Classification | REGRESSION | ALGORITHM | REPRESENTATION | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC

Matrix analysis | Sparse | Support vector machine | Low rank | Classification | REGRESSION | ALGORITHM | REPRESENTATION | COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE | ENGINEERING, ELECTRICAL & ELECTRONIC

Journal Article

SIAM Journal on Numerical Analysis, ISSN 0036-1429, 1/2011, Volume 49, Issue 3/4, pp. 1417 - 1435

In this paper, we investigate how the GaussâNewton Hessian matrix affects the basin of convergence in Newton-type methods. Although the Newton algorithm is...

Missing data | Approximation | Algorithms | Objective functions | Least squares | Quadratic approximation | Mathematical independent variables | Newtons method | Hessian matrices | Estimation methods | Gauss-Newton Hessian matrix | Least squares problem | Hessian matrix | MATHEMATICS, APPLIED | SHAPE | MOTION | MODELS | FACTORIZATION METHOD | ALGORITHM | least squares problem | MISSING DATA | LOW-RANK MATRIX | COMPLETION | Gauss Newton Hessian matrix

Missing data | Approximation | Algorithms | Objective functions | Least squares | Quadratic approximation | Mathematical independent variables | Newtons method | Hessian matrices | Estimation methods | Gauss-Newton Hessian matrix | Least squares problem | Hessian matrix | MATHEMATICS, APPLIED | SHAPE | MOTION | MODELS | FACTORIZATION METHOD | ALGORITHM | least squares problem | MISSING DATA | LOW-RANK MATRIX | COMPLETION | Gauss Newton Hessian matrix

Journal Article

Journal of Computational Physics, ISSN 0021-9991, 2011, Volume 230, Issue 10, pp. 4071 - 4087

We develop a hierarchical matrix construction algorithm using matrixâvector multiplications, based on the randomized singular value decomposition of low-rank...

Greenâs function | Elliptic operator | Randomized singular value decomposition | Fast algorithm | Hierarchical matrix construction | Matrixâvector multiplication | Matrix-vector multiplication | Green's function | MONTE-CARLO ALGORITHMS | LOW-RANK APPROXIMATION | Green's Junction | DECOMPOSITION | SOLVER | PHYSICS, MATHEMATICAL | COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS | SEMISEPARABLE REPRESENTATIONS | Algorithms | Multiplication | Matrix representation | Mathematical analysis | Images | Computational efficiency | Vectors (mathematics) | Two dimensional | Construction costs

Greenâs function | Elliptic operator | Randomized singular value decomposition | Fast algorithm | Hierarchical matrix construction | Matrixâvector multiplication | Matrix-vector multiplication | Green's function | MONTE-CARLO ALGORITHMS | LOW-RANK APPROXIMATION | Green's Junction | DECOMPOSITION | SOLVER | PHYSICS, MATHEMATICAL | COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS | SEMISEPARABLE REPRESENTATIONS | Algorithms | Multiplication | Matrix representation | Mathematical analysis | Images | Computational efficiency | Vectors (mathematics) | Two dimensional | Construction costs

Journal Article

Numerical Linear Algebra with Applications, ISSN 1070-5325, 12/2010, Volume 17, Issue 6, pp. 953 - 976

Semiseparable matrices and many other rankâstructured matrices have been widely used in developing new fast matrix algorithms. In this paper, we generalize the...

postordering HSS tree | lowârank property | fast HSS algorithms | HSS matrix | generalized HSS Cholesky factorization | Fast HSS algorithms | Low-rank property | Generalized HSS Cholesky factorization | Postordering HSS tree | MATHEMATICS, APPLIED | REPRESENTATIONS | STRUCTURED MATRICES | SPARSE | INTEGRAL-EQUATIONS | low-rank property | MATHEMATICS | PARTICLE SIMULATIONS | ELIMINATION TREES | FAST DIRECT SOLVER | MULTIFRONTAL METHOD | 2 DIMENSIONS | High speed tool steels | Algorithms | Matrix representation | Mathematical analysis | Matrices | Factorization | Matrix methods | Complexity

postordering HSS tree | lowârank property | fast HSS algorithms | HSS matrix | generalized HSS Cholesky factorization | Fast HSS algorithms | Low-rank property | Generalized HSS Cholesky factorization | Postordering HSS tree | MATHEMATICS, APPLIED | REPRESENTATIONS | STRUCTURED MATRICES | SPARSE | INTEGRAL-EQUATIONS | low-rank property | MATHEMATICS | PARTICLE SIMULATIONS | ELIMINATION TREES | FAST DIRECT SOLVER | MULTIFRONTAL METHOD | 2 DIMENSIONS | High speed tool steels | Algorithms | Matrix representation | Mathematical analysis | Matrices | Factorization | Matrix methods | Complexity

Journal Article

Probability Theory and Related Fields, ISSN 0178-8051, 10/2017, Volume 169, Issue 1, pp. 523 - 564

This paper considers the problem of estimation of a low-rank matrix when most of its entries are not observed and some of the observed entries are corrupted....

62G05 | 62J02 | Mathematical and Computational Biology | Theoretical, Mathematical and Computational Physics | Operations Research/Decision Theory | Probability Theory and Stochastic Processes | Mathematics | Quantitative Finance | 62G35 | DECOMPOSITION | STATISTICS & PROBABILITY | OPTIMAL RATES | LOW-RANK MATRICES | Sparsity | Minimax technique | Convexity | Statistics

62G05 | 62J02 | Mathematical and Computational Biology | Theoretical, Mathematical and Computational Physics | Operations Research/Decision Theory | Probability Theory and Stochastic Processes | Mathematics | Quantitative Finance | 62G35 | DECOMPOSITION | STATISTICS & PROBABILITY | OPTIMAL RATES | LOW-RANK MATRICES | Sparsity | Minimax technique | Convexity | Statistics

Journal Article

IEEE Transactions on Geoscience and Remote Sensing, ISSN 0196-2892, 08/2014, Volume 52, Issue 8, pp. 4729 - 4743

Hyperspectral images (HSIs) are often degraded by a mixture of various kinds of noise in the acquisition process, which can include Gaussian noise, impulse...

low rank | Gaussian noise | Noise reduction | Go Decomposition (GoDec) | hyperspectral image (HSIs) | Image restoration | Sparse matrices | Matrix decomposition | restoration | Hyperspectral imaging | Restoration | GEOCHEMISTRY & GEOPHYSICS | REMOTE SENSING | NOISE-REDUCTION | ALGORITHM | JOINT-SPARSE | IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY | DIFFUSION | ENGINEERING, ELECTRICAL & ELECTRONIC | Random noise theory | Usage | Image processing | Mathematical optimization | Innovations | Digital broadcasting | Normal distribution | Noise | Impulses | Algorithms | Computer simulation | Gaussian | Cleaning | Recovery | Marketing

low rank | Gaussian noise | Noise reduction | Go Decomposition (GoDec) | hyperspectral image (HSIs) | Image restoration | Sparse matrices | Matrix decomposition | restoration | Hyperspectral imaging | Restoration | GEOCHEMISTRY & GEOPHYSICS | REMOTE SENSING | NOISE-REDUCTION | ALGORITHM | JOINT-SPARSE | IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY | DIFFUSION | ENGINEERING, ELECTRICAL & ELECTRONIC | Random noise theory | Usage | Image processing | Mathematical optimization | Innovations | Digital broadcasting | Normal distribution | Noise | Impulses | Algorithms | Computer simulation | Gaussian | Cleaning | Recovery | Marketing

Journal Article

Journal of the Royal Statistical Society. Series B (Statistical Methodology), ISSN 1369-7412, 9/2013, Volume 75, Issue 4, pp. 603 - 680

The paper deals with the estimation of a high dimensional covariance with a conditional sparsity structure and fast diverging eigenvalues. By assuming a sparse...

Covariance | Threshing | Eigenvalues | Principal components analysis | Poetry | Covariance matrices | Financial portfolios | Estimators | Consistent estimators | Estimation methods | Crossâsectional correlation | Diverging eigenvalues | Thresholding | Sparse matrix | Approximate factor model | Low rank matrix | High dimensionality | Unknown factors | Principal components | Cross-sectional correlation | LARGEST EIGENVALUE | COMPONENTS-ANALYSIS | STATISTICS & PROBABILITY | HIGH-DIMENSION | PORTFOLIO SELECTION | FALSE DISCOVERY | CONSISTENCY | OPTIMAL RATES | DYNAMIC-FACTOR MODEL | MATRIX DECOMPOSITION | LARGE NUMBER | Studies | Mathematical models | Statistical analysis | Matrix | Statistics | Approximation | Asymptotic properties | Complement | Covariance matrix | Convergence | thresholding | diverging eigenvalues | sparse matrix | principal components | approximate factor model | unknown factors | High-dimensionality | cross-sectional correlation | low-rank matrix

Covariance | Threshing | Eigenvalues | Principal components analysis | Poetry | Covariance matrices | Financial portfolios | Estimators | Consistent estimators | Estimation methods | Crossâsectional correlation | Diverging eigenvalues | Thresholding | Sparse matrix | Approximate factor model | Low rank matrix | High dimensionality | Unknown factors | Principal components | Cross-sectional correlation | LARGEST EIGENVALUE | COMPONENTS-ANALYSIS | STATISTICS & PROBABILITY | HIGH-DIMENSION | PORTFOLIO SELECTION | FALSE DISCOVERY | CONSISTENCY | OPTIMAL RATES | DYNAMIC-FACTOR MODEL | MATRIX DECOMPOSITION | LARGE NUMBER | Studies | Mathematical models | Statistical analysis | Matrix | Statistics | Approximation | Asymptotic properties | Complement | Covariance matrix | Convergence | thresholding | diverging eigenvalues | sparse matrix | principal components | approximate factor model | unknown factors | High-dimensionality | cross-sectional correlation | low-rank matrix

Journal Article