References of "Mishra, Bamdev"
     in
Bookmark and Share    
Full Text
See detailA Riemannian approach to large-scale constrained least-squares with symmetries
Mishra, Bamdev ULg

Doctoral thesis (2014)

This thesis deals with least-squares optimization on a manifold of equivalence relations, e.g., in the presence of symmetries which arise frequently in many applications. While least-squares cost ... [more ▼]

This thesis deals with least-squares optimization on a manifold of equivalence relations, e.g., in the presence of symmetries which arise frequently in many applications. While least-squares cost functions remain a popular way to model large-scale problems, the additional symmetry constraint should be interpreted as a way to make the modeling robust. Two fundamental examples are the matrix completion problem, a least-squares problem with rank constraints and the generalized eigenvalue problem, a least-squares problem with orthogonality constraints. The possible large-scale nature of these problems demands to exploit the problem structure as much as possible in order to design numerically efficient algorithms. The constrained least-squares problems are tackled in the framework of Riemannian optimization that has gained much popularity in recent years because of the special nature of orthogonality and rank constraints that have particular symmetries. Previous work on Riemannian optimization has mostly focused on the search space, exploiting the differential geometry of the constraint but disregarding the role of the cost function. We, on the other hand, propose to take both cost and constraints into account to propose a tailored Riemannian geometry. This is achieved by proposing novel Riemannian metrics. To this end, we show a basic connection between sequential quadratic programming and Riemannian gradient optimization and address the general question of selecting a metric in Riemannian optimization. We revisit quadratic optimization problems with orthogonality and rank constraints by generalizing various existing methods, like power, inverse and Rayleigh quotient iterations, and proposing novel ones that empirically compete with state-of-the-art algorithms. Overall, this thesis deals with exploiting two fundamental structures, least-squares and symmetry, in nonlinear optimization. [less ▲]

Detailed reference viewed: 49 (22 ULg)
See detailLow-rank optimization with trace norm penalty
Mishra, Bamdev ULg; Meyer, Gilles ULg; Bach, Francis et al

E-print/Working paper (2012)

The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is ... [more ▼]

The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equipped with a particular Riemannian structure that leads to efficient computations. We present a second-order trust-region algorithm with a guaranteed quadratic rate of convergence. Overall, the proposed optimization scheme converges super-linearly to the global solution while still maintaining complexity that is linear in the number of rows of the matrix. To compute a set of solutions efficiently for a grid of regularization parameters we propose a predictor-corrector approach on the quotient manifold that outperforms the naive warm-restart approach. The performance of the proposed algorithm is illustrated on problems of low-rank matrix completion and multivariate linear regression. [less ▲]

Detailed reference viewed: 54 (23 ULg)
See detailFixed-rank matrix factorizations and Riemannian low-rank optimization
Mishra, Bamdev ULg; Meyer, Gilles ULg; Bonnabel, Silvere et al

E-print/Working paper (2012)

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of ... [more ▼]

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric optimization framework of optimization on Riemannian matrix manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss relative usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with the state-of-the-art and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. [less ▲]

Detailed reference viewed: 32 (4 ULg)
See detailA Riemannian geometry for low-rank matrix completion
Mishra, Bamdev ULg; Karavadi, Adithya Apuroop; Sepulchre, Rodolphe ULg

E-print/Working paper (2012)

We propose a new Riemannian geometry for fixed-rank matrices that is specifically tailored to the low-rank matrix completion problem. Exploiting the degree of freedom of a quotient space, we tune the ... [more ▼]

We propose a new Riemannian geometry for fixed-rank matrices that is specifically tailored to the low-rank matrix completion problem. Exploiting the degree of freedom of a quotient space, we tune the metric on our search space to the particular least square cost function. At one level, it illustrates in a novel way how to exploit the versatile framework of optimization on quotient manifold. At another level, our algorithm can be considered as an improved version of LMaFit, the state-of-the-art Gauss-Seidel algorithm. We develop necessary tools needed to perform both first-order and second-order optimization. In particular, we propose gradient descent schemes (steepest descent and conjugate gradient) and trust-region algorithms. We also show that, thanks to the simplicity of the cost function, it is numerically cheap to perform an exact linesearch given a search direction, which makes our algorithms competitive with the state-of-the-art on standard low-rank matrix completion instances. [less ▲]

Detailed reference viewed: 89 (17 ULg)
Full Text
Peer Reviewed
See detailLow-rank optimization for distance matrix completion
Mishra, Bamdev ULg; Meyer, Gilles ULg; Sepulchre, Rodolphe ULg

in Proceedings of the 50th IEEE Conference on Decision and Control (2011, December)

This paper addresses the problem of low-rank distance matrix completion. This problem amounts to recover the missing entries of a distance matrix when the dimension of the data embedding space is possibly ... [more ▼]

This paper addresses the problem of low-rank distance matrix completion. This problem amounts to recover the missing entries of a distance matrix when the dimension of the data embedding space is possibly unknown but small compared to the number of considered data points. The focus is on high-dimensional problems. We recast the considered problem into an optimization problem over the set of low-rank positive semidefinite matrices and propose two efficient algorithms for low-rank distance matrix completion. In addition, we propose a strategy to determine the dimension of the embedding space. The resulting algorithms scale to high-dimensional problems and monotonically converge to a global solution of the problem. Finally, numerical experiments illustrate the good performance of the proposed algorithms on benchmarks. [less ▲]

Detailed reference viewed: 51 (10 ULg)