Browse ORBi by ORBi project

- Background
- Content
- Benefits and challenges
- Legal aspects
- Functions and services
- Team
- Help and tutorials

Low-rank plus sparse decomposition for exoplanet detection in direct-imaging ADI sequences. The LLSG algorithm Gómez González, Carlos ; Absil, Olivier ; et al in Astronomy and Astrophysics (2016), 589 Context. Data processing constitutes a critical component of high-contrast exoplanet imaging. Its role is almost as important as the choice of a coronagraph or a wavefront control system, and it is ... [more ▼] Context. Data processing constitutes a critical component of high-contrast exoplanet imaging. Its role is almost as important as the choice of a coronagraph or a wavefront control system, and it is intertwined with the chosen observing strategy. Among the data processing techniques for angular differential imaging (ADI), the most recent is the family of principal component analysis (PCA) based algorithms. It is a widely used statistical tool developed during the first half of the past century. PCA serves, in this case, as a subspace projection technique for constructing a reference point spread function (PSF) that can be subtracted from the science data for boosting the detectability of potential companions present in the data. Unfortunately, when building this reference PSF from the science data itself, PCA comes with certain limitations such as the sensitivity of the lower dimensional orthogonal subspace to non-Gaussian noise. <BR /> Aims: Inspired by recent advances in machine learning algorithms such as robust PCA, we aim to propose a localized subspace projection technique that surpasses current PCA-based post-processing algorithms in terms of the detectability of companions at near real-time speed, a quality that will be useful for future direct imaging surveys. <BR /> Methods: We used randomized low-rank approximation methods recently proposed in the machine learning literature, coupled with entry-wise thresholding to decompose an ADI image sequence locally into low-rank, sparse, and Gaussian noise components (LLSG). This local three-term decomposition separates the starlight and the associated speckle noise from the planetary signal, which mostly remains in the sparse term. We tested the performance of our new algorithm on a long ADI sequence obtained on β Pictoris with VLT/NACO. <BR /> Results: Compared to a standard PCA approach, LLSG decomposition reaches a higher signal-to-noise ratio and has an overall better performance in the receiver operating characteristic space. This three-term decomposition brings a detectability boost compared to the full-frame standard PCA approach, especially in the small inner working angle region where complex speckle noise prevents PCA from discerning true companions from noise. [less ▲] Detailed reference viewed: 12 (6 ULg)Low-rank optimization on the cone of positive semidefinite matrices Journee, Michel ; ; et al in SIAM Journal on Optimization (2010), 20(5) Detailed reference viewed: 162 (12 ULg)Refining Sparse Principal Components ; ; et al in Diehl, M. (Ed.) Recent Advances in Optimization and its Applications in Engineering (2010) In this paper, we discuss methods to refine locally optimal solutions of sparse PCA. Starting from a local solution obtained by existing algorithms, these methods take advantage of convex relaxations of ... [more ▼] In this paper, we discuss methods to refine locally optimal solutions of sparse PCA. Starting from a local solution obtained by existing algorithms, these methods take advantage of convex relaxations of the sparse PCA problem to propose a refined solution that is still locally optimal but with a higher objective value. [less ▲] Detailed reference viewed: 102 (2 ULg)Optimization on manifolds : methods and applications ; ; Sepulchre, Rodolphe in Recent Advances in Optimization and its Applications in Engineering (2010) Summary. This paper provides an introduction to the topic of optimization on manifolds. The approach taken uses the language of differential geometry, however, we choose to emphasise the intuition of the ... [more ▼] Summary. This paper provides an introduction to the topic of optimization on manifolds. The approach taken uses the language of differential geometry, however, we choose to emphasise the intuition of the concepts and the structures that are important in generating practical numerical algorithms rather than the technical details of the formulation. There are a number of algorithms that can be applied to solve such problems and we discuss the steepest descent and Newton’s method in some detail as well as referencing the more important of the other approaches. There are a wide range of potential applications that we are aware of, and we briefly discuss these applications, as well as explaining one or two in more detail. [less ▲] Detailed reference viewed: 80 (2 ULg)Elucidating the altered transcriptional programs in breast cancer using independent component analysis ; Journee, Michel ; et al in PLoS Computational Biology (2007), 3(8), 1539-1554 The quantity of mRNA transcripts in a cell is determined by a complex interplay of cooperative and counteracting biological processes. Independent Component Analysis ( ICA) is one of a few number of ... [more ▼] The quantity of mRNA transcripts in a cell is determined by a complex interplay of cooperative and counteracting biological processes. Independent Component Analysis ( ICA) is one of a few number of unsupervised algorithms that have been applied to microarray gene expression data in an attempt to understand phenotype differences in terms of changes in the activation/ inhibition patterns of biological pathways. While the ICA model has been shown to outperform other linear representations of the data such as Principal Components Analysis ( PCA), a validation using explicit pathway and regulatory element information has not yet been performed. We apply a range of popular ICA algorithms to six of the largest microarray cancer datasets and use pathway- knowledge and regulatory- element databases for validation. We show that ICA outperforms PCA and clustering- based methods in that ICA components map closer to known cancer- related pathways, regulatory modules, and cancer phenotypes. Furthermore, we identify cancer signalling and oncogenic pathways and regulatory modules that play a prominent role in breast cancer and relate the differential activation patterns of these to breast cancer phenotypes. Importantly, we find novel associations linking immune response and epithelial - mesenchymal transition pathways with estrogen receptor status and histological grade, respectively. In addition, we find associations linking the activity levels of biological pathways and transcription factors ( NF1 and NFAT) with clinical outcome in breast cancer. ICA provides a framework for a more biologically relevant interpretation of genomewide transcriptomic data. Adopting ICA as the analysis tool of choice will help understand the phenotype - pathway relationship and thus help elucidate the molecular taxonomy of heterogeneous cancers and of other complex genetic diseases. [less ▲] Detailed reference viewed: 45 (10 ULg)Gradient-optimization on the orthogonal group for Independent Component Analysis Journee, Michel ; ; Sepulchre, Rodolphe in 7th International Conference on Independent Component Analysis and Blind Signal Separation (ICA 2007) (2007) Detailed reference viewed: 22 (3 ULg)Geometric optimization methods for independent component analysis applied on gene expression data Journee, Michel ; ; et al in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2007) (2007) Detailed reference viewed: 10 (2 ULg)Geometric optimization methods for the analysis of gene expression data Journee, Michel ; ; et al Part of book (2007) Detailed reference viewed: 14 (4 ULg)Continuous dynamical systems that realize discrete optimization on the hypercube ; Sepulchre, Rodolphe in Systems & Control Letters (2004), 52(3-4), 297-304 We study the problem of finding a local minimum of a multilinear function E over the discrete set {0, 1}(n). The search is achieved by a gradient-like system in [0, 1](n) with cost function E. Under mild ... [more ▼] We study the problem of finding a local minimum of a multilinear function E over the discrete set {0, 1}(n). The search is achieved by a gradient-like system in [0, 1](n) with cost function E. Under mild restrictions on the metric, the stable attractors of the gradient-like system are shown to produce solutions of the problem, even when they are not in the vicinity of the discrete set {0, 1}(n). Moreover, the gradient-like system connects with interior point methods for linear programming and with the analog neural network studied by Vidyasagar (IEEE Trans. Automat. Control 40 (8) (1995) 1359), in the same context. (C) 2004 Elsevier B.V. All rights reserved. [less ▲] Detailed reference viewed: 37 (5 ULg)Riemannian geometry of Grassmann manifolds with a view on algorithmic computation ; ; Sepulchre, Rodolphe in Acta Applicandae Mathematicae (2004), 80(2), 199-220 We give simple formulas for the canonical metric, gradient, Lie derivative, Riemannian connection, parallel translation, geodesics and distance on the Grassmann manifold of p-planes in R-n. In these ... [more ▼] We give simple formulas for the canonical metric, gradient, Lie derivative, Riemannian connection, parallel translation, geodesics and distance on the Grassmann manifold of p-planes in R-n. In these formulas, p-planes are represented as the column space of n x p matrices. The Newton method on abstract Riemannian manifolds proposed by Smith is made explicit on the Grassmann manifold. Two applications - computing an invariant subspace of a matrix and the mean of subspaces - are worked out. [less ▲] Detailed reference viewed: 44 (5 ULg)Cubically convergent iterations for invariant subspace computation ; Sepulchre, Rodolphe ; et al in SIAM Journal on Matrix Analysis and Applications (2004), 26(1), 70-96 We propose a Newton-like iteration that evolves on the set of fixed dimensional subspaces of R-n and converges locally cubically to the invariant subspaces of a symmetric matrix. This iteration is ... [more ▼] We propose a Newton-like iteration that evolves on the set of fixed dimensional subspaces of R-n and converges locally cubically to the invariant subspaces of a symmetric matrix. This iteration is compared in terms of numerical cost and global behavior with three other methods that display the same property of cubic convergence. Moreover, we consider heuristics that greatly improve the global behavior of the iterations. [less ▲] Detailed reference viewed: 33 (2 ULg)A Grassmann-Rayleigh quotient iteration for computing invariant subspaces ; ; Sepulchre, Rodolphe et al in SIAM Review (2002), 44(1), 57-73 The classical Rayleigh quotient iteration (RQI) allows one to compute a one-dimensional invariant subspace of a symmetric matrix A. Here we propose a generalization of the RQl which computes a p ... [more ▼] The classical Rayleigh quotient iteration (RQI) allows one to compute a one-dimensional invariant subspace of a symmetric matrix A. Here we propose a generalization of the RQl which computes a p-dimensional invariant subspace of A. Cubic convergence is preserved and the cost per iteration is low compared to other methods proposed in the literature. [less ▲] Detailed reference viewed: 30 (7 ULg)Nonlinear analysis of cardiac rythm fluctuations using DFA method ; Sepulchre, Rodolphe ; et al in Physica A: Statistical Mechanics and its Applications (1999), 272 Detailed reference viewed: 10 (0 ULg) |
||