"Discussant" au colloque “Exigences et limites du droit à la différence dans une société multiculturelle”
Scientific conference (1997, March 22)Detailed reference viewed: 21 (0 ULg)
Discussant, Séminaire Doctoral Annuel de l’Ecole Doctorale thématique en Sciences Sociales
Scientific conference (2014, May 09)Detailed reference viewed: 18 (0 ULg)
Discussing the validation of high-dimensional probability distribution learning with mixtures of graphical models for inference
Poster (2010, October 06)
Exact inference on probabilistic graphical models quickly becomes intractable when the dimension of the problem increases. A weighted average (or mixture) of different simple graphical models can be used ... [more ▼]
Exact inference on probabilistic graphical models quickly becomes intractable when the dimension of the problem increases. A weighted average (or mixture) of different simple graphical models can be used instead of a more complicated model to learn a distribution, allowing probabilistic inference to be much more efficient. I hope to discuss issues related to the validation of algorithms for learning such mixtures of models and to high-dimensional learning of probabilistic graphical models in general, and to gather valuable feedback and comments on my approach. The main problems are the difficulties to assess the accuracy of the algorithms and to choose a representative set of target distributions. The accuracy of algorithms for learning probabilistic graphical models is often evaluated by comparing the structure of the resulting model to the target (e.g. Number of similar/dissimilar edges, score BDe etc). This approach however falls short when studying methods using a mixture of simple models : individually, these lack the representation power to model the true distribution, and only their combination allows them to compete with more sophisticated models. The Kullback-Leibler divergence is a measure of the difference between two probability densities, and can be used to compare any model learned from a dataset to the data generating distribution. For computational reasons, I however had to resort to a Monte Carlo estimation of this quantity for large problems (starting at around 200 variables). Since probabilistic inference is the ultimate motivation for building these models, and not probability modelling, a more meaningful measure of accuracy could be obtained by comparing mixtures against a combination of state of the art model learning and approximate inference algorithms. However, the exact inference result cannot be easily assessed for interesting target distributions, since the use of mixtures is precisely considered because exact inference is not possible on said targets, and approximate inference would introduce a bias. Selecting a target distribution used to generate the data sets on which the algorithms are evaluated also proved a challenge. The easiest solution was to generate them at random (although different approaches can be designed). These models are however likely to be rather different from real problems, and thus constitute a poor choice to assess the practical interest of mixture of models. Methods (e.g. linking multiple copies of a given network) have been developed to increase the size of models known by the community (e.g. the alarm network), and the obtained graphical models have been made available. These could however still be far from the kind of interactions present in a real setting. A better way to proceed could be to generate samples based on the equations describing a physical problem, to learn a probabilistic model as best as possible from this high-dimensional dataset, and to use it as target distribution. [less ▲]Detailed reference viewed: 68 (4 ULg)
Discussion : Pratiques évaluatives en classe et épreuves externes
in Congrès international. Actualité de al recherche en éducation et en formation. AREF 2010 (2010)Detailed reference viewed: 21 (2 ULg)
La discussion à visée démocratique et philosophique au prisme de la critique deleuzienne de la discussion. Analyse d'un verbatim.
Herla, Anne ;
in Diotime. Revue internationale de didactique de la philosophie. (2014, April), 60Detailed reference viewed: 53 (5 ULg)
Discussion concerning the possible direction of evolution of the EU dual-use export control framework : introductory remarks
Conference (2011, September 20)Detailed reference viewed: 38 (5 ULg)
Discussion de cas cliniques : un Lewis Sumner qui n'en est pas un (JNLF 2009)
Conference (2009)Detailed reference viewed: 11 (0 ULg)
Discussion de l'intervention de Georges Charbonneau, "Phénoménologie du délire"
Conference (2015, March 23)Detailed reference viewed: 33 (6 ULg)
Discussion de la communication faite par M. Demoor, membre titulaire, sous le titre: "Le mécanisme du rythme cardiaque".
in Bulletin de l'Académie Royale de Médecine de Belgique (1880)Detailed reference viewed: 5 (0 ULg)
Discussion des rapports de P. Fraisse et R. Chauvin
in Canestrelli, I. (Ed.) Le comportement : symposium de l'Association de psychologie de langue française, Rome, 1967 (1968)Detailed reference viewed: 9 (0 ULg)
Discussion des travaux du GRAFELITT sur l'enseignement de la littérature (Chr. Ronveaux & B. Schneuwly; Chloé Gabathuler)
De Croix, Séverine
Conference (2012)Detailed reference viewed: 29 (0 ULg)
Discussion du symposium intitulé : "L’étude des pratiques d’évaluation sommative des enseignants : une urgence pour comprendre les compétences professionnelles en jeu ? "
Conference (2016, January)
Symposium organisé par Lucie Mottier Lopez et Raphaël Pasquini.Detailed reference viewed: 19 (0 ULg)
Discussion of "Sensitivity analysis of non-equilibrium adaptation parameters for modeling mining-pit migration"
Gouverneur, Ludovic ; Dewals, Benjamin ; Archambeau, Pierre et al
in Journal of Hydraulic Engineering (2013), 139(7), 799-801Detailed reference viewed: 117 (34 ULg)
Discussion of "Two-dimensional depth-averaged finite volume model for unsteady turbulent flows"
Erpicum, Sébastien ; Pirotton, Michel ; Archambeau, Pierre et al
in Journal of Hydraulic Research (2014), 52(1), 148-150Detailed reference viewed: 35 (8 ULg)