Last 7 days
Bookmark and Share    
Full Text
Peer Reviewed
See detailArchitectural and Environmental Housing Typology Analysis in Huamachuco, Peru
Meliani, Houmam ULg; Teller, Jacques ULg; Attia, Shady ULg

in La Roche, Pablo; Schiler, Marc (Eds.) Passive and Low Energy Architecture Conference (2016, July 12)

This work focuses on the city of Huamachuco, a town 3200 meters above sea level in Northern Peru. The main aim of this study is to share and disseminate technological knowledge on architecture, building ... [more ▼]

This work focuses on the city of Huamachuco, a town 3200 meters above sea level in Northern Peru. The main aim of this study is to share and disseminate technological knowledge on architecture, building technology and lifestyle of Huamachuco inhabitants. The paper objective is to (1) highlight the concept of architectural quality in Huamachuco, (2) compare the living environment of existing traditional dwellings in comparison with newly constructed concrete dwellings, and to (3) identify the reasons for neglecting ecological construction technologies and materials in the new built environment. The research methodology went through different phases that range from qualitative data collection and quantitative measuring, to data analysis and findings reporting. Firstly, a typological study was conducted visiting 110 houses. The typological study enabled us to describe and understand the housing typologies and classify those under four major typologies according to their construction techniques: 1) adobe, 2) rammed earth, 3) concrete and 4) mixed technique. Secondly, detailed field studies were conducted for representative houses representing the four categories. Finally, 10 houses have been thoroughly audited for using measuring equipment to collect data related to temperature, relative humidity, carbon dioxide concentration, lighting intensity, and envelope thermography. [less ▲]

Detailed reference viewed: 41 (5 ULg)
Peer Reviewed
See detailThe Methods of Managgement: An Answer to the crisis
Robert, Jocelyne ULg

Conference (2016, July 12)

Detailed reference viewed: 11 (0 ULg)
Full Text
See detailFailure multiscale simulations of composite laminates based on a non-local mean-field damage-enhanced homogenization
Wu, Ling ULg; Adam, Laurent; Doghri, Issam et al

Conference (2016, July 12)

A multiscale method is developed to study the failure of carbon fiber reinforced composites. In order to capture the intra-laminar failure, a non-local mean-field homogenization (MFH) method accounting ... [more ▼]

A multiscale method is developed to study the failure of carbon fiber reinforced composites. In order to capture the intra-laminar failure, a non-local mean-field homogenization (MFH) method accounting for the damage evolution of the matrix phase of the composite material [1] is considered. In that formulation, an incremental-secant MFH approach is used to account for the elastic unloading of the fibers during the strain softening of the matrix. In order to avoid the strain/damage localization caused by the matrix material softening, an implicit non-local method [2] was reformulated to account for the composite material anisotropy. As a result, accurate predictions of the composite softening behavior and of the different phases response is possible, even for volume ratios of inclusions around 60%. In particular it is shown that the damage propagation direction in each ply follows the fiber orientation in agreement with experimental data. The delamination process is modeled by recourse to a hybrid discontinuous Galerkin (DG)/ extrinsic cohesive law approach. As for the extrinsic cohesive law (ECL), which represents the fracturing response only, and for which cohesive elements are inserted at failure onset, the method does not suffer from a mesh-dependent effect. However, because of the underlying discontinuous Galerkin method, interface elements are present since the very beginning of the simulation avoiding the need to propagate topological changes in the mesh with the propagation of the delamination. Moreover, the pre-failure response is accurately captured by the material law though the DG implementation, by contrast to usual intrinsic cohesive laws. As a demonstration of the efficiency and accuracy of the method, a composite laminate with a quasi-isotropic sequence ([90/45/-45/90/0]S) and an open-hole geometry is studied using the multiscale method [3] and the results are compared to experimental data. The numerical model is found to predict the damage bands along the fiber directions as observed in the experimental samples inspected by X-ray computed tomography (XCT). Moreover, the predicted delamination pattern is found to match the experimental observations. REFERENCES [1] L. Wu, L. Noels, L. Adam, I. Doghri, An implicit-gradient-enhanced incremental-secant mean- field homogenization scheme for elasto-plastic composites with damage, International Journal of Solids and Structures, 50, 3843-3860, 2013. [2] R. Peerlings, R. de Borst, W. Brekelmans, S. Ayyapureddi, Gradient-enhanced damage for quasi-brittle materials. International Journal for Numerical Methods in Engineering, 39, 3391-3403, 1996. [3] L. Wu, F. Sket, J.M. Molina-Aldareguia, A. Makradi, L. Adam, I. Doghri, L. Noels, A study of composite laminates failure using an anisotropic gradient-enhanced damage mean-field homogenization model, Composite Structures, 126, 246–264, 2015. [less ▲]

Detailed reference viewed: 38 (3 ULg)
Full Text
See detailMean-Field-Homogenization-based stochastic multiscale methods for composite materials
Wu, Ling ULg; Lucas, Vincent ULg; Adam, Laurent et al

Conference (2016, July 12)

When considering a homogenization-based multiscale approach, at each integration-point of the macro-structure, the material properties are obtained from the resolution of a micro-scale boundary value ... [more ▼]

When considering a homogenization-based multiscale approach, at each integration-point of the macro-structure, the material properties are obtained from the resolution of a micro-scale boundary value problem. At the micro-level, the macro-point is viewed as the center of a Representative Volume Element (RVE). However, to be representative, the micro-volume-element should have a size much bigger than the micro-structure size. For composite materials which suffer from a large property and geometrical dispersion, either this requires RVE of sizes which cannot usually be obtained numerically, or simply the structural properties exhibit a scatter at the macro-scale. In both cases, the representativity of the micro-scale volume element is lost and Statistical Volume Elements (SVE) [1] should be considered in order to account for the micro-structural uncertainties, which should in turn be propagated to the macro-scale in order to predict the structural properties in a probabilistic way. In this work we propose a non-deterministic multi-scale approach for composite materials following the methodology set in [2]. Uncertainties on the meso-scale properties and their (spatial) correlations are first evaluated through the homogenization of SVEs. This homogenization combines both mean-field method in order to gain efficiency and computational homogenization to evaluate the spatial correlation. A generator of the meso-scale material tensor is then implemented using the spectral method [3]. As a result, a meso-scale random field can be generated, paving the way to the use of stochastic finite elements to study the probabilistic behavior of macro-scale structures. [1] M. Ostoja-Starzewski, X.Wang, Stochastic finite elements as a bridge between random material microstructure and global response, Computer Methods in Applied Mechanics and Engineering, 168, 35–49, 1999. [2] V. Lucas, J.-C. Golinval, S. Paquay, V.-D. Nguyen, L. Noels, L. Wu, A stochastic computational multiscale approach; Application to MEMS resonators. Computer Methods in Applied Mechanics and Engineering, 294, 141–167, 2015. [3] Shinozuka, M., Deodatis, G. Simulation of stochastic processes by spectral representation. Appl. Mech. Rev., 1991: 44(4): 191-204, 1991. [less ▲]

Detailed reference viewed: 33 (10 ULg)
Full Text
Peer Reviewed
See detailA Novel Methodology for Hybrid Fire Testing
Sauca, Ana ULg; Gernay, Thomas ULg; Robert, Fabienne et al

in Proceedings of the 6th European Conference on Structural Control (2016, July 11)

This paper describes a novel methodology for conducting stable hybrid fire testing (HTF). During hybrid fire testing, only a part of the structure is tested in a furnace while the reminded structure is ... [more ▼]

This paper describes a novel methodology for conducting stable hybrid fire testing (HTF). During hybrid fire testing, only a part of the structure is tested in a furnace while the reminded structure is calculated separately, here by means of a predetermined matrix. Equilibrium and compatibility at the interface between the tested “physical substructure” and the “numerical substructure” is maintained throughout the test using a dedicated algorithm. The procedures developed so far are sensitive to the stiffness ratio between the physical and the numerical substructure and therefore they can be applied only in some cases. In fire field, the stiffness of the heated physical substructure may change dramatically and the resulting change in stiffness ratio can lead to instability during the test. To overcome this drawback, a methodology independent of the stiffness ratio has been developed, inspired from the Finite Element Tearing and Interconnecting (FETI) method, which has been originally developed for substructuring in numerical analyses. The novel methodology has been successfully applied to a hybrid fire test in a purely numerical environment, i.e. the physical substructure was also modelled numerically. It is shown that stability does not depend on the stiffness ratio and that equilibrium and compatibility can be consistently maintained at the interface during the fire. Finally, the ongoing experimental program aimed at employing and experimentally validating this methodology is described. [less ▲]

Detailed reference viewed: 61 (29 ULg)
See detailHieroglyphic encoding: The Thot Sign-List (TSL) data model and the hieroglyphic signs in Unicode
Polis, Stéphane ULg; Rosmorduc, Serge

Conference (2016, July 11)

In this paper, we review issues related to the existing hieroglyphic sign-lists, focusing especially on the problematic aspects of the 'Manuel de Codage' (1988) and of the so-called 'Hieroglyphica' (2000 ... [more ▼]

In this paper, we review issues related to the existing hieroglyphic sign-lists, focusing especially on the problematic aspects of the 'Manuel de Codage' (1988) and of the so-called 'Hieroglyphica' (2000) for font-designers and users alike. We propose and discuss a data model for the Thot Sign-List (TSL, in prep.). This data model shall lead to the implementation of a structured and systematically referenced hieroglyphic repertoire, which should ultimately allow a sound extension of the Egyptian Hieroglyphs in Unicode. [less ▲]

Detailed reference viewed: 20 (2 ULg)
Peer Reviewed
See detailComputerized adaptive testing and multistage testing with R
Magis, David ULg; Yan, Duanli; von Davier, Alina

Conference (2016, July 11)

The goal of this workshop is to provide a practical (and brief) overview of the theory on computerized adaptive testing (CAT) and multistage testing (MST), and illustrate the methodologies and ... [more ▼]

The goal of this workshop is to provide a practical (and brief) overview of the theory on computerized adaptive testing (CAT) and multistage testing (MST), and illustrate the methodologies and applications using R open source language and several data examples. The implementations rely on the R packages catR and mstR that have been already or are being developed and include some of the newest research algorithms developed by the authors. This workshop will cover several topics: the basics of R, theoretical overview of CAT and MST, CAT and MST designs, assembly methodologies, catR and mstR packages, simulations and applications. The intended audience for the workshop is undergraduate/graduate students, faculty, researchers, practitioners at testing institutions, and anyone in psychometrics, measurement, education, psychology and other fields who is interested in computerized adaptive and multistage testing, especially in practical implementations of simulation using R. [less ▲]

Detailed reference viewed: 23 (1 ULg)
Peer Reviewed
See detailGestion des flux patients et surpopulation des urgences : Heurs et malheurs de la fonction de « Bed Manager ».
GILLET, Aline ULg; Minder, Anaïs ULg; Nyssen, Anne-Sophie ULg et al

Conference (2016, July 11)

For many years, emergency departments (ED) overcrowding has become a major issue in Public Health. Many studies have demonstrated the efficiency of flow management coordination on this recurrent problem ... [more ▼]

For many years, emergency departments (ED) overcrowding has become a major issue in Public Health. Many studies have demonstrated the efficiency of flow management coordination on this recurrent problem, by offering an interface between the ED, the hospital and out-of-hospital structures and by coordinating patients’ movements towards hospital care units. This was the basis for the implementation of "bed management" coordination program in the ED of the University Hospital of Liège in January 2014. The present study evaluates the adequacy of the Bed Manager (BM) activity with actual ED and hospital workload. Our results describe the rate of intra-hospital patients’ transfers according to the adequacy of the destination unit and time delays for these transfers. Head nurses from specific care units were interrogated about their perceptions of BM activity. We are now convinced by the importance of a participative approach in the development of ED bed management and working procedures, as well as the usefulness of further studies to explore this complex activity. [less ▲]

Detailed reference viewed: 33 (0 ULg)
Peer Reviewed
See detailOptimal returnable transport items management
Limbourg, Sabine ULg; Martin, Adeline; Paquay, Célia ULg

Conference (2016, July 11)

Reducing environmental impact, related regulations and potential for operational benefits are the main reasons why companies share their returnable transport items (RTIs) among the different partners of a ... [more ▼]

Reducing environmental impact, related regulations and potential for operational benefits are the main reasons why companies share their returnable transport items (RTIs) among the different partners of a closed-loop supply chain. Face-to-face interviews with senior executives from seven companies involved in RTIs were conducted to gain a thorough understanding of how they manage the flow of RTIs as well as how they determine the number of RTIs needed in a fleet. Results indicate that RTIs managements are quite diverse, that some common beliefs about RTIs do not apply to all RTI types, and that research efforts are needed in the areas of RTI acquisition; warehouse layout, inventory routing problem; production planning and control; tracking and scheduling. [less ▲]

Detailed reference viewed: 12 (1 ULg)
Full Text
Peer Reviewed
See detailCounting time measurement and statistics in gamma spectrometry: the balance
Guembou Shouop, Cébastien Joel ULg; Ndontchueng Moyo, Maurice; Chene, Grégoire ULg et al

Poster (2016, July 11)

Nuclear counting statistics at high count rate are assessed on a γ-ray spectrometer set-up. Our typical gamma spectrometry system consists of a High Purity Germanium (HPGe) detector, liquid nitrogen ... [more ▼]

Nuclear counting statistics at high count rate are assessed on a γ-ray spectrometer set-up. Our typical gamma spectrometry system consists of a High Purity Germanium (HPGe) detector, liquid nitrogen cooling system, preamplifier, detector bias supply, linear amplifier, analog-to-digital converter (ADC), multichannel storage of the spectrum, and data readout devices. Although the system is powerful enough for background measurements, it is important, nowadays, to have a great statistical in short time measurement: which is a challenge for scientists. The purpose of this study was to determine the average time for gamma spectrometry measurement. To detect Uranium, Thorium and their respective daughters and Potassium series with a relative related error less than 1%, it was found that it is necessary to count during a minimum of 24 Hours (86,400 s). This result is in accordance to the literature with planar geometry detector. These results conduct us to make the following three guidelines for selecting the detector best suited for an application: 1. The more detector material available (germanium semi-conductor), the higher the full-energy peak efficiency. 2. The smaller the distance between the detector and the source material, the higher the full- energy peak efficiency. 3. While better resolution gives a better MDA, the resolution contributes only as the square root to the MDA value, whereas the MDA is proportional to the full-energy peak efficiency. This idea came to us by comparing the spectra of measuring radioactivity lasts for 12 hours in the day that does not fully covered the night spectra for the same sample. The conclusion after several investigations became clearer: to remove all effects of radiation from outside (earth, sun and universe) our system, it is necessary to measure the background for 24, 48 or 72 hours. In the same way, the samples have to be measures for 24, 48 or 72 hours to be safe to be purified the measurement (equality of day and night measurement). It is also possible to not use the background of the winter in summer. Depend to the energy of radionuclide we seek, it is clear that the most important steps of a gamma spectrometry measurement are the preparation of the sample and the calibration of the detector. [less ▲]

Detailed reference viewed: 26 (4 ULg)
Full Text
See detailBallkani Perëndimor: terren veprimi i Bashkimit europian dhe fuqive emergjente?
Lika, Liridon ULg

Scientific conference (2016, July 11)

Detailed reference viewed: 7 (4 ULg)
Full Text
See detailReal Estate Architecture : revisiting the apartment building
Joachim, Guillaume ULg; Burquel, Benoît ULg; Dumont, Martin

Learning material (2016)

LIÈGE, EARLY 1960S. AS IN OTHER EUROPEAN CITIES, THE REAL ESTATE BOOM HAS JUST BEGUN. AS GOOD MODERNISTS, THE MAYOR AND A BENCH OF PRIVATE DEVELOPERS THOUGHT TO RENOVATE THE HOUSING STOCK BY REPLACING THE ... [more ▼]

LIÈGE, EARLY 1960S. AS IN OTHER EUROPEAN CITIES, THE REAL ESTATE BOOM HAS JUST BEGUN. AS GOOD MODERNISTS, THE MAYOR AND A BENCH OF PRIVATE DEVELOPERS THOUGHT TO RENOVATE THE HOUSING STOCK BY REPLACING THE OLD URBAN FABRIC WITH A NEW HOUSING TYPE : THE APARTMENT BUILDING. IN A 15-YEARS TIME, MOST OF THE 19TH CENTURY HOUSES ALONG THE QUAYS AND THE BOULEVARDS HAVE BEEN SUBSTITUTED BY 12-STOREY HIGH MODERNIST CONSTRUCTIONS. THIS TRANSFORMATION RADICALLY CHANGED THE SKYLINE AND THE IMAGE OF THE CITY. AFTER 50 YEARS, THESE BUILDINGS HAVE COME TO THE END OF A LIFE CYCLE. WITH THE RISING COST OF ENERGY, THE CHANGE IN SOCIETY AND THE EVOLUTION OF NORMS AND STANDARDS, THESE MODERN ‘ICONS’ ARE BECOMING OUTDATED. THERE IS A GROWING NEED TO RECONSIDER THIS TYPOLOGY AND TO THINK ITS POSSIBLE FUTURE. THE SUMMER SCHOOL WILL EXPLORE HOW ARCHITECTURAL INTERVENTIONS COULD REVIVE THIS IMPORTANT HOUSING STOCK. [less ▲]

Detailed reference viewed: 12 (0 ULg)
Full Text
See detailPegOpera Version 6.1 PRO
MAGERMANS, POL ULg; Everbecq, Etienne ULg; Grard, Aline ULg et al

Software (2016)

Le logiciel PegOpera est un outil scientifique et opérationnel utilisé entre autres par les gestionnaires de l’eau dans les champs d’applications de la Gestion Intégrée des Ressources en Eau. Développé à ... [more ▼]

Le logiciel PegOpera est un outil scientifique et opérationnel utilisé entre autres par les gestionnaires de l’eau dans les champs d’applications de la Gestion Intégrée des Ressources en Eau. Développé à l’Aquapôle de l’université de Liège, il est composé : - du modèle Pegase (Planification Et Gestion de l’ASsainissement des Eaux), un modèle intégré bassins hydrographiques/rivières qui permet de calculer de façon déterministe et prévisionnelle la qualité des eaux des rivières - d’une interface utilisateurs conviviale permettant son utilisation aisée - Le modèle Pegase (Planification Et Gestion de l’ASsainissement des Eaux) est un modèle intégré bassins hydrographiques/rivières qui permet de calculer de façon déterministe et prévisionnelle la qualité des eaux des rivières en fonction des rejets et apports de pollution (relation pression-impact). Développé depuis la fin des années 1980 à l’université de Liège, il permet d'orienter les choix des opérateurs publics et privés en matière de gestion des eaux de surface à l'échelle des petits et grands bassins versants. Cette version 6.1 du logiciel intègre les développements réalisés lors de la deuxième année du programme de recherche Pegase Opera II soutenu par 4 Agences de l'eau françaises. Les principaux nouveaux développements intégrés dans cette version 6.1 concernent principalement : - Le développement de nouveaux outils de validation globale - L’organisation du modèle Pegase 6 Orienté Objet en modules - Le développement de nouveaux outils d’importation des données - Le développement d’outils pour utiliser des « débits reconstitués » - Le passage de certains fichiers de données au format xml afin de permettre la simulation de nouveaux processus et de nouvelles substances [less ▲]

Detailed reference viewed: 16 (6 ULg)
Full Text
See detailPEGASE OPERA II PRÉSENTATION DES TRAVAUX EFFECTUÉS L’ANNÉE 2
Everbecq, Etienne ULg; MAGERMANS, POL ULg; Grard, Aline ULg et al

Report (2016)

Le logiciel PegOpera est un outil scientifique et opérationnel utilisé entre autres par les gestionnaires de l’eau dans les champs d’applications de la Gestion Intégrée des Ressources en Eau. Développé à ... [more ▼]

Le logiciel PegOpera est un outil scientifique et opérationnel utilisé entre autres par les gestionnaires de l’eau dans les champs d’applications de la Gestion Intégrée des Ressources en Eau. Développé à l’Aquapôle de l’université de Liège, il est composé : - du modèle Pegase (Planification Et Gestion de l’ASsainissement des Eaux), un modèle intégré bassins hydrographiques/rivières qui permet de calculer de façon déterministe et prévisionnelle la qualité des eaux des rivières - d’une interface utilisateurs conviviale permettant son utilisation aisée Le modèle Pegase (Planification Et Gestion de l’ASsainissement des Eaux) est un modèle intégré bassins hydrographiques/rivières qui permet de calculer de façon déterministe et prévisionnelle la qualité des eaux des rivières en fonction des rejets et apports de pollution (relation pression-impact). Développé depuis la fin des années 1980 à l’université de Liège, il permet d'orienter les choix des opérateurs publics et privés en matière de gestion des eaux de surface à l'échelle des petits et grands bassins versants. Le modèle et le logiciel sont développés en permanence. En juillet 2014, 4 Agences de l’eau Françaises ont décidé de continuer à supporter les développements du modèle et du Logiciel avec comme objectifs principaux d’apporter des améliorations scientifiques afin d’améliorer la connaissance du fonctionnement écologique des cours d’eau, ainsi que de maintenir l’outil performant, de haut niveau technologique, en adéquation avec les besoins des opérateurs. Ce programme triennal (2014-2016), appelé PEGASE OPERA 2, s’articule sur 3 axes principaux : • Axe 1 : Développements scientifiques du modèle PEGASE (modélisation des processus) ; • Axe 2 : Modularité et Interopérabilité du logiciel PEGOPERA ; • Axe 3 : Amélioration des fonctionnalités du logiciel PEGOPERA. Ce rapport reprend la description des tâches qui ont été réalisées l deuxième année [less ▲]

Detailed reference viewed: 20 (6 ULg)