Browse ORBi by ORBi project

- Background
- Content
- Benefits and challenges
- Legal aspects
- Functions and services
- Team
- Help and tutorials

Analysis of high frequency geostationary ocean colour data using DINEOF Alvera Azcarate, Aïda ; ; et al in Estuarine Coastal & Shelf Science (2015), 159 DINEOF (Data Interpolating Empirical Orthogonal Functions), a technique to reconstruct missing data, is applied to turbidity data obtained through the Spinning Enhanced Visible and Infrared Imager (SEVIRI ... [more ▼] DINEOF (Data Interpolating Empirical Orthogonal Functions), a technique to reconstruct missing data, is applied to turbidity data obtained through the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board Meteosat Second Generation 2. The aim of this work is to assess if the tidal variability of the southern North Sea in 2008 can be accurately reproduced in the reconstructed dataset. Such high frequency data have not previously been analysed with DINEOF and present new challenges, like a strong tidal signal and long night-time gaps. An outlier detection approach that exploits the high temporal resolution (15 min) of the SEVIRI dataset is developed. After removal of outliers, the turbidity dataset is reconstructed with DINEOF. In situ Smartbuoy data are used to assess the accuracy of the reconstruction. Then, a series of tidal cycles are examined at various positions over the southern North Sea. These examples demonstrate the capability of DINEOF to reproduce tidal variability in the reconstructed dataset, and show the high temporal and spatial variability of turbidity in the southern North Sea. An analysis of the main harmonic constituents (annual cycle, daily cycle, M2 and S2 tidal components) is performed, to assess the contribution of each of these modes to the total variability of turbidity. The variability not explained by the harmonic fit, due to the natural processes and satellite processing errors as noise, is also assessed. [less ▲] Detailed reference viewed: 46 (6 ULg)EOF analysis of long-term reconstructed AVHRR Pathfinder SST in the South China Sea Huynh, Thi Hong Ngu ; Alvera Azcarate, Aïda ; Barth, Alexander et al Poster (2014, May 02) Sea surface temperature (SST) is one of the key variables often used to investigate ocean dynamics, ocean-atmosphere interaction, and climate change. For recent decades, the AVHRR Pathfinder SST, measured ... [more ▼] Sea surface temperature (SST) is one of the key variables often used to investigate ocean dynamics, ocean-atmosphere interaction, and climate change. For recent decades, the AVHRR Pathfinder SST, measured by infrared sensors, has been widely used because of its high resolution and long time-series. The disadvantage of the AVHRR Pathfinder SST is high percentage of missing data due to cloud coverage. This becomes more serious in the South China Sea (SCS) because it is located in the tropical region, frequently covered by clouds. In this study, we used the Data INterpolating Empirical Orthogonal Functions (DINEOF) method to reconstruct daily night-time 4 km AVHRR Pathfinder SST spanning from 1989 to 2009 for the whole SCS. In order to better understand the spatial and temporal variability of the SCS SST, an EOF analysis of the reconstructed field is performed in association with surface wind. The first SST mode, accounting for 69% of the variance, presents the cooling (warming) of the basin due to the solar inclination through seasons, water exchange, topography, and monsoon-induced cyclonic circulation. The second SST mode, explaining 24.8% of the variance, shows the advection of cold and warm water from two opposite directions along the southwest-northeast diagonal of the basin. The second SST mode is affected by the atmospheric anticyclone (cyclone) located over the Philippine Sea. Comparing both SST modes with Nino3.0 index, it shows that the interannual variability of the SCS SST is influenced by the moderate and strong ENSO events with a lag of 5-6 months. Moreover, the analysis of the high-resolution reconstructed dataset reveals some oceanic features that could not be captured in previous EOF analyses. [less ▲] Detailed reference viewed: 141 (7 ULg)Bias correction using data assimilation: Application on the Lorenz ’95 and NEMO-LIM models. Canter, Martin ; Barth, Alexander Poster (2014, May 01) Data assimilation has been used for decades in fields like engineering or signal processing to improve forecast models. Ensemble Kalman filters and other sequential data assimilation methods are examples ... [more ▼] Data assimilation has been used for decades in fields like engineering or signal processing to improve forecast models. Ensemble Kalman filters and other sequential data assimilation methods are examples of developments which reduce the uncertainty of the model by taking observations into account. The widespread interest in addressing systematic forecast model errors only arose when the advances in modelling, data assimilation and computational power had reduced random errors to the point of commensurability with systematic errors, also known as bias. We present here a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model’s equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. Indeed, we were able to estimate and recover an artificial bias that had been added into the model. This bias had a spatial structure and was constant through time. The mean and behaviour of the corrected model corresponded to those the reference model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. [less ▲] Detailed reference viewed: 22 (3 ULg)Assimilation of ARGO temperature profile, sea surface temperature and altimetric satellite data into an eddy permitting primitive equation model of the North Atlantic Ocean Yan, Yajing ; Barth, Alexander ; Beckers, Jean-Marie et al Poster (2014, May 01) Detailed reference viewed: 57 (9 ULg)Data-Interpolating Variational Analysis (DIVA) software : recent development and application Watelet, Sylvain ; Barth, Alexander ; Troupin, Charles et al Poster (2014, April) Detailed reference viewed: 33 (8 ULg)Reconstruction of the Gulf Stream since 1900 and correlation with the North Atlantic Oscillation Watelet, Sylvain ; Beckers, Jean-Marie ; Barth, Alexander et al Conference (2014, April) Detailed reference viewed: 45 (9 ULg)Local ensemble assimilation scheme with global constraints and conservation Barth, Alexander ; Yan, Yajing ; Canter, Martin et al Poster (2014, April) Ensemble assimilation schemes applied in their original, global formulation have no problem in respecting linear conservation properties if the ensemble perturbations are setup accordingly. For realistic ... [more ▼] Ensemble assimilation schemes applied in their original, global formulation have no problem in respecting linear conservation properties if the ensemble perturbations are setup accordingly. For realistic ocean systems, only a relatively small number of ensemble members can be calculated. A localization of the ensemble increment is thus necessary to filter out spurious long-range correlations. However, the conservation of the global property will be lost if the assimilation is performed locally since the conservation requires a coupling between model grid points, which is filtered out by the localization. In the ocean, the distribution of observations is highly inhomogeneous. Systematic errors of the observed parts of the ocean state can lead to spurious systematic adjustments of the non-observed part of the ocean state due to data assimilation. As a result, global properties which should be conserved, increase or decrease in long-term simulations. We propose an assimilation scheme (with stochastic or deterministic analysis steps) which is formulated globally (i.e. for the whole state vector) but where spurious long-range correlations can be filtered out. The scheme can thus be used to enforce global conservation properties and non-local observation operators. Both aspects are indeed linked since one can introduce the global conservation as a weak constraint by using a global observation operator. The conserved property becomes thus an observed value. The proposed scheme is tested with the Kuramoto-Sivashinsky model which is conservative. The benefit compared to the traditional covariance localization scheme (with an ad-hoc step enforcing conservation) where observations are assimilated sequentially is shown. The assimilation scheme is suitable to be implemented on parallel computers where the number of available computing cores is a multiple of the ensemble size. [less ▲] Detailed reference viewed: 49 (3 ULg)Web-based application for Data INterpolation Empirical Orthogonal Functions (DINEOF) analysis Tomazic, Igor ; Alvera Azcarate, Aïda ; Barth, Alexander et al Poster (2014, April) DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite ... [more ▼] DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite datasets, as well as for the reduction of noise and detection of outliers. DINEOF is openly available as a series of Fortran routines to be compiled by the user, and as binaries (that can be run directly without any compilation) both for Windows and Linux platforms. In order to facilitate the use of DINEOF and increase the number of interested users, we developed a web-based interface for DINEOF with the necessary parameters available to run high-quality DINEOF analysis. This includes choosing variable within selected dataset, defining a domain, time range, filtering criteria based on available variables in the dataset (e.g. quality flag, satellite zenith angle …) and defining necessary DINEOF parameters. Results, including reconstructed data and calculated EOF modes will be disseminated in NetCDF format using OpenDAP and WMS server allowing easy visualisation and analysis. First, we will include several satellite datasets of sea surface temperature and chlorophyll concentration obtained from MyOcean data centre and already remapped to the regular grid (L3C). Later, based on user’s request, we plan to extend number of datasets available for reconstruction. [less ▲] Detailed reference viewed: 61 (3 ULg)Approximate and Efficient Methods to Assess Error Fields in Spatial Gridding with Data Interpolating Variational Analysis (DIVA) Beckers, Jean-Marie ; Barth, Alexander ; Troupin, Charles et al in Journal of Atmospheric & Oceanic Technology (2014), 31(2), 515-530 We present new approximate methods to provide error fields for the spatial analysis tool Diva. It is first shown how to replace the costly analysis of a large number of covariance functions by a single ... [more ▼] We present new approximate methods to provide error fields for the spatial analysis tool Diva. It is first shown how to replace the costly analysis of a large number of covariance functions by a single analysis for quick error computations. Then another method is presented where the error is only calculated in a small number of locations and from there the spatial error field itself interpolated by the analysis tool. The efficiency of the methods is illustrated on simple schematic test cases and a real application in the Mediterranean Sea. These examples show that with these methods one has the possibility for quick masking of regions void of sufficient data and the production of "exact" error fields at reasonable cost. The error-calculation methods can also be generalized for use with other analysis methods such as 3D-Var and are therefore potentially interesting for other implementations. [less ▲] Detailed reference viewed: 156 (18 ULg)Comparison of different assimilation schemes in a sequential Kalman filter assimilation system Yan, Yajing ; Barth, Alexander ; Beckers, Jean-Marie in Ocean Modelling (2014), 73 Detailed reference viewed: 69 (12 ULg)Assimilation of HF radar surface currents to optimize forcing in the northwestern Mediterranean Sea ; ; et al in Nonlinear Processes in Geophysics (2014), 21 HF radar measurements are used to optimize surface wind forcing and baroclinic open boundary condition forcing in order to constrain model coastal surface currents. This method is applied to a ... [more ▼] HF radar measurements are used to optimize surface wind forcing and baroclinic open boundary condition forcing in order to constrain model coastal surface currents. This method is applied to a northwestern Mediterranean (NWM) regional primitive equation model configuration. A new radar data set, provided by two radars deployed in the Toulon area (France), is used. To our knowledge, this is the first time that radar measurements of the NWM Sea are assimilated into a circulation model. Special attention has been paid to the improvement of the model coastal current in terms of speed and position. The data assimilation method uses an ensemble Kalman smoother to optimize forcing in order to improve the model trajectory. Twin experiments are initially performed to evaluate the method skills. Real measurements are then fed into the circulation model and significant improvements to the modeled surface currents, when compared to observations, are obtained. [less ▲] Detailed reference viewed: 16 (1 ULg)divand-1.0: n-dimensional variational data analysis for ocean observations Barth, Alexander ; Beckers, Jean-Marie ; Troupin, Charles et al in Geoscientific Model Development (2014), 7 A tool for multidimensional variational analysis (divand) is presented. It allows the interpolation and analysis of observations on curvilinear orthogonal grids in an arbitrary high dimensional space by ... [more ▼] A tool for multidimensional variational analysis (divand) is presented. It allows the interpolation and analysis of observations on curvilinear orthogonal grids in an arbitrary high dimensional space by minimizing a cost function. This cost function penalizes the deviation from the observations, the deviation from a first guess and abruptly varying fields based on a given correlation length (potentially varying in space and time). Additional constraints can be added to this cost function such as an advection constraint which forces the analysed field to align with the ocean current. The method decouples naturally disconnected areas based on topography and topology. This is useful in oceanography where disconnected water masses often have different physical properties. Individual elements of the a priori and a posteriori error covariance matrix can also be computed, in particular expected error variances of the analysis. A multidimensional approach (as opposed to stacking 2-dimensional analysis) has the benefit of providing a smooth analysis in all dimensions, although the computational cost is increased. Primal (problem solved in the grid space) and dual formulations (problem solved in the observational space) are implemented using either direct solvers (based on Cholesky factorization) or iterative solvers (conjugate gradient method). In most applications the primal formulation with the direct solver is the fastest, especially if an a posteriori error estimate is needed. However, for correlated observation errors the dual formulation with an iterative solver is more efficient. The method is tested by using pseudo observations from a global model. The distribution of the observations is based on the position of the ARGO floats. The benefit of the 3-dimensional analysis (longitude, latitude and time) compared to 2-dimensional analysis (longitude and latitude) and the role of the advection constraint are highlighted. The tool divand is free software, and is distributed under the terms of the GPL license (http://modb.oce.ulg.ac.be/mediawiki/index.php/divand). [less ▲] Detailed reference viewed: 169 (18 ULg)Multi-scale optimal interpolation: application to DINEOF analysis spiced with a local optimal interpolation Beckers, Jean-Marie ; Barth, Alexander ; Tomazic, Igor et al in Ocean Science Discussions (2014), 11 We present a method in which the optimal interpolation of multi-scale processes can be untangled into a succession of simpler interpolations. First, we prove how the optimal analysis of a superposition of ... [more ▼] We present a method in which the optimal interpolation of multi-scale processes can be untangled into a succession of simpler interpolations. First, we prove how the optimal analysis of a superposition of two processes can be obtained by different mathematical formulations involving iterations and analysis focusing on a single process. From the 5 different mathematical equivalent formulations we then select the most efficient ones by analyzing the behavior of the different possibilities in a simple and well controlled test case. The clear guidelines deduced from this experiment are then applied in a real situation in which we combine large-scale analysis of hourly SEVIRI satellite images using DINEOF with a local optimal interpolation using a Gaussian covariance. It is 10 shown that the optimal combination indeed provides the best reconstruction and can therefore be exploited to extract the maximum amount of useful information from the original data [less ▲] Detailed reference viewed: 114 (20 ULg)Interpolation of SLA Using the Data-Interpolating Variational Analysis in the Coastal Area of the NW Mediterranean Sea Troupin, Charles ; Barth, Alexander ; Beckers, Jean-Marie et al Poster (2013, October 07) The spatial interpolation of along-track Sea-Level Anomalies (SLA) data to produce gridded map has numerous applications in oceanography (model validation, data assimilation, eddy tracking, ...). Optimal ... [more ▼] The spatial interpolation of along-track Sea-Level Anomalies (SLA) data to produce gridded map has numerous applications in oceanography (model validation, data assimilation, eddy tracking, ...). Optimal Interpolation (OI) is often the preferred method for this task, as it leads to the lowest expected error and provides an error field associated to the analyzed field. However, the method suffers from limitations such as the numerical cost (due to the inversion of covariance matrices) as well as the isotropic covariance function, generally employed in altimetry. The Data-Interpolating Variational Analysis (DIVA) is a gridding method based on the minimization of a cost function using a finite-element technique. The cost function penalizes the departures from observations, the smoothness of the gridded field and physical constraints (advection, diffusion, ...). It has been shown that DIVA and OI are equivalent (provided some assumptions on the covariances are made), the main difference is that in DIVA, the covariance function is not explicitly formulated. The technique has been previously applied for the creation of regional hydrographic climatologies, which required the processing of a large number of data points. In this work we present the application and adaptation of Diva to the analysis of SLA in the Mediterranean Sea and the production of weekly maps of SLA in this region. The peculiarities of SLA along-track data are addressed: • number of observations: the finite-element technique coupled to improvements in the matrix inversion (parallel or iterative solvers) lead to a decrease of the computational time, meaning that sub-sampling of the initial data set is not required. • quality of the different missions: the weight attributed to each data point can be easily set according to the satellite that provided the observations, so that different measurement noise variances are considered. • spatial correlation scale: it varies spatially in the domain according to the value of the Rossby radius of deformation. • long-wavelength errors: each data point is associated a class, and a detrending technique allows the determination of the trend for each class, leading to a reduction of the inconsistencies between missions. • anisotropy of physical coastal features: a pseudo-velocity field derived from regional bathymetry enhances the correlations along the main currents. Particular attention will be paid to the influence of this constraint in the coastal area. The analysis and error fields obtained over the Mediterranean Sea are compared with the available gridded products from AVISO. Different ways to compute the error field are compared. The impact of the use of multiple missions to prepare the gridded fields is also examined. In situ measurements from an intensive multi-sensor experiment carried out north of the Balearic Islands in May 2009 serve to assess the quality of the gridded fields in the coastal area. [less ▲] Detailed reference viewed: 64 (2 ULg)WP8 and WP9 developments: Data-Interpolating Variational Analysis (Diva) developments Troupin, Charles ; Barth, Alexander ; et al Conference (2013, September 27) Detailed reference viewed: 46 (1 ULg)Variational data analysis for generating ocean climatologies (DIVA) and web-based distribution of data products (OceanBrowser) Barth, Alexander ; Troupin, Charles ; Alvera Azcarate, Aïda et al Conference (2013, September 25) Detailed reference viewed: 38 (3 ULg)Application of the Data-Interpolating Variational Analysis (DIVA) to sea-level anomaly measurements in the Mediterranean Sea Troupin, Charles ; Barth, Alexander ; Beckers, Jean-Marie et al Poster (2013, September 23) In ocean sciences, numerous techniques are available for the spatial interpolation of in situ data. These techniques mainly differ in the mathematical formulation and the numerical efficiency. Among them ... [more ▼] In ocean sciences, numerous techniques are available for the spatial interpolation of in situ data. These techniques mainly differ in the mathematical formulation and the numerical efficiency. Among them, DIVA, which is based on the minimization of a cost function using a finite-element technique (figure 1). The cost function penalizes the departure from observations, the smoothness or regularity of the gridded field and can also include physical constraints. The technique is particularly adapted for the creation of climatologies, which required a large to several regional seas or part of the ocean to generate hydrographic climatologies. Sea-level anomalies (SLA) can be deduced from satellite-borne altimeters. The measurements are characterized by a high spatial resolution along the satellite tracks, but often a large distance between neighbour tracks. This implies the use of simultaneous altimetry missions for the construction of gridded maps. An along-track long wave-length error (correlated noise, e.g. due to orbit, residual tidal correction or inverse barometer errors) also affects the measurement and has to be taken into account in the interpolation. In this work we present the application and adaptation of Diva to the analysis of SLA in the Mediterranean Sea and the production of weekly maps of SLA in this region. Determination of the parameters The two main parameters that determines an analysis with DIVA are the correlation length (L) and the signal-to-noise ratio (SNR). Because of the particular spatial distribution of the measurements, the tools implemented in Diva for the analysis parameter determination tend to underestimate L and overestimate SNR, leading to noisy analysis (the observation constraint dominates the regularity constraint). Some adaptations of the tools are necessary to solve this issue. Numerical cost Because of the large number of observations to be processed (in comparison with in situ measurements on a similar period), the interpolation method employed is expected to be numerically efficient. Improvements in the implementation of Diva further improved the numerical performance of the method, especially thanks to the use of a parallel solver for the matrix inversion. The performance of finite-element mesh generator was also enhanced, so that interpolation of a data set of more than 1 million data points on a 100-by-100 grid can be performed in a few minutes on a personal laptop. Analysis and error field The analysis and error fields obtained over the Mediterranean Sea are compared with the available gridded products from AVISO. Different ways to compute the error field are compared. The impact of the use of multiple missions to prepare the gridded fields is also examined. [less ▲] Detailed reference viewed: 38 (0 ULg)Derivation of high resolution TSM data by merging geostationary and polar-orbiting satellite data in the North Sea. Alvera Azcarate, Aïda ; Barth, Alexander ; et al Conference (2013, September 09) There is a need for high resolution ocean colour data, both in space and time, for a better assessment of the variability of these data and their influence in the environment, specially at shallow areas ... [more ▼] There is a need for high resolution ocean colour data, both in space and time, for a better assessment of the variability of these data and their influence in the environment, specially at shallow areas where factors as tides and wind play a role in their dynamics. High spatial resolution is achieved by polar-orbiting satellites, but at a low temporal resolution. The opposite is true for geostationary satellites. In order to exploit the complementary nature of geostationary and polar data, a merging methodology has been developed to obtain a unique estimate of the North Sea Total Suspended Matter (TSM). The largest difficulty in developing a merging methodology is the correct estimation of the error covariance matrix, which can be specially complex for variables like TSM. In this work, the error covariance is not parametrized a priori using an analytical expression, but expressed using a truncated spatial EOF basis calculated by analysing MODIS data using DINEOF (Data INterpolating Empirical Orthogonal Functions). This EOF basis represents more realistically the complex variability of the TSM data sets than the parametric covariance used in most optimal interpolation applications. This EOF basis is subsequently used to merge MODIS and SEVIRI TSM data using an optimal interpolation approach. Results for the North Sea 2009 TSM will be shown, demonstrating the possibilities of this technique. The influence of including variables like winds or tides in the analysis, through multivariate approaches, will be assessed. [less ▲] Detailed reference viewed: 25 (1 ULg)Assimilation of simulated satellite altimetric data and ARGO temperature data into a double-gyre NEMO ocean model Yan, Yajing ; Barth, Alexander ; et al Poster (2013, April 09) Detailed reference viewed: 58 (9 ULg)Estimating Inter-Sensor Sea Surface Temperature Biases using DINEOF analysis Tomazic, Igor ; Alvera Azcarate, Aïda ; Troupin, Charles et al Poster (2013) Climate studies need long-term data sets of homogeneous quality, in order to discern trends from other physical signals present in the data and to minimise the contamination of these trends by errors in ... [more ▼] Climate studies need long-term data sets of homogeneous quality, in order to discern trends from other physical signals present in the data and to minimise the contamination of these trends by errors in the source data. Sea surface temperature (SST), deﬁned as one of essential climatology variables, has been increasingly used in both oceanographical and meteorological operational context where there is a constant need for more accurate measurements. Satellite-derived SST provides an indispensable dataset, with both spatially and temporally high resolutions. However, these data have errors of 0.5 K on a global scale and present inter-sensor and inter-regional differences due to their technical characteristics, algorithm limitations and the changing physical properties of the measured environments. These inter-sensor differences should be taken into account in any research involving more than one sensor (SST analysis, long term climate research . . . ). The error correction for each SST sensor is usually calculated as a difference between the SST data derived from referent sensor (e.g. ENVISAT/AATSR) and from the other sensors (SEVIRI, AVHRR, MODIS). However, these empirical difference (bias) ﬁelds show gaps due to the satellite characteristics (e.g. narrow swath in case of AATSR) and to the presence of clouds or other atmospheric contaminations. We present a methodology based on DINEOF (Data INterpolation Empirical Orthogonal Functions) to reconstruct and analyse SST biases with the aim of studying temporal and spatial variability of the SST bias ﬁelds both at a large scale (European seas) and at a regional scale (Mediterranean Sea) and to perform the necessary corrections to the original SST ﬁelds. Two different approaches were taken: by analysing SST biases based on reconstructed SST differences and based on differences of reconstructed SST ﬁelds. Corrected SST ﬁelds based on both approaches were validated against independent in situ buoy SST data or with ENVISAT/AATSR SST data for areas without in situ data (e.g. eastern Mediterranean). [less ▲] Detailed reference viewed: 13 (0 ULg) |
||