scholarly journals Review article: Comparison of local particle filters and new implementations

2018 ◽  
Vol 25 (4) ◽  
pp. 765-807 ◽  
Author(s):  
Alban Farchi ◽  
Marc Bocquet

Abstract. Particle filtering is a generic weighted ensemble data assimilation method based on sequential importance sampling, suited for nonlinear and non-Gaussian filtering problems. Unless the number of ensemble members scales exponentially with the problem size, particle filter (PF) algorithms experience weight degeneracy. This phenomenon is a manifestation of the curse of dimensionality that prevents the use of PF methods for high-dimensional data assimilation. The use of local analyses to counteract the curse of dimensionality was suggested early in the development of PF algorithms. However, implementing localisation in the PF is a challenge, because there is no simple and yet consistent way of gluing together locally updated particles across domains. In this article, we review the ideas related to localisation and the PF in the geosciences. We introduce a generic and theoretical classification of local particle filter (LPF) algorithms, with an emphasis on the advantages and drawbacks of each category. Alongside the classification, we suggest practical solutions to the difficulties of local particle filtering, which lead to new implementations and improvements in the design of LPF algorithms. The LPF algorithms are systematically tested and compared using twin experiments with the one-dimensional Lorenz 40-variables model and with a two-dimensional barotropic vorticity model. The results illustrate the advantages of using the optimal transport theory to design the local analysis. With reasonable ensemble sizes, the best LPF algorithms yield data assimilation scores comparable to those of typical ensemble Kalman filter algorithms, even for a mildly nonlinear system.

2018 ◽  
Author(s):  
Alban Farchi ◽  
Marc Bocquet

Abstract. Particle filtering is a generic weighted ensemble data assimilation method based on sequential importance sampling, suited for nonlinear and non-Gaussian filtering problems. Unless the number of ensemble members scales exponentially with the problem size, particle filter (PF) algorithms lead to weight degeneracy. This phenomenon is a consequence of the curse of dimensionality that prevents one from using PF methods for high-dimensional data assimilation. The use of local analyses to counteract the curse of dimensionality was suggested early on. However, implementing localisation in the PF is a challenge because there is no simple and yet consistent way of gluing locally updated particles together across domains. In this article, we review the ideas related to localisation and the PF in the geosciences. We introduce a generic and theoretical classification of local particle filter (LPF) algorithms, with an emphasis on the advantages and drawbacks of each category. Alongside with the classification, we suggest practical solutions to the difficulties of local particle filtering, that lead to new implementations and improvements in the design of LPF algorithms. The LPF algorithms are systematically tested and compared using twin experiments with the one-dimensional Lorenz 40-variables model and with a two-dimensional barotropic vorticity model. The results illustrate the advantages of using the optimal transport theory to design the local analysis. With reasonable ensemble sizes, the best LPF algorithms yield data assimilation scores comparable to those of typical ensemble Kalman filter algorithms.


2008 ◽  
Vol 136 (12) ◽  
pp. 4629-4640 ◽  
Author(s):  
Chris Snyder ◽  
Thomas Bengtsson ◽  
Peter Bickel ◽  
Jeff Anderson

Abstract Particle filters are ensemble-based assimilation schemes that, unlike the ensemble Kalman filter, employ a fully nonlinear and non-Gaussian analysis step to compute the probability distribution function (pdf) of a system’s state conditioned on a set of observations. Evidence is provided that the ensemble size required for a successful particle filter scales exponentially with the problem size. For the simple example in which each component of the state vector is independent, Gaussian, and of unit variance and the observations are of each state component separately with independent, Gaussian errors, simulations indicate that the required ensemble size scales exponentially with the state dimension. In this example, the particle filter requires at least 1011 members when applied to a 200-dimensional state. Asymptotic results, following the work of Bengtsson, Bickel, and collaborators, are provided for two cases: one in which each prior state component is independent and identically distributed, and one in which both the prior pdf and the observation errors are Gaussian. The asymptotic theory reveals that, in both cases, the required ensemble size scales exponentially with the variance of the observation log likelihood rather than with the state dimension per se.


2011 ◽  
Vol 130-134 ◽  
pp. 3311-3315
Author(s):  
Nai Gao Jin ◽  
Fei Mo Li ◽  
Zhao Xing Li

A CUDA accelerated Quasi-Monte Carlo Gaussian particle filter (QMC-GPF) is proposed to deal with real-time non-linear non-Gaussian problems. GPF is especially suitable for parallel implementation as a result of the elimination of resampling step. QMC-GPF is an efficient counterpart of GPF using QMC sampling method instead of MC. Since particles generated by QMC method provides the best-possible distribution in the sampling space, QMC-GPF can make more accurate estimation with the same number of particles compared with traditional particle filter. Experimental results show that our GPU implementation of QMC-GPF can achieve the maximum speedup ratio of 95 on NVIDIA GeForce GTX 460.


Author(s):  
Yuyang Guo ◽  
Xiangbo Xu ◽  
Miaoxin Ji

Aiming at the low precision of Kalman filter in dealing with non-linear and non-Gaussian models and the serious particle degradation in standard particle filter, a zero-velocity correction algorithm of adaptive particle filter is proposed in this paper. In order to improve the efficiency of resampling, the adaptive threshold is combined with particle filter. In the process of resampling, the degradation co-efficient is introduced to judge the degree of particle degradation, and the particles are re-sampled to ensure the diversity of particles. In order to verify the effectiveness and feasibility of the proposed algorithm, a hardware platform based on the inertial measurement unit (IMU) is built, and the state space model of the system is established by using the data collected by IMU, and experiments are carried out. The experimental results show that, compared with Kalman filter and classical particle filter, the positioning accuracy of adaptive particle filter in zero-velocity range is improved by 40.6% and 19.4% respectively. The adaptive particle filter (APF) can correct navigation errors better and improve pedestrian trajectory accuracy.


2016 ◽  
Vol 23 (6) ◽  
pp. 391-405 ◽  
Author(s):  
Stephen G. Penny ◽  
Takemasa Miyoshi

Abstract. A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard sampling importance resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each grid point. The deterministic resampling approach of Kitagawa is adapted for application locally and combined with interpolation of the analysis weights to smooth the transition between neighboring points. Gaussian noise is applied with magnitude equal to the local analysis spread to prevent particle degeneracy while maintaining the estimate of the growing dynamical instabilities. The approach is validated against the local ensemble transform Kalman filter (LETKF) using the 40-variable Lorenz-96 (L96) model. The results show that (1) the accuracy of LPF surpasses LETKF as the forecast length increases (thus increasing the degree of nonlinearity), (2) the cost of LPF is significantly lower than LETKF as the ensemble size increases, and (3) LPF prevents filter divergence experienced by LETKF in cases with non-Gaussian observation error distributions.


2014 ◽  
Vol 543-547 ◽  
pp. 1278-1281 ◽  
Author(s):  
Zhun Jiao ◽  
Rong Zhang

As a new method for dealing with any nonlinear or non-Gaussian distributions, based on the Monte Carlo methods and Bayesian filtering, particle filters (PF) are favored by researchers and widely applied in many fields. Based on particle filtering, an improved particle filter (IPF) proposal distribution is presented. Evaluation of the weights is simplified and other improved techniques including the residual resampling step and Markov Chain Monte Carlo method are introduced for SINS/GPS integrated navigation system. The simulation results confirm that the improved particle filter outperforms the others.


2013 ◽  
Vol 9 (1) ◽  
pp. 43-74
Author(s):  
S. Dubinkina ◽  
H. Goosse

Abstract. In an idealized framework, we assess reconstructions of the climate state of the Southern Hemisphere during the past 150 yr using the climate model of intermediate complexity LOVECLIM and three data-assimilation methods: a nudging, a particle filter with sequential importance resampling, and an extremely efficient particle filter. The methods constrain the model by pseudo-observations of surface air temperature anomalies obtained from a twin experiment using the same model but different initial conditions. The net of the pseudo-observations is chosen to be either dense (when the pseudo-observations are given at every grid cell of the model) or sparse (when the pseudo-observations are given at the same locations as the dataset of instrumental surface temperature records HADCRUT3). All three data-assimilation methods provide with good estimations of surface air temperature and of sea ice concentration, with the extremely efficient particle filter having the best performance. When reconstructing variables that are not directly linked to the pseudo-observations of surface air temperature as atmospheric circulation and sea surface salinity, the performance of the particle filters is weaker but still satisfactory for many applications. Sea surface salinity reconstructed by the nudging, however, exhibits a patterns opposite to the pseudo-observations, which is due to a spurious impact of the nudging on the ocean mixing.


2016 ◽  
Vol 113 (51) ◽  
pp. 14609-14614 ◽  
Author(s):  
Yoonsang Lee ◽  
Andrew J. Majda

Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors.


2018 ◽  
Vol 25 (1) ◽  
pp. 55-66 ◽  
Author(s):  
Nelson Feyeux ◽  
Arthur Vidard ◽  
Maëlle Nodet

Abstract. Usually data assimilation methods evaluate observation-model misfits using weighted L2 distances. However, it is not well suited when observed features are present in the model with position error. In this context, the Wasserstein distance stemming from optimal transport theory is more relevant.This paper proposes the adaptation of variational data assimilation for the use of such a measure. It provides a short introduction of optimal transport theory and discusses the importance of a proper choice of scalar product to compute the cost function gradient. It also extends the discussion to the way the descent is performed within the minimization process.These algorithmic changes are tested on a nonlinear shallow-water model, leading to the conclusion that optimal transport-based data assimilation seems to be promising to capture position errors in the model trajectory.


2011 ◽  
Vol 15 (10) ◽  
pp. 3237-3251 ◽  
Author(s):  
S. J. Noh ◽  
Y. Tachikawa ◽  
M. Shiiba ◽  
S. Kim

Abstract. Data assimilation techniques have received growing attention due to their capability to improve prediction. Among various data assimilation techniques, sequential Monte Carlo (SMC) methods, known as "particle filters", are a Bayesian learning process that has the capability to handle non-linear and non-Gaussian state-space models. In this paper, we propose an improved particle filtering approach to consider different response times of internal state variables in a hydrologic model. The proposed method adopts a lagged filtering approach to aggregate model response until the uncertainty of each hydrologic process is propagated. The regularization with an additional move step based on the Markov chain Monte Carlo (MCMC) methods is also implemented to preserve sample diversity under the lagged filtering approach. A distributed hydrologic model, water and energy transfer processes (WEP), is implemented for the sequential data assimilation through the updating of state variables. The lagged regularized particle filter (LRPF) and the sequential importance resampling (SIR) particle filter are implemented for hindcasting of streamflow at the Katsura catchment, Japan. Control state variables for filtering are soil moisture content and overland flow. Streamflow measurements are used for data assimilation. LRPF shows consistent forecasts regardless of the process noise assumption, while SIR has different values of optimal process noise and shows sensitive variation of confidential intervals, depending on the process noise. Improvement of LRPF forecasts compared to SIR is particularly found for rapidly varied high flows due to preservation of sample diversity from the kernel, even if particle impoverishment takes place.


Sign in / Sign up

Export Citation Format

Share Document