scholarly journals Cloud and Precipitation Parameterizations in Modeling and Variational Data Assimilation: A Review

2007 ◽  
Vol 64 (11) ◽  
pp. 3766-3784 ◽  
Author(s):  
Philippe Lopez

Abstract This paper first reviews the current status, issues, and limitations of the parameterizations of atmospheric large-scale and convective moist processes that are used in numerical weather prediction and climate general circulation models. Both large-scale (resolved) and convective (subgrid scale) moist processes are dealt with. Then, the general question of the inclusion of diabatic processes in variational data assimilation systems is addressed. The focus is put on linearity and resolution issues, the specification of model and observation error statistics, the formulation of the control vector, and the problems specific to the assimilation of observations directly affected by clouds and precipitation.

2015 ◽  
Vol 72 (1) ◽  
pp. 55-74 ◽  
Author(s):  
Qiang Deng ◽  
Boualem Khouider ◽  
Andrew J. Majda

Abstract The representation of the Madden–Julian oscillation (MJO) is still a challenge for numerical weather prediction and general circulation models (GCMs) because of the inadequate treatment of convection and the associated interactions across scales by the underlying cumulus parameterizations. One new promising direction is the use of the stochastic multicloud model (SMCM) that has been designed specifically to capture the missing variability due to unresolved processes of convection and their impact on the large-scale flow. The SMCM specifically models the area fractions of the three cloud types (congestus, deep, and stratiform) that characterize organized convective systems on all scales. The SMCM captures the stochastic behavior of these three cloud types via a judiciously constructed Markov birth–death process using a particle interacting lattice model. The SMCM has been successfully applied for convectively coupled waves in a simplified primitive equation model and validated against radar data of tropical precipitation. In this work, the authors use for the first time the SMCM in a GCM. The authors build on previous work of coupling the High-Order Methods Modeling Environment (HOMME) NCAR GCM to a simple multicloud model. The authors tested the new SMCM-HOMME model in the parameter regime considered previously and found that the stochastic model drastically improves the results of the deterministic model. Clear MJO-like structures with many realistic features from nature are reproduced by SMCM-HOMME in the physically relevant parameter regime including wave trains of MJOs that organize intermittently in time. Also one of the caveats of the deterministic simulation of requiring a doubling of the moisture background is not required anymore.


2013 ◽  
Vol 9 (6) ◽  
pp. 2741-2757 ◽  
Author(s):  
A. Mairesse ◽  
H. Goosse ◽  
P. Mathiot ◽  
H. Wanner ◽  
S. Dubinkina

Abstract. The mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of this relatively large amount of information, we have compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but the models underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data-assimilation method based on a particle filter. In one simulation, all the 50 proxy-based records are used while in the other two only the continental or oceanic proxy-based records constrain the model results. As expected, data assimilation leads to improving the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at midlatitude that warms up northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxy-based paleoclimate records whose reconstructed signal is either incompatible with the signal recorded by some other proxy-based records or with model physics.


2013 ◽  
Vol 9 (4) ◽  
pp. 3953-3991 ◽  
Author(s):  
A. Mairesse ◽  
H. Goosse ◽  
P. Mathiot ◽  
H. Wanner ◽  
S. Dubinkina

Abstract. The mid-Holocene (6 thousand years before present) is a key period to study the consistency between model results and proxy data as it corresponds to a standard test for models and a reasonable number of proxy records are available. Taking advantage of this relatively large amount of information, we have first compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data assimilation method based on a particle filter. In one simulation, all the 50 proxies are used while in the other two, only the continental or oceanic proxies constrains the model results. This assimilation improves the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at mid-latitude that warms up the Northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxies whose reconstructed signal is either incompatible with the one recorded by some other proxies or with model physics.


2013 ◽  
Vol 141 (3) ◽  
pp. 1099-1117 ◽  
Author(s):  
Andrew Charles ◽  
Bertrand Timbal ◽  
Elodie Fernandez ◽  
Harry Hendon

Abstract Seasonal predictions based on coupled atmosphere–ocean general circulation models (GCMs) provide useful predictions of large-scale circulation but lack the conditioning on topography required for locally relevant prediction. In this study a statistical downscaling model based on meteorological analogs was applied to continental-scale GCM-based seasonal forecasts and high quality historical site observations to generate a set of downscaled precipitation hindcasts at 160 sites in the South Murray Darling Basin region of Australia. Large-scale fields from the Predictive Ocean–Atmosphere Model for Australia (POAMA) 1.5b GCM-based seasonal prediction system are used for analog selection. Correlation analysis indicates modest levels of predictability in the target region for the selected predictor fields. A single best-match analog was found using model sea level pressure, meridional wind, and rainfall fields, with the procedure applied to 3-month-long reforecasts, initialized on the first day of each month from 1980 to 2006, for each model day of 10 ensemble members. Assessment of the total accumulated rainfall and number of rainy days in the 3-month reforecasts shows that the downscaling procedure corrects the local climate variability with no mean effect on predictive skill, resulting in a smaller magnitude error. The amount of total rainfall and number of rain days in the downscaled output is significantly improved over the direct GCM output as measured by the difference in median and tercile thresholds between station observations and downscaled rainfall. Confidence in the downscaled output is enhanced by strong consistency between the large-scale mean of the downscaled and direct GCM precipitation.


2018 ◽  
Vol 146 (2) ◽  
pp. 447-465 ◽  
Author(s):  
Mark Buehner ◽  
Ping Du ◽  
Joël Bédard

Abstract Two types of approaches are commonly used for estimating the impact of arbitrary subsets of observations on short-range forecast error. The first was developed for variational data assimilation systems and requires the adjoint of the forecast model. Comparable approaches were developed for use with the ensemble Kalman filter and rely on ensembles of forecasts. In this study, a new approach for computing observation impact is proposed for ensemble–variational data assimilation (EnVar). Like standard adjoint approaches, the adjoint of the data assimilation procedure is implemented through the iterative minimization of a modified cost function. However, like ensemble approaches, the adjoint of the forecast step is obtained by using an ensemble of forecasts. Numerical experiments were performed to compare the new approach with the standard adjoint approach in the context of operational deterministic NWP. Generally similar results are obtained with both approaches, especially when the new approach uses covariance localization that is horizontally advected between analysis and forecast times. However, large differences in estimated impacts are obtained for some surface observations. Vertical propagation of the observation impact is noticeably restricted with the new approach because of vertical covariance localization. The new approach is used to evaluate changes in observation impact as a result of the use of interchannel observation error correlations for radiance observations. The estimated observation impact in similarly configured global and regional prediction systems is also compared. Overall, the new approach should provide useful estimates of observation impact for data assimilation systems based on EnVar when an adjoint model is not available.


2002 ◽  
Vol 32 (9) ◽  
pp. 2509-2519 ◽  
Author(s):  
Gerrit Burgers ◽  
Magdalena A. Balmaseda ◽  
Femke C. Vossepoel ◽  
Geert Jan van Oldenborgh ◽  
Peter Jan van Leeuwen

Abstract The question is addressed whether using unbalanced updates in ocean-data assimilation schemes for seasonal forecasting systems can result in a relatively poor simulation of zonal currents. An assimilation scheme, where temperature observations are used for updating only the density field, is compared to a scheme where updates of density field and zonal velocities are related by geostrophic balance. This is done for an equatorial linear shallow-water model. It is found that equatorial zonal velocities can be detoriated if velocity is not updated in the assimilation procedure. Adding balanced updates to the zonal velocity is shown to be a simple remedy for the shallow-water model. Next, optimal interpolation (OI) schemes with balanced updates of the zonal velocity are implemented in two ocean general circulation models. First tests indicate a beneficial impact on equatorial upper-ocean zonal currents.


2015 ◽  
Vol 2 (2) ◽  
pp. 513-536 ◽  
Author(s):  
I. Grooms ◽  
Y. Lee

Abstract. Superparameterization (SP) is a multiscale computational approach wherein a large scale atmosphere or ocean model is coupled to an array of simulations of small scale dynamics on periodic domains embedded into the computational grid of the large scale model. SP has been successfully developed in global atmosphere and climate models, and is a promising approach for new applications. The authors develop a 3D-Var variational data assimilation framework for use with SP; the relatively low cost and simplicity of 3D-Var in comparison with ensemble approaches makes it a natural fit for relatively expensive multiscale SP models. To demonstrate the assimilation framework in a simple model, the authors develop a new system of ordinary differential equations similar to the two-scale Lorenz-'96 model. The system has one set of variables denoted {Yi}, with large and small scale parts, and the SP approximation to the system is straightforward. With the new assimilation framework the SP model approximates the large scale dynamics of the true system accurately.


2019 ◽  
Vol 147 (4) ◽  
pp. 1107-1126 ◽  
Author(s):  
Jonathan Poterjoy ◽  
Louis Wicker ◽  
Mark Buehner

Abstract A series of papers published recently by the first author introduce a nonlinear filter that operates effectively as a data assimilation method for large-scale geophysical applications. The method uses sequential Monte Carlo techniques adopted by particle filters, which make no parametric assumptions for the underlying prior and posterior error distributions. The filter also treats the underlying dynamical system as a set of loosely coupled systems to effectively localize the effect observations have on posterior state estimates. This property greatly reduces the number of particles—or ensemble members—required for its implementation. For these reasons, the method is called the local particle filter. The current manuscript summarizes algorithmic advances made to the local particle filter following recent tests performed over a hierarchy of dynamical systems. The revised filter uses modified vector weight calculations and probability mapping techniques from earlier studies, and new strategies for improving filter stability in situations where state variables are observed infrequently with very accurate measurements. Numerical experiments performed on low-dimensional data assimilation problems provide evidence that supports the theoretical benefits of the new improvements. As a proof of concept, the revised particle filter is also tested on a high-dimensional application from a real-time weather forecasting system at the NOAA/National Severe Storms Laboratory (NSSL). The proposed changes have large implications for researchers applying the local particle filter for real applications, such as data assimilation in numerical weather prediction models.


2020 ◽  
Vol 50 (4) ◽  
pp. 1045-1064 ◽  
Author(s):  
Steven L. Morey ◽  
Ganesh Gopalakrishnan ◽  
Enric Pallás Sanz ◽  
Joao Marcos Azevedo Correia De Souza ◽  
Kathleen Donohue ◽  
...  

AbstractThree simulations of the circulation in the Gulf of Mexico (the “Gulf”) using different numerical general circulation models are compared with results of recent large-scale observational campaigns conducted throughout the deep (>1500 m) Gulf. Analyses of these observations have provided new understanding of large-scale mean circulation features and variability throughout the deep Gulf. Important features include cyclonic flow along the continental slope, deep cyclonic circulation in the western Gulf, a counterrotating pair of cells under the Loop Current region, and a cyclonic cell to the south of this pair. These dominant circulation features are represented in each of the ocean model simulations, although with some obvious differences. A striking difference between all the models and the observations is that the simulated deep eddy kinetic energy under the Loop Current region is generally less than one-half of that computed from observations. A multidecadal integration of one of these numerical simulations is used to evaluate the uncertainty of estimates of velocity statistics in the deep Gulf computed from limited-length (4 years) observational or model records. This analysis shows that the main deep circulation features identified from the observational studies appear to be robust and are not substantially impacted by variability on time scales longer than the observational records. Differences in strengths and structures of the circulation features are identified, however, and quantified through standard error analysis of the statistical estimates using the model solutions.


Sign in / Sign up

Export Citation Format

Share Document