3.2 Evaluate Fidelity of National Climate Models, with Actionable Feedback to National Model Development Centers
To effectively reap the benefits of improved understanding of days-to-decades predictability, we must apply our knowledge to the models used to produce guidance for the operational forecasts. Building on the National Multi-Model Ensemble (NMME) successfully brought to a level of experimental real-time forecast production in the past two years, COLA will diagnose the mechanisms for intra-seasonal, seasonal, interannual and decadal predictability in individual models of the NMME as well as the ensemble of models together. One example is a more complete understanding of the origin of predictability rebound, such as has been discerned in the late spring and summer prediction of surface temperature and rainfall, and multi-year ENSO predictability, such as has been found in decadal prediction experiments.
A major component will be a rigorous evaluation of the fidelity of national climate models in terms of their representation of climate variability and skill in standard hindcast experiments. To further consider mechanisms in a probabilistic framework, COLA will expand its research to consider the relative merits of deterministically-parameterized multi-model ensemble prediction vs. stochastically-parameterized probabilistic prediction and the super-parameterization method for representing clouds and convection in prediction mode. Furthermore, given the mounting evidence that climate models gain considerably by resolving mesoscale phenomena in the atmosphere and ocean, the experiments described herein will be undertaken with global models having mesoscale resolution, to the extent that is possible. Conventional resolution models will be used to repeat the experiments at trivial relative computational expense.
One benefit of using the models developed at the U.S. national laboratories (NCAR – host of CCSM and CESM; NOAA NCEP – developer of CFSv2; NOAA GFDL – developer of CM2.1 and CM2.5; and NASA GMAO – developer of GEOS5), in addition to testing the model dependence of research findings, is that the strengths and weaknesses of the individual models can be assessed as uniformly as possible. The findings of such analysis can provide valuable feedback to the development centers; however, the findings must be “actionable”, i.e., formulated in a way that provides the development teams with concrete recommendations for focusing further development. COLA will assess the national models in such a way as to provide actionable feedback, to the extent possible, both in the context of the analysis of experiments described above and in several specific ways described below.
3.2.1 Multivariate Skill Assessment
We propose to extend the univariate framework of Goddard et al. (2012) to a multivariate framework by defining a generalized skill score that summarizes the skill over many grid points and then decomposes forecasts into an ordered set of components such that the first maximizes the generalized skill, the second maximizes generalized skill subject to being uncorrelated with the first, and so on. The decomposition procedure can be viewed as a modification of predictable component analysis (Schneider and Griffies, 1999; DelSole and Tippett, 2007), except that instead of measuring predictability the procedure is modified to measure skill. This procedure can be generalized further to determine components that maximize the difference in skill between forecasts. The resulting procedure will be applied to forecasts from national models to provide an objective and comprehensive comparison of forecast skill. In particular, the procedure will be used to evaluate CFSv2 hindcasts relative to other hindcasts, especially COLA's own variants of those hindcasts, as well as the CMIP5 hindcasts and NMME Phase II hindcasts. We believe that the ability to decompose forecast skill and its differences into spatial structures and associated time series will provide useful insights into the space-time structure of the forecast errors that has the potential to yield actionable feedback to national model development centers. We also propose to investigate measures of probabilistic forecast performance based on the minimum relative entropy method of Abramov and Majda (2003) and Abramov (2007), insofar as this method fits nicely within the predictability and information theory framework developed at COLA.
3.2.2 Investigation of Alternative Physics Parameterizations
It has recently been shown that using a “super-parameterization” (SP), in which a cloud-resolving model is used in place of conventional parameterizations of convection, can produce more realistic variability in standard atmospheric models (Khairoutdinov 2005), and can produce major improvements in coupled atmosphere-ocean models (Stan et al. 2010; DeMott et al. 2011; DeMott et al. 2012; Stan et al. 2012). COLA has developed a super-parameterization (or SP) version of the Community Climate System Model version 4 (CCSM4). We propose to generate a set of ensemble seasonal hindcasts from realistic initial conditions using the SP version of CCSM4. These hindcasts will parallel the ensemble seasonal hindcasts using the standard CCSM4 that COLA has generated as part of the NMME. The resulting hindcasts from the two coupled models, which differ only in the treatment of cloud-related processes, will be compared to quantify the impact of super-parameterization in predictability and skill on seasonal to interannual time scales.
To improve understanding of super-parameterization, we will compare heating profiles generated by the super-parameterization and by the default deterministic parameterization of CCSM4. This comparison will be facilitated by the fact that the SP-CCSM automatically calculates both profiles at every time step, but couples only one of them to the atmospheric model. The fact that the output of the two parameterizations are available simultaneously for the same large-scale atmospheric state at every time step allows us to address questions that could not be addressed by running the two versions of CCSM4 separately, including: 1) can the SP be interpreted as a “stochastic” version of the deterministic parameterization? 2) in what ways do the parameterized heating profiles differ? 3) to what extent can the heating profile from one parameterization be predicted from the other? In addressing these questions, it is relevant to note that Lin and Neelin (2003) proposed a stochastic version of the Zhang-McFarlane (2005) convective parameterization in which either CAPE or the vertical heating profile are perturbed randomly. We will investigate whether the SP can constrain the statistics of the stochastic perturbations.
It also has been shown recently that introducing parameterizations that are partly stochastic in coupled atmosphere-ocean models can enhance forecast skill (Palmer et al. 2005; Berner et al., 2008, 2009; Shutts, 2005; Weisheimer et al. 2011). Evaluating the merit of stochastic parameterizations is challenging because it requires examining not only the impact on time mean quantities, but also the impact on the probability distribution and time variability. COLA anticipates acquiring a set of high-resolution forecasts from the ECMWF IFS model with and without the stochastic parameterizations. We propose to compare these forecasts to address basic questions about the impact of stochastic parameterizations on fidelity and skill. We further anticipate generating a large suite of hindcasts with stochastic parameterizations that can be compared to multi-model forecasts.
3.2.3 Understanding and Validation of Land-Atmosphere Processes
3.2.4 Multi-model Interactive Ensemble
We propose to continue to foster discussions within the climate modeling software engineering community to advance toward the vision of the recent NRC report, A National Strategy for Advancing Climate Modeling (NRC 2012). We also propose to build a prototype MMIE within the flux coupler framework of CESM in order to establish proof of concept. For example the effect of applying the IE method to different models can be tested by forming an interactive ensemble of multiple instantiations of the AGCM implemented with stochastic physics (see Sec. 3.2.4.2).
3.3 Toward the Next Generation Seamless System for Operational Climate Forecasting
3.3.1 Systematic Evaluation of the National Models
The systematic evaluation will be built on our ongoing evaluation of CFSv2 hindcasts made by NCEP and our own variants of those hindcasts, as well as CFSv2 long simulations. Inter-comparisons with other national and international (e.g., ECMWF) models, as well as the CMIP5 simulations/hindcasts and the NMME Phase II hindcasts data set, will be the focus of this evaluation. The order of discussion may follow time scales.
3.3.1.1 Mean Climate and Annual Cycle
We will compare and contrast a large set of metrics of simulation fidelity, including deterministic measures (e.g., Taylor diagrams, RMSE) and probabilistic measures (RPSS, relative entropy) of the mean and annual cycle phase and amplitude, for the members of the NMME and selected CMIP5 models in comparison to each of several reanalysis data sets. The use of multiple reanalysis products allows for an assessment of the impact of observational uncertainty of the diagnosed fidelity. We will also take advantage of COLA's unique relationship as a research institution with ECMWF to assess CFSv2 in the context of another world-class seasonal forecast model: the ECMWF IFS/EPS. As a result of COLA's successful NCAR Advanced Scientific Discovery proposal, a large suite of seasonal to interannual hindcasts are being generated using EPS, and these runs will be available for comparison.
3.3.1.2 Understanding the sources of climate model bias
3.3.1.2.1 Enhancing coastal upwelling and eddy divergence
An effective way to enhance the coastal upwelling in coupled models is to simulate the direction and strength of the alongshore winds near the coast realistically, which requires an accurate matching between the coastlines in the atmospheric and oceanic components. This can be accomplished by configuring coupled systems with relatively high and matching resolutions between its oceanic and atmospheric components. Resolving ocean eddies is also required to simulate the fluctuating mesoscale flow of the cold coherent eddies formed near the coast. Colbo and Weller (2006) argued that the eddy flux divergence is the main process to influence the offshore temperature structure away from the main upwelling zone within a few tens kilometers of the coast, which may be stronger than that of the direct Ekman transport. To accurately simulate and examine the combined effects of the coastal upwelling and mesoscale eddy divergence, we will configure a coupled system with both components at matching resolutions at 10 kilometers.
3.3.1.2.2 Reducing cloud error
We propose to further examine the effects of modifying the shallow convection and background diffusion parameters in the fully coupled CFSv2 and analyze (1) whether these parameters can help ameliorate biases in CFSv2, and (2) whether other parameters/processes should be adjusted accordingly. We will also examine the effects of other new parameterizations developed by the Climate Process Team (CPT) on the Stratocumulus to Cumulus Transition (REFERENCE) in the CFSv2 with readjustment of the other processes. We will also run some tests of the deep convection parameterization to determine if it is over-tuned to produce the observed easterly winds along the equator.
3.3.1.2.3 Reducing Equatorial Bias
3.3.1.2.4 Terrestrial water cycle drift
There are substantial biases in the representation of processes that control the terrestrial water cycle in CFSv2. In particular, the mean drifts toward greater precipitation over the tropical oceans, with shifts in the location of the ITCZ, and less precipitation practically everywhere else, especially over the continents. The ensemble spread of CFSv2 forecasts largely follows the mean drift with increases in spread where the mean precipitation increases and decreases elsewhere. The interannual variability of precipitation decreases markedly with lead-time, the bulk of the reduction occurring within the first month or two of the forecast.
NEED MORE --- PROPOSED WORK?
3.3.1.2.4 Artificial Damping by Numerics
Many choices made in the numerical schemes in climate system component models can have profound, unintended consequences. For example, in many AGCMs, the time step is chosen so as to minimize the computational expense without violating the CFL condition; however, to further reduce expense, a larger time step is used, and, when the CFL condition is close to being violated, strong damping is applied. This happens preferentially in the upper atmosphere where the wind speed can get very large. This introduces an artificial damping in the high atmosphere, which, because of the momentum balance in the atmosphere can also affect the surface and thereby the coupling to the ocean, land surface and sea ice. This can also cause parameterizations to behave outside the range for which they are tuned. Reducing the atmospheric time step to avoid all CFL violations can improve biases or expose where overtuning has masked biases. COLA has initiated investigation of this issue in CESM. We propose to conduct a systematic evaluation of the impact of time step choices in the national models.
3.3.1.3 Intraseasonal Variability and Predictability
The structure and statistics of significant intraseasonal fluctuations will be evaluated (vis-à-vis reanalyses) in seasonal re-forecasts from CFSv2 and other national models as well as the available seasonal re-forecasts from ECMWF. A multivariate three-dimensional analysis of storm tracks will include the seasonal mean configuration, the annual cycle (including boreal mid-winter suppression of the Pacific track), and the structure and amplitude of the leading modes of low-frequency storm-track variability (using the technique of envelope functions, as in Straus, 2010). The realism of the low frequency variability in mid-latitudes will be assessed via circulation regime analysis (Straus et al. 2007). Relationships between storm track variability and regime occurrence will be assessed, as well as the connection of both regimes and storm track anomalies with the occurrence of blocking.
The diagnosis of the role of tropical heating in causing extratropical variability requires analysis of the downstream changes in blocking, teleconnection patterns (particularly the MJO) and storm track changes following MJO events. This statistical analysis is supplemented by experiments to add realistic MJO tropical heating in short-range forecasts. A statistical analysis of the mid-latitude variability preceding strong tropical heating events in models and analyses will indicate potential predictability that may not be realized currently.
3.3.1.4 Seasonal Variability and Predictability: Tropical Heating and ENSO
The structure of tropical diabatic heating will be evaluated in ensembles of boreal winter seasonal forecasts for ENSO events and non-ENSO years, by comparison to heating computed from the thermodynamic budget of ERA-Interim reanalysis, as well as to the more limited heating obtained from TRMM and YOTC. The Rossby wave source and mid-latitude response (mean flow, storm track shifts, blocking changes) of ENSO events will also be evaluated, using the ensemble to generate pdfs.
3.3.1.5 Interannual variability – ENSO, IOD, and Atlantic Ocean
3.3.1.5.1 Evaluation of ENSO Prediction
Recent studies by COLA (Zhu et al. 2012a) indicate that there exists a substantial spread in the SST prediction skill with different ocean analyses and an ensemble ocean initialization has the potential to enhance the current prediction skill. We propose to carefully examine the HCA predictive skill and predictability, as a more critical measure of the capability of the climate forecast systems. We will extend our analysis to include HCA, as well as other relevant variables, such as precipitation, SLP and winds at 850 and 200hPa, etc.
Further studies are proposed to examine the effect of substantially perturbing ocean initial conditions in terms of probabilistic forecasting (e.g., prediction reliability) by comparing our multiple ocean analyses initialized hindcasts with NCEP CFSv2 and ECMWF S4 hindcasts. We will further compare the CFSv2 hindcasts initialized with multiple ocean analyses with the NMME hindcasts to identify the contributions of the ocean initial conditions spread to the NMME spread.
To evaluate the prediction skill in current systems (like CFSv2, ECMWF ORA-S4, and NMME), we will compute traditional metrics (anomaly correlation and RMSE), and we will carefully examine predictions of each major ENSO event through synoptic-style case studies. The emphasis is on identifying useful precursors of successful predictions in different hindcasts and analyzing crucial factors that lead to a(n) (un)successful hindcast for a given event. We expect to identify whether there is any prediction skill difference among events and whether the difference is model-dependent. In addition, recent studies recognized two different types of ENSO: warm pool type and cold tongue type (Kim and Yu, 2012; and references therein). We will also verify if there is any difference in terms of prediction between two different types of ENSO, and if the conclusion changes based on three different prediction systems.
3.3.1.5.2 Exploring Indian Ocean predictability and its implication for the Asian Monsoon
Our focus will be broadening the SST prediction in the Indian Ocean from focusing on the IOD index to this more dynamically meaningful pattern. Using the hindcast data, we propose to examine the predictive skills of the SWIO HCA and SSTA, in addition to the equatorial SSTA near the Indonesians coast. The delayed relationship between the eastern and southwestern Indian Ocean SSTA should be taken into account. We will explore the precursors induced by ENSO and the northwestern Pacific fluctuations to the potential predictability in the Indian Ocean. We will also examine the implication of SWIO high predictability of HCA and SSTA for regional climate, such as the precipitation in East Africa, and Asian Monsoon prediction including the tropical western North Pacific region.
3.3.1.5.3 Potential Predictability of Tropical Atlantic Variability
It is important to measure the predictive skill of the TAV in current operational prediction systems, such as CFSv2, ECMWF ORA-S4, and the NMME Phase II hindcasts, in more sophisticated ways. In particular, following a previous study (Hu and Huang 2007), we will compute the most predictable pattern in the tropical Atlantic, based on a more sophisticated EOF technique that maximizes the signal-to-noise ratio. Secondly, the difference in predictability, between two distinct modes (the equatorial zonal mode and the meridional gradient mode) that co-exist in the tropical Atlantic, will be examined. In addition to SSTA, we will also focus on the HCA variability, extending our previous and current studies (e.g., Huang and Shukla 1997; Zhu et al. 2012b,c) using hindcast datasets, as well as their combinations.
In addition to statistical analysis, we will also focus on examining the model predictive skill for the major events during the past few decades and conduct case-by-case synoptic and diagnostic analyses to how well they are predicted by current prediction systems. In addition, we will examine if the predictability depends on different time scales, since there are features with multiple time scales in the Tropical Atlantic (Huang and Shukla 1997; Zhu et al. 2012c).
3.3.1.6 COLA Monsoon Forecast
(a) We will analyze long coupled model runs to identify the SST-Monsoon relation, and test whether this relation changes over time. NEED MORE DETAIL!
(b) DelSole and Shukla (2012) showed that while the statistical relation between SST and the Indian monsoon during the past half century is too weak to justify empirical predictions based on observations, the relation between model-predicted SST and Indian monsoon is sufficiently strong to justify empirical predictions based on model output. We propose to develop a real-time forecast of Indian monsoon based on model-predicted NINO3.4 indices. To this end, we will acquire a set of model-predicted NINO3.4 indices, such as those compiled routinely by the International Research Institute for Climate and Society (IRI) or produced by the National Multi-Model Ensemble (NMME), and use these as the basis for Indian monsoon forecasts. The real time forecast of Indian monsoon will be available shortly after the IRI or NMME NINO3.4 forecasts are published.
3.3.1.7 New methodologies to compare models
TIM WILL PROVIDE TEXT.
3.3.1.8 Predictability of Regional Climate: Scientific Basis for Adaptation Strategies
We will use a coupled climate model whose atmospheric and land surface resolution is sufficient to accurately represent mesoscale features and circulation (nominally 0.25° grid spacing) and whose ocean and sea ice resolution is sufficient to accurately represent mesoscale eddies and sea ice features (nominally 0.10° grid spacing) to make experimental decadal climate predictions. Because such models include the full depth of the ocean but are very computationally expensive to integrate to equilibrium, we will use a long run of same model at lower resolution, which was brought into equilibrium with pre-industrial external forcing and then integrated forward to the present with the observed time history of external forcing, as a source of initial conditions for the mesoscale-resolving coupled model. This approach was employed with some success by Kirtman et al. (2012). The predictions will be repeated with the same lower resolution version of the model, so as to assess the impact of resolving mesoscales.
3.3.2 Practical Methods of Initializing High-Resolution Coupled Models
One potential drawback of using a currently available lower-resolution ODA analysis to initialize the eddy-resolving high-resolution ocean model for climate prediction is that such an ODA product, though reasonably realistic in describing the large-scale oceanic circulation and basin-wide variability, does not resolve the rich spatial structure of the tight gradients across the western boundary currents and near major upwelling zones, as well as the current meanderings and meso-scale eddies, which may be the key features for the added value of the high-resolution prediction systems. Therefore, the simple approach of interpolation may not be the optimal way to initialize the high-resolution ocean model and might produce spurious results that would be misleading. Preliminary attempts to assimilate ocean observations using (uncoupled) eddy-resolving ocean general circulation models have met with some success (e.g., Metzer et al. 2012), but the process is computationally intensive, time consuming and may have limited effectiveness since the limited subsurface measurements do not adequately constrain the initial conditions.
Since satellite observations are generally quite uniform in spatial-temporal coverage, both the SST and SSH are available as gridded fields at ¼-degree resolution on regular time intervals and may be used to enhance a low-resolution ODA analysis through a more economical “re-assimilation” approach within the coupled framework. Specifically, we will employ the nudging method as described in Pohlman et al. (2009), which is originally designed to produce more balanced initial conditions for decadal prediction, to systematically assimilate both the available ODA and the satellite fields into the high-resolution coupled system. For example, for each forecast initial time, we could start the coupled model from simply-interpolated 1-degree ODA (or ¼-degree ODA, such as SODA) and run it for, say, 2 years (upper bound lifetime of most ocean eddies), with additional nudging terms, such as large-scale ODA-based 3D temperature and salinity, high-resolution satellite-based SST analyses, synthetic upper ocean temperature profile based on the altimeter-based estimates of SSH (e.g., Pinardi; 1995; Fischer and Latif, 1995), and Observation-based high-resolution surface fluxes (e.g., ERS-1 and -2; QuikSCAT and ASCAT),
In addition to introducing mesoscale information, the coupled assimilation run should also produce more balanced initial states that are consistent with the eddy-resolving ocean model physics. More sophisticated assimilation approaches, such as ensemble Kalman filter, may also be tested in place of the simple nudging. At the end of a 3-year nudging period, we would start each forecast with no additional input of observational data. We would make two sets of hindcasts: one with simply-interpolated initial conditions and one with the nudged initial conditions to measure the sensitivity of the forecasts to the additional information provided by the nudging procedure. New land surface initialization methods described in Sec 3.1.2.4 also will be explored in this context.
3.3.3 Define R&D Pathway and Address Operational R2O and O2R Issues
INTRODUCTORY TEXT
Land: collab with Ek, Lawrence, Koster, Milly.
3.4 BROADER IMPACTS AND SERVICE TO THE NATION – 2 pp
By virtue of COLA’s unique multi-agency support framework, COLA has the capacity to contribute substantially beyond the basic and applied research described above. In particular, we propose to continue to reach out to the climate research and climate stakeholder communities in several ways.
3.4.1 National Multi-Model Ensemble
As described above, COLA is deeply involved in the National Multi-Model Ensemble (NMME), providing real-time experimental forecasts to the NMME, analyzing the results of NMME re-forecasts, and running hypothesis-testing experiments with several of the NMME models. We propose to help lead and coordinate NMME activities, in collaboration with the Climate Prediction Testbed (CTB) at NOAA CPC and the Modeling, Analysis, Predictions and Projections (MAPP) program at the NOAA Climate Program Office (CPO). By maintaining an open line of communication with each of the national modeling centers, COLA can facilitate coordination in a more sustained and effective way than has been possible in the past. (MORE?)
We also propose to establish a National Climate Models Information (NCMI) web page that enhances the information available to both sophisticated and casual users of the NNME forecast information. This web page will include links to existing NMME resources such as the current forecast products provided by the NOAA CPC, and the data dissemination site, currently being developed at NCAR. The web page will also provide analysis and commentary, both by COLA scientists and by contributors via social media, on the accuracy, reliability, and utility of NMME forecasts. The work will include evaluation of the effectiveness of communications (see Sec. 3.4.2).
3.4.2 The Institute of Global Environment and Society at George Mason University
The Board of Visitors of George Mason University has established a new Institute of Global Environment and Society (INSERT BLURB FROM IGES PROPOSAL TO BOV). The Institute will integrate across several climate-related educational and research activities at Mason, including the Center for Climate Change Communication, the Center for Biodiversity and Climate, and (OTHERS}. For example, COLA will work closely with the C4 center to evaluate the effectiveness of the NCMI web page. (MORE?)
3.4.3 Educating the Next Generation of Climate Scientists
COLA helped establish a very successful Ph.D. program in Climate Dynamics at Mason. To date, the program has graduated 25 doctoral students, all of whom have gone on to careers in major research labs, agencies or universities. There are currently 18 students enrolled in the program. This project will support 5 full-time Ph.D. students directly, and the Climate Dynamics program will leverage COLA research assets (computers, data sets, staff expertise etc.) in order to more effectively train the next generation of climate modelers and analysts.
3.4.4 Further development of and support for GrADS
The Grid Analysis and Display System (GrADS) (INSERT BOILERPLATE FROM ELSEWHERE). We propose to design and implement several new data analysis capabilities, such as linear algebra routines (leveraging the R programming language) and sorting as part of a new defop command that will provide significantly more powerful and flexible function-definition capability to users. We will also design and implement a new data model for quasi-regular grids (e.g. swaths, icosahedral, etc.) that will substantially expand the reach of GrADS analysis and display functionality to data sets not currently supported. We propose to design and implement a multi-thread version of GrADS for higher performance on high-resolution grids. Finally, we will explore possibilities for creating GrADS-in-the-Cloud. Our vision is to enable mobile computing platforms, which are becoming ubiquitous and transforming the way people communicate and interact with information resources, for geoscience data analysis. COLA is already running a cloud, namely a data cluster, which is accessed via the internet. Interactive analysis and display is accomplished using GrADS, via the X-windows protocol, which is as general as is available. However, X-windows is becoming strictly an open source effort on all platforms, and many new platforms do not support X-windows. The best alternative is the http protocol, which is in very wide use, with a core of portable standards at the applications level (Javascript, Digital Object Model, HTML5). We propose to develop a GrADS implementation that works well via the http protocol, including scripts (likely using an existing http-protocol-based text editor).
Much of the proposed work will intentionally leverage the GrADS open source development model to entrain the broader development community. We will explore possibilities of adopting a plug-in architecture to enable these developments.
Share with your friends: |