Go to the bottom of this page. See the search engine and sub-section links.
Go to next page Go to previous page Go to top of this section Go to top page Go to table of contents

Previous Section Headers

User's Guide to NCAR CCM3.6 Search Page


3. CCM3.6 Internals


3.2 Model Code Flow

This section provides a narrative of the code flow through the initialization phase and the driving computational loops.  Refer to the calling tree in "Appendix B: CCM3.6 Calling Tree". The user may also wish to review "Overview of CCM3.6" before reading this section. Details of the physical parameterizations are not presented here -- see Kiehl et al. (1996).

3.2.1 Initialization

The first tasks performed are defining the logical units used for I/O (subroutine lunits), presetting namelist variables, (subroutines preset, lsmctli) reading in the Fortran namelist data (subroutines data, lsmctli), and setting pointers and lengths associated with the main model buffer (subroutine points).

From this point on, initialization takes one of two paths.  Initial runs are set up in subroutine inital, or continuation runs are initialized in subroutine resume. These two paths are roughly parallel in nature and call many of the same routines. The primary difference is that an initial run reads the initial dataset and spectrally truncates prognostic fields (see inidat, below).  A continuation run obtains these fields from the appropriate restart files.

The primary job of subroutine inidat is to read the required fields from the initial dataset and spectrally transform the surface pressure (PS), wind (U,V), temperature (T), and surface geopotential fields (PHIS). Other required fields are the surface type flag (ORO), the sea ice layer temperatures (TS1 through TS4), and the moisture field (Q3). If the slab ocean model is enabled, snow depth (SNOWH) and sea ice thickness (SICTHK) are also read in from the initial dataset.

The amount of local data (stack-based) used in inidat is quite large. Three-dimensional arrays exist for vorticity, divergence, and the work array required by the FFT (Fast Fourier Transform) package. The need for three dimensional arrays is partly due to the fact that the model can start from an initial dataset randomly ordered in latitude. Since local memory is allocated on the stack (see "Shared-Memory Management" ), the large size of the local database does not increase total memory utilization since the local workspace in inidat will disappear before the time integration begins.

3.2.2 Time Integration

Subroutine stepon controls time integration.  After the initialization phase is complete for either an initial run or a continuation run, stepon loops through time until completion of the simulation, incrementing  variable nstep after each timestep. If the run terminates normally, subroutine scan1bc will be the last routine called by stepon, as well as the first to be called by stepon upon restart. The reason scan1bc is called at the end of a run is to write the fully time-filtered data to the history file, completing a time accumulation period.

Subroutine advnce, called from scan1bc each timestep,  updates current time information (calendr) and time-variant boundary dataset information (subroutines sstint or somint, and oznint).

Current simulation time information is computed in calendr.   The definition of a "year" is 365 days. There are no leap years.

3.2.2.1 Latitude Scans

There are two multitasked, Gaussian latitude scan loops: one in routine scan1bc and another in scan1ac. These were in one routine, scan1, in earlier versions of the Community Climate Model. The routine was split into two pieces in CCM3 in order to accommodate coupling to other geophysical models, such as ocean, and sea ice models through the Climate System Model (CSM) flux coupler (Byran et. al. 1996). The bc in scan1bc refers to "before coupling", and the ac in scan1ac to "after coupling". Adjustment physics, cloud calculations and radiative transfer routines are called on the bc side, and other physical parameterizations are called on the ac side. If coupled to other models, surface fluxes are exchanged between calls to scan1bc and scan1ac.

Subroutine lsmdrv drives the Land Surface Model calculations (Bonan, 1996). These calculations are not multitasked by latitude band, but rather as one big vector which is divided into several subsections of nearly equal length. Computations associated with each of these subsections are then distributed among the available processors. Refer to the document LSM User's Guide for more details.

Subroutine somoce drives the slab ocean model calculations, if the SOM option is enabled.  The cpp token COUP_SOM  must be set with a #define in misc.h.

Subroutine scanslt drives the semi-Lagrangian transport (SLT) calculations. This is the only latitude scan in which the full dimensionality of the "extended grid" (plond and platd) is required. In addition to the physical data dimensions (plon and plat), the extended grid includes additional array storage before the start and after the end of the physical data in both the longitudinal and latitudinal dimensions. The extended grid must be initialized at each timestep. This is done in sltini. In scanslt, the wind field at time level "n" is used to predict the evolution of moisture and constituent fields, from time level n-1 to time level n+1 using a semi-Lagrangian transport scheme (Williamson and Rasch, 1989).  No I/O is done in scanslt, since the prognostic arrays u3, v3, wfld, and q3 are entirely in-core. The local database of subroutine scanslt contains numerous multi-dimensional arrays. Since many of these arrays contain a constituent index, memory use can increase dramatically as the number of transported constituents is increased.

Routine sltini fills the extensions of certain arrays used by the SLT package in the longitudinal and latitudinal dimensions. Only a small amount of computational work is performed in sltini, but the routine is multitasked over latitude bands since it is called every timestep. 

Subroutine scan2 primarily converts spectral space prognostic variables back to gridpoint space and computes global integrals of dry and moist mass.  The routine multitasks over latitude pairs rather than individual latitudes. The last step in scan2 toggles the time indices n3 and n3m1 after the conversion back to gridpoint space.  No copying of the actual data is necessary.  Prognostic data that were referenced by the current time index (n3) will become the previous time index (n3m1) while data just computed in time index n3m1 are referenced by the "next" (n3) time index at the start of the next iteration in stepon.

3.2.2.2 Spectral Space Computations

Gaussian quadrature (subroutine quad), completion of the semi-implicit timestep (subroutine tstep), and horizontal diffusion calculations (subroutine hordif) are all accomplished within multitasked loops in dyndrv. The quadrature and semi-implicit timestep computations are parallelized over total wavenumber "n" when the cpp token PVP is defined, and over Fourier wavenumber "m" otherwise. Horizontal diffusion calculations are parallelized over the vertical index "k".

3.2.2.3 Computational Driving Routines for Each Latitude Band

Routines linemsbc and linemsac sit beneath the multitasked latitude loops in scan1bc and scan1ac, respectively. These routines allocate memory for the main model buffers and history buffer, apply the SLT "fixer", time filter, and call the adjustment physics and tendency physics driving routines.

Routine aphys applies adjustments directly to the prognostic variables rather than computing tendencies. These adjustments include dry adiabatic adjustment, moist convection, and large-scale condensation.

Routines tphysbc and tphysac (called from linemsbc and linemsac, respectively) call the various physical parameterizations which are applied as tendencies. These include cloud computations, radiation, surface temperature and flux calculations, the planetary boundary layer scheme, vertical diffusion, and gravity wave drag. Cloud calculations are needed internally only by the radiation code. Since radiative computations are normally not performed on every timestep, cloud calculations are performed only on timesteps that the radiative transfer code is exercised.

Routine radctl controls both the longwave and shortwave radiative transfer code. While the rest of the code uses mks units, cgs units are used throughout the radiation computations. Timesteps during which the longwave absorptivity and emissivity calculations are performed is controlled by namelist variable IRADAE. Since absorptivities and emissivities can only be calculated on a longwave radiation timestep, IRADAE must be an even multiple of IRADLW.  If IRADAE is greater than IRADLW, absorptivity and emissivity (a/e) values are stored in a buffer and read into memory during non-absorptivity radiation timesteps.  It is recommended that the history file write frequency, NHTFRQ(1)be an even multiple of IRADAE to ensure that the absorptivities and emissivities need not be written to the restart dataset.

Routine sltb1 drives the SLT constituent forecast calculations. Each call to this routine fills one latitude line of three-dimensional array qfcst. For more detail on the algorithms of the SLT, refer to (Kiehl et al., 1996) .


Sub Sections

    3.2.1 Initialization
    3.2.2 Time Integration


 Go to the top of this page. See links to previous section headers.
Go to next page Go to previous page Go to top of this section Go to top page Go to table of contents

 Search for keywords in the CCM3.6 Users GuideSearch page

Questions on these pages can be sent to... erik@ucar.edu .


$Name: ccm3_6_6_latest2 $ $Revision: 1.42.2.1 $ $Date: 1999/03/25 21:38:20 $ $Author: erik $