Monday, November 22, 2010

Building CAM again

Trying to build CAM w/o reference to notes.

Haven't looked at this for two years.

unzipped and untarred

Here is the directory structure out of the box:


./CAM3Bld/cam1/
./CAM3Bld/cam1/models/
./CAM3Bld/cam1/models/atm/
./CAM3Bld/cam1/models/lnd/
./CAM3Bld/cam1/models/ice/
./CAM3Bld/cam1/models/csm_share/
./CAM3Bld/cam1/models/utils/
./CAM3Bld/cam1/models/atm/cam/
./CAM3Bld/cam1/models/atm/cam/bld/
./CAM3Bld/cam1/models/atm/cam/src/
./CAM3Bld/cam1/models/atm/cam/test/
./CAM3Bld/cam1/models/atm/cam/tools/
./CAM3Bld/cam1/models/atm/cam/bld/XML/
./CAM3Bld/cam1/models/atm/cam/bld/XML/Lite/
./CAM3Bld/cam1/models/atm/cam/bld/XML/man3/
./CAM3Bld/cam1/models/atm/cam/src/dynamics/
./CAM3Bld/cam1/models/atm/cam/src/physics/
./CAM3Bld/cam1/models/atm/cam/src/control/
./CAM3Bld/cam1/models/atm/cam/src/ocnsice/
./CAM3Bld/cam1/models/atm/cam/src/advection/
./CAM3Bld/cam1/models/atm/cam/src/utils/
./CAM3Bld/cam1/models/atm/cam/src/dynamics/eul/
./CAM3Bld/cam1/models/atm/cam/src/dynamics/sld/
./CAM3Bld/cam1/models/atm/cam/src/dynamics/fv/
./CAM3Bld/cam1/models/atm/cam/src/physics/cam1/
./CAM3Bld/cam1/models/atm/cam/src/ocnsice/dom/
./CAM3Bld/cam1/models/atm/cam/src/ocnsice/som/
./CAM3Bld/cam1/models/atm/cam/src/advection/slt/
./CAM3Bld/cam1/models/atm/cam/tools/definehires/
./CAM3Bld/cam1/models/atm/cam/tools/definemld/
./CAM3Bld/cam1/models/atm/cam/tools/icesst/
./CAM3Bld/cam1/models/atm/cam/tools/interpaerosols/
./CAM3Bld/cam1/models/atm/cam/tools/mkrgridnew/
./CAM3Bld/cam1/models/atm/cam/tools/scam/
./CAM3Bld/cam1/models/atm/cam/tools/interpic/
./CAM3Bld/cam1/models/atm/cam/tools/definesurf/
./CAM3Bld/cam1/models/atm/cam/tools/mkrgrid/
./CAM3Bld/cam1/models/atm/cam/tools/mkrgridsst/
./CAM3Bld/cam1/models/atm/cam/tools/cprnc/
./CAM3Bld/cam1/models/atm/cam/tools/defineqflux/
./CAM3Bld/cam1/models/atm/cam/tools/icesst/bcgen/
./CAM3Bld/cam1/models/atm/cam/tools/icesst/regrid/
./CAM3Bld/cam1/models/atm/cam/tools/scam/data/
./CAM3Bld/cam1/models/atm/cam/tools/scam/html/
./CAM3Bld/cam1/models/atm/cam/tools/scam/mymods/
./CAM3Bld/cam1/models/atm/cam/tools/scam/obj/
./CAM3Bld/cam1/models/atm/cam/tools/scam/scm_init/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/
./CAM3Bld/cam1/models/atm/cam/tools/scam/ui/
./CAM3Bld/cam1/models/atm/cam/tools/scam/userdata/
./CAM3Bld/cam1/models/atm/cam/tools/scam/html/gif/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/ccm2iop/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/diurnal_ave/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/intercomparison-post-processing/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/ncadd/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/ncmult/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/nctrans/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/pdf/
./CAM3Bld/cam1/models/atm/cam/tools/scam/tools/sdev/
./CAM3Bld/cam1/models/atm/cam/tools/scam/ui/images/
./CAM3Bld/cam1/models/atm/cam/tools/scam/ui/ncarg/
./CAM3Bld/cam1/models/lnd/clm2/
./CAM3Bld/cam1/models/lnd/clm2/src/
./CAM3Bld/cam1/models/lnd/clm2/src/biogeochem/
./CAM3Bld/cam1/models/lnd/clm2/src/biogeophys/
./CAM3Bld/cam1/models/lnd/clm2/src/main/
./CAM3Bld/cam1/models/lnd/clm2/src/mksrfdata/
./CAM3Bld/cam1/models/lnd/clm2/src/riverroute/
./CAM3Bld/cam1/models/ice/csim4/
./CAM3Bld/cam1/models/csm_share/shr/
./CAM3Bld/cam1/models/utils/timing/
./CAM3Bld/cam1/models/utils/pilgrim/
./CAM3Bld/cam1/models/utils/esmf/
./CAM3Bld/cam1/models/utils/pilgrim/unit_testers/
./CAM3Bld/cam1/models/utils/esmf/build/
./CAM3Bld/cam1/models/utils/esmf/include/
./CAM3Bld/cam1/models/utils/esmf/scripts/
./CAM3Bld/cam1/models/utils/esmf/src/
./CAM3Bld/cam1/models/utils/esmf/build/Darwin_absoft/
./CAM3Bld/cam1/models/utils/esmf/build/Darwin_xlf/
./CAM3Bld/cam1/models/utils/esmf/build/ES/
./CAM3Bld/cam1/models/utils/esmf/build/IRIX/
./CAM3Bld/cam1/models/utils/esmf/build/IRIX64/
./CAM3Bld/cam1/models/utils/esmf/build/SX6/
./CAM3Bld/cam1/models/utils/esmf/build/alpha/
./CAM3Bld/cam1/models/utils/esmf/build/config/
./CAM3Bld/cam1/models/utils/esmf/build/cray_x1/
./CAM3Bld/cam1/models/utils/esmf/build/cray_x1_ssp/
./CAM3Bld/cam1/models/utils/esmf/build/linux_altix/
./CAM3Bld/cam1/models/utils/esmf/build/linux_gnupgf90/
./CAM3Bld/cam1/models/utils/esmf/build/linux_intel/
./CAM3Bld/cam1/models/utils/esmf/build/linux_lf95/
./CAM3Bld/cam1/models/utils/esmf/build/linux_pathscale/
./CAM3Bld/cam1/models/utils/esmf/build/linux_pgi/
./CAM3Bld/cam1/models/utils/esmf/build/rs6000_64/
./CAM3Bld/cam1/models/utils/esmf/build/rs6000_sp/
./CAM3Bld/cam1/models/utils/esmf/build/solaris/
./CAM3Bld/cam1/models/utils/esmf/build/solaris_hpc/
./CAM3Bld/cam1/models/utils/esmf/scripts/doc_templates/
./CAM3Bld/cam1/models/utils/esmf/scripts/doc_templates/templates/
./CAM3Bld/cam1/models/utils/esmf/src/Infrastructure/
./CAM3Bld/cam1/models/utils/esmf/src/include/
./CAM3Bld/cam1/models/utils/esmf/src/Infrastructure/BasicUtil/
./CAM3Bld/cam1/models/utils/esmf/src/Infrastructure/Error/
./CAM3Bld/cam1/models/utils/esmf/src/Infrastructure/TimeMgmt/


Configuration: here

Instructions say to run configure, but it's immediately not obvious where that is. There are four files called configure, but you have to guess that

cam1/models/atm/cam/bld/configure

is right.

So, of the options to configure, the first one to cause any difficulty would be

-cc name
name specifies the C compiler. This allows the user to override the default setting in the Makefile (Linux only). The C compiler can also be specified by setting the environment variable USER_CC. pgcc if using pgf90, otherwise use cc

similarly


-fc name
name specifies the Fortran compiler. This allows the user to override the default setting in the Makefile. The Fortran compiler can also be specified by setting the environment variable USER_FC. OS dependent




OK, let's see what f90 we have available. Hmm.


login4% man f90
No manual entry for f90
login4% which f90
f90: Command not found.
login4% which fortran
fortran: Command not found.
login4% which f77
/usr/bin/f77
login4% man f77
No manual entry for f77


umm? Here

we get ifort with icc, or pgf95 with pgcc or sunf90 or sunf95 with sun_cc

I haven't heard of NCAR components running under sun, so let's try the other two.
So we can set USER_CC and USER_FC

The next problem is the MPI version, always a head-scratcher.

-mpi_inc dir
dir is the directory that contains the MPI library include files. Only SPMD versions of CAM require MPI. The MPI include directory can also be specified by setting the environment variable INC_MPI. /usr/local/include except on IBM systems. The IBM Fortran compilers mpxlf90 and mpxlf90_r have the MPI include file location built in.
module avail yields
--------------------------------------- /opt/apps/pgi7_2/modulefiles ---------------------------------------
acml/4.1.0 gotoblas2/1.05 (default) mvapich2-debug/1.2
autodock/4.0.1 hdf5/1.6.5 mvapich2-new/1.2
fftw3/3.1.2 hecura-debug/1.5rc2 mvapich2/1.2
glpk/4.40 hecura/1.5.1 nco/3.9.5
gotoblas/1.26 (default) metis/4.0 netcdf/3.6.2
gotoblas/1.30 mvapich-old/1.0.1 openmpi/1.3
gotoblas2/1.00 mvapich/1.0.1

not helpful. But see "Compiling Parallel Programs with MPI" here. This also recommends intel or pgi, but seems inconsistent about which is installed.


login4% which mpif90
/opt/apps/pgi7_2/mvapich/1.0.1/bin/mpif90
login4% which mpicc
/opt/apps/pgi7_2/mvapich/1.0.1/bin/mpicc


So that we get pgi by default. OK, good enough for me, though we've been running intel10 on lonestar. "The compiler and MVAPICH library are selected according to the modules that have been loaded." But no modules loaded. Trying module load intel gives useful info:

Error: You can only have one compiler module loaded at time.
You already have pgi loaded.
To correct the situation, please enter the following command:

module swap pgi intel

This is tedious but going well so far. Still don;t know if the MPI modules will be found; time will tell, I guess, but it looks likely that $MPICH_HOME will need to be specified for this stuff

Similarly with netcdf

No comments:

Post a Comment