0: PE 0: MPICH processor detected: 0: PE 0: AMD Milan (25:1:1) (family:model:stepping) 0: MPI VERSION : CRAY MPICH version 8.1.25.17 (ANL base 3.4a2) 0: MPI BUILD INFO : Sun Feb 26 15:15 2023 (git hash aecd99f) (CH4) 0: PE 0: MPICH environment settings ===================================== 0: PE 0: MPICH_ENV_DISPLAY = 1 0: PE 0: MPICH_VERSION_DISPLAY = 1 0: PE 0: MPICH_ABORT_ON_ERROR = 0 0: PE 0: MPICH_CPUMASK_DISPLAY = 0 0: PE 0: MPICH_STATS_DISPLAY = 0 0: PE 0: MPICH_RANK_REORDER_METHOD = 1 0: PE 0: MPICH_RANK_REORDER_DISPLAY = 0 0: PE 0: MPICH_MEMCPY_MEM_CHECK = 0 0: PE 0: MPICH_USE_SYSTEM_MEMCPY = 0 0: PE 0: MPICH_OPTIMIZED_MEMCPY = 1 0: PE 0: MPICH_ALLOC_MEM_PG_SZ = 4096 0: PE 0: MPICH_ALLOC_MEM_POLICY = PREFERRED 0: PE 0: MPICH_ALLOC_MEM_AFFINITY = SYS_DEFAULT 0: PE 0: MPICH_MALLOC_FALLBACK = 0 0: PE 0: MPICH_MEM_DEBUG_FNAME = 0: PE 0: MPICH_INTERNAL_MEM_AFFINITY = SYS_DEFAULT 0: PE 0: MPICH_NO_BUFFER_ALIAS_CHECK = 0 0: PE 0: MPICH_COLL_SYNC = MPI_Bcast 0: PE 0: MPICH_SINGLE_HOST_ENABLED = 1 0: PE 0: MPICH/RMA environment settings ================================= 0: PE 0: MPICH_RMA_MAX_PENDING = 128 0: PE 0: MPICH_RMA_SHM_ACCUMULATE = 0 0: PE 0: MPICH/Dynamic Process Management environment settings ========== 0: PE 0: MPICH_DPM_DIR = 0: PE 0: MPICH_LOCAL_SPAWN_SERVER = 0 0: PE 0: MPICH_SPAWN_USE_RANKPOOL = 1 0: PE 0: MPICH/SMP environment settings ================================= 0: PE 0: MPICH_SMP_SINGLE_COPY_MODE = XPMEM 0: PE 0: MPICH_SMP_SINGLE_COPY_SIZE = 8192 0: PE 0: MPICH_SHM_PROGRESS_MAX_BATCH_SIZE = 8 0: PE 0: MPICH/COLLECTIVE environment settings ========================== 0: PE 0: MPICH_COLL_OPT_OFF = 0 0: PE 0: MPICH_BCAST_ONLY_TREE = 1 0: PE 0: MPICH_BCAST_INTERNODE_RADIX = 4 0: PE 0: MPICH_BCAST_INTRANODE_RADIX = 4 0: PE 0: MPICH_ALLTOALL_SHORT_MSG = 64-512 0: PE 0: MPICH_ALLTOALL_SYNC_FREQ = 1-24 0: PE 0: MPICH_ALLTOALLV_THROTTLE = 8 0: PE 0: MPICH_ALLGATHER_VSHORT_MSG = 1024-4096 0: PE 0: MPICH_ALLGATHERV_VSHORT_MSG = 1024-4096 0: PE 0: MPICH_GATHERV_SHORT_MSG = 131072 0: PE 0: MPICH_GATHERV_MIN_COMM_SIZE = 64 0: PE 0: MPICH_GATHERV_MAX_TMP_SIZE = 536870912 0: PE 0: MPICH_GATHERV_SYNC_FREQ = 16 0: PE 0: MPICH_IGATHERV_MIN_COMM_SIZE = 1000 0: PE 0: MPICH_IGATHERV_SYNC_FREQ = 100 0: PE 0: MPICH_IGATHERV_RAND_COMMSIZE = 2048 0: PE 0: MPICH_IGATHERV_RAND_RECVLIST = 0 0: PE 0: MPICH_SCATTERV_SHORT_MSG = 2048-8192 0: PE 0: MPICH_SCATTERV_MIN_COMM_SIZE = 64 0: PE 0: MPICH_SCATTERV_MAX_TMP_SIZE = 536870912 0: PE 0: MPICH_SCATTERV_SYNC_FREQ = 16 0: PE 0: MPICH_SCATTERV_SYNCHRONOUS = 0 0: PE 0: MPICH_ALLREDUCE_MAX_SMP_SIZE = 262144 0: PE 0: MPICH_ALLREDUCE_BLK_SIZE = 716800 0: PE 0: MPICH_GPU_ALLGATHER_VSHORT_MSG_ALGORITHM = 1 0: PE 0: MPICH_GPU_ALLREDUCE_USE_KERNEL = 0 0: PE 0: MPICH_GPU_COLL_STAGING_BUF_SIZE = 1048576 0: PE 0: MPICH_GPU_ALLREDUCE_STAGING_THRESHOLD = 256 0: PE 0: MPICH_ALLREDUCE_NO_SMP = 0 0: PE 0: MPICH_REDUCE_NO_SMP = 0 0: PE 0: MPICH_REDUCE_SCATTER_COMMUTATIVE_LONG_MSG_SIZE = 524288 0: PE 0: MPICH_REDUCE_SCATTER_MAX_COMMSIZE = 1000 0: PE 0: MPICH_SHARED_MEM_COLL_OPT = 1 0: PE 0: MPICH_SHARED_MEM_COLL_NCELLS = 8 0: PE 0: MPICH_SHARED_MEM_COLL_CELLSZ = 256 0: PE 0: MPICH MPIIO environment settings =============================== 0: PE 0: MPICH_MPIIO_HINTS_DISPLAY = 0 0: PE 0: MPICH_MPIIO_HINTS = NULL 0: PE 0: MPICH_MPIIO_ABORT_ON_RW_ERROR = disable 0: PE 0: MPICH_MPIIO_CB_ALIGN = 2 0: PE 0: MPICH_MPIIO_DVS_MAXNODES = 24 0: PE 0: MPICH_MPIIO_AGGREGATOR_PLACEMENT_DISPLAY = 0 0: PE 0: MPICH_MPIIO_AGGREGATOR_PLACEMENT_STRIDE = -1 0: PE 0: MPICH_MPIIO_MAX_NUM_IRECV = 50 0: PE 0: MPICH_MPIIO_MAX_NUM_ISEND = 50 0: PE 0: MPICH_MPIIO_MAX_SIZE_ISEND = 10485760 0: PE 0: MPICH_MPIIO_OFI_STARTUP_CONNECT = disable 0: PE 0: MPICH_MPIIO_OFI_STARTUP_NODES_AGGREGATOR = 2 0: PE 0: MPICH MPIIO statistics environment settings ==================== 0: PE 0: MPICH_MPIIO_STATS = 0 0: PE 0: MPICH_MPIIO_TIMERS = 0 0: PE 0: MPICH_MPIIO_WRITE_EXIT_BARRIER = 1 0: PE 0: MPICH Thread Safety settings =================================== 0: PE 0: MPICH_ASYNC_PROGRESS = 0 0: PE 0: MPICH_OPT_THREAD_SYNC = 1 0: PE 0: rank 0 required = single, was provided = single 0: User-specified PIO rearranger comm max pend req (comp2io), 0 (value will be reset as requested) 0: Resetting PIO rearranger comm max pend req (comp2io) to 64 0: PIO rearranger options: 0: comm type = p2p 0: comm fcd = 2denable 0: max pend req (comp2io) = 64 0: enable_hs (comp2io) = T 0: enable_isend (comp2io) = F 0: max pend req (io2comp) = 64 0: enable_hs (io2comp) = F 0: enable_isend (io2comp) = T 0: 40 pes participating in computation of coupled model 0: -------------------------------------------------------------- 0: GLOBAL communicator : 1 nodes, 40 MPI tasks 0: COMMUNICATOR NODE # [NODE NAME] : (# OF MPI TASKS) TASK # LIST 0: GLOBAL NODE 0 [ nid004797 ] : ( 40 MPI TASKS ) 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 0: -------------------------------------------------------------- 0: (seq_comm_setcomm) init ID ( 1 GLOBAL ) pelist = 0 39 1 ( npes = 40) ( nthreads = 1)( suffix =) 0: (seq_comm_setcomm) init ID ( 2 CPL ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_setcomm) init ID ( 5 ATM ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 6 CPLATM ) join IDs = 2 5 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 4 CPLALLATMID ) join IDs = 2 3 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 9 LND ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 10 CPLLND ) join IDs = 2 9 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 7 ALLLNDID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 8 CPLALLLNDID ) join IDs = 2 7 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 13 ICE ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 14 CPLICE ) join IDs = 2 13 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 11 ALLICEID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 12 CPLALLICEID ) join IDs = 2 11 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 17 OCN ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 18 CPLOCN ) join IDs = 2 17 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 15 ALLOCNID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 16 CPLALLOCNID ) join IDs = 2 15 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 21 ROF ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 22 CPLROF ) join IDs = 2 21 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 19 ALLROFID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 20 CPLALLROFID ) join IDs = 2 19 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 25 GLC ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 23 ALLGLCID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 24 CPLALLGLCID ) join IDs = 2 23 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 29 WAV ) pelist = 0 39 1 ( npes = 40) ( nthreads = 2)( suffix =) 0: (seq_comm_joincomm) init ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 27 ALLWAVID ) join multiple comp IDs ( npes = 40) ( nthreads = 2) 0: (seq_comm_joincomm) init ID ( 28 CPLALLWAVID ) join IDs = 2 27 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 33 ESP ) pelist = 0 39 1 ( npes = 40) ( nthreads = 1)( suffix =) 0: (seq_comm_joincomm) init ID ( 34 CPLESP ) join IDs = 2 33 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 31 ALLESPID ) join multiple comp IDs ( npes = 40) ( nthreads = 1) 0: (seq_comm_joincomm) init ID ( 32 CPLALLESPID ) join IDs = 2 31 ( npes = 40) ( nthreads = 2) 0: (seq_comm_setcomm) init ID ( 37 IAC ) pelist = 0 39 1 ( npes = 40) ( nthreads = 1)( suffix =) 0: (seq_comm_joincomm) init ID ( 38 CPLIAC ) join IDs = 2 37 ( npes = 40) ( nthreads = 2) 0: (seq_comm_jcommarr) init ID ( 35 ALLIACID ) join multiple comp IDs ( npes = 40) ( nthreads = 1) 0: (seq_comm_joincomm) init ID ( 36 CPLALLIACID ) join IDs = 2 35 ( npes = 40) ( nthreads = 2) 0: (seq_comm_printcomms) 1 0 40 1 GLOBAL: 0: (seq_comm_printcomms) 2 0 40 2 CPL: 0: (seq_comm_printcomms) 3 0 40 2 ALLATMID: 0: (seq_comm_printcomms) 4 0 40 2 CPLALLATMID: 0: (seq_comm_printcomms) 5 0 40 2 ATM: 0: (seq_comm_printcomms) 6 0 40 2 CPLATM: 0: (seq_comm_printcomms) 7 0 40 2 ALLLNDID: 0: (seq_comm_printcomms) 8 0 40 2 CPLALLLNDID: 0: (seq_comm_printcomms) 9 0 40 2 LND: 0: (seq_comm_printcomms) 10 0 40 2 CPLLND: 0: (seq_comm_printcomms) 11 0 40 2 ALLICEID: 0: (seq_comm_printcomms) 12 0 40 2 CPLALLICEID: 0: (seq_comm_printcomms) 13 0 40 2 ICE: 0: (seq_comm_printcomms) 14 0 40 2 CPLICE: 0: (seq_comm_printcomms) 15 0 40 2 ALLOCNID: 0: (seq_comm_printcomms) 16 0 40 2 CPLALLOCNID: 0: (seq_comm_printcomms) 17 0 40 2 OCN: 0: (seq_comm_printcomms) 18 0 40 2 CPLOCN: 0: (seq_comm_printcomms) 19 0 40 2 ALLROFID: 0: (seq_comm_printcomms) 20 0 40 2 CPLALLROFID: 0: (seq_comm_printcomms) 21 0 40 2 ROF: 0: (seq_comm_printcomms) 22 0 40 2 CPLROF: 0: (seq_comm_printcomms) 23 0 40 2 ALLGLCID: 0: (seq_comm_printcomms) 24 0 40 2 CPLALLGLCID: 0: (seq_comm_printcomms) 25 0 40 2 GLC: 0: (seq_comm_printcomms) 26 0 40 2 CPLGLC: 0: (seq_comm_printcomms) 27 0 40 2 ALLWAVID: 0: (seq_comm_printcomms) 28 0 40 2 CPLALLWAVID: 0: (seq_comm_printcomms) 29 0 40 2 WAV: 0: (seq_comm_printcomms) 30 0 40 2 CPLWAV: 0: (seq_comm_printcomms) 31 0 40 1 ALLESPID: 0: (seq_comm_printcomms) 32 0 40 2 CPLALLESPID: 0: (seq_comm_printcomms) 33 0 40 1 ESP: 0: (seq_comm_printcomms) 34 0 40 2 CPLESP: 0: (seq_comm_printcomms) 35 0 40 1 ALLIACID: 0: (seq_comm_printcomms) 36 0 40 2 CPLALLIACID: 0: (seq_comm_printcomms) 37 0 40 1 IAC: 0: (seq_comm_printcomms) 38 0 40 2 CPLIAC: 0: (t_initf) Read in prof_inparm namelist from: drv_in 0: (t_initf) Using profile_disable= F 0: (t_initf) profile_timer= 4 0: (t_initf) profile_depth_limit= 20 0: (t_initf) profile_detail_limit= 12 0: (t_initf) profile_barrier= F 0: (t_initf) profile_outpe_num= 1 0: (t_initf) profile_outpe_stride= 0 0: (t_initf) profile_single_file= F 0: (t_initf) profile_global_stats= T 0: (t_initf) profile_ovhd_measurement= F 0: (t_initf) profile_add_detail= F 0: (t_initf) profile_papi_enable= F 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/danielle_work/explore_files/fates_params_default.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/clm_params_c211124.nc 0 1: WARNING: Opening file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/domainFile_parflowER.nc using PIO_IOTYPE_PNETCDF iotype failed. Retrying using PIO_IOTYPE_NETCDF4P format 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/domainFile_parflowER.nc 262144 11: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Variable not found 31: NetCDF: Variable not found 11: NetCDF: Variable not found 21: NetCDF: Variable not found 1: WARNING: Opening file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/domainFile_parflowER.nc using PIO_IOTYPE_PNETCDF iotype failed. Retrying using PIO_IOTYPE_NETCDF4P format 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/domainFile_parflowER.nc 262144 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 1: NetCDF: Invalid dimension ID or name 11: NetCDF: Invalid dimension ID or name 21: NetCDF: Invalid dimension ID or name 31: NetCDF: Invalid dimension ID or name 11: NetCDF: Variable not found 1: NetCDF: Variable not found 31: NetCDF: Variable not found 21: NetCDF: Variable not found 1: NetCDF: Variable not found 11: NetCDF: Variable not found 21: NetCDF: Variable not found 31: NetCDF: Variable not found 11: NetCDF: Variable not found 21: NetCDF: Variable not found 31: NetCDF: Variable not found 1: NetCDF: Variable not found 11: NetCDF: Variable not found 21: NetCDF: Variable not found 31: NetCDF: Variable not found 1: NetCDF: Variable not found 1: NetCDF: Variable not found 11: NetCDF: Variable not found 21: NetCDF: Variable not found 31: NetCDF: Variable not found 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/clm_params_c211124.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/CNP_parameters_c131108.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/danielle_work/explore_files/fates_params_default.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/clm_params_c211124.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 0 1: 1: proc= 1 beg gridcell= 641 end gridcell= 1280 total gridcells per proc= 640 1: proc= 1 beg topounit= 641 end topounit= 1280 total topounits per proc= 640 1: proc= 1 beg landunit= 2561 end landunit= 5120 total landunits per proc= 2560 1: proc= 1 beg column = 10241 end column = 20480 total columns per proc = 10240 1: proc= 1 beg pft = 19201 end pft = 38400 total pfts per proc = 19200 1: proc= 1 beg coh = 384001 end coh = 768000 total coh per proc = 384000 1: proc= 1 lnd ngseg = 1600 lnd nlseg = 40 1: proc= 1 gce ngseg = 1600 gce nlseg = 40 1: proc= 1 top ngseg = 1600 top nlseg = 40 1: proc= 1 lun ngseg = 102000 lun nlseg = 2560 1: proc= 1 col ngseg = 102000 col nlseg = 2560 1: proc= 1 pft ngseg = 102000 pft nlseg = 2560 1: proc= 1 coh ngseg = 1600 coh nlseg = 40 1: proc= 1 nclumps = 2 1: proc= 1 clump no = 1 clump id= 2 beg gridcell= 641 end gridcell= 960 total gridcells per clump= 320 1: proc= 1 clump no = 1 clump id= 2 beg topounit= 641 end topounit= 960 total topounits per clump = 320 1: proc= 1 clump no = 1 clump id= 2 beg landunit= 2561 end landunit= 3840 total landunits per clump = 1280 1: proc= 1 clump no = 1 clump id= 2 beg column = 10241 end column = 15360 total columns per clump = 5120 1: proc= 1 clump no = 1 clump id= 2 beg pft = 19201 end pft = 28800 total pfts per clump = 9600 1: proc= 1 clump no = 1 clump id= 2 beg cohort = 384001 end cohort = 576000 total cohorts per clump = 192000 19: 19: proc= 19 beg gridcell= 12121 end gridcell= 12760 total gridcells per proc= 640 19: proc= 19 beg topounit= 12121 end topounit= 12760 total topounits per proc= 640 19: proc= 19 beg landunit= 48481 end landunit= 51040 total landunits per proc= 2560 19: proc= 19 beg column = 193921 end column = 204160 total columns per proc = 10240 19: proc= 19 beg pft = 363601 end pft = 382800 total pfts per proc = 19200 19: proc= 19 beg coh = 7272001 end coh = 7656000 total coh per proc = 384000 19: proc= 19 lnd ngseg = 1600 lnd nlseg = 40 19: proc= 19 gce ngseg = 1600 gce nlseg = 40 19: proc= 19 top ngseg = 1600 top nlseg = 40 19: proc= 19 lun ngseg = 102000 lun nlseg = 2560 19: proc= 19 col ngseg = 102000 col nlseg = 2560 19: proc= 19 pft ngseg = 102000 pft nlseg = 2560 19: proc= 19 coh ngseg = 1600 coh nlseg = 40 19: proc= 19 nclumps = 2 19: proc= 19 clump no = 1 clump id= 20 beg gridcell= 12121 end gridcell= 12440 total gridcells per clump= 320 19: proc= 19 clump no = 1 clump id= 20 beg topounit= 12121 end topounit= 12440 total topounits per clump = 320 19: proc= 19 clump no = 1 clump id= 20 beg landunit= 48481 end landunit= 49760 total landunits per clump = 1280 19: proc= 19 clump no = 1 clump id= 20 beg column = 193921 end column = 199040 total columns per clump = 5120 19: proc= 19 clump no = 1 clump id= 20 beg pft = 363601 end pft = 373200 total pfts per clump = 9600 19: proc= 19 clump no = 1 clump id= 20 beg cohort = 7272001 end cohort = 7464000 total cohorts per clump = 192000 38: 38: proc= 38 beg gridcell= 24241 end gridcell= 24880 total gridcells per proc= 640 38: proc= 38 beg topounit= 24241 end topounit= 24880 total topounits per proc= 640 38: proc= 38 beg landunit= 96961 end landunit= 99520 total landunits per proc= 2560 38: proc= 38 beg column = 387841 end column = 398080 total columns per proc = 10240 38: proc= 38 beg pft = 727201 end pft = 746400 total pfts per proc = 19200 38: proc= 38 beg coh = 14544001 end coh = 14928000 total coh per proc = 384000 38: proc= 38 lnd ngseg = 1600 lnd nlseg = 40 38: proc= 38 gce ngseg = 1600 gce nlseg = 40 38: proc= 38 top ngseg = 1600 top nlseg = 40 38: proc= 38 lun ngseg = 102000 lun nlseg = 2560 38: proc= 38 col ngseg = 102000 col nlseg = 2560 38: proc= 38 pft ngseg = 102000 pft nlseg = 2560 38: proc= 38 coh ngseg = 1600 coh nlseg = 40 38: proc= 38 nclumps = 2 38: proc= 38 clump no = 1 clump id= 39 beg gridcell= 24241 end gridcell= 24560 total gridcells per clump= 320 38: proc= 38 clump no = 1 clump id= 39 beg topounit= 24241 end topounit= 24560 total topounits per clump = 320 38: proc= 38 clump no = 1 clump id= 39 beg landunit= 96961 end landunit= 98240 total landunits per clump = 1280 38: proc= 38 clump no = 1 clump id= 39 beg column = 387841 end column = 392960 total columns per clump = 5120 38: proc= 38 clump no = 1 clump id= 39 beg pft = 727201 end pft = 736800 total pfts per clump = 9600 38: proc= 38 clump no = 1 clump id= 39 beg cohort = 14544001 end cohort = 14736000 total cohorts per clump = 192000 39: 39: proc= 39 beg gridcell= 24881 end gridcell= 25500 total gridcells per proc= 620 39: proc= 39 beg topounit= 24881 end topounit= 25500 total topounits per proc= 620 39: proc= 39 beg landunit= 99521 end landunit= 102000 total landunits per proc= 2480 39: proc= 39 beg column = 398081 end column = 408000 total columns per proc = 9920 39: proc= 39 beg pft = 746401 end pft = 765000 total pfts per proc = 18600 39: proc= 39 beg coh = 14928001 end coh = 15300000 total coh per proc = 372000 39: proc= 39 lnd ngseg = 1600 lnd nlseg = 40 39: proc= 39 gce ngseg = 1600 gce nlseg = 40 39: proc= 39 top ngseg = 1600 top nlseg = 40 39: proc= 39 lun ngseg = 102000 lun nlseg = 2480 39: proc= 39 col ngseg = 102000 col nlseg = 2480 39: proc= 39 pft ngseg = 102000 pft nlseg = 2480 39: proc= 39 coh ngseg = 1600 coh nlseg = 40 39: proc= 39 nclumps = 2 39: proc= 39 clump no = 1 clump id= 40 beg gridcell= 24881 end gridcell= 25200 total gridcells per clump= 320 39: proc= 39 clump no = 1 clump id= 40 beg topounit= 24881 end topounit= 25200 total topounits per clump = 320 39: proc= 39 clump no = 1 clump id= 40 beg landunit= 99521 end landunit= 100800 total landunits per clump = 1280 39: proc= 39 clump no = 1 clump id= 40 beg column = 398081 end column = 403200 total columns per clump = 5120 39: proc= 39 clump no = 1 clump id= 40 beg pft = 746401 end pft = 756000 total pfts per clump = 9600 39: proc= 39 clump no = 1 clump id= 40 beg cohort = 14928001 end cohort = 15120000 total cohorts per clump = 192000 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/clm_params_c211124.nc 0 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 2 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 1 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/m4309/eastRiver_PF-FATES/james_runs/eastRiver_static/eastRiver_static/caseFiles/surfdata_ER_220428.nc 3 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/snicardata/snicar_optics_5bnd_mam_c160322.nc 3 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/snicardata/snicar_drdt_bst_fit_60_c070416.nc 3 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: Opened existing file /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/paramdata/clm_params_c211124.nc 3 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 1: MPIIO WARNING: DVS stripe width of 24 was requested but DVS set it to 1 1: See MPICH_MPIIO_DVS_MAXNODES in the intro_mpi man page. 0: MCT::m_Router::initp_: GSMap indices not increasing...Will correct 0: MCT::m_Router::initp_: RGSMap indices not increasing...Will correct 0: MCT::m_Router::initp_: RGSMap indices not increasing...Will correct 0: MCT::m_Router::initp_: GSMap indices not increasing...Will correct 1: Opened file ./pfNOFates_testYF_build_newPARAMfile.IELMFATES.pm-cpu.gnu.junk.2024-11-04.elm.h0.2011-01.nc to write 5 1: pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1293 : NetCDF: Invalid dimension ID or name 11: pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1293 : NetCDF: Invalid dimension ID or name 21: pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1293 : NetCDF: Invalid dimension ID or name 31: pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1293 : NetCDF: Invalid dimension ID or name 1: MPICH ERROR [Rank 1] [job id 32521341.0] [Mon Nov 4 13:21:59 2024] [nid004797] - Abort(1) (rank 1 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 1: 1: aborting job: 1: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 11: MPICH ERROR [Rank 11] [job id 32521341.0] [Mon Nov 4 13:21:59 2024] [nid004797] - Abort(1) (rank 11 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 11 11: 11: aborting job: 11: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 11 21: MPICH ERROR [Rank 21] [job id 32521341.0] [Mon Nov 4 13:21:59 2024] [nid004797] - Abort(1) (rank 21 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 21 21: 21: aborting job: 21: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 21 31: MPICH ERROR [Rank 31] [job id 32521341.0] [Mon Nov 4 13:21:59 2024] [nid004797] - Abort(1) (rank 31 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 31 31: 31: aborting job: 31: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 31 srun: error: nid004797: tasks 1,11,21,31: Exited with exit code 255 srun: Terminating StepId=32521341.0 0: slurmstepd: error: *** STEP 32521341.0 ON nid004797 CANCELLED AT 2024-11-04T21:22:04 *** srun: error: nid004797: tasks 0,2-10,12-20,22-30,32-39: Terminated srun: Force Terminated StepId=32521341.0