FAIL
diff --git a/cime_config/testdefs/testlist_clm.xml b/cime_config/testdefs/testlist_clm.xml
index 4a217d7904..214ecd54a5 100644
--- a/cime_config/testdefs/testlist_clm.xml
+++ b/cime_config/testdefs/testlist_clm.xml
@@ -2,7 +2,7 @@
-
+
00:20:00
@@ -10,8 +10,8 @@
-
-
+
+
00:20:00
@@ -19,8 +19,8 @@
-
-
+
+
00:20:00
@@ -28,8 +28,9 @@
-
+
+
00:20:00
@@ -38,7 +39,7 @@
-
+
@@ -48,8 +49,7 @@
-
-
+
00:20:00
@@ -57,7 +57,8 @@
-
+
+
00:60:00
@@ -66,7 +67,7 @@
-
+
00:60:00
@@ -74,7 +75,7 @@
-
+
00:20:00
@@ -82,17 +83,18 @@
-
+
00:20:00
Include a test of this scientifically-supported compset at a scientifically-supported resolution
-
+
-
+
+
00:20:00
@@ -101,7 +103,7 @@
-
+
@@ -111,7 +113,7 @@
-
+
@@ -121,7 +123,7 @@
-
+
@@ -131,24 +133,14 @@
-
-
+
+
00:20:00
Include a test of prescribed soil-moisture, has to be at f09, should be 2000 and for SP
-
-
-
-
-
-
- 00:20:00
- NUOPC test covering prescribed soil moisture stream (this stream uses different code in a nuopc test than in an mct test). (Comments from mct version of this test say: Include a test of prescribed soil-moisture, has to be at f09, should be 2000 and for SP)
-
-
@@ -174,65 +166,43 @@
00:40:00
-
-
-
-
-
- 00:40:00
- Include an ERI test with NUOPC
-
-
-
+
00:20:00
-
+
-
+
00:20:00
Do a test similar to FXHIST starting at a 2005 start date, will interpoalte from the 2003 IC file
-
+
-
+
00:20:00
Run a transient case with standalone settings similar to the FXHIST waccm test
-
-
-
-
-
-
- 00:20:00
- NUOPC test covering drydep, megan and Gregorian calendar. (Comments from mct version of this test say: Run a transient case with standalone settings similar to the FXHIST waccm test)
-
-
-
+
-
+
00:20:00
-
+
-
+
00:20:00
@@ -240,7 +210,7 @@
-
+
00:20:00
@@ -248,8 +218,7 @@
-
-
+
00:20:00
@@ -257,7 +226,7 @@
-
+
00:20:00
@@ -265,7 +234,7 @@
-
+
00:20:00
@@ -273,7 +242,7 @@
-
+
00:20:00
@@ -281,7 +250,7 @@
-
+
00:20:00
@@ -289,25 +258,23 @@
-
-
+
00:20:00
-
+
-
-
+
00:20:00
-
+
-
+
00:20:00
@@ -316,7 +283,7 @@
-
+
@@ -326,7 +293,7 @@
-
+
@@ -342,9 +309,9 @@
00:60:00
-
+
-
+
00:10:00
@@ -353,7 +320,7 @@
-
+
@@ -363,7 +330,7 @@
-
+
@@ -373,7 +340,7 @@
-
+
00:20:00
@@ -382,7 +349,7 @@
-
+
00:20:00
@@ -390,7 +357,7 @@
-
+
00:20:00
@@ -398,7 +365,7 @@
-
+
00:20:00
@@ -406,7 +373,7 @@
-
+
@@ -416,8 +383,8 @@
-
-
+
+
01:20:00
@@ -426,7 +393,7 @@
-
+
01:20:00
@@ -435,7 +402,7 @@
-
+
01:20:00
@@ -444,8 +411,7 @@
-
-
+
01:20:00
@@ -454,16 +420,16 @@
-
+
02:00:00
Run without DEBUG for a long period for a single-point case with the matrix solution on for spinup with all options it interacts with (requires Bgc, but also turn on Crop and Carbon isotopes)
-
+
-
+
00:40:00
@@ -472,7 +438,7 @@
-
+
00:20:00
@@ -480,8 +446,8 @@
-
-
+
+
00:20:00
@@ -489,7 +455,7 @@
-
+
00:20:00
@@ -498,8 +464,7 @@
-
-
+
00:20:00
@@ -507,7 +472,7 @@
-
+
00:20:00
@@ -515,7 +480,7 @@
-
+
00:20:00
@@ -523,8 +488,8 @@
-
-
+
+
00:20:00
@@ -532,7 +497,7 @@
-
+
00:20:00
@@ -541,7 +506,7 @@
-
+
@@ -551,7 +516,7 @@
-
+
@@ -567,24 +532,15 @@
00:20:00
-
+
-
+
00:20:00
Transient case with isotopes with a december start
-
-
-
-
-
- 00:20:00
- NUOPC test covering both carbon isotopes and decStart (the latter is not very important, but we might as well have a test of that option). (Comments from mct version of this test say: Transient case with isotopes with a december start)
-
-
@@ -603,85 +559,66 @@
-
-
-
- 00:20:00
-
-
-
-
-
-
-
-
+
00:20:00
-
+
-
+
+
00:20:00
-
+
-
+
00:20:00
-
+
-
+
00:20:00
-
+
-
+
00:20:00
-
+
-
-
-
-
- 00:20:00
-
-
-
-
-
+
00:40:00
NOTE(bja, 201509) constrain_stress_deciduous_onset is on by default for clm50, but functionality is not exercised by nine day tests, Sean Swenson verified that it is active during 30 day tests.
-
+
-
+
00:20:00
Want ERP _D test with irrigation on
-
+
-
+
00:20:00
@@ -690,24 +627,14 @@
-
-
+
+
00:20:00
Do a test with RTM and flooding on as that also impacts CLM code
-
-
-
-
-
-
- 00:20:00
- NUOPC test covering flooding. (Comments from mct version of this test say: Do a test with RTM and flooding on as that also impacts CLM code)
-
-
@@ -719,7 +646,7 @@
-
+
00:20:00
@@ -727,7 +654,7 @@
-
+
00:40:00
@@ -735,8 +662,7 @@
-
-
+
00:20:00
@@ -744,7 +670,7 @@
-
+
00:20:00
@@ -752,13 +678,13 @@
-
+
00:20:00
-
+
@@ -770,7 +696,7 @@
-
+
00:20:00
@@ -778,7 +704,7 @@
-
+
00:20:00
@@ -794,7 +720,7 @@
-
+
00:20:00
@@ -802,7 +728,7 @@
-
+
@@ -812,7 +738,7 @@
-
+
@@ -844,9 +770,25 @@
00:20:00
-
+
-
+
+
+
+ 00:20:00
+
+
+
+
+
+
+
+ 00:20:00
+
+
+
+
+
01:40:00
@@ -855,7 +797,7 @@
-
+
@@ -863,9 +805,18 @@
Science support for I1850Clm45Bgc at f09
+
+
+
+
+
+ 00:20:00
+ Exact restart changing processor count with threading
+
+
-
+
@@ -875,60 +826,59 @@
-
+
00:40:00
include a Clm45 ERI test; also, want a debug test of I1850Clm45Bgc
-
+
-
-
+
+
00:20:00
-
+
-
+
00:20:00
-
+
-
+
00:20:00
-
+
-
+
00:20:00
include a debug test of I2000Clm45Sp
-
+
-
-
+
00:20:00
Include a few debug tests of Cn
-
+
-
+
00:20:00
@@ -937,7 +887,7 @@
-
+
@@ -947,7 +897,7 @@
-
+
@@ -957,7 +907,7 @@
-
+
@@ -967,7 +917,7 @@
-
+
@@ -977,7 +927,7 @@
-
+
00:20:00
@@ -985,7 +935,7 @@
-
+
@@ -995,7 +945,7 @@
-
+
@@ -1003,72 +953,73 @@
Science support for IHistClm45Sp at f19
-
+
-
-
+
02:00:00
0.5
-
+
-
+
00:20:00
include a debug test with flooding on
-
+
+
-
-
+
+
00:20:00
-
+
-
+
00:20:00
-
+
-
+
01:40:00
threaded ERP test for crop just over 2-years
-
+
-
+
00:20:00
@@ -1076,36 +1027,36 @@
0.5
-
+
-
+
00:20:00
-
+
-
+
01:40:00
Want a multi-year global crop restart test; this was 5 years when we were doing cold start, but 3 years is probably sufficient given that we have spun-up crop initial conditions
-
+
-
-
+
+
01:40:00
Want a multi-year global crop restart test; this was 5 years when we were doing cold start, but 3 years is probably sufficient given that we have spun-up crop initial conditions
-
+
-
+
00:30:00
Want an ERP test covering some non-default irrigation options. Long enough so that we're likely to exercise the various groundwater irrigation code.
@@ -1113,27 +1064,17 @@
-
-
-
-
- 00:30:00
- NUOPC test covering irrig_alternate, because this tests some rof coupling. (Comments from mct version of this test say: Want an ERP test covering some non-default irrigation options. Long enough so that we're likely to exercise the various groundwater irrigation code.)
-
-
-
-
-
+
00:20:00
-
+
-
+
00:20:00
@@ -1142,7 +1083,7 @@
-
+
00:10:00
@@ -1151,7 +1092,16 @@
-
+
+
+
+ 00:20:00
+ test transient PFTs (via HIST) with a December start, reading 78-pft data and running with 16 pfts
+
+
+
+
+
00:20:00
@@ -1160,25 +1110,25 @@
-
+
00:20:00
test soil_layerstruct_userdefined set to the same dzsoi values as in the predefined case 4SL_2m and expect bfb same answers
-
+
-
+
00:20:00
Tests updates of BGC variables with increasing and decreasing glacier areas
-
+
-
+
00:20:00
@@ -1187,10 +1137,8 @@
-
-
-
-
+
+
00:20:00
@@ -1198,9 +1146,7 @@
-
-
-
+
00:20:00
@@ -1208,8 +1154,7 @@
-
-
+
00:20:00
@@ -1217,7 +1162,7 @@
-
+
00:20:00
@@ -1225,7 +1170,7 @@
-
+
00:20:00
@@ -1241,8 +1186,7 @@
-
-
+
00:20:00
@@ -1250,7 +1194,7 @@
-
+
00:20:00
@@ -1259,7 +1203,7 @@
-
+
00:20:00
@@ -1267,7 +1211,7 @@
-
+
@@ -1276,7 +1220,7 @@
-
+
00:20:00
@@ -1284,7 +1228,7 @@
-
+
01:20:00
@@ -1293,7 +1237,7 @@
-
+
02:00:00
@@ -1302,7 +1246,7 @@
-
+
02:00:00
@@ -1311,7 +1255,7 @@
-
+
02:00:00
@@ -1320,7 +1264,7 @@
-
+
01:40:00
@@ -1330,7 +1274,7 @@
-
+
01:40:00
@@ -1339,43 +1283,34 @@
-
+
01:40:00
restart is right before the transition from 100% nat veg to 100% crop
-
+
-
+
01:40:00
Multi-year global test of transient crops together with transient glaciers. Use glacier evolution with ERS test
-
+
-
-
-
- 01:40:00
- NUOPC test covering evolving glacier. (Comments from mct version of this test say: Multi-year global test of transient crops together with transient glaciers. Use glacier evolution with ERS test)
-
-
-
-
-
+
01:40:00
Want a multi-year global test of transient crops; also want a multi-year transient restart test. Using P60x1 and ERS rather than ERP to get faster turnaround of this long-running test
-
+
-
+
03:00:00
@@ -1384,7 +1319,7 @@
-
+
02:00:00
@@ -1394,16 +1329,16 @@
-
+
00:20:00
Basic LII test, covering the standard range of subgrid heterogeneity - particularly, including crop. Uses a year-2000 restart file so that the restart file has non-zero product pools, so that we exercise the gridcell-level code in init_interp.
-
+
-
+
00:20:00
@@ -1412,7 +1347,7 @@
-
+
00:20:00
@@ -1421,8 +1356,8 @@
-
-
+
+
01:30:00
@@ -1431,8 +1366,8 @@
-
-
+
+
00:20:00
@@ -1440,8 +1375,8 @@
-
-
+
+
00:60:00
@@ -1456,20 +1391,9 @@
00:20:00
-
-
-
-
-
- 00:20:00
- The main purpose of this test is to test threading of init_interp, exercising the OpenMP directives in initInterp. (Note that ERP tests don't compare threaded vs. non-threaded runs of init_interp, since init_interp won't run in the restart case.) Note that this test will use init_interp as long as we don't have out-of-the-box initial conditions at f10 resolution. We could probably get a similar level of confidence in the threading directives by deleting this test and instead changing the LII test to use threading; the main loss would be that that wouldn't test threading combined with interpolating from one resolution to another, as this one does.
- 0.5
-
-
-
@@ -1478,7 +1402,7 @@
-
+
00:20:00
@@ -1486,7 +1410,7 @@
-
+
00:20:00
@@ -1494,7 +1418,7 @@
-
+
00:20:00
@@ -1504,7 +1428,7 @@
-
+
00:20:00
@@ -1513,7 +1437,7 @@
-
+
00:40:00
@@ -1522,7 +1446,7 @@
-
+
00:20:00
@@ -1531,7 +1455,7 @@
-
+
@@ -1541,7 +1465,7 @@
-
+
@@ -1551,7 +1475,7 @@
-
+
00:40:00
@@ -1560,7 +1484,7 @@
-
+
00:40:00
@@ -1569,7 +1493,7 @@
-
+
00:20:00
@@ -1579,7 +1503,6 @@
-
@@ -1588,7 +1511,6 @@
-
@@ -1605,7 +1527,7 @@
-
+
00:20:00
@@ -1614,8 +1536,7 @@
-
-
+
00:20:00
@@ -1623,25 +1544,13 @@
-
-
+
-
+
-
-
-
- 00:20:00
-
-
-
-
-
-
00:20:00
- Include a NUOPC test of pts mode
@@ -1652,9 +1561,9 @@
00:20:00
-
+
-
+
00:20:00
This covers some code that isn't covered by any existing tests (such as the oldhyd test), though the amount of additional code coverage is small, so we don't necessarily need to keep this test long-term.
@@ -1670,21 +1579,12 @@
00:20:00
-
-
-
-
-
- 00:20:00
- NUOPC test covering anomaly forcing
-
-
-
-
-
-
+
+
+
+
00:20:00
@@ -1692,9 +1592,8 @@
-
-
-
+
+
00:20:00
@@ -1702,7 +1601,7 @@
-
+
00:20:00
@@ -1710,21 +1609,11 @@
-
-
-
-
- 00:20:00
-
-
-
-
-
-
+
+
00:20:00
- NUOPC test covering multi-driver, the DAE test, and Gregorian calendar
@@ -1738,29 +1627,18 @@
-
-
+
+
00:20:00
0.5
-
-
-
-
-
-
- 00:20:00
- 0.5
- Add at least one test of a NEON site
-
-
-
-
+
+
00:20:00
@@ -1768,19 +1646,9 @@
This is a test of a generic tower site under MCT
-
-
-
-
-
- 00:20:00
- 0.5
- Make sure the CLM_USRDAT universal resolution works for the NUOPC driver
-
-
-
+
00:40:00
@@ -1789,7 +1657,7 @@
-
+
02:00:00
@@ -1798,7 +1666,7 @@
-
+
0:50:00
Include a test of transient lakes
@@ -1818,7 +1686,7 @@
-
+
00:20:00
@@ -1826,7 +1694,7 @@
-
+
00:20:00
@@ -1834,7 +1702,7 @@
-
+
00:20:00
@@ -1842,16 +1710,16 @@
-
+
00:20:00
-
+
-
-
+
+
00:20:00
@@ -1859,7 +1727,7 @@
-
+
00:20:00
@@ -1867,7 +1735,7 @@
-
+
00:20:00
@@ -1875,17 +1743,16 @@
-
-
+
00:20:00
- include a production gnu test of Clm45
+ include a production test of Clm45
-
+
00:20:00
@@ -1893,7 +1760,7 @@
-
+
00:20:00
@@ -1918,8 +1785,8 @@
-
-
+
+
00:20:00
@@ -1928,25 +1795,25 @@
-
+
00:20:00
- Transient production low res future scenario SSP2-4.5 case with isotopes with a december 2050 start, use gnu to move off of intel
+ Transient production low res future scenario SSP2-4.5 case with isotopes with a december 2050 start
-
+
00:20:00
- Transient production low res future scenario SSP3-7.0 case with isotopes with a december 2050 start, use gnu to move off of intel
+ Transient production low res future scenario SSP3-7.0 case with isotopes with a december 2050 start
-
+
00:20:00
@@ -1960,21 +1827,22 @@
The main point of this test is to exercise the case where nlevgrnd is less than nlevurb. See the README file in its testmod directory for details.
+
-
+
00:40:00
include a relatively long crop test at relatively high resolution
-
+
-
-
+
+
01:30:00
@@ -1983,8 +1851,8 @@
-
-
+
+
01:40:00
@@ -1992,7 +1860,7 @@
-
+
01:40:00
@@ -2010,8 +1878,8 @@
-
-
+
+
01:00:00
@@ -2019,7 +1887,7 @@
-
+
00:20:00
@@ -2028,7 +1896,7 @@
-
+
00:20:00
@@ -2036,7 +1904,7 @@
-
+
00:20:00
@@ -2044,8 +1912,7 @@
-
-
+
00:20:00
@@ -2053,7 +1920,7 @@
-
+
@@ -2062,10 +1929,8 @@
-
+
-
-
00:20:00
@@ -2073,7 +1938,7 @@
-
+
00:20:00
@@ -2081,7 +1946,7 @@
-
+
@@ -2090,7 +1955,7 @@
-
+
00:20:00
@@ -2099,7 +1964,7 @@
-
+
@@ -2108,7 +1973,7 @@
-
+
@@ -2117,7 +1982,7 @@
-
+
00:20:00
@@ -2125,26 +1990,16 @@
-
-
-
-
-
- 00:20:00
-
-
-
-
+
00:20:00
- Include a NUOPC FATES test, and a single-point test with NUOPC: There isn't any interaction between FATES and the cap, as far as I can tell, but it seems like a good idea to have one FATES test with NUOPC, since there is so much extra code covered in a FATES test.
-
+
00:20:00
@@ -2152,9 +2007,8 @@
-
+
-
00:20:00
@@ -2162,7 +2016,7 @@
-
+
00:20:00
@@ -2170,10 +2024,9 @@
-
+
-
-
+
00:20:00
@@ -2181,7 +2034,7 @@
-
+
00:20:00
@@ -2190,7 +2043,7 @@
-
+
00:40:00
@@ -2198,16 +2051,16 @@
-
-
+
+
00:40:00
-
+
-
+
00:20:00
@@ -2215,7 +2068,7 @@
-
+
00:20:00
@@ -2223,7 +2076,7 @@
-
+
00:20:00
@@ -2231,8 +2084,7 @@
-
-
+
00:20:00
@@ -2240,7 +2092,7 @@
-
+
00:20:00
@@ -2268,10 +2120,10 @@
-
+
-
+
00:30:00
A debug ERP test of the NWP configuration with active BGC and CROP.
@@ -2282,7 +2134,7 @@
-
+
00:30:00
Ensure that turning on water tracers doesn't change answers. Cold start for now, until we can use initial conditions from a non-isotope case in an isotope case; once we can do that, this should be changed to not be cold start (e.g., 5-day decStart transient test: see also https://github.com/ESCOMP/ctsm/issues/495#issuecomment-516619853).
@@ -2290,10 +2142,10 @@
-
+
-
+
00:30:00
Include a debug ERP test of the NWP configuration.
@@ -2305,7 +2157,7 @@
-
+
00:30:00
Include a short smoke test covering the nldas2 grid and the I2000Ctsm50NwpSpNldas compset, which uses NLDAS datm forcing.
@@ -2317,7 +2169,7 @@
-
+
00:30:00
Include a short smoke test covering the nldas2 grid and the I2000Ctsm50NwpSpNldasRs compset, which uses NLDAS datm forcing.
@@ -2328,27 +2180,17 @@
-
+
00:20:00
Short ERP debug FATES test for f19_g17 grid.
-
-
-
-
-
- 00:20:00
- Short ERP debug FATES test for f19_g17 grid with modified task layout.
-
-
-
+
-
00:20:00
@@ -2357,9 +2199,8 @@
-
+
-
00:20:00
@@ -2368,9 +2209,8 @@
-
+
-
00:40:00
@@ -2379,9 +2219,7 @@
-
-
-
+
00:40:00
@@ -2390,8 +2228,7 @@
-
-
+
00:20:00
@@ -2400,9 +2237,8 @@
-
+
-
00:20:00
@@ -2411,10 +2247,9 @@
-
-
-
-
+
+
+
00:10:00
@@ -2423,7 +2258,7 @@
-
+
@@ -2433,9 +2268,40 @@
-
+
+
+
+ 00:20:00
+
+
+
+
+
+
+
+ 00:20:00
+
+
+
+
+
+
+
+ 00:20:00
+
+
+
+
+
+
+
+ 00:20:00
+
+
+
+
+
-
00:40:00
@@ -2444,8 +2310,7 @@
-
-
+
00:20:00
@@ -2454,9 +2319,8 @@
-
-
-
+
+
00:40:00
@@ -2465,8 +2329,7 @@
-
-
+
00:40:00
@@ -2475,9 +2338,8 @@
-
+
-
00:20:00
@@ -2486,18 +2348,17 @@
-
+
00:20:00
60 day exact restart test activating FATES prescribed physiology mode on an f45 grid.
-
+
-
+
-
00:40:00
@@ -2506,7 +2367,7 @@
-
+
00:40:00
@@ -2515,19 +2376,25 @@
-
-
-
+
+
00:40:00
30 day exact restart test activating FATES size and age mortality mode on an f45 grid.
-
+
-
-
+
+
+
+ 00:20:00
+
+
+
+
+
00:20:00
@@ -2536,8 +2403,7 @@
-
-
+
00:40:00
@@ -2546,7 +2412,7 @@
-
+
01:00:00
@@ -2555,9 +2421,9 @@
-
-
-
+
+
+
00:20:00
@@ -2566,8 +2432,8 @@
-
-
+
+
00:60:00
@@ -2577,7 +2443,7 @@
-
+
00:20:00
Want at least a month-long debug test covering the output_crop usermod, as well as a test covering the output_crop_highfreq usermod. (Note that we already have a year+ test of output_crop via a cmip6 test, so having this test just be a month, rather than a year, seems good enough.)
@@ -2587,7 +2453,7 @@
-
+
00:20:00
Want a year-long test covering the output_bgc and output_bgc_highfreq usermods; don't want a highfreq, year-long global test because of the output volume, so this is single-point.
@@ -2597,7 +2463,7 @@
-
+
00:10:00
Want a year-long test covering the output_sp and output_sp_highfreq usermods; don't want a highfreq, year-long global test because of the output volume, so this is single-point.
@@ -2607,7 +2473,7 @@
-
+
00:30:00
Can use this test to determine if there are significant throughput changes, at least for this common and important configuration. Note that this deliberately doesn't have any testmods in order to (1) avoid doing history output (because the timing of output can be very variable, and mixing output timing with other aspects of model time can be confusing), and (2) generally keep the test replicating a production configuration as closely as possible (so, for example, we do NOT set BFBFLAG=TRUE for this test).
@@ -2620,7 +2486,7 @@
-
+
00:20:00
Basic LILAC smoke test. Needs to use the nuopc driver. Uses stub atmosphere to avoid needing to download a bunch of unnecessary data if run on a different machine.
@@ -2642,14 +2508,4 @@
-
-
-
-
- 00:30:00
- This test runs CTSM's Fortran unit tests. We're abusing the system test infrastructure to run these, so that a run of the test suite results in the unit tests being run as well. Grid and compset are irrelevant here, except that compset must be one that includes CTSM in order for CIME to find the test definition.
-
-
-
-
diff --git a/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/shell_commands b/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/shell_commands
index 446125abf9..4fb282d511 100755
--- a/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/shell_commands
+++ b/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/shell_commands
@@ -5,8 +5,8 @@
./xmlchange DATM_CLMNCEP_YR_END=2006
# Comment this out if NINST_LND is greater than 1 (see: http://bugs.cgd.ucar.edu/show_bug.cgi?id=2521)
./xmlchange MPILIB=mpi-serial
-./xmlchange ATM_DOMAIN_PATH=/glade/p/cesm/cseg/inputdata/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB
-./xmlchange LND_DOMAIN_PATH=/glade/p/cesm/cseg/inputdata/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB
+./xmlchange ATM_DOMAIN_PATH='$DIN_LOC_ROOT/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB'
+./xmlchange LND_DOMAIN_PATH='$DIN_LOC_ROOT/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB'
./xmlchange ATM_DOMAIN_FILE=domain.lnd.1x1pt_US-UMB_navy.171024.nc
./xmlchange LND_DOMAIN_FILE=domain.lnd.1x1pt_US-UMB_navy.171024.nc
./xmlchange --append CLM_BLDNML_OPTS='-mask navy -no-crop'
@@ -15,5 +15,4 @@
./xmlchange ATM_NCPL=24
./xmlchange RUN_STARTDATE=1999-01-01
./xmlchange DATM_CLMNCEP_YR_ALIGN=1999
-./xmlchange DIN_LOC_ROOT=/glade/p/cesm/cseg/inputdata
-./xmlchange DIN_LOC_ROOT_CLMFORC=/glade/p/cesm/cseg/inputdata/lnd/clm2/PTCLMmydatafiles.c171024
+./xmlchange DIN_LOC_ROOT_CLMFORC='$DIN_LOC_ROOT/lnd/clm2/PTCLMmydatafiles.c171024'
diff --git a/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/user_nl_clm b/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/user_nl_clm
index 38ce400297..03d3a0019e 100644
--- a/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/user_nl_clm
+++ b/cime_config/testdefs/testmods_dirs/clm/USUMB_mct/user_nl_clm
@@ -1,5 +1,4 @@
-! user_nl_clm namelist options written by PTCLMmkdata:
-! ./PTCLMmkdata --cesm_root ../../../.. -s US-UMB -d /glade/p/cesm/cseg/inputdata --mydatadir=/glade/p/cesm/cseg/inputdata/lnd/clm2/PTCLMmydatafiles.c171024
- fsurdat = '/glade/p/cesm/cseg/inputdata/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB/surfdata_1x1pt_US-UMB_16pfts_Irrig_CMIP6_simyr2000_c171024.nc'
+! user_nl_clm namelist options written by PTCLMmkdata, no longer supported
+ fsurdat = '$DIN_LOC_ROOT/lnd/clm2/PTCLMmydatafiles.c171024/1x1pt_US-UMB/surfdata_1x1pt_US-UMB_16pfts_Irrig_CMIP6_simyr2000_c171024.nc'
hist_nhtfrq = 0
hist_mfilt = 1200
diff --git a/cime_config/testdefs/testmods_dirs/clm/clm50cam6LndTuningMode_1979Start/user_nl_mosart b/cime_config/testdefs/testmods_dirs/clm/clm50cam6LndTuningMode_1979Start/user_nl_mosart
new file mode 100644
index 0000000000..e1e96bc6d4
--- /dev/null
+++ b/cime_config/testdefs/testmods_dirs/clm/clm50cam6LndTuningMode_1979Start/user_nl_mosart
@@ -0,0 +1,11 @@
+!----------------------------------------------------------------------------------
+! Users should add all user specific namelist changes below in the form of
+! namelist_var = new_namelist_value
+! NOTE: namelist variable rtm_tstep CAN ONLY be changed by modifying the value
+! of the xml variable ROF_NCPL in env_run.xml
+! NOTE: if the xml variable ROF GRID in env_build.xml is set to 'null', then
+! the RTM build-namelist will set do_rtm to .false. - and will ignore
+! any change below
+!----------------------------------------------------------------------------------
+rtmhist_nhtfrq = 1
+rtmhist_mfilt = 1
diff --git a/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/README b/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/README
index 81fb991ed0..af5d819ffc 100644
--- a/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/README
+++ b/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/README
@@ -9,10 +9,10 @@ According to the file
the following two files used in this test
are default files for the following options:
-fsurdat = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/surfdata_map/surfdata_10x15_78pfts_CMIP6_simyr1850_c170824.nc'
+fsurdat = '$DIN_LOC_ROOT/lnd/clm2/surfdata_map/surfdata_10x15_78pfts_CMIP6_simyr1850_c170824.nc'
hgrid="10x15" sim_year="1850" use_crop=".true."
-flanduse_timeseries = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/surfdata_map/landuse.timeseries_10x15_hist_78pfts_CMIP6_simyr1850-2015_c170824.nc'
+flanduse_timeseries = '$DIN_LOC_ROOT/lnd/clm2/surfdata_map/landuse.timeseries_10x15_hist_78pfts_CMIP6_simyr1850-2015_c170824.nc'
hgrid="10x15" sim_year_range="1850-2000" use_crop=".true."
hgrid="10x15" rcp="8.5" sim_year_range="1850-2100" use_crop=".true."
hgrid="10x15" rcp="6" sim_year_range="1850-2100" use_crop=".true."
diff --git a/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/user_nl_clm b/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/user_nl_clm
index ff78e0122c..8c4fed6873 100644
--- a/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/user_nl_clm
+++ b/cime_config/testdefs/testmods_dirs/clm/collapse_pfts_78_to_16_decStart_f10/user_nl_clm
@@ -1,2 +1,2 @@
-fsurdat = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/surfdata_map/surfdata_10x15_78pfts_CMIP6_simyr1850_c170824.nc'
-flanduse_timeseries = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/surfdata_map/landuse.timeseries_10x15_hist_78pfts_CMIP6_simyr1850-2015_c170824.nc'
+fsurdat = '$DIN_LOC_ROOT/lnd/clm2/surfdata_map/surfdata_10x15_78pfts_CMIP6_simyr1850_c170824.nc'
+flanduse_timeseries = '$DIN_LOC_ROOT/lnd/clm2/surfdata_map/landuse.timeseries_10x15_hist_78pfts_CMIP6_simyr1850-2015_c170824.nc'
diff --git a/cime_config/testdefs/testmods_dirs/clm/decStart/user_nl_mosart b/cime_config/testdefs/testmods_dirs/clm/decStart/user_nl_mosart
new file mode 100644
index 0000000000..7c5018f864
--- /dev/null
+++ b/cime_config/testdefs/testmods_dirs/clm/decStart/user_nl_mosart
@@ -0,0 +1,11 @@
+!----------------------------------------------------------------------------------
+! Users should add all user specific namelist changes below in the form of
+! namelist_var = new_namelist_value
+! NOTE: namelist variable rtm_tstep CAN ONLY be changed by modifying the value
+! of the xml variable ROF_NCPL in env_run.xml
+! NOTE: if the xml variable ROF GRID in env_build.xml is set to 'null', then
+! the RTM build-namelist will set do_rtm to .false. - and will ignore
+! any change below
+!----------------------------------------------------------------------------------
+rtmhist_nhtfrq =-24,-8
+rtmhist_mfilt = 1,1
diff --git a/doc/.ChangeLog_template b/doc/.ChangeLog_template
index f72764d0f4..9beedfe4ea 100644
--- a/doc/.ChangeLog_template
+++ b/doc/.ChangeLog_template
@@ -100,16 +100,7 @@ infrastructure should be run when appropriate, as described below.
build-namelist tests (if CLMBuildNamelist.pm has changed):
- cheyenne -
-
- tools-tests (test/tools) (if tools have been changed):
-
- cheyenne -
-
- PTCLM testing (tools/shared/PTCLM/test): (if cime or cime_config are changed)
- (PTCLM is being deprecated, so we only expect this to be done on occasion)
-
- cheyenne -
+ derecho -
python testing (if python code has changed; see instructions in python/README.md; document testing done):
@@ -117,13 +108,10 @@ infrastructure should be run when appropriate, as described below.
regular tests (aux_clm: https://github.com/ESCOMP/CTSM/wiki/System-Testing-Guide#pre-merge-system-testing):
- cheyenne ----
- izumi -------
-
- fates tests:
- cheyenne ----
-
- any other testing (give details below):
+ derecho_intel -----
+ izumi_intel -------
+ izumi_nag ---------
+ izumi_gnu ---------
If the tag used for baseline comparisons was NOT the previous tag, note that here:
diff --git a/doc/ChangeSum b/doc/ChangeSum
index f71295bc96..2735d49b82 100644
--- a/doc/ChangeSum
+++ b/doc/ChangeSum
@@ -1,5 +1,9 @@
Tag Who Date Summary
============================================================================================================================
+release-cesm2.2.04 erik 12/12/2023 Update PElayout and tests to work on Derecho, remove tools add notes around that, fix some tests
+release-cesm2.2.03 erik 12/04/2023 Quick tag with history update needed for SCAM
+release-cesm2.2.02 erik 08/23/2023 Change the sorting method for history files to help with performance on lustre filesystems
+release-cesm2.2.01 erik 09/02/2020 Fix clm4_5 initial conditions
ctsm5.1.dev042 erik 05/28/2021 Small answer changes for double precision constants and soil limits
ctsm5.1.dev041 rgknox 05/27/2021 bring FATES API up to sci.1.46.0_api.16.0.0 (methane and cn hooks)
ctsm5.1.dev040 slevis 05/20/2021 mksurfdata_map: replace SRC files of various masks with SRC files w no mask
diff --git a/doc/release-cesm2.2.ChangeLog b/doc/release-cesm2.2.ChangeLog
index 1f61e49095..14144c35f2 100644
--- a/doc/release-cesm2.2.ChangeLog
+++ b/doc/release-cesm2.2.ChangeLog
@@ -1,4 +1,199 @@
===============================================================
+Tag name: release-cesm2.2.04
+Originator(s): erik (Erik Kluzek,UCAR/TSS,303-497-1326)
+Date: Tue 12 Dec 2023 01:43:47 PM MST
+One-line Summary: Update PElayout and tests to work on Derecho, remove tools add notes around that, fix some tests
+
+Purpose of this version:
+------------------------
+
+Get working PE-layouts for Derecho. Move test list over to Derecho from Cheyenne.
+Remove all of the CTSM tools, users should use the latest.
+Some fixes to various tests to get them working.
+
+
+CTSM Master Tag This Corresponds To: ctsm1.0.dev108 (with most changes from ctsm1.0.dev111)
+
+Summary of changes:
+-------------------
+
+Issues fixed (include CTSM Issue #):
+ Resolves #2257 -- tag for CESM2.2.2 CESM release
+
+Science changes since: None
+
+Software changes since: Changes to PE-layouts and testing
+
+Changes to User Interface since:
+
+Testing:
+--------
+
+ [PASS means all tests PASS and OK means tests PASS other than expected fails.]
+
+ regular tests (aux_clm):
+
+ derecho_intel ----- OK
+ izumi_nag --------- OK
+ izumi_intel ------- OK
+
+Summary of Answer changes:
+-------------------------
+
+If the tag used for baseline comparisons was NOT the previous tag, note that here: previous
+
+Changes answers relative to baseline: No bit-for-bit (some ER tests change answers because of mosart history files)
+
+Detailed list of changes:
+------------------------
+
+Externals being used:
+ cism2_1_69_a
+ rtm1_0_72
+ mosart1_0_37_1
+ cime5.8.32.7
+ FATES tag -- sci.1.30.0_api.8.0.0
+ cdeps hash -- 45b7a85
+
+CTSM Tag versions pulled over from master development branch: None
+
+Pull Requests that document the changes (include PR ids):
+(https://github.com/ESCOMP/ctsm/pull)
+ #2281 -- PE layouts/testing to Derecho
+
+===============================================================
+===============================================================
+Tag name: release-cesm2.2.03
+Originator(s): erik (Erik Kluzek,UCAR/TSS,303-497-1326)
+Date: Mon 04 Dec 2023 10:21:10 PM MST
+One-line Summary: Quick tag with history update needed for SCAM
+
+Purpose of this version:
+------------------------
+
+Quick tag to fix a problem running PTS_MODE (SCAM) on Derecho.
+Update externals to work on Derecho.
+
+
+CTSM Master Tag This Corresponds To: ctsm1.0.dev108 (with most changes from ctsm1.0.dev111)
+
+Summary of changes:
+-------------------
+
+Science changes since: None
+
+Software changes since:
+ Backported change to ncdio_pio.F90.in from latest development CTSM by Jim Edwards
+
+Changes to User Interface since: None
+
+Testing:
+--------
+
+ [PASS means all tests PASS and OK means tests PASS other than expected fails.]
+
+ Derecho tests done that PASS:
+
+ ERP_D_Ld3.f09_g17.I2000Clm50SpGs.derecho_intel.clm-prescribed
+ SMS_D_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.derecho_intel.clm-ptsRLA
+ SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.derecho_intel.clm-ptsRLB
+ SMS_Ld1_Mmpi-serial.f45_f45_mg37.I2000Clm50SpGs.derecho_intel.clm-ptsROA
+
+Summary of Answer changes:
+-------------------------
+
+If the tag used for baseline comparisons was NOT the previous tag, note that here: previous
+
+Changes answers relative to baseline: No bit-for-bit
+
+Detailed list of changes:
+------------------------
+
+Externals being used:
+ cism2_1_69_a
+ rtm1_0_72
+ mosart1_0_37_1
+ cime5.8.32.7
+ FATES tag -- sci.1.30.0_api.8.0.0
+ cdeps hash -- 45b7a85
+
+CTSM Tag versions pulled over from master development branch: None
+
+Pull Requests that document the changes (include PR ids): None, went direct to branch
+
+===============================================================
+===============================================================
+Tag name: release-cesm2.2.02
+Originator(s): erik (Erik Kluzek,UCAR/TSS,303-497-1326)
+Date: Wed Aug 23 13:18:00 MDT 2023
+One-line Summary: Change the sorting method for history files to help with performance on lustre filesystems
+
+Purpose of this version:
+------------------------
+
+Bring in a change from Jim Edwards changing the way that history file variables
+are sorted which improves performance on lustre filesystems.
+
+Also update the cime tag so will work on izumi with the new domainname.
+
+Tried updating the mosart tag, but that resulting in failing tests.
+
+CTSM Master Tag This Corresponds To: ctsm1.0.dev108 (with most changes from ctsm1.0.dev111)
+
+Summary of changes:
+-------------------
+
+Testing:
+--------
+
+ [PASS means all tests PASS and OK means tests PASS other than expected fails.]
+
+ build-namelist tests:
+
+ cheyenne - PASS
+
+ unit-tests (components/clm/src):
+
+ cheyenne - PASS
+ izumi ---- PASS
+
+ tools-tests (components/clm/test/tools): Not tested
+
+ regular tests (aux_clm):
+
+ cheyenne_intel ---- PASS
+ cheyenne_gnu ------ PASS
+ izumi_nag --------- PASS
+ izumi_intel ------- PASS
+
+Summary of Answer changes:
+-------------------------
+
+If the tag used for baseline comparisons was NOT the previous tag, note that here: previous
+
+Changes answers relative to baseline: No bit-for-bit
+
+Detailed list of changes:
+------------------------
+
+Externals being used:
+ cism2_1_69
+ rtm1_0_72
+ mosart1_0_37
+ cime5.8.32.1
+ FATES tag -- sci.1.30.0_api.8.0.0
+ PTCLM2_20200902
+ cdeps hash -- 45b7a85
+
+CTSM Tag versions pulled over from master development branch: None
+
+Pull Requests that document the changes (include PR ids):
+(https://github.com/ESCOMP/ctsm/pull)
+
+ https://github.com/ESCOMP/CTSM/pull/2115 -- history file optimization
+
+===============================================================
+===============================================================
Tag name: release-cesm2.2.01
Originator(s): erik (Erik Kluzek,UCAR/TSS,303-497-1326)
Date: Wed Sep 2 02:44:01 MDT 2020
diff --git a/python/ctsm/machine_defaults.py b/python/ctsm/machine_defaults.py
index 637845d7eb..ab2b5e01a7 100644
--- a/python/ctsm/machine_defaults.py
+++ b/python/ctsm/machine_defaults.py
@@ -54,6 +54,22 @@
required_args=
'-l select=1:ncpus=36:mpiprocs=1 -r n -l inception=login -k oed')
}),
+ "derecho": MachineDefaults(
+ job_launcher_type=JOB_LAUNCHER_QSUB,
+ scratch_dir=os.path.join(os.path.sep, "glade", "derecho", "scratch", get_user()),
+ baseline_dir=os.path.join(os.path.sep, "glade", "campaign", "cgd", "tss", "ctsm_baselines"),
+ account_required=True,
+ create_test_retry=0,
+ job_launcher_defaults={
+ JOB_LAUNCHER_QSUB: QsubDefaults(
+ queue="main",
+ walltime="03:50:00",
+ extra_args="",
+ # The following assumes a single node, with a single mpi proc; we may want
+ # to add more flexibility in the future, making the node / proc counts
+ # individually selectable
+ required_args="-l select=1:ncpus=128:mpiprocs=1 -V -r n -k oed")
+ }),
'hobart': MachineDefaults(
job_launcher_type=JOB_LAUNCHER_QSUB,
scratch_dir=os.path.join(os.path.sep, 'scratch', 'cluster', get_user()),
diff --git a/python/ctsm/machine_utils.py b/python/ctsm/machine_utils.py
index 41459ce3de..74f24a07f6 100644
--- a/python/ctsm/machine_utils.py
+++ b/python/ctsm/machine_utils.py
@@ -34,6 +34,8 @@ def _machine_from_hostname(hostname):
"""
if re.match(r'cheyenne\d+', hostname):
machine = 'cheyenne'
+ elif re.match(r"derecho\d+", hostname):
+ machine = "derecho"
else:
machine = hostname
diff --git a/test/tools/CLM_compare.sh b/test/tools/CLM_compare.sh
deleted file mode 100755
index 38f547c3ab..0000000000
--- a/test/tools/CLM_compare.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 2 ]; then
- echo "CLM_compare.sh: incorrect number of input arguments"
- exit 1
-fi
-
-echo "CLM_compare.sh: comparing $1 "
-echo " with $2"
-
-##note syntax here as stderr and stdout from cprnc command go
-##to separate places!
-${CPRNC_EXE} ${CPRNC_OPT} $1 $2 2>&1 > cprnc.out
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "CLM_compare.sh: error doing comparison, cprnc error= $rc"
- exit 2
-fi
-
-result_old=`perl -e 'while (my $ll = <>) \
- { if ($ll =~ /(\d+)[^0-9]+compared[^0-9]+(\d+)/) \
- { print "PASS" if $1>0 && $2==0 }}' cprnc.out`
-if grep -c "the two files seem to be IDENTICAL" cprnc.out > /dev/null; then
- result=PASS
-elif grep -c "the two files seem to be DIFFERENT" cprnc.out > /dev/null; then
- result=FAIL
-else
- result=$result_old
-fi
-
-if [ "$result" = "PASS" ]; then
- echo "CLM_compare.sh: files are b4b"
-else
- echo "CLM_compare.sh: files are NOT b4b"
- exit 3
-fi
-
-exit 0
diff --git a/test/tools/Makefile b/test/tools/Makefile
deleted file mode 100644
index b5031abdba..0000000000
--- a/test/tools/Makefile
+++ /dev/null
@@ -1,12 +0,0 @@
-#
-# Makefile to build clm testing documentation
-#
-
-# Get list of tests_ files
-SOURCES = $(wildcard tests_*)
-
-all: test_table.html
-
-test_table.html: $(SOURCES)
- gen_test_table.sh
-
diff --git a/test/tools/README b/test/tools/README
deleted file mode 100644
index 470db6e4b7..0000000000
--- a/test/tools/README
+++ /dev/null
@@ -1,64 +0,0 @@
-$CTSMROOT/clm/test/tools/README 06/08/2018
-
-Scripts for testing the CLM support tools with many different
-configurations and run-time options.
-
-I. MAIN SCRIPTS:
-
-test_driver.sh - Test the CLM offline tools
-
-To use...
-
-./test_driver.sh -i
-
-on cheyenne
-
-qcmd -l select=mem=109GB -l walltime=06:00:00 -- ./test_driver.sh -i >& run.out &
-
-on izumi
-
-nohup ./test_driver.sh -i >& run.out &
-
-release tests
-
-qcmd -l walltime=08:00:00 -- env CLM_INPUT_TESTS=`pwd`/tests_posttag_nompi_regression \
-./test_driver.sh -i >& run.out &
-
-Intended for use on NCAR machines cheyenne, geyser (DAV) and hobart.
-
-II. RUNNING test_driver.sh TOOLS TESTING:
-
-Basic use:
-
-./test_driver.sh -i
-./test_driver.sh -h # to get help on options
-
-Important environment variables (just used by test_driver.sh)
-
-BL_ROOT ---------------- Root directory of CLM baseline code to compare to
- (if not set BL test will not be performed)
-BL_TESTDIR ------------- Root directory of where to put baseline tests
-CLM_INPUT_TESTS -------- Filename of file with list of tests to perform
-CLM_TESTDIR ------------ Root directory of where to put most tests
-CLM_RETAIN_FILES ------- If set to TRUE -- don't cleanup files after testing
-CLM_FC ----------------- Use given compiler
-CLM_JOBID -------------- Job identification number to use (rather than process ID)
-CLM_THREADS ------------ Number of open-MP threads to use
- (by default this is set differently by machine)
-CLM_SOFF --------------- If set to TRUE -- stop on first failed test (default FALSE)
-
-Important files for test_driver tools testing:
-
-test_driver.sh ------- Main test script for tools
-nl_files ------------- Directory with various namelists to test
-config_files --------- Directory with various configurations to test
-input_tests_master --- Master list of tests
-tests_pretag_* ------- Tests for specific machines to do by default before a tag is done
-tests_posttag_* ------ Tests for specific machines to do for more extensive testing
- after a tag is done
-CLM_compare.sh ------- Compares output history files between two cases
-T*.sh ---------------- Basic test script to do a specific type of test
-gen_test_table.sh ---- Creates HTML table of tests
-Makefile ------------- Will build the HTML table of tests
-
-../../tools/README.testing - Information on how the testing works for the CLM tools
diff --git a/test/tools/README.testnames b/test/tools/README.testnames
deleted file mode 100644
index eb6d50f38c..0000000000
--- a/test/tools/README.testnames
+++ /dev/null
@@ -1,63 +0,0 @@
-Tests for test_driver are for the CLM tools only.
-
-Test naming conventions for the test_driver.sh script:
-
-Test names are:
-
-xxnmi
-
-Where: xx is the two-letter test type
- sm=smoke, br=branch, er=exact restart, bl=base-line comparision,
- cb=configure-build, rp=reproducibility, op=OpenMP threading for tools
-
-n is the configuration type:
-
-1 -- unused
-2 -- unused
-3 -- unused
-4 -- unused
-5 -- unused
-6 -- unused
-7 -- unused
-8 -- unused
-9 -- unused
-0 -- unused
-a -- unused
-b -- unused
-c -- mkprocdata_map clm5.0
-d -- mkmapgrids clm5.0
-e -- gen_domain clm5.0
-f -- PTCLM clm5.0
-g -- mksurfdata_map clm5.0
-h -- interpinic clm5.0
-i -- tools scripts clm5.0
-
-m is the resolution
-
-0 -- 0.9x1.25
-1 -- 48x96
-5 -- 10x15
-6 -- 5x5_amazon
-7 -- 1x1 brazil
-8 -- US-UMB
-9 -- 4x5
-c -- US-UMB with cycling on forcing and transient use-case
-g -- US-UMB with global forcing and grid PFT and soil
-y -- 1.9x2.5 with transient 1850-2100 for rcp=2.6 and glacier-MEC on
-T -- 1x1_numaIA
-Z -- 10x15 with crop on
-@ -- ne120np4
-# -- ne30np4
-
-i is the specific test (usually this implies...)
-
-1 -- Serial script
-2 -- Serial
-3 -- OpenMP only
-4 -- serial, DEBUG
-7 -- OpenMP only second test, DEBUG
-8 -- OpenMP only third test, DEBUG
-9 -- Serial Script
-0 -- Serial Script
-
-
diff --git a/test/tools/TBLCFGtools.sh b/test/tools/TBLCFGtools.sh
deleted file mode 100755
index 6276c885e2..0000000000
--- a/test/tools/TBLCFGtools.sh
+++ /dev/null
@@ -1,120 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TBLCFGtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-if [ -z "$BL_ROOT" ] && [ -z "$BL_TESTDIR" ]; then
- echo "TBL.sh: no environment variables set for baseline test - will skip"
- exit 255
-fi
-
-tool=$(basename $1)
-test_name=TBLCFGtools.$tool.$2.$3
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLCFGtools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLCFGtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TBLCFGtools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TBLCFGtools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TBLCFGtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "TBLCFGtools.sh: calling TSMCFGtools.sh to run $tool executable"
-${CLM_SCRIPTDIR}/TSMCFGtools.sh $1 $2 $3
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TBLCFGtools.sh: error from TSMCFGtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-if [ -n "${BL_ROOT}" ]; then
- if [ -z "$BL_TESTDIR" ]; then
- BL_TESTDIR=${CLM_TESTDIR}.bl
- fi
- echo "TBLCFGtools.sh: generating baseline data from root $BL_ROOT - results in $BL_TESTDIR"
-
- echo "TBLCFGtools.sh: calling ****baseline**** TSMCFGtools.sh for smoke test"
- bl_dir=`/bin/ls -1d ${BL_ROOT}/test/tools`
- env CLM_TESTDIR=${BL_TESTDIR} \
- CLM_ROOT=${BL_ROOT} \
- CLM_SCRIPTDIR=$bl_dir \
- $bl_dir/TSMCFGtools.sh $1 $2 $3
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TBLCFGtools.sh: error from *baseline* TSMCFGtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
- fi
-fi
-
-echo "TBLCFGtools.sh: starting b4b comparisons "
-files_to_compare=`cd ${CLM_TESTDIR}/TSMCFGtools.$tool.$2.$3; ls *.nc`
-if [ -z "${files_to_compare}" ] && [ "$debug" != "YES" ]; then
- echo "TBLCFGtools.sh: error locating files to compare"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-all_comparisons_good="TRUE"
-for compare_file in ${files_to_compare}; do
-
- env CPRNC_OPT="-m" \
- ${CLM_SCRIPTDIR}/CLM_compare.sh \
- ${BL_TESTDIR}/TSMCFGtools.$tool.$2.$3/${compare_file} \
- ${CLM_TESTDIR}/TSMCFGtools.$tool.$2.$3/${compare_file}
- rc=$?
- mv cprnc.out cprnc.${compare_file}.out
- if [ $rc -eq 0 ]; then
- echo "TBLCFGtools.sh: comparison successful; output in ${rundir}/cprnc.${compare_file}.out"
- else
- echo "TBLCFGtools.sh: error from CLM_compare.sh= $rc; see ${rundir}/cprnc.${compare_file}.out for details
-"
- all_comparisons_good="FALSE"
- fi
-done
-
-if [ ${all_comparisons_good} = "TRUE" ]; then
- echo "TBLCFGtools.sh: baseline test passed"
- echo "PASS" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TBLCFGtools.sh: removing some unneeded files to save disc space"
- rm *.nc
- rm *.r*
- fi
-else
- echo "TBLCFGtools.sh: at least one file comparison did not pass"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 7
-fi
-
-exit 0
diff --git a/test/tools/TBLscript_tools.sh b/test/tools/TBLscript_tools.sh
deleted file mode 100755
index 284ac710d1..0000000000
--- a/test/tools/TBLscript_tools.sh
+++ /dev/null
@@ -1,148 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TBLscript_tools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-if [ -z "$BL_ROOT" ] && [ -z "$BL_TESTDIR" ]; then
- echo "TBLscript_tools.sh: no environment variables set for baseline test - will skip"
- exit 255
-fi
-
-test_name=TBLscript_tools.$1.$2.$3
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLscript_tools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLscript_tools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TBLscript_tools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TBLscript_tools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TBLscript_tools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "TBLscript_tools.sh: calling TSMscript_tools.sh to run $1 executable"
-${CLM_SCRIPTDIR}/TSMscript_tools.sh $1 $2 $3
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TBLscript_tools.sh: error from TSMtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-if [ -n "${BL_ROOT}" ]; then
- if [ -z "$BL_TESTDIR" ]; then
- BL_TESTDIR=${CLM_TESTDIR}.bl
- fi
- echo "TBLscript_tools.sh: generating baseline data from root $BL_ROOT - results in $BL_TESTDIR"
-
- echo "TBLscript_tools.sh: calling ****baseline**** TSMtools.sh for smoke test"
- bl_dir=`/bin/ls -1d ${BL_ROOT}/test/tools`
- env CLM_TESTDIR=${BL_TESTDIR} \
- CLM_SCRIPTDIR=$bl_dir \
- CLM_ROOT=$BL_ROOT \
- CTSM_ROOT=$BL_ROOT \
- CIME_ROOT=$BL_ROOT/cime \
- $bl_dir/TSMscript_tools.sh $1 $2 $3
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TBLscript_tools.sh: error from *baseline* TSMscript_tools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
- fi
-fi
-
-echo "TBLscript_tools.sh: starting b4b comparisons "
-files_to_compare=`cd ${CLM_TESTDIR}/TSMscript_tools.$1.$2.$3; ls *.nc`
-if [ -z "${files_to_compare}" ] && [ "$debug" != "YES" ]; then
- echo "TBLscript_tools.sh: error locating files to compare"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-all_comparisons_good="TRUE"
-for compare_file in ${files_to_compare}; do
-
- # For PTCLM, skip comparisons of mapping files, since these aren't really
- # necessary, take a lot of time, and cprnc.pl can crash if there are mapping
- # files with 0 overlaps
- if [[ "$2" == "PTCLM" ]]; then
- if [[ "$compare_file" == map* ]]; then
- echo "SKIPPING: $compare_file"
- continue
- fi
- fi
-
- env CPRNC_OPT="-m" \
- ${CLM_SCRIPTDIR}/CLM_compare.sh \
- ${BL_TESTDIR}/TSMscript_tools.$1.$2.$3/${compare_file} \
- ${CLM_TESTDIR}/TSMscript_tools.$1.$2.$3/${compare_file}
- rc=$?
- mv cprnc.out cprnc.${compare_file}.out
- if [ $rc -eq 0 ]; then
- echo "TBLscript_tools.sh: comparison successful; output in ${rundir}/cprnc.${compare_file}.out"
- else
- echo "TBLscript_tools.sh: error from CLM_compare.sh= $rc; see ${rundir}/cprnc.${compare_file}.out for details"
- all_comparisons_good="FALSE"
- fi
-done
-# Compare text files for PTCLM if they exist
-files_to_compare=`cd ${CLM_TESTDIR}/TSMscript_tools.$1.$2.$3; ls README*`
-for compare_file in ${files_to_compare}; do
-
- diff \
- ${BL_TESTDIR}/TSMscript_tools.$1.$2.$3/${compare_file} \
- ${CLM_TESTDIR}/TSMscript_tools.$1.$2.$3/${compare_file} \
- > diff.${compare_file}.out
- rc=$?
- if [ $rc -eq 0 ]; then
- echo "TBLscript_tools.sh: comparison successful; output in ${rundir}/diff.${compare_file}.out"
- else
- echo "TBLscript_tools.sh: error from CLM_compare.sh= $rc; see ${rundir}/diff.${compare_file}.out for details"
- all_comparisons_good="FALSE"
- fi
-done
-
-if [ ${all_comparisons_good} = "TRUE" ]; then
- echo "TBLscript_tools.sh: baseline test passed"
- echo "PASS" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TBLscript_tools.sh: removing some unneeded files to save disc space"
- rm *.nc
- rm *.r*
- fi
-else
- echo "TBLscript_tools.sh: at least one file comparison did not pass"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 7
-fi
-
-
-
-exit 0
diff --git a/test/tools/TBLtools.sh b/test/tools/TBLtools.sh
deleted file mode 100755
index 555ea7d1be..0000000000
--- a/test/tools/TBLtools.sh
+++ /dev/null
@@ -1,119 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TBLtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-if [ -z "$BL_ROOT" ] && [ -z "$BL_TESTDIR" ]; then
- echo "TBL.sh: no environment variables set for baseline test - will skip"
- exit 255
-fi
-
-test_name=TBLtools.$1.$2.$3
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLtools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TBLtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TBLtools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TBLtools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TBLtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "TBLtools.sh: calling TSMtools.sh to run $1 executable"
-${CLM_SCRIPTDIR}/TSMtools.sh $1 $2 $3
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TBLtools.sh: error from TSMtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-if [ -n "${BL_ROOT}" ]; then
- if [ -z "$BL_TESTDIR" ]; then
- BL_TESTDIR=${CLM_TESTDIR}.bl
- fi
- echo "TBLtools.sh: generating baseline data from root $BL_ROOT - results in $BL_TESTDIR"
-
- echo "TBLtools.sh: calling ****baseline**** TSMtools.sh for smoke test"
- bl_dir=`/bin/ls -1d ${BL_ROOT}/test/tools`
- env CLM_TESTDIR=${BL_TESTDIR} \
- CLM_ROOT=${BL_ROOT} \
- CLM_SCRIPTDIR=$bl_dir \
- $bl_dir/TSMtools.sh $1 $2 $3
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TBLtools.sh: error from *baseline* TSMtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
- fi
-fi
-
-echo "TBLtools.sh: starting b4b comparisons "
-files_to_compare=`cd ${CLM_TESTDIR}/TSMtools.$1.$2.$3; ls *.nc`
-if [ -z "${files_to_compare}" ] && [ "$debug" != "YES" ]; then
- echo "TBLtools.sh: error locating files to compare"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-all_comparisons_good="TRUE"
-for compare_file in ${files_to_compare}; do
-
- env CPRNC_OPT="-m" \
- ${CLM_SCRIPTDIR}/CLM_compare.sh \
- ${BL_TESTDIR}/TSMtools.$1.$2.$3/${compare_file} \
- ${CLM_TESTDIR}/TSMtools.$1.$2.$3/${compare_file}
- rc=$?
- mv cprnc.out cprnc.${compare_file}.out
- if [ $rc -eq 0 ]; then
- echo "TBLtools.sh: comparison successful; output in ${rundir}/cprnc.${compare_file}.out"
- else
- echo "TBLtools.sh: error from CLM_compare.sh= $rc; see ${rundir}/cprnc.${compare_file}.out for details
-"
- all_comparisons_good="FALSE"
- fi
-done
-
-if [ ${all_comparisons_good} = "TRUE" ]; then
- echo "TBLtools.sh: baseline test passed"
- echo "PASS" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TBLtools.sh: removing some unneeded files to save disc space"
- rm *.nc
- rm *.r*
- fi
-else
- echo "TBLtools.sh: at least one file comparison did not pass"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 7
-fi
-
-exit 0
diff --git a/test/tools/TCBCFGtools.sh b/test/tools/TCBCFGtools.sh
deleted file mode 100755
index 5c0b015123..0000000000
--- a/test/tools/TCBCFGtools.sh
+++ /dev/null
@@ -1,135 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 2 ]; then
- echo "TCBCFGtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-tool=$(basename $1)
-test_name=TCBCFGtools.$tool.$2
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBCFGtools.sh: build test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBCFGtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TCBCFGtools.sh: build test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TCBCFGtools.sh: this build test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/${1}`
-if [ $? -ne 0 ];then
- cfgdir=`ls -1d ${CIME_ROOT}/tools/mapping/${1}*`
- echo "use: $cfgdir"
-fi
-blddir=${CLM_TESTDIR}/${test_name}/src
-if [ -d ${blddir} ]; then
- rm -r ${blddir}
-fi
-mkdir -p ${blddir}
-if [ $? -ne 0 ]; then
- echo "TCBCFGtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${blddir}
-
-echo "TCBCFGtools.sh: building $tool executable; output in ${blddir}/test.log"
-#
-# Copy build files over
-#
-cp $cfgdir/src/Makefile .
-cp $cfgdir/src/Filepath .
-#
-# Add cfgdir path to beginning of each path in Filepath
-#
-touch Filepath
-while read filepath_arg; do
- echo "${cfgdir}/src/${filepath_arg}" >> Filepath
-done < ${cfgdir}/src/Filepath
-
-#
-# Figure out configuration
-#
-if [ ! -f ${CLM_SCRIPTDIR}/config_files/$tool ]; then
- echo "TCB.sh: configure options file ${CLM_SCRIPTDIR}/config_files/$tool not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-##construct string of args to configure
-config_string=" "
-while read config_arg; do
- config_string="${config_string}${config_arg} "
-done < ${CLM_SCRIPTDIR}/config_files/$tool
-
-if [ "$TOOLSLIBS" != "" ]; then
- export SLIBS=$TOOLSLIBS
-fi
-echo "env CIMEROOT=$CLM_ROOT/cime COMPILER=$CESM_COMP $config_string $CLM_ROOT/cime/tools/configure --macros-format Makefile --machine $CESM_MACH $TOOLS_CONF_STRING"
-env CIMEROOT=$CLM_ROOT/cime COMPILER=$CESM_COMP $config_string $CLM_ROOT/cime/tools/configure --macros-format Makefile --machine $CESM_MACH $TOOLS_CONF_STRING >> test.log 2>&1
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TCBCFGtools.sh: configure failed, error from configure= $rc"
- echo "TCBCFGtools.sh: see ${blddir}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-
-. $INITMODULES
-. ./.env_mach_specific.sh
-
-attempt=1
-still_compiling="TRUE"
-while [ $still_compiling = "TRUE" ]; do
-
- echo "TCBCFGtools.sh: call to make:"
- echo " ${MAKE_CMD} USER_CPPDEFS=-DLINUX"
- if [ "$debug" != "YES" ]; then
- ${MAKE_CMD} USER_CPPDEFS=-DLINUX >> test.log 2>&1
- status="PASS"
- rc=$?
- else
- status="GEN"
- rc=0
- fi
- if [ $rc -eq 0 ]; then
- echo "TCBCFGtools.sh: make was successful"
- echo "TCBCFGtools.sh: configure and build test passed"
- echo "$status" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TCBCFGtools.sh: removing some unneeded files to save disc space"
- rm *.o
- rm *.mod
- fi
- still_compiling="FALSE"
- elif [ $attempt -lt 10 ] && \
- grep -c "LICENSE MANAGER PROBLEM" test.log > /dev/null; then
- attempt=`expr $attempt + 1`
- echo "TCBCFGtools.sh: encountered License Manager Problem; launching attempt #$attempt"
- else
- echo "TCBCFGtools.sh: clm build failed, error from make= $rc"
- echo "TCBCFGtools.sh: see ${blddir}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
- fi
-done
-if [ "$TOOLSLIBS" != "" ]; then
- export -n SLIBS
-fi
-
-exit 0
diff --git a/test/tools/TCBscripttools.sh b/test/tools/TCBscripttools.sh
deleted file mode 100755
index 2605f4b5c4..0000000000
--- a/test/tools/TCBscripttools.sh
+++ /dev/null
@@ -1,80 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 2 ]; then
- echo "TCBscripttools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TCBscripttools.$1.$2
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBscripttools.sh: build test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBscripttools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TCBscripttools.sh: build test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TCBscripttools.sh: this build test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/$1`
-blddir=${CLM_TESTDIR}/${test_name}
-if [ -d ${blddir} ]; then
- rm -r ${blddir}
-fi
-mkdir -p ${blddir}
-if [ $? -ne 0 ]; then
- echo "TCBscripttools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${blddir}
-
-echo "TCBscripttools.sh: building $1 executables; output in ${blddir}/test.log"
-#
-# Build script to exercise
-#
-if [ ! -x ${cfgdir}/$2 ]; then
- echo "TCB.sh: build run script file ${cfgdir}/$2 not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-echo "TCBscripttools.sh: run the build scriptmake:"
-echo " ${cfgdir}/$2"
-
-if [ "$debug" != "YES" ]; then
- export CTSM_ROOT=${CLM_ROOT}
- export CIME_ROOT=${CLM_ROOT}/cime
- ${cfgdir}/$2 >> test.log 2>&1
- rc=$(( $rc + $? ))
- status="PASS"
-else
- status="GEN"
- rc=0
-fi
-if [ $rc -eq 0 ]; then
- echo "TCBscripttools.sh: build script was successful"
- echo "TCBscripttools.sh: build script test passed"
- echo "$status" > TestStatus
-else
- echo "TCBscripttools.sh: clm build script failed, error from build script= $rc"
- echo "TCBscripttools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-exit 0
diff --git a/test/tools/TCBtools.sh b/test/tools/TCBtools.sh
deleted file mode 100755
index b0ea9e4a87..0000000000
--- a/test/tools/TCBtools.sh
+++ /dev/null
@@ -1,130 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 2 ]; then
- echo "TCBtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TCBtools.$1.$2
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBtools.sh: build test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TCBtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TCBtools.sh: build test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TCBtools.sh: this build test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/$1`
-blddir=${CLM_TESTDIR}/${test_name}/src
-if [ -d ${blddir} ]; then
- rm -r ${blddir}
-fi
-mkdir -p ${blddir}
-if [ $? -ne 0 ]; then
- echo "TCBtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${blddir}
-
-echo "TCBtools.sh: building $1 executable; output in ${blddir}/test.log"
-#
-# Copy build files over
-#
-cp $cfgdir/src/Makefile .
-cp $cfgdir/src/Srcfiles .
-cp $cfgdir/src/Mkdepends .
-cp $cfgdir/src/Makefile.common .
-#
-# Add cfgdir path to beginning of each path in Filepath
-#
-touch Filepath
-while read filepath_arg; do
- echo "${cfgdir}/src/${filepath_arg}" >> Filepath
-done < ${cfgdir}/src/Filepath
-
-#
-# Figure out configuration
-#
-if [ ! -f ${CLM_SCRIPTDIR}/config_files/$2 ]; then
- echo "TCB.sh: configure options file ${CLM_SCRIPTDIR}/config_files/$2 not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-##construct string of args to configure
-config_string="$TOOLS_MAKE_STRING TOOLROOT=$cfgdir "
-while read config_arg; do
- config_string="${config_string}${config_arg} "
-done < ${CLM_SCRIPTDIR}/config_files/$2
-
-attempt=1
-still_compiling="TRUE"
-if [ "$TOOLSLIBS" != "" ]; then
- export SLIBS=$TOOLSLIBS
-fi
-while [ $still_compiling = "TRUE" ]; do
-
- if [ "$2" = "gen_domain" ]; then
- HOSTNAME=`uname -n | cut -c 1-2`
- if [ "$HOSTNAME" = "be" ]; then
- echo "TCBtools.sh: run configure for gen_domain on bluefire"
- env CIMEROOT=${CLM_ROOT}/cime ${CLM_ROOT}/cime/tools/configure -mach bluefire >> test.log 2>&1
- rc=$?
- fi
- fi
-
- ln -s Macros.make Macros
-
- echo "TCBtools.sh: call to make:"
- echo " ${MAKE_CMD} ${config_string} "
- if [ "$debug" != "YES" ]; then
- ${MAKE_CMD} ${config_string} >> test.log 2>&1
- status="PASS"
- rc=$(( $rc + $? ))
- else
- status="GEN"
- rc=0
- fi
- if [ $rc -eq 0 ]; then
- echo "TCBtools.sh: make was successful"
- echo "TCBtools.sh: configure and build test passed"
- echo "$status" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TCBtools.sh: removing some unneeded files to save disc space"
- rm *.o
- rm *.mod
- fi
- still_compiling="FALSE"
- elif [ $attempt -lt 10 ] && \
- grep -c "LICENSE MANAGER PROBLEM" test.log > /dev/null; then
- attempt=`expr $attempt + 1`
- echo "TCBtools.sh: encountered License Manager Problem; launching attempt #$attempt"
- else
- echo "TCBtools.sh: clm build failed, error from make= $rc"
- echo "TCBtools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
- fi
-done
-if [ "$TOOLSLIBS" != "" ]; then
- export -n SLIBS
-fi
-
-exit 0
diff --git a/test/tools/TOPtools.sh b/test/tools/TOPtools.sh
deleted file mode 100755
index c8efb440a2..0000000000
--- a/test/tools/TOPtools.sh
+++ /dev/null
@@ -1,120 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TOPtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TOPtools.$1.$2.$3
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TOPtools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TOPtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TOPtools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TOPtools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TOPtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-if [ ${CLM_THREADS} -lt 2 ]; then
- echo "TOPtools.sh: error not enough threads are being used to do the comparision"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-if [ "$2" != "tools__o" ] && [ "$2" != "tools__do" ]; then
- echo "TOPtools.sh: error build needs to be done Open-MP"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-
-echo "TOPtools.sh: calling TSMtools.sh to run $1 executable"
-${CLM_SCRIPTDIR}/TSMtools.sh $1 $2 $3
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TOPtools.sh: error from TSMtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-mkdir $rundir/$CLM_THREADS
-cp ${CLM_TESTDIR}/TSMtools.$1.$2.$3/*.nc $rundir/$CLM_THREADS
-
-# Get a list of different threads to run for, powers of 2 from 1 up to the thread count
-threads=1
-list="1 "
-until [ "$threads" -ge "$CLM_THREADS" ]; do
- threads=`perl -e "$CLM_THREADS<$threads*2 ? print $CLM_THREADS : print $threads*2"`
- if [ "$threads" -lt "$CLM_THREADS" ]; then list="$list $threads "; fi
-done
-
-all_comparisons_good="TRUE"
-for threads in $list
-do
- echo "TOPtools.sh: calling TSMtools.sh to run $1 executable for $threads threads"
- env CLM_THREADS=$threads CLM_RERUN=yes ${CLM_SCRIPTDIR}/TSMtools.sh $1 $2 $3
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TOPtools.sh: error from TSMtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
- fi
- mkdir $rundir/$threads
- cp ${CLM_TESTDIR}/TSMtools.$1.$2.$3/*.nc $rundir/$threads
- files_to_compare=`cd $rundir/$threads; ls *.nc`
- for compare_file in ${files_to_compare}; do
-
- env CPRNC_OPT="-m" \
- ${CLM_SCRIPTDIR}/CLM_compare.sh \
- $rundir/$CLM_THREADS/${compare_file} \
- $rundir/$threads/${compare_file}
- rc=$?
- cprout="cprnc.${compare_file}.threads${threads}.out"
- mv cprnc.out $cprout
- if [ $rc -eq 0 ]; then
- echo "TOPtools.sh: comparison successful; output in $cprout"
- else
- echo "TOPtools.sh: error from CLM_compare.sh= $rc; see $cprout for details"
- all_comparisons_good="FALSE"
- fi
- done
-done
-
-if [ ${all_comparisons_good} = "TRUE" ]; then
- echo "TOPtools.sh: OpenMP comparison test passed"
- echo "PASS" > TestStatus
- if [ $CLM_RETAIN_FILES != "TRUE" ]; then
- echo "TOPtools.sh: removing some unneeded files to save disc space"
- rm */*.nc
- fi
-else
- echo "TOPtools.sh: at least one file comparison did not pass"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 7
-fi
-
-exit 0
diff --git a/test/tools/TSMCFGtools.sh b/test/tools/TSMCFGtools.sh
deleted file mode 100755
index b667a4c6ec..0000000000
--- a/test/tools/TSMCFGtools.sh
+++ /dev/null
@@ -1,113 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TSMCFGtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-tool=$(basename $1)
-test_name=TSMCFGtools.$tool.$2.$3
-
-
-if [ -z "$CLM_RERUN" ]; then
- CLM_RERUN="no"
-fi
-
-if [ "$CLM_RERUN" != "yes" ] && [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMCFGtools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMCFGtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TSMCFGtools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TSMCFGtools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/${1}*`
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TSMCFGtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "TSMCFGtools.sh: calling TCBCFGtools.sh to prepare $tool executable"
-${CLM_SCRIPTDIR}/TCBCFGtools.sh $1 $2
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TSMCFGtools.sh: error from TCBtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-echo "TSMCFGtools.sh: running $tool output in ${rundir}/test.log"
-
-if [ "$2" = "CFGtools__o" ] || [ "$2" = "CFGtools__do" ]; then
- toolrun="env OMP_NUM_THREADS=${CLM_THREADS} ${CLM_TESTDIR}/TCBCFGtools.$tool.$2/${tool}*"
-else
- toolrun="${CLM_TESTDIR}/TCBCFGtools.$tool.$2/${tool}*"
-fi
-
-runfile="${CLM_SCRIPTDIR}/nl_files/$tool.$3"
-if [ ! -f "${runfile}" ]; then
- echo "TSMCFGtools.sh: error ${runfile} input run file not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-
-echo "Run file type = ${3#*.}"
-if [ ${3#*.} == "runoptions" ]; then
- runopts=`cat ${runfile} | sed -e "s|CSMDATA|$CSMDATA|g"`
- echo "$toolrun $runopts"
- cp $cfgdir/*.nc .
- if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- $toolrun $runopts >> test.log 2>&1
- rc=$?
- status="PASS"
- else
- echo "Successfully created file" > test.log
- status="GEN"
- rc=0
- fi
-else
- echo "$toolrun < ${runfile}"
- if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- $toolrun < ${runfile} >> test.log 2>&1
- rc=$?
- status="PASS"
- else
- echo "Successfully created file" > test.log
- status="GEN"
- rc=0
- fi
-fi
-
-if [ $rc -eq 0 ] && grep -ci "Successfully created " test.log > /dev/null; then
- echo "TSMCFGtools.sh: smoke test passed"
- echo "$status" > TestStatus
-else
- echo "TSMCFGtools.sh: error running $tool, error= $rc"
- echo "TSMCFGtools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-exit 0
diff --git a/test/tools/TSMncl_tools.sh b/test/tools/TSMncl_tools.sh
deleted file mode 100755
index eb82142e4d..0000000000
--- a/test/tools/TSMncl_tools.sh
+++ /dev/null
@@ -1,74 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 1 ]; then
- echo "TSMncl_tools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TSMncl_tools.$1
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMncl_tools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMncl_tools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TSMncl_tools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TSMncl_tools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/$1/ncl_scripts`
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TSMncl_tools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "TSMncl_tools.sh: running $1 output in ${rundir}/test.log"
-
-if [ ! -f "${cfgdir}/$1.ncl" ]; then
- echo "TSMncl_tools.sh: error ${cfgdir}/$1.ncl input script not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-
-if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- ncl ${cfgdir}/$1.ncl >> test.log 2>&1
- status="PASS"
- rc=$?
-else
- echo "success" > test.log
- status="GEN"
- rc=0
-fi
-
-if [ $rc -eq 0 ] && grep -ci "success" test.log > /dev/null; then
- echo "TSMncl_tools.sh: smoke test passed"
- echo "$status" > TestStatus
-else
- echo "TSMncl_tools.sh: error running $1, error= $rc"
- echo "TSMncl_tools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-exit 0
diff --git a/test/tools/TSMscript_tools.sh b/test/tools/TSMscript_tools.sh
deleted file mode 100755
index 360ecf86a4..0000000000
--- a/test/tools/TSMscript_tools.sh
+++ /dev/null
@@ -1,114 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TSMscript_tools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TSMscript_tools.$1.$2.$3
-
-if [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMscript_tools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMscript_tools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TSMscript_tools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TSMscript_tools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/$1`
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TSMscript_tools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-# Copy any sample files so can use them
-cp $cfgdir/sample_* $rundir
-
-optfile=${3%^*}
-cfgfile=${3#*^}
-
-if [[ "$1" == "PTCLM" ]]; then
- echo "TSMscript_tools.sh: calling TCBscripttools.sh to prepare executables for $1"
- ${CLM_SCRIPTDIR}/TCBscripttools.sh $1 $cfgfile
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TSMscript_tools.sh: error from TCBscripttools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
- fi
- # Copy map files so we can use them
- subdir=1x1pt_US-UMB
- mkdir $rundir/$subdir
- cp $CSMDATA/lnd/clm2/PTCLMmydatafiles.c171024/$subdir/map_* $rundir/$subdir
-elif [ "$optfile" != "$3" ]; then
- echo "TSMscript_tools.sh: calling TCBtools.sh to prepare $1 executable"
- ${CLM_SCRIPTDIR}/TCBtools.sh $1 $cfgfile
- rc=$?
- if [ $rc -ne 0 ]; then
- echo "TSMscript_tools.sh: error from TCBtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
- fi
- tcbtools=${CLM_TESTDIR}/TCBtools.$1.$cfgfile
-else
- tcbtools="."
-fi
-
-scopts=`cat ${CLM_SCRIPTDIR}/nl_files/$optfile | sed -e "s|CSMDATA|$CSMDATA|g" | sed -e "s|EXEDIR|$tcbtools|" | sed -e "s|CFGDIR|$cfgdir|g"`
-scopts=`echo $scopts | sed -e "s|CTSM_ROOT|$CTSM_ROOT|g" | sed -e "s|CIME_ROOT|$CIME_ROOT|g"`
-
-echo "TSMscript_tools.sh: running ${cfgdir}/$2 with $scopts; output in ${rundir}/test.log"
-
-if [ ! -f "${cfgdir}/$2" ]; then
- echo "TSMscript_tools.sh: error ${cfgdir}/$2 input script not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
-fi
-
-if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- ${cfgdir}/$2 $scopts >> test.log 2>&1
- rc=$?
- status="PASS"
-else
- echo "success" > test.log
- status="GEN"
- rc=0
-fi
-
-if [ $rc -eq 0 ] && grep -ci "Successfully " test.log > /dev/null; then
- echo "TSMscript_tools.sh: smoke test passed"
- echo "$status" > TestStatus
- # Copy files from subdirectories up...
- # (use hard links rather than symbolic links because 'ln -s' does funny
- # things when there are no matching files)
- ln */*.nc */*/*.nc .
-else
- echo "TSMscript_tools.sh: error running $2, error= $rc"
- echo "TSMscript_tools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-exit 0
diff --git a/test/tools/TSMtools.sh b/test/tools/TSMtools.sh
deleted file mode 100755
index 33a2316973..0000000000
--- a/test/tools/TSMtools.sh
+++ /dev/null
@@ -1,117 +0,0 @@
-#!/bin/sh
-#
-
-if [ $# -ne 3 ]; then
- echo "TSMtools.sh: incorrect number of input arguments"
- exit 1
-fi
-
-test_name=TSMtools.$1.$2.$3
-
-if [ -z "$CLM_RERUN" ]; then
- CLM_RERUN="no"
-fi
-
-if [ "$CLM_RERUN" != "yes" ] && [ -f ${CLM_TESTDIR}/${test_name}/TestStatus ]; then
- if grep -c PASS ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMtools.sh: smoke test has already passed; results are in "
- echo " ${CLM_TESTDIR}/${test_name}"
- exit 0
- elif grep -c GEN ${CLM_TESTDIR}/${test_name}/TestStatus > /dev/null; then
- echo "TSMtools.sh: test already generated"
- else
- read fail_msg < ${CLM_TESTDIR}/${test_name}/TestStatus
- prev_jobid=${fail_msg#*job}
-
- if [ $JOBID = $prev_jobid ]; then
- echo "TSMtools.sh: smoke test has already failed for this job - will not reattempt; "
- echo " results are in: ${CLM_TESTDIR}/${test_name}"
- exit 2
- else
- echo "TSMtools.sh: this smoke test failed under job ${prev_jobid} - moving those results to "
- echo " ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid and trying again"
- cp -rp ${CLM_TESTDIR}/${test_name} ${CLM_TESTDIR}/${test_name}_FAIL.job$prev_jobid
- fi
- fi
-fi
-
-cfgdir=`ls -1d ${CLM_ROOT}/tools/$1`
-rundir=${CLM_TESTDIR}/${test_name}
-if [ -d ${rundir} ]; then
- rm -r ${rundir}
-fi
-mkdir -p ${rundir}
-if [ $? -ne 0 ]; then
- echo "TSMtools.sh: error, unable to create work subdirectory"
- exit 3
-fi
-cd ${rundir}
-
-echo "Copy any text files over"
-cp $cfgdir/*.txt $rundir
-
-echo "TSMtools.sh: calling TCBtools.sh to prepare $1 executable"
-${CLM_SCRIPTDIR}/TCBtools.sh $1 $2
-rc=$?
-if [ $rc -ne 0 ]; then
- echo "TSMtools.sh: error from TCBtools.sh= $rc"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 4
-fi
-
-echo "TSMtools.sh: running $1; output in ${rundir}/test.log"
-
-if [ "$3" = "tools__o" ] || [ "$3" = "tools__do" ]; then
- toolrun="env OMP_NUM_THREADS=${CLM_THREADS} ${CLM_TESTDIR}/TCBtools.$1.$2/$1"
-else
- toolrun="${CLM_TESTDIR}/TCBtools.$1.$2/$1"
-fi
-
-runfile="${cfgdir}/$1.$3"
-
-if [ ! -f "${runfile}" ]; then
- runfile="${CLM_SCRIPTDIR}/nl_files/$1.$3"
- if [ ! -f "${runfile}" ]; then
- echo "TSMtools.sh: error ${runfile} input run file not found"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 5
- fi
-fi
-
-echo "Run file type = ${3#*.}"
-if [ ${3#*.} == "runoptions" ]; then
- echo "$toolrun "`cat ${runfile}`
- cp $cfgdir/*.nc .
- if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- $toolrun `cat ${runfile}` >> test.log 2>&1
- rc=$?
- status="PASS"
- else
- echo "Successfully created file" > test.log
- status="GEN"
- rc=0
- fi
-else
- echo "$toolrun < ${runfile}"
- if [ "$debug" != "YES" ] && [ "$compile_only" != "YES" ]; then
- $toolrun < ${runfile} >> test.log 2>&1
- rc=$?
- status="PASS"
- else
- echo "Successfully created file" > test.log
- status="GEN"
- rc=0
- fi
-fi
-
-if [ $rc -eq 0 ] && grep -ci "Successfully created " test.log > /dev/null; then
- echo "TSMtools.sh: smoke test passed"
- echo "$status" > TestStatus
-else
- echo "TSMtools.sh: error running $1, error= $rc"
- echo "TSMtools.sh: see ${CLM_TESTDIR}/${test_name}/test.log for details"
- echo "FAIL.job${JOBID}" > TestStatus
- exit 6
-fi
-
-exit 0
diff --git a/test/tools/config_files/CFGtools__ds b/test/tools/config_files/CFGtools__ds
deleted file mode 100644
index e69de29bb2..0000000000
diff --git a/test/tools/config_files/PTCLM__s b/test/tools/config_files/PTCLM__s
deleted file mode 100644
index 8a2155ac49..0000000000
--- a/test/tools/config_files/PTCLM__s
+++ /dev/null
@@ -1 +0,0 @@
-./buildtools
diff --git a/test/tools/config_files/README b/test/tools/config_files/README
deleted file mode 100644
index bdfe5e0dd0..0000000000
--- a/test/tools/config_files/README
+++ /dev/null
@@ -1,9 +0,0 @@
-_do => debug on, omp only on
-_ds => debug on, serial mode (neither mpi nor omp)
-
-_o => debug off, omp only on
-_s => debug off, serial mode (neither mpi nor omp)
-
-tools__ds => options for tools, debug on, serial mode
-tools__do => options for tools, debug on, omp only on
-tools__o => options for tools, debug off, omp only on
diff --git a/test/tools/config_files/gen_domain b/test/tools/config_files/gen_domain
deleted file mode 100644
index e69de29bb2..0000000000
diff --git a/test/tools/config_files/tools__do b/test/tools/config_files/tools__do
deleted file mode 100644
index 7f061ed65d..0000000000
--- a/test/tools/config_files/tools__do
+++ /dev/null
@@ -1 +0,0 @@
-SMP=TRUE OPT=FALSE
diff --git a/test/tools/config_files/tools__ds b/test/tools/config_files/tools__ds
deleted file mode 100644
index cf2d414b28..0000000000
--- a/test/tools/config_files/tools__ds
+++ /dev/null
@@ -1 +0,0 @@
-OPT=FALSE
diff --git a/test/tools/config_files/tools__o b/test/tools/config_files/tools__o
deleted file mode 100644
index 8821e0bc5a..0000000000
--- a/test/tools/config_files/tools__o
+++ /dev/null
@@ -1 +0,0 @@
-SMP=TRUE OPT=TRUE
diff --git a/test/tools/config_files/tools__s b/test/tools/config_files/tools__s
deleted file mode 100644
index 507973f8be..0000000000
--- a/test/tools/config_files/tools__s
+++ /dev/null
@@ -1 +0,0 @@
-OPT=TRUE
diff --git a/test/tools/gen_test_table.sh b/test/tools/gen_test_table.sh
deleted file mode 100755
index 0791ad0447..0000000000
--- a/test/tools/gen_test_table.sh
+++ /dev/null
@@ -1,80 +0,0 @@
-#!/bin/sh
-#
-
-# this script, when executed in the directory containing the test-driver
-# scripts (~/test/system) will loop through the default test
-# lists for pre and post tag testing of clm and create an html file
-# (test_table.html) with the specifics of each test detailed
-
-outfile="./test_table.html"
-
-echo '' > $outfile
-echo '' >> $outfile
-echo '' >> $outfile
-echo '' >> $outfile
-echo 'CLM Testing Information Page ' >> $outfile
-echo '' >> $outfile
-echo '' >> $outfile
-
-#########################################################################################
-for input_file in `ls tests_*` ; do
- echo '' >> $outfile
- echo "$input_file " >> $outfile
- echo "" >> $outfile
- echo "test# " >> $outfile
- echo "testid " >> $outfile
- echo "test script " >> $outfile
- echo "arg1 " >> $outfile
- echo "arg2 " >> $outfile
- echo "arg3 " >> $outfile
- echo " " >> $outfile
-
- test_list=""
- while read input_line; do
- test_list="${test_list}${input_line} "
- done < ./${input_file}
-
- count=0
- ##loop through the tests of input file
- for test_id in ${test_list}; do
- echo "" >> $outfile
- count=`expr $count + 1`
- while [ ${#count} -lt 3 ]; do
- count="0${count}"
- done
- echo " $count " >> $outfile
-
- master_line=`grep $test_id ./input_tests_master`
- dir=""
- for arg in ${master_line}; do
- arg1=${arg%^*}
- arg2=${arg#*^}
- if [ -d ../../tools/$arg ]; then
- dir=$arg
- elif [ -f ./nl_files/$arg ]; then
- echo "$arg " >> $outfile
- elif [ -f ./config_files/$arg ]; then
- echo "$arg " >> $outfile
- elif [ -f ./nl_files/$arg1 ] && [ -f ./nl_files/$arg2 ]; then
- echo "$arg1 ^" \
- "$arg2 " >> $outfile
- elif [ -f ./nl_files/$arg1 ] && [ -f ./config_files/$arg2 ]; then
- echo "$arg1 ^" \
- "$arg2 " >> $outfile
- elif [ -f ../../tools/$dir/$dir.$arg ]; then
- echo "$arg " >> $outfile
- else
- echo "$arg " >> $outfile
- fi
- done
- echo ' ' >> $outfile
- done
- echo '
' >> $outfile
- echo '' >> $outfile
- echo ' ' >> $outfile
- echo ' ' >> $outfile
-done
-echo '' >> $outfile
-echo '' >> $outfile
-
-exit 0
diff --git a/test/tools/get_cprnc_diffs.sh b/test/tools/get_cprnc_diffs.sh
deleted file mode 100755
index 360220cb71..0000000000
--- a/test/tools/get_cprnc_diffs.sh
+++ /dev/null
@@ -1,56 +0,0 @@
-#!/bin/bash
-
-# This script extracts lines from the output of cprnc that tell us
-# which variables differ between two files
-#
-# Usage: get_cprnc_diffs filename
-
-# ----------------------------------------------------------------------
-# SET PARAMETERS HERE
-# ----------------------------------------------------------------------
-
-# maximum number of differences to extract from the cprnc output
-maxdiffs=200
-
-# ----------------------------------------------------------------------
-# LOCAL FUNCTIONS DEFINED HERE
-# ----------------------------------------------------------------------
-
-# This function gets differences for one prefix (e.g., "RMS")
-# Usage: get_diffs prefix
-# (also uses $infile and $maxdiffs from the parent script)
-function get_diffs {
- prefix=$1
- outfile=${infile}.${prefix}.$$
- grep "$prefix" $infile > $outfile
- numlines=`wc -l $outfile | awk '{print $1}'`
- if [ $numlines -gt $maxdiffs ]; then
- echo "WARNING: Too many instances of $prefix - only printing last $maxdiffs"
- tail -$maxdiffs $outfile
- else
- cat $outfile
- fi
- rm $outfile
-}
-
-# ----------------------------------------------------------------------
-# BEGIN MAIN SCRIPT
-# ----------------------------------------------------------------------
-
-# ----------------------------------------------------------------------
-# Handle command-line arguments
-# ----------------------------------------------------------------------
-
-if [[ $# -ne 1 ]]; then
- echo "Usage: get_cprnc_diffs filename"
- exit 1
-fi
-
-infile=$1
-
-# ----------------------------------------------------------------------
-# Do the processing
-# ----------------------------------------------------------------------
-
-get_diffs RMS
-get_diffs FILLDIFF
diff --git a/test/tools/input_tests_master b/test/tools/input_tests_master
deleted file mode 100644
index cb3998557d..0000000000
--- a/test/tools/input_tests_master
+++ /dev/null
@@ -1,62 +0,0 @@
-
-
-smc#4 TSMscript_tools.sh mkprocdata_map mkprocdata_map_wrap mkprocdata_ne30_to_f19_I2000^tools__ds
-blc#4 TBLscript_tools.sh mkprocdata_map mkprocdata_map_wrap mkprocdata_ne30_to_f19_I2000^tools__ds
-
-sme14 TSMCFGtools.sh gen_domain CFGtools__ds T31.runoptions
-ble14 TBLCFGtools.sh gen_domain CFGtools__ds T31.runoptions
-sme@4 TSMCFGtools.sh gen_domain CFGtools__ds ne30.runoptions
-ble@4 TBLCFGtools.sh gen_domain CFGtools__ds ne30.runoptions
-
-smg54 TSMtools.sh mksurfdata_map tools__s namelist
-blg54 TBLtools.sh mksurfdata_map tools__s namelist
-
-smi24 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_T31_crpglc_2000^tools__ds
-bli24 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_T31_crpglc_2000^tools__ds
-
-smi04 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_f09_PtVg^tools__ds
-bli04 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_f09_PtVg^tools__ds
-
-smi53 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__o
-bli53 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__o
-smi54 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__ds
-bli54 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__ds
-smi57 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__do
-bli57 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_1850^tools__do
-smi58 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_crp_1850-2000^tools__do
-bli58 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_10x15_crp_1850-2000^tools__do
-
-smi64 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_5x5_amazon_hirespft_2005^tools__ds
-bli64 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_5x5_amazon_hirespft_2005^tools__ds
-
-smi74 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_brazil_1850-2000^tools__ds
-bli74 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_brazil_1850-2000^tools__ds
-smi78 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_brazil_1850^tools__ds
-bli78 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_brazil_1850^tools__ds
-smiT4 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_numaIA_crp_2000^tools__ds
-bliT4 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_numaIA_crp_2000^tools__ds
-smiT2 TSMscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_numaIA_crp_SSP5-8.5_1850-2100^tools__s
-bliT2 TBLscript_tools.sh mksurfdata_map mksurfdata.pl mksrfdt_1x1_numaIA_crp_SSP5-8.5_1850-2100^tools__s
-
-smi#2 TSMscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_ne30np4
-bli#2 TBLscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_ne30np4
-smi59 TSMscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_if10
-bli59 TBLscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_if10
-smi79 TSMscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_i1x1_brazil
-bli79 TBLscript_tools.sh mkmapdata mkmapdata.sh mkmapdata_i1x1_brazil
-
-smf84 TSMscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_clm4_5^buildtools
-blf84 TBLscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_clm4_5^buildtools
-smfc4 TSMscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_Cycle_clm4_5^buildtools
-blfc4 TBLscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_Cycle_clm4_5^buildtools
-smfg4 TSMscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_Global_clm4_5^buildtools
-blfg4 TBLscript_tools.sh PTCLM PTCLMmkdata PTCLM_USUMB_Global_clm4_5^buildtools
-
-smiS4 TSMscript_tools.sh ncl_scripts getregional_datasets.pl getregional
-bliS4 TBLscript_tools.sh ncl_scripts getregional_datasets.pl getregional
-smiS8 TSMscript_tools.sh ncl_scripts getregional_datasets.pl getregional_ndep
-bliS8 TBLscript_tools.sh ncl_scripts getregional_datasets.pl getregional_ndep
-smiS9 TSMscript_tools.sh ncl_scripts getregional_datasets.pl getregional_T62
-bliS9 TBLscript_tools.sh ncl_scripts getregional_datasets.pl getregional_T62
-smiS0 TSMscript_tools.sh ncl_scripts getregional_datasets.pl getregional_05popd
-bliS0 TBLscript_tools.sh ncl_scripts getregional_datasets.pl getregional_05popd
diff --git a/test/tools/nl_files/PTCLM_USUMB_Cycle_clm4_5 b/test/tools/nl_files/PTCLM_USUMB_Cycle_clm4_5
deleted file mode 100644
index f27830a2c4..0000000000
--- a/test/tools/nl_files/PTCLM_USUMB_Cycle_clm4_5
+++ /dev/null
@@ -1 +0,0 @@
--s US-UMB -d CSMDATA --mydatadir . --map_gdate 171024 --cycle_forcing --ctsm_root CTSM_ROOT --cime_root CIME_ROOT
diff --git a/test/tools/nl_files/PTCLM_USUMB_Global_clm4_5 b/test/tools/nl_files/PTCLM_USUMB_Global_clm4_5
deleted file mode 100644
index dc4c617dc3..0000000000
--- a/test/tools/nl_files/PTCLM_USUMB_Global_clm4_5
+++ /dev/null
@@ -1 +0,0 @@
--s US-UMB -d CSMDATA --mydatadir . --map_gdate 171024 --donot_use_tower_yrs --clmnmlusecase 20thC_transient --pftgrid --soilgrid --ctsm_root CTSM_ROOT --cime_root CIME_ROOT
diff --git a/test/tools/nl_files/PTCLM_USUMB_clm4_5 b/test/tools/nl_files/PTCLM_USUMB_clm4_5
deleted file mode 100644
index 5ed93b26bd..0000000000
--- a/test/tools/nl_files/PTCLM_USUMB_clm4_5
+++ /dev/null
@@ -1 +0,0 @@
--s US-UMB -d CSMDATA --mydatadir . --map_gdate 171024 --ctsm_root CTSM_ROOT --cime_root CIME_ROOT
diff --git a/test/tools/nl_files/gen_domain.T31.runoptions b/test/tools/nl_files/gen_domain.T31.runoptions
deleted file mode 100644
index c1fcc07df1..0000000000
--- a/test/tools/nl_files/gen_domain.T31.runoptions
+++ /dev/null
@@ -1 +0,0 @@
--m CSMDATA/cpl/cpl6/map_gx3v7_to_T31_aave_da_090903.nc -o domain.ocn.gx3v7_test.nc -l domain.lnd.10x15_gx3v7.test.nc
diff --git a/test/tools/nl_files/gen_domain.ne30.runoptions b/test/tools/nl_files/gen_domain.ne30.runoptions
deleted file mode 100644
index 790969101e..0000000000
--- a/test/tools/nl_files/gen_domain.ne30.runoptions
+++ /dev/null
@@ -1 +0,0 @@
--m CSMDATA/cpl/cpl6/map_gx1v6_to_ne30np4_aave_da_091227.nc -o domain.ocn.gx1v6_test.nc -l domain.lnd.ne30np4_gx1v6.test.nc
diff --git a/test/tools/nl_files/getregional b/test/tools/nl_files/getregional
deleted file mode 100644
index 5e5d348e39..0000000000
--- a/test/tools/nl_files/getregional
+++ /dev/null
@@ -1 +0,0 @@
--SW 52,190 -NE 73,220 -i sample_inlist -o sample_outlist
diff --git a/test/tools/nl_files/getregional_05popd b/test/tools/nl_files/getregional_05popd
deleted file mode 100644
index 79747ad9cd..0000000000
--- a/test/tools/nl_files/getregional_05popd
+++ /dev/null
@@ -1 +0,0 @@
--SW 52,190 -NE 73,220 -i sample_inlist_0.5popd -o sample_outlist_0.5popd
diff --git a/test/tools/nl_files/getregional_T62 b/test/tools/nl_files/getregional_T62
deleted file mode 100644
index 8288847cf5..0000000000
--- a/test/tools/nl_files/getregional_T62
+++ /dev/null
@@ -1 +0,0 @@
--SW 52,190 -NE 73,220 -i sample_inlist_T62 -o sample_outlist_T62
diff --git a/test/tools/nl_files/getregional_ndep b/test/tools/nl_files/getregional_ndep
deleted file mode 100644
index 125285f690..0000000000
--- a/test/tools/nl_files/getregional_ndep
+++ /dev/null
@@ -1 +0,0 @@
--SW 52,190 -NE 73,220 -i sample_inlist_ndep -o sample_outlist_ndep
diff --git a/test/tools/nl_files/mkmapdata_if10 b/test/tools/nl_files/mkmapdata_if10
deleted file mode 100644
index c3218edc1c..0000000000
--- a/test/tools/nl_files/mkmapdata_if10
+++ /dev/null
@@ -1 +0,0 @@
--r 10x15
diff --git a/test/tools/nl_files/mkmapdata_ne30np4 b/test/tools/nl_files/mkmapdata_ne30np4
deleted file mode 100644
index 11066c7b41..0000000000
--- a/test/tools/nl_files/mkmapdata_ne30np4
+++ /dev/null
@@ -1 +0,0 @@
--r ne30np4
diff --git a/test/tools/nl_files/mkprocdata_ne30_to_f19_I2000 b/test/tools/nl_files/mkprocdata_ne30_to_f19_I2000
deleted file mode 100644
index af85dcf226..0000000000
--- a/test/tools/nl_files/mkprocdata_ne30_to_f19_I2000
+++ /dev/null
@@ -1 +0,0 @@
--i CSMDATA/lnd/clm2/test_mkprocdata_map/clm4054_ne30g16_I2000.clm2.h0.2000-01_c170430.nc -o ne30output_onf19grid.nc -m CSMDATA/lnd/clm2/test_mkprocdata_map/map_ne30np4_nomask_to_fv1.9x2.5_nomask_aave_da_c121107.nc -t CSMDATA/lnd/clm2/test_mkprocdata_map/clm4054_f19g16_I2000.clm2.h0.2000-01_c170430.nc -e EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_10x15_1850 b/test/tools/nl_files/mksrfdt_10x15_1850
deleted file mode 100644
index cdbb7c13dc..0000000000
--- a/test/tools/nl_files/mksrfdt_10x15_1850
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -vic -r 10x15 -no-crop -y 1850 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_10x15_crp_1850-2000 b/test/tools/nl_files/mksrfdt_10x15_crp_1850-2000
deleted file mode 100644
index b42c1deb04..0000000000
--- a/test/tools/nl_files/mksrfdt_10x15_crp_1850-2000
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 10x15 -y 1850-2000 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_1x1_brazil_1850 b/test/tools/nl_files/mksrfdt_1x1_brazil_1850
deleted file mode 100644
index 2330bd082e..0000000000
--- a/test/tools/nl_files/mksrfdt_1x1_brazil_1850
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 1x1_brazil -y 1850-2000 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_1x1_brazil_1850-2000 b/test/tools/nl_files/mksrfdt_1x1_brazil_1850-2000
deleted file mode 100644
index 2330bd082e..0000000000
--- a/test/tools/nl_files/mksrfdt_1x1_brazil_1850-2000
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 1x1_brazil -y 1850-2000 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_2000 b/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_2000
deleted file mode 100644
index 03304f81eb..0000000000
--- a/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_2000
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 1x1_numaIA -y 2000 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_SSP5-8.5_1850-2100 b/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_SSP5-8.5_1850-2100
deleted file mode 100644
index ed83434075..0000000000
--- a/test/tools/nl_files/mksrfdt_1x1_numaIA_crp_SSP5-8.5_1850-2100
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 1x1_numaIA -y 1850-2100 -ssp_rcp SSP5-8.5 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_1x1_vancouverCAN_2000 b/test/tools/nl_files/mksrfdt_1x1_vancouverCAN_2000
deleted file mode 100644
index a446e82fcd..0000000000
--- a/test/tools/nl_files/mksrfdt_1x1_vancouverCAN_2000
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 1x1_vancouverCAN -no-crop -y 2000 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_5x5_amazon_hirespft_2005 b/test/tools/nl_files/mksrfdt_5x5_amazon_hirespft_2005
deleted file mode 100644
index 47a5391c84..0000000000
--- a/test/tools/nl_files/mksrfdt_5x5_amazon_hirespft_2005
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 5x5_amazon -y 2005 -hirespft -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_T31_crpglc_2000 b/test/tools/nl_files/mksrfdt_T31_crpglc_2000
deleted file mode 100644
index ac8ceed1a8..0000000000
--- a/test/tools/nl_files/mksrfdt_T31_crpglc_2000
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 48x96 -y 2000 -glc_nec 10 -exedir EXEDIR
diff --git a/test/tools/nl_files/mksrfdt_f09_PtVg b/test/tools/nl_files/mksrfdt_f09_PtVg
deleted file mode 100644
index 61c2d8325e..0000000000
--- a/test/tools/nl_files/mksrfdt_f09_PtVg
+++ /dev/null
@@ -1 +0,0 @@
--l CSMDATA -r 0.9x1.25 -no-crop -y PtVg -exedir EXEDIR
diff --git a/test/tools/show_var_diffs.sh b/test/tools/show_var_diffs.sh
deleted file mode 100755
index f462d4ad0c..0000000000
--- a/test/tools/show_var_diffs.sh
+++ /dev/null
@@ -1,79 +0,0 @@
-#!/bin/bash
-
-# This script processes a log file that was output by test_driver,
-# giving lists of all variables with differences in values (those with
-# RMS errors), and all variables with differences in fill patterns.
-#
-# This assumes that the log file contains output like:
-# RMS foo
-# RMS bar
-# FILLDIFF foo
-# FILLDIFF bar
-# Some characteristics of these output lines are:
-# - they begin with a leading space, followed by RMS or FILLDIFF
-# - the variable name is in the second column of the line
-#
-# Note that (as of 4-5-12) the log file only contains output from the
-# last file that didn't match, so this could potentially miss
-# something -- especially if there are both h0 and h1 files in the
-# comparison.
-
-# Usage: show_var_diffs logfile
-
-# ----------------------------------------------------------------------
-# LOCAL FUNCTIONS DEFINED HERE
-# ----------------------------------------------------------------------
-
-# This function shows the differences for one prefix (e.g., "RMS")
-# Usage: show_diffs prefix
-# (also uses $logfile from the parent script)
-#
-# Matches lines that start with the regular expression "^ ${prefix}"
-# (note that one leading space is expected before the prefix)
-#
-# Assumes that the variable name is in the second column of matching lines
-function show_diffs {
- prefix=$1
-
- # first determine if there were warnings relating to this prefix
- grep "WARNING: Too many instances of ${prefix}" $logfile > /dev/null
- if [ $? -eq 0 ]; then # found a warning
- echo "WARNING: Some output was truncated; this may not be a complete list"
- fi
-
- # now make a list of all variables matching this prefix
- grep "^ ${prefix}" $logfile > $logfile.tmp.$$
- if [ $? -eq 0 ]; then
- awk '{print $2}' $logfile.tmp.$$ | sort | uniq
- else
- echo "(no differences)"
- fi
-
- rm $logfile.tmp.$$
-}
-
-# ----------------------------------------------------------------------
-# BEGIN MAIN SCRIPT
-# ----------------------------------------------------------------------
-
-# ----------------------------------------------------------------------
-# Handle command-line arguments
-# ----------------------------------------------------------------------
-
-if [[ $# -ne 1 ]]; then
- echo "Usage: show_var_diffs logfile"
- exit 1
-fi
-
-logfile=$1
-
-# ----------------------------------------------------------------------
-# Do the processing
-# ----------------------------------------------------------------------
-
-echo "Variables with differences in values:"
-show_diffs "RMS"
-
-echo ""
-echo "Variables with differences in fill patterns:"
-show_diffs "FILLDIFF"
\ No newline at end of file
diff --git a/test/tools/test_driver.sh b/test/tools/test_driver.sh
deleted file mode 100755
index 28ecb07072..0000000000
--- a/test/tools/test_driver.sh
+++ /dev/null
@@ -1,653 +0,0 @@
-#!/bin/sh
-#
-# test_driver.sh: driver script for the offline testing of CLM of tools
-#
-# interactive usage on all machines:
-#
-# env ./test_driver.sh -i
-#
-# valid arguments:
-# -i interactive usage
-# -d debug usage -- display tests that will run -- but do NOT actually execute them
-# -f force batch submission (avoids user prompt)
-# -h displays this help message
-#
-#
-# **pass environment variables by preceding above commands
-# with 'env var1=setting var2=setting '
-# **more details in the CLM testing user's guide, accessible
-# from the CLM developers web page
-
-
-#will attach timestamp onto end of script name to prevent overwriting
-cur_time=`date '+%H:%M:%S'`
-
-hostname=`hostname`
-echo $hostname
-case $hostname in
-
- ##cheyenne
- cheyenne* | r*i*n*)
- submit_script="test_driver_cheyenne${cur_time}.sh"
-
-##vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv writing to batch script vvvvvvvvvvvvvvvvvvv
-cat > ./${submit_script} << EOF
-#!/bin/sh
-#
-
-interactive="YES"
-input_file="tests_pretag_cheyenne_nompi"
-c_threads=36
-
-
-export INITMODULES="/glade/u/apps/ch/opt/lmod/8.1.7/lmod/lmod/init/sh"
-. \$INITMODULES
-
-module purge
-module load ncarenv
-module load intel
-module load mkl
-module load ncarcompilers
-module load netcdf
-
-module load nco
-module load python
-module load ncl
-
-
-##omp threads
-if [ -z "\$CLM_THREADS" ]; then #threads NOT set on command line
- export CLM_THREADS=\$c_threads
-fi
-
-# Stop on first failed test
-if [ -z "\$CLM_SOFF" ]; then #CLM_SOFF NOT set
- export CLM_SOFF=FALSE
-fi
-
-export CESM_MACH="cheyenne"
-export CESM_COMP="intel"
-
-export NETCDF_DIR=\$NETCDF
-export INC_NETCDF=\$NETCDF/include
-export LIB_NETCDF=\$NETCDF/lib
-export MAKE_CMD="gmake -j "
-export CFG_STRING=""
-export TOOLS_MAKE_STRING="USER_FC=ifort USER_LINKER=ifort USER_CPPDEFS=-DLINUX"
-export MACH_WORKSPACE="/glade/scratch"
-export CPRNC_EXE="$CESMDATAROOT/tools/cime/tools/cprnc/cprnc.cheyenne"
-dataroot="$CESMDATAROOT"
-export TOOLSLIBS=""
-export REGRID_PROC=1
-export TOOLS_CONF_STRING="--mpilib mpi-serial"
-
-
-echo_arg=""
-
-EOF
-##^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ writing to batch script ^^^^^^^^^^^^^^^^^^^
- ;;
-
- ## DAV cluster
- casper* | pronghorn*)
- submit_script="test_driver_dav${cur_time}.sh"
-
-##vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv writing to batch script vvvvvvvvvvvvvvvvvvv
-cat > ./${submit_script} << EOF
-#!/bin/sh
-#
-
-interactive="YES"
-input_file="tests_posttag_dav_mpi"
-c_threads=36
-
-
-export INITMODULES="/glade/u/apps/ch/opt/lmod/8.1.7/lmod/lmod/init/sh"
-. \$INITMODULES
-
-module purge
-module load ncarenv
-module load intel
-module load mkl
-module load ncarcompilers
-module load netcdf
-module load openmpi
-
-module load nco
-module load python
-module load ncl
-
-
-##omp threads
-if [ -z "\$CLM_THREADS" ]; then #threads NOT set on command line
- export CLM_THREADS=\$c_threads
-fi
-
-# Stop on first failed test
-if [ -z "\$CLM_SOFF" ]; then #CLM_SOFF NOT set
- export CLM_SOFF=FALSE
-fi
-
-export CESM_MACH="cheyenne"
-export CESM_COMP="intel"
-
-export NETCDF_DIR=\$NETCDF
-export INC_NETCDF=\$NETCDF/include
-export LIB_NETCDF=\$NETCDF/lib
-export MAKE_CMD="gmake -j "
-export CFG_STRING=""
-export TOOLS_MAKE_STRING="USER_FC=ifort USER_LINKER=ifort USER_CPPDEFS=-DLINUX"
-export MACH_WORKSPACE="/glade/scratch"
-export CPRNC_EXE="$CESMDATAROOT/tools/cime/tools/cprnc/cprnc.cheyenne"
-dataroot="$CESMDATAROOT"
-export TOOLSLIBS=""
-export TOOLS_CONF_STRING="--mpilib mpich"
-
-
-echo_arg=""
-
-EOF
-##^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ writing to batch script ^^^^^^^^^^^^^^^^^^^
- ;;
-
- ## hobart
- hobart* | h*.cgd.ucar.edu)
- submit_script="test_driver_hobart_${cur_time}.sh"
- export PATH=/cluster/torque/bin:${PATH}
-
-##vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv writing to batch script vvvvvvvvvvvvvvvvvvv
-cat > ./${submit_script} << EOF
-#!/bin/sh
-#
-
-# Name of the queue (CHANGE THIS if needed)
-#PBS -q long
-# Number of nodes (CHANGE THIS if needed)
-#PBS -l nodes=1:ppn=24
-# output file base name
-#PBS -N test_dr
-# Put standard error and standard out in same file
-#PBS -j oe
-# Export all Environment variables
-#PBS -V
-# End of options
-
-if [ -n "\$PBS_JOBID" ]; then #batch job
- export JOBID=\`echo \${PBS_JOBID} | cut -f1 -d'.'\`
- initdir=\${PBS_O_WORKDIR}
-fi
-
-if [ "\$PBS_ENVIRONMENT" = "PBS_BATCH" ]; then
- interactive="NO"
- input_file="tests_posttag_hobart"
-else
- interactive="YES"
- input_file="tests_posttag_hobart_nompi"
-fi
-
-##omp threads
-if [ -z "\$CLM_THREADS" ]; then #threads NOT set on command line
- export CLM_THREADS=2
-fi
-export CLM_RESTART_THREADS=1
-
-##mpi tasks
-export CLM_TASKS=24
-export CLM_RESTART_TASKS=20
-
-export P4_GLOBMEMSIZE=500000000
-
-
-export CESM_MACH="hobart"
-
-ulimit -s unlimited
-ulimit -c unlimited
-
-export CESM_COMP="intel"
-export TOOLS_MAKE_STRING="USER_FC=ifort USER_CC=icc "
-export TOOLS_CONF_STRING=" -mpilib mpi-serial"
-export CFG_STRING=""
-export INITMODULES="/usr/share/Modules/init/sh"
-
-. \$INITMODULES
-module purge
-module load compiler/intel
-module load tool/nco
-module load tool/netcdf
-
-export NETCDF_DIR=\$NETCDF_PATH
-export INC_NETCDF=\${NETCDF_PATH}/include
-export LIB_NETCDF=\${NETCDF_PATH}/lib
-export MAKE_CMD="gmake -j 5" ##using hyper-threading on hobart
-export MACH_WORKSPACE="/scratch/cluster"
-export CPRNC_EXE=/fs/cgd/csm/tools/cprnc/cprnc
-export DATM_QIAN_DATA_DIR="/project/tss/atm_forcing.datm7.Qian.T62.c080727"
-dataroot="/fs/cgd/csm"
-export TOOLSSLIBS=""
-echo_arg="-e"
-
-EOF
-##^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ writing to batch script ^^^^^^^^^^^^^^^^^^^
- ;;
-
- ## izumi
- izumi* | i*.unified.ucar.edu)
- submit_script="test_driver_izumi_${cur_time}.sh"
- export PATH=/cluster/torque/bin:${PATH}
-
-##vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv writing to batch script vvvvvvvvvvvvvvvvvvv
-cat > ./${submit_script} << EOF
-#!/bin/sh
-#
-
-# Name of the queue (CHANGE THIS if needed)
-#PBS -q long
-# Number of nodes (CHANGE THIS if needed)
-#PBS -l nodes=1:ppn=24
-# output file base name
-#PBS -N test_dr
-# Put standard error and standard out in same file
-#PBS -j oe
-# Export all Environment variables
-#PBS -V
-# End of options
-
-if [ -n "\$PBS_JOBID" ]; then #batch job
- export JOBID=\`echo \${PBS_JOBID} | cut -f1 -d'.'\`
- initdir=\${PBS_O_WORKDIR}
-fi
-
-if [ "\$PBS_ENVIRONMENT" = "PBS_BATCH" ]; then
- interactive="NO"
- input_file="tests_posttag_izumi"
-else
- interactive="YES"
- input_file="tests_posttag_izumi_nompi"
-fi
-
-##omp threads
-if [ -z "\$CLM_THREADS" ]; then #threads NOT set on command line
- export CLM_THREADS=2
-fi
-export CLM_RESTART_THREADS=1
-
-##mpi tasks
-export CLM_TASKS=24
-export CLM_RESTART_TASKS=20
-
-export P4_GLOBMEMSIZE=500000000
-
-
-export CESM_MACH="izumi"
-
-ulimit -s unlimited
-ulimit -c unlimited
-
-export CESM_COMP="intel"
-export TOOLS_MAKE_STRING="USER_FC=ifort USER_CC=icc "
-export TOOLS_CONF_STRING=" -mpilib mpi-serial"
-export CFG_STRING=""
-export INITMODULES="/usr/share/Modules/init/sh"
-
-. \$INITMODULES
-module purge
-module load compiler/intel
-module load tool/nco
-module load tool/netcdf
-
-export NETCDF_DIR=\$NETCDF_PATH
-export INC_NETCDF=\${NETCDF_PATH}/include
-export LIB_NETCDF=\${NETCDF_PATH}/lib
-export MAKE_CMD="gmake -j 5" ##using hyper-threading on izumi
-export MACH_WORKSPACE="/scratch/cluster"
-export CPRNC_EXE=/fs/cgd/csm/tools/cprnc/cprnc.izumi
-export DATM_QIAN_DATA_DIR="/project/tss/atm_forcing.datm7.Qian.T62.c080727"
-dataroot="/fs/cgd/csm"
-export TOOLSSLIBS=""
-echo_arg="-e"
-
-EOF
-##^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ writing to batch script ^^^^^^^^^^^^^^^^^^^
- ;;
-
- * )
- echo "Only setup to work on: cheyenne, hobart and izumi"
- exit
-
-
-esac
-
-##vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv writing to batch script vvvvvvvvvvvvvvvvvvv
-cat >> ./${submit_script} << EOF
-
-export CPRNC_OPT=""
-if [ -n "\${CLM_JOBID}" ]; then
- export JOBID=\${CLM_JOBID}
-fi
-##check if interactive job
-
-if [ "\$interactive" = "YES" ]; then
-
- if [ -z "\${JOBID}" ]; then
- export JOBID=\$\$
- fi
- echo "test_driver.sh: interactive run - setting JOBID to \$JOBID"
- if [ \$0 = "test_driver.sh" ]; then
- initdir="."
- else
- initdir=\${0%/*}
- fi
-else
- echo "ERROR: you *always* need to use the interactive option (-i)"
- echo " currently doesn't work without it"
- exit 3
-fi
-
-##establish script dir and clm_root
-if [ -f \${initdir}/test_driver.sh ]; then
- export CLM_SCRIPTDIR=\`cd \${initdir}; pwd \`
- export CLM_ROOT=\`cd \${CLM_SCRIPTDIR}/../..; pwd \`
- export CTSM_ROOT=\${CLM_ROOT}
- if [ -d \${CLM_ROOT}/cime ]; then
- export CIME_ROOT=\${CLM_ROOT}/cime
- else
- export CIME_ROOT=\${CLM_ROOT}/../../cime
- fi
- if [ ! -d \${CIME_ROOT} ]; then
- echo "ERROR: trouble finding the CIME_ROOT directory: \$CIME_ROOT"
- exit 3
- fi
-else
- if [ -n "\${CLM_ROOT}" ] && [ -f \${CLM_ROOT}/test/tools/test_driver.sh ]; then
- export CLM_SCRIPTDIR=\`cd \${CLM_ROOT}/test/tools; pwd \`
- else
- echo "ERROR: unable to determine script directory "
- echo " if initiating batch job from directory other than the one containing test_driver.sh, "
- echo " you must set the environment variable CLM_ROOT to the full path of directory containing "
- echo " . "
- exit 3
- fi
-fi
-
-##output files
-clm_log=\${initdir}/td.\${JOBID}.log
-if [ -f \$clm_log ]; then
- rm \$clm_log
-fi
-clm_status=\${initdir}/td.\${JOBID}.status
-if [ -f \$clm_status ]; then
- rm \$clm_status
-fi
-
-##setup test work directory
-if [ -z "\$CLM_TESTDIR" ]; then
- export CLM_TESTDIR=\${MACH_WORKSPACE}/\$LOGNAME/clmTests/test-driver.\${JOBID}
- if [ -d \$CLM_TESTDIR ] && [ \$CLM_RETAIN_FILES != "TRUE" ]; then
- rm -r \$CLM_TESTDIR
- fi
-fi
-if [ ! -d \$CLM_TESTDIR ]; then
- mkdir -p \$CLM_TESTDIR
- if [ \$? -ne 0 ]; then
- echo "ERROR: unable to create work directory \$CLM_TESTDIR"
- exit 4
- fi
-fi
-
-## MCT and PIO build directorys
-export MCT_LIBDIR=\$CLM_TESTDIR/mct
-export PIO_LIBDIR=\$CLM_TESTDIR/pio
-
-##set our own environment vars
-export CSMDATA=\${dataroot}/inputdata
-export DIN_LOC_ROOT=\${CSMDATA}
-export MPI_TYPE_MAX=100000
-
-##process other env vars possibly coming in
-if [ -z "\$CLM_RETAIN_FILES" ]; then
- export CLM_RETAIN_FILES=FALSE
-fi
-if [ -n "\${CLM_INPUT_TESTS}" ]; then
- input_file=\$CLM_INPUT_TESTS
-else
- input_file=\${CLM_SCRIPTDIR}/\${input_file}
-fi
-if [ ! -f \${input_file} ]; then
- echo "ERROR: unable to locate input file \${input_file}"
- exit 5
-fi
-
-if [ \$interactive = "YES" ]; then
- echo "reading tests from \${input_file}"
-else
- echo "reading tests from \${input_file}" >> \${clm_log}
-fi
-
-num_tests=\`wc -w < \${input_file}\`
-echo "STATUS OF CLM TESTING UNDER JOB \${JOBID}; scheduled to run \$num_tests tests from:" >> \${clm_status}
-echo "\$input_file" >> \${clm_status}
-echo "" >> \${clm_status}
-echo " on machine: $hostname" >> \${clm_status}
-if [ -n "${BL_ROOT}" ]; then
- echo "tests of baseline will use source code from:" >> \${clm_status}
- echo "\$BL_ROOT" >> \${clm_status}
-fi
-if [ \$interactive = "NO" ]; then
- echo "see \${clm_log} for more detailed output" >> \${clm_status}
-fi
-echo "" >> \${clm_status}
-
-test_list=""
-while read input_line; do
- test_list="\${test_list}\${input_line} "
-done < \${input_file}
-
-
-##initialize flags, counter
-skipped_tests="NO"
-pending_tests="NO"
-count=0
-
-##loop through the tests of input file
-for test_id in \${test_list}; do
- count=\`expr \$count + 1\`
- while [ \${#count} -lt 3 ]; do
- count="0\${count}"
- done
-
- master_line=\`grep \$test_id \${CLM_SCRIPTDIR}/input_tests_master\`
- status_out=""
- for arg in \${master_line}; do
- status_out="\${status_out}\${arg} "
- done
-
- if [ -z "\$status_out" ]; then
- echo "No test matches \$test_id in \${CLM_SCRIPTDIR}/input_tests_master"
- exit 3
- fi
-
- test_cmd=\${status_out#* }
-
- status_out="\${count} \${status_out}"
-
- if [ \$interactive = "YES" ]; then
- echo ""
- echo "***********************************************************************************"
- echo "\${status_out}"
- echo "***********************************************************************************"
- else
- echo "" >> \${clm_log}
- echo "***********************************************************************************"\
- >> \${clm_log}
- echo "\$status_out" >> \${clm_log}
- echo "***********************************************************************************"\
- >> \${clm_log}
- fi
-
- if [ \${#status_out} -gt 94 ]; then
- status_out=\`echo "\${status_out}" | cut -c1-100\`
- fi
- while [ \${#status_out} -lt 97 ]; do
- status_out="\${status_out}."
- done
-
- echo \$echo_arg "\$status_out\c" >> \${clm_status}
-
- if [ \$interactive = "YES" ]; then
- \${CLM_SCRIPTDIR}/\${test_cmd}
- rc=\$?
- else
- \${CLM_SCRIPTDIR}/\${test_cmd} >> \${clm_log} 2>&1
- rc=\$?
- fi
- if [ \$rc -eq 0 ]; then
- echo "PASS" >> \${clm_status}
- elif [ \$rc -eq 255 ]; then
- echo "SKIPPED*" >> \${clm_status}
- skipped_tests="YES"
- elif [ \$rc -eq 254 ]; then
- echo "PENDING**" >> \${clm_status}
- pending_tests="YES"
- else
- echo " rc=\$rc FAIL" >> \${clm_status}
- if [ "\$CLM_SOFF" = "TRUE" ]; then
- echo "stopping on first failure" >> \${clm_status}
- echo "stopping on first failure" >> \${clm_log}
- exit 6
- fi
- fi
-done
-
-echo "end of input" >> \${clm_status}
-if [ \$interactive = "YES" ]; then
- echo "end of input"
-else
- echo "end of input" >> \${clm_log}
-fi
-
-if [ \$skipped_tests = "YES" ]; then
- echo "* please verify that any skipped tests are not required of your clm commit" >> \${clm_status}
-fi
-if [ \$pending_tests = "YES" ]; then
- echo "** tests that are pending must be checked manually for a successful completion" >> \${clm_status}
- if [ \$interactive = "NO" ]; then
- echo " see the test's output in \${clm_log} " >> \${clm_status}
- echo " for the location of test results" >> \${clm_status}
- fi
-fi
-
-if [ "\$interactive" = "YES" ]; then
- passInt="test_driver.sh-i"
-else
- passInt="test_driver.sh"
-fi
-
-../../bld/unit_testers/xFail/wrapClmTests.pl -statusFile "\${clm_status}" -numberOfTests "\${num_tests}" -callingScript "\${passInt}"
-
-exit 0
-
-EOF
-##^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ writing to batch script ^^^^^^^^^^^^^^^^^^^
-
-
-chmod a+x $submit_script
-if [ ! -z "$CLM_RETAIN_FILES" ]; then
- export CLM_RETAIN_FILES="FALSE"
-fi
-arg1=${1##*-}
-case $arg1 in
- [iI]* )
- debug="NO"
- interactive="YES"
- compile_only="NO"
- export debug
- export interactive
- export compile_only
- ./${submit_script}
- exit 0
- ;;
-
- [cC]* )
- debug="NO"
- interactive="YES"
- compile_only="YES"
- export debug
- export CLM_RETAIN_FILES="TRUE"
- export interactive
- export compile_only
- export CLM_RETAIN_FILES="TRUE"
- ./${submit_script}
- exit 0
- ;;
-
- [dD]* )
- debug="YES"
- interactive="YES"
- compile_only="NO"
- export debug
- export interactive
- export compile_only
- ./${submit_script}
- exit 0
- ;;
-
- [fF]* )
- debug="NO"
- interactive="NO"
- compile_only="NO"
- export debug
- export interactive
- export compile_only
- ;;
-
- "" )
- echo ""
- echo "**********************"
- echo "$submit_script has been created and will be submitted to the batch queue..."
- echo "(ret) to continue, (a) to abort"
- read ans
- case $ans in
- [aA]* )
- echo "aborting...type ./test_driver.sh -h for help message"
- exit 0
- ;;
- esac
- debug="NO"
- interactive="NO"
- compile_only="NO"
- export debug
- export interactive
- export compile_only
- ;;
-
- * )
- echo ""
- echo "**********************"
- echo "usage on cheyenne, hobart, and izumi: "
- echo "./test_driver.sh -i"
- echo ""
- echo "valid arguments: "
- echo "-i interactive usage"
- echo "-c compile-only usage (run configure and compile do not run clm)"
- echo "-d debug-only usage (run configure and build-namelist do NOT compile or run clm)"
- echo "-f force batch submission (avoids user prompt)"
- echo "-h displays this help message"
- echo ""
- echo "**pass environment variables by preceding above commands "
- echo " with 'env var1=setting var2=setting '"
- echo ""
- echo "**********************"
- exit 0
- ;;
-esac
-
-echo "submitting..."
-case $hostname in
- #default
- * )
- echo "no submission capability on this machine use the interactive option: -i"
- exit 0
- ;;
-
-esac
-exit 0
diff --git a/test/tools/tests_posttag_dav_mpi b/test/tools/tests_posttag_dav_mpi
deleted file mode 100644
index ef42215791..0000000000
--- a/test/tools/tests_posttag_dav_mpi
+++ /dev/null
@@ -1,2 +0,0 @@
-smi#2 bli#2
-smi59 bli59
diff --git a/test/tools/tests_posttag_hobart_nompi b/test/tools/tests_posttag_hobart_nompi
deleted file mode 100644
index 4655f29853..0000000000
--- a/test/tools/tests_posttag_hobart_nompi
+++ /dev/null
@@ -1,6 +0,0 @@
-smc#4 blc#4
-smi54 bli54
-smi57 bli57
-smiT4 bliT4
-smf84 blf84
-smfc4 blfc4
diff --git a/test/tools/tests_posttag_izumi_nompi b/test/tools/tests_posttag_izumi_nompi
deleted file mode 100644
index 90be9522dc..0000000000
--- a/test/tools/tests_posttag_izumi_nompi
+++ /dev/null
@@ -1,5 +0,0 @@
-smi54 bli54
-smi57 bli57
-smiT4 bliT4
-smf84 blf84
-smfc4 blfc4
diff --git a/test/tools/tests_posttag_nompi_regression b/test/tools/tests_posttag_nompi_regression
deleted file mode 100644
index 1785b5da47..0000000000
--- a/test/tools/tests_posttag_nompi_regression
+++ /dev/null
@@ -1,20 +0,0 @@
-smc#4 blc#4
-sme14 ble14
-smg54 blg54
-smi24 bli24
-smi53 bli53
-smi54 bli54
-smi57 bli57
-smi58 bli58
-smi74 bli74
-smi78 bli78
-smiT4 bliT4
-smiT2 bliT2
-smf84 blf84
-smfc4 blfc4
-smfg4 blfg4
-smiS4 bliS4
-smiS8 bliS8
-smiS9 bliS9
-smiS0 bliS0
-smiS0 bliS0
diff --git a/test/tools/tests_pretag_cheyenne_nompi b/test/tools/tests_pretag_cheyenne_nompi
deleted file mode 100644
index 3bdeef5deb..0000000000
--- a/test/tools/tests_pretag_cheyenne_nompi
+++ /dev/null
@@ -1,19 +0,0 @@
-smi79 bli79
-smc#4 blc#4
-sme14 ble14
-sme@4 ble@4
-smg54 blg54
-smi04 bli04
-smi24 bli24
-smi53 bli53
-smi64 bli64
-smi54 bli54
-smi57 bli57
-smi58 bli58
-smiS4 bliS4
-smi74 bli74
-smiT4 bliT4
-smiT2 bliT2
-smf84 blf84
-smfc4 blfc4
-smfg4 blfg4
diff --git a/tools/README b/tools/README
index 780e9f6e91..109c96f3f3 100644
--- a/tools/README
+++ b/tools/README
@@ -1,285 +1,22 @@
-$CTSMROOT/tools/README Jun/08/2018
+$CTSMROOT/tools/README Dec/05/2023
-CLM tools for analysis of CLM history files -- or for creation or
-modification of CLM input files.
-
-I. General directory structure:
+IMPORTANT NOTE: Normally tools for creating input datasets for CLM
+such as surface dataset creation would be here. As well as post processing
+or other utility side line tools. However, we are only supporting
+tools for later versions of the code. The tools should all work to create
+or process datasets from these versions.
- $CTSMROOT/tools
- mksurfdata_map --- Create surface datasets.
- (NOTE: interpinic now longer included as now an online capability in CLM)
+Therefore if you need any tools here checkout:
- mkmapgrids ------- Create regular lat/lon SCRIP grid files needed by mkmapdata
- mkmapdata -------- Create SCRIP mapping data from SCRIP grid files (uses ESMF)
- mkprocdata_map --- Convert output unstructured grids into a 2D format that
- can be plotted easily
- ncl_scripts ------ NCL post or pre processing scripts.
+ctsm5.1.dev158 or later and use it
- contrib ---------- Miscellaneous tools for pre or post processing of CTSM.
- Typically these are contributed by anyone who has something
- they think might be helpful to the community. They may not
- be as well tested or supported as other tools.
+Let us know if you run into issues here by posting to the CESM Forum
+or sending email to ctsm-software@ucar.edu
- cime-tools ($CIMEROOT/tools/) (CIMEROOT is ../cime for a CTSM checkout and ../../../cime for a CESM checkout)
- $CIMEROOT/mapping/gen_domain_files
- gen_domain ------- Create data model domain datasets from SCRIP mapping datasets.
+ https://bb.cgd.ucar.edu/cesm/forums/ctsm-clm-mosart-rtm.134/
-II. Notes on building/running for each of the above tools:
+See this file for the latest information:
- Each tool that has FORTRAN source code (mksurfdata_map and mkprocdata_map) has the following files:
-
- README ------- Specific help for using the specific tool and help on specific
- files in that directory.
- src/Filepath ----- List of directories needed to build the tool
- (some files in ../src directories are required).
- src/Makefile ----- GNU Makefile to build the tool
- (these are identical between tools.
- src/Macros.custom Customization of make macros for the particular tool in question
- src/Srcfiles ----- List of source files that are needed.
- src/Mkdepends ---- Dependency generator program
-
- mkmapdata, mkmapgrids and ncl_scripts only contain scripts so don't have the above build files.
-
- Most tools have copies of files from other directories -- see the README.filecopies
- file for more information on this.
-
- Tools may also have files with the directory name followed by namelist to provide sample namelists.
-
- .namelist ------ Namelist to create a global file.
-
- These files are also used by the test scripts to test the tools (see the
- README.testing) file.
-
- NOTE: Be sure to change the path of the datasets references by these namelists to
- point to where you have exported your CESM inputdata datasets.
-
- To build:
-
- cd
- setenv INC_NETCDF
- setenv LIB_NETCDF
- gmake
-
- The process will create a file called "Depends" which has the dependencies
- for the build of each file on other files.
-
- By default some codes may be compiled non-optimized
- so that you can use the debugger, and with bounds-checking, and float trapping on.
- To speed up do the following...
-
- gmake OPT=TRUE (by default already on for interpinic and mksurfdata_map)
-
- Also some of the tools allow for OpenMP shared memory parallelism
- (such as interpinic and mksurfdata) with
-
- gmake SMP=TRUE
-
- To run a program with a namelist:
-
- ./program < namelist
-
- To get help on running a program with command line options (e.g., interpinic):
-
- ./program
-
- To run a program built with SMP=TRUE:
-
- setenv OMP_NUM_THREADS=
-
- run normally as above
-
-III. Process sequence to create input datasets needed to run CLM
-
- 1.) Create SCRIP grid files (if needed)
-
- a.) For standard resolutions these files will already be created. (done)
-
- b.) To create regular lat-lon regional/single-point grids run mknoocnmap.pl
-
- This will create both SCRIP grid files and a mapping file that will
- be valid if the region includes NO ocean whatsoever (so you can skip step 2).
- You can also use this script to create SCRIP grid files for a region
- (or even a global grid) that DOES include ocean if you use step 2 to
- create mapping files for it (simply discard the non-ocean map created by
- this script).
-
- Example, for single-point over Boulder Colorado.
-
- cd mkmapdata
- ./mknoocnmap.pl -p 40,255 -n 1x1_boulderCO
-
- c.) General case
-
- You'll need to convert or create SCRIP grid files on your own (using scripts
- or other tools) for the general case where you have an unstructured grid, or
- a grid that is not regular in latitude and longitude.
-
- example format
- ==================
- netcdf fv1.9x2.5_090205 {
- dimensions:
- grid_size = 13824 ;
- grid_corners = 4 ;
- grid_rank = 2 ;
- variables:
- double grid_center_lat(grid_size) ;
- grid_center_lat:units = "degrees" ;
- double grid_center_lon(grid_size) ;
- grid_center_lon:units = "degrees" ;
- double grid_corner_lat(grid_size, grid_corners) ;
- grid_corner_lat:units = "degrees" ;
- double grid_corner_lon(grid_size, grid_corners) ;
- grid_corner_lon:units = "degrees" ;
- int grid_dims(grid_rank) ;
- int grid_imask(grid_size) ;
- grid_imask:units = "unitless" ;
-
- 2.) Create ocean to atmosphere mapping file (if needed)
-
- a.) Standard resolutions (done)
-
- If this is a standard resolution with a standard ocean resolution -- this
- step is already done, the files already exist.
-
- b.) Region without Ocean (done in step 1.b)
-
- IF YOU RAN mknoocnmap.pl FOR A REGION WITHOUT OCEAN THIS STEP IS ALREADY DONE.
-
- c.) New atmosphere or ocean resolution
-
- If the region DOES include ocean, use $CIMEROOT/tools/mapping/gen_domain_files/gen_maps.sh to create a
- mapping file for it.
-
- Example:
-
- cd $CIMEROOT/tools/mapping/gen_domain_files
- ./gen_maps.sh -focn -fatm -nocn -natm
-
-
- 3.) Add SCRIP grid file(s) created in (1) into XML database in CLM (optional)
-
- See the "Adding New Resolutions or New Files to the build-namelist Database"
- Chapter in the CLM User's Guide
-
- http://www.cesm.ucar.edu/models/cesm1.0/clm/models/lnd/clm/doc/UsersGuide/book1.html
-
- If you don't do this step, you'll need to specify the file to mkmapdata
- in step (3) using the "-f" option.
-
- 4.) Create mapping files for use by mksurfdata_map with mkmapdata
- (See mkmapdata/README for more help on doing this)
-
- - this step uses the results of (1) that were entered into the XML database
- by step (3). If you don't enter datasets in, you need to specify the
- SCRIP grid file using the "-f" option to mkmapdata.sh.
-
- Example: to generate all necessary mapping files for the ne30np4 grid
-
- cd mkmapdata
- ./mkmapdata.sh -r ne30np4
-
- 5.) Add mapping file(s) created in step (4) into XML database in CLM (optional)
-
- See notes on doing this in step (3) above.
- Edit ../bld/namelist_files/namelist_defaults_clm.xml to incorporate new
- mapping files.
-
- If you don't do this step, you'll need to specify the grid resolution name
- and file creation dates to mksurfdata_map in step (5) below.
-
- 6.) Convert map of ocean to atm for use by DATM and CLM with gen_domain
- (See $CIMEROOT/tools/mapping/README for more help on doing this)
-
- - gen_domain uses the map from step (2) (or previously created CESM maps)
-
- Example:
-
- cd $CIMEROOT/tools/mapping/gen_domain_files/src
- gmake
- cd ..
- setenv CDATE 090206
- setenv OCNGRIDNAME gx1v6
- setenv ATMGRIDNAME fv1.9x2.5
- setenv MAPFILE $CSMDATA/cpl/cpl6/map_${OCNGRIDNAME}_to_${ATMGRIDNAME}_aave_da_${CDATE}.nc
- ./gen_domain -m $MAPFILE -o $OCNGRIDNAME -l $ATMGRIDNAME
-
- Normally for I compsets running CLM only you will discard the ocean domain
- file, and only use the atmosphere domain file for datm and as the fatmlndfrc
- file for CLM. Output domain files will be named according to the input OCN/LND
- gridnames.
-
- 7.) Create surface datasets with mksurfdata_map
- (See mksurfdata_map/README for more help on doing this)
-
- - Run mksurfdata_map/mksurfdata.pl
- - This step uses the results of step (4) entered into the XML database
- in step (5).
- - If datasets were NOT entered into the XML database, set the resolution
- to "usrspec" and use the "-usr_gname", and "-usr_gdate" options.
-
- Example: for 0.9x1.25 resolution
-
- cd mksurfdata_map/src
- gmake
- cd ..
- ./mksurfdata.pl -r 0.9x1.25
-
- NOTE that surface dataset will be used by default for fatmgrid - and it will
- contain the lat,lon,edges and area values for the atm grid - ASSUMING that
- the atm and land grid are the same
-
- 8.) Interpolate initial conditions using interpinic (optional)
- (See interpinic/README for more help on doing this)
-
- 9.) Add new files to XML data or using user_nl_clm (optional)
-
- See notes on doing this in step (3) above.
-
-IV. Example of creating single-point datasets without entering into XML database.
-
- Here we apply the process described in III. for a single-point dataset
- where we don't enter the datasets into the XML database (thus skipping
- steps 3, 5 and 9), but use the needed command line options to specify where the
- files are. This also skips step (2) since step 1 creates the needed mapping file.
- We also skip step (8) and do NOT create a finidat file.
-
- 0.) Set name of grid to use and the creation date to be used later...
- setenv GRIDNAME 1x1_boulderCO
- setenv CDATE `date +%y%m%d`
- 1.) SCRIP grid and atm to ocn mapping file
- cd mkmapdata
- ./mknoocnmap.pl -p 40,255 -n $GRIDNAME
- # Set pointer to MAPFILE that will be used in step (6)
- setenv MAPFILE `pwd`/map_${GRIDNAME}_noocean_to_${GRIDNAME}_nomask_aave_da_${CDATE}.nc
- cd ../..
- 2.) skip
- 3.) skip
- 4.) Mapping files needed for mksurfdata_map
- cd mkmapdata
- setenv GRIDFILE ../mkmapgrids/SCRIPgrid_${GRIDNAME}_nomask_${CDATE}.nc
- ./mkmapdata.sh -r $GRIDNAME -f $GRIDFILE -t regional
- cd ../..
- 5.) skip
- 6.) Generate domain file for datm and CLM
- cd $CIMEROOT/tools/mapping/gen_domain_files/src
- gmake
- cd ..
- setenv OCNDOM domain.ocn_noocean.nc
- setenv ATMDOM domain.lnd.{$GRIDNAME}_noocean.nc
- ./gen_domain -m $MAPFILE -o $OCNDOM -l $ATMDOM
- 7.) Create surface dataset for CLM
- cd mksurfdata_map/src
- gmake
- cd ..
- ./mksurfdata.pl -r usrspec -usr_gname $GRIDNAME -usr_gdate $CDATE
- 8.) skip
- 9.) skip
-
-V. Notes on which input datasets are needed for CLM
-
- global or regional/single-point grids
- - need fsurdata and fatmlndfrc
-
- fsurdata ---- from mksurfdata_map in step (III.7)
- fatmlndfrc -- use the domain.lnd file from gen_domain in step (III.6)
+https//docs.google.com/document/d/1FOluB3C7RTHu8vArWGjACwFjGJfxkLcaGP00l0585_k
+ And the discussions/issues on github.com/CTSM regarding this
diff --git a/tools/README.filecopies b/tools/README.filecopies
deleted file mode 100644
index 5ab2bc96d1..0000000000
--- a/tools/README.filecopies
+++ /dev/null
@@ -1,38 +0,0 @@
-tools/README.filecopies May/26/2011
-
-There are several files that are copies of the original files from
-either CTSM src/main, cime/src/share/utils,
-cime/src/share/unit_test_stubs, or copies from other tool
-directories. By having copies the tools can all be made stand-alone,
-but any changes to the originals will have to be put into the tool
-directories as well.
-
-I. Files that are IDENTICAL:
-
- 1. csm_share files copied that should be identical to cime/share/utils:
-
- shr_kind_mod.F90
- shr_const_mod.F90
- shr_log_mod.F90
- shr_timer_mod.F90
- shr_string_mod.F90
- shr_file_mod.F90
-
- 2. csm_share files copied that should be identical to cime/share/csm_share/unit_testers:
-
- test_mod.F90
-
-II. Files with differences
-
- 1. csm_share files copied with differences:
-
- shr_sys_mod.F90 - Remove mpi abort and reference to shr_mpi_mod.F90.
-
- 2. CTSM src/utils files with differences:
-
- fileutils.F90 --- Remove use of masterproc and spmdMod and endrun in abortutils.
-
- 3. Files in mksurfdata_map
-
- mkvarpar.F90
- nanMod.F90
diff --git a/tools/README.testing b/tools/README.testing
deleted file mode 100644
index 9c386a3b26..0000000000
--- a/tools/README.testing
+++ /dev/null
@@ -1,58 +0,0 @@
-tools/README.testing May/23/2011
-
-There is automated testing for all of the tools and scripts under this tools directory.
-The tests are in the test/tools directory and are any of the scripts
-that have "tools" in the name. There are several assumptions made in order for the
-testing to work.
-
-
-1.) Executable name is the same as the directory name
-
-The name of the executable program is the same as the directory name of the tool.
-
-2.) Build works the same for any Fortran tools
-
-The build for any Fortran tools should work the same way, with the same options
-and required files for it. The files: Makefile, Mkdepends, Filepath and Srcfile
-are expected to exist in the tool "src" sub-directory. To make maintaining these files easier
-in general the Makefile and Mkdepends files should be kept identical other than
-default settings for OPT and SMP and the output executable name.
-
-Options to the Makefile:
-
- LIB_NETCDF --- Library directory location of NetCDF. (defaults to /usr/local/lib)
- INC_NETCDF --- Include directory location of NetCDF. (defaults to /usr/local/include)
- MOD_NETCDF --- Module directory location of NetCDF. (defaults to $LIB_NETCDF)
- USER_FC ------ Allow user to override the default Fortran compiler specified in Makefile.
- USER_FCTYP --- Allow user to override the default type of Fortran compiler
- (Linux and USER_FC=ftn only).
- USER_CC ------ Allow user to override the default C compiler specified in Makefile
- (Linux only).
- USER_LINKER -- Allow user to override the default linker specified in Makefile.
- SMP ---------- Shared memory Multi-processing (TRUE or FALSE) [default is FALSE]
- OPT ---------- Use optimized options. (TRUE or FALSE)
-
-3.) Successful completion of the tool ends with "Successfully ..."
-
-After the tool completes it should have an unique string telling of the
-successful completion of the file that is searchable in the log file starting
-with "Successfully ". If this string isn't found in the log file, it
-will be assumed by the test setup that the test failed.
-
-4.) NetCDF files are created or modified
-
-It is assumed that NetCDF files are created and/or modified by the process. And if
-NetCDF files are NOT created -- something went wrong. For some scripts that copy files
-to other locations this means an "-nomv" option needs to be invoked (and one provided)
-so that it leaves the files created in the current directory.
-
-5.) Namelist examples exist .* namelist files
-
-To specify options for the running of the tool, sample namelist files
-are provided or a sample run options file. These files can EITHER be in the
-tool directory OR the ../../test/tools/nl_files directory.
-
-6.) Specific tests for run scripts
-
-For tools that have scripts to create namelists and run the tool for you, there
-are customized tests to run these tools.
diff --git a/tools/contrib/README b/tools/contrib/README
deleted file mode 100644
index bae1b51ca9..0000000000
--- a/tools/contrib/README
+++ /dev/null
@@ -1,48 +0,0 @@
-$CTSMROOT/tools/contrib/README Jan/24/2019
-
-The purpose of this directory is for users of CTSM to contribute scripts for pre or post processing or
-case management of CTSM that others might find useful. The script should have some documentation made
-available before adding it. These scripts may not be as well tested or supported as other CTSM
-tools. They are also ONLY assumed to work on the NCAR supercomputer. So paths will be hardwired to
-assume NCAR directory structures.
-
-The python scripts require the following settings before running on cheyenne:
-(Currently that's the singlept and subset_surfdata scripts)
-
-module load python/2.7.14
-ncar_pylib
-
-Brief description of scripts:
-
-run_clm_historical
- does all the setup and submission required to do a 1850-2010 CLM
- historical simulation in three separate submissions
- v1 - Andrew Slater+Dave Lawrence, 8/2015
-
-subset_surfdata
- create regional domain, surface data, and rtm directional files by
- extracting data from global datasets
- v1 - Sean Swenson 8/2015
-
-singlept
- create single point domain, surface data, and datm forcing files by
- extracting data from global datasets
- v1 - Sean Swenson 8/2015
-
-SpinupStability.ncl
- This script assesses the equilibrium state of a spinup run
- works on either monthly or annual mean history files - Keith Oleson 7/2016
-
-run_clmtowers
- This script will run any number of flux tower sites.
- It's based on having created surface datasets with PTCLM.
- v1 - Keith Oleson, 8/2015
-
-ssp_anomaly_forcing_smooth
- This script creates anomaly forcing for CMIP6 SSP scenarios that
- can be used to run CTSM in CESM with datm.
- v0 -- Sean Swenson
- v1 - Peter Lawrence 3/2020
-
-
-
diff --git a/tools/contrib/SpinupStability.ncl b/tools/contrib/SpinupStability.ncl
deleted file mode 100644
index e8d0a61ba9..0000000000
--- a/tools/contrib/SpinupStability.ncl
+++ /dev/null
@@ -1,791 +0,0 @@
-; NCL script
-; SpinupStability_v8.ncl
-; Script to examine stability of spinup simulation.
-; This version operates on either monthly mean or multi-annual mean multi-variable history files
-; Keith Oleson, Jan 2019
-
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/shea_util.ncl"
-
-begin
-
- print ("=========================================")
- print ("Start Time: "+systemfunc("date") )
- print ("=========================================")
-
-;=======================================================================;
-; This script produces a page of plots of various variables that are evaluated
-; as to whether they are spunup or not. A summary of variables in equilibrium
-; and/or in disequilibrium is also produced to standard out. The postscript output
-; is $caseid_spinup.ps in the pwd.
-; The variables examined are TOTECOSYSC,TOTSOMC,TOTVEGC,TLAI,GPP,TWS,H2OSNO.
-; The percentage of land area in TOTECOSYSC disequilibrium is also examined (not for single point).
-; Thresholds are defined below (i.e., global_thresh_*) and can be changed for individual
-; variables.
-; To run this script, just enter in your case name and username below.
-; AND set the annual_hist flag to True if your case has annual mean history files or set
-; annual_hist flag to False if your case has monthly mean history files.
-; AND set the region (supported options: Global, Arctic, SPT).
-; AND set the subper (subsampling period in years, number of years that atm forcing repeats).
-; The script assumes that your history files are in /glade/scratch/$username/archive/$caseid/lnd/hist
-;=======================================================================;
-
-; GLOBAL EXAMPLE
- caseid = "clm50_release-clm5.0.15_2deg_GSWP3V1_1850AD"
- username = "oleson"
- annual_hist = True
- region = "Global" ; Global, Arctic, or SPT (single point)
- subper = 20 ; Subsampling period in years
-
-; SPT (single point) EXAMPLE
-; caseid = "clm50_release-clm5.0.15_SPT_GSWP3V1_1850spin"
-; username = "oleson"
-; annual_hist = True
-; region = "SPT" ; Global, Arctic, or SPT (single point)
-; subper = 20 ; Subsampling period in years
-
- do_plot = True
-; do_plot = False
-;=======================================================================;
-
- data_dir = "/glade/scratch/"+username+"/archive/"+caseid+"/lnd/hist/"
- if ( systemfunc("test -d "+data_dir+"; echo $?" ) .ne. 0 )then
- print( "Input directory does not exist or not found: "+data_dir );
- print( "Make sure username and caseid and base directory is set correctly" )
- status_exit( -1 )
- end if
-
-; Thresholds
- if (region .eq. "SPT") then
- glob_thresh_totecosysc = 0.02 ; global threshold for TOTECOSYSC equilibrium (delta MgC / yr)
- glob_thresh_totsomc = 0.02 ; global threshold for TOTSOMC equilibrium (delta MgC / yr)
- glob_thresh_totvegc = 0.02 ; global threshold for TOTVEGC equilibrium (delta MgC / yr)
- glob_thresh_tlai = 0.02 ; global threshold for TLAI equilibrium (delta m2/m2 / yr)
- glob_thresh_gpp = 0.02 ; global threshold for GPP equilibrium (delta MgC / yr)
- glob_thresh_tws = 0.001 ; global threshold for TWS equilibrium (delta m / yr)
- glob_thresh_h2osno = 1.0 ; global threshold for H2OSNO equilibrium (delta mm / yr)
- glob_thresh_area = 3.0 ; global threshold percent area with TOTECOSYSC disequilibrium gt 1 g C/m2/yr (not used)
- totecosysc_thresh = 1. ; disequilibrium threshold for individual gridcells (gC/m2/yr) (not used)
- else
- glob_thresh_totecosysc = 0.02 ; global threshold for TOTECOSYSC equilibrium (delta PgC / yr)
- glob_thresh_totsomc = 0.02 ; global threshold for TOTSOMC equilibrium (delta PgC / yr)
- glob_thresh_totvegc = 0.02 ; global threshold for TOTVEGC equilibrium (delta PgC / yr)
- glob_thresh_tlai = 0.02 ; global threshold for TLAI equilibrium (delta m2/m2 / yr)
- glob_thresh_gpp = 0.02 ; global threshold for GPP equilibrium (delta PgC / yr)
- glob_thresh_tws = 0.001 ; global threshold for TWS equilibrium (delta m / yr)
- glob_thresh_h2osno = 1.0 ; global threshold for H2OSNO equilibrium (delta mm / yr)
- glob_thresh_area = 3.0 ; global threshold percent area with TOTECOSYSC disequilibrium gt 1 g C/m2/yr
- totecosysc_thresh = 1. ; disequilibrium threshold for individual gridcells (gC/m2/yr)
- end if
-
- if (annual_hist) then
- fls = systemfunc("ls " + data_dir + caseid+".clm?.h0.*-*-*-*"+".nc")
- else
- fls = systemfunc("ls " + data_dir + caseid+".clm?.h0.*-*"+".nc")
- end if
- flsdims = dimsizes(fls)
-
- if (annual_hist) then
- lstfile = addfile(fls(flsdims-1),"r")
- else
- lstfile = addfile(fls(flsdims-12),"r") ;last month (DEC) of any year has mdate for next year
- ;so grab JAN of last year
- end if
-
- if (annual_hist) then
- lstyrdim = dimsizes(lstfile->mcdate)
- mcdate_lst = lstfile->mcdate(lstyrdim-1)
- else
- mcdate_lst = lstfile->mcdate
- end if
- lstyr = toint(mcdate_lst)/10000
-
- fstfile = addfile(fls(0),"r")
- if (annual_hist) then
- mcdate_fst = fstfile->mcdate(0)
- else
- mcdate_fst = fstfile->mcdate
- end if
- fstyr = toint(mcdate_fst)/10000
-
- yearplt = ispan(fstyr,lstyr,subper)
- yearpltrev = yearplt(::-1)
- year = ispan(fstyr,lstyr,subper) - fstyr
- nyrs = dimsizes(year)
- if (subper .eq. 1) then
- yearpltmid = ispan(fstyr+subper/2,yearplt(nyrs-2),subper)
- else
- yearpltmid = ispan(fstyr+subper/2,yearplt(nyrs-1),subper)
- end if
-
-; Build an array of monthly indices
- monstr = new(nyrs*12,"integer")
- do i = 0,nyrs-1
- monstr(i*12:i*12+11) = ispan(year(i)*12,year(i)*12+11,1)
- end do
-
-; Add the data files together
- if (annual_hist) then
- data = addfiles(fls, "r")
- ListSetType (data, "cat")
- else
- data = addfiles(fls(monstr), "r")
- end if
-
-; Convert to annual means if required
- if (annual_hist) then
- if (region .eq. "SPT") then
- totecosysc = data[:]->TOTECOSYSC(year,:)
- totsomc = data[:]->TOTSOMC(year,:)
- totvegc = data[:]->TOTVEGC(year,:)
- tlai = data[:]->TLAI(year,:)
- gpp = data[:]->GPP(year,:)
- tws = data[:]->TWS(year,:)
- if (isfilevar(data[0],"H2OSNO")) then
- h2osno = data[:]->H2OSNO(year,:)
- else
- h2osno = tws
- h2osno = h2osno@_FillValue
- end if
- else
- totecosysc = data[:]->TOTECOSYSC(year,:,:)
- totsomc = data[:]->TOTSOMC(year,:,:)
- totvegc = data[:]->TOTVEGC(year,:,:)
- tlai = data[:]->TLAI(year,:,:)
- gpp = data[:]->GPP(year,:,:)
- tws = data[:]->TWS(year,:,:)
- if (isfilevar(data[0],"H2OSNO")) then
- h2osno = data[:]->H2OSNO(year,:,:)
- else
- h2osno = tws
- h2osno = h2osno@_FillValue
- end if
- end if
- else
- totecosysc = month_to_annual(data[:]->TOTECOSYSC,1)
- totsomc = month_to_annual(data[:]->TOTSOMC,1)
- totvegc = month_to_annual(data[:]->TOTVEGC,1)
- tlai = month_to_annual(data[:]->TLAI,1)
- gpp = month_to_annual(data[:]->GPP,1)
- if (isfilevar(data[0],"TWS")) then
- tws = month_to_annual(data[:]->TWS,1)
- else
- tws = gpp
- tws = tws@_FillValue
- end if
- if (isfilevar(data[0],"H2OSNO")) then
- h2osno = month_to_annual(data[:]->H2OSNO,1)
- else
- h2osno = tws
- h2osno = h2osno@_FillValue
- end if
- end if
- lat = data[0]->lat
- nlat = dimsizes(lat)
- lon = data[0]->lon
- nlon = dimsizes(lon)
- landfrac = data[0]->landfrac
- area = data[0]->area
- aream = area*1.e6
- landarea = landfrac*aream
- if (region .eq. "SPT") then
- gtoXg = 1e-12 ;Tg
- units = "Tg C"
- else
- gtoXg = 1e-15 ;Pg
- units = "Pg C"
- end if
- secinyr = 60.*60.*24.*365.
-
-; TOTECOSYSC
- if (region .eq. "SPT") then
- landareaC = conform_dims(dimsizes(totecosysc),landarea,(/1/)) ; conforming dimensions of landarea to totecosysc
- else
- landareaC = conform_dims(dimsizes(totecosysc),landarea,(/1,2/)) ; conforming dimensions of landarea to totecosysc
- end if
- totecosysc_area = totecosysc*landareaC ; correcting totecosysc for total land area
- totecosysc_Xg = totecosysc_area*gtoXg ; g to Xg
- if (region .eq. "SPT") then
- totecosysc_glob = dim_sum_n(totecosysc_Xg, (/1/)) ; sums over single point
- else
- totecosysc_glob = dim_sum_n(totecosysc_Xg, (/1,2/)) ; sums over all latitudes
- end if
- totecosysc_glob!0 = "year"
- totecosysc_glob&year = yearplt
- totecosysc_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- totecosysc_glob_del(i) = (totecosysc_glob(i+1) - totecosysc_glob(i))/subper
- end do
- totecosysc_glob_del!0 = "year"
- totecosysc_glob_del&year = yearpltmid
- indx = where(abs(totecosysc_glob_del) .lt. glob_thresh_totecosysc,1,0)
- if (all(indx .eq. 1)) then
- totecosysc_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- totecosysc_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- totecosysc_glob_equil = -999
- end if
- end if
- totecosysc_glob_equil@_FillValue = -999
- delete(indx)
-
-; Land area not in TOTECOSYSC equilibrium
- if (region .ne. "SPT") then
- landarea_noequil = new((/nyrs-1,nlat,nlon/),"float")
- do i = 0,nyrs-2
- landarea_noequil(i,:,:) = where((totecosysc(i+1,:,:) - totecosysc(i,:,:))/subper .gt. totecosysc_thresh, landarea, 0.)
- end do
- perc_landarea_noequil = 100.*(dim_sum_n(landarea_noequil,(/1,2/))/sum(landarea))
- indx = where(abs(perc_landarea_noequil) .lt. glob_thresh_area,1,0)
- if (all(indx .eq. 1)) then
- perc_landarea_glob_noequil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- perc_landarea_glob_noequil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- perc_landarea_glob_noequil = -999
- end if
- end if
- perc_landarea_glob_noequil@_FillValue = -999
- delete(indx)
- totecosysc_1_map = where(landarea_noequil(nyrs-2,:,:) .ne. 0.,(totecosysc(nyrs-1,:,:)-totecosysc(nyrs-2,:,:))/subper,totecosysc@_FillValue)
- totecosysc_1_map!0 = "lat"
- totecosysc_1_map&lat = lat
- totecosysc_1_map!1 = "lon"
- totecosysc_1_map&lon = lon
- totecosysc_2_map = where(landarea_noequil(nyrs-3,:,:) .ne. 0.,(totecosysc(nyrs-2,:,:)-totecosysc(nyrs-3,:,:))/subper,totecosysc@_FillValue)
- copy_VarCoords(totecosysc_1_map,totecosysc_2_map)
- end if
-
-; TOTSOMC
- totsomc_area = totsomc*landareaC ; correcting totsomc for total land area
- totsomc_Xg = totsomc_area*gtoXg ; g to Xg
- if (region .eq. "SPT") then
- totsomc_glob = dim_sum_n(totsomc_Xg, (/1/)) ; sums over single point
- else
- totsomc_glob = dim_sum_n(totsomc_Xg, (/1,2/)) ; sums over all latitudes
- end if
- totsomc_glob!0 = "year"
- totsomc_glob&year = yearplt
- totsomc_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- totsomc_glob_del(i) = (totsomc_glob(i+1) - totsomc_glob(i))/subper
- end do
- totsomc_glob_del!0 = "year"
- totsomc_glob_del&year = yearpltmid
- indx = where(abs(totsomc_glob_del) .lt. glob_thresh_totsomc,1,0)
- if (all(indx .eq. 1)) then
- totsomc_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- totsomc_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- totsomc_glob_equil = -999
- end if
- end if
- totsomc_glob_equil@_FillValue = -999
- delete(indx)
-
-; TOTVEGC
- totvegc_area = totvegc*landareaC ; correcting totvegc for total land area
- totvegc_Xg = totvegc_area*gtoXg ; g to Xg
- if (region .eq. "SPT") then
- totvegc_glob = dim_sum_n(totvegc_Xg, (/1/)) ; sums over single point
- else
- totvegc_glob = dim_sum_n(totvegc_Xg, (/1,2/)) ; sums over all latitudes
- end if
- totvegc_glob!0 = "year"
- totvegc_glob&year = yearplt
- totvegc_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- totvegc_glob_del(i) = (totvegc_glob(i+1) - totvegc_glob(i))/subper
- end do
- totvegc_glob_del!0 = "year"
- totvegc_glob_del&year = yearpltmid
- indx = where(abs(totvegc_glob_del) .lt. glob_thresh_totvegc,1,0)
- if (all(indx .eq. 1)) then
- totvegc_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- totvegc_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- totvegc_glob_equil = -999
- end if
- end if
- totvegc_glob_equil@_FillValue = -999
- delete(indx)
-
-; TLAI
- areasum = sum(area*landfrac)
- areaL = area*landfrac
- if (region .eq. "SPT") then
- landareaL = conform_dims(dimsizes(tlai),areaL,(/1/)) ; conforming dimensions of areaL to tlai
- tlai_glob = dim_sum_n(tlai*landareaL/areasum,(/1/)) ; weighted tlai
- else
- landareaL = conform_dims(dimsizes(tlai),areaL,(/1,2/)) ; conforming dimensions of areaL to tlai
- tlai_glob = dim_sum_n(tlai*landareaL/areasum,(/1,2/)) ; weighted global tlai
- end if
- tlai_glob!0 = "year"
- tlai_glob&year = yearplt
- tlai_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- tlai_glob_del(i) = (tlai_glob(i+1) - tlai_glob(i))/subper
- end do
- tlai_glob_del!0 = "year"
- tlai_glob_del&year = yearpltmid
- indx = where(abs(tlai_glob_del) .lt. glob_thresh_tlai,1,0)
- if (all(indx .eq. 1)) then
- tlai_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- tlai_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- tlai_glob_equil = -999
- end if
- end if
- tlai_glob_equil@_FillValue = -999
- delete(indx)
-
-; GPP
- gpp_area = gpp*landareaC ; correcting gpp for total land area
- gpp_Xg = gpp_area*gtoXg*secinyr ; g to Xg and sec to yrs
- if (region .eq. "SPT") then
- gpp_glob = dim_sum_n(gpp_Xg, (/1/)) ; sums over single point
- else
- gpp_glob = dim_sum_n(gpp_Xg, (/1,2/)) ; sums over all latitudes
- end if
- gpp_glob!0 = "year"
- gpp_glob&year = yearplt
- gpp_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- gpp_glob_del(i) = (gpp_glob(i+1) - gpp_glob(i))/subper
- end do
- gpp_glob_del!0 = "year"
- gpp_glob_del&year = yearpltmid
- indx = where(abs(gpp_glob_del) .lt. glob_thresh_gpp,1,0)
- if (all(indx .eq. 1)) then
- gpp_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- gpp_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- gpp_glob_equil = -999
- end if
- end if
- gpp_glob_equil@_FillValue = -999
- delete(indx)
-
-; TWS
- if (isfilevar(data[0],"TWS")) then
-
- if (region .eq. "SPT") then
- tws_glob = (dim_sum_n(tws*landareaL/areasum,(/1/)))/1.e3 ; weighted tws (meters)
- else
- tws_glob = (dim_sum_n(tws*landareaL/areasum,(/1,2/)))/1.e3 ; weighted global tws (meters)
- end if
- tws_glob!0 = "year"
- tws_glob&year = yearplt
- tws_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- tws_glob_del(i) = (tws_glob(i+1) - tws_glob(i))/subper
- end do
- tws_glob_del!0 = "year"
- tws_glob_del&year = yearpltmid
- indx = where(abs(tws_glob_del) .lt. glob_thresh_tws,1,0)
- if (all(indx .eq. 1)) then
- tws_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- tws_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- tws_glob_equil = -999
- end if
- end if
- tws_glob_equil@_FillValue = -999
- delete(indx)
-
- else
- tws_glob_equil = new(1,typeof(yearplt),-999)
- tws_glob_equil = -999
- if (region .eq. "SPT") then
- tws_glob = new(dimsizes(dim_sum_n(tws,(/1/))),typeof(tws),-999)
- else
- tws_glob = new(dimsizes(dim_sum_n(tws,(/1,2/))),typeof(tws),-999)
- end if
- tws_glob = -999
- tws_glob_del = new(nyrs-1,"float",-999.)
- tws_glob_del = -999.
- end if
-
-; H2OSNO
- if (isfilevar(data[0],"H2OSNO")) then
-
- if (region .eq. "SPT") then
- h2osno_glob = dim_sum_n(h2osno*landareaL/areasum,(/1/)) ; weighted h2osno (mm)
- else
- h2osno_glob = dim_sum_n(h2osno*landareaL/areasum,(/1,2/)) ; weighted global h2osno (mm)
- end if
- h2osno_glob!0 = "year"
- h2osno_glob&year = yearplt
- h2osno_glob_del = new(nyrs-1,"float")
- do i = 0,nyrs-2
- h2osno_glob_del(i) = (h2osno_glob(i+1) - h2osno_glob(i))/subper
- end do
- h2osno_glob_del!0 = "year"
- h2osno_glob_del&year = yearpltmid
- indx = where(abs(h2osno_glob_del) .lt. glob_thresh_h2osno,1,0)
- if (all(indx .eq. 1)) then
- h2osno_glob_equil = yearplt(0)
- else
- if (.not.(all(indx .eq. 0)) .and. indx(dimsizes(indx)-1) .ne. 0) then
- indxrev = indx(::-1)
- do i = 0,dimsizes(indxrev)-1
- if (indxrev(i) .eq. 0) then
- h2osno_glob_equil = yearpltrev(i-1)
- break
- end if
- end do
- delete(indxrev)
- else
- h2osno_glob_equil = -999
- end if
- end if
- h2osno_glob_equil@_FillValue = -999
- delete(indx)
-
- else
- h2osno_glob_equil = -999
- h2osno_glob_equil = new(1,typeof(yearplt),-999)
- h2osno_glob_equil = -999
- if (region .eq. "SPT") then
- h2osno_glob = new(dimsizes(dim_sum_n(h2osno,(/1/))),typeof(h2osno),-999)
- else
- h2osno_glob = new(dimsizes(dim_sum_n(h2osno,(/1,2/))),typeof(h2osno),-999)
- end if
- h2osno_glob = -999
- h2osno_glob_del = new(nyrs-1,"float",-999.)
- h2osno_glob_del = -999.
- end if
-
-;===============================Plotting====================================;
- if (do_plot) then
-
-; wks_type = "png"
-; wks_type@wkWidth = 2500
-; wks_type@wkHeight = 2500
-; wks_type = "x11"
- wks_type = "ps"
-; wks_type = "pdf"
- wks = gsn_open_wks (wks_type, caseid+"_Spinup")
- gsn_define_colormap(wks, "ViBlGrWhYeOrRe")
-
- plot = new(13,"graphic")
-
- resP = True
-; resP@gsnMaximize = True
- resP@gsnPaperOrientation = "portrait"
- resP@gsnFrame = False
- resP@gsnDraw = True
- resP@txString = caseid + " Annual Mean "
- resP@gsnPanelXWhiteSpacePercent = 2.
- resP@gsnPanelCenter = False
-; resP@gsnPanelDebug = True
-
- res = True
- res@gsnFrame = False
- res@gsnDraw = False
- res@xyLineThicknessF = 2
-
- polyres1 = True
- polyres1@gsLineDashPattern = 16
- polyres2 = True
- polyres2@gsLineDashPattern = 2
-
- res@tiXAxisString = " "
- res@tiYAxisString = units
- res@tiMainString = "TOTECOSYSC"
- plot(0) = gsn_csm_xy(wks,yearplt,totecosysc_glob,res)
-
- res@tiYAxisString = units
- res@tiMainString = "Delta TOTECOSYSC " + "EqYr: "+totecosysc_glob_equil
- res@trYMaxF = 0.2
- res@trYMinF = -0.2
- plot(1) = gsn_csm_xy(wks,yearpltmid,totecosysc_glob_del,res)
- prim1 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim2 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_totecosysc,-glob_thresh_totecosysc/),polyres2)
- prim3 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_totecosysc,glob_thresh_totecosysc/),polyres2)
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = units
- res@tiMainString = "TOTSOMC"
- plot(2) = gsn_csm_xy(wks,yearplt,totsomc_glob,res)
-
- res@tiYAxisString = units
- res@tiMainString = "Delta TOTSOMC " + "EqYr: "+totsomc_glob_equil
- res@trYMaxF = 0.2
- res@trYMinF = -0.2
- plot(3) = gsn_csm_xy(wks,yearpltmid,totsomc_glob_del,res)
- prim4 = gsn_add_polyline(wks,plot(3),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim5 = gsn_add_polyline(wks,plot(3),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_totsomc,-glob_thresh_totsomc/),polyres2)
- prim6 = gsn_add_polyline(wks,plot(3),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_totsomc,glob_thresh_totsomc/),polyres2)
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = units
- res@tiMainString = "TOTVEGC"
- plot(4) = gsn_csm_xy(wks,yearplt,totvegc_glob,res)
-
- res@tiYAxisString = units
- res@tiMainString = "Delta TOTVEGC " + "EqYr: "+totvegc_glob_equil
- res@trYMaxF = 0.2
- res@trYMinF = -0.2
- plot(5) = gsn_csm_xy(wks,yearpltmid,totvegc_glob_del,res)
- prim7 = gsn_add_polyline(wks,plot(5),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim8 = gsn_add_polyline(wks,plot(5),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_totvegc,-glob_thresh_totvegc/),polyres2)
- prim9 = gsn_add_polyline(wks,plot(5),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_totvegc,glob_thresh_totvegc/),polyres2)
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = "m ~S~2~N~ m~S~-2~N~"
- res@tiMainString = "TLAI"
- plot(6) = gsn_csm_xy(wks,yearplt,tlai_glob,res)
-
- res@tiYAxisString = "m ~S~2~N~ m~S~-2~N~"
- res@tiMainString = "Delta TLAI " + "EqYr: "+tlai_glob_equil
- res@trYMaxF = 0.2
- res@trYMinF = -0.2
- plot(7) = gsn_csm_xy(wks,yearpltmid,tlai_glob_del,res)
- prim10 = gsn_add_polyline(wks,plot(7),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim11 = gsn_add_polyline(wks,plot(7),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_tlai,-glob_thresh_tlai/),polyres2)
- prim12 = gsn_add_polyline(wks,plot(7),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_tlai,glob_thresh_tlai/),polyres2)
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = units+" yr~S~-1~N~"
- res@tiMainString = "GPP"
- plot(8) = gsn_csm_xy(wks,yearplt,gpp_glob,res)
-
- res@tiYAxisString = units+" yr~S~-1~N~"
- res@tiXAxisString = "Spinup Year"
- res@tiMainString = "Delta GPP " + "EqYr: "+gpp_glob_equil
- res@trYMaxF = 0.2
- res@trYMinF = -0.2
- plot(9) = gsn_csm_xy(wks,yearpltmid,gpp_glob_del,res)
- prim13 = gsn_add_polyline(wks,plot(9),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim14 = gsn_add_polyline(wks,plot(9),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_gpp,-glob_thresh_gpp/),polyres2)
- prim15 = gsn_add_polyline(wks,plot(9),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_gpp,glob_thresh_gpp/),polyres2)
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = "m"
- res@tiMainString = "TWS"
- plot(10) = gsn_csm_xy(wks,yearplt,tws_glob,res)
-
- res@tiYAxisString = "m"
- res@tiMainString = "Delta TWS " + "EqYr: "+tws_glob_equil
- res@trYMaxF = 0.05
- res@trYMinF = -0.05
- plot(11) = gsn_csm_xy(wks,yearpltmid,tws_glob_del,res)
- prim16 = gsn_add_polyline(wks,plot(11),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim17 = gsn_add_polyline(wks,plot(11),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_tws,-glob_thresh_tws/),polyres2)
- prim18 = gsn_add_polyline(wks,plot(11),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_tws,glob_thresh_tws/),polyres2)
-
- if (region .ne. "SPT") then
- res@tiYAxisString = "%"
- res@tiMainString = "% Land Area in TOTECOSYSC Disequil " + "EqYr: "+perc_landarea_glob_noequil
- res@trYMaxF = 80.0
- res@trYMinF = 0.0
- plot(12) = gsn_csm_xy(wks,yearpltmid,perc_landarea_noequil,res)
- prim19 = gsn_add_polyline(wks,plot(12),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_area,glob_thresh_area/),polyres2)
- end if
-
- gsn_panel(wks,plot,(/4,4/),resP)
-
- if (region .ne. "SPT") then
- delete(plot)
- resc = True ; turn on plotting options
- resc@gsnSpreadColors = True ; spans all colors in colormap
- resc@cnFillMode = "RasterFill" ; raster mode
- resc@cnFillOn = True ; turn on color fill
- resc@cnLinesOn = False ; turn off contour lines
- resc@cnLineLabelsOn = False ; turn off contour line labels
- resc@cnLevelSelectionMode = "ExplicitLevels"
- resc@mpProjection = "robinson" ; Robinson grid projection
- if (region .eq. "Arctic") then
- resc@mpLimitMode = "LatLon"
- resc@mpMinLatF = 50.
- resc@mpMaxLatF = 85.
- resc@mpMinLonF = -180.
- resc@mpMaxLonF = -95.
- resc@gsnAddCyclic = False
- resc@mpProjection = "CylindricalEquidistant"
- end if
- resc@gsnDraw = True
- resc@gsnFrame = False
- resc@lbAutoManage = False
- resc@lbLabelFontHeightF = 0.010
- resc@txFontHeightF = 0.008
- resc@cnLevels = (/-5.0,-4.0,-3.0,-2.0,-1.0,0.0,1.0,2.0,3.0,4.0,5.0/)
- resc@gsnLeftString = "gC m~S~-2~N~"
- resc@gsnRightString = ""
-
- resc@vpXF = 0.30 ; position and sizes
- resc@vpYF = 0.28 ; for XY plot
- resc@vpWidthF = 0.30
- resc@vpHeightF = 0.30
- resc@gsnCenterString = "TOTECOSYSC Disequil Yr " + yearplt(nyrs-1) + " - " + yearplt(nyrs-2)
- plot = gsn_csm_contour_map(wks,totecosysc_1_map,resc)
-
- resc@vpXF = 0.65 ; position and sizes
- resc@vpYF = 0.28 ; for XY plot
- resc@vpWidthF = 0.30
- resc@vpHeightF = 0.30
- resc@gsnCenterString = "TOTECOSYSC Disequil Yr " + yearplt(nyrs-2) + " - " + yearplt(nyrs-3)
- plot = gsn_csm_contour_map(wks,totecosysc_2_map,resc)
-
- end if
-
- frame(wks)
- delete(plot)
- plot = new(2,"graphic")
-
- delete(res@trYMaxF)
- delete(res@trYMinF)
- res@tiYAxisString = "mm"
- res@tiMainString = "H2OSNO"
- plot(0) = gsn_csm_xy(wks,yearplt,h2osno_glob,res)
-
- res@tiYAxisString = "mm"
- res@tiMainString = "Delta H2OSNO " + "EqYr: "+h2osno_glob_equil
- res@trYMaxF = 5.0
- res@trYMinF = -5.0
- plot(1) = gsn_csm_xy(wks,yearpltmid,h2osno_glob_del,res)
- prim20 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/0.,0./),polyres1)
- prim21 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/-glob_thresh_h2osno,-glob_thresh_h2osno/),polyres2)
- prim22 = gsn_add_polyline(wks,plot(1),(/0.,yearpltmid(nyrs-2)/),(/glob_thresh_h2osno,glob_thresh_h2osno/),polyres2)
-
- gsn_panel(wks,plot,(/4,4/),resP)
- frame(wks)
-
- end if ; end do_plot
-
-; Equilibrium summary
- print((/"======================================================================="/))
- print((/"======================================================================="/))
- print((/"EQUILIBRIUM SUMMARY"/))
- print((/"======================================================================="/))
- if (.not.(ismissing(totecosysc_glob_equil))) then
- print((/"TOTECOSYSC is in equilibrium. Eq. Yr. = "+totecosysc_glob_equil/))
- else
- print((/"FATAL: TOTECOSYSC is NOT in equilibrium"/))
- end if
- if (.not.(ismissing(totsomc_glob_equil))) then
- print((/"TOTSOMC is in equilibrium. Eq. Yr. = "+totsomc_glob_equil/))
- else
- print((/"FATAL: TOTSOMC is NOT in equilibrium"/))
- end if
- if (.not.(ismissing(totvegc_glob_equil))) then
- print((/"TOTVEGC is in equilibrium. Eq. Yr. = "+totvegc_glob_equil/))
- else
- print((/"FATAL: TOTVEGC is NOT in equilibrium"/))
- end if
- if (.not.(ismissing(tlai_glob_equil))) then
- print((/"TLAI is in equilibrium. Eq. Yr. = "+tlai_glob_equil/))
- else
- print((/"FATAL: TLAI is NOT in equilibrium"/))
- end if
- if (.not.(ismissing(gpp_glob_equil))) then
- print((/"GPP is in equilibrium. Eq. Yr. = "+gpp_glob_equil/))
- else
- print((/"FATAL: GPP is NOT in equilibrium"/))
- end if
- if (.not.(ismissing(tws_glob_equil))) then
- print((/"TWS is in equilibrium. Eq. Yr. = "+tws_glob_equil/))
- else
- print((/"WARNING: TWS is NOT in equilibrium or is missing"/))
- end if
- if (.not.(ismissing(h2osno_glob_equil))) then
- print((/"H2OSNO is in equilibrium. Eq. Yr. = "+h2osno_glob_equil/))
- else
- print((/"WARNING: H2OSNO is NOT in equilibrium or is missing"/))
- end if
- if (region .ne. "SPT") then
- if (.not.(ismissing(perc_landarea_glob_noequil))) then
- print((/"At least "+(100.-glob_thresh_area)+" percent of the land surface is in TOTECOSYSC equilibrium. Eq. Yr. = "+perc_landarea_glob_noequil/))
- print((/"Percent of the land surface not in equilibrium ("+sprintf("%6.2f",perc_landarea_noequil(nyrs-2))+"% )"/))
- else
- print((/"FATAL: Not enough of the land surface is in equilibrium ("+sprintf("%6.2f",perc_landarea_noequil(nyrs-2))+"% > "+sprintf("%6.2f",glob_thresh_area)+"%)"/))
- end if
- if (.not.(ismissing(totecosysc_glob_equil)) .and. \
- .not.(ismissing(totsomc_glob_equil)) .and. \
- .not.(ismissing(totvegc_glob_equil)) .and. \
- .not.(ismissing(tlai_glob_equil)) .and. \
- .not.(ismissing(gpp_glob_equil)) .and. \
- .not.(ismissing(perc_landarea_glob_noequil))) then
- print((/"Congratulations! Your simulation is in equilibrium"/))
- else
- print((/"FATAL: Your simulation is not in equilibrium, 8 hours have been deducted from your PTO bank, try again"/))
- end if
- print((/"======================================================================="/))
- end if
-
- print ("=========================================")
- print ("Finish Time: "+systemfunc("date") )
- print ("=========================================")
- print ("Successfully ran the script")
-
-end
diff --git a/tools/contrib/modify_singlept_site b/tools/contrib/modify_singlept_site
deleted file mode 100755
index 4345109ca4..0000000000
--- a/tools/contrib/modify_singlept_site
+++ /dev/null
@@ -1,391 +0,0 @@
-#! /usr/bin/env python
-import sys
-import string
-import subprocess
-from getpass import getuser
-import numpy as np
-import netCDF4 as netcdf4
-import xarray as xr
-
-def mprint(mstr):
- vnum=sys.version_info[0]
- if vnum == 3:
- print(mstr)
- if vnum == 2:
- print mstr
-
-'''
-#------------------------------------------------------------------#
-#--------------------- Instructions -----------------------------#
-#------------------------------------------------------------------#
-After creating a single point surface data file from a global
-surface data file, use this script to overwrite some fields with
-site-specific data.
-'''
-
-myname = getuser()
-#-- Create single point CLM surface data file
-create_surfdata = True
-
-#-- specify site from which to extract data ----------
-sitename='US-Ha1'
-
-#-- Site level data directory ------------------------
-site_dir='../PTCLM/PTCLM_sitedata/'
-
-#-- Specify original file
-fsurf = '/glade/scratch/'+myname+'/single_point/surfdata_0.9x1.25_16pfts_CMIP6_simyr1850_287.8_42.5_c170706.nc'
-
-#-- Output directory ---------------------------------
-dir_output='/glade/scratch/'+myname+'/single_point/'
-
-#-- Specify new file name -----------------------------
-command='date "+%y%m%d"'
-x2=subprocess.Popen(command,stdout=subprocess.PIPE,shell='True')
-x=x2.communicate()
-timetag = x[0].strip()
-fsurf2 = dir_output + 'surfdata_0.9x1.25_16pfts_CMIP6_simyr2000_'+sitename+'_site.c'+timetag+'.nc'
-
-#== End User Mods =====================================
-
-#site_code,pft_f1,pft_c1,pft_f2,pft_c2,pft_f3,pft_c3,pft_f4,pft_c4,pft_f5,pft_c5
-pftfile =site_dir+'PTCLMDATA_pftdata.txt'
-#site_code,name,state,lon,lat,elev,startyear,endyear,alignyear,timestep,campaign
-sitefile=site_dir+'PTCLMDATA_sitedata.txt'
-#site_code,soil_depth,n_layers,layer_depth,layer_sand%,layer_clay%
-soilfile=site_dir+'PTCLMDATA_soildata.txt'
-
-#-- Raw datafiles ------------------------
-rawdatafile = '../mksurfdata_map/mksurfdata_map.namelist'
-
-mstr='Checking for data for site: '+sitename
-mprint(mstr)
-
-#-- Read raw datafiles ------------------------
-with open(rawdatafile, 'r') as t1:
- for tmp in t1:
- x=tmp.split('=')
- if x[0].strip() == 'mksrf_fsoitex':
- fsoitex = x[1].strip()
- fsoitex = fsoitex.strip("'")
- if x[0].strip() == 'mksrf_forganic':
- forganic = x[1].strip()
- forganic = forganic.strip("'")
- if x[0].strip() == 'mksrf_flakwat':
- flakwat= x[1].strip()
- flakwat= flakwat.strip("'")
- if x[0].strip() == 'mksrf_fwetlnd':
- fwetlnd = x[1].strip()
- fwetlnd = fwetlnd.strip("'")
- if x[0].strip() == 'mksrf_fmax':
- fmax = x[1].strip()
- fmax = fmax.strip("'")
- if x[0].strip() == 'mksrf_fglacier':
- fglacier= x[1].strip()
- fglacier= fglacier.strip("'")
- if x[0].strip() == 'mksrf_fglacierregion':
- fglacierregion= x[1].strip()
- fglacierregion= fglacierregion.strip("'")
- if x[0].strip() == 'mksrf_fvocef':
- fvocef= x[1].strip()
- fvocef= fvocef.strip("'")
- if x[0].strip() == 'mksrf_furbtopo':
- furbtopo = x[1].strip()
- furbtopo = furbtopo.strip("'")
- if x[0].strip() == 'mksrf_fgdp':
- fgdp = x[1].strip()
- fgdp = fgdp.strip("'")
- if x[0].strip() == 'mksrf_fpeat':
- fpeat = x[1].strip()
- fpeat = fpeat.strip("'")
- if x[0].strip() == 'mksrf_fsoildepth':
- fsoildepth= x[1].strip()
- fsoildepth= fsoildepth.strip("'")
- if x[0].strip() == 'mksrf_fabm':
- fabm = x[1].strip()
- fabm = fabm.strip("'")
- if x[0].strip() == 'mksrf_ftopostats':
- ftopostats = x[1].strip()
- ftopostats = ftopostats.strip("'")
- if x[0].strip() == 'mksrf_fvic':
- fvic = x[1].strip()
- fvic = fvic.strip("'")
- if x[0].strip() == 'mksrf_fch4':
- fch4 = x[1].strip()
- fch4 = fch4.strip("'")
- if x[0].strip() == 'mksrf_furban':
- furban = x[1].strip()
- furban = furban.strip("'")
- if x[0].strip() == 'mksrf_fvegtyp':
- fvegtyp= x[1].strip()
- fvegtyp= fvegtyp.strip("'")
- if x[0].strip() == 'mksrf_fhrvtyp':
- fhrvtyp= x[1].strip()
- fhrvtyp= fhrvtyp.strip("'")
- if x[0].strip() == 'mksrf_fsoicol':
- fsoicol = x[1].strip()
- fsoicol = fsoicol.strip("'")
- if x[0].strip() == 'mksrf_flai':
- flai = x[1].strip()
- flai = flai.strip("'")
-
-#Open site file and extract data
-site_found = False
-with open(sitefile, 'r') as t1:
- for tmp in t1:
- x=tmp.split(',')
- if x[0] == sitename:
- site_found = True
- site_header = tmp
- name = x[1]
- state = x[2]
- plon = np.float32(x[3])
- plat = np.float32(x[4])
- elev = np.float32(x[5])
- startyear= np.int32(x[6])
- endyear = np.int32(x[7])
- alignyear= np.int32(x[8])
- timestep = np.float32(x[9])
- campaign = x[10]
- exit
-
-if not site_found:
- mprint('No site matching '+sitename+' was found')
- stop
-else:
- mprint(site_header)
-
-# convert longitude to east if needed
-if plon < 0:
- plon+=360.0
-
-#-- Open pft file and extract data ---------------------------------
-site_found = False
-with open(pftfile, 'r') as t1:
- for tmp in t1:
- x=tmp.split(',')
- if x[0] == sitename:
- site_found = True
- pft_f1= np.int32(x[1])
- pft_c1= np.int32(x[2])
- pft_f2= np.int32(x[3])
- pft_c2= np.int32(x[4])
- pft_f3= np.int32(x[5])
- pft_c3= np.int32(x[6])
- pft_f4= np.int32(x[7])
- pft_c4= np.int32(x[8])
- pft_f5= np.int32(x[9])
- pft_c5= np.int32(x[10])
- exit
-
-
-#-- Open soil file and extract data ---------------------------------
-site_found = False
-with open(soilfile, 'r') as t1:
- for tmp in t1:
- x=tmp.split(',')
- if x[0] == sitename:
- site_found = True
- soil_depth = np.float32(x[1])
- n_layers = np.int32(x[2])
- layer_depth = np.float32(x[3])
- layer_sand_pct = np.float32(x[4])
- layer_clay_pct = np.float32(x[5])
- exit
-
-#-- create surface data file --------------------------------------
-##2
-if create_surfdata:
- f1 = xr.open_dataset(fsurf)
- # expand dimensions
- #f1 = f1.expand_dims(['lsmlat','lsmlon'])
-
- # create 1d variables
- lon0=np.asarray(f1['LONGXY'][0,:])
- lon=xr.DataArray(lon0,name='lon',dims='lsmlon',coords={'lsmlon':lon0})
- lat0=np.asarray(f1['LATIXY'][:,0])
- lat=xr.DataArray(lat0,name='lat',dims='lsmlat',coords={'lsmlat':lat0})
- #f2=f1.assign({'lon':lon,'lat':lat})
- # the above doesn't work now
- f2=f1.assign()
- f2['lon'] = lon
- f2['lat'] = lat
-
- # make gridcell entirely natural vegetation *or* entirely crop
- new_pct_natveg = 0.
- new_pct_crop = 0.
- if np.logical_and(
- np.any([pft_f1,pft_f2,pft_f3,pft_f4,pft_f5]),
- np.any([(pft_c1 < 15),(pft_c2 < 15),(pft_c3 < 15),(pft_c4 < 15),(pft_c5 < 15)])):
- new_pct_natveg = 100.
- if np.logical_and(
- np.any([pft_f1,pft_f2,pft_f3,pft_f4,pft_f5]),
- np.any([(pft_c1 >= 15),(pft_c2 >= 15),(pft_c3 >= 15),(pft_c4 >= 15),(pft_c5 >= 15)])):
- new_pct_crop = 100.
-
- if new_pct_natveg == new_pct_crop:
- print 'both natveg and crop set to 100, exiting'
- stop
-
- #-- Remove non-vegetated landunits ---------------------------------
- f2['PCT_NATVEG'] = new_pct_natveg
- f2['PCT_CROP'] = new_pct_crop
- f2['PCT_LAKE'] = 0.
- f2['PCT_WETLAND'] = 0.
- f2['PCT_URBAN'] = 0.
- f2['PCT_GLACIER'] = 0.
-
- #-- Overwrite global data with raw data ----------------------------
- f2['LONGXY'] = plon
- f2['LATIXY'] = plat
-
- #-- Soil texture
- r1 = xr.open_dataset(fsoitex)
- plonc = plon
- if plonc > 180.0:
- plonc -= 360.0
-
- # set coordinates (seems to require 'lon' and 'lat' to recognize...
- r1=r1.rename({'LON':'lon','LAT':'lat'})
- r1.set_coords(['lon','lat'],inplace=True)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- # extract sand/clay profiles for given mapunit
- mapunit = np.int32(r2['MAPUNITS'])
-
- f2['PCT_SAND'][:,0,0] = np.asarray(r2['PCT_SAND'][:,mapunit])
- f2['PCT_CLAY'][:,0,0] = np.asarray(r2['PCT_CLAY'][:,mapunit])
-
- r1.close() ; r2.close()
-
- #-- Organic
- r1 = xr.open_dataset(forganic)
- r1=r1.rename({'LON':'lon','LAT':'lat'})
- r1.set_coords(['lon','lat'],inplace=True)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['ORGANIC'][:,0,0] = np.asarray(r2['ORGANIC'])
- r1.close() ; r2.close()
-
- #-- Fmax
- r1 = xr.open_dataset(fmax)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['FMAX'] = np.asarray(r2['FMAX'])
- r1.close() ; r2.close()
-
- #-- VOC
- r1 = xr.open_dataset(fvocef)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['EF1_BTR'] = np.asarray(r2['ef_btr'])
- f2['EF1_CRP'] = np.asarray(r2['ef_crp'])
- f2['EF1_FDT'] = np.asarray(r2['ef_fdt'])
- f2['EF1_FET'] = np.asarray(r2['ef_fet'])
- f2['EF1_GRS'] = np.asarray(r2['ef_grs'])
- f2['EF1_SHR'] = np.asarray(r2['ef_shr'])
- r1.close() ; r2.close()
-
- #-- GDP
- r1 = xr.open_dataset(fgdp)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['gdp'] = np.asarray(r2['gdp'])
- r1.close() ; r2.close()
-
- #-- Peat
- r1 = xr.open_dataset(fpeat)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['peatf'] = np.asarray(r2['peatf'])
- r1.close() ; r2.close()
-
- #-- Soil Depth
- r1 = xr.open_dataset(fsoildepth)
- # create 1d variables
- lon0=np.asarray(r1['LONGXY'][0,:])
- lon=xr.DataArray(lon0,name='lon',dims='lon',coords={'lon':lon0})
- lat0=np.asarray(r1['LATIXY'][:,0])
- lat=xr.DataArray(lat0,name='lat',dims='lat',coords={'lat':lat0})
- r1['lon'] = lon
- r1['lat'] = lat
- r1=r1.rename({'lsmlon':'lon','lsmlat':'lat'})
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['zbedrock'] = np.asarray(r2['Avg_Depth_Median'])
- #f2['zbedrock'] = np.asarray(r2['Avg_Depth_Mean'])
- r1.close() ; r2.close()
-
- #-- ABM
- r1 = xr.open_dataset(fabm)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['abm'] = np.asarray(r2['abm'])
- r1.close() ; r2.close()
-
- #-- SLOPE
- r1 = xr.open_dataset(ftopostats)
- rlon=np.asarray(r1['LONGITUDE'])
- rlat=np.asarray(r1['LATITUDE'])
- # extract gridcell closest to plon/plat (data are 1-d) (lon [0,360])
- k1 = np.argmin(np.power(rlon - plon,2)+np.power(rlat - plat,2))
- f2['SLOPE'] = np.asarray(r1['SLOPE'][k1])
- r1.close() ; r2.close()
-
- #-- VIC
- r1 = xr.open_dataset(fvic)
- r1=r1.rename({'lsmlon':'lon','lsmlat':'lat'})
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['Ws'] = np.asarray(r2['Ws'])
- f2['Dsmax'] = np.asarray(r2['Dsmax'])
- f2['Ds'] = np.asarray(r2['Ds'])
- r1.close() ; r2.close()
-
- #-- Methane
- r1 = xr.open_dataset(fch4)
- # create 1d variables
- lon0=np.asarray(r1['LONGXY'][0,:])
- lon=xr.DataArray(lon0,name='lon',dims='lon',coords={'lon':lon0})
- lat0=np.asarray(r1['LATIXY'][:,0])
- lat=xr.DataArray(lat0,name='lat',dims='lat',coords={'lat':lat0})
- r1['lon'] = lon
- r1['lat'] = lat
- r1=r1.rename({'lsmlon':'lon','lsmlat':'lat'})
- # extract gridcell closest to plon/plat (this file is [0,360]
- r2 = r1.sel(lon=plon,lat=plat,method='nearest')
- f2['P3'] = np.asarray(r2['P3'])
- f2['ZWT0'] = np.asarray(r2['ZWT0'])
- f2['F0'] = np.asarray(r2['F0'])
- r1.close() ; r2.close()
-
- #-- Soil Color
- r1 = xr.open_dataset(fsoicol)
- r1=r1.rename({'LON':'lon','LAT':'lat'})
- r1.set_coords(['lon','lat'],inplace=True)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
- f2['SOIL_COLOR'] = np.asarray(r2['SOIL_COLOR'])
- r1.close() ; r2.close()
-
- #-- LAI / Height
- r1 = xr.open_dataset(flai)
- r1=r1.rename({'LON':'lon','LAT':'lat'})
- r1.set_coords(['lon','lat'],inplace=True)
- # extract gridcell closest to plonc/plat
- r2 = r1.sel(lon=plonc,lat=plat,method='nearest')
-
- # ignoring crop, i.e. excluding index 15
- f2['MONTHLY_HEIGHT_BOT'][:,0:15,0,0] = np.asarray(r2['MONTHLY_HEIGHT_BOT'][:,0:15])
- f2['MONTHLY_HEIGHT_TOP'][:,0:15,0,0] = np.asarray(r2['MONTHLY_HEIGHT_TOP'][:,0:15])
- f2['MONTHLY_LAI'][:,0:15,0,0] = np.asarray(r2['MONTHLY_LAI'][:,0:15])
- f2['MONTHLY_SAI'][:,0:15,0,0] = np.asarray(r2['MONTHLY_SAI'][:,0:15])
- r1.close() ; r2.close()
-
- #-- Close output file
- # mode 'w' overwrites file
- f2.to_netcdf(path=fsurf2, mode='w')
- mprint('created file '+fsurf2)
- f1.close(); f2.close(); f2.close()
- mprint('successfully created surface data file: '+fsurf2)
-
diff --git a/tools/contrib/run_clm_historical b/tools/contrib/run_clm_historical
deleted file mode 100755
index cd293d5867..0000000000
--- a/tools/contrib/run_clm_historical
+++ /dev/null
@@ -1,365 +0,0 @@
-#!/bin/csh
-
-#########################################################################################
-#
-# - Execute this script to do a CLM historical simulation from 1850 - 2014. This
-# script will complete all the changes required at year 1901 to deal with the
-# fact that met forcing data does not go back to 1850.
-#
-# - Unmodified script will do the following.
-# Part 1: Simulation 1: 1850 - 1870 (21 years) using repeated 1901-1920 forcing
-# Part 2: Simulation 2: 1871 - 1900 (30 years) using repeated 1901-1920 forcing
-# Part 3: Simulation 3+4+5+6: 1901-1988 (four 22 year) simulations using 1901-1988 forcing
-# Part 4: Simulation 7: 1989-2004 (one 16 year) branch simulation w/daily output using 1989-2004 forcing
-# Part 5: Simulation 8: 2005-2014 (one 10 year) branch simulation w/daily & subdaily output using 2005-2014 forcing
-#
-# - Script assumes that simulation can run at least 30 years within a 12 hour block on
-# Cheyenne. To find the timing in an equivalent sample run, look in the timing
-# directory and grep as follows > grep 'simulated_years/day' cesm_timing*
-#
-# - In the env_batch.xml file for the case.run group ensure the following:
-# that way you can use up to 12:00 hours of wall-clock computer time per run block
-#
-# - This script assumes that env_mach_pes.xml has been setup and case.setup has already been run
-#
-# - This script makes use of user_nl_datm1901-1920 and user_nl_datm1901-2014
-#
-# - Before submitting script, make a copy of your modified or unmodified user_nl_clm file
-# into "original_user_nl_clm". This should only contain namelist items that will not change throughout
-# the run.
-# Create a file called user_nl_clm_histdaily that contains the desired history output namelist items
-# for the 1989-2004 simulation
-# Create a file called user_nl_clm_histsubdaily that contains the desired history output namelist items
-# for the 2005-2014 simulation
-#
-# - The atm data files start in 1901, so with :
-# ALIGN year of 1901, (this is in units of RUN or simulation years)
-# START year of 1901, (this is in units of FORCE'ing data years)
-#
-# RUN Year : 1850 ... 1860 1861 ... 1870 ... 1880 1881 ... 1890 ... 1900 1901 ... 2014
-# FORCE Year : 1910 ... 1920 1901 ... 1910 ... 1920 1901 ... 1910 ... 1920 1901 ... 2014
-#
-# - The script could be broken up into several parts if you want to check the initial set of
-# simulations.
-#
-# - Written by Andrew Slater - Late July, 2015; aslater@kryos.colorado.edu
-# - Modified by Dave Lawrence, August, 2015
-# - Updated with better check that run has also been archived to short term - Dave Lawrence
-# October, 2015
-# - Updated to adjust for the fact that the model is now slower - Keith Oleson December, 2016
-# ./run_clm_historical.v5.csh ! > & run_historical.out &
-# - Extend to 2014 and obtain daily & sub-daily output for end of run - Keith Oleson January, 2018
-# ./run_clm_historical.v6.csh ! > & run_historical.out &
-# - Modify history output for CMIP6 - Keith Oleson January, 2019
-# ./run_clm_historical.v7.csh ! > & run_historical.out &
-#########################################################################################
-
-#########################################################################################
-# PART 1
-#########################################################################################
-#
-# This portion does the initial setup and the initial 21 year run (1850-1870)
-#
-#########################################################################################
-
-# --- CASENAME is your case name
-set CASENAME = 'clm50_release-clm5.0.15_2deg_GSWP3V1_hist'
-
-# --- Set the user namelist file.
-cp original_user_nl_clm user_nl_clm
-
-# --- Ensure that the env_run.xml file has the correct content
-# Since this particular run won't go for 51 years in 12 hours (as in v4 script), set it up for 21 years first
-./xmlchange RUN_TYPE=startup
-./xmlchange RUN_STARTDATE=1850-01-01
-./xmlchange STOP_OPTION=nyears
-./xmlchange STOP_N=21
-./xmlchange CONTINUE_RUN=FALSE
-./xmlchange RESUBMIT=0
-./xmlchange DATM_CLMNCEP_YR_ALIGN=1901
-./xmlchange DATM_CLMNCEP_YR_START=1901
-./xmlchange DATM_CLMNCEP_YR_END=1920
-
-# need to use user_nl_datm files to get years right
-cp user_nl_datm1901-1920 user_nl_datm
-
-# --- Check that you end up using the correct env_run.xml file
-set nenvr = `ls -1 env_run*.xml | wc -l`
-if ($nenvr > 1) then
- echo 'There is more than one file of the type env_run*.xml'
- echo 'There should only be one such file'
- exit
-endif
-
-# --- If you have not already built the code, then do so now
-#./case.clean_build
-qcmd -- ./case.build
-
-# --- Now submit the job and let it run
-./case.submit
-
-#########################################################################################
-# PART 2
-#########################################################################################
-#
-# This portion checks to see if the 1850-1870 portion of the run is done (or it waits
-# 10 minutes before checking again).
-#
-# This will then start the run from 1871 (hence the CONTINUE_RUN=TRUE) and do one 30 year
-# simulation
-#
-# The new values for env_run.xml are put in place
-# Then submit the job
-#
-#########################################################################################
-
-
-set WDIR = '/glade/scratch/'$USER'/'$CASENAME'/run/'
-set DONE_RUNA = 0
-set DONE_ARCHIVE = 0
-set RESTART_FILE = $WDIR$CASENAME'.clm2.r.1871-01-01-00000.nc'
-
-# --- Check if the first set of simulations have completed and the data archived (every ten minutes)
-while ($DONE_RUNA == 0)
- if (-e $RESTART_FILE) then
- set DONE_RUNA = 1
- echo '1850-1870 run is complete'
- while ($DONE_ARCHIVE == 0)
- set nh0 = `ls -l $WDIR/*clm?.h0.* | egrep -c '^-'`
- echo $nh0
- if ($nh0 == 1) then
- set DONE_ARCHIVE = 1
- echo 'Files have been archived'
- else
- sleep 600
- date
- endif
- else
- sleep 600
- date
- endif
-end
-
-# These are the proper settings to let this script continue the run through 1900
-./xmlchange STOP_N=30
-./xmlchange CONTINUE_RUN=TRUE
-
-# --- Now submit the job and let it run
-./case.submit
-
-#########################################################################################
-# PART 3
-#########################################################################################
-#
-# This portion checks to see if the 1871-1900 portion of the run is done (or it waits
-# 10 minutes before checking again). It then removes (or rather moves and renames) the
-# datm files so that the model will use the full array of data from 1901-2014.
-# This part runs with forcing data files that actually exist for 1901-2014
-#
-# This will start the run from 1901 (hence the CONTINUE_RUN=TRUE) and do four 22 year
-# simulations: 1901 + 4*22 - 1 = 1988 (minus 1 because we do 1901)
-#
-# The new values for env_run.xml are put in place
-# Then submit the job
-#
-#########################################################################################
-
-
-set WDIR = '/glade/scratch/'$USER'/'$CASENAME'/run/'
-set DDIR = $WDIR'restart_dump/'
-set DONE_RUNA = 0
-set DONE_ARCHIVE = 0
-set RESTART_FILE = $WDIR$CASENAME'.clm2.r.1901-01-01-00000.nc'
-
-# --- Check if the first set of simulations have completed and the data archived (every ten minutes)
-while ($DONE_RUNA == 0)
- if (-e $RESTART_FILE) then
- set DONE_RUNA = 1
- echo '1850-1900 run is complete'
- while ($DONE_ARCHIVE == 0)
- set nh0 = `ls -l $WDIR/*clm?.h0.* | egrep -c '^-'`
- echo $nh0
- if ($nh0 == 1) then
- set DONE_ARCHIVE = 1
- echo 'Files have been archived'
- else
- sleep 600
- date
- endif
- else
- sleep 600
- date
- endif
-end
-
-# --- If the first two sets of simulations are done, move the datm files and compress them
-if (! -d $DDIR) then
- mkdir $DDIR
-endif
-mv -i $WDIR$CASENAME.datm.rs1*.bin $DDIR
-gzip $DDIR$CASENAME*.bin
-
-# Since this particular run won't go for 55 years in 12 hours, do this in four 22 year chunks, thus
-# we have to resubmit the job 3 times.
-./xmlchange STOP_OPTION=nyears
-./xmlchange STOP_N=22
-./xmlchange DATM_CLMNCEP_YR_ALIGN=1901
-./xmlchange DATM_CLMNCEP_YR_START=1901
-./xmlchange DATM_CLMNCEP_YR_END=2014
-./xmlchange CONTINUE_RUN=TRUE
-./xmlchange RESUBMIT=3
-
-# need to use user_nl_datm files to get years right
-cp user_nl_datm1901-2014 user_nl_datm
-
-# --- Check that you end up using the correct env_run.xml file
-set nenvr = `ls -1 env_run*.xml | wc -l`
-if ($nenvr > 1) then
- echo 'There is more than one file of the type env_run*.xml'
- echo 'There should only be one such file'
- exit
-endif
-
-# --- Now submit the job and let it run
-./case.submit
-
-#########################################################################################
-# PART 4
-#########################################################################################
-#
-# This portion checks to see if the 1901-1988 part of the run is complete
-# and then runs the model for 1989-2004 as a branch run to get daily output
-#
-#########################################################################################
-
-set DONE_RUNA = 0
-set DONE_ARCHIVE = 0
-set RESTART_FILE = $WDIR$CASENAME'.clm2.r.1989-01-01-00000.nc'
-
-# --- Check if the second set of simulations have completed and the data archived (every ten minutes)
-while ($DONE_RUNA == 0)
- if (-e $RESTART_FILE) then
- set DONE_RUNA = 1
- echo '1901-1989 run is complete'
- while ($DONE_ARCHIVE == 0)
- set nh0 = `ls -l $WDIR/*clm?.h0.* | egrep -c '^-'`
- echo $nh0
- if ($nh0 == 1) then
- set DONE_ARCHIVE = 1
- echo 'Files have been archived'
- else
- sleep 600
- date
- endif
- else
- sleep 600
- date
- endif
-end
-
-# --- Ensure that the env_run.xml file has the correct content
-./xmlchange RUN_TYPE=branch
-./xmlchange RUN_REFCASE={$CASENAME}
-./xmlchange RUN_REFDATE=1989-01-01
-./xmlchange STOP_OPTION=nyears
-./xmlchange STOP_N=16
-./xmlchange CONTINUE_RUN=FALSE
-./xmlchange RESUBMIT=0
-
-# --- Add in the daily output streams
-# --- Reset the user namelist file.
-cp original_user_nl_clm user_nl_clm
-
-# --- Add in the daily history output items
-cat user_nl_clm_histdaily >> user_nl_clm
-
-# --- Now submit the job and let it run
-./case.submit
-
-#########################################################################################
-# PART 5
-#########################################################################################
-#
-# This portion checks to see if the 1989-2004 part of the run is complete
-# and then runs the model for 2005-2014 as a branch run to get daily and subdaily output
-#
-#########################################################################################
-
-set DONE_RUNA = 0
-set DONE_ARCHIVE = 0
-set RESTART_FILE = $WDIR$CASENAME'.clm2.r.2005-01-01-00000.nc'
-
-# --- Check if the second set of simulations have completed and the data archived (every ten minutes)
-while ($DONE_RUNA == 0)
- if (-e $RESTART_FILE) then
- set DONE_RUNA = 1
- echo '1989-2004 run is complete'
- while ($DONE_ARCHIVE == 0)
- set nh0 = `ls -l $WDIR/*clm?.h0.* | egrep -c '^-'`
- echo $nh0
- if ($nh0 == 1) then
- set DONE_ARCHIVE = 1
- echo 'Files have been archived'
- else
- sleep 600
- date
- endif
- else
- sleep 600
- date
- endif
-end
-
-# --- Ensure that the env_run.xml file has the correct content
-./xmlchange RUN_TYPE=branch
-./xmlchange RUN_REFCASE={$CASENAME}
-./xmlchange RUN_REFDATE=2005-01-01
-./xmlchange STOP_OPTION=nyears
-./xmlchange STOP_N=10
-./xmlchange CONTINUE_RUN=FALSE
-./xmlchange RESUBMIT=0
-
-# --- Add in the subdaily output streams
-# --- Reset the user namelist file.
-cp original_user_nl_clm user_nl_clm
-
-# --- Add in the daily history output items
-cat user_nl_clm_histdaily >> user_nl_clm
-
-# --- Add in the subdaily history output items
-cat user_nl_clm_histsubdaily >> user_nl_clm
-
-# --- Now submit the job and let it run
-./case.submit
-
-#########################################################################################
-#
-# This portion checks to see if the 2005-2014 part of the run is complete
-# and ends the script
-#
-#########################################################################################
-
-set DONE_RUNA = 0
-set DONE_ARCHIVE = 0
-set RESTART_FILE = $WDIR$CASENAME'.clm2.r.2015-01-01-00000.nc'
-
-# --- Check if the second set of simulations have completed and the data archived (every ten minutes)
-while ($DONE_RUNA == 0)
- if (-e $RESTART_FILE) then
- set DONE_RUNA = 1
- echo '2005-2014 run is complete'
- while ($DONE_ARCHIVE == 0)
- set nh0 = `ls -l $WDIR/*clm?.h0.* | egrep -c '^-'`
- echo $nh0
- if ($nh0 == 1) then
- set DONE_ARCHIVE = 1
- echo 'Files have been archived'
- else
- sleep 600
- date
- endif
- else
- sleep 600
- date
- endif
-end
diff --git a/tools/contrib/run_clmtowers b/tools/contrib/run_clmtowers
deleted file mode 100755
index 59b276260f..0000000000
--- a/tools/contrib/run_clmtowers
+++ /dev/null
@@ -1,356 +0,0 @@
-#!/bin/csh -f
-#
-# run_clmtowers.csh
-# Purpose: This script will run any number of flux tower sites. You will need to
-# run the script for each spinup and post spinup set of simulations (i.e.,
-# for BGC on, run it separately for AD, PostAD, and post spinup simulations;
-# for BGC off, run it separately for spinup and post spinup simulations)
-# You will need to do two things:
-# 1. Copy this script into $Clm_Tag_Dir/tools/shared/PTCLM
-# where $Clm_Tag_Dir is the location of your clm tag
-# 2. Set up a directory structure where you can put any sourcemods you might want.
-# These sourcemods will be copied into the appropriate case directory.
-# The structure is up to you but here is an example:
-# cd $Clm_Tag_Dir/tools/shared/PTCLM
-# mkdir SourceMods
-# mkdir SourceMods/clm4_5
-# mkdir SourceMods/clm5_0
-# mkdir SourceMods/clm4_5/BASE ; This might contain any sourcemods that you want
-# ; to use in your clm4_5 control experiment
-# mkdir SourceMods/clm5_0/BASE ; This might contain any sourcemods that you want
-# ; to use in your clm5_0 control experiment
-# mkdir SourceMods/clm4_5/EXP1 ; This might contain any sourcemods that you want
-# ; to use in your first clm4_5 experiment
-# mkdir SourceMods/clm5_0/EXP1 ; This might contain any sourcemods that you want
-# ; to use in your first clm5_0 experiment
-# Author: Keith Oleson
-# Last Revised: Jan 24 2019
-# Last CLM Tag that this worked on: release-clm5.0.12
-# Warning: This script is complicated and does not have good (any) error checking currently.
-# You might want to ask me for a quick tutorial before using this.
-#
-# ASSUMES that PTCLMmkdata has already been run for the tower sites chosen below
-# (Surface datasets and shell commands have already been created)
-# To run this script on cheyenne: qcmd -- ./run_clmtowers.csh >&! run_clmtowers.out &
-#
-
-set pwd=`pwd`
-
-# =================================Start User Mods================================
-# Pick a compset (these are the only two compsets supported, they are both SP compsets
-# but BGC will be added if requested below)
-#set compset = I1PtClm45SpGs
-set compset = I1PtClm50SpGs
-if ($compset == I1PtClm45SpGs) then
- set model = clm4_5
-else
- set model = clm5_0
-endif
-
-# Set location of your run directories
-set rundata = /glade/scratch/oleson
-# Set the location of your CLM tag
-set Clm_Tag_Dir = /glade/work/oleson/release-clm5.0.12
-# Set the location of your surface datasets and shell commands that were generated by PTCLM.
-# This will not necessarily be in the same location as the CLM tag that you are running above
-#set User_Mods_Dir = /glade/scratch/oleson/release-clm5.0.12 # This is my version for SP simulations
-set User_Mods_Dir = /glade/scratch/oleson/release-clm5.0.12.BGC # This is my version for BGC simulations
-
-# What sites to run?
-# These are the sites that can be evaluated with some combination of level 2 data and synthesis (gap-filled) data
-#set sites = ( US-Var US-Bo1 US-UMB US-Brw US-ARM US-Ho1 US-Me4 US-Me2 US-Dk3 US-NR1 DE-Tha ES-ES1 FL-Hyy CA-Man BR-Sa3 BR-Sa1 IT-Cpz US-Dk2 US-MOz US-WCr US-MMS US-Ha1 BE-Vie IT-Col CA-Let US-FPe FL-Kaa US-IB1 US-Ne3 CA-Qfo BR-Sa1LBA BR-Sa3LBA BR-Ma1LBA BR-RJA BR-Ji1 CA-Obs CA-Ojp CA-Ca1 CA-Oas US-Dk1)
-#set startyear = ( 2000 1996 1999 1998 2000 1996 1996 2002 1998 1998 1998 1999 1997 1994 2001 2002 2001 2003 2004 1998 1999 1991 1997 1996 1998 2000 2000 2005 2001 2004 2002 2001 2000 2000 1999 2000 2000 1998 1997 2001)
-#set endyear = ( 2007 2008 2006 2006 2007 2004 2000 2010 2005 2007 2003 2005 2005 2003 2003 2004 2005 2005 2007 2006 2007 2006 2005 2001 2007 2007 2005 2007 2006 2006 2004 2003 2005 2002 2001 2006 2006 2006 2006 2005)
-# Or you could just do one site
-set sites = ( US-Var )
-set startyear = ( 2000 )
-set endyear = ( 2007 )
-
-set BGC = "ON" # ON or OFF
-
-# USER MODS FOR BGC ON
-# For BGC on, the sequence of simulations is AD spinup (300 years),
-# PostAD spinup (100 years),
-# post spinup (the number of tower years with atmospheric forcing)
-# For BGC on, AD spinup is SPINUP_P1=TRUE, SPINUP_P2=FALSE
-# PostAD spinup is SPINUP_P1=TRUE, SPINUP_P2=TRUE
-# post spinup is SPINUP_P1=FALSE, SPINUP_P2=FALSE
-
-if ($BGC == ON) then
- setenv SPINUP_P1 "TRUE"
- setenv SPINUP_P2 "TRUE"
-endif
-
-# For BGC on, you could use these for either type of spinup (AD or PostAD)
-if ($BGC == ON) then
- if ($SPINUP_P1 == TRUE) then
- if ($model == clm5_0) then
- set newcase = spinclm50conr12AD
- set clonecase = spinclm50conr12pAD
- else
- if ($model == clm4_5) then
- set newcase = spinclm45conr12AD
- set clonecase = spinclm45conr12pAD
- endif
- endif
- endif
-endif
-
-# For BGC on, you could use these for post spinup (the number of tower years with atmospheric forcing)
-# You should change the "r12" for the clonecase to whatever tag you are using (e.g., r12 is used here to
-# denote release-clm5.0.12) and/or add some designation for your particular experiment with that tag (e.g.,
-# conclm50r12wspinbgc)
-if ($BGC == ON) then
- if ($SPINUP_P1 == FALSE) then
- if ($model == clm5_0) then
- set newcase = spinclm50conr12pAD
- set clonecase = conclm50r12wspinbgc
- else
- if ($model == clm4_5) then
- set newcase = spinclm45conr12pAD
- set clonecase = conclm45r12wspinbgc
- endif
- endif
- endif
-endif
-
-# USER MODS FOR BGC OFF
-# For BGC off, the sequence of simulations is normal spinup (32 years)
-# post spinup (the number of tower years with atmospheric forcing)
-# For BGC off, normal spinup is SPINUP_P1=TRUE, SPINUP_P2=FALSE
-# post spinup is SPINUP_P1=FALSE, SPINUP_P2=FALSE
-if ($BGC == OFF) then
- setenv SPINUP_P1 "FALSE"
- setenv SPINUP_P2 "FALSE"
-endif
-
-# For BGC off, use these for either normal spinup or post spinup
-# You should change the "r12" for the clonecase to whatever tag you are using (e.g., r12 is used here to
-# denote release-clm5.0.12) and/or add some designation for your particular experiment with that tag (e.g.,
-# conclm50r12wspinsp)
-if ($BGC == OFF) then
- if ($model == clm5_0) then
- set newcase = spinclm50conr12sp
- set clonecase = conclm50r12wspinsp
- else
- if ($model == clm4_5) then
- set newcase = spinclm45conr12sp
- set clonecase = conclm45r12wspinsp
- endif
- endif
-endif
-
-# These sourcemods will be copied into every case directory (you will need to setup a
-# directory structure for your sourcemods, see instructions at top of script)
-set sourcemods_dir = {$Clm_Tag_Dir}/tools/PTCLM/SourceMods/
-echo $sourcemods_dir
-set sourcemods = {$sourcemods_dir}{$model}/BASE/*.F90
-echo $sourcemods
-
-# Set some namelist options if required
-# If you set any of these you will need to also set them below (search on namelist_opts)
-#set namelist_opts1 = "paramfile='/glade/p/cgd/tss/people/oleson/modify_param/CLM5_SP_ens_dec_5D_mcalib_psi50BET3_BETKr9_Cropkrmax5e-10_calmbboptleafcn.nc'"
-#set namelist_opts2 = "baseflow_scalar= 0.001d00"
-# BGC
-#set namelist_opts3 = "pot_hmn_ign_counts_alpha= 0.012d00"
-#set namelist_opts4 = "cli_scale= 0.022d00"
-#set namelist_opts5 = "boreal_peatfire_c= 0.2d-4"
-
-# =================================End User Mods================================
-
-@ cnt = 1
-foreach mysite ( $sites )
- @ numyears = $endyear[$cnt] - $startyear[$cnt] + 1
- if ($SPINUP_P1 == TRUE) then
- @ numfour = $numyears / 4
- # If have three years or less (numfour = 0) just repeat first year
- # unless first year is leap year then use next year.
- # Since just using one year that is not a leap year we can use
- # an alignyear of 1 and endyear is the startyear
- if ( $numfour == 0 ) then
- if ( $startyear[$cnt] % 4 == 0 ) then
- @ startyears = $startyear[$cnt] + 1
- @ endyears = $startyears
- else
- @ endyears = $startyear[$cnt]
- @ startyears = $endyears
- endif
- @ alignyear = 1
- endif
- if ( $numfour != 0 ) then
- @ startyears = $startyear[$cnt]
- @ endyears = $startyear[$cnt] + $numfour * 4 - 1
- @ alignyear = $startyear[$cnt]
- endif
- echo $endyear[$cnt]
- echo $endyears
- echo $startyears
- echo $alignyear
- endif
- cd {$Clm_Tag_Dir}/cime/scripts
- if ($SPINUP_P1 == FALSE) then
- set casename = ${clonecase}_${mysite}_$compset
- ./create_clone --case $casename --clone ${newcase}_${mysite}_${compset}
- else
- if ($BGC == ON && $SPINUP_P2 == TRUE) then
- set casename = ${clonecase}_${mysite}_$compset
- echo $casename
- ./create_clone --case $casename --clone ${newcase}_${mysite}_${compset}
- else
- set casename = ${newcase}_${mysite}_$compset
- ./create_newcase --user-mods-dir {$User_Mods_Dir}/tools/PTCLM/mydatafiles/1x1pt_${mysite} --case $casename --mach cheyenne --compset $compset --res CLM_USRDAT --project P93300041 --run-unsupported
- endif
- endif
- cd {$Clm_Tag_Dir}/cime/scripts/${casename}
- ./xmlchange --id PIO_TYPENAME --val netcdf
- ./xmlchange --id PIO_REARRANGER --val 1
- if ($SPINUP_P1 == FALSE) then
- rm -f cesm.stderr*
- rm -f cesm.stdout*
- rm -f STATUS.out
- ./xmlchange --id STOP_OPTION --val nyears
- ./xmlchange --id STOP_N --val $numyears
- ./xmlchange --id RUN_STARTDATE --val $startyear[$cnt]-01-01
- ./xmlchange --id DATM_CLMNCEP_YR_ALIGN --val $startyear[$cnt]
- ./xmlchange --id DATM_CLMNCEP_YR_START --val $startyear[$cnt]
- ./xmlchange --id DATM_CLMNCEP_YR_END --val $endyear[$cnt]
- ./xmlchange --id CALENDAR --val GREGORIAN
- if ($BGC == ON) then
- ./xmlchange --id CLM_BLDNML_OPTS --val "-mask navy -bgc bgc -crop"
- ./xmlchange --id CLM_ACCELERATED_SPINUP --val "off"
- endif
- else
- if ($BGC == ON && $SPINUP_P2 == TRUE) then
- rm -f cesm.stderr*
- rm -f cesm.stdout*
- rm -f STATUS.out
- ./xmlchange --id STOP_OPTION --val nyears
- ./xmlchange --id STOP_N --val 100
- ./xmlchange --id CLM_BLDNML_OPTS --val "-mask navy -bgc bgc -crop"
- ./xmlchange --id CLM_ACCELERATED_SPINUP --val "off"
- else
- ./xmlchange --id STOP_OPTION --val nyears
- if ($BGC == ON) then
- ./xmlchange --id STOP_N --val 300
- else
- ./xmlchange --id STOP_N --val 32
- endif
- if ($alignyear == 1) then
- ./xmlchange --id RUN_STARTDATE --val 000{$alignyear}-01-01
- else
- ./xmlchange --id RUN_STARTDATE --val $startyear[$cnt]-01-01
- endif
- ./xmlchange --id DATM_CLMNCEP_YR_ALIGN --val $alignyear
- ./xmlchange --id DATM_CLMNCEP_YR_START --val $startyears
- ./xmlchange --id DATM_CLMNCEP_YR_END --val $endyears
- if ($alignyear == 1) then
- ./xmlchange --id CALENDAR --val NO_LEAP
- endif
- if ($BGC == ON) then
- ./xmlchange --id CLM_BLDNML_OPTS --val "-mask navy -bgc bgc -crop"
- ./xmlchange --id CLM_ACCELERATED_SPINUP --val "on"
- endif
- endif
- endif
- if ($mysite == BR-Sa1LBA || $mysite == BR-Sa3LBA || $mysite == BR-Ma1LBA || $mysite == BR-RJA || $mysite == BR-Ji1) then
- if ($SPINUP_P1 == FALSE) then
- rm -f user_datm.streams.txt.CLM1PT.CLM_USRDAT
- endif
- endif
- ./xmlchange --id MAX_TASKS_PER_NODE --val 1
- ./xmlchange --id MAX_MPITASKS_PER_NODE --val 1
- ./case.setup
- ./preview_namelists
- # Have to force this for some reason
- if ($SPINUP_P1 == FALSE) then
- ./xmlchange --id DATM_CLMNCEP_YR_END --val $endyear[$cnt]
- ./preview_namelists
- endif
- if ( $status != 0 )then
- echo "CESM_SETUP FAIL $status" >> ./STATUS.out
- exit -1
- else
- echo "CESM_SETUP PASS" >> ./STATUS.out
- endif
- if ($SPINUP_P1 == TRUE && $BGC == ON) then
- sed "/BSUB -R/d" ./.case.run > tmp.run
- ./xmlchange --subgroup case.run --id JOB_WALLCLOCK_TIME --val 5:59
-# ./xmlchange --subgroup case.run --id JOB_QUEUE --val regular
- ./xmlchange --subgroup case.run --id JOB_QUEUE --val share
- ./xmlchange --subgroup case.run --id PROJECT --val P93300041
- else
- sed "/BSUB -R/d" ./.case.run > tmp.run
-# ./xmlchange --subgroup case.run --id JOB_QUEUE --val regular
- ./xmlchange --subgroup case.run --id JOB_WALLCLOCK_TIME --val 5:59
- ./xmlchange --subgroup case.run --id JOB_QUEUE --val share
- ./xmlchange --subgroup case.run --id PROJECT --val P93300041
- endif
- mv tmp.run ./.case.run
- chmod u+x ./.case.run
- cp $sourcemods SourceMods/src.clm
- echo $mysite
- if ($mysite == BR-Sa1LBA || $mysite == BR-Sa3LBA || $mysite == BR-Ma1LBA || $mysite == BR-RJA || $mysite == BR-Ji1) then
- cp CaseDocs/datm.streams.txt.CLM1PT.CLM_USRDAT ./user_datm.streams.txt.CLM1PT.CLM_USRDAT
- chmod u+wx ./user_datm.streams.txt.CLM1PT.CLM_USRDAT
- sed "s/RH /QBOT /g" user_datm.streams.txt.CLM1PT.CLM_USRDAT > tmp.user_datm.streams.txt.CLM1PT.CLM_USRDAT
- sed "s/ rh/ shum/g" tmp.user_datm.streams.txt.CLM1PT.CLM_USRDAT > tmp2.user_datm.streams.txt.CLM1PT.CLM_USRDAT
- rm -f tmp.user_datm.streams.txt.CLM1PT.CLM_USRDAT
- mv tmp2.user_datm.streams.txt.CLM1PT.CLM_USRDAT ./user_datm.streams.txt.CLM1PT.CLM_USRDAT
- endif
- if ($SPINUP_P1 == FALSE) then
- sed "/taxmode = 'cycle','cycle'/d" user_nl_datm > tmp.user_nl_datm
- mv tmp.user_nl_datm ./user_nl_datm
- sed "s/hist_nhtfrq = 0/hist_nhtfrq = 0,1/g" ./user_nl_clm > tmp.user_nl_clm
- sed "s/hist_mfilt = 1200/hist_mfilt = 1,350400/g" ./tmp.user_nl_clm > tmp2.user_nl_clm
- rm -f tmp.user_nl_clm
- sed "/finidat/d" ./tmp2.user_nl_clm > tmp3.user_nl_clm
- rm -f tmp2.user_nl_clm
- if ($BGC == ON) then
- echo " hist_fincl2 = 'FSDS','FLDS','FSR','FSA','FIRE','FIRA','FSH','FCTR','FCEV','FGEV','FGR','FGR12','FSM','TSOI','COSZEN','RAIN','SNOW','H2OSOI','WA','ZWT','GPP','NEE','ELAI','BTRAN','TV','RSSUN','RSSHA','FSH_G','RHAF','RH_LEAF','RH','T10','TG','SABG','SABV'" >> tmp3.user_nl_clm
- else
- if ($model == clm5_0) then
- echo " hist_fincl2 = 'FSDS','FLDS','FSR','FSA','FIRE','FIRA','FSH','FCTR','FCEV','FGEV','FGR','FGR12','FSM','TSOI','COSZEN','RAIN','SNOW','H2OSOI','WA','ZWT','ELAI','BTRAN','FPSN','TV','RSSUN','RSSHA','FSH_G','RHAF','RH_LEAF','RH','T10','TG','SABG','SABV','VEGWP'" >> tmp3.user_nl_clm
- else
- echo " hist_fincl2 = 'FSDS','FLDS','FSR','FSA','FIRE','FIRA','FSH','FCTR','FCEV','FGEV','FGR','FGR12','FSM','TSOI','COSZEN','RAIN','SNOW','H2OSOI','WA','ZWT','ELAI','BTRAN','FPSN','TV','RSSUN','RSSHA','FSH_G','RHAF','RH_LEAF','RH','T10','TG','SABG','SABV'" >> tmp3.user_nl_clm
- endif
- endif
- set finidat=`ls -1 $rundata/${newcase}_${mysite}_${compset}/run/${newcase}_${mysite}_${compset}.clm?.r.*.nc | tail -1`
- echo $finidat
- echo " finidat = '$finidat'" >> tmp3.user_nl_clm
- mv tmp3.user_nl_clm ./user_nl_clm
- else
- if ($BGC == ON && $SPINUP_P2 == TRUE) then
- set finidat=`ls -1 $rundata/${newcase}_${mysite}_${compset}/run/${newcase}_${mysite}_${compset}.clm?.r.*.nc | tail -1`
- echo $finidat
- echo " finidat = '$finidat'" >> user_nl_clm
- else
- echo "taxmode = 'cycle','cycle'" >> user_nl_datm
-# echo $namelist_opts1 >> user_nl_clm
-# echo $namelist_opts2 >> user_nl_clm
-# echo $namelist_opts3 >> user_nl_clm
-# echo $namelist_opts4 >> user_nl_clm
-# echo $namelist_opts5 >> user_nl_clm
- if ($BGC == ON) then
- sed "s/hist_mfilt = 1200/hist_mfilt = 12000/g" ./user_nl_clm > tmp.user_nl_clm
- echo " hist_fincl1 = 'TOTECOSYSC', 'TOTECOSYSN', 'TOTSOMC', 'TOTSOMN', 'TOTVEGC', 'TOTVEGN', 'TLAI', 'GPP', 'CPOOL', 'NPP', 'TWS', 'FSH', 'FCTR', 'FCEV', 'FGEV', 'QRUNOFF', 'ZWT', 'NEE', 'NEP'" >> tmp.user_nl_clm
- mv tmp.user_nl_clm ./user_nl_clm
- endif
- endif
- endif
- ./case.build
- if ( $status != 0 )then
- echo "BUILD FAIL $status" >> ./STATUS.out
- exit -1
- else
- echo "BUILD PASS" >> ./STATUS.out
- endif
- ./case.submit
- if ( $status != 0 )then
- echo "SUBMIT FAIL $status" >> ./STATUS.out
- exit -1
- else
- echo "SUBMIT PASS" >> ./STATUS.out
- endif
- cd $pwd
- @ cnt++
-end
diff --git a/tools/contrib/ssp_anomaly_forcing_smooth b/tools/contrib/ssp_anomaly_forcing_smooth
deleted file mode 100755
index fa2350bcb9..0000000000
--- a/tools/contrib/ssp_anomaly_forcing_smooth
+++ /dev/null
@@ -1,343 +0,0 @@
-#! /usr/bin/env python
-#
-# ssp_anomaly_forcing_smooth
-#
-# Create anomoly forcing datasets for SSP scenarios that can be used by CESM datm model
-#
-import sys
-import os
-import string
-import subprocess
-import datetime
-import numpy as np
-import matplotlib.pyplot as plt
-import netCDF4 as netcdf4
-from scipy import interpolate
-
-# load proper modules first, i.e.
-# cheyenne
-'''
-module load python/2.7.16
-ncar_pylib
-#module load numpy/1.12.0
-#module load matplotlib/2.0.0
-#module load scipy/0.18.1
-#module load intel/16.0.3
-#module load ncarcompilers/0.3.5
-#module load netcdf/4.4.1.1
-#module load netcdf4-python/1.2.7
-'''
-
-# caldera / geyser
-
-'''
-module load python/2.7.7
-module load numpy/1.11.0
-module load pyside/1.1.2
-module load matplotlib/1.5.1
-module load scipy/0.18.1
-module load netcdf4python/1.2.4
-'''
-
-#-------------------------------------------------------
-"""
-
-This script creates CLM anomaly forcing data
-
-"""
-#-------------------------------------------------------
-
-#-- end of function definitions ---------------------------------
-#0
-
-print( "Create anomoly forcing data that can be used by CTSM in CESM" )
-# Input and output directories make sure they exist
-datapath = "/glade/p/cgd/tss/historyfiles/" # Path on cheyenne
-spath = './'
-if ( os.path.exists(datapath) ):
- print( "Input data directory:"+datapath )
-else:
- sys.exit( "Could not find input directory: "+datapath )
-if ( os.path.exists(spath) ):
- print( "Output data directory:"+spath )
-else:
- sys.exit( "Could not find output directory: "+spath )
-
-# Settings to run with
-today = datetime.date.today()
-creationdate = "_c"+today.strftime( "%Y%m%d" )
-historydate = today.strftime( "%a %b %d %Y" )
-sspnum = 4
-smoothsize = 5
-
-hist_case = 'b.e21.BHIST.f09_g17.CMIP6-historical.010'
-
-if sspnum == 1:
- # SSP1-26
- ssptag = 'SSP1-2.6'
- fut_case = 'b.e21.BSSP126cmip6.f09_g17.CMIP6-SSP1-2.6.001'
-elif sspnum == 2:
- # SSP3-70
- ssptag = 'SSP3-7.0'
- fut_case = 'b.e21.BSSP370cmip6.f09_g17.CMIP6-SSP3-7.0.001'
-elif sspnum == 3:
- # SSP5-85
- ssptag = 'SSP5-8.5'
- fut_case = 'b.e21.BSSP585cmip6.f09_g17.CMIP6-SSP5-8.5.001'
-elif sspnum == 4:
- # SSP2-45
- ssptag = 'SSP2-4.5'
- fut_case = 'b.e21.BSSP245cmip6.f09_g17.CMIP6-SSP2-4.5.001'
-else:
- sys.exit( "sspnum is out of range: "+sspnum )
-
-sspoutdir = 'anomaly_forcing/CMIP6-'+ssptag
-
-outdir = spath + sspoutdir
-if ( not os.path.exists(outdir) ):
- os.makedirs( outdir )
-
-print( "Output specific data directory :"+outdir )
-
-
-hist_yrstart = 2000
-hist_yrend = 2014
-hist_nyrs = hist_yrend - hist_yrstart + 1
-
-fut1_yrstart = 2015
-fut1_yrend = 2064
-fut1_nyrs = fut1_yrend - fut1_yrstart + 1
-
-fut2_yrstart = 2065
-fut2_yrend = 2100
-fut2_nyrs = fut2_yrend - fut2_yrstart + 1
-
-fut_yrstart = 2015
-fut_yrend = 2100
-fut_nyrs = fut_yrend - fut_yrstart + 1
-
-tot_yrstart = 2000
-tot_yrend = 2100
-tot_nyrs = tot_yrend - tot_yrstart + 1
-
-nmo = 12
-histnm = nmo*hist_nyrs
-futnm = nmo*fut_nyrs
-totnm = nmo*tot_nyrs
-outnm = nmo*fut_nyrs
-
-dpath = datapath
-dfile = '/lnd/proc/tseries/month_1/'
-hdir = dpath+hist_case+dfile
-fdir = dpath+fut_case+dfile
-
-# Check that directories exist
-if ( os.path.exists(hdir) ):
- print( "Data file directory:"+hdir )
-else:
- sys.exit( "Could not find directory: "+hdir )
-if ( os.path.exists(fdir) ):
- print( "Data file directory:"+fdir )
-else:
- sys.exit( "Could not find directory: "+fdir )
-
-print( "\n\n\n" )
-
-# needed to use QBOT and U10, not using V and U(for sfcwind)
-field_in = [ 'TBOT', 'RAIN', 'SNOW', 'FSDS', 'FLDS', 'QBOT', 'PBOT']
-field_combine = [ 0, 1, 1, 0, 0, 0, 0]
-field_out = [ 'tas', 'pr', 'pr', 'rsds', 'rlds', 'huss', 'ps']
-units = [ 'K', ' ', ' ', ' ', ' ', 'kg/kg', 'Pa']
-units_disp = [ 'K', 'mm/s', 'mm/s', 'W m!U-2!N', 'W m!U-2!N', 'kg/kg', 'Pa']
-anomsf = ['anomaly','scale factor','scale factor','scale factor','scale factor','anomaly','anomaly']
-
-nfields = len(field_in)
-
-#-- Read coordinates
-landfile = hdir+hist_case+'.clm2.h0.TBOT.'+str(hist_yrstart)+'01-'+str(hist_yrend)+'12.nc'
-if ( os.path.exists(landfile) ):
- print( "Land File: "+landfile )
-else:
- sys.exit( "Could not find land file: "+landfile )
-
-f1 = netcdf4.Dataset(landfile, 'r')
-landfrac=np.asfarray(f1.variables['landfrac'][:,:],np.float64)
-landmask=np.asfarray(f1.variables['landmask'][:,:],np.float64)
-area=np.asfarray(f1.variables['area'][:,:],np.float64)
-lon = np.asfarray(f1.variables['lon'][:],np.float64)
-lat = np.asfarray(f1.variables['lat'][:],np.float64)
-nlat = lat.size
-nlon = lon.size
-f1.close()
-ind=np.where(landfrac > 1.e10)
-landfrac[ind]=0
-
-#-- Loop over forcing fields ------------------------------------
-fieldskip = 0
-for f in range(nfields):
- # read in last ten years of historical data ------------------
-
- infieldname1 = field_in[f]
- infieldcombine1 = field_combine[f]
- if ((infieldcombine1 == 1 and fieldskip == 0) or (infieldcombine1 == 0 and fieldskip == 0)):
- hvarfile1 = hdir+hist_case+'.clm2.h0.'+infieldname1+'.'+str(hist_yrstart)+'01-'+str(hist_yrend)+'12.nc'
- fvarfile1 = fdir+fut_case+'.clm2.h0.'+infieldname1+'.'+str(fut1_yrstart)+'01-'+str(fut1_yrend)+'12.nc'
- fvarfile2 = fdir+fut_case+'.clm2.h0.'+infieldname1+'.'+str(fut2_yrstart)+'01-'+str(fut2_yrend)+'12.nc'
- hf1 = netcdf4.Dataset(hvarfile1, 'r')
- ff1 = netcdf4.Dataset(fvarfile1, 'r')
- ff2 = netcdf4.Dataset(fvarfile2, 'r')
- hvarvalues1 = np.asfarray(hf1.variables[infieldname1][:],np.float64)
- htime1 = np.asfarray(hf1.variables['time'][:],np.float64)
- print( 'Reading: ' + hvarfile1 )
- fvarvalues1 = np.asfarray(ff1.variables[infieldname1][:],np.float64)
- ftime1 = np.asfarray(ff1.variables['time'][:],np.float64)
- long_name = ff1.variables[field_in[f]].long_name
- print( 'Reading: ' + fvarfile1 )
- fvarvalues2 = np.asfarray(ff2.variables[infieldname1][:],np.float64)
- ftime2 = np.asfarray(ff2.variables['time'][:],np.float64)
- print( 'Reading: ' + fvarfile2 )
- hf1.close()
- ff1.close()
- ff2.close()
- if (infieldcombine1 == 1):
- infieldname2 = field_in[f+1]
- infieldcombine2 = field_combine[f+1]
- hvarfile2 = hdir+hist_case+'.clm2.h0.'+infieldname2+'.'+str(hist_yrstart)+'01-'+str(hist_yrend)+'12.nc'
- fvarfile3 = fdir+fut_case+'.clm2.h0.'+infieldname2+'.'+str(fut1_yrstart)+'01-'+str(fut1_yrend)+'12.nc'
- fvarfile4 = fdir+fut_case+'.clm2.h0.'+infieldname2+'.'+str(fut2_yrstart)+'01-'+str(fut2_yrend)+'12.nc'
- hf2 = netcdf4.Dataset(hvarfile2, 'r')
- ff3 = netcdf4.Dataset(fvarfile3, 'r')
- ff4 = netcdf4.Dataset(fvarfile4, 'r')
- hvarvalues1 = hvarvalues1 + np.asfarray(hf2.variables[infieldname2][:],np.float64)
- print( 'Reading: ' + hvarfile2 )
- fvarvalues1 = fvarvalues1 + np.asfarray(ff3.variables[infieldname2][:],np.float64)
- print( 'Reading: ' + fvarfile3 )
- fvarvalues2 = fvarvalues2 + np.asfarray(ff4.variables[infieldname2][:],np.float64)
- print( 'Reading: ' + fvarfile4 )
- hf2.close()
- ff3.close()
- ff4.close()
- fieldskip = 1
-
- allvarvalues = np.concatenate((hvarvalues1,fvarvalues1,fvarvalues2),axis=0)
- alltime = np.concatenate((htime1,ftime1,ftime2),axis=0)
- ftime = np.concatenate((ftime1,ftime2),axis=0)
- outtime = ftime - 16
- histavgvalues = np.zeros((nmo,nlat,nlon))
- histavgcount = np.zeros((nmo))
- runningavgvalues = np.zeros((nlat,nlon))
- runningavgcount = 0.0
- outputvarvalues = np.zeros((outnm,nlat,nlon))
-
- for hmonthindex in range(histnm):
- havgmonthnum = (hmonthindex) % 12 + 1
- havgmonthindex = havgmonthnum - 1
- histavgvalues[havgmonthindex,:,:] = histavgvalues[havgmonthindex,:,:] * histavgcount[havgmonthindex]
- histavgvalues[havgmonthindex,:,:] = histavgvalues[havgmonthindex,:,:] + allvarvalues[hmonthindex,:,:]
- histavgcount[havgmonthindex] = histavgcount[havgmonthindex] + 1.0
- histavgvalues[havgmonthindex,:,:] = histavgvalues[havgmonthindex,:,:] / histavgcount[havgmonthindex]
-
- for fmonthindex in range(futnm):
- allmonthindex = fmonthindex + histnm
- allyearindex = int(allmonthindex / nmo)
- favgmonthnum = (allmonthindex) % 12 + 1
- favgmonthindex = favgmonthnum - 1
-
- firstmonthindex = allmonthindex - nmo * smoothsize
- if allyearindex <= (tot_nyrs - smoothsize):
- lastmonthindex = allmonthindex + nmo * smoothsize
- else:
- lastmonthindex = allmonthindex + nmo * (tot_nyrs - allyearindex)
-
- runningavgvalues = 0.0
- runningavgcount = 0.0
- for smonthindex in range(firstmonthindex,lastmonthindex,nmo):
- runningavgvalues = runningavgvalues * runningavgcount
- runningavgvalues = runningavgvalues + allvarvalues[smonthindex,:,:]
- runningavgcount = runningavgcount + 1.0
- runningavgvalues = runningavgvalues / runningavgcount
-
- climoavgvalues = histavgvalues[favgmonthindex,:,:]
- if anomsf[f] == 'anomaly':
- anomvalues = runningavgvalues - climoavgvalues
-
- if anomsf[f] == 'scale factor':
- anomvalues = np.ones((nlat,nlon),dtype=np.float64)
-
- nonzeroindex = np.where(climoavgvalues != 0.0)
- anomvalues[nonzeroindex] = runningavgvalues[nonzeroindex]/climoavgvalues[nonzeroindex]
-
- max_scale_factor = 5.
- if field_in[f] == 'FSDS':
- max_scale_factor = 2.
- overmaxindex=np.where(anomvalues > max_scale_factor)
- anomvalues[overmaxindex] = max_scale_factor
-
- outputvarvalues[fmonthindex,:,:] = anomvalues
-
- # create netcdf file ---------------------------------
-
- outfilename = outdir + '/'+'af.'+field_out[f]+'.cesm2.'+ssptag+'.'+str(fut_yrstart)+'-'+str(fut_yrend)+creationdate+'.nc'
- print( 'Creating: ' + outfilename )
- outfile = netcdf4.Dataset(outfilename, 'w')
-
- outfile.source_file1 = hvarfile1
- outfile.source_file2 = fvarfile1
- outfile.source_file3 = fvarfile2
- outfile.title = 'anomaly forcing data'
- outfile.note1 = 'Anomaly/scale factors calculated relative to ' \
- +str(hist_yrstart)+'-'+str(hist_yrend) \
- +' climatology from CESM2 historical simulation (case name: '+hist_case+')'
- outfile.note2 = ssptag+' '+str(fut_yrstart)+'-'+str(fut_yrend) \
- +' from CESM simulations (case names: '+fut_case[0]+' and '+fut_case[1]+')'
- outfile.smoothsize = str(smoothsize)
- outfile.history = historydate + ": created by "+sys.argv[0]
- stdout = os.popen( "git describe" )
- outfile.gitdescribe = stdout.read().rstrip()
-
- outfile.createDimension('lat', size=int(nlat))
- outfile.createDimension('lon', size=int(nlon))
- outfile.createDimension('time', size=None)
-
- wtime = outfile.createVariable('time',np.float64,('time',))
- wlat = outfile.createVariable('lat',np.float64,('lat',))
- wlon = outfile.createVariable('lon',np.float64,('lon',))
- wmask = outfile.createVariable('landmask',np.int32,('lat','lon'))
- warea = outfile.createVariable('area',np.float64,('lat','lon'))
- wfrac = outfile.createVariable('landfrac',np.float64,('lat','lon'))
- wvar = outfile.createVariable(field_out[f],np.float64,('time','lat','lon'),fill_value=np.float64(1.e36))
-
- wtime.units = 'days since ' + str(fut_yrstart) + '-01-01 00:00:00'
- wlon.units = 'degrees'
- wlat.units = 'degrees'
- wvar.units = units[f]
- warea.units = 'km2'
- wfrac.units = 'unitless'
- wmask.units = 'unitless'
-
- #wtime.long_name = 'Months since January '+str(fut_yrstart)
- wtime.long_name = 'days since ' + str(fut_yrstart) + '-01-01 00:00:00'
- wlon.long_name = 'Longitude'
- wlat.long_name = 'Latitude'
- wvar.long_name = str(long_name)+' '+anomsf[f]
- warea.long_name = 'Grid cell area'
- wfrac.long_name = 'Grid cell land fraction'
- wmask.long_name = 'Grid cell land mask'
-
- wtime.calendar = 'noleap'
-
- # write to file --------------------------------------------
- #wtime[:] = month
- wtime[:] = outtime
- wlon[:] = lon
- wlat[:] = lat
- wmask[:,:] = landmask
- wfrac[:,:] = landfrac
- warea[:,:] = area
- wvar[:,:,:] = outputvarvalues
-
- else:
- fieldskip = 0
-
-
-print( "\n\nSuccessfully made anomoly forcing datasets\n" )
diff --git a/tools/mkmapdata/README b/tools/mkmapdata/README
deleted file mode 100644
index 00ec339380..0000000000
--- a/tools/mkmapdata/README
+++ /dev/null
@@ -1,93 +0,0 @@
-$CTSMROOT/tools/mkmapdata/README Jun/08/2018
-
-The routines in this directory create a mapping dataset from
-SCRIP grid files to map from one grid to another. These mapping files
-are used by either CLM or mksurfdata_map to regrid from one resolution
-to another.
-
-We have generally moved to "nomask" grid and mapping files. These "nomask"
-files typically contain mask and frac equal to 1 everywhere. During remapping
-we now apply the source masks found in the raw datasets and ignore the
-masks found in the mapping files. Exception: we continue to use a masked
-grid file and mapping file to regrid the 1-km topography.
-
-The script uses ESMF and requires that ESMF be built and the path
-for ESMF binary files (using the program ESMF_RegridWeightGen)
-be given as input to the script. You need to build at least
-two versions, one with mpiuni and one with mpi. Both versions
-also need to be built with NetCDF rather than the default
-IO version.
-
-Currently uses: ESMF7.1.0r
-
-Do the following for help with the different options to the script...
-
- ./mkmapdata.sh -help
-
-The following steps provide a method to create the executable
-and generate the grid map dataset:
-
-0) Background tasks you only have to do once
-
- a.) Export the input SCRIP grid files for the resolutions you'll need
-
- Most of these files are on the Subversion inputdata server at...
-
- https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata/lnd/clm2/mappingdata/grids/
-
- Supported machines also have a copy on the CESM DIN_LOC_ROOT location
- for that machine.
-
- b.) Obtain and build the versions of ESMF required for this script
-
-The version needs to support ESMF_RegridWeightGen and support the
-options passed to it in the mkmapdata.sh script. As such it needs
-to be built with NetCDF. You also need to build at least one
-version with mpiuni and one with an mpi library. You also need
-a version that supports the options: --netcdf4, --64bit_offset
-and --src_type UGRID.
-
- http://www.earthsystemmodeling.org/
-
-You may need more than one version to do everything above. On cheyenne
-we use ESMF7.1.0r.
-
-The version of NetCDF used with ESMF needs to be version 4.1 or higher
-and compiled with the NetCDF4 file format enabled (with HDF5 compression).
-That will enable the --netcdf4 and --64bit_offset options to be used.
-
-1) cd to this directory
-
-2) Create map dataset(s)
- Option A.) Use mkmapdata.sh directly
- run script(e.g.): (see header of mkmapdata.sh for other environment that can be set)
-
- Example for standard resolutions
- ./mkmapdata.sh -r 10x15
- Example for non-standard resolutions where you provide an input SCRIP grid file.
- ./mkmapdata.sh -f
-
- Option B.) Alternatively, run regridbatch.sh to run mkmapdata.sh for a bunch of
- different resolutions.
-
- Option C.) Alternatively, run mknoocnmap.pl to create a single-point/regional
- map for an area without ocean.
-
- ./mknoocnmap.pl -help # for help on this script
-
-3) move (and rename if appropriate) generated map datasets
- to $DIN_LOC_ROOT/lnd/clm/mappingdata/maps, etc.
-
-
-Important files:
-
- regridbatch.sh ------- Script to run mkmapdata.sh for many resolutions on cheyenne
- regriddav.sh --------- Script to run mkmapdata.sh for many resolutions on the DAV cluster (Casper)
- mvNimport.sh --------- Script to copy and import mapping files in for many resolutions
- mkmapdata.sh --------- Script to create mapping datasets for a given resolution
-
- mknoocnmap.pl -------- Script to create unity mapping dataset for single-point
- or regional studies over land-only (no ocean).
- mkunitymap.ncl ------- NCL script to create a unity map -- ran by above script
- rmdups.ncl ----------- NCL script to remove duplicate mapping points
-
diff --git a/tools/mkmapdata/createXMLEntries.pl b/tools/mkmapdata/createXMLEntries.pl
deleted file mode 100755
index c65e6888f7..0000000000
--- a/tools/mkmapdata/createXMLEntries.pl
+++ /dev/null
@@ -1,116 +0,0 @@
-#!/usr/bin/env perl
-
-# Creates a file giving XML entries for all the mapping files in the
-# current directory (mapping_entries.txt). Also creates another file
-# giving commands to move these files to the inputdata space
-# (mv_cmds.sh).
-#
-# Should be run with no arguments.
-#
-# See also bld/namelist_files/createMapEntry.pl, and mvNimport.sh in
-# the current directory for scripts that share some of the
-# functionality of this script.
-
-# Bill Sacks
-# March, 2013
-
-use strict;
-
-# ----------------------------------------------------------------------
-# FUNCTIONS
-# ----------------------------------------------------------------------
-
-# Given a map filename, returns a hash giving the resolutions and
-# masks implicit in that filename.
-# Inputs:
-# - filename
-# Output:
-# - hash containing:
-# - filename
-# - from_res
-# - from_mask
-# - to_res
-# - to_mask
-# Or does a bare return if the filename doesn't match the expected pattern
-sub get_resolutions_and_masks {
- my $filename = shift;
-
- # The following match assumes that the destination mask is
- # "nomask". This match will tolerate underscores in the
- # destination grid (e.g., 5x5_amazon), but be careful about
- # underscores in the source grid or source mask!
- if ($filename =~ m/^map_(.*)_(.*)_to_(.*)_nomask/) {
- my $from_res=$1;
- my $from_mask=$2;
- my $to_res=$3;
- my $to_mask="nomask";
-
- my %info = (filename => $filename,
- from_res => $from_res,
- from_mask => $from_mask,
- to_res => $to_res,
- to_mask => $to_mask);
-
- return %info;
- }
- else {
- return;
- }
-}
-
-
-# ----------------------------------------------------------------------
-# PARAMETERS DEFINED HERE
-# ----------------------------------------------------------------------
-
-my $CSMDATA = "/glade/p/cesm/cseg/inputdata";
-my $maps_dir = "lnd/clm2/mappingdata/maps"; # directory where mapping files are stored within the inputdata directory
-
-# ----------------------------------------------------------------------
-# BEGIN MAIN PROGRAM
-# ----------------------------------------------------------------------
-
-my @files = glob "map*.nc";
-
-# Make a hash containing all of the files at each destination resolution.
-# The keys of the hash are destination resolutions; the values are
-# references to arrays of hash references, where these low-level
-# hashes are the return values of get_resolutions_and_masks.
-my %dest_resols;
-foreach my $file (@files) {
- my %info = get_resolutions_and_masks($file);
- if (%info) {
- my $to_res = $info{'to_res'};
- push @{$dest_resols{$to_res}}, \%info;
- }
- else {
- warn "WARNING: $file doesn't match expected mapping filename pattern; skipping\n";
- }
-}
-
-open MAP_ENTRIES, ">", "mapping_entries.txt";
-open MV_CMDS, ">", "mv_cmds.sh";
-
-# Output xml entries (and mv commands) grouped by destination resolution
-foreach my $to_res (sort keys %dest_resols) {
- my $full_maps_dir = "$maps_dir/$to_res";
-
- foreach my $info_ref (@{$dest_resols{$to_res}}) {
- my $filename = ${$info_ref}{'filename'};
- my $from_res = ${$info_ref}{'from_res'};
- my $from_mask = ${$info_ref}{'from_mask'};
- my $to_res = ${$info_ref}{'to_res'};
- my $to_mask = ${$info_ref}{'to_mask'};
-
- print MV_CMDS "mv $filename $CSMDATA/$full_maps_dir/$filename\n";
- print MAP_ENTRIES "$full_maps_dir/$filename \n";
- }
-
- # Print blank line between destination grids
- print MAP_ENTRIES "\n";
-}
-
-system "chmod", "755", "mv_cmds.sh";
-close MAP_ENTRIES;
-close MV_CMDS;
diff --git a/tools/mkmapdata/mkmapdata.sh b/tools/mkmapdata/mkmapdata.sh
deleted file mode 100755
index f1287ab3ef..0000000000
--- a/tools/mkmapdata/mkmapdata.sh
+++ /dev/null
@@ -1,571 +0,0 @@
-#!/bin/bash
-#----------------------------------------------------------------------
-#
-# mkmapdata.sh
-#
-# Create needed mapping files for mksurfdata_map and CLM.
-#
-# Example to run for an output resolution of 4x5
-#
-# mkmapdata.sh -r 4x5
-#
-# valid arguments:
-# -f Input grid filename
-# -t Output type, supported values are [regional, global]
-# -r Output resolution
-# -b use batch mode (not default)
-# -i High resolution mode (Only used with -f)
-# -l list mapping files required (so can use check_input_data to get them)
-# -d debug usage -- display mkmapdata that will be run but don't execute them
-# -v verbose usage -- log more information on what is happening
-# -h displays this help message
-#
-# You can also set the following env variables:
-#
-# ESMFBIN_PATH - Path to ESMF binaries
-# CSMDATA ------ Path to CESM input data
-# MPIEXEC ------ Name of mpirun executable
-# REGRID_PROC -- Number of MPI processors to use
-#
-#----------------------------------------------------------------------
-echo $0
-dir=${0%/*}
-if [ "$dir" = "$0" ];then
- dir="."
-fi
-outfilelist="clm.input_data_list"
-default_res="10x15"
-
-#----------------------------------------------------------------------
-# SET SOME DEFAULTS -- if not set via env variables outside
-
-case $hostname in
-
- ##cheyenne
- cheyenne* | r* )
- if [ -z "$CSMDATA" ]; then
- CSMDATA=/glade/p/cesm/cseg/inputdata
- fi
- ;;
-
- ##hobart/izumi/thorodin
- hobart* | izumi* | thorodin* )
- if [ -z "$CSMDATA" ]; then
- CSMDATA=/fs/cgd/csm/inputdata
- fi
- ;;
-
-esac
-
-#----------------------------------------------------------------------
-# Usage subroutine
-usage() {
- echo ""
- echo "**********************"
- echo "usage:"
- echo "./mkmapdata.sh"
- echo ""
- echo "valid arguments: "
- echo "[-f|--gridfile ] "
- echo " Full pathname of model SCRIP grid file to use "
- echo " This variable should be set if this is not a supported grid"
- echo " This variable will override the automatic generation of the"
- echo " filename generated from the -res argument "
- echo " the filename is generated ASSUMING that this is a supported "
- echo " grid that has entries in the file namelist_defaults_ctsm.xml"
- echo " the -r|--res argument MUST be specied if this argument is specified"
- echo "[-r|--res ]"
- echo " Model output resolution (default is $default_res)"
- echo "[-t|--gridtype ]"
- echo " Model output grid type"
- echo " supported values are [regional,global], (default is global)"
- echo "[-b|--batch]"
- echo " Toggles batch mode usage (and run with mpi). If you want to run in batch mode"
- echo " you need to have a separate batch script for a supported machine"
- echo " that calls this script interactively - you cannot submit this"
- echo " script directly to the batch system"
- echo "[-i|--hires]"
- echo " Output maps are high resolution and large file support should be used"
- echo "[-l|--list]"
- echo " List mapping files required (use check_input_data to get them)"
- echo " also writes data to $outfilelist"
- echo "[-d|--debug]"
- echo " Toggles debug-only (don't actually run mkmapdata just echo what would happen)"
- echo "[-h|--help] "
- echo " Displays this help message"
- echo "[-v|--verbose]"
- echo " Toggle verbose usage -- log more information on what is happening "
- echo "[--fast]"
- echo " Toggle fast maps only -- only create the maps that can be done quickly "
- echo ""
- echo " You can also set the following env variables:"
- echo " ESMFBIN_PATH - Path to ESMF binaries "
- echo " (default is determined by machine running on)"
- echo " CSMDATA ------ Path to CESM input data"
- echo " (default is $CSMDATA)"
- echo " MPIEXEC ------ Name of mpirun executable"
- echo " (default is determined by machine running on)"
- echo " REGRID_PROC -- Number of MPI processors to use"
- echo " (default is $REGRID_PROC)"
- echo ""
- echo "**defaults can be determined on the machines: cheyenne or casper"
- echo ""
- echo "**pass environment variables by preceding above commands "
- echo " with 'env var1=setting var2=setting '"
- echo "**********************"
-}
-#----------------------------------------------------------------------
-# runcmd subroutine
-#----------------------------------------------------------------------
-
-runcmd() {
- cmd=$@
- if [ -z "$cmd" ]; then
- echo "No command given to the runcmd function"
- exit 3
- fi
- if [ "$verbose" = "YES" ]; then
- echo "$cmd"
- fi
- if [ "$debug" != "YES" ]; then
- ${cmd}
- rc=$?
- else
- rc=0
- fi
- if [ $rc != 0 ]; then
- echo "Error status returned from mkmapdata script"
- exit 4
-undo
- fi
- return 0
-}
-
-#----------------------------------------------------------------------
-# Process input arguments
-#----------------------------------------------------------------------
-
-interactive="YES"
-debug="no"
-res="default"
-type="global"
-phys="clm4_5"
-verbose="no"
-list="no"
-outgrid=""
-gridfile="default"
-fast="no"
-netcdfout="none"
-
-while [ $# -gt 0 ]; do
- case $1 in
- -v|-V)
- verbose="YES"
- ;;
- -b|--batch)
- interactive="NO"
- ;;
- -d|--debug)
- debug="YES"
- ;;
- --fast)
- fast="YES"
- ;;
- -i|--hires)
- netcdfout="64bit_offset"
- ;;
- -l|--list)
- debug="YES"
- list="YES"
- ;;
- -r|--res)
- res=$2
- shift
- ;;
- -f|--gridfile)
- gridfile=$2
- shift
- ;;
- -t|--gridtype)
- type=$2
- shift
- ;;
- -h|--help )
- usage
- exit 0
- ;;
- * )
- echo "ERROR:: invalid argument sent in: $2"
- usage
- exit 1
- ;;
- esac
- shift
-done
-
-echo "Script to create mapping files required by mksurfdata_map"
-
-#----------------------------------------------------------------------
-# Determine output scrip grid file
-#----------------------------------------------------------------------
-
-# Set general query command used below
-QUERY="$dir/../../bld/queryDefaultNamelist.pl -silent -namelist clmexp "
-QUERY="$QUERY -justvalue -options sim_year=2000 -csmdata $CSMDATA"
-echo "query command is $QUERY"
-
-echo ""
-DST_EXTRA_ARGS=""
-if [ "$gridfile" != "default" ]; then
- GRIDFILE=$gridfile
- echo "Using user specified scrip grid file: $GRIDFILE"
- if [ "$res" = "default" ]; then
- echo "When user specified grid file is given you MUST set the resolution (as the name of your grid)\n";
- exit 1
- fi
-
- # For now, maked the assumption about user-specified grids --
- # that they are SCRIP format. In the future we may want to
- # provide a command-line options to allow the user to
- # override that default.
- DST_LRGFIL=$netcdfout
- DST_TYPE="SCRIP"
-else
- if [ "$res" = "default" ]; then
- res=$default_res
- fi
-
- QUERYARGS="-res $res -options lmask=nomask"
-
- # Find the output grid file for this resolution using the XML database
- QUERYFIL="$QUERY -var scripgriddata $QUERYARGS -onlyfiles"
- if [ "$verbose" = "YES" ]; then
- echo $QUERYFIL
- fi
- GRIDFILE=`$QUERYFIL`
- echo "Using default scrip grid file: $GRIDFILE"
-
- # Determine extra information about the destination grid file
- DST_LRGFIL=`$QUERY -var scripgriddata_lrgfile_needed $QUERYARGS`
- DST_TYPE=`$QUERY -var scripgriddata_type $QUERYARGS`
- if [ "$DST_TYPE" = "UGRID" ]; then
- # For UGRID, we need extra information: the meshname variable
- dst_meshname=`$QUERY -var scripgriddata_meshname $QUERYARGS`
- DST_EXTRA_ARGS="$DST_EXTRA_ARGS --dst_meshname $dst_meshname"
- fi
-fi
-
-if [ "$type" = "global" ] && [ `echo "$res" | grep -c "1x1_"` = 1 ]; then
- echo "This is a regional resolution and yet it is being run as global, set type with '-t' option\n";
- exit 1
-fi
-if [ "$type" = "global" ] && [ `echo "$res" | grep -c "5x5_"` = 1 ]; then
- echo "This is a regional resolution and yet it is being run as global, set type with '-t' option\n";
- exit 1
-fi
-echo "Output grid resolution is $res"
-if [ -z "$GRIDFILE" ]; then
- echo "Output grid file was NOT found for this resolution: $res\n";
- exit 1
-fi
-
-if [ "$list" = "YES" ]; then
- echo "outgrid = $GRIDFILE"
- echo "outgrid = $GRIDFILE" > $outfilelist
-elif [ ! -f "$GRIDFILE" ]; then
- echo "Input SCRIP grid file does NOT exist: $GRIDFILE\n";
- echo "Make sure CSMDATA environment variable is set correctly"
- exit 1
-fi
-
-#----------------------------------------------------------------------
-# Determine all input grid files and output file names
-#----------------------------------------------------------------------
-
-if [ "$phys" = "clm4_5" ]; then
- grids=( \
- "0.5x0.5_nomask" \
- "0.25x0.25_nomask" \
- "0.125x0.125_nomask" \
- "3x3min_nomask" \
- "5x5min_nomask" \
- "10x10min_nomask" \
- "0.9x1.25_nomask" \
- "1km-merge-10min_HYDRO1K-merge-nomask" \
- )
-
-else
- echo "ERROR: Unknown value for phys: $phys"
- exit 1
-fi
-
-# Set timestamp for names below
-# The flag `-d "-0 days"` can serve as a time saver as follows:
-# If the script aborted without creating all of the map_ files and
-# the user resubmits to create the remaining files on the next day,
-# the user could change -0 to -1 to prevent the script from
-# duplicating files already generated the day before.
-#
-CDATE="c"`date -d "-0 days" +%y%m%d`
-
-# Set name of each output mapping file
-# First determine the name of the input scrip grid file
-# for each of the above grids
-declare -i nfile=1
-for gridmask in ${grids[*]}
-do
- grid=${gridmask%_*}
- lmask=${gridmask#*_}
-
- QUERYARGS="-res $grid -options lmask=$lmask,glc_nec=10 "
-
- QUERYFIL="$QUERY -var scripgriddata $QUERYARGS -onlyfiles"
- if [ "$verbose" = "YES" ]; then
- echo $QUERYFIL
- fi
- INGRID[nfile]=`$QUERYFIL`
- if [ "$list" = "YES" ]; then
- echo "ingrid = ${INGRID[nfile]}"
- echo "ingrid = ${INGRID[nfile]}" >> $outfilelist
- fi
-
- OUTFILE[nfile]=map_${grid}_${lmask}_to_${res}_nomask_aave_da_$CDATE.nc
-
- # Determine extra information about the source grid file
- SRC_EXTRA_ARGS[nfile]=""
- SRC_LRGFIL[nfile]=`$QUERY -var scripgriddata_lrgfile_needed $QUERYARGS`
- SRC_TYPE[nfile]=`$QUERY -var scripgriddata_type $QUERYARGS`
- if [ "${SRC_TYPE[nfile]}" = "UGRID" ]; then
- # For UGRID, we need extra information: the meshname variable
- src_meshname=`$QUERY -var scripgriddata_meshname $QUERYARGS`
- SRC_EXTRA_ARGS[nfile]="${SRC_EXTRA_ARGS[nfile]} --src_meshname $src_meshname"
- fi
-
- nfile=nfile+1
-done
-
-#----------------------------------------------------------------------
-# Determine supported machine specific stuff
-#----------------------------------------------------------------------
-
-hostname=`hostname`
-if [ -n "$NERSC_HOST" ]; then
- hostname=$NERSC_HOST
-fi
-echo "Hostname = $hostname"
-case $hostname in
- ##cheyenne
- cheyenne* | r* )
- . /glade/u/apps/ch/opt/lmod/8.1.7/lmod/lmod/init/bash
- if [ -z "$REGRID_PROC" ]; then
- REGRID_PROC=36
- fi
- if [ interactive = "YES" ]; then
- REGRID_PROC=1
- fi
- esmfvers=8.2.0.b06
- intelvers=19.1.1
- module purge
- module load intel/$intelvers
-# module load esmf_libs
-# module load esmf_libs/$esmfvers
- module load nco
-
- if [[ $REGRID_PROC > 1 ]]; then
- mpi=mpi
- module load mpt/2.22
- else
- mpi=uni
- fi
-# module load esmf-${esmfvers}-ncdfio-${mpi}-O
- module use /glade/p/cesmdata/cseg/PROGS/modulefiles/esmfpkgs/intel/$intelvers
- module load esmf-${esmfvers}-ncdfio-mpt-g
- if [ -z "$ESMFBIN_PATH" ]; then
- ESMFBIN_PATH=`grep ESMF_APPSDIR $ESMFMKFILE | awk -F= '{print $2}'`
- fi
- if [ -z "$MPIEXEC" ]; then
- MPIEXEC="mpiexec_mpt -np $REGRID_PROC"
- fi
- ;;
-
- ## DAV
- pronghorn* | casper* )
- . /glade/u/apps/ch/opt/lmod/7.2.1/lmod/lmod/init/bash
- if [ -z "$REGRID_PROC" ]; then
- REGRID_PROC=8
- fi
- if [ interactive = "YES" ]; then
- REGRID_PROC=1
- fi
- echo "REGRID_PROC=$REGRID_PROC"
- esmfvers=7.1.0r
- intelvers=17.0.1
- module purge
- module load intel/$intelvers
- if [ $? != 0 ]; then
- echo "Error doing module load: intel/$intelvers"
- exit 1
- fi
- module load nco
- module load netcdf
- module load ncarcompilers
-
- module load esmflibs/$esmfvers
- if [ $? != 0 ]; then
- echo "Error doing module load: esmflibs/$esmfvers"
- exit 1
- fi
-
- if [[ $REGRID_PROC > 1 ]]; then
- mpi=mpi
- echo "MPI option is NOT currently available"
- exit 1
- else
- mpi=uni
- fi
- module load esmf-${esmfvers}-ncdfio-${mpi}-O
- if [ $? != 0 ]; then
- echo "Error doing module load: esmf-${esmfvers}-ncdfio-${mpi}-O"
- exit 1
- fi
- if [ -z "$ESMFBIN_PATH" ]; then
- ESMFBIN_PATH=`grep ESMF_APPSDIR $ESMFMKFILE | awk -F= '{print $2}'`
- fi
- echo "ESMFMKFILE: $ESMFMKFILE"
- echo "LD_LIBRARY_PATH: $LD_LIBRARY_PATH"
-
- if [ -z "$MPIEXEC" ]; then
- MPIEXEC="mpiexec -n $REGRID_PROC"
- fi
- ;;
-
- ##no other machine currently supported
- *)
- echo "Machine $hostname NOT recognized"
- ;;
-
-esac
-
-# Error checks
-if [ ! -d "$ESMFBIN_PATH" ]; then
- echo "Path to ESMF binary directory does NOT exist: $ESMFBIN_PATH"
- echo "Set the environment variable: ESMFBIN_PATH"
- exit 1
-fi
-
-#----------------------------------------------------------------------
-# Generate the mapping files needed for surface dataset generation
-#----------------------------------------------------------------------
-
-# Resolve interactive or batch mode command
-# NOTE - if you want to run in batch mode - you need to have a separate
-# batch file that calls this script interactively - you cannot submit
-# this script to the batch system
-
-if [ "$interactive" = "NO" ]; then
- echo "Running in batch mode using MPI"
- if [ -z "$MPIEXEC" ]; then
- echo "Name of MPI exec to use was NOT set"
- echo "Set the environment variable: MPIEXEC"
- exit 1
- fi
- if [ ! -x `which ${MPIEXEC%% *}` ]; then
- echo "The MPIEXEC pathname given is NOT an executable: ${MPIEXEC%% *}"
- echo "Set the environment variable: MPIEXEC or run in interactive mode without MPI"
- exit 1
- fi
- mpirun=$MPIEXEC
- echo "Running in batch mode"
-else
- mpirun=""
-fi
-
-ESMF_REGRID="$ESMFBIN_PATH/ESMF_RegridWeightGen"
-if [ ! -x "$ESMF_REGRID" ]; then
- echo "ESMF_RegridWeightGen does NOT exist in ESMF binary directory: $ESMFBIN_PATH\n"
- echo "Upgrade to a newer version of ESMF with this utility included"
- echo "Set the environment variable: ESMFBIN_PATH"
- exit 1
-fi
-
-# Remove previous log files
-rm PET*.Log
-
-#
-# Now run the mapping for each file, checking that input files exist
-# and then afterwards that the output mapping file exists
-#
-declare -i nfile=1
-until ((nfile>${#INGRID[*]})); do
- echo "Creating mapping file: ${OUTFILE[nfile]}"
- echo "From input grid: ${INGRID[nfile]}"
- echo "For output grid: $GRIDFILE"
- echo " "
- if [ -z "${INGRID[nfile]}" ] || [ -z "$GRIDFILE" ] || [ -z "${OUTFILE[nfile]}" ]; then
- echo "Either input or output grid or output mapping file is NOT set"
- exit 3
- fi
- if [ ! -f "${INGRID[nfile]}" ]; then
- echo "Input grid file does NOT exist: ${INGRID[nfile]}"
- if [ ! "$list" = "YES" ]; then
- exit 2
- fi
- fi
- if [ ! -f "$GRIDFILE" ]; then
- echo "Output grid file does NOT exist: $GRIDFILE"
- exit 3
- fi
-
- # Determine what (if any) large file support is needed. Use the
- # most extreme large file support needed by either the source file
- # or the destination file.
- if [ "$DST_LRGFIL" = "netcdf4" ] || [ "${SRC_LRGFIL[nfile]}" = "netcdf4" ]; then
- lrgfil="--netcdf4"
- elif [ "$DST_LRGFIL" = "64bit_offset" ] || [ "${SRC_LRGFIL[nfile]}" = "64bit_offset" ]; then
- lrgfil="--64bit_offset"
- elif [ "$DST_LRGFIL" = "none" ] && [ "${SRC_LRGFIL[nfile]}" = "none" ]; then
- lrgfil=""
- else
- echo "Unknown LRGFIL type:"
- echo "DST_LRGFIL = $DST_LRGFIL"
- echo "SRC_LRGFIL = ${SRC_LRGFIL[nfile]}"
- exit 4
- fi
-
- # Skip if file already exists
- if [ -f "${OUTFILE[nfile]}" ]; then
- echo "Skipping creation of ${OUTFILE[nfile]} as already exists"
- # Skip if large file and Fast mode is on
- elif [ "$fast" = "YES" ] && [ "${SRC_LRGFIL[nfile]}" = "netcdf4" ]; then
- echo "Skipping creation of ${OUTFILE[nfile]} as fast mode is on so skipping large files in NetCDF4 format"
- else
-
- cmd="$mpirun $ESMF_REGRID --ignore_unmapped -s ${INGRID[nfile]} "
- cmd="$cmd -d $GRIDFILE -m conserve -w ${OUTFILE[nfile]}"
- if [ $type = "regional" ]; then
- cmd="$cmd --dst_regional"
- fi
-
- cmd="$cmd --src_type ${SRC_TYPE[nfile]} ${SRC_EXTRA_ARGS[nfile]} --dst_type $DST_TYPE $DST_EXTRA_ARGS"
- cmd="$cmd $lrgfil"
-
- runcmd $cmd
-
- if [ "$debug" != "YES" ] && [ ! -f "${OUTFILE[nfile]}" ]; then
- echo "Output mapping file was NOT created: ${OUTFILE[nfile]}"
- exit 6
- fi
- # add some metadata to the file
- HOST=`hostname`
- history="$ESMF_REGRID"
- runcmd "ncatted -a history,global,a,c,"$history" ${OUTFILE[nfile]}"
- runcmd "ncatted -a hostname,global,a,c,$HOST -h ${OUTFILE[nfile]}"
- runcmd "ncatted -a logname,global,a,c,$LOGNAME -h ${OUTFILE[nfile]}"
- fi
-
- nfile=nfile+1
-done
-
-echo "Successfully created needed mapping files for $res"
-
-exit 0
diff --git a/tools/mkmapdata/mknoocnmap.pl b/tools/mkmapdata/mknoocnmap.pl
deleted file mode 100755
index 5438c8bcd3..0000000000
--- a/tools/mkmapdata/mknoocnmap.pl
+++ /dev/null
@@ -1,302 +0,0 @@
-#!/usr/bin/env perl
-#
-# mknoocnmap.pl Erik Kluzek
-# Dec/07/2011
-#
-# Create SCRIP grid and mapping files for a single-point or region
-# that is assumed to be a land land-only region.
-#
-use Cwd;
-use strict;
-use English;
-use IO::File;
-use Getopt::Long;
-
-#
-# Global constants
-#
-my $degsiz = 0.1;
-
-#-----------------------------------------------------------------------------------------------
-# Set the directory that contains this scripts. If the command was issued using a
-# relative or absolute path, that path is in $ProgDir. Otherwise assume the
-# command was issued from the current working directory.
-
-(my $ProgName = $0) =~ s!(.*)/!!; # name of this script
-my $ProgDir = $1; # name of directory containing this script -- may be a
- # relative or absolute path, or null if the script
- # is in
- # the user's PATH
-my $cmdline = "@ARGV"; # Command line arguments to script
-my $cwd = getcwd(); # current working directory
-my $scrdir; # absolute pathname of directory that contains this script
-my $nm = "${ProgName}::"; # name to use if script dies
-if ($ProgDir) {
- $scrdir = absolute_path($ProgDir);
-} else {
- $scrdir = $cwd;
-}
-
-
-#-----------------------------------------------------------------------------------------------
-
-sub usage {
- die < Center latitude,longitude of the grid to create.
- -name [-or -n] Name to use to describe point
-
-OPTIONS
- -dx Size of total grid in degrees in longitude direction
- (default is $degsiz)
- -dy Size of total grid in degrees in latitude direction
- (default is $degsiz)
- -silent [or -s] Make output silent
- -help [or -h] Print usage to STDOUT.
- -verbose [or -v] Make output more verbose.
- -nx Number of longitudes (default is 1)
- -ny Number of latitudes (default is 1)
-EOF
-}
-
-#-----------------------------------------------------------------------------------------------
-
-sub get_latlon {
-#
-# Return the latitude and longitude of the input string and validate it
-#
- my $string = shift;
- my $desc = shift;
- my $dx = shift;
- my $dy = shift;
-
- my $lat = undef;
- my $lon = undef;
- my $valreal1 = "[+-]?[0-9]*\.?[0-9]*[EedDqQ]?[0-9+-]*";
-
- if ( $string =~ /^($valreal1)\s*,\s*($valreal1)$/ ) {
- $lat = $1;
- $lon = $2;
- } else {
- die <<"EOF";
-** $ProgName - Error in entering latitude/longitude for $desc **
-EOF
- }
- if ( $dx <= 0.0 || $dx > 360. ) {
- die <<"EOF";
-** $ProgName - Bad value for dx (=$dx) for $desc **
- }
- if ( $dy <= 0.0 || $dy > 180. ) {
- die <<"EOF";
-** $ProgName - Bad value for dy (=$dy) for $desc **
- }
- if ( ($lat < -90.+$dy/2.0) || ($lat > 90.0-$dy/2.0) ) {
- die <<"EOF";
-** $ProgName - Bad value for latitude (=$lat) for $desc **
-EOF
- }
- if ( ($lon < $dx/2.0) || ($lon > 360.0-$dx/2.0) ) {
- die <<"EOF";
-** $ProgName - Bad value for longitude (=$lat) for $desc **
-EOF
- }
- return( $lat, $lon );
-
-}
-
-#-------------------------------------------------------------------------------
-
-sub absolute_path {
-#
-# Convert a pathname into an absolute pathname, expanding any . or .. characters.
-# Assumes pathnames refer to a local filesystem.
-# Assumes the directory separator is "/".
-#
- my $path = shift;
- my $cwd = getcwd(); # current working directory
- my $abspath; # resulting absolute pathname
-
-# Strip off any leading or trailing whitespace. (This pattern won't match if
-# there's embedded whitespace.
- $path =~ s!^\s*(\S*)\s*$!$1!;
-
-# Convert relative to absolute path.
-
- if ($path =~ m!^\.$!) { # path is "."
- return $cwd;
- } elsif ($path =~ m!^\./!) { # path starts with "./"
- $path =~ s!^\.!$cwd!;
- } elsif ($path =~ m!^\.\.$!) { # path is ".."
- $path = "$cwd/..";
- } elsif ($path =~ m!^\.\./!) { # path starts with "../"
- $path = "$cwd/$path";
- } elsif ($path =~ m!^[^/]!) { # path starts with non-slash character
- $path = "$cwd/$path";
- }
-
- my ($dir, @dirs2);
- my @dirs = split "/", $path, -1; # The -1 prevents split from stripping trailing nulls
- # This enables correct processing of the input "/".
-
- # Remove any "" that are not leading.
- for (my $i=0; $i<=$#dirs; ++$i) {
- if ($i == 0 or $dirs[$i] ne "") {
- push @dirs2, $dirs[$i];
- }
- }
- @dirs = ();
-
- # Remove any "."
- foreach $dir (@dirs2) {
- unless ($dir eq ".") {
- push @dirs, $dir;
- }
- }
- @dirs2 = ();
-
- # Remove the "subdir/.." parts.
- foreach $dir (@dirs) {
- if ( $dir !~ /^\.\.$/ ) {
- push @dirs2, $dir;
- } else {
- pop @dirs2; # remove previous dir when current dir is ..
- }
- }
- if ($#dirs2 == 0 and $dirs2[0] eq "") { return "/"; }
- $abspath = join '/', @dirs2;
- return( $abspath );
-}
-
-#-------------------------------------------------------------------------------
-
-# Process command-line options
-
-my %opts = (
- ctr => undef,
- help => undef,
- name => undef,
- nx => 1,
- ny => 1,
- dx => $degsiz,
- dy => $degsiz,
- silent => 0,
- verbose => 0,
- );
-
-GetOptions(
- "p|centerpoint=s" => \$opts{'ctr'},
- "n|name=s" => \$opts{'name'},
- "nx=i" => \$opts{'nx'},
- "ny=i" => \$opts{'ny'},
- "dx=f" => \$opts{'dx'},
- "dy=f" => \$opts{'dy'},
- "h|help" => \$opts{'help'},
- "s|silent" => \$opts{'silent'},
- "v|verbose" => \$opts{'verbose'},
-) or usage();
-
-# Check for unparsed arguments
-if (@ARGV) {
- print "ERROR: unrecognized arguments: @ARGV\n";
- usage();
-}
-
-if ( $opts{'verbose'} && $opts{'silent'} ) {
- print "ERROR: Can NOT set both silent and verbose at once!\n";
- usage();
-}
-my $printlev;
-if ( $opts{'verbose'} ) {
- $printlev = 2;
-} elsif ( $opts{'silent'} ) {
- $printlev = 0;
-} else {
- $printlev = 1;
-}
-
-if ( ! defined($opts{'ctr'}) ) {
- print "ERROR: MUST set the center point\n";
- usage();
-}
-if ( ! defined($opts{'name'}) ) {
- print "ERROR: MUST set the name of the point\n";
- usage();
-}
-my $name = $opts{'name'};
-
-my ($lat,$lon) = get_latlon( $opts{'ctr'}, $name, $opts{'dx'}, $opts{'dy'} );
-my $S_lat = $lat - $opts{'dy'}/2.0;
-my $N_lat = $lat + $opts{'dy'}/2.0;
-my $W_lon = $lon - $opts{'dx'}/2.0;
-my $E_lon = $lon + $opts{'dx'}/2.0;
-
-my $nx = $opts{'nx'};
-my $ny = $opts{'ny'};
-if ( $opts{'nx'} < 1 ) {
- print "ERROR: nx MUST be greater than or equal to 1\n";
- usage();
-}
-if ( $opts{'ny'} < 1 ) {
- print "ERROR: ny MUST be greater than or equal to 1\n";
- usage();
-}
-
-#-----------------------------------------------------------------------------------------------
-my $print;
-if ( $printlev > 1 ) {
- $print = "PRINT=TRUE";
-}
-
-# Creation date
-my $cdate = `date +%y%m%d`; chomp( $cdate );
-
-if ( $printlev > 0 ) {
- print "\n\nCreate SCRIP grid and mapping files for a single-point\n";
-}
-# land grid...
-my $GITDES = `cd $scrdir; git describe; cd -`;
-$GITDES =~ s/\n//g;
-my $grddir = absolute_path( "$scrdir/../mkmapgrids" );
-my $cwd = getcwd();
-my $datdir = $grddir;
-if ( $cwd ne $scrdir ) {
- $datdir = $cwd;
-}
-my $grid1 = "$datdir/SCRIPgrid_${name}_nomask_c${cdate}.nc";
-my $cmdenv = "env S_LAT=$S_lat W_LON=$W_lon N_LAT=$N_lat E_LON=$E_lon " .
- "NX=$nx NY=$ny PTNAME=$name GITDES=$GITDES $print ";
-
-my $cmd = "$cmdenv GRIDFILE=$grid1 ncl $scrdir/../mkmapgrids/mkscripgrid.ncl";
-if ( $printlev > 0 ) {
- print "Create land SCRIP gridfile\n";
- print "Execute: $cmd\n";
-}
-system( $cmd );
-
-# ocean grid...
-my $grid2 = "$datdir/SCRIPgrid_${name}_noocean_c${cdate}.nc";
-my $cmd = "$cmdenv GRIDFILE=$grid2 IMASK=0 ncl $scrdir/../mkmapgrids/mkscripgrid.ncl";
-if ( $printlev > 0 ) {
- print "Create ocean SCRIP gridfile\n";
- print "Execute: $cmd\n";
-}
-system( $cmd );
-
-# Now create a unity mapping between the two...
-# Note reversal of grid1 & grid2, because we want an ocean -> land
-# mapping file
-my $mapfile = "$datdir/map_${name}_noocean_to_${name}_nomask_aave_da_${cdate}.nc";
-my $cmd = "env GRIDFILE1=$grid2 GRIDFILE2=$grid1 MAPFILE=$mapfile " .
- "GITDES=$GITDES $print ncl $scrdir/mkunitymap.ncl";
-
-if ( $printlev > 0 ) {
- print "Create unity mapping file between the two gridfile\n";
- print "Execute: $cmd\n";
-}
-system( $cmd );
-
-if ( $printlev > 0 ) {
- print "\n\nSuccessfully created grid/mapping files for single-point\n";
-}
diff --git a/tools/mkmapdata/mkunitymap.ncl b/tools/mkmapdata/mkunitymap.ncl
deleted file mode 100644
index 7570f7eedd..0000000000
--- a/tools/mkmapdata/mkunitymap.ncl
+++ /dev/null
@@ -1,166 +0,0 @@
-;
-; mkunitymap.ncl
-;
-; Create a unity map file either between two identical grids or between two
-; grids that do NOT intersect at all.
-;
-; Erik Kluzek
-; Dec/07/2011
-;
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
-begin
- ; Set a few constants needed later
- cdate = systemfunc( "date +%y%m%d" );
- ldate = systemfunc( "date" );
- ; ===========================================================================================================
- ;
- ; IMPORTANT NOTE: EDIT THE FOLLOWING TO CUSTOMIZE or use ENV VARIABLE SETTINGS
- ; Edit the following as needed to interpolate to a new resolution.
- gridfile1 = getenv("GRIDFILE1"); ; Get name of the first SCRIP grid file
- gridfile2 = getenv("GRIDFILE2"); ; Get name of the second SCRIP grid file
-
- outfilename = getenv("MAPFILE"); ; Get name of the output mapping file
-
- print_str = getenv("PRINT"); ; Do Extra printing for debugging
-
- gitdescribe = getenv("GITDES"); ; Git describe from the source clone
-
- if ( ismissing(gridfile1) )then
- print( "ERROR: GRIDFILE1 is missing!" );
- exit
- end if
- if ( ismissing(gridfile2) )then
- print( "ERROR: GRIDFILE2 is missing!" );
- exit
- end if
- if ( ismissing(outfilename) )then
- print( "ERROR: MAPFILE is missing!" );
- exit
- end if
- if ( ismissing(print_str) )then
- printn = False;
- else
- if ( print_str .eq. "TRUE" )then
- printn = True;
- else
- printn = False;
- end if
- end if
-
- if ( ismissing(gitdescribe) )then
- gitdescribe = systemfunc( "git describe" )
- end if
-
- ;
- ; Open up the input grid files
- ;
- nca = addfile( gridfile1, "r" );
- ncb = addfile( gridfile2, "r" );
-
- system( "/bin/rm -f "+outfilename );
- if ( printn )then
- print( "output mapping file to create: "+outfilename );
- end if
- nc = addfile( outfilename, "c" );
- ;
- ; Define dimensions
- ;
- n_a = dimsizes( nca->grid_center_lat );
- n_b = dimsizes( ncb->grid_center_lat );
- if ( n_a .ne. n_b )then
- print( "ERROR: dimensions of input SCRIP grid files is NOT the same!" );
- exit
- end if
- if ( any(ncb->grid_imask .ne. 1.0d00) )then
- print( "ERROR: the mask of the second file isn't identically 1!" );
- print( "(second file should be land grid file)");
- exit
- end if
- chkvars = (/ "grid_center_lat", "grid_center_lon", "grid_corner_lat", "grid_corner_lon" /);
- do i = 1, dimsizes(chkvars)-1
- if ( any(nca->$chkvars(i)$ .ne. ncb->$chkvars(i)$) )then
- print( "ERROR: the grid variables are different between the two files!: "+chkvars(i) );
- exit
- end if
- end do
- n_s = n_a;
- dimnames = (/ "n_a", "n_b", "n_s", "nv_a", "nv_b", "num_wgts", "src_grid_rank", "dst_grid_rank" /);
- dsizes = (/ n_a, n_b, n_a, 4, 4, 1, 2, 2/);
- is_unlim = (/ False, False, False, False, False, False, False, False /);
- filedimdef( nc, dimnames, dsizes, is_unlim );
-
- ;
- ; Define grid dimensions
- ;
- filevardef( nc, "src_grid_dims", "integer", (/ "src_grid_rank" /))
- nc->src_grid_dims = (/nca->grid_dims/)
- filevardef( nc, "dst_grid_dims", "integer", (/ "dst_grid_rank" /))
- nc->dst_grid_dims = (/ncb->grid_dims/)
-
- ;
- ; Define variables
- ;
- cvars = (/ "yc", "xc", "yv", "xv", "mask" /);
- gvars = (/ "grid_center_lat", "grid_center_lon", "grid_corner_lat", "grid_corner_lon", "grid_imask" /);
-
- do i = 0, dimsizes(cvars)-1
- var = cvars(i)+"_a";
- if ( cvars(i) .eq. "yv" .or. cvars(i) .eq. "xv" )then
- dnamesa = (/ "n_a", "nv_a" /);
- dnamesb = (/ "n_b", "nv_b" /);
- else
- dnamesa = (/ "n_a" /);
- dnamesb = (/ "n_b" /);
- end if
- filevardef ( nc, var, typeof(nca->$gvars(i)$), dnamesa );
- filevarattdef ( nc, var, nca->$gvars(i)$ );
- nc->$var$ = (/ nca->$gvars(i)$ /);
- var = cvars(i)+"_b";
- filevardef ( nc, var, typeof(nca->$gvars(i)$), dnamesb );
- filevarattdef ( nc, var, ncb->$gvars(i)$ );
- nc->$var$ = (/ ncb->$gvars(i)$ /);
- delete( dnamesa );
- delete( dnamesb );
- end do
- filevardef ( nc, "area_a", "double", (/ "n_a" /) );
- filevardef ( nc, "area_b", "double", (/ "n_b" /) );
- filevardef ( nc, "frac_a", "double", (/ "n_a" /) );
- filevardef ( nc, "frac_b", "double", (/ "n_b" /) );
- ;
- ; Attributes
- ;
- nc->area_a@units = "square radians";
- nc->frac_a@units = "unitless";
- nc->area_b@units = nc->area_a@units;
- nc->frac_b@units = nc->frac_a@units;
- nc@conventions = "NCAR-CESM";
- nc@domain_a = gridfile1;
- nc@domain_b = gridfile2;
- nc@grid_file_src = gridfile1;
- nc@grid_file_dst = gridfile2;
- nc@title = "SCRIP mapping file between identical grids without ocean";
- nc@history = ldate+": create using mkunitymap.ncl";
- nc@Version = gitdescribe;
-
- ;
- ; Fraction
- ;
- nc->frac_a = int2dble( (/nc->mask_a/) );
- nc->frac_b = int2dble( (/nc->mask_b/) );
- ;
- ; Area
- ;
- nc->area_a = gc_qarea( nc->yv_a(:,:), nc->xv_a(:,:) );
- nc->area_b = gc_qarea( nc->yv_b(:,:), nc->xv_b(:,:) );
- ;
- ; Weights
- ;
- filevardef ( nc, "col", "integer", (/ "n_s" /) );
- filevardef ( nc, "row", "integer", (/ "n_s" /) );
- filevardef ( nc, "S", "double", (/ "n_s" /) );
-
- nc->col = ispan( 1, n_s, 1 );
- nc->row = nc->col;
- nc->S = 1.0d00;
-
-end
diff --git a/tools/mkmapdata/mvNimport.sh b/tools/mkmapdata/mvNimport.sh
deleted file mode 100755
index 184a3fac25..0000000000
--- a/tools/mkmapdata/mvNimport.sh
+++ /dev/null
@@ -1,75 +0,0 @@
-#!/bin/bash
-#
-#
-# Batch script to move and import mapping files to inputdata
-# for several resolutions.
-#
-
-#----------------------------------------------------------------------
-
-if [ -z "$CSMDATA" ]; then
- CSMDATA=/fis/cgd/cseg/csm/inputdata
-fi
-
-if [ ! -d "$CSMDATA" ]; then
- echo "Environment variable CSMDATA is not set to a valid directory!"
- exit 1
-fi
-
-mapdir="lnd/clm2/mappingdata/maps"
-if [ ! -d "$CSMDATA/$mapdir" ]; then
- echo "Environment variable CSMDATA is not set to a valid inputdata directory!"
- exit 1
-fi
-
-if [ -z "$SVN_INP_DIR" ]; then
- SVN_INP_DIR=https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata
-fi
-
-if [ $# -gt 0 ]; then
- resols=""
- for arg in $@; do
- resols="$resols $arg"
- done
-else
- echo "Run for all valid resolutions"
- resols=`../bld/queryDefaultNamelist.pl -res list -silent`
-fi
-echo "Move and import mapping files for this list of resolutions: $resols"
-
-#----------------------------------------------------------------------
-
-for res in $resols; do
- echo "Move and import mapping files for: $res"
- dir=$mapdir/$res
- #----------------------------------------------------------------------
- files=(map_*${res}*_aave_da_c??????.nc)
- if [ ${#files[*]} -lt 2 ]; then
- echo "No mappingfiles found for $res"
- exit 2
- else
- if [ ! -d "$CSMDATA/$dir" ]; then
- echo "Create mapping directory: $CSMDATA/$dir"
- mkdir $CSMDATA/$dir
- svn mkdir $SVN_INP_URL/$dir -m "Create mapping directory for $res"
- fi
- for file in ${files[*]}; do
- echo "Copy and import file $file"
- cp -p $file $CSMDATA/$dir
- if [ $? -ne 0 ]; then
- echo "Problem copying file: $file"
- exit 3
- fi
- chmod 0444 $CSMDATA/$dir/$file
- if [ $? -ne 0 ]; then
- echo "Problem chmod on file: $file"
- exit 4
- fi
- svn import $CSMDATA/$dir/$file $SVN_INP_DIR/$dir/$file -m "Mapping file for $res"
- if [ $? -ne 0 ]; then
- echo "Problem doing svn import on file: $file"
- exit 4
- fi
- done
- fi
-done
diff --git a/tools/mkmapdata/regridbatch.sh b/tools/mkmapdata/regridbatch.sh
deleted file mode 100755
index 8b56f2dc7d..0000000000
--- a/tools/mkmapdata/regridbatch.sh
+++ /dev/null
@@ -1,102 +0,0 @@
-#!/bin/bash
-#
-#
-# Batch script to submit to create mapping files for all standard
-# resolutions. If you provide a single resolution via "$RES", only
-# that resolution will be used. In that case: If it is a regional or
-# single point resolution, you should set '#PBS -n' to 1, and be sure
-# that '-t regional' is specified in cmdargs.
-#
-# cheyenne specific batch commands:
-#PBS -A P93300606
-#PBS -N regrid
-#PBS -q regular
-#PBS -l select=4:ncpus=2:mpiprocs=2:mem=109GB
-#PBS -l walltime=2:00:00
-#PBS -j oe
-#PBS -me
-#PBS -V
-#PBS -S /bin/bash
-
-#----------------------------------------------------------------------
-# Set parameters
-#----------------------------------------------------------------------
-
-#----------------------------------------------------------------------
-# Begin main script
-#----------------------------------------------------------------------
-
-if [ -z "$RES" ]; then
- echo "Run for all valid resolutions"
- resols=`../../bld/queryDefaultNamelist.pl -res list -silent`
- if [ ! -z "$GRIDFILE" ]; then
- echo "When GRIDFILE set RES also needs to be set for a single resolution"
- exit 1
- fi
-else
- resols="$RES"
-fi
-if [ -z "$GRIDFILE" ]; then
- grid=""
-else
- if [[ ${#resols[@]} > 1 ]]; then
- echo "When GRIDFILE is specificed only one resolution can also be given (# resolutions ${#resols[@]})"
- echo "Resolutions input is: $resols"
- exit 1
- fi
- grid="-f $GRIDFILE"
-fi
-
-if [ -z "$MKMAPDATA_OPTIONS" ]; then
- echo "Run with standard options"
- options=" "
-else
- options="$MKMAPDATA_OPTIONS"
-fi
-echo "Create mapping files for this list of resolutions: $resols"
-
-#----------------------------------------------------------------------
-
-for res in $resols; do
- echo "Create mapping files for: $res"
-#----------------------------------------------------------------------
- cmdargs="-r $res $grid $options"
-
- # For single-point and regional resolutions, tell mkmapdata that
- # output type is regional
- if [[ `echo "$res" | grep -c "1x1_"` -gt 0 || `echo "$res" | grep -c "5x5_"` -gt 0 ]]; then
- res_type="regional"
- else
- res_type="global"
- fi
- # Assume if you are providing a gridfile that the grid is regional
- if [ $grid != "" ];then
- res_type="regional"
- fi
-
- cmdargs="$cmdargs -t $res_type"
-
- echo "$res_type"
- if [ "$res_type" = "regional" ]; then
- echo "regional"
- # For regional and (especially) single-point grids, we can get
- # errors when trying to use multiple processors - so just use 1.
- regrid_num_proc=1
- else
- echo "global"
- regrid_num_proc=8
- fi
-
- if [ ! -z "$LSFUSER" ]; then
- echo "batch"
- cmdargs="$cmdargs -b"
- fi
- if [ ! -z "$PBS_O_WORKDIR" ]; then
- cd $PBS_O_WORKDIR
- cmdargs="$cmdargs -b"
- fi
-
- echo "args: $cmdargs"
- echo "time env REGRID_PROC=$regrid_num_proc ./mkmapdata.sh $cmdargs\n"
- time env REGRID_PROC=$regrid_num_proc ./mkmapdata.sh $cmdargs
-done
diff --git a/tools/mkmapdata/regridgeyser.sh b/tools/mkmapdata/regridgeyser.sh
deleted file mode 100755
index 82a4615dcd..0000000000
--- a/tools/mkmapdata/regridgeyser.sh
+++ /dev/null
@@ -1,87 +0,0 @@
-#!/bin/bash
-#
-#
-# Batch script to submit to create mapping files for all standard
-# resolutions. If you provide a single resolution via "$RES", only
-# that resolution will be used. In that case: If it is a regional or
-# single point resolution, you should set '#SBATCH -n' to 1, and be sure
-# that '-t regional' is specified in cmdargs.
-#
-# geyser specific batch commands:
-#SBATCH -J regrid # job name
-#SBATCH -n 8
-#SBATCH --ntasks-per-node=8
-#SBATCH --mem=450G
-#SBATCH -t 03:00:00
-#SBATCH -A P93300606
-#SBATCH -p dav
-#SBATCH -e regrid.%J.out # output filename
-#SBATCH -o regrid.%J.err # error filename
-#
-# To submit this script:
-#
-# sbatch regridgeyser.sh
-#
-## IMPORTANT NOTE:
-#
-# environment variables can NOT be passed into DAV
-# queues. Hence, this script MUST be edited to select
-# what resolution to run for.
-
-#----------------------------------------------------------------------
-# Set parameters
-#----------------------------------------------------------------------
-export RES=1x1_brazil
-
-#----------------------------------------------------------------------
-# Begin main script
-#----------------------------------------------------------------------
-
-if [ -z "$RES" ]; then
- echo "Run for all valid resolutions"
- resols=`../../bld/queryDefaultNamelist.pl -res list -silent`
-else
- resols="$RES"
-fi
-echo "Create mapping files for this list of resolutions: $resols"
-
-#----------------------------------------------------------------------
-
-for res in $resols; do
- echo "Create mapping files for: $res"
-#----------------------------------------------------------------------
- cmdargs="-r $res"
-
- # For single-point and regional resolutions, tell mkmapdata that
- # output type is regional
- if [[ `echo "$res" | grep -c "1x1_"` -gt 0 || `echo "$res" | grep -c "5x5_"` -gt 0 ]]; then
- res_type="regional"
- else
- res_type="global"
- fi
-
- cmdargs="$cmdargs -t $res_type"
-
- echo "$res_type"
- if [ "$res_type" = "regional" ]; then
- echo "regional"
- # For regional and (especially) single-point grids, we can get
- # errors when trying to use multiple processors - so just use 1.
- # We also do NOT set batch mode in this case, because some
- # machines (e.g., yellowstone) do not listen to REGRID_PROC, so to
- # get a single processor, we need to run mkmapdata.sh in
- # interactive mode.
- regrid_num_proc=1
- else
- echo "global"
- regrid_num_proc=$SLURM_NTASKS
- if [ ! -z "$SLURM_JOB_ACCOUNT" ]; then
- echo "batch"
- cmdargs="$cmdargs -b"
- fi
- fi
-
- echo "args: $cmdargs"
- echo "time env REGRID_PROC=$regrid_num_proc ./mkmapdata.sh $cmdargs\n"
- time env REGRID_PROC=$regrid_num_proc ./mkmapdata.sh $cmdargs
-done
diff --git a/tools/mkmapdata/rmdups.ncl b/tools/mkmapdata/rmdups.ncl
deleted file mode 100644
index d5fff40d53..0000000000
--- a/tools/mkmapdata/rmdups.ncl
+++ /dev/null
@@ -1,131 +0,0 @@
-;
-; Remove duplicate weights from a mapping file.
-;
-; Mark Taylor (converted for use by CLM mkmapdata by Erik Kluzek)
-; Sep/01/2011
-;
-load "$NCARG_NCARG/nclscripts/csm/gsn_code.ncl"
-load "$NCARG_NCARG/nclscripts/csm/gsn_csm.ncl"
-load "$NCARG_NCARG/nclscripts/csm/contributed.ncl"
-begin
- ; ===========================================================================================================
- ;
- ; IMPORTANT NOTE: EDIT THE FOLLOWING TO CUSTOMIZE or use ENV VARIABLE SETTINGS
- ; Edit the following as needed
- ;
- ; Input mapping file to remove duplicate weights from a mapping file
- ;
- mapfile = getenv("MAPFILE") ; Get the mapping file
- newmapfile = getenv("NEWMAPFILE") ; The new mapping file to create
- logname = getenv("LOGNAME") ; Logname of user running the script
-
- if ( ismissing(mapfile) )then
- print( "You did NOT enter an input mapping file to convert" )
- status_exit( -1 )
- end if
- if ( ismissing(newmapfile) )then
- sdate = systemfunc( "date +%y%m%d" );
- newmapfile = mapfile+"_c"+sdate+".nc";
- end if
- ; ===========================================================================================================
-
- if ( systemfunc("test -f "+mapfile+"; echo $?" ) .ne. 0 )then
- print( "Input file does not exist or not found: "+mapfile );
- status_exit( -1 )
- end if
- print("map file: "+mapfile)
- f = addfile(mapfile,"r") ; Open netCDF files.
-
-
- n_s = dimsizes(f->col)
- if ( n_s .eq. 0 )then
- print( "n_s is size zero, so no overlap points just return: " );
- exit
- end if
-
- n_b = dimsizes(f->area_b)
- n_a = dimsizes(f->area_a)
- print("n_s = "+n_s+" max(row)="+max(f->row)+" max(col)="+max(f->col))
-
-
-
- row = f->row
- col = f->col
-
-
- print("checking for dups, sorting...")
- hash = new( n_s, double )
- hash = col
- hash= hash + row*n_b
- index1d=dim_pqsort(hash,1)
- row2=row(index1d)
- col2=col(index1d)
- S=f->S
- print("zeroing out any dups...")
- ndups=0
- i0=0
- do i=1,n_s-1
- if ( (col2(i) .eq. col2(i0)) .and. (row2(i) .eq. row2(i0))) then
- iorig1 = index1d(i0)
- iorig2 = index1d(i)
- ;print("dup row: "+row2(i)+" "+row2(i0)+" "+row(iorig1)+" "+row(iorig2))
- ;print("dup col: "+col2(i)+" "+col2(i0)+" "+col(iorig1)+" "+col(iorig2))
- ;print("removing "+iorig2+" keeping "+iorig1)
- S(iorig1)=S(iorig1)+S(iorig2)
- S(iorig2)=0
- ndups=ndups+1
- ; dont increment i0
- else
- i0=i
- end if
- end do
- delete(row2)
- delete(col2)
- if ( ndups .gt. 0) then
- print("ndups = "+ndups)
- print("compacting S...")
- ns2 = n_s-ndups
- S2 = new( ns2, double)
- row2= new( ns2, integer)
- col2 = new( ns2, integer)
- ns2=0
- do i=0,n_s-1
- if (S(i) .ne. 0) then
- S2(ns2)=S(i)
- row2(ns2)=row(i)
- col2(ns2)=col(i)
- ns2=ns2+1
- end if
- end do
- print("removed "+ndups+" dups")
- delete(S)
- delete(row)
- delete(col)
- S=S2
- row=row2
- col=col2
- n_s = ns2
- print("writing new netcdf file")
- cmdout = systemfunc("ncks -O -x -v S,row,col "+mapfile+" "+newmapfile)
- nco = addfile(newmapfile,"rw") ; Open netCDF files.
- nco->S = S
- nco->row = row
- nco->col = col
- ldate = systemfunc( "date" );
- nco@history = nco@history + ":"+ldate + ": ";
- nco@history = nco@history + " Removed duplicate weights from mapping file with: rmdups.ncl "
- nco@rmdups_Logname = logname;
- nco@rmdups_mod_date = ldate;
- nco@rmdups_version = systemfunc( "git describe" );
-
- print("Successfully removed duplicate weights from mapping file" );
-
- else
-
- print("No duplicate weights to remove from mapping file" );
-
- end if
-
-
-
-end
diff --git a/tools/mkmapgrids/README b/tools/mkmapgrids/README
deleted file mode 100644
index cc6d2cb644..0000000000
--- a/tools/mkmapgrids/README
+++ /dev/null
@@ -1,22 +0,0 @@
-$CTSMROOT/tools/mkmapgrids/README June08/2018
-
-NCL script to create a SCRIP grid file for a regular lat/lon grid.
-
-To use the script, set the following environment variables
-
-Required (or defaults to a single point over Boulder Colorado)
-
-PTNAME ! name of your grid
-S_LAT ! Southern latitude corner
-N_LAT ! Northern latitude corner
-E_LON ! Eastern longitude corner
-W_LON ! Western longitude corner
-
-Optional:
-
-NX ! Number of grid points along longitude (default 1)
-NY ! Number of grid points along latitude (default 1)
-IMASK ! 0 or 1, mask to use if all points are active or not (default active)
-PRINT ! TRUE/FALSE do extra verbose printing or not (default FALSE)
-GRIDFILE ! Output filename
-
diff --git a/tools/mkmapgrids/mkscripgrid.ncl b/tools/mkmapgrids/mkscripgrid.ncl
deleted file mode 100644
index 0cbd1a8960..0000000000
--- a/tools/mkmapgrids/mkscripgrid.ncl
+++ /dev/null
@@ -1,188 +0,0 @@
-;
-; mkscripgrid.ncl
-;
-; Create SCRIP grid and mapping file for a land-only point or region.
-; Requires NCL 6.1.0 or later for the ESMF regridding functions
-;
-; Erik Kluzek
-; Dec/07/2011
-;
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
-load "$NCARG_ROOT/lib/ncarg/nclscripts/esmf/ESMF_regridding.ncl"
-begin
- ; ===========================================================================================================
- ; Set a few constants needed later
- cdate = systemfunc( "date +%y%m%d" );
- ldate = systemfunc( "date" );
- ;
- ; IMPORTANT NOTE: EDIT THE FOLLOWING TO CUSTOMIZE or use ENV VARIABLE SETTINGS
- ; Edit the following as needed to interpolate to a new resolution.
- ;
- ; Input resolution and position
- ;
- name = getenv("PTNAME"); ; Get name of this point
-
- latS = stringtodouble( getenv("S_LAT") ); ; Get south latitude from env variable
- latN = stringtodouble( getenv("N_LAT") ); ; Get north latitude from env variable
- lonE = stringtodouble( getenv("E_LON") ); ; Get east longitude from env variable
- lonW = stringtodouble( getenv("W_LON") ); ; Get west longitude from env variable
-
- nx = stringtointeger( getenv("NX" ) ); ; Get number of grids along longitude lines
- ny = stringtointeger( getenv("NY" ) ); ; Get number of grids along latitude lines
-
- imask = stringtointeger( getenv("IMASK") ); ; Get imask to use from env variable
-
- print_str = getenv("PRINT"); ; Do Extra printing for debugging
-
- outfilename = getenv("GRIDFILE"); ; Get filename from env variable
-
- gitdescribe = getenv("GITDES"); ; Git describe from the source clone
-
- if ( ismissing(nx) )then
- nx = 1;
- end if
- if ( ismissing(ny) )then
- ny = 1;
- end if
- if ( ismissing(imask) )then
- imask = 1;
- end if
- if ( ismissing(name) )then
- name = nx+"x"+ny+"pt_US-UMB";
- end if
- if ( ismissing(latS) )then
- latS = 45.5098;
- end if
- if ( ismissing(latN) )then
- latN = 45.6098;
- end if
- if ( ismissing(lonW) )then
- lonW = 275.2362;
- end if
- if ( ismissing(lonE) )then
- lonE = 275.3362;
- end if
- if ( ismissing(print_str) )then
- printn = False;
- else
- if ( print_str .eq. "TRUE" )then
- printn = True;
- else
- printn = False;
- end if
- end if
-
- if ( ismissing(outfilename) )then
- if ( imask .eq. 1 )then
- outfilename = "SCRIPgrid_"+name+"_nomask_c"+cdate+".nc";
- else
- if ( imask .eq. 0 )then
- outfilename = "SCRIPgrid_"+name+"_noocean_c"+cdate+".nc";
- else
- outfilename = "SCRIPgrid_"+name+"_mask_c"+cdate+".nc";
- end if
- end if
- end if
-
- if ( ismissing(gitdescribe) )then
- gitdescribe = systemfunc( "git describe" )
- end if
-
- system( "/bin/rm -f "+outfilename );
- if ( printn )then
- print( "output file: "+outfilename );
- end if
-
-function fspan1up( fbegin [*]:double, fend [*]:double, number:integer )
-;
-; An "fspan" that can handle size of 1 and up.
-; Do fspan for arrays of two or more, or average of end points for array of one.
-;
-local farray;
-begin
- if ( number .eq. 1) then
- farray = (/ (fbegin + fend) / 2.0d00 /);
- else
- farray = fspan( fbegin, fend, number );
- end if
- return( farray );
-end
-
- ;
- ; Compute derived quantities
- ;
-
- delX = (lonE - lonW) / int2dble(nx);
- delY = (latN - latS) / int2dble(ny);
-
- lonCenters = fspan1up( (lonW + delX/2.d0), (lonE - delX/2.d0), nx)
- latCenters = fspan1up( (latS + delY/2.d0), (latN - delY/2.d0), ny)
- lon = new( (/ny, nx/), "double" );
- lat = new( (/ny, nx/), "double" );
- if ( (nx .eq. 1) .or. (ny .eq. 1) )then
- if ( printn )then
- print( "Calculate corners" )
- end if
- lonCorners = new( (/ny, nx, 4/), "double" );
- latCorners = new( (/ny, nx, 4/), "double" );
- else
- if ( printn )then
- print( "Have NCL calculate corners" )
- end if
- end if
- do i = 0, nx-1
- lat(:,i) = latCenters;
- if ( (nx .eq. 1) .or. (ny .eq. 1) )then
- latCorners(:,i,0) = latCenters - delY/2.d0;
- latCorners(:,i,1) = latCenters - delY/2.d0;
- latCorners(:,i,2) = latCenters + delY/2.d0;
- latCorners(:,i,3) = latCenters + delY/2.d0;
- end if
- end do
- do j = 0, ny-1
- lon(j,:) = lonCenters;
- if ( (nx .eq. 1) .or. (ny .eq. 1) )then
- lonCorners(j,:,0) = lonCenters - delX/2.d0;
- lonCorners(j,:,1) = lonCenters + delX/2.d0;
- lonCorners(j,:,2) = lonCenters + delX/2.d0;
- lonCorners(j,:,3) = lonCenters - delX/2.d0;
- end if
- end do
-
- ; for some reason, "No_FillValue" isn't working in the case where imask=1
- Mask2D = new( (/ny,nx/), "integer", "No_FillValue" )
- Mask2D(:,:) = imask
- gridSize = delX+"x"+delY
-
- ;
- ; Create SCRIP grid file
- ;
-
- Opt = True
- Opt@Mask2D = Mask2D
- if ( (nx .eq. 1) .or. (ny .eq. 1) )then
- Opt@GridCornerLat = latCorners
- Opt@GridCornerLon = lonCorners
- end if
- Opt@Title = "SCRIP grid file for "+name
- if (printn) then
- Opt@Debug = True
- end if
- curvilinear_to_SCRIP(outfilename, lat, lon, Opt)
-
- ;
- ; Add global attributes to file
- ;
-
- nc = addfile( outfilename, "w" );
- nc@history = ldate+": create using mkscripgrid.ncl";
- nc@comment = "Ocean is assumed to be non-existant in this region";
- nc@Version = gitdescribe;
- if ( printn )then
- print( "================================================================================================" );
- print( "Successfully created SCRIP grid file: "+outfilename);
- end if
-
- ; ===========================================================================================================
-
-end
diff --git a/tools/mkprocdata_map/README b/tools/mkprocdata_map/README
deleted file mode 100644
index f5ac71b1ff..0000000000
--- a/tools/mkprocdata_map/README
+++ /dev/null
@@ -1,152 +0,0 @@
-$CTSMROOT/tools/mkprocdata_map/clm/README Oct 30, 2012
-
-This directory contains scripts for regridding CLM output from an
-unstructured grid (1-d output using the lndgrid dimension) to a 2-d
-(lat/lon) grid. The regridding method is area-conservative.
-
-The following steps provide a method to create the necessary inputs to
-this script, produce an executable, and regrid output:
-
-In the following instructions, the "original" resolution is the
-resolution of the run on an unstructured grid, and the "target"
-resolution is the regular lat/lon resolution to which you will regrid
-the output.
-
-(0) Install prerequisites:
-
- (a) If you do not already have a mapping file from the original
- resolution to the target resolution, you will need the
- ESMF_RegridWeightGen tool installed on your system.
-
- (b) The wrapper scripts describe below require the netCDF operators
- (NCO). These nco tools (ncks, ncap2, etc.) must be in your path.
-
-(1) Determine the target resolution. This resolution must be a regular
- lat/lon resolution. Generally, this should be a resolution close
- to the resolution of the CLM run. For example, when running CLM at
- ne30_np4 resolution, a good target resolution is 0.9x1.25 (i.e.,
- finite volume 1 degree: f09); when running CLM at ne120_np4
- resolution, a good target resolution is 0.23x0.31 (i.e., finitev
- volume 1/4 degree: f02).
-
-(2) Perform a short CLM run at the target resolution, producing at
- least one history file. After this run completes, set the
- environment variable $TEMPLATE_FILE to point to one of the history
- files created by this run.
-
-(3) Create a conservative mapping file from the original resolution to
- the target resolution using the ESMF regrid weight generator. The
- basic method for doing this is:
-
- $ESMF_PATH/bin/ESMF_RegridWeightGen -s $INGRID -d $OUTGRID -m conserve -w $MAP_FILE -i
-
- where $INGRID gives the path to a SCRIP grid file at the original
- resolution, $OUTGRID gives the path to a SCRIP grid file at the
- template resolution, and $MAP_FILE gives the name of the mapping
- file that will be generated.
-
- However, you may want to wrap this in a job script to run it on
- multiple processors (using mpirun), and you may have to set other
- machine-specific environment variables. You can follow the method
- used in tools/mkmapdata/mkmapdata.sh.
-
-(4) Build the mkprocdata_map tool. From the current directory, do the
- following:
-
- > cd src
- > gmake
- > cd ..
-
- By default code compiles optimized so it's reasonably fast. If you want
- to use the debugger, with bounds-checking, and float trapping on do the
- following:
- gmake OPT=FALSE
- See Also: See the components/clm/tools/README file for notes about setting
- the path for NetCDF.
-
- This builds the mkprocdata_map executable. However, you generally
- will not want to run this executable directly: instead, you should
- use one of the wrapper scripts described below.
-
-(5) Do the regridding using one of the wrapper scripts in this
- directory. To determine which script is most appropriate: Do you
- need to regrid just one or a few output files, or most/all of the
- output files in a directory?
-
- (a) If you are regridding just one or a few output files, you can
- use mkprocdata_map_wrap. Its usage is:
-
- > mkprocdata_map_wrap -i input_file -o output_file -m $MAP_FILE -t $TEMPLATE_FILE
-
- where:
- - input_file is the CLM history file you want to regrid
- - output_file is the name of the regridded file that will be
- created
- - $MAP_FILE is the ESMF conservative mapping file created in
- step (3)
- - $TEMPLATE_FILE is a CLM history file at the target resolution,
- created in step (2)
-
- You may also specify the '-l' option to this script. This option
- determines whether to determine landfrac and related variables
- by regridding the input file (when you don't give the '-l'
- option), or by copying these variables from the template file
- (when you give the '-l' option). These variables are important
- for computing regional and global averages, e.g., as is done in
- the land diagnostics package. Each method may be reasonable,
- depending on the purposes of the regridding. For example, if you
- want regional/global integrals to be as true as possible to the
- original run, you should run withOUT the '-l' option; but if you
- want to compare regional/global integrals between the original
- run and a run at the target resolution, then you may want to run
- WITH the '-l' option.
-
- Run 'mkprocdata_map_wrap -h' for full usage
-
- (b) If you need to regrid most or all of the output files in a
- directory, you can use the convenience script
- mkprocdata_map_all. This script runs mkprocdata_map_wrap on all
- files matching a given pattern within a directory. Its basic
- usage is the following, done from a directory containing many
- CLM history files:
-
- > /path/to/mkprocdata_map_all -p $CASE -m $MAP_FILE -t $TEMPLATE_FILE
-
- where:
- - $CASE is the case name of the original run (this -p argument
- is actually more general: it provides the prefix of files on
- which mkprocdata_map_wrap should be run; run
- 'mkprocdata_map_all -h' for details)
- - $MAP_FILE is the ESMF conservative mapping file created in
- step (3)
- - $TEMPLATE_FILE is a CLM history file at the target resolution,
- created in step (2)
-
- There are a number of additional optional arguments to this
- script, including the '-l' option described in (a), above. Run
- 'mkprocdata_map_all -h' for full usage.
-
-
-------------------------------------------------------------------------
-Some miscellaneous notes on the scripts contained here
-------------------------------------------------------------------------
-
-- area vs. area_regridded in the output of mkprocdata_map_wrap and
- mkprocdata_map_all: The 'area' variable gives the actual grid cell
- area on the destination grid. The 'area_regridded' variable is the
- result of performing the regridding procedure on the 'area' variable
- in the original source data. This seems to be the wrong way to
- regrid areas (e.g., it leads to global totals that do not make
- sense). However, area_regridded is left in the regridded files as a
- diagnostic. BUT PLEASE USE CAUTION IF USING THIS AREA_REGRIDDED
- VALUE, UNLESS YOU KNOW WHAT IT REALLY REPRESENTS!
-
-- At least as of this writing (Oct 29, 2012), there is insufficient
- metadata on the CLM history files to regrid all variables
- perfectly. In particular, note that many CLM history variables apply
- only over a subset of the grid cell (e.g., over the non-lake portion
- of the grid cell). Thus, to regrid these variables appropriately, we
- would need to weight each grid cell's value by the portion of the
- grid cell over which the field applies. However, doing this would
- require metadata about each field that is not currently
- available.
diff --git a/tools/mkprocdata_map/README.filedescriptions b/tools/mkprocdata_map/README.filedescriptions
deleted file mode 100644
index e657e7c7d9..0000000000
--- a/tools/mkprocdata_map/README.filedescriptions
+++ /dev/null
@@ -1,25 +0,0 @@
-$CTSMROOT/tools/mkprocdata_map/README.filedescriptions Erik Kluzek
- 06/08/2018
-
-mkprocdata_map_all ------------ Script to run over a list of files
-mkprocdata_map_wrap ----------- Main script to actually use
-mkprocdata_map_functions.bash - Bash shell functions to use in other scripts
-README ------------------------ Description and how to run
-src --------------------------- Directory with FORTRAN source code
-
-Also there are some sample files that can be used for testing under inputdata in
-
-$DIN_LOC_ROOT/lnd/clm2/test_mkprocdata_map
-
-See how this is done by looking at the file for testing mkprocdata_map:
-
-../../test/tools/nl_files/mkprocdata_ne30_to_f19_I2000
-
-Which does something like the following:
-
-./mkprocdata_map_wrap \
--i $DIN_LOC_ROOT/lnd/clm2/test_mkprocdata_map/clm4054_ne30g16_I2000.clm2.h0.2000-01_c170430.nc \
--o ne30output_onf19grid.nc \
--m $DIN_LOC_ROOT/lnd/clm2/test_mkprocdata_map/map_ne30np4_nomask_to_fv1.9x2.5_nomask_aave_da_c121107.nc \
--t $DIN_LOC_ROOT/lnd/clm2/test_mkprocdata_map/clm4054_f19g16_I2000.clm2.h0.2000-01_c170430.nc
-
diff --git a/tools/mkprocdata_map/mkprocdata_map_all b/tools/mkprocdata_map/mkprocdata_map_all
deleted file mode 100755
index 73e8abedf1..0000000000
--- a/tools/mkprocdata_map/mkprocdata_map_all
+++ /dev/null
@@ -1,202 +0,0 @@
-#!/bin/bash
-
-# This script runs mkprocdata_map_wrap on all files matching a given
-# pattern within a directory.
-
-# Created by Bill Sacks, 5-26-11
-
-# ----------------------------------------------------------------------
-# LOCAL FUNCTIONS DEFINED HERE
-# ----------------------------------------------------------------------
-
-function Usage {
- script_name=`basename $0`
- echo "Usage: $script_name -p prefix -m map_file -t template_file [-d] [-e executable-path] [-h] [-i] [-l] [-o output_suffix] [-r diRectory] [-s suffix]"
- echo ""
- echo "This script runs mkprocdata_map_wrap on all files matching a"
- echo "given pattern within a directory."
- echo ""
- echo "'prefix' gives the prefix of the files on which"
- echo "mkprocdata_map_wrap should be run; 'prefix' should NOT contain"
- echo "wildcard characters. The prefix is also used to translate"
- echo "from input to output file names (see examples below)"
- echo ""
- echo "'map_file' gives the name (and full path if not in the current"
- echo "directory) of the mapping file"
- echo ""
- echo "'template_file' gives the name (and full path if not in the"
- echo "current directory) of the template file, from which we read"
- echo "lats, lons and some other variables"
- echo ""
- echo "The following are optional arguments:"
- echo ""
- echo "[-d]: Do a test (Dry run): do all error-checking on"
- echo " arguments and print commands that would be run, but"
- echo " don't actually run commands"
- echo ""
- echo "[-e executable-path]: Gives the path of the mkprocdata_map executable."
- echo " If not specified, the path is determined by the"
- echo " default value in mkprocdata_map_wrap."
- echo ""
- echo "[-h]: Print this help message and exit"
- echo ""
- echo "[-i]: Ignore (skip) existing output files; if this option is"
- echo " not specified, then the script dies with an error if"
- echo " any of the desired output files already exist"
- echo ""
- echo "[-l]: Option passed to mkprocdata_map_wrap: rather than computing"
- echo " landfrac and related variables by regridding the input file,"
- echo " instead copy these variables directly from the template file."
- echo ""
- echo "[-o output_suffix]: suffix to append to the end of the prefix"
- echo " on the output files"
- echo " If not specified, '_2d' is used"
- echo ""
- echo "[-r diRectory]: Do the processing in the given directory."
- echo " If not specified, processing is done in the"
- echo " current working directory."
- echo ""
- echo "[-s suffix]: Run mkprocdata_map_wrap on all files matching the"
- echo " pattern '\${prefix}\${suffix}'. The suffix can -"
- echo " and often will - contain wildcards; but"
- echo " remember to enclose 'suffix' in quotes to"
- echo " prevent shell expansion."
- echo " If not specified, run mkprocdata_map_wrap on all"
- echo " files matching '\${prefix}*'"
- echo ""
- echo ""
- echo "Example: $script_name -p Ib14_ne30np4_gx1v6 -m map_ne30np4_to_fv1.9x2.5_aave_da_091230.nc -t Ib19_1.9x2.5_gx1v6.clm2.h0.0001-01.nc"
- echo "This will run mkprocdata_map_wrap on all files whose names begin"
- echo "with 'Ib14_ne30np4_gx1v6' in the current directory, using the"
- echo "mapping file named 'map_ne30np4_to_fv1.9x2.5_aave_da_091230.nc'"
- echo "and the template file named 'Ib19_1.9x2.5_gx1v6.clm2.h0.0001-01.nc'"
- echo "For an input file named:"
- echo " Ib14_ne30np4_gx1v6.clm2.h0.0001-01-06-00000.nc"
- echo "The output file will be named:"
- echo " Ib14_ne30np4_gx1v6_2d.clm2.h0.0001-01-06-00000.nc"
- echo ""
- echo "Example: $script_name -o '_remap' -s '*.h0.0001*.nc' -p Ib14_ne30np4_gx1v6 -m map_ne30np4_to_fv1.9x2.5_aave_da_091230.nc -t Ib19_1.9x2.5_gx1v6.clm2.h0.0001-01.nc"
- echo "This will run mkprocdata_map_wrap on all files whose names match"
- echo "the pattern 'Ib14_ne30np4_gx1v6*.h0.0001*.nc', in the"
- echo "current directory, using the mapping file named"
- echo "'map_ne30np4_to_fv1.9x2.5_aave_da_091230.nc' and the"
- echo "template file named Ib19_1.9x2.5_gx1v6.clm2.h0.0001-01.nc"
- echo "For an input file named:"
- echo " Ib14_ne30np4_gx1v6.clm2.h0.0001-01-06-00000.nc"
- echo "The output file will be named:"
- echo " Ib14_ne30np4_gx1v6_remap.clm2.h0.0001-01-06-00000.nc"
- echo ""
-}
-
-# ----------------------------------------------------------------------
-# BEGIN MAIN SCRIPT
-# ----------------------------------------------------------------------
-
-script_dir=`dirname $0`
-source $script_dir/mkprocdata_map_functions.bash
-
-# ----------------------------------------------------------------------
-# Handle command-line arguments
-# ----------------------------------------------------------------------
-
-# define default values:
-# required arguments:
-prefix=""
-map_file=""
-template_file=""
-# optional arguments:
-directory="."
-ignore_existing=0
-output_suffix="_2d"
-suffix="*"
-dryrun=0
-extra_args=""
-
-while getopts de:hilm:o:p:r:s:t: opt; do
- case $opt in
- d) dryrun=1;;
- e) extra_args="$extra_args -e $OPTARG";;
- h) Usage; exit;;
- i) ignore_existing=1;;
- l) extra_args="$extra_args -l";;
- m) map_file=$OPTARG;;
- o) output_suffix=$OPTARG;;
- p) prefix=$OPTARG;;
- r) directory=$OPTARG;;
- s) suffix=$OPTARG;;
- t) template_file=$OPTARG;;
- \?) Usage; exit 1
- esac
-done
-
-# ----------------------------------------------------------------------
-# Error checking on arguments
-# ----------------------------------------------------------------------
-
-if [ -z "$prefix" ]; then
- echo "Must specify a prefix"
- Usage
- exit 1
-fi
-
-check_file_arg "$map_file" "map"
-check_file_arg "$template_file" "template"
-
-# Make sure directory is really a directory
-if [ ! -d $directory ]; then
- echo "ERROR: $directory is not a directory"
- echo ""
- Usage
- exit 1
-fi
-
-
-# ----------------------------------------------------------------------
-# Change to desired directory
-# ----------------------------------------------------------------------
-
-olddir=`pwd`
-cd $directory
-
-# ----------------------------------------------------------------------
-# Get list of files matching the given pattern; make sure there really
-# are some matching files
-# ----------------------------------------------------------------------
-
-files=`ls ${prefix}${suffix}`
-if [ $? -ne 0 ]; then
- echo "ERROR trying to find files matching: ${prefix}${suffix}"
- echo ""
- Usage
- exit 1
-fi
-
-# ----------------------------------------------------------------------
-# Loop through files matching the given pattern; run mkprocdata_map_wrap for each
-# ----------------------------------------------------------------------
-
-for infile in $files; do
- outfile=${infile/$prefix/${prefix}${output_suffix}}
- if [ -e $outfile ]; then
- if [ $ignore_existing -eq 0 ]; then
- echo ""
- echo "ERROR: output file $outfile already exists"
- exit 1
- else
- echo ""
- echo "WARNING: output file $outfile already exists: skipping"
- echo ""
- fi
-
- else # outfile does not exist
- echo ""
- do_cmd "${script_dir}/mkprocdata_map_wrap -i $infile -o $outfile -m $map_file -t $template_file $extra_args" $dryrun
- fi
-done
-
-# ----------------------------------------------------------------------
-# Clean up
-# ----------------------------------------------------------------------
-
-cd $olddir
-
diff --git a/tools/mkprocdata_map/mkprocdata_map_functions.bash b/tools/mkprocdata_map/mkprocdata_map_functions.bash
deleted file mode 100644
index bbc359fb89..0000000000
--- a/tools/mkprocdata_map/mkprocdata_map_functions.bash
+++ /dev/null
@@ -1,57 +0,0 @@
-#!/bin/bash
-
-# This file contains functions used by other bash scripts in this directory.
-
-# This function echoes the command given by $1 (cmd), then executes it.
-# However, if $2 (dryrun) is non-zero, then it only does the echo, not the execution.
-# Usage: do_cmd cmd dryrun
-# Returns 0 on success, non-zero on failure; if there is an error, the error string is echoed.
-function do_cmd {
- if [[ $# -ne 2 ]]; then
- echo "ERROR in do_cmd: wrong number of arguments: expected 2, received $#"
- exit 1
- fi
-
- local cmd=$1
- local dryrun=$2
-
- echo $cmd
- if [ $dryrun -eq 0 ]; then
- # We use 'eval $cmd' rather than just '$cmd', because the
- # latter doesn't work right if the command contains any quoted
- # strings (e.g., svn ci -m "this is my message")
- eval $cmd
- if [ $? -ne 0 ]; then
- echo "ERROR in do_cmd: error executing command"
- exit 2
- fi
- fi
-
- return 0
-}
-
-# make sure that the given file name argument was provided, and that
-# the file exists; exit the script with a usage message if either of
-# these is not true
-#
-# Usage: check_file_arg filename_arg description
-# (description is echoed if there is an error)
-# Example: check_file_arg "$input_file" "input"
-# (note that $input_file must be in quotes)
-function check_file_arg {
- local filename=$1
- local description=$2
-
- if [ -z "$filename" ]; then
- echo "ERROR: Must specify $description file"
- Usage
- exit 1
- fi
-
- if [ ! -f $filename ]; then
- echo "ERROR: Can't find $description file: $filename"
- Usage
- exit 1
- fi
-}
-
diff --git a/tools/mkprocdata_map/mkprocdata_map_wrap b/tools/mkprocdata_map/mkprocdata_map_wrap
deleted file mode 100755
index 4744b0eacc..0000000000
--- a/tools/mkprocdata_map/mkprocdata_map_wrap
+++ /dev/null
@@ -1,250 +0,0 @@
-#!/bin/bash
-
-# This script is a wrapper around mkprocdata_map that runs that
-# program and then copies some additional variables from the template
-# file to the output file. It also does some additional pre and
-# post-processing in order to create some additional variables.
-
-# Created by Bill Sacks, 5-25-11
-
-# ----------------------------------------------------------------------
-# SET PARAMETERS HERE
-# ----------------------------------------------------------------------
-
-# comma-delimited list of extra variables to copy directly from
-# template file; note that these variables should not be written out
-# by mkprocdata_map (i.e., everything in this list should be listed in
-# the 'ignore_var' function in mkprocdata_map.F90); however, there may
-# be some variables in the 'ignore_var' function that are not listed
-# here - e.g., variables that we treat specially.
-copy_vars="lon,lat"
-
-# comma-delimited list of extra variables to copy from the template
-# file if the -l option is specified -- this option says to copy
-# landfrac and related variables. Note that some of these variables
-# may be written out by mkprocdata_map, in which case they will be
-# overwritten afterwards (slighly less efficient, but that keeps
-# things simpler).
-landfrac_copy_vars="landfrac,landmask,pftmask"
-
-# name of the executable;
-# expected to be in the same directory as this script unless -e option is given
-executable="mkprocdata_map"
-
-# minimum value for regridded pftmask variable for the output variable to be 1
-pftmask_min="1.e-6"
-
-# fill value for landmask
-landmask_fill=-9999
-
-# ----------------------------------------------------------------------
-# LOCAL FUNCTIONS DEFINED HERE
-# ----------------------------------------------------------------------
-
-function Usage {
- script_name=`basename $0`
- echo "Usage: $script_name -i input_file -o output_file -m map_file -t template_file [-e executable-path] [-h] [-l]"
- echo ""
- echo "This script runs mkprocdata_map with the given arguments (-i, -o, -m and -t),"
- echo "then copies some additional variables from the template file"
- echo "to the output file. It also does some additional pre and"
- echo "post-processing in order to create some additional variables."
- echo ""
- echo "Additional optional arguments:"
- echo ""
- echo "[-e executable-path]: Gives the path of the mkprocdata_map executable."
- echo " If not specified, the executable is assumed to be"
- echo " in the same directory as this script."
- echo ""
- echo "[-h]: Print this help message and exit"
- echo ""
- echo "[-l]: Rather than computing landfrac and related variables"
- echo "by regridding the input file, instead copy these variables"
- echo "directly from the template file. The variables this pertains"
- echo "to are:"
- echo $landfrac_copy_vars
-}
-
-# This function operates on a single variable in a file, changing all
-# places where that variable is missing to some new (non-missing)
-# value. The _FillValue attribute remains unchanged.
-# Usage: change_missing_to_value varname newval infile outfile
-# - varname: the name of the variable to change
-# - newval: all instances of the missing value will be replaced with
-# this new value
-# - infile: input file name
-# - outfile: output file name (can be the same as infile)
-function change_missing_to_value {
- if [[ $# -ne 4 ]]; then
- echo "ERROR in change_missing_to_value: wrong number of arguments: expected 2, received $#"
- exit 1
- fi
-
- varname=$1
- newval=$2
- infile=$3
- outfile=$4
-
- varname_tmp=${varname}_tmp_$$
-
- cat > cmds.nco.tmp.$$ <= $pftmask_min)' $output_file $output_file" 0
- do_cmd "ncks -O -x -v pftmask_float $output_file $output_file" 0
-
- # --- Calculate landmask from landfrac ---
- echo ""
-
- cat > cmds.nco.tmp.$$ < 0);
-landmask_float.change_miss($landmask_fill);
-landmask=int(landmask_float);
-EOF
-
- do_cmd "ncap2 -O -S cmds.nco.tmp.$$ $output_file $output_file" 0
- rm cmds.nco.tmp.$$
-
- change_missing_to_value landmask 0 $output_file $output_file
-
- # in the following, note that we need to manually set missing_value, because it doesn't get changed through the .set_miss call in nco:
- do_cmd "ncatted -a long_name,landmask,o,c,'land/ocean mask (0.=ocean and 1.=land)' -a missing_value,landmask,o,i,$landmask_fill $output_file" 0
-fi
-
-echo "Successfully regridded data"
diff --git a/tools/mkprocdata_map/src/Filepath b/tools/mkprocdata_map/src/Filepath
deleted file mode 100644
index 9c558e357c..0000000000
--- a/tools/mkprocdata_map/src/Filepath
+++ /dev/null
@@ -1 +0,0 @@
-.
diff --git a/tools/mkprocdata_map/src/Makefile b/tools/mkprocdata_map/src/Makefile
deleted file mode 100644
index 6f07deb741..0000000000
--- a/tools/mkprocdata_map/src/Makefile
+++ /dev/null
@@ -1,10 +0,0 @@
-# Makefile for mksurfdata_map
-
-EXENAME = ../mkprocdata_map
-
-# Set optimization on by default
-ifeq ($(OPT),$(null))
- OPT := TRUE
-endif
-
-include Makefile.common
\ No newline at end of file
diff --git a/tools/mkprocdata_map/src/Makefile.common b/tools/mkprocdata_map/src/Makefile.common
deleted file mode 100644
index ab79f94144..0000000000
--- a/tools/mkprocdata_map/src/Makefile.common
+++ /dev/null
@@ -1,360 +0,0 @@
-#-----------------------------------------------------------------------
-# This Makefile is for building clm tools on AIX, Linux (with pgf90 or
-# lf95 compiler), Darwin or IRIX platforms.
-#
-# These macros can be changed by setting environment variables:
-#
-# LIB_NETCDF --- Library directory location of netcdf. (defaults to /usr/local/lib)
-# INC_NETCDF --- Include directory location of netcdf. (defaults to /usr/local/include)
-# MOD_NETCDF --- Module directory location of netcdf. (defaults to $LIB_NETCDF)
-# USER_FC ------ Allow user to override the default Fortran compiler specified in Makefile.
-# USER_FCTYP --- Allow user to override the default type of Fortran compiler (linux and USER_FC=ftn only).
-# USER_CC ------ Allow user to override the default C compiler specified in Makefile (linux only).
-# USER_LINKER -- Allow user to override the default linker specified in Makefile.
-# USER_CPPDEFS - Additional CPP defines.
-# USER_CFLAGS -- Additional C compiler flags that the user wishes to set.
-# USER_FFLAGS -- Additional Fortran compiler flags that the user wishes to set.
-# USER_LDLAGS -- Additional load flags that the user wishes to set.
-# SMP ---------- Shared memory Multi-processing (TRUE or FALSE) [default is FALSE]
-# OPT ---------- Use optimized options.
-#
-#------------------------------------------------------------------------
-
-# Set up special characters
-null :=
-
-# Newer makes set the CURDIR variable.
-CURDIR := $(shell pwd)
-
-RM = rm
-
-# Check for the netcdf library and include directories
-ifeq ($(LIB_NETCDF),$(null))
- LIB_NETCDF := /usr/local/lib
-endif
-
-ifeq ($(INC_NETCDF),$(null))
- INC_NETCDF := /usr/local/include
-endif
-
-ifeq ($(MOD_NETCDF),$(null))
- MOD_NETCDF := $(LIB_NETCDF)
-endif
-
-# Set user specified Fortran compiler
-ifneq ($(USER_FC),$(null))
- FC := $(USER_FC)
-endif
-
-# Set user specified C compiler
-ifneq ($(USER_CC),$(null))
- CC := $(USER_CC)
-endif
-
-# Set if Shared memory multi-processing will be used
-ifeq ($(SMP),$(null))
- SMP := FALSE
-endif
-
-CPPDEF := $(USER_CPPDEFS)
-
-# Set optimization on by default
-ifeq ($(OPT),$(null))
- OPT := TRUE
-endif
-
-ifeq ($(OPT),TRUE)
- CPPDEF := -DOPT
-endif
-
-# Determine platform
-UNAMES := $(shell uname -s)
-
-# Load dependency search path.
-dirs := . $(shell cat Filepath)
-
-# Set cpp search path, include netcdf
-cpp_dirs := $(dirs) $(INC_NETCDF) $(MOD_NETCDF)
-cpp_path := $(foreach dir,$(cpp_dirs),-I$(dir)) # format for command line
-
-# Expand any tildes in directory names. Change spaces to colons.
-# (the vpath itself is set elsewhere, based on this variable)
-vpath_dirs := $(foreach dir,$(cpp_dirs),$(wildcard $(dir)))
-vpath_dirs := $(subst $(space),:,$(vpath_dirs))
-
-#Primary Target: build the tool
-all: $(EXENAME)
-
-# Get list of files and build dependency file for all .o files
-# using perl scripts mkSrcfiles and mkDepends
-
-SOURCES := $(shell cat Srcfiles)
-
-OBJS := $(addsuffix .o, $(basename $(SOURCES)))
-
-# Set path to Mkdepends script; assumes that any Makefile including
-# this file is in a sibling of the src directory, in which Mkdepends
-# resides
-Mkdepends := ../src/Mkdepends
-
-$(CURDIR)/Depends: $(CURDIR)/Srcfiles $(CURDIR)/Filepath
- $(Mkdepends) Filepath Srcfiles > $@
-
-
-# Architecture-specific flags and rules
-#------------------------------------------------------------------------
-# AIX
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),AIX)
-CPPDEF += -DAIX
-cpre = $(null)-WF,-D$(null)
-FPPFLAGS := $(patsubst -D%,$(cpre)%,$(CPPDEF))
-FFLAGS = -c -I$(INC_NETCDF) -q64 -qsuffix=f=f90 -qsuffix=f=f90:cpp=F90 \
- $(FPPFLAGS) -g -qfullpath -qarch=auto -qtune=auto -qsigtrap=xl__trcedump -qsclk=micro
-
-LDFLAGS = -L$(LIB_NETCDF) -q64 -lnetcdff -lnetcdf
-ifneq ($(OPT),TRUE)
- FFLAGS += -qinitauto=7FF7FFFF -qflttrap=ov:zero:inv:en -qspillsize=4000 -C
-else
- FFLAGS += -O2 -qmaxmem=-1 -Q
- LDFLAGS += -Q
-endif
-CFLAGS := -q64 -g $(CPPDEF) -O2
-FFLAGS += $(cpp_path)
-CFLAGS += $(cpp_path)
-
-ifeq ($(SMP),TRUE)
- FC = xlf90_r
- FFLAGS += -qsmp=omp
- LDFLAGS += -qsmp=omp
-else
- FC = xlf90
-endif
-
-endif
-
-#------------------------------------------------------------------------
-# Darwin
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),Darwin)
-
-# Set the default Fortran compiler
-ifeq ($(USER_FC),$(null))
- FC := g95
-endif
-ifeq ($(USER_CC),$(null))
- CC := gcc
-endif
-
-CFLAGS := -g -O2
-CPPDEF += -DSYSDARWIN -DDarwin -DLINUX
-LDFLAGS :=
-
-ifeq ($(FC),g95)
-
- CPPDEF += -DG95
- FFLAGS := -c -fno-second-underscore $(CPPDEF) $(cpp_path) -I$(MOD_NETCDF)
- ifeq ($(OPT),TRUE)
- FFLAGS += -O2
- else
- FFLAGS += -g -fbounds-check
- endif
-
-endif
-
-ifeq ($(FC),gfortran)
-
- CPPDEF += -DG95
- FFLAGS := -c -fno-second-underscore $(CPPDEF) $(cpp_path) -I$(MOD_NETCDF) \
- -fno-range-check
- ifeq ($(OPT),TRUE)
- FFLAGS += -O2
- else
- FFLAGS += -g -fbounds-check
- endif
-
-endif
-
-ifeq ($(FC),ifort)
-
- CPPDEF += -DFORTRANUNDERSCORE
- FFLAGS += -c -ftz -g -fp-model precise $(CPPDEF) $(cpp_path) \
- -convert big_endian -assume byterecl -traceback -FR
- LDFLAGS += -m64
-
- ifneq ($(OPT),TRUE)
- FFLAGS += -CB -O0
- else
- FFLAGS += -O2
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -qopenmp
- LDFLAGS += -qopenmp
- endif
-endif
-
-ifeq ($(FC),pgf90)
-
- CPPDEF += -DFORTRANUNDERSCORE
- FFLAGS += -c $(CPPDEF) $(cpp_path)
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -Ktrap=fp -Mbounds -Kieee
- else
- FFLAGS += -fast -Kieee
- endif
-
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
-
-endif
-
-ifeq ($(CC),icc)
- CFLAGS += -m64 -g
- ifeq ($(SMP),TRUE)
- CFLAGS += -qopenmp
- endif
-endif
-ifeq ($(CC),pgcc)
- CFLAGS += -g -fast
-endif
-
-CFLAGS += $(CPPDEF) $(cpp_path)
-LDFLAGS += -L$(LIB_NETCDF) -lnetcdf -lnetcdff
-
-endif
-
-#------------------------------------------------------------------------
-# Linux
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),Linux)
- ifeq ($(USER_FC),$(null))
- FC := ifort
- FCTYP := ifort
- else
- ifeq ($(USER_FC),ftn)
- ifneq ($(USER_FCTYP),$(null))
- FCTYP := $(USER_FCTYP)
- else
- FCTYP := pgf90
- endif
- else
- FCTYP := $(USER_FC)
- endif
- endif
- CPPDEF += -DLINUX -DFORTRANUNDERSCORE
- CFLAGS := $(CPPDEF)
- LDFLAGS := $(shell $(LIB_NETCDF)/../bin/nf-config --flibs)
- FFLAGS =
-
- ifeq ($(FCTYP),pgf90)
- CC := pgcc
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -Ktrap=fp -Mbounds -Kieee
- else
- FFLAGS += -fast -Kieee
- CFLAGS += -fast
- endif
-
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
-
- endif
-
- ifeq ($(FCTYP),lf95)
- ifneq ($(OPT),TRUE)
- FFLAGS += -g --chk a,e,s,u -O0
- else
- FFLAGS += -O
- endif
- # Threading only works by putting thread memory on the heap rather than the stack
- # (--threadheap).
- # As of lf95 version 6.2 the thread stacksize limits are (still) too small to run
- # even small
- # resolution problems (FV at 10x15 res fails).
- ifeq ($(SMP),TRUE)
- FFLAGS += --openmp --threadheap 4096
- LDFLAGS += --openmp --threadheap 4096
- endif
- endif
- ifeq ($(FCTYP),pathf90)
- FFLAGS += -extend_source -ftpp -fno-second-underscore
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -O0
- else
- FFLAGS += -O
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
- endif
- ifeq ($(FCTYP),ifort)
-
- FFLAGS += -ftz -g -fp-model precise -convert big_endian -assume byterecl -traceback -FR
- CFLAGS += -m64 -g
- LDFLAGS += -m64
-
- ifneq ($(OPT),TRUE)
- FFLAGS += -CB -O0
- else
- FFLAGS += -O2
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -qopenmp
- CFLAGS += -qopenmp
- LDFLAGS += -qopenmp
- endif
- endif
- FFLAGS += -c -I$(INC_NETCDF) $(CPPDEF) $(cpp_path)
- CFLAGS += $(cpp_path)
-endif
-
-#------------------------------------------------------------------------
-# Default rules and macros
-#------------------------------------------------------------------------
-
-.SUFFIXES:
-.SUFFIXES: .F90 .c .o
-
-# Set the vpath for all file types EXCEPT .o
-# We do this for individual file types rather than generally using
-# VPATH, because for .o files, we don't want to use files from a
-# different build (e.g., in building the unit tester, we don't want to
-# use .o files from the main build)
-vpath %.F90 $(vpath_dirs)
-vpath %.c $(vpath_dirs)
-vpath %.h $(vpath_dirs)
-
-# Append user defined compiler and load flags to Makefile defaults
-CFLAGS += $(USER_CFLAGS)
-FFLAGS += $(USER_FFLAGS)
-LDFLAGS += $(USER_LDFLAGS)
-
-# Set user specified linker
-ifneq ($(USER_LINKER),$(null))
- LINKER := $(USER_LINKER)
-else
- LINKER := $(FC)
-endif
-
-.F90.o:
- $(FC) $(FFLAGS) $<
-
-.c.o:
- $(CC) -c $(CFLAGS) $<
-
-
-$(EXENAME): $(OBJS)
- $(LINKER) -o $@ $(OBJS) $(LDFLAGS)
-
-clean:
- $(RM) -f $(OBJS) *.mod Depends
-
-include $(CURDIR)/Depends
diff --git a/tools/mkprocdata_map/src/Mkdepends b/tools/mkprocdata_map/src/Mkdepends
deleted file mode 100755
index a75e8fdde0..0000000000
--- a/tools/mkprocdata_map/src/Mkdepends
+++ /dev/null
@@ -1,327 +0,0 @@
-#!/usr/bin/env perl
-
-# Generate dependencies in a form suitable for inclusion into a Makefile.
-# The source filenames are provided in a file, one per line. Directories
-# to be searched for the source files and for their dependencies are provided
-# in another file, one per line. Output is written to STDOUT.
-#
-# For CPP type dependencies (lines beginning with #include) the dependency
-# search is recursive. Only dependencies that are found in the specified
-# directories are included. So, for example, the standard include file
-# stdio.h would not be included as a dependency unless /usr/include were
-# one of the specified directories to be searched.
-#
-# For Fortran module USE dependencies (lines beginning with a case
-# insensitive "USE", possibly preceded by whitespace) the Fortran compiler
-# must be able to access the .mod file associated with the .o file that
-# contains the module. In order to correctly generate these dependencies
-# two restrictions must be observed.
-# 1) All modules must be contained in files that have the same base name as
-# the module, in a case insensitive sense. This restriction implies that
-# there can only be one module per file.
-# 2) All modules that are to be contained in the dependency list must be
-# contained in one of the source files in the list provided on the command
-# line.
-# The reason for the second restriction is that since the makefile doesn't
-# contain rules to build .mod files the dependency takes the form of the .o
-# file that contains the module. If a module is being used for which the
-# source code is not available (e.g., a module from a library), then adding
-# a .o dependency for that module is a mistake because make will attempt to
-# build that .o file, and will fail if the source code is not available.
-#
-# Author: B. Eaton
-# Climate Modelling Section, NCAR
-# Feb 2001
-
-use Getopt::Std;
-use File::Basename;
-
-# Check for usage request.
-@ARGV >= 2 or usage();
-
-# Process command line.
-my %opt = ();
-getopts( "t:w", \%opt ) or usage();
-my $filepath_arg = shift() or usage();
-my $srcfile_arg = shift() or usage();
-@ARGV == 0 or usage(); # Check that all args were processed.
-
-my $obj_dir;
-if ( defined $opt{'t'} ) { $obj_dir = $opt{'t'}; }
-
-open(FILEPATH, $filepath_arg) or die "Can't open $filepath_arg: $!\n";
-open(SRCFILES, $srcfile_arg) or die "Can't open $srcfile_arg: $!\n";
-
-# Make list of paths to use when looking for files.
-# Prepend "." so search starts in current directory. This default is for
-# consistency with the way GNU Make searches for dependencies.
-my @file_paths = ;
-close(FILEPATH);
-chomp @file_paths;
-unshift(@file_paths,'.');
-foreach $dir (@file_paths) { # (could check that directories exist here)
- $dir =~ s!/?\s*$!!; # remove / and any whitespace at end of directory name
- ($dir) = glob $dir; # Expand tildes in path names.
-}
-
-# Make list of files containing source code.
-my @src = ;
-close(SRCFILES);
-chomp @src;
-
-# For each file that may contain a Fortran module (*.[fF]90 *.[fF]) convert the
-# file's basename to uppercase and use it as a hash key whose value is the file's
-# basename. This allows fast identification of the files that contain modules.
-# The only restriction is that the file's basename and the module name must match
-# in a case insensitive way.
-my %module_files = ();
-my ($f, $name, $path, $suffix, $mod);
-my @suffixes = ('\.[fF]90', '\.[fF]' );
-foreach $f (@src) {
- ($name, $path, $suffix) = fileparse($f, @suffixes);
- ($mod = $name) =~ tr/a-z/A-Z/;
- $module_files{$mod} = $name;
-}
-
-# Now make a list of .mod files in the file_paths. If a .o source dependency
-# can't be found based on the module_files list above, then maybe a .mod
-# module dependency can if the mod file is visible.
-my %trumod_files = ();
-my ($dir);
-my ($f, $name, $path, $suffix, $mod);
-my @suffixes = ('\.mod' );
-foreach $dir (@file_paths) {
- @filenames = (glob("$dir/*.mod"));
- foreach $f (@filenames) {
- ($name, $path, $suffix) = fileparse($f, @suffixes);
- ($mod = $name) =~ tr/a-z/A-Z/;
- $trumod_files{$mod} = $name;
- }
-}
-
-#print STDERR "\%module_files\n";
-#while ( ($k,$v) = each %module_files ) {
-# print STDERR "$k => $v\n";
-#}
-
-# Find module and include dependencies of the source files.
-my ($file_path, $rmods, $rincs);
-my %file_modules = ();
-my %file_includes = ();
-my @check_includes = ();
-foreach $f ( @src ) {
-
- # Find the file in the seach path (@file_paths).
- unless ($file_path = find_file($f)) {
- if (defined $opt{'w'}) {print STDERR "$f not found\n";}
- next;
- }
-
- # Find the module and include dependencies.
- ($rmods, $rincs) = find_dependencies( $file_path );
-
- # Remove redundancies (a file can contain multiple procedures that have
- # the same dependencies).
- $file_modules{$f} = rm_duplicates($rmods);
- $file_includes{$f} = rm_duplicates($rincs);
-
- # Make a list of all include files.
- push @check_includes, @{$file_includes{$f}};
-}
-
-#print STDERR "\%file_modules\n";
-#while ( ($k,$v) = each %file_modules ) {
-# print STDERR "$k => @$v\n";
-#}
-#print STDERR "\%file_includes\n";
-#while ( ($k,$v) = each %file_includes ) {
-# print STDERR "$k => @$v\n";
-#}
-#print STDERR "\@check_includes\n";
-#print STDERR "@check_includes\n";
-
-# Find include file dependencies.
-my %include_depends = ();
-while (@check_includes) {
- $f = shift @check_includes;
- if (defined($include_depends{$f})) { next; }
-
- # Mark files not in path so they can be removed from the dependency list.
- unless ($file_path = find_file($f)) {
- $include_depends{$f} = -1;
- next;
- }
-
- # Find include file dependencies.
- ($rmods, $include_depends{$f}) = find_dependencies($file_path);
-
- # Add included include files to the back of the check_includes list so
- # that their dependencies can be found.
- push @check_includes, @{$include_depends{$f}};
-
- # Add included modules to the include_depends list.
- if ( @$rmods ) { push @{$include_depends{$f}}, @$rmods; }
-}
-
-#print STDERR "\%include_depends\n";
-#while ( ($k,$v) = each %include_depends ) {
-# print STDERR (ref $v ? "$k => @$v\n" : "$k => $v\n");
-#}
-
-# Remove include file dependencies that are not in the Filepath.
-my $i, $ii;
-foreach $f (keys %include_depends) {
-
- unless (ref $include_depends{$f}) { next; }
- $rincs = $include_depends{$f};
- unless (@$rincs) { next; }
- $ii = 0;
- $num_incs = @$rincs;
- for ($i = 0; $i < $num_incs; ++$i) {
- if ($include_depends{$$rincs[$ii]} == -1) {
- splice @$rincs, $ii, 1;
- next;
- }
- ++$ii;
- }
-}
-
-# Substitute the include file dependencies into the %file_includes lists.
-foreach $f (keys %file_includes) {
- my @expand_incs = ();
-
- # Initialize the expanded %file_includes list.
- my $i;
- unless (@{$file_includes{$f}}) { next; }
- foreach $i (@{$file_includes{$f}}) {
- push @expand_incs, $i unless ($include_depends{$i} == -1);
- }
- unless (@expand_incs) {
- $file_includes{$f} = [];
- next;
- }
-
- # Expand
- for ($i = 0; $i <= $#expand_incs; ++$i) {
- push @expand_incs, @{ $include_depends{$expand_incs[$i]} };
- }
-
- $file_includes{$f} = rm_duplicates(\@expand_incs);
-}
-
-#print STDERR "expanded \%file_includes\n";
-#while ( ($k,$v) = each %file_includes ) {
-# print STDERR "$k => @$v\n";
-#}
-
-# Print dependencies to STDOUT.
-foreach $f (sort keys %file_modules) {
- $f =~ /(.+)\./;
- $target = "$1.o";
- if ( defined $opt{'t'} ) { $target = "$opt{'t'}/$1.o"; }
- print "$target : $f @{$file_modules{$f}} @{$file_includes{$f}}\n";
-}
-
-#--------------------------------------------------------------------------------------
-
-sub find_dependencies {
-
- # Find dependencies of input file.
- # Use'd Fortran 90 modules are returned in \@mods.
- # Files that are "#include"d by the cpp preprocessor are returned in \@incs.
-
- my( $file ) = @_;
- my( @mods, @incs );
-
- open(FH, $file) or die "Can't open $file: $!\n";
-
- while ( ) {
- # Search for "#include" and strip filename when found.
- if ( /^#include\s+[<"](.*)[>"]/ ) {
- push @incs, $1;
- }
- # Search for Fortran include dependencies.
- elsif ( /^\s*include\s+['"](.*)['"]/ ) { #" for emacs fontlock
- push @incs, $1;
- }
- # Search for module dependencies.
- elsif ( /^\s*USE\s+(\w+)/i ) {
- ($module = $1) =~ tr/a-z/A-Z/;
- # Return dependency in the form of a .o version of the file that contains
- # the module. this is from the source list.
- if ( defined $module_files{$module} ) {
- if ( defined $obj_dir ) {
- push @mods, "$obj_dir/$module_files{$module}.o";
- } else {
- push @mods, "$module_files{$module}.o";
- }
- }
- # Return dependency in the form of a .mod version of the file that contains
- # the module. this is from the .mod list. only if .o version not found
- elsif ( defined $trumod_files{$module} ) {
- if ( defined $obj_dir ) {
- push @mods, "$obj_dir/$trumod_files{$module}.mod";
- } else {
- push @mods, "$trumod_files{$module}.mod";
- }
- }
- }
- }
- close( FH );
- return (\@mods, \@incs);
-}
-
-#--------------------------------------------------------------------------------------
-
-sub find_file {
-
-# Search for the specified file in the list of directories in the global
-# array @file_paths. Return the first occurance found, or the null string if
-# the file is not found.
-
- my($file) = @_;
- my($dir, $fname);
-
- foreach $dir (@file_paths) {
- $fname = "$dir/$file";
- if ( -f $fname ) { return $fname; }
- }
- return ''; # file not found
-}
-
-#--------------------------------------------------------------------------------------
-
-sub rm_duplicates {
-
-# Return a list with duplicates removed.
-
- my ($in) = @_; # input arrary reference
- my @out = ();
- my $i;
- my %h = ();
- foreach $i (@$in) {
- $h{$i} = '';
- }
- @out = keys %h;
- return \@out;
-}
-
-#--------------------------------------------------------------------------------------
-
-sub usage {
- ($ProgName = $0) =~ s!.*/!!; # name of program
- die < shr_kind_r8
- implicit none
- save
-
- real(R8),parameter :: SHR_CONST_REARTH = 6.37122e6_R8 ! radius of earth ~ m
- real(r8),parameter :: re_km = SHR_CONST_REARTH*0.001 ! radius of earth (km)
-
-end module constMod
diff --git a/tools/mkprocdata_map/src/fileutils.F90 b/tools/mkprocdata_map/src/fileutils.F90
deleted file mode 100644
index e1f8e633da..0000000000
--- a/tools/mkprocdata_map/src/fileutils.F90
+++ /dev/null
@@ -1,282 +0,0 @@
-module fileutils
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: fileutils
-!
-! !DESCRIPTION:
-! Module containing file I/O utilities
-!
-! !USES:
-!
-! !PUBLIC TYPES:
- implicit none
- save
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: get_filename !Returns filename given full pathname
- public :: opnfil !Open local unformatted or formatted file
- public :: getfil !Obtain local copy of file
- public :: relavu !Close and release Fortran unit no longer in use
- public :: getavu !Get next available Fortran unit number
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !PRIVATE MEMBER FUNCTIONS: None
-!EOP
-!-----------------------------------------------------------------------
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get_filename
-!
-! !INTERFACE:
- character(len=256) function get_filename (fulpath)
-!
-! !DESCRIPTION:
-! Returns filename given full pathname
-!
-! !ARGUMENTS:
- implicit none
- character(len=*), intent(in) :: fulpath !full pathname
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer i !loop index
- integer klen !length of fulpath character string
-!------------------------------------------------------------------------
-
- klen = len_trim(fulpath)
- do i = klen, 1, -1
- if (fulpath(i:i) == '/') go to 10
- end do
- i = 0
-10 get_filename = fulpath(i+1:klen)
-
- end function get_filename
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: set_filename
-!
-! !INTERFACE:
- character(len=256) function set_filename (rem_dir, loc_fn)
-!
-! !DESCRIPTION:
-!
-! !ARGUMENTS:
-!
- implicit none
- character(len=*), intent(in) :: rem_dir !remote directory
- character(len=*), intent(in) :: loc_fn !local full path filename
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: i !integer
-!------------------------------------------------------------------------
-
- set_filename = ' '
- do i = len_trim(loc_fn), 1, -1
- if (loc_fn(i:i)=='/') go to 10
- end do
- i = 0
-10 set_filename = trim(rem_dir) // loc_fn(i+1:len_trim(loc_fn))
-
- end function set_filename
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: getfil
-!
-! !INTERFACE:
- subroutine getfil (fulpath, locfn, iflag)
-!
-! !DESCRIPTION:
-! Obtain local copy of file
-! First check current working directory
-! Next check full pathname[fulpath] on disk
-! Finally check full pathname[fulpath] on archival system
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- character(len=*), intent(in) :: fulpath !Archival or permanent disk full pathname
- character(len=*), intent(out) :: locfn !output local file name
- integer, optional, intent(in) :: iflag !0=>abort if file not found 1=>do not abort
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer i !loop index
- integer klen !length of fulpath character string
- integer ierr !error status
- logical lexist !true if local file exists
- character(len=len(fulpath)+5) :: fulpath2 !Archival full pathname
-!------------------------------------------------------------------------
-
- ! get local file name from full name: start at end. look for first "/"
-
- klen = len_trim(fulpath)
- do i = klen, 1, -1
- if (fulpath(i:i).eq.'/') go to 100
- end do
- i = 0
-100 locfn = fulpath(i+1:klen)
- if (len_trim(locfn) == 0) then
- write(6,*)'(GETFIL): local filename has zero length'
- stop 1
- else
- write(6,*)'(GETFIL): attempting to find local file ',trim(locfn)
- endif
-
- ! first check if file is in current working directory.
-
- inquire (file=locfn,exist=lexist)
- if (lexist) then
- write(6,*) '(GETFIL): using ',trim(locfn),' in current working directory'
- RETURN
- endif
-
- ! second check for full pathname on disk
-
- inquire(file=fulpath, exist=lexist)
- if (lexist) then
- locfn = trim(fulpath)
- write(6,*) '(GETFIL): using ',trim(fulpath)
- RETURN
- else
- write(6,*) 'GETFIL: FAILED to get '//trim(fulpath)
- stop 1
- end if
-
- end subroutine getfil
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: opnfil
-!
-! !INTERFACE:
- subroutine opnfil (locfn, iun, form)
-!
-! !DESCRIPTION:
-! Open file locfn in unformatted or formatted form on unit iun
-!
-! !ARGUMENTS:
-!
- implicit none
- character(len=*), intent(in):: locfn !file name
- integer, intent(in):: iun !fortran unit number
- character(len=1), intent(in):: form !file format: u = unformatted,
- !f = formatted
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer ioe !error return from fortran open
- character(len=11) ft !format type: formatted. unformatted
-!------------------------------------------------------------------------
-
- if (len_trim(locfn) == 0) then
- write(6,*)'OPNFIL: local filename has zero length'
- stop 1
- endif
- if (form=='u' .or. form=='U') then
- ft = 'unformatted'
- else
- ft = 'formatted '
- end if
- open (unit=iun,file=locfn,status='unknown',form=ft,iostat=ioe)
- if (ioe /= 0) then
- write(6,*)'(OPNFIL): failed to open file ',trim(locfn), &
- & ' on unit ',iun,' ierr=',ioe
- stop 1
- else
- write(6,*)'(OPNFIL): Successfully opened file ',trim(locfn),' on unit= ',iun
- end if
-
- end subroutine opnfil
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: getavu
-!
-! !INTERFACE:
- integer function getavu()
-!
-! !DESCRIPTION:
-! Get next available Fortran unit number.
-!
-! !USES:
- use shr_file_mod, only : shr_file_getUnit
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Created by Gordon Bonan
-! Modified for clm2 by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
-!------------------------------------------------------------------------
-
- getavu = shr_file_getunit()
-
- end function getavu
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: relavu
-!
-! !INTERFACE:
- subroutine relavu (iunit)
-!
-! !DESCRIPTION:
-! Close and release Fortran unit no longer in use!
-!
-! !USES:
- use shr_file_mod, only : shr_file_freeUnit
-!
-! !ARGUMENTS:
- implicit none
- integer, intent(in) :: iunit !Fortran unit number
-!
-! !REVISION HISTORY:
-! Created by Gordon Bonan
-!
-!EOP
-!------------------------------------------------------------------------
-
- close(iunit)
- call shr_file_freeUnit(iunit)
-
- end subroutine relavu
-
-end module fileutils
diff --git a/tools/mkprocdata_map/src/fmain.F90 b/tools/mkprocdata_map/src/fmain.F90
deleted file mode 100644
index ba9e593c1d..0000000000
--- a/tools/mkprocdata_map/src/fmain.F90
+++ /dev/null
@@ -1,78 +0,0 @@
-program fmain
-
- use mkprocdata_map, only : mkmap
- implicit none
-
- character(len= 256) :: arg
- integer :: n !index
- integer :: nargs !number of arguments
- integer, external :: iargc !number of arguments function
- character(len=256) :: filei !input file
- character(len=256) :: fileo !output mapped file
- character(len=256) :: fmap !maping file
- character(len=256) :: ftemplate !template file, containing lat & lon arrays desired in output file
- character(len=256) :: cmdline !input command line
- integer, parameter :: inival = -999 !initial value for command-line integers
- !----------------------------------------------------
-
- filei = ' '
- fileo = ' '
- fmap = ' '
- ftemplate = ' '
-
- cmdline = 'mkprocdata_map'
- nargs = iargc()
- n = 1
- do while (n <= nargs)
- arg = ' '
- call getarg (n, arg)
- n = n + 1
-
- select case (arg)
- case ('-i')
- call getarg (n, arg)
- n = n + 1
- filei = trim(arg)
- cmdline = trim(cmdline) // ' -i ' // trim(arg)
- case ('-o')
- call getarg (n, arg)
- n = n + 1
- fileo = trim(arg)
- cmdline = trim(cmdline) // ' -o ' // trim(arg)
- case ('-m')
- call getarg (n, arg)
- n = n + 1
- fmap = trim(arg)
- cmdline = trim(cmdline) // ' -m ' // trim(arg)
- case ('-t')
- call getarg (n, arg)
- n = n + 1
- ftemplate = trim(arg)
- cmdline = trim(cmdline) // ' -t ' // trim(arg)
- case default
- write (6,*) 'Argument ', arg,' is not known'
- call usage_exit (' ')
- cmdline = trim(cmdline) // ' ' // trim(arg)
- end select
- end do
-
- if (filei == ' ' .or. fileo == ' ' .or. fmap == ' ' &
- .or. ftemplate == ' ') then
- call usage_exit ('Must specify all the following arguments')
- end if
-
- call mkmap (filei, fileo, fmap, ftemplate)
-
-end program fmain
-
-
-subroutine usage_exit (arg)
- implicit none
- character(len=*) :: arg
- if (arg /= ' ') write (6,*) arg
- write (6,*) 'Usage: mkprocdata_map -i -o -m -t '
- write (6,*)
- write (6,*) "The template file must contain the dimensions 'lat' and 'lon';"
- write (6,*) "these are used to determine the number of latitudes and longitudes in the output file"
- stop 1
-end subroutine
diff --git a/tools/mkprocdata_map/src/gridmapMod.F90 b/tools/mkprocdata_map/src/gridmapMod.F90
deleted file mode 100644
index 376692cf0d..0000000000
--- a/tools/mkprocdata_map/src/gridmapMod.F90
+++ /dev/null
@@ -1,289 +0,0 @@
-module gridmapMod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use fileutils, only : getfil
-
- implicit none
- private
- include 'netcdf.inc'
-
- type gridmap_type
- character(len=32) :: name
- integer :: na ! size of source domain
- integer :: nb ! size of destination domain
- integer :: ni ! number of row in the matrix
- integer :: nj ! number of col in the matrix
- integer :: ns ! number of non-zero elements in matrix
- real(r8), pointer :: yc_src(:) ! "degrees"
- real(r8), pointer :: yc_dst(:) ! "degrees"
- real(r8), pointer :: xc_src(:) ! "degrees"
- real(r8), pointer :: xc_dst(:) ! "degrees"
- integer , pointer :: mask_src(:) ! "unitless"
- integer , pointer :: mask_dst(:) ! "unitless"
- real(R8), pointer :: area_src(:) ! area of a grid in map (radians)
- real(R8), pointer :: area_dst(:) ! area of b grid in map (radians)
- real(r8), pointer :: frac_src(:) ! "unitless"
- real(r8), pointer :: frac_dst(:) ! "unitless"
- integer , pointer :: src_indx(:) ! correpsonding column index
- integer , pointer :: dst_indx(:) ! correpsonding row index
- real(r8), pointer :: wovr(:) ! wt of overlap input cell
- real(r8), pointer :: scalepft_i(:) ! PFT wt of overlap input cell
- end type gridmap_type
- public :: gridmap_type
-
- public :: gridmap_mapread
- public :: gridmap_areaave
- public :: gridmap_clean
-
- interface gridmap_areaave
- module procedure gridmap_areaave_default
- module procedure gridmap_areaave_mask
- end interface
-
- ! questions - how does the reverse mapping occur
- ! is mask_dst read in - and what happens if this is very different
- ! from frac_dst which is calculated by mapping frac_src?
- ! in frac - isn't grid1_frac always 1 or 0?
-
-contains
-
- subroutine gridmap_mapread(gridmap, fileName)
-
- !--- input/output parameters ---
- type(gridmap_type), intent(out) :: gridmap ! mapping data
- character(len=*) , intent(in) :: filename ! netCDF file to read
-
-
- !--- local ---
- integer :: n ! generic loop indicies
- integer :: na ! size of source domain
- integer :: nb ! size of destination domain
- integer :: igrow ! aVect index for matrix row
- integer :: igcol ! aVect index for matrix column
- integer :: iwgt ! aVect index for matrix element
- integer :: iarea ! aVect index for area
-
-
- character,allocatable :: str(:) ! variable length char string
- character(len=256) :: attstr ! netCDF attribute name string
- integer :: rcode ! netCDF routine return code
- integer :: fid ! netCDF file ID
- integer :: vid ! netCDF variable ID
- integer :: did ! netCDF dimension ID
- integer :: ns ! size of array
- character(len=256) :: locfn
-
- !--- formats ---
- character(*),parameter :: subName = '(gridmap_map_read) '
- character(*),parameter :: F00 = '("(gridmap_map_read) ",4a)'
- character(*),parameter :: F01 = '("(gridmap_map_read) ",2(a,i7))'
-
- !-------------------------------------------------------------------------------
- !
- !-------------------------------------------------------------------------------
-
- write(6,F00) "reading mapping matrix data..."
-
- ! open & read the file
- write(6,F00) "* file name : ",trim(fileName)
-
- call getfil (trim(filename), locfn, 0)
- rcode = nf_open(locfn ,NF_NOWRITE, fid)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- !--- allocate memory & get matrix data ----------
- rcode = nf_inq_dimid (fid, 'n_s', did) ! size of sparse matrix
- rcode = nf_inq_dimlen(fid, did , gridmap%ns)
- rcode = nf_inq_dimid (fid, 'n_a', did) ! size of input vector
- rcode = nf_inq_dimlen(fid, did , gridmap%na)
- rcode = nf_inq_dimid (fid, 'n_b', did) ! size of output vector
- rcode = nf_inq_dimlen(fid, did , gridmap%nb)
-
- write(6,*) "* matrix dimensions rows x cols :",gridmap%na,' x',gridmap%nb
- write(6,*) "* number of non-zero elements: ",gridmap%ns
-
- ns = gridmap%ns
- na = gridmap%na
- nb = gridmap%nb
- allocate(gridmap%wovr(ns) , &
- gridmap%src_indx(ns), &
- gridmap%dst_indx(ns), &
- gridmap%mask_src(na), &
- gridmap%area_src(na), &
- gridmap%frac_src(na), &
- gridmap%area_dst(nb), &
- gridmap%frac_dst(nb), &
- gridmap%mask_dst(nb), &
- gridmap%xc_src(na), &
- gridmap%yc_src(na), stat=rcode)
- if (rcode /= 0) then
- write(6,*) SubName//' ERROR: allocate gridmap'
- stop 1
- endif
-
- rcode = nf_inq_varid(fid,'S' ,vid)
- rcode = nf_get_var_double(fid,vid ,gridmap%wovr)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'row',vid)
- rcode = nf_get_var_int(fid, vid ,gridmap%dst_indx)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'col',vid)
- rcode = nf_get_var_int(fid, vid, gridmap%src_indx)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'area_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%area_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'area_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%area_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'frac_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%frac_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'frac_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%frac_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'mask_a',vid)
- rcode = nf_get_var_int(fid, vid, gridmap%mask_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'mask_b',vid)
- rcode = nf_get_var_int(fid, vid, gridmap%mask_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'xc_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%xc_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'yc_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%yc_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_close(fid)
-
- end subroutine gridmap_mapread
-
-!==========================================================================
-
- subroutine gridmap_areaave_default (gridmap, src_array, dst_array)
-
- !--- input/output parameters ---
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
-
- !--- local ---
- integer :: n,ns,ni,no
- real(r8):: wt,frac
- !----------------------------------------------------------
-
- dst_array = 0._r8
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- frac = gridmap%frac_dst(no)
- if (frac > 0.) then
- dst_array(no) = dst_array(no) + wt * src_array(ni)/frac
- end if
- end do
-
- end subroutine gridmap_areaave_default
-
-!==========================================================================
-
- subroutine gridmap_areaave_mask (gridmap, src_array, dst_array, src_mask, &
- spval)
-
- !--- input/output parameters ---
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
- real(r8), intent(in) :: src_mask(:)
- real(r8), intent(in), optional :: spval
-
- !--- local ---
- integer :: n,ns,ni,no
- real(r8):: wt
- real(r8), allocatable :: wtnorm(:)
- !----------------------------------------------------------
-
- ns = size(dst_array)
- allocate(wtnorm(ns))
-
- wtnorm(:) = 0._r8
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- if (src_mask(ni) > 0) then
- wtnorm(no) = wtnorm(no) + wt*src_mask(ni)
- end if
- end do
-
- do no = 1,ns
- if (wtnorm(no) > 0) then
- dst_array(no) = 0._r8
- else
- if (present(spval)) then
- dst_array(no) = spval
- else
- dst_array(no) = 1.e36
- end if
- end if
- end do
-
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- if (wtnorm(no) > 0) then
- dst_array(no) = dst_array(no) + wt*src_mask(ni)*src_array(ni)/wtnorm(no)
- end if
- end do
-
- deallocate(wtnorm)
-
- end subroutine gridmap_areaave_mask
-
-!==========================================================================
-
- subroutine gridmap_clean(gridmap)
-
- !--- input/output parameters ---
- implicit none
- type(gridmap_type), intent(inout) :: gridmap
-
-
- !--- local ---
- character(len=*), parameter :: subName = "gridmap_clean"
- integer ier ! error flag
- !----------------------------------------------------------
-
- deallocate(gridmap%wovr , &
- gridmap%src_indx, &
- gridmap%dst_indx, &
- gridmap%mask_src, &
- gridmap%mask_dst, &
- gridmap%area_src, &
- gridmap%area_dst, &
- gridmap%frac_src, &
- gridmap%frac_dst, &
- gridmap%xc_src, &
- gridmap%yc_src, stat=ier)
- if (ier /= 0) then
- write(6,*) SubName//' ERROR: deallocate gridmap'
- stop 1
- endif
-
- end subroutine gridmap_clean
-
-end module gridmapMod
-
-
diff --git a/tools/mkprocdata_map/src/mkprocdata_map.F90 b/tools/mkprocdata_map/src/mkprocdata_map.F90
deleted file mode 100644
index 1d10671364..0000000000
--- a/tools/mkprocdata_map/src/mkprocdata_map.F90
+++ /dev/null
@@ -1,525 +0,0 @@
-module mkprocdata_map
-
-contains
-
-subroutine mkmap(filei, fileo, fmap, ftemplate)
-
- use netcdf
- use shr_kind_mod, only : r8 => shr_kind_r8, r4=>shr_kind_r4
- use fileutils
- use gridmapMod
-
- implicit none
- character(len=256), intent(in) :: filei ! input dataset
- character(len=256), intent(in) :: fileo ! output mapped dataset
- character(len=256), intent(in) :: fmap ! mapping file
- character(len=256), intent(in) :: ftemplate ! template file, containing lat & lon arrays desired in output file
-
-
- integer :: nDimensions ! number of dimensions defined for this netCDF dataset.
- integer :: nVariables ! number of variables defined for this netCDF dataset.
- integer :: output_n ! number of variables in the output file that are obtained by
- ! regridding variables from the input file
- integer :: nAttributes ! number of global attributes defined for this netCDF dataset.
- integer :: unlimitedDimID ! ID of the unlimited dimension, if there is one
- ! If no unlimited length dimension has been defined,
- ! -1 is returned.
- integer :: ier ! returned error code
- integer :: nlen
- integer :: nlon, nlat
- integer :: dimid
- integer :: dimid_lndgrid
- integer :: dimid_lon, dimid_lat
- integer :: ncidi, ncido, ncidt
- integer :: varidi, varido
- integer :: varid_area
- integer :: dimlen
- integer :: xtype
- integer :: ndimsi,ndimso
- integer :: dimidsi(4) ! dimension id array
- integer :: dimidso(4) ! dimension id array
- integer :: nAtts ! number of variable attributes
- integer :: n,nv,nt,nd,na ! indices
- integer :: attlen
-
- ! input_ids & output_ids: arrays defining mapping between input & output variables:
- integer, dimension(:), allocatable :: input_ids
- integer, dimension(:), allocatable :: output_ids
-
- character(len=256):: locfn
- character(len=128):: dimname
- character(len=128):: varname
- character(len=128):: attname
- character(len=128):: cattvalue
- real(r8) :: dattvalue
- real(r4) :: fattvalue
- integer :: iattvalue
- type(gridmap_type):: tgridmap
- logical :: mapvar
-
- !--------------------------------------------------------
- ! Read in mapping file - will have frac in it - and input and output domain
- !--------------------------------------------------------
-
- call getfil (fmap, locfn, 0)
-
- call handle_ncerr(nf90_open(locfn, NF90_NOWRITE, ncidi))
-
- call get_dimlen(ncidi, 'n_b', nlen)
-
- call handle_ncerr(nf90_close(ncidi))
- call gridmap_mapread(tgridmap, locfn)
-
- !--------------------------------------------------------
- ! Read in template file to get nlon & nlat
- !--------------------------------------------------------
-
- call getfil (ftemplate, locfn, 0)
-
- call handle_ncerr(nf90_open(locfn, NF90_NOWRITE, ncidt))
-
- call get_dimlen(ncidt, 'lon', nlon)
- call get_dimlen(ncidt, 'lat', nlat)
-
- call handle_ncerr(nf90_close(ncidt))
-
- write(6,*) 'nlon = ', nlon
- write(6,*) 'nlat = ', nlat
-
- if (nlon*nlat /= nlen) then
- write(6,*) 'must have nlon*nlat == nlen'
- write(6,*) 'nlon = ', nlon
- write(6,*) 'nlat = ', nlat
- write(6,*) 'nlen = ', nlen
- stop 1
- end if
-
- call getfil (filei, locfn, 0)
-
- !--------------------------------------------------------
- ! Create output file (put it in define mode)
- !--------------------------------------------------------
-
- call handle_ncerr(nf90_create(fileo, NF90_64BIT_OFFSET, ncido))
-
- !--------------------------------------------------------
- ! Define output dimensions - creating file puts it in define mode
- !--------------------------------------------------------
-
- call handle_ncerr(nf90_open(locfn, NF90_NOWRITE, ncidi))
- call handle_ncerr(nf90_inquire(ncidi, nDimensions, nVariables, &
- nAttributes, unlimitedDimId))
-
- do nd = 1,nDimensions
- ! Determine input dimensions
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimid=nd, name=dimname, len=dimlen))
-
- ! Define output variables
- ! Assume that input dimensions are time, lndgrid
- ! 2d lon,lat <=> 1d lndgrid
- if (dimname == 'time') then
- call handle_ncerr(nf90_def_dim(ncido, name=dimname, len=nf90_unlimited, dimid=dimid))
- else
- if (trim(dimname) == 'lndgrid') then
- dimid_lndgrid= nd
- call handle_ncerr(nf90_def_dim(ncido, name='lon', len=nlon, dimid=dimid_lon))
- call handle_ncerr(nf90_def_dim(ncido, name='lat', len=nlat, dimid=dimid_lat))
- else
- call handle_ncerr(nf90_def_dim(ncido, name=dimname, len=dimlen, dimid=dimid))
- end if
- end if
- write(6,*)'n = ',nd,' dimname= ',trim(dimname)
- end do
-
- !--------------------------------------------------------
- ! Define output variables
- !--------------------------------------------------------
-
- allocate(input_ids(nVariables), output_ids(nVariables))
-
- ! Loop over input variables
- output_n = 0
- do nv = 1,nVariables
-
- ! Determine input variable
- call handle_ncerr(nf90_Inquire_Variable(ncid=ncidi, varid=nv, natts=natts, &
- name=varname, ndims=ndimsi, dimids=dimidsi, xtype=xtype))
-
- if (ignore_var(varname)) then
- write(6,*)'skipping writing out variable ',trim(varname)
-
- else
- output_n = output_n + 1
-
- ! Determine output dimension ids
- if (dimidsi(1) == dimid_lndgrid) then
- ndimso = ndimsi + 1
- dimidso(1) = dimid_lon
- dimidso(2) = dimid_lat
- do n = 2,ndimsi
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(n), dimname))
- call handle_ncerr(nf90_inq_dimid(ncido, dimname, dimidso(n+1)))
- end do
- else
- ndimso = ndimsi
- do n = 1,ndimsi
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(n), dimname))
- call handle_ncerr(nf90_inq_dimid(ncido, dimname, dimidso(n)))
- end do
- end if
-
- ! Define output variable and attributes
- call handle_ncerr(nf90_def_var(ncido, name=varname, xtype=xtype, &
- dimids=dimidso(1:ndimso), varid=varido))
-
- input_ids(output_n) = nv
- output_ids(output_n) = varido
-
- do na = 1,natts
- call handle_ncerr(nf90_inq_attname(ncidi, varid=nv, attnum=na, name=attname))
- call handle_ncerr(nf90_inquire_attribute(ncidi, varid=nv, name=attname, &
- len=attlen, xtype=xtype))
- if (xtype == nf90_char) then
- call handle_ncerr(nf90_get_att(ncidi, varid=nv, name=attname, values=cattvalue))
- call handle_ncerr(nf90_put_att(ncido, varid=varido, name=attname, &
- values=cattvalue(1:attlen)))
- else if (xtype == nf90_double) then
- call handle_ncerr(nf90_get_att(ncidi, varid=nv, name=attname, values=dattvalue))
- call handle_ncerr(nf90_put_att(ncido, varid=varido, name=attname, values=dattvalue))
- else if (xtype == nf90_float) then
- call handle_ncerr(nf90_get_att(ncidi, varid=nv, name=attname, values=fattvalue))
- call handle_ncerr(nf90_put_att(ncido, varid=varido, name=attname, values=fattvalue))
- else if (xtype == nf90_int) then
- call handle_ncerr(nf90_get_att(ncidi, varid=nv, name=attname, values=iattvalue))
- call handle_ncerr(nf90_put_att(ncido, varid=varido, name=attname, values=iattvalue))
- end if
- end do
-
- end if ! .not. ignore_var
- end do
- ! now output_n is the number of output variables that are regridded from input variables
-
- call def_area(ncido, dimid_lon, dimid_lat, varid_area)
-
- ! End define mode
- call handle_ncerr(nf90_enddef(ncido))
-
- !--------------------------------------------------------
- ! Read in variables and write them out
- !--------------------------------------------------------
-
- do nv = 1,output_n
- call outputvar(input_ids(nv), output_ids(nv), nlon, nlat, ncidi, ncido, tgridmap)
- end do
-
- call write_area(ncido, varid_area, tgridmap, nlon, nlat)
-
- call handle_ncerr(nf90_close(ncidi))
- call handle_ncerr(nf90_close(ncido))
-
- deallocate(input_ids, output_ids)
-
-contains
-
- ! return true if we should ignore this variable, false otherwise
- ! (this allows us to ignore variables that aren't handled properly in the regridding)
- logical function ignore_var(varname)
- character(len=*), intent(in) :: varname
-
- ! We do NOT exclude 'area' because it may be useful to be able to refer to this in
- ! the regridded file. See also the 'gw' variable created through def_area /
- ! write_area for the actual grid cell areas of the output grid.
- ignore_var = (trim(varname) == 'date_written' .or. &
- trim(varname) == 'time_written' .or. &
- trim(varname) == 'lon' .or. &
- trim(varname) == 'lat' .or. &
- trim(varname) == 'landmask' .or. &
- trim(varname) == 'pftmask')
- end function ignore_var
-
- ! return true if we should weight this variable by landfrac, false otherwise
- ! note that most variables are weighted by landfrac, so this function just lists those
- ! variables that are NOT weighted by landfrac
- logical function weight_by_landfrac(varname)
- character(len=*), intent(in) :: varname
-
- weight_by_landfrac = .not. (trim(varname) == 'area' .or. &
- trim(varname) == 'landfrac')
- end function weight_by_landfrac
-
- ! Get the length of a dimension from an open netcdf file
- subroutine get_dimlen(ncid, dimname, len)
- use netcdf
-
- implicit none
-
- ! Subroutine arguments
- integer, intent(in) :: ncid ! ID of an open netcdf file
- character(len=*), intent(in) :: dimname ! name of dimension of interest
- integer, intent(out) :: len ! length of dimension
-
- ! Local variables
- integer :: dimid
- integer :: ier
-
- call handle_ncerr(nf90_inq_dimid(ncid, dimname, dimid))
- call handle_ncerr(nf90_inquire_dimension(ncid, dimid, len=len))
-
- end subroutine get_dimlen
-
-
- subroutine outputvar(varidi, varido, nlon, nlat, ncidi, ncido, tgridmap)
-
- integer, intent(in) :: varidi
- integer, intent(in) :: varido
- integer, intent(in) :: nlon
- integer, intent(in) :: nlat
- integer, intent(in) :: ncidi
- integer, intent(in) :: ncido
- type(gridmap_type), intent(inout) :: tgridmap
-
- integer :: len1,len2,len3,len4
- integer :: n1,n2,n3,n4
- integer :: dimidsi(4)
- integer :: dimidso(4)
- real(r8), allocatable :: rarrayi(:),rarrayo(:)
- character(len=128):: dimname
- character(len=128):: varname
- character(len=128):: attname
- integer :: varid_landfrac
- logical :: first_time = .true.
- real(r8), allocatable, save :: landfraci(:)
- real(r8), allocatable, save :: src_mask(:)
- integer :: vid
- real(r8):: spval = 1.e36
-
- ! Allocate landfraci and src_mask; determine landfrac on input file
- if (first_time) then
- dimname = 'lndgrid'
- call handle_ncerr(nf90_inq_dimid(ncidi, dimname, dimidsi(1)))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimid=dimidsi(1), len=len1))
- allocate(landfraci(len1))
- allocate(src_mask(len1))
- call handle_ncerr(nf90_inq_varid(ncid=ncidi, name='landfrac', varid=varid_landfrac))
- call handle_ncerr(nf90_get_var(ncidi, varid_landfrac, landfraci))
- first_time = .false.
- end if
-
- call handle_ncerr(nf90_Inquire_Variable(ncid=ncidi, varid=varidi, &
- name=varname, ndims=ndimsi, dimids=dimidsi, xtype=xtype))
- write(6,*)'varidi = ',varidi,' varido = ', varido, ' varname= ',trim(varname),' ndimsi= ',ndimsi
-
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(1), &
- len=len1, name=dimname))
- allocate(rarrayi(len1))
- if (trim(dimname)=='lndgrid') then
- mapvar = .true.
- else
- mapvar = .false.
- end if
-
- call handle_ncerr(nf90_Inquire_Variable(ncid=ncido, varid=varido, &
- name=varname, ndims=ndimso, dimids=dimidso, xtype=xtype))
- if (mapvar) then
- call handle_ncerr(nf90_inquire_dimension(ncido, dimidso(1), &
- len=len1, name=dimname))
- call handle_ncerr(nf90_inquire_dimension(ncido, dimidso(2), &
- len=len2, name=dimname))
- allocate(rarrayo(len1*len2))
- else
- len1 = size(rarrayi)
- allocate(rarrayo(len1))
- end if
-
- ! src_mask will give the relative weight of each grid cell. i.e., if two source grid
- ! cells have the same area of overlap with a destination grid cell, then their
- ! relative weightings are given by their src_mask values.
- ! For most grid cells, this relative weighting is landfrac, but there are a few
- ! fields for which we do not want to weight by landfrac.
- ! Note that, for some fields (fields that are NA over certain landunits) we should
- ! really also be weighting by an additional factor saying the fraction of land area
- ! over which that field applies; but more metadata need to be added to the clm
- ! history files before that will be possible.
- if (mapvar .and. weight_by_landfrac(varname)) then
- src_mask(:) = landfraci(:)
- else
- ! note that, if we get here because mapvar is false, we currently don't use
- ! src_mask; but we set it to 1 anyway to be safe
- src_mask(:) = 1
- end if
-
- if (ndimsi == 1) then
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(1), len=len1))
- call handle_ncerr(nf90_get_var(ncidi, varidi, rarrayi))
- if (mapvar) then
- if (xtype == nf90_int) then
- where (rarrayi == -9999) src_mask = 0
- call gridmap_areaave(tgridmap, rarrayi, rarrayo, src_mask, &
- spval=-9999._r8)
- else
- where (rarrayi == spval) src_mask = 0
- call gridmap_areaave(tgridmap, rarrayi, rarrayo, src_mask)
- end if
- call handle_ncerr(nf90_put_var(ncido, varido, &
- reshape(rarrayo,(/nlon,nlat/))))
- else
- rarrayo(:)= rarrayi(:)
- call handle_ncerr(nf90_put_var(ncido, varido, rarrayo))
- end if
- end if
-
- if (ndimsi == 2) then
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(1), len=len1))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(2), len=len2))
- do n2 = 1,len2
- call handle_ncerr(nf90_get_var(ncidi, varidi, rarrayi, &
- start=(/1,n2/), count=(/len1,1/)))
- if (mapvar) then
- if (xtype == nf90_int) then
- where (rarrayi == -9999) src_mask = 0
- call gridmap_areaave(tgridmap, rarrayi, rarrayo, src_mask, &
- spval=-9999._r8)
- else
- where (rarrayi == spval) src_mask = 0
- call gridmap_areaave(tgridmap, rarrayi, rarrayo, src_mask)
- end if
-
- call handle_ncerr(nf90_put_var(ncido, varido, &
- reshape(rarrayo,(/nlon,nlat/)), &
- start=(/1,1,n2/), count=(/nlon,nlat/)))
- else
- rarrayo(:)= rarrayi(:)
- call handle_ncerr(nf90_put_var(ncido, varido, rarrayo, &
- start=(/1,n2/), count=(/len1,1/)))
- end if
- end do
- end if
-
- if (ndimsi == 3) then
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(1), len=len1))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(2), len=len2))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(3), len=len3))
- do n2 = 1,len2
- do n3 = 1,len3
- call handle_ncerr(nf90_get_var(ncidi, varidi, rarrayi, &
- start=(/1,n2,n3/), count=(/len1,1,1/)))
- if (mapvar) then
- where (rarrayi == spval) src_mask = 0
- call gridmap_areaave(tgridmap, rarrayi, rarrayo, src_mask)
- call handle_ncerr(nf90_put_var(ncido, varido, &
- reshape(rarrayo,(/nlon,nlat/)), &
- start=(/1,1,n2,n3/), count=(/nlon,nlat/)))
- else
- rarrayo(:)= rarrayi(:)
- call handle_ncerr(nf90_put_var(ncido, varido, rarrayo, &
- start=(/1,n2,n3/), count=(/len1,1,1/)))
- end if
- end do
- end do
- end if
-
- if (ndimsi == 4) then
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(1), len=len1))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(2), len=len2))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(3), len=len3))
- call handle_ncerr(nf90_inquire_dimension(ncidi, dimidsi(4), len=len4))
- do n2 = 1,len2
- do n3 = 1,len3
- do n4 = 1,len4
- call handle_ncerr(nf90_get_var(ncidi, varidi, rarrayi, &
- start=(/1,n2,n3,n4/), count=(/len1,1,1,1/)))
- if (mapvar) then
- src_mask(:) = 0.
- where (rarrayi == spval) src_mask = spval
- call gridmap_areaave(tgridmap, rarrayi, rarrayo)
- call handle_ncerr(nf90_put_var(ncido, varido, &
- reshape(rarrayo,(/nlon,nlat/)), &
- start=(/ 1,1,n2,n3,n4 /), count=(/nlon,nlat,1,1,1/)))
- else
- rarrayo(:)= rarrayi(:)
- call handle_ncerr(nf90_put_var(ncido, varido, rarrayo, &
- start=(/1,n2,n3,n4/), count=(/len1,1,1,1/)))
- end if
- end do
- end do
- end do
- end if
-
- deallocate(rarrayi)
- deallocate(rarrayo)
-
- end subroutine outputvar
-
-
- ! Define the area variable 'gw' in the output file, return the variable id of this
- ! variable. This variable will give the actual grid cell areas of the output grid. The
- ! name ('gw') is consistent with the similar variable in the cam homme regridding
- ! routine.
- subroutine def_area(ncido, dimid_lon, dimid_lat, varid)
- implicit none
-
- ! Subroutine arguments
- integer, intent(in) :: ncido ! netcdf id of output netcdf file, in define mode
- integer, intent(in) :: dimid_lon ! id of lon dimension in the file given by ncido
- integer, intent(in) :: dimid_lat ! id of lat dimension in the file given by ncido
- integer, intent(out) :: varid ! variable id of the newly-created area variable
-
- call handle_ncerr(nf90_def_var(ncido, name='gw', xtype=NF90_FLOAT, &
- dimids=(/dimid_lon,dimid_lat/), varid=varid))
-
- call handle_ncerr(nf90_put_att(ncido, varid=varid, name='long_name', &
- values='grid cell areas'))
-
- call handle_ncerr(nf90_put_att(ncido, varid=varid, name='units', &
- values='km^2'))
-
- end subroutine def_area
-
- ! Write the area variable to the output file. This assumes that def_area has already
- ! been called, and varid is the variable id returned by that subroutine.
- subroutine write_area(ncido, varid, tgridmap, nlon, nlat)
- use constMod, only : re_km ! radius of earth (km)
-
- implicit none
-
- ! Subroutine arguments
- integer, intent(in) :: ncido ! netcdf id of output netcdf file, open for writing
- integer, intent(in) :: varid ! variable id of the area variable in which we'll put data
- type(gridmap_type), intent(in) :: tgridmap ! grid information; we're interested in
- ! the area_dst component, assumed to be in
- ! square radians
- integer, intent(in) :: nlon ! number of longitudes in the output map
- integer, intent(in) :: nlat ! number of latitudes in the output map
-
- ! Local variables
- real(r8), dimension(:,:), allocatable :: area_2d ! grid cell areas, on a 2-d array,
- ! in km^2
-
-
- allocate(area_2d(nlon,nlat))
- area_2d = reshape(tgridmap%area_dst,(/nlon,nlat/)) * re_km**2
-
- call handle_ncerr(nf90_put_var(ncido, varid, area_2d))
-
- deallocate(area_2d)
- end subroutine write_area
-
- subroutine handle_err(status)
- integer, intent(in) :: status
- if (status /= nf90_noerr) then
- write(6,*)trim(nf90_strerror(status))
- stop 1
- end if
- end subroutine handle_err
-
- subroutine handle_ncerr(ret)
- integer , intent(in) :: ret ! return code from netCDF library routine
- if ( ret .ne. NF90_NOERR ) then
- write(6,*) nf90_strerror( ret )
- stop 1
- endif
- end subroutine handle_ncerr
-
-end subroutine mkmap
-
-end module mkprocdata_map
diff --git a/tools/mkprocdata_map/src/nanMod.F90 b/tools/mkprocdata_map/src/nanMod.F90
deleted file mode 100644
index 0cbeeea112..0000000000
--- a/tools/mkprocdata_map/src/nanMod.F90
+++ /dev/null
@@ -1,41 +0,0 @@
-module nanMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: nanMod
-!
-! !DESCRIPTION:
-! Set parameters for the floating point flags "inf" Infinity
-! and "nan" not-a-number. As well as "bigint" the point
-! at which integers start to overflow. These values are used
-! to initialize arrays with as a way to detect if arrays
-! are being used before being set.
-! Note that bigint is the largest possible 32-bit integer.
-!
-! !USES:
- use shr_kind_mod, only: r8 => shr_kind_r8
-!
-! !PUBLIC TYPES:
- implicit none
- save
-#ifdef __PGI
-! quiet nan for portland group compilers
- real(r8), parameter :: inf = O'0777600000000000000000'
- real(r8), parameter :: nan = O'0777700000000000000000'
- integer, parameter :: bigint = O'17777777777'
-#else
-! signaling nan otherwise
- real(r8), parameter :: inf = O'0777600000000000000000'
- real(r8), parameter :: nan = O'0777610000000000000000'
- integer, parameter :: bigint = O'17777777777'
-#endif
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein based on cam module created by
-! CCM core group
-!
-!EOP
-!-----------------------------------------------------------------------
-
-end module nanMod
diff --git a/tools/mkprocdata_map/src/shr_file_mod.F90 b/tools/mkprocdata_map/src/shr_file_mod.F90
deleted file mode 100644
index 3279f5ab08..0000000000
--- a/tools/mkprocdata_map/src/shr_file_mod.F90
+++ /dev/null
@@ -1,164 +0,0 @@
-!===============================================================================
-!BOP ===========================================================================
-!
-! !MODULE: shr_file_mod.F90 --- Module to handle various file utilily functions.
-!
-! !DESCRIPTION:
-!
-! Miscilaneous methods to handle file and directory utilities as well as FORTRAN
-! unit control. Also put/get local files into/from archival location
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-MODULE shr_file_mod
-
-! !USES:
-
- use shr_kind_mod ! defines kinds
-
- IMPLICIT none
-
- PRIVATE ! By default everything is private to this module
-
-! !PUBLIC TYPES:
-
- ! no public types
-
-! !PUBLIC MEMBER FUNCTIONS:
-
- public :: shr_file_getUnit ! Get a logical unit for reading or writing
- public :: shr_file_freeUnit ! Free a logical unit
-
-! !PUBLIC DATA MEMBERS:
-
- ! Integer flags for recognized prefixes on file get/put operations
- !--- unit numbers, users can ask for unit numbers from 0 to min, but getUnit
- !--- won't give a unit below min, users cannot ask for unit number above max
- !--- for backward compatability.
- !--- eventually, recommend min as hard lower limit (tcraig, 9/2007)
- integer(SHR_KIND_IN),parameter :: shr_file_minUnit = 10 ! Min unit number to give
- integer(SHR_KIND_IN),parameter :: shr_file_maxUnit = 99 ! Max unit number to give
- logical, save :: UnitTag(0:shr_file_maxUnit) = .false. ! Logical units in use
-
-!===============================================================================
-CONTAINS
-!===============================================================================
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_getUnit -- Get a free FORTRAN unit number
-!
-! !DESCRIPTION: Get the next free FORTRAN unit number.
-!
-! !REVISION HISTORY:
-! 2005-Dec-14 - E. Kluzek - creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-INTEGER FUNCTION shr_file_getUnit ( unit )
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in),optional :: unit ! desired unit number
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: n ! loop index
- logical :: opened ! If unit opened or not
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_file_getUnit) '
- character(*),parameter :: F00 = "('(shr_file_getUnit) ',A,I4,A)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (present (unit)) then
- inquire( unit, opened=opened )
- if (unit < 0 .or. unit > shr_file_maxUnit) then
- write(6,F00) 'invalid unit number request:', unit
- write(6,*) 'ERROR: bad input unit number'
- stop 1
- else if (opened .or. UnitTag(unit) .or. unit == 0 .or. unit == 5 &
- .or. unit == 6) then
- write(6,F00) 'unit number ', unit, ' is already in use'
- write(6,*)'ERROR: Input unit number already in use'
- stop 1
- else
- shr_file_getUnit = unit
- UnitTag (unit) = .true.
- return
- end if
-
- else
- ! --- Choose first available unit other than 0, 5, or 6 ------
- do n=shr_file_maxUnit, shr_file_minUnit, -1
- inquire( n, opened=opened )
- if (n == 5 .or. n == 6 .or. opened) then
- cycle
- end if
- if ( .not. UnitTag(n) ) then
- shr_file_getUnit = n
- UnitTag(n) = .true.
- return
- end if
- end do
- end if
-
- write(6,*) trim(subName),': Error: no available units found'
- stop 1
-
-END FUNCTION shr_file_getUnit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_freeUnit -- Free up a FORTRAN unit number
-!
-! !DESCRIPTION: Free up the given unit number
-!
-! !REVISION HISTORY:
-! 2005-Dec-14 - E. Kluzek - creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_freeUnit ( unit)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in) :: unit ! unit number to be freed
-
-!EOP
-
- !----- local -----
-
- !----- formats -----
- character(*), parameter :: subName = '(shr_file_freeUnit) '
- character(*), parameter :: F00 = "('(shr_file_freeUnit) ',A,I4,A)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (unit < 0 .or. unit > shr_file_maxUnit) then
- write(6,F00) 'invalid unit number request:', unit
- else if (unit == 0 .or. unit == 5 .or. unit == 6) then
- write(6,*) trim(subName),': Error: units 0, 5, and 6 must not be freed'
- else if (UnitTag(unit)) then
- UnitTag (unit) = .false.
- else
- write(6,F00) 'unit ', unit, ' was not in use'
- end if
-
- return
-
-END SUBROUTINE shr_file_freeUnit
-
-END MODULE shr_file_mod
diff --git a/tools/mkprocdata_map/src/shr_kind_mod.F90 b/tools/mkprocdata_map/src/shr_kind_mod.F90
deleted file mode 100644
index d1219223da..0000000000
--- a/tools/mkprocdata_map/src/shr_kind_mod.F90
+++ /dev/null
@@ -1,19 +0,0 @@
-!===============================================================================
-
-MODULE shr_kind_mod
-
- !----------------------------------------------------------------------------
- ! precision/kind constants add data public
- !----------------------------------------------------------------------------
- public
- integer,parameter :: SHR_KIND_R8 = selected_real_kind(12) ! 8 byte real
- integer,parameter :: SHR_KIND_R4 = selected_real_kind( 6) ! 4 byte real
- integer,parameter :: SHR_KIND_RN = kind(1.0) ! native real
- integer,parameter :: SHR_KIND_I8 = selected_int_kind (13) ! 8 byte integer
- integer,parameter :: SHR_KIND_I4 = selected_int_kind ( 6) ! 4 byte integer
- integer,parameter :: SHR_KIND_IN = kind(1) ! native integer
- integer,parameter :: SHR_KIND_CS = 80 ! short char
- integer,parameter :: SHR_KIND_CL = 256 ! long char
- integer,parameter :: SHR_KIND_CX = 512 ! extra-long char
-
-END MODULE shr_kind_mod
diff --git a/tools/mksurfdata_map/Makefile.data b/tools/mksurfdata_map/Makefile.data
deleted file mode 100644
index 373ff6e63d..0000000000
--- a/tools/mksurfdata_map/Makefile.data
+++ /dev/null
@@ -1,266 +0,0 @@
-# -*- mode:Makefile -*-
-#
-# To generate all surface data sets, run:
-# make -f Makefile.data all
-#
-# To generate a single dataset, run make with the name of the rule you
-# want to build. For example, to generate the crop data set for 1x1_numaIA:
-#
-# make -f Makefile.data crop-numa
-#
-# NOTE: The default behavior is to parallelize data set creation using
-# the batch system by submitting jobs to the batch queue (on cheyenne).
-# On yellowstone we submit to an interactive queue in the
-# background. Standard out and standard error are redirected to a text
-# file. To change this behavior, you can comment out the BATCHJOBS and
-# BACKGROUND variables and replace them with empty variables.
-#
-# WARNING: Do not put more than one mksurfdata call per rule. output
-# redirection is based on the rule name, and multiple rules will over
-# write the previous output or incomprehensively merge output from
-# simultaneously running jobs.
-#
-# Note that we typically use -no_surfdata in rules for transient files, having
-# separate rules to make the year-1850 and year-2000 surface datasets. This
-# results in extra processes, but ensures that the surface datasets have the
-# correct name (rather than having 'hist' or 'rcpXXX' in their file name).
-#
-
-# Set up special characters
-null :=
-
-# Set a few things needed for batch handling
-PROJECT = $(shell cat $(HOME)/.cesm_proj)
-LOGOUT = $@.stdout.txt
-PWD = $(shell pwd)
-
-# Setup batch handling for either cheyenne or yellowstone
-# Determine what to use by machine hostname
-BATCHJOBS_ys = execgy
-# Send to regular queue for 2 processors with extra memory, combine stdout/stderr output to log file, and send email on abort or exit
-BATCHJOBS_ch = qsub -A $(PROJECT) -q regular -l select=1:ncpus=2:mem=110GB -l walltime=4:00:00 -j oe -N $(LOGOUT) -m ae --
-HOST = $(shell hostname)
-FINDCH = $(findstring cheyenne,$(HOST))
-ifeq ($(FINDCH),$(null))
- ifeq ($(PROJECT),$(null))
- $(error Can NOT find PROJECT number from ~/.cesm_proj file create it and try again)
- endif
- BATCHJOBS = $(BATCHJOBS_ys)
- BACKGROUND = &> $(LOGOUT) &
-else
- BATCHJOBS = $(BATCHJOBS_ch)
- BACKGROUND = -rundir $(PWD)
-endif
-
-MKSURFDATA = $(BATCHJOBS) $(PWD)/mksurfdata.pl
-
-# f19 and f09 are standard resolutions, f10 is used for testing, f45 is used for FATES
-# ne30np4 is standard resolution for SE dycore in CAM, C96 is standard for fv3 dycore
-# The ne30np4 series (including pg2, pg3, pg4) are standard for SE dycore
-# The variable resolution grids for ARCTIC, ARCTICGRIS and CONUS are also standard
-STANDARD_RES_NO_CROP = 0.9x1.25,1.9x2.5,10x15
-STANDARD_RES = 0.9x1.25,1.9x2.5,10x15,4x5,ne30np4,C96,ne30pg2,ne30pg3,ne30pg4,ne120np4pg3,ne0np4ARCTICGRISne30x8,ne0np4ARCTICne30x4,ne0np4CONUSne30x8
-
-# For future CMIP6 scenarios: SSP-RCP's
-FUTURE_RES = 0.9x1.25,1.9x2.5,10x15
-# For historical transient cases (TRY TO KEEP THIS LIST AS SHORT AS POSSIBLE)
-TRANS_RES = 0.9x1.25,1.9x2.5,10x15,ne30np4,ne0np4ARCTICGRISne30x8,ne0np4ARCTICne30x4,ne0np4CONUSne30x8
-
-# ne120np4 is for high resolution SE dycore, ne16 is for testing SE dycore
-# T42 is for SCAM
-# f05 is needed for running full chemistry model
-# nldas is for NWP working with WRF
-STANDARD = \
- global-present \
- global-present-nldas \
- global-present-T42
-
-TROPICS = \
- crop-tropics-present \
- crop-tropics-historical \
- crop-tropics-transient
-
-CROP = \
- crop-global-present \
- crop-global-present-ne16np4 \
- crop-global-present-ne120np4 \
- crop-numa-present \
- crop-numa-historical \
- crop-smallville \
- crop-smallville-historical \
- crop-global-historical \
- crop-global-transient \
- crop-global-future
-
-all : standard tropics crop urban landuse-timeseries
-
-DEBUG:
- @echo "HOST := $(HOST)"
- @echo "PROJECT := $(PROJECT)"
- @echo "BATCHJOBS := $(BATCHJOBS)"
- @echo "BACKGROUND := $(BACKGROUND)"
-#
-# standard
-#
-standard : $(STANDARD)
-
-global-present : FORCE
- $(MKSURFDATA) -no-crop -vic -glc_nec 10 -y 2000 -res $(STANDARD_RES_NO_CROP) $(BACKGROUND)
-
-# T42 is needed for SCAM
-global-present-T42 : FORCE
- $(MKSURFDATA) -no-crop -glc_nec 10 -y 2000 -res 64x128 $(BACKGROUND)
-
-global-present-nldas : FORCE
- $(MKSURFDATA) -no-crop -hirespft -glc_nec 10 -y 2005 -res 0.125nldas2 $(BACKGROUND)
-
-#
-# tropics
-#
-tropics : $(TROPICS)
-
-crop-tropics-present : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -res 5x5_amazon,1x1_brazil $(BACKGROUND)
-
-crop-tropics-historical : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -res 1x1_brazil $(BACKGROUND)
-
-crop-tropics-transient : FORCE
- $(MKSURFDATA) -glc_nec 10 -no_surfdata -y 1850-2000 -res 1x1_brazil $(BACKGROUND)
-
-#
-# crop
-#
-crop : $(CROP)
-
-crop-global-present : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -r $(STANDARD_RES) $(BACKGROUND)
-
-crop-global-present-0.125 : FORCE
- $(MKSURFDATA) -hirespft -glc_nec 10 -y 2000 -r 0.125x0.125 $(BACKGROUND)
-
-crop-global-present-f05 : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850,2000 -res 0.47x0.63 $(BACKGROUND)
-
-crop-numa-present : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -r 1x1_numaIA $(BACKGROUND)
-
-crop-numa-historical : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -r 1x1_numaIA $(BACKGROUND)
-
-crop-smallville : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -r 1x1_smallvilleIA \
- -pft_idx 17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78 \
- -pft_frc 6.5,1.5,1.6,1.7,1.8,1.9,1.5,1.6,1.7,1.8,1.9,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5 \
- $(BACKGROUND)
-
-crop-global-present-ne16np4 : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -res ne16np4 $(BACKGROUND)
-
-crop-global-present-ne120np4 : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 2000 -res ne120np4 $(BACKGROUND)
-
-# Note that the smallville 1850 dataset is entirely natural vegetation. This
-# facilitates testing a transient case that starts with no crop, and then later
-# adds crop (to make sure that it works properly to add crop in a grid cell
-# where there used to be no crop).
-crop-smallville-historical : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -r 1x1_smallvilleIA -pft_idx 13 -pft_frc 100 $(BACKGROUND)
-
-# Setup the historical case for SSP5-8.5 so that historical can be used to go into the future.
-crop-global-historical : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -ssp_rcp SSP5-8.5 -res $(STANDARD_RES) $(BACKGROUND)
-
-crop-global-historical-f05 : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -r 0.47x0.63 $(BACKGROUND)
-
-crop-global-historical-ne120np4 : FORCE
- $(MKSURFDATA) -glc_nec 10 -y 1850 -res ne120np4 $(BACKGROUND)
-
-crop-global-transient: FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2000 -res $(TRANS_RES) $(BACKGROUND)
-
-crop-global-transient-ne120np4 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2000 -res ne120np4 $(BACKGROUND)
-
-crop-global-transient-f05 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2000 -res 0.47x0.63 $(BACKGROUND)
-
-#
-# Crop with future scenarios
-#
-crop-global-future : crop-global-SSP1-2.6 crop-global-SSP3-7.0 crop-global-SSP5-3.4 crop-global-SSP2-4.5 \
- crop-global-SSP1-1.9 crop-global-SSP4-3.4 crop-global-SSP4-6.0 crop-global-SSP5-8.5
-
-crop-global-SSP1-2.6 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP1-2.6 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP3-7.0 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP3-7.0 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP5-3.4 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP5-3.4 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP2-4.5 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP2-4.5 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP1-1.9 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP1-1.9 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP4-3.4 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP4-3.4 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP4-6.0 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP4-6.0 -res $(FUTURE_RES) $(BACKGROUND)
-
-crop-global-SSP5-8.5 : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-2100 \
- -ssp_rcp SSP5-8.5 -res $(FUTURE_RES) $(BACKGROUND)
-
-#
-# urban
-#
-urban : urban-present urban-alpha
-
-urban-present : FORCE
- $(MKSURFDATA) -y 2000 -no-crop -glc_nec 10 -r 1x1_camdenNJ,1x1_vancouverCAN,1x1_mexicocityMEX $(BACKGROUND)
-
-# NOTE(bja, 2015-01) skip abort on invalid data necessary as of 2015-01. See
-# /glade/p/cesm/cseg/inputdata/lnd/clm2/surfdata_map/README_c141219
-urban-alpha : FORCE
- $(MKSURFDATA) -y 2000 -no-crop -glc_nec 10 -r 1x1_urbanc_alpha -urban_skip_abort_on_invalid_data_check $(BACKGROUND)
-
-
-#
-# landuse timeseries
-#
-landuse-timeseries : landuse-timeseries-smallville
-
-landuse-timeseries-smallville : FORCE
- $(MKSURFDATA) -no_surfdata -glc_nec 10 -y 1850-1855 -r 1x1_smallvilleIA \
- -pft_idx 17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78 \
- -pft_frc 6.5,1.5,1.6,1.7,1.8,1.9,1.5,1.6,1.7,1.8,1.9,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5,1.5 \
- -dynpft single_point_dynpft_files/landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt \
- $(BACKGROUND)
-
-#
-# clean up the working directory by removing generated files
-#
-clean : FORCE
- -rm *~
-
-clobber : clean
- -rm surfdata_*.nc surfdata_*.log surfdata_*.namelist
-
-#
-# generic rule to force things to happen
-#
-FORCE :
-
diff --git a/tools/mksurfdata_map/README b/tools/mksurfdata_map/README
deleted file mode 100644
index bd324580bb..0000000000
--- a/tools/mksurfdata_map/README
+++ /dev/null
@@ -1,73 +0,0 @@
-$CTSMROOT/tools/mksurfdata_map/README Jun 08, 2018
-
-The routines in this directory create a surface dataset.
-The output grid is read in from the input namelist and
-can correspond to either a global or regional grid.
-
-Supported model resolutions are those found in the repository input data directory
- $DIN_LOC_ROOT/lnd/clm2/mappingdata/maps
-
-Surface datasets can either be created for two separate cases
- a) for supported model resolutions
- b) for unsupported (user-specified) model resolutions
-
-The following steps provide a method to create the executable
-and generate the surface dataset:
-
-1) Make the mksurfdata_map executable
-
- Starting from this directory $CTSMROOT/tools/mksurfdata_map
- > cd src
- > gmake
- By default code compiles optimized so it's reasonably fast. If you want
- to use the debugger, with bounds-checking, and float trapping on do the
- following:
- gmake OPT=FALSE
- See Also: See the $CTSMROOT/tools/README file for notes about setting
- the path for NetCDF and running with shared-memory parallelism.
-
-2) For supported model resolutions - skip this step
-
- For unsupported model resolutions - do the following...
- determine the pathname of the model resolution SCRIP grid file
-
- Starting from this directory $CTSMROOT/tools/mksurfdata_map
- > cd ../mkmapdata
- invoke one of the following commands
- (for global resolution)
- > ./mkmapdata.sh -f -res -type global
- (for regional resolution)
- > ./mkmapdata.sh -f -res -type regional
- > cd ../
-
- note: the mapping files generated in ./mkmapdata will be used to
- generate the surface dataset
- note: the res argument above () MUST be identical to the one provided to
- mksurfdata.pl (see below)
-
-3) make surface dataset(s)
-
- Starting from this directory $CTSMROOT/tools/mksurfdata_map
- > mksurfdata.pl --help (for full usage instructions)
- For supported model resolution ()
- > mksurfdata.pl -res [options]
-
- For supported model resolutions for SSP scenarios
- > mksurfdata.pl -res -ssp_rcp -years 1850-2100
-
- For unsupported, user specified model resolutions
- > mksurfdata.pl -res usrspec -usr_gname -usr_gdate
-
- Note that the argument to usr_gname MUST be the same as the -res argument value
- when invoking mkmapdata
-
- Example, for gridname=1x1_boulderCO with maps created on Jan/13/2012
-
- > mksurfdata.pl -res usrspec -usr_gname 1x_boulderCO -usr_gdate 20120113
-
-Lists of input files for range of dates historical or future scenarios:
-
- landuse_timeseries_hist_16pfts_simyr1850-2015.txt --- List of historical input PFT files from 1850 to 2015
-
-(Historical period from 1850-2015 datafiles all point to the historical files)
-
diff --git a/tools/mksurfdata_map/README.developers b/tools/mksurfdata_map/README.developers
deleted file mode 100644
index 6513aeb131..0000000000
--- a/tools/mksurfdata_map/README.developers
+++ /dev/null
@@ -1,227 +0,0 @@
-$CTSMROOT/tools/mksurfdata_map/README.developers Jun/08/2018
-
-============================================================================
-============================================================================
-Developer's guide for mksurfdata_map
-============================================================================
-============================================================================
-
-============================================================================
-Table of Contents
-============================================================================
-
-I. Adding a new raw data file
-
-II. Adding mapping files for a raw data file with a new grid / landmask
-
-III. Checks that should be done when making new surface datasets
-
-============================================================================
-I. Adding a new raw data file
-============================================================================
-
-Here is what you need to change when adding a new raw data file, with one or
-more fields that need to be remapped to the CLM resolution. Note that
-additional steps are needed (beyond what's listed here) when the field
-you're adding specifies something about the subgrid breakdown (into
-landunits, columns & pfts): for those fields, additional code is needed to
-ensure that percentages add to 100%.
-
-Note: The following assumes that the new file uses an existing set of
-mapping files, or that you have already done everything necessary to add a
-new set of mapping files. If your raw data file has a new grid, or a new
-landmask on an existing grid, see the instructions for adding mapping files
-in a separate section of this document.
-
-- Add a new module in $CTSMROOT/tools/mksurfdata_map/src that provides a routine for
- remapping your new field(s).
-
- Note that there is generally one module per input file; multiple fields
- can be regridded using a single subroutine or multiple subroutines.
-
-- Add your new file in $CTSMROOT/tools/mksurfdata_map/src/Srcfiles
-
-- Add new namelist options in $CTSMROOT/tools/mksurfdata_map/src/mkvarctl.F90; e.g., for a
- new field xxx:
- - mksrf_fxxx
- - map_fxxx
-
-- Add output calls in $CTSMROOT/tools/mksurfdata_map/src/mkfileMod.F90; you need to add
- calls in 3 places:
- - raw data file name attribute (nf_put_att_text call)
- - mapping file name attribute (nf_put_att_text call)
- - variable definition (ncd_defvar calls)
- Make sure this goes in an 'if (.not dynlanduse)' conditional, if
- appropriate
-
-- Add code in $CTSMROOT/tools/mksurfdata_map/src/mksurfdat.F90; you need to add the
- following:
- - add a 'use' statement to use your new regridding routine(s)
- - declare array(s) to hold data on the output grid
- - add your new mksrf_fxxx and map_fxxx variables to the 'namelist
- /clmexp/' declaration
- - document your new mksrf_fxxx and map_fxxx variables in the long
- comment following the 'namelist /clmexp/' declaration
- - add your new array(s) to the allocation statement under the heading
- "Allocate and initialize dynamic memory"
- - initialize your new array(s) in the initialization section following
- the allocation
- - add output to ndiag (the log file) documenting your new mksrf_fxxx and
- map_fxxx variables
- - add call(s) to your new subroutine(s) for regridding the data
- - add calls to nf_inq_varid & nf_put_var_double (or nf_put_var_int) for
- each new output variable; make sure to put these calls in the section
- for dynlanduse = false and/or true, as appropriate
- - add a deallocation statement for each new output variable
-
-- Add your new file in $CTSMROOT/bld/namelist_files/namelist_definition_ctsm.xml;
- e.g. (replace xxx with your new field):
-
-
- XXX dataset for mksurfdata
-
-
-- Add your new mksrf_fxxx variable to the list of valid_values for
- mksrf_filename in $CTSMROOT/bld/namelist_files/namelist_definition_ctsm.xml
-
-- Add defaults in $CTSMROOT/bld/namelist_files/namelist_defaults_ctsm_tools.xml;
- note that the "type" attribute is a short code that can be used in
- mksurfdata.pl, and doesn't have to match the "xxx" that is used elsewhere
- - lmask
- - hgrid
- - mksrf_filename
- - mksrf_fxxx (including hgrid and lmask attributes)
-
-- Add hooks to your new files in $CTSMROOT/tools/mksurfdata_map/mksurfdata.pl:
- - add new string in the list following 'foreach my $typ'
- - add the new mapping file to clmexp, as in:
- map_fxxx = '$map{'xxx'}'
- - add the new raw data file to clmexp, as in:
- mksrf_fxxx = '$datfil{'xxx'}'
-
-- Add new raw data file to the inputdata repository: lnd/clm2/rawdata
- - locally
- - check in to the inputdata svn repository
-
-- Add documentation for your new mksrf_fxxx in $CTSMROOT/doc/UsersGuide/tools.xml
-
-============================================================================
-II. Adding mapping files for a raw data file with a new grid / landmask
-============================================================================
-
-If your raw data file is on a new grid, or just has a new landmask on an
-existing grid, you will need to perform a number of additional steps, as
-laid out here.
-
-- First, move your data file to the inputdata directory and give it its
- final name. (This will ensure that the appropriate metadata is put in the
- SCRIP grid file.)
-
-- Make a scrip grid file from your data file using mkmapgrids, and move it
- to the inputdata directory
-
-- Add a scripgriddata entry for the new scrip grid file in
- $CTSMROOT/bld/namelist_files/namelist_defaults_ctsm_tools.xml
-
-- If necessary, add other entries in
- $CTSMROOT/bld/namelist_files/namelist_defaults_ctsm_tools.xml giving information about your
- scrip grid file:
- - If this is a high resolution grid (e.g., 3min or higher), add a
- scripgriddata_lrgfile_needed entry, saying we need 64bit_offset
- (or netcdf4) support for mapping files made with this scrip grid
- file
- - If the grid file is in UGRID format rather than SCRIP grid
- format, add scripgriddata_type and scripgriddata_meshname
- entries. If you don't know what I'm talking about, then your
- grid file is in SCRIP format and you can ignore this.
-
-- If necessary, add new grid and/or landmask to lists of valid values for
- hgrid, res and lmask in $CTSMROOT/bld/namelist_files/namelist_definition_ctsm.xml
- - Note that a new resolution currently needs to be added to both the hgrid
- and res lists of valid values, although in the future this
- should probably be changed so that these raw data grids just
- appear in hgrid
-
-- Add the new grid-landmask combo to the 'mapgrids' list in
- $CTSMROOT/bld/namelist_files/checkmapfiles.ncl
-
-- Add the new grid-landmask combo to the 'grids' list in
- $CTSMROOT/tools/shared/mkmapdata/mkmapdata.sh (in the clm4_5 branch of the
- conditional)
-
-- Make mapping files, from $CTSMROOT/tools/shared/mkmapdata
- - Modify mkmapdata.sh:
- - edit the grids list so it only contains your new grid
- - Modify regridbatch.sh as desired, e.g.:
- - project number
- - number of processors (BSUB -n line, span, and the regrid_num_proc setting)
- - wall-clock limit
- - if ESMFBIN_PATH is in your environment, you may want to unset it;
- this can be important to allow mkmapdata.sh choose a different
- executable for mpi vs serial runs
- - if you renamed the mkmapdata.sh script, be sure to call the
- renamed script at the bottom of regridbatch.sh
- - Submit regridbatch.sh
-
-- When mapping files have all been created, run createXMLEntries.pl from
- $CTSMROOT/tools/shared/mkmapdata (usage: just run the script with no arguments)
-
-- Cut and paste the xml entries from mapping_entries.txt (created by
- createXMLEntries.pl) into $CTSMROOT/bld/namelist_files/namelist_defaults_ctsm.xml,
- in the correct locations
-
-- Move mapping files to correct location, either using mv_cmds.sh created by
- createXMLEntries.pl, or using $CTSMROOT/tools/shared/mkmapdata/mvNimport.sh.
- - Note that the latter also imports to the inputdata directory; if you
- don't use that, you'll need to add the files to the inputdata
- directory yourself
-
-
-============================================================================
-III. Checks that should be done when making new surface datasets
-============================================================================
-
-Remaking all surface datasets carries the risk of introducing unintended
-changes, particularly when you are expecting answer changes (so you
-don't notice unintended answer changes that are mixed with the expected
-changes).
-
-Here are some things to check after making a new set of surface
-datasets:
-
-- For at least one global dataset (probably a production resolution
- rather than a low resolution that is just used for testing): Compare
- the new dataset against the previous version:
-
- - Compare header (via ncdump -h) and/or log file: ensure that the same
- source data were used, except where you expect differences
-
- - Compare all fields with a tool like cprnc: make sure that the only
- fields that differ are those you expect to differ
-
- - Visually compare all fields that differ: make sure differences look
- reasonable and as expected
-
-And here are some things to check for when making new landuse.timeseries
-datasets (which often happens at the same time, and most of the above applies
-as well):
-
-- Compare one of the production resolution datasets to a previous version.
-
- - If part of it should be identical (for example the historical period) make
- sure it is identical as expected (using cprnc make sure the historical period
- is identical and only the future scenario changes).
-
- - If the historical period should be identical, make sure the 1850 surface dataset
- created is identical to the previous one.
-
- - Visually compare all fields/times that differ: make sure differences look
- reasonable and as expected. Go through at least the first and last time to see
- that the change in time is as expected.
-
- - Quickly going through the time differences for at least one field that changes
- can also be useful to see that there isn't a sudden jump for a particular time.
-
- - Go through the list of raw PFT files that were used to create the dataset and make
- sure it appears to be correct (ncdump -v input_pftdata_filename)
diff --git a/tools/mksurfdata_map/landuse_timeseries_hist_78pfts_simyr1850-2015.txt b/tools/mksurfdata_map/landuse_timeseries_hist_78pfts_simyr1850-2015.txt
deleted file mode 100644
index 3c622f3965..0000000000
--- a/tools/mksurfdata_map/landuse_timeseries_hist_78pfts_simyr1850-2015.txt
+++ /dev/null
@@ -1,332 +0,0 @@
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1850.c170412.nc 1850
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1850.c170412.nc 1850
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1851.c170412.nc 1851
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1851.c170412.nc 1851
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1852.c170412.nc 1852
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1852.c170412.nc 1852
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1853.c170412.nc 1853
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1853.c170412.nc 1853
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1854.c170412.nc 1854
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1854.c170412.nc 1854
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1855.c170412.nc 1855
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1855.c170412.nc 1855
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1856.c170412.nc 1856
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1856.c170412.nc 1856
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1857.c170412.nc 1857
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1857.c170412.nc 1857
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1858.c170412.nc 1858
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1858.c170412.nc 1858
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1859.c170412.nc 1859
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1859.c170412.nc 1859
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1860.c170412.nc 1860
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1860.c170412.nc 1860
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1861.c170412.nc 1861
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1861.c170412.nc 1861
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1862.c170412.nc 1862
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1862.c170412.nc 1862
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1863.c170412.nc 1863
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1863.c170412.nc 1863
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1864.c170412.nc 1864
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1864.c170412.nc 1864
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1865.c170412.nc 1865
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1865.c170412.nc 1865
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1866.c170412.nc 1866
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1866.c170412.nc 1866
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1867.c170412.nc 1867
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1867.c170412.nc 1867
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1868.c170412.nc 1868
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1868.c170412.nc 1868
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1869.c170412.nc 1869
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1869.c170412.nc 1869
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1870.c170412.nc 1870
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1870.c170412.nc 1870
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1871.c170412.nc 1871
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1871.c170412.nc 1871
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1872.c170412.nc 1872
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1872.c170412.nc 1872
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1873.c170412.nc 1873
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1873.c170412.nc 1873
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1874.c170412.nc 1874
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1874.c170412.nc 1874
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1875.c170412.nc 1875
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1875.c170412.nc 1875
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1876.c170412.nc 1876
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1876.c170412.nc 1876
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1877.c170412.nc 1877
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1877.c170412.nc 1877
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1878.c170412.nc 1878
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1878.c170412.nc 1878
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1879.c170412.nc 1879
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1879.c170412.nc 1879
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1880.c170412.nc 1880
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1880.c170412.nc 1880
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1881.c170412.nc 1881
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1881.c170412.nc 1881
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1882.c170412.nc 1882
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1882.c170412.nc 1882
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1883.c170412.nc 1883
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1883.c170412.nc 1883
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1884.c170412.nc 1884
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1884.c170412.nc 1884
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1885.c170412.nc 1885
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1885.c170412.nc 1885
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1886.c170412.nc 1886
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1886.c170412.nc 1886
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1887.c170412.nc 1887
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1887.c170412.nc 1887
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1888.c170412.nc 1888
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1888.c170412.nc 1888
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1889.c170412.nc 1889
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1889.c170412.nc 1889
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1890.c170412.nc 1890
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1890.c170412.nc 1890
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1891.c170412.nc 1891
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1891.c170412.nc 1891
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1892.c170412.nc 1892
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1892.c170412.nc 1892
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1893.c170412.nc 1893
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1893.c170412.nc 1893
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1894.c170412.nc 1894
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1894.c170412.nc 1894
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1895.c170412.nc 1895
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1895.c170412.nc 1895
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1896.c170412.nc 1896
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1896.c170412.nc 1896
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1897.c170412.nc 1897
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1897.c170412.nc 1897
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1898.c170412.nc 1898
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1898.c170412.nc 1898
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1899.c170412.nc 1899
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1899.c170412.nc 1899
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1900.c170412.nc 1900
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1900.c170412.nc 1900
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1901.c170412.nc 1901
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1901.c170412.nc 1901
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1902.c170412.nc 1902
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1902.c170412.nc 1902
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1903.c170412.nc 1903
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1903.c170412.nc 1903
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1904.c170412.nc 1904
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1904.c170412.nc 1904
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1905.c170412.nc 1905
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1905.c170412.nc 1905
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1906.c170412.nc 1906
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1906.c170412.nc 1906
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1907.c170412.nc 1907
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1907.c170412.nc 1907
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1908.c170412.nc 1908
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1908.c170412.nc 1908
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1909.c170412.nc 1909
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1909.c170412.nc 1909
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1910.c170412.nc 1910
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1910.c170412.nc 1910
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1911.c170412.nc 1911
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1911.c170412.nc 1911
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1912.c170412.nc 1912
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1912.c170412.nc 1912
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1913.c170412.nc 1913
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1913.c170412.nc 1913
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1914.c170412.nc 1914
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1914.c170412.nc 1914
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1915.c170412.nc 1915
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1915.c170412.nc 1915
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1916.c170412.nc 1916
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1916.c170412.nc 1916
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1917.c170412.nc 1917
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1917.c170412.nc 1917
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1918.c170412.nc 1918
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1918.c170412.nc 1918
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1919.c170412.nc 1919
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1919.c170412.nc 1919
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1920.c170412.nc 1920
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1920.c170412.nc 1920
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1921.c170412.nc 1921
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1921.c170412.nc 1921
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1922.c170412.nc 1922
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1922.c170412.nc 1922
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1923.c170412.nc 1923
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1923.c170412.nc 1923
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1924.c170412.nc 1924
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1924.c170412.nc 1924
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1925.c170412.nc 1925
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1925.c170412.nc 1925
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1926.c170412.nc 1926
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1926.c170412.nc 1926
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1927.c170412.nc 1927
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1927.c170412.nc 1927
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1928.c170412.nc 1928
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1928.c170412.nc 1928
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1929.c170412.nc 1929
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1929.c170412.nc 1929
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1930.c170412.nc 1930
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1930.c170412.nc 1930
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1931.c170412.nc 1931
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1931.c170412.nc 1931
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1932.c170412.nc 1932
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1932.c170412.nc 1932
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1933.c170412.nc 1933
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1933.c170412.nc 1933
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1934.c170412.nc 1934
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1934.c170412.nc 1934
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1935.c170412.nc 1935
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1935.c170412.nc 1935
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1936.c170412.nc 1936
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1936.c170412.nc 1936
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1937.c170412.nc 1937
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1937.c170412.nc 1937
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1938.c170412.nc 1938
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1938.c170412.nc 1938
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1939.c170412.nc 1939
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1939.c170412.nc 1939
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1940.c170412.nc 1940
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1940.c170412.nc 1940
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1941.c170412.nc 1941
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1941.c170412.nc 1941
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1942.c170412.nc 1942
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1942.c170412.nc 1942
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1943.c170412.nc 1943
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1943.c170412.nc 1943
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1944.c170412.nc 1944
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1944.c170412.nc 1944
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1945.c170412.nc 1945
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1945.c170412.nc 1945
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1946.c170412.nc 1946
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1946.c170412.nc 1946
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1947.c170412.nc 1947
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1947.c170412.nc 1947
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1948.c170412.nc 1948
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1948.c170412.nc 1948
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1949.c170412.nc 1949
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1949.c170412.nc 1949
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1950.c170412.nc 1950
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1950.c170412.nc 1950
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1951.c170412.nc 1951
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1951.c170412.nc 1951
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1952.c170412.nc 1952
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1952.c170412.nc 1952
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1953.c170412.nc 1953
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1953.c170412.nc 1953
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1954.c170412.nc 1954
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1954.c170412.nc 1954
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1955.c170412.nc 1955
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1955.c170412.nc 1955
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1956.c170412.nc 1956
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1956.c170412.nc 1956
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1957.c170412.nc 1957
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1957.c170412.nc 1957
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1958.c170412.nc 1958
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1958.c170412.nc 1958
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1959.c170412.nc 1959
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1959.c170412.nc 1959
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1960.c170412.nc 1960
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1960.c170412.nc 1960
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1961.c170412.nc 1961
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1961.c170412.nc 1961
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1962.c170412.nc 1962
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1962.c170412.nc 1962
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1963.c170412.nc 1963
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1963.c170412.nc 1963
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1964.c170412.nc 1964
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1964.c170412.nc 1964
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1965.c170412.nc 1965
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1965.c170412.nc 1965
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1966.c170412.nc 1966
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1966.c170412.nc 1966
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1967.c170412.nc 1967
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1967.c170412.nc 1967
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1968.c170412.nc 1968
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1968.c170412.nc 1968
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1969.c170412.nc 1969
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1969.c170412.nc 1969
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1970.c170412.nc 1970
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1970.c170412.nc 1970
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1971.c170412.nc 1971
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1971.c170412.nc 1971
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1972.c170412.nc 1972
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1972.c170412.nc 1972
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1973.c170412.nc 1973
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1973.c170412.nc 1973
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1974.c170412.nc 1974
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1974.c170412.nc 1974
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1975.c170412.nc 1975
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1975.c170412.nc 1975
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1976.c170412.nc 1976
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1976.c170412.nc 1976
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1977.c170412.nc 1977
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1977.c170412.nc 1977
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1978.c170412.nc 1978
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1978.c170412.nc 1978
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1979.c170412.nc 1979
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1979.c170412.nc 1979
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1980.c170412.nc 1980
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1980.c170412.nc 1980
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1981.c170412.nc 1981
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1981.c170412.nc 1981
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1982.c170412.nc 1982
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1982.c170412.nc 1982
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1983.c170412.nc 1983
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1983.c170412.nc 1983
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1984.c170412.nc 1984
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1984.c170412.nc 1984
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1985.c170412.nc 1985
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1985.c170412.nc 1985
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1986.c170412.nc 1986
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1986.c170412.nc 1986
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1987.c170412.nc 1987
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1987.c170412.nc 1987
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1988.c170412.nc 1988
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1988.c170412.nc 1988
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1989.c170412.nc 1989
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1989.c170412.nc 1989
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1990.c170412.nc 1990
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1990.c170412.nc 1990
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1991.c170412.nc 1991
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1991.c170412.nc 1991
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1992.c170412.nc 1992
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1992.c170412.nc 1992
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1993.c170412.nc 1993
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1993.c170412.nc 1993
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1994.c170412.nc 1994
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1994.c170412.nc 1994
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1995.c170412.nc 1995
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1995.c170412.nc 1995
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1996.c170412.nc 1996
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1996.c170412.nc 1996
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1997.c170412.nc 1997
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1997.c170412.nc 1997
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1998.c170412.nc 1998
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1998.c170412.nc 1998
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1999.c170412.nc 1999
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_1999.c170412.nc 1999
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2000.c170412.nc 2000
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2000.c170412.nc 2000
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2001.c170412.nc 2001
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2001.c170412.nc 2001
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2002.c170412.nc 2002
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2002.c170412.nc 2002
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2003.c170412.nc 2003
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2003.c170412.nc 2003
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2004.c170412.nc 2004
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2004.c170412.nc 2004
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2005.c170412.nc 2005
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2005.c170412.nc 2005
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2006.c170412.nc 2006
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2006.c170412.nc 2006
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2007.c170412.nc 2007
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2007.c170412.nc 2007
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2008.c170412.nc 2008
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2008.c170412.nc 2008
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2009.c170412.nc 2009
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2009.c170412.nc 2009
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2010.c170412.nc 2010
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2010.c170412.nc 2010
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2011.c170412.nc 2011
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2011.c170412.nc 2011
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2012.c170412.nc 2012
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2012.c170412.nc 2012
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2013.c170412.nc 2013
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2013.c170412.nc 2013
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2014.c170412.nc 2014
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2014.c170412.nc 2014
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2015.c170412.nc 2015
-/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2015.c170412.nc 2015
diff --git a/tools/mksurfdata_map/mksurfdata.pl b/tools/mksurfdata_map/mksurfdata.pl
deleted file mode 100755
index f9961c7f9e..0000000000
--- a/tools/mksurfdata_map/mksurfdata.pl
+++ /dev/null
@@ -1,881 +0,0 @@
-#!/usr/bin/env perl
-#
-# Oct/30/2008 Erik Kluzek
-#
-# mksurfdata.pl Perl script to make surface datasets for all resolutions.
-#
-#
-use Cwd;
-use strict;
-use English;
-use IO::File;
-use Getopt::Long;
-
-
-#Figure out where configure directory is and where can use the XML/Lite module from
-my $ProgName;
-($ProgName = $PROGRAM_NAME) =~ s!(.*)/!!; # name of program
-my $ProgDir = $1; # name of directory where program lives
-
-my $cwd = getcwd(); # current working directory
-my $scrdir;
-
-if ($ProgDir) { $scrdir = $ProgDir; }
-else { $scrdir = $cwd; }
-
-my $debug = 0;
-
-#-----------------------------------------------------------------------------------------------
-# Add $scrdir to the list of paths that Perl searches for modules
-my @dirs = ( "$scrdir/../../cime/utils/perl5lib",
- "$scrdir/../../../../cime/utils/perl5lib"
- );
-unshift @INC, @dirs;
-my $result = eval "require XML::Lite";
-if ( ! defined($result) ) {
- die <<"EOF";
-** Cannot find perl module \"XML/Lite.pm\" from directories: @dirs **
-EOF
-}
-my $result = eval "require Build::NamelistDefinition";
-if ( ! defined($result) ) {
- die <<"EOF";
-** Cannot find perl module \"Build/NamelistDefinition.pm\" from directories: @dirs **
-EOF
-}
-my $nldef_file = "$scrdir/../../bld/namelist_files/namelist_definition_ctsm.xml";
-
-my $definition = Build::NamelistDefinition->new( $nldef_file );
-
-my $CSMDATA = "/glade/p/cesm/cseg/inputdata";
-
-my %opts = (
- hgrid=>"all",
- vic=>0,
- glc=>0,
- ssp_rcp=>"hist",
- debug=>0,
- exedir=>undef,
- allownofile=>undef,
- crop=>1,
- fast_maps=>0,
- hirespft=>undef,
- years=>"1850,2000",
- glc_nec=>10,
- merge_gis=>undef,
- inlandwet=>undef,
- help=>0,
- no_surfdata=>0,
- pft_override=>undef,
- pft_frc=>undef,
- pft_idx=>undef,
- soil_override=>undef,
- soil_cly=>undef,
- soil_snd=>undef,
- soil_col=>undef,
- soil_fmx=>undef,
- outnc_double=>undef,
- outnc_dims=>"2",
- usrname=>"",
- rundir=>"$cwd",
- usr_mapdir=>"../mkmapdata",
- dynpft=>undef,
- csmdata=>$CSMDATA,
- urban_skip_abort_on_invalid_data_check=>undef,
- );
-
-my $numpft = 78;
-
-#-----------------------------------------------------------------------------------------------
-sub usage {
- die < [OPTIONS]
- -res [or -r] "resolution" is the supported resolution(s) to use for files (by default $opts{'hgrid'} ).
-
-
- For unsupported, user-specified resolutions:
- $ProgName -res usrspec -usr_gname -usr_gdate [OPTIONS]
- -usr_gname "user_gname" User resolution name to find grid file with
- (only used if -res is set to 'usrspec')
- -usr_gdate "user_gdate" User map date to find mapping files with
- (only used if -res is set to 'usrspec')
- NOTE: all mapping files are assumed to be in mkmapdata
- - and the user needs to have invoked mkmapdata in
- that directory first
- -usr_mapdir "mapdirectory" Directory where the user-supplied mapping files are
- Default: $opts{'usr_mapdir'}
-
-OPTIONS
- NOTE: The three critical options are (-years, -glc_nec, and -ssp_rcp) they are marked as such.
-
- -allownofile Allow the script to run even if one of the input files
- does NOT exist.
- -dinlc [or -l] Enter the directory location for inputdata
- (default $opts{'csmdata'})
- -debug [or -d] Do not actually run -- just print out what
- would happen if ran.
- -dynpft "filename" Dynamic PFT/harvesting file to use if you have a manual list you want to use
- (rather than create it on the fly, must be consistent with first year)
- (Normally NOT used)
- -fast_maps Toggle fast mode which doesn't use the large mapping files
- -glc_nec "number" Number of glacier elevation classes to use (by default $opts{'glc_nec'})
- (CRITICAL OPTION)
- -merge_gis If you want to use the glacier dataset that merges in
- the Greenland Ice Sheet data that CISM uses (typically
- used only if consistency with CISM is important)
- -hirespft If you want to use the high-resolution pft dataset rather
- than the default lower resolution dataset
- (low resolution is at half-degree, high resolution at 3minute)
- (hires only available for present-day [2000])
- -exedir "directory" Directory where mksurfdata_map program is
- (by default assume it is in the current directory)
- -inlandwet If you want to allow inland wetlands
- -no-crop Create datasets without the extensive list of prognostic crop types
- -no_surfdata Do not output a surface dataset
- This is useful if you only want a landuse_timeseries file
- -years [or -y] "years" Simulation year(s) to run over (by default $opts{'years'})
- (can also be a simulation year range: i.e. 1850-2000 or 1850-2100 for ssp_rcp future scenarios)
- (CRITICAL OPTION)
- -help [or -h] Display this help.
-
- -rundir "directory" Directory to run in
- (by default current directory $opts{'rundir'})
-
- -ssp_rcp "scenario-name" Shared Socioeconomic Pathway and Representative Concentration Pathway Scenario name(s).
- "hist" for historical, otherwise in form of SSPn-m.m where n is the SSP number
- and m.m is the radiative forcing in W/m^2 at the peak or 2100.
- (normally use thiw with -years 1850-2100)
- (CRITICAL OPTION)
-
- -usrname "clm_usrdat_name" CLM user data name to find grid file with.
-
- -vic Add the fields required for the VIC model
- -glc Add the optional 3D glacier fields for verification of the glacier model
-
- NOTE: years, res, and ssp_rcp can be comma delimited lists.
-
-
-OPTIONS to override the mapping of the input gridded data with hardcoded input
-
- -pft_frc "list of fractions" Comma delimited list of percentages for veg types
- -pft_idx "list of veg index" Comma delimited veg index for each fraction
- -soil_cly "% of clay" % of soil that is clay
- -soil_col "soil color" Soil color (1 [light] to 20 [dark])
- -soil_fmx "soil fmax" Soil maximum saturated fraction (0-1)
- -soil_snd "% of sand" % of soil that is sand
-
-OPTIONS to work around bugs?
- -urban_skip_abort_on_invalid_data_check
- do not abort on an invalid data check in urban.
- Added 2015-01 to avoid recompiling as noted in
- /glade/p/cesm/cseg/inputdata/lnd/clm2/surfdata_map/README_c141219
-
-EOF
-}
-
-sub check_soil {
-#
-# check that the soil options are set correctly
-#
- foreach my $type ( "soil_cly", "soil_snd" ) {
- if ( ! defined($opts{$type} ) ) {
- die "ERROR: Soil variables were set, but $type was NOT set\n";
- }
- }
- #if ( $opts{'soil_col'} < 0 || $opts{'soil_col'} > 20 ) {
- # die "ERROR: Soil color is out of range = ".$opts{'soil_col'}."\n";
- #}
- my $texsum = $opts{'soil_cly'} + $opts{'soil_snd'};
- my $loam = 100.0 - $texsum;
- if ( $texsum < 0.0 || $texsum > 100.0 ) {
- die "ERROR: Soil textures are out of range: clay = ".$opts{'soil_cly'}.
- " sand = ".$opts{'soil_snd'}." loam = $loam\n";
- }
-}
-
-sub check_soil_col_fmx {
-#
-# check that the soil color or soil fmax option is set correctly
-#
- if ( defined($opts{'soil_col'}) ) {
- if ( $opts{'soil_col'} < 0 || $opts{'soil_col'} > 20 ) {
- die "ERROR: Soil color is out of range = ".$opts{'soil_col'}."\n";
- }
- }
- if ( defined($opts{'soil_fmx'}) ) {
- if ( $opts{'soil_fmx'} < 0.0 || $opts{'soil_fmx'} > 1.0 ) {
- die "ERROR: Soil fmax is out of range = ".$opts{'soil_fmx'}."\n";
- }
- }
-}
-
-sub check_pft {
-#
-# check that the pft options are set correctly
-#
- # Eliminate starting and ending square brackets
- $opts{'pft_idx'} =~ s/^\[//;
- $opts{'pft_idx'} =~ s/\]$//;
- $opts{'pft_frc'} =~ s/^\[//;
- $opts{'pft_frc'} =~ s/\]$//;
- foreach my $type ( "pft_idx", "pft_frc" ) {
- if ( ! defined($opts{$type} ) ) {
- die "ERROR: PFT variables were set, but $type was NOT set\n";
- }
- }
- my @pft_idx = split( /,/, $opts{'pft_idx'} );
- my @pft_frc = split( /,/, $opts{'pft_frc'} );
- if ( $#pft_idx != $#pft_frc ) {
- die "ERROR: PFT arrays are different sizes: pft_idx and pft_frc\n";
- }
- my $sumfrc = 0.0;
- for( my $i = 0; $i <= $#pft_idx; $i++ ) {
- # check index in range
- if ( $pft_idx[$i] < 0 || $pft_idx[$i] > $numpft ) {
- die "ERROR: pft_idx out of range = ".$opts{'pft_idx'}."\n";
- }
- # make sure there are no duplicates
- for( my $j = 0; $j < $i; $j++ ) {
- if ( $pft_idx[$i] == $pft_idx[$j] ) {
- die "ERROR: pft_idx has duplicates = ".$opts{'pft_idx'}."\n";
- }
- }
- # check fraction in range
- if ( $pft_frc[$i] <= 0.0 || $pft_frc[$i] > 100.0 ) {
- die "ERROR: pft_frc out of range (>0.0 and <=100.0) = ".$opts{'pft_frc'}."\n";
- }
- $sumfrc = $sumfrc + $pft_frc[$i];
- }
- # check that fraction sums up to 100%
- if ( abs( $sumfrc - 100.0) > 1.e-6 ) {
- die "ERROR: pft_frc does NOT add up to 100% = ".$opts{'pft_frc'}."\n";
- }
-
-}
-
-# Perl trim function to remove whitespace from the start and end of the string
-sub trim($)
-{
- my $string = shift;
- $string =~ s/^\s+//;
- $string =~ s/\s+$//;
- return $string;
-}
-
-sub write_transient_timeseries_file {
- my ($transient, $desc, $sim_yr0, $sim_yrn, $queryfilopts, $resol, $resolhrv, $ssp_rcp, $mkcrop, $sim_yr_surfdat) = @_;
-
- my $strlen = 195;
- my $dynpft_format = "%-${strlen}.${strlen}s %4.4d\n";
- my $landuse_timeseries_text_file = "";
- if ( $transient ) {
- if ( ! defined($opts{'dynpft'}) && ! $opts{'pft_override'} ) {
- $landuse_timeseries_text_file = "landuse_timeseries_$desc.txt";
- my $fh_landuse_timeseries = IO::File->new;
- $fh_landuse_timeseries->open( ">$landuse_timeseries_text_file" ) or die "** can't open file: $landuse_timeseries_text_file\n";
- print "Writing out landuse_timeseries text file: $landuse_timeseries_text_file\n";
- for( my $yr = $sim_yr0; $yr <= $sim_yrn; $yr++ ) {
- my $vegtypyr = `$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts $resol -options sim_year='$yr',ssp_rcp=${ssp_rcp}${mkcrop} -var mksrf_fvegtyp -namelist clmexp`;
- chomp( $vegtypyr );
- printf $fh_landuse_timeseries $dynpft_format, $vegtypyr, $yr;
- my $hrvtypyr = `$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts $resolhrv -options sim_year='$yr',ssp_rcp=${ssp_rcp}${mkcrop} -var mksrf_fvegtyp -namelist clmexp`;
- chomp( $hrvtypyr );
- printf $fh_landuse_timeseries $dynpft_format, $hrvtypyr, $yr;
- if ( $yr % 100 == 0 ) {
- print "year: $yr\n";
- }
- }
- $fh_landuse_timeseries->close;
- print "Done writing file\n";
- } elsif ( $opts{'pft_override'} && defined($opts{'dynpft'}) ) {
- $landuse_timeseries_text_file = $opts{'dynpft'};
- } else {
- $landuse_timeseries_text_file = "landuse_timeseries_override_$desc.txt";
- my $fh_landuse_timeseries = IO::File->new;
- $fh_landuse_timeseries->open( ">$landuse_timeseries_text_file" ) or die "** can't open file: $landuse_timeseries_text_file\n";
- my $frstpft = "$opts{'pft_frc'} " .
- "$opts{'pft_idx'} " .
- "0,0,0,0,0 0 ";
- print "Writing out landuse_timeseries text file: $landuse_timeseries_text_file\n";
- if ( (my $len = length($frstpft)) > $strlen ) {
- die "ERROR PFT line is too long ($len): $frstpft\n";
- }
- # NOTE(wjs, 2014-12-04) Using sim_yr_surfdat here rather than
- # sim_yr0. As far as I can tell, it seems somewhat arbitrary which one
- # we use, but sim_yr_surfdat seems more like what's intended.
- printf $fh_landuse_timeseries $dynpft_format, $frstpft, $sim_yr_surfdat;
- $fh_landuse_timeseries->close;
- print "Done writing file\n";
- }
- }
- return $landuse_timeseries_text_file;
-}
-
-sub write_namelist_file {
- my ($namelist_fname, $logfile_fname, $fsurdat_fname, $fdyndat_fname,
- $glc_nec, $griddata, $gridtype, $map, $datfil, $double,
- $all_urb, $no_inlandwet, $vegtyp, $hrvtyp,
- $landuse_timeseries_text_file, $setnumpft) = @_;
-
-
- my $gitdescribe = `cd $scrdir; git describe; cd -`;
- chomp( $gitdescribe );
- my $fh = IO::File->new;
- $fh->open( ">$namelist_fname" ) or die "** can't open file: $namelist_fname\n";
- print $fh <<"EOF";
-&clmexp
- nglcec = $glc_nec
- mksrf_fgrid = '$griddata'
- mksrf_gridtype = '$gridtype'
- map_fpft = '$map->{'veg'}'
- map_fglacier = '$map->{'glc'}'
- map_fglacierregion = '$map->{'glcregion'}'
- map_fsoicol = '$map->{'col'}'
- map_furban = '$map->{'urb'}'
- map_fmax = '$map->{'fmx'}'
- map_forganic = '$map->{'org'}'
- map_flai = '$map->{'lai'}'
- map_fharvest = '$map->{'hrv'}'
- map_flakwat = '$map->{'lak'}'
- map_fwetlnd = '$map->{'wet'}'
- map_fvocef = '$map->{'voc'}'
- map_fsoitex = '$map->{'tex'}'
- map_furbtopo = '$map->{'utp'}'
- map_fgdp = '$map->{'gdp'}'
- map_fpeat = '$map->{'peat'}'
- map_fsoildepth = '$map->{'soildepth'}'
- map_fabm = '$map->{'abm'}'
- mksrf_fsoitex = '$datfil->{'tex'}'
- mksrf_forganic = '$datfil->{'org'}'
- mksrf_flakwat = '$datfil->{'lak'}'
- mksrf_fwetlnd = '$datfil->{'wet'}'
- mksrf_fmax = '$datfil->{'fmx'}'
- mksrf_fglacier = '$datfil->{'glc'}'
- mksrf_fglacierregion = '$datfil->{'glcregion'}'
- mksrf_fvocef = '$datfil->{'voc'}'
- mksrf_furbtopo = '$datfil->{'utp'}'
- mksrf_fgdp = '$datfil->{'gdp'}'
- mksrf_fpeat = '$datfil->{'peat'}'
- mksrf_fsoildepth = '$datfil->{'soildepth'}'
- mksrf_fabm = '$datfil->{'abm'}'
- outnc_double = $double
- all_urban = $all_urb
- no_inlandwet = $no_inlandwet
- mksrf_furban = '$datfil->{'urb'}'
- gitdescribe = '$gitdescribe'
-EOF
- if ( $opts{'vic'} ) {
- print $fh <<"EOF";
- map_fvic = '$map->{'vic'}'
- mksrf_fvic = '$datfil->{'vic'}'
- outnc_vic = .true.
-EOF
- }
- if ( $opts{'glc'} ) {
- print $fh <<"EOF";
- outnc_3dglc = .true.
-EOF
- }
- if ( $opts{'glc'} ) {
- print $fh <<"EOF";
- outnc_3dglc = .true.
-EOF
- }
- if ( ! $opts{'fast_maps'} ) {
- print $fh <<"EOF";
- map_ftopostats = '$map->{'topostats'}'
- mksrf_ftopostats = '$datfil->{'topostats'}'
-EOF
- } else {
- print $fh <<"EOF";
- std_elev = 371.0d00
-EOF
- }
- if ( defined($opts{'soil_override'}) ) {
- print $fh <<"EOF";
- soil_clay = $opts{'soil_cly'}
- soil_sand = $opts{'soil_snd'}
-EOF
- }
- if ( defined($opts{'pft_override'}) ) {
- print $fh <<"EOF";
- all_veg = .true.
- pft_frc = $opts{'pft_frc'}
- pft_idx = $opts{'pft_idx'}
-EOF
- }
-
- print $fh <<"EOF";
- mksrf_fvegtyp = '$vegtyp'
- mksrf_fhrvtyp = '$hrvtyp'
- mksrf_fsoicol = '$datfil->{'col'}'
- mksrf_flai = '$datfil->{'lai'}'
-EOF
-
- # Note that some of the file names in the following may be empty strings
- # (except for logfile_fname)
- print $fh <<"EOF";
- fsurdat = '$fsurdat_fname'
- fsurlog = '$logfile_fname'
- mksrf_fdynuse = '$landuse_timeseries_text_file'
- fdyndat = '$fdyndat_fname'
-EOF
-
- if ( $setnumpft ) {
- print $fh <<"EOF";
- $setnumpft
-EOF
- }
-
- if ( $opts{'urban_skip_abort_on_invalid_data_check'} ) {
- print $fh <<"EOF";
- urban_skip_abort_on_invalid_data_check = .true.
-EOF
- }
- # end the namelist
- print $fh <<"EOF";
-/
-EOF
-
- $fh->close;
- #
- # Print namelist file
- $fh->open( "<$namelist_fname" ) or die "** can't open file: $namelist_fname\n";
- while( $_ = <$fh> ) {
- print $_;
- }
- $fh->close;
-}
-
-#-----------------------------------------------------------------------------------------------
-
- my $cmdline = "@ARGV";
- GetOptions(
- "allownofile" => \$opts{'allownofile'},
- "r|res=s" => \$opts{'hgrid'},
- "usr_gname=s" => \$opts{'usr_gname'},
- "usr_gdate=s" => \$opts{'usr_gdate'},
- "usr_mapdir=s" => \$opts{'usr_mapdir'},
- "crop!" => \$opts{'crop'},
- "hirespft" => \$opts{'hirespft'},
- "l|dinlc=s" => \$opts{'csmdata'},
- "d|debug" => \$opts{'debug'},
- "fast_maps" => \$opts{'fast_maps'},
- "dynpft=s" => \$opts{'dynpft'},
- "y|years=s" => \$opts{'years'},
- "exedir=s" => \$opts{'exedir'},
- "h|help" => \$opts{'help'},
- "usrname=s" => \$opts{'usrname'},
- "glc_nec=i" => \$opts{'glc_nec'},
- "merge_gis" => \$opts{'merge_gis'},
- "inlandwet" => \$opts{'inlandwet'},
- "no_surfdata" => \$opts{'no_surfdata'},
- "pft_frc=s" => \$opts{'pft_frc'},
- "pft_idx=s" => \$opts{'pft_idx'},
- "ssp_rcp=s" => \$opts{'ssp_rcp'},
- "vic!" => \$opts{'vic'},
- "glc!" => \$opts{'glc'},
- "rundir=s" => \$opts{'rundir'},
- "soil_col=i" => \$opts{'soil_col'},
- "soil_fmx=f" => \$opts{'soil_fmx'},
- "soil_cly=f" => \$opts{'soil_cly'},
- "soil_snd=f" => \$opts{'soil_snd'},
- "urban_skip_abort_on_invalid_data_check" => \$opts{'urban_skip_abort_on_invalid_data_check'},
- ) or usage();
-
- # Check for unparsed arguments
- if (@ARGV) {
- print "ERROR: unrecognized arguments: @ARGV\n";
- usage();
- }
- if ( $opts{'help'} ) {
- usage();
- }
-
- chdir( $opts{'rundir'} ) or die "** can't change to directory: $opts{'rundir'}\n";
- # If csmdata was changed from the default
- if ( $CSMDATA ne $opts{'csmdata'} ) {
- $CSMDATA = $opts{'csmdata'};
- }
- my $glc_nec = $opts{'glc_nec'};
- if ( $glc_nec <= 0 ) {
- print "** glc_nec must be at least 1\n";
- usage();
- }
- my $no_inlandwet = ".true.";
- if (defined($opts{'inlandwet'})) {
- $no_inlandwet = ".false.";
- }
- #
- # Set disk location to send files to, and list resolutions to operate over,
- # set filenames, and short-date-name
- #
- my @hresols;
- my $mapdate;
- if ( $opts{'hgrid'} eq "all" ) {
- my @all_hresols = $definition->get_valid_values( "res" );
- @hresols = @all_hresols;
- } elsif ( $opts{'hgrid'} eq "usrspec" ) {
- @hresols = $opts{'usr_gname'};
- $mapdate = $opts{'usr_gdate'};
- } else {
- @hresols = split( ",", $opts{'hgrid'} );
- # Check that resolutions are valid
- foreach my $res ( @hresols ) {
- if ( ! $definition->is_valid_value( "res", "'$res'" ) ) {
- if ( $opts{'usrname'} eq "" || $res ne $opts{'usrname'} ) {
- print "** Invalid resolution: $res\n";
- usage();
- }
- }
- }
- }
- #
- # Set years to run over
- #
- my @years = split( ",", $opts{'years'} );
- # Check that resolutions are valid
- foreach my $sim_year ( @years ) {
- if ( ("-" eq substr($sim_year, 4, 1)) || ("-" eq substr($sim_year, 3, 1)) ) {
- # range of years for transient run
- if ( ! $definition->is_valid_value( "sim_year_range", "'$sim_year'" ) ) {
- print "** Invalid simulation simulation year range: $sim_year\n";
- usage();
- }
- } else {
- # single year.
- if ( ! $definition->is_valid_value( "sim_year", "'$sim_year'" ) ) {
- print "** Invalid simulation year: $sim_year\n";
- usage();
- }
- }
- }
- #
- # Set ssp_rcp to use
- #
- my @rcpaths = split( ",", $opts{'ssp_rcp'} );
- # Check that ssp_rcp is valid
- foreach my $ssp_rcp ( @rcpaths ) {
- if ( ! $definition->is_valid_value( "ssp_rcp", "'$ssp_rcp'" ) ) {
- print "** Invalid ssp_rcp: $ssp_rcp\n";
- usage();
- }
- }
-
- # CMIP series input data is corresponding to
- my $cmip_series = "CMIP6";
- # Check if soil set
- if ( defined($opts{'soil_cly'}) ||
- defined($opts{'soil_snd'}) ) {
- &check_soil( );
- $opts{'soil_override'} = 1;
- }
- # Check if pft set
- if ( ! $opts{'crop'} ) { $numpft = 16; } # First set numpft if crop is off
- if ( defined($opts{'pft_frc'}) || defined($opts{'pft_idx'}) ) {
- &check_pft( );
- $opts{'pft_override'} = 1;
- }
- # Check if dynpft set and is valid filename
- if ( defined($opts{'dynpft'}) ) {
- if ( ! -f $opts{'dynpft'} ) {
- print "** Dynamic PFT file does NOT exist: $opts{'dynpft'}\n";
- usage();
- }
- }
-
- my $sdate = "c" . `date +%y%m%d`;
- chomp( $sdate );
-
- my $cfile = "clm.input_data_list";
- if ( -f "$cfile" ) {
- `/bin/mv -f $cfile ${cfile}.previous`;
- }
- my $cfh = IO::File->new;
- $cfh->open( ">$cfile" ) or die "** can't open file: $cfile\n";
- system( "\rm -f $cfile" );
- system( "touch $cfile" );
- print $cfh <<"EOF";
-#! /bin/csh -f
-set CSMDATA = $CSMDATA
-EOF
- system( "chmod +x $cfile" );
- my $surfdir = "lnd/clm2/surfdata";
-
- # string to add to options for crop off or on
- my $mkcrop_off = ",crop='on'";
- my $mkcrop_on = ",crop='on'";
-
- #
- # Loop over all resolutions and sim-years listed
- #
- foreach my $res ( @hresols ) {
- #
- # Query the XML default file database to get the appropriate files
- #
- my $queryopts, my $queryfilopts;
- if ( $opts{'hgrid'} eq "usrspec" ) {
- $queryopts = "-csmdata $CSMDATA -silent -justvalue";
- } else {
- $queryopts = "-res $res -csmdata $CSMDATA -silent -justvalue";
- }
- $queryfilopts = "$queryopts -onlyfiles ";
- my $mkcrop = $mkcrop_off;
- my $setnumpft = "";
- $mkcrop = $mkcrop_on;
- $setnumpft = "numpft = $numpft";
- my $usrnam = "";
- if ( $opts{'usrname'} ne "" && $res eq $opts{'usrname'} ) {
- $usrnam = "-usrname ".$opts{'usrname'};
- }
- #
- # Mapping files
- #
- my %map; my %hgrd; my %lmsk; my %datfil; my %filnm;
- my $hirespft = "off";
- if ( defined($opts{'hirespft'}) ) {
- $hirespft = "on";
- }
- my $merge_gis = "off";
- if ( defined($opts{'merge_gis'}) ) {
- $merge_gis = "on";
- }
- my $mopts = "$queryopts -namelist default_settings $usrnam";
- my $mkopts = "-csmdata $CSMDATA -silent -justvalue -namelist clmexp $usrnam";
- my @typlist = ( "lak", "veg", "voc", "tex", "col", "hrv",
- "fmx", "lai", "urb", "org", "glc", "glcregion", "utp", "wet",
- "gdp", "peat","soildepth","abm");
- if ( $opts{'vic'} ) {
- push( @typlist, "vic" );
- }
- if ( ! $opts{'fast_maps'} ) {
- push( @typlist, "topostats" );
- }
- foreach my $typ ( @typlist ) {
- my $lmask = `$scrdir/../../bld/queryDefaultNamelist.pl $mopts -options type=$typ,mergeGIS=$merge_gis,hirespft=$hirespft -var lmask`;
- $lmask = trim($lmask);
- my $hgrid_cmd = "$scrdir/../../bld/queryDefaultNamelist.pl $mopts -options type=$typ,hirespft=$hirespft -var hgrid";
- my $hgrid = `$hgrid_cmd`;
- if ($debug) {
- print "query to determine hgrid:\n $hgrid_cmd \n\n";
- }
- $hgrid = trim($hgrid);
- my $filnm = `$scrdir/../../bld/queryDefaultNamelist.pl $mopts -options type=$typ -var mksrf_filename`;
- $filnm = trim($filnm);
- $filnm{$typ} = $filnm;
- $hgrd{$typ} = $hgrid;
- $lmsk{$typ} = $lmask;
- if ( $opts{'hgrid'} eq "usrspec" ) {
- $map{$typ} = $opts{'usr_mapdir'}."/map_${hgrid}_${lmask}_to_${res}_nomask_aave_da_c${mapdate}\.nc";
- } else {
- $map{$typ} = `$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts -namelist clmexp -options frm_hgrid=$hgrid,frm_lmask=$lmask,to_hgrid=$res,to_lmask=nomask -var map`;
- }
- $map{$typ} = trim($map{$typ});
- if ( $map{$typ} !~ /[^ ]+/ ) {
- die "ERROR: could NOT find a mapping file for this resolution: $res and type: $typ at $hgrid and $lmask.\n";
- }
- if ( ! defined($opts{'allownofile'}) && ! -f $map{$typ} ) {
- die "ERROR: mapping file for this resolution does NOT exist ($map{$typ}).\n";
- }
- }
- #
- # Grid file from the pft map file or grid if not found
- #
- my $griddata = trim($map{'veg'});
- if ( $griddata eq "" ) {
- $griddata = `$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts $usrnam -var fatmgrid`;
- if ( $griddata eq "" ) {
- die "ERROR: could NOT find a grid data file for this resolution: $res.\n";
- }
- }
- my $desc;
- my $desc_surfdat;
- #
- # Check if all urban single point dataset
- #
- my @all_urb = ( "1x1_camdenNJ","1x1_vancouverCAN", "1x1_mexicocityMEX", "1x1_urbanc_alpha" );
- my $all_urb = ".false.";
- my $urb_pt = 0;
- foreach my $urb_res ( @all_urb ) {
- if ( $res eq $urb_res ) {
- $all_urb = ".true.";
- if ( $res ne "1x1_camdenNJ" ) { $urb_pt = 1; }
- }
- }
- #
- # Always run at double precision for output
- #
- my $double = ".true.";
- #
- # Loop over each SSP-RCP scenario
- #
- RCP: foreach my $ssp_rcp ( @rcpaths ) {
- #
- # Loop over each sim_year
- #
- SIM_YEAR: foreach my $sim_year ( @years ) {
- #
- # Skip if urban unless sim_year=2000
- #
- if ( $urb_pt && $sim_year ne '2000' ) {
- print "For urban -- skip this simulation year = $sim_year\n";
- next SIM_YEAR;
- }
- #
- # If year is 1850-2000 actually run 1850-2015
- #
- if ( $sim_year eq "1850-2000" ) {
- my $actual = "1850-2015";
- print "For $sim_year actually run $actual\n";
- $sim_year = $actual;
- }
- my $urbdesc = "urb3den";
- my $resol = "-res $hgrd{'veg'}";
- my $resolhrv = "-res $hgrd{'hrv'}";
- my $sim_yr0 = $sim_year;
- my $sim_yrn = $sim_year;
- my $transient = 0;
- if ( $sim_year =~ /([0-9]+)-([0-9]+)/ ) {
- $sim_yr0 = $1;
- $sim_yrn = $2;
- $transient = 1;
- }
- #
- # Find the file for each of the types
- #
- foreach my $typ ( @typlist ) {
- my $hgrid = $hgrd{$typ};
- my $lmask = $lmsk{$typ};
- my $filnm = $filnm{$typ};
- my $typ_cmd = "$scrdir/../../bld/queryDefaultNamelist.pl $mkopts -options " .
- "hgrid=$hgrid,lmask=$lmask,mergeGIS=$merge_gis$mkcrop,sim_year=$sim_yr0 -var $filnm";
- $datfil{$typ} = `$typ_cmd`;
- $datfil{$typ} = trim($datfil{$typ});
- if ( $datfil{$typ} !~ /[^ ]+/ ) {
- die "ERROR: could NOT find a $filnm data file for this resolution: $hgrid and type: $typ and $lmask.\n$typ_cmd\n\n";
- }
- if ( ! defined($opts{'allownofile'}) && ! -f $datfil{$typ} ) {
- die "ERROR: data file for this resolution does NOT exist ($datfil{$typ}).\n";
- }
- }
- # determine simulation year to use for the surface dataset:
- my $sim_yr_surfdat = "$sim_yr0";
-
- my $cmd = "$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts $resol -options sim_year='${sim_yr_surfdat}'$mkcrop,ssp_rcp=${ssp_rcp}${mkcrop} -var mksrf_fvegtyp -namelist clmexp";
- my $vegtyp = `$cmd`;
- chomp( $vegtyp );
- if ( $vegtyp eq "" ) {
- die "** trouble getting vegtyp file with: $cmd\n";
- }
- my $cmd = "$scrdir/../../bld/queryDefaultNamelist.pl $queryfilopts $resolhrv -options sim_year='${sim_yr_surfdat}'$mkcrop,ssp_rcp=${ssp_rcp}${mkcrop} -var mksrf_fvegtyp -namelist clmexp";
- my $hrvtyp = `$cmd`;
- chomp( $hrvtyp );
- if ( $hrvtyp eq "" ) {
- die "** trouble getting hrvtyp file with: $cmd\n";
- }
- my $options = "";
- my $crpdes = sprintf("%2.2dpfts", $numpft);
- if ( $numpft == 16 ) {
- $crpdes .= "_Irrig";
- }
- if ( $mkcrop ne "" ) {
- $options = "-options $mkcrop";
- }
- $desc = sprintf( "%s_%s_%s_simyr%s-%4.4d", $ssp_rcp, $crpdes, $cmip_series, $sim_yr0, $sim_yrn );
- $desc_surfdat = sprintf( "%s_%s_%s_simyr%s", $ssp_rcp, $crpdes, $cmip_series, $sim_yr_surfdat );
-
- my $fsurdat_fname_base = "";
- my $fsurdat_fname = "";
- if ( ! $opts{'no_surfdata'} ) {
- $fsurdat_fname_base = "surfdata_${res}_${desc_surfdat}_${sdate}";
- $fsurdat_fname = "${fsurdat_fname_base}.nc";
- }
-
- my $fdyndat_fname_base = "";
- my $fdyndat_fname = "";
- if ($transient) {
- $fdyndat_fname_base = "landuse.timeseries_${res}_${desc}_${sdate}";
- $fdyndat_fname = "${fdyndat_fname_base}.nc";
- }
-
- if (!$fsurdat_fname && !$fdyndat_fname) {
- die("ERROR: Tried to run mksurfdata_map without creating either a surface dataset or a landuse.timeseries file")
- }
-
- my $logfile_fname;
- my $namelist_fname;
- if ($fsurdat_fname_base) {
- $logfile_fname = "${fsurdat_fname_base}.log";
- $namelist_fname = "${fsurdat_fname_base}.namelist";
- }
- else {
- $logfile_fname = "${fdyndat_fname_base}.log";
- $namelist_fname = "${fdyndat_fname_base}.namelist";
- }
-
- my ($landuse_timeseries_text_file) = write_transient_timeseries_file(
- $transient, $desc, $sim_yr0, $sim_yrn,
- $queryfilopts, $resol, $resolhrv, $ssp_rcp, $mkcrop,
- $sim_yr_surfdat);
-
- print "CSMDATA is $CSMDATA \n";
- print "resolution: $res ssp_rcp=$ssp_rcp sim_year = $sim_year\n";
- print "namelist: $namelist_fname\n";
-
- my $gridtype;
- $gridtype = "global";
- if (index($res, '1x1_') != -1) {
- $gridtype = "regional";
- }
- if (index($res, '5x5_amazon') != -1) {
- $gridtype = "regional";
- }
-
- write_namelist_file(
- $namelist_fname, $logfile_fname, $fsurdat_fname, $fdyndat_fname,
- $glc_nec, $griddata, $gridtype, \%map, \%datfil, $double,
- $all_urb, $no_inlandwet, $vegtyp, $hrvtyp,
- $landuse_timeseries_text_file, $setnumpft);
-
- #
- # Delete previous versions of files that will be created
- #
- system( "/bin/rm -f $fsurdat_fname $logfile_fname" );
- #
- # Run mksurfdata_map with the namelist file
- #
- my $exedir = $scrdir;
- if ( defined($opts{'exedir'}) ) {
- $exedir = $opts{'exedir'};
- }
- print "$exedir/mksurfdata_map < $namelist_fname\n";
- if ( ! $opts{'debug'} ) {
- system( "$exedir/mksurfdata_map < $namelist_fname" );
- if ( $? ) { die "ERROR in mksurfdata_map: $?\n"; }
- }
- print "\n===========================================\n\n";
-
- #
- # If urban point, overwrite urban variables from previous surface dataset to this one
- #
- if ( $urb_pt && ! $opts{'no_surfdata'} ) {
- my $prvsurfdata = `$scrdir/../../bld/queryDefaultNamelist.pl $queryopts -var fsurdat`;
- if ( $? != 0 ) {
- die "ERROR:: previous surface dataset file NOT found\n";
- }
- chomp( $prvsurfdata );
- my $varlist = "CANYON_HWR,EM_IMPROAD,EM_PERROAD,EM_ROOF,EM_WALL,HT_ROOF,THICK_ROOF,THICK_WALL,T_BUILDING_MIN,WIND_HGT_CANYON,WTLUNIT_ROOF,WTROAD_PERV,ALB_IMPROAD_DIR,ALB_IMPROAD_DIF,ALB_PERROAD_DIR,ALB_PERROAD_DIF,ALB_ROOF_DIR,ALB_ROOF_DIF,ALB_WALL_DIR,ALB_WALL_DIF,TK_ROOF,TK_WALL,TK_IMPROAD,CV_ROOF,CV_WALL,CV_IMPROAD,NLEV_IMPROAD,PCT_URBAN,URBAN_REGION_ID";
- print "Overwrite urban parameters with previous surface dataset values\n";
- $cmd = "ncks -A -v $varlist $prvsurfdata $fsurdat_fname";
- print "$cmd\n";
- if ( ! $opts{'debug'} ) { system( $cmd ); }
- }
-
- } # End of sim_year loop
- } # End of ssp_rcp loop
- }
- close( $cfh );
- print "Successfully created fsurdat files\n";
diff --git a/tools/mksurfdata_map/mksurfdata_map.namelist b/tools/mksurfdata_map/mksurfdata_map.namelist
deleted file mode 100644
index 2c6180f156..0000000000
--- a/tools/mksurfdata_map/mksurfdata_map.namelist
+++ /dev/null
@@ -1,54 +0,0 @@
-&clmexp
- nglcec = 10
- mksrf_fgrid = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.25x0.25_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fpft = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.25x0.25_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fglacier = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_3x3min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fglacierregion = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_10x10min_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fsoicol = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.25x0.25_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_furban = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_3x3min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fmax = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.125x0.125_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_forganic = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_5x5min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_flai = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.25x0.25_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fharvest = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.25x0.25_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_flakwat = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_3x3min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fwetlnd = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fvocef = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fsoitex = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_5x5min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_furbtopo = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_10x10min_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fgdp = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fpeat = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fsoildepth = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_5x5min_nomask_to_10x15_nomask_aave_da_c200309.nc'
- map_fabm = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_ftopostats = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_1km-merge-10min_HYDRO1K-merge-nomask_to_10x15_nomask_aave_da_c130411.nc'
- map_fvic = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.9x1.25_nomask_to_10x15_nomask_aave_da_c200206.nc'
- map_fch4 = '/glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/maps/10x15/map_0.5x0.5_nomask_to_10x15_nomask_aave_da_c200206.nc'
- mksrf_fsoitex = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_soitex.10level.c010119.nc'
- mksrf_forganic = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_organic_10level_5x5min_ISRIC-WISE-NCSCD_nlev7_c120830.nc'
- mksrf_flakwat = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_LakePnDepth_3x3min_simyr2004_csplk_c151015.nc'
- mksrf_fwetlnd = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_lanwat.050425.nc'
- mksrf_fmax = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_fmax_0.125x0.125_c200220.nc'
- mksrf_fglacier = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_glacier_3x3min_simyr2000.c120926.nc'
- mksrf_fglacierregion = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_GlacierRegion_10x10min_nomask_c191120.nc'
- mksrf_fvocef = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_vocef_0.5x0.5_simyr2000.c110531.nc'
- mksrf_furbtopo = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_topo.10min.c191120.nc'
- mksrf_fgdp = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_gdp_0.5x0.5_AVHRR_simyr2000.c130228.nc'
- mksrf_fpeat = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_peatf_0.5x0.5_AVHRR_simyr2000.c130228.nc'
- mksrf_fsoildepth = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksf_soilthk_5x5min_ORNL-Soil_simyr1900-2015_c170630.nc'
- mksrf_fabm = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_abm_0.5x0.5_AVHRR_simyr2000.c130201.nc'
- mksrf_ftopostats = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_topostats_1km-merge-10min_HYDRO1K-merge-nomask_simyr2000.c130402.nc'
- mksrf_fvic = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_vic_0.9x1.25_GRDC_simyr2000.c130307.nc'
- mksrf_fch4 = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_ch4inversion_0.5x0.5_cruncep_simyr2000.c191112.nc'
- outnc_double = .true.
- all_urban = .false.
- no_inlandwet = .true.
- mksrf_furban = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/mksrf_urban_0.05x0.05_simyr2000.c170724.nc'
- mksrf_fvegtyp = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2000.c170412.nc'
- mksrf_fhrvtyp = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_landuse_histclm50_LUH2_2000.c170412.nc'
- mksrf_fsoicol = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_soilcolor_simyr2005.c170413.nc'
- mksrf_flai = '/glade/p/cesm/cseg/inputdata/lnd/clm2/rawdata/pftcftlandusedynharv.0.25x0.25.MODIS.simyr1850-2015.c170412/mksrf_lai_78pfts_simyr2005.c170413.nc'
- fsurdat = 'surfdata_10x15_78pfts_simyr2000_c170531.nc'
- fsurlog = 'surfdata_10x15_78pfts_simyr2000_c170531.log'
- mksrf_fdynuse = ''
- fdyndat = ''
- numpft = 78
-/
diff --git a/tools/mksurfdata_map/single_point_dynpft_files/README b/tools/mksurfdata_map/single_point_dynpft_files/README
deleted file mode 100644
index 04334d4cb2..0000000000
--- a/tools/mksurfdata_map/single_point_dynpft_files/README
+++ /dev/null
@@ -1,4 +0,0 @@
-$CTSMROOT/tools/mksurfdata_map/single_point_dynpft_files
-
-This directory contains files that can be used with the -dynpft option to
-mksurfdata.pl when creating the respective single-point transient datasets.
diff --git a/tools/mksurfdata_map/single_point_dynpft_files/README.landuse_timeseries_smallvilleIA_hist_simyr1850-1855 b/tools/mksurfdata_map/single_point_dynpft_files/README.landuse_timeseries_smallvilleIA_hist_simyr1850-1855
deleted file mode 100644
index 9cc79f0ecd..0000000000
--- a/tools/mksurfdata_map/single_point_dynpft_files/README.landuse_timeseries_smallvilleIA_hist_simyr1850-1855
+++ /dev/null
@@ -1,23 +0,0 @@
-The file landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt is meant for
-use with the 1x1_smallvilleIA test case. It tests a number of aspects of
-transient crops:
-
-- It starts with two years of 100% natural veg (1850 & 1851)
-
-- It then transitions to 100% crop (1852)
-
-- It then shifts PCT_CFT while keeping PCT_CROP at 100% (1853)
-
-- It then increases natural veg to > 0, while also shifting PCT_CFT (1854)
-
-- It then adjusts both PCT_CROP and PCT_CFT (1855)
-
-To create a surface dataset and transient dataset that use this file:
-
-mksurfdata.pl -crop -y 1850-2000 -r 1x1_smallvilleIA -pft_idx 13 -pft_frc 100 -dynpft single_point_dynpft_files/landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt
-mv landuse.timeseries_1x1_smallvilleIA_hist_simyr1850-2005_cYYMMDD.nc landuse.timeseries_1x1_smallvilleIA_hist_simyr1850-1855_cYYMMDD.nc
-
-
-This should be run with a transient crop case that starts in 1850 and runs for
-at least 6 years.
-
diff --git a/tools/mksurfdata_map/single_point_dynpft_files/landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt b/tools/mksurfdata_map/single_point_dynpft_files/landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt
deleted file mode 100644
index f6943e957f..0000000000
--- a/tools/mksurfdata_map/single_point_dynpft_files/landuse_timeseries_smallvilleIA_hist_simyr1850-1855.txt
+++ /dev/null
@@ -1,6 +0,0 @@
-100 13 0,0,0,0,0 0 1850
-100 13 0,0,0,0,0 0 1851
-1,1,1,1,1,1,1,1,1,91 15,16,17,18,19,20,21,22,23,24 0,0,0,0,0 0 1852
-91,1,1,1,1,1,1,1,1,1 15,16,17,18,19,20,21,22,23,24 0,0,0,0,0 0 1853
-50,1,2,2,3,3,4,4,5,5,21 13,15,16,17,18,19,20,21,22,23,24 0,0,0,0,0 0 1854
-75,1,1,1,1,1,1,1,1,1,16 13,15,16,17,18,19,20,21,22,23,24 0,0,0,0,0 0 1855
diff --git a/tools/mksurfdata_map/src/CMakeLists.txt b/tools/mksurfdata_map/src/CMakeLists.txt
deleted file mode 100644
index 3179c3cdc9..0000000000
--- a/tools/mksurfdata_map/src/CMakeLists.txt
+++ /dev/null
@@ -1,43 +0,0 @@
-# This CMakeLists.txt file is currently used just for building unit tests.
-
-cmake_minimum_required(VERSION 2.8)
-list(APPEND CMAKE_MODULE_PATH ${CIME_CMAKE_MODULE_DIRECTORY})
-include(CIME_initial_setup)
-
-project(mksurfdat_tests Fortran)
-
-include(CIME_utils)
-
-# Build library containing stuff needed for the unit tests
-list(APPEND mksurfdat_sources
- shr_kind_mod.F90
- shr_log_mod.F90
- nanMod.F90
- shr_string_mod.F90
- fileutils.F90
- shr_timer_mod.F90
- shr_file_mod.F90
- mkgridmapMod.F90
- mkindexmapMod.F90
- mkpftConstantsMod.F90
- mkpctPftTypeMod.F90
- mkpftMod.F90
- mkdomainMod.F90
- mkgridmapMod.F90
- mkutilsMod.F90
- mkpftUtilsMod.F90
- mksoilUtilsMod.F90
- mkvarctl.F90
- mkvarpar.F90
- shr_const_mod.F90
- shr_sys_mod.F90
- unit_test_stubs/abort.F90
- unit_test_stubs/mkncdio.F90)
-add_library(mksurfdat ${mksurfdat_sources})
-
-# Tell cmake to look for libraries & mod files here, because this is where we built libraries
-include_directories(${CMAKE_CURRENT_BINARY_DIR})
-link_directories(${CMAKE_CURRENT_BINARY_DIR})
-
-# Add the test directory
-add_subdirectory(test)
diff --git a/tools/mksurfdata_map/src/Filepath b/tools/mksurfdata_map/src/Filepath
deleted file mode 100644
index 9c558e357c..0000000000
--- a/tools/mksurfdata_map/src/Filepath
+++ /dev/null
@@ -1 +0,0 @@
-.
diff --git a/tools/mksurfdata_map/src/Makefile b/tools/mksurfdata_map/src/Makefile
deleted file mode 100644
index 248a913565..0000000000
--- a/tools/mksurfdata_map/src/Makefile
+++ /dev/null
@@ -1,10 +0,0 @@
-# Makefile for mksurfdata_map
-
-EXENAME = ../mksurfdata_map
-
-# Set optimization on by default
-ifeq ($(OPT),$(null))
- OPT := TRUE
-endif
-
-include Makefile.common
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/Makefile.common b/tools/mksurfdata_map/src/Makefile.common
deleted file mode 100644
index ab79f94144..0000000000
--- a/tools/mksurfdata_map/src/Makefile.common
+++ /dev/null
@@ -1,360 +0,0 @@
-#-----------------------------------------------------------------------
-# This Makefile is for building clm tools on AIX, Linux (with pgf90 or
-# lf95 compiler), Darwin or IRIX platforms.
-#
-# These macros can be changed by setting environment variables:
-#
-# LIB_NETCDF --- Library directory location of netcdf. (defaults to /usr/local/lib)
-# INC_NETCDF --- Include directory location of netcdf. (defaults to /usr/local/include)
-# MOD_NETCDF --- Module directory location of netcdf. (defaults to $LIB_NETCDF)
-# USER_FC ------ Allow user to override the default Fortran compiler specified in Makefile.
-# USER_FCTYP --- Allow user to override the default type of Fortran compiler (linux and USER_FC=ftn only).
-# USER_CC ------ Allow user to override the default C compiler specified in Makefile (linux only).
-# USER_LINKER -- Allow user to override the default linker specified in Makefile.
-# USER_CPPDEFS - Additional CPP defines.
-# USER_CFLAGS -- Additional C compiler flags that the user wishes to set.
-# USER_FFLAGS -- Additional Fortran compiler flags that the user wishes to set.
-# USER_LDLAGS -- Additional load flags that the user wishes to set.
-# SMP ---------- Shared memory Multi-processing (TRUE or FALSE) [default is FALSE]
-# OPT ---------- Use optimized options.
-#
-#------------------------------------------------------------------------
-
-# Set up special characters
-null :=
-
-# Newer makes set the CURDIR variable.
-CURDIR := $(shell pwd)
-
-RM = rm
-
-# Check for the netcdf library and include directories
-ifeq ($(LIB_NETCDF),$(null))
- LIB_NETCDF := /usr/local/lib
-endif
-
-ifeq ($(INC_NETCDF),$(null))
- INC_NETCDF := /usr/local/include
-endif
-
-ifeq ($(MOD_NETCDF),$(null))
- MOD_NETCDF := $(LIB_NETCDF)
-endif
-
-# Set user specified Fortran compiler
-ifneq ($(USER_FC),$(null))
- FC := $(USER_FC)
-endif
-
-# Set user specified C compiler
-ifneq ($(USER_CC),$(null))
- CC := $(USER_CC)
-endif
-
-# Set if Shared memory multi-processing will be used
-ifeq ($(SMP),$(null))
- SMP := FALSE
-endif
-
-CPPDEF := $(USER_CPPDEFS)
-
-# Set optimization on by default
-ifeq ($(OPT),$(null))
- OPT := TRUE
-endif
-
-ifeq ($(OPT),TRUE)
- CPPDEF := -DOPT
-endif
-
-# Determine platform
-UNAMES := $(shell uname -s)
-
-# Load dependency search path.
-dirs := . $(shell cat Filepath)
-
-# Set cpp search path, include netcdf
-cpp_dirs := $(dirs) $(INC_NETCDF) $(MOD_NETCDF)
-cpp_path := $(foreach dir,$(cpp_dirs),-I$(dir)) # format for command line
-
-# Expand any tildes in directory names. Change spaces to colons.
-# (the vpath itself is set elsewhere, based on this variable)
-vpath_dirs := $(foreach dir,$(cpp_dirs),$(wildcard $(dir)))
-vpath_dirs := $(subst $(space),:,$(vpath_dirs))
-
-#Primary Target: build the tool
-all: $(EXENAME)
-
-# Get list of files and build dependency file for all .o files
-# using perl scripts mkSrcfiles and mkDepends
-
-SOURCES := $(shell cat Srcfiles)
-
-OBJS := $(addsuffix .o, $(basename $(SOURCES)))
-
-# Set path to Mkdepends script; assumes that any Makefile including
-# this file is in a sibling of the src directory, in which Mkdepends
-# resides
-Mkdepends := ../src/Mkdepends
-
-$(CURDIR)/Depends: $(CURDIR)/Srcfiles $(CURDIR)/Filepath
- $(Mkdepends) Filepath Srcfiles > $@
-
-
-# Architecture-specific flags and rules
-#------------------------------------------------------------------------
-# AIX
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),AIX)
-CPPDEF += -DAIX
-cpre = $(null)-WF,-D$(null)
-FPPFLAGS := $(patsubst -D%,$(cpre)%,$(CPPDEF))
-FFLAGS = -c -I$(INC_NETCDF) -q64 -qsuffix=f=f90 -qsuffix=f=f90:cpp=F90 \
- $(FPPFLAGS) -g -qfullpath -qarch=auto -qtune=auto -qsigtrap=xl__trcedump -qsclk=micro
-
-LDFLAGS = -L$(LIB_NETCDF) -q64 -lnetcdff -lnetcdf
-ifneq ($(OPT),TRUE)
- FFLAGS += -qinitauto=7FF7FFFF -qflttrap=ov:zero:inv:en -qspillsize=4000 -C
-else
- FFLAGS += -O2 -qmaxmem=-1 -Q
- LDFLAGS += -Q
-endif
-CFLAGS := -q64 -g $(CPPDEF) -O2
-FFLAGS += $(cpp_path)
-CFLAGS += $(cpp_path)
-
-ifeq ($(SMP),TRUE)
- FC = xlf90_r
- FFLAGS += -qsmp=omp
- LDFLAGS += -qsmp=omp
-else
- FC = xlf90
-endif
-
-endif
-
-#------------------------------------------------------------------------
-# Darwin
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),Darwin)
-
-# Set the default Fortran compiler
-ifeq ($(USER_FC),$(null))
- FC := g95
-endif
-ifeq ($(USER_CC),$(null))
- CC := gcc
-endif
-
-CFLAGS := -g -O2
-CPPDEF += -DSYSDARWIN -DDarwin -DLINUX
-LDFLAGS :=
-
-ifeq ($(FC),g95)
-
- CPPDEF += -DG95
- FFLAGS := -c -fno-second-underscore $(CPPDEF) $(cpp_path) -I$(MOD_NETCDF)
- ifeq ($(OPT),TRUE)
- FFLAGS += -O2
- else
- FFLAGS += -g -fbounds-check
- endif
-
-endif
-
-ifeq ($(FC),gfortran)
-
- CPPDEF += -DG95
- FFLAGS := -c -fno-second-underscore $(CPPDEF) $(cpp_path) -I$(MOD_NETCDF) \
- -fno-range-check
- ifeq ($(OPT),TRUE)
- FFLAGS += -O2
- else
- FFLAGS += -g -fbounds-check
- endif
-
-endif
-
-ifeq ($(FC),ifort)
-
- CPPDEF += -DFORTRANUNDERSCORE
- FFLAGS += -c -ftz -g -fp-model precise $(CPPDEF) $(cpp_path) \
- -convert big_endian -assume byterecl -traceback -FR
- LDFLAGS += -m64
-
- ifneq ($(OPT),TRUE)
- FFLAGS += -CB -O0
- else
- FFLAGS += -O2
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -qopenmp
- LDFLAGS += -qopenmp
- endif
-endif
-
-ifeq ($(FC),pgf90)
-
- CPPDEF += -DFORTRANUNDERSCORE
- FFLAGS += -c $(CPPDEF) $(cpp_path)
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -Ktrap=fp -Mbounds -Kieee
- else
- FFLAGS += -fast -Kieee
- endif
-
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
-
-endif
-
-ifeq ($(CC),icc)
- CFLAGS += -m64 -g
- ifeq ($(SMP),TRUE)
- CFLAGS += -qopenmp
- endif
-endif
-ifeq ($(CC),pgcc)
- CFLAGS += -g -fast
-endif
-
-CFLAGS += $(CPPDEF) $(cpp_path)
-LDFLAGS += -L$(LIB_NETCDF) -lnetcdf -lnetcdff
-
-endif
-
-#------------------------------------------------------------------------
-# Linux
-#------------------------------------------------------------------------
-
-ifeq ($(UNAMES),Linux)
- ifeq ($(USER_FC),$(null))
- FC := ifort
- FCTYP := ifort
- else
- ifeq ($(USER_FC),ftn)
- ifneq ($(USER_FCTYP),$(null))
- FCTYP := $(USER_FCTYP)
- else
- FCTYP := pgf90
- endif
- else
- FCTYP := $(USER_FC)
- endif
- endif
- CPPDEF += -DLINUX -DFORTRANUNDERSCORE
- CFLAGS := $(CPPDEF)
- LDFLAGS := $(shell $(LIB_NETCDF)/../bin/nf-config --flibs)
- FFLAGS =
-
- ifeq ($(FCTYP),pgf90)
- CC := pgcc
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -Ktrap=fp -Mbounds -Kieee
- else
- FFLAGS += -fast -Kieee
- CFLAGS += -fast
- endif
-
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
-
- endif
-
- ifeq ($(FCTYP),lf95)
- ifneq ($(OPT),TRUE)
- FFLAGS += -g --chk a,e,s,u -O0
- else
- FFLAGS += -O
- endif
- # Threading only works by putting thread memory on the heap rather than the stack
- # (--threadheap).
- # As of lf95 version 6.2 the thread stacksize limits are (still) too small to run
- # even small
- # resolution problems (FV at 10x15 res fails).
- ifeq ($(SMP),TRUE)
- FFLAGS += --openmp --threadheap 4096
- LDFLAGS += --openmp --threadheap 4096
- endif
- endif
- ifeq ($(FCTYP),pathf90)
- FFLAGS += -extend_source -ftpp -fno-second-underscore
- ifneq ($(OPT),TRUE)
- FFLAGS += -g -O0
- else
- FFLAGS += -O
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -mp
- LDFLAGS += -mp
- endif
- endif
- ifeq ($(FCTYP),ifort)
-
- FFLAGS += -ftz -g -fp-model precise -convert big_endian -assume byterecl -traceback -FR
- CFLAGS += -m64 -g
- LDFLAGS += -m64
-
- ifneq ($(OPT),TRUE)
- FFLAGS += -CB -O0
- else
- FFLAGS += -O2
- endif
- ifeq ($(SMP),TRUE)
- FFLAGS += -qopenmp
- CFLAGS += -qopenmp
- LDFLAGS += -qopenmp
- endif
- endif
- FFLAGS += -c -I$(INC_NETCDF) $(CPPDEF) $(cpp_path)
- CFLAGS += $(cpp_path)
-endif
-
-#------------------------------------------------------------------------
-# Default rules and macros
-#------------------------------------------------------------------------
-
-.SUFFIXES:
-.SUFFIXES: .F90 .c .o
-
-# Set the vpath for all file types EXCEPT .o
-# We do this for individual file types rather than generally using
-# VPATH, because for .o files, we don't want to use files from a
-# different build (e.g., in building the unit tester, we don't want to
-# use .o files from the main build)
-vpath %.F90 $(vpath_dirs)
-vpath %.c $(vpath_dirs)
-vpath %.h $(vpath_dirs)
-
-# Append user defined compiler and load flags to Makefile defaults
-CFLAGS += $(USER_CFLAGS)
-FFLAGS += $(USER_FFLAGS)
-LDFLAGS += $(USER_LDFLAGS)
-
-# Set user specified linker
-ifneq ($(USER_LINKER),$(null))
- LINKER := $(USER_LINKER)
-else
- LINKER := $(FC)
-endif
-
-.F90.o:
- $(FC) $(FFLAGS) $<
-
-.c.o:
- $(CC) -c $(CFLAGS) $<
-
-
-$(EXENAME): $(OBJS)
- $(LINKER) -o $@ $(OBJS) $(LDFLAGS)
-
-clean:
- $(RM) -f $(OBJS) *.mod Depends
-
-include $(CURDIR)/Depends
diff --git a/tools/mksurfdata_map/src/Mkdepends b/tools/mksurfdata_map/src/Mkdepends
deleted file mode 100755
index a75e8fdde0..0000000000
--- a/tools/mksurfdata_map/src/Mkdepends
+++ /dev/null
@@ -1,327 +0,0 @@
-#!/usr/bin/env perl
-
-# Generate dependencies in a form suitable for inclusion into a Makefile.
-# The source filenames are provided in a file, one per line. Directories
-# to be searched for the source files and for their dependencies are provided
-# in another file, one per line. Output is written to STDOUT.
-#
-# For CPP type dependencies (lines beginning with #include) the dependency
-# search is recursive. Only dependencies that are found in the specified
-# directories are included. So, for example, the standard include file
-# stdio.h would not be included as a dependency unless /usr/include were
-# one of the specified directories to be searched.
-#
-# For Fortran module USE dependencies (lines beginning with a case
-# insensitive "USE", possibly preceded by whitespace) the Fortran compiler
-# must be able to access the .mod file associated with the .o file that
-# contains the module. In order to correctly generate these dependencies
-# two restrictions must be observed.
-# 1) All modules must be contained in files that have the same base name as
-# the module, in a case insensitive sense. This restriction implies that
-# there can only be one module per file.
-# 2) All modules that are to be contained in the dependency list must be
-# contained in one of the source files in the list provided on the command
-# line.
-# The reason for the second restriction is that since the makefile doesn't
-# contain rules to build .mod files the dependency takes the form of the .o
-# file that contains the module. If a module is being used for which the
-# source code is not available (e.g., a module from a library), then adding
-# a .o dependency for that module is a mistake because make will attempt to
-# build that .o file, and will fail if the source code is not available.
-#
-# Author: B. Eaton
-# Climate Modelling Section, NCAR
-# Feb 2001
-
-use Getopt::Std;
-use File::Basename;
-
-# Check for usage request.
-@ARGV >= 2 or usage();
-
-# Process command line.
-my %opt = ();
-getopts( "t:w", \%opt ) or usage();
-my $filepath_arg = shift() or usage();
-my $srcfile_arg = shift() or usage();
-@ARGV == 0 or usage(); # Check that all args were processed.
-
-my $obj_dir;
-if ( defined $opt{'t'} ) { $obj_dir = $opt{'t'}; }
-
-open(FILEPATH, $filepath_arg) or die "Can't open $filepath_arg: $!\n";
-open(SRCFILES, $srcfile_arg) or die "Can't open $srcfile_arg: $!\n";
-
-# Make list of paths to use when looking for files.
-# Prepend "." so search starts in current directory. This default is for
-# consistency with the way GNU Make searches for dependencies.
-my @file_paths = ;
-close(FILEPATH);
-chomp @file_paths;
-unshift(@file_paths,'.');
-foreach $dir (@file_paths) { # (could check that directories exist here)
- $dir =~ s!/?\s*$!!; # remove / and any whitespace at end of directory name
- ($dir) = glob $dir; # Expand tildes in path names.
-}
-
-# Make list of files containing source code.
-my @src = ;
-close(SRCFILES);
-chomp @src;
-
-# For each file that may contain a Fortran module (*.[fF]90 *.[fF]) convert the
-# file's basename to uppercase and use it as a hash key whose value is the file's
-# basename. This allows fast identification of the files that contain modules.
-# The only restriction is that the file's basename and the module name must match
-# in a case insensitive way.
-my %module_files = ();
-my ($f, $name, $path, $suffix, $mod);
-my @suffixes = ('\.[fF]90', '\.[fF]' );
-foreach $f (@src) {
- ($name, $path, $suffix) = fileparse($f, @suffixes);
- ($mod = $name) =~ tr/a-z/A-Z/;
- $module_files{$mod} = $name;
-}
-
-# Now make a list of .mod files in the file_paths. If a .o source dependency
-# can't be found based on the module_files list above, then maybe a .mod
-# module dependency can if the mod file is visible.
-my %trumod_files = ();
-my ($dir);
-my ($f, $name, $path, $suffix, $mod);
-my @suffixes = ('\.mod' );
-foreach $dir (@file_paths) {
- @filenames = (glob("$dir/*.mod"));
- foreach $f (@filenames) {
- ($name, $path, $suffix) = fileparse($f, @suffixes);
- ($mod = $name) =~ tr/a-z/A-Z/;
- $trumod_files{$mod} = $name;
- }
-}
-
-#print STDERR "\%module_files\n";
-#while ( ($k,$v) = each %module_files ) {
-# print STDERR "$k => $v\n";
-#}
-
-# Find module and include dependencies of the source files.
-my ($file_path, $rmods, $rincs);
-my %file_modules = ();
-my %file_includes = ();
-my @check_includes = ();
-foreach $f ( @src ) {
-
- # Find the file in the seach path (@file_paths).
- unless ($file_path = find_file($f)) {
- if (defined $opt{'w'}) {print STDERR "$f not found\n";}
- next;
- }
-
- # Find the module and include dependencies.
- ($rmods, $rincs) = find_dependencies( $file_path );
-
- # Remove redundancies (a file can contain multiple procedures that have
- # the same dependencies).
- $file_modules{$f} = rm_duplicates($rmods);
- $file_includes{$f} = rm_duplicates($rincs);
-
- # Make a list of all include files.
- push @check_includes, @{$file_includes{$f}};
-}
-
-#print STDERR "\%file_modules\n";
-#while ( ($k,$v) = each %file_modules ) {
-# print STDERR "$k => @$v\n";
-#}
-#print STDERR "\%file_includes\n";
-#while ( ($k,$v) = each %file_includes ) {
-# print STDERR "$k => @$v\n";
-#}
-#print STDERR "\@check_includes\n";
-#print STDERR "@check_includes\n";
-
-# Find include file dependencies.
-my %include_depends = ();
-while (@check_includes) {
- $f = shift @check_includes;
- if (defined($include_depends{$f})) { next; }
-
- # Mark files not in path so they can be removed from the dependency list.
- unless ($file_path = find_file($f)) {
- $include_depends{$f} = -1;
- next;
- }
-
- # Find include file dependencies.
- ($rmods, $include_depends{$f}) = find_dependencies($file_path);
-
- # Add included include files to the back of the check_includes list so
- # that their dependencies can be found.
- push @check_includes, @{$include_depends{$f}};
-
- # Add included modules to the include_depends list.
- if ( @$rmods ) { push @{$include_depends{$f}}, @$rmods; }
-}
-
-#print STDERR "\%include_depends\n";
-#while ( ($k,$v) = each %include_depends ) {
-# print STDERR (ref $v ? "$k => @$v\n" : "$k => $v\n");
-#}
-
-# Remove include file dependencies that are not in the Filepath.
-my $i, $ii;
-foreach $f (keys %include_depends) {
-
- unless (ref $include_depends{$f}) { next; }
- $rincs = $include_depends{$f};
- unless (@$rincs) { next; }
- $ii = 0;
- $num_incs = @$rincs;
- for ($i = 0; $i < $num_incs; ++$i) {
- if ($include_depends{$$rincs[$ii]} == -1) {
- splice @$rincs, $ii, 1;
- next;
- }
- ++$ii;
- }
-}
-
-# Substitute the include file dependencies into the %file_includes lists.
-foreach $f (keys %file_includes) {
- my @expand_incs = ();
-
- # Initialize the expanded %file_includes list.
- my $i;
- unless (@{$file_includes{$f}}) { next; }
- foreach $i (@{$file_includes{$f}}) {
- push @expand_incs, $i unless ($include_depends{$i} == -1);
- }
- unless (@expand_incs) {
- $file_includes{$f} = [];
- next;
- }
-
- # Expand
- for ($i = 0; $i <= $#expand_incs; ++$i) {
- push @expand_incs, @{ $include_depends{$expand_incs[$i]} };
- }
-
- $file_includes{$f} = rm_duplicates(\@expand_incs);
-}
-
-#print STDERR "expanded \%file_includes\n";
-#while ( ($k,$v) = each %file_includes ) {
-# print STDERR "$k => @$v\n";
-#}
-
-# Print dependencies to STDOUT.
-foreach $f (sort keys %file_modules) {
- $f =~ /(.+)\./;
- $target = "$1.o";
- if ( defined $opt{'t'} ) { $target = "$opt{'t'}/$1.o"; }
- print "$target : $f @{$file_modules{$f}} @{$file_includes{$f}}\n";
-}
-
-#--------------------------------------------------------------------------------------
-
-sub find_dependencies {
-
- # Find dependencies of input file.
- # Use'd Fortran 90 modules are returned in \@mods.
- # Files that are "#include"d by the cpp preprocessor are returned in \@incs.
-
- my( $file ) = @_;
- my( @mods, @incs );
-
- open(FH, $file) or die "Can't open $file: $!\n";
-
- while ( ) {
- # Search for "#include" and strip filename when found.
- if ( /^#include\s+[<"](.*)[>"]/ ) {
- push @incs, $1;
- }
- # Search for Fortran include dependencies.
- elsif ( /^\s*include\s+['"](.*)['"]/ ) { #" for emacs fontlock
- push @incs, $1;
- }
- # Search for module dependencies.
- elsif ( /^\s*USE\s+(\w+)/i ) {
- ($module = $1) =~ tr/a-z/A-Z/;
- # Return dependency in the form of a .o version of the file that contains
- # the module. this is from the source list.
- if ( defined $module_files{$module} ) {
- if ( defined $obj_dir ) {
- push @mods, "$obj_dir/$module_files{$module}.o";
- } else {
- push @mods, "$module_files{$module}.o";
- }
- }
- # Return dependency in the form of a .mod version of the file that contains
- # the module. this is from the .mod list. only if .o version not found
- elsif ( defined $trumod_files{$module} ) {
- if ( defined $obj_dir ) {
- push @mods, "$obj_dir/$trumod_files{$module}.mod";
- } else {
- push @mods, "$trumod_files{$module}.mod";
- }
- }
- }
- }
- close( FH );
- return (\@mods, \@incs);
-}
-
-#--------------------------------------------------------------------------------------
-
-sub find_file {
-
-# Search for the specified file in the list of directories in the global
-# array @file_paths. Return the first occurance found, or the null string if
-# the file is not found.
-
- my($file) = @_;
- my($dir, $fname);
-
- foreach $dir (@file_paths) {
- $fname = "$dir/$file";
- if ( -f $fname ) { return $fname; }
- }
- return ''; # file not found
-}
-
-#--------------------------------------------------------------------------------------
-
-sub rm_duplicates {
-
-# Return a list with duplicates removed.
-
- my ($in) = @_; # input arrary reference
- my @out = ();
- my $i;
- my %h = ();
- foreach $i (@$in) {
- $h{$i} = '';
- }
- @out = keys %h;
- return \@out;
-}
-
-#--------------------------------------------------------------------------------------
-
-sub usage {
- ($ProgName = $0) =~ s!.*/!!; # name of program
- die <abort if file not found 1=>do not abort
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer i !loop index
- integer klen !length of fulpath character string
- logical lexist !true if local file exists
-!------------------------------------------------------------------------
-
- ! get local file name from full name
-
- locfn = get_filename( fulpath )
- if (len_trim(locfn) == 0) then
- write(iulog,*)'(GETFIL): local filename has zero length'
- call shr_sys_abort
- else
- write(iulog,*)'(GETFIL): attempting to find local file ', &
- trim(locfn)
- endif
-
- ! first check if file is in current working directory.
-
- inquire (file=locfn,exist=lexist)
- if (lexist) then
- write(iulog,*) '(GETFIL): using ',trim(locfn), &
- ' in current working directory'
- RETURN
- endif
-
- ! second check for full pathname on disk
- locfn = fulpath
-
- inquire (file=fulpath,exist=lexist)
- if (lexist) then
- write(iulog,*) '(GETFIL): using ',trim(fulpath)
- RETURN
- else
- write(iulog,*)'(GETFIL): failed getting file from full path: ', fulpath
- if (present(iflag) .and. iflag==0) then
- call shr_sys_abort ('GETFIL: FAILED to get '//trim(fulpath))
- else
- RETURN
- endif
- endif
-
- end subroutine getfil
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: opnfil
-!
-! !INTERFACE:
- subroutine opnfil (locfn, iun, form)
-!
-! !DESCRIPTION:
-! Open file locfn in unformatted or formatted form on unit iun
-!
-! !ARGUMENTS:
-!
- implicit none
- character(len=*), intent(in):: locfn !file name
- integer, intent(in):: iun !fortran unit number
- character(len=1), intent(in):: form !file format: u = unformatted,
- !f = formatted
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer ioe !error return from fortran open
- character(len=11) ft !format type: formatted. unformatted
-!------------------------------------------------------------------------
-
- if (len_trim(locfn) == 0) then
- write(iulog,*)'(OPNFIL): local filename has zero length'
- call shr_sys_abort
- endif
- if (form=='u' .or. form=='U') then
- ft = 'unformatted'
- else
- ft = 'formatted '
- end if
- open (unit=iun,file=locfn,status='unknown',form=ft,iostat=ioe)
- if (ioe /= 0) then
- write(iulog,*)'(OPNFIL): failed to open file ',trim(locfn), &
- & ' on unit ',iun,' ierr=',ioe
- call shr_sys_abort
- else
- write(iulog,*)'(OPNFIL): Successfully opened file ',trim(locfn), &
- & ' on unit= ',iun
- end if
-
- end subroutine opnfil
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: getavu
-!
-! !INTERFACE:
- integer function getavu()
-!
-! !DESCRIPTION:
-! Get next available Fortran unit number.
-!
-! !USES:
- use shr_file_mod, only : shr_file_getUnit
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Created by Gordon Bonan
-! Modified for clm2 by Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
-!------------------------------------------------------------------------
-
- getavu = shr_file_getunit()
-
- end function getavu
-
-!------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: relavu
-!
-! !INTERFACE:
- subroutine relavu (iunit)
-!
-! !DESCRIPTION:
-! Close and release Fortran unit no longer in use!
-!
-! !USES:
- use shr_file_mod, only : shr_file_freeUnit
-!
-! !ARGUMENTS:
- implicit none
- integer, intent(in) :: iunit !Fortran unit number
-!
-! !REVISION HISTORY:
-! Created by Gordon Bonan
-!
-!EOP
-!------------------------------------------------------------------------
-
- close(iunit)
- call shr_file_freeUnit(iunit)
-
- end subroutine relavu
-
-end module fileutils
diff --git a/tools/mksurfdata_map/src/mkVICparamsMod.F90 b/tools/mksurfdata_map/src/mkVICparamsMod.F90
deleted file mode 100644
index f7cb4946c6..0000000000
--- a/tools/mksurfdata_map/src/mkVICparamsMod.F90
+++ /dev/null
@@ -1,200 +0,0 @@
-module mkVICparamsMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkVICparamsMod
-!
-! !DESCRIPTION:
-! make parameters for VIC
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mkVICparams ! make VIC parameters
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkVICparams
-!
-! !INTERFACE:
-subroutine mkVICparams(ldomain, mapfname, datfname, ndiag, &
- binfl_o, ws_o, dsmax_o, ds_o)
-!
-! !DESCRIPTION:
-! make VIC parameters
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_continuous
- use mkchecksMod, only : min_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: binfl_o(:) ! output grid: VIC b parameter for the Variable Infiltration Capacity Curve (unitless)
- real(r8) , intent(out):: ws_o(:) ! output grid: VIC Ws parameter for the ARNO curve (unitless)
- real(r8) , intent(out):: dsmax_o(:) ! output grid: VIC Dsmax parameter for the ARNO curve (mm/day)
- real(r8) , intent(out):: ds_o(:) ! output grid: VIC Ds parameter for the ARNO curve (unitless)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- real(r8), parameter :: min_valid_binfl = 0._r8
- real(r8), parameter :: min_valid_ws = 0._r8
- real(r8), parameter :: min_valid_dsmax = 0._r8
- real(r8), parameter :: min_valid_ds = 0._r8
-
- character(len=32) :: subname = 'mkVICparams'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make VIC parameters.....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read(tdomain,datfname)
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open VIC parameter file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid binfl
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'binfl', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, binfl_o, nodata=0.1_r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(binfl_o, min_valid_binfl, 'binfl')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, binfl_o, tgridmap, "VIC b parameter", "unitless", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Regrid Ws
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'Ws', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, ws_o, nodata=0.75_r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(ws_o, min_valid_ws, 'Ws')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, ws_o, tgridmap, "VIC Ws parameter", "unitless", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Regrid Dsmax
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'Dsmax', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, dsmax_o, nodata=10._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(dsmax_o, min_valid_dsmax, 'Dsmax')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, dsmax_o, tgridmap, "VIC Dsmax parameter", "mm/day", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Regrid Ds
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'Ds', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, ds_o, nodata=0.1_r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(ds_o, min_valid_ds, 'Ds')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, ds_o, tgridmap, "VIC Ds parameter", "unitless", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
- write (6,*) 'Successfully made VIC parameters'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkVICparams
-
-
-end module mkVICparamsMod
diff --git a/tools/mksurfdata_map/src/mkagfirepkmonthMod.F90 b/tools/mksurfdata_map/src/mkagfirepkmonthMod.F90
deleted file mode 100644
index af8001263f..0000000000
--- a/tools/mksurfdata_map/src/mkagfirepkmonthMod.F90
+++ /dev/null
@@ -1,273 +0,0 @@
-module mkagfirepkmonthMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkagfirepkmonthMod
-!
-! !DESCRIPTION:
-! Make agricultural fire peak month data
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!-----------------------------------------------------------------------
-!!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
- implicit none
-
- SAVE
- private ! By default make data private
-!
-! !PUBLIC MEMBER FUNCTIONS:
-!
- public mkagfirepkmon ! Set agricultural fire peak month
-!
-! !PRIVATE MEMBER FUNCTIONS:
- private define_months ! define month strings
-!
-! !PRIVATE DATA MEMBERS:
-!
- integer , parameter :: min_valid_value = 1
- integer , parameter :: max_valid_value = 12
- integer , parameter :: unsetmon = 13 ! flag to indicate agricultural fire peak month NOT set
-!
-! !PRIVATE DATA MEMBERS:
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkagfirepkmon
-!
-! !INTERFACE:
-subroutine mkagfirepkmon(ldomain, mapfname, datfname, ndiag, &
- agfirepkmon_o)
-!
-! !DESCRIPTION:
-! Make agricultural fire peak month data from higher resolution data
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkindexmapMod, only : get_dominant_indices
- use mkvarpar, only : re
- use mkncdio
- use mkchecksMod, only : min_bad, max_bad
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- integer , intent(out):: agfirepkmon_o(:) ! agricultural fire peak month
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: gast_i(:) ! global area, by surface type
- real(r8), allocatable :: gast_o(:) ! global area, by surface type
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer , allocatable :: agfirepkmon_i(:) ! input grid: agricultural fire peak month
- integer :: nagfirepkmon ! number of peak months
- character(len=35), allocatable :: month(:)! name of each month
- integer :: k,ni,no,ns_i,ns_o ! indices
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- integer, parameter :: miss = unsetmon ! missing data indicator
- integer, parameter :: min_valid = 1 ! minimum valid value
- integer, parameter :: max_valid = 13 ! maximum valid value
- character(len=32) :: subname = 'mkagfirepkmon'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make agricultural fire peak month data .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read( tdomain,datfname )
-
- call gridmap_mapread( tgridmap, mapfname )
-
- ! Obtain frac_dst
- ns_o = ldomain%ns
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ns_i = tdomain%ns
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write (6,*) 'Open agricultural fire peak month file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(agfirepkmon_i(ns_i), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid ag fire peak month
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'abm', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, agfirepkmon_i), subname)
- ! Note that any input point that is outside the range [min_valid_value,max_valid_value]
- ! will be ignored; this ignores input points with value of unsetmon
- call get_dominant_indices(tgridmap, agfirepkmon_i, agfirepkmon_o, &
- min_valid_value, max_valid_value, miss, mask_src=tdomain%mask)
-
- ! Check validity of output data
- if (min_bad(agfirepkmon_o, min_valid, 'agfirepkmon') .or. &
- max_bad(agfirepkmon_o, max_valid, 'agfirepkmon')) then
- stop
- end if
-
-
- ! -----------------------------------------------------------------
- ! Output diagnostics comparing global area of each peak month on input and output grids
- !
- ! WJS (3-4-13): I am trying to generally put these diagnostics in mkdiagnosticsMod, but
- ! so far there isn't a general diagnostics routine for categorical data
- !
- ! TODO(wjs, 2016-01-22) Now there is a routine for this: output_diagnostics_index.
- ! However, it currently doesn't provide the capability for named months. Either add
- ! that capability or decide it's not important, then delete the below code, instead
- ! calling output_diagnostics_index.
- ! -----------------------------------------------------------------
-
- nagfirepkmon = maxval(agfirepkmon_i)
- allocate(gast_i(1:nagfirepkmon),gast_o(1:nagfirepkmon),month(1:nagfirepkmon))
- call define_months(nagfirepkmon, month)
-
- gast_i(:) = 0.0_r8
- do ni = 1,ns_i
- k = agfirepkmon_i(ni)
- gast_i(k) = gast_i(k) + tgridmap%area_src(ni)*tdomain%mask(ni)*re**2
- end do
-
- gast_o(:) = 0.0_r8
- do no = 1,ns_o
- k = agfirepkmon_o(no)
- gast_o(k) = gast_o(k) + tgridmap%area_dst(no)*frac_dst(no)*re**2
- end do
-
- ! area comparison
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Agricultural fire peak month Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,1001)
-1001 format (1x,'peak month',20x,' input grid area output grid area',/ &
- 1x,33x,' 10**6 km**2',' 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
-
- do k = 1, nagfirepkmon
- write (ndiag,1002) month(k),gast_i(k)*1.e-6,gast_o(k)*1.e-6
-1002 format (1x,a35,f16.3,f17.3)
- end do
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (agfirepkmon_i,gast_i,gast_o,month, frac_dst, mask_r8)
-
- write (6,*) 'Successfully made Agricultural fire peak month'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkagfirepkmon
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: define_months
-!
-! !INTERFACE:
-subroutine define_months(nagfirepkmon, month)
-!
-! !DESCRIPTION:
-! Define month strings
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: nagfirepkmon ! max input value (including the 'unset' special value)
- character(len=*), intent(out):: month(:) ! name of each month value
-!
-! !CALLED FROM:
-! subroutine mkagfirepkmon
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
-!-----------------------------------------------------------------------
-
- if (nagfirepkmon == unsetmon) then
- if (size(month) < 13) then
- write(6,*) 'month array too small: ', size(month), ' < 13'
- call abort()
- end if
- month(1) = 'January '
- month(2) = 'February '
- month(3) = 'March '
- month(4) = 'April '
- month(5) = 'May '
- month(6) = 'June '
- month(7) = 'July '
- month(8) = 'August '
- month(9) = 'September '
- month(10) = 'October '
- month(11) = 'November '
- month(12) = 'December '
- month(13) = 'no agricultural fire peak month data'
- else
- write(6,*)'nagfirepkmon value of ',nagfirepkmon,' not supported'
- call abort()
- end if
-
-end subroutine define_months
-!-----------------------------------------------------------------------
-
-
-end module mkagfirepkmonthMod
diff --git a/tools/mksurfdata_map/src/mkchecksMod.F90 b/tools/mksurfdata_map/src/mkchecksMod.F90
deleted file mode 100644
index 94b8fe5930..0000000000
--- a/tools/mksurfdata_map/src/mkchecksMod.F90
+++ /dev/null
@@ -1,233 +0,0 @@
-module mkchecksMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkchecks
-!
-! !DESCRIPTION:
-! Generic routines to check validity of output fields
-!
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
-
- implicit none
- private
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: min_bad ! check the minimum value of a field
- public :: max_bad ! check the maximum value of a field
-
- interface min_bad
- module procedure min_bad_int
- module procedure min_bad_r8
- end interface min_bad
-
- interface max_bad
- module procedure max_bad_int
- module procedure max_bad_r8
- end interface max_bad
-!
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!EOP
-!------------------------------------------------------------------------------
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: min_bad_r8
-!
-! !INTERFACE:
-logical function min_bad_r8(data, min_allowed, varname)
-!
-! !DESCRIPTION:
-! Confirm that no value of data is less than min_allowed.
-! Returns true if errors found, false otherwise.
-! Also prints offending points
-!
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8) , intent(in) :: data(:) ! array of data to check
- real(r8) , intent(in) :: min_allowed ! minimum valid value
- character(len=*) , intent(in) :: varname ! name of field
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical :: errors_found ! true if any errors have been found
- integer :: n ! index
-
- character(len=*), parameter :: subname = 'min_bad_r8'
-!------------------------------------------------------------------------------
-
- errors_found = .false.
-
- do n = 1, size(data)
- if (data(n) < min_allowed) then
- write(6,*) subname//' ERROR: ', trim(varname), ' = ', data(n), ' less than ',&
- min_allowed, ' at ', n
- errors_found = .true.
- end if
- end do
-
- call shr_sys_flush(6)
- min_bad_r8 = errors_found
-end function min_bad_r8
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: min_bad_int
-!
-! !INTERFACE:
-logical function min_bad_int(data, min_allowed, varname)
-!
-! !DESCRIPTION:
-! Confirm that no value of data is less than min_allowed.
-! Returns true if errors found, false otherwise.
-! Also prints offending points
-!
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: data(:) ! array of data to check
- integer , intent(in) :: min_allowed ! minimum valid value
- character(len=*) , intent(in) :: varname ! name of field
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical :: errors_found ! true if any errors have been found
- integer :: n ! index
-
- character(len=*), parameter :: subname = 'min_bad_int'
-!------------------------------------------------------------------------------
-
- errors_found = .false.
-
- do n = 1, size(data)
- if (data(n) < min_allowed) then
- write(6,*) subname//' ERROR: ', trim(varname), ' = ', data(n), ' less than ',&
- min_allowed, ' at ', n
- errors_found = .true.
- end if
- end do
-
- call shr_sys_flush(6)
- min_bad_int = errors_found
-end function min_bad_int
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: max_bad_r8
-!
-! !INTERFACE:
-logical function max_bad_r8(data, max_allowed, varname)
-!
-! !DESCRIPTION:
-! Confirm that no value of data is greate than max_allowed.
-! Returns true if errors found, false otherwise.
-! Also prints offending points
-!
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8) , intent(in) :: data(:) ! array of data to check
- real(r8) , intent(in) :: max_allowed ! maximum valid value
- character(len=*) , intent(in) :: varname ! name of field
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical :: errors_found ! true if any errors have been found
- integer :: n ! index
-
- character(len=*), parameter :: subname = 'max_bad_r8'
-!------------------------------------------------------------------------------
-
- errors_found = .false.
-
- do n = 1, size(data)
- if (data(n) > max_allowed) then
- write(6,*) subname//' ERROR: ', trim(varname), ' = ', data(n), ' greater than ',&
- max_allowed, ' at ', n
- errors_found = .true.
- end if
- end do
-
- call shr_sys_flush(6)
- max_bad_r8 = errors_found
-end function max_bad_r8
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: max_bad_int
-!
-! !INTERFACE:
-logical function max_bad_int(data, max_allowed, varname)
-!
-! !DESCRIPTION:
-! Confirm that no value of data is greate than max_allowed.
-! Returns true if errors found, false otherwise.
-! Also prints offending points
-!
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: data(:) ! array of data to check
- integer , intent(in) :: max_allowed ! maximum valid value
- character(len=*) , intent(in) :: varname ! name of field
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical :: errors_found ! true if any errors have been found
- integer :: n ! index
-
- character(len=*), parameter :: subname = 'max_bad_int'
-!------------------------------------------------------------------------------
-
- errors_found = .false.
-
- do n = 1, size(data)
- if (data(n) > max_allowed) then
- write(6,*) subname//' ERROR: ', trim(varname), ' = ', data(n), ' greater than ',&
- max_allowed, ' at ', n
- errors_found = .true.
- end if
- end do
-
- call shr_sys_flush(6)
- max_bad_int = errors_found
-end function max_bad_int
-
-
-end module mkchecksMod
diff --git a/tools/mksurfdata_map/src/mkdiagnosticsMod.F90 b/tools/mksurfdata_map/src/mkdiagnosticsMod.F90
deleted file mode 100644
index a53d9ca4d2..0000000000
--- a/tools/mksurfdata_map/src/mkdiagnosticsMod.F90
+++ /dev/null
@@ -1,452 +0,0 @@
-module mkdiagnosticsMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkdiagnostics
-!
-! !DESCRIPTION:
-! Output diagnostics to log file
-!
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: output_diagnostics_area ! output diagnostics for field that is % of grid area
- public :: output_diagnostics_continuous ! output diagnostics for a continuous (real-valued) field
- public :: output_diagnostics_continuous_outonly ! output diagnostics for a continuous (real-valued) field, just on the output grid
- public :: output_diagnostics_index ! output diagnostics for an index field
-!
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!EOP
-!------------------------------------------------------------------------------
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: output_diagnostics_area
-!
-! !INTERFACE:
-subroutine output_diagnostics_area(data_i, data_o, gridmap, name, percent, ndiag, mask_src, frac_dst)
-!
-! !DESCRIPTION:
-! Output diagnostics for a field that gives either fraction or percent of grid cell area
-!
-! !USES:
- use mkgridmapMod, only : gridmap_type
- use mkvarpar, only : re
-!
-! !ARGUMENTS:
- implicit none
- real(r8) , intent(in) :: data_i(:) ! data on input grid
- real(r8) , intent(in) :: data_o(:) ! data on output grid
- type(gridmap_type), intent(in) :: gridmap ! mapping info
- character(len=*) , intent(in) :: name ! name of field
- logical , intent(in) :: percent ! is field specified as percent? (alternative is fraction)
- integer , intent(in) :: ndiag ! unit number for diagnostic output
- integer, intent(in) :: mask_src(:)
- real(r8), intent(in) :: frac_dst(:)
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: gdata_i ! global sum of input data
- real(r8) :: gdata_o ! global sum of output data
- real(r8) :: garea_i ! global sum of input area
- real(r8) :: garea_o ! global sum of output area
- integer :: ns_i, ns_o ! sizes of input & output grids
- integer :: ni,no,k ! indices
-
- character(len=*), parameter :: subname = "output_diagnostics_area"
-!------------------------------------------------------------------------------
-
- ! Error check for array size consistencies
-
- ns_i = gridmap%na
- ns_o = gridmap%nb
- if (size(data_i) /= ns_i .or. &
- size(data_o) /= ns_o) then
- write(6,*) subname//' ERROR: array size inconsistencies for ', trim(name)
- write(6,*) 'size(data_i) = ', size(data_i)
- write(6,*) 'ns_i = ', ns_i
- write(6,*) 'size(data_o) = ', size(data_o)
- write(6,*) 'ns_o = ', ns_o
- stop
- end if
- if (size(frac_dst) /= ns_o) then
- write(6,*) subname//' ERROR: incorrect size of frac_dst'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'ns_o = ', ns_o
- call abort()
- end if
- if (size(mask_src) /= ns_i) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'ns_i = ', ns_i
- call abort()
- end if
-
- ! Sums on input grid
-
- gdata_i = 0.
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + gridmap%area_src(ni)*re**2
- gdata_i = gdata_i + data_i(ni) * gridmap%area_src(ni) * mask_src(ni) * re**2
- end do
-
- ! Sums on output grid
-
- gdata_o = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + gridmap%area_dst(no)*re**2
- gdata_o = gdata_o + data_o(no) * gridmap%area_dst(no) * frac_dst(no) * re**2
- end do
-
- ! Correct units
-
- if (percent) then
- gdata_i = gdata_i / 100._r8
- gdata_o = gdata_o / 100._r8
- end if
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) trim(name), ' Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2002) name, gdata_i*1.e-06, gdata_o*1.e-06
- write (ndiag,2002) 'all surface', garea_i*1.e-06, garea_o*1.e-06
-2002 format (1x,a12, f14.3,f17.3)
-
-end subroutine output_diagnostics_area
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: output_diagnostics_continuous
-!
-! !INTERFACE:
-subroutine output_diagnostics_continuous(data_i, data_o, gridmap, name, units, ndiag, mask_src, frac_dst)
-!
-! !DESCRIPTION:
-! Output diagnostics for a continuous field (but not area, for which there is a different routine)
-!
-! !USES:
- use mkgridmapMod, only : gridmap_type
- use mkvarpar, only : re
-!
-! !ARGUMENTS:
- implicit none
- real(r8) , intent(in) :: data_i(:) ! data on input grid
- real(r8) , intent(in) :: data_o(:) ! data on output grid
- type(gridmap_type), intent(in) :: gridmap ! mapping info
- character(len=*) , intent(in) :: name ! name of field
- character(len=*) , intent(in) :: units ! units of field
- integer , intent(in) :: ndiag ! unit number for diagnostic output
- integer, intent(in) :: mask_src(:)
- real(r8), intent(in) :: frac_dst(:)
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: gdata_i ! global sum of input data
- real(r8) :: gdata_o ! global sum of output data
- real(r8) :: gwt_i ! global sum of input weights (area * frac)
- real(r8) :: gwt_o ! global sum of output weights (area * frac)
- integer :: ns_i, ns_o ! sizes of input & output grids
- integer :: ni,no,k ! indices
-
- character(len=*), parameter :: subname = "output_diagnostics_continuous"
-!------------------------------------------------------------------------------
-
- ! Error check for array size consistencies
-
- ns_i = gridmap%na
- ns_o = gridmap%nb
- if (size(data_i) /= ns_i .or. &
- size(data_o) /= ns_o) then
- write(6,*) subname//' ERROR: array size inconsistencies for ', trim(name)
- write(6,*) 'size(data_i) = ', size(data_i)
- write(6,*) 'ns_i = ', ns_i
- write(6,*) 'size(data_o) = ', size(data_o)
- write(6,*) 'ns_o = ', ns_o
- stop
- end if
- if (size(frac_dst) /= ns_o) then
- write(6,*) subname//' ERROR: incorrect size of frac_dst'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'ns_o = ', ns_o
- call abort()
- end if
- if (size(mask_src) /= ns_i) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'ns_i = ', ns_i
- call abort()
- end if
-
- ! Sums on input grid
-
- gdata_i = 0.
- gwt_i = 0.
- do ni = 1,ns_i
- gdata_i = gdata_i + data_i(ni) * gridmap%area_src(ni) * mask_src(ni)
- gwt_i = gwt_i + gridmap%area_src(ni) * mask_src(ni)
- end do
-
- ! Sums on output grid
-
- gdata_o = 0.
- gwt_o = 0.
- do no = 1,ns_o
- gdata_o = gdata_o + data_o(no) * gridmap%area_dst(no) * frac_dst(no)
- gwt_o = gwt_o + gridmap%area_dst(no) * frac_dst(no)
- end do
-
- ! Correct units
-
- gdata_i = gdata_i / gwt_i
- gdata_o = gdata_o / gwt_o
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) trim(name), ' Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
- write (ndiag,2002) units, units
-2001 format (1x,' parameter input grid output grid')
-2002 format (1x,' ', a24, a24)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2003) name, gdata_i, gdata_o
-2003 format (1x,a12, f22.3,f17.3)
-
-end subroutine output_diagnostics_continuous
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: output_diagnostics_continuous_outonly
-!
-! !INTERFACE:
-subroutine output_diagnostics_continuous_outonly(data_o, gridmap, name, units, ndiag)
-!
-! !DESCRIPTION:
-! Output diagnostics for a continuous field, just on the output grid
-! This is used when the average of the field on the input grid is not of interest (e.g.,
-! when the output quantity is the standard deviation of the input field)
-!
-! !USES:
- use mkgridmapMod, only : gridmap_type
- use mkvarpar, only : re
-!
-! !ARGUMENTS:
- implicit none
- real(r8) , intent(in) :: data_o(:) ! data on output grid
- type(gridmap_type), intent(in) :: gridmap ! mapping info
- character(len=*) , intent(in) :: name ! name of field
- character(len=*) , intent(in) :: units ! units of field
- integer , intent(in) :: ndiag ! unit number for diagnostic output
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: gdata_o ! global sum of output data
- real(r8) :: gwt_o ! global sum of output weights (area * frac)
- integer :: ns_o ! size of output grid
- integer :: no,k ! indices
-
- character(len=*), parameter :: subname = "output_diagnostics_continuous_outonly"
-!------------------------------------------------------------------------------
-
- ! Error check for array size consistencies
-
- ns_o = gridmap%nb
- if (size(data_o) /= ns_o) then
- write(6,*) subname//' ERROR: array size inconsistencies for ', trim(name)
- write(6,*) 'size(data_o) = ', size(data_o)
- write(6,*) 'ns_o = ', ns_o
- stop
- end if
-
- ! Sums on output grid
-
- gdata_o = 0.
- gwt_o = 0.
- do no = 1,ns_o
- gdata_o = gdata_o + data_o(no)*gridmap%area_dst(no)*gridmap%frac_dst(no)
- gwt_o = gwt_o + gridmap%area_dst(no)*gridmap%frac_dst(no)
- end do
-
- ! Correct units
-
- gdata_o = gdata_o / gwt_o
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) trim(name), ' Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
- write (ndiag,2002) units
-2001 format (1x,' parameter output grid')
-2002 format (1x,' ', a24)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2003) name, gdata_o
-2003 format (1x,a12, f22.3)
-
-end subroutine output_diagnostics_continuous_outonly
-!------------------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-subroutine output_diagnostics_index(data_i, data_o, gridmap, name, &
- minval, maxval, ndiag, mask_src, frac_dst)
- !
- ! !DESCRIPTION:
- ! Output diagnostics for an index field: area of each index in input and output
- !
- ! !USES:
- use mkvarpar, only : re
- use mkgridmapMod, only : gridmap_type
- !
- ! !ARGUMENTS:
- integer , intent(in) :: data_i(:) ! data on input grid
- integer , intent(in) :: data_o(:) ! data on output grid
- type(gridmap_type) , intent(in) :: gridmap ! mapping info
- character(len=*) , intent(in) :: name ! name of field
- integer , intent(in) :: minval ! minimum valid value
- integer , intent(in) :: maxval ! minimum valid value
- integer , intent(in) :: ndiag ! unit number for diagnostic output
- integer , intent(in) :: mask_src(:)
- real(r8) , intent(in) :: frac_dst(:)
- !
- ! !LOCAL VARIABLES:
- integer :: ns_i, ns_o ! sizes of input & output grids
- integer :: ni, no, k ! indices
- real(r8), allocatable :: garea_i(:) ! input grid: global area of each index
- real(r8), allocatable :: garea_o(:) ! output grid: global area of each index
- integer :: ier ! error status
-
- character(len=*), parameter :: subname = 'output_diagnostics_index'
- !-----------------------------------------------------------------------
-
- ! Error check for array size consistencies
-
- ns_i = gridmap%na
- ns_o = gridmap%nb
- if (size(data_i) /= ns_i .or. &
- size(data_o) /= ns_o) then
- write(6,*) subname//' ERROR: array size inconsistencies for ', trim(name)
- write(6,*) 'size(data_i) = ', size(data_i)
- write(6,*) 'ns_i = ', ns_i
- write(6,*) 'size(data_o) = ', size(data_o)
- write(6,*) 'ns_o = ', ns_o
- stop
- end if
- if (size(frac_dst) /= ns_o) then
- write(6,*) subname//' ERROR: incorrect size of frac_dst'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'ns_o = ', ns_o
- call abort()
- end if
- if (size(mask_src) /= ns_i) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'ns_i = ', ns_i
- call abort()
- end if
-
- ! Sum areas on input grid
-
- allocate(garea_i(minval:maxval), stat=ier)
- if (ier/=0) call abort()
-
- garea_i(:) = 0.
- do ni = 1, ns_i
- k = data_i(ni)
- if (k >= minval .and. k <= maxval) then
- garea_i(k) = garea_i(k) + gridmap%area_src(ni) * mask_src(ni) * re**2
- end if
- end do
-
- ! Sum areas on output grid
-
- allocate(garea_o(minval:maxval), stat=ier)
- if (ier/=0) call abort()
-
- garea_o(:) = 0.
- do no = 1, ns_o
- k = data_o(no)
- if (k >= minval .and. k <= maxval) then
- garea_o(k) = garea_o(k) + gridmap%area_dst(no) * frac_dst(no) * re**2
- end if
- end do
-
- ! Write results
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) trim(name), ' Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'index input grid area output grid area',/ &
- 1x,' 10**6 km**2 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
-
- do k = minval, maxval
- write (ndiag,2002) k, garea_i(k)*1.e-06, garea_o(k)*1.e-06
-2002 format (1x,i9,f17.3,f18.3)
- end do
-
- ! Deallocate memory
-
- deallocate(garea_i, garea_o)
-
-end subroutine output_diagnostics_index
-
-
-
-end module mkdiagnosticsMod
diff --git a/tools/mksurfdata_map/src/mkdomainMod.F90 b/tools/mksurfdata_map/src/mkdomainMod.F90
deleted file mode 100644
index 84865458b0..0000000000
--- a/tools/mksurfdata_map/src/mkdomainMod.F90
+++ /dev/null
@@ -1,936 +0,0 @@
-module mkdomainMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: domain1Mod
-!
-! !DESCRIPTION:
-! Module containing 2-d global surface boundary data information
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkvarpar , only : re
- use nanMod , only : nan, bigint
-!
-! !PUBLIC TYPES:
- implicit none
- private
-!
- public :: domain_type
-
- type domain_type
- character*16 :: set ! flag to check if domain is set
- integer :: ns ! global size of domain
- integer :: ni,nj ! for 2d domains only
- real(r8) :: edgen ! lsmedge north
- real(r8) :: edgee ! lsmedge east
- real(r8) :: edges ! lsmedge south
- real(r8) :: edgew ! lsmedge west
- integer ,pointer :: mask(:) ! land mask: 1 = land, 0 = ocean
- real(r8),pointer :: frac(:) ! fractional land
- real(r8),pointer :: latc(:) ! latitude of grid cell (deg)
- real(r8),pointer :: lonc(:) ! longitude of grid cell (deg)
- real(r8),pointer :: lats(:) ! grid cell latitude, S edge (deg)
- real(r8),pointer :: latn(:) ! grid cell latitude, N edge (deg)
- real(r8),pointer :: lonw(:) ! grid cell longitude, W edge (deg)
- real(r8),pointer :: lone(:) ! grid cell longitude, E edge (deg)
- real(r8),pointer :: area(:) ! grid cell area (km**2) (only used for output grid)
- logical :: is_2d ! if this is a 2-d domain
- logical :: fracset ! if frac is set
- logical :: maskset ! if mask is set
- end type domain_type
-
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public domain_clean
- public domain_check
- public domain_read
- public domain_read_dims ! get dimensions from a domain file (only public for unit testing)
- public domain_read_map
- public domain_write
- public domain_checksame
- public for_test_create_domain ! For unit testing create a simple domain
-!
-!
-! !REVISION HISTORY:
-! Originally clm_varsur by Mariana Vertenstein
-! Migrated from clm_varsur to domainMod by T Craig
-!
- character*16,parameter :: set = 'domain_set '
- character*16,parameter :: unset = 'NOdomain_unsetNO'
-
- real(r8) :: flandmin = 0.001 !minimum land frac for land cell
-!
-! !PRIVATE MEMBER FUNCTIONS:
- private domain_init
-!
-!EOP
-!------------------------------------------------------------------------------
-
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_init
-!
-! !INTERFACE:
- subroutine domain_init(domain,ns)
-!
-! !DESCRIPTION:
-! This subroutine allocates and nans the domain type
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) :: domain ! domain datatype
- integer :: ns ! grid size, 2d
-!
-! !REVISION HISTORY:
-! Created by T Craig
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer ier
- integer nb,ne
-!
-!------------------------------------------------------------------------------
-
- nb = 1
- ne = ns
-
- if (domain%set == set) then
- call domain_clean(domain)
- endif
-
- allocate(domain%mask(ns), &
- domain%frac(ns), &
- domain%latc(ns), &
- domain%lonc(ns), &
- domain%lats(ns), &
- domain%latn(ns), &
- domain%lonw(ns), &
- domain%lone(ns), &
- domain%area(ns), stat=ier)
- if (ier /= 0) then
- write(6,*) 'domain_init ERROR: allocate mask, frac, lat, lon, area '
- endif
-
- domain%ns = ns
- domain%mask = -9999
- domain%frac = -1.0e36
- domain%latc = nan
- domain%lonc = nan
- domain%area = nan
- domain%set = set
- domain%fracset = .false.
- domain%maskset = .false.
-
- end subroutine domain_init
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_clean
-!
-! !INTERFACE:
- subroutine domain_clean(domain)
-!
-! !DESCRIPTION:
-! This subroutine deallocates the domain type
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) :: domain ! domain datatype
-!
-! !REVISION HISTORY:
-! Created by T Craig
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer ier
-!
-!------------------------------------------------------------------------------
-
- if (domain%set == set) then
- write(6,*) 'domain_clean: cleaning ',domain%ns
- deallocate(domain%mask, &
- domain%frac, &
- domain%latc, &
- domain%lonc, &
- domain%lats, &
- domain%latn, &
- domain%lonw, &
- domain%lone, &
- domain%area, stat=ier)
- if (ier /= 0) then
- write(6,*) 'domain_clean ERROR: deallocate mask, frac, lat, lon, area '
- call abort()
- endif
- else
- write(6,*) 'domain_clean WARN: clean domain unecessary '
- endif
-
- domain%ns = bigint
- domain%set = unset
- domain%fracset = .false.
- domain%maskset = .false.
-
-end subroutine domain_clean
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_check
-!
-! !INTERFACE:
- subroutine domain_check(domain)
-!
-! !DESCRIPTION:
-! This subroutine write domain info
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(in) :: domain ! domain datatype
-!
-! !REVISION HISTORY:
-! Created by T Craig
-!
-!
-! !LOCAL VARIABLES:
-!
-!EOP
-!------------------------------------------------------------------------------
-
- write(6,*) ' domain_check set = ',trim(domain%set)
- write(6,*) ' domain_check ns = ',domain%ns
- write(6,*) ' domain_check lonc = ',minval(domain%lonc),maxval(domain%lonc)
- write(6,*) ' domain_check latc = ',minval(domain%latc),maxval(domain%latc)
- write(6,*) ' domain_check mask = ',minval(domain%mask),maxval(domain%mask)
- write(6,*) ' domain_check frac = ',minval(domain%frac),maxval(domain%frac)
- write(6,*) ' '
-
-end subroutine domain_check
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_read_map
-!
-! !INTERFACE:
- logical function domain_read_map(domain, fname)
-!
-! !DESCRIPTION:
-! Read a grid file
-!
-! !USES:
- use mkncdio, only : convert_latlon
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(inout) :: domain
- character(len=*) ,intent(in) :: fname ! this assumes a SCRIP mapping file - look at destination grid
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- include 'netcdf.inc'
- integer :: i,j,n ! indices
- integer :: grid_rank ! rank of domain grid
- integer :: ns ! size of domain grid
- integer :: ncid ! netCDF file id
- integer :: dimid ! netCDF dimension id
- integer :: varid ! netCDF variable id
- integer :: ndims ! number of dims for variable
- integer :: ier ! error status
- real(r8), allocatable :: xv(:,:) ! local array for corner lons
- real(r8), allocatable :: yv(:,:) ! local array for corner lats
- integer :: grid_dims(2)
- character(len= 32) :: subname = 'domain_read'
-!-----------------------------------------------------------------
-
- domain_read_map = .true.
-
- ! Read domain file and compute stuff as needed
-
- call check_ret(nf_open(fname, 0, ncid), subname)
-
- ! Assume unstructured grid
-
- domain%ni = -9999
- domain%nj = -9999
- domain%is_2d = .false.
-
- ier = nf_inq_dimid (ncid, 'n_b', dimid)
- if ( ier /= NF_NOERR )then
- domain_read_map = .false.
- else
- call check_ret(nf_inq_dimlen (ncid, dimid, domain%ns), subname)
-
- call check_ret(nf_inq_dimid (ncid, 'dst_grid_rank', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, grid_rank), subname)
-
- if (grid_rank == 2) then
- call check_ret(nf_inq_varid (ncid, 'dst_grid_dims', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, grid_dims), subname)
- domain%ni = grid_dims(1)
- domain%nj = grid_dims(2)
- domain%is_2d = .true.
- end if
-
- call domain_init(domain, domain%ns)
- ns = domain%ns
-
- call check_ret(nf_inq_varid (ncid, 'xc_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%lonc), subname)
- call convert_latlon(ncid, 'xc_b', domain%lonc)
-
- call check_ret(nf_inq_varid (ncid, 'yc_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%latc), subname)
- call convert_latlon(ncid, 'yc_b', domain%latc)
-
- if (grid_rank == 2 ) then
- allocate(yv(4,ns), xv(4,ns))
- call check_ret(nf_inq_varid (ncid, 'yv_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, yv), subname)
- call check_ret(nf_inq_varid (ncid, 'xv_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, xv), subname)
-
- domain%lats(:) = yv(1,:)
- call convert_latlon(ncid, 'yv_b', domain%lats(:))
-
- domain%latn(:) = yv(3,:)
- call convert_latlon(ncid, 'yv_b', domain%latn(:))
-
- domain%lonw(:) = xv(1,:)
- call convert_latlon(ncid, 'xv_b', domain%lonw(:))
-
- domain%lone(:) = xv(2,:)
- call convert_latlon(ncid, 'xv_b', domain%lone(:))
-
- domain%edgen = maxval(domain%latn)
- domain%edgee = maxval(domain%lone)
- domain%edges = minval(domain%lats)
- domain%edgew = minval(domain%lonw)
- deallocate(yv,xv)
- end if
-
- call check_ret(nf_inq_varid (ncid, 'frac_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%frac), subname)
-
- call check_ret(nf_inq_varid (ncid, 'area_b', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%area), subname)
- domain%area = domain%area * re**2
- end if
- domain%maskset = .true.
- domain%fracset = .true.
-
- call check_ret(nf_close(ncid), subname)
-
- end function domain_read_map
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_read
-!
-! !INTERFACE:
- subroutine domain_read(domain, fname, readmask)
-!
-! !DESCRIPTION:
-! Read a grid file
-!
-! !USES:
- use mkncdio, only : convert_latlon
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(inout) :: domain
- character(len=*) ,intent(in) :: fname
- logical,optional, intent(in) :: readmask ! true => read mask instead of landmask for urban parameters
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- include 'netcdf.inc'
- integer :: i,j,n
- real(r8), allocatable :: lon1d(:) ! local array for 1d lon
- real(r8), allocatable :: lat1d(:) ! local array for 1d lat
- real(r8), allocatable :: xv(:,:) ! local array for corner lons
- real(r8), allocatable :: yv(:,:) ! local array for corner lats
- integer :: ncid ! netCDF file id
- integer :: varid ! netCDF variable id
- logical :: edgeNESWset ! local EDGE[NESW]
- logical :: lonlatset ! local lon(:,:), lat(:,:)
- logical :: llneswset ! local lat[ns],lon[we]
- logical :: landfracset ! local landfrac
- logical :: maskset ! local mask
- integer :: ndims ! number of dims for variable
- integer :: ier ! error status
- logical :: lreadmask ! local readmask
- character(len= 32) :: lonvar ! name of 2-d longitude variable
- character(len= 32) :: latvar ! name of 2-d latitude variable
- character(len= 32) :: subname = 'domain_read'
-!-----------------------------------------------------------------
-
- lonlatset = .false.
- edgeNESWset = .false.
- llneswset = .false.
- landfracset = .false.
- maskset = .false.
- lreadmask = .false.
-
- if (present(readmask)) then
- lreadmask = readmask
- end if
-
- call check_ret(nf_open(fname, 0, ncid), subname)
-
- call domain_read_dims(domain, ncid)
- call domain_init(domain, domain%ns)
- write(6,*) trim(subname),' initialized domain'
-
- ! ----- Set lat/lon variable ------
-
- lonvar = ' '
- latvar = ' '
-
- if (.not. lonlatset) then
- ier = nf_inq_varid (ncid, 'LONGXY', varid)
- if (ier == NF_NOERR) then
- lonvar = 'LONGXY'
- latvar = 'LATIXY'
- lonlatset = .true.
- end if
- end if
-
- if (.not. lonlatset) then
- ier = nf_inq_varid (ncid, 'lon', varid)
- if (ier == NF_NOERR) then
- lonvar = 'lon'
- latvar = 'lat'
- lonlatset = .true.
- end if
- end if
-
- if (.not. lonlatset) then
- ier = nf_inq_varid (ncid, 'LONGITUDE', varid)
- if (ier == NF_NOERR) then
- lonvar = 'LONGITUDE'
- latvar = 'LATITUDE'
- lonlatset = .true.
- end if
- end if
-
- if (.not. lonlatset) then
- write(6,*)'lon/lat values not set'
- write(6,*)'currently assume either that lon/lat or LONGXY/LATIXY', &
- ' or LONGITUDE/LATITUDE variables are on input dataset'
- call abort()
- end if
-
- call check_ret(nf_inq_varid (ncid, lonvar, varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%lonc), subname)
- call convert_latlon(ncid, lonvar, domain%lonc)
-
- call check_ret(nf_inq_varid (ncid, latvar, varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%latc), subname)
- call convert_latlon(ncid, latvar, domain%latc)
-
- ! ----- Set landmask/landfrac ------
-
- ier = nf_inq_varid (ncid, 'frac', varid)
- if (ier == NF_NOERR) then
- if (landfracset) write(6,*) trim(subname),' WARNING, overwriting frac'
- landfracset = .true.
- write(6,*) trim(subname),' read frac'
- call check_ret(nf_inq_varid (ncid, 'frac', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%frac), subname)
- endif
-
- ier = nf_inq_varid (ncid, 'LANDFRAC', varid)
- if (ier == NF_NOERR) then
- if (landfracset) write(6,*) trim(subname),' WARNING, overwriting frac'
- landfracset = .true.
- write(6,*) trim(subname),' read LANDFRAC'
- call check_ret(nf_inq_varid (ncid, 'LANDFRAC', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, domain%frac), subname)
- endif
-
- if (lreadmask) then
- ier = nf_inq_varid (ncid, 'mask', varid)
- if (ier == NF_NOERR) then
- if (maskset) write(6,*) trim(subname),' WARNING, overwriting mask'
- maskset = .true.
- write(6,*) trim(subname),' read mask with lreadmask set'
- call check_ret(nf_inq_varid (ncid, 'mask', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, domain%mask), subname)
- endif
- else
- ier = nf_inq_varid (ncid, 'mask', varid)
- if (ier == NF_NOERR) then
- if (maskset) write(6,*) trim(subname),' WARNING, overwriting mask'
- maskset = .true.
- write(6,*) trim(subname),' read mask'
- call check_ret(nf_inq_varid (ncid, 'mask', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, domain%mask), subname)
- endif
- ier = nf_inq_varid (ncid, 'LANDMASK', varid)
- if (ier == NF_NOERR) then
- if (maskset) write(6,*) trim(subname),' WARNING, overwriting mask'
- maskset = .true.
- write(6,*) trim(subname),' read LANDMASK'
- call check_ret(nf_inq_varid (ncid, 'LANDMASK', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, domain%mask), subname)
- endif
- end if
-
- call check_ret(nf_close(ncid), subname)
-
- ! ----- set derived variables ----
-
- if (.not.maskset.and.landfracset) then
- maskset = .true.
- where (domain%frac < flandmin)
- domain%mask = 0 !ocean
- elsewhere
- domain%mask = 1 !land
- endwhere
- endif
-
- if (.not.landfracset.and.maskset) then
- landfracset = .true.
- do n = 1,domain%ns
- if ( domain%mask(n) == 0 )then
- domain%frac(n) = 0._r8 !ocean
- else
- domain%frac(n) = 1._r8 !land
- end if
- end do
- endif
- domain%maskset = maskset
- domain%fracset = landfracset
-
- end subroutine domain_read
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_read_dims
-!
-! !INTERFACE:
- subroutine domain_read_dims(domain, ncid)
-!
-! !DESCRIPTION:
-! get dimension size(s) from a domain file
-! sets domain%ns, domain%is_2d; and (if 2-d) domain%ni and domain%nj
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(inout) :: domain
- integer ,intent(in) :: ncid ! ID of an open netcdf file
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical :: dimset ! has dimension information been set?
- character(len= 32) :: subname = 'domain_read_dims'
-!-----------------------------------------------------------------
-
- ! Assume unstructured grid
- domain%ni = -9999
- domain%nj = -9999
- domain%is_2d = .false.
-
- dimset = .false.
-
- ! Note: We use the first dimension that is found in the following list
-
- ! ----- First try to find 2-d info ------
-
- call domain_read_dims_2d(domain, dimset, ncid, 'lsmlon', 'lsmlat')
- call domain_read_dims_2d(domain, dimset, ncid, 'ni', 'nj')
- call domain_read_dims_2d(domain, dimset, ncid, 'lon', 'lat')
-
- ! ----- If we haven't found 2-d info, try to find 1-d info -----
-
- call domain_read_dims_1d(domain, dimset, ncid, 'num_pixels')
-
- ! ----- If we haven't found any info, abort -----
-
- if (.not. dimset) then
- write(6,*) trim(subname),' ERROR: dims not set'
- call abort()
- endif
-
- contains
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_read_dims_2d
-!
-! !INTERFACE:
- subroutine domain_read_dims_2d(domain, dimset, ncid, lon_name, lat_name)
-!
-! !DESCRIPTION:
-! Try to read 2-d dimension size information
-!
-! Checks whether the given lon_name is found in the netcdf file. If it is:
-! (a) If dimset is already true, then it issues a warning and returns
-! (b) If dimset is false, then this sets:
-! - domain%ni
-! - domain%nj
-! - domain%ns
-! - domain%is_2d
-! - dimset = true
-!
-! If the given lon_name is not found, the above variables are left unchanged
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(inout) :: domain
- logical ,intent(inout) :: dimset ! has dimension information been set?
- integer ,intent(in) :: ncid ! ID of an open netCDF file
- character(len=*) ,intent(in) :: lon_name
- character(len=*) ,intent(in) :: lat_name
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- include 'netcdf.inc'
- integer :: dimid ! netCDF dimension id
- integer :: nlon,nlat ! size
- integer :: ier ! error status
-
- character(len= 32) :: subname = 'domain_read_dims_2d'
-
-!-----------------------------------------------------------------
-
- ier = nf_inq_dimid (ncid, lon_name, dimid)
- if (ier == NF_NOERR) then
- if (dimset) then
- write(6,*) trim(subname),' WARNING: dimension sizes already set; skipping ', &
- lon_name, '/', lat_name
- else
- write(6,*) trim(subname),' read lon and lat dims from ', lon_name, '/', lat_name
- call check_ret(nf_inq_dimid (ncid, lon_name, dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, nlon), subname)
- call check_ret(nf_inq_dimid (ncid, lat_name, dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, nlat), subname)
- domain%ni = nlon
- domain%nj = nlat
- domain%ns = nlon * nlat
- domain%is_2d = .true.
- dimset = .true.
- end if
- endif
-
- end subroutine domain_read_dims_2d
-
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_read_dims_1d
-!
-! !INTERFACE:
- subroutine domain_read_dims_1d(domain, dimset, ncid, dim_name)
-!
-! !DESCRIPTION:
-! Try to read 1-d dimension size information
-!
-! Checks whether the given dim_name is found in the netcdf file. If it is:
-! (a) If dimset is already true, then it issues a warning and returns
-! (b) If dimset is false, then this sets:
-! - domain%ns
-! - domain%is_2d
-! - dimset = true
-!
-! If the given dim_name is not found, the above variables are left unchanged
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type),intent(inout) :: domain
- logical ,intent(inout) :: dimset ! has dimension information been set?
- integer ,intent(in) :: ncid ! ID of an open netCDF file
- character(len=*) ,intent(in) :: dim_name
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- include 'netcdf.inc'
- integer :: dimid ! netCDF dimension id
- integer :: npts ! size
- integer :: ier ! error status
-
- character(len= 32) :: subname = 'domain_read_dims_1d'
-
-!-----------------------------------------------------------------
-
- ier = nf_inq_dimid (ncid, dim_name, dimid)
- if (ier == NF_NOERR) then
- if (dimset) then
- write(6,*) trim(subname),' WARNING: dimension sizes already set; skipping ', dim_name
- else
- write(6,*) trim(subname),' read 1-d length from ', dim_name
- call check_ret(nf_inq_dimid (ncid, dim_name, dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, npts), subname)
- domain%ns = npts
- domain%is_2d = .false.
- dimset = .true.
- end if
- endif
-
- end subroutine domain_read_dims_1d
-
- end subroutine domain_read_dims
-
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_write
-!
-! !INTERFACE:
- subroutine domain_write(domain,fname)
-!
-! !DESCRIPTION:
-! Write a domain to netcdf
-
-! !ARGUMENTS:
- implicit none
- include 'netcdf.inc'
- type(domain_type),intent(inout) :: domain
- character(len=*) ,intent(in) :: fname
-!
-! !REVISION HISTORY:
-! Author: T Craig
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: varid !netCDF variable id
- integer :: ncid !netCDF file id
- integer :: omode !netCDF output mode
- character(len= 32) :: subname = 'domain_write'
-!-----------------------------------------------------------------
-
- call check_ret(nf_open(trim(fname), nf_write, ncid), subname)
- ! File will be in define mode. Set fill mode to "no fill" to optimize performance
-
- call check_ret(nf_set_fill (ncid, nf_nofill, omode), subname)
-
- ! Write domain fields
-
- call check_ret(nf_inq_varid(ncid, 'AREA', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, domain%area), subname)
-
- call check_ret(nf_inq_varid(ncid, 'LONGXY', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, domain%lonc), subname)
-
- call check_ret(nf_inq_varid(ncid, 'LATIXY', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, domain%latc), subname)
-
- ! Synchronize the disk copy of a netCDF dataset with in-memory buffers
-
- call check_ret(nf_sync(ncid), subname)
-
- ! Close grid data dataset
-
- call check_ret(nf_close(ncid), subname)
-
- end subroutine domain_write
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: check_ret
-!
-! !INTERFACE:
- subroutine check_ret(ret, calling)
-!
-! !DESCRIPTION:
-! Check return status from netcdf call
-!
-! !ARGUMENTS:
- implicit none
- include 'netcdf.inc'
- integer, intent(in) :: ret
- character(len=*) :: calling
-!
-! !REVISION HISTORY:
-!
-!EOP
-!-----------------------------------------------------------------------
-
- if (ret /= NF_NOERR) then
- write(6,*)'netcdf error from ',trim(calling), ' rcode = ', ret, &
- ' error = ', NF_STRERROR(ret)
- call abort()
- end if
-
- end subroutine check_ret
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: domain_checksame
-!
-! !INTERFACE:
- subroutine domain_checksame( srcdomain, dstdomain, tgridmap )
-!
-! !DESCRIPTION:
-! Check that the input domains agree with the input map
-!
-! USES:
- use mkgridmapMod, only : gridmap_type, gridmap_setptrs
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: srcdomain ! input domain
- type(domain_type), intent(in) :: dstdomain ! output domain
- type(gridmap_type),intent(in) :: tgridmap ! grid map
-!
-! !REVISION HISTORY:
-!
-!EOP
-!-----------------------------------------------------------------------
- integer :: na, nb, ns ! gridmap sizes
- integer :: n, ni ! indices
- real(r8), pointer :: xc_src(:) ! Source longitude
- real(r8), pointer :: yc_src(:) ! Source latitude
- integer, pointer :: src_indx(:) ! Source index
- real(r8), pointer :: xc_dst(:) ! Destination longitude
- real(r8), pointer :: yc_dst(:) ! Destination latitude
- integer, pointer :: dst_indx(:) ! Destination index
- character(len= 32) :: subname = 'domain_checksame'
-
- ! tolerance for checking equality of lat & lon
- ! We allow for single-precision rounding-level differences (approx. 1.2e-7 relative
- ! error) For a value of 360 (max value for lat / lon), this means we can allow
- ! absolute errors of about 5e-5.
- real(r8), parameter :: eps = 5.e-5_r8
-
-
- if (srcdomain%set == unset) then
- write(6,*) trim(subname)//'ERROR: source domain is unset!'
- call abort()
- end if
- if (dstdomain%set == unset) then
- write(6,*) trim(subname)//'ERROR: destination domain is unset!'
- call abort()
- end if
-
- call gridmap_setptrs( tgridmap, nsrc=na, ndst=nb, ns=ns, &
- xc_src=xc_src, yc_src=yc_src, &
- xc_dst=xc_dst, yc_dst=yc_dst, &
- src_indx=src_indx, dst_indx=dst_indx &
- )
-
- if (srcdomain%ns /= na) then
- write(6,*) trim(subname)// &
- ' ERROR: input domain size and gridmap source size are not the same size'
- write(6,*)' domain size = ',srcdomain%ns
- write(6,*)' map src size= ',na
- call abort()
- end if
- if (dstdomain%ns /= nb) then
- write(6,*) trim(subname)// &
- ' ERROR: output domain size and gridmap destination size are not the same size'
- write(6,*)' domain size = ',dstdomain%ns
- write(6,*)' map dst size= ',nb
- call abort()
- end if
- do n = 1,ns
- ni = src_indx(n)
- if (abs(srcdomain%lonc(ni) - xc_src(ni)) > eps) then
- write(6,*) trim(subname)// &
- ' ERROR: input domain lon and gridmap lon not the same at ni = ',ni
- write(6,*)' domain lon= ',srcdomain%lonc(ni)
- write(6,*)' gridmap lon= ',xc_src(ni)
- call abort()
- end if
- if (abs(srcdomain%latc(ni) - yc_src(ni)) > eps) then
- write(6,*) trim(subname)// &
- ' ERROR: input domain lat and gridmap lat not the same at ni = ',ni
- write(6,*)' domain lat= ',srcdomain%latc(ni)
- write(6,*)' gridmap lat= ',yc_src(ni)
- call abort()
- end if
- end do
- do n = 1,ns
- ni = dst_indx(n)
- if (abs(dstdomain%lonc(ni) - xc_dst(ni)) > eps) then
- write(6,*) trim(subname)// &
- ' ERROR: output domain lon and gridmap lon not the same at ni = ',ni
- write(6,*)' domain lon= ',dstdomain%lonc(ni)
- write(6,*)' gridmap lon= ',xc_dst(ni)
- call abort()
- end if
- if (abs(dstdomain%latc(ni) - yc_dst(ni)) > eps) then
- write(6,*) trim(subname)// &
- ' ERROR: output domain lat and gridmap lat not the same at ni = ',ni
- write(6,*)' domain lat= ',dstdomain%latc(ni)
- write(6,*)' gridmap lat= ',yc_dst(ni)
- call abort()
- end if
- end do
- end subroutine domain_checksame
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: for_test_create_domain
-!
-! !INTERFACE:
- subroutine for_test_create_domain( domain )
-!
-! !DESCRIPTION:
-! Create a simple domain for unit testing
-!
-! USES:
- implicit none
-! !ARGUMENTS:
- type(domain_type), intent(inout) :: domain ! input domain
-! !LOCAL VARIABLES:
- integer, parameter :: ns_o = 2
-
- call domain_init( domain, ns_o )
- domain%latc = (/ 42.0_r8, 40.0_r8 /)
- domain%lonc = (/ -105.0_r8, -100.0_r8 /)
- domain%latn = (/ 43.0_r8, 41.0_r8 /)
- domain%lats = (/ 41.0_r8, 39.0_r8 /)
- domain%lone = (/ -104.0_r8, -99.0_r8 /)
- domain%lonw = (/ -106.0_r8, -101.0_r8 /)
- domain%mask = (/ 1, 1 /)
- domain%frac = (/ 1.0_r8, 1.0_r8 /)
- domain%area = (/ 49284.0_r8, 49284.0_r8 /) ! This is NOT the correct area!
-
- domain%edgen = maxval( domain%latn )
- domain%edges = minval( domain%lats )
- domain%edgew = minval( domain%lonw )
- domain%edgee = maxval( domain%lone )
-
- domain%maskset = .true.
- domain%fracset = .true.
- domain%is_2d = .false.
-
- end subroutine for_test_create_domain
-
-end module mkdomainMod
diff --git a/tools/mksurfdata_map/src/mkfileMod.F90 b/tools/mksurfdata_map/src/mkfileMod.F90
deleted file mode 100644
index 43bdda4c12..0000000000
--- a/tools/mksurfdata_map/src/mkfileMod.F90
+++ /dev/null
@@ -1,566 +0,0 @@
-module mkfileMod
-
-contains
-
-!-----------------------------------------------------------------------
- subroutine mkfile(domain, fname, harvdata, dynlanduse)
-
- use shr_kind_mod , only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_getenv
- use fileutils , only : get_filename
- use mkvarpar , only : nlevsoi, numrad, numstdpft
- use mkvarctl
- use mkurbanparMod, only : numurbl, nlevurb
- use mkglcmecMod , only : nglcec
- use mkpftMod , only : mkpftAtt
- use mksoilMod , only : mksoilAtt
- use mkharvestMod , only : mkharvest_fieldname, mkharvest_numtypes, mkharvest_longname
- use mkharvestMod , only : mkharvest_units, harvestDataType
- use mkncdio , only : check_ret, ncd_defvar, ncd_def_spatial_var
- use mkdomainMod
-
- implicit none
- include 'netcdf.inc'
- type(domain_type) , intent(in) :: domain
- character(len=*) , intent(in) :: fname
- logical , intent(in) :: dynlanduse
- type(harvestDataType), intent(in) :: harvdata
-
- integer :: ncid
- integer :: j ! index
- integer :: dimid ! temporary
- integer :: values(8) ! temporary
- character(len=256) :: str ! global attribute string
- character(len=256) :: name ! name of attribute
- character(len=256) :: unit ! units of attribute
- character(len= 18) :: datetime ! temporary
- character(len= 8) :: date ! temporary
- character(len= 10) :: time ! temporary
- character(len= 5) :: zone ! temporary
- integer :: ier ! error status
- integer :: omode ! netCDF output mode
- integer :: xtype ! external type
- integer, allocatable :: ind1D(:)! Indices of 1D harvest variables
- integer, allocatable :: ind2D(:)! Indices of 2D harvest variables
- character(len=32) :: subname = 'mkfile' ! subroutine name
-!-----------------------------------------------------------------------
-
- call check_ret(nf_create(trim(fname), ior(nf_clobber,nf_64bit_offset), &
- ncid), subname)
-
- call check_ret(nf_set_fill (ncid, nf_nofill, omode), subname)
-
- ! Define dimensions.
-
- if (outnc_1d) then
- call check_ret(nf_def_dim (ncid, 'gridcell', domain%ns, dimid), subname)
- else
- call check_ret(nf_def_dim (ncid, 'lsmlon' , domain%ni, dimid), subname)
- call check_ret(nf_def_dim (ncid, 'lsmlat' , domain%nj, dimid), subname)
- end if
-
- if (.not. dynlanduse) then
- call check_ret(nf_def_dim (ncid, 'nglcec' , nglcec , dimid), subname)
- call check_ret(nf_def_dim (ncid, 'nglcecp1', nglcec+1 , dimid), subname)
- end if
- call check_ret(nf_def_dim (ncid, 'numurbl' , numurbl , dimid), subname)
- call check_ret(nf_def_dim (ncid, 'nlevurb' , nlevurb , dimid), subname)
- call check_ret(nf_def_dim (ncid, 'numrad' , numrad , dimid), subname)
- call check_ret(nf_def_dim (ncid, 'nchar' , 256 , dimid), subname)
-
- ! Create global attributes.
-
- str = 'NCAR-CSM'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Conventions', len_trim(str), trim(str)), subname)
-
- call date_and_time (date, time, zone, values)
- datetime(1:8) = date(5:6) // '-' // date(7:8) // '-' // date(3:4)
- datetime(9:) = ' ' // time(1:2) // ':' // time(3:4) // ':' // time(5:6) // ' '
- str = 'created on: ' // datetime
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'History_Log', len_trim(str), trim(str)), subname)
-
- call shr_sys_getenv ('LOGNAME', str, ier)
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Logname', len_trim(str), trim(str)), subname)
-
- call shr_sys_getenv ('HOST', str, ier)
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Host', len_trim(str), trim(str)), subname)
-
- str = 'Community Land Model: CLM5'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Source', len_trim(str), trim(str)), subname)
-
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Version', len_trim(gitdescribe), trim(gitdescribe)), subname)
-
-#ifdef OPT
- str = 'TRUE'
-#else
- str = 'FALSE'
-#endif
-
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'Compiler_Optimized', len_trim(str), trim(str)), subname)
-
- if ( all_urban )then
- str = 'TRUE'
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'all_urban', len_trim(str), trim(str)), subname)
- end if
-
- if ( no_inlandwet )then
- str = 'TRUE'
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'no_inlandwet', len_trim(str), trim(str)), subname)
- end if
-
- call check_ret(nf_put_att_int(ncid, NF_GLOBAL, &
- 'nglcec', nf_int, 1, nglcec), subname)
-
- ! Raw data file names
-
- str = get_filename(mksrf_fgrid)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Input_grid_dataset', len_trim(str), trim(str)), subname)
-
- str = trim(mksrf_gridtype)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Input_gridtype', len_trim(str), trim(str)), subname)
-
- if (.not. dynlanduse) then
- str = get_filename(mksrf_fvocef)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'VOC_EF_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
-
- str = get_filename(mksrf_flakwat)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Inland_lake_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fwetlnd)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Inland_wetland_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fglacier)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Glacier_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fglacierregion)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Glacier_region_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_furbtopo)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Urban_Topography_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_furban)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Urban_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- if (.not. dynlanduse .and. (numpft == numstdpft) ) then
- str = get_filename(mksrf_flai)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Lai_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
-
- str = get_filename(mksrf_fabm)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'agfirepkmon_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fgdp)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'gdp_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fpeat)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'peatland_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_fsoildepth)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'soildepth_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(mksrf_ftopostats)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'topography_stats_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- if ( outnc_vic )then
- str = get_filename(mksrf_fvic)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'vic_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
-
- ! Mapping file names
-
- str = get_filename(map_fpft)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_pft_file_name', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_flakwat)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_lakwat_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fwetlnd)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_wetlnd_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fglacier)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_glacier_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fglacierregion)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_glacier_region_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fsoitex)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_soil_texture_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fsoicol)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_soil_color_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_forganic)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_soil_organic_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_furban)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_urban_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fmax)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_fmax_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fvocef)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_VOC_EF_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fharvest)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_harvest_file', len_trim(str), trim(str)), subname)
-
- if ( numpft == numstdpft )then
- str = get_filename(map_flai)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_lai_sai_file', len_trim(str), trim(str)), subname)
- end if
-
- str = get_filename(map_furbtopo)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_urban_topography_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fabm)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_agfirepkmon_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fgdp)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_gdp_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fpeat)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_peatland_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_fsoildepth)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_soildepth_file', len_trim(str), trim(str)), subname)
-
- str = get_filename(map_ftopostats)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_topography_stats_file', len_trim(str), trim(str)), subname)
-
- if ( outnc_vic )then
- str = get_filename(map_fvic)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'map_vic_file', len_trim(str), trim(str)), subname)
- end if
-
- ! ----------------------------------------------------------------------
- ! Define variables
- ! ----------------------------------------------------------------------
-
- if ( .not. outnc_double )then
- xtype = nf_float
- else
- xtype = nf_double
- end if
-
- call mksoilAtt( ncid, dynlanduse, xtype )
-
- call mkpftAtt( ncid, dynlanduse, xtype )
-
- call ncd_def_spatial_var(ncid=ncid, varname='AREA' , xtype=nf_double, &
- long_name='area', units='km^2')
-
- call ncd_def_spatial_var(ncid=ncid, varname='LONGXY', xtype=nf_double, &
- long_name='longitude', units='degrees east')
-
- call ncd_def_spatial_var(ncid=ncid, varname='LATIXY', xtype=nf_double, &
- long_name='latitude', units='degrees north')
-
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_BTR', xtype=xtype, &
- long_name='EF btr (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_FET', xtype=xtype, &
- long_name='EF fet (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_FDT', xtype=xtype, &
- long_name='EF fdt (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_SHR', xtype=xtype, &
- long_name='EF shr (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_GRS', xtype=xtype, &
- long_name='EF grs (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EF1_CRP', xtype=xtype, &
- long_name='EF crp (isoprene)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='CANYON_HWR', xtype=xtype, &
- lev1name='numurbl', &
- long_name='canyon height to width ratio', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EM_IMPROAD', xtype=xtype, &
- lev1name='numurbl', &
- long_name='emissivity of impervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EM_PERROAD', xtype=xtype, &
- lev1name='numurbl', &
- long_name='emissivity of pervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EM_ROOF', xtype=xtype, &
- lev1name='numurbl', &
- long_name='emissivity of roof', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='EM_WALL', xtype=xtype, &
- lev1name='numurbl', &
- long_name='emissivity of wall', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='HT_ROOF', xtype=xtype, &
- lev1name='numurbl', &
- long_name='height of roof', units='meters')
-
- call ncd_def_spatial_var(ncid=ncid, varname='THICK_ROOF', xtype=xtype, &
- lev1name='numurbl', &
- long_name='thickness of roof', units='meters')
-
- call ncd_def_spatial_var(ncid=ncid, varname='THICK_WALL', xtype=xtype, &
- lev1name='numurbl', &
- long_name='thickness of wall', units='meters')
-
- call ncd_def_spatial_var(ncid=ncid, varname='T_BUILDING_MIN', xtype=xtype, &
- lev1name='numurbl', &
- long_name='minimum interior building temperature', units='K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='WIND_HGT_CANYON', xtype=xtype, &
- lev1name='numurbl', &
- long_name='height of wind in canyon', units='meters')
-
- call ncd_def_spatial_var(ncid=ncid, varname='WTLUNIT_ROOF', xtype=xtype, &
- lev1name='numurbl', &
- long_name='fraction of roof', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='WTROAD_PERV', xtype=xtype, &
- lev1name='numurbl', &
- long_name='fraction of pervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_IMPROAD_DIR', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='direct albedo of impervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_IMPROAD_DIF', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='diffuse albedo of impervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_PERROAD_DIR', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='direct albedo of pervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_PERROAD_DIF', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='diffuse albedo of pervious road', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_ROOF_DIR', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='direct albedo of roof', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_ROOF_DIF', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='diffuse albedo of roof', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_WALL_DIR', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='direct albedo of wall', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ALB_WALL_DIF', xtype=xtype, &
- lev1name='numurbl', lev2name='numrad', &
- long_name='diffuse albedo of wall', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='TK_ROOF', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='thermal conductivity of roof', units='W/m*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='TK_WALL', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='thermal conductivity of wall', units='W/m*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='TK_IMPROAD', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='thermal conductivity of impervious road', units='W/m*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='CV_ROOF', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='volumetric heat capacity of roof', units='J/m^3*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='CV_WALL', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='volumetric heat capacity of wall', units='J/m^3*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='CV_IMPROAD', xtype=xtype, &
- lev1name='numurbl', lev2name='nlevurb', &
- long_name='volumetric heat capacity of impervious road', units='J/m^3*K')
-
- call ncd_def_spatial_var(ncid=ncid, varname='NLEV_IMPROAD', xtype=nf_int, &
- lev1name='numurbl', &
- long_name='number of impervious road layers', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='peatf', xtype=xtype, &
- long_name='peatland fraction', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='zbedrock', xtype=xtype, &
- long_name='soil depth', units='m')
-
- call ncd_def_spatial_var(ncid=ncid, varname='abm', xtype=nf_int, &
- long_name='agricultural fire peak month', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='gdp', xtype=xtype, &
- long_name='gdp', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='SLOPE', xtype=xtype, &
- long_name='mean topographic slope', units='degrees')
-
- call ncd_def_spatial_var(ncid=ncid, varname='STD_ELEV', xtype=xtype, &
- long_name='standard deviation of elevation', units='m')
-
- if ( outnc_vic )then
- call ncd_def_spatial_var(ncid=ncid, varname='binfl', xtype=xtype, &
- long_name='VIC b parameter for the Variable Infiltration Capacity Curve', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='Ws', xtype=xtype, &
- long_name='VIC Ws parameter for the ARNO curve', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='Dsmax', xtype=xtype, &
- long_name='VIC Dsmax parameter for the ARNO curve', units='mm/day')
-
- call ncd_def_spatial_var(ncid=ncid, varname='Ds', xtype=xtype, &
- long_name='VIC Ds parameter for the ARNO curve', units='unitless')
-
- end if
- call ncd_def_spatial_var(ncid=ncid, varname='LAKEDEPTH', xtype=xtype, &
- long_name='lake depth', units='m')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_WETLAND', xtype=xtype, &
- long_name='percent wetland', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_LAKE', xtype=xtype, &
- long_name='percent lake', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLACIER', xtype=xtype, &
- long_name='percent glacier', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='GLACIER_REGION', xtype=nf_int, &
- long_name='glacier region ID', units='unitless')
-
- call ncd_defvar(ncid=ncid, varname='GLC_MEC', xtype=xtype, &
- dim1name='nglcecp1', long_name='Glacier elevation class', units='m')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_MEC', xtype=xtype, &
- lev1name='nglcec', &
- long_name='percent glacier for each glacier elevation class (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='TOPO_GLC_MEC', xtype=xtype, &
- lev1name='nglcec', &
- long_name='mean elevation on glacier elevation classes', units='m')
-
- if ( outnc_3dglc ) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_MEC_GIC', xtype=xtype, &
- lev1name='nglcec', &
- long_name='percent smaller glaciers and ice caps for each glacier elevation class (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_MEC_ICESHEET', xtype=xtype, &
- lev1name='nglcec', &
- long_name='percent ice sheet for each glacier elevation class (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_GIC', xtype=xtype, &
- long_name='percent ice caps/glaciers (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_ICESHEET', xtype=xtype, &
- long_name='percent ice sheet (% of landunit)', units='unitless')
-
- end if
-
- if ( outnc_3dglc ) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_MEC_GIC', xtype=xtype, &
- lev1name='nglcec', &
- long_name='percent smaller glaciers and ice caps for each glacier elevation class (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_MEC_ICESHEET', xtype=xtype, &
- lev1name='nglcec', &
- long_name='percent ice sheet for each glacier elevation class (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_GIC', xtype=xtype, &
- long_name='percent ice caps/glaciers (% of landunit)', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_GLC_ICESHEET', xtype=xtype, &
- long_name='percent ice sheet (% of landunit)', units='unitless')
-
- end if
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_URBAN', xtype=xtype, &
- lev1name='numurbl', &
- long_name='percent urban for each density type', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='URBAN_REGION_ID', xtype=nf_int, &
- long_name='urban region ID', units='unitless')
-
- call harvdata%getFieldsIdx( ind1D, ind2D )
- do j = 1, harvdata%num1Dfields()
- call ncd_def_spatial_var(ncid=ncid, varname=mkharvest_fieldname(ind1D(j),constant=.true.), xtype=xtype, &
- long_name=mkharvest_longname(ind1D(j)), units=mkharvest_units(ind1D(j)) )
- end do
- do j = 1, harvdata%num2Dfields()
- call ncd_def_spatial_var(ncid=ncid, varname=mkharvest_fieldname(ind2D(j),constant=.true.), xtype=xtype, &
- lev1name=harvdata%getFieldsDim(ind2D(j)), &
- long_name=mkharvest_longname(ind2D(j)), units=mkharvest_units(ind2D(j)) )
- end do
- deallocate(ind1D, ind2D)
-
- else
-
- call harvdata%getFieldsIdx( ind1D, ind2D )
- do j = 1, harvdata%num1Dfields()
- call ncd_def_spatial_var(ncid=ncid, varname=mkharvest_fieldname(ind1D(j),constant=.false.), xtype=xtype, &
- lev1name='time', &
- long_name=mkharvest_longname(ind1D(j)), units=mkharvest_units(ind1D(j)) )
- end do
- do j = 1, harvdata%num2Dfields()
- call ncd_def_spatial_var(ncid=ncid, varname=mkharvest_fieldname(ind2D(j),constant=.false.), xtype=xtype, &
- lev1name=harvdata%getFieldsDim(ind2D(j)), lev2name="time", &
- long_name=mkharvest_longname(ind2D(j)), units=mkharvest_units(ind2D(j)) )
- end do
- deallocate(ind1D, ind2D)
-
- end if ! .not. dynlanduse
-
- ! End of define mode
-
- call check_ret(nf_enddef(ncid), subname)
- call check_ret(nf_close(ncid), subname)
-
- end subroutine mkfile
-
-end module mkfileMod
diff --git a/tools/mksurfdata_map/src/mkgdpMod.F90 b/tools/mksurfdata_map/src/mkgdpMod.F90
deleted file mode 100644
index 6a560e61b5..0000000000
--- a/tools/mksurfdata_map/src/mkgdpMod.F90
+++ /dev/null
@@ -1,147 +0,0 @@
-module mkgdpMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkgdpMod
-!
-! !DESCRIPTION:
-! make GDP from input GDP data
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mkgdp ! regrid gdp data
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkgdp
-!
-! !INTERFACE:
-subroutine mkgdp(ldomain, mapfname, datfname, ndiag, gdp_o)
-!
-! !DESCRIPTION:
-! make GDP from input GDP data
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_continuous
- use mkchecksMod, only : min_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: gdp_o(:) ! output grid: GDP (x1000 1995 US$ per capita)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- real(r8), parameter :: min_valid = 0._r8 ! minimum valid value
-
- character(len=32) :: subname = 'mkgdp'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make GDP.....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read(tdomain,datfname)
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open GDP file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid gdp
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'gdp', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, gdp_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(gdp_o, min_valid, 'gdp')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, gdp_o, tgridmap, "GDP", "x1000 US$ per capita", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
- write (6,*) 'Successfully made GDP'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkgdp
-
-end module mkgdpMod
diff --git a/tools/mksurfdata_map/src/mkglacierregionMod.F90 b/tools/mksurfdata_map/src/mkglacierregionMod.F90
deleted file mode 100644
index beae6a8d97..0000000000
--- a/tools/mksurfdata_map/src/mkglacierregionMod.F90
+++ /dev/null
@@ -1,139 +0,0 @@
-module mkglacierregionMod
-
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !MODULE: mkglacierregionMod
- !
- ! !DESCRIPTION:
- ! make glacier region ID
- !
- ! !REVISION HISTORY:
- ! Author: Bill Sacks
- !
- !-----------------------------------------------------------------------
- !
- ! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- implicit none
-
- private
-
- ! !PUBLIC MEMBER FUNCTIONS:
- public mkglacierregion ! make glacier region ID
- !
- !EOP
-
-contains
-
- !-----------------------------------------------------------------------
- subroutine mkglacierregion(ldomain, mapfname, datfname, ndiag, &
- glacier_region_o)
- !
- ! !DESCRIPTION:
- ! Make glacier region ID
- !
- ! Regridding is done by finding the max index that overlaps each destination cell,
- ! without regard to the weight of overlap or dominance of each overlapping index.
- !
- ! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read, domain_checksame
- use mkgridmapMod
- use mkncdio
- use mkindexmapMod, only : get_max_indices
- use mkdiagnosticsMod, only : output_diagnostics_index
- use mkchecksMod, only : min_bad
- !
- ! !ARGUMENTS:
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- integer , intent(out) :: glacier_region_o(:) ! glacier region
- !
- ! !LOCAL VARIABLES:
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- integer, allocatable :: glacier_region_i(:) ! glacier region on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
- integer :: max_region ! max region ID
-
- character(len=*), parameter :: subname = 'mkglacierregion'
- !-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make glacier region .....'
- call shr_sys_flush(6)
-
- ! ------------------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! ------------------------------------------------------------------------
-
- call domain_read(tdomain, datfname)
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check(tgridmap, mask_r8, frac_dst, subname)
-
- call domain_checksame(tdomain, ldomain, tgridmap)
-
- ! ------------------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! ------------------------------------------------------------------------
-
- write (6,*) 'Open glacier region raw data file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(glacier_region_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! ------------------------------------------------------------------------
- ! Regrid glacier_region
- ! ------------------------------------------------------------------------
-
- call check_ret(nf_inq_varid(ncid, 'GLACIER_REGION', varid), subname)
- call check_ret(nf_get_var_int(ncid, varid, glacier_region_i), subname)
- if (min_bad(glacier_region_i, 0, 'GLACIER_REGION')) then
- stop
- end if
-
- call get_max_indices( &
- gridmap = tgridmap, &
- src_array = glacier_region_i, &
- dst_array = glacier_region_o, &
- nodata = 0, &
- mask_src = tdomain%mask)
-
- max_region = maxval(glacier_region_i)
- call output_diagnostics_index(glacier_region_i, glacier_region_o, tgridmap, &
- 'Glacier Region ID', 0, max_region, ndiag, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! ------------------------------------------------------------------------
- ! Deallocate dynamic memory & other clean up
- ! ------------------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate(glacier_region_i)
- deallocate(frac_dst)
- deallocate(mask_r8)
-
- write (6,*) 'Successfully made glacier region'
- write (6,*)
- call shr_sys_flush(6)
-
- end subroutine mkglacierregion
-
-end module mkglacierregionMod
diff --git a/tools/mksurfdata_map/src/mkglcmecMod.F90 b/tools/mksurfdata_map/src/mkglcmecMod.F90
deleted file mode 100644
index 2ac4d94e4f..0000000000
--- a/tools/mksurfdata_map/src/mkglcmecMod.F90
+++ /dev/null
@@ -1,794 +0,0 @@
-module mkglcmecMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkglcmecMod
-!
-! !DESCRIPTION:
-! Make glacier multi-elevation class data
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek, Mariana Vertenstein
-!
-!-----------------------------------------------------------------------
-!!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
- implicit none
-
- private ! By default make data private
-!
-! !PUBLIC MEMBER FUNCTIONS:
-!
- public mkglcmecInit ! Initialization
- public mkglcmec ! Set glacier multi-elevation class
- public mkglacier ! Set percent glacier
-!
-! !PUBLIC DATA MEMBERS:
-!
- integer, public :: nglcec = 10 ! number of elevation classes for glaciers
- real(r8), pointer :: elevclass(:) ! elevation classes
-!
-! !PRIVATE MEMBER FUNCTIONS:
- private get_elevclass ! get elevation class index
- private mean_elevation_vc ! get the elevation of a virtual column
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkglcmecInit
-!
-! !INTERFACE:
-subroutine mkglcmecInit( elevclass_o )
-!
-! !DESCRIPTION:
-! Initialize of Make glacier multi-elevation class data
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(OUT) :: elevclass_o(:) ! elevation classes
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- character(len=32) :: subname = 'mkglcmecInit:: '
-!-----------------------------------------------------------------------
- allocate( elevclass(nglcec+1) )
-
- ! -----------------------------------------------------------------
- ! Define elevation classes, represents lower boundary of each class
- ! -----------------------------------------------------------------
-
- if ( nglcec == 36 )then
- elevclass(:) = (/ 0., 200., 400., 600., 800., &
- 1000., 1200., 1400., 1600., 1800., &
- 2000., 2200., 2400., 2600., 2800., &
- 3000., 3200., 3400., 3600., 3800., &
- 4000., 4200., 4400., 4600., 4800., &
- 5000., 5200., 5400., 5600., 5800., &
- 6000., 6200., 6400., 6600., 6800., &
- 7000., 10000./)
- else if ( nglcec == 10 )then
- elevclass(1) = 0.
- elevclass(2) = 200.
- elevclass(3) = 400.
- elevclass(4) = 700.
- elevclass(5) = 1000.
- elevclass(6) = 1300.
- elevclass(7) = 1600.
- elevclass(8) = 2000.
- elevclass(9) = 2500.
- elevclass(10) = 3000.
- elevclass(11) = 10000.
- else if ( nglcec == 5 )then
- elevclass(1) = 0.
- elevclass(2) = 500.
- elevclass(3) = 1000.
- elevclass(4) = 1500.
- elevclass(5) = 2000.
- elevclass(6) = 10000.
- else if ( nglcec == 3 )then
- elevclass(1) = 0.
- elevclass(2) = 1000.
- elevclass(3) = 2000.
- elevclass(4) = 10000.
- else if ( nglcec == 1 )then
- elevclass(1) = 0.
- elevclass(2) = 10000.
- else
- write(6,*) subname//"ERROR:: nglcec must be 1, 3, 5, 10 or 36",&
- " to work with CLM: "
- call abort()
- end if
-
- elevclass_o(:) = elevclass(:)
-
-end subroutine mkglcmecInit
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkglcmec
-!
-! !INTERFACE:
-subroutine mkglcmec(ldomain, mapfname, &
- datfname_fglacier, ndiag, &
- pctglcmec_o, topoglcmec_o, &
- pctglcmec_gic_o, pctglcmec_icesheet_o, &
- pctglc_gic_o, pctglc_icesheet_o)
-!
-! !DESCRIPTION:
-! make percent glacier on multiple elevation classes, mean elevation for each
-! elevation class, and associated fields
-!
-! Note that the raw glacier data are specified by level, and thus implicitly include the
-! necessary topo data for breaking pct glacier into elevation classes. Each level in the
-! input data is assigned to an elevation (given by BIN_CENTERS in the input data). Thus,
-! all of the input glacier in level 1 is treated as being at the same elevation, and
-! likewise for each other level. These elevations are then used in assigning pct_glacier
-! to the appropriate elevation class in the output data, as well as determining the mean
-! topographic height of each elevation class in the output data.
-!
-! Note that the various percentages computed here are given as % of the glc_mec landunit.
-! If the input glacier area is 0 for a given grid cell, this requires setting these %
-! variables in an arbitrary way.
-!
-! !USES:
- use shr_sys_mod, only : shr_sys_abort
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkutilsMod, only : slightly_below, slightly_above
- use mkncdio
- use mkvarctl , only : outnc_3dglc
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname_fglacier ! raw glacier data
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: pctglcmec_o (:,:) ! % for each elevation class on output glacier grid (% of landunit)
- real(r8) , intent(out):: topoglcmec_o(:,:) ! mean elevation for each elevation classs on output glacier grid
- real(r8), optional, intent(out):: pctglcmec_gic_o(:,:) ! % glc gic on output grid, by elevation class (% of landunit)
- real(r8), optional, intent(out):: pctglcmec_icesheet_o(:,:) ! % glc ice sheet on output grid, by elevation class (% of landunit)
- real(r8), optional, intent(out):: pctglc_gic_o(:) ! % glc gic on output grid, summed across elevation classes (% of landunit)
- real(r8), optional, intent(out):: pctglc_icesheet_o(:) ! % glc ice sheet on output grid, summed across elevation classes (% of landunit)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: David Lawrence
-! 7/12/11: Bill Sacks: substantial rewrite to use input topo and % glacier at same resolution
-! 9/25/12: Bill Sacks: substantial rewrite to use new format of fglacier, which provides
-! percent by elevation bin (thus the separate topo dataset is no longer needed
-! in this routine)
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(domain_type) :: tdomain ! local domain
- type(gridmap_type) :: tgridmap ! local gridmap
- real(r8), allocatable :: pctglc_gic_i(:) ! input GIC percentage for a single level
- real(r8), allocatable :: pctglc_icesheet_i(:) ! input icesheet percentage for a single level
- real(r8), allocatable :: topoglcmec_unnorm_o(:,:) ! same as topoglcmec_o, but unnormalized
- real(r8), allocatable :: pctglc_tot_o(:) ! total glacier cover for the grid cell
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8) :: topoice_i ! topographic height of this level
- real(r8) :: pctglc_i ! input total pct glacier for a single level & single point
- real(r8) :: wt, frac ! weighting factors for remapping
- integer :: ndims ! number of dimensions in input variables
- integer :: dim_lengths(nf_max_var_dims) ! lengths of dimensions in input variables
- integer, allocatable :: starts(:), counts(:) ! start indices & counts for reading variable slices
- integer :: ni,no,ns_o,nst,lev ! indices
- integer :: n,m ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: nlev ! number of levels in input file
- real(r8) :: glc_sum ! temporary
- integer :: ier ! error status
- logical :: errors ! error status
-
- real(r8), parameter :: eps = 2.e-5_r8 ! epsilon for error checks (note that we use a large-ish value
- ! because data are stored as single-precision floats in the
- ! raw dataset)
- real(r8), parameter :: eps_small = 1.e-12_r8 ! epsilon for error checks that expect close match
- character(len=32) :: subname = 'mkglcmec'
-!-----------------------------------------------------------------------
-
- ! Initialize all output fields to zero
-
- pctglcmec_o(:,:) = 0.
- topoglcmec_o(:,:) = 0.
- if ( outnc_3dglc )then
- if ( (.not. present(pctglcmec_gic_o)) .or. (.not. present(pctglcmec_icesheet_o)) .or. &
- (.not. present(pctglc_gic_o) ) .or. (.not. present(pctglc_icesheet_o) ) )then
- call shr_sys_abort( subname//' ERROR: 3D glacier fields were NOT sent in and they are required' )
- end if
- pctglcmec_gic_o(:,:) = 0.
- pctglcmec_icesheet_o(:,:) = 0.
- pctglc_gic_o(:) = 0.
- pctglc_icesheet_o(:) = 0.
- end if
-
- ! Set number of output points
-
- ns_o = ldomain%ns
-
- write (6,*) 'Attempting to make percent elevation class ',&
- 'and mean elevation for glaciers .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and dimension information from glacier raw data file
- ! -----------------------------------------------------------------
-
- call domain_read(tdomain,datfname_fglacier)
- nst = tdomain%ns
-
- ! Read z dimension size
- write (6,*) 'Open glacier file: ', trim(datfname_fglacier)
- call check_ret(nf_open(datfname_fglacier, 0, ncid), subname)
- ier = nf_inq_dimid (ncid, 'z', dimid)
- if (ier /= NF_NOERR) then
- write (6,*) trim(subname), ' ERROR: z dimension not found on glacier file:'
- write (6,*) trim(datfname_fglacier)
- write (6,*) 'Perhaps you are trying to use an old-format glacier file?'
- write (6,*) '(prior to Sept., 2012)'
- call abort()
- end if
- call check_ret(nf_inq_dimlen (ncid, dimid, nlev), subname)
-
- ! -----------------------------------------------------------------
- ! Read mapping data, check for consistency with domains
- ! -----------------------------------------------------------------
-
- ! Mapping for raw glacier -> model output grid
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Determine dimension lengths and create start & count arrays
- ! for later reading one level at a time
- ! -----------------------------------------------------------------
-
- call get_dim_lengths(ncid, 'PCT_GLC_GIC', ndims, dim_lengths)
-
- allocate(starts(ndims), counts(ndims), stat=ier)
- if (ier/=0) call abort()
-
- starts(1:ndims) = 1
-
- ! We assume that the last dimension is the level dimension
- counts(1:ndims-1) = dim_lengths(1:ndims-1)
- counts(ndims) = 1
-
- ! --------------------------------------------------------------------
- ! Compute fields on the output grid
- ! --------------------------------------------------------------------
-
- allocate(pctglc_gic_i(nst), pctglc_icesheet_i(nst), stat=ier)
- if (ier/=0) call abort()
-
- allocate(topoglcmec_unnorm_o(ns_o,nglcec), stat=ier)
- if (ier/=0) call abort()
-
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- topoglcmec_unnorm_o(:,:) = 0.
-
- write(6,'(a,i4,a)',advance='no') 'Level (out of ', nlev, '): '
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- do lev = 1, nlev
- write(6,'(i4)',advance='no') lev
- flush(6)
-
- ! Read this level's data
- ! We assume that the last dimension is the level dimension
- starts(ndims) = lev
- call check_ret(nf_inq_varid (ncid, 'BIN_CENTERS', varid), subname)
- call check_ret(nf_get_vara_double (ncid, varid, (/lev/), (/1/), topoice_i), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_GLC_GIC', varid), subname)
- call check_ret(nf_get_vara_double (ncid, varid, starts, counts, pctglc_gic_i), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_GLC_ICESHEET', varid), subname)
- call check_ret(nf_get_vara_double (ncid, varid, starts, counts, pctglc_icesheet_i), subname)
-
- ! Determine elevation class
- m = get_elevclass(topoice_i)
- if (m < 1 .or. m > nglcec) then
- call abort()
- end if
-
- do n = 1,tgridmap%ns
- ni = tgridmap%src_indx(n)
- no = tgridmap%dst_indx(n)
- wt = tgridmap%wovr(n) * tdomain%mask(ni)
-
- ! fraction of this destination cell that is covered by source cells that are within the source landmask
- frac = frac_dst(no)
-
- ! If frac == 0, then we can't do this, to avoid divide by 0. In this case, the
- ! outputs remain equal to 0 (their initialized value).
- if (frac > 0) then
- pctglc_i = pctglc_gic_i(ni) + pctglc_icesheet_i(ni)
- pctglcmec_o(no,m) = pctglcmec_o(no,m) + wt*pctglc_i / frac
- if ( outnc_3dglc )then
- pctglcmec_gic_o(no,m) = pctglcmec_gic_o(no,m) + wt*pctglc_gic_i(ni) / frac
- pctglcmec_icesheet_o(no,m) = pctglcmec_icesheet_o(no,m) + wt*pctglc_icesheet_i(ni) / frac
- end if
-
- ! note that, by weighting the following by pctglc_i, we are getting something
- ! like the average topographic height over glaciated areas - NOT the average
- ! topographic height of the entire grid cell
- topoglcmec_unnorm_o(no,m) = topoglcmec_unnorm_o(no,m) + wt*pctglc_i*topoice_i / frac
- end if
- end do
- end do
-
- ! Note: at this point, the various percentages are given as % of grid cell; below, we
- ! renormalize these to be given as % of landunit.
-
- ! advance to next line (needed because of 'advance=no' writes above)
- write(6,*) ' '
-
- ! Close glacier input file
- call check_ret(nf_close(ncid), subname)
-
- ! Normalize topoglcmec_o. To do this, note that pctglcmec_o(n,m) is equal to the sum of
- ! the weights used in doing the weighted average of topoice_i (weight =
- ! wt*pctglc_i/frac); hence pctglcmec_o(n,m) is the correct normalization factor
- do no = 1,ns_o
- do m = 1,nglcec
- if (pctglcmec_o(no,m) > 0) then
- topoglcmec_o(no,m) = topoglcmec_unnorm_o(no,m) / pctglcmec_o(no,m)
- else
- topoglcmec_o(no,m) = mean_elevation_vc(m)
- end if
-
- ! Correct for rounding errors that put topoglcmec_o(no,m) slightly outside the
- ! allowed bounds for this elevation class
- if (slightly_below(topoglcmec_o(no,m), elevclass(m))) then
- write(6,*) 'Warning: topoglcmec_o was slightly lower than lower bound; setting equal&
- & to lower bound; for: ', no, m, topoglcmec_o(no,m), elevclass(m)
- write(6,*) '(this is informational only, and probably just indicates rounding error)'
- topoglcmec_o(no,m) = elevclass(m)
- else if (slightly_above(topoglcmec_o(no,m), elevclass(m+1))) then
- write(6,*) 'Warning: topoglcmec_o was slightly higher than upper bound; setting equal&
- & to upper bound; for: ', no, m, topoglcmec_o(no,m), elevclass(m+1)
- write(6,*) '(this is informational only, and probably just indicates rounding error)'
- topoglcmec_o(no,m) = elevclass(m+1)
- end if
- end do
- end do
-
- ! Renormalize percentages to be given as % of landunit rather than % of grid cell.
-
- allocate(pctglc_tot_o(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- do no = 1,ns_o
- pctglc_tot_o(no) = sum(pctglcmec_o(no,:))
-
- if (pctglc_tot_o(no) > 0._r8) then
- pctglcmec_o(no,:) = pctglcmec_o(no,:) / pctglc_tot_o(no) * 100._r8
- if ( outnc_3dglc )then
- pctglcmec_gic_o(no,:) = pctglcmec_gic_o(no,:) / pctglc_tot_o(no) * 100._r8
- pctglcmec_icesheet_o(no,:) = pctglcmec_icesheet_o(no,:) / pctglc_tot_o(no) * 100._r8
- end if
-
- else
- ! Division of landunit is ambiguous. Apply the rule that all area is assigned to
- ! the lowest elevation class, and all GIC.
- pctglcmec_o(no,1) = 100._r8
- if ( outnc_3dglc )then
- pctglcmec_gic_o(no,1) = 100._r8
- end if
- end if
- end do
-
- ! Set pctglc_gic_o to sum of pctglcmec_gic_o across elevation classes, and similarly for pctglc_icesheet_o
- if ( outnc_3dglc )then
- pctglc_gic_o = sum(pctglcmec_gic_o, dim=2)
- pctglc_icesheet_o = sum(pctglcmec_icesheet_o, dim=2)
- end if
-
- ! --------------------------------------------------------------------
- ! Perform various sanity checks
- ! --------------------------------------------------------------------
-
- errors = .false.
-
- ! Confirm that the sum over pctglcmec_o (from 1 to nglcec) is 100%
- do no = 1,ns_o
- glc_sum = sum(pctglcmec_o(no,:))
- if (abs(glc_sum - 100._r8) > eps_small) then
- write(6,*)'glc_sum differs from 100% at no,pctglc= ',no,glc_sum
- errors = .true.
- end if
- end do
-
- ! Confirm that GIC + ICESHEET = 100%
- if ( outnc_3dglc )then
- do no = 1,ns_o
- if (abs((pctglc_gic_o(no) + pctglc_icesheet_o(no)) - 100._r8) > eps) then
- write(6,*)'GIC + ICESHEET differs from 100% at no,pctglc_gic,pctglc_icesheet,lon,lat=', &
- no,pctglc_gic_o(no),pctglc_icesheet_o(no),&
- tgridmap%xc_dst(no),tgridmap%yc_dst(no)
- errors = .true.
- end if
- end do
-
- ! Check that GIC + ICESHEET = total glacier at each elevation class
- do m = 1, nglcec
- do no = 1,ns_o
- if (abs((pctglcmec_gic_o(no,m) + pctglcmec_icesheet_o(no,m)) - &
- pctglcmec_o(no,m)) > eps) then
- write(6,*)'GIC + ICESHEET differs from total GLC '
- write(6,*)'at no,m,pctglcmec,pctglcmec_gic,pctglcmec_icesheet = '
- write(6,*) no,m,pctglcmec_o(no,m),pctglcmec_gic_o(no,m),pctglcmec_icesheet_o(no,m)
- errors = .true.
- end if
- end do
- end do
- end if
-
-
- ! Error check: are all elevations within elevation class range
- do no = 1,ns_o
- do m = 1,nglcec
- if (topoglcmec_o(no,m) < elevclass(m) .or. topoglcmec_o(no,m) > elevclass(m+1)) then
- write(6,*) 'Error: mean elevation does not fall within elevation class '
- write(6,*) elevclass(m),elevclass(m+1),topoglcmec_o(no,m),m,no
- errors = .true.
- endif
- end do
- end do
-
- if (errors) then
- call abort()
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate(pctglc_gic_i, pctglc_icesheet_i)
- deallocate(topoglcmec_unnorm_o)
- deallocate(pctglc_tot_o)
- deallocate(frac_dst)
- deallocate(starts, counts)
-
- write (6,*) 'Successfully made percent elevation class and mean elevation for glaciers'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkglcmec
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkglacier
-!
-! !INTERFACE:
-subroutine mkglacier(ldomain, mapfname, datfname, ndiag, zero_out, glac_o)
-!
-! !DESCRIPTION:
-! make percent glacier
-!
-! In contrast to mkglcmec, this uses a "flat" PCT_GLACIER field (not separated by
-! elevation class, and not separated into icesheet vs GIC).
-!
-! This simpler routine is sufficient for cases when we run without multiple elevation
-! classes. This routine is also used when running with multiple elevation classes: we
-! first regrid the flat PCT_GLACIER field, then later create the multiple elevation class
-! data. This multi-step process makes it easier to do corrections on the total
-! PCT_GLACIER, and make sure these corrections apply appropriately to the multi-level
-! output. The assumption is that PCT_GLACIER is the sum of both PCT_GLC_GIC and
-! PCT_GLC_ICESHEET across all elevation bins.
-!
-! !USES:
- use mkdomainMod , only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- logical , intent(in) :: zero_out ! if should zero glacier out
- real(r8) , intent(out):: glac_o(:) ! output grid: %glacier
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: glac_i(:) ! input grid: percent glac
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: gglac_i ! input grid: global glac
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: gglac_o ! output grid: global glac
- real(r8) :: garea_o ! output grid: global area
- integer :: ni,no,k,n,m,ns, ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mkglacier'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %glacier .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns = tdomain%ns
- ns_o = ldomain%ns
- allocate(glac_i(ns), &
- frac_dst(ns_o), &
- stat=ier)
- if (ier/=0) call abort()
-
- write (6,*) 'Open glacier file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_GLACIER', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, glac_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- if ( zero_out )then
-
- do no = 1, ns_o
- glac_o(no) = 0.
- enddo
-
- else
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine glac_o on output grid
-
- call gridmap_areaave_srcmask(tgridmap, glac_i, glac_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- do no = 1, ns_o
- if (glac_o(no) < 1.) glac_o(no) = 0.
- enddo
- end if
-
- ! Check for conservation
-
- do no = 1, ns_o
- if ((glac_o(no)) > 100.000001_r8) then
- write (6,*) 'MKGLACIER error: glacier = ',glac_o(no), &
- ' greater than 100.000001 for column, row = ',no
- call shr_sys_flush(6)
- stop
- end if
- enddo
-
- ! Some error checking and writing of global values before and after the regrid
-
- if ( .not. zero_out )then
-
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
-
- allocate(mask_r8(ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- ! Input grid
-
- gglac_i = 0.
- garea_i = 0.
- do ni = 1, ns
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- gglac_i = gglac_i + glac_i(ni)*(tgridmap%area_src(ni)/100.)*&
- tdomain%mask(ni)*re**2
- end do
-
- ! Output grid
-
- gglac_o = 0.
- garea_o = 0.
- do no = 1, ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- gglac_o = gglac_o + glac_o(no)*(tgridmap%area_dst(no)/100.)*&
- frac_dst(no)*re**2
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Glacier Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2002) gglac_i*1.e-06,gglac_o*1.e-06
- write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-2002 format (1x,'glaciers ',f14.3,f17.3)
-2004 format (1x,'all surface ',f14.3,f17.3)
-
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- if ( .not. zero_out )then
- call gridmap_clean(tgridmap)
- deallocate (glac_i, frac_dst, mask_r8)
- end if
-
- write (6,*) 'Successfully made %glacier'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkglacier
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get_elevclass
-!
-! !INTERFACE:
-integer function get_elevclass(topo, writewarn)
-!
-! !DESCRIPTION:
-! Returns elevation class index (1..nglcec) given the topographic height.
-! If topo is lower than the lowest elevation class, returns 0.
-! If topo is higher than the highest elevation class, returns (nglcec+1).
-! In either of the two latter cases, the function also writes a warning message, unless
-! writewarn is present and false.
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: topo ! topographic height (m)
- logical, intent(in), optional :: writewarn ! should warning messages be written? (default: true)
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: m
- logical :: my_writewarn
- character(len=32) :: subname = 'get_elevclass'
-!-----------------------------------------------------------------------
-
- if (present(writewarn)) then
- my_writewarn = writewarn
- else
- my_writewarn = .true.
- end if
-
- if (topo < elevclass(1)) then
- if (my_writewarn) then
- write(6,*) 'WARNING in ', trim(subname)
- write(6,*) 'topo out of bounds'
- write(6,*) 'topo = ', topo
- write(6,*) 'elevclass(1) = ', elevclass(1)
- end if
- get_elevclass = 0
- return
- end if
-
- do m = 1, nglcec
- if (topo < elevclass(m+1)) then
- ! note that we already know that topo >= elevclass(m), otherwise we would have
- ! returned earlier
- get_elevclass = m
- return
- end if
- end do
-
- if (my_writewarn) then
- write(6,*) 'WARNING in ', trim(subname)
- write(6,*) 'topo out of bounds'
- write(6,*) 'topo = ', topo
- write(6,*) 'elevclass(nglcec+1) = ', elevclass(nglcec+1)
- end if
- get_elevclass = nglcec+1
-
-end function get_elevclass
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mean_elevation_vc
-!
-! !INTERFACE:
-real(r8) function mean_elevation_vc(class)
-!
-! !DESCRIPTION:
-! For a virtual column (thus, a column that has no true elevation data), return the
-! "mean" elevation of the given elevation class.
-!
-! !ARGUMENTS:
- implicit none
- integer, intent(in) :: class ! elevation class
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-! !LOCAL VARIABLES:
-!EOP
- character(len=32) :: subname = 'mean_elevation_vc'
-!-----------------------------------------------------------------------
-
- if (class < nglcec) then
- mean_elevation_vc = 0.5_r8 * (elevclass(class) + elevclass(class+1))
- else if (class == nglcec) then
- ! In the top elevation class; in this case, assignment of a "mean" elevation is
- ! somewhat arbitrary
-
- if (nglcec > 1) then
- mean_elevation_vc = 2.0_r8*elevclass(class) - elevclass(class-1)
- else
- ! entirely arbitrary
- mean_elevation_vc = 1000._r8
- end if
- else
- write(6,*) 'ERROR in ', trim(subname), ': class out of bounds= ', class
- call abort()
- end if
-
-end function mean_elevation_vc
-
-end module mkglcmecMod
diff --git a/tools/mksurfdata_map/src/mkgridmapMod.F90 b/tools/mksurfdata_map/src/mkgridmapMod.F90
deleted file mode 100644
index 21ca23f4d6..0000000000
--- a/tools/mksurfdata_map/src/mkgridmapMod.F90
+++ /dev/null
@@ -1,915 +0,0 @@
-module mkgridmapMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkgridmapMod
-!
-! !DESCRIPTION:
-! Module containing 2-d global surface boundary data information
-!
-! !NOTES:
-! Avoid using the frac_src and frac_dst found here, because they
-! are read from mapping files, and we have generally moved to "nomask"
-! mapping files. This means that mapping files now typically contain
-! mask and frac equal to 1 everywhere. So now during remapping we apply the
-! source masks found in the raw datasets and ignore the masks found in the
-! mapping files. Exception: we continue to use a masked mapping file to regrid
-! the 1-km topography.
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
-! !PUBLIC TYPES:
- type gridmap_type
- character(len=32) :: set ! If set or not
- integer :: na ! size of source domain
- integer :: nb ! size of destination domain
- integer :: ns ! number of non-zero elements in matrix
- real(r8), pointer :: yc_src(:) ! "degrees"
- real(r8), pointer :: yc_dst(:) ! "degrees"
- real(r8), pointer :: xc_src(:) ! "degrees"
- real(r8), pointer :: xc_dst(:) ! "degrees"
- real(R8), pointer :: area_src(:) ! area of a grid in map (radians)
- real(R8), pointer :: area_dst(:) ! area of b grid in map (radians)
- real(r8), pointer :: frac_src(:) ! "unitless"
- real(r8), pointer :: frac_dst(:) ! "unitless"
- integer , pointer :: src_indx(:) ! correpsonding column index
- integer , pointer :: dst_indx(:) ! correpsonding row index
- real(r8), pointer :: wovr(:) ! wt of overlap input cell
- end type gridmap_type
- public :: gridmap_type
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: gridmap_setptrs ! Set pointers to gridmap data
- public :: for_test_create_gridmap ! Set a gridmap directly, for testing
- public :: gridmap_mapread ! Read in gridmap
- public :: gridmap_check ! Check validity of a gridmap
- public :: gridmap_calc_frac_dst ! Obtain frac_dst
- public :: gridmap_areaave_no_srcmask ! do area average without passing mask
- public :: gridmap_areaave_srcmask ! do area average with mask passed
- public :: gridmap_areaave_scs ! area average, but multiply by ratio of source over destination weight
- public :: gridmap_areastddev ! do area-weighted standard deviation
- public :: gridmap_clean ! Clean and deallocate a gridmap structure
-!
-!
-! !REVISION HISTORY:
-! Author Mariana Vertenstein
-
- ! questions - how does the reverse mapping occur
- ! is mask_dst read in - and what happens if this is very different
- ! from frac_dst which is calculated by mapping frac_src?
- ! in frac - isn't grid1_frac always 1 or 0?
-
- ! !PRIVATE MEMBER FUNCTIONS:
- private :: set_gridmap_var
- private :: gridmap_checkifset
-
- interface set_gridmap_var
- module procedure set_gridmap_var_r8
- module procedure set_gridmap_var_int
- end interface set_gridmap_var
-
- character(len=32), parameter :: isSet = "gridmap_IsSet"
-
-!
-!EOP
-!------------------------------------------------------------------------------
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_setptrs
-!
-! !INTERFACE:
- subroutine gridmap_setptrs(gridmap, nsrc, ndst, ns, yc_src, yc_dst, &
- xc_src, xc_dst, &
- frac_src, frac_dst, src_indx, dst_indx )
-!
-! !DESCRIPTION:
-! This subroutine assigns pointers to some of the map type data.
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type), intent(in) :: gridmap ! mapping data
- integer, optional :: nsrc ! size of source domain
- integer, optional :: ndst ! size of destination domain
- integer, optional :: ns ! number of non-zero elements in matrix
- integer, optional, pointer :: dst_indx(:) ! Destination index
- integer, optional, pointer :: src_indx(:) ! Destination index
- real(r8), optional, pointer :: yc_src(:) ! "degrees"
- real(r8), optional, pointer :: yc_dst(:) ! "degrees"
- real(r8), optional, pointer :: xc_src(:) ! "degrees"
- real(r8), optional, pointer :: xc_dst(:) ! "degrees"
- real(r8), optional, pointer :: frac_src(:) ! "unitless"
- real(r8), optional, pointer :: frac_dst(:) ! "unitless"
-!
-! !REVISION HISTORY:
-! Created by Erik Kluzek
-!
-! !LOCAL VARIABLES:
-!EOP
-!------------------------------------------------------------------------------
- character(*),parameter :: subName = '(gridmap_setptrs) '
-
- call gridmap_checkifset( gridmap, subname )
- if ( present(nsrc) ) nsrc = gridmap%na
- if ( present(ndst) ) ndst = gridmap%nb
- if ( present(ns) ) ns = gridmap%ns
- if ( present(yc_src) ) yc_src => gridmap%yc_src
- if ( present(xc_src) ) xc_src => gridmap%xc_src
- if ( present(frac_src) ) frac_src => gridmap%frac_src
- if ( present(yc_dst) ) yc_dst => gridmap%yc_dst
- if ( present(xc_dst) ) xc_dst => gridmap%xc_dst
- if ( present(frac_dst) ) frac_dst => gridmap%frac_dst
- if ( present(dst_indx) ) dst_indx => gridmap%dst_indx
- if ( present(src_indx) ) src_indx => gridmap%src_indx
- end subroutine gridmap_setptrs
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_mapread
-!
-! !INTERFACE:
- subroutine gridmap_mapread(gridmap, fileName)
-!
-! !DESCRIPTION:
-! This subroutine reads in the map file
-!
-! !USES:
- use mkncdio, only : nf_open, nf_close, nf_strerror
- use mkncdio, only : nf_inq_dimid, nf_inq_dimlen
- use mkncdio, only : nf_inq_varid, nf_get_var_double, nf_get_var_int
- use mkncdio, only : NF_NOWRITE, NF_NOERR
- use mkncdio, only : convert_latlon
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type), intent(out) :: gridmap ! mapping data
- character(len=*) , intent(in) :: filename ! netCDF file to read
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-! !LOCAL VARIABLES:
- integer :: n ! generic loop indicies
- integer :: na ! size of source domain
- integer :: nb ! size of destination domain
- integer :: igrow ! aVect index for matrix row
- integer :: igcol ! aVect index for matrix column
- integer :: iwgt ! aVect index for matrix element
- integer :: iarea ! aVect index for area
-
-
- character,allocatable :: str(:) ! variable length char string
- character(len=256) :: attstr ! netCDF attribute name string
- integer :: rcode ! netCDF routine return code
- integer :: fid ! netCDF file ID
- integer :: vid ! netCDF variable ID
- integer :: did ! netCDF dimension ID
- integer :: ns ! size of array
-
- real(r8), parameter :: tol = 1.0e-4_r8 ! tolerance for checking that mapping data
- ! are within expected bounds
-
- !--- formats ---
- character(*),parameter :: subName = '(gridmap_map_read) '
- character(*),parameter :: F00 = '("(gridmap_map_read) ",4a)'
- character(*),parameter :: F01 = '("(gridmap_map_read) ",2(a,i7))'
-!EOP
-!------------------------------------------------------------------------------
-
- !-------------------------------------------------------------------------------
- !
- !-------------------------------------------------------------------------------
-
- write(6,F00) "reading mapping matrix data..."
-
- ! open & read the file
- write(6,F00) "* file name : ",trim(fileName)
-
- rcode = nf_open(filename ,NF_NOWRITE, fid)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- !--- allocate memory & get matrix data ----------
- rcode = nf_inq_dimid (fid, 'n_s', did) ! size of sparse matrix
- rcode = nf_inq_dimlen(fid, did , gridmap%ns)
- rcode = nf_inq_dimid (fid, 'n_a', did) ! size of input vector
- rcode = nf_inq_dimlen(fid, did , gridmap%na)
- rcode = nf_inq_dimid (fid, 'n_b', did) ! size of output vector
- rcode = nf_inq_dimlen(fid, did , gridmap%nb)
-
- write(6,*) "* matrix dimensions rows x cols :",gridmap%na,' x',gridmap%nb
- write(6,*) "* number of non-zero elements: ",gridmap%ns
-
- ns = gridmap%ns
- na = gridmap%na
- nb = gridmap%nb
- allocate(gridmap%wovr(ns) , &
- gridmap%src_indx(ns), &
- gridmap%dst_indx(ns), &
- gridmap%area_src(na), &
- gridmap%frac_src(na), &
- gridmap%area_dst(nb), &
- gridmap%frac_dst(nb), &
- gridmap%xc_dst(nb), &
- gridmap%yc_dst(nb), &
- gridmap%xc_src(na), &
- gridmap%yc_src(na), stat=rcode)
- if (rcode /= 0) then
- write(6,*) SubName//' ERROR: allocate gridmap'
- call abort()
- endif
-
- rcode = nf_inq_varid(fid,'S' ,vid)
- rcode = nf_get_var_double(fid,vid ,gridmap%wovr)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'row',vid)
- rcode = nf_get_var_int(fid, vid ,gridmap%dst_indx)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'col',vid)
- rcode = nf_get_var_int(fid, vid, gridmap%src_indx)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'area_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%area_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'area_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%area_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
-
- rcode = nf_inq_varid(fid,'frac_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%frac_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- if ( any(gridmap%frac_src(:) < 0.0_r8 .or. gridmap%frac_src > (1.0_r8 + tol)) )then
- write(6,*) SubName//' ERROR: frac_src out of bounds'
- write(6,*) 'max = ', maxval(gridmap%frac_src), ' min = ', minval(gridmap%frac_src)
- call abort()
- end if
-
- rcode = nf_inq_varid(fid,'frac_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%frac_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- if ( any(gridmap%frac_dst(:) < 0.0_r8 .or. gridmap%frac_dst > (1.0_r8 + tol)) )then
- write(6,*) SubName//' ERROR: frac_dst out of bounds'
- write(6,*) 'max = ', maxval(gridmap%frac_dst), ' min = ', minval(gridmap%frac_dst)
- call abort()
- end if
-
- rcode = nf_inq_varid(fid,'xc_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%xc_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- call convert_latlon(fid, 'xc_a', gridmap%xc_src)
-
- rcode = nf_inq_varid(fid,'yc_a',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%yc_src)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- call convert_latlon(fid, 'yc_a', gridmap%yc_src)
-
- rcode = nf_inq_varid(fid,'xc_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%xc_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- call convert_latlon(fid, 'xc_b', gridmap%xc_dst)
-
- rcode = nf_inq_varid(fid,'yc_b',vid)
- rcode = nf_get_var_double(fid, vid, gridmap%yc_dst)
- if (rcode /= NF_NOERR) write(6,F00) nf_strerror(rcode)
- call convert_latlon(fid, 'yc_b', gridmap%yc_dst)
-
- rcode = nf_close(fid)
-
- gridmap%set = IsSet
-
- end subroutine gridmap_mapread
-
-!==========================================================================
-
- !-----------------------------------------------------------------------
- subroutine for_test_create_gridmap(gridmap, na, nb, ns, &
- src_indx, dst_indx, wovr, &
- frac_src, frac_dst, area_src, area_dst, &
- xc_src, xc_dst, yc_src, yc_dst)
- !
- ! !DESCRIPTION:
- ! Creates a gridmap object directly from inputs
- !
- ! This is meant for testing
- !
- ! !ARGUMENTS:
- type(gridmap_type), intent(out) :: gridmap
- integer, intent(in) :: na
- integer, intent(in) :: nb
- integer, intent(in) :: ns
- integer, intent(in) :: src_indx(:)
- integer, intent(in) :: dst_indx(:)
- real(r8), intent(in) :: wovr(:)
-
- ! If not provided, mask and frac values are set to 1 everywhere
- real(r8), intent(in), optional :: frac_src(:)
- real(r8), intent(in), optional :: frac_dst(:)
-
- ! If not provided, area values are set to a constant value everywhere
- real(r8), intent(in), optional :: area_src(:)
- real(r8), intent(in), optional :: area_dst(:)
-
- ! If not provided, xc and yc values are set to 0 everywhere
- real(r8), intent(in), optional :: xc_src(:)
- real(r8), intent(in), optional :: xc_dst(:)
- real(r8), intent(in), optional :: yc_src(:)
- real(r8), intent(in), optional :: yc_dst(:)
-
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'for_test_create_gridmap'
- !-----------------------------------------------------------------------
-
- ! ------------------------------------------------------------------------
- ! Error checking on sizes of arrays
- ! ------------------------------------------------------------------------
- call check_input_size('src_indx', size(src_indx), ns)
- call check_input_size('dst_indx', size(dst_indx), ns)
- call check_input_size('wovr', size(wovr), ns)
-
- if (present(frac_src)) then
- call check_input_size('frac_src', size(frac_src), na)
- end if
- if (present(area_src)) then
- call check_input_size('area_src', size(area_src), na)
- end if
- if (present(xc_src)) then
- call check_input_size('xc_src', size(xc_src), na)
- end if
- if (present(yc_src)) then
- call check_input_size('yc_src', size(yc_src), na)
- end if
-
- if (present(frac_dst)) then
- call check_input_size('frac_dst', size(frac_dst), nb)
- end if
- if (present(area_dst)) then
- call check_input_size('area_dst', size(area_dst), nb)
- end if
- if (present(xc_dst)) then
- call check_input_size('xc_dst', size(xc_dst), nb)
- end if
- if (present(yc_dst)) then
- call check_input_size('yc_dst', size(yc_dst), nb)
- end if
-
- ! ------------------------------------------------------------------------
- ! Create gridmap object
- ! ------------------------------------------------------------------------
-
- gridmap%na = na
- gridmap%nb = nb
- gridmap%ns = ns
-
- allocate(gridmap%src_indx(ns))
- gridmap%src_indx = src_indx
- allocate(gridmap%dst_indx(ns))
- gridmap%dst_indx = dst_indx
- allocate(gridmap%wovr(ns))
- gridmap%wovr = wovr
-
- allocate(gridmap%frac_src(na))
- call set_gridmap_var(gridmap%frac_src, 1._r8, frac_src)
- allocate(gridmap%frac_dst(nb))
- call set_gridmap_var(gridmap%frac_dst, 1._r8, frac_dst)
-
- allocate(gridmap%yc_src(na))
- call set_gridmap_var(gridmap%yc_src, 0._r8, yc_src)
- allocate(gridmap%yc_dst(nb))
- call set_gridmap_var(gridmap%yc_dst, 0._r8, yc_dst)
- allocate(gridmap%xc_src(na))
- call set_gridmap_var(gridmap%xc_src, 0._r8, xc_src)
- allocate(gridmap%xc_dst(nb))
- call set_gridmap_var(gridmap%xc_dst, 0._r8, xc_dst)
- allocate(gridmap%area_src(na))
- call set_gridmap_var(gridmap%area_src, 0._r8, area_src)
- allocate(gridmap%area_dst(nb))
- call set_gridmap_var(gridmap%area_dst, 0._r8, area_dst)
-
- gridmap%set = isSet
-
- contains
- subroutine check_input_size(varname, actual_size, expected_size)
- character(len=*), intent(in) :: varname
- integer, intent(in) :: actual_size
- integer, intent(in) :: expected_size
-
- if (actual_size /= expected_size) then
- write(6,*) subname, ' ERROR: ', trim(varname), ' wrong size: actual, expected = ', &
- actual_size, expected_size
- call abort()
- end if
- end subroutine check_input_size
-
- end subroutine for_test_create_gridmap
-
- subroutine set_gridmap_var_r8(var, default_val, input_val)
- ! Convenience subroutine to set a variable to an optional input or a default value
- real(r8), intent(out) :: var(:)
- real(r8), intent(in) :: default_val
- real(r8), intent(in), optional :: input_val(:)
-
- if (present(input_val)) then
- var = input_val
- else
- var = default_val
- end if
- end subroutine set_gridmap_var_r8
-
- subroutine set_gridmap_var_int(var, default_val, input_val)
- ! Convenience subroutine to set a variable to an optional input or a default value
- integer, intent(out) :: var(:)
- integer, intent(in) :: default_val
- integer, intent(in), optional :: input_val(:)
-
- if (present(input_val)) then
- var = input_val
- else
- var = default_val
- end if
- end subroutine set_gridmap_var_int
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_check
-!
-! !INTERFACE:
- subroutine gridmap_check(gridmap, mask_src, frac_dst, caller)
-!
-! !DESCRIPTION:
-! Check validity of a gridmap
-! Aborts if there are any errors
-!
-! !USES:
- use mkvarctl, only : mksrf_gridtype
- use mkvarpar, only : re
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! mapping data
- real(r8), intent(in) :: mask_src(:) ! input mask; could be declared integer but for the argument passed from subr. mktopostats
- real(r8), intent(in) :: frac_dst(:) ! output fractions
- character(len=*) , intent(in) :: caller ! calling subroutine (used for error messages)
-!
-! !REVISION HISTORY:
-! Created by Bill Sacks
-!
-! !LOCAL VARIABLES:
- real(r8) :: sum_area_i ! global sum of input area
- real(r8) :: sum_area_o ! global sum of output area
- integer :: ni,no,ns_i,ns_o ! indices
-
- real(r8), parameter :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=*), parameter :: subname = 'gridmap_check'
-!EOP
-!------------------------------------------------------------------------------
-
- ns_i = gridmap%na
- ns_o = gridmap%nb
-
- ! -----------------------------------------------------------------
- ! Error check prep
- ! Global sum of output area -- must multiply by fraction of
- ! output grid that is land as determined by input grid
- ! -----------------------------------------------------------------
-
- sum_area_i = 0.0_r8
- do ni = 1,ns_i
- sum_area_i = sum_area_i + gridmap%area_src(ni)*mask_src(ni)*re**2
- enddo
-
- sum_area_o = 0.
- do no = 1,ns_o
- sum_area_o = sum_area_o + gridmap%area_dst(no)*frac_dst(no)*re**2
- end do
-
- ! -----------------------------------------------------------------
- ! Error check1
- ! Compare global sum_area_i to global sum_area_o.
- ! -----------------------------------------------------------------
-
- if ( trim(mksrf_gridtype) == 'global' ) then
- if ( abs(sum_area_o/sum_area_i-1.) > relerr ) then
- write (6,*) subname//' ERROR from '//trim(caller)//': mapping areas not conserved'
- write (6,'(a30,e20.10)') 'global sum output field = ',sum_area_o
- write (6,'(a30,e20.10)') 'global sum input field = ',sum_area_i
- stop
- end if
- end if
-
- end subroutine gridmap_check
-
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_areaave_scs
-!
-! !INTERFACE:
- subroutine gridmap_areaave_scs (gridmap, src_array, dst_array, nodata, src_wt, dst_wt, frac_dst)
-!
-! !DESCRIPTION:
-! This subroutine does a simple area average, but multiplies by the ratio of the source over
-! the destination weight. Sets to zero if destination weight is zero.
-!
-! The src_wt must be multiplied by tdomain%mask to maintain consistency with the
-! incoming frac_dst.
-!
-! Called by subroutine mkpft.
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
- real(r8), intent(in) :: nodata ! value to apply where there are no input data
- real(r8), intent(in) :: src_wt(:) ! Source weights
- real(r8), intent(in) :: dst_wt(:) ! Destination weights
- real(r8), intent(in) :: frac_dst(:) ! Output grid weights
-
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein, moditied by Sean Swenson
-!
-! !LOCAL VARIABLES:
- integer :: n,ns,ni,no
- real(r8):: wt,frac,swt,dwt
- real(r8), allocatable :: sum_weights(:) ! sum of weights on the output grid
- character(*),parameter :: subName = '(gridmap_areaave_scs) '
-!EOP
-!------------------------------------------------------------------------------
-
- ! Error check inputs and initialize local variables
-
- if (size(frac_dst) /= size(dst_array)) then
- write(6,*) subname//' ERROR: incorrect size of frac_dst'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'size(dst_array) = ', size(dst_array)
- call abort()
- end if
-
- call gridmap_checkifset( gridmap, subname )
- allocate(sum_weights(size(dst_array)))
- sum_weights = 0._r8
- dst_array = 0._r8
-
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- frac = frac_dst(no)
- swt = src_wt(ni)
- dwt = dst_wt(no)
- wt = wt * swt
- if(dwt > 0._r8) then
- wt = wt / dwt
- else
- wt = 0._r8
- endif
- if (frac > 0.) then
- dst_array(no) = dst_array(no) + wt * src_array(ni)/frac
- sum_weights(no) = sum_weights(no) + wt
- end if
- end do
-
- where (sum_weights == 0._r8)
- dst_array = nodata
- end where
-
- deallocate(sum_weights)
-
- end subroutine gridmap_areaave_scs
-
-!==========================================================================
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_areaave_srcmask
-!
-! !INTERFACE:
- subroutine gridmap_areaave_srcmask (gridmap, src_array, dst_array, nodata, mask_src, frac_dst)
-!
-! !DESCRIPTION:
-! This subroutine does an area average with the source mask
-!
-! !NOTES:
-! We have generally moved to "nomask" mapping files. This means that mapping
-! files now typically contain mask and frac equal to 1 everywhere. So now during
-! remapping we apply the source masks found in the raw datasets and ignore the
-! masks found in the mapping files. Exception: we continue to use a masked
-! mapping file to regrid the 1-km topography.
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
- real(r8), intent(in) :: nodata ! value to apply where there are no input data
- integer, intent(in) :: mask_src(:)
- real(r8), intent(in) :: frac_dst(:)
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-! !LOCAL VARIABLES:
- integer :: n,ns,ni,no
- real(r8):: wt
- character(*),parameter :: subName = '(gridmap_areaave_srcmask) '
-!EOP
-!------------------------------------------------------------------------------
- ! Error check inputs and initialize local variables
-
- ns = size(dst_array)
- if (size(frac_dst) /= ns) then
- write(6,*) subname//' ERROR: incorrect size of frac_dst'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'size(dst_array) = ', ns
- call abort()
- end if
- if (size(mask_src) /= size(src_array)) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'size(src_array) = ', size(src_array)
- call abort()
- end if
-
- call gridmap_checkifset( gridmap, subname )
-
- dst_array = 0._r8
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- if (mask_src(ni) > 0) then
- dst_array(no) = dst_array(no) + wt*mask_src(ni)*src_array(ni)/frac_dst(no)
- end if
- end do
-
- where (frac_dst == 0._r8)
- dst_array = nodata
- end where
-
- end subroutine gridmap_areaave_srcmask
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_areastddev
-!
-! !INTERFACE:
- subroutine gridmap_areastddev (gridmap, src_array, dst_array, nodata)
-!
-! !DESCRIPTION:
-! Computes area-weighted standard deviation
-!
-! We use the definition of standard deviation that applies if you measure the full
-! population (as opposed to the unbiased standard deviation that should be used when
-! sampling a subset of the full population). (This is equivalent to using 1/N rather than
-! 1/(N-1).) This makes sense if we assume that the underlying values are constant
-! throughout each source grid cell -- in that case, we know the full population as long as
-! we know the values in all source grid cells, which is generally the case.
-!
-! The formula is from
-! (accessed 3-4-13).
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
- real(r8), intent(in) :: nodata ! value to apply where there are no input data
-!
-! !REVISION HISTORY:
-! Created by Bill Sacks
-!
-! !LOCAL VARIABLES:
- integer :: n,ni,no
- integer :: ns_o ! number of output points
- real(r8):: wt ! weight of overlap
- real(r8), allocatable :: weighted_means(:) ! weighted mean on the output grid
- real(r8), allocatable :: sum_weights(:) ! sum of weights on the output grid
- character(*),parameter :: subName = '(gridmap_areastddev) '
-!EOP
-!------------------------------------------------------------------------------
- call gridmap_checkifset( gridmap, subname )
-
- ns_o = size(dst_array)
- allocate(weighted_means(ns_o))
-
- ! Subr. gridmap_areaave_no_srcmask should NOT be used in general. We have
- ! kept it to support the rare raw data files for which we have masking on
- ! the mapping file and, therefore, we do not explicitly pass the src_mask
- ! as an argument. In general, users are advised to use subroutine
- ! gridmap_areaave_srcmask.
- call gridmap_areaave_no_srcmask(gridmap, src_array, weighted_means, nodata=0._r8)
-
- ! WJS (3-5-13): I believe that sum_weights should be the same as gridmap%frac_dst,
- ! but I'm not positive of this, so we compute it explicitly to be safe
- allocate(sum_weights(ns_o))
- sum_weights(:) = 0._r8
- dst_array(:) = 0._r8
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- ! The following accumulates the numerator of the weighted sigma-squared
- dst_array(no) = dst_array(no) + wt * (src_array(ni) - weighted_means(no))**2
- sum_weights(no) = sum_weights(no) + wt
- end do
-
- do no = 1,ns_o
- if (sum_weights(no) > 0._r8) then
- dst_array(no) = sqrt(dst_array(no)/sum_weights(no))
- else
- dst_array(no) = nodata
- end if
- end do
-
- deallocate(weighted_means, sum_weights)
-
- end subroutine gridmap_areastddev
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_clean
-!
-! !INTERFACE:
- subroutine gridmap_clean(gridmap)
-!
-! !DESCRIPTION:
-! This subroutine deallocates the gridmap type
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type), intent(inout) :: gridmap
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subName = "gridmap_clean"
- integer ier ! error flag
-!EOP
-!------------------------------------------------------------------------------
- if ( gridmap%set .eq. IsSet )then
- deallocate(gridmap%wovr , &
- gridmap%src_indx, &
- gridmap%dst_indx, &
- gridmap%area_src, &
- gridmap%area_dst, &
- gridmap%frac_src, &
- gridmap%frac_dst, &
- gridmap%xc_src, &
- gridmap%yc_src, stat=ier)
- if (ier /= 0) then
- write(6,*) SubName//' ERROR: deallocate gridmap'
- call abort()
- endif
- else
- write(6,*) SubName//' Warning: calling '//trim(subName)//' on unallocated gridmap'
- end if
- gridmap%set = "NOT-set"
-
- end subroutine gridmap_clean
-
-!==========================================================================
-
- subroutine gridmap_checkifset( gridmap, subname )
-
- implicit none
- type(gridmap_type), intent(in) :: gridmap
- character(len=*), intent(in) :: subname
-
- if ( gridmap%set .ne. IsSet )then
- write(6,*) SubName//' ERROR: gridmap NOT set yet, run gridmap_mapread first'
- call abort()
- end if
- end subroutine gridmap_checkifset
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_calc_frac_dst
-!
-! !INTERFACE:
- subroutine gridmap_calc_frac_dst(gridmap, mask_src, frac_dst)
-!
-! !DESCRIPTION:
-! This subroutine calculates frac_dst
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- integer, intent(in) :: mask_src(:)
- real(r8), intent(out) :: frac_dst(:)
-!
-! !REVISION HISTORY:
-! Created by Sam Levis
-!
-! !LOCAL VARIABLES:
- integer :: n,ns,ni,no
- real(r8):: wt
- character(*),parameter :: subName = '(gridmap_calc_frac_dst) '
-!EOP
-!------------------------------------------------------------------------------
- call gridmap_checkifset( gridmap, subname )
- frac_dst(:) = 0._r8
-
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- if (mask_src(ni) > 0) then
- frac_dst(no) = frac_dst(no) + wt*mask_src(ni)
- end if
- end do
-
- end subroutine gridmap_calc_frac_dst
-
-!==========================================================================
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: gridmap_areaave_no_srcmask
-!
-! !INTERFACE:
- subroutine gridmap_areaave_no_srcmask (gridmap, src_array, dst_array, nodata)
-!
-! !DESCRIPTION:
-! This subroutine should NOT be used in general. We have kept it to support the
-! rare raw data files for which we have masking on the mapping file and,
-! therefore, we do not explicitly pass the src_mask as an argument. In general,
-! users are advised to use subroutine gridmap_areaave_srcmask.
-!
-! Perform simple area average without explicitly passing a src mask. The src
-! mask may be implicit in gridmap%wovr.
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type) , intent(in) :: gridmap ! gridmap data
- real(r8), intent(in) :: src_array(:)
- real(r8), intent(out):: dst_array(:)
- real(r8), intent(in) :: nodata ! value to apply where there are no input data
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein
-!
-! !LOCAL VARIABLES:
- integer :: n,ns,ni,no
- real(r8):: wt,frac
- real(r8), allocatable :: sum_weights(:) ! sum of weights on the output grid
- character(*),parameter :: subName = '(gridmap_areaave_no_srcmask) '
-!EOP
-!------------------------------------------------------------------------------
- call gridmap_checkifset( gridmap, subname )
- allocate(sum_weights(size(dst_array)))
- sum_weights = 0._r8
- dst_array = 0._r8
-
- do n = 1,gridmap%ns
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n)
- frac = gridmap%frac_dst(no)
- if (frac > 0.) then
- dst_array(no) = dst_array(no) + wt * src_array(ni)/frac
- sum_weights(no) = sum_weights(no) + wt
- end if
- end do
-
- where (sum_weights == 0._r8)
- dst_array = nodata
- end where
-
- deallocate(sum_weights)
-
- end subroutine gridmap_areaave_no_srcmask
-
-end module mkgridmapMod
-
-
diff --git a/tools/mksurfdata_map/src/mkharvestMod.F90 b/tools/mksurfdata_map/src/mkharvestMod.F90
deleted file mode 100644
index 0dc107729b..0000000000
--- a/tools/mksurfdata_map/src/mkharvestMod.F90
+++ /dev/null
@@ -1,1104 +0,0 @@
-module mkharvestMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkharvest
-!
-! !DESCRIPTION:
-! Make harvest and grazing data to add to the dynamic PFT file.
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!-----------------------------------------------------------------------
-! !USES:
- use shr_kind_mod , only : r8 => shr_kind_r8, CL => shr_kind_CL
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC DATA MEMBERS:
-
- public :: harvestDataType
- integer, private, parameter :: numharv = 9 ! number of harvest and grazing fields
-
- type :: harvestDataType
- private
- real(r8), pointer :: data1D(:,:) ! Input 1D data
- real(r8), pointer :: data2DCFT(:,:,:) ! Input 2D data with CFT's
- real(r8), pointer :: data2DPFT(:,:,:) ! Input 2D data with PFT's
- real(r8), pointer :: OutData1D(:,:) ! Output 1D data
- real(r8), pointer :: OutData2DCFT(:,:,:) ! Output 2D data with CFT's
- real(r8), pointer :: OutData2DPFT(:,:,:) ! Output 2D data with natural PFT's
- integer :: dims2nd(numharv) ! 2nd dimension size
- integer :: CFTdimsize ! Size of CFT dimension
- integer :: PFTdimsize ! Size of PFT dimension
- integer :: indices1D(numharv) ! Field indices for CFT variables
- integer :: indicesCFT(numharv) ! Field indices for CFT variables
- integer :: indicesPFT(numharv) ! Field indices for PFT variables
- contains
- procedure :: init ! Initialization
- procedure :: get1DFieldPtr ! Get a pointer to a 1D field
- procedure :: get2DFieldPtr ! Get a pointer to a 2D field
- procedure :: getFieldsIdx ! Get field indexes to 1D and 2D fields
- procedure :: getFieldsDim ! Get dimension names for this field
- procedure :: isField1D ! Return true if field is a 1D field
- procedure :: isField2D ! Return true if field is a 2D field
- procedure :: num1DFields ! Return the number of 1D fields
- procedure :: num2DFields ! Return the number of 2D fields
- procedure :: clean ! Clean and deallocate everything
- end type harvestDataType
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mkharvest_init ! Initialization
- public mkharvest ! Calculate the harvest values on output grid
- public mkharvest_fieldname ! Field name for harvest fields on landuse.timeseries
- public mkharvest_longname ! Long name
- public mkharvest_units ! units
- public mkharvest_numtypes ! Number of harvest types
- public mkharvest_parse_oride ! Parse the over-ride string
-
-! !PRIVATE MEMBER FUNCTIONS: (but public because unit test uses them)
- public mkharvest_fieldInBounds ! Check that field index is within bounds
-
-! !PRIVATE DATA MEMBERS:
-
- integer, parameter :: harlen = 25 ! length of strings for harvest fieldnames
- character(len=harlen), parameter :: harvest_fieldnames(numharv) = (/ &
- 'HARVEST_VH1 ', &
- 'HARVEST_VH2 ', &
- 'HARVEST_SH1 ', &
- 'HARVEST_SH2 ', &
- 'HARVEST_SH3 ', &
- 'GRAZING ', &
- 'FERTNITRO_CFT ', &
- 'UNREPRESENTED_PFT_LULCC', &
- 'UNREPRESENTED_CFT_LULCC' &
- /)
- character(len=harlen), parameter :: harvest_const_fieldnames(numharv) = (/ &
- 'CONST_HARVEST_VH1 ', &
- 'CONST_HARVEST_VH2 ', &
- 'CONST_HARVEST_SH1 ', &
- 'CONST_HARVEST_SH2 ', &
- 'CONST_HARVEST_SH3 ', &
- 'CONST_GRAZING ', &
- 'CONST_FERTNITRO_CFT ', &
- 'UNREPRESENTED_PFT_LULCC', &
- 'UNREPRESENTED_CFT_LULCC' &
- /)
- character(len=CL), parameter :: string_undef = 'UNSET'
- real(r8), parameter :: real_undef = -999.99
- character(len=CL), save :: harvest_longnames(numharv) = string_undef
- character(len=CL), save :: harvest_units(numharv) = string_undef
- real(r8), pointer :: oride_harv(:) ! array that can override harvesting
- logical , save :: initialized = .false.
-
-!EOP
-!-----------------------------------------------------------------------
-contains
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: init
-!
-! !INTERFACE:
- subroutine init( this, dims2nd, ns_i, ns_o, init_value )
-!
-! !DESCRIPTION:
-! Initialization of the harvestData object
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(INOUT) :: this ! harvestData object
- integer, intent(IN) :: dims2nd(:) ! 2nd Dimension sizes
- integer, intent(IN) :: ns_i ! Input dimension size
- integer, intent(IN) :: ns_o ! Output dimension size
- real(r8), intent(IN) :: init_value ! Initial value
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::init'
- integer :: num2nd ! number of non 1D variables
- integer :: numCFT, numPFT ! number of CFT and PFT variables respectively
- integer :: num1D ! number of 1D variables
- integer :: n ! index
-!EOP
-!-----------------------------------------------------------------------
- if ( size(dims2nd) /= numharv )then
- write(*,*) subname//':ERROR:: dims2nd given to init is not the right size'
- call abort()
- end if
- this%CFTdimsize = 64
- this%PFTdimsize = 15
- this%dims2nd = dims2nd
- num2nd = 0
- numCFT = 0
- numPFT = 0
- num1D = 0
- this%indices1D = -1
- this%indicesPFT = -1
- this%indicesCFT = -1
- do n = 1, numharv
- if ( dims2nd(n) == 0 )then
- num1D = num1D + 1
- this%indices1D(n) = num1D
- else
- num2nd = num2nd + 1
- if ( dims2nd(n) == this%CFTdimsize )then
- numCFT = numCFT + 1
- this%indicesCFT(n) = numCFT
- else if ( dims2nd(n) == this%PFTdimsize )then
- numPFT = numPFT + 1
- this%indicesPFT(n) = numPFT
- else
- write(*,*) 'ERROR:: dims2nd is not the right size (should be 0, 15, or 64) = ', dims2nd(n)
- call abort()
- end if
- end if
- end do
-
- allocate( this%data1D(ns_i,num1D) )
- allocate( this%OutData1D(ns_o,num1D) )
-
- this%OutData1D(:,:) = init_value
-
- if ( num2nd > 0 ) then
- allocate( this%data2DCFT (ns_i,this%CFTdimsize,numCFT) )
- allocate( this%OutData2DCFT(ns_o,this%CFTdimsize,numCFT) )
-
- this%OutData2DCFT(:,:,:) = init_value
-
- allocate( this%data2DPFT (ns_i,this%PFTdimsize,numPFT) )
- allocate( this%OutData2DPFT(ns_o,this%PFTdimsize,numPFT) )
-
- this%OutData2DPFT(:,:,:) = init_value
- end if
-
- end subroutine init
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get1DFieldPtr
-!
-! !INTERFACE:
- function get1DFieldPtr( this, nfield, output ) result(ptr1D)
-!
-! !DESCRIPTION:
-! Returns 2D pointer to field data for this index
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, intent(in) :: nfield ! field index
- real(r8), pointer :: ptr1D(:) ! Pointer to 1D data
- logical, optional, intent(in) :: output ! Flag if this is the output pointer or not (input)
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::get1DFieldPtr'
- logical :: loutput ! Local output flag
-!EOP
-!-----------------------------------------------------------------------
- loutput = .false.
- if ( present(output) ) loutput = output
- if ( mkharvest_fieldInBounds( nfield ) .and. this%isField1D(nfield) )then
- if ( .not. loutput ) then
- ptr1D => this%data1D(:,this%indices1D(nfield))
- else
- ptr1D => this%OutData1D(:,this%indices1D(nfield))
- end if
- else
- call abort()
- end if
- end function get1DFieldPtr
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get2DFieldPtr
-!
-! !INTERFACE:
- function get2DFieldPtr( this, nfield, output ) result(ptr2D)
-!
-! !DESCRIPTION:
-! Returns 2D pointer to field data for this index
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, intent(in) :: nfield ! field index
- real(r8), pointer :: ptr2D(:,:) ! Pointer to 2D data
- logical, optional, intent(in) :: output ! Flag if this is the output pointer or not (input)
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::get2DFieldPtr'
- logical :: loutput ! Local output flag
-!EOP
-!-----------------------------------------------------------------------
- loutput = .false.
- if ( present(output) ) loutput = output
- if ( mkharvest_fieldInBounds( nfield ) .and. this%isField2D(nfield) )then
- if ( .not. loutput ) then
- if ( this%dims2nd(nfield) == this%CFTdimsize )then
- ptr2D => this%data2DCFT(:,:,this%indicesCFT(nfield))
- else
- ptr2D => this%data2DPFT(:,:,this%indicesPFT(nfield))
- end if
- else
- if ( this%dims2nd(nfield) == this%CFTdimsize )then
- ptr2D => this%OutData2DCFT(:,:,this%indicesCFT(nfield))
- else
- ptr2D => this%OutData2DPFT(:,:,this%indicesPFT(nfield))
- end if
- end if
- else
- call abort()
- end if
- end function get2DFieldPtr
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: getFieldsIdx
-!
-! !INTERFACE:
- subroutine getFieldsIdx( this, indices1D, indices2D )
-!
-! !DESCRIPTION:
-! Returns list of 1D and 2D fields indices
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, allocatable :: indices1D(:) ! List of 1D indices
- integer, allocatable :: indices2D(:) ! List of 2D indices
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::getFieldsIdx'
- integer :: ifld, n1, n2 ! field index and field index
-!EOP
-!-----------------------------------------------------------------------
- allocate( indices1D(max(1,this%num1DFields()) ) )
- allocate( indices2D(max(1,this%num2DFields()) ) )
- indices1D = -1
- indices2D = -1
- n1 = 0
- n2 = 0
- do ifld = 1, mkharvest_numtypes()
- if ( this%isField1D(ifld) )then
- n1 = n1 + 1
- indices1D(n1) = ifld
- else if ( this%isField2D(ifld) )then
- n2 = n2 + 1
- indices2D(n2) = ifld
- end if
- end do
- end subroutine getFieldsIdx
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: getFieldsDim
-!
-! !INTERFACE:
- function getFieldsDim( this, nfield ) result(dimname)
-!
-! !DESCRIPTION:
-! Returns list of 1D and 2D fields indices
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, intent(in) :: nfield ! field index
- character(len=10) :: dimname ! Dimension names
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::getFieldsDim'
-!EOP
-!-----------------------------------------------------------------------
- if ( this%dims2nd(nfield) == this%CFTdimsize )then
- dimname = "cft"
- else if ( this%dims2nd(nfield) == this%PFTdimsize )then
- dimname = "natpft"
- else
- dimname = "none"
- end if
- end function getFieldsDim
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: isField1D
-!
-! !INTERFACE:
- logical function isField1D( this, nfield )
-!
-! !DESCRIPTION:
-! Returns true if this field index is a 1D field
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, intent(in) :: nfield ! field index
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::isField1D'
-!EOP
-!-----------------------------------------------------------------------
- isField1D = .false.
- if ( mkharvest_fieldInBounds( nfield ) )then
- if ( this%dims2nd(nfield) == 0 ) isField1D = .true.
- else
- call abort()
- end if
- end function isField1D
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: isField2D
-!
-! !INTERFACE:
- logical function isField2D( this, nfield )
-!
-! !DESCRIPTION:
-! Returns true if this field index is a 2D field
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
- integer, intent(in) :: nfield ! field index
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::isField2D'
-!EOP
-!-----------------------------------------------------------------------
- isField2D = .false.
- if ( mkharvest_fieldInBounds( nfield ) )then
- if ( this%dims2nd(nfield) /= 0 ) isField2D = .true.
- else
- call abort()
- end if
- end function isField2D
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: num1DFields
-!
-! !INTERFACE:
- integer function num1DFields( this )
-!
-! !DESCRIPTION:
-! Returns the number of 1D fields
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::num1DFields'
-!EOP
-!-----------------------------------------------------------------------
- num1DFields = count( this%dims2nd == 0)
- end function num1DFields
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: num2DFields
-!
-! !INTERFACE:
- integer function num2DFields( this )
-!
-! !DESCRIPTION:
-! Returns the number of 2D fields
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(IN) :: this ! harvestData object
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::num2DFields'
-!EOP
-!-----------------------------------------------------------------------
- num2DFields = count( this%dims2nd /= 0)
- end function num2DFields
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_init
-!
-! !INTERFACE:
- subroutine mkharvest_init( ns_o, init_val, harvdata, fharvest, constant )
-!
-! !DESCRIPTION:
-! Initialization of mkharvest module.
-!
-! !USES:
- use mkncdio
- implicit none
-!
-! !ARGUMENTS:
- integer , intent(in) :: ns_o ! clm output grid resolution
- real(r8) , intent(in) :: init_val ! initial value to set to
- type(harvestDataType), intent(INOUT) :: harvdata ! Harvest data
- character(len=*) , intent(in) :: fharvest ! input harvest dataset file name
- logical, intent(in), optional :: constant ! Flag if variables are CONST_ version for surface dataset
- ! rather than landuse.timeseries
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'mkharvest_init'
- character(len=CL) :: lunits ! local units read in
- integer :: ncid,varid ! input netCDF id's
- integer :: ifld ! indices
- integer :: ret ! return code
- logical :: lconstant ! local version of constant flag
- logical :: varexists ! If variable exists on file
- integer :: dim_lengths(3) ! Dimension lengths on file
- integer :: dims2nd(numharv) ! Dimension lengths of 3rd dimension for each variable on file
- integer :: ndims ! Number of dimensions on file
- integer :: ns_i ! clm input grid resolution (nlat*nlon)
-!EOP
-!-----------------------------------------------------------------------
- lconstant = .false.
- if ( present(constant) ) lconstant = constant
-
- initialized = .true.
- call check_ret(nf_open(fharvest, 0, ncid), subname)
- dims2nd(:) = 0
- ns_i = 0
- do ifld = 1, numharv
- call check_ret(nf_inq_varid ( ncid, mkharvest_fieldname(ifld, constant=lconstant), varid), subname, varexists=varexists)
- if ( .not. varexists )then
- write(*,*) "SKIP: "//mkharvest_fieldname(ifld, constant=lconstant)
- harvest_longnames(ifld) = trim(mkharvest_fieldname(ifld, constant=lconstant)) // " (zeroed out)"
- harvest_units(ifld) = "not_read_in"
- else
- call check_ret(nf_get_att_text( ncid, varid, 'long_name', harvest_longnames(ifld)), subname )
- ret = nf_get_att_text( ncid, varid, 'units', harvest_units(ifld))
- if ( ret == nf_enotatt )then
- harvest_units(ifld) = "unitless"
- else if ( ret == nf_noerr )then
- else
- write(*,*) 'ERROR:: bad return code from NetCDF get attribute= '// nf_strerror(ret)
- call abort()
- end if
- call get_dim_lengths(ncid, mkharvest_fieldname(ifld, constant=lconstant), ndims, dim_lengths)
- if ( ns_i == 0 )then
- ns_i = dim_lengths(1)*dim_lengths(2)
- else if ( ns_i /= dim_lengths(1)*dim_lengths(2) )then
- write(*,*) 'ERROR:: bad dimension sizes for variable = ', mkharvest_fieldname(ifld, constant=lconstant)
- call abort()
- end if
- if ( ndims == 2 )then
- dims2nd(ifld) = 0
- else if ( ndims == 3 )then
- dims2nd(ifld) = dim_lengths(3)
- else
- write(*,*) 'ERROR:: bad dimensionality for variable = ', mkharvest_fieldname(ifld, constant=lconstant)
- call abort()
- end if
-
- end if
- end do
- call harvdata%init( dims2nd, ns_i, ns_o, init_val )
-
- call check_ret(nf_close(ncid), subname)
-
- allocate( oride_harv(numharv) )
- oride_harv(:) = real_undef
-
- end subroutine mkharvest_init
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_fieldInBounds
-!
-! !INTERFACE:
- logical function mkharvest_fieldInBounds( nfield )
-!
-! !DESCRIPTION:
-! Return true if field index is in bounds and initialization done
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- integer, intent(in) :: nfield ! field index
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'mkharvest_fieldInBounds'
-!EOP
-!-----------------------------------------------------------------------
- if ( nfield < 1 )then
- write(6,*) subname, ' ERROR nfield < 1'
- mkharvest_fieldInBounds = .false.
- else if ( nfield > numharv )then
- write(6,*) subname, ' ERROR nfield > max fields'
- mkharvest_fieldInBounds = .false.
- else if ( .not. initialized ) then
- write(6,*) subname, ' ERROR mkharvest NOT initialized yet!'
- mkharvest_fieldInBounds = .false.
- else
- mkharvest_fieldInBounds = .true.
- end if
-
- end function mkharvest_fieldInBounds
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_fieldname
-!
-! !INTERFACE:
- character(len=harlen) function mkharvest_fieldname( nfield, constant )
-!
-! !DESCRIPTION:
-! Return harvest fieldname of input field number.
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- integer, intent(in) :: nfield
- logical, intent(in), optional :: constant
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'mkharvest_fieldname'
- logical :: lconstant ! local version of constant flag
-!EOP
-!-----------------------------------------------------------------------
- lconstant = .false.
- if ( present(constant) ) lconstant = constant
-
- if ( mkharvest_fieldInBounds( nfield ) )then
- if ( .not. lconstant )then
- mkharvest_fieldname = harvest_fieldnames(nfield)
- else
- mkharvest_fieldname = harvest_const_fieldnames(nfield)
- end if
- else
- call abort()
- end if
-
- end function mkharvest_fieldname
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_units
-!
-! !INTERFACE:
- character(len=CL) function mkharvest_units( nfield )
-!
-! !DESCRIPTION:
-! Return units description of harvest fields
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- integer, intent(in) :: nfield
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'mkharvest_units'
-!EOP
-!-----------------------------------------------------------------------
-
- if ( mkharvest_fieldInBounds( nfield ) )then
- mkharvest_units = harvest_units(nfield)
- else
- call abort()
- end if
-
- end function mkharvest_units
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_longname
-!
-! !INTERFACE:
- character(len=CL) function mkharvest_longname( nfield )
-!
-! !DESCRIPTION:
-! Return longname description of given input field number.
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- integer, intent(in) :: nfield
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'mkharvest_longname'
-!EOP
-!-----------------------------------------------------------------------
-
- if ( mkharvest_fieldInBounds( nfield ) )then
- mkharvest_longname = harvest_longnames(nfield)
- else
- call abort()
- end if
-
- end function mkharvest_longname
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_numtypes
-!
-! !INTERFACE:
- integer function mkharvest_numtypes( )
-!
-! !DESCRIPTION:
-! Return number of different harvest field types.
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- character(len=*), parameter :: subname = 'mkharvest_numtypes'
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
-!-----------------------------------------------------------------------
- mkharvest_numtypes = numharv
-
- end function mkharvest_numtypes
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: clean
-!
-! !INTERFACE:
- subroutine clean( this )
-!
-! !DESCRIPTION:
-! Clean and deallocate the harvestData object
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- class(harvestDataType), intent(INOUT) :: this ! harvestData object
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-! !LOCAL VARIABLES:
- character(len=*), parameter :: subname = 'harvestData::clean'
-!EOP
-!-----------------------------------------------------------------------
- this%CFTdimsize = -1
- this%PFTdimsize = -1
-
- if ( associated(this%data1D) ) deallocate( this%data1D )
- if ( associated(this%Outdata1D) ) deallocate( this%OutData1D )
-
- if ( associated(this%data2DCFT) ) deallocate( this%data2DCFT )
- if ( associated(this%OutData2DCFT)) deallocate( this%OutData2DCFT )
- if ( associated(this%data2DPFT ) ) deallocate( this%data2DPFT )
- if ( associated(this%OutData2DPFT)) deallocate( this%OutData2DPFT )
- this%data2DCFT => null()
- this%OutData2DCFT => null()
- this%data2DPFT => null()
- this%OutData2DPFT => null()
-
- end subroutine clean
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest
-!
-! !INTERFACE:
-subroutine mkharvest(ldomain, mapfname, datfname, ndiag, harvdata)
-!
-! !DESCRIPTION:
-! Make harvest data for the dynamic PFT dataset.
-! This dataset consists of the normalized harvest or grazing fraction (0-1) of
-! the model.
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain !
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- type(harvestDataType), intent(INOUT) :: harvdata ! Harvest data
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8) :: gharv_o(numharv) ! output grid: global area harvesting
- real(r8) :: garea_o ! output grid: global area
- real(r8) :: gharv_i(numharv) ! input grid: global area harvesting
- real(r8) :: garea_i ! input grid: global area
- integer :: ifld ! indices
- integer :: k,n,m,ni,no,ns_i,ns_o ! indices
- integer :: ncid,varid ! input netCDF id's
- logical :: varexists ! If variable exists or not
- integer :: ier ! error status
- integer, allocatable :: ind1D(:) ! Index of 1D harvest fields
- integer, allocatable :: ind2D(:) ! Index of 2D harvest fields
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), pointer :: data1D_i(:) ! 1D input data
- real(r8), pointer :: data2D_i(:,:) ! 2D output data
- real(r8), pointer :: data1D_o(:) ! 1D output data
- real(r8), pointer :: data2D_o(:,:) ! 2D output data
-
- character(len=*), parameter :: unit = '10**6 km**2' ! Output units
- real(r8), parameter :: fac = 1.e-06_r8 ! Output factor
- real(r8), parameter :: rat = fac/100._r8 ! Output factor divided by 100%
- character(len=*), parameter :: subname = 'mkharvest'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make harvest fields .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Normally read in the harvesting file, and then regrid to output grid
- ! -----------------------------------------------------------------
- call harvdata%getFieldsIdx( ind1D, ind2D )
-
- if ( all(oride_harv == real_undef ) )then
-
- ! -----------------------------------------------------------------
- ! Read input harvesting file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read HARVEST_VH1, HARVEST_VH2, ... GRAZING etc.
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
- ns_o = ldomain%ns
- allocate(frac_dst(ns_o), stat=ier)
- if (ier /= 0) call abort()
-
- write (6,*) 'Open harvest file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- do k = 1, harvdata%num1Dfields()
- ifld = ind1D(k)
- call check_ret( nf_inq_varid(ncid, mkharvest_fieldname(ifld), varid), subname, varexists=varexists )
- data1D_i => harvdata%get1DFieldPtr( ifld )
- if ( .not. varexists )then
- write(*,*) "SKIP: "//mkharvest_fieldname(ifld)
- data1D_i(:) = 0.0_r8
- else
- call check_ret(nf_get_var_double (ncid, varid, data1D_i), subname)
- end if
- end do
- do k = 1, harvdata%num2Dfields()
- ifld = ind2D(k)
- call check_ret( nf_inq_varid(ncid, mkharvest_fieldname(ifld), varid), subname, varexists=varexists )
- data2D_i => harvdata%get2DFieldPtr( ifld )
- if ( .not. varexists )then
- write(*,*) "SKIP: "//mkharvest_fieldname(ifld)
- data2D_i(:,:) = 0.0_r8
- else
- call check_ret(nf_get_var_double (ncid, varid, data2D_i), subname)
- end if
- end do
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average normalized harvest on input grid [data*_i] to output grid [data*_o]
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine data* on output grid
-
- do k = 1, harvdata%num1Dfields()
- ifld = ind1D(k)
- data1D_i => harvdata%get1DFieldPtr( ifld )
- data1D_o => harvdata%get1DFieldPtr( ifld, output=.true. )
- call gridmap_areaave_srcmask(tgridmap, data1D_i, data1D_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- end do
- do k = 1, harvdata%num2Dfields()
- ifld = ind2D(k)
- data2D_i => harvdata%get2DFieldPtr( ifld )
- data2D_o => harvdata%get2DFieldPtr( ifld, output=.true. )
- do m = lbound(data2D_i(:,:),dim=2), ubound(data2D_i(:,:),dim=2)
- call gridmap_areaave_srcmask(tgridmap, data2D_i(:,m), data2D_o(:,m), nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- end do
- end do
-
- ! -----------------------------------------------------------------
- ! Error check
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- gharv_i(:) = 0.
- garea_i = 0.
- do ni = 1, ns_i
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- do k = 1, harvdata%num1Dfields()
- m = ind1D(k)
- data1D_i => harvdata%get1DFieldPtr( m )
- gharv_i(m) = gharv_i(m) + data1D_i(ni)*tgridmap%area_src(ni)* &
- tdomain%mask(ni)*re**2
- end do
- end do
-
- gharv_o(:) = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- do k = 1, harvdata%num1Dfields()
- m = ind1D(k)
- data1D_o => harvdata%get1DFieldPtr( m, output=.true. )
- gharv_o(m) = gharv_o(m) + data1D_o(no)*tgridmap%area_dst(no)* &
- frac_dst(no)*re**2
- end do
- end do
-
- ! Write out to diagnostic output file
- !
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Harvesting Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,1001) unit, unit
-1001 format (1x,'harvest type ',20x,' input grid area',' output grid area',/ &
- 1x,33x,' ',A,' ',A)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- do k = 1, harvdata%num1Dfields()
- m = ind1D(k)
- write (ndiag,1002) mkharvest_fieldname(m), gharv_i(m)*rat,gharv_o(m)*rat
- end do
-1002 format (1x,a35,f16.3,f17.3)
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
-
- else
-
- ! -----------------------------------------------------------------
- ! Otherwise override the harvesting with the input harvest values
- ! -----------------------------------------------------------------
-
- if ( any(oride_harv == real_undef ) )then
- write(6,*) subname, ' error some override harvesting fields set ', &
- 'and others are not = ', oride_harv
- call abort()
- end if
- do k = 1, harvdata%num1Dfields()
- m = ind1D(k)
- if ( oride_harv(m) < 0.0_r8 .or. oride_harv(m) > 100.0_r8 )then
- write(6,*) subname, ' error override harvesting field out of range', &
- oride_harv(m), ' field = ', mkharvest_fieldname(m)
- call abort()
- end if
- end do
- do no = 1,ns_o
- do k = 1, harvdata%num1Dfields()
- m = ind1D(k)
- data1D_o => harvdata%get1DFieldPtr( m, output=.true. )
- data1D_o(no) = oride_harv(m)
- end do
- end do
-
- end if
-
- deallocate( ind1D, ind2D )
- write (6,*) 'Successfully made harvest and grazing'
- write (6,*)
-
-end subroutine mkharvest
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkharvest_parse_oride
-!
-! !INTERFACE:
-subroutine mkharvest_parse_oride( string )
-!
-! !DESCRIPTION:
-! Parse the string with harvest and grazing information on it, to override
-! the file with this information rather than reading from a file.
-!
-! !USES:
- use shr_string_mod, only: shr_string_betweenTags
-! !ARGUMENTS:
- character(len=256), intent(IN) :: string ! String to parse with harvest and grazing data
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: rc ! error return code
- character(len=256) :: substring ! substring between tags
- character(len=*), parameter :: harv_start = ""
- character(len=*), parameter :: harv_end = " "
- character(len=*), parameter :: graz_start = ""
- character(len=*), parameter :: graz_end = " "
- character(len=*), parameter :: subname = 'mkharvest_parse_oride'
-!-----------------------------------------------------------------------
- call shr_string_betweenTags( string, harv_start, harv_end, substring, rc )
- if ( rc /= 0 )then
- write(6,*) subname//'Trouble finding harvest start end tags'
- call abort()
- end if
- read(substring,*) oride_harv(1:numharv-1)
- call shr_string_betweenTags( string, graz_start, graz_end, substring, rc )
- if ( rc /= 0 )then
- write(6,*) subname//'Trouble finding grazing start end tags'
- call abort()
- end if
- read(substring,*) oride_harv(numharv)
- if ( harvest_fieldnames(numharv) /= 'GRAZING' )then
- write(6,*) subname, ' grazing is NOT last field as was expected'
- call abort()
- end if
-
-!-----------------------------------------------------------------------
-
-end subroutine mkharvest_parse_oride
-
-!-----------------------------------------------------------------------
-
-end module mkharvestMod
diff --git a/tools/mksurfdata_map/src/mkindexmapMod.F90 b/tools/mksurfdata_map/src/mkindexmapMod.F90
deleted file mode 100644
index 5f8e74af2b..0000000000
--- a/tools/mksurfdata_map/src/mkindexmapMod.F90
+++ /dev/null
@@ -1,697 +0,0 @@
-module mkindexmapMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkindexmapMod
-!
-! !DESCRIPTION:
-! Module containing subroutines for making maps of index data.
-!
-! This includes a routine for making a map using the dominant type among the input grid
-! cells making up a given output cell, as well as routines for using an index map as
-! indices into a lookup table, to essentially paint-by-number some other field, and some
-! other related routines
-!
-! WJS (2-1-12): There is a lookup_2d subroutine, but not a lookup_1d (or any other
-! dimensionality). That is simply because I needed lookup_2d, but have not yet needed a
-! routine of other dimensionalities. In the future, it would probably be helpful to at
-! least have lookup_1d and lookup_1d_netcdf. If this is done, see my notes under the
-! lookup_2d_netcdf routine for some thoughts on avoiding duplication.
-!
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkncdio, only : nf_max_name
- use mkgridmapMod, only : gridmap_type
-
- implicit none
- private
-
-! !PUBLIC TYPES:
-!
- ! dim_slice_type: stores information about dimensions that we use for slicing a multi-
- ! dimensional variable
- type dim_slice_type
- character(len=nf_max_name) :: name ! name of this dimension
- integer :: val ! index to use for the slice
- end type dim_slice_type
- public :: dim_slice_type
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: get_dominant_indices ! make output map based on dominant type in each grid cell
- public :: get_max_indices ! make output map based on maximum type in each grid cell
- public :: lookup_2d ! create map based on a 2-d lookup table
- public :: lookup_2d_netcdf ! wrapper to lookup_2d; first read table from netcdf file
- public :: which_max ! get index of the maximum value in an array
-!
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!EOP
-!------------------------------------------------------------------------------
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get_dominant_indices
-!
-! !INTERFACE:
-subroutine get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata, filter, mask_src)
-!
-! !DESCRIPTION:
-! Fills an output array on the destination grid (dst_array) whose values are equal to the
-! (weighted) dominant value in the source grid cells overlapping a given destination grid
-! cell
-!
-! Ignores all values in src_array that are less than minval or greater than maxval (treats
-! those values the same as if they had wt=0). (Note: for memory-use efficiency, it is
-! best if the indices are designed such that most values between minval and maxval are
-! actually used, since an array is allocated of size (maxval - minval + 1)*gridmap%nb.)
-!
-! The filter argument can be used to exclude certain overlaps -- if provided, we only
-! consider overlaps where filter is .true. If not provided, filter is treated as being
-! .true. everywhere.
-!
-! Output grid cells with no contributing valid source points are given the nodata value
-!
-! !ARGUMENTS:
- implicit none
- type(gridmap_type), intent(in) :: gridmap ! provides mapping from src -> dst
- integer , intent(in) :: src_array(:) ! input values; length gridmap%na
- integer , intent(out):: dst_array(:) ! output values; length gridmap%nb
- integer , intent(in) :: minval ! minimum valid value in src_array
- integer , intent(in) :: maxval ! maximum valid value in src_array
- integer , intent(in) :: nodata ! value to assign to dst_array where there are no valid source points
- integer , intent(in) :: mask_src(:)
-
- logical, intent(in), optional :: filter(:) ! only consider overlaps where filter is .true.; length gridmap%ns
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- logical, allocatable :: lfilter(:) ! local version of filter
- logical, allocatable :: hasdata(:) ! true if an output cell has any valid data;
- real(r8), allocatable :: weights(:,:) ! summed weight of each index value for each output cell
-
- integer :: n, ni, no
- integer :: k
- integer :: maxindex
- real(r8) :: wt
- real(r8) :: maxwt
-
- character(len=*), parameter :: subname = "get_dominant_indices"
-!-----------------------------------------------------------------------
-
- ! Error-check inputs and initialize local variables
-
- if (size(src_array) /= gridmap%na .or. &
- size(dst_array) /= gridmap%nb) then
- write(6,*) subname//' ERROR: incorrect sizes of src_array or dst_array'
- write(6,*) 'size(src_array) = ', size(src_array)
- write(6,*) 'gridmap%na = ', gridmap%na
- write(6,*) 'size(dst_array) = ', size(dst_array)
- write(6,*) 'gridmap%nb = ', gridmap%nb
- call abort()
- end if
- if (size(mask_src) /= size(src_array)) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'size(src_array) = ', size(src_array)
- call abort()
- end if
-
- allocate(lfilter(gridmap%ns))
-
- if (present(filter)) then
- if (size(filter) /= gridmap%ns) then
- write(6,*) subname//' ERROR: incorrect size of filter'
- write(6,*) 'size(filter) = ', size(filter)
- write(6,*) 'gridmap%ns = ', gridmap%ns
- call abort()
- end if
-
- lfilter(:) = filter(:)
- else
- lfilter(:) = .true.
- end if
-
- allocate(hasdata(gridmap%nb))
- hasdata(:) = .false.
- allocate(weights(minval:maxval, gridmap%nb))
- weights(minval:maxval,:) = 0.
-
- ! Determine weight of each index value for each output (destination) cell
-
- do n = 1, gridmap%ns
- if (lfilter(n)) then
- ni = gridmap%src_indx(n)
- no = gridmap%dst_indx(n)
- wt = gridmap%wovr(n) * mask_src(ni)
- k = src_array(ni)
- if (k >= minval .and. k <= maxval) then
- ! Note: if we were doing something like weighted sums, I think we would
- ! want to divide wt by gridmap%frac_dst(no), as is done in
- ! gridmap_areaave_default. But since all we care about is the relative
- ! values of weights for a given destination cell, this is unnecessary
- weights(k,no) = weights(k,no) + wt
- hasdata(no) = .true.
- end if
- end if
- end do
-
- ! Determine output values
- ! Note: if a given destination cell has no contributing source points (thus
- ! hasdata(no) = false), or the max weight of any index overlapping this destination
- ! cell is <= 0, then the output value there will be nodata.
- ! (I don't think this latter condition -- weight <= 0 -- is possible, but we handle
- ! it anyway)
-
- dst_array(:) = nodata
- do no = 1, gridmap%nb
- if (hasdata(no)) then
- call which_max(weights(:,no), maxwt, maxindex, lbound=minval)
- if (maxwt > 0.) then
- dst_array(no) = maxindex
- end if
- end if
- end do
-
- deallocate(lfilter, weights, hasdata)
-
-end subroutine get_dominant_indices
-!------------------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-subroutine get_max_indices(gridmap, src_array, dst_array, nodata, mask_src)
- !
- ! !DESCRIPTION:
- ! Fills an output array on the destination grid (dst_array) whose values are equal to
- ! the maximum value in the source grid cells overlapping a given destination grid cell.
- !
- ! The frequency of occurrence of the source values is irrelevant. For example, if the
- ! value 1 appears in 99% of source cells overlapping a given destination cell and the
- ! value 2 appears in just 1%, we'll put 2 in the destination cell because it is the
- ! maximum value.
- !
- ! Output grid cells with no contributing valid source points are given the nodata value
- !
- ! !ARGUMENTS:
- type(gridmap_type) , intent(in) :: gridmap ! provides mapping from src -> dst
- integer , intent(in) :: src_array(:) ! input values; length gridmap%na
- integer , intent(out) :: dst_array(:) ! output values; length gridmap%nb
- integer , intent(in) :: nodata ! value to assign to dst_array where there are no valid source points
- integer , intent(in) :: mask_src(:) ! mask at the source resolution
- !
- ! !LOCAL VARIABLES:
- logical, allocatable :: hasdata(:) ! true if an output cell has any valid data;
- integer :: n, ni, no
- real(r8) :: wt
- integer :: src_val
-
- character(len=*), parameter :: subname = 'get_max_indices'
- !-----------------------------------------------------------------------
-
- ! Error-check inputs
-
- if (size(src_array) /= gridmap%na .or. &
- size(dst_array) /= gridmap%nb) then
- write(6,*) subname//' ERROR: incorrect sizes of src_array or dst_array'
- write(6,*) 'size(src_array) = ', size(src_array)
- write(6,*) 'gridmap%na = ', gridmap%na
- write(6,*) 'size(dst_array) = ', size(dst_array)
- write(6,*) 'gridmap%nb = ', gridmap%nb
- call abort()
- end if
- if (size(mask_src) /= size(src_array)) then
- write(6,*) subname//' ERROR: incorrect size of mask_src'
- write(6,*) 'size(mask_src) = ', size(mask_src)
- write(6,*) 'size(src_array) = ', size(src_array)
- call abort()
- end if
-
- ! Initialize local variables
- allocate(hasdata(gridmap%nb))
- hasdata(:) = .false.
-
- do n = 1, gridmap%ns
- ni = gridmap%src_indx(n)
- wt = gridmap%wovr(n) * mask_src(ni)
- if (wt > 0._r8) then
- no = gridmap%dst_indx(n)
- src_val = src_array(ni)
- if (.not. hasdata(no)) then
- hasdata(no) = .true.
- dst_array(no) = src_val
- else if (src_val > dst_array(no)) then
- dst_array(no) = src_val
- end if
- end if
- end do
-
- do no = 1, gridmap%nb
- if (.not. hasdata(no)) then
- dst_array(no) = nodata
- end if
- end do
-
-end subroutine get_max_indices
-
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: lookup_2d
-!
-! !INTERFACE:
-subroutine lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, &
- nodata, valid_entries, invalid_okay)
-!
-! !DESCRIPTION:
-! Creates a data array using a paint-by-number approach according to a lookup table
-!
-! This routine operates on a 2-d lookup table. There are therefore two index arrays
-! (index1 and index2); these index arrays are on the same grid as the desired data array
-! (thus, index1, index2 and data must all have the same length). Each output point, n, is
-! then generally determined as:
-!
-! data(n) = lookup_table(index1(n), index2(n))
-!
-! fill_val: value to put in data array where either:
-! (a) index1 or index2 are equal to nodata (if nodata is given)
-! Note that this condition does NOT result in ierr being set
-! (b) valid_entries(index1(n), index2(n)) is false (if valid_entries is given)
-! Note that this condition also results in ierr being set, unless invalid_okay is
-! present and .true.
-! (If valid_entries is not given, it is treated as being .true. everywhere)
-! (c) index1 or index2 out of range
-! Note that this condition also results in ierr being set
-!
-! ierr: error return code (if non-0, indicates first error encountered):
-! 0: no error
-! 1: attempt to assign values from the lookup table that are invalid according
-! to valid_entries (note: this is not considered an error if invalid_okay is
-! present and .true.)
-! 2: attempt to access an out-of-range index in lookup table
-! WJS (2-2-12): My main reason for using ierr rather than aborting in case of error
-! is to facilitate unit testing
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: index1(:) ! index into dim 1 of lookup_table
- integer , intent(in) :: index2(:) ! index into dim 2 of lookup_table
- real(r8), intent(in) :: lookup_table(:,:)
- real(r8), intent(in) :: fill_val ! value to put in data where we don't have a valid value (see above for details)
- real(r8), intent(out):: data(:) ! output arary
- integer , intent(out):: ierr ! error return code (0 = no error)
-
- ! nodata flag in index1 and index2 (see above for details):
- integer, intent(in), optional :: nodata
-
- ! which entries are considered valid (see above for details):
- logical, intent(in), optional :: valid_entries(:,:)
-
- ! invalid_okay: if true, then assigning fill_val because valid_entries is false does
- ! NOT raise an error flag (invalid_okay defaults to false, meaning an error is
- ! raised in this case):
- logical, intent(in), optional :: invalid_okay
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: n
- integer :: i1, i2
- integer :: data_size ! size of index1, index2 and data arrays
- integer :: table_n1 ! size of dimension 1 of lookup table
- integer :: table_n2 ! size of dimension 2 of lookup table
- logical :: linvalid_okay ! local version of invalid_okay
- logical, allocatable :: lvalid_entries(:,:) ! local version of valid_entries
-
- character(len=*), parameter :: subname = 'lookup_2d'
-!-----------------------------------------------------------------------
-
- ierr = 0
-
- ! Error-check array sizes
-
- data_size = size(data)
- if (size(index1) /= data_size .or. size(index2) /= data_size) then
- write(6,*) subname//' ERROR: data array sizes do not match'
- write(6,*) 'size(data) = ', data_size
- write(6,*) 'size(index1) = ', size(index1)
- write(6,*) 'size(index2) = ', size(index2)
- call abort()
- end if
-
- table_n1 = size(lookup_table,1)
- table_n2 = size(lookup_table,2)
- if (present(valid_entries)) then
- if (size(valid_entries,1) /= table_n1 .or. size(valid_entries,2) /= table_n2) then
- write(6,*) subname//' ERROR: size of valid_entries does not match lookup_table'
- write(6,*) 'size(lookup_table) = ', table_n1, table_n2
- write(6,*) 'size(valid_entries) = ', size(valid_entries,1), &
- size(valid_entries,2)
- call abort()
- end if
- end if
-
- ! Set local version of invalid_okay & valid_entries
-
- if (present(invalid_okay)) then
- linvalid_okay = invalid_okay
- else
- linvalid_okay = .false.
- end if
-
- allocate(lvalid_entries(table_n1, table_n2))
- if (present(valid_entries)) then
- lvalid_entries(:,:) = valid_entries(:,:)
- else
- lvalid_entries(:,:) = .true.
- end if
-
- ! Do the lookups
-
- do n = 1, data_size
- i1 = index1(n)
- i2 = index2(n)
-
- ! First handle special cases:
-
- ! index is nodata flag (this is NOT an error)
- if (present(nodata)) then
- if (i1 == nodata .or. i2 == nodata) then
- data(n) = fill_val
- cycle
- end if
- end if
-
- ! index out of range
- if (i1 <= 0 .or. i1 > table_n1 .or. &
- i2 <= 0 .or. i2 > table_n2) then
- data(n) = fill_val
- if (ierr == 0) ierr = 2
- cycle
- end if
-
- ! lookup table entry is invalid
- if (.not. lvalid_entries(i1, i2)) then
- data(n) = fill_val
- if (.not. linvalid_okay) then
- if (ierr == 0) ierr = 1
- end if
- cycle
- end if
-
- ! Finally, the "normal" case, if none of the special cases were triggered:
- data(n) = lookup_table(i1, i2)
- end do
-
- deallocate(lvalid_entries)
-
-end subroutine lookup_2d
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: lookup_2d_netcdf
-!
-! !INTERFACE:
-subroutine lookup_2d_netcdf(ncid, tablename, lookup_has_invalid, &
- dimname1, dimname2, n_extra_dims, &
- index1, index2, fill_val, data, ierr, &
- extra_dims, nodata, invalid_okay)
-!
-! !DESCRIPTION:
-! Wrapper to lookup_2d that first reads the lookup table from a netcdf file
-!
-! If lookup_has_invalid is false, then we treat all lookup table entries as valid data
-! (i.e., all valid_entries are true in the call to lookup_2d). If lookup_has_invalid is
-! true, then we read the _FillValue attribute for the lookup table variable, and consider
-! any table entry with value _FillValue to be an invalid entry, thus putting fill_val in
-! these data locations (and raising an error flag unless invalid_okay is present and
-! true).
-!
-! The dimension given by dimname1 -- with the associated indices given by index1 -- is the
-! fastest-varying dimension in the lookup table. Dimension dimname2 (associated with
-! index2) is the second-fastest-varying dimension. Similarly, extra_dims should be ordered
-! from faster-varying to slowest-varying dimension. (The first dimension in extra_dims is
-! the third-fastest-varying dimension in the lookup table.)
-!
-! n_extra_dims gives the number of extra dimensions (in addition to the first two) in the
-! lookup table. We take a single 2-d slice of the lookup table, by using a single value of
-! each of these other dimensions. If n_extra_dims > 0, then extra_dims must be present,
-! with at least n_extra_dims entries. Each entry in extra_dims gives the name of a
-! dimension and the dimension index to use for the slice.
-!
-! If size(extra_dims) > n_extra_dims, then we use the first n_extra_dims entries in
-! extra_dims. If n_extra_dims = 0, then extra_dims is ignored.
-!
-! Note that we ignore any coordinate variables associated with the dimensions of the
-! lookup table; we simply treat the lookup table indices as 1,2,3,...
-!
-! See the lookup_2d documentation for documentation of some other arguments
-!
-! WJS (2-1-12): Some thoughts on avoiding duplication if we eventually want similar
-! routines, lookup_1d_netcdf, lookup_3d_netcdf, etc.:
-!
-! Much of the code in lookup_2d_netcdf could then be pulled out to a shared subroutine
-! (e.g., much of the error-checking code).
-!
-! Or, maybe better: we could try to make a single lookup_netcdf subroutine that handles
-! 1-d, 2-d and any other dimensionality. To do that, we would (1) make a generic interface
-! (of which lookup_1d and lookup_2d would be implementations); (2) change the repeated
-! arguments in lookup_2d_netcdf (*1 and *2) to arrays -- maybe using an array of a derived
-! type containing these arguments; (3) if possible, initially read the lookup table into a
-! 1-d array (if the netcdf call allows reading a n-d array into a 1-d array) (if netcdf
-! doesn't allow this, then I think we could achieve the same thing by reading 1-d slices
-! of the lookup table in a loop, building the full lookup table as a long 1-d array); (4)
-! in the call to the generic 'lookup' function, reshape the 1-d lookup table
-! appropriately. (Note: I think it would be challenging to combine lookup_1d and lookup_2d
-! (etc.) into a single routine using a similar method.)
-!
-! !USES:
- use mkncdio
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: ncid ! ID of an open netcdf file
- character(len=*), intent(in) :: tablename ! name of the lookup table variable
- logical , intent(in) :: lookup_has_invalid ! should we use _FillValue? (see above)
- character(len=*), intent(in) :: dimname1 ! name of the first (fastest-varying) dimension of the lookup table
- character(len=*), intent(in) :: dimname2 ! name of the second dimension of the lookup table
- integer , intent(in) :: n_extra_dims ! number of extra dimensions in the lookup table
- ! The following arguments are passed directly to lookup_2d:
- integer , intent(in) :: index1(:) ! index into dim 1 of lookup table
- integer , intent(in) :: index2(:) ! index into dim 2 of lookup table
- real(r8) , intent(in) :: fill_val ! value to put in data where we don't have a valid value
- real(r8) , intent(out):: data(:) ! output array
- integer , intent(out):: ierr ! error return code from the call to lookup_2d
-
- ! slice to use if lookup table variable has more than 2 dimensions:
- type(dim_slice_type), intent(in), optional :: extra_dims(:)
-
- ! nodata flag in index1 and index2, passed directly to lookup_2d:
- integer , intent(in), optional :: nodata
-
- ! flag for whether trying to use a lookup table value that is equal to the _FillValue
- ! should raise an error flag
- ! (irrelevant if lookup_has_invalid is .false.)
- ! (passed directly to lookup_2d - see the documentation there for more details)
- logical , intent(in), optional :: invalid_okay
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: varid ! netcdf variable id of the lookup table
- integer :: ndims ! total number of dimensions of lookup table
- integer :: ndims_expected ! value we expect for ndims, for error checking
- integer :: i
- real(r8) :: table_fillval ! value of the _FillValue attribute for the lookup table
- character(len=nf_max_name), allocatable :: dimnames(:) ! dimension names
- integer , allocatable :: dimids(:) ! dimension ids
- integer , allocatable :: dimlens(:) ! dimension lengths
- integer , allocatable :: starts(:) ! starting indices for reading lookup table
- integer , allocatable :: counts(:) ! dimension counts for reading lookup table
- real(r8), allocatable :: lookup_table(:,:)
- logical , allocatable :: valid_entries(:,:) ! which entries of the lookup table are considered valid
-
- character(len=*), parameter :: subname = 'lookup_2d_netcdf'
-!-----------------------------------------------------------------------
-
- ! Error-check extra_dims
-
- if (n_extra_dims > 0) then
- if (.not. present(extra_dims)) then
- write(6,*) subname//' ERROR: extra_dims must be present for n_extra_dims > 0'
- call abort()
- end if
-
- if (size(extra_dims) < n_extra_dims) then
- write(6,*) subname//' ERROR: not enough extra dimensions given'
- write(6,*) 'n_extra_dims = ', n_extra_dims
- write(6,*) 'size(extra_dims) = ', size(extra_dims)
- call abort()
- end if
- end if
-
- ! Determine number of expected dimensions in the table, and actual number of
- ! dimensions in the netcdf file
-
- ndims_expected = 2 + n_extra_dims
-
- call check_ret(nf_inq_varid (ncid, tablename, varid), subname)
- call check_ret(nf_inq_varndims (ncid, varid, ndims), subname)
- if (ndims /= ndims_expected) then
- write(6,*) subname//' ERROR: unexpected number of dimensions in ', &
- trim(tablename)
- write(6,*) 'ndims = ', ndims
- write(6,*) 'expected (based on n_extra_dims): ', ndims_expected
- call abort()
- end if
-
- ! Get dimension names & sizes, and error-check them
-
- allocate(dimids(ndims), dimlens(ndims), dimnames(ndims))
- call check_ret(nf_inq_vardimid (ncid, varid, dimids), subname)
- do i = 1, ndims
- call check_ret(nf_inq_dimname (ncid, dimids(i), dimnames(i)), subname)
- call check_ret(nf_inq_dimlen (ncid, dimids(i), dimlens(i)), subname)
- end do
-
- call check_dimname(dimnames(1), dimname1, 1)
- call check_dimname(dimnames(2), dimname2, 2)
- do i = 1, n_extra_dims
- call check_dimname(dimnames(2+i), extra_dims(i)%name, 2+i)
- call check_dimsize(dimlens(2+i), extra_dims(i)%val, 2+i)
- end do
-
- ! Read the lookup table; if the given variable has more than 2 dimensions, we read
- ! a single 2-d slice
-
- allocate(starts(ndims), counts(ndims))
- allocate(lookup_table(dimlens(1), dimlens(2)))
- starts(1:2) = 1
- counts(1:2) = dimlens(1:2)
- do i = 1, n_extra_dims
- starts(2+i) = extra_dims(i)%val
- counts(2+i) = 1
- end do
- call check_ret(nf_get_vara_double (ncid, varid, starts, counts, lookup_table), subname)
-
- ! Determine which entries are valid
-
- allocate(valid_entries(size(lookup_table, 1), size(lookup_table, 2)))
- valid_entries(:,:) = .true.
- if (lookup_has_invalid) then
- call check_ret(nf_get_att_double (ncid, varid, '_FillValue', table_fillval), subname)
- where (lookup_table == table_fillval)
- valid_entries = .false.
- end where
- end if
-
- ! Do the lookups
-
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, nodata=nodata, &
- valid_entries=valid_entries, invalid_okay=invalid_okay)
-
- deallocate(valid_entries)
- deallocate(lookup_table)
- deallocate(starts, counts)
- deallocate(dimids, dimlens, dimnames)
-
-contains
-!------------------------------------------------------------------------------
- subroutine check_dimname(actual, expected, i)
- ! Make sure names are equal; if not, stop with an error message
-
- character(len=*), intent(in) :: actual, expected
- integer , intent(in) :: i ! dimension number, for output purposes
-
- if (actual /= expected) then
- write(6,*) subname//' ERROR: unexpected dimension name in ', trim(tablename)
- write(6,*) 'dimension #', i
- write(6,*) 'actual: ', trim(actual)
- write(6,*) 'expected: ', trim(expected)
- call abort()
- end if
- end subroutine check_dimname
-
-!------------------------------------------------------------------------------
- subroutine check_dimsize(length, index, i)
- ! Make sure dimension length is long enough; if not, stop with an error message
-
- integer, intent(in) :: length, index
- integer, intent(in) :: i ! dimension number, for output purposes
-
- if (index > length) then
- write(6,*) subname//' ERROR: desired index exceeds dimension length in ', &
- trim(tablename)
- write(6,*) 'dimension #', i
- write(6,*) 'index: ', index
- write(6,*) 'length: ', length
- call abort()
- end if
- end subroutine check_dimsize
-
-end subroutine lookup_2d_netcdf
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: which_max
-!
-! !INTERFACE:
-subroutine which_max(arr, maxval, maxindex, lbound)
-!
-! !DESCRIPTION:
-! Returns maximum value in arr along with the index of the maximum value
-!
-! If multiple values are tied, returns index of the first maximum
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: arr(:)
- real(r8), intent(out):: maxval ! maximum value in arr(:)
- integer , intent(out):: maxindex ! first index of maxval
-
- ! lower bound of indices of arr; if not supplied, assumed to be 1:
- integer , intent(in), optional :: lbound
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: i
-!-----------------------------------------------------------------------
-
- maxindex = 1
- maxval = arr(1)
-
- do i = 2, size(arr)
- if (arr(i) > maxval) then
- maxindex = i
- maxval = arr(i)
- end if
- end do
-
- if (present(lbound)) then
- maxindex = maxindex + (lbound - 1)
- end if
-end subroutine which_max
-!------------------------------------------------------------------------------
-
-end module mkindexmapMod
diff --git a/tools/mksurfdata_map/src/mklaiMod.F90 b/tools/mksurfdata_map/src/mklaiMod.F90
deleted file mode 100644
index aef33f3463..0000000000
--- a/tools/mksurfdata_map/src/mklaiMod.F90
+++ /dev/null
@@ -1,445 +0,0 @@
-module mklaiMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mklai
-!
-! !DESCRIPTION:
-! Make LAI/SAI/height data
-!
-! !REVISION HISTORY:
-! Author: Sam Levis
-!
-!EOP
-!-----------------------------------------------------------------------
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
- use mkvarctl
-
- implicit none
-
- private
-
- public :: mklai
- private :: pft_laicheck
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mklai
-!
-! !INTERFACE:
-subroutine mklai(ldomain, mapfname, datfname, ndiag, ncido)
-!
-! !DESCRIPTION:
-! Make LAI/SAI/height data
-! Portions of this code could be moved out of the month loop
-! for improved efficiency
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar , only : re
- use mkvarctl
- use mkncdio
- use mkpftConstantsMod, only : c3cropindex, c3irrcropindex
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- integer , intent(in) :: ncido ! output netcdf file id
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- integer :: numpft_i ! number of plant types on input
- real(r8) :: glai_o(0:numpft) ! output grid: global area pfts
- real(r8) :: gsai_o(0:numpft) ! output grid: global area pfts
- real(r8) :: ghgtt_o(0:numpft) ! output grid: global area pfts
- real(r8) :: ghgtb_o(0:numpft) ! output grid: global area pfts
- real(r8) :: glai_i(0:numpft) ! input grid: global area pfts
- real(r8) :: gsai_i(0:numpft) ! input grid: global area pfts
- real(r8) :: ghgtt_i(0:numpft) ! input grid: global area pfts
- real(r8) :: ghgtb_i(0:numpft) ! input grid: global area pfts
-
- real(r8), allocatable :: mlai_o(:,:) ! monthly lai
- real(r8), allocatable :: msai_o(:,:) ! monthly sai
- real(r8), allocatable :: mhgtt_o(:,:) ! monthly height (top)
- real(r8), allocatable :: mhgtb_o(:,:) ! monthly height (bottom)
- real(r8), allocatable :: mlai_max(:,:) ! monthly lai
- real(r8), allocatable :: msai_max(:,:) ! monthly sai
- real(r8), allocatable :: mhgtt_max(:,:) ! monthly height (top)
- real(r8), allocatable :: mhgtb_max(:,:) ! monthly height (bottom)
- real(r8), allocatable :: mlai_i(:,:) ! monthly lai in
- real(r8), allocatable :: msai_i(:,:) ! monthly sai in
- real(r8), allocatable :: mhgtt_i(:,:) ! monthly height (top) in
- real(r8), allocatable :: mhgtb_i(:,:) ! monthly height (bottom) in
- real(r8), allocatable :: frac_dst(:) ! output fractions: same as frac_dst
- integer, pointer :: laimask(:,:) ! lai+sai output mask for each plant function type
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: garea_o ! output grid: global area
- integer :: mwts ! number of weights
- integer :: ni,no,ns_i,ns_o ! indices
- integer :: k,l,n,m ! indices
- integer :: ncidi,dimid,varid ! input netCDF id's
- integer :: ndimsi,ndimso ! netCDF dimension sizes
- integer :: dimids(4) ! netCDF dimension ids
- integer :: bego(4),leno(4) ! netCDF bounds
- integer :: begi(4),leni(4) ! netCDF bounds
- integer :: ntim ! number of input time samples
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=256) :: name ! name of attribute
- character(len=256) :: unit ! units of attribute
- character(len= 32) :: subname = 'mklai'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make LAIs/SAIs/heights .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- ns_o = ldomain%ns
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
-
- write (6,*) 'Open LAI file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncidi), subname)
- call check_ret(nf_inq_dimid(ncidi, 'pft', dimid), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimid, numpft_i), subname)
- call check_ret(nf_inq_dimid(ncidi, 'time', dimid), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimid, ntim), subname)
-
- if (numpft_i /= numpft+1) then
- write(6,*) 'WARNING: ' // trim(subname) // '(): parameter numpft+1 = ', numpft+1, &
- 'does not equal input dataset numpft = ', numpft_i
- write(6,*)'This inconsistency used to stop the program. Now we allow it '
- write(6,*)'because crop pfts 17-last are assumed to never use satellite lai data.'
-! stop
- if (numpft_i > numpft + 1) then
- ! NOTE(bja, 2015-01) If this error check is determined to be
- ! invalid, all the loop bounds over output data in this
- ! routine will need to be double checked!
- write(6, *) "ERROR:" // trim(subname) // "(): input numpft must be less than or equal to output numpft+1."
- stop
- end if
- endif
- if (ntim /= 12) then
- write(6,*)'MKLAI: must have 12 time samples on input data'
- call abort()
- endif
-
- ! NOTE - close data set at bottom of routine
-
- ! Dynamic allocation of variables
-
- allocate(mlai_i(ns_i,0:numpft_i), &
- msai_i(ns_i,0:numpft_i), &
- mhgtt_i(ns_i,0:numpft_i), &
- mhgtb_i(ns_i,0:numpft_i), &
- frac_dst(ns_o), &
- mlai_o(ns_o,0:numpft), &
- msai_o(ns_o,0:numpft), &
- mhgtt_o(ns_o,0:numpft), &
- mhgtb_o(ns_o,0:numpft), &
- laimask(ns_i,0:numpft), stat=ier )
- if (ier /= 0) then
- write(6,*)'mklai allocation error'; call abort()
- end if
-
- ! Determine mapping weights and map
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Determine number of dimensions in input by querying MONTHLY_LAI
-
- call check_ret(nf_inq_varid(ncidi, 'MONTHLY_LAI', varid), subname)
- call check_ret(nf_inq_vardimid(ncidi, varid, dimids), subname)
- call check_ret(nf_inq_varndims(ncidi, varid, ndimsi), subname)
- if (ndimsi ==4) then
- begi(1) = 1
- begi(2) = 1
- begi(3) = 1
- leni(4) = 1
- call check_ret(nf_inq_dimlen(ncidi, dimids(1), leni(1)), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimids(2), leni(2)), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimids(3), leni(3)), subname)
- else if (ndimsi== 3) then
- begi(1) = 1
- begi(2) = 1
- leni(3) = 1
- call check_ret(nf_inq_dimlen(ncidi, dimids(1), leni(1)), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimids(2), leni(2)), subname)
- end if
-
- ! Determine number of dimensions in output by querying MONTHLY_LAI
-
- call check_ret(nf_inq_varid(ncido, 'MONTHLY_LAI', varid), subname)
- call check_ret(nf_inq_varndims(ncido, varid, ndimso), subname)
- call check_ret(nf_inq_vardimid(ncido, varid, dimids), subname)
- if (ndimso ==4) then
- bego(1) = 1
- bego(2) = 1
- bego(3) = 1
- leno(4) = 1
- call check_ret(nf_inq_dimlen(ncido, dimids(1), leno(1)), subname)
- call check_ret(nf_inq_dimlen(ncido, dimids(2), leno(2)), subname)
- call check_ret(nf_inq_dimlen(ncido, dimids(3), leno(3)), subname)
- else if (ndimso== 3) then
- bego(1) = 1
- bego(2) = 1
- leno(3) = 1
- call check_ret(nf_inq_dimlen(ncido, dimids(1), leno(1)), subname)
- call check_ret(nf_inq_dimlen(ncido, dimids(2), leno(2)), subname)
- end if
-
- ! Loop over months
-
- do m = 1, ntim
-
- if (ndimsi == 4) begi(4)=m
- if (ndimsi == 3) begi(3)=m
-
- call check_ret(nf_inq_varid (ncidi, 'MONTHLY_LAI', varid), subname)
- call check_ret(nf_get_vara_double (ncidi, varid, begi(1:ndimsi), leni(1:ndimsi), &
- mlai_i), subname)
-
- call check_ret(nf_inq_varid (ncidi, 'MONTHLY_SAI', varid), subname)
- call check_ret(nf_get_vara_double (ncidi, varid, begi(1:ndimsi), leni(1:ndimsi), &
- msai_i), subname)
-
- call check_ret(nf_inq_varid (ncidi, 'MONTHLY_HEIGHT_TOP', varid), subname)
- call check_ret(nf_get_vara_double (ncidi, varid, begi(1:ndimsi), leni(1:ndimsi), &
- mhgtt_i), subname)
-
- call check_ret(nf_inq_varid (ncidi, 'MONTHLY_HEIGHT_BOT', varid), subname)
- call check_ret(nf_get_vara_double (ncidi, varid, begi(1:ndimsi), leni(1:ndimsi), &
- mhgtb_i), subname)
-
- mlai_o(:,:) = 0.
- msai_o(:,:) = 0.
- mhgtt_o(:,:) = 0.
- mhgtb_o(:,:) = 0.
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Loop over pft types to do mapping
- do l = 0, numpft_i - 1
- call gridmap_areaave_srcmask(tgridmap, mlai_i(:,l) , mlai_o(:,l) , nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, msai_i(:,l) , msai_o(:,l) , nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, mhgtt_i(:,l), mhgtt_o(:,l), nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, mhgtb_i(:,l), mhgtb_o(:,l), nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- enddo
-
- ! Determine laimask
-
- laimask(:,:) = 0
-
- ! copy LAI, SAI, & heights from the C3 crop (pft15)
- ! to the irrigated (pft16) whether crop is on or off
- mlai_o(:,c3irrcropindex) = mlai_o(:,c3cropindex)
- msai_o(:,c3irrcropindex) = msai_o(:,c3cropindex)
- mhgtt_o(:,c3irrcropindex) = mhgtt_o(:,c3cropindex)
- mhgtb_o(:,c3irrcropindex) = mhgtb_o(:,c3cropindex)
-
- ! -----------------------------------------------------------------
- ! Output model resolution LAI/SAI/HEIGHT data
- ! -----------------------------------------------------------------
-
- ! Now write out all variables
-
- if (ndimso == 4) bego(4)=m
- if (ndimso == 3) bego(3)=m
-
- call check_ret(nf_inq_varid(ncido, 'MONTHLY_LAI', varid), subname)
- call check_ret(nf_put_vara_double(ncido, varid, bego, leno, mlai_o), subname)
-
- call check_ret(nf_inq_varid(ncido, 'MONTHLY_SAI', varid), subname)
- call check_ret(nf_put_vara_double(ncido, varid, bego, leno, msai_o), subname)
-
- call check_ret(nf_inq_varid(ncido, 'MONTHLY_HEIGHT_TOP', varid), subname)
- call check_ret(nf_put_vara_double(ncido, varid, bego, leno, mhgtt_o), subname)
-
- call check_ret(nf_inq_varid(ncido, 'MONTHLY_HEIGHT_BOT', varid), subname)
- call check_ret(nf_put_vara_double(ncido, varid, bego, leno, mhgtb_o), subname)
-
- call check_ret(nf_inq_varid(ncido, 'time', varid), subname)
- call check_ret(nf_put_vara_int(ncido, varid, bego(ndimso), leno(ndimso), m), subname)
-
- call check_ret(nf_sync(ncido), subname)
-
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- ! Input grid global area
-
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + tgridmap%area_src(ni)
- end do
-
- glai_i(:) = 0.
- gsai_i(:) = 0.
- ghgtt_i(:) = 0.
- ghgtb_i(:) = 0.
- do l = 0, numpft_i - 1
- do ni = 1, ns_i
- glai_i(l) = glai_i(l) + mlai_i(ni,l) *tgridmap%area_src(ni)*&
- tdomain%mask(ni)*re**2
- gsai_i(l) = gsai_i(l) + msai_i(ni,l) *tgridmap%area_src(ni)*&
- tdomain%mask(ni)*re**2
- ghgtt_i(l) = ghgtt_i(l)+ mhgtt_i(ni,l)*tgridmap%area_src(ni)*&
- tdomain%mask(ni)*re**2
- ghgtb_i(l) = ghgtb_i(l)+ mhgtb_i(ni,l)*tgridmap%area_src(ni)*&
- tdomain%mask(ni)*re**2
- end do
- end do
-
- ! Output grid global area
-
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)
- end do
-
- glai_o(:) = 0.
- gsai_o(:) = 0.
- ghgtt_o(:) = 0.
- ghgtb_o(:) = 0.
- do l = 0, numpft_i - 1
- do no = 1,ns_o
- glai_o(l) = glai_o(l) + mlai_o(no,l)*tgridmap%area_dst(no)* &
- frac_dst(no)*re**2
- gsai_o(l) = gsai_o(l) + msai_o(no,l)*tgridmap%area_dst(no)* &
- frac_dst(no)*re**2
- ghgtt_o(l) = ghgtt_o(l)+ mhgtt_o(no,l)*tgridmap%area_dst(no)* &
- frac_dst(no)*re**2
- ghgtb_o(l) = ghgtb_o(l)+ mhgtb_o(no,l)*tgridmap%area_dst(no)* &
- frac_dst(no)*re**2
- end do
- end do
-
- ! Comparison
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'LAI Output for month ',m
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,1001)
-1001 format (1x,'PFT input grid area output grid area',/ &
- 1x,3x,' 10**6 km**2',' 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- do l = 0, numpft
- write (ndiag,1002) l, glai_i(l)*1.e-06*1.e-02,glai_o(l)*1.e-06*1.e-02
-1002 format (1x,i3,f16.3,f17.3)
- end do
-
- write (6,*) 'Successfully made LAIs/SAIs/heights for month ', m
- call shr_sys_flush(6)
-
- enddo
- write (6,*)
-
- ! Close input file
- call check_ret(nf_close(ncidi), subname)
-
- ! consistency check that PFT and LAI+SAI make sense
- !call pft_laicheck( ni_s, pft_i, laimask )
-
- ! Deallocate dynamic memory
- deallocate(mlai_i)
- deallocate(msai_i)
- deallocate(mhgtt_i)
- deallocate(mhgtb_i)
- deallocate(mlai_o)
- deallocate(msai_o)
- deallocate(mhgtt_o)
- deallocate(mhgtb_o)
- deallocate(laimask)
- deallocate(frac_dst)
-
- call gridmap_clean(tgridmap)
- call domain_clean(tdomain)
-
-end subroutine mklai
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !INTERFACE:
-subroutine pft_laicheck( ni_s, pctpft_i, laimask )
-
-! !USES:
-!
-! !DESCRIPTION:
-!
-! consistency check that PFT and LAI+SAI make sense
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: ni_s ! input PFT grid resolution
- real(r8), pointer :: pctpft_i(:,:) ! % plant function types
- integer, pointer :: laimask(:,:) ! mask where LAI+SAI > 0
-!EOP
-
- character(len=*), parameter :: subName="pft_laicheck"
- integer :: ni,l,n,nc ! Indices
-!-----------------------------------------------------------------------
-
- do l = 0, numpft
- n = 0
- nc = 0
- do ni = 1,ni_s
- if ( pctpft_i(ni,l) > 0.0_r8 ) nc = nc + 1
- if ( (pctpft_i(ni,l) > 0.0_r8) .and. (laimask(ni,l) /= 1) )then
- write (6,*) subName//' :: warning: pft and LAI+SAI mask not consistent!'
- write (6,*) 'ni,l = ', ni, l
- write (6,*) 'pctpft_i = ',pctpft_i(ni,l)
- write (6,*) 'laimask = ', laimask(ni,l)
- n = n + 1
- end if
- end do
- if ( n > max(4,nc/4) ) then
- write (6,*) subName//' :: pft/LAI+SAI inconsistency over more than 25% land-cover'
- write (6,*) '# inconsistent points, total PFT pts, total LAI+SAI pts = ', &
- n, nc, sum(laimask(:,l))
- stop
- end if
- end do
-
-end subroutine pft_laicheck
-
-!-----------------------------------------------------------------------
-
-end module mklaiMod
diff --git a/tools/mksurfdata_map/src/mklanwatMod.F90 b/tools/mksurfdata_map/src/mklanwatMod.F90
deleted file mode 100644
index 49a1485fa7..0000000000
--- a/tools/mksurfdata_map/src/mklanwatMod.F90
+++ /dev/null
@@ -1,503 +0,0 @@
-module mklanwatMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mklanwatMod
-!
-! !DESCRIPTION:
-! make %lake and %wetland from input lake / wetland data
-! also make lake parameters
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mklakwat ! make % lake
- public mkwetlnd ! make % wetland
- public mklakparams ! make lake parameters
-
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mklakwat
-!
-! !INTERFACE:
-subroutine mklakwat(ldomain, mapfname, datfname, ndiag, zero_out, lake_o)
-!
-! !DESCRIPTION:
-! make %lake
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- logical , intent(in) :: zero_out ! if should zero glacier out
- real(r8) , intent(out):: lake_o(:) ! output grid: %lake
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: lake_i(:) ! input grid: percent lake
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: glake_i ! input grid: global lake
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: glake_o ! output grid: global lake
- real(r8) :: garea_o ! output grid: global area
- integer :: ni,no,k,n,m,ns_i,ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mklakwat'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %lake and %wetland .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- ns_o = ldomain%ns
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
-
- if ( .not. zero_out )then
- allocate(lake_i(ns_i), stat=ier)
- if (ier/=0) call abort()
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- write(6,*)'Open lake file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_LAKE', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, lake_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine lake_o on output grid
-
- call gridmap_areaave_srcmask(tgridmap, lake_i,lake_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- do no = 1,ns_o
- if (lake_o(no) < 1.) lake_o(no) = 0.
- enddo
-
- ! -----------------------------------------------------------------
- ! Error check prep
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
- ! -----------------------------------------------------------------
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- ! Input grid
-
- glake_i = 0.
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- glake_i = glake_i + lake_i(ni)*tgridmap%area_src(ni)/100.*re**2
- end do
-
- ! Output grid
-
- glake_o = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- glake_o = glake_o + lake_o(no)*tgridmap%area_dst(no)/100.*re**2
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Inland Water Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2002) glake_i*1.e-06,glake_o*1.e-06
- write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-2002 format (1x,'lakes ',f14.3,f17.3)
-2004 format (1x,'all surface ',f14.3,f17.3)
- else
- do no = 1,ns_o
- lake_o(no) = 0.
- enddo
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- if ( .not. zero_out )then
- call gridmap_clean(tgridmap)
- deallocate (lake_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
- end if
-
- write (6,*) 'Successfully made %lake'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mklakwat
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkwetlnd
-!
-! !INTERFACE:
-subroutine mkwetlnd(ldomain, mapfname, datfname, ndiag, zero_out, swmp_o)
-!
-! !DESCRIPTION:
-! make %wetland
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- logical , intent(in) :: zero_out ! if should zero glacier out
- real(r8) , intent(out):: swmp_o(:) ! output grid: %wetland
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: swmp_i(:) ! input grid: percent swamp
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: gswmp_i ! input grid: global swamp
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: gswmp_o ! output grid: global swamp
- real(r8) :: garea_o ! output grid: global area
- integer :: ni,no,k,n,m,ns_i,ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mkwetlnd'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %wetland .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- ns_o = ldomain%ns
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
-
- if ( .not. zero_out )then
- allocate(swmp_i(ns_i), stat=ier)
- if (ier/=0) call abort()
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- write(6,*)'Open wetland file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_WETLAND', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, swmp_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine swmp_o on output grid
-
- call gridmap_areaave_srcmask(tgridmap, swmp_i, swmp_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- do no = 1,ns_o
- if (swmp_o(no) < 1.) swmp_o(no) = 0.
- enddo
-
- ! -----------------------------------------------------------------
- ! Error check prep
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
- ! -----------------------------------------------------------------
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- ! Input grid
-
- gswmp_i = 0.
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- gswmp_i = gswmp_i + swmp_i(ni)*tgridmap%area_src(ni)/100.*re**2
- end do
-
- ! Output grid
-
- gswmp_o = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- gswmp_o = gswmp_o + swmp_o(no)*tgridmap%area_dst(no)/100.*re**2
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Inland Water Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2003) gswmp_i*1.e-06,gswmp_o*1.e-06
- write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-2003 format (1x,'wetlands ',f14.3,f17.3)
-2004 format (1x,'all surface ',f14.3,f17.3)
- else
- do no = 1,ns_o
- swmp_o(no) = 0.
- enddo
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- if ( .not. zero_out )then
- call gridmap_clean(tgridmap)
- deallocate (swmp_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
- end if
-
- write (6,*) 'Successfully made %wetland'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkwetlnd
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mklakparams
-!
-! !INTERFACE:
-subroutine mklakparams(ldomain, mapfname, datfname, ndiag, &
- lakedepth_o)
-!
-! !DESCRIPTION:
-! make lake parameters (currently just lake depth)
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_continuous
- use mkchecksMod, only : min_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: lakedepth_o(:) ! output grid: lake depth (m)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- real(r8), parameter :: min_valid_lakedepth = 0._r8
-
- character(len=32) :: subname = 'mklakparams'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make lake parameters.....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read(tdomain,datfname)
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open lake parameter file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid lake depth
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'LAKEDEPTH', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, lakedepth_o, nodata=10._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(lakedepth_o, min_valid_lakedepth, 'lakedepth')) then
- stop
- end if
-
- call output_diagnostics_continuous(data_i, lakedepth_o, tgridmap, "Lake Depth", "m", ndiag, tdomain%mask, frac_dst)
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
- write (6,*) 'Successfully made lake parameters'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mklakparams
-
-end module mklanwatMod
diff --git a/tools/mksurfdata_map/src/mkncdio.F90 b/tools/mksurfdata_map/src/mkncdio.F90
deleted file mode 100644
index 555eb6ae80..0000000000
--- a/tools/mksurfdata_map/src/mkncdio.F90
+++ /dev/null
@@ -1,558 +0,0 @@
-module mkncdio
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkncdio
-!
-! !DESCRIPTION:
-! Generic interfaces to write fields to netcdf files, and other useful netcdf operations
-!
-! !USES:
- use shr_kind_mod , only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
-!
-! !PUBLIC TYPES:
- implicit none
- include 'netcdf.inc'
- save
-
- private
-
- public :: check_ret ! checks return status of netcdf calls
- public :: ncd_defvar ! define netCDF input variable
- public :: ncd_def_spatial_var ! define spatial netCDF variable (convenience wrapper to ncd_defvar)
- public :: ncd_put_time_slice ! write a single time slice of a variable
- public :: get_dim_lengths ! get dimension lengths of a netcdf variable
-
- interface ncd_def_spatial_var
- module procedure ncd_def_spatial_var_0lev
- module procedure ncd_def_spatial_var_1lev
- module procedure ncd_def_spatial_var_2lev
- end interface ncd_def_spatial_var
-
- interface ncd_put_time_slice
- module procedure ncd_put_time_slice_1d
- module procedure ncd_put_time_slice_2d
- end interface ncd_put_time_slice
-
- public :: convert_latlon ! convert a latitude or longitude variable to degrees E / N
-!
-! !REVISION HISTORY:
-!
-!
-! !PRIVATE MEMBER FUNCTIONS:
-!
- private :: get_time_slice_beg_and_len ! determine beg and len vectors for writing a time slice
-
- logical :: masterproc = .true. ! always use 1 proc
- real(r8) :: spval = 1.e36 ! special value
-
- public :: nf_open
- public :: nf_close
- public :: nf_write
- public :: nf_sync
- public :: nf_inq_attlen
- public :: nf_inq_dimlen
- public :: nf_inq_dimname
- public :: nf_inq_varid
- public :: nf_inq_varndims
- public :: nf_inq_vardimid
- public :: nf_get_att_double
- public :: nf_get_att_text
- public :: nf_get_var_double
- public :: nf_get_vara_double
- public :: nf_get_var_int
- public :: nf_get_vara_int
- public :: nf_put_var_double
- public :: nf_put_vara_double
- public :: nf_put_var_int
- public :: nf_put_vara_int
- public :: nf_inq_dimid
- public :: nf_max_name
- public :: nf_max_var_dims
- public :: nf_noerr
- public :: nf_nowrite
- public :: nf_enotatt
- public :: nf_strerror
-!EOP
-!-----------------------------------------------------------------------
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: check_ret
-!
-! !INTERFACE:
- subroutine check_ret(ret, calling, varexists)
-!
-! !DESCRIPTION:
-! Check return status from netcdf call
-!
-! !ARGUMENTS:
- implicit none
- integer, intent(in) :: ret
- character(len=*) :: calling
- logical, intent(out), optional :: varexists
-!
-! !REVISION HISTORY:
-!
-!EOP
-!-----------------------------------------------------------------------
-
- if ( present(varexists) ) varexists = .true.
- if ( present(varexists) .and. ret == NF_ENOTVAR )then
- varexists = .false.
- else if (ret /= NF_NOERR) then
- write(6,*)'netcdf error from ',trim(calling), ' rcode = ', ret, &
- ' error = ', NF_STRERROR(ret)
- call abort()
- end if
-
- end subroutine check_ret
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: ncd_defvar
-!
-! !INTERFACE:
- subroutine ncd_defvar(ncid, varname, xtype, &
- dim1name, dim2name, dim3name, dim4name, dim5name, &
- long_name, units, cell_method, missing_value, fill_value, &
- imissing_value, ifill_value)
-!
-! !DESCRIPTION:
-! Define a netcdf variable
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: ncid ! input unit
- character(len=*), intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*), intent(in), optional :: dim1name ! dimension name
- character(len=*), intent(in), optional :: dim2name ! dimension name
- character(len=*), intent(in), optional :: dim3name ! dimension name
- character(len=*), intent(in), optional :: dim4name ! dimension name
- character(len=*), intent(in), optional :: dim5name ! dimension name
- character(len=*), intent(in), optional :: long_name ! attribute
- character(len=*), intent(in), optional :: units ! attribute
- character(len=*), intent(in), optional :: cell_method ! attribute
- real(r8) , intent(in), optional :: missing_value ! attribute for real
- real(r8) , intent(in), optional :: fill_value ! attribute for real
- integer , intent(in), optional :: imissing_value ! attribute for int
- integer , intent(in), optional :: ifill_value ! attribute for int
-!
-! !REVISION HISTORY:
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: n ! indices
- integer :: ndims ! dimension counter
- integer :: dimid(5) ! dimension ids
- integer :: varid ! variable id
- integer :: itmp ! temporary
- character(len=256) :: str ! temporary
- character(len=32) :: subname='NCD_DEFVAR_REAL' ! subroutine name
-!-----------------------------------------------------------------------
-
- if (.not. masterproc) return
-
- ! Determine dimension ids for variable
-
- dimid(:) = 0
-
- if (present(dim1name)) then
- call check_ret(nf_inq_dimid(ncid, dim1name, dimid(1)), subname)
- end if
- if (present(dim2name)) then
- call check_ret(nf_inq_dimid(ncid, dim2name, dimid(2)), subname)
- end if
- if (present(dim3name)) then
- call check_ret(nf_inq_dimid(ncid, dim3name, dimid(3)), subname)
- end if
- if (present(dim4name)) then
- call check_ret(nf_inq_dimid(ncid, dim4name, dimid(4)), subname)
- end if
- if (present(dim5name)) then
- call check_ret(nf_inq_dimid(ncid, dim5name, dimid(5)), subname)
- end if
-
- ! Define variable
-
- if (present(dim1name)) then
- ndims = 0
- do n = 1, size(dimid)
- if (dimid(n) /= 0) ndims = ndims + 1
- end do
- call check_ret(nf_def_var(ncid, trim(varname), xtype, ndims, dimid(1:ndims), varid), subname)
- else
- call check_ret(nf_def_var(ncid, varname, xtype, 0, 0, varid), subname)
- end if
- if (present(long_name)) then
- call check_ret(nf_put_att_text(ncid, varid, 'long_name', len_trim(long_name), trim(long_name)), subname)
- end if
- if (present(units)) then
- call check_ret(nf_put_att_text(ncid, varid, 'units', len_trim(units), trim(units)), subname)
- end if
- if (present(cell_method)) then
- str = 'time: ' // trim(cell_method)
- call check_ret(nf_put_att_text(ncid, varid, 'cell_method', len_trim(str), trim(str)), subname)
- end if
- if (present(fill_value)) then
- call check_ret(nf_put_att_double(ncid, varid, '_FillValue', xtype, 1, fill_value), subname)
- end if
- if (present(missing_value)) then
- call check_ret(nf_put_att_double(ncid, varid, 'missing_value', xtype, 1, missing_value), subname)
- end if
- if (present(ifill_value)) then
- call check_ret(nf_put_att_int(ncid, varid, '_FillValue', xtype, 1, ifill_value), subname)
- end if
- if (present(imissing_value)) then
- call check_ret(nf_put_att_int(ncid, varid, 'missing_value', xtype, 1, imissing_value), subname)
- end if
-
- end subroutine ncd_defvar
-
- ! ========================================================================
- ! ncd_def_spatial_var routines: define a spatial netCDF variable (convenience wrapper to
- ! ncd_defvar)
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- subroutine ncd_def_spatial_var_0lev(ncid, varname, xtype, long_name, units)
- !
- ! !DESCRIPTION:
- ! Define a spatial netCDF variable (convenience wrapper to ncd_defvar)
- !
- ! The variable in question has ONLY spatial dimensions (no level or time dimensions)
- !
- ! !USES:
- use mkvarctl, only : outnc_1d
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! input unit
- character(len=*) , intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*) , intent(in) :: long_name ! attribute
- character(len=*) , intent(in) :: units ! attribute
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'ncd_def_spatial_var_0lev'
- !-----------------------------------------------------------------------
-
- if (outnc_1d) then
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='gridcell', &
- long_name=long_name, units=units)
- else
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='lsmlon', dim2name='lsmlat', &
- long_name=long_name, units=units)
- end if
-
- end subroutine ncd_def_spatial_var_0lev
-
- !-----------------------------------------------------------------------
- subroutine ncd_def_spatial_var_1lev(ncid, varname, xtype, lev1name, long_name, units)
- !
- ! !DESCRIPTION:
- ! Define a spatial netCDF variable (convenience wrapper to ncd_defvar)
- !
- ! The variable in question has one level (or time) dimension in addition to its
- ! spatial dimensions
- !
- ! !USES:
- use mkvarctl, only : outnc_1d
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! input unit
- character(len=*) , intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*) , intent(in) :: lev1name ! name of level (or time) dimension
- character(len=*) , intent(in) :: long_name ! attribute
- character(len=*) , intent(in) :: units ! attribute
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'ncd_def_spatial_var_1lev'
- !-----------------------------------------------------------------------
-
- if (outnc_1d) then
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='gridcell', dim2name=lev1name, &
- long_name=long_name, units=units)
- else
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='lsmlon', dim2name='lsmlat',dim3name=lev1name, &
- long_name=long_name, units=units)
- end if
-
- end subroutine ncd_def_spatial_var_1lev
-
- !-----------------------------------------------------------------------
- subroutine ncd_def_spatial_var_2lev(ncid, varname, xtype, lev1name, lev2name, long_name, units)
- !
- ! !DESCRIPTION:
- ! Define a spatial netCDF variable (convenience wrapper to ncd_defvar)
- !
- ! The variable in question has two level (or time) dimensions in addition to its
- ! spatial dimensions
- !
- ! !USES:
- use mkvarctl, only : outnc_1d
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! input unit
- character(len=*) , intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*) , intent(in) :: lev1name ! name of first level (or time) dimension
- character(len=*) , intent(in) :: lev2name ! name of second level (or time) dimension
- character(len=*) , intent(in) :: long_name ! attribute
- character(len=*) , intent(in) :: units ! attribute
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'ncd_def_spatial_var_2lev'
- !-----------------------------------------------------------------------
-
- if (outnc_1d) then
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='gridcell', dim2name=lev1name, dim3name=lev2name, &
- long_name=long_name, units=units)
- else
- call ncd_defvar(ncid=ncid, varname=varname, xtype=xtype, &
- dim1name='lsmlon', dim2name='lsmlat', dim3name=lev1name, dim4name=lev2name, &
- long_name=long_name, units=units)
- end if
-
- end subroutine ncd_def_spatial_var_2lev
-
- ! ========================================================================
- ! ncd_put_time_slice routines: write a single time slice of a variable
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- subroutine ncd_put_time_slice_1d(ncid, varid, time_index, data)
- !
- ! !DESCRIPTION:
- ! Write a single time slice of a 1-d variable
- !
- ! !USES:
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! netCDF id
- integer , intent(in) :: varid ! variable id
- integer , intent(in) :: time_index ! time index in file
- real(r8), intent(in) :: data(:) ! data to write (a single time slice)
- !
- ! !LOCAL VARIABLES:
- integer, allocatable :: beg(:) ! begin indices for each dimension
- integer, allocatable :: len(:) ! length along each dimension
-
- character(len=*), parameter :: subname = 'ncd_put_time_slice_1d'
- !-----------------------------------------------------------------------
-
- call get_time_slice_beg_and_len(ncid, varid, time_index, beg, len)
- call check_ret(nf_put_vara_double(ncid, varid, beg, len, data), subname)
-
- deallocate(beg, len)
-
- end subroutine ncd_put_time_slice_1d
-
- !-----------------------------------------------------------------------
- subroutine ncd_put_time_slice_2d(ncid, varid, time_index, data)
- !
- ! !DESCRIPTION:
- ! Write a single time slice of a 2-d variable
- !
- ! !USES:
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! netCDF id
- integer , intent(in) :: varid ! variable id
- integer , intent(in) :: time_index ! time index in file
- real(r8), intent(in) :: data(:,:) ! data to write (a single time slice)
- !
- ! !LOCAL VARIABLES:
- integer, allocatable :: beg(:) ! begin indices for each dimension
- integer, allocatable :: len(:) ! length along each dimension
-
- character(len=*), parameter :: subname = 'ncd_put_time_slice_2d'
- !-----------------------------------------------------------------------
-
- call get_time_slice_beg_and_len(ncid, varid, time_index, beg, len)
- call check_ret(nf_put_vara_double(ncid, varid, beg, len, data), subname)
-
- deallocate(beg, len)
-
- end subroutine ncd_put_time_slice_2d
-
-
- !-----------------------------------------------------------------------
- subroutine get_time_slice_beg_and_len(ncid, varid, time_index, beg, len)
- !
- ! !DESCRIPTION:
- ! Determine beg and len vectors for writing a time slice.
- !
- ! Assumes time is the last dimension of the given variable.
- !
- ! Allocates memory for beg & len.
- !
- ! !USES:
- !
- ! !ARGUMENTS:
- integer , intent(in) :: ncid ! netcdf ID
- integer , intent(in) :: varid ! variable ID
- integer , intent(in) :: time_index ! time index in file
- integer, allocatable, intent(out) :: beg(:) ! begin indices for each dimension
- integer, allocatable, intent(out) :: len(:) ! length along each dimension
- !
- ! !LOCAL VARIABLES:
- integer :: n ! index
- integer :: ndims ! number of dimensions
- integer, allocatable :: dimids(:) ! dimension IDs
-
- character(len=*), parameter :: subname = 'get_time_slice_beg_and_len'
- !-----------------------------------------------------------------------
-
- call check_ret(nf_inq_varndims(ncid, varid, ndims), subname)
- allocate(beg(ndims))
- allocate(len(ndims))
- allocate(dimids(ndims))
-
- call check_ret(nf_inq_vardimid(ncid, varid, dimids), subname)
- beg(1:ndims-1) = 1
- do n = 1,ndims-1
- call check_ret(nf_inq_dimlen(ncid, dimids(n), len(n)), subname)
- end do
- len(ndims) = 1
- beg(ndims) = time_index
-
- deallocate(dimids)
-
- end subroutine get_time_slice_beg_and_len
-
-
-
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: get_dim_lengths
-!
-! !INTERFACE:
-subroutine get_dim_lengths(ncid, varname, ndims, dim_lengths)
-!
-! !DESCRIPTION:
-! Returns the number of dimensions and an array containing the dimension lengths of a
-! variable in an open netcdf file.
-!
-! Entries 1:ndims in the returned dim_lengths array contain the dimension lengths; the
-! remaining entries in that vector are meaningless. The dim_lengths array must be large
-! enough to hold all ndims values; if not, the code aborts (this can be ensured by passing
-! in an array of length nf_max_var_dims).
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: ncid ! netcdf id of an open netcdf file
- character(len=*), intent(in) :: varname ! name of variable of interest
- integer , intent(out):: ndims ! number of dimensions of variable
- integer , intent(out):: dim_lengths(:) ! lengths of dimensions of variable
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
- integer :: varid
- integer :: dimids(size(dim_lengths))
- integer :: i
- character(len=*), parameter :: subname = 'get_dim_lengths'
-!EOP
-!------------------------------------------------------------------------------
- call check_ret(nf_inq_varid(ncid, varname, varid), subname)
- call check_ret(nf_inq_varndims(ncid, varid, ndims), subname)
-
- if (ndims > size(dim_lengths)) then
- write(6,*) trim(subname), ' ERROR: dim_lengths too small'
- call abort()
- end if
-
- call check_ret(nf_inq_vardimid(ncid, varid, dimids), subname)
-
- dim_lengths(:) = 0 ! pre-fill with 0 so we won't have garbage in elements past ndims
- do i = 1, ndims
- call check_ret(nf_inq_dimlen(ncid, dimids(i), dim_lengths(i)), subname)
- end do
- end subroutine get_dim_lengths
-
-!----------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: convert_latlon
-!
-! !INTERFACE:
- subroutine convert_latlon(ncid, varname, data)
-!
-! !DESCRIPTION:
-! Convert a latitude or longitude variable from its units in the input file to degrees E /
-! degrees N. Currently, this just handles conversions from radians to degrees.
-!
-! Assumes that the longitude / latitude variable has already been read from file, into
-! the variable given by 'data'. ncid & varname give the file ID and variable name from
-! which this variable was read (needed to obtain the variable's units).
-!
-! !USES:
- use shr_const_mod, only : SHR_CONST_PI
-!
-! !ARGUMENTS:
- implicit none
- integer , intent(in) :: ncid ! ID of open netcdf file
- character(len=*), intent(in) :: varname ! name of lat or lon variable that was read into 'data'
- real(r8) , intent(inout):: data(:) ! latitude or longitude data
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: ier ! error return code
- integer :: varid ! netCDF variable id
- integer :: units_len ! length of units attribute on file
- character(len=256) :: units ! units attribute
- character(len= 32) :: subname = 'convert_latlon'
-!-----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, varname, varid), subname)
- ier = nf_inq_attlen(ncid, varid, 'units', units_len)
-
- ! Only do the following processing if there is no error; if ier /= NF_NOERR, that
- ! probably means there isn't a units attribute -- in that case, assume units are
- ! degrees and need no conversion
- if (ier == NF_NOERR) then
- if (units_len > len(units)) then
- write(6,*) trim(subname), ' ERROR: units variable not long enough to hold attributue'
- call abort()
- end if
-
- call check_ret(nf_get_att_text(ncid, varid, 'units', units), subname)
-
- if (units(1:7) == 'radians') then
- ! convert from radians to degrees
- data(:) = data(:) * 180._r8 / SHR_CONST_PI
- end if
- end if
-
- end subroutine convert_latlon
-!------------------------------------------------------------------------------
-
-
-end module mkncdio
diff --git a/tools/mksurfdata_map/src/mkpctPftTypeMod.F90 b/tools/mksurfdata_map/src/mkpctPftTypeMod.F90
deleted file mode 100644
index 8c2c9b7c53..0000000000
--- a/tools/mksurfdata_map/src/mkpctPftTypeMod.F90
+++ /dev/null
@@ -1,626 +0,0 @@
-module mkpctPftTypeMod
-
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !MODULE: mkpctPftType
- !
- ! !DESCRIPTION:
- ! Derived type and associated methods for operating on pct_pft data
- !
- ! !REVISION HISTORY:
- ! Author: Bill Sacks
- !
- !-----------------------------------------------------------------------
-
- !!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- ! !PUBLIC TYPES
- public :: pct_pft_type
-
- type :: pct_pft_type
- private
- real(r8), allocatable :: pct_p2l(:) ! pct of each pft on the landunit
- real(r8) :: pct_l2g ! pct of landunit on the grid cell
- contains
- ! Public routines:
- ! Query routines:
- procedure :: get_pct_p2l ! get an array holding % of each pft on the landunit
- procedure :: get_pct_p2g ! get an array holding % of each pft on the gridcell
- procedure :: get_pct_l2g ! get % of landunit on the grid cell
- procedure :: get_first_pft_index ! get index of the first pft (lower bound of arrays)
- procedure :: get_one_pct_p2g ! get % of gridcell for a single pft
- ! Routines that modify the data:
- procedure :: set_pct_l2g ! set % of landunit on the grid cell
- procedure :: set_one_pct_p2g ! set % pft for a single pft
- procedure :: merge_pfts ! merge all area from one PFT into another PFT
- procedure :: remove_small_cover ! set % cover to 0 for any PFT whose grid cell coverage is less than a threshold
-
- ! Private routines:
- procedure, private :: convert_from_p2g ! convert a p2g array into p2l and l2g
- procedure, private :: check_vals ! perform a sanity check after setting values
- end type pct_pft_type
-
- ! !PUBLIC MEMBER FUNCTIONS
- public :: update_max_array ! given an array of pct_pft_type variables update the max_p2l values from pct_p2l
- public :: get_pct_p2l_array ! given an array of pct_pft_type variables, return a 2-d array of pct_p2l
- public :: get_pct_l2g_array ! given an array of pct_pft_type variables, return an array of pct_l2g
-
- interface pct_pft_type
- module procedure constructor ! initialize a new pct_pft_type object
- module procedure constructor_pong ! initialize a new pct_pft_type object with all PFT's on the gridcell
- module procedure constructor_empty ! initialize a new pct_pft_type object for an empty landunit
- end interface pct_pft_type
-
- ! !PRIVATE TYPES:
- real(r8), parameter :: tol = 1.e-12_r8 ! tolerance for checking equality
-
- !EOP
-
-contains
-
- ! ========================================================================
- ! Constructors
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- function constructor_pong(pct_p2g, first_pft_index, default_pct_p2l) result(this)
- !
- ! !DESCRIPTION:
- ! Given the % of each pft on the grid cell, create a pct_pft_type object.
- !
- ! Note that pct_p2g should just contain the pfts in this landunit.
- !
- ! If all PFTs have 0 weight on the grid cell, we arbitrarily set % of each pft on the
- ! landunit based on default_pct_p2l. Note that:
- ! (1) size of default_pct_p2l must match size of pct_p2g
- ! (2) default_pct_p2l must sum to 100%
- !
- ! !ARGUMENTS:
- type(pct_pft_type) :: this ! function result
-
- real(r8), intent(in) :: pct_p2g(:) ! % of each pft on the grid cell
- integer , intent(in) :: first_pft_index ! index of the first pft (lower bound of arrays)
- real(r8), intent(in) :: default_pct_p2l(:) ! default % of each pft on the landunit, used if total landunit area is 0%
- !
- ! !LOCAL VARIABLES:
- integer :: last_pft_index
-
- character(len=*), parameter :: subname = 'constructor_pong'
- !-----------------------------------------------------------------------
-
- if (size(default_pct_p2l) /= size(pct_p2g)) then
- write(6,*) subname//' ERROR: size of default_pct_p2l must match size of pct_p2g'
- call abort()
- end if
-
- last_pft_index = first_pft_index + size(pct_p2g) - 1
- allocate(this%pct_p2l(first_pft_index : last_pft_index))
- call this%convert_from_p2g(pct_p2g, default_pct_p2l)
-
- end function constructor_pong
-
- !-----------------------------------------------------------------------
- function constructor(pct_p2l, pct_l2g, first_pft_index) result(this)
- !
- ! !DESCRIPTION:
- ! Given the % of each pft on the land cell and % of land unit on grid cell,
- ! create a pct_pft_type object.
- !
- ! Note that pct_p2g should just contain the pfts in this landunit.
- !
- ! !ARGUMENTS:
- type(pct_pft_type) :: this ! function result
-
- real(r8), intent(in) :: pct_p2l(:) ! % of each pft on the landunit
- real(r8), intent(in) :: pct_l2g ! % of the landunit on the grid cell
- integer , intent(in) :: first_pft_index ! index of the first pft (lower bound of arrays)
- !
- ! !LOCAL VARIABLES:
- integer :: last_pft_index
-
- character(len=*), parameter :: subname = 'constructor'
- !-----------------------------------------------------------------------
-
- last_pft_index = first_pft_index + size(pct_p2l) - 1
- allocate(this%pct_p2l(first_pft_index : last_pft_index))
- this%pct_p2l = pct_p2l
- this%pct_l2g = pct_l2g
-
- end function constructor
-
- !-----------------------------------------------------------------------
- function constructor_empty() result(this)
- !
- ! !DESCRIPTION:
- ! Initialize a new pct_pft_type object for an empty landunit - that is, one that has
- ! no PFTs on it, and never can (e.g., the crop landunit when we're running without
- ! prognostic crops, so that the landunit is always empty).
- !
- ! !ARGUMENTS:
- type(pct_pft_type) :: this ! function result
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'constructor_empty'
- !-----------------------------------------------------------------------
-
- this%pct_l2g = 0._r8
- allocate(this%pct_p2l(0))
-
- end function constructor_empty
-
-
-
- ! ========================================================================
- ! Public member functions
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- function get_pct_p2l(this) result(pct_p2l)
- !
- ! !DESCRIPTION:
- ! Get an array holding % of each pft on the landunit
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(in) :: this
- real(r8) :: pct_p2l(size(this%pct_p2l)) ! function result
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_pct_p2l'
- !-----------------------------------------------------------------------
-
- pct_p2l = this%pct_p2l
-
- end function get_pct_p2l
-
- !-----------------------------------------------------------------------
- function get_pct_p2g(this) result(pct_p2g)
- !
- ! !DESCRIPTION:
- ! Get an array holding % of each pft on the gridcell
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(in) :: this
- real(r8) :: pct_p2g(size(this%pct_p2l)) ! function result
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_pct_p2g'
- !-----------------------------------------------------------------------
-
- pct_p2g(:) = this%pct_p2l(:) * this%pct_l2g / 100._r8
-
- end function get_pct_p2g
-
- !-----------------------------------------------------------------------
- function get_pct_l2g(this) result(pct_l2g)
- !
- ! !DESCRIPTION:
- ! Get % of landunit on the grid cell
- !
- ! !ARGUMENTS:
- real(r8) :: pct_l2g ! function result
- class(pct_pft_type), intent(in) :: this
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_pct_l2g'
- !-----------------------------------------------------------------------
-
- pct_l2g = this%pct_l2g
-
- end function get_pct_l2g
-
- !-----------------------------------------------------------------------
- function get_first_pft_index(this) result(first_pft_index)
- !
- ! !DESCRIPTION:
- ! Get index of the first pft (lower bound of arrays)
- !
- ! !ARGUMENTS:
- integer :: first_pft_index ! function result
- class(pct_pft_type), intent(in) :: this
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_first_pft_index'
- !-----------------------------------------------------------------------
-
- first_pft_index = lbound(this%pct_p2l, 1)
-
- end function get_first_pft_index
-
- !-----------------------------------------------------------------------
- function get_one_pct_p2g(this, pft_index) result(pct_p2g)
- !
- ! !DESCRIPTION:
- ! Get % of gridcell for a single pft
- !
- ! !ARGUMENTS:
- real(r8) :: pct_p2g ! function result
- class(pct_pft_type), intent(in) :: this
- integer :: pft_index
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_one_pct_p2g'
- !-----------------------------------------------------------------------
-
- pct_p2g = this%pct_p2l(pft_index) * this%pct_l2g / 100._r8
-
- end function get_one_pct_p2g
-
- !-----------------------------------------------------------------------
- subroutine set_pct_l2g(this, pct_l2g_new)
- !
- ! !DESCRIPTION:
- ! Set percent of landunit on the grid cell. Keep pct_p2l the same as before.
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(inout) :: this
- real(r8), intent(in) :: pct_l2g_new ! new percent of this landunit with respect to grid cell
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'set_pct_l2g'
- !-----------------------------------------------------------------------
-
- if (pct_l2g_new < 0._r8 .or. pct_l2g_new > (100._r8 + tol)) then
- write(6,*) subname//' ERROR: pct_l2g_new must be between 0 and 100%'
- write(6,*) 'pct_l2g_new = ', pct_l2g_new
- call abort()
- end if
-
- this%pct_l2g = pct_l2g_new
-
- end subroutine set_pct_l2g
-
- !-----------------------------------------------------------------------
- subroutine set_one_pct_p2g(this, pft_index, pct_p2g_new)
- !
- ! !DESCRIPTION:
- ! Set percent pft for a single pft, given its weight on the grid cell.
- !
- ! The landunit percent is adjusted appropriately. In addition, the coverage of other
- ! PFTs are adjusted proportionally so that the total pct_pft adds to 100%.
- !
- ! If the resulting total weight on the grid cell is reduced to 0, then pct_p2l
- ! remains as it was before this subroutine call.
- !
- ! Note about pft_index: Note that the first element of the array has index given by
- ! the first_pft_index value given to the constructor.
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(inout) :: this
- integer , intent(in) :: pft_index ! index of the pft to change
- real(r8), intent(in) :: pct_p2g_new ! new percent of this pft, with respect to grid cell
- !
- ! !LOCAL VARIABLES:
- real(r8), allocatable :: pct_p2g(:) ! % of each pft on the grid cell
-
- character(len=*), parameter :: subname = 'set_pct_p2g'
- !-----------------------------------------------------------------------
-
- if (pct_p2g_new < 0._r8 .or. pct_p2g_new > (100._r8 + tol)) then
- write(6,*) subname//' ERROR: pct_p2g_new must be between 0 and 100%'
- write(6,*) 'pct_p2g_new = ', pct_p2g_new
- call abort()
- end if
-
- allocate(pct_p2g(lbound(this%pct_p2l, 1) : ubound(this%pct_p2l, 1)))
- pct_p2g(:) = this%get_pct_p2g()
- pct_p2g(pft_index) = pct_p2g_new
-
- ! Note that by using this%pct_p2l as the default_pct_pl2 argument, we ensure that, if
- ! the new p2g value brings the total % on the grid cell to 0, then we keep the
- ! previous values for pct_p2l
- call this%convert_from_p2g(pct_p2g, this%pct_p2l)
-
- deallocate(pct_p2g)
-
- end subroutine set_one_pct_p2g
-
- !-----------------------------------------------------------------------
- subroutine merge_pfts(this, source, dest)
- !
- ! !DESCRIPTION:
- ! Merge all area from one PFT into another PFT
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(inout) :: this
- integer, intent(in) :: source ! index of source PFT
- integer, intent(in) :: dest ! index of dest PFT
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'merge_pfts'
- !-----------------------------------------------------------------------
-
- this%pct_p2l(dest) = this%pct_p2l(dest) + this%pct_p2l(source)
- this%pct_p2l(source) = 0._r8
-
- call this%check_vals(subname)
-
- end subroutine merge_pfts
-
- !-----------------------------------------------------------------------
- subroutine remove_small_cover(this, too_small, nsmall)
- !
- ! !DESCRIPTION:
- ! Remove any small PFTs, defined as those whose grid cell coverage is below some
- ! threshold. Also returns the number of small PFTs found.
- !
- ! !USES:
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(inout) :: this
- real(r8), intent(in) :: too_small ! threshold for considering a PFT too small (% of grid cell)
- integer , intent(out) :: nsmall ! number of small (but non-zero) PFTs found
- !
- ! !LOCAL VARIABLES:
- integer :: pft_lbound
- integer :: pft_ubound
- integer :: pft_index
- real(r8), allocatable :: pct_p2g(:) ! % of each pft on the grid cell
- logical , allocatable :: is_small(:) ! whether each PFT is considered too small (but not 0)
- logical , allocatable :: is_zero(:) ! whether each PFT is exactly 0
-
- character(len=*), parameter :: subname = 'remove_small_cover'
- !-----------------------------------------------------------------------
-
- pft_lbound = lbound(this%pct_p2l, 1)
- pft_ubound = ubound(this%pct_p2l, 1)
- allocate(pct_p2g (pft_lbound : pft_ubound))
- allocate(is_small(pft_lbound : pft_ubound))
- allocate(is_zero (pft_lbound : pft_ubound))
-
- pct_p2g(:) = this%get_pct_p2g()
- is_zero(:) = (pct_p2g == 0._r8)
- is_small(:) = (pct_p2g < too_small .and. .not. is_zero(:))
-
- nsmall = count(is_small(:))
-
- if (nsmall > 0) then
-
- if (all(is_zero(:) .or. is_small(:))) then
- ! If all PFTs are either 0 or small, then set pct_l2g to 0, but don't touch
- ! pct_p2l(:) (We do NOT set pct_p2l to all 0 in this case, because we need to
- ! maintain sum(pct_p2l) = 100%)
- this%pct_l2g = 0._r8
-
- else
- ! If there are some big PFTs, then we need to adjust pct_p2l as well as pct_l2g
- ! (setting pct_p2l to 0 for the small elements and renormalizing the others)
- do pft_index = pft_lbound, pft_ubound
- if (is_small(pft_index)) then
- call this%set_one_pct_p2g(pft_index, 0._r8)
- end if
- end do
- end if
-
- call this%check_vals(subname)
- end if
-
- deallocate(pct_p2g, is_small, is_zero)
- end subroutine remove_small_cover
-
- ! ========================================================================
- ! Private member functions
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- subroutine convert_from_p2g(this, pct_p2g, default_pct_p2l)
- !
- ! !DESCRIPTION:
- ! Given a p2g array, compute the p2l array and l2g
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(inout) :: this
- real(r8), intent(in) :: pct_p2g(:) ! % of each pft on the grid cell
- real(r8), intent(in) :: default_pct_p2l(:) ! default % of each pft on the landunit, used if total landunit area is 0%
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'convert_from_p2g'
- !-----------------------------------------------------------------------
-
- ! Check pre-conditions
-
- if (size(pct_p2g) /= size(this%pct_p2l) .or. size(default_pct_p2l) /= size(this%pct_p2l)) then
- write(6,*) subname//' ERROR: array size mismatch: '
- write(6,*) size(pct_p2g), size(default_pct_p2l), size(this%pct_p2l)
- call abort()
- end if
-
- if (abs(sum(default_pct_p2l) - 100._r8) > tol) then
- write(6,*) subname//' ERROR: default_pct_p2l must sum to 100'
- call abort()
- end if
-
- if (any(pct_p2g < 0._r8)) then
- write(6,*) subname//' ERROR: negative values found in pct_p2g array'
- write(6,*) pct_p2g
- call abort()
- end if
-
- if (sum(pct_p2g) < 0._r8 .or. sum(pct_p2g) > (100._r8 + tol)) then
- write(6,*) subname//' ERROR: pct_p2g must be between 0 and 100'
- write(6,*) 'sum(pct_p2g) = ', sum(pct_p2g)
- call abort()
- end if
-
- ! Done checking pre-conditions
-
- this%pct_l2g = sum(pct_p2g)
- if (this%pct_l2g > 0._r8) then
- this%pct_p2l = pct_p2g / this%pct_l2g * 100._r8
- else
- this%pct_p2l = default_pct_p2l
- end if
-
- ! Check post-conditions
-
- call this%check_vals(subname)
-
- end subroutine convert_from_p2g
-
-
- !-----------------------------------------------------------------------
- subroutine check_vals(this, caller)
- !
- ! !DESCRIPTION:
- ! Perform a sanity check after setting values
- !
- ! !ARGUMENTS:
- class(pct_pft_type), intent(in) :: this
- character(len=*), intent(in) :: caller ! name of the calling subroutine
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'check_vals'
- !-----------------------------------------------------------------------
-
- if (abs(sum(this%pct_p2l) - 100._r8) > tol) then
- write(6,*) subname//' ERROR from ', caller, ': pct_p2l does not sum to 100'
- write(6,*) 'sum(this%pct_p2l) = ', sum(this%pct_p2l)
- call abort()
- end if
-
- if (any(this%pct_p2l < 0._r8)) then
- write(6,*) subname//' ERROR from ', caller, ': negative values found in pct_p2l'
- write(6,*) this%pct_p2l
- call abort()
- end if
-
- if (this%pct_l2g < 0._r8 .or. this%pct_l2g > (100._r8 + tol)) then
- write(6,*) subname//' ERROR from ', caller, ': pct_l2g must be between 0 and 100'
- write(6,*) 'pct_l2g = ', this%pct_l2g
- call abort()
- end if
-
- end subroutine check_vals
-
- ! ========================================================================
- ! Module-level routines (not member functions)
- ! ========================================================================
-
- !-----------------------------------------------------------------------
- subroutine update_max_array(pct_pft_max_arr,pct_pft_arr)
- !
- ! !DESCRIPTION:
- ! Given an array of pct_pft_type variables, update all the max_p2l variables.
- !
- ! Assumes that all elements of pct_pft_max_arr and pct_pft_arr have the same
- ! size and lower bound for their pct_p2l array.
- !
- ! !ARGUMENTS:
- ! workaround for gfortran bug (58043): declare this 'type' rather than 'class':
- type(pct_pft_type), intent(inout) :: pct_pft_max_arr(:)
- type(pct_pft_type), intent(in) :: pct_pft_arr(:)
- !
- ! !LOCAL VARIABLES:
- integer :: pft_lbound
- integer :: pft_ubound
- integer :: arr_index
- integer :: pft_index
-
- character(len=*), parameter :: subname = 'update_max_array'
- !-----------------------------------------------------------------------
-
-
- pft_lbound = lbound(pct_pft_arr(1)%pct_p2l, 1)
- pft_ubound = ubound(pct_pft_arr(1)%pct_p2l, 1)
-
- do arr_index = 1, size(pct_pft_arr)
- if (lbound(pct_pft_arr(arr_index)%pct_p2l, 1) /= pft_lbound .or. &
- ubound(pct_pft_arr(arr_index)%pct_p2l, 1) /= pft_ubound) then
- write(6,*) subname//' ERROR: all elements of pct_pft_arr must have'
- write(6,*) 'the same size and lower bound for their pct_p2l array'
- call abort()
- end if
-
- if (pct_pft_arr(arr_index)%pct_l2g > pct_pft_max_arr(arr_index)%pct_l2g) then
- pct_pft_max_arr(arr_index)%pct_l2g = pct_pft_arr(arr_index)%pct_l2g
- end if
-
- do pft_index = pft_lbound, pft_ubound
- if (pct_pft_arr(arr_index)%pct_p2l(pft_index) > pct_pft_max_arr(arr_index)%pct_p2l(pft_index)) then
- pct_pft_max_arr(arr_index)%pct_p2l(pft_index) = pct_pft_arr(arr_index)%pct_p2l(pft_index)
- end if
- end do
- end do
-
- end subroutine update_max_array
-
- !-----------------------------------------------------------------------
- function get_pct_p2l_array(pct_pft_arr) result(pct_p2l)
- !
- ! !DESCRIPTION:
- ! Given an array of pct_pft_type variables, return a 2-d array of pct_p2l.
- !
- ! Assumes that all elements of pct_pft_arr have the same size and lower bound for
- ! their pct_p2l array.
- !
- ! !ARGUMENTS:
- real(r8), allocatable :: pct_p2l(:,:) ! function result (n_elements, n_pfts)
- ! workaround for gfortran bug (58043): declare this 'type' rather than 'class':
- type(pct_pft_type), intent(in) :: pct_pft_arr(:)
- !
- ! !LOCAL VARIABLES:
- integer :: pft_lbound
- integer :: pft_ubound
- integer :: arr_index
- integer :: pft_index
-
- character(len=*), parameter :: subname = 'get_pct_p2l_array'
- !-----------------------------------------------------------------------
-
- pft_lbound = lbound(pct_pft_arr(1)%pct_p2l, 1)
- pft_ubound = ubound(pct_pft_arr(1)%pct_p2l, 1)
-
- allocate(pct_p2l(size(pct_pft_arr), pft_lbound:pft_ubound))
-
- do arr_index = 1, size(pct_pft_arr)
- if (lbound(pct_pft_arr(arr_index)%pct_p2l, 1) /= pft_lbound .or. &
- ubound(pct_pft_arr(arr_index)%pct_p2l, 1) /= pft_ubound) then
- write(6,*) subname//' ERROR: all elements of pct_pft_arr must have'
- write(6,*) 'the same size and lower bound for their pct_p2l array'
- call abort()
- end if
-
- do pft_index = pft_lbound, pft_ubound
- pct_p2l(arr_index, pft_index) = pct_pft_arr(arr_index)%pct_p2l(pft_index)
- end do
- end do
-
- end function get_pct_p2l_array
-
- !-----------------------------------------------------------------------
- function get_pct_l2g_array(pct_pft_arr) result(pct_l2g)
- !
- ! !DESCRIPTION:
- ! Given an array of pct_pft_type variables, return an array of pct_l2g.
- !
- ! !ARGUMENTS:
- real(r8), allocatable :: pct_l2g(:) ! function result
- class(pct_pft_type), intent(in) :: pct_pft_arr(:)
- !
- ! !LOCAL VARIABLES:
- integer :: arr_index
-
- character(len=*), parameter :: subname = 'get_pct_l2g_array'
- !-----------------------------------------------------------------------
-
- allocate(pct_l2g(size(pct_pft_arr)))
- pct_l2g = pct_pft_arr(:)%pct_l2g
-
- end function get_pct_l2g_array
-
-
-end module mkpctPftTypeMod
diff --git a/tools/mksurfdata_map/src/mkpeatMod.F90 b/tools/mksurfdata_map/src/mkpeatMod.F90
deleted file mode 100644
index 974566a056..0000000000
--- a/tools/mksurfdata_map/src/mkpeatMod.F90
+++ /dev/null
@@ -1,149 +0,0 @@
-module mkpeatMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkpeatMod
-!
-! !DESCRIPTION:
-! make fraction peat from input peat data
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mkpeat ! regrid peat data
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpeat
-!
-! !INTERFACE:
-subroutine mkpeat(ldomain, mapfname, datfname, ndiag, peat_o)
-!
-! !DESCRIPTION:
-! make peat
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_area
- use mkchecksMod, only : min_bad, max_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: peat_o(:) ! output grid: fraction peat
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- real(r8), parameter :: min_valid = 0._r8 ! minimum valid value
- real(r8), parameter :: max_valid = 100.000001_r8 ! maximum valid value
- character(len=32) :: subname = 'mkpeat'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make peat .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read( tdomain, datfname )
-
- call gridmap_mapread( tgridmap, mapfname )
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open peat file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid peat
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'peatf', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, peat_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(peat_o, min_valid, 'peat') .or. &
- max_bad(peat_o, max_valid, 'peat')) then
- stop
- end if
-
- call output_diagnostics_area(data_i, peat_o, tgridmap, "Peat", percent=.false., ndiag=ndiag, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
- write (6,*) 'Successfully made peat'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkpeat
-
-
-end module mkpeatMod
diff --git a/tools/mksurfdata_map/src/mkpftConstantsMod.F90 b/tools/mksurfdata_map/src/mkpftConstantsMod.F90
deleted file mode 100644
index 241873c339..0000000000
--- a/tools/mksurfdata_map/src/mkpftConstantsMod.F90
+++ /dev/null
@@ -1,43 +0,0 @@
-module mkpftConstantsMod
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !MODULE: mkpftConstants
- !
- ! !DESCRIPTION:
- ! Constants used by mkpft and related code
- !
- ! !REVISION HISTORY:
- ! Author: Bill Sacks
- !
- !-----------------------------------------------------------------------
- !!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- !
- ! !PUBLIC DATA MEMBERS:
- !
-
- integer, parameter, public :: maxpft = 78 ! maximum # of PFT
-
- integer, public :: num_natpft = -1 ! number of PFTs on the natural vegetation
- ! landunit, NOT including bare ground
- ! (includes generic crops for runs with
- ! create_crop_landunit=false)
-
- integer, public :: num_cft ! number of CFTs on the crop landunit
- integer, public :: natpft_lb ! lower bound for natural pft arrays
- integer, public :: natpft_ub ! upper bound for natural pft arrays
- integer, public :: cft_lb ! lower bound for cft arrays
- integer, public :: cft_ub ! upper bound for cft arrays
-
- integer, parameter, public :: baregroundindex = 0 ! index of bare ground in a natural pft array
-
- ! The following is NOT set as a parameter so that it can be overridden in unit tests
- integer, public :: c3cropindex = 15
- integer, public :: c3irrcropindex = 16
-
-end module mkpftConstantsMod
diff --git a/tools/mksurfdata_map/src/mkpftMod.F90 b/tools/mksurfdata_map/src/mkpftMod.F90
deleted file mode 100644
index 3a12c38cdf..0000000000
--- a/tools/mksurfdata_map/src/mkpftMod.F90
+++ /dev/null
@@ -1,1259 +0,0 @@
-module mkpftMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkpft
-!
-! !DESCRIPTION:
-! Make PFT data
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!-----------------------------------------------------------------------
-!!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkvarpar , only : noveg
- use mkvarctl , only : numpft
- use mkdomainMod , only : domain_checksame
- use mkpftConstantsMod
-
- implicit none
-
- private ! By default make data private
-!
-! !PUBLIC MEMBER FUNCTIONS:
-!
- public mkpftInit ! Initialization
- public mkpft ! Set PFT
- public mkpft_parse_oride ! Parse the string with PFT fraction/index info to override
- public mkpftAtt ! Write out attributes to output file on pft
-!
-! !PUBLIC DATA MEMBERS:
-!
-
- !
- ! When pft_idx and pft_frc are set, they must be set together, and they will cause the
- ! entire area to be covered with vegetation and zero out other landunits.
- ! The sum of pft_frc must = 100%, and each pft_idx point in the array corresponds to
- ! the fraction in pft_frc. Only the first few points are used until pft_frc = 0.0.
- !
- integer :: m ! index
- integer, public :: pft_idx(0:maxpft) = & ! PFT vegetation index to override with
- (/ ( -1, m = 0, maxpft ) /)
- real(r8), public :: pft_frc(0:maxpft) = & ! PFT vegetation fraction to override with
- (/ ( 0.0_r8, m = 0, maxpft ) /)
-!
-! !PRIVATE DATA MEMBERS:
-!
- logical, public, protected :: use_input_pft = .false. ! Flag to override PFT with input values
- logical, public, protected :: presc_cover = .false. ! Flag to prescribe vegetation coverage
- integer, private :: nzero ! index of first zero fraction
-
- type, public :: pft_oride ! Public only for unit testing
- real(r8) :: crop ! Percent covered by crops
- real(r8) :: natveg ! Percent covered by natural vegetation
- real(r8), allocatable :: natpft(:) ! Percent of each natural PFT within the natural veg landunit
- real(r8), allocatable :: cft(:) ! Percent of each crop CFT within the crop landunit
- contains
- procedure, public :: InitZeroOut ! Initialize the PFT override object to zero out all vegetation
- procedure, public :: InitAllPFTIndex ! Initialize the PFT override object with PFT indeces for all veg and crop types
- procedure, public :: Clean ! Clean up a PFT Override object
- end type pft_oride
-
- interface pft_oride
- module procedure :: constructor ! PFT Overide object constructor
- end interface pft_oride
-
- type(pft_oride), private :: pft_override ! Module instance of PFT override object
- ! Used for both zeroing out PFT's as well
- ! as setting specified PFT's over the gridcell
-!
-! !PRIVATE MEMBER FUNCTIONS:
-!
- private :: mkpft_check_oride ! Check the pft_frc and pft_idx values for correctness
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpftInit
-!
-! !INTERFACE:
-subroutine mkpftInit( zero_out_l, all_veg_l )
-!
-! !DESCRIPTION:
-! Initialize of Make PFT data
-! !USES:
- use mkvarpar, only : numstdpft, numstdcft
-!
-! !ARGUMENTS:
- implicit none
- logical, intent(IN) :: zero_out_l ! If veg should be zero'ed out
- logical, intent(IN) :: all_veg_l ! If should zero out other fractions so that
- ! all land-cover is vegetation
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8), parameter :: hndrd = 100.0_r8 ! A hundred percent
- character(len=32) :: subname = 'mkpftMod::mkpftInit() '
- logical :: error_happened ! If an error was triggered so should return
-!-----------------------------------------------------------------------
- write (6, '(a, a, a)') "In ", trim(subname), "..."
- if ( maxpft < numpft ) then
- write(6,*) subname//'number PFT is > max allowed!'
- call abort()
- return
- end if
- nzero = -1
- call mkpft_check_oride( error_happened )
- if ( error_happened )then
- write(6,*) subname//'Problem setting pft override settings'
- return
- end if
- if ( zero_out_l .and. use_input_pft )then
- write(6,*) subname//"trying to both zero out all PFT's as well as set them to specific values"
- call abort()
- return
- end if
- ! If zeroing out, set use_input_pft to true so the pft_override will be used
- if( zero_out_l )then
- nzero = 0
- pft_frc(0) = 0.0_r8
- pft_idx(0) = noveg
- use_input_pft = .true.
- end if
- if ( use_input_pft ) then
- write(6,*) 'Set PFT fraction to : ', pft_frc(0:nzero)
- write(6,*) 'With PFT index : ', pft_idx(0:nzero)
- end if
- if ( all_veg_l .and. .not. use_input_pft )then
- write(6,*) subname//'if all_veg is set to true then specified PFT indices must be provided (i.e. pft_frc and pft_idx)'
- call abort()
- return
- end if
-
- if ( zero_out_l .and. all_veg_l )then
- write(6,*) subname//'zeroing out vegetation and setting vegetation to 100% is a contradiction!'
- call abort()
- return
- end if
-
- ! Determine number of PFTs on the natural vegetation landunit, and number of CFTs on
- ! the crop landunit.
- !
- ! For the sake of dynamic PFTs and dynamic landunits, it helps for the structure of the
- ! surface dataset to reflect the subgrid structure that will be used by CLM. Currently
- ! generic crops will always go on the crop landunit, regardless of whether or not we're
- ! using the extra specific crops (so we always run CLM with create_crop_landunit=.true.).
- ! When we create a surface dataset WITH the extra specific crops, all crops
- ! (including the generic crops) again go on the crop landunit.
-
- num_natpft = numstdpft - numstdcft
- num_cft = numpft - num_natpft
-
- ! Determine array bounds for arrays of just natural pfts and just crops. Note that
- ! these are set up so that they always span 0:numpft, so that there is a 1:1
- ! correspondence between an element in a full 0:numpft array and an element with the
- ! same index in either a natpft array or a cft array.
- natpft_lb = noveg
- natpft_ub = num_natpft
- cft_lb = num_natpft+1
- cft_ub = cft_lb + num_cft - 1
-
- ! Make sure the array indices have been set up properly, to ensure the 1:1
- ! correspondence mentioned above
- if (cft_ub /= numpft) then
- write(6,*) 'CFT_UB set up incorrectly: cft_ub, numpft = ', cft_ub, numpft
- call abort()
- return
- end if
- !
- ! Set the PFT override values if applicable
- !
- pft_override = pft_oride()
- presc_cover = .false.
- if( zero_out_l )then
- call pft_override%InitZeroOut()
- presc_cover = .true.
- else if ( use_input_pft ) then
- call pft_override%InitAllPFTIndex()
- if ( .not. all_veg_l )then
- if ( pft_override%crop <= 0.0 )then
- write(6,*) "Warning: PFT/CFT's are being overridden, but no crop type is being asked for"
- end if
- if ( pft_override%natveg <= 0.0 )then
- write(6,*) "Warning: PFT/CFT's are being overridden, but no natural vegetation type is being asked for"
- end if
- presc_cover = .false.
- else
- presc_cover = .true.
- end if
- end if
-
-end subroutine mkpftInit
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpft
-!
-! !INTERFACE:
-subroutine mkpft(ldomain, mapfname, fpft, ndiag, &
- pctlnd_o, pctnatpft_o, pctcft_o)
-!
-! !DESCRIPTION:
-! Make PFT data
-!
-! This dataset consists of the %cover of the [numpft]+1 PFTs used by
-! the model. The input %cover pertains to the "vegetated" portion of the
-! grid cell and sums to 100. The real portion of each grid cell
-! covered by each PFT is the PFT cover times the fraction of the
-! grid cell that is land. This is the quantity preserved when
-! area-averaging from the input (1/2 degree) grid to the models grid.
-!
-! Upon return from this routine, the % cover of the natural veg + crop landunits is
-! generally 100% everywhere; this will be normalized later to account for special landunits.
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
- use mkpctPftTypeMod, only : pct_pft_type
- use mkpftConstantsMod, only : natpft_lb, natpft_ub, num_cft, cft_lb, cft_ub
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: fpft ! input pft dataset file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: pctlnd_o(:) ! output grid:%land/gridcell
- type(pct_pft_type), intent(out):: pctnatpft_o(:) ! natural PFT cover
- type(pct_pft_type), intent(out):: pctcft_o(:) ! crop (CFT) cover
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(pct_pft_type), allocatable:: pctnatpft_i(:) ! input grid: natural PFT cover
- type(pct_pft_type), allocatable:: pctcft_i(:) ! input grid: crop (CFT) cover
- type(domain_type) :: tdomain ! local domain
- type(gridmap_type) :: tgridmap ! local gridmap
- real(r8), allocatable :: pctpft_i(:,:) ! input grid: PFT percent
- real(r8), allocatable :: pctpft_o(:,:) ! output grid: PFT percent (% of grid cell)
- real(r8), allocatable :: pctnatveg_i(:) ! input grid: natural veg percent (% of grid cell)
- real(r8), allocatable :: pctnatveg_o(:) ! output grid: natural veg percent (% of grid cell)
- real(r8), allocatable :: pctcrop_i(:) ! input grid: all crop percent (% of grid cell)
- real(r8), allocatable :: pctcrop_o(:) ! output grid: all crop percent (% of grid cell)
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: pct_cft_i(:,:) ! input grid: CFT (Crop Functional Type) percent (% of landunit cell)
- real(r8), allocatable :: temp_i(:,:) ! input grid: temporary 2D variable to read in
- real(r8), allocatable :: pct_cft_o(:,:) ! output grid: CFT (Crop Functional Type) percent (% of landunit cell)
- real(r8), allocatable :: pct_nat_pft_i(:,:) ! input grid: natural PFT percent (% of landunit cell)
- real(r8), allocatable :: pct_nat_pft_o(:,:) ! output grid: natural PFT percent (% of landunit cell)
- integer :: numpft_i ! num of plant types input data
- integer :: natpft_i ! num of natural plant types input data
- integer :: ncft_i ! num of crop types input data
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: wst_sum ! sum of %pft
- real(r8), allocatable :: gpft_o(:) ! output grid: global area pfts
- real(r8) :: garea_o ! output grid: global area
- real(r8), allocatable :: gpft_i(:) ! input grid: global area pfts
- real(r8) :: garea_i ! input grid: global area
- integer :: k,n,m,ni,no,ns_i,ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ndims ! number of dimensions for a variable on the file
- integer :: dimlens(3) ! dimension lengths for a variable on the file
- integer :: ier ! error status
- real(r8) :: relerr = 0.0001_r8 ! max error: sum overlap wts ne 1
- logical :: oldformat ! if input file is in the old format or not (based on what variables exist)
- logical :: error_happened ! If an error was triggered so should return
-
- character(len=35) veg(0:maxpft) ! vegetation types
- character(len=32) :: subname = 'mkpftMod::mkpft()'
-!-----------------------------------------------------------------------
-
- write (6,*)
- write (6, '(a, a, a)') "In ", trim(subname), "..."
- write (6,*) 'Attempting to make PFTs .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Set the vegetation types
- ! -----------------------------------------------------------------
- if ( numpft >= numstdpft )then
- veg(0:maxpft) = (/ &
- 'not vegetated ', &
- 'needleleaf evergreen temperate tree', &
- 'needleleaf evergreen boreal tree ', &
- 'needleleaf deciduous boreal tree ', &
- 'broadleaf evergreen tropical tree ', &
- 'broadleaf evergreen temperate tree ', &
- 'broadleaf deciduous tropical tree ', &
- 'broadleaf deciduous temperate tree ', &
- 'broadleaf deciduous boreal tree ', &
- 'broadleaf evergreen shrub ', &
- 'broadleaf deciduous temperate shrub', &
- 'broadleaf deciduous boreal shrub ', &
- 'c3 arctic grass ', &
- 'c3 non-arctic grass ', &
- 'c4 grass ', &
- 'c3_crop ', &
- 'c3_irrigated ', &
- 'temperate_corn ', &
- 'irrigated_temperate_corn ', &
- 'spring_wheat ', &
- 'irrigated_spring_wheat ', &
- 'winter_wheat ', &
- 'irrigated_winter_wheat ', &
- 'temperate_soybean ', &
- 'irrigated_temperate_soybean ', &
- 'barley ', &
- 'irrigated_barley ', &
- 'winter_barley ', &
- 'irrigated_winter_barley ', &
- 'rye ', &
- 'irrigated_rye ', &
- 'winter_rye ', &
- 'irrigated_winter_rye ', &
- 'cassava ', &
- 'irrigated_cassava ', &
- 'citrus ', &
- 'irrigated citrus ', &
- 'cocoa ', &
- 'irrigated_cocoa ', &
- 'coffee ', &
- 'irrigated_coffee ', &
- 'cotton ', &
- 'irrigated_cotton ', &
- 'datepalm ', &
- 'irrigated_datepalm ', &
- 'foddergrass ', &
- 'irrigated_foddergrass ', &
- 'grapes ', &
- 'irrigated_grapes ', &
- 'groundnuts ', &
- 'irrigated_groundnuts ', &
- 'millet ', &
- 'irrigated_millet ', &
- 'oilpalm ', &
- 'irrigated_oilpalm ', &
- 'potatoes ', &
- 'irrigated_potatoes ', &
- 'pulses ', &
- 'irrigated_pulses ', &
- 'rapeseed ', &
- 'irrigated_rapeseed ', &
- 'rice ', &
- 'irrigated_rice ', &
- 'sorghum ', &
- 'irrigated_sorghum ', &
- 'sugarbeet ', &
- 'irrigated_sugarbeet ', &
- 'sugarcane ', &
- 'irrigated_sugarcane ', &
- 'sunflower ', &
- 'irrigated_sunflower ', &
- 'miscanthus ', &
- 'irrigated_miscanthus ', &
- 'switchgrass ', &
- 'irrigated_switchgrass ', &
- 'tropical_corn ', &
- 'irrigated_tropical_corn ', &
- 'tropical_soybean ', &
- 'irrigated_tropical_soybean ' /)
- end if
- if ( numpft == numstdpft )then
- write(6,*)'Creating surface datasets with the standard # of PFTs =', numpft
- else if ( numpft > numstdpft )then
- write(6,*)'Creating surface datasets with extra types for crops; total pfts =', numpft
- else
- write(6,*) subname//': parameter numpft is NOT set to a known value (should be 16 or more) =',numpft
- call abort()
- return
- end if
-
- ns_o = ldomain%ns
-
- ! -----------------------------------------------------------------
- ! Read input PFT file
- ! -----------------------------------------------------------------
- if ( .not. presc_cover ) then
- ! Obtain input grid info, read PCT_PFT
-
- call domain_read(tdomain,fpft)
- ns_i = tdomain%ns
-
- write (6,*) 'Open PFT file: ', trim(fpft)
- call check_ret(nf_open(fpft, 0, ncid), subname)
-
- ! Check what variables exist to determine what format the file is in
- call check_ret(nf_inq_varid (ncid, 'PCT_PFT', varid), subname, varexists=oldformat)
-
- if ( oldformat ) then
- write(6,*) subname//' ERROR: PCT_PFT field on the the file so it is in the old format, which is no longer supported'
- call abort()
- return
- end if
- call check_ret(nf_inq_dimid (ncid, 'natpft', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, natpft_i), subname)
- call check_ret(nf_inq_dimid (ncid, 'cft', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, ncft_i), subname)
- numpft_i = natpft_i + ncft_i
-
- ! Check if the number of pfts on the input matches the expected number. A mismatch
- ! is okay if the input raw dataset has prognostic crops and the output does not.
- if (numpft_i .ne. numpft+1) then
- if (numpft_i .eq. numstdpft+1) then
- write(6,*) subname//' ERROR: trying to use non-crop input file'
- write(6,*) 'for a surface dataset with crops.'
- call abort()
- return
- else if (numpft_i > numstdpft+1 .and. numpft_i == maxpft+1) then
- write(6,*) subname//' WARNING: using a crop input raw dataset for a non-crop output surface dataset'
- else
- write(6,*) subname//': parameter numpft+1= ',numpft+1, &
- 'does not equal input dataset numpft= ',numpft_i
- call abort()
- return
- end if
- endif
-
-
- ! If file is in the new format, expect the following variables:
- ! PCT_NATVEG, PCT_CROP, PCT_NAT_PFT, PCT_CFT
- allocate(pctnatveg_i(ns_i), &
- pctnatveg_o(ns_o), &
- pctcrop_i(ns_i), &
- pctcrop_o(ns_o), &
- frac_dst(ns_o), &
- pct_cft_i(ns_i,1:num_cft), &
- pct_cft_o(ns_o,1:num_cft), &
- pct_nat_pft_i(ns_i,0:num_natpft), &
- pct_nat_pft_o(ns_o,0:num_natpft), &
- stat=ier)
- if (ier/=0)then
- call abort()
- return
- end if
-
- call check_ret(nf_inq_varid (ncid, 'PCT_NATVEG', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, pctnatveg_i), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_CROP', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, pctcrop_i), subname)
- if ( .not. use_input_pft )then
- call check_ret(nf_inq_varid (ncid, 'PCT_CFT', varid), subname)
- call get_dim_lengths(ncid, 'PCT_CFT', ndims, dimlens(:) )
- if ( ndims == 3 .and. dimlens(1)*dimlens(2) == ns_i .and. dimlens(3) == num_cft )then
- call check_ret(nf_get_var_double (ncid, varid, pct_cft_i), subname)
- else if ( ndims == 3 .and. dimlens(1)*dimlens(2) == ns_i .and. dimlens(3) > num_cft )then
- ! Read in the whole array: then sum the rainfed and irrigated
- ! seperately
- allocate( temp_i(ns_i,dimlens(3)) )
- call check_ret(nf_get_var_double (ncid, varid, temp_i), subname)
- do n = 1, num_cft
- pct_cft_i(:,n) = 0.0_r8
- do m = n, dimlens(3), 2
- pct_cft_i(:,n) = pct_cft_i(:,n) + temp_i(:,m)
- end do
- end do
- deallocate( temp_i )
- else
- write(6,*) subname//': ERROR: dimensions for PCT_CROP are NOT what is expected'
- call abort()
- return
- end if
- call check_ret(nf_inq_varid (ncid, 'PCT_NAT_PFT', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, pct_nat_pft_i), subname)
- end if
-
- call check_ret(nf_close(ncid), subname)
-
- ! -----------------------------------------------------------------
- ! Otherwise if vegetation is prescribed everywhere
- ! -----------------------------------------------------------------
- else
- ns_i = 1
- numpft_i = numpft+1
- allocate(pctnatveg_i(ns_i), &
- pctnatveg_o(ns_o), &
- pctcrop_i(ns_i), &
- pctcrop_o(ns_o), &
- pct_cft_i(ns_i,1:num_cft), &
- pct_cft_o(ns_o,1:num_cft), &
- pct_nat_pft_i(ns_i,0:num_natpft), &
- pct_nat_pft_o(ns_o,0:num_natpft), &
- stat=ier)
- if (ier/=0)then
- call abort()
- return
- end if
- end if
- allocate(pctpft_i(ns_i,0:(numpft_i-1)), &
- pctpft_o(ns_o,0:(numpft_i-1)), &
- pctnatpft_i(ns_i), &
- pctcft_i(ns_i), &
- stat=ier)
- if (ier/=0)then
- call abort()
- return
- end if
-
- ! Determine pctpft_o on output grid
-
- ! If total vegetation cover is prescribed from input...
- if ( use_input_pft .and. presc_cover ) then
-
- do no = 1,ns_o
- pctlnd_o(no) = 100._r8
- pctnatveg_o(no) = pft_override%natveg
- pctcrop_o(no) = pft_override%crop
- end do
-
- ! otherewise if total cover isn't prescribed read it from the datasets
- else
-
- ! Compute pctlnd_o, pctpft_o
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
- ! Area-average percent cover on input grid [pctpft_i] to output grid
- ! [pctpft_o] and correct [pctpft_o] according to land landmask
- ! Note that percent cover is in terms of total grid area.
- pctlnd_o(:) = frac_dst(:) * 100._r8
-
- ! New format with extra variables on input
- call gridmap_areaave_srcmask(tgridmap, pctnatveg_i, pctnatveg_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, pctcrop_i, pctcrop_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- !
- ! If specific PFT/CFT's are NOT prescribed set them from the input file
- !
- if ( .not. use_input_pft )then
- do m = 0, num_natpft
- call gridmap_areaave_scs(tgridmap, pct_nat_pft_i(:,m), &
- pct_nat_pft_o(:,m), nodata=0._r8, &
- src_wt=pctnatveg_i*0.01_r8*tdomain%mask, &
- dst_wt=pctnatveg_o*0.01_r8, frac_dst=frac_dst)
- do no = 1,ns_o
- if (pctlnd_o(no) < 1.0e-6 .or. pctnatveg_o(no) < 1.0e-6) then
- if (m == 0) then
- pct_nat_pft_o(no,m) = 100._r8
- else
- pct_nat_pft_o(no,m) = 0._r8
- endif
- end if
- enddo
- end do
- do m = 1, num_cft
- call gridmap_areaave_scs(tgridmap, pct_cft_i(:,m), pct_cft_o(:,m), &
- nodata=0._r8, src_wt=pctcrop_i*0.01_r8*tdomain%mask, &
- dst_wt=pctcrop_o*0.01_r8, frac_dst=frac_dst)
- do no = 1,ns_o
- if (pctlnd_o(no) < 1.0e-6 .or. pctcrop_o(no) < 1.0e-6) then
- if (m == 1) then
- pct_cft_o(no,m) = 100._r8
- else
- pct_cft_o(no,m) = 0._r8
- endif
- end if
- enddo
- end do
- ! Otherwise do some error checking to make sure specific veg types are given where nat-veg and crop is assigned
- else
- do no = 1,ns_o
- if (pctlnd_o(no) > 1.0e-6 .and. pctnatveg_o(no) > 1.0e-6) then
- if ( pft_override%natveg <= 0.0_r8 )then
- write(6,*) subname//': ERROR: no natural vegetation PFTs are being prescribed but there are natural '// &
- 'vegetation areas: provide at least one natural veg PFT'
- call abort()
- return
- end if
- end if
- if (pctlnd_o(no) > 1.0e-6 .and. pctcrop_o(no) > 1.0e-6) then
- if ( pft_override%crop <= 0.0_r8 )then
- write(6,*) subname//': ERROR: no crop CFTs are being prescribed but there are crop areas: provide at least one CFT'
- call abort()
- return
- end if
- end if
- end do
- end if
- end if
-
- !
- ! If specific PFT/CFT's are prescribed set them directly
- !
- if ( use_input_pft )then
- do no = 1,ns_o
- if (pctlnd_o(no) > 1.0e-6 .and. pctnatveg_o(no) > 1.0e-6) then
- pct_nat_pft_o(no,noveg:num_natpft) = pft_override%natpft(noveg:num_natpft)
- else
- pct_nat_pft_o(no,noveg) = 100._r8
- pct_nat_pft_o(no,noveg+1:) = 0._r8
- end if
- if (pctlnd_o(no) > 1.0e-6 .and. pctcrop_o(no) > 1.0e-6) then
- pct_cft_o(no,1:num_cft) = pft_override%cft(1:num_cft)
- else
- pct_cft_o(no,1) = 100._r8
- pct_cft_o(no,2:) = 0._r8
- end if
- pctpft_o(no,natpft_lb:natpft_ub) = pct_nat_pft_o(no,0:num_natpft)
- pctpft_o(no,cft_lb:cft_ub) = pct_cft_o(no,1:num_cft)
- end do
- end if
-
-
- ! Error check: percents should sum to 100 for land grid cells, within roundoff
- ! Also correct sums so that if they differ slightly from 100, they are corrected to
- ! equal 100 more exactly.
-
- do no = 1,ns_o
- wst_sum = 0.
- do m = 0, num_natpft
- wst_sum = wst_sum + pct_nat_pft_o(no,m)
- enddo
- if (abs(wst_sum-100._r8) > relerr) then
- write (6,*) subname//'error: nat pft = ', &
- (pct_nat_pft_o(no,m), m = 0, num_natpft), &
- ' do not sum to 100. at no = ',no,' but to ', wst_sum
- stop
- end if
-
- ! Correct sum so that if it differs slightly from 100, it is corrected to equal
- ! 100 more exactly
- do m = 1, num_natpft
- pct_nat_pft_o(no,m) = pct_nat_pft_o(no,m) * 100._r8 / wst_sum
- end do
-
- wst_sum = 0.
- do m = 1, num_cft
- wst_sum = wst_sum + pct_cft_o(no,m)
- enddo
- if (abs(wst_sum-100._r8) > relerr) then
- write (6,*) subname//'error: crop cft = ', &
- (pct_cft_o(no,m), m = 1, num_cft), &
- ' do not sum to 100. at no = ',no,' but to ', wst_sum
- stop
- end if
-
- ! Correct sum so that if it differs slightly from 100, it is corrected to equal
- ! 100 more exactly
- do m = 1, num_cft
- pct_cft_o(no,m) = pct_cft_o(no,m) * 100._r8 / wst_sum
- end do
-
- end do
-
- ! Convert % pft as % of grid cell to % pft on the landunit and % of landunit on the
- ! grid cell
- do no = 1,ns_o
- pctnatpft_o(no) = pct_pft_type( pct_nat_pft_o(no,:), pctnatveg_o(no), first_pft_index=natpft_lb )
- pctcft_o(no) = pct_pft_type( pct_cft_o(no,:), pctcrop_o(no), first_pft_index=cft_lb )
- end do
-
- ! -----------------------------------------------------------------
- ! Error check
- ! Compare global areas on input and output grids
- ! Only when you aren't prescribing the vegetation coverage everywhere
- ! If use_input_pft is set this will compare the global coverage of
- ! the prescribed vegetation to the coverage of PFT/CFT's on the input
- ! datasets.
- ! -----------------------------------------------------------------
-
- if ( .not. presc_cover ) then
-
- ! Convert to pctpft over grid if using new format
- do ni = 1, ns_i
- pctnatpft_i(ni) = pct_pft_type( pct_nat_pft_i(ni,:), pctnatveg_i(ni), first_pft_index=natpft_lb )
- pctcft_i(ni) = pct_pft_type( pct_cft_i(ni,:), pctcrop_i(ni), first_pft_index=cft_lb )
- end do
-
- do no = 1,ns_o
- pctpft_o(no,natpft_lb:natpft_ub) = pctnatpft_o(no)%get_pct_p2g()
- pctpft_o(no,cft_lb:cft_ub) = pctcft_o(no)%get_pct_p2g()
- end do
- allocate(gpft_i(0:numpft_i-1))
- allocate(gpft_o(0:numpft_i-1))
-
- ! input grid
-
- gpft_i(:) = 0.
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- do m = 0, numpft_i - 1
- gpft_i(m) = gpft_i(m) + pctpft_i(ni,m)*tgridmap%area_src(ni)*&
- tdomain%mask(ni)*re**2
- end do
- end do
- if ( allocated(pctpft_i) ) deallocate (pctpft_i)
-
- ! output grid
-
- gpft_o(:) = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- do m = 0, numpft_i - 1
- gpft_o(m) = gpft_o(m) + pctpft_o(no,m)*tgridmap%area_dst(no)*&
- frac_dst(no)*re**2
- end do
- end do
-
- ! comparison
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'PFTs Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,1001)
-1001 format (1x,'plant type ',20x,' input grid area',' output grid area',/ &
- 1x,33x,' 10**6 km**2',' 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- do m = 0, numpft_i - 1
- write (ndiag,1002) veg(m), gpft_i(m)*1.e-06/100.,gpft_o(m)*1.e-06/100.
- end do
-1002 format (1x,a35,f16.3,f17.3)
- call shr_sys_flush(ndiag)
-
- deallocate(gpft_i, gpft_o, frac_dst)
-
- end if
- deallocate( pctnatpft_i )
- deallocate( pctcft_i )
- deallocate(pctpft_o)
-
-
- ! Deallocate dynamic memory
-
- deallocate(pctnatveg_i)
- deallocate(pctnatveg_o)
- deallocate(pctcrop_i)
- deallocate(pctcrop_o)
- deallocate(pct_cft_i)
- deallocate(pct_cft_o)
- deallocate(pct_nat_pft_i)
- deallocate(pct_nat_pft_o)
- if ( .not. presc_cover ) then
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- end if
-
- write (6,*) 'Successfully made PFTs'
- write (6,*)
-
-
-end subroutine mkpft
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpft_parse_oride
-!
-! !INTERFACE:
-subroutine mkpft_parse_oride( string )
-!
-! !DESCRIPTION:
-! Parse the string with pft fraction and index information on it, to override
-! the file with this information rather than reading from a file.
-!
-! !USES:
- use shr_string_mod, only: shr_string_betweenTags, shr_string_countChar
-! !ARGUMENTS:
- character(len=256), intent(IN) :: string ! String to parse with PFT fraction
- ! and index data
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: rc ! error return code
- integer :: num_elms ! number of elements
- character(len=256) :: substring ! string between tags
- character(len=*), parameter :: frc_start = ""
- character(len=*), parameter :: frc_end = " "
- character(len=*), parameter :: idx_start = ""
- character(len=*), parameter :: idx_end = " "
- character(len=*), parameter :: subname = 'mkpft_parse_oride'
- !-----------------------------------------------------------------------
-
- ! NOTE(bja, 2015-02) pft_frc and pft_index can be reset multiple
- ! times by calls to this function. If the number of elements being
- ! set is different each time, then we are working with out of date
- ! information, and the sums may not sum to 100%.
- pft_frc = 0.0_r8
- pft_idx = -1
-
- call shr_string_betweenTags( string, frc_start, frc_end, substring, rc )
- if ( rc /= 0 )then
- write(6,*) subname//'Trouble finding pft_frac start end tags'
- call abort()
- return
- end if
- num_elms = shr_string_countChar( substring, ",", rc )
- read(substring,*) pft_frc(0:num_elms)
- call shr_string_betweenTags( string, idx_start, idx_end, substring, rc )
- if ( rc /= 0 )then
- write(6,*) subname//'Trouble finding pft_index start end tags'
- call abort()
- return
- end if
- if ( num_elms /= shr_string_countChar( substring, ",", rc ) )then
- write(6,*) subname//'number of elements different between frc and idx fields'
- call abort()
- return
- end if
- read(substring,*) pft_idx(0:num_elms)
-!-----------------------------------------------------------------------
-
-end subroutine mkpft_parse_oride
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpft_check_oride
-!
-! !INTERFACE:
-subroutine mkpft_check_oride( error_happened )
-!
-! !DESCRIPTION:
-! Check that the pft override values are valid
-! !USES:
- implicit none
-! !ARGUMENTS:
- logical, intent(out) :: error_happened ! Result, true if there was a problem
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: i, j ! indices
- real(r8) :: sumpft ! Sum of pft_frc
- real(r8), parameter :: hndrd = 100.0_r8 ! A hundred percent
- character(len=32) :: subname = 'mkpftMod::mkpft_check_oride() '
-!-----------------------------------------------------------------------
-
- error_happened = .false.
- sumpft = sum(pft_frc)
- if ( sumpft == 0.0 )then
- ! PFT fraction is NOT used
- use_input_pft = .false.
- else if ( abs(sumpft - hndrd) > 1.e-6 )then
- write(6, '(a, a, f15.12)') trim(subname), 'Sum of PFT fraction is NOT equal to 100% =', sumpft
- write(6,*) 'Set PFT fraction to : ', pft_frc(0:nzero)
- write(6,*) 'With PFT index : ', pft_idx(0:nzero)
- error_happened = .true.
- call abort()
- return
- else
- use_input_pft = .true.
- nzero = numpft
- do i = 0, numpft
- if ( pft_frc(i) == 0.0_r8 )then
- nzero = i-1
- exit
- end if
- end do
- ! PFT fraction IS used, and sum is OK, now check details
- do i = 0, nzero
- if ( pft_frc(i) < 0.0_r8 .or. pft_frc(i) > hndrd )then
- write(6,*) subname//'PFT fraction is out of range: pft_frc=', pft_frc(i)
- error_happened = .true.
- call abort()
- return
- else if ( pft_frc(i) > 0.0_r8 .and. pft_idx(i) == -1 )then
- write(6,*) subname//'PFT fraction > zero, but index NOT set: pft_idx=', pft_idx(i)
- error_happened = .true.
- call abort()
- return
- end if
- ! PFT index out of range
- if ( pft_idx(i) < 0 .or. pft_idx(i) > numpft )then
- write(6,*) subname//'PFT index is out of range: ', pft_idx(i)
- error_happened = .true.
- call abort()
- return
- end if
- ! Make sure index values NOT used twice
- do j = 0, i-1
- if ( pft_idx(i) == pft_idx(j) )then
- write(6,*) subname//'Same PFT index is used twice: ', pft_idx(i)
- error_happened = .true.
- call abort()
- return
- end if
- end do
- end do
- ! Make sure the rest of the fraction is zero and index are not set as well
- do i = nzero+1, numpft
- if ( pft_frc(i) /= 0.0_r8 .or. pft_idx(i) /= -1 )then
- write(6,*) subname//'After PFT fraction is zeroed out, fraction is non zero, or index set'
- error_happened = .true.
- call abort()
- return
- end if
- end do
- end if
-
-end subroutine mkpft_check_oride
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkpftAtt
-!
-! !INTERFACE:
-subroutine mkpftAtt( ncid, dynlanduse, xtype )
-!
-! !DESCRIPTION:
-! make PFT attributes on the output file
-!
- use mkncdio , only : check_ret, ncd_defvar, ncd_def_spatial_var
- use fileutils , only : get_filename
- use mkvarctl , only : mksrf_fvegtyp, mksrf_flai
- use mkvarpar
-
-! !ARGUMENTS:
- implicit none
- include 'netcdf.inc'
- integer, intent(in) :: ncid ! NetCDF file ID to write out to
- logical, intent(in) :: dynlanduse ! if dynamic land-use file
- integer, intent(in) :: xtype ! external type to output real data as
-!
-! !CALLED FROM:
-! subroutine mkfile in module mkfileMod
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: pftsize ! size of lsmpft dimension
- integer :: natpftsize ! size of natpft dimension
- integer :: dimid ! input netCDF id's
- character(len=256) :: str ! global attribute string
- character(len=32) :: subname = 'mkpftAtt'
-
- ! Define dimensions
- call check_ret(nf_def_dim (ncid, 'time' , nf_unlimited, dimid), subname)
-
- if (.not. dynlanduse) then
- pftsize = numpft + 1
- call check_ret(nf_def_dim (ncid, 'lsmpft' , pftsize , dimid), subname)
- end if
-
- natpftsize = num_natpft + 1
- call check_ret(nf_def_dim (ncid, 'natpft' , natpftsize , dimid), subname)
-
- ! zero-size dimensions can cause problems, so we only include the cft dimension if num_cft > 0
- ! Note that this implies that we can only include PCT_CFT on the dataset if num_cft > 0
- if (num_cft > 0) then
- call check_ret(nf_def_dim (ncid, 'cft' , num_cft , dimid), subname)
- end if
-
- ! Add global attributes
-
- if (.not. dynlanduse) then
- str = get_filename(mksrf_flai)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Lai_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
-
- if ( use_input_pft ) then
- str = 'TRUE'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'pft_override', len_trim(str), trim(str)), subname)
- else
- str = get_filename(mksrf_fvegtyp)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Vegetation_type_raw_data_filename', len_trim(str), trim(str)), subname)
- end if
-
- ! Define variables
-
- ! Coordinate variable for indices of natural PFTs
- call ncd_defvar(ncid=ncid, varname='natpft', xtype=nf_int, &
- dim1name='natpft', long_name='indices of natural PFTs', units='index')
-
- ! Coordinate variable for indices of CFTs
- if (num_cft > 0) then
- call ncd_defvar(ncid=ncid, varname='cft', xtype=nf_int, &
- dim1name='cft', long_name='indices of CFTs', units='index')
- end if
-
- call ncd_def_spatial_var(ncid=ncid, varname='LANDFRAC_PFT', xtype=nf_double, &
- long_name='land fraction from pft dataset', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PFTDATA_MASK', xtype=nf_int, &
- long_name='land mask from pft dataset, indicative of real/fake points', units='unitless')
-
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_NATVEG', xtype=xtype, &
- long_name='total percent natural vegetation landunit', units='unitless')
- end if
-
- ! PCT_CROP
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CROP', xtype=xtype, &
- long_name='total percent crop landunit', units='unitless')
- else
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CROP', xtype=xtype, &
- lev1name='time', &
- long_name='total percent crop landunit', units='unitless')
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CROP_MAX', xtype=xtype, &
- long_name='maximum total percent crop landunit during time period', units='unitless')
- end if
-
- ! PCT_NAT_PFT
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_NAT_PFT', xtype=xtype, &
- lev1name='natpft', &
- long_name='percent plant functional type on the natural veg landunit (% of landunit)', units='unitless')
- else
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_NAT_PFT', xtype=xtype, &
- lev1name='natpft', lev2name='time', &
- long_name='percent plant functional type on the natural veg landunit (% of landunit)', units='unitless')
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_NAT_PFT_MAX', xtype=xtype, &
- lev1name='natpft', &
- long_name='maximum percent plant functional type during time period (% of landunit)', units='unitless')
- end if
-
- ! PCT_CFT
- if (num_cft > 0) then
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CFT', xtype=xtype, &
- lev1name='cft', &
- long_name='percent crop functional type on the crop landunit (% of landunit)', units='unitless')
- else
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CFT', xtype=xtype, &
- lev1name='cft', lev2name='time', &
- long_name='percent crop functional type on the crop landunit (% of landunit)', units='unitless')
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CFT_MAX', xtype=xtype, &
- lev1name='cft', &
- long_name='maximum percent crop functional type during time period (% of landunit)', units='unitless')
- end if
- end if
-
- ! LAI,SAI,HTOP,HBOT
- if (.not. dynlanduse) then
- call ncd_def_spatial_var(ncid=ncid, varname='MONTHLY_LAI', xtype=xtype, &
- lev1name='lsmpft', lev2name='time', &
- long_name='monthly leaf area index', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='MONTHLY_SAI', xtype=xtype, &
- lev1name='lsmpft', lev2name='time', &
- long_name='monthly stem area index', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='MONTHLY_HEIGHT_TOP', xtype=xtype, &
- lev1name='lsmpft', lev2name='time', &
- long_name='monthly height top', units='meters')
-
- call ncd_def_spatial_var(ncid=ncid, varname='MONTHLY_HEIGHT_BOT', xtype=xtype, &
- lev1name='lsmpft', lev2name='time', &
- long_name='monthly height bottom', units='meters')
- end if
-
- ! OTHER
- if (dynlanduse) then
- call ncd_defvar(ncid=ncid, varname='YEAR', xtype=nf_int, &
- dim1name='time', &
- long_name='Year of PFT data', units='unitless')
- call ncd_defvar(ncid=ncid, varname='time', xtype=nf_int, &
- dim1name='time', &
- long_name='year', units='unitless')
- call ncd_defvar(ncid=ncid, varname='input_pftdata_filename', xtype=nf_char, &
- dim1name='nchar', &
- dim2name='time', &
- long_name='Input filepath for PFT values for this year', units='unitless')
- else
- call ncd_defvar(ncid=ncid, varname='time', xtype=nf_int, &
- dim1name='time', &
- long_name='Calendar month', units='month')
- end if
-
-end subroutine mkpftAtt
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: constructor
-!
-! !INTERFACE:
-function constructor( ) result(this)
-!
-! !DESCRIPTION:
-! Construct a new PFT override object
-!
-! !ARGUMENTS:
- implicit none
- type(pft_oride) :: this
-!EOP
- character(len=32) :: subname = 'mkpftMod::constructor() '
-
- this%crop = -1.0_r8
- this%natveg = -1.0_r8
- if ( num_natpft < 0 )then
- write(6,*) subname//'num_natpft is NOT set = ', num_natpft
- call abort()
- return
- end if
- if ( num_cft < 0 )then
- write(6,*) subname//'num_cft is NOT set = ', num_cft
- call abort()
- return
- end if
- allocate( this%natpft(noveg:num_natpft) )
- allocate( this%cft(1:num_cft) )
- this%natpft(:) = -1.0_r8
- this%cft(:) = -1.0_r8
- call this%InitZeroOut()
-end function constructor
-
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: InitZeroOut
-!
-! !INTERFACE:
-subroutine InitZeroOut( this )
-!
-! !DESCRIPTION:
-! Initialize a pft_oride object with vegetation that's zeroed out
-!
-! !ARGUMENTS:
- implicit none
- class(pft_oride), intent(inout) :: this
-!EOP
- this%crop = 0.0_r8
- this%natveg = 0.0_r8
-
- this%natpft = 0.0_r8
- this%natpft(noveg) = 100.0_r8
- this%cft = 0.0_r8
- this%cft(1) = 100.0_r8
-end subroutine InitZeroOut
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: InitZeroOut
-!
-! !INTERFACE:
-subroutine InitAllPFTIndex( this )
-!
-! !DESCRIPTION:
-! Initialize a pft_oride object with vegetation that's zeroed out
-!
-! !ARGUMENTS:
- implicit none
- class(pft_oride), intent(inout) :: this
-!EOP
- integer :: m, i ! Indices
- real(r8) :: croptot ! Total of crop
- real(r8) :: natvegtot ! Total of natural vegetation
- character(len=32) :: subname = 'mkpftMod::coInitAllPFTIndex() '
-
- croptot = 0.0_r8
- natvegtot = 0.0_r8
- this%natpft = 0.0_r8
- this%cft = 0.0_r8
- do m = noveg, nzero
- i = pft_idx(m)
- if ( (i < noveg) .or. (i > numpft) )then
- write(6,*) subname//'PFT index is out of valid range'
- call abort()
- return
- else if ( i <= num_natpft )then
- this%natpft(i) = pft_frc(m)
- natvegtot = natvegtot + pft_frc(m)
- else
- this%cft(i-num_natpft) = pft_frc(m)
- croptot = croptot + pft_frc(m)
- end if
- end do
- this%crop = croptot
- this%natveg = natvegtot
- ! Renormalize
- if ( natvegtot > 0.0_r8 )then
- this%natpft = 100.0_r8 * this%natpft / natvegtot
- else
- this%natpft(noveg) = 100.0_r8
- end if
- if (croptot > 0.0_r8 )then
- this%cft = 100.0_r8 * this%cft / croptot
- else
- this%cft(1) = 100.0_r8
- end if
-
-end subroutine InitAllPFTIndex
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: clean
-!
-! !INTERFACE:
-subroutine Clean( this )
-!
-! !DESCRIPTION:
-! Clean up a PFT Oride object
-!
-! !ARGUMENTS:
- implicit none
- class(pft_oride), intent(inout) :: this
-!EOP
- this%crop = -1.0_r8
- this%natveg = -1.0_r8
- deallocate( this%natpft )
- deallocate( this%cft )
-
-end subroutine Clean
-
-!-----------------------------------------------------------------------
-
-end module mkpftMod
diff --git a/tools/mksurfdata_map/src/mkpftUtilsMod.F90 b/tools/mksurfdata_map/src/mkpftUtilsMod.F90
deleted file mode 100644
index 4a9ea12f97..0000000000
--- a/tools/mksurfdata_map/src/mkpftUtilsMod.F90
+++ /dev/null
@@ -1,257 +0,0 @@
-module mkpftUtilsMod
-
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !MODULE: mkpftUtils
- !
- ! !DESCRIPTION:
- ! Lower-level utilities used in making PFT data.
- !
- ! These are separated out from mkpftMod mainly as an aid to testing.
- !
- ! !REVISION HISTORY:
- ! Author: Bill Sacks
- !
- !-----------------------------------------------------------------------
- !!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- !
- ! !PUBLIC MEMBER FUNCTIONS:
- !
- public :: convert_from_p2g ! Convert a p2g array into pct_pft_type objects
- public :: adjust_total_veg_area ! Adjust the total vegetated area (natural veg & crop) to a new specified total
-
- !
- ! !PRIVATE MEMBER FUNCTIONS:
- !
-
- private :: get_default_natpft ! Get the default natural pft breakdown, for a 0-area natural veg. landunit
- private :: get_default_cft ! Get the default cft breakdown, for a 0-area crop landunit
-
- interface convert_from_p2g
- module procedure convert_from_p2g_default
- module procedure convert_from_p2g_missing_crops
- end interface convert_from_p2g
-
- !EOP
- !===============================================================
-contains
- !===============================================================
-
- !-----------------------------------------------------------------------
- subroutine convert_from_p2g_default(pct_p2g, pctnatpft, pctcft)
- !
- ! !DESCRIPTION:
- ! Given the % of each pft on the grid cell, create pct_pft_type objects that give % of
- ! each pft on the landunit and % of each landunit on the grid cell.
- !
- ! !USES:
- use mkpctPftTypeMod , only : pct_pft_type
- use mkpftConstantsMod, only : natpft_lb, natpft_ub, num_cft, cft_lb, cft_ub
- !
- ! !ARGUMENTS:
- real(r8), intent(in) :: pct_p2g(natpft_lb:) ! % of each pft on the grid cell (includes crops as well as natural veg types)
- type(pct_pft_type), intent(out) :: pctnatpft ! natural PFT cover
- type(pct_pft_type), intent(out) :: pctcft ! crop (CFT) COVER
- !
- ! !LOCAL VARIABLES:
- real(r8), allocatable :: default_natpft(:) ! default p2l for natural PFTs, for grid cells where the current size of the natural veg landunit is 0
- real(r8), allocatable :: default_cft(:) ! default p2l for CFTs, for grid cells where the current size of the crop landunit is 0
-
- character(len=*), parameter :: subname = 'convert_from_p2g_default'
- !-----------------------------------------------------------------------
-
- if (ubound(pct_p2g, 1) /= cft_ub) then
- write(6,*) subname, ' ERROR: upper bound of pct_p2g should be cft_ub'
- write(6,*) 'ubound(pct_p2g), cft_ub = ', ubound(pct_p2g), cft_ub
- call abort()
- end if
-
- allocate(default_natpft(natpft_lb:natpft_ub))
- default_natpft = get_default_natpft()
- pctnatpft = pct_pft_type(pct_p2g(natpft_lb:natpft_ub), natpft_lb, default_natpft)
- deallocate(default_natpft)
-
- if (num_cft > 0) then
- allocate(default_cft(cft_lb:cft_ub))
- default_cft = get_default_cft()
- pctcft = pct_pft_type(pct_p2g(cft_lb:cft_ub), cft_lb, default_cft)
- deallocate(default_cft)
- else
- ! create an empty placeholder, with 0 area on the grid cell
- pctcft = pct_pft_type()
- end if
-
- end subroutine convert_from_p2g_default
-
- !-----------------------------------------------------------------------
- subroutine convert_from_p2g_missing_crops(pct_p2g, pctcft_saved, pctnatpft, pctcft)
- !
- ! !DESCRIPTION:
- ! Given the % of each pft on the grid cell, create pct_pft_type objects that give %
- ! of each pft on the landunit and % of each landunit on the grid cell.
- !
- ! This version of the routine assumes that pct_p2g only includes the standard PFTs -
- ! not prognostic crops. It takes the relative crop cover from pctcft_saved, and uses
- ! the % cover of the generic c3 crop in pct_p2g to specify the total crop landunit
- ! area.
- !
- ! Typically, pct_p2g will have an upper bound of numstdpft; however, this is not
- ! assumed. Any upper bound is fine as long as the upper bound is greater than
- ! natpft_ub and includes c3cropindex.
- !
- ! Assumptions:
- ! - We are running with prognostic crops (i.e., NOT an empty crop landunit - although
- ! it's fine for the crop landunit area to be 0%)
- ! - In pct_p2g, the only non-zero areas should be:
- ! - Areas of PFTs on the natural veg landunit
- ! - The area of the generic c3 crop
- !
- ! !USES:
- use mkpctPftTypeMod , only : pct_pft_type
- use mkpftConstantsMod , only : c3cropindex, natpft_lb, natpft_ub, num_cft
- !
- ! !ARGUMENTS:
- real(r8), intent(in) :: pct_p2g(natpft_lb:) ! % of each pft on the grid cell (includes crops as well as natural veg types)
- type(pct_pft_type), intent(in) :: pctcft_saved ! saved crop cover information, used to specify the relative cover of each crop
- type(pct_pft_type), intent(out) :: pctnatpft ! natural PFT cover
- type(pct_pft_type), intent(out) :: pctcft ! crop (CFT) COVER
- !
- ! !LOCAL VARIABLES:
- real(r8), allocatable :: default_natpft(:) ! default p2l for natural PFTs, for grid cells where the current size of the natural veg landunit is 0
- integer :: pft_index
- real(r8) :: crop_area ! area of the crop landunit on the grid cell
-
- character(len=*), parameter :: subname = 'convert_from_p2g_missing_crops'
- !-----------------------------------------------------------------------
-
- ! Error checking on inputs
-
- if (num_cft == 0) then
- write(6,*) subname, ' ERROR: this routine should only be called when running with prognostic crops'
- write(6,*) '(i.e., with num_cft > 0)'
- call abort()
- end if
-
- do pft_index = natpft_ub + 1, ubound(pct_p2g, 1)
- if (pft_index /= c3cropindex .and. pct_p2g(pft_index) > 0._r8) then
- write(6,*) subname, ' ERROR: in pct_p2g, the only non-zero areas should be:'
- write(6,*) ' - areas of PFTs on the natural veg landunit'
- write(6,*) ' - the area of the generic c3 crop'
- write(6,*) '(we do not currently handle the case where the transient input dataset'
- write(6,*) 'has non-zero areas for both pft 15 and pft 16)'
- write(6,*) 'pft_index, area = ', pft_index, pct_p2g(pft_index)
- call abort()
- end if
- end do
-
- ! Done error checking on inputs
-
- allocate(default_natpft(natpft_lb:natpft_ub))
- default_natpft = get_default_natpft()
- pctnatpft = pct_pft_type(pct_p2g(natpft_lb:natpft_ub), natpft_lb, default_natpft)
- deallocate(default_natpft)
-
- pctcft = pctcft_saved
- crop_area = pct_p2g(c3cropindex)
- call pctcft%set_pct_l2g(crop_area)
-
- end subroutine convert_from_p2g_missing_crops
-
- !-----------------------------------------------------------------------
- function get_default_natpft() result(default_natpft)
- !
- ! !DESCRIPTION:
- ! Get the default natural pft breakdown, for a 0-area natural veg. landunit.
- !
- ! Currently we use the same default everywhere. In the future, we could change this
- ! to compute default_natpft based on some function of location (e.g., different
- ! values for high latitudes than low latitudes, etc.).
- !
- ! !USES:
- use mkpftConstantsMod, only : baregroundindex, natpft_lb, natpft_ub
- !
- ! !ARGUMENTS:
- real(r8), allocatable :: default_natpft(:) ! function result
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_default_natpft'
- !-----------------------------------------------------------------------
-
- allocate(default_natpft(natpft_lb:natpft_ub))
- default_natpft(:) = 0._r8
- default_natpft(baregroundindex) = 100._r8
-
- end function get_default_natpft
-
- !-----------------------------------------------------------------------
- function get_default_cft() result(default_cft)
- !
- ! !DESCRIPTION:
- ! Get the default cft breakdown, for a 0-area crop landunit.
- !
- ! !USES:
- use mkpftConstantsMod, only : c3cropindex, cft_lb, cft_ub
- !
- ! !ARGUMENTS:
- real(r8), allocatable :: default_cft(:) ! function result
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'get_default_cft'
- !-----------------------------------------------------------------------
-
- allocate(default_cft(cft_lb:cft_ub))
- default_cft(:) = 0._r8
- default_cft(c3cropindex) = 100._r8
-
- end function get_default_cft
-
-
- !-----------------------------------------------------------------------
- subroutine adjust_total_veg_area(new_total_pct, pctnatpft, pctcft)
- !
- ! !DESCRIPTION:
- ! Adjust the total vegetated area on the grid cell (natural veg & crop) to a new
- ! specified total.
- !
- ! If the old areas are 0%, then all the new area goes into pctnatpft.
- !
- ! !USES:
- use mkpctPftTypeMod, only : pct_pft_type
- !
- ! !ARGUMENTS:
- real(r8), intent(in) :: new_total_pct ! new total % of natural veg + crop landunits
- class(pct_pft_type), intent(inout) :: pctnatpft ! natural veg cover information
- class(pct_pft_type), intent(inout) :: pctcft ! crop cover information
- !
- ! !LOCAL VARIABLES:
- real(r8) :: natpft_l2g ! grid cell % cover of nat. veg.
- real(r8) :: cft_l2g ! grid cell % cover of crop
- real(r8) :: old_total ! old total % cover of natural veg + crop landunits
-
- character(len=*), parameter :: subname = 'adjust_total_veg_area'
- !-----------------------------------------------------------------------
-
- natpft_l2g = pctnatpft%get_pct_l2g()
- cft_l2g = pctcft%get_pct_l2g()
- old_total = natpft_l2g + cft_l2g
- if (old_total > 0._r8) then
- call pctnatpft%set_pct_l2g(natpft_l2g * new_total_pct / old_total)
- call pctcft%set_pct_l2g(cft_l2g * new_total_pct / old_total)
- else
- call pctnatpft%set_pct_l2g(new_total_pct)
- end if
-
- end subroutine adjust_total_veg_area
-
-
-end module mkpftUtilsMod
-
-
diff --git a/tools/mksurfdata_map/src/mksoilMod.F90 b/tools/mksurfdata_map/src/mksoilMod.F90
deleted file mode 100644
index 959749ca1a..0000000000
--- a/tools/mksurfdata_map/src/mksoilMod.F90
+++ /dev/null
@@ -1,1237 +0,0 @@
-module mksoilMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mksoilMod
-!
-! !DESCRIPTION:
-! Make soil data (texture, color and organic)
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!-----------------------------------------------------------------------
-!!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8, r4=>shr_kind_r4
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
- use mksoilUtilsMod, only : mkrank, dominant_soil_color
- implicit none
-
- SAVE
- private ! By default make data private
-!
-! !PUBLIC MEMBER FUNCTIONS:
-!
- public mksoilInit ! Soil Initialization
-
- public mksoilAtt ! Add attributes to output file
-
- public mksoiltex ! Set soil texture
- public mkorganic ! Set organic soil
- public mksoilcol ! Set soil color
- public mkfmax ! Make percent fmax
-!
-! !PUBLIC DATA MEMBERS:
-!
- real(r8), public, parameter :: unset = -999.99_r8 ! Flag to signify soil texture override not set
- real(r8), public :: soil_sand = unset ! soil texture sand % to override with
- real(r8), public :: soil_clay = unset ! soil texture clay % to override with
- real(r8), public :: soil_fmax = unset ! soil max saturation frac to override with
- integer , parameter :: unsetcol = -999 ! flag to indicate soil color NOT set
- integer , public :: soil_color= unsetcol ! soil color to override with
-!
-! !PRIVATE DATA MEMBERS:
-!
-! !PRIVATE MEMBER FUNCTIONS:
- private :: mksoiltexInit ! Soil texture Initialization
- private :: mksoilcolInit ! Soil color Initialization
- private :: mksoilfmaxInit ! Soil fmax Initialization
-
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoilInit
-!
-! !INTERFACE:
-subroutine mksoilInit( )
-!
-! !DESCRIPTION:
-! Initialize the different soil types
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- character(len=32) :: subname = 'mksoilInit'
-!-----------------------------------------------------------------------
- call mksoiltexInit()
- call mksoilcolInit()
- call mksoilfmaxInit()
-
-end subroutine mksoilInit
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoiltexInit
-!
-! !INTERFACE:
-subroutine mksoiltexInit( )
-!
-! !DESCRIPTION:
-! Initialize of make soil texture
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: sumtex
- character(len=32) :: subname = 'mksoiltexInit'
-!-----------------------------------------------------------------------
- if ( soil_clay /= unset )then
- write(6,*) 'Replace soil clay % for all points with: ', soil_clay
- if ( soil_sand == unset )then
- write (6,*) subname//':error: soil_clay set, but NOT soil_sand'
- call abort()
- end if
- end if
- if ( soil_sand /= unset )then
- write(6,*) 'Replace soil sand % for all points with: ', soil_sand
- if ( soil_clay == unset )then
- write (6,*) subname//':error: soil_sand set, but NOT soil_clay'
- call abort()
- end if
- sumtex = soil_sand + soil_clay
- if ( sumtex < 0.0_r8 .or. sumtex > 100.0_r8 )then
- write (6,*) subname//':error: soil_sand and soil_clay out of bounds: sand, clay = ', &
- soil_sand, soil_clay
- call abort()
- end if
- end if
-
-end subroutine mksoiltexInit
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoiltex
-!
-! !INTERFACE:
-subroutine mksoiltex(ldomain, mapfname, datfname, ndiag, sand_o, clay_o)
-!
-! !DESCRIPTION:
-! make %sand and %clay from IGBP soil data, which includes
-! igbp soil 'mapunits' and their corresponding textures
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: sand_o(:,:) ! % sand (output grid)
- real(r8) , intent(out):: clay_o(:,:) ! % clay (output grid)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- character(len=38) :: typ ! soil texture based on ...
- integer :: nlay ! number of soil layers
- integer :: mapunitmax ! max value of igbp soil mapunits
- integer :: mapunittemp ! temporary igbp soil mapunit
- integer :: maxovr
- integer , allocatable :: novr(:)
- integer , allocatable :: kmap(:,:)
- real(r8), allocatable :: kwgt(:,:)
- integer , allocatable :: kmax(:)
- real(r8), allocatable :: wst(:)
- real(r8), allocatable :: sand_i(:,:) ! input grid: percent sand
- real(r8), allocatable :: clay_i(:,:) ! input grid: percent clay
- real(r8), allocatable :: mapunit_i(:) ! input grid: igbp soil mapunits
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer, parameter :: num=2 ! set soil mapunit number
- integer :: wsti(num) ! index to 1st and 2nd largest wst
- integer, parameter :: nlsm=4 ! number of soil textures
- character(len=38) :: soil(0:nlsm) ! name of each soil texture
- real(r8) :: gast_i(0:nlsm) ! global area, by texture type
- real(r8) :: gast_o(0:nlsm) ! global area, by texture type
- real(r8) :: wt ! map overlap weight
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- integer :: l,k,n,m,ni,no,ns_i,ns_o ! indices
- integer :: k1,k2 ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- integer :: miss = 99999 ! missing data indicator
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- logical :: found ! temporary
- integer :: kmap_max ! maximum overlap weights
- integer, parameter :: kmap_max_min = 90 ! kmap_max mininum value
- integer, parameter :: km_mx_ns_prod = 160000 ! product of kmap_max*ns_o to keep constant
- character(len=32) :: subname = 'mksoiltex'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %sand and %clay .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Define the model surface types: 0 to nlsm
- ! -----------------------------------------------------------------
-
- soil(0) = 'no soil: ocean, glacier, lake, no data'
- soil(1) = 'clays '
- soil(2) = 'sands '
- soil(3) = 'loams '
- soil(4) = 'silts '
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
- ns_o = ldomain%ns
-
- write (6,*) 'Open soil texture file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_dimid (ncid, 'number_of_layers', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, nlay), subname)
-
- call check_ret(nf_inq_dimid (ncid, 'max_value_mapunit', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, mapunitmax), subname)
-
- allocate(sand_i(mapunitmax,nlay), &
- clay_i(mapunitmax,nlay), &
- mapunit_i(ns_i), stat=ier)
- if (ier/=0) call abort()
-
- call check_ret(nf_inq_varid (ncid, 'MAPUNITS', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, mapunit_i), subname)
-
- call check_ret(nf_inq_varid (ncid, 'PCT_SAND', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, sand_i), subname)
-
- call check_ret(nf_inq_varid (ncid, 'PCT_CLAY', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, clay_i), subname)
-
- call check_ret(nf_close(ncid), subname)
-
- ! Compute local fields _o
- if (soil_sand==unset .and. soil_clay==unset) then
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! kmap_max are the maximum number of mapunits that will consider on
- ! any output gridcell - this is set currently above and can be changed
- ! kmap(:) are the mapunit values on the input grid
- ! kwgt(:) are the weights on the input grid
-
- allocate(novr(ns_o))
- novr(:) = 0
- do n = 1,tgridmap%ns
- ni = tgridmap%src_indx(n)
- if (tdomain%mask(ni) > 0) then
- no = tgridmap%dst_indx(n)
- novr(no) = novr(no) + 1
- end if
- end do
- maxovr = maxval(novr(:))
- kmap_max = min(maxovr,max(kmap_max_min,km_mx_ns_prod/ns_o))
- deallocate(novr)
-
- write(6,*)'kmap_max= ',kmap_max,' maxovr= ',maxovr,' ns_o= ',ns_o,' size= ',(kmap_max+1)*ns_o
-
- allocate(kmap(0:kmap_max,ns_o), stat=ier)
- if (ier/=0) call abort()
- allocate(kwgt(0:kmap_max,ns_o), stat=ier)
- if (ier/=0) call abort()
- allocate(kmax(ns_o), stat=ier)
- if (ier/=0) call abort()
- allocate(wst(0:kmap_max), stat=ier)
- if (ier/=0) call abort()
-
- kwgt(:,:) = 0.
- kmap(:,:) = 0
- kmax(:) = 0
-
- do n = 1,tgridmap%ns
- ni = tgridmap%src_indx(n)
- no = tgridmap%dst_indx(n)
- wt = tgridmap%wovr(n) * tdomain%mask(ni)
- if (wt > 0._r8) then
- k = mapunit_i(ni)
- found = .false.
- do l = 0,kmax(no)
- if (k == kmap(l,no)) then
- kwgt(l,no) = kwgt(l,no) + wt
- found = .true.
- exit
- end if
- end do
- if (.not. found) then
- kmax(no) = kmax(no) + 1
- if (kmax(no) > kmap_max) then
- write(6,*)'kmax is > kmap_max= ',kmax(no), 'kmap_max = ', &
- kmap_max,' for no = ',no
- write(6,*)'reset kmap_max in mksoilMod to a greater value'
- stop
- end if
- kmap(kmax(no),no) = k
- kwgt(kmax(no),no) = wt
- end if
- end if
- enddo
-
- end if
-
- do no = 1,ns_o
-
- if (soil_sand==unset .and. soil_clay==unset) then
- wst(:) = 0.
- wst(0:kmax(no)) = kwgt(0:kmax(no),no)
-
- ! Rank non-zero weights by soil mapunit.
- ! k1 is the most extensive mapunit.
- ! k2 is the second most extensive mapunit.
-
- if (maxval(wst(:)) > 0) then
- call mkrank (kmax(no)+1, wst(0:kmax(no)), miss, wsti, num)
- k1 = kmap(wsti(1),no)
- if (wsti(2) == miss) then
- k2 = miss
- else
- k2 = kmap(wsti(2),no)
- end if
- else
- k1 = 0
- k2 = 0
- end if
-
- end if
-
- ! Set soil texture as follows:
- ! a. Use dominant igbp soil mapunit based on area of overlap unless
- ! 'no data' is dominant
- ! b. In this case use second most dominant mapunit if it has data
- ! c. If this has no data or if there isn't a second most dominant
- ! mapunit, use loam for soil texture
-
- if (soil_sand/=unset .and. soil_clay/=unset) then !---soil texture is input
- do l = 1, nlay
- sand_o(no,l) = soil_sand
- clay_o(no,l) = soil_clay
- end do
- else if (k1 /= 0) then !---not 'no data'
- do l = 1, nlay
- sand_o(no,l) = sand_i(k1,l)
- clay_o(no,l) = clay_i(k1,l)
- end do
- else !---if (k1 == 0) then
- if (k2 == 0 .or. k2 == miss) then !---no data
- do l = 1, nlay
- sand_o(no,l) = 43. !---use loam
- clay_o(no,l) = 18.
- end do
- else !---if (k2 /= 0 and /= miss)
- do l = 1, nlay
- sand_o(no,l) = sand_i(k2,l)
- clay_o(no,l) = clay_i(k2,l)
- end do
- end if !---end of k2 if-block
- end if !---end of k1 if-block
-
- enddo
-
- if (soil_sand==unset .and. soil_clay==unset) then
-
- ! Global sum of output field
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global area of each soil type on input and output grids
- ! -----------------------------------------------------------------
-
- ! input grid: global areas by texture class
-
- gast_i(:) = 0.
- do l = 1, nlay
- do ni = 1,ns_i
- mapunittemp = nint(mapunit_i(ni))
- if (mapunittemp==0) then
- typ = 'no soil: ocean, glacier, lake, no data'
- else if (clay_i(mapunittemp,l) >= 40.) then
- typ = 'clays'
- else if (sand_i(mapunittemp,l) >= 50.) then
- typ = 'sands'
- else if (clay_i(mapunittemp,l)+sand_i(mapunittemp,l) < 50.) then
- if (tdomain%mask(ni) /= 0.) then
- typ = 'silts'
- else !if (tdomain%mask(ni) == 0.) then no data
- typ = 'no soil: ocean, glacier, lake, no data'
- end if
- else
- typ = 'loams'
- end if
- do m = 0, nlsm
- if (typ == soil(m)) go to 101
- end do
- write (6,*) 'MKSOILTEX error: sand = ',sand_i(mapunittemp,l), &
- ' clay = ',clay_i(mapunittemp,l), &
- ' not assigned to soil type for input grid lon,lat,layer = ',ni,l
- call abort()
-101 continue
- gast_i(m) = gast_i(m) + tgridmap%area_src(ni)*tdomain%mask(ni)*re**2
- end do
- end do
-
- ! output grid: global areas by texture class
-
- gast_o(:) = 0.
- do l = 1, nlay
- do no = 1,ns_o
- if (clay_o(no,l)==0. .and. sand_o(no,l)==0.) then
- typ = 'no soil: ocean, glacier, lake, no data'
- else if (clay_o(no,l) >= 40.) then
- typ = 'clays'
- else if (sand_o(no,l) >= 50.) then
- typ = 'sands'
- else if (clay_o(no,l)+sand_o(no,l) < 50.) then
- typ = 'silts'
- else
- typ = 'loams'
- end if
- do m = 0, nlsm
- if (typ == soil(m)) go to 102
- end do
- write (6,*) 'MKSOILTEX error: sand = ',sand_o(no,l), &
- ' clay = ',clay_o(no,l), &
- ' not assigned to soil type for output grid lon,lat,layer = ',no,l
- call abort()
-102 continue
- gast_o(m) = gast_o(m) + tgridmap%area_dst(no)*frac_dst(no)*re**2
- end do
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',l=1,70)
- write (ndiag,*) 'Soil Texture Output'
- write (ndiag,'(1x,70a1)') ('=',l=1,70)
- write (ndiag,*)
-
- write (ndiag,*) 'The following table of soil texture classes is for comparison only.'
- write (ndiag,*) 'The actual data is continuous %sand, %silt and %clay not textural classes'
- write (ndiag,*)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',l=1,70)
- write (ndiag,1001)
-1001 format (1x,'soil texture class',17x,' input grid area output grid area',/ &
- 1x,33x,' 10**6 km**2',' 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',l=1,70)
- write (ndiag,*)
-
- do l = 0, nlsm
- write (ndiag,1002) soil(l),gast_i(l)*1.e-6,gast_o(l)*1.e-6
-1002 format (1x,a38,f16.3,f17.3)
- end do
-
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- if (soil_sand==unset .and. soil_clay==unset) then
- call gridmap_clean(tgridmap)
- deallocate (kmap, kwgt, kmax, wst)
- deallocate (sand_i,clay_i,mapunit_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
- end if
-
-
- write (6,*) 'Successfully made %sand and %clay'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mksoiltex
-
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoilcolInit
-!
-! !INTERFACE:
-subroutine mksoilcolInit( )
-!
-! !DESCRIPTION:
-! Initialize of make soil color
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: sumtex
- character(len=32) :: subname = 'mksoilcolInit'
-!-----------------------------------------------------------------------
-
- ! Error check soil_color if it is set
- if ( soil_color /= unsetcol )then
- if ( soil_color < 0 .or. soil_color > 20 )then
- write(6,*)'soil_color is out of range = ', soil_color
- call abort()
- end if
- write(6,*) 'Replace soil color for all points with: ', soil_color
- end if
-end subroutine mksoilcolInit
-
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoilcol
-!
-! !INTERFACE:
-subroutine mksoilcol(ldomain, mapfname, datfname, ndiag, &
- soil_color_o, nsoicol)
-!
-! !DESCRIPTION:
-! make %sand and %clay from IGBP soil data, which includes
-! igbp soil 'mapunits' and their corresponding textures
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- integer , intent(out):: soil_color_o(:) ! soil color classes
- integer , intent(out):: nsoicol ! number of soil colors
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: gast_i(:) ! global area, by surface type
- real(r8), allocatable :: gast_o(:) ! global area, by surface type
- integer , allocatable :: soil_color_i(:) ! input grid: BATS soil color
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- character(len=35), allocatable :: col(:) ! name of each color
- integer :: k,l,m,ni,no,ns_i,ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mksoilcol'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make soil color classes .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ns_o = ldomain%ns
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
- allocate(soil_color_i(ns_i), stat=ier)
- if (ier/=0) call abort()
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- write (6,*) 'Open soil color file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'SOIL_COLOR', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, soil_color_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- nsoicol = maxval(soil_color_i)
- write(6,*)'nsoicol = ',nsoicol
-
- allocate(gast_i(0:nsoicol),gast_o(0:nsoicol),col(0:nsoicol))
-
- ! -----------------------------------------------------------------
- ! Define the model color classes: 0 to nsoicol
- ! -----------------------------------------------------------------
-
- if (nsoicol == 20) then
- col(0) = 'no soil '
- col(1) = 'class 1: light '
- col(2) = 'class 2: '
- col(3) = 'class 3: '
- col(4) = 'class 4: '
- col(5) = 'class 5: '
- col(6) = 'class 6: '
- col(7) = 'class 7: '
- col(8) = 'class 8: '
- col(9) = 'class 9: '
- col(10) = 'class 10: '
- col(11) = 'class 11: '
- col(12) = 'class 12: '
- col(13) = 'class 13: '
- col(14) = 'class 14: '
- col(15) = 'class 15: '
- col(16) = 'class 16: '
- col(17) = 'class 17: '
- col(18) = 'class 18: '
- col(19) = 'class 19: '
- col(20) = 'class 20: dark '
- else if (nsoicol == 8) then
- col(0) = 'no soil '
- col(1) = 'class 1: light '
- col(2) = 'class 2: '
- col(3) = 'class 3: '
- col(4) = 'class 4: '
- col(5) = 'class 5: '
- col(6) = 'class 6: '
- col(7) = 'class 7: '
- col(8) = 'class 8: dark '
- else
- write(6,*)'nsoicol value of ',nsoicol,' is not currently supported'
- call abort()
- end if
-
- ! Error check soil_color if it is set
- if ( soil_color /= unsetcol )then
- if ( soil_color > nsoicol )then
- write(6,*)'soil_color is out of range = ', soil_color
- call abort()
- end if
-
- do no = 1,ns_o
- soil_color_o(no) = soil_color
- end do
-
- else
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine dominant soil color for each output cell
-
- call dominant_soil_color( &
- tgridmap = tgridmap, &
- mask_i = tdomain%mask, &
- soil_color_i = soil_color_i, &
- nsoicol = nsoicol, &
- soil_color_o = soil_color_o)
-
- ! Global sum of output field
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global area of each soil color on input and output grids
- ! -----------------------------------------------------------------
-
- gast_i(:) = 0.
- do ni = 1,ns_i
- k = soil_color_i(ni)
- gast_i(k) = gast_i(k) + tgridmap%area_src(ni)*tdomain%mask(ni)*re**2
- end do
-
- gast_o(:) = 0.
- do no = 1,ns_o
- k = soil_color_o(no)
- gast_o(k) = gast_o(k) + tgridmap%area_dst(no)*frac_dst(no)*re**2
- end do
-
- ! area comparison
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Soil Color Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,1001)
-1001 format (1x,'soil color type',20x,' input grid area output grid area',/ &
- 1x,33x,' 10**6 km**2',' 10**6 km**2')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
-
- do k = 0, nsoicol
- write (ndiag,1002) col(k),gast_i(k)*1.e-6,gast_o(k)*1.e-6
-1002 format (1x,a35,f16.3,f17.3)
- end do
-
- end if
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- if ( soil_color == unsetcol )then
- call gridmap_clean(tgridmap)
- end if
- deallocate (soil_color_i,gast_i,gast_o,col, frac_dst, mask_r8)
-
- write (6,*) 'Successfully made soil color classes'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mksoilcol
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkorganic
-!
-! !INTERFACE:
-subroutine mkorganic(ldomain, mapfname, datfname, ndiag, organic_o)
-!
-! !DESCRIPTION:
-! make organic matter dataset
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: organic_o(:,:) ! output grid:
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-!
-! Author: David Lawrence
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: organic_i(:,:) ! input grid: total column organic matter
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: gomlev_i ! input grid: global organic on lev
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: gomlev_o ! output grid: global organic on lev
- real(r8) :: garea_o ! output grid: global area
- integer :: k,n,m,ni,no,ns_i ! indices
- integer :: lev ! level index
- integer :: nlay ! number of soil layers
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mkorganic'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make organic matter dataset .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
-
- write (6,*) 'Open soil organic file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- call check_ret(nf_inq_dimid (ncid, 'number_of_layers', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, nlay), subname)
-
- allocate(organic_i(ns_i,nlay),stat=ier)
- if (ier/=0) call abort()
- allocate(frac_dst(ldomain%ns),stat=ier)
- if (ier/=0) call abort()
-
- if (nlay /= nlevsoi) then
- write(6,*)'nlay, nlevsoi= ',nlay,nlevsoi,' do not match'
- stop
- end if
-
- call check_ret(nf_inq_varid (ncid, 'ORGANIC', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, organic_i), subname)
-
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_mapread(tgridmap, mapfname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- do lev = 1,nlay
- call gridmap_areaave_srcmask(tgridmap, organic_i(:,lev), organic_o(:,lev), nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- end do
-
- do lev = 1,nlevsoi
-
- ! Check for conservation
-
- do no = 1,ldomain%ns
- if ((organic_o(no,lev)) > 130.000001_r8) then
- write (6,*) 'MKORGANIC error: organic = ',organic_o(no,lev), &
- ' greater than 130.000001 for column, row = ',no
- call shr_sys_flush(6)
- stop
- end if
- enddo
-
-! ! Diagnostic output
-
- ! TODO: there is nothing being written out here currently - all zeroes
- ! So for now these are commented out
-!!$ write (ndiag,*)
-!!$ write (ndiag,'(1x,70a1)') ('.',k=1,70)
-!!$ write (ndiag,2001)
-!!$2001 format (1x,'surface type input grid area output grid area'/ &
-!!$ 1x,' 10**6 km**2 10**6 km**2 ')
-!!$ write (ndiag,'(1x,70a1)') ('.',k=1,70)
-!!$ write (ndiag,*)
-!!$ write (ndiag,2002) gomlev_i*1.e-06,gomlev_o*1.e-06
-!!$ write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-!!$2002 format (1x,'organic ',f14.3,f17.3)
-!!$2004 format (1x,'all surface ',f14.3,f17.3)
-!!$
- call shr_sys_flush(ndiag)
-
- write (6,*) 'Successfully made organic matter, level = ', lev
- call shr_sys_flush(6)
-
- end do ! lev
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (organic_i)
- deallocate (frac_dst)
-
- write (6,*) 'Successfully made organic matter'
- call shr_sys_flush(6)
- write(6,*)
-
-end subroutine mkorganic
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoilfmaxInit
-!
-! !INTERFACE:
-subroutine mksoilfmaxInit( )
-!
-! !DESCRIPTION:
-! Initialize of make soil fmax
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: sumtex
- character(len=32) :: subname = 'mksoilfmaxInit'
-!-----------------------------------------------------------------------
-
- ! Error check soil_fmax if it is set
- if ( soil_fmax /= unset )then
- if ( soil_fmax < 0.0 .or. soil_fmax > 1.0 )then
- write(6,*)'soil_fmax is out of range = ', soil_fmax
- stop
- end if
- write(6,*) 'Replace soil fmax for all points with: ', soil_fmax
- end if
-
-end subroutine mksoilfmaxInit
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkfmax
-!
-! !INTERFACE:
-subroutine mkfmax(ldomain, mapfname, datfname, ndiag, fmax_o)
-!
-! !DESCRIPTION:
-! make percent fmax
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: fmax_o(:) ! output grid: %fmax
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Revised: Nan Rosenbloom - used mkglacier.F90 as template.
-! Original Author: Mariana Vertenstein
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: fmax_i(:) ! input grid: percent fmax
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- real(r8) :: gfmax_i ! input grid: global fmax
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: gfmax_o ! output grid: global fmax
- real(r8) :: garea_o ! output grid: global area
- integer :: k,n,m,ni,no,ns_i,ns_o ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mkfmax'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %fmax .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
- ns_o = ldomain%ns
- allocate(fmax_i(ns_i), stat=ier)
- if (ier/=0) call abort()
- allocate(frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- write (6,*) 'Open soil fmax file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'FMAX', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, fmax_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine fmax_o on output grid
-
- ! In points with no data, use globalAvg
- ! (WJS (3-11-13): use real(.365783,r8) rather than .365783_r8 to maintain bfb results
- ! with old code)
- call gridmap_areaave_srcmask(tgridmap, fmax_i, fmax_o, nodata=real(.365783,r8), mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check for conservation
-
- do no = 1, ns_o
- if ((fmax_o(no)) > 1.000001_r8) then
- write (6,*) 'MKFMAX error: fmax = ',fmax_o(no), &
- ' greater than 1.000001 for column, row = ',no
- call shr_sys_flush(6)
- stop
- end if
- enddo
-
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- gfmax_i = 0.
- garea_i = 0.
- do ni = 1,ns_i
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- gfmax_i = gfmax_i + fmax_i(ni)*(tgridmap%area_src(ni)/100.)* &
- tdomain%mask(ni)*re**2
- end do
-
- gfmax_o = 0.
- garea_o = 0.
- do no = 1,ns_o
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- gfmax_o = gfmax_o + fmax_o(no)*(tgridmap%area_dst(no)/100.) * &
- frac_dst(no)*re**2
- if ((frac_dst(no) < 0.0) .or. (frac_dst(no) > 1.0001)) then
- write(6,*) "ERROR:: frac_dst out of range: ", frac_dst(no),no
- stop
- end if
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- write (ndiag,*) 'Maximum Fractional Saturated Area Output'
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2002) gfmax_i*1.e-06,gfmax_o*1.e-06
- write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-2002 format (1x,'fmax ',f14.3,f17.3)
-2004 format (1x,'all surface ',f14.3,f17.3)
-
- write (6,*) 'Successfully made %fmax'
- write (6,*)
- call shr_sys_flush(6)
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (fmax_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
-end subroutine mkfmax
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoilAtt
-!
-! !INTERFACE:
-subroutine mksoilAtt( ncid, dynlanduse, xtype )
-!
-! !DESCRIPTION:
-! add atttributes to output file regarding the soil module
-!
-! !USES:
- use fileutils , only : get_filename
- use mkncdio , only : check_ret, ncd_defvar, ncd_def_spatial_var
- use mkvarpar
- use mkvarctl
-
-! !ARGUMENTS:
- implicit none
- include 'netcdf.inc'
- integer, intent(in) :: ncid ! NetCDF file ID to write out to
- logical, intent(in) :: dynlanduse ! if dynamic land-use file
- integer, intent(in) :: xtype ! external type to output real data as
-!
-! !CALLED FROM:
-! subroutine mkfile in module mkfileMod
-!
-! !REVISION HISTORY:
-! Original Author: Erik Kluzek
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: dimid ! temporary
- character(len=256) :: str ! global attribute string
- character(len=32) :: subname = 'mksoilAtt'
-!-----------------------------------------------------------------------
-
- if (.not. dynlanduse) then
-
- ! Define dimensions unique to soil
-
- call check_ret(nf_def_dim (ncid, 'nlevsoi', &
- nlevsoi , dimid), subname)
-
- ! Add global attributes to file
-
- if ( soil_clay /= unset .and. soil_sand /= unset )then
- str = 'TRUE'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'soil_clay_override', len_trim(str), trim(str)), subname)
- str = 'TRUE'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'soil_sand_override', len_trim(str), trim(str)), subname)
- else
- str = get_filename(mksrf_fsoitex)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Soil_texture_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
- if ( soil_color /= unsetcol )then
- str = 'TRUE'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'soil_color_override', len_trim(str), trim(str)), subname)
- else
- str = get_filename(mksrf_fsoicol)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Soil_color_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
- if ( soil_fmax /= unset )then
- str = 'TRUE'
- call check_ret(nf_put_att_text (ncid, NF_GLOBAL, &
- 'soil_fmax_override', len_trim(str), trim(str)), subname)
- else
- str = get_filename(mksrf_fmax)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Fmax_raw_data_file_name', len_trim(str), trim(str)), subname)
- end if
- str = get_filename(mksrf_forganic)
- call check_ret(nf_put_att_text(ncid, NF_GLOBAL, &
- 'Organic_matter_raw_data_file_name', len_trim(str), trim(str)), subname)
-
- ! Define variables
-
- call ncd_defvar(ncid=ncid, varname='mxsoil_color', xtype=nf_int, &
- long_name='maximum numbers of soil colors', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='SOIL_COLOR', xtype=nf_int, &
- long_name='soil color', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_SAND', xtype=xtype, &
- lev1name='nlevsoi', &
- long_name='percent sand', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='PCT_CLAY', xtype=xtype, &
- lev1name='nlevsoi', &
- long_name='percent clay', units='unitless')
-
- call ncd_def_spatial_var(ncid=ncid, varname='ORGANIC', xtype=xtype, &
- lev1name='nlevsoi', &
- long_name='organic matter density at soil levels', &
- units='kg/m3 (assumed carbon content 0.58 gC per gOM)')
-
- call ncd_def_spatial_var(ncid=ncid, varname='FMAX', xtype=xtype, &
- long_name='maximum fractional saturated area', units='unitless')
-
- end if
-
-end subroutine mksoilAtt
-
-!-----------------------------------------------------------------------
-
-end module mksoilMod
diff --git a/tools/mksurfdata_map/src/mksoilUtilsMod.F90 b/tools/mksurfdata_map/src/mksoilUtilsMod.F90
deleted file mode 100644
index 122cfd45d5..0000000000
--- a/tools/mksurfdata_map/src/mksoilUtilsMod.F90
+++ /dev/null
@@ -1,224 +0,0 @@
-module mksoilUtilsMod
-
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !MODULE: mksoilUtils
- !
- ! !DESCRIPTION:
- ! Lower-level utilities used in making soil data.
- !
- ! These are separated out from mksoilMod mainly as an aid to testing.
- !
- ! !REVISION HISTORY:
- ! Author: Bill Sacks
- !
- !-----------------------------------------------------------------------
- !!USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkgridmapMod, only : gridmap_type
-
- implicit none
- private
-
- !
- ! !PUBLIC MEMBER FUNCTIONS:
- !
- public :: dominant_soil_color
- public :: mkrank
-
- !
- ! !PRIVATE MEMBER FUNCTIONS:
- !
-
- !EOP
- !===============================================================
-contains
- !===============================================================
-
- !-----------------------------------------------------------------------
- subroutine dominant_soil_color(tgridmap, mask_i, soil_color_i, nsoicol, soil_color_o)
- !
- ! !DESCRIPTION:
- ! Determine the dominant soil color in each output cell
- !
- ! !ARGUMENTS:
- type(gridmap_type) , intent(in) :: tgridmap
- integer , intent(in) :: mask_i(:) ! input grid: land mask (1 = land, 0 = ocean)
- integer , intent(in) :: soil_color_i(:) ! input grid: BATS soil color
- integer , intent(in) :: nsoicol ! number of soil colors
- integer , intent(out) :: soil_color_o(:) ! output grid: soil color classes
- !
- ! !LOCAL VARIABLES:
- integer, parameter :: num = 2 ! set soil mapunit number
- integer :: wsti(num) ! index to 1st and 2nd largest wst
- integer :: k, n, ni, no, ns_i, ns_o
- real(r8) :: wt ! map overlap weight
- real(r8), allocatable :: wst(:,:) ! overlap weights, by surface type
- logical :: has_color ! whether this grid cell has non-zero color
- integer, parameter :: miss = 99999 ! missing data indicator
-
- character(len=*), parameter :: subname = 'dominant_soil_color'
- !-----------------------------------------------------------------------
-
- ns_i = size(mask_i)
- if (size(soil_color_i) /= ns_i) then
- write(6,*) subname, ' ERROR: size of soil_color_i should match size of mask_i'
- write(6,*) 'size(mask_i), size(soil_color_i) = ', &
- size(mask_i), size(soil_color_i)
- call abort()
- end if
-
- ! find area of overlap for each soil color for each no
-
- ns_o = size(soil_color_o)
- allocate(wst(0:nsoicol,ns_o))
- wst(0:nsoicol,:) = 0
-
- ! TODO: need to do a loop to determine
- ! the maximum number of over lap cells throughout the grid
- ! first get an array that is novr(ns_o) and fill this in - then set
- ! maxovr - to max(novr) - then allocate the array wst to be size of
- ! maxovr,ns_o or 0:nsoilcol,ns_o
-
- do n = 1,tgridmap%ns
- ni = tgridmap%src_indx(n)
- no = tgridmap%dst_indx(n)
- wt = tgridmap%wovr(n) * mask_i(ni)
- k = soil_color_i(ni) * mask_i(ni)
- wst(k,no) = wst(k,no) + wt
- enddo
-
- soil_color_o(:) = 0
- do no = 1,ns_o
-
- ! If the output cell has any non-zero-colored inputs, then set the weight of
- ! zero-colored inputs to 0, to ensure that the zero-color is NOT dominant.
- if (any(wst(1:nsoicol,no) > 0.)) then
- has_color = .true.
- wst(0,no) = 0.0
- else
- has_color = .false.
- end if
-
- ! Rank non-zero weights by color type. wsti(1) is the most extensive
- ! color type.
-
- if (has_color) then
- call mkrank (nsoicol, wst(0:nsoicol,no), miss, wsti, num)
- soil_color_o(no) = wsti(1)
- end if
-
- ! If land but no color, set color to 15 (in older dataset generic
- ! soil color 4)
-
- if (nsoicol == 8) then
- if (soil_color_o(no)==0) then
- soil_color_o(no) = 4
- end if
- else if (nsoicol == 20) then
- if (soil_color_o(no)==0) then
- soil_color_o(no) = 15
- end if
- else
- write(6,*) 'MKSOILCOL error: unhandled nsoicol: ', nsoicol
- call abort()
- end if
-
- ! Error checks
-
- if (soil_color_o(no) < 0 .or. soil_color_o(no) > nsoicol) then
- write (6,*) 'MKSOILCOL error: land model soil color = ', &
- soil_color_o(no),' is not valid for lon,lat = ',no
- call abort()
- end if
-
- end do
-
- deallocate (wst)
-
- end subroutine dominant_soil_color
-
-
- !-----------------------------------------------------------------------
- !BOP
- !
- ! !ROUTINE: mkrank
- !
- ! !INTERFACE:
- subroutine mkrank (n, a, miss, iv, num)
- !
- ! !DESCRIPTION:
- ! Return indices of largest [num] values in array [a].
- !
- ! !ARGUMENTS:
- integer , intent(in) :: n !array length
- real(r8), intent(in) :: a(0:n) !array to be ranked
- integer , intent(in) :: miss !missing data value
- integer , intent(in) :: num !number of largest values requested
- integer , intent(out):: iv(num) !index to [num] largest values in array [a]
- !
- ! !REVISION HISTORY:
- ! Author: Gordon Bonan
- !
- ! !LOCAL VARIABLES:
- !EOP
- real(r8) a_max !maximum value in array
- integer i !array index
- real(r8) delmax !tolerance for finding if larger value
- integer m !do loop index
- integer k !do loop index
- logical exclude !true if data value has already been chosen
- !-----------------------------------------------------------------------
-
- delmax = 1.e-06
-
- ! Find index of largest non-zero number
-
- iv(1) = miss
- a_max = -9999.
-
- do i = 0, n
- if (a(i)>0. .and. (a(i)-a_max)>delmax) then
- a_max = a(i)
- iv(1) = i
- end if
- end do
-
- ! iv(1) = miss indicates no values > 0. this is an error
-
- if (iv(1) == miss) then
- write (6,*) 'MKRANK error: iv(1) = missing'
- call abort()
- end if
-
- ! Find indices of the next [num]-1 largest non-zero number.
- ! iv(m) = miss if there are no more values > 0
-
- do m = 2, num
- iv(m) = miss
- a_max = -9999.
- do i = 0, n
-
- ! exclude if data value has already been chosen
-
- exclude = .false.
- do k = 1, m-1
- if (i == iv(k)) exclude = .true.
- end do
-
- ! if not already chosen, see if it is the largest of
- ! the remaining values
-
- if (.not. exclude) then
- if (a(i)>0. .and. (a(i)-a_max)>delmax) then
- a_max = a(i)
- iv(m) = i
- end if
- end if
- end do
- end do
-
- end subroutine mkrank
-
-end module mksoilUtilsMod
diff --git a/tools/mksurfdata_map/src/mksoildepthMod.F90 b/tools/mksurfdata_map/src/mksoildepthMod.F90
deleted file mode 100644
index 521ac2c6f0..0000000000
--- a/tools/mksurfdata_map/src/mksoildepthMod.F90
+++ /dev/null
@@ -1,172 +0,0 @@
-module mksoildepthMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mksoildepthMod
-!
-! !DESCRIPTION:
-! make fraction soildepth from input soildepth data
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mksoildepth ! regrid soildepth data
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mksoildepth
-!
-! !INTERFACE:
-subroutine mksoildepth(ldomain, mapfname, datfname, ndiag, soildepth_o)
-!
-! !DESCRIPTION:
-! make soildepth
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_area
- use mkchecksMod, only : min_bad, max_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(out):: soildepth_o(:) ! output grid: fraction soildepth
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Sam Levis and Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
-
- real(r8), parameter :: min_valid = 0._r8 ! minimum valid value
- real(r8), parameter :: max_valid = 100.000001_r8 ! maximum valid value
- character(len=32) :: subname = 'mksoildepth'
- character(len=32) :: varname
- integer :: varnum
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make soildepth .....'
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- call domain_read( tdomain, datfname )
-
- call gridmap_mapread( tgridmap, mapfname )
-
- ! Obtain frac_dst
- allocate(frac_dst(ldomain%ns), stat=ier)
- if (ier/=0) call abort()
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open soildepth file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Regrid soildepth
- ! -----------------------------------------------------------------
-
- varnum = 1
- select case (varnum)
- case(1)
- varname = 'Avg_Depth_Median'
- case(2)
- varname = 'Avg_Depth_Mean'
- case(3)
- varname = 'Upland_Valley_Depth_Median'
- case(4)
- varname = 'Upland_Valley_Depth_Mean'
- case(5)
- varname = 'Upland_Hillslope_Depth_Median'
- case(6)
- varname = 'Upland_Hillslope_Depth_Mean'
- case(7)
- varname = 'Lowland_Depth_Mean'
- case(8)
- varname = 'Lowland_Depth_Mean'
- end select
-
-! call check_ret(nf_inq_varid (ncid, 'Avg_Depth_Median', varid), subname)
- call check_ret(nf_inq_varid (ncid, varname, varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areaave_srcmask(tgridmap, data_i, soildepth_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check validity of output data
- if (min_bad(soildepth_o, min_valid, 'soildepth') .or. &
- max_bad(soildepth_o, max_valid, 'soildepth')) then
- stop
- end if
-
- call output_diagnostics_area(data_i, soildepth_o, tgridmap, "Soildepth", percent=.false., ndiag=ndiag, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- deallocate (frac_dst)
- deallocate (mask_r8)
-
- write (6,*) 'Successfully made soildepth'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mksoildepth
-
-
-end module mksoildepthMod
diff --git a/tools/mksurfdata_map/src/mksurfdat.F90 b/tools/mksurfdata_map/src/mksurfdat.F90
deleted file mode 100644
index bc6ef6f028..0000000000
--- a/tools/mksurfdata_map/src/mksurfdat.F90
+++ /dev/null
@@ -1,1631 +0,0 @@
-program mksurfdat
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !PROGRAM: mksurfdat
-!
-! !DESCRIPTION:
-! Creates land model surface dataset from original "raw" data files.
-! Surface dataset contains model grid, pfts, inland water, glacier,
-! soil texture, soil color, LAI and SAI, urban fraction, and urban
-! parameters.
-!
-! !USES:
- use shr_kind_mod , only : r8 => shr_kind_r8, r4 => shr_kind_r4
- use fileutils , only : opnfil, getavu
- use mklaiMod , only : mklai
- use mkpctPftTypeMod , only : pct_pft_type, get_pct_p2l_array, get_pct_l2g_array, update_max_array
- use mkpftConstantsMod , only : natpft_lb, natpft_ub, cft_lb, cft_ub, num_cft
- use mkpftMod , only : pft_idx, pft_frc, mkpft, mkpftInit, mkpft_parse_oride
- use mksoilMod , only : soil_sand, soil_clay, mksoiltex, mksoilInit, &
- soil_color, mksoilcol, mkorganic, &
- soil_fmax, mkfmax
- use mkvocefMod , only : mkvocef
- use mklanwatMod , only : mklakwat, mkwetlnd, mklakparams
- use mkglacierregionMod , only : mkglacierregion
- use mkglcmecMod , only : nglcec, mkglcmec, mkglcmecInit, mkglacier
- use mkharvestMod , only : mkharvest, mkharvest_init, mkharvest_fieldname
- use mkharvestMod , only : mkharvest_numtypes, mkharvest_parse_oride
- use mkharvestMod , only : harvestDataType
- use mkurbanparCommonMod, only : mkelev
- use mkurbanparMod , only : mkurbanInit, mkurban, mkurbanpar, numurbl
- use mkutilsMod , only : normalize_classes_by_gcell
- use mkfileMod , only : mkfile
- use mkvarpar , only : nlevsoi, elev_thresh, numstdpft
- use mkvarctl
- use nanMod , only : nan, bigint
- use mkncdio , only : check_ret, ncd_put_time_slice
- use mkdomainMod , only : domain_type, domain_read_map, domain_read, &
- domain_write
- use mkgdpMod , only : mkgdp
- use mkpeatMod , only : mkpeat
- use mksoildepthMod , only : mksoildepth
- use mkagfirepkmonthMod , only : mkagfirepkmon
- use mktopostatsMod , only : mktopostats
- use mkVICparamsMod , only : mkVICparams
-!
-! !ARGUMENTS:
- implicit none
-
- include 'netcdf.inc'
-!
-! !REVISION HISTORY:
-! Authors: Gordon Bonan, Sam Levis and Mariana Vertenstein
-! Revised: Nan Rosenbloom to add fmax processing.
-! 3/18/08: David Lawrence added organic matter processing
-! 1/22/09: Keith Oleson added urban parameter processing
-! 2/11/13: Sam Levis added abm, peat, and gdp processing for new fire model
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: nsoicol ! number of model color classes
- integer :: k,m,n ! indices
- integer :: ni,nj,ns_o ! indices
- integer :: ier ! error status
- integer :: ndiag,nfdyn ! unit numbers
- integer :: ncid ! netCDF id
- integer :: omode ! netCDF output mode
- integer :: varid ! netCDF variable id
- integer :: ret ! netCDF return status
- integer :: ntim ! time sample for dynamic land use
- integer :: year ! year for dynamic land use
- integer :: year2 ! year for dynamic land use for harvest file
- logical :: all_veg ! if gridcell will be 100% vegetated land-cover
- real(r8) :: suma ! sum for error check
- character(len=256) :: fgrddat ! grid data file
- character(len=256) :: fsurdat ! output surface data file name (if blank, do not output a surface dataset)
- character(len=256) :: fsurlog ! output surface log file name
- character(len=256) :: fdyndat ! dynamic landuse data file name
- character(len=256) :: fname ! generic filename
- character(len=256) :: fhrvname ! generic harvest filename
- character(len=256) :: string ! string read in
- integer :: t1 ! timer
- real(r8),parameter :: p5 = 0.5_r8 ! constant
- real(r8),parameter :: p25 = 0.25_r8 ! constant
-
- real(r8), allocatable :: landfrac_pft(:) ! PFT data: % land per gridcell
- real(r8), allocatable :: pctlnd_pft(:) ! PFT data: % of gridcell for PFTs
- real(r8), allocatable :: pctlnd_pft_dyn(:) ! PFT data: % of gridcell for dyn landuse PFTs
- integer , allocatable :: pftdata_mask(:) ! mask indicating real or fake land type
- type(pct_pft_type), allocatable :: pctnatpft(:) ! % of grid cell that is nat veg, and breakdown into PFTs
- type(pct_pft_type), allocatable :: pctnatpft_max(:) ! % of grid cell maximum PFTs of the time series
- type(pct_pft_type), allocatable :: pctcft(:) ! % of grid cell that is crop, and breakdown into CFTs
- type(pct_pft_type), allocatable :: pctcft_max(:) ! % of grid cell maximum CFTs of the time series
- real(r8) :: harvest_initval ! initial value for harvest variables
- real(r8), pointer :: harvest1D(:) ! harvest 1D data: normalized harvesting
- real(r8), pointer :: harvest2D(:,:) ! harvest 1D data: normalized harvesting
- real(r8), allocatable :: pctgla(:) ! percent of grid cell that is glacier
- real(r8), allocatable :: pctglc_gic(:) ! percent of grid cell that is gic (% of glc landunit)
- real(r8), allocatable :: pctglc_icesheet(:) ! percent of grid cell that is ice sheet (% of glc landunit)
- real(r8), allocatable :: pctglcmec(:,:) ! glacier_mec pct coverage in each class (% of landunit)
- real(r8), allocatable :: topoglcmec(:,:) ! glacier_mec sfc elevation in each gridcell and class
- real(r8), allocatable :: pctglcmec_gic(:,:) ! GIC pct coverage in each class (% of landunit)
- real(r8), allocatable :: pctglcmec_icesheet(:,:) ! icesheet pct coverage in each class (% of landunit)
- real(r8), allocatable :: elevclass(:) ! glacier_mec elevation classes
- integer, allocatable :: glacier_region(:) ! glacier region ID
- real(r8), allocatable :: pctlak(:) ! percent of grid cell that is lake
- real(r8), allocatable :: pctwet(:) ! percent of grid cell that is wetland
- real(r8), allocatable :: pcturb(:) ! percent of grid cell that is urbanized (total across all urban classes)
- real(r8), allocatable :: urbn_classes(:,:) ! percent cover of each urban class, as % of total urban area
- real(r8), allocatable :: urbn_classes_g(:,:)! percent cover of each urban class, as % of grid cell
- real(r8), allocatable :: elev(:) ! glc elevation (m)
- real(r8), allocatable :: fmax(:) ! fractional saturated area
- integer , allocatable :: soicol(:) ! soil color
- real(r8), allocatable :: pctsand(:,:) ! soil texture: percent sand
- real(r8), allocatable :: pctclay(:,:) ! soil texture: percent clay
- real(r8), allocatable :: ef1_btr(:) ! Isoprene emission factor for broadleaf
- real(r8), allocatable :: ef1_fet(:) ! Isoprene emission factor for fine/everg
- real(r8), allocatable :: ef1_fdt(:) ! Isoprene emission factor for fine/dec
- real(r8), allocatable :: ef1_shr(:) ! Isoprene emission factor for shrubs
- real(r8), allocatable :: ef1_grs(:) ! Isoprene emission factor for grasses
- real(r8), allocatable :: ef1_crp(:) ! Isoprene emission factor for crops
- real(r8), allocatable :: organic(:,:) ! organic matter density (kg/m3)
- real(r8), allocatable :: gdp(:) ! GDP (x1000 1995 US$/capita)
- real(r8), allocatable :: fpeat(:) ! peatland fraction of gridcell
- real(r8), allocatable :: soildepth(:) ! soil depth (m)
- integer , allocatable :: agfirepkmon(:) ! agricultural fire peak month
- integer , allocatable :: urban_region(:) ! urban region ID
- real(r8), allocatable :: topo_stddev(:) ! standard deviation of elevation (m)
- real(r8), allocatable :: slope(:) ! topographic slope (degrees)
- real(r8), allocatable :: vic_binfl(:) ! VIC b parameter (unitless)
- real(r8), allocatable :: vic_ws(:) ! VIC Ws parameter (unitless)
- real(r8), allocatable :: vic_dsmax(:) ! VIC Dsmax parameter (mm/day)
- real(r8), allocatable :: vic_ds(:) ! VIC Ds parameter (unitless)
- real(r8), allocatable :: lakedepth(:) ! lake depth (m)
-
- real(r8) :: std_elev = -999.99_r8 ! Standard deviation of elevation (m) to use for entire grid
-
- integer, allocatable :: harvind1D(:) ! Indices of 1D harvest fields
- integer, allocatable :: harvind2D(:) ! Indices of 2D harvest fields
-
- ! NOTE(bja, 2015-01) added to work around a ?bug? causing 1x1_urbanc_alpha to abort. See
- !/glade/p/cesm/cseg/inputdata/lnd/clm2/surfdata_map/README_c141219
- logical :: urban_skip_abort_on_invalid_data_check
-
- type(domain_type) :: ldomain
-
- character(len=32) :: subname = 'mksrfdat' ! program name
- type(harvestDataType) :: harvdata
-
- namelist /clmexp/ &
- mksrf_fgrid, &
- mksrf_gridtype, &
- mksrf_fvegtyp, &
- mksrf_fhrvtyp, &
- mksrf_fsoitex, &
- mksrf_forganic, &
- mksrf_fsoicol, &
- mksrf_fvocef, &
- mksrf_flakwat, &
- mksrf_fwetlnd, &
- mksrf_fglacier, &
- mksrf_fglacierregion, &
- mksrf_furbtopo, &
- mksrf_fmax, &
- mksrf_furban, &
- mksrf_flai, &
- mksrf_fdynuse, &
- mksrf_fgdp, &
- mksrf_fpeat, &
- mksrf_fsoildepth, &
- mksrf_fabm, &
- mksrf_ftopostats, &
- mksrf_fvic, &
- mksrf_fch4, &
- nglcec, &
- numpft, &
- soil_color, &
- soil_sand, &
- soil_fmax, &
- soil_clay, &
- pft_idx, &
- all_veg, &
- pft_frc, &
- all_urban, &
- no_inlandwet, &
- map_fpft, &
- map_flakwat, &
- map_fwetlnd, &
- map_fglacier, &
- map_fglacierregion, &
- map_fsoitex, &
- map_fsoicol, &
- map_furban, &
- map_furbtopo, &
- map_fmax, &
- map_forganic, &
- map_fvocef, &
- map_flai, &
- map_fharvest, &
- map_fgdp, &
- map_fpeat, &
- map_fsoildepth, &
- map_fabm, &
- map_ftopostats, &
- map_fvic, &
- map_fch4, &
- gitdescribe, &
- outnc_large_files, &
- outnc_double, &
- outnc_dims, &
- outnc_vic, &
- outnc_3dglc, &
- fsurdat, &
- fdyndat, &
- fsurlog, &
- std_elev, &
- urban_skip_abort_on_invalid_data_check
-
-!-----------------------------------------------------------------------
-
- ! ======================================================================
- ! Read input namelist
- ! ======================================
- ! Must specify settings for the output grid:
- ! ======================================
- ! mksrf_fgrid -- Grid dataset
- ! ======================================
- ! Must specify settings for input high resolution datafiles
- ! ======================================
- ! mksrf_fglacier - Glacier dataset
- ! mksrf_fglacierregion - Glacier region ID dataset
- ! mksrf_flai ----- Leaf Area Index dataset
- ! mksrf_flakwat -- Lake water dataset
- ! mksrf_fwetlnd -- Wetland water dataset
- ! mksrf_forganic - Organic soil carbon dataset
- ! mksrf_fmax ----- Max fractional saturated area dataset
- ! mksrf_fsoicol -- Soil color dataset
- ! mksrf_fsoitex -- Soil texture dataset
- ! mksrf_furbtopo-- Topography dataset (for limiting urban areas)
- ! mksrf_furban --- Urban dataset
- ! mksrf_fvegtyp -- PFT vegetation type dataset
- ! mksrf_fhrvtyp -- harvest type dataset
- ! mksrf_fvocef -- Volatile Organic Compund Emission Factor dataset
- ! mksrf_fgdp ----- GDP dataset
- ! mksrf_fpeat ---- Peatland dataset
- ! mksrf_fsoildepth Soil depth dataset
- ! mksrf_fabm ----- Agricultural fire peak month dataset
- ! mksrf_ftopostats Topography statistics dataset
- ! mksrf_fvic ----- VIC parameters dataset
- ! mksrf_fch4 ----- inversion-derived CH4 parameters dataset
- ! ======================================
- ! Must specify mapping file for the different datafiles above
- ! ======================================
- ! map_fpft -------- Mapping for mksrf_fvegtyp
- ! map_flakwat ----- Mapping for mksrf_flakwat
- ! map_fwetlnd ----- Mapping for mksrf_fwetlnd
- ! map_fglacier ---- Mapping for mksrf_fglacier
- ! map_fglacierregion - Mapping for mksrf_fglacierregion
- ! map_fsoitex ----- Mapping for mksrf_fsoitex
- ! map_fsoicol ----- Mapping for mksrf_fsoicol
- ! map_furban ------ Mapping for mksrf_furban
- ! map_furbtopo ---- Mapping for mksrf_furbtopo
- ! map_fmax -------- Mapping for mksrf_fmax
- ! map_forganic ---- Mapping for mksrf_forganic
- ! map_fvocef ------ Mapping for mksrf_fvocef
- ! map_flai -------- Mapping for mksrf_flai
- ! map_fharvest ---- Mapping for mksrf_flai harvesting
- ! map_fgdp -------- Mapping for mksrf_fgdp
- ! map_fpeat ------- Mapping for mksrf_fpeat
- ! map_fsoildepth -- Mapping for mksrf_fsoildepth
- ! map_fabm -------- Mapping for mksrf_fabm
- ! map_ftopostats -- Mapping for mksrf_ftopostats
- ! map_fvic -------- Mapping for mksrf_fvic
- ! map_fch4 -------- Mapping for mksrf_fch4
- ! ======================================
- ! Optionally specify setting for:
- ! ======================================
- ! mksrf_fdynuse ----- ASCII text file that lists each year of pft files to use
- ! mksrf_gridtype ---- Type of grid (default is 'global')
- ! outnc_double ------ If output should be in double precision
- ! outnc_large_files - If output should be in NetCDF large file format
- ! outnc_vic --------- Output fields needed for VIC
- ! outnc_3dglc ------- Output 3D glacier fields (normally only needed for comparasion)
- ! nglcec ------------ If you want to change the number of Glacier elevation classes
- ! gitdescribe ------- Description of this version from git
- ! ======================================
- ! Optional settings to change values for entire area
- ! ======================================
- ! all_urban --------- If entire area is urban
- ! all_veg ----------- If entire area is to be vegetated (pft_idx and pft_frc then required)
- ! no_inlandwet ------ If wetland should be set to 0% over land
- ! soil_color -------- If you want to change the soil_color to this value everywhere
- ! soil_clay --------- If you want to change the soil_clay % to this value everywhere
- ! soil_fmax --------- If you want to change the soil_fmax to this value everywhere
- ! soil_sand --------- If you want to change the soil_sand % to this value everywhere
- ! pft_idx ----------- If you want to change to 100% veg covered with given PFT indices
- ! pft_frc ----------- Fractions that correspond to the pft_idx above
- ! ==================
- ! numpft (if different than default of 16)
- ! ======================================
- ! Optional settings to work around urban bug?
- ! ======================================
- ! urban_skip_abort_on_invalid_data_check
- ! ======================================================================
-
- write(6,*) 'Attempting to initialize control settings .....'
-
- mksrf_gridtype = 'global'
- outnc_large_files = .false.
- outnc_double = .true.
- outnc_vic = .false.
- outnc_3dglc = .false.
- all_urban = .false.
- all_veg = .false.
- no_inlandwet = .true.
-
- ! default value for bug work around
- urban_skip_abort_on_invalid_data_check = .false.
-
- read(5, clmexp, iostat=ier)
- if (ier /= 0) then
- write(6,*)'error: namelist input resulted in error code ',ier
- call abort()
- endif
-
- write (6,*) 'Attempting to create surface boundary data .....'
- write (6,'(72a1)') ("-",n=1,60)
-
- ! ----------------------------------------------------------------------
- ! Error check namelist input
- ! ----------------------------------------------------------------------
-
- if (urban_skip_abort_on_invalid_data_check) then
- write(6, *) "WARNING: aborting on invalid data check in urban has been disabled!"
- write(6, *) "WARNING: urban data may be invalid!"
- end if
-
- if (mksrf_fgrid /= ' ')then
- fgrddat = mksrf_fgrid
- write(6,*)'mksrf_fgrid = ',mksrf_fgrid
- else
- write (6,*)'must specify mksrf_fgrid'
- call abort()
- endif
-
- if (trim(mksrf_gridtype) == 'global' .or. &
- trim(mksrf_gridtype) == 'regional') then
- write(6,*)'mksrf_gridtype = ',trim(mksrf_gridtype)
- else
- write(6,*)'mksrf_gridtype = ',trim(mksrf_gridtype)
- write (6,*)'illegal mksrf_gridtype, must be global or regional '
- call abort()
- endif
- if ( outnc_large_files )then
- write(6,*)'Output file in NetCDF 64-bit large_files format'
- end if
- if ( outnc_double )then
- write(6,*)'Output ALL data in file as 64-bit'
- end if
- if ( outnc_vic )then
- write(6,*)'Output VIC fields'
- end if
- if ( outnc_3dglc )then
- write(6,*)'Output optional 3D glacier fields (mostly used for verification of the glacier model)'
- end if
- if ( outnc_3dglc )then
- write(6,*)'Output optional 3D glacier fields (mostly used for verification of the glacier model)'
- end if
- if ( all_urban )then
- write(6,*) 'Output ALL data in file as 100% urban'
- end if
- if ( no_inlandwet )then
- write(6,*) 'Set wetland to 0% over land'
- end if
- if (nglcec <= 0) then
- write(6,*) 'nglcec must be at least 1'
- call abort()
- end if
-
- !
- ! Call module initialization routines
- !
- call mksoilInit( )
- call mkpftInit( zero_out_l=all_urban, all_veg_l=all_veg )
- allocate ( elevclass(nglcec+1) )
- call mkglcmecInit (elevclass)
- call mkurbanInit (mksrf_furban)
-
- if ( all_veg )then
- write(6,*) 'Output ALL data in file as 100% vegetated'
- end if
-
- ! ----------------------------------------------------------------------
- ! Determine land model grid, fractional land and land mask
- ! ----------------------------------------------------------------------
-
- write(6,*)'calling domain_read'
- if ( .not. domain_read_map(ldomain, fgrddat) )then
- call domain_read(ldomain, fgrddat)
- end if
- write(6,*)'finished domain_read'
-
- ! Invalidate mask and frac for ldomain
-
- !ldomain%mask = bigint
- !ldomain%frac = nan
-
- ! Determine if will have 1d output
-
- if (ldomain%ni /= -9999 .and. ldomain%nj /= -9999) then
- write(6,*)'fsurdat is 2d lat/lon grid'
- write(6,*)'nlon= ',ldomain%ni,' nlat= ',ldomain%nj
- if (outnc_dims == 1) then
- write(6,*)' writing output file in 1d gridcell format'
- end if
- else
- write(6,*)'fsurdat is 1d gridcell grid'
- outnc_dims = 1
- end if
-
- outnc_1d = .false.
- if ((ldomain%ni == -9999 .and. ldomain%nj == -9999) .or. outnc_dims==1) then
- outnc_1d = .true.
- write(6,*)'output file will be 1d'
- end if
-
- ! ----------------------------------------------------------------------
- ! Allocate and initialize dynamic memory
- ! ----------------------------------------------------------------------
-
- ns_o = ldomain%ns
- allocate ( landfrac_pft(ns_o) , &
- pctlnd_pft(ns_o) , &
- pftdata_mask(ns_o) , &
- pctnatpft(ns_o) , &
- pctnatpft_max(ns_o) , &
- pctcft(ns_o) , &
- pctcft_max(ns_o) , &
- pctgla(ns_o) , &
- pctlak(ns_o) , &
- pctwet(ns_o) , &
- pcturb(ns_o) , &
- urban_region(ns_o) , &
- urbn_classes(ns_o,numurbl) , &
- urbn_classes_g(ns_o,numurbl) , &
- pctsand(ns_o,nlevsoi) , &
- pctclay(ns_o,nlevsoi) , &
- soicol(ns_o) , &
- gdp(ns_o) , &
- fpeat(ns_o) , &
- soildepth(ns_o) , &
- agfirepkmon(ns_o) , &
- topo_stddev(ns_o) , &
- slope(ns_o) , &
- vic_binfl(ns_o) , &
- vic_ws(ns_o) , &
- vic_dsmax(ns_o) , &
- vic_ds(ns_o) , &
- lakedepth(ns_o) , &
- glacier_region(ns_o) )
- landfrac_pft(:) = spval
- pctlnd_pft(:) = spval
- pftdata_mask(:) = -999
- pctgla(:) = spval
- pctlak(:) = spval
- pctwet(:) = spval
- pcturb(:) = spval
- urban_region(:) = -999
- urbn_classes(:,:) = spval
- urbn_classes_g(:,:) = spval
- pctsand(:,:) = spval
- pctclay(:,:) = spval
- soicol(:) = -999
- gdp(:) = spval
- fpeat(:) = spval
- soildepth(:) = spval
- agfirepkmon(:) = -999
- topo_stddev(:) = spval
- slope(:) = spval
- vic_binfl(:) = spval
- vic_ws(:) = spval
- vic_dsmax(:) = spval
- vic_ds(:) = spval
- lakedepth(:) = spval
- glacier_region(:) = -999
-
- ! ----------------------------------------------------------------------
- ! Open diagnostic output log file
- ! ----------------------------------------------------------------------
-
- if (fsurlog == ' ') then
- write(6,*)' must specify fsurlog in namelist'
- stop
- else
- ndiag = getavu(); call opnfil (fsurlog, ndiag, 'f')
- end if
-
- if (urban_skip_abort_on_invalid_data_check) then
- write(ndiag, *) "WARNING: aborting on invalid data check in urban has been disabled!"
- write(ndiag, *) "WARNING: urban data may be invalid!"
- end if
-
- if (mksrf_fgrid /= ' ')then
- write (ndiag,*)'using fractional land data from file= ', &
- trim(mksrf_fgrid),' to create the surface dataset'
- endif
-
- if (trim(mksrf_gridtype) == 'global' .or. &
- trim(mksrf_gridtype) == 'regional') then
- write(6,*)'mksrf_gridtype = ',trim(mksrf_gridtype)
- endif
-
- write(ndiag,*) 'PFTs from: ',trim(mksrf_fvegtyp)
- write(ndiag,*) 'harvest from: ',trim(mksrf_fhrvtyp)
- write(ndiag,*) 'fmax from: ',trim(mksrf_fmax)
- write(ndiag,*) 'glaciers from: ',trim(mksrf_fglacier)
- write(ndiag,*) ' with: ', nglcec, ' glacier elevation classes'
- write(ndiag,*) 'glacier region ID from: ',trim(mksrf_fglacierregion)
- write(ndiag,*) 'urban topography from: ',trim(mksrf_furbtopo)
- write(ndiag,*) 'urban from: ',trim(mksrf_furban)
- write(ndiag,*) 'inland lake from: ',trim(mksrf_flakwat)
- write(ndiag,*) 'inland wetland from: ',trim(mksrf_fwetlnd)
- write(ndiag,*) 'soil texture from: ',trim(mksrf_fsoitex)
- write(ndiag,*) 'soil organic from: ',trim(mksrf_forganic)
- write(ndiag,*) 'soil color from: ',trim(mksrf_fsoicol)
- write(ndiag,*) 'VOC emission factors from: ',trim(mksrf_fvocef)
- write(ndiag,*) 'gdp from: ',trim(mksrf_fgdp)
- write(ndiag,*) 'peat from: ',trim(mksrf_fpeat)
- write(ndiag,*) 'soil depth from: ',trim(mksrf_fsoildepth)
- write(ndiag,*) 'abm from: ',trim(mksrf_fabm)
- write(ndiag,*) 'topography statistics from: ',trim(mksrf_ftopostats)
- write(ndiag,*) 'VIC parameters from: ',trim(mksrf_fvic)
- write(ndiag,*) 'CH4 parameters from: ',trim(mksrf_fch4)
- write(ndiag,*)' mapping for pft ',trim(map_fpft)
- write(ndiag,*)' mapping for lake water ',trim(map_flakwat)
- write(ndiag,*)' mapping for wetland ',trim(map_fwetlnd)
- write(ndiag,*)' mapping for glacier ',trim(map_fglacier)
- write(ndiag,*)' mapping for glacier region ',trim(map_fglacierregion)
- write(ndiag,*)' mapping for soil texture ',trim(map_fsoitex)
- write(ndiag,*)' mapping for soil color ',trim(map_fsoicol)
- write(ndiag,*)' mapping for soil organic ',trim(map_forganic)
- write(ndiag,*)' mapping for urban ',trim(map_furban)
- write(ndiag,*)' mapping for fmax ',trim(map_fmax)
- write(ndiag,*)' mapping for VOC pct emis ',trim(map_fvocef)
- write(ndiag,*)' mapping for harvest ',trim(map_fharvest)
- write(ndiag,*)' mapping for lai/sai ',trim(map_flai)
- write(ndiag,*)' mapping for urb topography ',trim(map_furbtopo)
- write(ndiag,*)' mapping for GDP ',trim(map_fgdp)
- write(ndiag,*)' mapping for peatlands ',trim(map_fpeat)
- write(ndiag,*)' mapping for soil depth ',trim(map_fsoildepth)
- write(ndiag,*)' mapping for ag fire pk month ',trim(map_fabm)
- write(ndiag,*)' mapping for topography stats ',trim(map_ftopostats)
- write(ndiag,*)' mapping for VIC parameters ',trim(map_fvic)
- write(ndiag,*)' mapping for CH4 parameters ',trim(map_fch4)
-
- if (mksrf_fdynuse /= ' ') then
- write(6,*)'mksrf_fdynuse = ',trim(mksrf_fdynuse)
- end if
-
- ! ----------------------------------------------------------------------
- ! Make surface dataset fields
- ! ----------------------------------------------------------------------
-
- ! Make PFTs [pctnatpft, pctcft] from dataset [fvegtyp]
-
- call mkpft(ldomain, mapfname=map_fpft, fpft=mksrf_fvegtyp, &
- ndiag=ndiag, pctlnd_o=pctlnd_pft, pctnatpft_o=pctnatpft, pctcft_o=pctcft)
-
- ! Create harvesting data at model resolution
- if (all_veg) then
- ! In this case, we don't call mkharvest, so we want the harvest variables to be
- ! initialized reasonably.
- harvest_initval = 0._r8
- else
- harvest_initval = spval
- end if
- call mkharvest_init( ns_o, harvest_initval, harvdata, mksrf_fhrvtyp )
- if ( .not. all_veg )then
-
- call mkharvest( ldomain, mapfname=map_fharvest, datfname=mksrf_fhrvtyp, &
- ndiag=ndiag, harvdata=harvdata )
- end if
-
- ! Make inland water [pctlak, pctwet] [flakwat] [fwetlnd]
-
- call mklakwat (ldomain, mapfname=map_flakwat, datfname=mksrf_flakwat, &
- ndiag=ndiag, zero_out=all_urban.or.all_veg, lake_o=pctlak)
-
- call mkwetlnd (ldomain, mapfname=map_fwetlnd, datfname=mksrf_fwetlnd, &
- ndiag=ndiag, zero_out=all_urban.or.all_veg.or.no_inlandwet, swmp_o=pctwet)
-
- ! Make glacier fraction [pctgla] from [fglacier] dataset
-
- call mkglacier (ldomain, mapfname=map_fglacier, datfname=mksrf_fglacier, &
- ndiag=ndiag, zero_out=all_urban.or.all_veg, glac_o=pctgla)
-
- ! Make glacier region ID [glacier_region] from [fglacierregion] dataset
-
- call mkglacierregion (ldomain, mapfname=map_fglacierregion, &
- datfname=mksrf_fglacierregion, ndiag=ndiag, &
- glacier_region_o = glacier_region)
-
- ! Make soil texture [pctsand, pctclay] [fsoitex]
-
- call mksoiltex (ldomain, mapfname=map_fsoitex, datfname=mksrf_fsoitex, &
- ndiag=ndiag, sand_o=pctsand, clay_o=pctclay)
- ! Make soil color classes [soicol] [fsoicol]
-
- call mksoilcol (ldomain, mapfname=map_fsoicol, datfname=mksrf_fsoicol, &
- ndiag=ndiag, soil_color_o=soicol, nsoicol=nsoicol)
-
- ! Make fmax [fmax] from [fmax] dataset
-
- allocate(fmax(ns_o))
- fmax(:) = spval
- call mkfmax (ldomain, mapfname=map_fmax, datfname=mksrf_fmax, &
- ndiag=ndiag, fmax_o=fmax)
-
- ! Make GDP data [gdp] from [gdp]
-
- call mkgdp (ldomain, mapfname=map_fgdp, datfname=mksrf_fgdp, &
- ndiag=ndiag, gdp_o=gdp)
-
- ! Make peat data [fpeat] from [peatf]
-
- call mkpeat (ldomain, mapfname=map_fpeat, datfname=mksrf_fpeat, &
- ndiag=ndiag, peat_o=fpeat)
-
- ! Make soil depth data [soildepth] from [soildepthf]
-
- call mksoildepth (ldomain, mapfname=map_fsoildepth, datfname=mksrf_fsoildepth, &
- ndiag=ndiag, soildepth_o=soildepth)
-
- ! Make agricultural fire peak month data [abm] from [abm]
-
- call mkagfirepkmon (ldomain, mapfname=map_fabm, datfname=mksrf_fabm, &
- ndiag=ndiag, agfirepkmon_o=agfirepkmon)
-
- ! Make urban fraction [pcturb] from [furban] dataset
-
- call mkurban (ldomain, mapfname=map_furban, datfname=mksrf_furban, &
- ndiag=ndiag, zero_out=all_veg, urbn_o=pcturb, urbn_classes_o=urbn_classes, &
- region_o=urban_region)
-
- ! Make elevation [elev] from [ftopo, ffrac] dataset
- ! Used only to screen pcturb
- ! Screen pcturb by elevation threshold from elev dataset
-
- if ( .not. all_urban .and. .not. all_veg )then
- allocate(elev(ns_o))
- elev(:) = spval
- ! NOTE(wjs, 2016-01-15) This uses the 'TOPO_ICE' variable for historical reasons
- ! (this same dataset used to be used for glacier-related purposes as well).
- ! TODO(wjs, 2016-01-15) A better solution for this urban screening would probably
- ! be to modify the raw urban data; in that case, I believe we could remove
- ! furbtopo.
- call mkelev (ldomain, mapfname=map_furbtopo, datfname=mksrf_furbtopo, &
- varname='TOPO_ICE', ndiag=ndiag, elev_o=elev)
-
- where (elev .gt. elev_thresh)
- pcturb = 0._r8
- end where
- deallocate(elev)
- end if
-
- ! Compute topography statistics [topo_stddev, slope] from [ftopostats]
- call mktopostats (ldomain, mapfname=map_ftopostats, datfname=mksrf_ftopostats, &
- ndiag=ndiag, topo_stddev_o=topo_stddev, slope_o=slope, std_elev=std_elev)
-
- ! Make VIC parameters [binfl, ws, dsmax, ds] from [fvic]
- if ( outnc_vic )then
- call mkVICparams (ldomain, mapfname=map_fvic, datfname=mksrf_fvic, ndiag=ndiag, &
- binfl_o=vic_binfl, ws_o=vic_ws, dsmax_o=vic_dsmax, ds_o=vic_ds)
- end if
-
- ! Make lake depth [lakedepth] from [flakwat]
- call mklakparams (ldomain, mapfname=map_flakwat, datfname=mksrf_flakwat, ndiag=ndiag, &
- lakedepth_o=lakedepth)
-
- ! Make organic matter density [organic] [forganic]
- allocate (organic(ns_o,nlevsoi))
- organic(:,:) = spval
- call mkorganic (ldomain, mapfname=map_forganic, datfname=mksrf_forganic, &
- ndiag=ndiag, organic_o=organic)
-
- ! Make VOC emission factors for isoprene &
- ! [ef1_btr,ef1_fet,ef1_fdt,ef1_shr,ef1_grs,ef1_crp]
-
- allocate ( ef1_btr(ns_o) , &
- ef1_fet(ns_o) , &
- ef1_fdt(ns_o) , &
- ef1_shr(ns_o) , &
- ef1_grs(ns_o) , &
- ef1_crp(ns_o) )
- ef1_btr(:) = 0._r8
- ef1_fet(:) = 0._r8
- ef1_fdt(:) = 0._r8
- ef1_shr(:) = 0._r8
- ef1_grs(:) = 0._r8
- ef1_crp(:) = 0._r8
-
- call mkvocef (ldomain, mapfname=map_fvocef, datfname=mksrf_fvocef, ndiag=ndiag, &
- ef_btr_o=ef1_btr, ef_fet_o=ef1_fet, ef_fdt_o=ef1_fdt, &
- ef_shr_o=ef1_shr, ef_grs_o=ef1_grs, ef_crp_o=ef1_crp)
-
- ! Do landuse changes such as for the poles, etc.
-
- call change_landuse( ldomain, dynpft=.false. )
-
- do n = 1,ns_o
-
- ! Truncate all percentage fields on output grid. This is needed to
- ! insure that wt is zero (not a very small number such as
- ! 1e-16) where it really should be zero
-
- do k = 1,nlevsoi
- pctsand(n,k) = float(nint(pctsand(n,k)))
- pctclay(n,k) = float(nint(pctclay(n,k)))
- end do
- pctlak(n) = float(nint(pctlak(n)))
- pctwet(n) = float(nint(pctwet(n)))
- pctgla(n) = float(nint(pctgla(n)))
-
- ! Assume wetland, glacier and/or lake when dataset landmask implies ocean
- ! (assume medium soil color (15) and loamy texture).
- ! Also set pftdata_mask here
-
- if (pctlnd_pft(n) < 1.e-6_r8) then
- pftdata_mask(n) = 0
- soicol(n) = 15
- if (pctgla(n) < 1.e-6_r8) then
- pctwet(n) = 100._r8 - pctlak(n)
- pctgla(n) = 0._r8
- else
- pctwet(n) = max(100._r8 - pctgla(n) - pctlak(n), 0.0_r8)
- end if
- pcturb(n) = 0._r8
- call pctnatpft(n)%set_pct_l2g(0._r8)
- call pctcft(n)%set_pct_l2g(0._r8)
- pctsand(n,:) = 43._r8
- pctclay(n,:) = 18._r8
- organic(n,:) = 0._r8
- else
- pftdata_mask(n) = 1
- end if
-
- ! Make sure sum of land cover types does not exceed 100. If it does,
- ! subtract excess from most dominant land cover.
-
- suma = pctlak(n) + pctwet(n) + pcturb(n) + pctgla(n)
- if (suma > 250._r4) then
- write (6,*) subname, ' error: sum of pctlak, pctwet,', &
- 'pcturb and pctgla is greater than 250%'
- write (6,*)'n,pctlak,pctwet,pcturb,pctgla= ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n)
- call abort()
- else if (suma > 100._r4) then
- pctlak(n) = pctlak(n) * 100._r8/suma
- pctwet(n) = pctwet(n) * 100._r8/suma
- pcturb(n) = pcturb(n) * 100._r8/suma
- pctgla(n) = pctgla(n) * 100._r8/suma
- end if
-
- end do
-
- call normalizencheck_landuse(ldomain)
-
- ! Write out sum of PFT's
-
- do k = natpft_lb,natpft_ub
- suma = 0._r8
- do n = 1,ns_o
- suma = suma + pctnatpft(n)%get_one_pct_p2g(k)
- enddo
- write(6,*) 'sum over domain of pft ',k,suma
- enddo
- write(6,*)
-
- do k = cft_lb,cft_ub
- suma = 0._r8
- do n = 1,ns_o
- suma = suma + pctcft(n)%get_one_pct_p2g(k)
- enddo
- write(6,*) 'sum over domain of cft ',k,suma
- enddo
- write(6,*)
-
- ! Make final values of percent urban by class
- ! This call needs to occur after all corrections are made to pcturb
-
- call normalize_classes_by_gcell(urbn_classes, pcturb, urbn_classes_g)
-
-
- ! Make glacier multiple elevation classes [pctglcmec,topoglcmec] from [fglacier] dataset
- ! This call needs to occur after pctgla has been adjusted for the final time
-
- allocate (pctglcmec(ns_o,nglcec), &
- topoglcmec(ns_o,nglcec) )
- if ( outnc_3dglc )then
- allocate( &
- pctglcmec_gic(ns_o,nglcec), &
- pctglcmec_icesheet(ns_o,nglcec))
- allocate (pctglc_gic(ns_o))
- allocate (pctglc_icesheet(ns_o))
- end if
-
- pctglcmec(:,:) = spval
- topoglcmec(:,:) = spval
-
- if ( outnc_3dglc )then
- call mkglcmec (ldomain, mapfname=map_fglacier, &
- datfname_fglacier=mksrf_fglacier, ndiag=ndiag, &
- pctglcmec_o=pctglcmec, topoglcmec_o=topoglcmec, &
- pctglcmec_gic_o=pctglcmec_gic, pctglcmec_icesheet_o=pctglcmec_icesheet, &
- pctglc_gic_o=pctglc_gic, pctglc_icesheet_o=pctglc_icesheet)
- else
- call mkglcmec (ldomain, mapfname=map_fglacier, &
- datfname_fglacier=mksrf_fglacier, ndiag=ndiag, &
- pctglcmec_o=pctglcmec, topoglcmec_o=topoglcmec )
- end if
-
- ! Determine fractional land from pft dataset
-
- do n = 1,ns_o
- landfrac_pft(n) = pctlnd_pft(n)/100._r8
- end do
-
- ! ----------------------------------------------------------------------
- ! Create surface dataset
- ! ----------------------------------------------------------------------
-
- ! Create netCDF surface dataset.
-
- ! If fsurdat is blank, then we do not write a surface dataset - but we may still
- ! write a dynamic landuse file. This is useful if we are creating many datasets at
- ! once, and don't want duplicate surface datasets.
- !
- ! TODO(wjs, 2016-01-26) Ideally, we would also avoid doing the processing of
- ! variables that are just needed by the surface dataset (not by the dynamic landuse
- ! file). However, this would require some analysis of the above code, to determine
- ! which processing is needed (directly or indirectly) in order to create a dynamic
- ! landuse file.
-
- if (fsurdat /= ' ') then
-
- call mkfile(ldomain, trim(fsurdat), harvdata, dynlanduse = .false.)
-
- call domain_write(ldomain, fsurdat)
-
- call check_ret(nf_open(trim(fsurdat), nf_write, ncid), subname)
- call check_ret(nf_set_fill (ncid, nf_nofill, omode), subname)
-
- ! Write fields OTHER THAN lai, sai, heights, and urban parameters to netcdf surface dataset
-
- call check_ret(nf_inq_varid(ncid, 'natpft', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, (/(n,n=natpft_lb,natpft_ub)/)), subname)
-
- if (num_cft > 0) then
- call check_ret(nf_inq_varid(ncid, 'cft', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, (/(n,n=cft_lb,cft_ub)/)), subname)
- end if
-
- call check_ret(nf_inq_varid(ncid, 'PFTDATA_MASK', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, pftdata_mask), subname)
-
- call check_ret(nf_inq_varid(ncid, 'LANDFRAC_PFT', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, landfrac_pft), subname)
-
- call check_ret(nf_inq_varid(ncid, 'mxsoil_color', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, nsoicol), subname)
-
- call check_ret(nf_inq_varid(ncid, 'SOIL_COLOR', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, soicol), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_SAND', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctsand), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_CLAY', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctclay), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_WETLAND', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctwet), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_LAKE', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctlak), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLACIER', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctgla), subname)
-
- call check_ret(nf_inq_varid(ncid, 'GLACIER_REGION', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, glacier_region), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_MEC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglcmec), subname)
-
- call check_ret(nf_inq_varid(ncid, 'GLC_MEC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, elevclass), subname)
-
- call check_ret(nf_inq_varid(ncid, 'TOPO_GLC_MEC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, topoglcmec), subname)
-
- if ( outnc_3dglc )then
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_MEC_GIC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglcmec_gic), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_MEC_ICESHEET', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglcmec_icesheet), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_GIC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglc_gic), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_ICESHEET', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglc_icesheet), subname)
- end if
-
- if ( outnc_3dglc )then
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_MEC_GIC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglcmec_gic), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_MEC_ICESHEET', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglcmec_icesheet), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_GIC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglc_gic), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_GLC_ICESHEET', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, pctglc_icesheet), subname)
- end if
-
- call check_ret(nf_inq_varid(ncid, 'PCT_URBAN', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, urbn_classes_g), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_NATVEG', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_l2g_array(pctnatpft)), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_CROP', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_l2g_array(pctcft)), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_NAT_PFT', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_p2l_array(pctnatpft)), subname)
-
- if (num_cft > 0) then
- call check_ret(nf_inq_varid(ncid, 'PCT_CFT', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_p2l_array(pctcft)), subname)
- end if
-
- call harvdata%getFieldsIdx( harvind1D, harvind2D )
- do k = 1, harvdata%num1Dfields()
- call check_ret(nf_inq_varid(ncid, trim(mkharvest_fieldname(harvind1D(k),constant=.true.)), varid), subname)
- harvest1D => harvdata%get1DFieldPtr( harvind1D(k), output=.true. )
- call check_ret(nf_put_var_double(ncid, varid, harvest1D), subname)
- end do
- do k = 1, harvdata%num2Dfields()
- call check_ret(nf_inq_varid(ncid, trim(mkharvest_fieldname(harvind2D(k),constant=.true.)), varid), subname)
- harvest2D => harvdata%get2DFieldPtr( harvind2D(k), output=.true. )
- call check_ret(nf_put_var_double(ncid, varid, harvest2D), subname)
- end do
- deallocate( harvind1D, harvind2D )
-
- call check_ret(nf_inq_varid(ncid, 'FMAX', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, fmax), subname)
-
- call check_ret(nf_inq_varid(ncid, 'gdp', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, gdp), subname)
-
- call check_ret(nf_inq_varid(ncid, 'peatf', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, fpeat), subname)
-
-
- ! call check_ret(nf_inq_varid(ncid, 'Avg_Depth_Median', varid), subname)
- call check_ret(nf_inq_varid(ncid, 'zbedrock', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, soildepth), subname)
-
- call check_ret(nf_inq_varid(ncid, 'abm', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, agfirepkmon), subname)
-
- call check_ret(nf_inq_varid(ncid, 'SLOPE', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, slope), subname)
-
- call check_ret(nf_inq_varid(ncid, 'STD_ELEV', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, topo_stddev), subname)
-
- if ( outnc_vic )then
- call check_ret(nf_inq_varid(ncid, 'binfl', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, vic_binfl), subname)
-
- call check_ret(nf_inq_varid(ncid, 'Ws', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, vic_ws), subname)
-
- call check_ret(nf_inq_varid(ncid, 'Dsmax', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, vic_dsmax), subname)
-
- call check_ret(nf_inq_varid(ncid, 'Ds', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, vic_ds), subname)
- end if
-
- call check_ret(nf_inq_varid(ncid, 'LAKEDEPTH', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, lakedepth), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_BTR', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_btr), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_FET', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_fet), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_FDT', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_fdt), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_SHR', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_shr), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_GRS', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_grs), subname)
-
- call check_ret(nf_inq_varid(ncid, 'EF1_CRP', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, ef1_crp), subname)
-
- call check_ret(nf_inq_varid(ncid, 'ORGANIC', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, organic), subname)
-
- call check_ret(nf_inq_varid(ncid, 'URBAN_REGION_ID', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, urban_region), subname)
-
- ! Synchronize the disk copy of a netCDF dataset with in-memory buffers
-
- call check_ret(nf_sync(ncid), subname)
-
- ! ----------------------------------------------------------------------
- ! Make Urban Parameters from raw input data and write to surface dataset
- ! Write to netcdf file is done inside mkurbanpar routine
- ! ----------------------------------------------------------------------
-
- write(6,*)'calling mkurbanpar'
- call mkurbanpar(datfname=mksrf_furban, ncido=ncid, region_o=urban_region, &
- urbn_classes_gcell_o=urbn_classes_g, &
- urban_skip_abort_on_invalid_data_check=urban_skip_abort_on_invalid_data_check)
-
- ! ----------------------------------------------------------------------
- ! Make LAI and SAI from 1/2 degree data and write to surface dataset
- ! Write to netcdf file is done inside mklai routine
- ! ----------------------------------------------------------------------
-
- write(6,*)'calling mklai'
- call mklai(ldomain, mapfname=map_flai, datfname=mksrf_flai, &
- ndiag=ndiag, ncido=ncid )
-
- ! Close surface dataset
-
- call check_ret(nf_close(ncid), subname)
-
- write (6,'(72a1)') ("-",n=1,60)
- write (6,*)' land model surface data set successfully created for ', &
- 'grid of size ',ns_o
-
- else ! fsurdat == ' '
-
- write (6,*) 'fsurdat is blank: skipping writing surface dataset'
-
- end if ! if (fsurdat /= ' ')
-
- ! Deallocate arrays NOT needed for dynamic-pft section of code
-
- deallocate ( organic )
- deallocate ( ef1_btr, ef1_fet, ef1_fdt, ef1_shr, ef1_grs, ef1_crp )
- deallocate ( pctglcmec, topoglcmec)
- if ( outnc_3dglc ) deallocate ( pctglc_gic, pctglc_icesheet)
- deallocate ( elevclass )
- deallocate ( fmax )
- deallocate ( pctsand, pctclay )
- deallocate ( soicol )
- deallocate ( gdp, fpeat, agfirepkmon )
- deallocate ( soildepth )
- deallocate ( topo_stddev, slope )
- deallocate ( vic_binfl, vic_ws, vic_dsmax, vic_ds )
- deallocate ( lakedepth )
- deallocate ( glacier_region )
-
- call harvdata%clean()
-
- ! ----------------------------------------------------------------------
- ! Create dynamic land use dataset if appropriate
- ! ----------------------------------------------------------------------
-
- if (mksrf_fdynuse /= ' ') then
-
- write(6,*)'creating dynamic land use dataset'
-
- allocate(pctlnd_pft_dyn(ns_o))
- call mkharvest_init( ns_o, spval, harvdata, mksrf_fhrvtyp )
-
- if (fdyndat == ' ') then
- write(6,*)' must specify fdyndat in namelist if mksrf_fdynuse is not blank'
- stop
- end if
-
- ! Define dimensions and global attributes
-
- call mkfile(ldomain, fdyndat, harvdata, dynlanduse=.true.)
-
- ! Write fields other pft to dynamic land use dataset
-
- call domain_write(ldomain, fdyndat)
-
- call check_ret(nf_open(trim(fdyndat), nf_write, ncid), subname)
- call check_ret(nf_set_fill (ncid, nf_nofill, omode), subname)
-
- call check_ret(nf_inq_varid(ncid, 'natpft', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, (/(n,n=natpft_lb,natpft_ub)/)), subname)
-
- if (num_cft > 0) then
- call check_ret(nf_inq_varid(ncid, 'cft', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, (/(n,n=cft_lb,cft_ub)/)), subname)
- end if
-
- call check_ret(nf_inq_varid(ncid, 'PFTDATA_MASK', varid), subname)
- call check_ret(nf_put_var_int(ncid, varid, pftdata_mask), subname)
-
- call check_ret(nf_inq_varid(ncid, 'LANDFRAC_PFT', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, landfrac_pft), subname)
-
- ! Synchronize the disk copy of a netCDF dataset with in-memory buffers
-
- call check_ret(nf_sync(ncid), subname)
-
- ! Read in each dynamic pft landuse dataset
-
- nfdyn = getavu(); call opnfil (mksrf_fdynuse, nfdyn, 'f')
-
- pctnatpft_max = pctnatpft
- pctcft_max = pctcft
-
- ntim = 0
- do
- ! Read input pft data
-
- read(nfdyn, '(A195,1x,I4)', iostat=ier) string, year
- if (ier /= 0) exit
- !
- ! If pft fraction override is set, than intrepret string as PFT and harvesting override values
- !
- if ( all_veg )then
- fname = ' '
- fhrvname = ' '
- call mkpft_parse_oride(string)
- call mkharvest_parse_oride(string)
- write(6, '(a, i4, a)') 'PFT and harvesting values for year ', year, ' :'
- write(6, '(a, a)') ' ', trim(string)
- !
- ! Otherwise intrepret string as a filename with PFT and harvesting values in it
- !
- else
- fname = string
- write(6,*)'input pft dynamic dataset for year ', year, ' is : ', trim(fname)
- read(nfdyn, '(A195,1x,I4)', iostat=ier) fhrvname, year2
- if ( year2 /= year ) then
- write(6,*) subname, ' error: year for harvest not equal to year for PFT files'
- call abort()
- end if
- end if
- ntim = ntim + 1
-
- ! Create pctpft data at model resolution
-
- call mkpft(ldomain, mapfname=map_fpft, fpft=fname, &
- ndiag=ndiag, pctlnd_o=pctlnd_pft_dyn, pctnatpft_o=pctnatpft, pctcft_o=pctcft )
-
- ! Create harvesting data at model resolution
-
- call mkharvest( ldomain, mapfname=map_fharvest, datfname=fhrvname, &
- ndiag=ndiag, harvdata=harvdata )
-
- ! Consistency check on input land fraction
-
- do n = 1,ns_o
- if (pctlnd_pft_dyn(n) /= pctlnd_pft(n)) then
- write(6,*) subname,' error: pctlnd_pft for dynamics data = ',&
- pctlnd_pft_dyn(n), ' not equal to pctlnd_pft for surface data = ',&
- pctlnd_pft(n),' at n= ',n
- if ( trim(fname) == ' ' )then
- write(6,*) ' PFT string = ', string
- else
- write(6,*) ' PFT file = ', fname
- end if
- call abort()
- end if
- end do
-
- call change_landuse(ldomain, dynpft=.true.)
-
- call normalizencheck_landuse(ldomain)
-
- call update_max_array(pctnatpft_max,pctnatpft)
- call update_max_array(pctcft_max,pctcft)
-
- ! Output time-varying data for current year
-
- call check_ret(nf_inq_varid(ncid, 'PCT_NAT_PFT', varid), subname)
- call ncd_put_time_slice(ncid, varid, ntim, get_pct_p2l_array(pctnatpft))
-
- call check_ret(nf_inq_varid(ncid, 'PCT_CROP', varid), subname)
- call ncd_put_time_slice(ncid, varid, ntim, get_pct_l2g_array(pctcft))
-
- if (num_cft > 0) then
- call check_ret(nf_inq_varid(ncid, 'PCT_CFT', varid), subname)
- call ncd_put_time_slice(ncid, varid, ntim, get_pct_p2l_array(pctcft))
- end if
-
- call harvdata%getFieldsIdx( harvind1D, harvind2D )
- do k = 1, harvdata%num1Dfields()
- call check_ret(nf_inq_varid(ncid, trim(mkharvest_fieldname(harvind1D(k),constant=.false.)), varid), subname)
- harvest1D => harvdata%get1DFieldPtr( harvind1D(k), output=.true. )
- call ncd_put_time_slice(ncid, varid, ntim, harvest1D)
- end do
- do k = 1, harvdata%num2Dfields()
- call check_ret(nf_inq_varid(ncid, trim(mkharvest_fieldname(harvind2D(k),constant=.false.)), varid), subname)
- harvest2D => harvdata%get2DFieldPtr( harvind2D(k), output=.true. )
- call ncd_put_time_slice(ncid, varid, ntim, harvest2D)
- end do
- deallocate( harvind1D, harvind2D )
-
- call check_ret(nf_inq_varid(ncid, 'YEAR', varid), subname)
- call check_ret(nf_put_vara_int(ncid, varid, ntim, 1, year), subname)
-
- call check_ret(nf_inq_varid(ncid, 'time', varid), subname)
- call check_ret(nf_put_vara_int(ncid, varid, ntim, 1, year), subname)
-
- call check_ret(nf_inq_varid(ncid, 'input_pftdata_filename', varid), subname)
- call check_ret(nf_put_vara_text(ncid, varid, (/ 1, ntim /), (/ len_trim(string), 1 /), trim(string) ), subname)
-
- ! Synchronize the disk copy of a netCDF dataset with in-memory buffers
-
- call check_ret(nf_sync(ncid), subname)
-
- end do ! end of read loop
-
- call check_ret(nf_inq_varid(ncid, 'PCT_NAT_PFT_MAX', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_p2l_array(pctnatpft_max)), subname)
-
- call check_ret(nf_inq_varid(ncid, 'PCT_CROP_MAX', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_l2g_array(pctcft_max)), subname)
-
- if (num_cft > 0) then
- call check_ret(nf_inq_varid(ncid, 'PCT_CFT_MAX', varid), subname)
- call check_ret(nf_put_var_double(ncid, varid, get_pct_p2l_array(pctcft_max)), subname)
- end if
-
- call check_ret(nf_close(ncid), subname)
-
- end if ! end of if-create dynamic landust dataset
-
- ! ----------------------------------------------------------------------
- ! Close diagnostic dataset
- ! ----------------------------------------------------------------------
-
- close (ndiag)
- write (6,*)
- write (6,*) 'Surface data output file = ',trim(fsurdat)
- write (6,*) ' This file contains the land model surface data'
- write (6,*) 'Diagnostic log file = ',trim(fsurlog)
- write (6,*) ' See this file for a summary of the dataset'
- write (6,*)
-
- write (6,*) 'Successfully created surface dataset'
-
-!-----------------------------------------------------------------------
-contains
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: change_landuse
-!
-! !INTERFACE:
-subroutine change_landuse( ldomain, dynpft )
-!
-! !DESCRIPTION:
-!
-! Do landuse changes such as for the poles, etc.
-!
-! !USES:
- implicit none
-!
-! !ARGUMENTS:
- type(domain_type) :: ldomain
- logical, intent(in) :: dynpft ! if part of the dynpft section of code
-
-!
-! !REVISION HISTORY:
-! 9/10/09: Erik Kluzek spin off subroutine from original embedded code
-!
-!EOP
-!
-! !LOCAL VARIABLES:
- integer :: n,ns_o ! indices
- character(len=32) :: subname = 'change_landuse' ! subroutine name
-!-----------------------------------------------------------------------
-
- ns_o = ldomain%ns
- do n = 1,ns_o
-
- ! If have pole points on grid - set south pole to glacier
- ! north pole is assumed as non-land
-
- if (abs((ldomain%latc(n) - 90._r8)) < 1.e-6_r8) then
- pctlak(n) = 0._r8
- pctwet(n) = 0._r8
- pcturb(n) = 0._r8
- pctgla(n) = 100._r8
- call pctnatpft(n)%set_pct_l2g(0._r8)
- call pctcft(n)%set_pct_l2g(0._r8)
- if ( .not. dynpft )then
- organic(n,:) = 0._r8
- ef1_btr(n) = 0._r8
- ef1_fet(n) = 0._r8
- ef1_fdt(n) = 0._r8
- ef1_shr(n) = 0._r8
- ef1_grs(n) = 0._r8
- ef1_crp(n) = 0._r8
- end if
- end if
-
- end do
-
-end subroutine change_landuse
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: normalizencheck_landuse
-!
-! !INTERFACE:
-subroutine normalizencheck_landuse(ldomain)
-!
-! !DESCRIPTION:
-!
-! Normalize land use and make sure things add up to 100% as well as
-! checking that things are as they should be.
-!
-! Precondition: pctlak + pctwet + pcturb + pctgla <= 100 (within roundoff)
-!
-! !USES:
- use mkpftConstantsMod , only : baregroundindex
- use mkpftUtilsMod , only : adjust_total_veg_area
- implicit none
-! !ARGUMENTS:
- type(domain_type) :: ldomain
-!
-! !REVISION HISTORY:
-! 9/10/09: Erik Kluzek spin off subroutine from original embedded code
-!
-!EOP
-!
-! !LOCAL VARIABLES:
- integer :: m,k,n,ns_o ! indices
- integer :: nsmall ! number of small PFT values for a single check
- integer :: nsmall_tot ! total number of small PFT values in all grid cells
- real(r8) :: suma ! sum for error check
- real(r8) :: suma2 ! another sum for error check
- real(r8) :: new_total_veg_pct ! new % veg (% of grid cell, total of natural veg & crop)
- real(r8) :: bare_pct_p2g ! % of bare soil, as % of grid cell
- real(r8) :: bare_urb_diff ! difference between bare soil and urban %
- real(r8) :: pcturb_excess ! excess urban % not accounted for by bare soil
- real(r8) :: sum8, sum8a ! sum for error check
- real(r4) :: sum4a ! sum for error check
- real(r8), parameter :: tol_loose = 1.e-4_r8 ! tolerance for some 'loose' error checks
- real(r8), parameter :: toosmallPFT = 1.e-10_r8 ! tolerance for PFT's to ignore
- character(len=32) :: subname = 'normalizencheck_landuse' ! subroutine name
-!-----------------------------------------------------------------------
-
- ! ------------------------------------------------------------------------
- ! Normalize vegetated area so that vegetated + special area is 100%
- ! ------------------------------------------------------------------------
-
- ns_o = ldomain%ns
- do n = 1,ns_o
-
- ! Check preconditions
- if ( pctlak(n) < 0.0_r8 )then
- write(6,*) subname, ' ERROR: pctlak is negative!'
- write(6,*) 'n, pctlak = ', n, pctlak(n)
- call abort()
- end if
- if ( pctwet(n) < 0.0_r8 )then
- write(6,*) subname, ' ERROR: pctwet is negative!'
- write(6,*) 'n, pctwet = ', n, pctwet(n)
- call abort()
- end if
- if ( pcturb(n) < 0.0_r8 )then
- write(6,*) subname, ' ERROR: pcturb is negative!'
- write(6,*) 'n, pcturb = ', n, pcturb(n)
- call abort()
- end if
- if ( pctgla(n) < 0.0_r8 )then
- write(6,*) subname, ' ERROR: pctgla is negative!'
- write(6,*) 'n, pctgla = ', n, pctgla(n)
- call abort()
- end if
-
- suma = pctlak(n) + pctwet(n) + pcturb(n) + pctgla(n)
- if (suma > (100._r8 + tol_loose)) then
- write(6,*) subname, ' ERROR: pctlak + pctwet + pcturb + pctgla must be'
- write(6,*) '<= 100% before calling this subroutine'
- write(6,*) 'n, pctlak, pctwet, pcturb, pctgla = ', &
- n, pctlak(n), pctwet(n), pcturb(n), pctgla(n)
- call abort()
- end if
-
- ! First normalize vegetated (natural veg + crop) cover so that the total of
- ! (vegetated + (special excluding urban)) is 100%. We'll deal with urban later.
- !
- ! Note that, in practice, the total area of natural veg + crop is typically 100%
- ! going into this routine. However, the following code does NOT rely on this, and
- ! will work properly regardless of the initial area of natural veg + crop (even if
- ! that initial area is 0%).
-
- suma = pctlak(n)+pctwet(n)+pctgla(n)
- new_total_veg_pct = 100._r8 - suma
- ! correct for rounding error:
- new_total_veg_pct = max(new_total_veg_pct, 0._r8)
-
- call adjust_total_veg_area(new_total_veg_pct, pctnatpft=pctnatpft(n), pctcft=pctcft(n))
-
- ! Make sure we did the above rescaling correctly
-
- suma = suma + pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g()
- if (abs(suma - 100._r8) > tol_loose) then
- write(6,*) subname, ' ERROR in rescaling veg based on (special excluding urban'
- write(6,*) 'suma = ', suma
- call abort()
- end if
-
- ! Now decrease the vegetated area to account for urban area. Urban needs to be
- ! handled specially because we replace bare soil preferentially with urban, rather
- ! than rescaling all PFTs equally.
-
- if (pcturb(n) > 0._r8) then
-
- ! Replace bare soil preferentially with urban
- bare_pct_p2g = pctnatpft(n)%get_one_pct_p2g(baregroundindex)
- bare_urb_diff = bare_pct_p2g - pcturb(n)
- bare_pct_p2g = max(0._r8, bare_urb_diff)
- call pctnatpft(n)%set_one_pct_p2g(baregroundindex, bare_pct_p2g)
- pcturb_excess = abs(min(0._r8,bare_urb_diff))
-
- ! For any urban not accounted for by bare soil, replace other PFTs
- ! proportionally
- if (pcturb_excess > 0._r8) then
- ! Note that, in this case, we will have already reduced bare ground to 0%
-
- new_total_veg_pct = pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g() - pcturb_excess
- if (new_total_veg_pct < 0._r8) then
- if (abs(new_total_veg_pct) < tol_loose) then
- ! only slightly less than 0; correct it
- new_total_veg_pct = 0._r8
- else
- write(6,*) subname, ' ERROR: trying to replace veg with urban,'
- write(6,*) 'but pcturb_excess exceeds current vegetation percent'
- call abort()
- end if
- end if
-
- call adjust_total_veg_area(new_total_veg_pct, pctnatpft=pctnatpft(n), pctcft=pctcft(n))
- end if
-
- end if ! pcturb(n) > 0
-
- ! Confirm that we have done the rescaling correctly: now the sum of all landunits
- ! should be 100%
- suma = pctlak(n)+pctwet(n)+pctgla(n)+pcturb(n)
- suma = suma + pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g()
- if (abs(suma - 100._r8) > tol_loose) then
- write(6,*) subname, ' ERROR: landunits do not sum to 100%'
- write(6,*) 'n, suma, pctlak, pctwet, pctgla, pcturb, pctnatveg, pctcrop = '
- write(6,*) n, suma, pctlak(n), pctwet(n), pctgla(n), pcturb(n), &
- pctnatpft(n)%get_pct_l2g(), pctcft(n)%get_pct_l2g()
- call abort()
- end if
-
- end do
-
- ! ------------------------------------------------------------------------
- ! Do other corrections and error checks
- ! ------------------------------------------------------------------------
-
- nsmall_tot = 0
-
- do n = 1,ns_o
-
- ! If the coverage of any PFT or CFT is too small at the gridcell level, set its
- ! % cover to 0, then renormalize everything else as needed
- call pctnatpft(n)%remove_small_cover(toosmallPFT, nsmall)
- nsmall_tot = nsmall_tot + nsmall
- call pctcft(n)%remove_small_cover(toosmallPFT, nsmall)
- nsmall_tot = nsmall_tot + nsmall
-
- suma = pctlak(n) + pctwet(n) + pcturb(n) + pctgla(n)
- suma = suma + pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g()
- if ( abs(suma - 100.0_r8) > 2.0*epsilon(suma) )then
- pctlak(n) = pctlak(n) * 100._r8/suma
- pctwet(n) = pctwet(n) * 100._r8/suma
- pcturb(n) = pcturb(n) * 100._r8/suma
- pctgla(n) = pctgla(n) * 100._r8/suma
- call pctnatpft(n)%set_pct_l2g(pctnatpft(n)%get_pct_l2g() * 100._r8/suma)
- call pctcft(n)%set_pct_l2g(pctcft(n)%get_pct_l2g() * 100._r8/suma)
- end if
-
- ! Roundoff error fix
- suma = pctlak(n) + pctwet(n) + pcturb(n) + pctgla(n)
- suma2 = pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g()
- if ( (suma < 100._r8 .and. suma > (100._r8 - 1.e-6_r8)) .or. &
- (suma2 > 0.0_r8 .and. suma2 < 1.e-6_r8) ) then
- write (6,*) 'Special land units near 100%, but not quite for n,suma =',n,suma
- write (6,*) 'Adjusting special land units to 100%'
- if (pctlak(n) >= 25._r8) then
- pctlak(n) = 100._r8 - (pctwet(n) + pcturb(n) + pctgla(n))
- else if (pctwet(n) >= 25._r8) then
- pctwet(n) = 100._r8 - (pctlak(n) + pcturb(n) + pctgla(n))
- else if (pcturb(n) >= 25._r8) then
- pcturb(n) = 100._r8 - (pctlak(n) + pctwet(n) + pctgla(n))
- else if (pctgla(n) >= 25._r8) then
- pctgla(n) = 100._r8 - (pctlak(n) + pctwet(n) + pcturb(n))
- else
- write (6,*) subname, 'Error: sum of special land units nearly 100% but none is >= 25% at ', &
- 'n,pctlak(n),pctwet(n),pcturb(n),pctgla(n),pctnatveg(n),pctcrop(n),suma = ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n),&
- pctnatpft(n)%get_pct_l2g(),pctcft(n)%get_pct_l2g(),suma
- call abort()
- end if
- call pctnatpft(n)%set_pct_l2g(0._r8)
- call pctcft(n)%set_pct_l2g(0._r8)
- end if
- if ( any(pctnatpft(n)%get_pct_p2g() > 0.0_r8 .and. pctnatpft(n)%get_pct_p2g() < toosmallPFT ) .or. &
- any(pctcft(n)%get_pct_p2g() > 0.0_r8 .and. pctcft(n)%get_pct_p2g() < toosmallPFT )) then
- write (6,*) 'pctnatpft or pctcft is small at n=', n
- write (6,*) 'pctnatpft%pct_p2l = ', pctnatpft(n)%get_pct_p2l()
- write (6,*) 'pctcft%pct_p2l = ', pctcft(n)%get_pct_p2l()
- write (6,*) 'pctnatpft%pct_l2g = ', pctnatpft(n)%get_pct_l2g()
- write (6,*) 'pctcft%pct_l2g = ', pctcft(n)%get_pct_l2g()
- call abort()
- end if
-
- suma = pctlak(n) + pctwet(n) + pcturb(n) + pctgla(n)
- if (suma < 100._r8-epsilon(suma) .and. suma > (100._r8 - 4._r8*epsilon(suma))) then
- write (6,*) subname, 'n,pctlak,pctwet,pcturb,pctgla,pctnatveg,pctcrop= ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n),&
- pctnatpft(n)%get_pct_l2g(), pctcft(n)%get_pct_l2g()
- call abort()
- end if
- suma = suma + pctnatpft(n)%get_pct_l2g() + pctcft(n)%get_pct_l2g()
- if ( abs(suma-100._r8) > 1.e-10_r8) then
- write (6,*) subname, ' error: sum of pctlak, pctwet,', &
- 'pcturb, pctgla, pctnatveg and pctcrop is NOT equal to 100'
- write (6,*)'n,pctlak,pctwet,pcturb,pctgla,pctnatveg,pctcrop,sum= ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n),&
- pctnatpft(n)%get_pct_l2g(),pctcft(n)%get_pct_l2g(), suma
- call abort()
- end if
-
- end do
-
- ! Check that when pctnatveg+pctcrop identically zero, sum of special landunits is identically 100%
-
- if ( .not. outnc_double )then
- do n = 1,ns_o
- sum8 = real(pctlak(n),r4)
- sum8 = sum8 + real(pctwet(n),r4)
- sum8 = sum8 + real(pcturb(n),r4)
- sum8 = sum8 + real(pctgla(n),r4)
- sum4a = real(pctnatpft(n)%get_pct_l2g(),r4)
- sum4a = sum4a + real(pctcft(n)%get_pct_l2g(),r4)
- if ( sum4a==0.0_r4 .and. sum8 < 100._r4-2._r4*epsilon(sum4a) )then
- write (6,*) subname, ' error: sum of pctlak, pctwet,', &
- 'pcturb, pctgla is < 100% when pctnatveg+pctcrop==0 sum = ', sum8
- write (6,*)'n,pctlak,pctwet,pcturb,pctgla,pctnatveg,pctcrop= ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n), &
- pctnatpft(n)%get_pct_l2g(),pctcft(n)%get_pct_l2g()
- call abort()
- end if
- end do
- else
- do n = 1,ns_o
- sum8 = pctlak(n)
- sum8 = sum8 + pctwet(n)
- sum8 = sum8 + pcturb(n)
- sum8 = sum8 + pctgla(n)
- sum8a = pctnatpft(n)%get_pct_l2g()
- sum8a = sum8a + pctcft(n)%get_pct_l2g()
- if ( sum8a==0._r8 .and. sum8 < (100._r8-4._r8*epsilon(sum8)) )then
- write (6,*) subname, ' error: sum of pctlak, pctwet,', &
- 'pcturb, pctgla is < 100% when pctnatveg+pctcrop==0 sum = ', sum8
- write (6,*) 'Total error, error/epsilon = ',100._r8-sum8, ((100._r8-sum8)/epsilon(sum8))
- write (6,*)'n,pctlak,pctwet,pcturb,pctgla,pctnatveg,pctcrop,epsilon= ', &
- n,pctlak(n),pctwet(n),pcturb(n),pctgla(n),&
- pctnatpft(n)%get_pct_l2g(),pctcft(n)%get_pct_l2g(), epsilon(sum8)
- call abort()
- end if
- end do
- end if
-
- ! Make sure that there is no vegetation outside the pft mask
- do n = 1,ns_o
- if (pftdata_mask(n) == 0 .and. (pctnatpft(n)%get_pct_l2g() > 0 .or. pctcft(n)%get_pct_l2g() > 0)) then
- write (6,*)'vegetation found outside the pft mask at n=',n
- write (6,*)'pctnatveg,pctcrop=', pctnatpft(n)%get_pct_l2g(), pctcft(n)%get_pct_l2g()
- call abort()
- end if
- end do
-
- ! Make sure that sums at the landunit level all add to 100%
- ! (Note that we don't check pctglcmec here, because it isn't computed at the point
- ! that this subroutine is called -- but the check of sum(pctglcmec) is done in
- ! mkglcmecMod)
- ! (Also note that we don't need to check pctnatpft or pctcft, because a similar check
- ! is done internally by the pct_pft_type routines.)
- do n = 1,ns_o
- if (abs(sum(urbn_classes(n,:)) - 100._r8) > 1.e-12_r8) then
- write(6,*) 'sum(urbn_classes(n,:)) != 100: ', n, sum(urbn_classes(n,:))
- call abort()
- end if
- end do
-
- if ( nsmall_tot > 0 )then
- write (6,*)'number of small pft = ', nsmall_tot
- end if
-
-end subroutine normalizencheck_landuse
-
-end program mksurfdat
diff --git a/tools/mksurfdata_map/src/mktopostatsMod.F90 b/tools/mksurfdata_map/src/mktopostatsMod.F90
deleted file mode 100644
index 2ecd705f4c..0000000000
--- a/tools/mksurfdata_map/src/mktopostatsMod.F90
+++ /dev/null
@@ -1,183 +0,0 @@
-module mktopostatsMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mktopostatsMod
-!
-! !DESCRIPTION:
-! make various topography statistics
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!-----------------------------------------------------------------------
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public mktopostats ! make topo stddev & mean slope
-!
-!EOP
-!===============================================================
-contains
-!===============================================================
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mktopostats
-!
-! !INTERFACE:
-subroutine mktopostats(ldomain, mapfname, datfname, ndiag, topo_stddev_o, slope_o, std_elev)
-!
-! !DESCRIPTION:
-! make various topography statistics
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_continuous, output_diagnostics_continuous_outonly
- use mkchecksMod, only : min_bad, max_bad
-!
-! !ARGUMENTS:
-
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- real(r8) , intent(in) :: std_elev ! standard deviation of elevation (m) to use when not using input file
- real(r8) , intent(out):: topo_stddev_o(:) ! output grid: standard deviation of elevation (m)
- real(r8) , intent(out):: slope_o(:) ! output grid: slope (degrees)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: data_i(:) ! data on input grid
- integer :: ncid,varid ! input netCDF id's
- integer :: ier ! error status
- logical :: bypass_reading ! If should bypass reading dataset and just use a global value
-
- real(r8), parameter :: min_valid_topo_stddev = 0._r8
-
- real(r8), parameter :: min_valid_slope = 0._r8
- real(r8), parameter :: max_valid_slope = 90._r8
-
- character(len=32) :: subname = 'mktopostats'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make Topography statistics.....'
- if ( std_elev >= 0.0_r8 )then
- bypass_reading = .true.
- write (6,*) ' By pass the reading and just use global values'
- else
- bypass_reading = .false.
- end if
- call shr_sys_flush(6)
-
- ! -----------------------------------------------------------------
- ! Read domain and mapping information, check for consistency
- ! -----------------------------------------------------------------
-
- if ( .not. bypass_reading )then
- call domain_read(tdomain,datfname)
-
- call gridmap_mapread(tgridmap, mapfname )
-
- call gridmap_check( tgridmap, tgridmap%frac_src, tgridmap%frac_dst, subname )
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! -----------------------------------------------------------------
- ! Open input file, allocate memory for input data
- ! -----------------------------------------------------------------
-
- write(6,*)'Open Topography file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
-
- allocate(data_i(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
-
- ! -----------------------------------------------------------------
- ! Make topography standard deviation
- ! -----------------------------------------------------------------
-
- call check_ret(nf_inq_varid (ncid, 'ELEVATION', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
- call gridmap_areastddev(tgridmap, data_i, topo_stddev_o, nodata=0._r8)
-
- call output_diagnostics_continuous_outonly(topo_stddev_o, tgridmap, "Topo Std Dev", "m", ndiag)
- else
- write (6,*) ' Set std deviation of topography to ', std_elev
- topo_stddev_o = std_elev
- end if
-
- ! Check validity of output data
- if (min_bad(topo_stddev_o, min_valid_topo_stddev, 'topo_stddev')) then
- stop
- end if
-
-
- ! -----------------------------------------------------------------
- ! Regrid slope
- ! -----------------------------------------------------------------
-
- if ( .not. bypass_reading )then
- call check_ret(nf_inq_varid (ncid, 'SLOPE', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, data_i), subname)
-
- ! Subr. gridmap_areaave_no_srcmask should NOT be used in general. We have
- ! kept it to support the rare raw data files for which we have masking on
- ! the mapping file and, therefore, we do not explicitly pass the src_mask
- ! as an argument. In general, users are advised to use subroutine
- ! gridmap_areaave_srcmask.
- call gridmap_areaave_no_srcmask(tgridmap, data_i, slope_o, nodata=0._r8)
-
- call output_diagnostics_continuous(data_i, slope_o, tgridmap, "Slope", "degrees", ndiag, tdomain%mask, tgridmap%frac_dst)
- else
- write (6,*) ' Set slope of topography to ', 0.0_r8
- slope_o = 0.0_r8
- end if
- ! Check validity of output data
- if (min_bad(slope_o, min_valid_slope, 'slope') .or. &
- max_bad(slope_o, max_valid_slope, 'slope')) then
- stop
- end if
-
-
- ! -----------------------------------------------------------------
- ! Close files and deallocate dynamic memory
- ! -----------------------------------------------------------------
-
- if ( .not. bypass_reading )then
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (data_i)
- end if
-
- write (6,*) 'Successfully made Topography statistics'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mktopostats
-
-
-end module mktopostatsMod
diff --git a/tools/mksurfdata_map/src/mkurbanparCommonMod.F90 b/tools/mksurfdata_map/src/mkurbanparCommonMod.F90
deleted file mode 100644
index 5db84e8351..0000000000
--- a/tools/mksurfdata_map/src/mkurbanparCommonMod.F90
+++ /dev/null
@@ -1,365 +0,0 @@
-module mkurbanparCommonMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkurbanparCommon
-!
-! !DESCRIPTION:
-! Common routines for making urban parameter data, independent of the method used for
-! making the urban parameters (e.g., averages, dominant type, etc.)
-!
-! (WJS 4-18-12: In the past, this contained routines shared between mkurbanparDomMod and
-! mkurbanparAvgMod; now there is just a single module, mkurbanparMod, but I am keeping the
-! separate mkurbanparCommonMod in case a similar split comes back in the future. However,
-! if such a split seems unlikely in the future, these routines could be moved back into
-! mkurbanparMod.)
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!-----------------------------------------------------------------------
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public :: mkurban_pct ! Make output urban %, given input urban %
- public :: mkurban_pct_diagnostics ! print diagnostics related to pct urban
- public :: mkelev ! Get elevation to reduce urban for high elevation areas
-!
-! !PUBLIC DATA MEMBERS:
-!
- real(r8), parameter :: MIN_DENS = 0.1_r8 ! minimum urban density (% of grid cell) - below this value, urban % is set to 0
-
- public :: MIN_DENS
-!
-!EOP
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkurban_pct
-!
-! !INTERFACE:
-subroutine mkurban_pct(ldomain, tdomain, tgridmap, urbn_i, urbn_o, frac_dst)
-!
-! !DESCRIPTION:
-! make percent urban on output grid, given percent urban on input grid
-!
-! This assumes that we're neither using all_urban or zero_out
-!
-!
-! !USES:
- use mkdomainMod , only : domain_type, domain_checksame
- use mkgridmapMod
- use mkvarctl , only : mksrf_gridtype
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain
- type(domain_type) , intent(in) :: tdomain ! local domain
- type(gridmap_type), intent(in) :: tgridmap ! local gridmap
- real(r8) , intent(in) :: urbn_i(:) ! input grid: percent urban
- real(r8) , intent(in) :: frac_dst(:) ! output fractions
- real(r8) , intent(out):: urbn_o(:) ! output grid: percent urban
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-! (Moved from mkurbanparMod Feb, 2012)
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: ier ! error status
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldi ! global sum of dummy input fld
- real(r8) :: sum_fldo ! global sum of dummy output fld
- integer :: ni,no ! indices
- real(r8) :: relerr = 0.00001_r8 ! max error: sum overlap wts ne 1
- character(len=*), parameter :: subname = 'mkurban_pct'
-!-----------------------------------------------------------------------
-
- ! Error checks for array size consistencies
-
- if (size(urbn_i) /= tdomain%ns .or. &
- size(urbn_o) /= ldomain%ns) then
- write(6,*) subname//' ERROR: array size inconsistencies'
- write(6,*) 'size(urbn_i) = ', size(urbn_i)
- write(6,*) 'tdomain%ns = ', tdomain%ns
- write(6,*) 'size(urbn_o) = ', size(urbn_o)
- write(6,*) 'ldomain%ns = ', ldomain%ns
- stop
- end if
- if (size(frac_dst) /= ldomain%ns) then
- write(6,*) subname//' ERROR: array size inconsistencies'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'ldomain%ns = ', ldomain%ns
- stop
- end if
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Determine urbn_o on ouput grid:
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_areaave_srcmask(tgridmap, urbn_i, urbn_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check for conservation
-
- do no = 1, ldomain%ns
- if ((urbn_o(no)) > 100.000001_r8) then
- write (6,*) 'MKURBAN error: urban = ',urbn_o(no), &
- ' greater than 100.000001 for column, row = ',no
- stop
- end if
- enddo
-
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
-
- allocate(mask_r8(tdomain%ns), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- ! (Error check2 in mkurban_pct_diagnostics, which should be called separately)
-
- deallocate (mask_r8)
-
-end subroutine mkurban_pct
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkurban_pct_diagnostics
-!
-! !INTERFACE:
-subroutine mkurban_pct_diagnostics(ldomain, tdomain, tgridmap, urbn_i, urbn_o, ndiag, dens_class, frac_dst)
-!
-! !DESCRIPTION:
-! print diagnostics related to pct urban
-!
-! This is intended to be called after mkurban_pct, but is split out into a separate
-! routine so that modifications to urbn_o can be made in between the two calls (e.g.,
-! setting urbn_o to 0 wherever it is less than a certain threshold; the rules for doing
-! this can't always be applied inline in mkurban_pct).
-!
-! !USES:
- use mkdomainMod , only : domain_type
- use mkgridmapMod, only : gridmap_type
- use mkvarpar
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain
- type(domain_type) , intent(in) :: tdomain ! local domain
- type(gridmap_type), intent(in) :: tgridmap ! local gridmap
- real(r8) , intent(in) :: urbn_i(:) ! input grid: percent urban
- real(r8) , intent(in) :: urbn_o(:) ! output grid: percent urban
- real(r8) , intent(in) :: frac_dst(:) ! output fractions
- integer , intent(in) :: ndiag ! unit number for diag out
-
- integer , intent(in), optional :: dens_class ! density class
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-! (Moved from mkurbanparMod Feb, 2012)
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: gurbn_i ! input grid: global urbn
- real(r8) :: garea_i ! input grid: global area
- real(r8) :: gurbn_o ! output grid: global urbn
- real(r8) :: garea_o ! output grid: global area
- integer :: ni,no,k ! indices
- character(len=*), parameter :: subname = 'mkurban_pct_diagnostics'
-!-----------------------------------------------------------------------
-
- ! Error check inputs
- if (size(frac_dst) /= ldomain%ns) then
- write(6,*) subname//' ERROR: array size inconsistencies'
- write(6,*) 'size(frac_dst) = ', size(frac_dst)
- write(6,*) 'ldomain%ns = ', ldomain%ns
- stop
- end if
-
- ! -----------------------------------------------------------------
- ! Error check2
- ! Compare global areas on input and output grids
- ! -----------------------------------------------------------------
-
- ! Input grid
-
- gurbn_i = 0._r8
- garea_i = 0._r8
-
- do ni = 1, tdomain%ns
- garea_i = garea_i + tgridmap%area_src(ni)*re**2
- gurbn_i = gurbn_i + urbn_i(ni)*(tgridmap%area_src(ni)/100._r8)*&
- tdomain%mask(ni)*re**2
- end do
-
- ! Output grid
-
- gurbn_o = 0._r8
- garea_o = 0._r8
-
- do no = 1, ldomain%ns
- garea_o = garea_o + tgridmap%area_dst(no)*re**2
- gurbn_o = gurbn_o + urbn_o(no)* (tgridmap%area_dst(no)/100._r8)*&
- frac_dst(no)*re**2
- end do
-
- ! Diagnostic output
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
- if (present(dens_class)) then
- write (ndiag,'(1x,a,i0)') 'Urban Output -- class ', dens_class
- else
- write (ndiag,'(1x,a)') 'Urban Output'
- end if
- write (ndiag,'(1x,70a1)') ('=',k=1,70)
-
- write (ndiag,*)
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,2001)
-2001 format (1x,'surface type input grid area output grid area'/ &
- 1x,' 10**6 km**2 10**6 km**2 ')
- write (ndiag,'(1x,70a1)') ('.',k=1,70)
- write (ndiag,*)
- write (ndiag,2003) gurbn_i*1.e-06,gurbn_o*1.e-06
- write (ndiag,2004) garea_i*1.e-06,garea_o*1.e-06
-2002 format (1x,'urban ',f14.3,f17.3)
-2003 format (1x,'urban ',f14.3,f22.8)
-2004 format (1x,'all surface ',f14.3,f17.3)
-
-end subroutine mkurban_pct_diagnostics
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkelev
-!
-! !INTERFACE:
-subroutine mkelev(ldomain, mapfname, datfname, varname, ndiag, elev_o)
-!
-! !DESCRIPTION:
-! Make elevation data
-!
-! !USES:
- use mkdomainMod , only : domain_type, domain_clean, domain_read, domain_checksame
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_continuous
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- character(len=*) , intent(in) :: varname ! topo variable name
- real(r8) , intent(out):: elev_o(:) ! output elevation data
-!
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Keith Oleson
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(domain_type) :: tdomain ! local domain
- type(gridmap_type) :: tgridmap ! local gridmap
-
- real(r8), allocatable :: elev_i(:) ! canyon_height to width ratio in
- real(r8), allocatable :: frac_dst(:) ! output fractions
- integer :: ns_i,ns_o ! indices
- integer :: k,l,n,m,ni ! indices
- integer :: ncidi,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- character(len=256) :: name ! name of attribute
- character(len=256) :: unit ! units of attribute
- character(len= 32) :: subname = 'mkelev'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make elevation .....'
- call shr_sys_flush(6)
-
- ns_o = ldomain%ns
-
- ! -----------------------------------------------------------------
- ! Read input file
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
-
- ns_i = tdomain%ns
- allocate(elev_i(ns_i), stat=ier)
- allocate(frac_dst(ns_o), stat=ier)
- if (ier /= 0) then
- write(6,*)'mkelev allocation error'; call abort()
- end if
-
- write (6,*) 'Open elevation file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncidi), subname)
- call check_ret(nf_inq_varid (ncidi, trim(varname), varid), subname)
- call check_ret(nf_get_var_double (ncidi, varid, elev_i), subname)
- call check_ret(nf_close(ncidi), subname)
-
- ! Read topo elev dataset with unit mask everywhere
-
- call gridmap_mapread(tgridmap, mapfname)
-
- ! Error checks for domain and map consistencies
- ! Note that the topo dataset has no landmask - so a unit landmask is assumed
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Determine elev_o on output grid
-
- elev_o(:) = 0.
-
- call gridmap_areaave_srcmask(tgridmap, elev_i, elev_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- call output_diagnostics_continuous(elev_i, elev_o, tgridmap, "Urban elev variable", "m", ndiag, tdomain%mask, frac_dst)
-
-
- ! Deallocate dynamic memory
-
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (elev_i)
- deallocate (frac_dst)
-
- write (6,*) 'Successfully made elevation'
- write (6,*)
- call shr_sys_flush(6)
-
-end subroutine mkelev
-
-!-----------------------------------------------------------------------
-
-end module mkurbanparCommonMod
diff --git a/tools/mksurfdata_map/src/mkurbanparMod.F90 b/tools/mksurfdata_map/src/mkurbanparMod.F90
deleted file mode 100644
index 07319b1f27..0000000000
--- a/tools/mksurfdata_map/src/mkurbanparMod.F90
+++ /dev/null
@@ -1,759 +0,0 @@
-module mkurbanparMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkurbanpar
-!
-! !DESCRIPTION:
-! Make Urban Parameter data
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!-----------------------------------------------------------------------
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkvarctl, only : ispval
- implicit none
-
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public :: mkurbanInit
- public :: mkurban
- public :: mkurbanpar
-
- ! The following could be private, but because there are associated test routines in a
- ! separate module, it needs to be public
- public :: normalize_urbn_by_tot
-
-! !PUBLIC DATA MEMBERS:
- integer :: numurbl ! number of urban classes
- integer :: nlevurb = ispval ! number of urban layers
-
- public :: numurbl
- public :: nlevurb
-
-! !PRIVATE DATA MEMBERS:
- ! flag to indicate nodata for index variables in output file:
- integer, parameter :: index_nodata = 0
- character(len=*), parameter :: modname = 'mkurbanparMod'
-
- private :: index_nodata
- private :: modname
-
-!EOP
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkurbanInit
-!
-! !INTERFACE:
-subroutine mkurbanInit(datfname)
-!
-! !DESCRIPTION:
-! Initialize variables needed for urban
-!
-! !USES:
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- character(len=*), intent(in) :: datfname ! input data file name (same as file used in mkurban)
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
- integer :: ncid,dimid ! input netCDF id's
-
- character(len=*), parameter :: subname = 'mkurbanInit'
-!EOP
-!-----------------------------------------------------------------------
-
- ! Set numurbl
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_dimid (ncid, 'density_class', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, numurbl), subname)
- call check_ret(nf_inq_dimid (ncid, 'nlevurb', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, nlevurb), subname)
- call check_ret(nf_close(ncid), subname)
-
-end subroutine mkurbanInit
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkurban
-!
-! !INTERFACE:
-subroutine mkurban(ldomain, mapfname, datfname, ndiag, zero_out, &
- urbn_o, urbn_classes_o, region_o)
-!
-! !DESCRIPTION:
-! make total percent urban, breakdown into urban classes, and region ID on the output grid
-!
-! urbn_classes_o(n, i) gives the percent of the urban area in grid cell n that is in class #i.
-! This is normalized so that sum(urbn_classes_o(n,:)) = 100 for all n, even for grid
-! cells where urbn_o(n) = 0 (in the case where urbn_o(n) = 0, we come up with an
-! arbitrary assignment of urban into the different classes).
-!
-! See comments under the normalize_urbn_by_tot subroutine for how urbn_classes_o is
-! determined when the total % urban is 0, according to the input data. Note that this
-! also applies when all_urban=.true., for points that have 0 urban according to the input
-! data.
-!
-! TODO (WJS 6-12-14): I think this could be rewritten slightly to take advantage of the
-! new mkpctPftTypeMod (which should then be renamed to something more general; or maybe
-! better, in terms of maintaining helpful abstractions, there could be a new type to
-! handle urban, and both that and pct_pft_type could be build on a single set of shared
-! code - either as a single base class or through a "has-a" mechanism). This would allow
-! us to combine urbn_o and urbn_classes_o into a single derived type variable. I think
-! this would also replace the use of normalize_classes_by_gcell, and maybe some other
-! urban-specific code.
-!
-! !USES:
- use mkdomainMod , only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkindexmapMod, only : get_dominant_indices
- use mkurbanparCommonMod, only : mkurban_pct, mkurban_pct_diagnostics, MIN_DENS
- use mkutilsMod , only : normalize_classes_by_gcell
- use mkvarctl , only : all_urban
- use mkvarpar
- use mkncdio
- use mkdiagnosticsMod, only : output_diagnostics_index
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type), intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diag out
- logical , intent(in) :: zero_out ! if should zero urban out
- real(r8) , intent(out):: urbn_o(:) ! output grid: total % urban
- real(r8) , intent(out):: urbn_classes_o(:,:) ! output grid: breakdown of total urban into each class
- ! (dimensions: (ldomain%ns, numurbl))
- integer , intent(out):: region_o(:) ! output grid: region ID
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- type(domain_type) :: tdomain ! local domain
- type(gridmap_type) :: tgridmap ! local gridmap
- real(r8), allocatable :: urbn_classes_gcell_i(:,:) ! input grid: percent urban in each density class
- ! (% of total grid cell area)
- real(r8), allocatable :: urbn_classes_gcell_o(:,:) ! output grid: percent urban in each density class
- real(r8), allocatable :: frac_dst(:) ! output fractions
- ! (% of total grid cell area)
- integer , allocatable :: region_i(:) ! input grid: region ID
- integer :: ni,no,ns,k ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: dimlen ! netCDF dimension length
- integer :: max_region ! maximum region index
- integer :: ier ! error status
-
- character(len=*), parameter :: subname = 'mkurban'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make %urban .....'
-
- ! Obtain input grid info, read local fields
-
- call gridmap_mapread(tgridmap, mapfname)
- call domain_read(tdomain, datfname)
-
- ns = tdomain%ns
-
- allocate(urbn_classes_gcell_i(ns, numurbl), &
- urbn_classes_gcell_o(ldomain%ns, numurbl), &
- frac_dst(ldomain%ns), &
- stat=ier)
- if (ier/=0) call abort()
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- write (6,*) 'Open urban file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'PCT_URBAN', varid), subname)
- call check_ret(nf_get_var_double (ncid, varid, urbn_classes_gcell_i), subname)
-
- ! Determine % urban by density class on the output grid
- do k = 1, numurbl
- call mkurban_pct(ldomain, tdomain, tgridmap, urbn_classes_gcell_i(:,k), urbn_classes_gcell_o(:,k), frac_dst)
- end do
-
- ! Determine total % urban
- do no = 1, ldomain%ns
- urbn_o(no) = sum(urbn_classes_gcell_o(no,:))
- end do
-
- call normalize_urbn_by_tot(urbn_classes_gcell_o, urbn_o, urbn_classes_o)
-
- ! Handle special cases
-
- ! Note that, for all these adjustments of total urban %, we do not change anything
- ! about the breakdown into the different urban classes. In particular: when urbn_o is
- ! set to 0 for a point, the breakdown into the different urban classes is maintained
- ! as it was before.
- if (all_urban) then
- urbn_o(:) = 100._r8
- else if (zero_out) then
- urbn_o(:) = 0._r8
- else
- ! Set points to 0% if they fall below a given threshold
- do no = 1, ldomain%ns
- if (urbn_o(no) < MIN_DENS) then
- urbn_o(no) = 0._r8
- end if
- end do
- end if
-
- ! Print diagnostics
- ! First, recompute urbn_classes_gcell_o, based on any changes we have made to urbn_o
- ! while handling special cases
- call normalize_classes_by_gcell(urbn_classes_o, urbn_o, urbn_classes_gcell_o)
- do k = 1, numurbl
- call mkurban_pct_diagnostics(ldomain, tdomain, tgridmap, &
- urbn_classes_gcell_i(:,k), urbn_classes_gcell_o(:,k), &
- ndiag, dens_class=k, frac_dst=frac_dst)
- end do
-
- write (6,*) 'Successfully made %urban'
-
-
- write(6,*) 'Attempting to make urban region .....'
-
- ! Read in region field
- ! Note: we do this here, rather than with the rest of the reads above, because we
- ! expect the input urban fields to be large, so we're just reading the fields as
- ! they're needed to try to avoid unnecessary memory paging
-
- allocate(region_i(ns), stat=ier)
- if (ier/=0) call abort()
- call check_ret(nf_inq_varid (ncid, 'REGION_ID', varid), subname)
- call check_ret(nf_get_var_int (ncid, varid, region_i), subname)
-
- ! Determine max region value, and make sure it doesn't exceed bounds of the lookup tables.
- !
- ! (Note: this check assumes that region_i=1 refers to region(1), region_i=2 refers to
- ! region(2), etc. The alternative would be to use a coordinate variable associated with
- ! the region dimension of the lookup table, which could result in an arbitrary mapping
- ! between region values and the indices of the lookup table; however, this use of
- ! coordinate variables currently isn't supported by lookup_2d_netcdf [as of 2-8-12].)
-
- max_region = maxval(region_i)
- call check_ret(nf_inq_dimid (ncid, 'region', dimid), subname)
- call check_ret(nf_inq_dimlen (ncid, dimid, dimlen), subname)
- if (max_region > dimlen) then
- write(6,*) modname//':'//subname// &
- ' ERROR: max region value exceeds length of region dimension'
- write(6,*) 'max region value : ', max_region
- write(6,*) 'length of region dimension: ', dimlen
- call abort()
- end if
-
- ! Determine dominant region for each output cell
-
- call get_dominant_indices(tgridmap, region_i, region_o, 1, max_region, index_nodata, mask_src=tdomain%mask)
-
- write (6,*) 'Successfully made urban region'
- write (6,*)
-
- ! Output diagnostics
-
- call output_diagnostics_index(region_i, region_o, tgridmap, 'Urban Region ID', &
- 1, max_region, ndiag, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Deallocate dynamic memory & other clean up
-
- call check_ret(nf_close(ncid), subname)
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
- deallocate (urbn_classes_gcell_i, urbn_classes_gcell_o, region_i, frac_dst)
-
-end subroutine mkurban
-!-----------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: normalize_urbn_by_tot
-!
-! !INTERFACE:
-subroutine normalize_urbn_by_tot(classes_pct_gcell, sums, classes_pct_tot)
-!
-! !DESCRIPTION:
-! Normalizes urban class areas to produce % cover of each class, as % of total urban area
-!
-! Specifically: Given (1) an array specifying the % cover of each urban class, as a % of
-! the total grid cell area ('classes_pct_gcell'), and (2) a vector giving the total urban
-! area in each grid cell, expressed as % of the grid cell area: Returns an array
-! ('classes_pct_tot') of the same dimensionality as classes_pct_gcell, where the values
-! now give % cover of each class as a % of the total urban area.
-!
-! Assumes that sums(n) = sum(classes_pct_gcell(n,:))
-!
-! When sums(n) = 0, the creation of classes_pct_tot(n,:) is ambiguous. Here we use the
-! rule that all area is assigned to the medium-density class, defined by parameter MD.
-!
-! The returned array satisfies sum(classes_pct_tot(n,:))==100 for all n (within rounding error)
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: classes_pct_gcell(:,:) ! % cover of classes as % of grid cell
- real(r8), intent(in) :: sums(:) ! totals, as % of grid cell
- real(r8), intent(out):: classes_pct_tot(:,:) ! % cover of classes as % of total
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: n ! index
- integer :: n_max ! number of points
- integer :: nclasses ! number of classes
- real(r8) :: suma ! sum for error check
-
- ! index of medium-density class, which is where we assign urban areas when the total
- ! urban area is 0
- integer, parameter :: MD = 3
-
- ! relative error tolerance for error check
- real(r8), parameter :: relerr = 1.e-10_r8
-
- character(len=*), parameter :: subname = 'normalize_urbn_by_tot'
-!-----------------------------------------------------------------------
-
- ! Error-check inputs
-
- n_max = size(sums)
- if (size(classes_pct_tot, 1) /= n_max .or. &
- size(classes_pct_gcell, 1) /= n_max) then
- write(6,*) subname//' ERROR: array size mismatch'
- write(6,*) 'size(sums) = ', n_max
- write(6,*) 'size(classes_pct_tot, 1) = ', size(classes_pct_tot, 1)
- write(6,*) 'size(classes_pct_gcell, 1) = ', size(classes_pct_gcell, 1)
- call abort()
- end if
-
- if (size(classes_pct_tot, 2) /= size(classes_pct_gcell, 2)) then
- write(6,*) subname//' ERROR: array size mismatch'
- write(6,*) 'size(classes_pct_tot, 2) = ', size(classes_pct_tot, 2)
- write(6,*) 'size(classes_pct_gcell, 2) = ', size(classes_pct_gcell, 2)
- call abort()
- end if
-
- nclasses = size(classes_pct_gcell, 2)
- if (MD > nclasses) then
- write(6,*) subname//' ERROR: MD exceeds nclasses'
- write(6,*) 'MD = ', MD
- write(6,*) 'nclasses = ', nclasses
- call abort()
- end if
-
- ! Do the work
-
- do n = 1, n_max
- if (sums(n) > 0._r8) then
- classes_pct_tot(n,:) = classes_pct_gcell(n,:)/sums(n) * 100._r8
- else
- ! Creation of classes_pct_tot is ambiguous. Apply the rule that all area is
- ! assigned to the medium-density class.
- classes_pct_tot(n,:) = 0._r8
- classes_pct_tot(n,MD) = 100._r8
- end if
- end do
-
- ! Error-check output: Make sure sum(classes_pct_tot(n,:)) = 100 for all n
-
- do n = 1, n_max
- suma = sum(classes_pct_tot(n,:))
- if (abs(suma/100._r8 - 1._r8) > relerr) then
- write(6,*) subname//' ERROR: sum does not equal 100 at point ', n
- write(6,*) 'suma = ', suma
- call abort()
- end if
- end do
-
-end subroutine normalize_urbn_by_tot
-!-----------------------------------------------------------------------
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkurbanpar
-!
-! !INTERFACE:
-subroutine mkurbanpar(datfname, ncido, region_o, urbn_classes_gcell_o, urban_skip_abort_on_invalid_data_check)
-!
-! !DESCRIPTION:
-! Make Urban Parameter data
-!
-! Note that, in a grid cell with region_o==r, parameter values are filled from region r
-! for ALL density classes. Thus, the parameter variables have a numurbl dimension along
-! with their other dimensions.
-!
-! Note that we will have a 'nodata' value (given by the fill_val value associated with
-! each parameter) wherever (1) we have a nodata value for region_o, or (2) the parameter
-! has nodata for the given region/density combination in the input lookup table.
-!
-! !USES:
- use mkdomainMod , only : domain_type, domain_clean, domain_read
- use mkindexmapMod, only : dim_slice_type, lookup_2d_netcdf
- use mkvarpar
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ncido ! output netcdf file id
- integer , intent(in) :: region_o(:) ! output grid: region ID (length: ns_o)
- real(r8) , intent(in) :: urbn_classes_gcell_o(:,:) ! output grid: percent urban in each density class
- ! (% of total grid cell area) (dimensions: ns_o, numurbl)
- logical , intent(in) :: urban_skip_abort_on_invalid_data_check
-
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- ! Type to store information about each urban parameter
- type param
- character(len=32) :: name ! name in input & output files
- real(r8) :: fill_val ! value to put where we have no data in output
- logical :: check_invalid ! should we check whether there are any invalid data in the output?
- end type param
-
- real(r8), allocatable :: data_scalar_o(:,:) ! output array for parameters with no extra dimensions
- real(r8), allocatable :: data_rad_o(:,:,:,:) ! output array for parameters dimensioned by numrad & numsolar
- real(r8), allocatable :: data_levurb_o(:,:,:) ! output array for parameters dimensioned by nlevurb
- integer , allocatable :: unity_dens_o(:,:) ! artificial density indices
- integer :: nlevurb_i ! input grid: number of urban vertical levels
- integer :: numsolar_i ! input grid: number of solar type (DIR/DIF)
- integer :: numrad_i ! input grid: number of solar bands (VIS/NIR)
- integer :: m,n,no,ns_o,p,k ! indices
- integer :: ncidi,dimid,varid ! netCDF id's
- integer :: ier ! error status
- character(len=nf_max_name) :: varname ! variable name
-
- ! information on extra dimensions for lookup tables greater than 2-d:
- type(dim_slice_type), allocatable :: extra_dims(:)
-
- ! suffix for variables dimensioned by numsolar, for each value of numsolar:
- character(len=8), parameter :: solar_suffix(numsolar) = (/'_DIR', '_DIF'/)
-
- ! value to put where we have no data in output variables, for real-valued parameters
- real(r8), parameter :: fill_val_real = 0._r8
-
- ! To add a new urban parameter, simply add an element to one of the below lists
- ! (params_scalar, params_rad or params_levurb)
-
- ! Urban parameters with no extra dimensions
- type(param), parameter :: params_scalar(13) = &
- (/ param('CANYON_HWR', fill_val_real, .true.), &
- param('EM_IMPROAD', fill_val_real, .true.), &
- param('EM_PERROAD', fill_val_real, .true.), &
- param('EM_ROOF', fill_val_real, .true.), &
- param('EM_WALL', fill_val_real, .true.), &
- param('HT_ROOF', fill_val_real, .true.), &
- param('THICK_ROOF', fill_val_real, .true.), &
- param('THICK_WALL', fill_val_real, .true.), &
- param('T_BUILDING_MIN', fill_val_real, .true.), &
- param('WIND_HGT_CANYON', fill_val_real, .true.), &
- param('WTLUNIT_ROOF', fill_val_real, .true.), &
- param('WTROAD_PERV', fill_val_real, .true.), &
-
- ! Note that NLEV_IMPROAD is written as an integer, meaning that type conversion occurs
- ! by truncation. Thus we expect the values in the NLEV_IMPROAD lookup table to be exact;
- ! e.g., if a value were 1.99999 rather than 2.0000, it would be written as 1 instead of 2
- ! Also note: we use fill_val=-1 rather than 0, because 0 appears in the lookup table
- param('NLEV_IMPROAD', -1, .true.) /)
-
- ! Urban parameters dimensioned by numrad & numsolar
- type(param), parameter :: params_rad(4) = &
- (/ param('ALB_IMPROAD', fill_val_real, .true.), &
- param('ALB_PERROAD', fill_val_real, .true.), &
- param('ALB_ROOF', fill_val_real, .true.), &
- param('ALB_WALL', fill_val_real, .true.) /)
-
- ! Urban parameters dimensioned by nlevurb
- type(param), parameter :: params_levurb(6) = &
- (/ param('TK_ROOF', fill_val_real, .true.), &
- param('TK_WALL', fill_val_real, .true.), &
- param('CV_ROOF', fill_val_real, .true.), &
- param('CV_WALL', fill_val_real, .true.), &
-
- ! Impervious road thermal conductivity and heat capacity have varying levels of
- ! data. Thus, we expect to find some missing values in the lookup table -- we
- ! do not want to treat that as an error -- thus, we set check_invalid=.false.
- param('CV_IMPROAD', fill_val_real, .false.), &
- param('TK_IMPROAD', fill_val_real, .false.) /)
-
-
- character(len=*), parameter :: subname = 'mkurbanpar'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make Urban Parameters .....'
- call shr_sys_flush(6)
-
- ! Determine & error-check array sizes
- ns_o = size(region_o)
- if (size(urbn_classes_gcell_o, 1) /= ns_o) then
- write(6,*) modname//':'//subname//' ERROR: array size mismatch'
- write(6,*) 'size(region_o) = ', size(region_o)
- write(6,*) 'size(urbn_classes_gcell_o, 1) = ', size(urbn_classes_gcell_o, 1)
- call abort()
- end if
- if (size(urbn_classes_gcell_o, 2) /= numurbl) then
- write(6,*) modname//':'//subname//' ERROR: array size mismatch'
- write(6,*) 'size(urbn_classes_gcell_o, 2) = ', size(urbn_classes_gcell_o, 2)
- write(6,*) 'numurbl = ', numurbl
- end if
-
-
- ! Read dimensions from input file
-
- write (6,*) 'Open urban parameter file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncidi), subname)
- call check_ret(nf_inq_dimid(ncidi, 'nlevurb', dimid), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimid, nlevurb_i), subname)
- call check_ret(nf_inq_dimid(ncidi, 'numsolar', dimid), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimid, numsolar_i), subname)
- call check_ret(nf_inq_dimid(ncidi, 'numrad', dimid), subname)
- call check_ret(nf_inq_dimlen(ncidi, dimid, numrad_i), subname)
-
- if (nlevurb_i /= nlevurb) then
- write(6,*)'MKURBANPAR: parameter nlevurb= ',nlevurb, &
- 'does not equal input dataset nlevurb= ',nlevurb_i
- stop
- endif
- if (numsolar_i /= numsolar) then
- write(6,*)'MKURBANPAR: parameter numsolar= ',numsolar, &
- 'does not equal input dataset numsolar= ',numsolar_i
- stop
- endif
- if (numrad_i /= numrad) then
- write(6,*)'MKURBANPAR: parameter numrad= ',numrad, &
- 'does not equal input dataset numrad= ',numrad_i
- stop
- endif
-
- ! Create an array that will hold the density indices
- ! In a given grid cell, we output parameter values for all density classes, for the
- ! region of that grid cell. In order to do this while still using the lookup_2d
- ! routine, we create a dummy unity_dens_o array that contains the density values
- ! passed to the lookup routine.
-
- allocate(unity_dens_o(ns_o, numurbl))
- do k = 1, numurbl
- unity_dens_o(:,k) = k
- end do
-
- ! Handle urban parameters with no extra dimensions
-
- allocate(data_scalar_o(ns_o, numurbl), stat=ier)
- if (ier /= 0) then
- write(6,*)'mkurbanpar allocation error'; call abort()
- end if
-
- do p = 1, size(params_scalar)
- call lookup_and_check_err(params_scalar(p)%name, params_scalar(p)%fill_val, &
- params_scalar(p)%check_invalid, urban_skip_abort_on_invalid_data_check, &
- data_scalar_o, 0)
-
- call check_ret(nf_inq_varid(ncido, params_scalar(p)%name, varid), subname)
- ! In the following, note that type conversion occurs if we're writing to a variable of type
- ! other than double; e.g., for an integer, conversion occurs by truncation!
- call check_ret(nf_put_var_double(ncido, varid, data_scalar_o), subname)
- end do
-
- deallocate(data_scalar_o)
-
- ! Handle urban parameters dimensioned by numrad & numsolar
-
- allocate(data_rad_o(ns_o, numurbl, numrad, numsolar), stat=ier)
- if (ier /= 0) then
- write(6,*)'mkurbanpar allocation error'; call abort()
- end if
-
- allocate(extra_dims(2))
- extra_dims(1)%name = 'numrad'
- extra_dims(2)%name = 'numsolar'
-
- do p = 1, size(params_rad)
- do m = 1,numsolar
- extra_dims(2)%val = m
- do n = 1,numrad
- extra_dims(1)%val = n
-
- call lookup_and_check_err(params_rad(p)%name, params_rad(p)%fill_val, &
- params_rad(p)%check_invalid, urban_skip_abort_on_invalid_data_check, &
- data_rad_o(:,:,n,m), &
- 2, extra_dims)
- end do
- end do
-
- ! Special handling of numsolar: rather than outputting variables with a numsolar
- ! dimension, we output separate variables for each value of numsolar
- do m = 1,numsolar
- if (len_trim(params_rad(p)%name) + len_trim(solar_suffix(m)) > len(varname)) then
- write(6,*) 'variable name exceeds length of varname'
- write(6,*) trim(params_rad(p)%name)//trim(solar_suffix(m))
- call abort()
- end if
- varname = trim(params_rad(p)%name)//trim(solar_suffix(m))
- call check_ret(nf_inq_varid(ncido, varname, varid), subname)
- ! In the following, note that type conversion occurs if we're writing to a variable of type
- ! other than double; e.g., for an integer, conversion occurs by truncation!
- call check_ret(nf_put_var_double(ncido, varid, data_rad_o(:,:,:,m)), subname)
- end do
- end do
-
- deallocate(data_rad_o)
- deallocate(extra_dims)
-
- ! Handle urban parameters dimensioned by nlevurb
-
- allocate(data_levurb_o(ns_o, numurbl, nlevurb), stat=ier)
- if (ier /= 0) then
- write(6,*)'mkurbanpar allocation error'; call abort()
- end if
-
- allocate(extra_dims(1))
- extra_dims(1)%name = 'nlevurb'
-
- do p = 1, size(params_levurb)
- do n = 1,nlevurb
- extra_dims(1)%val = n
-
- call lookup_and_check_err(params_levurb(p)%name, params_levurb(p)%fill_val, &
- params_levurb(p)%check_invalid, &
- urban_skip_abort_on_invalid_data_check, data_levurb_o(:,:,n), &
- 1, extra_dims)
- end do
-
- call check_ret(nf_inq_varid(ncido, params_levurb(p)%name, varid), subname)
- ! In the following, note that type conversion occurs if we're writing to a variable of type
- ! other than double; e.g., for an integer, conversion occurs by truncation!
- call check_ret(nf_put_var_double(ncido, varid, data_levurb_o), subname)
- end do
-
- deallocate(data_levurb_o)
- deallocate(extra_dims)
-
-
- call check_ret(nf_close(ncidi), subname)
- call check_ret(nf_sync(ncido), subname)
-
- write (6,*) 'Successfully made Urban Parameters'
- write (6,*)
- call shr_sys_flush(6)
-
- deallocate(unity_dens_o)
-
-contains
-!------------------------------------------------------------------------------
- subroutine lookup_and_check_err(varname, fill_val, check_invalid, &
- urban_skip_abort_on_invalid_data_check, data, n_extra_dims, extra_dims)
-
- ! Wrapper to lookup_2d_netcdf: Loops over each density class, calling lookup_2d_netcdf
- ! with that density class and filling the appropriate slice of the data array. Also
- ! checks for any errors, aborting if there were any.
- !
- ! Note that the lookup_2d_netcdf routine is designed to work with a single value of
- ! each of the indices. However, we want to fill parameter values for ALL density
- ! classes. This is why we loop over density class in this routine.
- !
- ! Note: inherits a number of variables from the parent routine
-
- use mkindexmapMod, only : lookup_2d_netcdf
-
- implicit none
- character(len=*), intent(in) :: varname ! name of lookup table
- real(r8) , intent(in) :: fill_val ! value to put where we have no data in output variables
- logical , intent(in) :: check_invalid ! should we check whether there are any invalid data in the output?
- logical , intent(in) :: urban_skip_abort_on_invalid_data_check
-
- real(r8) , intent(out):: data(:,:) ! output from lookup_2d_netcdf
- integer , intent(in) :: n_extra_dims ! number of extra dimensions in the lookup table
-
- ! slice to use if lookup table variable has more than 2 dimensions:
- type(dim_slice_type), intent(in), optional :: extra_dims(:)
-
- ! Local variables:
-
- integer :: k,n ! indices
- integer :: ierr ! error return code
-
-
- do k = 1, numurbl
- ! In the following, note that unity_dens_o(:,k) has been constructed so that
- ! unity_dens_o(:,k)==k everywhere. Thus, we fill data(:,k) with the parameter
- ! values corresponding to density class k.
- ! Also note: We use invalid_okay=.true. because we fill all density classes,
- ! some of which may have invalid entries. Because doing so disables some error
- ! checking, we do our own error checking after the call.
- call lookup_2d_netcdf(ncidi, varname, .true., &
- 'density_class', 'region', n_extra_dims, &
- unity_dens_o(:,k), region_o, fill_val, data(:,k), ierr, &
- extra_dims=extra_dims, nodata=index_nodata, &
- invalid_okay=.true.)
-
- if (ierr /= 0) then
- write(6,*) modname//':'//subname//' ERROR in lookup_2d_netcdf for ', &
- trim(varname), ' class', k, ': err=', ierr
- call abort()
- end if
-
- if (check_invalid) then
- ! Make sure we have valid parameter values wherever we have non-zero urban cover
- do n = 1, ns_o
- ! This check assumes that fill_val doesn't appear in any of the valid entries
- ! of the lookup table
- if (urbn_classes_gcell_o(n,k) > 0. .and. data(n,k) == fill_val) then
- write(6,*) modname//':'//subname//' ERROR: fill value found in output where urban cover > 0'
- write(6,*) 'var: ', trim(varname)
- write(6,*) 'class: ', k
- write(6,*) 'n: ', n
- write(6,*) 'region: ', region_o(n)
- write(6,*) 'urbn_classes_gcell_o(n,k): ', urbn_classes_gcell_o(n,k)
- if (.not. urban_skip_abort_on_invalid_data_check) then
- ! NOTE(bja, 2015-01) added to work around a ?bug? noted in
- ! /glade/p/cesm/cseg/inputdata/lnd/clm2/surfdata_map/README_c141219
- call abort()
- end if
- end if
- end do
- end if
-
- end do
-
- end subroutine lookup_and_check_err
-
-end subroutine mkurbanpar
-!------------------------------------------------------------------------------
-
-end module mkurbanparMod
diff --git a/tools/mksurfdata_map/src/mkutilsMod.F90 b/tools/mksurfdata_map/src/mkutilsMod.F90
deleted file mode 100644
index 43e779745b..0000000000
--- a/tools/mksurfdata_map/src/mkutilsMod.F90
+++ /dev/null
@@ -1,197 +0,0 @@
-module mkutilsMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkutils
-!
-! !DESCRIPTION:
-! General-purpose utilities for mksurfdata_map
-!
-!
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-!
-! !PUBLIC MEMBER FUNCTIONS:
- public :: normalize_classes_by_gcell ! renormalize array so values are given as % of total grid cell area
- public :: slightly_below
- public :: slightly_above
-!
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!EOP
-!------------------------------------------------------------------------------
-contains
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: normalize_classes_by_gcell
-!
-! !INTERFACE:
-subroutine normalize_classes_by_gcell(classes_pct_tot, sums, classes_pct_gcell)
-!
-! !DESCRIPTION:
-! Renormalizes an array (gcell x class) so that values are given as % of total grid cell area
-!
-! Specifically: Given (1) an array specifying the % cover of different classes, as a % of
-! some total ('classes_pct_tot'), and (2) a vector giving these totals ('sums'), expressed
-! as % of grid cell area: Returns an array ('classes_pct_gcell') of the same
-! dimensionality as classes_pct_tot, where the values now give the % cover of each class
-! as a % of total grid cell area.
-!
-! The size of 'sums' should match the size of the first dimension in 'classes_pct_tot' and
-! 'classes_pct_gcell'
-!
-! For example, if classes_pct_tot(n,i) gives the % of the urban area in grid cell n that is
-! in urban class #i, and sums(n) gives the % of grid cell n that is urban, then
-! classes_pct_gcell(n,i) will give the % of the total area of grid cell n that is in urban
-! class #i.
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: classes_pct_tot(:,:) ! % cover of classes as % of total
- real(r8), intent(in) :: sums(:) ! totals, as % of grid cell
- real(r8), intent(out):: classes_pct_gcell(:,:) ! % cover of classes as % of grid cell
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- integer :: n, n_max
-
- character(len=*), parameter :: subname = "normalize_classes_by_gcell"
-!------------------------------------------------------------------------------
-
- ! Error-check inputs
-
- n_max = size(sums)
- if (size(classes_pct_tot, 1) /= n_max .or. &
- size(classes_pct_gcell, 1) /= n_max) then
- write(6,*) subname//' ERROR: array size mismatch'
- write(6,*) 'size(sums) = ', n_max
- write(6,*) 'size(classes_pct_tot, 1) = ', size(classes_pct_tot, 1)
- write(6,*) 'size(classes_pct_gcell, 1) = ', size(classes_pct_gcell, 1)
- call abort()
- end if
-
- if (size(classes_pct_tot, 2) /= size(classes_pct_gcell, 2)) then
- write(6,*) subname//' ERROR: array size mismatch'
- write(6,*) 'size(classes_pct_tot, 2) = ', size(classes_pct_tot, 2)
- write(6,*) 'size(classes_pct_gcell, 2) = ', size(classes_pct_gcell, 2)
- call abort()
- end if
-
- ! Do the work
-
- do n = 1, n_max
- classes_pct_gcell(n,:) = classes_pct_tot(n,:) * (sums(n)/100._r8)
- end do
-end subroutine normalize_classes_by_gcell
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: slightly_below
-!
-! !INTERFACE:
-logical function slightly_below(a, b, eps)
-!
-! !DESCRIPTION:
-! Returns true if a is slightly below b; false if a is significantly below b or if a is
-! greater than or equal to b
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: a
- real(r8), intent(in) :: b
-
- ! if provided, eps gives the relative error allowed for checking the "slightly"
- ! condition; if not provided, the tolerance defaults to the value given by eps_default
- real(r8), intent(in), optional :: eps
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: l_eps
- real(r8), parameter :: eps_default = 1.e-15_r8 ! default relative error tolerance
-!------------------------------------------------------------------------------
-
- if (present(eps)) then
- l_eps = eps
- else
- l_eps = eps_default
- end if
-
- if (a < b .and. (b - a)/b < l_eps) then
- slightly_below = .true.
- else
- slightly_below = .false.
- end if
-
-end function slightly_below
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: slightly_above
-!
-! !INTERFACE:
-logical function slightly_above(a, b, eps)
-!
-! !DESCRIPTION:
-! Returns true if a is slightly above b; false if a is significantly above b or if a is
-! less than or equal to b
-!
-! !USES:
-!
-! !ARGUMENTS:
- implicit none
- real(r8), intent(in) :: a
- real(r8), intent(in) :: b
-
- ! if provided, eps gives the relative error allowed for checking the "slightly"
- ! condition; if not provided, the tolerance defaults to the value given by eps_default
- real(r8), intent(in), optional :: eps
-!
-! !REVISION HISTORY:
-! Author: Bill Sacks
-!
-!
-! !LOCAL VARIABLES:
-!EOP
- real(r8) :: l_eps
- real(r8), parameter :: eps_default = 1.e-15_r8 ! default relative error tolerance
-!------------------------------------------------------------------------------
-
- if (present(eps)) then
- l_eps = eps
- else
- l_eps = eps_default
- end if
-
- if (a > b .and. (a - b)/b < l_eps) then
- slightly_above = .true.
- else
- slightly_above = .false.
- end if
-
-end function slightly_above
-!------------------------------------------------------------------------------
-
-end module mkutilsMod
diff --git a/tools/mksurfdata_map/src/mkvarctl.F90 b/tools/mksurfdata_map/src/mkvarctl.F90
deleted file mode 100644
index cebfc6a5e3..0000000000
--- a/tools/mksurfdata_map/src/mkvarctl.F90
+++ /dev/null
@@ -1,92 +0,0 @@
-module mkvarctl
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkvarctl
-!
-! !DESCRIPTION:
-! Module containing control variables
-!
-! !USES:
- use shr_kind_mod, only: r8 => shr_kind_r8
-!
-! !PUBLIC TYPES:
- implicit none
- private
- save
-!
- real(r8), public, parameter :: spval = 1.e36 ! special value
- integer, public, parameter :: ispval = -9999 ! special value
-
- logical, public :: outnc_large_files ! output files in 64-bit format for large files
- logical, public :: outnc_double ! output ALL data in files as 64-bit
- integer, public :: outnc_dims = 2 ! only applicable to lat/lon grids
- logical, public :: outnc_1d ! true => output file is 1d
- logical, public :: outnc_vic ! true => output VIC fields
- logical, public :: outnc_3dglc ! true => output 3D glacier fields
-
- character(len= 32), public :: mksrf_gridnm = ' ' ! name of grid to use on output file
- character(len=256), public :: mksrf_fgrid = ' ' ! land grid file name to use
- character(len=256), public :: mksrf_gridtype = ' ' ! land gridtype, global or reg
- character(len=256), public :: mksrf_fvegtyp = ' ' ! vegetation data file name
- character(len=256), public :: mksrf_fhrvtyp = ' ' ! harvest data file name
- character(len=256), public :: mksrf_fsoitex = ' ' ! soil texture data file name
- character(len=256), public :: mksrf_forganic = ' ' ! organic matter data file name
- character(len=256), public :: mksrf_fsoicol = ' ' ! soil color data file name
- character(len=256), public :: mksrf_fabm = ' ' ! ag fire peak month and
- character(len=256), public :: mksrf_fpeat = ' ' ! peatlands and
- character(len=256), public :: mksrf_fsoildepth = ' ' ! soil depth file name
- character(len=256), public :: mksrf_fgdp = ' ' ! gdp data file names
- character(len=256), public :: mksrf_flakwat = ' ' ! inland lake data file name
- character(len=256), public :: mksrf_fwetlnd = ' ' ! inland wetlands data file name
- character(len=256), public :: mksrf_furban = ' ' ! urban data file name
- character(len=256), public :: mksrf_fglacier = ' ' ! glacier data file name
- character(len=256), public :: mksrf_fglacierregion = ' ' ! glacier region data file name
- character(len=256), public :: mksrf_furbtopo = ' ' ! urban topography data file name
- character(len=256), public :: mksrf_fmax = ' ' ! fmax data file name
- character(len=256), public :: mksrf_flai = ' ' ! lai data filename
- character(len=256), public :: mksrf_fdynuse = ' ' ! ascii file containing names of dynamic land use files
- character(len=256), public :: mksrf_fvocef = ' ' ! VOC Emission Factor data file name
- character(len=256), public :: mksrf_ftopostats = ' ' ! topography statistics data file name
- character(len=256), public :: mksrf_fvic = ' ' ! VIC parameters data file name
- character(len=256), public :: mksrf_fch4 = ' ' ! inversion-derived CH4 parameters data file name
-
- integer , public :: numpft = 16 ! number of plant types
-
- character(len=256), public :: map_fpft = ' ' ! Mapping file for PFT
- character(len=256), public :: map_flakwat = ' ' ! Mapping file for lake water
- character(len=256), public :: map_fwetlnd = ' ' ! Mapping file for wetland water
- character(len=256), public :: map_fglacier = ' ' ! Mapping file for glacier
- character(len=256), public :: map_fglacierregion = ' ' ! Mapping file for glacier region
- character(len=256), public :: map_fsoitex = ' ' ! Mapping file for soil texture
- character(len=256), public :: map_fsoicol = ' ' ! Mapping file for soil color
- character(len=256), public :: map_fabm = ' ' ! Mapping file: ag fire...
- character(len=256), public :: map_fpeat = ' ' ! Mapping file: peatlands
- character(len=256), public :: map_fsoildepth = ' ' ! Mapping file: soil depth
- character(len=256), public :: map_fgdp = ' ' ! Mapping file: gdp
- character(len=256), public :: map_furban = ' ' ! Mapping file for urban
- character(len=256), public :: map_furbtopo = ' ' ! Mapping file for urban topography
- character(len=256), public :: map_fmax = ' ' ! Mapping file for soil frac max
- character(len=256), public :: map_forganic = ' ' ! Mapping file for organic soil
- character(len=256), public :: map_fvocef = ' ' ! Mapping file for VOC emission factors
- character(len=256), public :: map_flai = ' ' ! Mapping file for LAI
- character(len=256), public :: map_fharvest = ' ' ! Mapping file for harvesting
- character(len=256), public :: map_ftopostats = ' ' ! Mapping file for topography statistics
- character(len=256), public :: map_fvic = ' ' ! Mapping file for VIC parameters
- character(len=256), public :: map_fch4 = ' ' ! Mapping file for inversion-derived CH4 parameters
- character(len=80) , public :: gitdescribe = ' ' ! Description of model version from git
-!
-! Variables to override data read in with
-! (all_urban is mostly for single-point mode, but could be used for sensitivity studies)
-!
- logical, public :: all_urban ! output ALL data as 100% covered in urban
- logical, public :: no_inlandwet ! set wetland to 0% over land; wetland will only be used for ocean points
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein 11/04
-!
-!EOP
-!-----------------------------------------------------------------------
-
-end module mkvarctl
diff --git a/tools/mksurfdata_map/src/mkvarpar.F90 b/tools/mksurfdata_map/src/mkvarpar.F90
deleted file mode 100644
index a8a01d2da2..0000000000
--- a/tools/mksurfdata_map/src/mkvarpar.F90
+++ /dev/null
@@ -1,32 +0,0 @@
-module mkvarpar
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: clm_varpar
-!
-! !DESCRIPTION:
-! Module containing CLM parameters
-!
-! !USES:
- use shr_kind_mod, only: r8 => shr_kind_r8
- use shr_const_mod, only: SHR_CONST_REARTH
-!
-! !PUBLIC TYPES:
- implicit none
- save
-!
- integer, parameter :: nlevsoi = 10 ! number of soil layers
- integer, parameter :: numstdpft = 16! number of standard PFT types
- integer, parameter :: numstdcft = 2 ! of the number of standard PFT types, how many are crop (CFT)
- integer, parameter :: noveg = 0 ! value for non-vegetated pft
- integer, parameter :: numsolar = 2 ! number of solar types (Direct,Diffuse)
- integer, parameter :: numrad = 2 ! number of solar bands (VIS,NIR)
- real(r8),parameter :: elev_thresh = 2600._r8 ! elevation threshold for screening urban areas
- real(r8),parameter :: re = SHR_CONST_REARTH*0.001
-
-!
-!EOP
-!-----------------------------------------------------------------------
-
-end module mkvarpar
diff --git a/tools/mksurfdata_map/src/mkvocefMod.F90 b/tools/mksurfdata_map/src/mkvocefMod.F90
deleted file mode 100644
index 03d9dddd3f..0000000000
--- a/tools/mksurfdata_map/src/mkvocefMod.F90
+++ /dev/null
@@ -1,209 +0,0 @@
-module mkvocefMod
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: mkvocMod
-!
-! !DESCRIPTION:
-! Make VOC percentage emissions for surface dataset
-!
-! !REVISION HISTORY:
-! Author: Erik Kluzek
-!
-!-----------------------------------------------------------------------
-! !USES:
- use shr_kind_mod, only : r8 => shr_kind_r8
- use shr_sys_mod , only : shr_sys_flush
- use mkdomainMod , only : domain_checksame
-
- implicit none
- private
-
-! !PUBLIC MEMBER FUNCTIONS:
- public :: mkvocef ! Get the percentage emissions for VOC for different
- ! land cover types
-!EOP
-
-contains
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !IROUTINE: mkvocef
-!
-! !INTERFACE:
-subroutine mkvocef(ldomain, mapfname, datfname, ndiag, &
- ef_btr_o, ef_fet_o, ef_fdt_o, ef_shr_o, ef_grs_o, ef_crp_o)
-!
-! !DESCRIPTION:
-! make volatile organic coumpunds (VOC) emission factors.
-!
-! !USES:
- use mkdomainMod, only : domain_type, domain_clean, domain_read
- use mkgridmapMod
- use mkvarpar
- use mkvarctl
- use mkncdio
-!
-! !ARGUMENTS:
- implicit none
- type(domain_type) , intent(in) :: ldomain
- character(len=*) , intent(in) :: mapfname ! input mapping file name
- character(len=*) , intent(in) :: datfname ! input data file name
- integer , intent(in) :: ndiag ! unit number for diagnostic output
- real(r8) , intent(out):: ef_btr_o(:) ! output grid: EFs for broadleaf trees
- real(r8) , intent(out):: ef_fet_o(:) ! output grid: EFs for fineleaf evergreen
- real(r8) , intent(out):: ef_fdt_o(:) ! output grid: EFs for fineleaf deciduous
- real(r8) , intent(out):: ef_shr_o(:) ! output grid: EFs for shrubs
- real(r8) , intent(out):: ef_grs_o(:) ! output grid: EFs for grasses
- real(r8) , intent(out):: ef_crp_o(:) ! output grid: EFs for crops
-!
-! !CALLED FROM:
-! subroutine mksrfdat in module mksrfdatMod
-!
-! !REVISION HISTORY:
-! Author: Colette L. Heald
-! 17 Jul 2007 F Vitt -- updated to pftintdat06_clm3_5_05 and corrected indexing of ef_*_i arrarys
-!
-!EOP
-!
-! !LOCAL VARIABLES:
- type(gridmap_type) :: tgridmap
- type(domain_type) :: tdomain ! local domain
- real(r8), allocatable :: ef_btr_i(:) ! input grid: EFs for broadleaf trees
- real(r8), allocatable :: ef_fet_i(:) ! input grid: EFs for fineleaf evergreen
- real(r8), allocatable :: ef_fdt_i(:) ! input grid: EFs for fineleaf deciduous
- real(r8), allocatable :: ef_shr_i(:) ! input grid: EFs for shrubs
- real(r8), allocatable :: ef_grs_i(:) ! input grid: EFs for grasses
- real(r8), allocatable :: ef_crp_i(:) ! input grid: EFs for crops
- real(r8), allocatable :: frac_dst(:) ! output fractions
- real(r8), allocatable :: mask_r8(:) ! float of tdomain%mask
- real(r8) :: sum_fldo ! global sum of dummy input fld
- real(r8) :: sum_fldi ! global sum of dummy input fld
- integer :: k,n,no,ni,ns_o,ns_i ! indices
- integer :: ncid,dimid,varid ! input netCDF id's
- integer :: ier ! error status
- real(r8) :: relerr = 0.00001_r8 ! max error: sum overlap wts ne 1
- character(len=32) :: subname = 'mkvocef'
-!-----------------------------------------------------------------------
-
- write (6,*) 'Attempting to make VOC emission factors .....'
- call shr_sys_flush(6)
-
- ns_o = ldomain%ns
-
- ! -----------------------------------------------------------------
- ! Read input Emission Factors
- ! -----------------------------------------------------------------
-
- ! Obtain input grid info, read local fields
-
- call domain_read(tdomain,datfname)
- ns_i = tdomain%ns
- allocate(ef_btr_i(ns_i), ef_fet_i(ns_i), ef_fdt_i(ns_i), &
- ef_shr_i(ns_i), ef_grs_i(ns_i), ef_crp_i(ns_i), &
- frac_dst(ns_o), stat=ier)
- if (ier/=0) call abort()
-
- write (6,*) 'Open VOC file: ', trim(datfname)
- call check_ret(nf_open(datfname, 0, ncid), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_btr', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_btr_i), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_fet', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_fet_i), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_fdt', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_fdt_i), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_shr', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_shr_i), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_grs', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_grs_i), subname)
- call check_ret(nf_inq_varid (ncid, 'ef_crp', varid), subname)
- call check_ret(nf_get_var_double(ncid, varid, ef_crp_i), subname)
- call check_ret(nf_close(ncid), subname)
-
- ! Area-average percent cover on input grid to output grid
- ! and correct according to land landmask
- ! Note that percent cover is in terms of total grid area.
-
- call gridmap_mapread(tgridmap, mapfname )
-
- ! Error checks for domain and map consistencies
-
- call domain_checksame( tdomain, ldomain, tgridmap )
-
- ! Obtain frac_dst
- call gridmap_calc_frac_dst(tgridmap, tdomain%mask, frac_dst)
-
- ! Do mapping from input to output grid
-
- call gridmap_areaave_srcmask(tgridmap, ef_btr_i, ef_btr_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, ef_fet_i, ef_fet_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, ef_fdt_i, ef_fdt_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, ef_shr_i, ef_shr_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, ef_grs_i, ef_grs_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
- call gridmap_areaave_srcmask(tgridmap, ef_crp_i, ef_crp_o, nodata=0._r8, mask_src=tdomain%mask, frac_dst=frac_dst)
-
- ! Check for conservation
-
- do no = 1, ns_o
- if ( ef_btr_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF btr = ',ef_btr_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- if ( ef_fet_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF fet = ',ef_fet_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- if ( ef_fdt_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF fdt = ',ef_fdt_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- if ( ef_shr_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF shr = ',ef_shr_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- if ( ef_grs_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF grs = ',ef_grs_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- if ( ef_crp_o(no) < 0._r8 ) then
- write (6,*) 'MKVOCEF error: EF crp = ',ef_crp_o(no), &
- ' is negative for no = ',no
- call abort()
- end if
- enddo
-
- ! -----------------------------------------------------------------
- ! Error check1
- ! Compare global sum fld_o to global sum fld_i.
- ! -----------------------------------------------------------------
-
- ! Global sum of output field -- must multiply by fraction of
- ! output grid that is land as determined by input grid
-
- allocate(mask_r8(ns_i), stat=ier)
- if (ier/=0) call abort()
- mask_r8 = tdomain%mask
- call gridmap_check( tgridmap, mask_r8, frac_dst, subname )
-
- write (6,*) 'Successfully made VOC Emission Factors'
- write (6,*)
- call shr_sys_flush(6)
-
- ! Deallocate dynamic memory
-
- deallocate ( ef_btr_i, ef_fet_i, ef_fdt_i, &
- ef_shr_i, ef_grs_i, ef_crp_i, frac_dst, mask_r8 )
- call domain_clean(tdomain)
- call gridmap_clean(tgridmap)
-
-end subroutine mkvocef
-
-!-----------------------------------------------------------------------
-
-end module mkvocefMod
diff --git a/tools/mksurfdata_map/src/nanMod.F90 b/tools/mksurfdata_map/src/nanMod.F90
deleted file mode 100644
index 0cbeeea112..0000000000
--- a/tools/mksurfdata_map/src/nanMod.F90
+++ /dev/null
@@ -1,41 +0,0 @@
-module nanMod
-
-!-----------------------------------------------------------------------
-!BOP
-!
-! !MODULE: nanMod
-!
-! !DESCRIPTION:
-! Set parameters for the floating point flags "inf" Infinity
-! and "nan" not-a-number. As well as "bigint" the point
-! at which integers start to overflow. These values are used
-! to initialize arrays with as a way to detect if arrays
-! are being used before being set.
-! Note that bigint is the largest possible 32-bit integer.
-!
-! !USES:
- use shr_kind_mod, only: r8 => shr_kind_r8
-!
-! !PUBLIC TYPES:
- implicit none
- save
-#ifdef __PGI
-! quiet nan for portland group compilers
- real(r8), parameter :: inf = O'0777600000000000000000'
- real(r8), parameter :: nan = O'0777700000000000000000'
- integer, parameter :: bigint = O'17777777777'
-#else
-! signaling nan otherwise
- real(r8), parameter :: inf = O'0777600000000000000000'
- real(r8), parameter :: nan = O'0777610000000000000000'
- integer, parameter :: bigint = O'17777777777'
-#endif
-!
-! !REVISION HISTORY:
-! Created by Mariana Vertenstein based on cam module created by
-! CCM core group
-!
-!EOP
-!-----------------------------------------------------------------------
-
-end module nanMod
diff --git a/tools/mksurfdata_map/src/shr_const_mod.F90 b/tools/mksurfdata_map/src/shr_const_mod.F90
deleted file mode 100644
index 07d194e71e..0000000000
--- a/tools/mksurfdata_map/src/shr_const_mod.F90
+++ /dev/null
@@ -1,58 +0,0 @@
-!===============================================================================
-
-MODULE shr_const_mod
-
- use shr_kind_mod
-
- integer(SHR_KIND_IN),parameter,private :: R8 = SHR_KIND_R8 ! rename for local readability only
-
- !----------------------------------------------------------------------------
- ! physical constants (all data public)
- !----------------------------------------------------------------------------
- public
-
- real(R8),parameter :: SHR_CONST_PI = 3.14159265358979323846_R8 ! pi
- real(R8),parameter :: SHR_CONST_CDAY = 86400.0_R8 ! sec in calendar day ~ sec
- real(R8),parameter :: SHR_CONST_SDAY = 86164.0_R8 ! sec in siderial day ~ sec
- real(R8),parameter :: SHR_CONST_OMEGA = 2.0_R8*SHR_CONST_PI/SHR_CONST_SDAY ! earth rot ~ rad/sec
- real(R8),parameter :: SHR_CONST_REARTH = 6.37122e6_R8 ! radius of earth ~ m
- real(R8),parameter :: SHR_CONST_G = 9.80616_R8 ! acceleration of gravity ~ m/s^2
-
- real(R8),parameter :: SHR_CONST_STEBOL = 5.67e-8_R8 ! Stefan-Boltzmann constant ~ W/m^2/K^4
- real(R8),parameter :: SHR_CONST_BOLTZ = 1.38065e-23_R8 ! Boltzmann's constant ~ J/K/molecule
- real(R8),parameter :: SHR_CONST_AVOGAD = 6.02214e26_R8 ! Avogadro's number ~ molecules/kmole
- real(R8),parameter :: SHR_CONST_RGAS = SHR_CONST_AVOGAD*SHR_CONST_BOLTZ ! Universal gas constant ~ J/K/kmole
- real(R8),parameter :: SHR_CONST_MWDAIR = 28.966_R8 ! molecular weight dry air ~ kg/kmole
- real(R8),parameter :: SHR_CONST_MWWV = 18.016_R8 ! molecular weight water vapor
- real(R8),parameter :: SHR_CONST_RDAIR = SHR_CONST_RGAS/SHR_CONST_MWDAIR ! Dry air gas constant ~ J/K/kg
- real(R8),parameter :: SHR_CONST_RWV = SHR_CONST_RGAS/SHR_CONST_MWWV ! Water vapor gas constant ~ J/K/kg
- real(R8),parameter :: SHR_CONST_ZVIR = (SHR_CONST_RWV/SHR_CONST_RDAIR)-1.0_R8 ! RWV/RDAIR - 1.0
- real(R8),parameter :: SHR_CONST_KARMAN = 0.4_R8 ! Von Karman constant
- real(R8),parameter :: SHR_CONST_PSTD = 101325.0_R8 ! standard pressure ~ pascals
- real(R8),parameter :: SHR_CONST_PDB = 0.0112372_R8 ! ratio of 13C/12C in Pee Dee Belemnite (C isotope standard)
-
- real(R8),parameter :: SHR_CONST_TKTRIP = 273.16_R8 ! triple point of fresh water ~ K
- real(R8),parameter :: SHR_CONST_TKFRZ = 273.15_R8 ! freezing T of fresh water ~ K
- real(R8),parameter :: SHR_CONST_TKFRZSW = SHR_CONST_TKFRZ - 1.8_R8 ! freezing T of salt water ~ K
-
- real(R8),parameter :: SHR_CONST_RHODAIR = & ! density of dry air at STP ~ kg/m^3
- SHR_CONST_PSTD/(SHR_CONST_RDAIR*SHR_CONST_TKFRZ)
- real(R8),parameter :: SHR_CONST_RHOFW = 1.000e3_R8 ! density of fresh water ~ kg/m^3
- real(R8),parameter :: SHR_CONST_RHOSW = 1.026e3_R8 ! density of sea water ~ kg/m^3
- real(R8),parameter :: SHR_CONST_RHOICE = 0.917e3_R8 ! density of ice ~ kg/m^3
- real(R8),parameter :: SHR_CONST_CPDAIR = 1.00464e3_R8 ! specific heat of dry air ~ J/kg/K
- real(R8),parameter :: SHR_CONST_CPWV = 1.810e3_R8 ! specific heat of water vap ~ J/kg/K
- real(R8),parameter :: SHR_CONST_CPVIR = (SHR_CONST_CPWV/SHR_CONST_CPDAIR)-1.0_R8 ! CPWV/CPDAIR - 1.0
- real(R8),parameter :: SHR_CONST_CPFW = 4.188e3_R8 ! specific heat of fresh h2o ~ J/kg/K
- real(R8),parameter :: SHR_CONST_CPSW = 3.996e3_R8 ! specific heat of sea h2o ~ J/kg/K
- real(R8),parameter :: SHR_CONST_CPICE = 2.11727e3_R8 ! specific heat of fresh ice ~ J/kg/K
- real(R8),parameter :: SHR_CONST_LATICE = 3.337e5_R8 ! latent heat of fusion ~ J/kg
- real(R8),parameter :: SHR_CONST_LATVAP = 2.501e6_R8 ! latent heat of evaporation ~ J/kg
- real(R8),parameter :: SHR_CONST_LATSUB = & ! latent heat of sublimation ~ J/kg
- SHR_CONST_LATICE + SHR_CONST_LATVAP
- real(R8),parameter :: SHR_CONST_OCN_REF_SAL = 34.7_R8 ! ocn ref salinity (psu)
- real(R8),parameter :: SHR_CONST_ICE_REF_SAL = 4.0_R8 ! ice ref salinity (psu)
-
- real(R8),parameter :: SHR_CONST_SPVAL = 1.0e30_R8 ! special missing value
-
-END MODULE shr_const_mod
diff --git a/tools/mksurfdata_map/src/shr_file_mod.F90 b/tools/mksurfdata_map/src/shr_file_mod.F90
deleted file mode 100644
index a5e8d1987d..0000000000
--- a/tools/mksurfdata_map/src/shr_file_mod.F90
+++ /dev/null
@@ -1,1023 +0,0 @@
-!BOP ===========================================================================
-!
-! !MODULE: shr_file_mod.F90 --- Module to handle various file utilily functions.
-!
-! !DESCRIPTION:
-!
-! Miscilaneous methods to handle file and directory utilities as well as FORTRAN
-! unit control. Also put/get local files into/from archival location
-!
-! File utilites used with CCSM Message passing:
-!
-! shr_file_stdio is the main example here, it changes the working directory,
-! changes stdin and stdout to a given filename.
-!
-! This is needed because some implementations of MPI with MPMD so that
-! each executable can run in a different working directory and redirect
-! output to different files.
-!
-! File name archival convention, eg.
-! call shr_file_put(rcode,"foo","mss:/USER/foo",rtpd=3650)
-! is extensible -- the existence of the option file name prefix, eg. "mss:",
-! and optional arguments, eg. rtpd-3650 can be used to access site-specific
-! storage devices. Based on CCM (atmosphere) getfile & putfile routines, but
-! intended to be a more extensible, shared code.
-!
-! !REVISION HISTORY:
-! 2006-05-08 E. Kluzek, Add in shr_file_mod and getUnit, freeUnif methods.
-! 2000-??-?? B. Kauffman, original version circa 2000
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-MODULE shr_file_mod
-
-! !USES:
-
- use shr_kind_mod ! defines kinds
- use shr_sys_mod ! system calls
- use shr_log_mod, only: s_loglev => shr_log_Level
- use shr_log_mod, only: s_logunit => shr_log_Unit
-
- IMPLICIT none
-
- PRIVATE ! By default everything is private to this module
-
-! !PUBLIC TYPES:
-
- ! no public types
-
-! !PUBLIC MEMBER FUNCTIONS:
-
- public :: shr_file_put ! Put a file to an archive location
- public :: shr_file_get ! Get a file from an archive location
- public :: shr_file_queryPrefix ! Get prefix type for a filename
- public :: shr_file_getUnit ! Get a logical unit for reading or writing
- public :: shr_file_freeUnit ! Free a logical unit
- public :: shr_file_stdio ! change dir and stdin and stdout
- public :: shr_file_chDir ! change current working directory
- public :: shr_file_dirio ! change stdin and stdout
- public :: shr_file_chStdIn ! change stdin (attach to a file)
- public :: shr_file_chStdOut ! change stdout (attach to a file)
- public :: shr_file_setIO ! open a log file from namelist
- public :: shr_file_setLogUnit ! Reset the log unit number
- public :: shr_file_setLogLevel ! Reset the logging debug level
- public :: shr_file_getLogUnit ! Get the log unit number
- public :: shr_file_getLogLevel ! Get the logging debug level
-
-! !PUBLIC DATA MEMBERS:
-
- ! Integer flags for recognized prefixes on file get/put operations
- integer(SHR_KIND_IN), parameter, public :: shr_file_noPrefix = 0 ! no recognized prefix
- integer(SHR_KIND_IN), parameter, public :: shr_file_nullPrefix = 1 ! null:
- integer(SHR_KIND_IN), parameter, public :: shr_file_cpPrefix = 2 ! cp:
- integer(SHR_KIND_IN), parameter, public :: shr_file_mssPrefix = 3 ! mss:
- integer(SHR_KIND_IN), parameter, public :: shr_file_hpssPrefix = 4 ! hpss:
-
-!EOP
- !--- unit numbers, users can ask for unit numbers from 0 to min, but getUnit
- !--- won't give a unit below min, users cannot ask for unit number above max
- !--- for backward compatability.
- !--- eventually, recommend min as hard lower limit (tcraig, 9/2007)
- integer(SHR_KIND_IN),parameter :: shr_file_minUnit = 10 ! Min unit number to give
- integer(SHR_KIND_IN),parameter :: shr_file_maxUnit = 99 ! Max unit number to give
- logical, save :: UnitTag(0:shr_file_maxUnit) = .false. ! Logical units in use
-
-!===============================================================================
-CONTAINS
-!===============================================================================
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_put -- Put a file to an archival location.
-!
-! !DESCRIPTION:
-! a generic, extensible put-local-file-into-archive routine
-! USAGE:
-! call shr_file_put(rcode,"foo","/home/user/foo")
-! if ( rcode /= 0 ) call shr_sys_abort( "error copying foo" )
-! call shr_file_put(rcode,"foo","cp:/home/user/foo",remove=.true.)
-! if ( rcode /= 0 ) call shr_sys_abort( "error copying foo" )
-! call shr_file_put(rcode,"foo","mss:/USER/foo",rtpd=3650)
-! if ( rcode /= 0 ) call shr_sys_abort( "error archiving foo to MSS" )
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_put(rcode,loc_fn,rem_fn,passwd,rtpd,async,remove)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(out) :: rcode ! return code (non-zero -- error)
- character(*), intent(in) :: loc_fn ! local filename
- character(*), intent(in) :: rem_fn ! remote filename
- character(*), intent(in),optional :: passwd ! password
- integer(SHR_KIND_IN),intent(in),optional :: rtpd ! MSS retention period
- logical, intent(in),optional :: async ! true <=> asynchronous put
- logical, intent(in),optional :: remove ! true <=> rm after put
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: rtpd2 ! MSS retention period
- logical :: remove2 ! true <=> rm after put
- logical :: async2 ! true <=> asynchronous put
- character(SHR_KIND_CL) :: passwd2 ! password
- character(SHR_KIND_CL) :: rfn ! rem_fn without the destination prefix
- character(SHR_KIND_CL) :: cmd ! command sent to system call
- integer(SHR_KIND_IN) :: prefix ! remote file prefix type
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_file_put) '
- character(*),parameter :: F00 = "('(shr_file_put) ',4a)"
- character(*),parameter :: F01 = "('(shr_file_put) ',a,i3,2a)"
- character(*),parameter :: F02 = "(a,i4)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - On some machines the system call will not return a valid error code
-! - when things are sent asynchronously, there probably won't be a error code
-! returned.
-!-------------------------------------------------------------------------------
-
- remove2 =.false. ; if ( PRESENT(remove )) remove2 = remove
- async2 =.true. ; if ( PRESENT(async )) async2 = async
- passwd2 = " " ; if ( PRESENT(passwd )) passwd2 = passwd
- rtpd2 = 365 ; if ( PRESENT(rtpd )) rtpd2 = rtpd
- rcode = 0
-
- if ( trim(rem_fn) == trim(loc_fn) ) then
- !------------------------------------------------------
- ! (remote file name) == (local file name) => do nothing
- !------------------------------------------------------
- cmd = 'do nothing: remote file = local file = '//trim(loc_fn)
- rcode = 0
- else if ( prefix == shr_file_cpPrefix .or. prefix == shr_file_noPrefix )then
- !------------------------------------------------------
- ! put via unix cp
- !------------------------------------------------------
- rfn = rem_fn
- if ( rem_fn(1:3) == "cp:") rfn = rem_fn(4:len_trim(rem_fn))
-#if defined(CATAMOUNT)
- call shr_jlcp(trim(loc_fn),len_trim(loc_fn),trim(rfn),len_trim(rfn),rcode)
- if (remove2) call unlink(trim(loc_fn))
- if (async2 .and. s_loglev > 0) write(s_logunit,F00) 'Error: asynchronous copy not supported.'
- cmd = 'shr_jlcp -f '//trim(loc_fn)//' '//trim(rfn)
- rcode = 0
-#else
- cmd = '/bin/cp -f '//trim(loc_fn)//' '//trim(rfn)
- if (remove2) cmd = trim(cmd)//' && /bin/rm -f '//trim(loc_fn)
- if (async2 ) cmd = trim(cmd)//' & '
- call shr_sys_system(trim(cmd),rcode)
-#endif
- else if ( prefix == shr_file_mssPrefix )then
- !------------------------------------------------------
- ! put onto NCAR's MSS
- !------------------------------------------------------
- if (rtpd2 > 9999) rtpd2 = 9999
- write(cmd,F02) '/usr/local/bin/msrcp -period ',rtpd2
- if (async2 .and. (.not. remove2) ) cmd = trim(cmd)//' -async '
- if (len_trim(passwd2) > 0 ) cmd = trim(cmd)//' -wpwd '//trim(passwd)
- cmd = trim(cmd)//' '//trim(loc_fn)//' '//trim(rem_fn)
- if (remove2) cmd = trim(cmd)//' && /bin/rm -f '//trim(loc_fn)
- if (async2 .and. remove2 ) cmd = trim(cmd)//' & '
- call shr_sys_system(trim(cmd),rcode)
- else if ( prefix == shr_file_hpssPrefix )then
- !------------------------------------------------------
- ! put onto LANL's hpss
- !------------------------------------------------------
- rcode = -1
- cmd = 'rem_fn='//trim(rem_fn)//' loc_fn='//trim(loc_fn)
- write(s_logunit,F00) 'ERROR: hpss option not yet implemented'
- call shr_sys_abort( subName//'ERROR: hpss option not yet implemented' )
- else if ( prefix == shr_file_nullPrefix )then
- ! do nothing
- cmd = "null prefix => no file archival, do nothing"
- rcode = 0
- end if
-
- if (s_loglev > 0) write(s_logunit,F01) 'rcode =',rcode,' cmd = ', trim(cmd)
-
-END SUBROUTINE shr_file_put
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_get -- Get a file from archival location.
-!
-! !DESCRIPTION:
-! a generic, extensible get-local-file-from-archive routine
-!
-! USAGE:
-! call shr_file_get(rcode,"foo","/home/user/foo")
-! if ( rcode /= 0 ) call shr_sys_abort( "error getting file foo" )
-! call shr_file_get(rcode,"foo","cp:/home/user/foo",remove=.true.)
-! if ( rcode /= 0 ) call shr_sys_abort( "error getting file foo" )
-! call shr_file_get(rcode,"foo","mss:/USER/foo",clobber=.true.)
-! if ( rcode /= 0 ) call shr_sys_abort( "error getting file foo from MSS" )
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_get(rcode,loc_fn,rem_fn,passwd,async,clobber)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(out) :: rcode ! return code (non-zero means error)
- character(*) ,intent(in) :: loc_fn ! local filename
- character(*) ,intent(in) :: rem_fn ! remote filename
- character(*) ,intent(in),optional :: passwd ! password
- logical ,intent(in),optional :: async ! true <=> asynchronous get
- logical ,intent(in),optional :: clobber ! true <=> clobber existing file
-
-!EOP
-
- !----- local -----
- logical :: async2 ! true <=> asynchronous get
- logical :: clobber2 ! true <=> clobber existing file
- logical :: exists ! true <=> local file a ready exists
- character(SHR_KIND_CL) :: passwd2 ! password
- character(SHR_KIND_CL) :: rfn ! rem_fn without the destination prefix
- character(SHR_KIND_CL) :: cmd ! command sent to system call
- integer(SHR_KIND_IN) :: prefix ! remote file prefix type
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_file_get) '
- character(*),parameter :: F00 = "('(shr_file_get) ',4a)"
- character(*),parameter :: F01 = "('(shr_file_get) ',a,i3,2a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - On some machines the system call will not return a valid error code
-! - When things are sent asynchronously, there probably won't be a error code
-! returned.
-!-------------------------------------------------------------------------------
-
- passwd2 = " " ; if (PRESENT(passwd )) passwd2 = passwd
- async2 = .false. ; if (PRESENT(async )) async2 = async
- clobber2 = .false. ; if (PRESENT(clobber)) clobber2 = clobber
- rcode = 0
-
- inquire(file=trim(loc_fn),exist=exists)
- prefix = shr_file_queryPrefix( rem_fn )
-
- if ( exists .and. .not. clobber2 ) then
- !------------------------------------------------------
- ! (file exists) and (don't clobber) => do nothing
- !------------------------------------------------------
- cmd = 'do nothing: file exists & no-clobber for '//trim(loc_fn)
- rcode = 0
- else if ( trim(rem_fn) == trim(loc_fn) ) then
- !------------------------------------------------------
- ! (remote file name) == (local file name) => do nothing
- !------------------------------------------------------
- cmd = 'do nothing: remote file = local file for '//trim(loc_fn)
- rcode = 0
- else if ( prefix == shr_file_cpPrefix .or. prefix == shr_file_noPrefix )then
- !------------------------------------------------------
- ! get via unix cp
- !------------------------------------------------------
- rfn = rem_fn ! remove prefix from this temp file name
- if (rem_fn(1:3) == "cp:") rfn = rem_fn(4:len_trim(rem_fn))
-#if defined(CATAMOUNT)
- call shr_jlcp(trim(rfn),len(trim(rfn)),trim(loc_fn),len(trim(loc_fn)),rcode)
- if (async2.and.s_loglev>0) write(s_logunit,F00) 'Error: asynchronous copy not supported.'
- cmd = 'shr_jlcp -f '//trim(rfn)//' '//trim(loc_fn)
- rcode = 0
-#else
- cmd = '/bin/cp -f '//trim(rfn)//' '//trim(loc_fn)
- if (async2) cmd = trim(cmd)//' & '
- call shr_sys_system(trim(cmd),rcode)
-#endif
- else if ( prefix == shr_file_mssPrefix )then
- !------------------------------------------------------
- ! get from NCAR's MSS
- !------------------------------------------------------
- cmd = '/usr/local/bin/msrcp '
- if (async2) cmd = trim(cmd)//' -async '
- cmd = trim(cmd)//' '//trim(rem_fn)//' '//trim(loc_fn)
- call shr_sys_system(trim(cmd),rcode)
- else if ( prefix == shr_file_hpssPrefix )then
- !------------------------------------------------------
- ! get from LANL's hpss
- !------------------------------------------------------
- rcode = -1
- cmd = 'rem_fn='//trim(rem_fn)//' loc_fn='//trim(loc_fn)
- write(s_logunit,F00) 'ERROR: hpss option not yet implemented'
- call shr_sys_abort( subName//'ERROR: hpss option not yet implemented' )
- else if ( prefix == shr_file_nullPrefix )then
- ! do nothing
- cmd = "null prefix => no file retrieval, do nothing"
- rcode = 0
- end if
-
- if (s_loglev > 0) write(s_logunit,F01) 'rcode =',rcode,' cmd = ', trim(cmd)
-
-END SUBROUTINE shr_file_get
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_queryPrefix -- Get the prefix type from a filepath.
-!
-! !DESCRIPTION:
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer(SHR_KIND_IN) FUNCTION shr_file_queryPrefix( filepath, prefix )
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*), intent(in) :: filepath ! Input filepath
- character(*), intent(out), optional :: prefix ! Output prefix description
-
-!EOP
-
- !----- local -----
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if ( filepath(1:5) == "null:" )then
- shr_file_queryPrefix = shr_file_nullPrefix
- if ( present(prefix) ) prefix = "null:"
- else if( filepath(1:3) == "cp:" )then
- shr_file_queryPrefix = shr_file_cpPrefix
- if ( present(prefix) ) prefix = "cp:"
- else if( filepath(1:4) == "mss:" )then
- shr_file_queryPrefix = shr_file_mssPrefix
- if ( present(prefix) ) prefix = "mss:"
- else if( filepath(1:5) == "hpss:" )then
- shr_file_queryPrefix = shr_file_hpssPrefix
- if ( present(prefix) ) prefix = "hpss:"
- else
- shr_file_queryPrefix = shr_file_noPrefix
- if ( present(prefix) ) prefix = ""
- end if
-
-END FUNCTION shr_file_queryPrefix
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_getUnit -- Get a free FORTRAN unit number
-!
-! !DESCRIPTION: Get the next free FORTRAN unit number.
-!
-! !REVISION HISTORY:
-! 2005-Dec-14 - E. Kluzek - creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-INTEGER FUNCTION shr_file_getUnit ( unit )
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in),optional :: unit ! desired unit number
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: n ! loop index
- logical :: opened ! If unit opened or not
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_file_getUnit) '
- character(*),parameter :: F00 = "('(shr_file_getUnit) ',A,I4,A)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (present (unit)) then
- inquire( unit, opened=opened )
- if (unit < 0 .or. unit > shr_file_maxUnit) then
- write(s_logunit,F00) 'invalid unit number request:', unit
- call shr_sys_abort( 'ERROR: bad input unit number' )
- else if (opened .or. UnitTag(unit) .or. unit == 0 .or. unit == 5 &
- .or. unit == 6) then
- write(s_logunit,F00) 'unit number ', unit, ' is already in use'
- call shr_sys_abort( 'ERROR: Input unit number already in use' )
- else
- shr_file_getUnit = unit
- UnitTag (unit) = .true.
- return
- end if
-
- else
- ! --- Choose first available unit other than 0, 5, or 6 ------
- do n=shr_file_maxUnit, shr_file_minUnit, -1
- inquire( n, opened=opened )
- if (n == 5 .or. n == 6 .or. opened) then
- cycle
- end if
- if ( .not. UnitTag(n) ) then
- shr_file_getUnit = n
- UnitTag(n) = .true.
- return
- end if
- end do
- end if
-
- call shr_sys_abort( subName//': Error: no available units found' )
-
-END FUNCTION shr_file_getUnit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_freeUnit -- Free up a FORTRAN unit number
-!
-! !DESCRIPTION: Free up the given unit number
-!
-! !REVISION HISTORY:
-! 2005-Dec-14 - E. Kluzek - creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_freeUnit ( unit)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in) :: unit ! unit number to be freed
-
-!EOP
-
- !----- local -----
-
- !----- formats -----
- character(*), parameter :: subName = '(shr_file_freeUnit) '
- character(*), parameter :: F00 = "('(shr_file_freeUnit) ',A,I4,A)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (unit < 0 .or. unit > shr_file_maxUnit) then
- if (s_loglev > 0) write(s_logunit,F00) 'invalid unit number request:', unit
- else if (unit == 0 .or. unit == 5 .or. unit == 6) then
- call shr_sys_abort( subName//': Error: units 0, 5, and 6 must not be freed' )
- else if (UnitTag(unit)) then
- UnitTag (unit) = .false.
- else
- if (s_loglev > 0) write(s_logunit,F00) 'unit ', unit, ' was not in use'
- end if
-
- return
-
-END SUBROUTINE shr_file_freeUnit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_stdio -- Change working directory, and redirect stdin/stdout
-!
-! !DESCRIPTION:
-! 1) change the cwd (current working directory) and
-! 2) redirect stdin & stdout (units 5 & 6) to named files,
-! where the desired cwd & files are specified by namelist file.
-!
-! Normally this is done to work around limitations in the execution syntax
-! of common MPI implementations. For example, SGI's mpirun syntax is not
-! flexible enough to allow MPMD models to select different execution
-! directories or to redirect stdin & stdout on the command line.
-! Such functionality is highly desireable for CCSM purposes.
-! ie. mpirun can't handle this:
-! unix> cd /usr/tmp/jdoe/csm/case01/atm ; atm < atm.parm > atm.log &
-! unix> cd /usr/tmp/jdoe/csm/case01/cpl ; cpl < cpl.parm > cpl.log &
-! etc.
-!
-! ASSUMPTIONS:
-! o if the cwd, stdin, or stdout are to be changed, there must be a namelist
-! file in the cwd named _stdio.nml where is provided via
-! subroutine dummy argument.
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_stdio(model)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),intent(in) :: model ! used to construct env varible name
-
-!EOP
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_stdio) '
- character(*),parameter :: F00 = "('(shr_file_stdio) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- call shr_file_chdir (model) ! changes cwd
- call shr_file_chStdOut(model) ! open units 5 & 6 to named files
- call shr_file_chStdIn (model) ! open units 5 & 6 to named files
-
-END SUBROUTINE shr_file_stdio
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_chdir -- Change working directory.
-!
-! !DESCRIPTION:
-! change the cwd (current working directory), see shr_file_stdio for notes
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_chdir(model, rcodeOut)
-
-! !USES:
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: model ! used to construct env varible name
- integer(SHR_KIND_IN),intent(out),optional :: rcodeOut ! Return error code
-
-!EOP
-
- !--- local ---
- character(SHR_KIND_CL) :: dir ! directory to cd to
- integer (SHR_KIND_IN) :: rcode ! Return error code
- character(SHR_KIND_CL) :: filename ! namelist file to read
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_chdir) '
- character(*),parameter :: F00 = "('(shr_file_chdir) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- call shr_file_stdioReadNL( model, filename, dirOut=dir, rcodeOut=rcode )
- if (dir /= "nochange") then
- call shr_sys_chdir(dir ,rcode)
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename),", changed cwd to ",trim(dir)
- else
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename),", cwd has *not* been changed"
- rcode = 1
- endif
- if ( present(rcodeOut) ) rcodeOut = rcode
-
-END SUBROUTINE shr_file_chdir
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_dirio --- Change stdin and stdout.
-!
-! !DESCRIPTION:
-! change the stdin & stdout (units 5 & 6), see shr_file_stdio for notes
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_dirio(model)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),intent(in) :: model ! used to construct env varible name
-
-!EOP
-
- !--- local ---
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_dirio) '
-
-!-------------------------------------------------------------------------------
-! Notes:
-!
-!-------------------------------------------------------------------------------
-
- call shr_file_chStdIn (model)
- call shr_file_chStdOut(model)
-
-END SUBROUTINE shr_file_dirio
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_chStdIn -- Change stdin
-!
-! !DESCRIPTION:
-! change the stdin (unit 5), see shr_file_stdio for notes
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_chStdIn( model, NLFilename, rcodeOut )
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: model ! used to construct env var name
- character(SHR_KIND_CL),intent(out),optional :: NLFilename ! open unit 5 to this
- integer (SHR_KIND_IN),intent(out),optional :: rcodeOut ! return code
-
-!EOP
-
- !--- local ---
- character(SHR_KIND_CL) :: stdin ! open unit 5 to this file
- character(SHR_KIND_CL) :: nlfile ! Namelist filename for model to read from
- character(SHR_KIND_CL) :: filename ! namelist file to read
- integer (SHR_KIND_IN) :: rcode ! return code
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_chStdIn) '
- character(*),parameter :: F00 = "('(shr_file_chStdIn) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- call shr_file_stdioReadNL( model, filename, stdinOut=stdin, &
- nlfileOut=nlfile, rcodeOut=rcode )
- if (stdin /= "nochange") then
- open(unit=5,file=stdin ,status='UNKNOWN',iostat=rcode)
- if ( rcode /= 0 )then
- if (s_loglev > 0) &
- write(s_logunit,F00) "read ",trim(filename),': error opening file as unit 5:', &
- trim(nlfile)
- else
- if (s_loglev > 0) &
- write(s_logunit,F00) "read ",trim(filename),': unit 5 connected to ', &
- trim(stdin)
- end if
- else
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename), &
- ': unit 5 has *not* been redirected'
- endif
- if ( len_trim(nlfile) > 0) then
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename), &
- ': read namelist from file:',trim(nlfile)
- if ( .not. present(NLFilename) )then
- if (s_loglev > 0) write(s_logunit,F00) "error: namelist filename NOT present"
- rcode = 7
- end if
- else
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename),", "
- if ( present(NLFilename) )then
- if (s_loglev > 0) write(s_logunit,F00) "error: namelist filename present, but null"
- rcode = 8
- end if
- endif
- if ( present(NLFilename) ) NLFilename = nlfile
- if ( present(rcodeOut) ) rcodeOut = rcode
-
-END SUBROUTINE shr_file_chStdIn
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_stdout -- Change stdout
-!
-! !DESCRIPTION:
-! change the stdout (unit 6), see shr_file_stdio for notes
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_chStdOut(model,rcodeOut)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
- !--- arguments ---
- character(*) ,intent(in) :: model ! used to construct env varible name
- integer(SHR_KIND_IN),intent(out),optional :: rcodeOut ! Return error code
-!EOP
-
- !--- local ---
- character(SHR_KIND_CL) :: filename ! namelist file to read
- character(SHR_KIND_CL) :: stdout ! open unit 6 to this file
- integer (SHR_KIND_IN) :: rcode ! return code
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_chStdOut) '
- character(*),parameter :: F00 = "('(shr_file_chStdOut) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- call shr_file_stdioReadNL( model, filename, stdoutOut=stdout, &
- rcodeOut=rcode )
- if (stdout /= "nochange") then
- close(6)
- open(unit=6,file=stdout,position='APPEND')
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename), &
- ': unit 6 connected to ',trim(stdout)
- call shr_sys_flush(s_logunit)
- else
- if (s_loglev > 0) write(s_logunit,F00) "read ",trim(filename), &
- ': unit 6 has *not* been redirected'
- rcode = 1
- endif
-
- if ( present(rcodeOut) ) rcodeOut = rcode
-
-END SUBROUTINE shr_file_chStdOut
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_stdioReadNL -- read in stdio namelist
-!
-! !DESCRIPTION:
-! Read in the stdio namelist for any given model type. Return any of the
-! needed input namelist variables as optional arguments. Return "nochange" in
-! dir, stdin, or stdout if shouldn't change.
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_stdioReadNL( model, filename, dirOut, stdinOut, stdoutOut, &
- NLFileOut, rcodeOut )
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: model ! used to construct env varible name
- character(SHR_KIND_CL),intent(out) :: filename ! nml file to read from unit 5
- character(SHR_KIND_CL),intent(out),optional :: NLFileOut ! open unit 6 to this file
- character(SHR_KIND_CL),intent(out),optional :: dirOut ! directory to cd to
- character(SHR_KIND_CL),intent(out),optional :: stdinOut ! open unit 5 to this file
- character(SHR_KIND_CL),intent(out),optional :: stdoutOut ! open unit 6 to this file
- integer (SHR_KIND_IN),intent(out),optional :: rcodeOut ! return code
-
-!EOP
-
- !--- local ---
- logical :: exists ! true iff file exists
- character(SHR_KIND_CL) :: dir ! directory to cd to
- character(SHR_KIND_CL) :: stdin ! open unit 5 to this file
- character(SHR_KIND_CL) :: stdout ! open unit 6 to this file
- character(SHR_KIND_CL) :: NLFile ! namelist file to read seperately
- integer (SHR_KIND_IN) :: rcode ! return code
- integer (SHR_KIND_IN) :: unit ! Unit to read from
-
- namelist / stdio / dir,stdin,stdout,NLFile
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_stdioReadNL) '
- character(*),parameter :: F00 = "('(shr_file_stdioReadNL) ',4a)"
- character(*),parameter :: F01 = "('(shr_file_stdioReadNL) ',2a,i6)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!
-!-------------------------------------------------------------------------------
-
- rcode = 0
- dir = "nochange"
- stdin = "nochange"
- stdout = "nochange"
- NLFile = " "
-
- filename = trim(model)//"_stdio.nml" ! eg. file="cpl_stdio.nml"
- inquire(file=filename,exist=exists)
-
- if (.not. exists) then
- if (s_loglev > 0) write(s_logunit,F00) "file ",trim(filename),&
- & " doesn't exist, can not read stdio namelist from it"
- rcode = 9
- else
- unit = shr_file_getUnit()
- open (unit,file=filename,action="READ")
- read (unit,nml=stdio,iostat=rcode)
- close(unit)
- call shr_file_freeUnit( unit )
- if (rcode /= 0) then
- write(s_logunit,F01) 'ERROR: reading ',trim(filename),': iostat=',rcode
- call shr_sys_abort(subName//" ERROR reading "//trim(filename) )
- end if
- endif
- if ( len_trim(NLFile) > 0 .and. trim(stdin) /= "nochange" )then
- write(s_logunit,F00) "Error: input namelist:"
- write(s_logunit,nml=stdio)
- call shr_sys_abort(subName//" ERROR trying to both redirect AND "// &
- "open namelist filename" )
- end if
- if ( present(NLFileOut) ) NLFileOut = NLFile
- if ( present(dirOut) ) dirOut = dir
- if ( present(stdinOut) ) stdinOut = stdin
- if ( present(stdoutOut) ) stdoutOut = stdout
- if ( present(rcodeOut) ) rcodeOut = rcode
-
-END SUBROUTINE shr_file_stdioReadNL
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_setIO -- read in stdio namelist
-!
-! !DESCRIPTION:
-! This opens a namelist file specified as an argument and then opens
-! a log file associated with the unit argument. This may be extended
-! in the future.
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_setIO( nmlfile, funit)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(len=*) ,intent(in) :: nmlfile ! namelist filename
- integer(SHR_KIND_IN),intent(in) :: funit ! unit number for log file
-
-!EOP
-
- !--- local ---
- logical :: exists ! true if file exists
- character(SHR_KIND_CL) :: diri ! directory to cd to
- character(SHR_KIND_CL) :: diro ! directory to cd to
- character(SHR_KIND_CL) :: logfile ! open unit 6 to this file
- integer(SHR_KIND_IN) :: unit ! unit number
- integer(SHR_KIND_IN) :: rcode ! error code
-
- namelist / modelio / diri,diro,logfile
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_setIO) '
- character(*),parameter :: F00 = "('(shr_file_setIO) ',4a)"
- character(*),parameter :: F01 = "('(shr_file_setIO) ',2a,i6)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!
-!-------------------------------------------------------------------------------
-
- diri = "."
- diro = "."
- logfile = ""
-
- inquire(file=nmlfile,exist=exists)
-
- if (.not. exists) then
- if (s_loglev > 0) write(s_logunit,F00) "file ",trim(nmlfile)," non existant"
- return
- else
- unit = shr_file_getUnit()
- open (unit,file=nmlfile,action="READ")
- read (unit,nml=modelio,iostat=rcode)
- close(unit)
- call shr_file_freeUnit( unit )
- if (rcode /= 0) then
- write(s_logunit,F01) 'ERROR: reading ',trim(nmlfile),': iostat=',rcode
- call shr_sys_abort(subName//" ERROR reading "//trim(nmlfile) )
- end if
- endif
-
- if (len_trim(logfile) > 0) then
- open(funit,file=trim(diro)//"/"//trim(logfile))
- else
- if (s_loglev > 0) write(s_logunit,F00) "logfile not opened"
- endif
-
-END SUBROUTINE shr_file_setIO
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_setLogUnit -- Set the Log I/O Unit number
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_setLogUnit(unit)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in) :: unit ! new unit number
-
-!EOP
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_setLogUnit) '
- character(*),parameter :: F00 = "('(shr_file_setLogUnit) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes: Caller must be sure it's a valid unit number
-!-------------------------------------------------------------------------------
-
- if (s_loglev > 1 .and. s_logunit-unit /= 0) then
- write(s_logunit,*) subName,': reset log unit number from/to ',s_logunit, unit
- write( unit,*) subName,': reset log unit number from/to ',s_logunit, unit
- endif
-
- s_logunit = unit
-
-END SUBROUTINE shr_file_setLogUnit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_setLogLevel -- Set the Log I/O Unit number
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_setLogLevel(newlevel)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in) :: newlevel ! new log level
-
-!EOP
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_setLogLevel) '
- character(*),parameter :: F00 = "('(shr_file_setLogLevel) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (s_loglev+newlevel > 2 .and. s_loglev-newlevel /= 0) &
- write(s_logunit,*) subName,': reset log level from/to ',s_loglev, newlevel
-
- s_loglev = newlevel
-
-END SUBROUTINE shr_file_setLogLevel
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_getLogUnit -- Set the Log I/O Unit number
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_getLogUnit(unit)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(out) :: unit ! new unit number
-
-!EOP
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_getLogUnit) '
- character(*),parameter :: F00 = "('(shr_file_getLogUnit) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- unit = s_logunit
-
-END SUBROUTINE shr_file_getLogUnit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_file_getLogLevel -- Set the Log I/O Unit number
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-SUBROUTINE shr_file_getLogLevel(curlevel)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(out) :: curlevel ! new log level
-
-!EOP
-
- !--- formats ---
- character(*),parameter :: subName = '(shr_file_getLogLevel) '
- character(*),parameter :: F00 = "('(shr_file_getLogLevel) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- curlevel = s_loglev
-
-END SUBROUTINE shr_file_getLogLevel
-
-!===============================================================================
-!===============================================================================
-
-END MODULE shr_file_mod
diff --git a/tools/mksurfdata_map/src/shr_kind_mod.F90 b/tools/mksurfdata_map/src/shr_kind_mod.F90
deleted file mode 100644
index d1219223da..0000000000
--- a/tools/mksurfdata_map/src/shr_kind_mod.F90
+++ /dev/null
@@ -1,19 +0,0 @@
-!===============================================================================
-
-MODULE shr_kind_mod
-
- !----------------------------------------------------------------------------
- ! precision/kind constants add data public
- !----------------------------------------------------------------------------
- public
- integer,parameter :: SHR_KIND_R8 = selected_real_kind(12) ! 8 byte real
- integer,parameter :: SHR_KIND_R4 = selected_real_kind( 6) ! 4 byte real
- integer,parameter :: SHR_KIND_RN = kind(1.0) ! native real
- integer,parameter :: SHR_KIND_I8 = selected_int_kind (13) ! 8 byte integer
- integer,parameter :: SHR_KIND_I4 = selected_int_kind ( 6) ! 4 byte integer
- integer,parameter :: SHR_KIND_IN = kind(1) ! native integer
- integer,parameter :: SHR_KIND_CS = 80 ! short char
- integer,parameter :: SHR_KIND_CL = 256 ! long char
- integer,parameter :: SHR_KIND_CX = 512 ! extra-long char
-
-END MODULE shr_kind_mod
diff --git a/tools/mksurfdata_map/src/shr_log_mod.F90 b/tools/mksurfdata_map/src/shr_log_mod.F90
deleted file mode 100644
index 244314a8de..0000000000
--- a/tools/mksurfdata_map/src/shr_log_mod.F90
+++ /dev/null
@@ -1,13 +0,0 @@
-MODULE shr_log_mod
-
- use shr_kind_mod
-
- !----------------------------------------------------------------------------
- ! low-level shared variables for logging, these may not be parameters
- !----------------------------------------------------------------------------
- public
-
- integer(SHR_KIND_IN) :: shr_log_Level = 1
- integer(SHR_KIND_IN) :: shr_log_Unit = 6
-
-END MODULE shr_log_mod
diff --git a/tools/mksurfdata_map/src/shr_string_mod.F90 b/tools/mksurfdata_map/src/shr_string_mod.F90
deleted file mode 100644
index 44697ceaee..0000000000
--- a/tools/mksurfdata_map/src/shr_string_mod.F90
+++ /dev/null
@@ -1,1753 +0,0 @@
-!===============================================================================
-!BOP ===========================================================================
-!
-! !MODULE: shr_string_mod -- string and list methods
-!
-! !DESCRIPTION:
-! General string and specific list method. A list is a single string
-! that is delimited by a character forming multiple fields, ie,
-! character(len=*) :: mylist = "t:s:u1:v1:u2:v2:taux:tauy"
-! The delimiter is called listDel in this module, is default ":",
-! but can be set by a call to shr_string_listSetDel.
-!
-! !REVISION HISTORY:
-! 2005-Apr-28 - T. Craig - first version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-module shr_string_mod
-
-! !USES:
-
- use shr_kind_mod ! F90 kinds
- use shr_sys_mod ! shared system calls
- use shr_timer_mod, only : shr_timer_get, shr_timer_start, shr_timer_stop
- use shr_log_mod, only : s_loglev => shr_log_Level
- use shr_log_mod, only : s_logunit => shr_log_Unit
-
- implicit none
- private
-
-! !PUBLIC TYPES:
-
- ! no public types
-
-! !PUBLIC MEMBER FUNCTIONS:
-
- public :: shr_string_countChar ! Count number of char in string, fn
- public :: shr_string_toUpper ! Convert string to upper-case
- public :: shr_string_toLower ! Convert string to lower-case
- public :: shr_string_getParentDir ! For a pathname get the parent directory name
- public :: shr_string_lastIndex ! Index of last substr in str
- public :: shr_string_endIndex ! Index of end of substr in str
- public :: shr_string_leftAlign ! remove leading white space
- public :: shr_string_alphanum ! remove all non alpha-numeric characters
- public :: shr_string_betweenTags ! get the substring between the two tags
- public :: shr_string_parseCFtunit ! parse CF time units
- public :: shr_string_clean ! Set string to all white space
-
- public :: shr_string_listIsValid ! test for a valid "list"
- public :: shr_string_listGetNum ! Get number of fields in list, fn
- public :: shr_string_listGetIndex ! Get index of field
- public :: shr_string_listGetIndexF ! function version of listGetIndex
- public :: shr_string_listGetName ! get k-th field name
- public :: shr_string_listIntersect ! get intersection of two field lists
- public :: shr_string_listUnion ! get union of two field lists
- public :: shr_string_listMerge ! merge two lists to form third
- public :: shr_string_listAppend ! append list at end of another
- public :: shr_string_listPrepend ! prepend list in front of another
- public :: shr_string_listSetDel ! Set field delimeter in lists
- public :: shr_string_listGetDel ! Get field delimeter in lists
-
- public :: shr_string_setAbort ! set local abort flag
- public :: shr_string_setDebug ! set local debug flag
-
-! !PUBLIC DATA MEMBERS:
-
- ! no public data members
-
-!EOP
-
- character(len=1) ,save :: listDel = ":" ! note single exec implications
- character(len=2) ,save :: listDel2 = "::" ! note single exec implications
- logical ,save :: doabort = .true.
- integer(SHR_KIND_IN),save :: debug = 0
-
-!===============================================================================
-contains
-!===============================================================================
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_countChar -- Count number of occurances of a character
-!
-! !DESCRIPTION:
-! count number of occurances of a single character in a string
-! \newline
-! n = shr\_string\_countChar(string,character)
-!
-! !REVISION HISTORY:
-! 2005-Feb-28 - First version from dshr_bundle
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer function shr_string_countChar(str,char,rc)
-
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: str ! string to search
- character(1) ,intent(in) :: char ! char to search for
- integer(SHR_KIND_IN),intent(out),optional :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: count ! counts occurances of char
- integer(SHR_KIND_IN) :: n ! generic index
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_countChar) "
- character(*),parameter :: F00 = "('(shr_string_countChar) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- count = 0
- do n = 1, len_trim(str)
- if (str(n:n) == char) count = count + 1
- end do
- shr_string_countChar = count
-
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_countChar
-
-!===============================================================================
-!BOP ===========================================================================
-! !IROUTINE: shr_string_toUpper -- Convert string to upper case
-!
-! !DESCRIPTION:
-! Convert the input string to upper-case.
-! Use achar and iachar intrinsics to ensure use of ascii collating sequence.
-!
-! !REVISION HISTORY:
-! 2005-Dec-20 - Move CAM version over to shared code.
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-function shr_string_toUpper(str)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
- character(len=*), intent(in) :: str ! String to convert to upper case
- character(len=len(str)) :: shr_string_toUpper
-
- !----- local -----
- integer(SHR_KIND_IN) :: i ! Index
- integer(SHR_KIND_IN) :: aseq ! ascii collating sequence
- integer(SHR_KIND_IN) :: LowerToUpper ! integer to convert case
- character(len=1) :: ctmp ! Character temporary
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_toUpper) "
- character(*),parameter :: F00 = "('(shr_string_toUpper) ',4a)"
-
-!-------------------------------------------------------------------------------
-!
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- LowerToUpper = iachar("A") - iachar("a")
-
- do i = 1, len(str)
- ctmp = str(i:i)
- aseq = iachar(ctmp)
- if ( aseq >= iachar("a") .and. aseq <= iachar("z") ) &
- ctmp = achar(aseq + LowertoUpper)
- shr_string_toUpper(i:i) = ctmp
- end do
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_toUpper
-
-!===============================================================================
-!BOP ===========================================================================
-! !IROUTINE: shr_string_toLower -- Convert string to lower case
-!
-! !DESCRIPTION:
-! Convert the input string to lower-case.
-! Use achar and iachar intrinsics to ensure use of ascii collating sequence.
-!
-! !REVISION HISTORY:
-! 2006-Apr-20 - Creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-function shr_string_toLower(str)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
- character(len=*), intent(in) :: str ! String to convert to lower case
- character(len=len(str)) :: shr_string_toLower
-
- !----- local -----
- integer(SHR_KIND_IN) :: i ! Index
- integer(SHR_KIND_IN) :: aseq ! ascii collating sequence
- integer(SHR_KIND_IN) :: UpperToLower ! integer to convert case
- character(len=1) :: ctmp ! Character temporary
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_toLower) "
- character(*),parameter :: F00 = "('(shr_string_toLower) ',4a)"
-
-!-------------------------------------------------------------------------------
-!
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- UpperToLower = iachar("a") - iachar("A")
-
- do i = 1, len(str)
- ctmp = str(i:i)
- aseq = iachar(ctmp)
- if ( aseq >= iachar("A") .and. aseq <= iachar("Z") ) &
- ctmp = achar(aseq + UpperToLower)
- shr_string_toLower(i:i) = ctmp
- end do
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_toLower
-
-!===============================================================================
-!BOP ===========================================================================
-! !IROUTINE: shr_string_getParentDir -- For pathname get the parent directory name
-!
-! !DESCRIPTION:
-! Get the parent directory name for a pathname.
-!
-! !REVISION HISTORY:
-! 2006-May-09 - Creation
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-function shr_string_getParentDir(str)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
- character(len=*), intent(in) :: str ! String to convert to lower case
- character(len=len(str)) :: shr_string_getParentDir
-
- !----- local -----
- integer(SHR_KIND_IN) :: i ! Index
- integer(SHR_KIND_IN) :: nlen ! Length of string
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_getParentDir) "
- character(*),parameter :: F00 = "('(shr_string_getParentDir) ',4a)"
-
-!-------------------------------------------------------------------------------
-!
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- nlen = len_trim(str)
- if ( str(nlen:nlen) == "/" ) nlen = nlen - 1
- i = index( str(1:nlen), "/", back=.true. )
- if ( i == 0 )then
- shr_string_getParentDir = str
- else
- shr_string_getParentDir = str(1:i-1)
- end if
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_getParentDir
-
-!===============================================================================
-!BOP ===========================================================================
-!
-!
-! !IROUTINE: shr_string_lastIndex -- Get index of last substr within string
-!
-! !DESCRIPTION:
-! Get index of last substr within string
-! \newline
-! n = shr\_string\_lastIndex(string,substring)
-!
-! !REVISION HISTORY:
-! 2005-Feb-28 - First version from dshr_domain
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer function shr_string_lastIndex(string,substr,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: string ! string to search
- character(*) ,intent(in) :: substr ! sub-string to search for
- integer(SHR_KIND_IN),intent(out),optional :: rc ! return code
-
-!EOP
-
- !--- local ---
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_lastIndex) "
- character(*),parameter :: F00 = "('(shr_string_lastIndex) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Note:
-! - "new" F90 back option to index function makes this home-grown solution obsolete
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- shr_string_lastIndex = index(string,substr,.true.)
-
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_lastIndex
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_endIndex -- Get the ending index of substr within string
-!
-! !DESCRIPTION:
-! Get the ending index of substr within string
-! \newline
-! n = shr\_string\_endIndex(string,substring)
-!
-! !REVISION HISTORY:
-! 2005-May-10 - B. Kauffman, first version.
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer function shr_string_endIndex(string,substr,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: string ! string to search
- character(*) ,intent(in) :: substr ! sub-string to search for
- integer(SHR_KIND_IN),intent(out),optional :: rc ! return code
-
-!EOP
-
- !--- local ---
- integer(SHR_KIND_IN) :: i ! generic index
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_endIndex) "
- character(*),parameter :: F00 = "('(shr_string_endIndex) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! * returns zero if substring not found, uses len_trim() intrinsic
-! * very similar to: i = index(str,substr,back=.true.)
-! * do we need this function?
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- i = index(trim(string),trim(substr))
- if ( i == 0 ) then
- shr_string_endIndex = 0 ! substr is not in string
- else
- shr_string_endIndex = i + len_trim(substr) - 1
- end if
-
-! -------------------------------------------------------------------
-! i = index(trim(string),trim(substr),back=.true.)
-! if (i == len(string)+1) i = 0
-! shr_string_endIndex = i
-! -------------------------------------------------------------------
-
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_endIndex
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_leftAlign -- remove leading white space
-!
-! !DESCRIPTION:
-! Remove leading white space
-! \newline
-! call shr\_string\_leftAlign(string)
-!
-! !REVISION HISTORY:
-! 2005-Apr-28 - B. Kauffman - First version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_leftAlign(str,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(inout) :: str
- integer(SHR_KIND_IN),intent(out) ,optional :: rc ! return code
-
-!EOP
-
- !----- local ----
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_leftAlign) "
- character(*),parameter :: F00 = "('(shr_string_leftAlign) ',4a)"
-
-!-------------------------------------------------------------------------------
-! note:
-! * ?? this routine isn't needed, use the intrisic adjustL instead ??
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
-! -------------------------------------------------------------------
-! --- I used this until I discovered the intrinsic function below - BK
-! do while (len_trim(str) > 0 )
-! if (str(1:1) /= ' ') exit
-! str = str(2:len_trim(str))
-! end do
-! rCode = 0
-! !! (len_trim(str) == 0 ) rCode = 1 ! ?? appropriate ??
-! -------------------------------------------------------------------
-
- str = adjustL(str)
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_leftAlign
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_alphanum -- remove non alpha numeric characters
-!
-! !DESCRIPTION:
-! Remove all non alpha numeric characters from string
-! \newline
-! call shr\_string\_alphanum(string)
-!
-! !REVISION HISTORY:
-! 2005-Aug-01 - T. Craig - First version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_alphanum(str,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(inout) :: str
- integer(SHR_KIND_IN),intent(out) ,optional :: rc ! return code
-
-!EOP
-
- !----- local ----
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: n,icnt ! counters
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_alphaNum) "
- character(*),parameter :: F00 = "('(shr_string_alphaNum) ',4a)"
-
-!-------------------------------------------------------------------------------
-!
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- icnt = 0
- do n=1,len_trim(str)
- if ((str(n:n) >= 'a' .and. str(n:n) <= 'z') .or. &
- (str(n:n) >= 'A' .and. str(n:n) <= 'Z') .or. &
- (str(n:n) >= '0' .and. str(n:n) <= '9')) then
- icnt = icnt + 1
- str(icnt:icnt) = str(n:n)
- endif
- enddo
- do n=icnt+1,len(str)
- str(n:n) = ' '
- enddo
-
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_alphanum
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_betweenTags -- Get the substring between the two tags.
-!
-! !DESCRIPTION:
-! Get the substring found between the start and end tags.
-! \newline
-! call shr\_string\_betweenTags(string,startTag,endTag,substring,rc)
-!
-! !REVISION HISTORY:
-! 2005-May-11 - B. Kauffman, first version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_betweenTags(string,startTag,endTag,substr,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: string ! string to search
- character(*) ,intent(in) :: startTag ! start tag
- character(*) ,intent(in) :: endTag ! end tag
- character(*) ,intent(out) :: substr ! sub-string between tags
- integer(SHR_KIND_IN),intent(out),optional :: rc ! retrun code
-
-!EOP
-
- !--- local ---
- integer(SHR_KIND_IN) :: iStart ! substring start index
- integer(SHR_KIND_IN) :: iEnd ! substring end index
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_betweenTags) "
- character(*),parameter :: F00 = "('(shr_string_betweenTags) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! * assumes the leading/trailing white space is not part of start & end tags
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- iStart = shr_string_endIndex(string,trim(adjustL(startTag))) ! end of start tag
- iEnd = index(string,trim(adjustL(endTag ))) ! start of end tag
-
- rCode = 0
- substr = ""
-
- if (iStart < 1) then
- if (s_loglev > 0) then
- write(s_logunit,F00) "ERROR: can't find start tag in string"
- write(s_logunit,F00) "ERROR: start tag = ",trim(startTag)
- write(s_logunit,F00) "ERROR: string = ",trim(string)
- endif
- rCode = 1
- else if (iEnd < 1) then
- if (s_loglev > 0) then
- write(s_logunit,F00) "ERROR: can't find end tag in string"
- write(s_logunit,F00) "ERROR: end tag = ",trim( endTag)
- write(s_logunit,F00) "ERROR: string = ",trim(string)
- endif
- rCode = 2
- else if ( iEnd <= iStart) then
- if (s_loglev > 0) then
- write(s_logunit,F00) "ERROR: start tag not before end tag"
- write(s_logunit,F00) "ERROR: start tag = ",trim(startTag)
- write(s_logunit,F00) "ERROR: end tag = ",trim( endTag)
- write(s_logunit,F00) "ERROR: string = ",trim(string)
- endif
- rCode = 3
- else if ( iStart+1 == iEnd ) then
- substr = ""
- if (s_loglev > 0) write(s_logunit,F00) "WARNING: zero-length substring found in ",trim(string)
- else
- substr = string(iStart+1:iEnd-1)
- if (len_trim(substr) == 0 .and. s_loglev > 0) &
- & write(s_logunit,F00) "WARNING: white-space substring found in ",trim(string)
- end if
-
- if (present(rc)) rc = rCode
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_betweenTags
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_parseCFtunit -- Parse CF time unit
-!
-! !DESCRIPTION:
-! Parse CF time unit into a delta string name and a base time in yyyymmdd
-! and seconds (nearest integer actually).
-! \newline
-! call shr\_string\_parseCFtunit(string,substring)
-! \newline
-! Input string is like "days since 0001-06-15 15:20:45.5 -6:00"
-! - recognizes "days", "hours", "minutes", "seconds"
-! - must have at least yyyy-mm-dd, hh:mm:ss.s is optional
-! - expects a "since" in the string
-! - ignores time zone part
-!
-! !REVISION HISTORY:
-! 2005-May-15 - T. Craig - first version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_parseCFtunit(string,unit,bdate,bsec,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: string ! string to search
- character(*) ,intent(out) :: unit ! delta time unit
- integer(SHR_KIND_IN),intent(out) :: bdate ! base date yyyymmdd
- real(SHR_KIND_R8) ,intent(out) :: bsec ! base seconds
- integer(SHR_KIND_IN),intent(out),optional :: rc ! return code
-
-!EOP
-
- !--- local ---
- integer(SHR_KIND_IN) :: i,i1,i2 ! generic index
- character(SHR_KIND_CL) :: tbase ! baseline time
- character(SHR_KIND_CL) :: lstr ! local string
- integer(SHR_KIND_IN) :: yr,mo,da,hr,min ! time stuff
- real(SHR_KIND_R8) :: sec ! time stuff
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_parseCFtunit) "
- character(*),parameter :: F00 = "('(shr_string_parseCFtunit) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! o assume length of CF-1.0 time attribute char string < SHR_KIND_CL
-! This is a reasonable assumption.
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- unit = 'none'
- bdate = 0
- bsec = 0.0_SHR_KIND_R8
-
- i = shr_string_lastIndex(string,'days ')
- if (i > 0) unit = 'days'
- i = shr_string_lastIndex(string,'hours ')
- if (i > 0) unit = 'hours'
- i = shr_string_lastIndex(string,'minutes ')
- if (i > 0) unit = 'minutes'
- i = shr_string_lastIndex(string,'seconds ')
- if (i > 0) unit = 'seconds'
-
- if (trim(unit) == 'none') then
- write(s_logunit,F00) ' ERROR time unit unknown'
- call shr_string_abort(subName//' time unit unknown')
- endif
-
- i = shr_string_lastIndex(string,' since ')
- if (i < 1) then
- write(s_logunit,F00) ' ERROR since does not appear in unit attribute for time '
- call shr_string_abort(subName//' no since in attr name')
- endif
- tbase = trim(string(i+6:))
- call shr_string_leftAlign(tbase)
-
- if (debug > 0 .and. s_logunit > 0) then
- write(s_logunit,*) trim(subName)//' '//'unit '//trim(unit)
- write(s_logunit,*) trim(subName)//' '//'tbase '//trim(tbase)
- endif
-
- yr=0; mo=0; da=0; hr=0; min=0; sec=0
- i1 = 1
-
- i2 = index(tbase,'-') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=200) yr
- tbase = tbase(i2+2:)
- call shr_string_leftAlign(tbase)
-
- i2 = index(tbase,'-') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=200) mo
- tbase = tbase(i2+2:)
- call shr_string_leftAlign(tbase)
-
- i2 = index(tbase,' ') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=200) da
- tbase = tbase(i2+2:)
- call shr_string_leftAlign(tbase)
-
- i2 = index(tbase,':') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=100) hr
- tbase = tbase(i2+2:)
- call shr_string_leftAlign(tbase)
-
- i2 = index(tbase,':') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=100) min
- tbase = tbase(i2+2:)
- call shr_string_leftAlign(tbase)
-
- i2 = index(tbase,' ') - 1
- lstr = tbase(i1:i2)
- read(lstr,*,ERR=200,END=100) sec
-
-100 continue
-
- if (debug > 0 .and. s_loglev > 0) write(s_logunit,*) trim(subName),'ymdhms:',yr,mo,da,hr,min,sec
-
- bdate = abs(yr)*10000 + mo*100 + da
- if (yr < 0) bdate = -bdate
- bsec = real(hr*3600 + min*60,SHR_KIND_R8) + sec
-
- if (present(rc)) rc = 0
-
- if (debug>1) call shr_timer_stop (t01)
- return
-
-200 continue
- write(s_logunit,F00) 'ERROR 200 on char num read '
- call shr_string_abort(subName//' ERROR on char num read')
- if (debug>1) call shr_timer_stop (t01)
- return
-
-end subroutine shr_string_parseCFtunit
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_clean -- Clean a string, set it to "blank"
-!
-! !DESCRIPTION:
-! Clean a string, set it to blank
-! \newline
-! call shr\_string\_clean(string,rc)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_clean(string,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(inout) :: string ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: n ! counter
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_clean) "
- character(*),parameter :: F00 = "('(shr_string_clean) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- rCode = 0
- string = ' '
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_clean
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listIsValid -- determine whether string is a valid list
-!
-! !DESCRIPTION:
-! Determine whether string is a valid list
-! \newline
-! logical_var = shr\_string\_listIsValid(list,rc)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - B. Kauffman
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-logical function shr_string_listIsValid(list,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: list ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer (SHR_KIND_IN) :: nChar ! lenth of list
- integer (SHR_KIND_IN) :: rCode ! return code
- integer (SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listIsValid) "
- character(*),parameter :: F00 = "('(shr_string_listIsValid) ',4a)"
-
-!-------------------------------------------------------------------------------
-! check that the list conforms to the list format
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- rCode = 0
- shr_string_listIsValid = .true.
-
- nChar = len_trim(list)
- if (nChar < 1) then ! list is an empty string
- rCode = 1
- else if ( list(1:1) == listDel ) then ! first char is delimiter
- rCode = 2
- else if (list(nChar:nChar) == listDel ) then ! last char is delimiter
- rCode = 3
- else if (index(trim(list)," " ) > 0) then ! white-space in a field name
- rCode = 4
- else if (index(trim(list),listDel2) > 0) then ! found zero length field
- rCode = 5
- end if
-
- if (rCode /= 0) then
- shr_string_listIsValid = .false.
- if (s_loglev > 0) write(s_logunit,F00) "WARNING: invalid list = ",trim(list)
- endif
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_listIsValid
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listGetName -- Get name of k-th field in list
-!
-! !DESCRIPTION:
-! Get name of k-th field in list
-! \newline
-! call shr\_string\_listGetName(list,k,name,rc)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - B. Kauffman
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listGetName(list,k,name,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: list ! list/string
- integer(SHR_KIND_IN) ,intent(in) :: k ! index of field
- character(*) ,intent(out) :: name ! k-th name in list
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: i,j,n ! generic indecies
- integer(SHR_KIND_IN) :: kFlds ! number of fields in list
- integer(SHR_KIND_IN) :: i0,i1 ! name = list(i0:i1)
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listGetName) "
- character(*),parameter :: F00 = "('(shr_string_listGetName) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- rCode = 0
-
- !--- check that this is a valid list ---
- if (.not. shr_string_listIsValid(list,rCode) ) then
- write(s_logunit,F00) "ERROR: invalid list = ",trim(list)
- call shr_string_abort(subName//" ERROR: invalid list = "//trim(list))
- end if
-
- !--- check that this is a valid index ---
- kFlds = shr_string_listGetNum(list)
- if (k<1 .or. kFlds1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listGetName
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listIntersect -- Get intersection of two field lists
-!
-! !DESCRIPTION:
-! Get intersection of two fields lists, write into third list
-! \newline
-! call shr\_string\_listIntersect(list1,list2,listout)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listIntersect(list1,list2,listout,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: list1 ! list/string
- character(*) ,intent(in) :: list2 ! list/string
- character(*) ,intent(out) :: listout ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: nf,n1,n2 ! counters
- character(SHR_KIND_CS) :: name ! field name
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listIntersect) "
- character(*),parameter :: F00 = "('(shr_string_listIntersect) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- rCode = 0
-
- nf = shr_string_listGetNum(list1)
- call shr_string_clean(listout)
- do n1 = 1,nf
- call shr_string_listGetName(list1,n1,name,rCode)
- n2 = shr_string_listGetIndexF(list2,name)
- if (n2 > 0) then
- call shr_string_listAppend(listout,name)
- endif
- enddo
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listIntersect
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listUnion -- Get union of two field lists
-!
-! !DESCRIPTION:
-! Get union of two fields lists, write into third list
-! \newline
-! call shr\_string\_listUnion(list1,list2,listout)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listUnion(list1,list2,listout,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: list1 ! list/string
- character(*) ,intent(in) :: list2 ! list/string
- character(*) ,intent(out) :: listout ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: nf,n1,n2 ! counters
- character(SHR_KIND_CS) :: name ! field name
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listUnion) "
- character(*),parameter :: F00 = "('(shr_string_listUnion) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
- rCode = 0
-
- call shr_string_clean(listout)
-
- nf = shr_string_listGetNum(list1)
- do n1 = 1,nf
- call shr_string_listGetName(list1,n1,name,rCode)
- n2 = shr_string_listGetIndexF(listout,name)
- if (n2 < 1) then
- call shr_string_listAppend(listout,name)
- endif
- enddo
-
- nf = shr_string_listGetNum(list2)
- do n1 = 1,nf
- call shr_string_listGetName(list2,n1,name,rCode)
- n2 = shr_string_listGetIndexF(listout,name)
- if (n2 < 1) then
- call shr_string_listAppend(listout,name)
- endif
- enddo
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listUnion
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listMerge -- Merge lists two list to third
-!
-! !DESCRIPTION:
-! Merge two list to third
-! \newline
-! call shr\_string\_listMerge(list1,list2,listout)
-! call shr\_string\_listMerge(list1,list2,list1)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listMerge(list1,list2,listout,rc)
-
- implicit none
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: list1 ! list/string
- character(*) ,intent(in) :: list2 ! list/string
- character(*) ,intent(out) :: listout ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- character(SHR_KIND_CX) :: l1,l2 ! local char strings
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listMerge) "
- character(*),parameter :: F00 = "('(shr_string_listMerge) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - no input or output string should be longer than SHR_KIND_CX
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
- rCode = 0
-
- !--- make sure temp strings are large enough ---
- if ( (len(l1) < len_trim(list1)) .or. (len(l2) < len_trim(list2))) then
- call shr_string_abort(subName//'ERROR: temp string not large enough')
- end if
-
- call shr_string_clean(l1)
- call shr_string_clean(l2)
- call shr_string_clean(listout)
- l1 = trim(list1)
- l2 = trim(list2)
- call shr_string_leftAlign(l1,rCode)
- call shr_string_leftAlign(l2,rCode)
- if (len_trim(l1)+len_trim(l2)+1 > len(listout)) &
- call shr_string_abort(subName//'ERROR: output list string not large enough')
- if (len_trim(l1) == 0) then
- listout = trim(l2)
- else
- listout = trim(l1)//":"//trim(l2)
- endif
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listMerge
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listAppend -- Append one list to another
-!
-! !DESCRIPTION:
-! Append one list to another
-! \newline
-! call shr\_string\_listAppend(list,listadd)
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listAppend(list,listadd,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(inout) :: list ! list/string
- character(*) ,intent(in) :: listadd ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- character(SHR_KIND_CX) :: l1 ! local string
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listAppend) "
- character(*),parameter :: F00 = "('(shr_string_listAppend) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - no input or output string should be longer than SHR_KIND_CX
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
- rCode = 0
-
- !--- make sure temp string is large enough ---
- if (len(l1) < len_trim(listAdd)) then
- call shr_string_abort(subName//'ERROR: temp string not large enough')
- end if
-
- call shr_string_clean(l1)
- l1 = trim(listadd)
- call shr_string_leftAlign(l1,rCode)
- if (len_trim(list)+len_trim(l1)+1 > len(list)) &
- call shr_string_abort(subName//'ERROR: output list string not large enough')
- if (len_trim(list) == 0) then
- list = trim(l1)
- else
- list = trim(list)//":"//trim(l1)
- endif
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listAppend
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listPrepend -- Prepend one list to another
-!
-! !DESCRIPTION:
-! Prepend one list to another
-! \newline
-! call shr\_string\_listPrepend(listadd,list)
-! \newline
-! results in listadd:list
-!
-! !REVISION HISTORY:
-! 2005-May-05 - T. Craig
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listPrepend(listadd,list,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: listadd ! list/string
- character(*) ,intent(inout) :: list ! list/string
- integer(SHR_KIND_IN),optional,intent(out) :: rc ! return code
-
-!EOP
-
- !----- local -----
- character(SHR_KIND_CX) :: l1 ! local string
- integer(SHR_KIND_IN) :: rCode ! return code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listPrepend) "
- character(*),parameter :: F00 = "('(shr_string_listPrepend) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - no input or output string should be longer than SHR_KIND_CX
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
- rCode = 0
-
- !--- make sure temp string is large enough ---
- if (len(l1) < len_trim(listAdd)) then
- call shr_string_abort(subName//'ERROR: temp string not large enough')
- end if
-
- call shr_string_clean(l1)
- l1 = trim(listadd)
- call shr_string_leftAlign(l1,rCode)
- call shr_string_leftAlign(list,rCode)
- if (len_trim(list)+len_trim(l1)+1 > len(list)) &
- call shr_string_abort(subName//'ERROR: output list string not large enough')
- if (len_trim(l1) == 0) then
- list = trim(list)
- else
- list = trim(l1)//":"//trim(list)
- endif
-
- if (present(rc)) rc = rCode
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listPrepend
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listGetIndexF -- Get index of field in string
-!
-! !DESCRIPTION:
-! Get index of field in string
-! \newline
-! k = shr\_string\_listGetIndex(str,"taux")
-!
-! !REVISION HISTORY:
-! 2005-Feb-28 - B. Kauffman and J. Schramm - first version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer function shr_string_listGetIndexF(string,fldStr)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),intent(in) :: string ! string
- character(*),intent(in) :: fldStr ! name of field
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: k ! local index variable
- integer(SHR_KIND_IN) :: rc ! error code
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listGetIndexF) "
- character(*),parameter :: F00 = "('(shr_string_listGetIndexF) ',4a)"
-
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- call shr_string_listGetIndex(string,fldStr,k,print=.false.,rc=rc)
- shr_string_listGetIndexF = k
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_listGetIndexF
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listGetIndex -- Get index of field in string
-!
-! !DESCRIPTION:
-! Get index of field in string
-! \newline
-! call shr\_string\_listGetIndex(str,"taux",k,rc)
-!
-! !REVISION HISTORY:
-! 2005-Feb-28 - B. Kauffman and J. Schramm - first version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listGetIndex(string,fldStr,kFld,print,rc)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*) ,intent(in) :: string ! string
- character(*) ,intent(in) :: fldStr ! name of field
- integer(SHR_KIND_IN),intent(out) :: kFld ! index of field
- logical ,intent(in) ,optional :: print ! print switch
- integer(SHR_KIND_IN),intent(out),optional :: rc ! return code
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: n ! index for colon position
- integer(SHR_KIND_IN) :: k ! index for field name position
- integer(SHR_KIND_IN) :: nFields ! number of fields in a string
- integer(SHR_KIND_IN) :: i0,i1 ! fldStr == string(i0,i1) ??
- integer(SHR_KIND_IN) :: j0,j1 ! fldStr == string(j0,j1) ??
- logical :: found ! T => field found in fieldNames
- logical :: lprint ! local print flag
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listGetIndex) "
- character(*),parameter :: F00 = "('(shr_string_listGetIndex) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-! - searching from both ends of the list at the same time seems to be 20% faster
-! but I'm not sure why (B. Kauffman, Feb 2007)
-! - I commented out sanity check to a little gain speed (B. Kauffman, Mar 2007)
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
- if (present(rc)) rc = 0
-
- lprint = .false.
- if (present(print)) lprint = print
-
- !--- confirm proper size of input data ---
- if (len_trim(fldStr) < 1) then
- if (lprint) write(s_logunit,F00) "ERROR: input field name has 0 length"
- call shr_string_abort(subName//"invalid field name")
- end if
-
- !--- search for field name in string's list of fields ---
- found = .false.
- kFld = 0
- i0 = 1 ! ?? fldStr == string(i0:i1) ??
- i1 = -1
- j0 = -1 ! ?? fldStr == string(j0:j1) ??
- j1 = len_trim(string)
- nFields = shr_string_listGetNum(string)
- do k = 1,nFields
- !--------------------------------------------------------
- ! search from end of list to end of list
- !--------------------------------------------------------
- !--- get end index of of field number k ---
- n = index(string(i0:len_trim(string)),listDel)
- if (n > 0) then
- i1 = i0 + n - 2 ! *not* the last field name in fieldNames
- else
- i1 = len_trim(string) ! this is the last field name in fieldNames
- endif
- !--- sanity check ---
- ! if ((k 0)) then
- ! call shr_string_abort(subName//"ERROR: wrong string%nf ?")
- ! end if
- !--- is it a match? ---
- if (trim(fldStr) == string(i0:i1)) then
- found = .true.
- kFld = k
- exit
- endif
- i0 = i1 + 2 ! start index for next iteration
- !--------------------------------------------------------
- ! search from end of list to start of list
- !--------------------------------------------------------
- !--- get start index of field number (nFields + 1 - k ) ---
- n = index(string(1:j1),listDel,back=.true.)
- j0 = n + 1 ! n==0 => the first field name in fieldNames
- !--- sanity check ---
- ! if ((k 0)) then
- ! call shr_string_abort(subName//"ERROR: wrong string%nf ?")
- ! end if
- !--- is it a match? ---
- if (trim(fldStr) == string(j0:j1)) then
- found = .true.
- kFld = nFields + 1 - k
- exit
- endif
- j1 = j0 - 2 ! end index for next iteration
- !--------------------------------------------------------
- ! exit if all field names have been checked
- !--------------------------------------------------------
- if (2*k >= nFields) exit
- end do
-
- !--- not finding a field is not a fatal error ---
- if (.not. found) then
- kFld = 0
- if (lprint .and. s_loglev > 0) write(s_logunit,F00) "FYI: field ",trim(fldStr)," not found in list ",trim(string)
- if (present(rc)) rc = 1
- end if
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listGetIndex
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listGetNum -- get number of fields in a string list
-!
-! !DESCRIPTION:
-! return number of fields in string list
-!
-! !REVISION HISTORY:
-! 2005-Apr-28 - T. Craig - First version
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-integer function shr_string_listGetNum(str)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),intent(in) :: str ! string to search
-
-!EOP
-
- !----- local -----
- integer(SHR_KIND_IN) :: count ! counts occurances of char
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !----- formats -----
- character(*),parameter :: subName = "(shr_string_listGetNum) "
- character(*),parameter :: F00 = "('(shr_string_listGetNum) ',4a)"
-
-!-------------------------------------------------------------------------------
-! Notes:
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- shr_string_listGetNum = 0
-
- if (len_trim(str) > 0) then
- count = shr_string_countChar(str,listDel)
- shr_string_listGetNum = count + 1
- endif
-
- if (debug>1) call shr_timer_stop (t01)
-
-end function shr_string_listGetNum
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listSetDel -- Set list delimeter character
-!
-! !DESCRIPTION:
-! Set field delimeter character in lists
-! \newline
-! call shr\_string\_listSetDel(":")
-!
-! !REVISION HISTORY:
-! 2005-Apr-30 - T. Craig - first prototype
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listSetDel(cflag)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(len=1),intent(in) :: cflag
-
-!EOP
-
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !--- formats ---
- character(*),parameter :: subName = "(shr_string_listSetDel) "
- character(*),parameter :: F00 = "('(shr_string_listSetDel) ',a) "
-
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- if (debug > 0 .and. s_loglev > 0) write(s_logunit,F00) 'changing listDel from '//trim(listDel)//' to '//trim(cflag)
- listDel = trim(cflag)
- listDel2 = listDel//listDel
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listSetDel
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_listGetDel -- Get list delimeter character
-!
-! !DESCRIPTION:
-! Get field delimeter character in lists
-! \newline
-! call shr\_string\_listGetDel(del)
-!
-! !REVISION HISTORY:
-! 2005-May-15 - T. Craig - first prototype
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_listGetDel(del)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),intent(out) :: del
-
-!EOP
-
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !--- formats ---
- character(*),parameter :: subName = "(shr_string_listGetDel) "
- character(*),parameter :: F00 = "('(shr_string_listGetDel) ',a) "
-
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- del = trim(listDel)
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_listGetDel
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_setAbort -- Set local shr_string abort flag
-!
-! !DESCRIPTION:
-! Set local shr_string abort flag, true = abort, false = print and continue
-! \newline
-! call shr\_string\_setAbort(.false.)
-!
-! !REVISION HISTORY:
-! 2005-Apr-30 - T. Craig - first prototype
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_setAbort(flag)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- logical,intent(in) :: flag
-
-!EOP
-
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !--- formats ---
- character(*),parameter :: subName = "(shr_string_setAbort) "
- character(*),parameter :: F00 = "('(shr_string_setAbort) ',a) "
-
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- if (debug > 0 .and. s_loglev > 0) then
- if (flag) then
- write(s_logunit,F00) 'setting abort to true'
- else
- write(s_logunit,F00) 'setting abort to false'
- endif
- endif
-
- doabort = flag
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_setAbort
-
-!===============================================================================
-!BOP ===========================================================================
-!
-! !IROUTINE: shr_string_setDebug -- Set local shr_string debug level
-!
-! !DESCRIPTION:
-! Set local shr_string debug level, 0 = production
-! \newline
-! call shr\_string\_setDebug(2)
-!
-! !REVISION HISTORY:
-! 2005-Apr-30 - T. Craig - first prototype
-!
-! !INTERFACE: ------------------------------------------------------------------
-
-subroutine shr_string_setDebug(iFlag)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- integer(SHR_KIND_IN),intent(in) :: iFlag ! requested debug level
-
-!EOP
-
- !--- local ---
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !--- formats ---
- character(*),parameter :: subName = "(shr_string_setDebug) "
- character(*),parameter :: F00 = "('(shr_string_setDebug) ',a) "
- character(*),parameter :: F01 = "('(shr_string_setDebug) ',a,i3,a,i3) "
-
-!-------------------------------------------------------------------------------
-! NTOE: write statement can be expensive if called many times.
-!-------------------------------------------------------------------------------
-
- if (iFlag>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (iFlag>1) call shr_timer_start(t01)
-
-! if (s_loglev > 0) write(s_logunit,F01) 'changing debug level from ',debug,' to ',iflag
- debug = iFlag
-
- if (iFlag>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_setDebug
-
-!===============================================================================
-!===============================================================================
-
-subroutine shr_string_abort(string)
-
- implicit none
-
-! !INPUT/OUTPUT PARAMETERS:
-
- character(*),optional,intent(in) :: string
-
-!EOP
-
- integer(SHR_KIND_IN) :: t01 = 0 ! timer
-
- !--- local ---
- character(SHR_KIND_CX) :: lstring
- character(*),parameter :: subName = "(shr_string_abort)"
- character(*),parameter :: F00 = "('(shr_string_abort) ',a)"
-
-!-------------------------------------------------------------------------------
-! NOTE:
-! - no input or output string should be longer than SHR_KIND_CX
-!-------------------------------------------------------------------------------
-
- if (debug>1 .and. t01<1) call shr_timer_get(t01,subName)
- if (debug>1) call shr_timer_start(t01)
-
- lstring = ''
- if (present(string)) lstring = string
-
- if (doabort) then
- call shr_sys_abort(trim(lstring))
- else
- write(s_logunit,F00) ' no abort:'//trim(lstring)
- endif
-
- if (debug>1) call shr_timer_stop (t01)
-
-end subroutine shr_string_abort
-
-!===============================================================================
-!===============================================================================
-
-end module shr_string_mod
diff --git a/tools/mksurfdata_map/src/shr_sys_mod.F90 b/tools/mksurfdata_map/src/shr_sys_mod.F90
deleted file mode 100644
index 8c51b711cc..0000000000
--- a/tools/mksurfdata_map/src/shr_sys_mod.F90
+++ /dev/null
@@ -1,352 +0,0 @@
-!===============================================================================
-
-MODULE shr_sys_mod
-
- use shr_kind_mod ! defines real & integer kinds
- use shr_log_mod, only: s_loglev => shr_log_Level
- use shr_log_mod, only: s_logunit => shr_log_Unit
-
- implicit none
-
-! PUBLIC: Public interfaces
-
- private
-
- public :: shr_sys_system ! make a system call
- public :: shr_sys_chdir ! change current working dir
- public :: shr_sys_getenv ! get an environment variable
- public :: shr_sys_abort ! abort a program
- public :: shr_sys_irtc ! returns real-time clock tick
- public :: shr_sys_sleep ! have program sleep for a while
- public :: shr_sys_flush ! flush an i/o buffer
-
-!===============================================================================
-CONTAINS
-!===============================================================================
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_system(str,rcode)
-
- IMPLICIT none
-
- !----- arguments ---
- character(*) ,intent(in) :: str ! system/shell command string
- integer(SHR_KIND_IN),intent(out) :: rcode ! function return error code
-
- !----- functions -----
-#if (defined CRAY) || (defined UNICOSMP)
- integer(SHR_KIND_IN),external :: ishell ! function to envoke shell command
-#endif
-#if (defined OSF1 || defined SUNOS || (defined LINUX && !defined __GFORTRAN__ && !defined CATAMOUNT))
- integer(SHR_KIND_IN),external :: system ! function to envoke shell command
-#endif
-
- !----- local -----
-#if (defined CATAMOUNT)
- character(2*SHR_KIND_CL) :: file1 ! one or two filenames
- character( SHR_KIND_CL) :: file2 ! 2nd file name
- integer(SHR_KIND_IN) :: iloc ! index/location within a string
-#endif
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_system) '
- character(*),parameter :: F00 = "('(shr_sys_system) ',4a)"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: an architecture independant system call
-! NOTE:
-! - for Catamount (Cray, pheonix at ORNL) there is no system call -- workarounds
-! exist only for simple "rm" and "cp" commands
-!-------------------------------------------------------------------------------
-
-
-#if (defined CRAY) || (defined UNICOSMP)
-
- rcode=ishell(str)
-
-#elif (defined IRIX64 || defined NEC_SX)
-
- rcode = 0
- call system(str)
-
-#elif (defined AIX)
-
- call system(str,rcode)
-
-#elif (defined OSF1 || defined SUNOS || defined __GFORTRAN__ || (defined LINUX && !defined CATAMOUNT))
-
- rcode = system(str)
-
-#elif (defined CATAMOUNT)
- if (str(1:3) == 'rm ') then
- call unlink(str(4:))
- if (s_loglev > 0) write(s_logunit,F00) 'CATAMOUNT unlink ',trim(str(4:))
- rcode = 0
- elseif (str(1:3) == 'mv ') then
- file1 = str(4:)
- iloc = index(file1,' ') + 3
- if (iloc < 6) then
- if (s_loglev > 0) write(s_logunit,*) 'CATAMOUNT mv error ',trim(str),iloc
- rcode = -1
- else
- file1 = str(4:iloc)
- file2 = str(iloc+1:)
- call rename(trim(file1),trim(file2))
- if (s_loglev > 0) write(s_logunit,F00) 'CATAMOUNT rename ',trim(file1)," ",trim(file2)
- rcode = 0
- endif
- else
- rcode = -1
- endif
-
-#else
-
- write(s_logunit,F00) 'ERROR: no implementation of system call for this architecture'
- call shr_sys_abort(subName//'no implementation of system call for this architecture')
-
-#endif
-
-END SUBROUTINE shr_sys_system
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_chdir(path, rcode)
-
- IMPLICIT none
-
- !----- arguments -----
- character(*) ,intent(in) :: path ! chdir to this dir
- integer(SHR_KIND_IN),intent(out) :: rcode ! return code
-
- !----- local -----
- integer(SHR_KIND_IN) :: lenpath ! length of path
-#if (defined AIX || defined OSF1 || defined SUNOS || (defined LINUX && !defined __GFORTRAN__) || defined NEC_SX)
- integer(SHR_KIND_IN),external :: chdir ! AIX system call
-#endif
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_chdir) '
- character(*),parameter :: F00 = "('(shr_sys_chdir) ',4a)"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: an architecture independant system call
-!-------------------------------------------------------------------------------
-
- lenpath=len_trim(path)
-
-#if (defined IRIX64 || defined CRAY || defined UNICOSMP)
-
- call pxfchdir(path, lenpath, rcode)
-
-#elif (defined AIX)
-
- rcode = chdir(%ref(path(1:lenpath)//'\0'))
-
-#elif (defined OSF1 || defined SUNOS || defined LINUX || defined NEC_SX)
-
- rcode=chdir(path(1:lenpath))
-
-#else
-
- write(s_logunit,F00) 'ERROR: no implementation of chdir for this architecture'
- call shr_sys_abort(subname//'no implementation of chdir for this machine')
-
-#endif
-
-END SUBROUTINE shr_sys_chdir
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_getenv(name, val, rcode)
-
- IMPLICIT none
-
- !----- arguments -----
- character(*) ,intent(in) :: name ! env var name
- character(*) ,intent(out) :: val ! env var value
- integer(SHR_KIND_IN),intent(out) :: rcode ! return code
-
- !----- local -----
- integer(SHR_KIND_IN) :: lenname ! length of env var name
- integer(SHR_KIND_IN) :: lenval ! length of env var value
- character(SHR_KIND_CL) :: tmpval ! temporary env var value
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_getenv) '
- character(*),parameter :: F00 = "('(shr_sys_getenv) ',4a)"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: an architecture independant system call
-!-------------------------------------------------------------------------------
-
- lenname=len_trim(name)
-
-#if (defined IRIX64 || defined CRAY || defined UNICOSMP)
-
- call pxfgetenv(name, lenname, val, lenval, rcode)
-
-#elif (defined AIX || defined OSF1 || defined SUNOS || defined LINUX || defined NEC_SX)
-
- call getenv(trim(name),tmpval)
- val=trim(tmpval)
- rcode = 0
- if (len_trim(val) == 0 ) rcode = 1
- if (len_trim(val) > SHR_KIND_CL) rcode = 2
-
-#else
-
- write(s_logunit,F00) 'ERROR: no implementation of getenv for this architecture'
- call shr_sys_abort(subname//'no implementation of getenv for this machine')
-
-#endif
-
-END SUBROUTINE shr_sys_getenv
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_abort(string,rc)
-
- IMPLICIT none
-
- character(*) ,optional :: string ! error message string
- integer(SHR_KIND_IN),optional :: rc ! error code
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_abort) '
- character(*),parameter :: F00 = "('(shr_sys_abort) ',4a)"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: consistent stopping mechanism
-!-------------------------------------------------------------------------------
-
- call shr_sys_flush(s_logunit)
- if (len_trim(string) > 0) write(s_logunit,F00) 'ERROR: '//trim(string)
- write(s_logunit,F00) 'WARNING: stopping'
- call shr_sys_flush(s_logunit)
- call abort()
- stop
-
-END SUBROUTINE shr_sys_abort
-
-!===============================================================================
-!===============================================================================
-
-integer(SHR_KIND_I8) FUNCTION shr_sys_irtc( rate )
-
- IMPLICIT none
-
- !----- arguments -----
- integer(SHR_KIND_I8), optional :: rate
-
- !----- local -----
- integer(SHR_KIND_IN) :: count
- integer(SHR_KIND_IN) :: count_rate
- integer(SHR_KIND_IN) :: count_max
- integer(SHR_KIND_IN),save :: last_count = -1
- integer(SHR_KIND_I8),save :: count_offset = 0
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_irtc) '
- character(*),parameter :: F00 = "('(shr_sys_irtc) ',4a)"
-
-!-------------------------------------------------------------------------------
-! emulates Cray/SGI irtc function (returns clock tick since last reboot)
-!-------------------------------------------------------------------------------
-
- call system_clock(count=count,count_rate=count_rate, count_max=count_max)
- if ( present(rate) ) rate = count_rate
- shr_sys_irtc = count
-
- !--- adjust for clock wrap-around ---
- if ( last_count /= -1 ) then
- if ( count < last_count ) count_offset = count_offset + count_max
- end if
- shr_sys_irtc = shr_sys_irtc + count_offset
- last_count = count
-
-END FUNCTION shr_sys_irtc
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_sleep(sec)
-
- IMPLICIT none
-
- !----- arguments -----
- real (SHR_KIND_R8),intent(in) :: sec ! number of seconds to sleep
-
- !----- local -----
- integer(SHR_KIND_IN) :: isec ! integer number of seconds
- integer(SHR_KIND_IN) :: rcode ! return code
- character(90) :: str ! system call string
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_sleep) '
- character(*),parameter :: F00 = "('(shr_sys_sleep) ',4a)"
- character(*),parameter :: F10 = "('sleep ',i8 )"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: Sleep for approximately sec seconds
-!-------------------------------------------------------------------------------
-
- isec = nint(sec)
-
- if (isec < 0) then
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: seconds must be > 0, sec=',sec
- else if (isec == 0) then
- ! Don't consider this an error and don't call system sleep
- else
-#if defined(CATAMOUNT)
- call sleep(isec)
-#else
- write(str,FMT=F10) isec
- call shr_sys_system( str, rcode )
-#endif
- endif
-
-END SUBROUTINE shr_sys_sleep
-
-!===============================================================================
-!===============================================================================
-
-SUBROUTINE shr_sys_flush(unit)
-
- IMPLICIT none
-
- !----- arguments -----
- integer(SHR_KIND_IN) :: unit ! flush output buffer for this unit
-
- !----- formats -----
- character(*),parameter :: subName = '(shr_sys_flush) '
- character(*),parameter :: F00 = "('(shr_sys_flush) ',4a)"
-
-!-------------------------------------------------------------------------------
-! PURPOSE: an architecture independant system call
-!-------------------------------------------------------------------------------
-
-#if (defined IRIX64 || defined CRAY || defined OSF1 || defined SUNOS || defined LINUX || defined NEC_SX || defined UNICOSMP)
-
- call flush(unit)
-
-#elif (defined AIX)
-
- call flush_(unit)
-
-#else
-
- if (s_loglev > 0) write(s_logunit,F00) 'WARNING: no implementation of flush for this architecture'
-
-#endif
-
-END SUBROUTINE shr_sys_flush
-
-!===============================================================================
-!===============================================================================
-
-END MODULE shr_sys_mod
diff --git a/tools/mksurfdata_map/src/shr_timer_mod.F90 b/tools/mksurfdata_map/src/shr_timer_mod.F90
deleted file mode 100644
index c9d728ca4a..0000000000
--- a/tools/mksurfdata_map/src/shr_timer_mod.F90
+++ /dev/null
@@ -1,425 +0,0 @@
-!===============================================================================
-
-module shr_timer_mod
-
- !----------------------------------------------------------------------------
- !
- ! routines that support multiple CPU timers via F90 intrinsics
- !
- ! Note:
- ! o if an operation is requested on an invalid timer number n
- ! then nothing is done in a routine
- ! o if more than max_timers are requested,
- ! then timer n=max_timers is "overloaded" and becomes invalid/undefined
- !
- ! * cpp if-defs were introduced in 2005 to work-around a bug in the ORNL Cray
- ! X1 F90 intrinsic system_clock() function -- ideally this Cray bug would be
- ! fixed and cpp if-defs would be unnecessary and removed.
- !
- ! !REVISION HISTORY:
- ! 2005-??-?? - added workaround for Cray F90 bug, mods by Cray/ORNL
- ! 2000-??-?? - 1st version by B. Kauffman
- !----------------------------------------------------------------------------
-
- use shr_kind_mod
- use shr_log_mod, only: s_loglev => shr_log_Level
- use shr_log_mod, only: s_logunit => shr_log_Unit
-
- implicit none
-
- private ! restricted access
-
- public :: shr_timer_init
- public :: shr_timer_get
- public :: shr_timer_start
- public :: shr_timer_stop
- public :: shr_timer_print
- public :: shr_timer_print_all
- public :: shr_timer_check
- public :: shr_timer_check_all
- public :: shr_timer_zero
- public :: shr_timer_zero_all
- public :: shr_timer_free
- public :: shr_timer_free_all
- public :: shr_timer_sleep
-
- integer(SHR_KIND_IN),parameter :: stat_free = 0 ! timer status constants
- integer(SHR_KIND_IN),parameter :: stat_inuse = 1
- integer(SHR_KIND_IN),parameter :: stat_started = 2
- integer(SHR_KIND_IN),parameter :: stat_stopped = 3
- integer(SHR_KIND_IN),parameter :: max_timers = 200 ! max number of timers
-
- integer(SHR_KIND_IN) :: status (max_timers) ! status of each timer
- !----------------------------------------------------------------------------
- ! the following ifdef circumvents a bug in the X1 system_clock function
- !----------------------------------------------------------------------------
-#if (defined UNICOSMP)
- integer(kind=8) :: cycles1(max_timers) ! cycle number at timer start
- integer(kind=8) :: cycles2(max_timers) ! cycle number at timer stop
-#else
- integer(SHR_KIND_IN) :: cycles1(max_timers) ! cycle number at timer start
- integer(SHR_KIND_IN) :: cycles2(max_timers) ! cycle number at timer stop
-#endif
- integer(SHR_KIND_IN) :: cycles_max = -1 ! max cycles before wrapping
- character (len=80) :: name (max_timers) ! name assigned to each timer
- real (SHR_KIND_R8) :: dt (max_timers) ! accumulated time
- integer(SHR_KIND_IN) :: calls (max_timers) ! # of samples in accumulation
- real (SHR_KIND_R8) :: clock_rate ! clock_rate: seconds per cycle
-
- save
-
-!===============================================================================
- contains
-!===============================================================================
-
-subroutine shr_timer_init
-
- !----- local -----
- integer(SHR_KIND_IN) :: cycles ! count rate return by system clock
-#if (defined UNICOSMP)
- integer(kind=8) :: irtc_rate
-#endif
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_init) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! This routine initializes:
-! 1) values in all timer array locations
-! 2) machine parameters necessary for computing cpu time from F90 intrinsics.
-! F90 intrinsic: system_clock(count_rate=cycles, count_max=cycles_max)
-!-------------------------------------------------------------------------------
-
- call shr_timer_free_all
-
-#if (defined UNICOSMP)
- cycles = irtc_rate()
-#else
- call system_clock(count_rate=cycles, count_max=cycles_max)
-#endif
-
- if (cycles /= 0) then
- clock_rate = 1.0_SHR_KIND_R8/real(cycles,SHR_KIND_R8)
- else
- clock_rate = 0._SHR_KIND_R8
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: no system clock available'
- endif
-
-end subroutine shr_timer_init
-
-!===============================================================================
-
-subroutine shr_timer_get(n, str)
-
- !----- arguments -----
- integer(SHR_KIND_IN),intent(out) :: n ! timer number
- character (*) ,intent( in) :: str ! text string with timer name
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_get) ',a,i5)"
-
-!-----------------------------------------------------------------------
-! search for next free timer
-!-----------------------------------------------------------------------
-
- do n=1,max_timers
- if (status(n) == stat_free) then
- status(n) = stat_inuse
- name (n) = str
- calls (n) = 0
- return
- endif
- end do
-
- n=max_timers
- name (n) = ""
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: exceeded maximum number of timers'
-
-end subroutine shr_timer_get
-
-!===============================================================================
-
-subroutine shr_timer_start(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN), intent(in) :: n ! timer number
-
- !----- local -----
-#if (defined UNICOSMP)
- integer(kind=8) :: irtc
-#endif
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_start) ',a,i5)"
-
-!-----------------------------------------------------------------------
-! This routine starts a given timer.
-!-----------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- if (status(n) == stat_started) call shr_timer_stop(n)
-
- status(n) = stat_started
-#if (defined UNICOSMP)
- cycles1(n) = irtc()
-#else
- call system_clock(count=cycles1(n))
-#endif
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_start
-
-!===============================================================================
-
-subroutine shr_timer_stop(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN), intent(in) :: n ! timer number
-
- !----- local -----
- real (SHR_KIND_R8) :: elapse ! elapsed time returned by system counter
-#if (defined UNICOSMP)
- integer(kind=8) :: irtc
-#endif
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_stop) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! This routine stops a given timer, checks for cycle wrapping, computes the
-! elapsed time, and accumulates the elapsed time in the dt(n) array
-!-------------------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- if ( status(n) == stat_started) then
-#if (defined UNICOSMP)
- cycles2(n) = irtc()
-#else
- call system_clock(count=cycles2(n))
-#endif
- if (cycles2(n) >= cycles1(n)) then
- dt(n) = dt(n) + clock_rate*(cycles2(n) - cycles1(n))
- else
- dt(n) = dt(n) + clock_rate*(cycles_max + cycles2(n) - cycles1(n))
- endif
- calls (n) = calls(n) + 1
- status(n) = stat_stopped
- end if
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_stop
-
-!===============================================================================
-
-subroutine shr_timer_print(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN), intent(in) :: n ! timer number
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_print) ',a,i5)"
- character(len=*),parameter :: F01 = "('(shr_timer_print) timer',i3,&
- & ':',i8,' calls,',f10.3,'s, id: ',a)"
-!-------------------------------------------------------------------------------
-! prints the accumulated time for a given timer
-!-------------------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- if (status(n) == stat_started) then
- call shr_timer_stop(n)
- if (s_loglev > 0) write(s_logunit,F01) n,calls(n),dt(n),trim(name(n))
- call shr_timer_start(n)
- else
- if (s_loglev > 0) write(s_logunit,F01) n,calls(n),dt(n),trim(name(n))
- endif
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_print
-
-!===============================================================================
-
-subroutine shr_timer_print_all
-
- !----- local -----
- integer(SHR_KIND_IN) :: n
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_print_all) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! prints accumulated time for all timers in use
-!-------------------------------------------------------------------------------
-
- if (s_loglev > 0) write(s_logunit,F00) 'print all timing info:'
-
- do n=1,max_timers
- if (status(n) /= stat_free) call shr_timer_print(n)
- end do
-
-end subroutine shr_timer_print_all
-
-!===============================================================================
-
-subroutine shr_timer_zero(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN), intent(in) :: n ! timer number
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_zero) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! This routine resets a given timer.
-!-------------------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- dt(n) = 0.0_SHR_KIND_R8
- calls(n) = 0
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_zero
-
-!===============================================================================
-
-subroutine shr_timer_zero_all
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_zero_all) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! This routine resets all timers.
-!-------------------------------------------------------------------------------
-
- dt = 0.0_SHR_KIND_R8
- calls = 0
-
-end subroutine shr_timer_zero_all
-
-!===============================================================================
-
-subroutine shr_timer_check(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN), intent(in) :: n ! timer number
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_check) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! This routine checks a given timer. This is primarily used to
-! periodically accumulate time in the timer to prevent timer cycles
-! from wrapping around max_cycles.
-!-------------------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- if (status(n) == stat_started) then
- call shr_timer_stop (n)
- call shr_timer_start(n)
- endif
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_check
-
-!===============================================================================
-
-subroutine shr_timer_check_all
-
- !----- local -----
- integer(SHR_KIND_IN) :: n
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_check_all) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! Call shr_timer_check for all timers in use
-!-------------------------------------------------------------------------------
-
- do n=1,max_timers
- if (status(n) == stat_started) then
- call shr_timer_stop (n)
- call shr_timer_start(n)
- endif
- end do
-
-end subroutine shr_timer_check_all
-
-!===============================================================================
-
-subroutine shr_timer_free(n)
-
- !----- arguments -----
- integer(SHR_KIND_IN),intent(in) :: n ! timer number
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_free) ',a,i5)"
-
-!-----------------------------------------------------------------------
-! initialize/free all timer array values
-!-----------------------------------------------------------------------
-
- if ( n>0 .and. n<=max_timers) then
- status (n) = stat_free
- name (n) = ""
- dt (n) = 0.0_SHR_KIND_R8
- cycles1(n) = 0
- cycles2(n) = 0
- else
- if (s_loglev > 0) write(s_logunit,F00) 'ERROR: invalid timer number: ',n
- end if
-
-end subroutine shr_timer_free
-
-!===============================================================================
-
-subroutine shr_timer_free_all
-
- !----- local -----
- integer(SHR_KIND_IN) :: n
-
- !----- i/o formats -----
- character(len=*),parameter :: F00 = "('(shr_timer_free_all) ',a,i5)"
-
-!-------------------------------------------------------------------------------
-! initialize/free all timer array values
-!-------------------------------------------------------------------------------
-
- do n=1,max_timers
- call shr_timer_free(n)
- end do
-
-end subroutine shr_timer_free_all
-
-!===============================================================================
-
-subroutine shr_timer_sleep(sec)
-
- use shr_sys_mod ! share system calls (namely, shr_sys_sleep)
-
- !----- local -----
- real (SHR_KIND_R8),intent(in) :: sec ! number of seconds to sleep
-
-!-------------------------------------------------------------------------------
-! Sleep for approximately sec seconds
-!
-! Note: sleep is typically a system call, hence it is implemented in
-! shr_sys_mod, although it probably would only be used in a timing
-! context, which is why there is a shr_timer_* wrapper provided here.
-!-------------------------------------------------------------------------------
-
- call shr_sys_sleep(sec)
-
-end subroutine shr_timer_sleep
-
-!===============================================================================
-end module shr_timer_mod
-!===============================================================================
diff --git a/tools/mksurfdata_map/src/test/CMakeLists.txt b/tools/mksurfdata_map/src/test/CMakeLists.txt
deleted file mode 100644
index 81e413cb85..0000000000
--- a/tools/mksurfdata_map/src/test/CMakeLists.txt
+++ /dev/null
@@ -1,6 +0,0 @@
-add_subdirectory(mkpctPftType_test)
-add_subdirectory(mkpftUtils_test)
-add_subdirectory(mkpftmod_test)
-add_subdirectory(mkgridmap_test)
-add_subdirectory(mkindexmap_test)
-add_subdirectory(mksoilUtils_test)
diff --git a/tools/mksurfdata_map/src/test/mkgridmap_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mkgridmap_test/CMakeLists.txt
deleted file mode 100644
index 85d936fd33..0000000000
--- a/tools/mksurfdata_map/src/test/mkgridmap_test/CMakeLists.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-create_pFUnit_test(mkgridmap test_mkgridmap_exe
- "test_mkgridmap.pf" "")
-
-target_link_libraries(test_mkgridmap_exe mksurfdat)
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/test/mkgridmap_test/test_mkgridmap.pf b/tools/mksurfdata_map/src/test/mkgridmap_test/test_mkgridmap.pf
deleted file mode 100644
index c8eda9d007..0000000000
--- a/tools/mksurfdata_map/src/test/mkgridmap_test/test_mkgridmap.pf
+++ /dev/null
@@ -1,106 +0,0 @@
-module test_mkgridmap
-
- ! Tests of mkgridmapMod
-
- use pfunit_mod
- use mkgridmapMod
- use shr_kind_mod , only : r8 => shr_kind_r8
-
- implicit none
-
- @TestCase
- type, extends(TestCase) :: TestMkGridmap
- type(gridmap_type) :: gridmap
- contains
- procedure :: setUp
- procedure :: tearDown
- end type TestMkGridmap
-
- real(r8), parameter :: tol = 1.e-13_r8
-
-contains
-
- subroutine setUp(this)
- class(TestMkGridmap), intent(inout) :: this
- end subroutine setUp
-
- subroutine tearDown(this)
- class(TestMkGridmap), intent(inout) :: this
-
- call gridmap_clean(this%gridmap)
- end subroutine tearDown
-
- ! ------------------------------------------------------------------------
- ! Tests of for_test_create_gridmap
- ! ------------------------------------------------------------------------
-
- @Test
- subroutine forTestCreateGridmap_defaultArgs(this)
- class(TestMkGridmap), intent(inout) :: this
- call for_test_create_gridmap(this%gridmap, na=2, nb=3, ns=4, &
- src_indx = [11,11,12,12], &
- dst_indx = [21,22,22,23], &
- wovr = [1._r8, 0.5_r8, 0.5_r8, 1._r8])
-
- @assertEqual(2, this%gridmap%na)
- @assertEqual(3, this%gridmap%nb)
- @assertEqual(4, this%gridmap%ns)
- @assertEqual([11,11,12,12], this%gridmap%src_indx)
- @assertEqual([21,22,22,23], this%gridmap%dst_indx)
- @assertEqual([1._r8, 0.5_r8, 0.5_r8, 1._r8], this%gridmap%wovr)
- @assertEqual([1._r8, 1._r8], this%gridmap%frac_src)
- @assertEqual([1._r8, 1._r8, 1._r8], this%gridmap%frac_dst)
-
- ! Don't bother asserting area, xc, yc, because the default values of those shouldn't
- ! matter too much.
- end subroutine forTestCreateGridmap_defaultArgs
-
- @Test
- subroutine forTestCreateGridmap_explicitArgs(this)
- class(TestMkGridmap), intent(inout) :: this
- integer, parameter :: na = 2
- integer, parameter :: nb = 3
- integer, parameter :: ns = 4
- integer, parameter :: src_indx(ns) = [11,11,12,12]
- integer, parameter :: dst_indx(ns) = [21,22,22,23]
- real(r8), parameter :: wovr(ns) = [1._r8, 0.5_r8, 0.5_r8, 1._r8]
- real(r8), parameter :: frac_src(na) = [0.1_r8, 0.0_r8]
- real(r8), parameter :: frac_dst(nb) = [0.0_r8, 0.1_r8, 0.1_r8]
- real(r8), parameter :: area_src(na) = [0.11_r8, 0.12_r8]
- real(r8), parameter :: area_dst(nb) = [0.13_r8, 0.14_r8, 0.15_r8]
- real(r8), parameter :: xc_src(na) = [1.1_r8, 1.2_r8]
- real(r8), parameter :: xc_dst(nb) = [2.1_r8, 2.2_r8, 2.3_r8]
- real(r8), parameter :: yc_src(na) = [3.1_r8, 3.2_r8]
- real(r8), parameter :: yc_dst(nb) = [4.1_r8, 4.2_r8, 4.3_r8]
-
- call for_test_create_gridmap(this%gridmap, na=na, nb=nb, ns=ns, &
- src_indx = src_indx, &
- dst_indx = dst_indx, &
- wovr = wovr, &
- frac_src = frac_src, &
- frac_dst = frac_dst, &
- area_src = area_src, &
- area_dst = area_dst, &
- xc_src = xc_src, &
- xc_dst = xc_dst, &
- yc_src = yc_src, &
- yc_dst = yc_dst)
-
- @assertEqual(na, this%gridmap%na)
- @assertEqual(nb, this%gridmap%nb)
- @assertEqual(ns, this%gridmap%ns)
- @assertEqual(src_indx, this%gridmap%src_indx)
- @assertEqual(dst_indx, this%gridmap%dst_indx)
- @assertEqual(wovr, this%gridmap%wovr)
- @assertEqual(frac_src, this%gridmap%frac_src)
- @assertEqual(frac_dst, this%gridmap%frac_dst)
- @assertEqual(yc_src, this%gridmap%yc_src)
- @assertEqual(yc_dst, this%gridmap%yc_dst)
- @assertEqual(xc_src, this%gridmap%xc_src)
- @assertEqual(xc_dst, this%gridmap%xc_dst)
- @assertEqual(area_src, this%gridmap%area_src)
- @assertEqual(area_dst, this%gridmap%area_dst)
-
- end subroutine forTestCreateGridmap_explicitArgs
-
-end module test_mkgridmap
diff --git a/tools/mksurfdata_map/src/test/mkindexmap_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mkindexmap_test/CMakeLists.txt
deleted file mode 100644
index 044d3e4f89..0000000000
--- a/tools/mksurfdata_map/src/test/mkindexmap_test/CMakeLists.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-create_pFUnit_test(mkindexmap test_mkindexmap_exe
- "test_mkindexmap.pf" "")
-
-target_link_libraries(test_mkindexmap_exe mksurfdat)
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/test/mkindexmap_test/test_mkindexmap.pf b/tools/mksurfdata_map/src/test/mkindexmap_test/test_mkindexmap.pf
deleted file mode 100644
index 98e9590478..0000000000
--- a/tools/mksurfdata_map/src/test/mkindexmap_test/test_mkindexmap.pf
+++ /dev/null
@@ -1,258 +0,0 @@
-module test_mkindexmap
-
- ! Tests of mkindexmapMod
-
- use pfunit_mod
- use mkindexmapMod
- use mkgridmapMod, only : gridmap_type, for_test_create_gridmap, gridmap_clean
- use shr_kind_mod , only : r8 => shr_kind_r8
-
- implicit none
-
- @TestCase
- type, extends(TestCase) :: TestMkIndexMap
- type(gridmap_type) :: gridmap
- contains
- procedure :: setUp
- procedure :: tearDown
- procedure :: createGridmap
- procedure :: createGridmap3src1dst
- end type TestMkIndexMap
-
- real(r8), parameter :: tol = 1.e-13_r8
-
- integer, parameter :: NODATA_VAL = -999
-
-contains
-
- ! ------------------------------------------------------------------------
- ! Helper routines
- ! ------------------------------------------------------------------------
-
-
- subroutine setUp(this)
- class(TestMkIndexMap), intent(inout) :: this
- end subroutine setUp
-
- subroutine tearDown(this)
- class(TestMkIndexMap), intent(inout) :: this
- call gridmap_clean(this%gridmap)
- end subroutine tearDown
-
- !-----------------------------------------------------------------------
- subroutine createGridmap(this, src_indx, dst_indx, wovr, &
- na_in, nb_in)
- !
- ! !DESCRIPTION:
- ! Create this%gridmap
- !
- ! !ARGUMENTS:
- class(TestMkIndexMap), intent(inout) :: this
-
- ! The following arrays should all be the same size:
- integer, intent(in) :: src_indx(:)
- integer, intent(in) :: dst_indx(:)
- real(r8), intent(in) :: wovr(:) ! overlap weights
-
- ! If not present, na is set to max(src_indx) and nb to max(dst_indx)
- integer, intent(in), optional :: na_in
- integer, intent(in), optional :: nb_in
-
- !
- ! !LOCAL VARIABLES:
- integer :: na
- integer :: nb
- integer :: ns
-
- character(len=*), parameter :: subname = 'createGridmap'
- !-----------------------------------------------------------------------
-
- ns = size(wovr)
- @assertEqual(ns, size(src_indx))
- @assertEqual(ns, size(dst_indx))
-
- if (present(na_in)) then
- na = na_in
- else
- na = maxval(src_indx)
- end if
-
- if (present(nb_in)) then
- nb = nb_in
- else
- nb = maxval(dst_indx)
- end if
-
- call for_test_create_gridmap(this%gridmap, na=na, nb=nb, ns=ns, &
- src_indx=src_indx, dst_indx=dst_indx, wovr=wovr)
-
- end subroutine createGridmap
-
- !-----------------------------------------------------------------------
- subroutine createGridmap3src1dst(this)
- !
- ! !DESCRIPTION:
- ! Creates a gridmap with 3 src points and 1 dst point.
- !
- ! Overlap weights are 0.25, 0.5, 0.25
- !
- ! !ARGUMENTS:
- class(TestMkIndexMap), intent(inout) :: this
- !
- ! !LOCAL VARIABLES:
-
- character(len=*), parameter :: subname = 'createGridmap3src1dst'
- !-----------------------------------------------------------------------
-
- call this%createGridmap( &
- src_indx = [1, 2, 3], &
- dst_indx = [1, 1, 1], &
- wovr = [0.25_r8, 0.5_r8, 0.25_r8])
- end subroutine createGridmap3src1dst
-
-
- ! ------------------------------------------------------------------------
- ! Tests of get_max_indices
- ! ------------------------------------------------------------------------
-
- @Test
- subroutine getMaxIndices_maxFirst(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(1)
-
- call this%createGridmap3src1dst()
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [13, 12, 11], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([13], dst_array)
- end subroutine getMaxIndices_maxFirst
-
- @Test
- subroutine getMaxIndices_maxMiddle(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(1)
-
- call this%createGridmap3src1dst()
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [12, 13, 11], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([13], dst_array)
- end subroutine getMaxIndices_maxMiddle
-
- @Test
- subroutine getMaxIndices_maxLast(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(1)
-
- call this%createGridmap3src1dst()
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [11, 12, 13], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([13], dst_array)
- end subroutine getMaxIndices_maxLast
-
- @Test
- subroutine getMaxIndices_noData(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(2)
-
- ! 2 destination points, but all source points map to dest #1 (nothing maps to dest #2)
- call this%createGridmap( &
- src_indx = [1,2,3], &
- dst_indx = [1,1,1], &
- wovr = [0.25_r8, 0.5_r8, 0.25_r8], &
- nb_in = 2)
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [11, 12, 13], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([13, NODATA_VAL], dst_array)
- end subroutine getMaxIndices_noData
-
- @Test
- subroutine getMaxIndices_noOverlap(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(2)
-
- ! 2 destination points, and the matrix has an overlap with dest #2, but the overlap
- ! weight is 0. (I'm not sure this can happen in practice, but I'm also not sure that
- ! it can't happen.)
- call this%createGridmap( &
- src_indx = [1,2,3,3], &
- dst_indx = [1,1,1,2], &
- wovr = [0.25_r8, 0.5_r8, 0.25_r8, 0._r8])
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [11, 12, 13], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([13, NODATA_VAL], dst_array)
- end subroutine getMaxIndices_noOverlap
-
- @Test
- subroutine getMaxIndices_bigValNoOverlap(this)
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(1)
-
- ! Overlap weight is 0 for a point with a big value. (I'm not sure this can happen in
- ! practice, but I'm also not sure that it can't happen.)
- call this%createGridmap( &
- src_indx = [1,2,3], &
- dst_indx = [1,1,1], &
- wovr = [0.5_r8, 0.5_r8, 0._r8])
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [11, 12, 13], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1])
-
- @assertEqual([12], dst_array)
- end subroutine getMaxIndices_bigValNoOverlap
-
- @Test
- subroutine getMaxIndices_multipleDests(this)
- ! Make sure that the source/dest indexing is working right by having multiple source
- ! & dest points
- class(TestMkIndexMap), intent(inout) :: this
- integer :: dst_array(2)
-
- call this%createGridmap( &
- src_indx = [1,2,3,4], &
- dst_indx = [1,1,2,2], &
- wovr = [0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8])
-
- call get_max_indices(&
- gridmap = this%gridmap, &
- src_array = [11,12,22,21], &
- dst_array = dst_array, &
- nodata = NODATA_VAL, &
- mask_src = [1, 1, 1, 1])
-
- @assertEqual([12,22], dst_array)
- end subroutine getMaxIndices_multipleDests
-
-end module test_mkindexmap
diff --git a/tools/mksurfdata_map/src/test/mkpctPftType_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mkpctPftType_test/CMakeLists.txt
deleted file mode 100644
index 8fd784c672..0000000000
--- a/tools/mksurfdata_map/src/test/mkpctPftType_test/CMakeLists.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-create_pFUnit_test(mkpctPftType test_mkpctPftType_exe
- "test_mkpctPftType.pf" "")
-
-target_link_libraries(test_mkpctPftType_exe mksurfdat)
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/test/mkpctPftType_test/test_mkpctPftType.pf b/tools/mksurfdata_map/src/test/mkpctPftType_test/test_mkpctPftType.pf
deleted file mode 100644
index 47e7e90f48..0000000000
--- a/tools/mksurfdata_map/src/test/mkpctPftType_test/test_mkpctPftType.pf
+++ /dev/null
@@ -1,253 +0,0 @@
-module test_mkpctPftType
-
- ! Tests of pct_pft_type
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpctPftTypeMod
-
- implicit none
- save
-
- real(r8), parameter :: tol = 1.e-12_r8
-
-contains
-
- @Test
- subroutine test_constructor_nonzero()
- ! Tests constructor with non-zero area
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- @assertEqual([10._r8, 40._r8, 50._r8], pct_pft%get_pct_p2l(), tolerance=tol)
- @assertEqual(50._r8, pct_pft%get_pct_l2g(), tolerance=tol)
- @assertEqual(11, pct_pft%get_first_pft_index())
-
- end subroutine test_constructor_nonzero
-
- @Test
- subroutine test_constructor_zero()
- ! Tests constructor with zero area
- type(pct_pft_type) :: pct_pft
- real(r8) :: default_pct_p2l(3)
-
- default_pct_p2l = [0._r8, 100._r8, 0._r8]
-
- pct_pft = pct_pft_type([0._r8, 0._r8, 0._r8], 11, default_pct_p2l)
- @assertEqual(default_pct_p2l, pct_pft%get_pct_p2l())
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- end subroutine test_constructor_zero
-
- @Test
- subroutine test_constructor_empty()
- ! Tests version of constructor with an empty landunit
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type()
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- end subroutine test_constructor_empty
-
- @Test
- subroutine test_assignment()
- ! Tests assignment of one object to another
- !
- ! Currently there is no defined assignment operator, so the point of this is to
- ! ensure that intrinsic assignment works properly, particularly with respect to
- ! maintaining the correct lower bound (get_first_pft_index).
- type(pct_pft_type) :: source, dest
-
- source = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
- dest = source
-
- @assertEqual([10._r8, 40._r8, 50._r8], dest%get_pct_p2l(), tolerance=tol)
- @assertEqual(50._r8, dest%get_pct_l2g(), tolerance=tol)
- @assertEqual(11, dest%get_first_pft_index())
- end subroutine test_assignment
-
- @Test
- subroutine test_get_pct_p2g()
- ! Test the get_pct_p2g routine
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- @assertEqual([5._r8, 20._r8, 25._r8], pct_pft%get_pct_p2g())
- end subroutine test_get_pct_p2g
-
- @Test
- subroutine test_get_one_pct_p2g()
- ! Test the get_one_pct_p2g routine
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- @assertEqual(20._r8, pct_pft%get_one_pct_p2g(12))
- end subroutine test_get_one_pct_p2g
-
-
- @Test
- subroutine test_set_pct_l2g()
- ! Test the set_pct_l2g routine
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%set_pct_l2g(60._r8)
- @assertEqual([10._r8, 40._r8, 50._r8], pct_pft%get_pct_p2l(), tolerance=tol)
- @assertEqual(60._r8, pct_pft%get_pct_l2g(), tolerance=tol)
- end subroutine test_set_pct_l2g
-
-
- @Test
- subroutine test_set_one_pct_p2g()
- ! Test the set_one_pct_p2g routine
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%set_one_pct_p2g(12, 10._r8)
- @assertEqual(40._r8, pct_pft%get_pct_l2g(), tolerance=tol)
- @assertEqual([12.5_r8, 25._r8, 62.5_r8], pct_pft%get_pct_p2l(), tolerance=tol)
-
- end subroutine test_set_one_pct_p2g
-
- @Test
- subroutine test_set_one_pct_p2g_to_zero()
- ! Test the set_one_pct_p2g routine, when we go to a total area of 0
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([20._r8, 0._r8, 0._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%set_one_pct_p2g(11, 0._r8)
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- ! note that pct_p2l stays at its original value
- @assertEqual([100._r8, 0._r8, 0._r8], pct_pft%get_pct_p2l(), tolerance=tol)
-
- end subroutine test_set_one_pct_p2g_to_zero
-
- @Test
- subroutine test_set_one_pct_p2g_from_zero()
- ! Test the set_one_pct_p2g routine, when we start from a total area of 0
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([0._r8, 0._r8, 0._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%set_one_pct_p2g(13, 5._r8)
- @assertEqual(5._r8, pct_pft%get_pct_l2g())
- @assertEqual([0._r8, 0._r8, 100._r8], pct_pft%get_pct_p2l(), tolerance=tol)
-
- end subroutine test_set_one_pct_p2g_from_zero
-
- @Test
- subroutine test_merge_pfts()
- ! Test the merge_pfts routine
- type(pct_pft_type) :: pct_pft
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%merge_pfts(source=12, dest=13)
- @assertEqual(50._r8, pct_pft%get_pct_l2g())
- @assertEqual([10._r8, 0._r8, 90._r8], pct_pft%get_pct_p2l(), tolerance=tol)
- end subroutine test_merge_pfts
-
- @Test
- subroutine test_remove_small_cover_no_small()
- ! Test the remove_small_cover routine with no small pfts
- type(pct_pft_type) :: pct_pft, pct_pft_orig
- integer :: nsmall
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 0._r8], 11, [0._r8, 100._r8, 0._r8])
- pct_pft_orig = pct_pft
-
- call pct_pft%remove_small_cover(1._r8, nsmall)
- @assertEqual(pct_pft_orig%get_pct_l2g(), pct_pft%get_pct_l2g())
- @assertEqual(pct_pft_orig%get_pct_p2l(), pct_pft%get_pct_p2l())
- @assertEqual(0, nsmall)
- end subroutine test_remove_small_cover_no_small
-
- @Test
- subroutine test_remove_small_cover_all_small()
- ! Test the remove_small_cover routine with all small (or zero) pfts
- type(pct_pft_type) :: pct_pft, pct_pft_orig
- integer :: nsmall
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 0._r8], 11, [0._r8, 100._r8, 0._r8])
- pct_pft_orig = pct_pft
-
- call pct_pft%remove_small_cover(30._r8, nsmall)
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- @assertEqual(pct_pft_orig%get_pct_p2l(), pct_pft%get_pct_p2l())
- @assertEqual(2, nsmall)
- end subroutine test_remove_small_cover_all_small
-
- @Test
- subroutine test_remove_small_cover_some_small()
- ! Test the remove_small_cover routine with some (but not all) small pfts
- type(pct_pft_type) :: pct_pft
- integer :: nsmall
-
- pct_pft = pct_pft_type([5._r8, 20._r8, 0._r8, 25._r8], 11, [0._r8, 100._r8, 0._r8, 0._r8])
-
- call pct_pft%remove_small_cover(10._r8, nsmall)
- @assertEqual(45._r8, pct_pft%get_pct_l2g())
- @assertEqual([0._r8, 20._r8, 0._r8, 25._r8]/45._r8 * 100._r8, pct_pft%get_pct_p2l(), tolerance=tol)
- @assertEqual(1, nsmall)
- end subroutine test_remove_small_cover_some_small
-
- @Test
- subroutine test_remove_small_cover_zero_area()
- ! Test the remove_small_cover routine with a starting area of 0
- type(pct_pft_type) :: pct_pft
- integer :: nsmall
-
- pct_pft = pct_pft_type([0._r8, 0._r8, 0._r8], 11, [0._r8, 100._r8, 0._r8])
-
- call pct_pft%remove_small_cover(1._r8, nsmall)
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- @assertEqual([0._r8, 100._r8, 0._r8], pct_pft%get_pct_p2l())
- @assertEqual(0, nsmall)
- end subroutine test_remove_small_cover_zero_area
-
- @Test
- subroutine test_remove_small_cover_no_landunit()
- ! Test the remove_small_cover routine when there are no pfts on this landunit
- type(pct_pft_type) :: pct_pft
- integer :: nsmall
-
- pct_pft = pct_pft_type()
- call pct_pft%remove_small_cover(1._r8, nsmall)
- @assertEqual(0._r8, pct_pft%get_pct_l2g())
- @assertEqual(0, nsmall)
- end subroutine test_remove_small_cover_no_landunit
-
- @Test
- subroutine test_get_pct_p2l_array()
- ! Test the get_pct_p2l_array routine
- type(pct_pft_type) :: pct_pft(2)
- real(r8) :: expected(2, 3)
-
- pct_pft(1) = pct_pft_type([10._r8, 40._r8, 50._r8], 11, [0._r8, 100._r8, 0._r8])
- pct_pft(2) = pct_pft_type([5._r8, 30._r8, 65._r8], 11, [0._r8, 100._r8, 0._r8])
-
- expected(1,:) = [10._r8, 40._r8, 50._r8]
- expected(2,:) = [5._r8, 30._r8, 65._r8]
-
- @assertEqual(expected, get_pct_p2l_array(pct_pft))
-
- end subroutine test_get_pct_p2l_array
-
- @Test
- subroutine test_get_pct_l2g_array()
- ! Test the get_pct_l2g_array routine
- type(pct_pft_type) :: pct_pft(2)
-
- pct_pft(1) = pct_pft_type([5._r8, 25._r8, 20._r8], 11, [0._r8, 100._r8, 0._r8])
- pct_pft(2) = pct_pft_type([1._r8, 2._r8, 3._r8], 11, [0._r8, 100._r8, 0._r8])
-
- @assertEqual([50._r8, 6._r8], get_pct_l2g_array(pct_pft), tolerance=tol)
-
- end subroutine test_get_pct_l2g_array
-
-end module test_mkpctPftType
diff --git a/tools/mksurfdata_map/src/test/mkpftUtils_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mkpftUtils_test/CMakeLists.txt
deleted file mode 100644
index 33dd01bcd9..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftUtils_test/CMakeLists.txt
+++ /dev/null
@@ -1,8 +0,0 @@
-set (pfunit_sources
- test_adjust_total_veg_area.pf
- test_convert_from_p2g.pf)
-
-create_pFUnit_test(mkpftUtils test_mkpftUtils_exe
- "${pfunit_sources}" "")
-
-target_link_libraries(test_mkpftUtils_exe mksurfdat)
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/test/mkpftUtils_test/test_adjust_total_veg_area.pf b/tools/mksurfdata_map/src/test/mkpftUtils_test/test_adjust_total_veg_area.pf
deleted file mode 100644
index 345c1a7370..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftUtils_test/test_adjust_total_veg_area.pf
+++ /dev/null
@@ -1,59 +0,0 @@
-module test_adjust_total_veg_area
-
- ! Tests of mkpftUtilsMod: adjust_total_veg_area
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpctPftTypeMod, only : pct_pft_type
- use mkpftUtilsMod, only : adjust_total_veg_area
-
- implicit none
- save
-
- real(r8), parameter :: tol = 1.e-12_r8
-
-contains
-
- @Test
- subroutine test_standard_case()
- type(pct_pft_type) :: pctnatpft, pctcft
-
- pctnatpft = pct_pft_type([5._r8, 10._r8], 1, [100._r8, 0._r8])
- pctcft = pct_pft_type([10._r8, 20._r8], 3, [100._r8, 0._r8])
-
- call adjust_total_veg_area(90._r8, pctnatpft, pctcft)
-
- @assertEqual(30._r8, pctnatpft%get_pct_l2g())
- @assertEqual(60._r8, pctcft%get_pct_l2g())
- end subroutine test_standard_case
-
- @Test
- subroutine test_initial_total_zero()
- ! When the old areas are 0, all area should go into natural veg
- type(pct_pft_type) :: pctnatpft, pctcft
-
- pctnatpft = pct_pft_type([0._r8, 0._r8], 1, [100._r8, 0._r8])
- pctcft = pct_pft_type([0._r8, 0._r8], 3, [100._r8, 0._r8])
-
- call adjust_total_veg_area(90._r8, pctnatpft, pctcft)
-
- @assertEqual(90._r8, pctnatpft%get_pct_l2g())
- @assertEqual(0._r8, pctcft%get_pct_l2g())
- end subroutine test_initial_total_zero
-
- @Test
- subroutine test_initial_one_zero()
- ! Test a case where this is initially a 0 - make sure it stays 0
- type(pct_pft_type) :: pctnatpft, pctcft
-
- pctnatpft = pct_pft_type([0._r8, 0._r8], 1, [100._r8, 0._r8])
- pctcft = pct_pft_type([10._r8, 20._r8], 3, [100._r8, 0._r8])
-
- call adjust_total_veg_area(90._r8, pctnatpft, pctcft)
-
- @assertEqual(0._r8, pctnatpft%get_pct_l2g())
- @assertEqual(90._r8, pctcft%get_pct_l2g())
- end subroutine test_initial_one_zero
-
-end module test_adjust_total_veg_area
diff --git a/tools/mksurfdata_map/src/test/mkpftUtils_test/test_convert_from_p2g.pf b/tools/mksurfdata_map/src/test/mkpftUtils_test/test_convert_from_p2g.pf
deleted file mode 100644
index 53548e4e6c..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftUtils_test/test_convert_from_p2g.pf
+++ /dev/null
@@ -1,151 +0,0 @@
-module test_convert_from_p2g
-
- ! Tests of mkpftUtilsMod: convert_from_p2g
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpctPftTypeMod, only : pct_pft_type
- use mkpftUtilsMod, only : convert_from_p2g
- use mkpftConstantsMod, only : natpft_lb, natpft_ub, num_cft, cft_lb, cft_ub, c3cropindex
-
- implicit none
- save
-
- real(r8), parameter :: tol = 1.e-12_r8
-
-contains
-
- subroutine setup()
- ! Perform setup for most tests
-
- natpft_lb = 0
- natpft_ub = 2
- cft_lb = 3
- cft_ub = 4
- num_cft = 2
-
- c3cropindex = 3
-
- end subroutine setup
-
-
- ! ------------------------------------------------------------------------
- ! Tests of convert_from_p2g_default
- ! ------------------------------------------------------------------------
-
- @Test
- subroutine test_standard()
- ! Standard case: some nat pft, some crop
- type(pct_pft_type) :: pctnatpft, pctcft
-
- call setup
-
- call convert_from_p2g([1._r8, 2._r8, 3._r8, 4._r8, 5._r8], pctnatpft, pctcft)
-
- @assertEqual(6._r8, pctnatpft%get_pct_l2g(), tolerance=tol)
- @assertEqual([1._r8, 2._r8, 3._r8]/6._r8 * 100._r8, pctnatpft%get_pct_p2l(), tolerance=tol)
- @assertEqual(9._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([4._r8, 5._r8]/9._r8 * 100._r8, pctcft%get_pct_p2l(), tolerance=tol)
- end subroutine test_standard
-
- @Test
- subroutine test_natpft0()
- ! natpft all 0 (percents should be at their default)
- type(pct_pft_type) :: pctnatpft, pctcft
-
- call setup
-
- call convert_from_p2g([0._r8, 0._r8, 0._r8, 4._r8, 5._r8], pctnatpft, pctcft)
-
- @assertEqual(0._r8, pctnatpft%get_pct_l2g())
- @assertEqual([100._r8, 0._r8, 0._r8], pctnatpft%get_pct_p2l())
- @assertEqual(9._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([4._r8, 5._r8]/9._r8 * 100._r8, pctcft%get_pct_p2l(), tolerance=tol)
- end subroutine test_natpft0
-
- @Test
- subroutine test_cft0()
- ! cft landunit present, but all 0 (percents should be at their default)
- type(pct_pft_type) :: pctnatpft, pctcft
-
- call setup
-
- call convert_from_p2g([1._r8, 2._r8, 3._r8, 0._r8, 0._r8], pctnatpft, pctcft)
- @assertEqual(6._r8, pctnatpft%get_pct_l2g(), tolerance=tol)
- @assertEqual([1._r8, 2._r8, 3._r8]/6._r8 * 100._r8, pctnatpft%get_pct_p2l(), tolerance=tol)
- @assertEqual(0._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([100._r8, 0._r8], pctcft%get_pct_p2l(), tolerance=tol)
- end subroutine test_cft0
-
- @Test
- subroutine test_no_cft_landunit()
- ! no cft landunit
- type(pct_pft_type) :: pctnatpft, pctcft
-
- call setup
-
- cft_lb = 3
- cft_ub = 2
- num_cft = 0
-
- call convert_from_p2g([1._r8, 2._r8, 3._r8], pctnatpft, pctcft)
- @assertEqual(6._r8, pctnatpft%get_pct_l2g(), tolerance=tol)
- @assertEqual([1._r8, 2._r8, 3._r8]/6._r8 * 100._r8, pctnatpft%get_pct_p2l(), tolerance=tol)
- @assertEqual(0._r8, pctcft%get_pct_l2g(), tolerance=tol)
-
- end subroutine test_no_cft_landunit
-
- ! ------------------------------------------------------------------------
- ! Tests of convert_from_p2g_missing_crops
- ! ------------------------------------------------------------------------
-
- @Test
- subroutine test_missing_crops()
- type(pct_pft_type) :: pctnatpft, pctcft_saved, pctcft
-
- call setup
- ! add an extra cft to make sure it's okay for the pct_p2g input to not contain the
- ! same number of elements as the cft landunit
- cft_ub = 5
- num_cft = 3
- pctcft_saved = pct_pft_type([10._r8, 15._r8, 20._r8], cft_lb, [100._r8, 0._r8, 0._r8])
-
- call convert_from_p2g([1._r8, 2._r8, 3._r8, 4._r8, 0._r8], pctcft_saved, pctnatpft, pctcft)
- @assertEqual(6._r8, pctnatpft%get_pct_l2g(), tolerance=tol)
- @assertEqual([1._r8, 2._r8, 3._r8]/6._r8 * 100._r8, pctnatpft%get_pct_p2l(), tolerance=tol)
- @assertEqual(4._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([10._r8, 15._r8, 20._r8]/45._r8 * 100._r8, pctcft%get_pct_p2l(), tolerance=tol)
-
- end subroutine test_missing_crops
-
- @Test
- subroutine test_missing_crops_natpft0()
- ! Make sure the setting of the natpft default works correctly for the missing_crops
- ! version of the subroutine
- type(pct_pft_type) :: pctnatpft, pctcft_saved, pctcft
-
- call setup
- pctcft_saved = pct_pft_type([10._r8, 15._r8], cft_lb, [100._r8, 0._r8])
-
- call convert_from_p2g([0._r8, 0._r8, 0._r8, 4._r8, 0._r8], pctcft_saved, pctnatpft, pctcft)
- @assertEqual(0._r8, pctnatpft%get_pct_l2g())
- @assertEqual([100._r8, 0._r8, 0._r8], pctnatpft%get_pct_p2l())
- @assertEqual(4._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([10._r8, 15._r8]/25._r8 * 100._r8, pctcft%get_pct_p2l(), tolerance=tol)
- end subroutine test_missing_crops_natpft0
-
- @Test
- subroutine test_missing_crops_cft0()
- ! Make sure the cft cover is as expected when the cft landunit area goes to 0
- type(pct_pft_type) :: pctnatpft, pctcft_saved, pctcft
-
- call setup
- pctcft_saved = pct_pft_type([10._r8, 15._r8], cft_lb, [100._r8, 0._r8])
-
- call convert_from_p2g([1._r8, 2._r8, 3._r8, 0._r8, 0._r8], pctcft_saved, pctnatpft, pctcft)
- @assertEqual(0._r8, pctcft%get_pct_l2g(), tolerance=tol)
- @assertEqual([10._r8, 15._r8]/25._r8 * 100._r8, pctcft%get_pct_p2l(), tolerance=tol)
- end subroutine test_missing_crops_cft0
-
-end module test_convert_from_p2g
diff --git a/tools/mksurfdata_map/src/test/mkpftmod_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mkpftmod_test/CMakeLists.txt
deleted file mode 100644
index 8fcb75145f..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftmod_test/CMakeLists.txt
+++ /dev/null
@@ -1,9 +0,0 @@
-set (pfunit_sources
- test_pftrun.pf
- test_pft_oride.pf
- test_pftInit.pf)
-
-create_pFUnit_test(mkpftMod test_mkpft_exe
- "${pfunit_sources}" "")
-
-target_link_libraries(test_mkpft_exe mksurfdat)
diff --git a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftInit.pf b/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftInit.pf
deleted file mode 100644
index 1ddb143961..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftInit.pf
+++ /dev/null
@@ -1,297 +0,0 @@
-module test_pftInit
-
- ! Tests of mkpftMod: pft_override functions
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpftMod
- use mkvarctl, only: numpft
- use mkvarpar, only: numstdpft, noveg
- use mkpftConstantsMod, only: maxpft, c3cropindex
-
- implicit none
- save
-
- @TestCase
- type, extends(TestCase) :: TestMkPFT
- contains
- procedure :: setUp
- procedure :: tearDown
- end type TestMkPFT
-
-contains
-
- subroutine setUp(this)
- class(TestMkPFT), intent(inout) :: this
- numpft = numstdpft
- pft_idx(0:maxpft) = -1
- pft_frc(0:maxpft) = 0.0_r8
- end subroutine setUp
-
- subroutine tearDown(this)
- class(TestMkPFT), intent(inout) :: this
-
- end subroutine tearDown
-
- @Test
- subroutine test_runmkpftInit(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .false.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertFalse( use_input_pft )
- @assertFalse( presc_cover )
-
- end subroutine test_runmkpftInit
-
- @Test
- subroutine test_runmkpftInitZero(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .true.
- all_veg_l = .false.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
- @assertEqual( pft_idx(0), noveg )
- @assertEqual( pft_frc(0), 0.0_r8 )
-
- end subroutine test_runmkpftInitZero
-
- @Test
- subroutine test_runmkpftInitPftORide(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- pft_idx(0) = 1
- pft_frc(0) = 100._r8
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPftORide
-
-
- @Test
- subroutine test_runmkpftInitPftORideButNOTAllVeg(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- pft_idx(0:1) = (/ 1, c3cropindex /)
- pft_frc(0:1) = (/ 50._r8, 50.0_r8 /)
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPftORideButNOTAllVeg
-
-
- @Test
- subroutine test_runmkpftInitPftORideCrop(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- numpft = maxpft
- pft_idx(0) = 17
- pft_frc(0) = 100._r8
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPftORideCrop
-
-
- @Test
- subroutine test_runmkpftInitPftORideAll(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
- integer :: i
-
- zero_out_l = .false.
- numpft = numstdpft
- do i = 0, numpft
- pft_idx(i) = i
- pft_frc(i) = 1.0_r8
- end do
- pft_frc(numpft) = 100._r8 - sum(pft_frc(0:numpft-1))
- @assertEqual( 100.0_r8, sum(pft_frc) )
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPftORideAll
-
- @Test
- subroutine test_runmkpftInitPFTOrideWarnNoCrop(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- numpft = maxpft
- pft_idx(0) = 1
- pft_frc(0) = 100._r8
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPFTOrideWarnNoCrop
-
- @Test
- subroutine test_runmkpftInitPFTOrideWarnNoNatVeg(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- numpft = maxpft
- pft_idx(0) = c3cropindex
- pft_frc(0) = 100._r8
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
-
- end subroutine test_runmkpftInitPFTOrideWarnNoNatVeg
-
- @Test
- subroutine test_runmkpftInitBadZeroNInput(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .true.
- numpft = maxpft
- pft_idx(0) = numstdpft+1
- pft_frc(0) = 100._r8
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadZeroNInput
-
- @Test
- subroutine test_runmkpftInitBadAllVeg(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadAllVeg
-
- @Test
- subroutine test_runmkpftInitBadNotSum(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = maxpft
- pft_idx(0) = numstdpft+1
- pft_frc(0) = 99._r8
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadNotSum
-
- @Test
- subroutine test_runmkpftInitBadPFTOutRange(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = numstdpft
- pft_idx(0) = numstdpft+1
- pft_frc(0) = 100._r8
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadPFTOutRange
-
- @Test
- subroutine test_runmkpftInitBadPFTBadVals(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = maxpft
- pft_idx(0:1) = (/ numstdpft+1, numstdpft+2 /)
- pft_frc(0:1) = (/ 101._r8, -1._r8 /)
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadPFTBadVals
-
- @Test
- subroutine test_runmkpftInitBadnumpft(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = 79
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadnumpft
-
- @Test
- subroutine test_runmkpftInitBadFrcNotIdx(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = maxpft
- pft_idx(0) = numstdpft+1
- pft_frc(0:1) = (/ 99._r8, 1._r8 /)
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadFrcNotIdx
-
- @Test
- subroutine test_runmkpftInitBadIdxTwice(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- all_veg_l = .true.
- numpft = maxpft
- pft_idx(0:1) = (/ 17, 17 /)
- pft_frc(0:1) = (/ 99._r8, 1._r8 /)
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
-
- end subroutine test_runmkpftInitBadIdxTwice
-
- @Test
- subroutine test_runmkpftInitBadFrcAfterZero(this)
- class(TestMkPFT), intent(inout) :: this
- logical :: zero_out_l, all_veg_l
-
- zero_out_l = .false.
- numpft = maxpft
- all_veg_l = .true.
- pft_idx(0:2) = (/ 17, -1, 18 /)
- pft_frc(0:2) = (/ 99._r8, 0.0_r8, 1._r8 /)
- call mkpftInit( zero_out_l, all_veg_l )
- @assertExceptionRaised( "ABORTED:" )
- end subroutine test_runmkpftInitBadFrcAfterZero
-
-
-end module test_pftInit
diff --git a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pft_oride.pf b/tools/mksurfdata_map/src/test/mkpftmod_test/test_pft_oride.pf
deleted file mode 100644
index 97cfc66d1e..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pft_oride.pf
+++ /dev/null
@@ -1,127 +0,0 @@
-module test_pft_oride
-
- ! Tests of mkpftMod: pft_override functions
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpftMod
- use mkvarctl, only : numpft
- use mkvarpar, only : numstdpft
- use mkpftConstantsMod, only : c3cropindex, c3irrcropindex
-
- implicit none
- save
-
- @TestCase
- type, extends(TestCase) :: TestPFTORide
- type(pft_oride) :: pftoverride
- contains
- procedure :: setUp
- procedure :: tearDown
- end type TestPFTORide
-
- logical :: zero_out, all_veg_l
-
-contains
-
- subroutine setUp(this)
- class(TestPFTORide), intent(inout) :: this
-
- pft_idx(:) = -1
- pft_frc(:) = 0.0_r8
- zero_out = .false.
- numpft = numstdpft
- all_veg_l = .false.
- call mkpftInit( zero_out, all_veg_l )
- this%pftoverride = pft_oride( )
- end subroutine setUp
-
- subroutine tearDown(this)
- class(TestPFTORide), intent(inout) :: this
-
- call this%pftoverride%Clean()
-
- end subroutine tearDown
-
- @Test
- subroutine test_runmkpftZero(this)
- class(TestPFTORide), intent(inout) :: this
-
- zero_out = .true.
- all_veg_l = .false.
- call mkpftInit( zero_out, all_veg_l )
- call this%pftoverride%InitZeroOut()
- @assertEqual( 0.0_r8, this%pftoverride%crop )
- @assertEqual( 0.0_r8, this%pftoverride%natveg )
- @assertEqual( 100.0_r8, sum(this%pftoverride%cft))
- @assertEqual( 100.0_r8, sum(this%pftoverride%natpft))
-
- end subroutine test_runmkpftZero
-
- @Test
- subroutine test_runSetpft(this)
- class(TestPFTORide), intent(inout) :: this
- integer, parameter :: pftidx = 1
-
- pft_idx(0) = pftidx
- pft_frc(0) = 100.0_r8
- zero_out = .false.
- all_veg_l = .true.
- call mkpftInit( zero_out, all_veg_l )
- call this%pftoverride%InitAllPFTIndex()
- @assertEqual( 0.0_r8, this%pftoverride%crop )
- @assertEqual( 100.0_r8, this%pftoverride%natveg )
- @assertEqual( 100.0_r8, sum(this%pftoverride%cft) )
- @assertEqual( 100.0_r8, sum(this%pftoverride%natpft) )
- @assertEqual( 100.0_r8, this%pftoverride%natpft(pftidx) )
-
- end subroutine test_runSetpft
-
- @Test
- subroutine test_runSetCrop(this)
- class(TestPFTORide), intent(inout) :: this
- integer :: cftidx
-
- cftidx = c3cropindex
- pft_idx(0) = cftidx
- pft_frc(0) = 100.0_r8
- zero_out = .false.
- all_veg_l = .true.
- call mkpftInit( zero_out, all_veg_l )
- call this%pftoverride%InitAllPFTIndex()
- @assertEqual( 100.0_r8, this%pftoverride%crop )
- @assertEqual( 0.0_r8, this%pftoverride%natveg )
- @assertEqual( 100.0_r8, sum(this%pftoverride%cft) )
- @assertEqual( 100.0_r8, sum(this%pftoverride%natpft) )
- @assertEqual( 100.0_r8, this%pftoverride%cft(numpft-cftidx) )
-
- end subroutine test_runSetCrop
-
- @Test
- subroutine test_runSetMix(this)
- class(TestPFTORide), intent(inout) :: this
- integer :: cftidx, cftidx2, pftidx2
- integer, parameter :: pftidx = 1
-
- zero_out = .false.
- pftidx2 = c3cropindex-1
- cftidx = c3cropindex
- cftidx2 = c3irrcropindex
- pft_idx(0:3) = (/ pftidx, pftidx2, cftidx, cftidx2 /)
- pft_frc(0:3) = (/ 25.0_r8, 25.0_r8, 25.0_r8, 25.0_r8 /)
- all_veg_l = .true.
- call mkpftInit( zero_out, all_veg_l )
- call this%pftoverride%InitAllPFTIndex()
- @assertEqual( 50.0_r8, this%pftoverride%crop )
- @assertEqual( 50.0_r8, this%pftoverride%natveg )
- @assertEqual( 100.0_r8, sum(this%pftoverride%cft) )
- @assertEqual( 100.0_r8, sum(this%pftoverride%natpft) )
- @assertEqual( 50.0_r8, this%pftoverride%natpft(pftidx) )
- @assertEqual( 50.0_r8, this%pftoverride%natpft(pftidx2) )
- @assertEqual( 50.0_r8, this%pftoverride%cft(1) )
- @assertEqual( 50.0_r8, this%pftoverride%cft(2) )
-
- end subroutine test_runSetMix
-
-end module test_pft_oride
diff --git a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftrun.pf b/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftrun.pf
deleted file mode 100644
index 389748764b..0000000000
--- a/tools/mksurfdata_map/src/test/mkpftmod_test/test_pftrun.pf
+++ /dev/null
@@ -1,204 +0,0 @@
-module test_pftrun
-
- ! Tests of mkpftMod: pft_override functions
-
- use pfunit_mod
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkpftMod
- use mkvarctl, only: numpft
- use mkvarpar, only: numstdpft
- use mkpftConstantsMod, only: maxpft, c3cropindex, c3irrcropindex
- use mkpctPftTypeMod , only : pct_pft_type
- use mkdomainMod , only : domain_type, for_test_create_domain, domain_clean
-
- implicit none
- save
-
- @TestCase
- type, extends(TestCase) :: TestMkPFTRun
- character(len=12) :: mapfname
- character(len=12) :: fpft
- type(domain_type) :: ldomain
- integer :: ndiag
- real(r8), allocatable :: expected(:)
- real(r8) :: expected_cft(2)
- real(r8) :: expected_pft(0:14)
- type(pct_pft_type), allocatable :: pctnatpft(:) ! % of grid cell that is nat veg, and breakdown into PFTs
- real(r8), allocatable :: pctlnd_pft(:) ! PFT data: % of gridcell for PFTs
- type(pct_pft_type), allocatable :: pctcft(:) ! % of grid cell that is crop, and breakdown into CFTs
- contains
- procedure :: setUp
- procedure :: tearDown
- end type TestMkPFTRun
-
-contains
-
- subroutine setUp(this)
- class(TestMkPFTRun), intent(inout) :: this
- integer :: ns_o
-
- numpft = numstdpft
- pft_idx(0:maxpft) = -1
- pft_frc(0:maxpft) = 0.0_r8
- this%ndiag = 6
- this%mapfname = "none"
- this%fpft = "none"
- call for_test_create_domain( this%ldomain )
- ns_o = this%ldomain%ns
- allocate( this%pctnatpft(ns_o) )
- allocate( this%pctlnd_pft(ns_o) )
- allocate( this%pctcft(ns_o) )
- allocate( this%expected(ns_o) )
- this%expected = 0.0_r8
- this%expected_cft = 0.0_r8
- this%expected_pft = 0.0_r8
- end subroutine setUp
-
- subroutine tearDown(this)
- class(TestMkPFTRun), intent(inout) :: this
-
- deallocate( this%pctnatpft )
- deallocate( this%pctlnd_pft )
- deallocate( this%pctcft )
- deallocate( this%expected )
- call domain_clean( this%ldomain )
-
- end subroutine tearDown
-
- @Test
- subroutine test_runmkpftZero(this)
- class(TestMkPFTRun), intent(inout) :: this
- logical :: zero_out, all_veg_l
- integer :: n
-
- zero_out = .true.
- all_veg_l = .false.
- call mkpftInit( zero_out, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
- call mkpft(this%ldomain, this%mapfname, this%fpft, this%ndiag, &
- pctlnd_o=this%pctlnd_pft, pctnatpft_o=this%pctnatpft, pctcft_o=this%pctcft)
- this%expected = 100.0_r8
- @assertEqual( this%expected, this%pctlnd_pft )
- do n = 1, this%ldomain%ns
- @assertEqual( this%pctnatpft(n)%get_pct_l2g(), 0.0_r8 )
- @assertEqual( this%pctcft(n)%get_pct_l2g(), 0.0_r8 )
- this%expected_pft = 0.0_r8
- this%expected_pft(0) = 100.0_r8
- this%expected_cft = 0.0_r8
- this%expected_cft(1) = 100.0_r8
- @assertEqual( this%pctnatpft(n)%get_pct_p2l(), this%expected_pft )
- @assertEqual( this%pctcft(n)%get_pct_p2l(), this%expected_cft )
- end do
-
- end subroutine test_runmkpftZero
-
- @Test
- subroutine test_runmkpftPftORide(this)
- class(TestMkPFTRun), intent(inout) :: this
- logical :: zero_out, all_veg_l
- integer :: n
- integer, parameter :: pftidx = 1
-
- zero_out = .false.
- all_veg_l = .true.
- pft_idx(0) = pftidx
- pft_frc(0) = 100._r8
- call mkpftInit( zero_out, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
- call mkpft(this%ldomain, this%mapfname, this%fpft, this%ndiag, &
- pctlnd_o=this%pctlnd_pft, pctnatpft_o=this%pctnatpft, pctcft_o=this%pctcft)
- this%expected = 100.0_r8
- @assertEqual( this%expected, this%pctlnd_pft )
- do n = 1, this%ldomain%ns
- @assertEqual( this%pctnatpft(n)%get_pct_l2g(), 100.0_r8 )
- @assertEqual( this%pctcft(n)%get_pct_l2g(), 0.0_r8 )
- this%expected_pft = 0.0_r8
- this%expected_pft(pftidx) = 100.0_r8
- this%expected_cft = 0.0_r8
- this%expected_cft(1) = 100.0_r8
- @assertEqual( this%pctnatpft(n)%get_pct_p2l(), this%expected_pft )
- @assertEqual( this%pctcft(n)%get_pct_p2l(), this%expected_cft )
- end do
-
- end subroutine test_runmkpftPftORide
-
-
- @Test
- subroutine test_runmkpftPftORideWCrop(this)
- use mkvarpar, only: numstdpft, numstdcft
- class(TestMkPFTRun), intent(inout) :: this
- logical :: zero_out, all_veg_l
- integer :: n
- integer :: cftidx
- integer, parameter :: pftidx = 1
-
- cftidx = c3cropindex
- zero_out = .false.
- all_veg_l = .true.
- @assertLessThan( pftidx, numstdpft-numstdcft+1 )
- @assertGreaterThan( cftidx, numstdpft-numstdcft )
- pft_idx(0:1) = (/ pftidx, cftidx /)
- pft_frc(0:1) = (/ 50.0_r8, 50.0_r8 /)
- call mkpftInit( zero_out, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
- call mkpft(this%ldomain, this%mapfname, this%fpft, this%ndiag, &
- pctlnd_o=this%pctlnd_pft, pctnatpft_o=this%pctnatpft, pctcft_o=this%pctcft)
- this%expected = 100.0_r8
- @assertEqual( this%expected, this%pctlnd_pft )
- do n = 1, this%ldomain%ns
- @assertEqual( this%pctnatpft(n)%get_pct_l2g(), 50.0_r8 )
- @assertEqual( this%pctcft(n)%get_pct_l2g(), 50.0_r8 )
- this%expected_pft = 0.0_r8
- this%expected_pft(pftidx) = 100.0_r8
- this%expected_cft = 0.0_r8
- this%expected_cft(numstdpft-cftidx) = 100.0_r8
- @assertEqual( this%pctnatpft(n)%get_pct_p2l(), this%expected_pft )
- @assertEqual( this%pctcft(n)%get_pct_p2l(), this%expected_cft )
- end do
-
- end subroutine test_runmkpftPftORideWCrop
-
- @Test
- subroutine test_runmkpft4PftORideWCrop(this)
- use mkvarpar, only: numstdpft, numstdcft
- class(TestMkPFTRun), intent(inout) :: this
- logical :: zero_out, all_veg_l
- integer :: n
- integer :: cftidx, cftidx2
- integer, parameter :: pftidx = 1, pftidx2 = 2
-
- cftidx = c3cropindex
- cftidx2 = c3irrcropindex
- zero_out = .false.
- all_veg_l = .true.
- @assertLessThan( pftidx, numstdpft-numstdcft+1 )
- @assertLessThan( pftidx2, numstdpft-numstdcft+1 )
- @assertGreaterThan( cftidx, numstdpft-numstdcft )
- @assertGreaterThan( cftidx2, numstdpft-numstdcft )
- pft_idx(0:3) = (/ pftidx, pftidx2, cftidx, cftidx2 /)
- pft_frc(0:3) = (/ 25.0_r8, 25.0_r8, 25.0_r8, 25.0_r8 /)
- call mkpftInit( zero_out, all_veg_l )
- @assertTrue( use_input_pft )
- @assertTrue( presc_cover )
- call mkpft(this%ldomain, this%mapfname, this%fpft, this%ndiag, &
- pctlnd_o=this%pctlnd_pft, pctnatpft_o=this%pctnatpft, pctcft_o=this%pctcft)
- this%expected = 100.0_r8
- @assertEqual( this%expected, this%pctlnd_pft )
- do n = 1, this%ldomain%ns
- @assertEqual( this%pctnatpft(n)%get_pct_l2g(), 50.0_r8 )
- @assertEqual( this%pctcft(n)%get_pct_l2g(), 50.0_r8 )
- this%expected_pft = 0.0_r8
- this%expected_pft(pftidx) = 50.0_r8
- this%expected_pft(pftidx2) = 50.0_r8
- this%expected_cft = 50.0_r8
- @assertEqual( this%pctnatpft(n)%get_pct_p2l(), this%expected_pft )
- @assertEqual( this%pctcft(n)%get_pct_p2l(), this%expected_cft )
- end do
-
- end subroutine test_runmkpft4PftORideWCrop
-
-end module test_pftrun
diff --git a/tools/mksurfdata_map/src/test/mksoilUtils_test/CMakeLists.txt b/tools/mksurfdata_map/src/test/mksoilUtils_test/CMakeLists.txt
deleted file mode 100644
index 4d94b8114b..0000000000
--- a/tools/mksurfdata_map/src/test/mksoilUtils_test/CMakeLists.txt
+++ /dev/null
@@ -1,7 +0,0 @@
-set (pfunit_sources
- test_dominant_soil_color.pf)
-
-create_pFUnit_test(mksoilUtils test_mksoilUtils_exe
- "${pfunit_sources}" "")
-
-target_link_libraries(test_mksoilUtils_exe mksurfdat)
\ No newline at end of file
diff --git a/tools/mksurfdata_map/src/test/mksoilUtils_test/test_dominant_soil_color.pf b/tools/mksurfdata_map/src/test/mksoilUtils_test/test_dominant_soil_color.pf
deleted file mode 100644
index b506549d87..0000000000
--- a/tools/mksurfdata_map/src/test/mksoilUtils_test/test_dominant_soil_color.pf
+++ /dev/null
@@ -1,140 +0,0 @@
-module test_dominant_soil_color
-
- ! Tests of mksoilUtilsMod: dominant_soil_color
-
- use pfunit_mod
- use mksoilUtilsMod
- use shr_kind_mod , only : r8 => shr_kind_r8
- use mkgridmapMod, only : gridmap_type, gridmap_clean, for_test_create_gridmap
-
- implicit none
-
- @TestCase
- type, extends(TestCase) :: tdsc
- type(gridmap_type) :: gridmap
- contains
- procedure :: setUp
- procedure :: tearDown
- procedure :: createGridmap1dst
- end type tdsc
-
- real(r8), parameter :: tol = 1.e-13_r8
-
-contains
-
- subroutine setUp(this)
- class(tdsc), intent(inout) :: this
- end subroutine setUp
-
- subroutine tearDown(this)
- class(tdsc), intent(inout) :: this
- call gridmap_clean(this%gridmap)
- end subroutine tearDown
-
- subroutine createGridmap1dst(this, wovr)
- ! Create this%gridmap with a single destination point
- class(tdsc), intent(inout) :: this
- real(r8), intent(in) :: wovr(:) ! overlap weights
-
- integer :: i
- integer :: npts
- integer :: src_indx(size(wovr))
- integer :: dst_indx(size(wovr))
-
- dst_indx(:) = 1
- npts = size(wovr)
- src_indx(:) = [(i, i = 1, npts)]
-
- call for_test_create_gridmap(this%gridmap, na = npts, nb = 1, ns = npts, &
- src_indx = src_indx, dst_indx = dst_indx, wovr = wovr)
- end subroutine createGridmap1dst
-
- @Test
- subroutine equalWeights(this)
- ! Four inputs with equal weight; two of one class, one of each of two other classes
- class(tdsc), intent(inout) :: this
- integer :: mask_i(4)
- integer :: soil_color_i(4)
- integer :: soil_color_o(1)
-
- call this%createGridmap1dst([0.25_r8, 0.25_r8, 0.25_r8, 0.25_r8])
- mask_i(:) = 1
- soil_color_i(:) = [1, 2, 2, 3]
-
- call dominant_soil_color(this%gridmap, mask_i, soil_color_i, 20, soil_color_o)
-
- @assertEqual(2, soil_color_o(1))
- end subroutine equalWeights
-
- @Test
- subroutine inequalWeights(this)
- ! Four inputs with inequal weight
- class(tdsc), intent(inout) :: this
- integer :: mask_i(4)
- integer :: soil_color_i(4)
- integer :: soil_color_o(1)
-
- call this%createGridmap1dst([0.5_r8, 0.2_r8, 0.2_r8, 0.1_r8])
- mask_i(:) = 1
- soil_color_i(:) = [3, 1, 1, 2]
-
- call dominant_soil_color(this%gridmap, mask_i, soil_color_i, 20, soil_color_o)
-
- @assertEqual(3, soil_color_o(1))
- end subroutine inequalWeights
-
- @Test
- subroutine noColor(this)
- ! No color in input
- class(tdsc), intent(inout) :: this
- integer :: mask_i(4)
- integer :: soil_color_i(4)
- integer :: soil_color_o(1)
-
- call this%createGridmap1dst([0.25_r8, 0.25_r8, 0.25_r8, 0.25_r8])
- ! Some points are inside the mask with color = 0, other points are outside the mask
- mask_i(:) = [1, 0, 0, 1]
- soil_color_i(:) = [0, 1, 1, 0]
-
- call dominant_soil_color(this%gridmap, mask_i, soil_color_i, 20, soil_color_o)
-
- @assertEqual(15, soil_color_o(1))
- end subroutine noColor
-
- @Test
- subroutine noColorInFirstPoints(this)
- ! No color in the first points, but a color in the last point
- class(tdsc), intent(inout) :: this
- integer :: mask_i(4)
- integer :: soil_color_i(4)
- integer :: soil_color_o(1)
-
- call this%createGridmap1dst([0.25_r8, 0.25_r8, 0.25_r8, 0.25_r8])
- ! Some points are inside the mask with color = 0, other points are outside the mask
- mask_i(:) = 1
- soil_color_i(:) = [0, 0, 0, 1]
-
- call dominant_soil_color(this%gridmap, mask_i, soil_color_i, 20, soil_color_o)
-
- @assertEqual(1, soil_color_o(1))
- end subroutine noColorInFirstPoints
-
- @Test
- subroutine noColorInLastPoints(this)
- ! No color in the last points, but a color in the first point
- class(tdsc), intent(inout) :: this
- integer :: mask_i(4)
- integer :: soil_color_i(4)
- integer :: soil_color_o(1)
-
- call this%createGridmap1dst([0.25_r8, 0.25_r8, 0.25_r8, 0.25_r8])
- ! Some points are inside the mask with color = 0, other points are outside the mask
- mask_i(:) = 1
- soil_color_i(:) = [1, 0, 0, 0]
-
- call dominant_soil_color(this%gridmap, mask_i, soil_color_i, 20, soil_color_o)
-
- @assertEqual(1, soil_color_o(1))
- end subroutine noColorInLastPoints
-
-end module test_dominant_soil_color
diff --git a/tools/mksurfdata_map/src/unit_test_stubs/abort.F90 b/tools/mksurfdata_map/src/unit_test_stubs/abort.F90
deleted file mode 100644
index aa1d8b76c2..0000000000
--- a/tools/mksurfdata_map/src/unit_test_stubs/abort.F90
+++ /dev/null
@@ -1,25 +0,0 @@
-subroutine abort()
- ! Replacement for abort that throws a pfunit exception rather than aborting
- !
- ! This can be used to test expected errors (i.e., failure testing).
- !
- ! If this occurs within a pFUnit-based test:
- !
- ! - If you have code like:
- !
- ! @assertExceptionRaised("ABORTED:")
- !
- ! - If you don't have
- !
- ! @assertExceptionRaised
- !
- ! or
- !
- ! call assertExceptionRaised
- !
- ! then this will result in the given pFUnit test failing.
- use pfunit_mod, only : throw
- implicit none
-
- call throw("ABORTED:")
-end subroutine abort
diff --git a/tools/mksurfdata_map/src/unit_test_stubs/mkncdio.F90 b/tools/mksurfdata_map/src/unit_test_stubs/mkncdio.F90
deleted file mode 100644
index 1bf6a8afdf..0000000000
--- a/tools/mksurfdata_map/src/unit_test_stubs/mkncdio.F90
+++ /dev/null
@@ -1,223 +0,0 @@
-module mkncdio
- ! Stub of mkncdio for unit testing. This is enough to get other modules to compile, but
- ! it doesn't do anything useful.
-
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: nf_open
- public :: nf_close
- public :: nf_strerror
- public :: nf_inq_dimid
- public :: nf_inq_dimname
- public :: nf_inq_dimlen
- public :: nf_inq_varid
- public :: nf_inq_varndims
- public :: nf_inq_vardimid
- public :: nf_get_var_double
- public :: nf_get_var_int
- public :: nf_get_vara_double
- public :: nf_get_att_double
- public :: ncd_defvar
- public :: ncd_def_spatial_var
-
- public :: get_dim_lengths
-
- public :: check_ret
- public :: convert_latlon
-
- interface nf_get_var_double
- module procedure nf_get_var_double_1d
- module procedure nf_get_var_double_2d
- end interface nf_get_var_double
-
- interface nf_get_vara_double
- module procedure nf_get_vara_double_2d
- end interface nf_get_vara_double
-
- integer, parameter, public :: nf_nowrite = 0
- integer, parameter, public :: nf_noerr = 0
- integer, parameter, public :: nf_max_name = 64
-
-contains
-
-!-----------------------------------------------------------------------
- subroutine ncd_defvar(ncid, varname, xtype, &
- dim1name, dim2name, &
- long_name, units )
-!
- implicit none
- integer , intent(in) :: ncid ! input unit
- character(len=*), intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*), intent(in), optional :: dim1name ! dimension name
- character(len=*), intent(in), optional :: dim2name ! dimension name
- character(len=*), intent(in), optional :: long_name ! attribute
- character(len=*), intent(in), optional :: units ! attribute
-!
- end subroutine ncd_defvar
-
- !-----------------------------------------------------------------------
- subroutine ncd_def_spatial_var(ncid, varname, xtype, long_name, units, lev1name, lev2name)
- integer , intent(in) :: ncid ! input unit
- character(len=*) , intent(in) :: varname ! variable name
- integer , intent(in) :: xtype ! external type
- character(len=*) , intent(in) :: long_name ! attribute
- character(len=*) , intent(in) :: units ! attribute
- character(len=*) , optional, intent(in) :: lev1name ! name of first level (or time) dimension
- character(len=*) , optional, intent(in) :: lev2name ! name of second level (or time) dimension
- end subroutine ncd_def_spatial_var
-
- subroutine get_dim_lengths(ncid, varname, ndims, dim_lengths)
- integer , intent(in) :: ncid ! netcdf id of an open netcdf file
- character(len=*), intent(in) :: varname ! name of variable of interest
- integer , intent(out):: ndims ! number of dimensions of variable
- integer , intent(out):: dim_lengths(:) ! lengths of dimensions of variable
-
- dim_lengths = 0
- end subroutine get_dim_lengths
-
- integer function nf_open(filename, mode, ncid)
- character(len=*), intent(in) :: filename
- integer, intent(in) :: mode
- integer, intent(out) :: ncid
-
- ncid = 0
- nf_open = 0
- end function nf_open
-
- integer function nf_close(ncid)
- integer, intent(in) :: ncid
-
- nf_close = 0
- end function nf_close
-
- function nf_strerror(rcode)
- character(len=16) :: nf_strerror
- integer, intent(in) :: rcode
-
- nf_strerror = 'N/A'
- end function nf_strerror
-
- integer function nf_inq_dimid(ncid, dimname, did)
- integer, intent(in) :: ncid
- character(len=*), intent(in) :: dimname
- integer, intent(out) :: did
-
- did = 0
- nf_inq_dimid = 0
- end function nf_inq_dimid
-
- integer function nf_inq_dimname(ncid, dimid, dimname)
- integer, intent(in) :: ncid
- integer, intent(in) :: dimid
- character(len=*), intent(out) :: dimname
-
- dimname = 'none'
- nf_inq_dimname = 0
- end function nf_inq_dimname
-
- integer function nf_inq_dimlen(ncid, did, dimlen)
- integer, intent(in) :: ncid
- integer, intent(in) :: did
- integer, intent(out) :: dimlen
-
- dimlen = 0
- nf_inq_dimlen = 0
- end function nf_inq_dimlen
-
- integer function nf_inq_varid(ncid, varname, vid)
- integer, intent(in) :: ncid
- character(len=*), intent(in) :: varname
- integer, intent(out) :: vid
-
- vid = 0
- nf_inq_varid = 0
- end function nf_inq_varid
-
- integer function nf_inq_varndims(ncid, varid, ndims)
- integer, intent(in) :: ncid
- integer, intent(in) :: varid
- integer, intent(out) :: ndims
-
- ndims = 0
- nf_inq_varndims = 0
- end function nf_inq_varndims
-
- integer function nf_inq_vardimid(ncid, varid, dimids)
- integer, intent(in) :: ncid
- integer, intent(in) :: varid
- integer, intent(out) :: dimids(:)
-
- dimids(:) = 0
- nf_inq_vardimid = 0
- end function nf_inq_vardimid
-
- integer function nf_get_var_double_1d(ncid, vid, data)
- integer, intent(in) :: ncid
- integer, intent(in) :: vid
- real(r8), intent(out) :: data(:)
-
- data(:) = 0._r8
- nf_get_var_double_1d = 0
- end function nf_get_var_double_1d
-
- integer function nf_get_var_double_2d(ncid, vid, data)
- integer, intent(in) :: ncid
- integer, intent(in) :: vid
- real(r8), intent(out) :: data(:,:)
-
- data(:,:) = 0._r8
- nf_get_var_double_2d = 0
- end function nf_get_var_double_2d
-
- integer function nf_get_var_int(ncid, vid, data)
- integer, intent(in) :: ncid
- integer, intent(in) :: vid
- integer, intent(out) :: data(:)
-
- data(:) = 0
- nf_get_var_int = 0
- end function nf_get_var_int
-
- integer function nf_get_vara_double_2d(ncid, varid, starts, counts, data)
- integer, intent(in) :: ncid
- integer, intent(in) :: varid
- integer, intent(in) :: starts(:)
- integer, intent(in) :: counts(:)
- real(r8), intent(out) :: data(:,:)
-
- data(:,:) = 0._r8
- nf_get_vara_double_2d = 0
- end function nf_get_vara_double_2d
-
- integer function nf_get_att_double(ncid, varid, attname, attval)
- integer, intent(in) :: ncid
- integer, intent(in) :: varid
- character(len=*), intent(in) :: attname
- real(r8), intent(out) :: attval
-
- attval = 0._r8
- nf_get_att_double = 0
- end function nf_get_att_double
-
- subroutine check_ret(ret, calling, varexists)
- integer, intent(in) :: ret
- character(len=*), intent(in) :: calling
- logical, intent(out), optional :: varexists
-
- if (present(varexists)) then
- varexists = .true.
- end if
- end subroutine check_ret
-
- subroutine convert_latlon(ncid, varname, data)
- integer, intent(in) :: ncid
- character(len=*), intent(in) :: varname
- real(r8), intent(inout) :: data(:)
-
- end subroutine convert_latlon
-
-end module mkncdio
diff --git a/tools/mksurfdata_map/unit_testers/Filepath b/tools/mksurfdata_map/unit_testers/Filepath
deleted file mode 100644
index f5228276ec..0000000000
--- a/tools/mksurfdata_map/unit_testers/Filepath
+++ /dev/null
@@ -1,2 +0,0 @@
-.
-../src
diff --git a/tools/mksurfdata_map/unit_testers/Makefile b/tools/mksurfdata_map/unit_testers/Makefile
deleted file mode 100644
index 7260c828d8..0000000000
--- a/tools/mksurfdata_map/unit_testers/Makefile
+++ /dev/null
@@ -1,10 +0,0 @@
-# Makefile for mksurfdata_map unit testing
-
-EXENAME = ../test_mksurfdata_map
-
-# Set optimization off by default
-ifeq ($(OPT),$(null))
- OPT := FALSE
-endif
-
-include ../src/Makefile.common
\ No newline at end of file
diff --git a/tools/mksurfdata_map/unit_testers/README b/tools/mksurfdata_map/unit_testers/README
deleted file mode 100644
index 8620c3cc6d..0000000000
--- a/tools/mksurfdata_map/unit_testers/README
+++ /dev/null
@@ -1,6 +0,0 @@
-This directory contains source code for building unit tests for
-mksurfdata_map
-
-test_mod.F90 was copied from
-https://svn-ccsm-models.cgd.ucar.edu/csm_share/trunk/unit_testers/test_mod.F90
-
diff --git a/tools/mksurfdata_map/unit_testers/Srcfiles b/tools/mksurfdata_map/unit_testers/Srcfiles
deleted file mode 100644
index 3ee42a79bb..0000000000
--- a/tools/mksurfdata_map/unit_testers/Srcfiles
+++ /dev/null
@@ -1,32 +0,0 @@
-test_mksurfdata_map.F90
-test_mkdomainMod.F90
-test_mkindexmapMod.F90
-test_mkgridmapMod.F90
-test_mkchecksMod.F90
-test_mkurbanparMod.F90
-test_mkutilsMod.F90
-test_mkharvest.F90
-test_mkncdio.F90
-test_mod.F90
-mkindexmapMod.F90
-mkchecksMod.F90
-mkharvestMod.F90
-mkurbanparMod.F90
-mkdiagnosticsMod.F90
-mkurbanparCommonMod.F90
-mkutilsMod.F90
-mkdomainMod.F90
-mkvarpar.F90
-mkgridmapMod.F90
-mkncdio.F90
-mkvarctl.F90
-nanMod.F90
-fileutils.F90
-shr_const_mod.F90
-shr_kind_mod.F90
-shr_sys_mod.F90
-shr_log_mod.F90
-shr_string_mod.F90
-shr_timer_mod.F90
-shr_file_mod.F90
-
diff --git a/tools/mksurfdata_map/unit_testers/test_mkchecksMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkchecksMod.F90
deleted file mode 100644
index edec7643e5..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkchecksMod.F90
+++ /dev/null
@@ -1,101 +0,0 @@
-module test_mkchecksMod
-! Module for testing mkchecksMod
-
- use mkchecksMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_min_bad
- public :: test_max_bad
-
- character(len=*), parameter :: modname = 'test_mkchecksMod'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_min_bad
-
- implicit none
-
- character(len=128) :: testname
- logical :: test_result
-
- character(len=*), parameter :: subname = 'test_min_bad'
-
- ! Tests for r8
-
- testname = 'r8 - pass'
- test_result = min_bad((/1._r8,2._r8,3._r8/), 0._r8, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'r8 - pass on border'
- test_result = min_bad((/1._r8,2._r8,3._r8/), 1._r8, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Note that we expect output to stdout from the following test that indicates an error
- testname = 'r8 - fail'
- test_result = min_bad((/1._r8,2._r8,3._r8/), 1.5_r8, 'testvar')
- call test_is(test_result .eqv. .true., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Tests for int
-
- testname = 'int - pass'
- test_result = min_bad((/1,2,3/), 0, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'int - pass on border'
- test_result = min_bad((/1,2,3/), 1, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Note that we expect output to stdout from the following test that indicates an error
- testname = 'int - fail'
- test_result = min_bad((/1,2,3/), 2, 'testvar')
- call test_is(test_result .eqv. .true., modname//' -- '//subname//' -- '//trim(testname))
-
- end subroutine test_min_bad
-
-!------------------------------------------------------------------------------
- subroutine test_max_bad
-
- implicit none
-
- character(len=128) :: testname
- logical :: test_result
-
- character(len=*), parameter :: subname = 'test_max_bad'
-
- ! Tests for r8
-
- testname = 'r8 - pass'
- test_result = max_bad((/1._r8,2._r8,3._r8/), 4._r8, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'r8 - pass on border'
- test_result = max_bad((/1._r8,2._r8,3._r8/), 3._r8, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Note that we expect output to stdout from the following test that indicates an error
- testname = 'r8 - fail'
- test_result = max_bad((/1._r8,2._r8,3._r8/), 2.5_r8, 'testvar')
- call test_is(test_result .eqv. .true., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Tests for int
-
- testname = 'int - pass'
- test_result = max_bad((/1,2,3/), 4, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'int - pass on border'
- test_result = max_bad((/1,2,3/), 3, 'testvar')
- call test_is(test_result .eqv. .false., modname//' -- '//subname//' -- '//trim(testname))
-
- ! Note that we expect output to stdout from the following test that indicates an error
- testname = 'int - fail'
- test_result = max_bad((/1,2,3/), 2, 'testvar')
- call test_is(test_result .eqv. .true., modname//' -- '//subname//' -- '//trim(testname))
-
- end subroutine test_max_bad
-end module test_mkchecksMod
diff --git a/tools/mksurfdata_map/unit_testers/test_mkdomainMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkdomainMod.F90
deleted file mode 100644
index 56a37e7f28..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkdomainMod.F90
+++ /dev/null
@@ -1,95 +0,0 @@
-module test_mkdomainMod
-! Module for testing mkindexmapMod
-
- use mkdomainMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_domain_read_dims
-
- character(len=*), parameter :: modname = 'test_mkdomainMod'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_domain_read_dims
-
- use mkncdio
-
- implicit none
-
- type(domain_type) :: domain
- integer :: ncid
- character(len=128) :: testname
-
- integer :: ni_t, nj_t, ns_t
- logical :: is_2d_t
-
- character(len=*), parameter :: subname = 'test_domain_read_dims'
-
- testname = 'lon'
- call check_ret(nf_open('unit_testers/inputs/test_domain_read_dims__lon.nc', 0, ncid), subname)
- ni_t = 2
- nj_t = 3
- ns_t = 6
- is_2d_t = .true.
- call domain_read_dims(domain, ncid)
- call check_results_2d
-
- testname = 'lsmlon'
- call check_ret(nf_open('unit_testers/inputs/test_domain_read_dims__lsmlon.nc', 0, ncid), subname)
- ni_t = 3
- nj_t = 4
- ns_t = 12
- is_2d_t = .true.
- call domain_read_dims(domain, ncid)
- call check_results_2d
-
- ! When we have both 'lon' and 'ni', should use 'ni'
- testname = 'lon_and_ni'
- call check_ret(nf_open('unit_testers/inputs/test_domain_read_dims__lon_and_ni.nc', 0, ncid), subname)
- ni_t = 4
- nj_t = 5
- ns_t = 20
- is_2d_t = .true.
- call domain_read_dims(domain, ncid)
- call check_results_2d
-
- ! test 1-d
- testname = 'num_pixels'
- call check_ret(nf_open('unit_testers/inputs/test_domain_read_dims__num_pixels.nc', 0, ncid), subname)
- ns_t = 17
- is_2d_t = .false.
- call domain_read_dims(domain, ncid)
- call check_results_1d
-
- ! When we have both 2-d and 1-d info, should use 2-d info
- testname = 'lon_and_num_pixels'
- call check_ret(nf_open('unit_testers/inputs/test_domain_read_dims__lon_and_num_pixels.nc', 0, ncid), subname)
- ni_t = 2
- nj_t = 3
- ns_t = 6
- is_2d_t = .true.
- call domain_read_dims(domain, ncid)
- call check_results_2d
-
- contains
- subroutine check_results_1d
- call test_is(domain%ns, ns_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ns')
- call test_is((domain%is_2d .eqv. is_2d_t), modname//' -- '//subname//' -- '//trim(testname)//' -- is_2d')
- end subroutine check_results_1d
-
- subroutine check_results_2d
- call test_is(domain%ns, ns_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ns')
- call test_is(domain%ni, ni_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ni')
- call test_is(domain%nj, nj_t, modname//' -- '//subname//' -- '//trim(testname)//' -- nj')
- call test_is((domain%is_2d .eqv. is_2d_t), modname//' -- '//subname//' -- '//trim(testname)//' -- is_2d')
- end subroutine check_results_2d
- end subroutine test_domain_read_dims
-end module test_mkdomainMod
-
-
-
diff --git a/tools/mksurfdata_map/unit_testers/test_mkgridmapMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkgridmapMod.F90
deleted file mode 100644
index b802d16162..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkgridmapMod.F90
+++ /dev/null
@@ -1,664 +0,0 @@
-module test_mkgridmapMod
- ! Module for testing mkgridmapMod
-
- use mkgridmapMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_gridmap_areastddev
- public :: test_gridmap_areaave_default
- public :: test_gridmap_areaave_srcmask
- public :: test_gridmap_areaave_srcmask2
-
- character(len=*), parameter :: modname = 'test_mkgridmapMod'
-
-contains
-
- !------------------------------------------------------------------------------
- subroutine test_gridmap_areaave_default
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- real(r8), allocatable :: src_array(:)
- real(r8), allocatable :: dst_array(:)
- real(r8), allocatable :: dst_array_t(:)
-
- real(r8), parameter :: nodata = -1._r8
- real(r8), parameter :: eps = 1.e-13_r8
-
- character(len=*), parameter :: subname = 'test_gridmap_areaave_default'
-
- ! Note about the gridmaps for the tests here:
- ! For most tests here, the test arrays are: (1) simple case, (2) the main case to
- ! test, (3) simple case. Thus, the main case in question is #2 of 3, and we're always
- ! basically just testing one scenario in each call to the subroutine (rather than
- ! doing a bunch of tests at once, which could make setting up the test arrays more
- ! error-prone).
-
- ! Set up a gridmap with 0 weight of overlap on dest #2
- gridmap%na = 4
- gridmap%nb = 3
- gridmap%ns = 4
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4/)
- gridmap%dst_indx = (/1,1,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.25_r8,0.75_r8/) ! weights of sources 3:4 on test 3
- gridmap%frac_dst = (/1.0, 0.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'no overlap'
- src_array = (/0.1_r8,0.2_r8,0.3_r8,0.4_r8/)
- dst_array_t = (/0.125_r8, nodata, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, dst_array, dst_array_t)
-
- ! Set up a gridmap with a single point overlapping dest #2
- gridmap%na = 5
- gridmap%nb = 3
- gridmap%ns = 5
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4,5/)
- gridmap%dst_indx = (/1,1,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 1.0_r8, & ! weight of source 3 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 4:5 on test 3
- gridmap%frac_dst = (/1.0, 1.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'single overlap'
- src_array = (/0.1_r8,0.2_r8,0.5_r8,0.3_r8,0.4_r8/)
- dst_array_t = (/0.125_r8, 0.5_r8, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! Now change the overlap point to have weight=0
- testname = 'single overlap with 0 weight'
- gridmap%wovr(3) = 0.0_r8
- gridmap%frac_dst(2) = 0.0_r8
- dst_array_t(2) = nodata
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, dst_array, dst_array_t)
-
- ! Set up a gridmap for the remaining tests
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.05_r8,0.05_r8,0.1_r8,0.3_r8,0.2_r8,0.15_r8,0.15_r8, & ! weights of sources 2:8 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 8:9 on test 3
- gridmap%frac_dst = (/1.0_r8, 1.0_r8, 1.0_r8/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
-
-
- testname='multiple overlaps, all the same value'
- src_array = (/0.1_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.6_r8/)
- dst_array_t = (/0.2_r8, 0.5_r8, 0.575_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! dividing the weights by 2 shouldn't affect the mean
- testname='weights divided by 2'
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
-
- ! using frac_dst > 1 should be okay
- testname='frac_dst > 1'
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
-
- deallocate(src_array, dst_array, dst_array_t)
-
- end subroutine test_gridmap_areaave_default
-
- !------------------------------------------------------------------------------
- subroutine test_gridmap_areaave_srcmask
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- real(r8), allocatable :: src_array(:)
- real(r8), allocatable :: mask_src(:)
- real(r8), allocatable :: dst_array(:)
- real(r8), allocatable :: dst_array_t(:)
-
- real(r8), parameter :: nodata = -1._r8
- real(r8), parameter :: eps = 1.e-13_r8
-
- character(len=*), parameter :: subname = 'test_gridmap_areaave_srcmask'
-
- ! Note about the gridmaps for the tests here:
- ! For most tests here, the test arrays are: (1) simple case, (2) the main case to
- ! test, (3) simple case. Thus, the main case in question is #2 of 3, and we're always
- ! basically just testing one scenario in each call to the subroutine (rather than
- ! doing a bunch of tests at once, which could make setting up the test arrays more
- ! error-prone).
-
- ! Set up a gridmap with 0 weight of overlap on dest #2
- gridmap%na = 4
- gridmap%nb = 3
- gridmap%ns = 4
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4/)
- gridmap%dst_indx = (/1,1,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.25_r8,0.75_r8/) ! weights of sources 3:4 on test 3
- gridmap%frac_dst = (/1.0, 0.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'no overlap'
- src_array = (/0.1_r8,0.2_r8,0.3_r8,0.4_r8/)
- mask_src(:) = 1.0_r8
- dst_array_t = (/0.125_r8, nodata, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, mask_src, dst_array, dst_array_t)
-
- ! Set up a gridmap with a single point overlapping dest #2
- gridmap%na = 5
- gridmap%nb = 3
- gridmap%ns = 5
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4,5/)
- gridmap%dst_indx = (/1,1,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 1.0_r8, & ! weight of source 3 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 4:5 on test 3
- gridmap%frac_dst = (/1.0, 1.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'single overlap'
- src_array = (/0.1_r8,0.2_r8,0.5_r8,0.3_r8,0.4_r8/)
- mask_src(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.5_r8, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! Now change the overlap point to have src_mask=0
- testname = 'single overlap with 0 src_mask'
- mask_src(3) = 0.0_r8
- dst_array_t(2) = nodata
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, mask_src, dst_array, dst_array_t)
-
- ! Set up a gridmap for the remaining tests
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.05_r8,0.05_r8,0.1_r8,0.3_r8,0.2_r8,0.15_r8,0.15_r8, & ! weights of sources 2:8 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 8:9 on test 3
- gridmap%frac_dst = (/1.0_r8, 1.0_r8, 1.0_r8/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
-
-
- testname='multiple overlaps, all the same value'
- src_array = (/0.1_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.6_r8/)
- mask_src(:) = 1.0_r8
- dst_array_t = (/0.2_r8, 0.5_r8, 0.575_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values, srcmask'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = (/1.0_r8, 1.0_r8, 0.0_r8, 0.5_r8, 1.0_r8, 0.5_r8, 0.0_r8, 1.0_r8, 1.0_r8/)
- dst_array_t = (/0.125_r8, 0.923076923076923_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! dividing the weights by 2 and dividing mask_src by a constant shouldn't affect the mean
- testname='weights divided by 2'
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 0.25_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
-
- ! using frac_dst > 1 should be okay
- testname='frac_dst > 1'
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 0.25_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
-
-
- deallocate(src_array, mask_src, dst_array, dst_array_t)
-
- end subroutine test_gridmap_areaave_srcmask
-
- !------------------------------------------------------------------------------
- subroutine test_gridmap_areaave_srcmask2
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- real(r8), allocatable :: src_array(:)
- real(r8), allocatable :: mask_src(:)
- real(r8), allocatable :: dst_array(:)
- real(r8), allocatable :: mask_dst(:)
- real(r8), allocatable :: dst_array_t(:)
-
- real(r8), parameter :: mask_dst_min = 0.0_r8
- real(r8), parameter :: nodata = -1._r8
- real(r8), parameter :: eps = 1.e-13_r8
-
- character(len=*), parameter :: subname = 'test_gridmap_areaave_srcmask2'
-
- ! Note about the gridmaps for the tests here:
- ! For most tests here, the test arrays are: (1) simple case, (2) the main case to
- ! test, (3) simple case. Thus, the main case in question is #2 of 3, and we're always
- ! basically just testing one scenario in each call to the subroutine (rather than
- ! doing a bunch of tests at once, which could make setting up the test arrays more
- ! error-prone).
-
- ! Set up a gridmap with 0 weight of overlap on dest #2
- gridmap%na = 4
- gridmap%nb = 3
- gridmap%ns = 4
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4/)
- gridmap%dst_indx = (/1,1,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.25_r8,0.75_r8/) ! weights of sources 3:4 on test 3
- gridmap%frac_dst = (/1.0, 0.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- mask_dst (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'no overlap'
- src_array = (/0.1_r8,0.2_r8,0.3_r8,0.4_r8/)
- mask_src(:) = 1.0_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, nodata, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, mask_src, dst_array, mask_dst, dst_array_t)
-
- ! Set up a gridmap with a single point overlapping dest #2
- gridmap%na = 5
- gridmap%nb = 3
- gridmap%ns = 5
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4,5/)
- gridmap%dst_indx = (/1,1,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 1.0_r8, & ! weight of source 3 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 4:5 on test 3
- gridmap%frac_dst = (/1.0, 1.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- mask_dst (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'single overlap'
- src_array = (/0.1_r8,0.2_r8,0.5_r8,0.3_r8,0.4_r8/)
- mask_src(:) = 1.0_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.5_r8, 0.375_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! Now change the overlap point to have src_mask=0
- testname = 'single overlap with 0 src_mask'
- mask_src(3) = 0.0_r8
- mask_dst(:) = 1.0_r8
- dst_array_t(2) = nodata
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, mask_src, dst_array, mask_dst, dst_array_t)
-
- ! Set up a gridmap for the remaining tests
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.05_r8,0.05_r8,0.1_r8,0.3_r8,0.2_r8,0.15_r8,0.15_r8, & ! weights of sources 2:8 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 8:9 on test 3
- gridmap%frac_dst = (/1.0_r8, 1.0_r8, 1.0_r8/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- mask_src (gridmap%na), &
- dst_array (gridmap%nb), &
- mask_dst (gridmap%nb), &
- dst_array_t(gridmap%nb))
-
-
- testname='multiple overlaps, all the same value'
- src_array = (/0.1_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.6_r8/)
- mask_src(:) = 1.0_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.2_r8, 0.5_r8, 0.575_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 1.0_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values, dst mask'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 1.0_r8
- mask_dst(:) = (/1.0_r8, 0.0_r8, 1.0_r8/)
- dst_array_t = (/0.125_r8, nodata, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values, srcmask'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = (/1.0_r8, 1.0_r8, 0.0_r8, 0.5_r8, 1.0_r8, 0.5_r8, 0.0_r8, 1.0_r8, 1.0_r8/)
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.923076923076923_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! dividing the weights by 2 and dividing mask_src by a constant shouldn't affect the mean
- testname='weights divided by 2'
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 0.25_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
-
- ! using frac_dst > 1 should be okay
- testname='frac_dst > 1'
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- mask_src(:) = 0.25_r8
- mask_dst(:) = 1.0_r8
- dst_array_t = (/0.125_r8, 0.875_r8, 1.775_r8/)
- call gridmap_areaave(gridmap, src_array, dst_array, nodata, mask_src, mask_dst, mask_dst_min)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
-
- deallocate(src_array, mask_src, dst_array, mask_dst, dst_array_t)
-
- end subroutine test_gridmap_areaave_srcmask2
-
- !------------------------------------------------------------------------------
- subroutine test_gridmap_areastddev
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- real(r8), allocatable :: src_array(:)
- real(r8), allocatable :: dst_array(:)
- real(r8), allocatable :: dst_array_t(:)
-
- real(r8), parameter :: nodata = -1._r8
- real(r8), parameter :: eps = 1.e-13_r8
-
- character(len=*), parameter :: subname = 'test_gridmap_areastddev'
-
- ! Note about the gridmaps for the tests here:
- ! For most tests here, the test arrays are: (1) simple case, (2) the main case to
- ! test, (3) simple case. Thus, the main case in question is #2 of 3, and we're always
- ! basically just testing one scenario in each call to the subroutine (rather than
- ! doing a bunch of tests at once, which could make setting up the test arrays more
- ! error-prone).
-
- ! Set up a gridmap with 0 weight of overlap on dest #2
- gridmap%na = 4
- gridmap%nb = 3
- gridmap%ns = 4
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4/)
- gridmap%dst_indx = (/1,1,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.25_r8,0.75_r8/) ! weights of sources 3:4 on test 3
- gridmap%frac_dst = (/1.0, 0.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'no overlap'
- src_array = (/0.1_r8,0.2_r8,0.3_r8,0.4_r8/)
- dst_array_t = (/0.04330127018922193_r8, nodata, 0.04330127018922195_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, dst_array, dst_array_t)
-
- ! Set up a gridmap with a single point overlapping dest #2
- gridmap%na = 5
- gridmap%nb = 3
- gridmap%ns = 5
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,3,4,5/)
- gridmap%dst_indx = (/1,1,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 1.0_r8, & ! weight of source 3 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 4:5 on test 3
- gridmap%frac_dst = (/1.0, 1.0, 1.0/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
- testname = 'single overlap'
- src_array = (/0.1_r8,0.2_r8,0.5_r8,0.3_r8,0.4_r8/)
- dst_array_t = (/0.04330127018922193_r8, 0.0_r8, 0.04330127018922195_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, gridmap%frac_dst)
- deallocate(src_array, dst_array, dst_array_t)
-
- ! Set up a gridmap for the remaining tests
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns), &
- gridmap%frac_dst(gridmap%nb))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- gridmap%wovr = (/0.75_r8,0.25_r8, & ! weights of sources 1:2 on dest 1
- 0.05_r8,0.05_r8,0.1_r8,0.3_r8,0.2_r8,0.15_r8,0.15_r8, & ! weights of sources 2:8 on dest 2
- 0.25_r8,0.75_r8/) ! weights of sources 8:9 on test 3
- gridmap%frac_dst = (/1.0_r8, 1.0_r8, 1.0_r8/)
- gridmap%set = 'gridmap_IsSet'
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb))
-
-
- testname='multiple overlaps, all the same value'
- src_array = (/0.1_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.5_r8, 0.6_r8/)
- dst_array_t = (/0.1732050807568877_r8, 0.0_r8, 0.04330127018922193_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- testname='multiple overlaps, different values'
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.04330127018922193_r8, 0.5346727971385864_r8, 0.04330127018922197_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
-
- ! dividing the weights by 2 shouldn't affect the standard deviation
- testname='weights divided by 2'
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.04330127018922193_r8, 0.5346727971385864_r8, 0.04330127018922197_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
-
- ! using frac_dst > 1 should be okay
- testname='frac_dst > 1'
- gridmap%wovr(:) = gridmap%wovr(:) * 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) * 2.0_r8
- src_array = (/0.1_r8, 0.2_r8, 0.3_r8, 0.7_r8, 0.5_r8, 1.5_r8, 0.5_r8, 1.7_r8, 1.8_r8/)
- dst_array_t = (/0.04330127018922193_r8, 0.5346727971385864_r8, 0.04330127018922197_r8/)
- call gridmap_areastddev(gridmap, src_array, dst_array, nodata)
- call test_close(dst_array, dst_array_t, eps, modname//' -- '//subname//' -- '//trim(testname))
- ! restore wovr & frac_dst
- gridmap%wovr(:) = gridmap%wovr(:) / 2.0_r8
- gridmap%frac_dst(:) = gridmap%frac_dst(:) / 2.0_r8
-
- deallocate(src_array, dst_array, dst_array_t)
-
- end subroutine test_gridmap_areastddev
-end module test_mkgridmapMod
diff --git a/tools/mksurfdata_map/unit_testers/test_mkharvest.F90 b/tools/mksurfdata_map/unit_testers/test_mkharvest.F90
deleted file mode 100644
index 421af9d620..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkharvest.F90
+++ /dev/null
@@ -1,316 +0,0 @@
-module test_mkharvest
-! Module for testing harvest
-
- use shr_kind_mod, only : r8 => shr_kind_r8
- use mkharvestMod
- use test_mod
-
- implicit none
- private
-
- public :: test_harvest_init
- public :: test_harvest_init_old
- public :: test_harvest_data
- public :: test_harvest_data_all1D
-
- character(len=*), parameter :: modname = 'test_harvest'
-
- character(len=128) :: testname
- character(len=128) :: test_prefix
- integer, parameter :: ns_o = 4
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_harvest_init
-
- use mkncdio
- implicit none
-
- integer :: ncid
- type(harvestDataType) :: harvdata
- character(len=128) :: varname
- integer :: varid
- logical :: varexists
- integer :: ifld
- character(len=*), parameter :: constfieldname(9) = (/ &
- 'CONST_HARVEST_VH1 ', &
- 'CONST_HARVEST_VH2 ', &
- 'CONST_HARVEST_SH1 ', &
- 'CONST_HARVEST_SH2 ', &
- 'CONST_HARVEST_SH3 ', &
- 'CONST_GRAZING ', &
- 'CONST_FERTNITRO_CFT ', &
- 'UNREPRESENTED_PFT_LULCC', &
- 'UNREPRESENTED_CFT_LULCC' &
- /)
- character(len=*), parameter :: units(9) = (/ &
- 'gC/m2/yr', &
- 'gC/m2/yr', &
- 'gC/m2/yr', &
- 'gC/m2/yr', &
- 'gC/m2/yr', &
- 'gC/m2/yr', &
- 'gN/m2/yr', &
- 'unitless', &
- 'unitless' &
- /)
- character(len=*), parameter :: fieldname(9) = (/ &
- 'HARVEST_VH1 ', &
- 'HARVEST_VH2 ', &
- 'HARVEST_SH1 ', &
- 'HARVEST_SH2 ', &
- 'HARVEST_SH3 ', &
- 'GRAZING ', &
- 'FERTNITRO_CFT', &
- 'PFT_LULCC ', &
- 'CFT_LULCC ' &
- /)
- character(len=*), parameter :: longname(9) = (/ &
- 'harvest from primary forest ', &
- 'harvest from primary non-forest ', &
- 'harvest from secondary mature-forest ', &
- 'harvest from secondary young-forest ', &
- 'harvest from secondary non-forest ', &
- 'grazing of herbacous pfts ', &
- 'constant background nitrogen fertilizer for each crop ', &
- 'constant background unrepresented PFT LULCC transitions ', &
- 'constant background unrepresented crop LULCC transitions' &
- /)
- character(len=256) :: string
- character(len=*), parameter :: filename = 'unit_testers/inputs/harvestfields.nc'
-
- character(len=*), parameter :: subname = 'test_harvest_init'
- integer :: nfields
-
- testname = 'check harvest_init'
- test_prefix = modname//' -- '//subname//' -- '//trim(testname)//' -- '
- ! Open netcdf file that will be used for most tests
- call check_ret(nf_open(filename, 0, ncid), subname)
- varname = 'GRAZING'
- call check_ret(nf_inq_varid(ncid, varname, varid), subname, varexists=varexists)
- call test_is(varexists, trim(test_prefix)//'existing var')
- call test_is( .not.mkharvest_fieldInBounds( 3 ), trim(test_prefix)//'allfieldsoutofboundsbeforeinit')
-
- call mkharvest_init( ns_o, 0.0_r8, harvdata, filename )
- call test_is( .not.mkharvest_fieldInBounds( 0 ), trim(test_prefix)//'0 out of bounds')
- nfields = mkharvest_numtypes()
- call test_is( .not.mkharvest_fieldInBounds( nfields+1), trim(test_prefix)//'10 out of bounds')
-
- ! make sure can now do getter functions
-
- do ifld = 1, mkharvest_numtypes()
- call test_is(mkharvest_fieldname(ifld,constant=.true.), constfieldname(ifld), trim(test_prefix)//'bad const fieldname')
- call test_is(mkharvest_fieldname(ifld), fieldname(ifld), trim(test_prefix)//trim(testname)//'bad fieldname')
- call test_is(mkharvest_units(ifld), units(ifld), trim(test_prefix)//'bad units')
- call test_is(mkharvest_longname(ifld), longname(ifld), trim(test_prefix)//'bad longname')
- end do
- call harvdata%clean()
-
- end subroutine test_harvest_init
-
- subroutine test_harvest_data_all1D()
- implicit none
- type(harvestDataType) :: harvdata
- integer :: dim2nd(9)
- integer :: dsizes(2), nfields, ifld, n, doutsizes(2)
- integer :: dims1D(1), dims2D(2)
- character(len=*), parameter :: subname = 'test_harvest_data'
- character(len=*), parameter :: filename = 'unit_testers/inputs/harvestfields.nc'
- integer, parameter :: indices1D(9) = (/ 1, 2, 3, 4, 5, 6, 7, 8, 9 /)
- integer, parameter :: indices2D(1) = (/ -1 /)
- real(r8), pointer :: data1D(:)
- integer, allocatable :: ind1D(:), ind2D(:)
- integer, parameter :: ns_i = 15, ns_o = 10
-
- testname = 'check harvest_data_all1D'
- test_prefix = modname//' -- '//subname//' -- '//trim(testname)//' -- '
- dim2nd = 0
- call mkharvest_init( ns_o, 0.0_r8, harvdata, filename )
- call harvdata%clean()
- call harvdata%init( dim2nd, ns_i, ns_o, 0.0_r8 )
- do ifld = 1, mkharvest_numtypes()
- call test_is(harvdata%isField1D(ifld), trim(test_prefix)//'field is 1D' )
- call test_is(.not.harvdata%isField2D(ifld), trim(test_prefix)//'field not 2D' )
- end do
- nfields = mkharvest_numtypes()
- call test_is(harvdata%num1DFields(),nfields,trim(test_prefix)//'num 1D fields')
- call test_is(harvdata%num2DFields(),0,trim(test_prefix)//'num 2D fields')
- call harvdata%getFieldsIdx( ind1D, ind2D )
- call test_is(ind1D,indices1D,trim(test_prefix)//'1D fields indices')
- call test_is(ind2D,indices2D,trim(test_prefix)//'2D fields indices')
- dsizes(1) = ns_i
- doutsizes(1) = ns_o
- do n = 1, harvdata%num1DFields()
- call test_is(harvdata%isField1D(indices1D(n)), trim(test_prefix)//'verify field is 1D' )
- data1D => harvdata%get1DFieldPtr( indices1D(n) )
- dims1D = shape(data1D)
- call test_is(dims1D,dsizes(:),trim(test_prefix)//'1D field dims')
- ! Set data
- data1D(:) = real( n, r8 )
- data1D => null()
- ! Output data
- data1D => harvdata%get1DFieldPtr( indices1D(n), output=.true. )
- dims1D = shape(data1D)
- call test_is(dims1D,doutsizes(:),trim(test_prefix)//'1D Output field dims')
- ! Set data
- data1D(:) = real( n*100, r8 )
- data1D => null()
- end do
- ! Check that data is set from setting above
- do n = 1, harvdata%num1DFields()
- data1D => harvdata%get1DFieldPtr( indices1D(n) )
- call test_is(data1D(1),real( n, r8 ), trim(test_prefix)//'field ')
- data1D => null()
- ! output data
- data1D => harvdata%get1DFieldPtr( indices1D(n), output=.true. )
- call test_is(data1D(1),real( n*100, r8 ), trim(test_prefix)//'field ')
- data1D => null()
- end do
- call harvdata%clean()
- end subroutine test_harvest_data_all1D
-
-!------------------------------------------------------------------------------
-
- subroutine test_harvest_data()
- implicit none
- type(harvestDataType) :: harvdata
- integer :: dsizes(2), nfields, ifld, n, doutsizes(2)
- integer :: dims1D(1), dims2D(2)
- character(len=*), parameter :: subname = 'test_harvest_data'
- character(len=*), parameter :: filename = 'unit_testers/inputs/harvestfields.nc'
- integer, parameter :: indices1D(6) = (/ 1, 2, 3, 4, 5, 6 /)
- integer, parameter :: indices2D(3) = (/ 7, 8, 9 /)
- integer, parameter :: dim2nd(3) = (/ 64, 15, 64 /)
- character(len=10) :: dimnames(3) = (/ "cft", "natpft", "cft" /)
- real(r8), pointer :: data1D(:)
- real(r8), pointer :: data2D(:,:)
- integer, allocatable :: ind1D(:), ind2D(:)
- integer, parameter :: ns_i = 4, ns_o = 20
-
- testname = 'check harvest_data'
- test_prefix = modname//' -- '//subname//' -- '//trim(testname)//' -- '
- call mkharvest_init( ns_o, 0.0_r8, harvdata, filename )
- call harvdata%getFieldsIdx( ind1D, ind2D )
- call test_is(ind1D,indices1D,trim(test_prefix)//'1D fields indices')
- call test_is(ind2D,indices2D,trim(test_prefix)//'2D fields indices')
- call test_is(harvdata%num1DFields(),size(indices1D),trim(test_prefix)//'num 1D fields')
- call test_is(harvdata%num2DFields(),size(indices2D),trim(test_prefix)//'num 2D fields')
- do n = 1, harvdata%num1DFields()
- ifld = ind1D(n)
- call test_is(harvdata%isField1D(ifld), trim(test_prefix)//'field is 1D' )
- call test_is(.not.harvdata%isField2D(ifld), trim(test_prefix)//'field not 2D' )
- end do
- do n = 1, harvdata%num2DFields()
- ifld = ind2D(n)
- call test_is(.not.harvdata%isField1D(ifld), trim(test_prefix)//'field is not 1D' )
- call test_is(harvdata%isField2D(ifld), trim(test_prefix)//'field is 2D' )
- end do
- dsizes(1) = ns_i
- doutsizes(1) = ns_o
- do n = 1, harvdata%num1DFields()
- call test_is(harvdata%isField1D(indices1D(n)), trim(test_prefix)//'verify field is 1D' )
- data1D => harvdata%get1DFieldPtr( indices1D(n) )
- dims1D = shape(data1D)
- call test_is(dims1D,dsizes(:),trim(test_prefix)//'1D field dims')
- call test_is(harvdata%getFieldsDim(indices1D(n)),"none",trim(test_prefix)//'1D field dimname')
- data1D => null()
- end do
- do n = 1, harvdata%num2DFields()
- dsizes(2) = dim2nd(n)
- call test_is(harvdata%isField2D(indices2D(n)), trim(test_prefix)//'verify field is 2D' )
- data2D => harvdata%get2DFieldPtr( indices2D(n) )
- dims2D = shape(data2D)
- call test_is(dims2D,dsizes(:),trim(test_prefix)//'2D field dims')
- call test_is(harvdata%getFieldsDim(indices2D(n)),dimnames(n),trim(test_prefix)//'1D field dimname')
- data2D => null()
- end do
- call harvdata%clean()
- end subroutine test_harvest_data
-
-
-!------------------------------------------------------------------------------
- subroutine test_harvest_init_old
-
- use mkncdio
- implicit none
-
- type(harvestDataType) :: harvdata
- character(len=128) :: testname
- integer :: ncid
- character(len=128) :: varname
- integer :: varid
- logical :: varexists
- integer, parameter :: ns_o = 4
- integer :: ifld
-
- character(len=*), parameter :: filename = 'unit_testers/inputs/harvestfieldsold.nc'
-
- character(len=*), parameter :: subname = 'test_harvest_init'
- character(len=*), parameter :: constfieldname(9) = (/ &
- 'CONST_HARVEST_VH1 ', &
- 'CONST_HARVEST_VH2 ', &
- 'CONST_HARVEST_SH1 ', &
- 'CONST_HARVEST_SH2 ', &
- 'CONST_HARVEST_SH3 ', &
- 'CONST_GRAZING ', &
- 'CONST_FERTNITRO_CFT ', &
- 'UNREPRESENTED_PFT_LULCC', &
- 'UNREPRESENTED_CFT_LULCC' &
- /)
- character(len=*), parameter :: units(9) = (/ &
- 'unitless ', &
- 'unitless ', &
- 'unitless ', &
- 'unitless ', &
- 'unitless ', &
- 'unitless ', &
- 'not_read_in', &
- 'not_read_in', &
- 'not_read_in' &
- /)
- character(len=*), parameter :: fieldname(9) = (/ &
- 'HARVEST_VH1 ', &
- 'HARVEST_VH2 ', &
- 'HARVEST_SH1 ', &
- 'HARVEST_SH2 ', &
- 'HARVEST_SH3 ', &
- 'GRAZING ', &
- 'FERTNITRO_CFT', &
- 'PFT_LULCC ', &
- 'CFT_LULCC ' &
- /)
- character(len=*), parameter :: longname(9) = (/ &
- 'harvest from primary forest ', &
- 'harvest from primary non-forest ', &
- 'harvest from secondary mature-forest', &
- 'harvest from secondary young-forest ', &
- 'harvest from secondary non-forest ', &
- 'grazing of herbacous pfts ', &
- 'FERTNITRO_CFT (zeroed out) ', &
- 'PFT_LULCC (zeroed out) ', &
- 'CFT_LULCC (zeroed out) ' &
- /)
- character(len=256) :: string
- testname = 'check harvest_init_old'
- ! Open netcdf file that will be used for most tests
- call check_ret(nf_open(filename, 0, ncid), subname)
- varname = 'GRAZING'
- call check_ret(nf_inq_varid(ncid, varname, varid), subname, varexists=varexists)
- call test_is(varexists, modname//' -- '//subname//' -- '//trim(testname)//' -- existing var')
-
- call mkharvest_init( ns_o, 0.0_r8, harvdata, filename )
-
- ! make sure can now do getter functions
-
- do ifld = 1, mkharvest_numtypes()
- call test_is(mkharvest_fieldname(ifld,constant=.true.), constfieldname(ifld), modname//' -- '//subname//' -- '//trim(testname)//' -- bad const fieldname')
- call test_is(mkharvest_fieldname(ifld), fieldname(ifld), modname//' -- '//subname//' -- '//trim(testname)//' -- bad fieldname')
- call test_is(mkharvest_units(ifld), units(ifld), modname//' -- '//subname//' -- '//trim(testname)//' -- bad units')
- call test_is(mkharvest_longname(ifld), longname(ifld), modname//' -- '//subname//' -- '//trim(testname)//' -- bad longname')
- end do
- call harvdata%clean()
-
- end subroutine test_harvest_init_old
-
-end module test_mkharvest
diff --git a/tools/mksurfdata_map/unit_testers/test_mkindexmapMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkindexmapMod.F90
deleted file mode 100644
index 4e6a099daa..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkindexmapMod.F90
+++ /dev/null
@@ -1,676 +0,0 @@
-module test_mkindexmapMod
-! Module for testing mkindexmapMod
-
- use mkindexmapMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_get_dominant_indices
- public :: test_filter_same
- public :: test_lookup_2d
- public :: test_lookup_2d_netcdf
- public :: test_which_max
-
- character(len=*), parameter :: modname = 'test_mkindexmapMod'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_get_dominant_indices
-
- use mkgridmapMod, only : gridmap_type
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- integer, allocatable :: src_array(:)
- integer, allocatable :: dst_array(:)
- integer, allocatable :: dst_array_t(:)
- logical, allocatable :: filter(:)
- integer :: minval, maxval, nodata
-
- character(len=*), parameter :: subname = 'test_get_dominant_indices'
-
- ! Set up a gridmap that will be used for most tests, and allocate corresponding
- ! arrays:
- ! Note that, for most tests here, the test arrays are: (1) simple case, (2) the main
- ! case to test, (3) simple case. Thus, the main case in question is #2 of 3, and
- ! we're always basically just testing one scenario in each call to the subroutine
- ! (rather than doing a bunch of tests at once, which could make setting up the test
- ! arrays more error-prone).
-
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- ! Note: I'm not setting some things that aren't used in get_dominant_indices
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- gridmap%wovr = (/0.75,0.25, & ! weights of sources 1:2 on dest 1
- 0.1,0.1,0.1,0.3,0.2,0.2,0.2, & ! weights of sources 2:8 on dest 2
- 0.25,0.75/) ! weights of sources 8:9 on test 3
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb), &
- filter (gridmap%ns))
-
- testname = 'basic test, all unique'
- src_array = (/1, 2, 3, 4, 5, 6, 7, 8, 9/)
- minval = 1
- maxval = 9
- nodata = -1
- ! dst 2 takes its value from src 5 because it has the largest weight:
- dst_array_t = (/1, 5, 9/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'basic test, some duplicates'
- src_array = (/1, 2, 3, 3, 4, 2, 2, 1, 1/)
- minval = 1
- maxval = 4
- nodata = -1
- dst_array_t = (/1, 2, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'minval not 1'
- src_array = (/3, 4, 5, 5, 6, 4, 4, 3, 3/)
- minval = 3
- maxval = 6
- nodata = -1
- dst_array_t = (/3, 4, 3/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'single non-zero source value'
- src_array = (/1, 0, 0, 0, 0, 2, 0, 0, 1/)
- minval = 1
- maxval = 2
- nodata = -1
- dst_array_t = (/1, 2, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'single value within given min-max range'
- src_array = (/1, 0, 9, 9, 0, 2, 9, 9, 1/)
- minval = 1
- maxval = 2
- nodata = -1
- dst_array_t = (/1, 2, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'no valid values'
- src_array = (/1, 0, 9, 9, 0, 0, 9, 9, 1/)
- minval = 1
- maxval = 2
- nodata = -1
- dst_array_t = (/1, nodata, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'some filters false'
- src_array = (/1, 2, 3, 3, 4, 2, 2, 1, 1/)
- minval = 1
- maxval = 4
- nodata = -1
- filter = (/.true., .true., &
- .false., .true., .true., .true., .false., .true., .true., &
- .true., .true./)
- dst_array_t = (/1, 4, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata, filter=filter)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'all filters false'
- src_array = (/1, 2, 3, 3, 4, 2, 2, 1, 1/)
- minval = 1
- maxval = 4
- nodata = -1
- filter = (/.true., .true., &
- .false., .false., .false., .false., .false., .false., .false., &
- .true., .true./)
- dst_array_t = (/1, nodata, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata, filter=filter)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- ! Modify gridmap weights for the following test
- gridmap%wovr = (/0.75,0.25, & ! weights of sources 1:2 on dest 1
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0, & ! weights of sources 2:8 on dest 2
- 0.25,0.75/) ! weights of sources 8:9 on test 3
- testname='all weights 0'
- src_array = (/1, 1, 1, 1, 1, 1, 1, 1, 1/)
- minval = 1
- maxval = 2
- nodata = -1
- dst_array_t = (/1, nodata, 1/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- ! Make a new gridmap for the following test;
- ! this involves more output cells and a more complex mapping from src to dst
- ! This gridmap will have:
- ! dst 1: from src 1, 4, 7
- ! dst 2: from src 2, 4, 6
- ! dst 3: from src 1
- ! dst 4: no overlapping src cells
- ! dst 5: from src 5, 7, 8
- ! note that src 3 & 9 do not overlap with any dst
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, &
- src_array, dst_array, dst_array_t, filter)
- gridmap%na = 9
- gridmap%nb = 5
- gridmap%ns = 10
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns), &
- gridmap%wovr (gridmap%ns))
- gridmap%src_indx = (/1, 2, 4, 4, 7, 6, 1, 5, 7, 8/)
- gridmap%dst_indx = (/1, 2, 1, 2, 1, 2, 3, 5, 5, 5/)
- gridmap%wovr = (/1, 1, 2, 2, 1, 3, 1, 2, 2, 3/)
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- dst_array_t(gridmap%nb), &
- filter (gridmap%ns))
-
- testname = 'more complex gridmap'
- ! src index: 1 2 3 4 5 6 7 8 9
- src_array = (/1, 2, 3, 1, 5, 6, 5, 8, 9/)
- minval = 1
- maxval = 9
- nodata = -1
- dst_array_t = (/1, 6, 1, nodata, 5/)
- call get_dominant_indices(gridmap, src_array, dst_array, minval, maxval, nodata)
- call test_is(dst_array, dst_array_t, modname//' -- '//subname//' -- '//trim(testname))
-
- deallocate(gridmap%src_indx, gridmap%dst_indx, gridmap%wovr, &
- src_array, dst_array_t, filter)
-
- end subroutine test_get_dominant_indices
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
- subroutine test_filter_same
-
- use mkgridmapMod, only : gridmap_type
-
- implicit none
-
- type(gridmap_type) :: gridmap
- character(len=128) :: testname
-
- integer, allocatable :: src_array(:)
- integer, allocatable :: dst_array(:)
- logical, allocatable :: filter(:)
- logical, allocatable :: filter_t(:)
- integer :: nodata
-
- character(len=*), parameter :: subname = 'test_filter_same'
-
- ! Set up a gridmap that will be used for most tests, and allocate corresponding
- ! arrays:
- ! Note that, for most tests here, the test arrays are: (1) simple case, (2) the main
- ! case to test, (3) simple case. Thus, the main case in question is #2 of 3, and
- ! we're always basically just testing one scenario in each call to the subroutine
- ! (rather than doing a bunch of tests at once, which could make setting up the test
- ! arrays more error-prone).
-
- ! This gridmap will have 3 src cells, 9 dest cells, and:
- ! src 1: just overlaps with dst 1
- ! src 2: overlaps with dst 1 & dst 2
- ! src 3..7: just overlaps with dst 2
- ! src 8: overlaps with dst 2 & dst 3
- ! src 9: just overlaps with dst 3
- ! Note: I'm not setting some things that aren't used in filter_same
- gridmap%na = 9
- gridmap%nb = 3
- gridmap%ns = 11
- allocate(gridmap%src_indx(gridmap%ns), &
- gridmap%dst_indx(gridmap%ns))
- gridmap%src_indx = (/1,2,2,3,4,5,6,7,8,8,9/)
- gridmap%dst_indx = (/1,1,2,2,2,2,2,2,2,3,3/)
- allocate(src_array (gridmap%na), &
- dst_array (gridmap%nb), &
- filter (gridmap%ns), &
- filter_t (gridmap%ns))
-
- testname = 'maintain false values in filter'
- src_array(:) = 1
- dst_array(:) = 1
- filter(:) = .true.
- filter(3) = .false.
- filter(5) = .false.
- filter_t(:) = .true.
- filter_t(3) = .false.
- filter_t(5) = .false.
- call filter_same(gridmap, filter, src_array, dst_array)
- call test_is(filter, filter_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'dst_array = nodata in some places'
- nodata = -1
- src_array(:) = 1
- src_array(5) = nodata ! make sure that even when src_array = dst_array = nodata,
- ! we still end up with filter = false
- dst_array = (/1, nodata, 1/)
- filter(:) = .true.
- filter_t(:) = .true.
- filter_t(3:9) = .false. ! false for all overlaps with dst #2
- call filter_same(gridmap, filter, src_array, dst_array, nodata=nodata)
- call test_is(filter, filter_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'src_array not equal to dst_array in some places, no nodata argument'
- src_array(:) = (/1, 1, 1, 1, 2, 3, 1, 3, 1/)
- dst_array(:) = (/1, 1, 1/)
- filter(:) = .true.
- ! src_array index: 1 2 2 3 4 5 6 7 8 8 9
- filter_t(:) = (/.true.,.true.,.true.,.true.,.true.,.false.,.false.,.true.,.false.,.false.,.true./)
- call filter_same(gridmap, filter, src_array, dst_array)
- call test_is(filter, filter_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'src_array not equal to dst_array in some places, nodata never applies'
- nodata = -1
- src_array(:) = (/1, 1, 1, 1, 2, 3, 1, 3, 1/)
- dst_array(:) = (/1, 1, 1/)
- filter(:) = .true.
- ! src_array index: 1 2 2 3 4 5 6 7 8 8 9
- filter_t(:) = (/.true.,.true.,.true.,.true.,.true.,.false.,.false.,.true.,.false.,.false.,.true./)
- call filter_same(gridmap, filter, src_array, dst_array, nodata=nodata)
- call test_is(filter, filter_t, modname//' -- '//subname//' -- '//trim(testname))
-
- testname = 'combination of false filter, src_array not equal to dst_array, and nodata'
- nodata = -1
- src_array(:) = (/1, 2, 1, 2, 1, 2, 1, 2, 1/)
- dst_array(:) = (/nodata, 1, 1/)
- filter(:) = .true.
- filter(4) = .false.
- filter_t(:) = (/.false.,.false.,.false.,.false.,.false.,.true.,.false.,.true.,.false.,.false.,.true./)
- call filter_same(gridmap, filter, src_array, dst_array, nodata=nodata)
- call test_is(filter, filter_t, modname//' -- '//subname//' -- '//trim(testname))
-
-
- deallocate(gridmap%src_indx, gridmap%dst_indx, &
- src_array, dst_array, filter, filter_t)
-
- end subroutine test_filter_same
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
- subroutine test_lookup_2d
-
- implicit none
-
- character(len=128) :: testname
- real(r8), allocatable :: lookup_table(:,:)
- logical , allocatable :: valid_entries(:,:)
- integer , allocatable :: index1(:), index2(:)
- real(r8), allocatable :: data(:), data_t(:)
- real(r8) :: fill_val
- integer :: nodata
- integer :: ierr, ierr_t
-
- character(len=*), parameter :: subname = 'test_lookup_2d'
-
- ! Create lookup table for use in most tests
- allocate(lookup_table(2,3), valid_entries(2,3))
- lookup_table(1,:) = (/11.,12.,13./)
- lookup_table(2,:) = (/21.,22.,23./)
-
- testname = 'basic test; no nodata or valid_entries'
- allocate(index1(5), index2(5), data(5), data_t(5))
- index1 = (/1,2,1,2,2/)
- index2 = (/1,2,3,2,3/)
- fill_val = -1.
- data_t = (/11., 22., 13., 22., 23./)
- ierr_t = 0
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr)
- call check_results
- deallocate(index1, index2, data, data_t)
-
- testname = 'basic test but with index out of range'
- allocate(index1(5), index2(5), data(5), data_t(5))
- index1 = (/1,2,3,2,2/)
- index2 = (/1,2,1,2,4/)
- fill_val = -1.
- data_t = (/11._r8, 22._r8, fill_val, 22._r8, fill_val/)
- ierr_t = 2
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr)
- call check_results
- deallocate(index1, index2, data, data_t)
-
- testname = 'basic test but with nodata present, and a nodata value in input'
- allocate(index1(5), index2(5), data(5), data_t(5))
- nodata = -1
- index1 = (/nodata,2,1,2,nodata/)
- index2 = (/1,2,3,nodata,nodata/)
- fill_val = -1.
- data_t = (/fill_val, 22._r8, 13._r8, fill_val, fill_val/)
- ierr_t = 0
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, nodata=nodata)
- call check_results
- deallocate(index1, index2, data, data_t)
-
- testname = 'valid_entries'
- allocate(index1(5), index2(5), data(5), data_t(5))
- index1 = (/1,1,2,2,1/)
- index2 = (/1,2,1,2,3/)
- valid_entries(1,:) = (/.false.,.false.,.true./)
- valid_entries(2,:) = (/.true. ,.true. ,.true./)
- fill_val = -1.
- data_t = (/fill_val, fill_val, 21._r8, 22._r8, 13._r8/)
- ierr_t = 1
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, valid_entries=valid_entries)
- call check_results
-
- testname = 'valid_entries, invalid_okay'
- ! Note: this test reuses some setup from the previous test
- ierr_t = 0
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, &
- valid_entries=valid_entries, invalid_okay=.true.)
- call check_results
- deallocate(index1, index2, data, data_t)
-
-
- testname = 'valid_entries, together with index out of range'
- ! in addition to checking both valid_entries and index out of range, this also
- ! makes sure that we get the appropriate ierr value when we have both errors
- ! (because we encounter the valid_entries error first)
- allocate(index1(5), index2(5), data(5), data_t(5))
- index1 = (/1,1,3,2,2/)
- index2 = (/1,2,1,1,0/)
- valid_entries(1,:) = (/.false.,.false.,.true./)
- valid_entries(2,:) = (/.true. ,.true. ,.true./)
- fill_val = -1.
- data_t = (/fill_val, fill_val, fill_val, 21._r8, fill_val/)
- ierr_t = 1
- call lookup_2d(index1, index2, lookup_table, fill_val, data, ierr, valid_entries=valid_entries)
- call check_results
- deallocate(index1, index2, data, data_t)
-
-
- deallocate(lookup_table, valid_entries)
-
- contains
- subroutine check_results
- call test_is(data, data_t, modname//' -- '//subname//' -- '//trim(testname)//' -- data')
- call test_is(ierr, ierr_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ierr')
- end subroutine check_results
-
- end subroutine test_lookup_2d
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
- subroutine test_lookup_2d_netcdf
-
- use mkncdio
-
- implicit none
-
- character(len=128) :: testname
- character(len=64) :: tablename
- character(len=4) :: dimname1, dimname2
- logical :: invalid_lookup
- integer :: n_extra_dims
- integer , allocatable :: index1(:), index2(:)
- real(r8), allocatable :: data(:), data_t(:)
- real(r8) :: fill_val
- integer :: nodata
- integer :: ierr, ierr_t
- type(dim_slice_type), allocatable :: extra_dims(:)
-
- integer :: ncid
- character(len=*), parameter :: filename = 'unit_testers/inputs/test_lookup_2d_netcdf.nc'
-
- ! flags to enable tests that we don't usually want to run, because they result in
- ! an abort, but we may occasionally want to run to make sure this error-handling is
- ! working properly
- logical, parameter :: test_abort1 = .false.
- logical, parameter :: test_abort2 = .false.
- logical, parameter :: test_abort3 = .false.
-
- character(len=*), parameter :: subname = 'test_lookup_2d_netcdf'
-
- ! Open netcdf file that will be used for most tests:
- ! Note that this file was created such that lookup4d(i,j,k,l) = 1000*i+100*j+10*k+l,
- ! and similarly for the other variables
- ! Also, lookup2d(1,2) is missing (i.e., equal to the _FillVal)
- call check_ret(nf_open(filename, 0, ncid), subname)
-
- testname = '2-d lookup table with _FillValue resulting in valid_entries false somewhere'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup2d'
- invalid_lookup = .true.
- dimname1 = 'dim1'
- dimname2 = 'dim2'
- n_extra_dims = 0
- index1 = (/1,2,1,2,2/)
- index2 = (/1,2,2,1,3/)
- fill_val = -1.
- ! Note that the third value is fill_val because lookup2d(1,2) is missing (i.e.,
- ! equal to the _FillVal in the netcdf file)
- data_t = (/11._r8, 22._r8, fill_val, 21._r8, 23._r8/)
- ierr_t = 1
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr)
- call check_results
-
- testname = '2-d lookup table with _FillValue resulting in valid_entries false somewhere, invalid_okay'
- ! Note: this test reuses some setup from the previous test
- ierr_t = 0
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr, invalid_okay=.true.)
- call check_results
- deallocate(index1, index2, data, data_t)
-
- testname = '3-d lookup table with no _FillValue; nodata in index arrays'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup3d'
- invalid_lookup = .false.
- dimname1 = 'dim1'
- dimname2 = 'dim2'
- n_extra_dims = 1
- allocate(extra_dims(n_extra_dims))
- extra_dims(1) = dim_slice_type('dim3', 2)
- nodata = -999
- index1 = (/nodata,2,1,2,2/)
- index2 = (/1,2,2,1,nodata/)
- fill_val = -1.
- data_t = (/fill_val, 222._r8, 122._r8, 212._r8, fill_val/)
- ierr_t = 0
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr, extra_dims=extra_dims, &
- nodata=nodata)
- call check_results
- deallocate(index1, index2, data, data_t, extra_dims)
-
- testname = '4-d lookup table'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup4d'
- invalid_lookup = .true.
- dimname1 = 'dim1'
- dimname2 = 'dim2'
- n_extra_dims = 2
- allocate(extra_dims(n_extra_dims))
- extra_dims(1) = dim_slice_type('dim3', 4)
- extra_dims(2) = dim_slice_type('dim4', 5)
- index1 = (/1,2,1,2,2/)
- index2 = (/1,2,2,1,3/)
- fill_val = -1.
- data_t = (/1145., 2245., 1245., 2145., 2345./)
- ierr_t = 0
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr, extra_dims=extra_dims)
- call check_results
- deallocate(index1, index2, data, data_t, extra_dims)
-
- ! The following tests should result in the code aborting with an error message.
- !
- ! We don't usually want to run these tests, because they result in the code
- ! aborting, but we may want to run them occasionally to make sure this
- ! error-handling is working correctly.
-
- if (test_abort1) then
- testname = '2-d lookup table with incorrect dimname for dimension 2'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup2d'
- invalid_lookup = .true.
- dimname1 = 'dim1'
- dimname2 = 'bad2' ! this differs from the value in the file
- n_extra_dims = 0
- index1 = (/1,2,1,2,2/)
- index2 = (/1,2,2,1,3/)
- fill_val = -1.
- ! Note that the third value is fill_val because lookup2d(1,2) is missing (i.e.,
- ! equal to the _FillVal in the netcdf file)
- data_t = (/11._r8, 22._r8, fill_val, 21._r8, 23._r8/)
- ierr_t = 1
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr)
- deallocate(index1, index2, data, data_t)
- end if
-
- if (test_abort2) then
- testname = '3-d lookup table with incorrect dimname for dimension 3'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup3d'
- invalid_lookup = .false.
- dimname1 = 'dim1'
- dimname2 = 'dim2'
- n_extra_dims = 1
- allocate(extra_dims(n_extra_dims))
- extra_dims(1) = dim_slice_type('bad3', 2) ! this name differs from the value in the file
- nodata = -999
- index1 = (/nodata,2,1,2,2/)
- index2 = (/1,2,2,1,nodata/)
- fill_val = -1.
- data_t = (/fill_val, 222._r8, 122._r8, 212._r8, fill_val/)
- ierr_t = 0
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr, extra_dims=extra_dims, &
- nodata=nodata)
- deallocate(index1, index2, data, data_t, extra_dims)
- end if
-
- if (test_abort3) then
- testname = '3-d lookup table, trying to access too large index for dimension 3'
- allocate(index1(5), index2(5), data(5), data_t(5))
- tablename = 'lookup3d'
- invalid_lookup = .false.
- dimname1 = 'dim1'
- dimname2 = 'dim2'
- n_extra_dims = 1
- allocate(extra_dims(n_extra_dims))
- extra_dims(1) = dim_slice_type('dim3', 5) ! this index is out of bounds
- nodata = -999
- index1 = (/nodata,2,1,2,2/)
- index2 = (/1,2,2,1,nodata/)
- fill_val = -1.
- data_t = (/fill_val, 222._r8, 122._r8, 212._r8, fill_val/)
- ierr_t = 0
- call lookup_2d_netcdf(ncid, tablename, invalid_lookup, dimname1, dimname2, &
- n_extra_dims, index1, index2, fill_val, data, ierr, extra_dims=extra_dims, &
- nodata=nodata)
- deallocate(index1, index2, data, data_t, extra_dims)
- end if
-
- call check_ret(nf_close(ncid), subname)
-
- contains
- subroutine check_results
- call test_is(data, data_t, modname//' -- '//subname//' -- '//trim(testname)//' -- data')
- call test_is(ierr, ierr_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ierr')
- end subroutine check_results
-
- end subroutine test_lookup_2d_netcdf
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
- subroutine test_which_max
-
- implicit none
-
- real(r8), dimension(:), allocatable :: arr
-
- character(len=128) :: testname
-
- real(r8) :: maxval, maxval_t
- integer :: maxindex, maxindex_t
-
- character(len=*), parameter :: subname = 'test_which_max'
-
-
- testname = 'length-1 array'
- allocate(arr(1))
- arr = (/3.0/)
- maxval_t = 3.0
- maxindex_t = 1
- call which_max(arr, maxval, maxindex)
- call check_results
- deallocate(arr)
-
- testname = 'max @ 1'
- allocate(arr(5))
- arr = (/5.0, 2.0, 3.0, 2.5, 1.5/)
- maxval_t = 5.0
- maxindex_t = 1
- call which_max(arr, maxval, maxindex)
- call check_results
- deallocate(arr)
-
- testname = 'max in middle'
- allocate(arr(5))
- arr = (/1.0, 2.0, 3.0, 2.5, 1.5/)
- maxval_t = 3.0
- maxindex_t = 3
- call which_max(arr, maxval, maxindex)
- call check_results
- deallocate(arr)
-
- testname = 'max at end'
- allocate(arr(5))
- arr = (/1.0, 2.0, 3.0, 2.5, 8.0/)
- maxval_t = 8.0
- maxindex_t = 5
- call which_max(arr, maxval, maxindex)
- call check_results
- deallocate(arr)
-
- testname = 'multiple tied max values'
- allocate(arr(5))
- arr = (/1.0, 3.0, 3.0, 2.5, 1.5/)
- maxval_t = 3.0
- maxindex_t = 2
- call which_max(arr, maxval, maxindex)
- call check_results
- deallocate(arr)
-
- testname = 'max in middle, with lbound present'
- allocate(arr(3:7))
- arr = (/1.0, 3.0, 10.0, 2.5, 8.0/)
- maxval_t = 10.0
- maxindex_t = 5
- call which_max(arr, maxval, maxindex, lbound=3)
- call check_results
- deallocate(arr)
-
- contains
- subroutine check_results
- call test_is(maxval, maxval_t, modname//' -- '//subname//' -- '//trim(testname)//' -- maxval')
- call test_is(maxindex, maxindex_t, modname//' -- '//subname//' -- '//trim(testname)//' -- maxindex')
- end subroutine check_results
-
- end subroutine test_which_max
-!------------------------------------------------------------------------------
-
-end module test_mkindexmapMod
-
diff --git a/tools/mksurfdata_map/unit_testers/test_mkncdio.F90 b/tools/mksurfdata_map/unit_testers/test_mkncdio.F90
deleted file mode 100644
index b96dc47071..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkncdio.F90
+++ /dev/null
@@ -1,82 +0,0 @@
-module test_mkncdio
-! Module for testing mkncdio
-
- use mkncdio
- use test_mod
-
- implicit none
- private
-
- public :: test_get_dim_lengths
- public :: test_get_nonexisting_var
-
- character(len=*), parameter :: modname = 'test_mkncdio'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_get_dim_lengths
-
- implicit none
-
- character(len=128) :: testname
- integer :: ncid
- character(len=128) :: varname
- integer :: ndims, ndims_t
- integer :: dim_lengths(nf_max_var_dims), dim_lengths_t(nf_max_var_dims)
-
- character(len=*), parameter :: filename = 'unit_testers/inputs/test_lookup_2d_netcdf.nc'
-
- character(len=*), parameter :: subname = 'test_get_dim_lengths'
-
- ! Open netcdf file that will be used for most tests
- call check_ret(nf_open(filename, 0, ncid), subname)
-
- testname = '3d variable'
- varname = 'lookup3d'
- ndims_t = 3
- dim_lengths_t = 0
- dim_lengths_t(1) = 2
- dim_lengths_t(2) = 3
- dim_lengths_t(3) = 4
- call get_dim_lengths(ncid, varname, ndims, dim_lengths)
- call check_results
-
- call check_ret(nf_close(ncid), subname)
-
- contains
- subroutine check_results
- call test_is(ndims, ndims_t, modname//' -- '//subname//' -- '//trim(testname)//' -- ndims')
- call test_is(dim_lengths(1:ndims), dim_lengths_t(1:ndims_t), &
- modname//' -- '//subname//' -- '//trim(testname)//' -- dim_lengths')
- end subroutine check_results
-
- end subroutine test_get_dim_lengths
-
-!------------------------------------------------------------------------------
- subroutine test_get_nonexisting_var
-
- implicit none
-
- character(len=128) :: testname
- integer :: ncid
- character(len=128) :: varname
- integer :: varid
- logical :: varexists
-
- character(len=*), parameter :: filename = 'unit_testers/inputs/test_lookup_2d_netcdf.nc'
-
- character(len=*), parameter :: subname = 'test_get_nonexiting_var'
-
- testname = 'check if variables exist'
- varname = 'lookup3d'
- ! Open netcdf file that will be used for most tests
- call check_ret(nf_open(filename, 0, ncid), subname)
- call check_ret(nf_inq_varid(ncid, "zztop", varid), subname, varexists=varexists)
- call test_is(.not.varexists, modname//' -- '//subname//' -- '//trim(testname)//' -- non existing var')
- call check_ret(nf_inq_varid(ncid, varname, varid), subname, varexists=varexists)
- call test_is(varexists, modname//' -- '//subname//' -- '//trim(testname)//' -- existing var')
-
- end subroutine test_get_nonexisting_var
-
-end module test_mkncdio
diff --git a/tools/mksurfdata_map/unit_testers/test_mksurfdata_map.F90 b/tools/mksurfdata_map/unit_testers/test_mksurfdata_map.F90
deleted file mode 100644
index 96b463645b..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mksurfdata_map.F90
+++ /dev/null
@@ -1,54 +0,0 @@
-! Run unit tests for mksurfdata_map
-program mksurfdata_map_unit_tester
- use test_mkdomainMod
- use test_mkutilsMod
- use test_mkgridmapMod
- use test_mkindexmapMod
- use test_mkchecksMod
- use test_mkurbanparMod
- use test_mkncdio
- use test_mkharvest
- use test_mod, only : test_init, test_final
-
- call test_init
-
- ! Test mkdomainMod
- call test_domain_read_dims
-
- ! Test mkutilsMod
- call test_slightly_below
- call test_slightly_above
-
- ! Test mkgridmapMod
- call test_gridmap_areaave_default
- call test_gridmap_areaave_srcmask
- call test_gridmap_areaave_srcmask2
- call test_gridmap_areastddev
-
- ! Test mkindexmapMod
- call test_get_dominant_indices
- call test_filter_same
- call test_lookup_2d
- call test_lookup_2d_netcdf
- call test_which_max
-
- ! Test mkchecksMod
- call test_min_bad
- call test_max_bad
-
- ! Test mkurbanparMod
- call test_normalize_urbn_by_tot
-
- ! Test mkharvestMod
- call test_harvest_init
- call test_harvest_init_old
- call test_harvest_data_all1D
- call test_harvest_data
-
- ! Test mkncdio
- call test_get_dim_lengths
- call test_get_nonexisting_var
-
- call test_final
-
-end program mksurfdata_map_unit_tester
diff --git a/tools/mksurfdata_map/unit_testers/test_mkurbanparMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkurbanparMod.F90
deleted file mode 100644
index 30168eb97c..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkurbanparMod.F90
+++ /dev/null
@@ -1,75 +0,0 @@
-module test_mkurbanparMod
-! Module for testing mkurbanparMod
-
- use mkurbanparMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_normalize_urbn_by_tot
-
- character(len=*), parameter :: modname = 'test_mkurbanparMod'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_normalize_urbn_by_tot
-
- use mkutilsMod, only : normalize_classes_by_gcell
-
- implicit none
-
- character(len=128) :: testname
-
- real(r8), allocatable :: classes_pct_gcell_t(:,:)
- real(r8), allocatable :: classes_pct_gcell(:,:)
- real(r8), allocatable :: classes_pct_tot(:,:)
- real(r8), allocatable :: sums(:)
-
- integer :: n,nmax,nclass,totsize
-
- real(r8), parameter :: eps = 1.e-13_r8
-
- character(len=*), parameter :: subname = 'test_normalize_urbn_by_tot'
-
-
- ! This test does a basic check of both normalize_urbn_by_tot and
- ! normalize_classes_by_gcell, by ensuring that when the two are called in
- ! succession, the result is the same as the initial values
- ! (Note that it doesn't directly check the intermediate values -- i.e. the output
- ! produced by normalize_urbn_by_tot)
- testname = 'normalize_urbn_by_tot then normalize_classes_by_gcell'
- nmax = 7
- nclass = 3
- totsize = nmax*nclass
- allocate(classes_pct_gcell_t(nmax,nclass), &
- classes_pct_gcell (nmax,nclass), &
- classes_pct_tot (nmax,nclass), &
- sums (nmax))
-
- ! The following values are designed to test a number of things, including summing
- ! to 100, summing to 0, some values 0 for a given n, and no values being 0 for a
- ! given n
- classes_pct_gcell_t(:,1) = (/ 0., 5., 0., 0., 10., 0., 10./)
- classes_pct_gcell_t(:,2) = (/ 0., 0., 0., 100., 30., 15., 50./)
- classes_pct_gcell_t(:,3) = (/100., 30., 0., 0., 20., 0., 40./)
-
- do n = 1, nmax
- sums(n) = sum(classes_pct_gcell_t(n,:))
- end do
-
- call normalize_urbn_by_tot(classes_pct_gcell_t, sums, classes_pct_tot)
- call normalize_classes_by_gcell(classes_pct_tot, sums, classes_pct_gcell)
- call test_close(reshape(classes_pct_gcell, (/totsize/)), &
- reshape(classes_pct_gcell_t, (/totsize/)), &
- eps, modname//' -- '//subname//' -- '//trim(testname), rel_diff=.true.)
-
- deallocate(classes_pct_gcell_t, classes_pct_gcell, classes_pct_tot, sums)
-
-
- end subroutine test_normalize_urbn_by_tot
-!------------------------------------------------------------------------------
-
-end module test_mkurbanparMod
diff --git a/tools/mksurfdata_map/unit_testers/test_mkutilsMod.F90 b/tools/mksurfdata_map/unit_testers/test_mkutilsMod.F90
deleted file mode 100644
index 53b5b1b8c3..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mkutilsMod.F90
+++ /dev/null
@@ -1,112 +0,0 @@
-module test_mkutilsMod
-! Module for testing mkutilsMod
-
- use mkutilsMod
- use test_mod
- use shr_kind_mod, only : r8 => shr_kind_r8
-
- implicit none
- private
-
- public :: test_slightly_below
- public :: test_slightly_above
-
- character(len=*), parameter :: modname = 'test_mkutilsMod'
-
-contains
-
-!------------------------------------------------------------------------------
- subroutine test_slightly_below
-
- implicit none
-
- character(len=128) :: testname
-
- logical :: retval
- real(r8) :: a
- real(r8) :: b
-
- character(len=*), parameter :: subname = 'test_slightly_below'
-
- testname='basic-true'
- b = 3.0
- a = 3.0 - b*epsilon(b)
- retval = slightly_below(a,b)
- call test_is((retval .eqv. .true.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='far below'
- b = 3.0
- a = 2.0
- retval = slightly_below(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='equal'
- b = 3.0
- a = 3.0
- retval = slightly_below(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='above'
- b = 3.0
- a = 3.0 + epsilon(b)
- retval = slightly_below(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='change epsilon to allow far below'
- b = 3.0
- a = 2.0
- retval = slightly_below(a,b,eps=0.75_r8)
- call test_is((retval .eqv. .true.), modname//' -- '//subname//' -- '//trim(testname))
-
- end subroutine test_slightly_below
-!------------------------------------------------------------------------------
-
-!------------------------------------------------------------------------------
- subroutine test_slightly_above
-
- implicit none
-
- character(len=128) :: testname
-
- logical :: retval
- real(r8) :: a
- real(r8) :: b
-
- character(len=*), parameter :: subname = 'test_slightly_above'
-
- testname='basic-true'
- b = 3.0
- a = 3.0 + b*epsilon(b)
- retval = slightly_above(a,b)
- call test_is((retval .eqv. .true.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='far above'
- b = 3.0
- a = 4.0
- retval = slightly_above(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='equal'
- b = 3.0
- a = 3.0
- retval = slightly_above(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='below'
- b = 3.0
- a = 3.0 - epsilon(b)
- retval = slightly_above(a,b)
- call test_is((retval .eqv. .false.), modname//' -- '//subname//' -- '//trim(testname))
-
- testname='change epsilon to allow far above'
- b = 3.0
- a = 4.0
- retval = slightly_above(a,b,eps=0.75_r8)
- call test_is((retval .eqv. .true.), modname//' -- '//subname//' -- '//trim(testname))
-
- end subroutine test_slightly_above
-!------------------------------------------------------------------------------
-
-end module test_mkutilsMod
-
-
diff --git a/tools/mksurfdata_map/unit_testers/test_mod.F90 b/tools/mksurfdata_map/unit_testers/test_mod.F90
deleted file mode 100644
index 967eee1c89..0000000000
--- a/tools/mksurfdata_map/unit_testers/test_mod.F90
+++ /dev/null
@@ -1,339 +0,0 @@
-module test_mod
-
-use shr_kind_mod, only : SHR_KIND_R8
-use shr_sys_mod, only : shr_sys_abort
-
-implicit none
-
-public test_init
-public test_is
-public test_close
-public test_final
-
-integer, save :: ntests = 0
-integer, save :: npass = 0
-integer, save :: num_expected = 0
-logical, save :: num_expected_given = .false.
-character(*), parameter :: formatTest = '(A4, " ", i5.5, " - ", A)'
-character(*), parameter :: formatArrayMatch = &
- '(" (all ", i5, " values match)")'
-character(*), parameter :: formatArray2DMatch = &
- '(" (all ", i5, "x", i5, " values match)")'
-character(*), parameter :: formatArrayMisMatch = &
- '(" (only ", i5, " values of ", i5, " values match)")'
-character(*), parameter :: formatArray2DMisMatch = &
- '(" (only ", i5, " values of ", i5, "x", i5, " values match)")'
-character(*), parameter :: formatRArrayClose = &
- '(" (all ", i5, " values are within", 1pe9.1e2, " )")'
-character(*), parameter :: formatRArrayNotClose = &
- '(" (only ", i5, " values of ", i5, " values are within", 1pe9.1e2, " max diff= ", 1pe9.1e2, ")")'
-character(*), parameter :: formatRClose = &
- '(" ( value within", 1pe9.1e2, " )")'
-character(*), parameter :: formatRNotClose = &
- '(" ( value within", 1pe9.1e2, " diff= ", 1pe9.1e2, ")")'
-
-interface test_is
- module procedure test_is_logical
- module procedure test_is_logical1D
- module procedure test_is_string
- module procedure test_is_integer
- module procedure test_is_integer1D
- module procedure test_is_real1D
- module procedure test_is_real2D
- module procedure test_is_realScalar
-end interface test_is
-
-interface test_close
- module procedure test_close_real1D
- module procedure test_close_realScalar
-end interface test_close
-
-private test_is_logical
-private test_is_string
-private test_is_integer
-private test_is_integer1D
-private test_is_real1D
-private test_is_realScalar
-private test_close_real1D
-
-contains
-
-
-subroutine test_init( num_expected_tests )
- integer, intent(IN), optional :: num_expected_tests
-
- if ( present(num_expected_tests) ) then
- num_expected = num_expected_tests
- num_expected_given = .true.
- write(*,formatTest) "1...", num_expected, "expected tests"
- write(*,*)
- end if
-
-end subroutine test_init
-
-subroutine test_is_logical( pass, description )
-
- implicit none
-
- logical, intent(IN) :: pass ! If matches or not
- character(*), intent(IN) :: description ! description of test
-
- character(4) :: status
-
- ntests = ntests + 1
- if ( pass )then
- npass = npass + 1
- status = "PASS"
- else
- status = "FAIL"
- end if
- write(*,formatTest) status, ntests, trim(description)
-
-end subroutine test_is_logical
-
-subroutine test_is_logical1D( value, expected, description )
-
- implicit none
-
- logical, intent(IN) :: value(:) ! test value
- logical, intent(IN) :: expected(:) ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
- integer :: nsize, nmatch
- character(256) :: descrip
-
- nsize = size(value)
- if ( all(value .eqv. expected) )then
- pass = .true.
- write(descrip,formatArrayMatch) nsize
- else
- nmatch = count(value .eqv. expected)
- write(descrip,formatArrayMisMatch) nmatch, nsize
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_is_logical1D
-
-
-subroutine test_is_string( value, expected, description )
-
- implicit none
-
- character(len=*), intent(IN) :: value
- character(len=*), intent(IN) :: expected
- character(len=*), intent(IN) :: description ! description of test
-
-
- logical :: pass ! If matches or not
-
- character(4) :: status
-
- if ( trim(value) == trim(expected) )then
- pass = .true.
- else
- pass = .false.
- end if
- ntests = ntests + 1
- if ( pass )then
- npass = npass + 1
- status = "PASS"
- else
- status = "FAIL"
- end if
- write(*,formatTest) status, ntests, trim(description)
-
-end subroutine test_is_string
-
-subroutine test_is_integer( value, expected, description )
- integer, intent(IN) :: value ! test value
- integer, intent(IN) :: expected ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
-
- if ( value == expected )then
- pass = .true.
- else
- pass = .false.
- end if
- call test_is_logical( pass, description )
-
-end subroutine test_is_integer
-
-subroutine test_is_integer1D( value, expected, description )
- integer, intent(IN) :: value(:) ! test value
- integer, intent(IN) :: expected(:) ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
- integer :: nsize, nmatch
- character(256) :: descrip
-
- nsize = size(value)
- if ( all(value == expected) )then
- pass = .true.
- write(descrip,formatArrayMatch) nsize
- else
- nmatch = count(value == expected)
- write(descrip,formatArrayMisMatch) nmatch, nsize
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_is_integer1D
-
-subroutine test_is_real1D( value, expected, description )
- real(SHR_KIND_R8), intent(IN) :: value(:) ! test value
- real(SHR_KIND_R8), intent(IN) :: expected(:) ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
- integer :: nsize, nmatch
- character(256) :: descrip
-
- nsize = size(value)
- if ( all(value == expected) )then
- pass = .true.
- write(descrip,formatArrayMatch) nsize
- else
- nmatch = count(value == expected)
- write(descrip,formatArrayMisMatch) nmatch, nsize
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_is_real1D
-
-subroutine test_is_real2D( value, expected, description )
- real(SHR_KIND_R8), intent(IN) :: value(:,:) ! test value
- real(SHR_KIND_R8), intent(IN) :: expected(:,:) ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
- integer :: nsize1, nsize2, nmatch
- character(256) :: descrip
-
- nsize1 = size(value,1)
- nsize2 = size(value,2)
- if ( all(value == expected) )then
- pass = .true.
- write(descrip,formatArray2DMatch) nsize1, nsize2
- else
- nmatch = count(value == expected)
- write(descrip,formatArray2DMisMatch) nmatch, nsize1, nsize2
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_is_real2D
-
-subroutine test_is_realScalar( value, expected, description )
- real(SHR_KIND_R8), intent(IN) :: value ! test value
- real(SHR_KIND_R8), intent(IN) :: expected ! expected value
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
-
- if ( value == expected )then
- pass = .true.
- else
- pass = .false.
- end if
- call test_is_logical( pass, description )
-
-end subroutine test_is_realScalar
-
-subroutine test_close_real1D( value, expected, eps, description, rel_diff )
- real(SHR_KIND_R8), intent(IN) :: value(:) ! test value
- real(SHR_KIND_R8), intent(IN) :: expected(:) ! expected value
- real(SHR_KIND_R8), intent(IN) :: eps ! epsilon -- how close to be within
- character(*), intent(IN) :: description ! description of test
- logical, optional, intent(IN) :: rel_diff ! if should do relative difference or not
-
- logical :: pass, lreldiff
- integer :: nsize, nmatch, i, n0(1), nf(1)
- real(SHR_KIND_R8) :: within, diff
- character(256) :: descrip
-
- lreldiff = .false.
- if ( present(rel_diff) ) lreldiff = rel_diff
- nsize = size(value)
- if ( nsize /= size(expected) )then
- call shr_sys_abort( "size of value and expected array is different" )
- end if
- if ( any(lbound(value) /= lbound(expected)) )then
- call shr_sys_abort( "lower bound of value and expected array is different" )
- end if
- nmatch = 0
- n0 = lbound(value)
- nf = ubound(value)
- within = abs(value(n0(1)) - expected(n0(1)))
- if ( lreldiff .and. within > 0.0_SHR_KIND_R8 ) within = within / max( abs(value(n0(1))), abs(expected(n0(1))) )
- do i = n0(1), nf(1)
- diff = abs(value(i) - expected(i))
- if ( lreldiff .and. diff > 0.0_SHR_KIND_R8 ) diff = diff / max(abs(value(i)),abs(expected(i)) )
- within = max( within, diff )
- if ( diff <= eps ) nmatch = nmatch + 1
- end do
- if( nmatch == nsize )then
- write(descrip,formatRArrayClose) nsize, eps
- pass = .true.
- else
- write(descrip,formatRArrayNotClose) nmatch, nsize, eps, within
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_close_real1D
-
-subroutine test_close_realScalar( value, expected, eps, description )
- real(SHR_KIND_R8), intent(IN) :: value ! test value
- real(SHR_KIND_R8), intent(IN) :: expected ! expected value
- real(SHR_KIND_R8), intent(IN) :: eps ! epsilon -- how close to be within
- character(*), intent(IN) :: description ! description of test
-
- logical :: pass
- real(SHR_KIND_R8) :: diff
- character(256) :: descrip
-
- diff = abs(value - expected)
- if ( diff <= eps ) then
- write(descrip,formatRClose) eps
- pass = .true.
- else
- write(descrip,formatRNotClose) eps, diff
- pass = .false.
- end if
- call test_is_logical( pass, trim(description)//trim(descrip) )
-
-end subroutine test_close_realScalar
-
-subroutine test_final( PassStatus )
-
- logical, intent(OUT), optional :: PassStatus
-
- character(4) :: status
- character(50) :: desc
-
- write(*,*)
- status = "PASS"
- if ( present(PassStatus) ) PassStatus = .true.
- desc = "All expected tests ran successfully"
- if ( num_expected_given .and. ntests /= num_expected )then
- status = "FAIL"
- desc = "Different number of tests than expected"
- if ( present(PassStatus) ) PassStatus = .false.
- end if
- if ( npass /= ntests )then
- status = "FAIL"
- if ( present(PassStatus) ) PassStatus = .false.
- write(desc,'(A,i3,A)') "Not all tests passed (", &
- ntests-npass, " tests failed)"
- end if
- write(*,formatTest) status, ntests, "tests run -- "//desc
-
-end subroutine test_final
-
-end module test_mod
diff --git a/tools/ncl_scripts/README b/tools/ncl_scripts/README
deleted file mode 100644
index 52abcdaf78..0000000000
--- a/tools/ncl_scripts/README
+++ /dev/null
@@ -1,23 +0,0 @@
-$CTSMROOT/tools/ncl_scripts Jun/08/2018
-
-CLM NCL script tools for analysis of CLM history files -- or for creation or
-modification of CLM input files.
-
-In order to make these scripts work in the testing framework the following must
-be done.
-
-1.) Respond to CSMDATA and CLM_ROOT as needed.
-2.) Print a line with "success" after the work is completed.
-
-Scripts available:
-
-Master perl scripts that call the other ncl scripts:
-
-getregional_datasets.pl ----- Extract out regional datasets from global ones and put files in a location that
- can be used by build-namelist.
-
-NCL Scripts available:
-
-getregional_datasets.ncl ---- NCL script to extract out regional datasets.
-getco2_historical.ncl ------- Get historical CO2 to use for input in datm8 streams
-
diff --git a/tools/ncl_scripts/README.getregional b/tools/ncl_scripts/README.getregional
deleted file mode 100644
index b39dc6baa9..0000000000
--- a/tools/ncl_scripts/README.getregional
+++ /dev/null
@@ -1,35 +0,0 @@
-$CTSMROOT/tools/ncl_scripts/README.getregional Erik Kluzek
- 06/08/2018
-
-Information on the getregional_datasets script.
-
-The getregional_datasets.pl script operates on global datasets and
-extracts out a regional box (or single point) within it.
-
-
-QUICKSTART:
-
-
-Here is how you would use the script to run a setup a simple case.
-
-1.) Create list of input global files you want to extract from.
-
-A sample file is: sample_inlist
-
-2.) Create list of regional files that will be created.
-
-A sample file is: sample_outlist
-
-3.) Run getregional
-
-set DIR=`pwd`
-./getregional_datasets.pl -ne 74,221 -sw 51,189 -i sample_inlist -o sample_outlist
-
-4.) Make sure the user_nl_clm and xmlchange_cmnds files are correct.
-
-getregional will create a user_nl_clm file and a xmlchange_cmnds script to set
-needed env_run settings.
-
-3.) Create your case using the user_mods_dir option and CLM_USRDAT resolution
-
-./create_newcase --res CLM_USRDAT --user_mods_dir $DIR --case testAlaska --compset I2000Clm50SpGs
diff --git a/tools/ncl_scripts/getco2_historical.ncl b/tools/ncl_scripts/getco2_historical.ncl
deleted file mode 100644
index c071fa42d6..0000000000
--- a/tools/ncl_scripts/getco2_historical.ncl
+++ /dev/null
@@ -1,196 +0,0 @@
-;
-; Take the greenhouse gas file used by CAM for historical (and future) representations of
-; greenhouse gases, and convert it to a format that can be used by streams.
-; So include domain data for a single point (or latitude bands) that covers the globe, as well
-; as CO2 data over those latitude bands. In the process we also discard the other
-; greenhouse gases, as the datm can only pass CO2.
-;
-; Erik Kluzek
-; Mar/03/2010
-;
-begin
- ; ===========================================================================================================
-
-
- ; ===========================================================================================================
- ;
- ; Setup the namelist query script
- ;
- csmdata = getenv("CSMDATA");
- clmroot = getenv("CLM_ROOT");
- hgrid = getenv("HGRID"); ; Get horizontal grid to use from env variable
- querynml = "bld/queryDefaultNamelist.pl -silent -justvalue ";
- if ( .not. ismissing(csmdata) )then
- querynml = querynml+" -csmdata "+csmdata;
- end if
- if ( ismissing(clmroot) )then
- querynml = "../../"+querynml;
- else
- querynml = clmroot+"/"+querynml;
- end if
- if ( ismissing(hgrid) )then
- hgrid = "lat-bands"
- end if
- ;
- ; Get input Greenhouse gas file and open it
- ;
- filetype = "mkghg_bndtvghg";
- print( querynml+" -namelist clmexp -var "+filetype+" -options hgrid="+hgrid );
- ghgfile = systemfunc( querynml+" -namelist clmexp -var "+filetype+" -options hgrid="+hgrid );
- print( "Use "+filetype+" file: "+ghgfile );
- if ( systemfunc("test -f "+ghgfile+"; echo $?" ) .ne. 0 )then
- print( "Input "+filetype+" file does not exist or not found: "+ghgfile );
- exit
- end if
- ncg = addfile( ghgfile, "r" );
-
- ;
- ; Get date time-stamp to put on output CO2 file
- ;
- sdate = systemfunc( "date +%y%m%d" );
- ldate = systemfunc( "date" );
-
- sim_yr0 = ncg->date(0) / 10000;
- ntime = dimsizes( ncg->date );
- sim_yr2 = ncg->date(ntime-1) / 10000;
-
- sim_yr_rng = "simyr_"+sim_yr0 + "-" + sim_yr2;
-
- cmip_vers = "_CMIP6_";
- outco2filename = "fco2_datm_"+hgrid+sim_yr_rng+cmip_vers+"c"+sdate+".nc";
- system( "/bin/rm -f "+outco2filename );
- print( "output file: "+outco2filename );
- nco = addfile( outco2filename, "c" );
- ;
- ; Define dimensions
- ;
- if ( hgrid .eq. "lat-bands" )then
- nlat = dimsizes(ncg->lat);
- else
- if ( hgrid .eq. "global" )then
- nlat = 1
- else
- print( "hgrid type can only be global or lat-bands: "+hgrid )
- exit
- end if
- end if
- nlon = 1;
- nv = 4;
- dimnames = (/ "time", "lat", "lon", "nv", "bounds" /);
- dsizes = (/ ntime, nlat, nlon, nv, 2 /);
- is_unlim = (/ True, False, False, False, False /);
- filedimdef( nco, dimnames, dsizes, is_unlim );
- ;
- ; Define variables
- ;
- vars = (/ "lonc", "latc", "lonv", "latv", "mask", "frac", "area", "CO2" /);
- units= (/ "degrees_east", "degrees_north", "degree_east", "degrees_north", "unitless", "unitless", "radians^2", "ppmv" /);
- lname= (/ "Longitude of grid cell center", "Latitude of grid cell center", "Longitudes of grid cell vertices", "Latitudes of grid cell vertices", "Mask of active cells: 1=active", "Fraction of grid cell that is active", "Area of grid cell", "CO2 concentration" /);
- print( "Define variables: "+vars );
- do i= 0, dimsizes(vars)-1
- if ( vars(i) .eq. "lonv" .or. vars(i) .eq. "latv" )then
- filevardef ( nco, vars(i), "double", (/ "lat", "lon", "nv" /) );
- else
- if ( vars(i) .eq. "CO2" )then
- filevardef ( nco, vars(i), "float", (/ "time", "lat", "lon" /) );
- nco->$vars(i)$@coordinate = "latc lonc time";
- else
- filevardef ( nco, vars(i), "double", (/ "lat", "lon" /) );
- end if
- end if
- nco->$vars(i)$@units = units(i);
- nco->$vars(i)$@long_name = lname(i);
- end do
- filevardef ( nco, "time", "float", (/ "time" /) );
- filevardef ( nco, "time_bnds", "float", (/ "time", "bounds" /) );
- filevardef ( nco, "date", "integer", (/ "time" /) );
- varstatic = (/ "mask", "frac", "area" /);
- do i = 0, dimsizes(varstatic)-1
- nco->$varstatic(i)$@coordinate = "latc lonc";
- end do
- nco->lonc@bounds = "lonv";
- nco->latc@bounds = "latv";
- ;
- ; Add attributes
- ;
- fileattdef ( nco, ncg );
- nco@history = ldate+": Convert by getco2_historical.ncl";
- nco@source = "Convert from:"+ghgfile;
- nco@Version = systemfunc( "git describe" );
- filevarattdef( nco, "time", ncg->time );
- filevarattdef( nco, "date", ncg->date );
- nco->time_bnds@long_name = nco->time@long_name;
- nco->time_bnds@units = nco->time@units;
- nco->time_bnds@calendar = nco->time@calendar;
- ;
- ; Set static variables
- ;
- pi = 3.14159265358979323846d00;
- nco->mask = 1;
- nco->frac = 1.0;
- if ( nlat .gt. 1 )then
- nco->latc = (/ ncg->lat/);
- else
- nco->latc = (/ 0.0d00 /);
- end if
- nco->latv(nlat-1,0,0) = 90.0d00;
- nco->latv(nlat-1,0,3) = 90.0d00;
- if ( nlat .gt. 1 )then
- nco->latv(0:nlat-2,0,0) = ( (/ ncg->lat(0:nlat-2) /) + (/ncg->lat(1:nlat-1) /) )*0.5d00
- nco->latv(0:nlat-2,0,3) = (/ nco->latv(0:nlat-2,0,0) /);
- nco->latv(1:nlat-1,0,1) = (/ nco->latv(0:nlat-2,0,0) /);
- nco->latv(1:nlat-1,0,2) = (/ nco->latv(1:nlat-1,0,1) /);
- end if
- nco->latv(0,0,1) = -90.0d00;
- nco->latv(0,0,2) = -90.0d00;
- nco->lonv(:,0,0) = 0.0d00;
- nco->lonv(:,0,3) = 0.0d00;
- nco->lonc = 180.0d00;
- nco->lonv(:,0,1) = 360.0d00;
- nco->lonv(:,0,2) = 360.0d00;
- clkws = gc_clkwise( nco->latv, nco->lonv );
- if ( any(clkws .eq. False) )then
- print( "Some varticies are NOT clockwise" );
- exit
- end if
- ; EBK -- NOTE The NCL function wasn't giving me the correct answer so I used the mathmatical expression
- ;nco->area = dble2flt( gc_qarea( nco->latv, nco->lonv ) );
- conv2rad = pi/180.0d00
- nco->area(:,0) = 2.0d00*pi*abs( sin((/nco->latv(:,0,0)/)*conv2rad) - sin((/nco->latv(:,0,1)/)*conv2rad) );
- if ( abs(sum(nco->area) - 4.0d00*pi) .gt. 1.d-14 )then
- print( "Area of globe does not sum to 4*pi as expected" );
- exit
- end if
- ;
- ; Time and date
- ;
- nco->date = (/ ncg->date /);
- nco->time = (/ ncg->time /);
- nco->time_bnds = (/ ncg->time_bnds /);
- nco->date@comment = "This variable is NOT used when read by datm, the time coordinate is used";
- ;
- ; CO2
- ;
- print( "Copy CO2 for "+ntime+" time samples of data" );
- if ( nlat .gt. 1 )then
- do y = 0, nlat-1
- print( "latitude: "+ nco->latc(y,0) );
- nco->CO2(:,y,0) = (/ ncg->CO2_LBC(:,y) /) * 1.e6;
- end do
- else
- ; make sure all latitudes on file are the same for each time
- do itime = 0, ntime-1
- if ( max(ncg->CO2_LBC(itime,:)) .ne. min(ncg->CO2_LBC(itime,:)) )then
- print( "Global average, but latitudes are NOT constant" );
- exit
- end if
- end do
- nco->CO2(:,0,0) = (/ ncg->CO2_LBC(:,0) /) * 1.e6;
- end if
- print( "Average Global First CO2 ppmv value: Date="+nco->date(0)+" CO2="+avg(nco->CO2(0,:,0) ) );
- print( "Average Global Last CO2 ppmv value: Date="+nco->date(ntime-1)+" CO2="+avg(nco->CO2(ntime-1,:,0)) );
-
- print( "================================================================================================" );
- print( "Successfully created output historical CO2 file: "+outco2filename);
-
-end
diff --git a/tools/ncl_scripts/getregional_datasets.ncl b/tools/ncl_scripts/getregional_datasets.ncl
deleted file mode 100644
index a6da88c67a..0000000000
--- a/tools/ncl_scripts/getregional_datasets.ncl
+++ /dev/null
@@ -1,268 +0,0 @@
-;
-; Extract out regional datasets needed to run clm from the global datasets.
-; NOTE: Requires at least NCL version 5.1.0 or later...
-;
-; Erik Kluzek
-; Aug/28/2009
-;
-load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl";
-
-procedure getfilecoord_namenlen( filenames[*]:string, dimnames[*]:string, dimlens[*]:integer, nlen:integer, name:string )
-;
-; get the name and size of either the latitude or longitude
-;
- local d, l
-begin
- if ( name .eq. "" )then
- do d = 0, dimsizes(filenames)-1
- if ( any(dimnames .eq. filenames(d) ) )then
- name = filenames(d)
- ; Get length of this dimension
- do l = 0, dimsizes(dimnames)-1
- if ( dimnames(l) .eq. name )then
- nlen = dimlens(l)
- end if
- end do
- end if
- end do
- end if
-end
-
-begin
- ; ===========================================================================================================
- ;
- ; IMPORTANT NOTE: EDIT THE FOLLOWING TO CUSTOMIZE or use ENV VARIABLE SETTINGS
- ; Edit the following as needed to interpolate to a new resolution.
- ;
- ; Input resolution and position
- ;
- latS = stringtodouble( getenv("S_LAT") ); ; Get south latitude from env variable
- latN = stringtodouble( getenv("N_LAT") ); ; Get north latitude from env variable
- lonE = stringtodouble( getenv("E_LON") ); ; Get east longitude from env variable
- lonW = stringtodouble( getenv("W_LON") ); ; Get west longitude from env variable
- debug_str = getenv("DEBUG"); ; Don't run just -- debug
- print_str = getenv("PRINT"); ; Do Extra printing for debugging
- gridfile = getenv("GRIDFILE"); ; Input global grid file
- nfiles = stringtointeger( getenv("NFILES") ); ; number of files to read in file lists
- filelistfil = getenv("INFILELIST"); ; filename of list of global files to work on
- regfilelistfil = getenv("OUTFILELIST"); ; filename of list of regional eiles to create
-
- if ( ismissing(nfiles) )then
- print( "NFILES is missing -- need to provide the number of files to process" );
- status_exit( -1 )
- end if
- if ( ismissing(filelistfil) .or. ismissing(regfilelistfil) )then
- print( "INFILELIST or OUTFILELIST is missing -- need to provide both" );
- status_exit( -1 )
- end if
- if ( ismissing(latS) )then
- latS = 52.0d00;
- end if
- if ( ismissing(latN) )then
- latN = 73.0d00;
- end if
- if ( ismissing(lonW) )then
- lonW = 190.0d00;
- end if
- if ( ismissing(lonE) )then
- lonE = 220.0d00;
- end if
- if ( ismissing(print_str) )then
- printn = False;
- else
- if ( print_str .eq. "TRUE" )then
- printn = True;
- else
- printn = False;
- end if
- end if
- if ( ismissing(debug_str) )then
- debug = False;
- else
- if ( debug_str .eq. "TRUE" )then
- print( "DEBUG is TRUE do extra printing AND do NOT execute -- just print what WOULD happen" );
- debug = True;
- printn = True;
- else
- debug = False;
- end if
- end if
- print( "Extract out regional datasets from global datasets" );
- if ( printn .eq. True )then
- print( "Regional: Latitude="+latS+"-"+latN+" Longitude="+lonW+"-"+lonE );
- end if
-
- ;
- ; Setup the namelist query script
- ;
- ldate = systemfunc( "date" );
- clmroot = getenv("CLM_ROOT");
-
- ;
- ; list of latitude and longitude names
- ;
- filelatnames = (/ "lsmlat", "lat", "nj" /);
- filelonnames = (/ "lsmlon", "lon", "ni" /);
-
- ;
- ; Open file
- ;
- if ( systemfunc("test -f "+gridfile+"; echo $?" ) .ne. 0 )then
- print( "Input gridfile does not exist or not found: "+gridfile );
- status_exit( -1 )
- end if
- if ( printn .eq. True )then
- print( "gridfile:"+gridfile );
- end if
- ncg = addfile( gridfile, "r" );
- ;
- ; Get the names for latitude/longitude on the grid file
- ;
- varnames = getfilevarnames( ncg );
- gridlonnm = ""
- gridlatnm = ""
- glat = 0
- glon = 0
- varlens = new( dimsizes(varnames), "integer" );
- getfilecoord_namenlen( (/ "yc", "LATIXY"/), varnames, varlens, glat, gridlatnm );
- getfilecoord_namenlen( (/ "xc", "LONGXY"/), varnames, varlens, glon, gridlonnm );
- delete( varnames );
- delete( varlens );
- if ( gridlatnm .eq. "" )then
- print( "Could not find a recognizable latitude dimension name" )
- status_exit(-1);
- end if
- if ( printn .eq. True )then
- print( "gridlatname = "+gridlatnm )
- print( "gridlonname = "+gridlonnm )
- end if
-
- gridlon = ncg->$gridlonnm$;
- gridlon = where( gridlon < 0.0, 360.0 + gridlon, gridlon );
-
- indx = region_ind ( (/ncg->$gridlatnm$/), (/gridlon/), latS, latN, lonW, lonE );
- ; Indexes into indices
- ilat0 = 0;
- ilatN = 1;
- ilon0 = 2;
- ilonN = 3;
-
- latdim = dimsizes(ncg->$gridlatnm$(:,0))
- londim = dimsizes(gridlon(0,:))
- if ( any( ismissing(indx)) )then
- print( "Indices:"+indx );
- print( "Missing indices found" );
- print( "nlat: "+latdim );
- print( "nlon: "+londim );
- print( "yc: "+ncg->$gridlatnm$(:,0) );
- print( "xc: "+gridlon(0,:) );
- status_exit(-1);
- end if
-
- if ( debug .eq. True )then
- print( "Indices:"+indx );
- end if
- if ( printn .eq. True )then
- print( "Full grid size: nlat = "+latdim+" nlon = "+londim )
- loclatdim = indx(ilatN) - indx(ilat0) + 1;
- loclondim = indx(ilonN) - indx(ilon0) + 1;
- print( "Grid size:"+loclatdim+"x"+loclondim );
- LOLAT = ncg->$gridlatnm$(indx(ilat0),indx(ilon0));
- HILAT = ncg->$gridlatnm$(indx(ilatN),indx(ilonN));
- print( "Actual grid span: Latitude="+LOLAT+"-"+HILAT );
- LOLON = gridlon(indx(ilat0),indx(ilon0));
- HILON = gridlon(indx(ilatN),indx(ilonN));
- print( "Actual grid span: Longitude="+LOLON+"-"+HILON );
- end if
-
- ;
- ; Read in the list of files
- ;
- filelist = asciiread(filelistfil(0), (/ nfiles /), "string");
- regfilelist = asciiread(regfilelistfil(0), (/ nfiles /), "string");
- ;
- ; Loop over each of the files to process...
- ;
- do i = 0, nfiles-1
- ;
- ; Get the filename of the input global file and the output regional filename
- ;
- globalfile = filelist(i)
- if ( systemfunc("test -f "+globalfile+"; echo $?" ) .ne. 0 )then
- print( "Input global "+globalfile+" file does not exist or not found: "+globalfile );
- status_exit(-1);
- end if
- if ( debug .eq. True )then
- print( "Process file: "+globalfile );
- end if
- regfile = regfilelist(i)
- if ( ismissing(regfile) )then
- print( "Output regional filename was NOT found: "+regfile );
- status_exit(-1);
- end if
-
- nc = addfile( globalfile, "r" );
- varnames = getfilevarnames( nc );
- filelonnm = ""
- filelatnm = ""
- nlat = 0
- nlon = 0
- do v = 0, dimsizes(varnames)-1
- dimnames = getfilevardims( nc, varnames(v) );
- dimlens = getfilevardimsizes( nc, varnames(v) );
- getfilecoord_namenlen( filelatnames, dimnames, dimlens, nlat, filelatnm );
- getfilecoord_namenlen( filelonnames, dimnames, dimlens, nlon, filelonnm );
- delete( dimnames );
- delete( dimlens );
- end do
- if ( filelatnm .eq. "" )then
- print( "Could not find a recognizable latitude dimension name" )
- status_exit(-1);
- end if
- if ( printn .eq. True )then
- print( "nlat = "+nlat+" nlon = "+nlon )
- end if
- ;
- ; Check to make sure number of latitudes and longitudes are the same as on the domain file
- ;
- if ( (latdim .ne. nlat) .or. (londim .ne. nlon) )then
- print( "Latitude or longitude dimensions do NOT match the grid file for file: "+globalfile );
- status_exit(-1);
- end if
- ;
- ; Run ncks on it over the region of interest
- ;
- do v = 0, dimsizes(varnames)-1
- cmd = "ncks -O -d "+filelatnm+","+indx(ilat0)+","+indx(ilatN)+" -d "+filelonnm+","+indx(ilon0)+","+indx(ilonN);
- cmd = cmd + " -v " + varnames(v) + " " + globalfile + " "+regfile+"_VAR"+varnames(v)+".nc"
- print( "Execute:"+cmd );
- if ( debug .eq. False )then
- if ( systemfunc( cmd+"; echo $?" ) .ne. 0 )then
- print( "Command did not complete successfully: " );
- status_exit( -1 )
- end if
- end if
- cmd = "ncks -A "+regfile+"_VAR"+varnames(v)+".nc "+regfile
- print( "Execute:"+cmd );
- if ( debug .eq. False )then
- if ( systemfunc( cmd+"; echo $?" ) .ne. 0 )then
- print( "Command did not complete successfully: " );
- status_exit( -1 )
- end if
- system( "/bin/rm "+regfile+"_VAR"+varnames(v)+".nc" )
- end if
- end do
- delete( varnames );
- if ( debug .eq. False )then
- ;
- ; Open up resultant file for writing
- ;
- nco = addfile( regfile, "w" );
- nco@history = nco@history + ":"+ldate + ": ";
- end if
- end do
-
- print( "================================================================================================" );
- print( "Successfully created regional datasets from global datasets" );
-
-end
diff --git a/tools/ncl_scripts/getregional_datasets.pl b/tools/ncl_scripts/getregional_datasets.pl
deleted file mode 100755
index 5fee1a1493..0000000000
--- a/tools/ncl_scripts/getregional_datasets.pl
+++ /dev/null
@@ -1,375 +0,0 @@
-#!/usr/bin/env perl
-#=======================================================================
-#
-# Extract out regional datasets from the global datasets.
-#
-# Usage:
-#
-# getregional_datasets.pl
-#
-# Erik Kluzek
-# Aug/28/2009
-#
-#=======================================================================
-
-use Cwd;
-use strict;
-#use diagnostics;
-use English;
-use Getopt::Long;
-use IO::File;
-
-#-----------------------------------------------------------------------------------------------
-# Set the directory that contains this scripts. If the command was issued using a
-# relative or absolute path, that path is in $ProgDir. Otherwise assume the
-# command was issued from the current working directory.
-
-(my $ProgName = $0) =~ s!(.*)/!!; # name of this script
-my $ProgDir = $1; # name of directory containing this script -- may be a
- # relative or absolute path, or null if the script is in
- # the user's PATH
-my $cmdline = "@ARGV"; # Command line arguments to script
-my $cwd = getcwd(); # current working directory
-my $scrdir; # absolute pathname of directory that contains this script
-my $nm = "$ProgName::"; # name to use if script dies
-if ($ProgDir) {
- $scrdir = absolute_path($ProgDir);
-} else {
- $scrdir = $cwd;
-}
-
-my $gridfilename = "fatmlndfrc";
-
-#-----------------------------------------------------------------------------------------------
-
-sub usage {
- die < 90.0) ) {
- die <<"EOF";
-** $ProgName - Bad value for latitude (=$lat) for $desc **
-EOF
- }
- if ( ($lon < 0.) || ($lon > 360.0) ) {
- die <<"EOF";
-** $ProgName - Bad value for longitude (=$lat) for $desc **
-EOF
- }
- return( $lat, $lon );
-
-}
-
-#-----------------------------------------------------------------------------------------------
-
-# Process command-line options.
-
-my %opts = (
- SW_corner => undef,
- NE_corner => undef,
- infilelist => undef,
- outfilelist => undef,
- help => 0,
- verbose => 0,
- debug => 0,
- );
-GetOptions(
- "sw|SW_corner=s" => \$opts{'SW_corner'},
- "ne|NE_corner=s" => \$opts{'NE_corner'},
- "i|infilelist=s" => \$opts{'infilelist'},
- "o|outfilelist=s" => \$opts{'outfilelist'},
- "h|help" => \$opts{'help'},
- "d|debug" => \$opts{'debug'},
- "v|verbose" => \$opts{'verbose'},
-) or usage();
-
-# Give usage message.
-usage() if $opts{'help'};
-
-# Check for unparsed arguments
-if (@ARGV) {
- print "ERROR: unrecognized arguments: @ARGV\n";
- usage();
-}
-
-if ( ! defined($opts{'infilelist'}) || ! defined($opts{'outfilelist'}) ) {
- print "ERROR: MUST set both infilelist and outfilelist\n";
- usage();
-}
-if ( ! defined($opts{'SW_corner'}) || ! defined($opts{'NE_corner'}) ) {
- print "ERROR: MUST set both SW_corner and NE_corner\n";
- usage();
-}
-
-my ($S_lat,$W_lon) = get_latlon( $opts{'SW_corner'}, "SW" );
-my ($N_lat,$E_lon) = get_latlon( $opts{'NE_corner'}, "NE" );
-
-if ( $N_lat <= $S_lat ) {
- print "ERROR: NE corner latitude less than or equal to SW corner latitude\n";
- usage();
-}
-if ( $E_lon <= $W_lon ) {
- print "ERROR: NE corner longitude less than or equal to SW corner longitude\n";
- usage();
-}
-
-#-----------------------------------------------------------------------------------------------
-my $debug;
-if ( $opts{'debug'} ) {
- $debug = "DEBUG=TRUE";
-}
-my $print;
-if ( $opts{'verbose'} ) {
- $print = "PRINT=TRUE";
-}
-
-my %infiles = parse_filelist( $opts{'infilelist'} );
-my %outfiles = parse_filelist( $opts{'outfilelist'} );
-
-(my $GRIDFILE, my $NFILES, my $INFILES, my $OUTFILES) = get_filelists( \%infiles, \%outfiles );
-
-write_usermods( \%outfiles );
-
-my $cmd = "env S_LAT=$S_lat W_LON=$W_lon N_LAT=$N_lat E_LON=$E_lon " .
- "GRIDFILE=$GRIDFILE NFILES=$NFILES OUTFILELIST=$OUTFILES INFILELIST=$INFILES " .
- "$debug $print ncl $scrdir/getregional_datasets.ncl";
-
-print "Execute: $cmd\n";
-system( $cmd );
-system( "/bin/rm $INFILES $OUTFILES" );
-
-#-------------------------------------------------------------------------------
-
-sub parse_filelist {
-#
-# Parse a list of files (in "filename = 'filepath'" format) into a hash
-#
- my $file = shift;
-
- # check that the file exists
- (-f $file) or die "$nm: failed to find filelist file $file";
- my $fh = IO::File->new($file, '<') or die "$nm: can't open file: $file\n";
-
- my %files = ( );
- my $valstring1 = '\'[^\']*\'';
- my $valstring2 = '"[^"]*"';
- while( my $line = <$fh> ) {
- if ( $line =~ m/^\s*(\S+)\s*=\s*($valstring1|$valstring2)$/ ) {
- my $var = $1;
- my $string = $2;
- $string =~ s/'|"//g;
- if ( exists($files{$var}) ) {
- die "$nm: variable listed twice in file ($file): $var\n";
- }
- $files{$var} = $string;
- # Ignore empty lines or comments
- } elsif ( ($line =~ m/^\s*$/) || ($line =~ m/^\s*!/) ) {
- # ignore empty lines or comments
- } else {
- die "$nm: unexpected line in $file: $line\n";
- }
- }
- $fh->close;
-
- return( %files );
-}
-
-#-------------------------------------------------------------------------------
-
-sub get_filelists {
-#
-# Make sure file hashes compare correctly, and if so return in and out lists
-# on files
-#
- my $infiles_ref = shift;
- my $outfiles_ref = shift;
-
- my @infiles = sort( keys(%$infiles_ref ) );
- my @outfiles = sort( keys(%$outfiles_ref) );
-
- if ( $#infiles != $#outfiles ) {
- die "$nm: number of infiles is different from outfiles\n";
- }
- if ( "@infiles" ne "@outfiles" ) {
- die "$nm: list of infiles is different from outfiles list\n";
- }
- my $infilelist = "infilelist_getregional_datasets___tmp.lst";
- my $outfilelist = "outfilelist_getregional_datasets___tmp.lst";
- my $fhin = IO::File->new($infilelist, '>') or die "$nm: can't open file: $infilelist\n";
- my $fhout = IO::File->new($outfilelist, '>') or die "$nm: can't open file: $outfilelist\n";
-
- my $nfiles = 0;
- foreach my $file ( @infiles ) {
- my $infile = $$infiles_ref{$file};
- if ( ! -f "$infile" ) {
- die "$nm: infile ($file) $infile does NOT exist!\n";
- }
- print $fhin "$infile\n";
- my $outfile = $$outfiles_ref{$file};
- if ( -f "$outfile" ) {
- die "$nm: outfile ($file) $outfile already exists, delete it if you want to overwrite!\n";
- }
- print $fhout "$outfile\n";
- $nfiles++;
- }
- $fhin->close();
- $fhout->close();
- my $var = $gridfilename;
- my $gridfile = "";
- if ( exists($$infiles_ref{$var}) ) {
- $gridfile = $$infiles_ref{$var};
- } else {
- die "$nm: the grid file ($var) is required to be on the lists!\n";
- }
-
- return( $gridfile, $nfiles, $infilelist, $outfilelist );
-}
-
-#-------------------------------------------------------------------------------
-
-sub write_usermods {
-#
-# Write the user_nl_clm and xmlchng_cmnds files out
-# These can be used to setup a case after getregional_datasets is run.
-#
- my $outfiles_ref = shift;
-
- my $cwd = getcwd(); # current working directory
-
- #
- # Write out the user_nl_clm file
- #
- my $usrnlfile = "user_nl_clm";
- my $fh = IO::File->new($usrnlfile, '>') or die "$nm: can't open file: $usrnlfile\n";
-
- my $outgridfile = undef;
- foreach my $file ( sort(keys(%$outfiles_ref)) ) {
- my $filepath = $$outfiles_ref{$file};
- # Add current directory on front of path if not an absolute path in filepath
- if ( $filepath !~ m/^\// ) {
- $filepath = "$cwd/$filepath";
- }
- # Write all filenames out besides the gridfilename
- if ( $file ne $gridfilename ) {
- print $fh "$file = '$filepath'\n";
- } else {
- $outgridfile = $filepath;
- }
- }
- $fh->close();
- #
- # Write out the xmlchnge_cmnds file
- #
- (my $filename = $outgridfile)=~ s!(.*)/!!;
- my $filedir = $1;
- my $cmndsfile = "xmlchange_cmnds";
- my $fh = IO::File->new($cmndsfile, '>') or die "$nm: can't open file: $cmndsfile\n";
- print $fh "./xmlchange ATM_DOMAIN_PATH=$filedir\n";
- print $fh "./xmlchange LND_DOMAIN_PATH=$filedir\n";
- print $fh "./xmlchange ATM_DOMAIN_FILE=$filename\n";
- print $fh "./xmlchange LND_DOMAIN_FILE=$filename\n";
- $fh->close();
-}
-
-#-------------------------------------------------------------------------------
-
-sub absolute_path {
-#
-# Convert a pathname into an absolute pathname, expanding any . or .. characters.
-# Assumes pathnames refer to a local filesystem.
-# Assumes the directory separator is "/".
-#
- my $path = shift;
- my $cwd = getcwd(); # current working directory
- my $abspath; # resulting absolute pathname
-
-# Strip off any leading or trailing whitespace. (This pattern won't match if
-# there's embedded whitespace.
- $path =~ s!^\s*(\S*)\s*$!$1!;
-
-# Convert relative to absolute path.
-
- if ($path =~ m!^\.$!) { # path is "."
- return $cwd;
- } elsif ($path =~ m!^\./!) { # path starts with "./"
- $path =~ s!^\.!$cwd!;
- } elsif ($path =~ m!^\.\.$!) { # path is ".."
- $path = "$cwd/..";
- } elsif ($path =~ m!^\.\./!) { # path starts with "../"
- $path = "$cwd/$path";
- } elsif ($path =~ m!^[^/]!) { # path starts with non-slash character
- $path = "$cwd/$path";
- }
-
- my ($dir, @dirs2);
- my @dirs = split "/", $path, -1; # The -1 prevents split from stripping trailing nulls
- # This enables correct processing of the input "/".
-
- # Remove any "" that are not leading.
- for (my $i=0; $i<=$#dirs; ++$i) {
- if ($i == 0 or $dirs[$i] ne "") {
- push @dirs2, $dirs[$i];
- }
- }
- @dirs = ();
-
- # Remove any "."
- foreach $dir (@dirs2) {
- unless ($dir eq ".") {
- push @dirs, $dir;
- }
- }
- @dirs2 = ();
-
- # Remove the "subdir/.." parts.
- foreach $dir (@dirs) {
- if ( $dir !~ /^\.\.$/ ) {
- push @dirs2, $dir;
- } else {
- pop @dirs2; # remove previous dir when current dir is ..
- }
- }
- if ($#dirs2 == 0 and $dirs2[0] eq "") { return "/"; }
- $abspath = join '/', @dirs2;
- return( $abspath );
-}
-
-#-------------------------------------------------------------------------------
-
diff --git a/tools/ncl_scripts/sample_inlist b/tools/ncl_scripts/sample_inlist
deleted file mode 100644
index 0b13271540..0000000000
--- a/tools/ncl_scripts/sample_inlist
+++ /dev/null
@@ -1,18 +0,0 @@
-!
-! Sample input file of global datasets to extract a region from. Only works
-! on standard 2D CLM files such as: fatmlndfrc, fsurdat, flanduse_timeseries,
-! stream_fldfilename_ndep, stream_fldfilename_lightng, and/or
-! stream_fldfilename_popdens.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
-
-! fatmlndfrc is REQUIRED! MUST BE ON THE SAME GRID AS ALL OF THE DATAFILES BELOW!!!!
-
- fatmlndfrc = '/glade/p/cesmdata/cseg/inputdata/share/domains/domain.clm/domain.lnd.0.125x0.125_tx0.1v2.140704.nc'
- fsurdat = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/surfdata_map/surfdata_0.125x0.125_simyr2000_c150114.nc'
diff --git a/tools/ncl_scripts/sample_inlist_0.5popd b/tools/ncl_scripts/sample_inlist_0.5popd
deleted file mode 100644
index 8d4480fff4..0000000000
--- a/tools/ncl_scripts/sample_inlist_0.5popd
+++ /dev/null
@@ -1,22 +0,0 @@
-!
-! Sample input file of global datasets to extract a region from. Only works
-! on standard 2D CLM files such as: fatmlndfrc, fsurdat, flanduse_timeseries,
-! stream_fldfilename_ndep, stream_fldfilename_lightng, and/or
-! stream_fldfilename_popdens.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
-
-! fatmlndfrc is REQUIRED! MUST BE ON THE SAME GRID AS ALL OF THE DATAFILES BELOW!!!!
-
- fatmlndfrc = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/firedata/clmforc.Li_2012_hdm_0.5x0.5_AVHRR_simyr1850-2010_c130401.nc'
-!
-! The following files are interpolated by default so technically do NOT need
-! to be extracted, but it will help performance.
-!
- stream_fldfilename_popdens = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/firedata/clmforc.Li_2012_hdm_0.5x0.5_AVHRR_simyr1850-2010_c130401.nc'
diff --git a/tools/ncl_scripts/sample_inlist_T62 b/tools/ncl_scripts/sample_inlist_T62
deleted file mode 100644
index ac5ee2ed8d..0000000000
--- a/tools/ncl_scripts/sample_inlist_T62
+++ /dev/null
@@ -1,23 +0,0 @@
-!
-! Sample input file of global datasets to extract a region from. Only works
-! on standard 2D CLM files such as: fatmlndfrc, fsurdat, flanduse_timeseries,
-! stream_fldfilename_ndep, stream_fldfilename_lightng, and/or
-! stream_fldfilename_popdens.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
-
-! fatmlndfrc is REQUIRED! MUST BE ON THE SAME GRID AS ALL OF THE DATAFILES BELOW!!!!
-
- fatmlndfrc = '/glade/p/cesmdata/cseg/inputdata/atm/datm7/atm_forcing.datm7.Qian.T62.c080727/domain.T62.050609.nc'
-!
-! The following files are interpolated by default so technically do NOT need
-! to be extracted, but it will help performance.
-!
- stream_fldfilename_lightng = '/glade/p/cesmdata/cseg/inputdata/atm/datm7/NASA_LIS/clmforc.Li_2012_climo1995-2011.T62.lnfm_c130327.nc'
- strm_datfil = '/glade/p/cesmdata/cseg/inputdata/atm/datm7/atm_forcing.datm7.Qian.T62.c080727/Solar6Hrly/clmforc.Qian.c2006.T62.Solr.1948-01.nc'
diff --git a/tools/ncl_scripts/sample_inlist_ndep b/tools/ncl_scripts/sample_inlist_ndep
deleted file mode 100644
index 726c7ffbaf..0000000000
--- a/tools/ncl_scripts/sample_inlist_ndep
+++ /dev/null
@@ -1,22 +0,0 @@
-!
-! Sample input file of global datasets to extract a region from. Only works
-! on standard 2D CLM files such as: fatmlndfrc, fsurdat, flanduse_timeseries,
-! stream_fldfilename_ndep, stream_fldfilename_lightng, and/or
-! stream_fldfilename_popdens.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
-
-! fatmlndfrc is REQUIRED! MUST BE ON THE SAME GRID AS ALL OF THE DATAFILES BELOW!!!!
-
- fatmlndfrc = '/glade/p/cesmdata/cseg/inputdata/share/domains/domain.lnd.fv1.9x2.5_gx1v6.090206.nc'
-!
-! The following files are interpolated by default so technically do NOT need
-! to be extracted, but it will help performance.
-!
- stream_fldfilename_ndep = '/glade/p/cesmdata/cseg/inputdata/lnd/clm2/ndepdata/fndep_clm_hist_simyr1849-2006_1.9x2.5_c100428.nc'
diff --git a/tools/ncl_scripts/sample_outlist b/tools/ncl_scripts/sample_outlist
deleted file mode 100644
index fd153e7206..0000000000
--- a/tools/ncl_scripts/sample_outlist
+++ /dev/null
@@ -1,14 +0,0 @@
-!
-! Sample input file of regional datasets that will be created. You need to have
-! the same list of files as in the input filelist as well. See the sample_inlist
-! for the list of files that can be operated on.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
- fatmlndfrc = 'domain.lnd.184x256pt_0.125x0.125_alaskaUSA_tx0.1v2_c150114.nc'
- fsurdat = 'surfdata_184x256pt_0.125x0.125_alaskaUSA_simyr2000_c150114.nc'
diff --git a/tools/ncl_scripts/sample_outlist_0.5popd b/tools/ncl_scripts/sample_outlist_0.5popd
deleted file mode 100644
index 671a55037d..0000000000
--- a/tools/ncl_scripts/sample_outlist_0.5popd
+++ /dev/null
@@ -1,14 +0,0 @@
-!
-! Sample input file of regional datasets that will be created. You need to have
-! the same list of files as in the input filelist as well. See the sample_inlist
-! for the list of files that can be operated on.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
- fatmlndfrc = 'domain.lnd.0.5x0.5_alaskaUSA_gx1v6_c141117.nc'
-stream_fldfilename_popdens = 'clmforc.Li_2012_hdm_0.5x0.5_AVHRR_simyr1850-2010_c141117.nc'
diff --git a/tools/ncl_scripts/sample_outlist_T62 b/tools/ncl_scripts/sample_outlist_T62
deleted file mode 100644
index 3dfe69148a..0000000000
--- a/tools/ncl_scripts/sample_outlist_T62
+++ /dev/null
@@ -1,16 +0,0 @@
-!
-! Sample input file of regional datasets that will be created. You need to have
-! the same list of files as in the input filelist as well. See the sample_inlist
-! for the list of files that can be operated on.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
- fatmlndfrc = 'domain.lnd.T62_alaskaUSA_c141117.nc'
-
- stream_fldfilename_lightng = 'clmforc.Li_2012_climo1995-2011.T62_alaskaUSA.lnfm_c141117.nc'
- strm_datfil = 'clmforc.Qian.c2006.T62_alaskaUSA.Solr.1948-01.nc'
diff --git a/tools/ncl_scripts/sample_outlist_ndep b/tools/ncl_scripts/sample_outlist_ndep
deleted file mode 100644
index d3dad24ae7..0000000000
--- a/tools/ncl_scripts/sample_outlist_ndep
+++ /dev/null
@@ -1,15 +0,0 @@
-!
-! Sample input file of regional datasets that will be created. You need to have
-! the same list of files as in the input filelist as well. See the sample_inlist
-! for the list of files that can be operated on.
-!
-! format: list of filenames similar to namelist format with a single line
-! per file with the format of:
-!
-! variable = "filepath"
-!
-! Lines (such as these) with "!" are ignored.
-!
- fatmlndfrc = 'domain.lnd.13x12pt_f19_alaskaUSA_gx1v6_c141117.nc'
-
- stream_fldfilename_ndep = 'fndep_clm_hist_simyr1849-2006_13x12pt_f19_alaskaUSA_c141117.nc'