diff --git a/docs/Users_Guide/reformat_point.rst b/docs/Users_Guide/reformat_point.rst
index 32695f36b0..ec09fa3f83 100644
--- a/docs/Users_Guide/reformat_point.rst
+++ b/docs/Users_Guide/reformat_point.rst
@@ -458,6 +458,8 @@ While initial versions of the ASCII2NC tool only supported a simple 11 column AS
• `International Soil Moisture Network (ISMN) Data format `_.
+• `International Arctic Buoy Programme (IABP) Data format `_.
+
• `AErosol RObotic NEtwork (AERONET) versions 2 and 3 format `_
• Python embedding of point observations, as described in :numref:`pyembed-point-obs-data`. See example below in :numref:`ascii2nc-pyembed`.
@@ -522,6 +524,8 @@ Once the ASCII point observations have been formatted as expected, the ASCII fil
netcdf_file
[-format ASCII_format]
[-config file]
+ [-valid_beg time]
+ [-valid_end time]
[-mask_grid string]
[-mask_poly file]
[-mask_sid file|list]
@@ -541,21 +545,25 @@ Required Arguments for ascii2nc
Optional Arguments for ascii2nc
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-3. The **-format ASCII_format** option may be set to "met_point", "little_r", "surfrad", "wwsis", "airnowhourlyaqobs", "airnowhourly", "airnowdaily_v2", "ndbc_standard", "ismn", "aeronet", "aeronetv2", "aeronetv3", or "python". If passing in ISIS data, use the "surfrad" format flag.
+3. The **-format ASCII_format** option may be set to "met_point", "little_r", "surfrad", "wwsis", "airnowhourlyaqobs", "airnowhourly", "airnowdaily_v2", "ndbc_standard", "ismn", "iabp", "aeronet", "aeronetv2", "aeronetv3", or "python". If passing in ISIS data, use the "surfrad" format flag.
4. The **-config file** option is the configuration file for generating time summaries.
-5. The **-mask_grid** string option is a named grid or a gridded data file to filter the point observations spatially.
+5. The **-valid_beg** time option in YYYYMMDD[_HH[MMSS]] format sets the beginning of the retention time window.
-6. The **-mask_poly** file option is a polyline masking file to filter the point observations spatially.
+6. The **-valid_end** time option in YYYYMMDD[_HH[MMSS]] format sets the end of the retention time window.
-7. The **-mask_sid** file|list option is a station ID masking file or a comma-separated list of station ID's to filter the point observations spatially. See the description of the "sid" entry in :numref:`config_options`.
+7. The **-mask_grid** string option is a named grid or a gridded data file to filter the point observations spatially.
-8. The **-log file** option directs output and errors to the specified log file. All messages will be written to that file as well as standard out and error. Thus, users can save the messages without having to redirect the output on the command line. The default behavior is no log file.
+8. The **-mask_poly** file option is a polyline masking file to filter the point observations spatially.
-9. The **-v level** option indicates the desired level of verbosity. The value of "level" will override the default setting of 2. Setting the verbosity to 0 will make the tool run with no log messages, while increasing the verbosity above 1 will increase the amount of logging.
+9. The **-mask_sid** file|list option is a station ID masking file or a comma-separated list of station ID's to filter the point observations spatially. See the description of the "sid" entry in :numref:`config_options`.
-10. The **-compress level** option indicates the desired level of compression (deflate level) for NetCDF variables. The valid level is between 0 and 9. The value of "level" will override the default setting of 0 from the configuration file or the environment variable MET_NC_COMPRESS. Setting the compression level to 0 will make no compression for the NetCDF output. Lower number is for fast compression and higher number is for better compression.
+10. The **-log file** option directs output and errors to the specified log file. All messages will be written to that file as well as standard out and error. Thus, users can save the messages without having to redirect the output on the command line. The default behavior is no log file.
+
+11. The **-v level** option indicates the desired level of verbosity. The value of "level" will override the default setting of 2. Setting the verbosity to 0 will make the tool run with no log messages, while increasing the verbosity above 1 will increase the amount of logging.
+
+12. The **-compress level** option indicates the desired level of compression (deflate level) for NetCDF variables. The valid level is between 0 and 9. The value of "level" will override the default setting of 0 from the configuration file or the environment variable MET_NC_COMPRESS. Setting the compression level to 0 will make no compression for the NetCDF output. Lower number is for fast compression and higher number is for better compression.
An example of the ascii2nc calling sequence is shown below:
@@ -1203,3 +1211,34 @@ For how to use the script, issue the command:
.. code-block:: none
python3 MET_BASE/python/utility/print_pointnc2ascii.py -h
+
+IABP retrieval Python Utilities
+====================================
+
+`International Arctic Buoy Programme (IABP) Data `_ is one of the data types supported by ascii2nc. A utility script that pulls all this data from the web and stores it locally, called get_iabp_from_web.py is included. This script accesses the appropriate webpage and downloads the ascii files for all buoys. It is straightforward, but can be time intensive as the archive of this data is extensive and files are downloaded one at a time.
+
+The script can be found at:
+
+.. code-block:: none
+
+ MET_BASE/python/utility/get_iabp_from_web.py
+
+For how to use the script, issue the command:
+
+.. code-block:: none
+
+ python3 MET_BASE/python/utility/get_iabp_from_web.py -h
+
+Another IABP utility script is included for users, to be run after all files have been downloaded using get_iabp_from_web.py. This script examines all the files and lists those files that contain entries that fall within a user specified range of days. It is called find_iabp_in_timerange.py.
+
+The script can be found at:
+
+.. code-block:: none
+
+ MET_BASE/python/utility/find_iabp_in_timerange.py
+
+For how to use the script, issue the command:
+
+.. code-block:: none
+
+ python3 MET_BASE/python/utility/find_iabp_in_timerange.py -h
diff --git a/internal/test_unit/xml/unit_ascii2nc.xml b/internal/test_unit/xml/unit_ascii2nc.xml
index 2dd9df07e7..4424fd0e33 100644
--- a/internal/test_unit/xml/unit_ascii2nc.xml
+++ b/internal/test_unit/xml/unit_ascii2nc.xml
@@ -211,4 +211,19 @@
+
+ &MET_BIN;/ascii2nc
+ \
+ -format iabp \
+ -valid_beg 20140101 -valid_end 20140201 \
+ &DATA_DIR_OBS;/iabp/090629.dat \
+ &DATA_DIR_OBS;/iabp/109320.dat \
+ &DATA_DIR_OBS;/iabp/109499.dat \
+ &OUTPUT_DIR;/ascii2nc/iabp_20140101_20140201.nc
+
+
+
+
diff --git a/scripts/python/utility/Makefile.am b/scripts/python/utility/Makefile.am
index 5efd02b01e..2509cff62b 100644
--- a/scripts/python/utility/Makefile.am
+++ b/scripts/python/utility/Makefile.am
@@ -26,8 +26,11 @@
pythonutilitydir = $(pkgdatadir)/python/utility
pythonutility_DATA = \
+ build_ndbc_stations_from_web.py \
+ find_iabp_in_timerange.py \
+ get_iabp_from_web.py \
print_pointnc2ascii.py \
- build_ndbc_stations_from_web.py
+ rgb2ctable.py
EXTRA_DIST = ${pythonutility_DATA}
diff --git a/scripts/python/utility/Makefile.in b/scripts/python/utility/Makefile.in
index 4c379b52a6..0b977854db 100644
--- a/scripts/python/utility/Makefile.in
+++ b/scripts/python/utility/Makefile.in
@@ -311,8 +311,11 @@ top_builddir = @top_builddir@
top_srcdir = @top_srcdir@
pythonutilitydir = $(pkgdatadir)/python/utility
pythonutility_DATA = \
+ build_ndbc_stations_from_web.py \
+ find_iabp_in_timerange.py \
+ get_iabp_from_web.py \
print_pointnc2ascii.py \
- build_ndbc_stations_from_web.py
+ rgb2ctable.py
EXTRA_DIST = ${pythonutility_DATA}
MAINTAINERCLEANFILES = Makefile.in
diff --git a/scripts/python/utility/find_iabp_in_timerange.py b/scripts/python/utility/find_iabp_in_timerange.py
new file mode 100755
index 0000000000..3aa76d1eff
--- /dev/null
+++ b/scripts/python/utility/find_iabp_in_timerange.py
@@ -0,0 +1,241 @@
+ #!/usr/bin/env python3
+
+from optparse import OptionParser
+import urllib.request
+import datetime
+from datetime import date
+import os
+import shutil
+import shlex
+import errno
+from subprocess import Popen, PIPE
+
+
+
+#----------------------------------------------
+def usage():
+ print("Usage: find_iabp_in_timerange.py -s yyyymmdd -e yyyymmdd [-d PATH]")
+
+#----------------------------------------------
+def is_date_in_range(input_date, start_date, end_date):
+ return start_date <= input_date <= end_date
+
+#----------------------------------------------
+def lookFor(name, inlist, filename, printWarning=False):
+ rval = -1
+ try:
+ rval = inlist.index(name)
+ except:
+ if printWarning:
+ print(name, " not in header line, file=", filename)
+
+ return rval
+
+#----------------------------------------------
+def pointToInt(index, tokens, filename):
+ if index < 0 or index >= len(tokens):
+ print("ERROR index out of range ", index)
+ return -1
+ return int(tokens[index])
+
+#----------------------------------------------
+def pointToFloat(index, tokens, filename):
+ if index < 0 or index >= len(tokens):
+ print("ERROR index out of range ", index)
+ return -99.99
+ return float(tokens[index])
+
+#----------------------------------------------
+class StationHeader:
+ def __init__(self, headerLine, filename):
+ tokens = headerLine.split()
+ self._ok = True
+ self._idIndex = lookFor('BuoyID', tokens, filename, True)
+ self._yearIndex = lookFor('Year', tokens, filename, True)
+ self._hourIndex = lookFor('Hour', tokens, filename, True)
+ self._minuteIndex = lookFor('Min', tokens, filename, True)
+ self._doyIndex = lookFor('DOY', tokens, filename, True)
+ self._posdoyIndex = lookFor('POS_DOY', tokens, filename, True)
+ self._latIndex = lookFor('Lat', tokens, filename, True)
+ self._lonIndex = lookFor('Lon', tokens, filename, True)
+ self._bpIndex = lookFor('BP', tokens, filename, False)
+ self._tsIndex = lookFor('Ts', tokens, filename, False)
+ self._taIndex = lookFor('Ta', tokens, filename, False)
+ self._ok = self._idIndex != -1 and self._yearIndex != -1 and self._hourIndex != -1 \
+ and self._minuteIndex != -1 and self._doyIndex != -1 and self._posdoyIndex != -1 \
+ and self._latIndex != -1 and self._lonIndex != -1
+ if not self._ok:
+ print("ERROR badly formed header line")
+
+#----------------------------------------------
+class Station:
+ def __init__(self, line, filename, stationHeader):
+ self._ok = True
+ tokens = line.split()
+ self._id = pointToInt(stationHeader._idIndex, tokens, filename)
+ if self._id < 0:
+ self._ok = False
+ self._year = pointToInt(stationHeader._yearIndex, tokens, filename)
+ if self._year < 0:
+ self._ok = False
+ self._hour = pointToInt(stationHeader._hourIndex, tokens, filename)
+ if self._hour < 0:
+ self._ok = False
+ self._minute = pointToInt(stationHeader._minuteIndex, tokens, filename)
+ if self._minute < 0:
+ self._ok = False
+ self._doy = pointToFloat(stationHeader._doyIndex, tokens, filename)
+ if self._doy < 0:
+ self._ok = False
+ if self._doy > 365:
+ self._ok = False
+ self._posdoy = pointToFloat(stationHeader._posdoyIndex, tokens, filename)
+ if self._posdoy < 0:
+ self._ok = False
+ if self._posdoy > 365:
+ self._ok = False
+ self._lat = pointToFloat(stationHeader._latIndex, tokens, filename)
+ if self._lat == -99.99:
+ self._ok = False
+ self._lon = pointToFloat(stationHeader._lonIndex, tokens, filename)
+ if self._lon == -99.99:
+ self._ok = False
+ if stationHeader._bpIndex >= 0:
+ self._pressure = pointToFloat(stationHeader._bpIndex, tokens, filename)
+ else:
+ self._pressure = -99.99
+ if stationHeader._tsIndex >= 0:
+ self._tempsurface = pointToFloat(stationHeader._tsIndex, tokens, filename)
+ else:
+ self._tempsurface = -99.99
+ if stationHeader._taIndex >= 0:
+ self._tempair = pointToFloat(stationHeader._taIndex, tokens, filename)
+ else:
+ self._tempair = -99.99
+
+ if self._ok:
+ d = datetime.datetime(self._year, 1, 1) + datetime.timedelta(self._doy - 1)
+ self._month = d.month
+ self._day = d.day
+ else:
+ self._month = -1
+ self._day = -1
+ def timeInRange(self, start_date, end_date):
+ if self._ok:
+ input_date = date(self._year, self._month, self._day)
+ return is_date_in_range(input_date, start_date, end_date)
+ else:
+ return False
+
+#----------------------------------------------
+class StationTimeSeries:
+ def __init__(self, stationHeader):
+ self._stationHeader = stationHeader
+ self._data = []
+ def add(self, line, filename):
+ s = Station(line, filename, self._stationHeader)
+ if s._ok:
+ self._data.append(s)
+ def print(self):
+ print("Nothing")
+ def hasTimesInRange(self, start_date, end_date):
+ for s in self._data:
+ if (s.timeInRange(start_date, end_date)):
+ return True
+ return False
+
+#----------------------------------------------
+def doCmd(cmd, debug=False):
+ #print(cmd)
+ my_env = os.environ.copy()
+ args = shlex.split(cmd)
+ proc = Popen(args, stdout=PIPE, stderr=PIPE, env=my_env)
+ out, err = proc.communicate()
+ exitcode = proc.returncode
+ if exitcode == 0:
+ return str(out)
+ else:
+ if debug:
+ print("Command failed ", cmd)
+ return ""
+
+#----------------------------------------------
+def getdatafilenames(aDir):
+ if (os.path.exists(aDir)):
+ allFiles = [name for name in os.listdir(aDir) \
+ if not os.path.isdir(os.path.join(aDir, name))]
+ return [s for s in allFiles if '.dat' in s]
+ else:
+ return []
+
+#----------------------------------------------
+def run2(data_path, start, end):
+
+ if (data_path[0:2] != "./" and data_path[0] != '/'):
+ inpath = "./" + data_path
+ else:
+ inpath = data_path
+
+ print("data_path = ", inpath)
+
+ # could put testing here to make sure strings will convert
+ print("start = ", start)
+ print("end = ", end)
+
+ y0 = int(start[0:4])
+ m0 = int(start[4:6])
+ d0 = int(start[6:8])
+
+ y1 = int(end[0:4])
+ m1 = int(end[4:6])
+ d1 = int(end[6:8])
+
+ print("Looking for file with data in range ", y0, m0, d0, " to ", y1, m1, d1)
+
+ # read each file that ends in .dat
+ stationfiles = getdatafilenames(inpath)
+ stationfiles.sort()
+
+ print("We have ", len(stationfiles), " data files to look at")
+ start_date = date(y0, m0, d0)
+ end_date = date(y1, m1, d1)
+
+ for i in range(len(stationfiles)):
+
+ #print("Looking at ", stationfiles[i])
+ with open(inpath + "/" + stationfiles[i], 'r') as file:
+ data_all = file.read()
+ file.close()
+ lines = data_all.splitlines()
+
+ # first line is a header, remaining lines are a time series
+ sh = StationHeader(lines[0], stationfiles[i])
+ if sh._ok:
+ lines = lines[1:]
+ st = StationTimeSeries(sh)
+ for l in lines:
+ st.add(l, stationfiles[i])
+
+ if (st.hasTimesInRange(start_date, end_date)):
+ print(stationfiles[i])
+
+#----------------------------------------------
+def create_parser_options(parser):
+ parser.add_option("-d", "--data_path", dest="data_path",
+ default="./iabp_files", help=" path to the station files (.dat) (default: ./iabp_files)")
+ parser.add_option("-s", "--start", dest="start",
+ default="notset", help=" starting yyyymmdd. Must be set")
+ parser.add_option("-e", "--end", dest="end",
+ default="notset", help=" ending yyyymmdd. Must be set")
+ return parser.parse_args()
+
+#----------------------------------------------
+if __name__ == "__main__":
+ usage_str = "%prog [options]"
+ parser = OptionParser(usage = usage_str)
+ options, args = create_parser_options(parser)
+ if (options.start == "notset" or options.end == "notset"):
+ usage()
+ exit(0)
+ run2(options.data_path, options.start, options.end)
+ exit(0)
diff --git a/scripts/python/utility/get_iabp_from_web.py b/scripts/python/utility/get_iabp_from_web.py
new file mode 100755
index 0000000000..b2a931c8fe
--- /dev/null
+++ b/scripts/python/utility/get_iabp_from_web.py
@@ -0,0 +1,235 @@
+ #!/usr/bin/env python3
+
+from optparse import OptionParser
+import urllib.request
+import os
+import shutil
+import shlex
+import errno
+from subprocess import Popen, PIPE
+
+#----------------------------------------------------------------------------
+def makeOrScrub(path, debug=False):
+ if (debug):
+ print("Recreating path " + path)
+ if (os.path.exists(path)):
+ try:
+ shutil.rmtree(path)
+ os.makedirs(path)
+ except:
+ print('WARNING: ' + path + ' not completely cleaned out.')
+ else:
+ os.makedirs(path)
+
+
+#----------------------------------------------
+def lookFor(name, inlist, filename, printWarning=False):
+ rval = -1
+ try:
+ rval = inlist.index(name)
+ except:
+ if printWarning:
+ print(name, " not in header line, file=", filename)
+
+ return rval
+
+#----------------------------------------------
+def pointToInt(index, tokens, filename):
+ if index < 0 or index >= len(tokens):
+ print("ERROR index out of range ", index)
+ return -1
+ return int(tokens[index])
+
+#----------------------------------------------
+def pointToFloat(index, tokens, filename):
+ if index < 0 or index >= len(tokens):
+ print("ERROR index out of range ", index)
+ return -99.99
+ return float(tokens[index])
+
+#----------------------------------------------
+class StationHeader:
+ def __init__(self, headerLine, filename):
+ tokens = headerLine.split()
+ self._ok = True
+ self._idIndex = lookFor('BuoyID', tokens, filename, True)
+ self._yearIndex = lookFor('Year', tokens, filename, True)
+ self._hourIndex = lookFor('Hour', tokens, filename, True)
+ self._minuteIndex = lookFor('Min', tokens, filename, True)
+ self._doyIndex = lookFor('DOY', tokens, filename, True)
+ self._posdoyIndex = lookFor('POS_DOY', tokens, filename, True)
+ self._latIndex = lookFor('Lat', tokens, filename, True)
+ self._lonIndex = lookFor('Lon', tokens, filename, True)
+ self._bpIndex = lookFor('Lon', tokens, filename, False)
+ self._tsIndex = lookFor('Lon', tokens, filename, False)
+ self._taIndex = lookFor('Lon', tokens, filename, False)
+ self._ok = self._idIndex != -1 and self._yearIndex != -1 and self._hourIndex != -1 \
+ and self._minuteIndex != -1 and self._doyIndex != -1 and self._posdoyIndex != -1 \
+ and self._latIndex != -1 and self._lonIndex != -1
+ if not self._ok:
+ print("ERROR badly formed header line")
+
+#----------------------------------------------
+class Station:
+ def __init__(self, line, filename, stationHeader):
+ self._ok = True
+ tokens = line.split()
+ self._id = pointToInt(stationHeader._idIndex, tokens, filename)
+ if self._id < 0:
+ self._ok = False
+ self._hour = pointToInt(stationHeader._hourIndex, tokens, filename)
+ if self._hour < 0:
+ self._ok = False
+ self._minute = pointToInt(stationHeader._minuteIndex, tokens, filename)
+ if self._minute < 0:
+ self._ok = False
+ self._doy = pointToFloat(stationHeader._doyIndex, tokens, filename)
+ if self._doy < 0:
+ self._ok = False
+ self._posdoy = pointToFloat(stationHeader._posdoyIndex, tokens, filename)
+ if self._posdoy < 0:
+ self._ok = False
+ self._lat = pointToFloat(stationHeader._latIndex, tokens, filename)
+ if self._lat == -99.99:
+ self._ok = False
+ self._lon = pointToFloat(stationHeader._lonIndex, tokens, filename)
+ if self._lon == -99.99:
+ self._ok = False
+ if stationHeader._bpIndex >= 0:
+ self._pressure = pointToFloat(stationHeader._bpIndex, tokens, filename)
+ else:
+ self._pressure = -99.99
+ if stationHeader._tsIndex >= 0:
+ self._tempsurface = pointToFloat(stationHeader._tsIndex, tokens, filename)
+ else:
+ self._tempsurface = -99.99
+ if stationHeader._taIndex >= 0:
+ self._tempair = pointToFloat(stationHeader._taIndex, tokens, filename)
+ else:
+ self._tempair = -99.99
+
+#----------------------------------------------
+class StationTimeSeries:
+ def __init__(self, stationHeader):
+ self._stationHeader = stationHeader
+ self._data = []
+ def add(self, line, filename):
+ s = Station(line, filename, self._stationHeader)
+ if s._ok:
+ self._data.append(s)
+ def print(self):
+ print("Nothing")
+
+#----------------------------------------------
+def doCmd(cmd, debug=False):
+ #print(cmd)
+ my_env = os.environ.copy()
+ args = shlex.split(cmd)
+ proc = Popen(args, stdout=PIPE, stderr=PIPE, env=my_env)
+ out, err = proc.communicate()
+ exitcode = proc.returncode
+ if exitcode == 0:
+ return str(out)
+ else:
+ if debug:
+ print("Command failed ", cmd)
+ return ""
+
+#----------------------------------------------
+def nextStation(data_all, index_all):
+
+ index_all = data_all.find('.dat', index_all)
+ data = ""
+ if index_all == -1:
+ return -1, data
+
+ #is this the weird (bad) .dat.dat situtation?
+ teststr = data_all[index_all:index_all+8]
+ if teststr == ".dat.dat":
+ indexend = data_all.find('.dat.dat<', index_all+1)
+ if indexend == -1:
+ print("Unexpected lack of .dat.dat<")
+ return -1, data
+ data = data_all[index_all+10:indexend+8]
+ else:
+ indexend = data_all.find('.dat<', index_all+1)
+ if indexend == -1:
+ print("UNexpected lack of .dat<")
+ return -1, data
+ data = data_all[index_all+6:indexend+4]
+ return indexend+10, data
+
+#----------------------------------------------
+def getStation(sfile):
+ cmd = "wget https://iabp.apl.uw.edu/WebData/" + sfile
+ print(cmd)
+ doCmd(cmd, True)
+
+ # parse contents (not used for anything just yet)
+ with open(sfile, 'r') as file:
+ data_all = file.read()
+ file.close()
+ lines = data_all.splitlines()
+
+ # first line is a header, remaining lines are a time series
+ sh = StationHeader(lines[0], sfile)
+ if sh._ok:
+ lines = lines[1:]
+ st = StationTimeSeries(sh)
+ for l in lines:
+ st.add(l, sfile)
+
+
+#----------------------------------------------
+def run(output_path):
+
+ cwd = os.getcwd()
+
+ if (output_path[0:2] != "./" and output_path[0] != '/'):
+ outpath = "./" + output_path
+ else:
+ outpath = output_path
+ makeOrScrub(outpath, True)
+ os.chdir(outpath)
+
+ cmd = "wget https://iabp.apl.uw.edu/WebData"
+ print(cmd)
+ s = doCmd(cmd, True)
+ if not s:
+ status = False
+
+ stationfiles = []
+ with open("WebData", 'r') as file:
+ data_all = file.read().replace('\n', '')
+ file.close()
+ index_all = 0
+ while index_all < len(data_all):
+ index_all, data = nextStation(data_all, index_all)
+ if (index_all == -1):
+ break;
+ stationfiles.append(data)
+
+ print("Parsed out ", len(stationfiles), " individual station files")
+
+ # pull down all the station files
+ for i in range(len(stationfiles)):
+ getStation(stationfiles[i])
+
+ print("created ", len(stationfiles), " station files in ", outpath)
+ os.chdir(cwd)
+
+#----------------------------------------------
+def create_parser_options(parser):
+ parser.add_option("-o", "--output_path", dest="output_path",
+ default="./iabp_files", help=" create an output path or clear out what is there and put output files to that path (default: ./iabp_files)")
+ #parser.add_option("-H", "--Help", dest="options", action="store_true", default=False, help = " show usage information (optional, default = False)")
+ return parser.parse_args()
+
+#----------------------------------------------
+if __name__ == "__main__":
+
+ usage_str = "%prog [options]"
+ parser = OptionParser(usage = usage_str)
+ options, args = create_parser_options(parser)
+ run(options.output_path)
+ exit(0)
diff --git a/scripts/python/utility/rgb2ctable.py b/scripts/python/utility/rgb2ctable.py
old mode 100644
new mode 100755
diff --git a/src/tools/other/ascii2nc/Makefile.am b/src/tools/other/ascii2nc/Makefile.am
index 0647687561..ab69ead9f1 100644
--- a/src/tools/other/ascii2nc/Makefile.am
+++ b/src/tools/other/ascii2nc/Makefile.am
@@ -30,6 +30,7 @@ ascii2nc_SOURCES = ascii2nc.cc \
airnow_locations.cc airnow_locations.h \
aeronet_handler.cc aeronet_handler.h \
ismn_handler.cc ismn_handler.h \
+ iabp_handler.cc iabp_handler.h \
$(OPT_PYTHON_SOURCES)
ascii2nc_CPPFLAGS = ${MET_CPPFLAGS} -I../../../basic/vx_log
diff --git a/src/tools/other/ascii2nc/Makefile.in b/src/tools/other/ascii2nc/Makefile.in
index 4c364feb94..ccc217163a 100644
--- a/src/tools/other/ascii2nc/Makefile.in
+++ b/src/tools/other/ascii2nc/Makefile.in
@@ -110,8 +110,8 @@ am__ascii2nc_SOURCES_DIST = ascii2nc.cc ascii2nc_conf_info.cc \
airnow_handler.h ndbc_handler.cc ndbc_handler.h \
ndbc_locations.cc ndbc_locations.h airnow_locations.cc \
airnow_locations.h aeronet_handler.cc aeronet_handler.h \
- ismn_handler.cc ismn_handler.h python_handler.h \
- python_handler.cc
+ ismn_handler.cc ismn_handler.h iabp_handler.cc iabp_handler.h \
+ python_handler.h python_handler.cc
@ENABLE_PYTHON_TRUE@am__objects_1 = ascii2nc-python_handler.$(OBJEXT)
am__objects_2 = $(am__objects_1)
am_ascii2nc_OBJECTS = ascii2nc-ascii2nc.$(OBJEXT) \
@@ -126,7 +126,8 @@ am_ascii2nc_OBJECTS = ascii2nc-ascii2nc.$(OBJEXT) \
ascii2nc-ndbc_locations.$(OBJEXT) \
ascii2nc-airnow_locations.$(OBJEXT) \
ascii2nc-aeronet_handler.$(OBJEXT) \
- ascii2nc-ismn_handler.$(OBJEXT) $(am__objects_2)
+ ascii2nc-ismn_handler.$(OBJEXT) \
+ ascii2nc-iabp_handler.$(OBJEXT) $(am__objects_2)
ascii2nc_OBJECTS = $(am_ascii2nc_OBJECTS)
am__DEPENDENCIES_1 =
ascii2nc_DEPENDENCIES = $(am__DEPENDENCIES_1) $(am__DEPENDENCIES_1) \
@@ -155,6 +156,7 @@ am__depfiles_remade = ./$(DEPDIR)/ascii2nc-aeronet_handler.Po \
./$(DEPDIR)/ascii2nc-ascii2nc.Po \
./$(DEPDIR)/ascii2nc-ascii2nc_conf_info.Po \
./$(DEPDIR)/ascii2nc-file_handler.Po \
+ ./$(DEPDIR)/ascii2nc-iabp_handler.Po \
./$(DEPDIR)/ascii2nc-ismn_handler.Po \
./$(DEPDIR)/ascii2nc-little_r_handler.Po \
./$(DEPDIR)/ascii2nc-met_handler.Po \
@@ -392,6 +394,7 @@ ascii2nc_SOURCES = ascii2nc.cc \
airnow_locations.cc airnow_locations.h \
aeronet_handler.cc aeronet_handler.h \
ismn_handler.cc ismn_handler.h \
+ iabp_handler.cc iabp_handler.h \
$(OPT_PYTHON_SOURCES)
ascii2nc_CPPFLAGS = ${MET_CPPFLAGS} -I../../../basic/vx_log
@@ -519,6 +522,7 @@ distclean-compile:
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-ascii2nc.Po@am__quote@ # am--include-marker
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-ascii2nc_conf_info.Po@am__quote@ # am--include-marker
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-file_handler.Po@am__quote@ # am--include-marker
+@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-iabp_handler.Po@am__quote@ # am--include-marker
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-ismn_handler.Po@am__quote@ # am--include-marker
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-little_r_handler.Po@am__quote@ # am--include-marker
@AMDEP_TRUE@@am__include@ @am__quote@./$(DEPDIR)/ascii2nc-met_handler.Po@am__quote@ # am--include-marker
@@ -730,6 +734,20 @@ ascii2nc-ismn_handler.obj: ismn_handler.cc
@AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
@am__fastdepCXX_FALSE@ $(AM_V_CXX@am__nodep@)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -c -o ascii2nc-ismn_handler.obj `if test -f 'ismn_handler.cc'; then $(CYGPATH_W) 'ismn_handler.cc'; else $(CYGPATH_W) '$(srcdir)/ismn_handler.cc'; fi`
+ascii2nc-iabp_handler.o: iabp_handler.cc
+@am__fastdepCXX_TRUE@ $(AM_V_CXX)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -MT ascii2nc-iabp_handler.o -MD -MP -MF $(DEPDIR)/ascii2nc-iabp_handler.Tpo -c -o ascii2nc-iabp_handler.o `test -f 'iabp_handler.cc' || echo '$(srcdir)/'`iabp_handler.cc
+@am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/ascii2nc-iabp_handler.Tpo $(DEPDIR)/ascii2nc-iabp_handler.Po
+@AMDEP_TRUE@@am__fastdepCXX_FALSE@ $(AM_V_CXX)source='iabp_handler.cc' object='ascii2nc-iabp_handler.o' libtool=no @AMDEPBACKSLASH@
+@AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
+@am__fastdepCXX_FALSE@ $(AM_V_CXX@am__nodep@)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -c -o ascii2nc-iabp_handler.o `test -f 'iabp_handler.cc' || echo '$(srcdir)/'`iabp_handler.cc
+
+ascii2nc-iabp_handler.obj: iabp_handler.cc
+@am__fastdepCXX_TRUE@ $(AM_V_CXX)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -MT ascii2nc-iabp_handler.obj -MD -MP -MF $(DEPDIR)/ascii2nc-iabp_handler.Tpo -c -o ascii2nc-iabp_handler.obj `if test -f 'iabp_handler.cc'; then $(CYGPATH_W) 'iabp_handler.cc'; else $(CYGPATH_W) '$(srcdir)/iabp_handler.cc'; fi`
+@am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/ascii2nc-iabp_handler.Tpo $(DEPDIR)/ascii2nc-iabp_handler.Po
+@AMDEP_TRUE@@am__fastdepCXX_FALSE@ $(AM_V_CXX)source='iabp_handler.cc' object='ascii2nc-iabp_handler.obj' libtool=no @AMDEPBACKSLASH@
+@AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
+@am__fastdepCXX_FALSE@ $(AM_V_CXX@am__nodep@)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -c -o ascii2nc-iabp_handler.obj `if test -f 'iabp_handler.cc'; then $(CYGPATH_W) 'iabp_handler.cc'; else $(CYGPATH_W) '$(srcdir)/iabp_handler.cc'; fi`
+
ascii2nc-python_handler.o: python_handler.cc
@am__fastdepCXX_TRUE@ $(AM_V_CXX)$(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(ascii2nc_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS) -MT ascii2nc-python_handler.o -MD -MP -MF $(DEPDIR)/ascii2nc-python_handler.Tpo -c -o ascii2nc-python_handler.o `test -f 'python_handler.cc' || echo '$(srcdir)/'`python_handler.cc
@am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/ascii2nc-python_handler.Tpo $(DEPDIR)/ascii2nc-python_handler.Po
@@ -877,6 +895,7 @@ distclean: distclean-am
-rm -f ./$(DEPDIR)/ascii2nc-ascii2nc.Po
-rm -f ./$(DEPDIR)/ascii2nc-ascii2nc_conf_info.Po
-rm -f ./$(DEPDIR)/ascii2nc-file_handler.Po
+ -rm -f ./$(DEPDIR)/ascii2nc-iabp_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-ismn_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-little_r_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-met_handler.Po
@@ -936,6 +955,7 @@ maintainer-clean: maintainer-clean-am
-rm -f ./$(DEPDIR)/ascii2nc-ascii2nc.Po
-rm -f ./$(DEPDIR)/ascii2nc-ascii2nc_conf_info.Po
-rm -f ./$(DEPDIR)/ascii2nc-file_handler.Po
+ -rm -f ./$(DEPDIR)/ascii2nc-iabp_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-ismn_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-little_r_handler.Po
-rm -f ./$(DEPDIR)/ascii2nc-met_handler.Po
diff --git a/src/tools/other/ascii2nc/ascii2nc.cc b/src/tools/other/ascii2nc/ascii2nc.cc
index a0c76e0082..940e12cedf 100644
--- a/src/tools/other/ascii2nc/ascii2nc.cc
+++ b/src/tools/other/ascii2nc/ascii2nc.cc
@@ -86,6 +86,7 @@
#include "airnow_handler.h"
#include "ndbc_handler.h"
#include "ismn_handler.h"
+#include "iabp_handler.h"
#ifdef ENABLE_PYTHON
#include "global_python.h"
@@ -117,6 +118,7 @@ enum class ASCIIFormat {
Airnow_hourly,
NDBC_standard,
ISMN,
+ IABP,
Aeronet_v2,
Aeronet_v3,
Python,
@@ -137,6 +139,10 @@ static MaskPlane mask_area;
static MaskPoly mask_poly;
static StringArray mask_sid;
+// Beginning and ending times
+static unixtime valid_beg_ut;
+static unixtime valid_end_ut;
+
static int compress_level = -1;
////////////////////////////////////////////////////////////////////////
@@ -152,6 +158,8 @@ static void set_mask_grid(const StringArray &);
static void set_mask_poly(const StringArray &);
static void set_mask_sid(const StringArray &);
static void set_compress(const StringArray &);
+static void set_valid_beg_time(const StringArray &);
+static void set_valid_end_time(const StringArray &);
static void setup_wrapper_path();
@@ -166,6 +174,9 @@ int met_main(int argc, char *argv[]) {
//
if(argc == 1) { usage(); return 0; }
+ // Initialize time range
+ valid_beg_ut = valid_end_ut = (unixtime) 0;
+
//
// Parse the command line into tokens
//
@@ -184,6 +195,8 @@ int met_main(int argc, char *argv[]) {
cline.add(set_mask_grid, "-mask_grid", 1);
cline.add(set_mask_poly, "-mask_poly", 1);
cline.add(set_mask_sid, "-mask_sid", 1);
+ cline.add(set_valid_beg_time, "-valid_beg", 1);
+ cline.add(set_valid_end_time, "-valid_end", 1);
cline.add(set_compress, "-compress", 1);
//
@@ -211,6 +224,17 @@ int met_main(int argc, char *argv[]) {
<< "Config File: " << config_filename << "\n";
config_info.read_config(DEFAULT_CONFIG_FILENAME, config_filename.text());
+ // Check that valid_end_ut >= valid_beg_ut
+ if(valid_beg_ut != (unixtime) 0 &&
+ valid_end_ut != (unixtime) 0 &&
+ valid_beg_ut > valid_end_ut) {
+ mlog << Error << "\nmet_main() -> "
+ << "the ending time (" << unix_to_yyyymmdd_hhmmss(valid_end_ut)
+ << ") must be greater than the beginning time ("
+ << unix_to_yyyymmdd_hhmmss(valid_beg_ut) << ").\n\n";
+ exit(1);
+ }
+
//
// Create the file handler based on the ascii format specified on
// the command line. If one wasn't specified, we'll look in the
@@ -225,7 +249,8 @@ int met_main(int argc, char *argv[]) {
if(deflate_level > 9) deflate_level = config_info.get_compression_level();
file_handler->setCompressionLevel(deflate_level);
file_handler->setSummaryInfo(config_info.getSummaryInfo());
-
+ file_handler->setValidTimeRange(valid_beg_ut, valid_end_ut);
+
//
// Set the masking grid and polyline, if specified.
//
@@ -330,6 +355,10 @@ FileHandler *create_file_handler(const ASCIIFormat format, const ConcatString &a
return (FileHandler *) new IsmnHandler(program_name);
}
+ case ASCIIFormat::IABP: {
+ return((FileHandler *) new IabpHandler(program_name));
+ }
+
case ASCIIFormat::Aeronet_v2: {
AeronetHandler *handler = new AeronetHandler(program_name);
handler->setFormatVersion(2);
@@ -375,6 +404,22 @@ FileHandler *determine_ascii_format(const ConcatString &ascii_filename) {
exit(1);
}
+ //
+ // See if this is an IABP file.
+ // put this first as it can have the same number of columns as some
+ // other ones, which look only at the number of columns
+ //
+ f_in.rewind();
+ IabpHandler *iabp_file = new IabpHandler(program_name);
+
+ if(iabp_file->isFileType(f_in)) {
+ f_in.close();
+ return((FileHandler *) iabp_file);
+ }
+
+ delete iabp_file;
+
+
//
// See if this is a MET file.
//
@@ -505,6 +550,8 @@ void usage() {
<< "\t[-mask_sid file|list]\n"
<< "\t[-log file]\n"
<< "\t[-v level]\n"
+ << "\t[-valid_beg time]\n"
+ << "\t[-valid_end time]\n"
<< "\t[-compress level]\n\n"
<< "\twhere\t\"ascii_file\" is the formatted ASCII "
@@ -524,6 +571,7 @@ void usage() {
<< AirnowHandler::getFormatStringHourly() << "\", \""
<< NdbcHandler::getFormatStringStandard() << "\", \""
<< IsmnHandler::getFormatString() << "\", \""
+ << IabpHandler::getFormatString() << "\", \""
<< AeronetHandler::getFormatString() << "\", \""
<< AeronetHandler::getFormatString_v2() << "\", \""
<< AeronetHandler::getFormatString_v3() << "\"";
@@ -556,6 +604,12 @@ void usage() {
<< "\t\t\"-v level\" overrides the default level of logging ("
<< mlog.verbosity_level() << ") (optional).\n"
+ << "\t\t\"-valid_beg time\" in YYYYMMDD[_HH[MMSS]] sets the "
+ << "beginning of the processed data time window (optional).\n"
+
+ << "\t\t\"-valid_end time\" in YYYYMMDD[_HH[MMSS]] sets the "
+ << "end of the processed data time window (optional).\n"
+
<< "\t\t\"-compress level\" overrides the compression level of NetCDF variable ("
<< config_info.get_compression_level() << ") (optional).\n\n"
@@ -609,6 +663,9 @@ void set_format(const StringArray & a) {
else if(IsmnHandler::getFormatString() == a[0]) {
ascii_format = ASCIIFormat::ISMN;
}
+ else if(IabpHandler::getFormatString() == a[0]) {
+ ascii_format = ASCIIFormat::IABP;
+ }
else if(AeronetHandler::getFormatString() == a[0]
|| AeronetHandler::getFormatString_v2() == a[0]) {
ascii_format = ASCIIFormat::Aeronet_v2;
@@ -702,6 +759,20 @@ void set_mask_sid(const StringArray & a) {
////////////////////////////////////////////////////////////////////////
+void set_valid_beg_time(const StringArray & a)
+{
+ valid_beg_ut = timestring_to_unix(a[0].c_str());
+}
+
+////////////////////////////////////////////////////////////////////////
+
+void set_valid_end_time(const StringArray & a)
+{
+ valid_end_ut = timestring_to_unix(a[0].c_str());
+}
+
+////////////////////////////////////////////////////////////////////////
+
void set_compress(const StringArray & a) {
compress_level = atoi(a[0].c_str());
}
diff --git a/src/tools/other/ascii2nc/file_handler.cc b/src/tools/other/ascii2nc/file_handler.cc
index 6e2521f0c8..5495c8b6dc 100644
--- a/src/tools/other/ascii2nc/file_handler.cc
+++ b/src/tools/other/ascii2nc/file_handler.cc
@@ -57,7 +57,9 @@ FileHandler::FileHandler(const string &program_name) :
use_var_id(false),
do_monitor(false),
deflate_level(DEF_DEFLATE_LEVEL),
- _dataSummarized(false)
+ _dataSummarized(false),
+ valid_beg_ut((time_t)0),
+ valid_end_ut((time_t)0)
{
}
@@ -77,6 +79,12 @@ bool FileHandler::readAsciiFiles(const vector< ConcatString > &ascii_filename_li
// Loop through the ASCII files, reading in the observations. At the end of
// this loop, all of the observations will be in the _observations vector.
+ //
+ // debug counts
+ //
+ num_observations_in_range = 0;
+ num_observations_out_of_range = 0;
+
for (vector< ConcatString >::const_iterator ascii_filename = ascii_filename_list.begin();
ascii_filename != ascii_filename_list.end(); ++ascii_filename)
{
@@ -103,6 +111,9 @@ bool FileHandler::readAsciiFiles(const vector< ConcatString > &ascii_filename_li
ascii_file.close();
}
+ mlog << Debug(2) << " Kept " << num_observations_in_range
+ << " observations, rejected (out of range) " << num_observations_out_of_range
+ << " observations\n";
return true;
}
@@ -172,6 +183,14 @@ void FileHandler::setSummaryInfo(const TimeSummaryInfo &summary_info) {
////////////////////////////////////////////////////////////////////////
+void FileHandler::setValidTimeRange(const time_t &valid_beg, const time_t valid_end)
+{
+ valid_beg_ut = valid_beg;
+ valid_end_ut = valid_end;
+}
+
+////////////////////////////////////////////////////////////////////////
+
bool FileHandler::summarizeObs(const TimeSummaryInfo &summary_info)
{
bool result = summary_obs.summarizeObs(summary_info);
@@ -275,6 +294,16 @@ bool FileHandler::_addObservations(const Observation &obs)
//
if(filters.is_filtered_sid(obs.getStationId().c_str())) return false;
+ //
+ // Check if valid time is in range
+ //
+ if (_keep_valid_time(obs.getValidTime())) {
+ num_observations_in_range++;
+ } else {
+ num_observations_out_of_range++;
+ return false;
+ }
+
// Save obs because the obs vector is sorted after time summary
_observations.push_back(obs);
if (do_summary) summary_obs.addObservationObj(obs);
@@ -334,3 +363,23 @@ void FileHandler::debug_print_observations(vector< Observation > my_observation,
}
////////////////////////////////////////////////////////////////////////
+
+bool FileHandler::_keep_valid_time(const time_t &valid_time) const
+{
+ bool keep = true;
+
+ // If valid times are both set, check the range
+ if (valid_beg_ut != (time_t) 0 && valid_end_ut != (time_t) 0) {
+ if (valid_time < valid_beg_ut || valid_time > valid_end_ut) keep = false;
+ }
+ // If only beg set, check the lower bound
+ else if (valid_beg_ut != (time_t) 0 && valid_end_ut == (time_t) 0) {
+ if (valid_time < valid_beg_ut) keep = false;
+ }
+ // If only end set, check the upper bound
+ else if (valid_beg_ut == (time_t) 0 && valid_end_ut != (time_t) 0) {
+ if (valid_time > valid_end_ut) keep = false;
+ }
+ return(keep);
+}
+
diff --git a/src/tools/other/ascii2nc/file_handler.h b/src/tools/other/ascii2nc/file_handler.h
index af721c008c..07d77e54f3 100644
--- a/src/tools/other/ascii2nc/file_handler.h
+++ b/src/tools/other/ascii2nc/file_handler.h
@@ -66,7 +66,7 @@ class FileHandler
void setCompressionLevel(int compressoion_level);
void setSummaryInfo(bool new_do_summary);
void setSummaryInfo(const TimeSummaryInfo &summary_info);
-
+ void setValidTimeRange(const time_t &valid_beg, const time_t valid_end);
protected:
@@ -114,6 +114,10 @@ class FileHandler
TimeSummaryInfo _summaryInfo;
SummaryObs summary_obs;
+ time_t valid_beg_ut, valid_end_ut;
+ int num_observations_in_range;
+ int num_observations_out_of_range;
+
///////////////////////
// Protected methods //
///////////////////////
@@ -143,6 +147,9 @@ class FileHandler
void _closeNetcdf();
bool _openNetcdf(const std::string &nc_filename);
void debug_print_observations(std::vector< Observation >, std::string);
+
+ bool _keep_valid_time(const time_t &valid_time) const;
+
};
inline void FileHandler::setCompressionLevel(int compressoion_level) { deflate_level = compressoion_level; }
diff --git a/src/tools/other/ascii2nc/iabp_handler.cc b/src/tools/other/ascii2nc/iabp_handler.cc
new file mode 100644
index 0000000000..71ce27f10c
--- /dev/null
+++ b/src/tools/other/ascii2nc/iabp_handler.cc
@@ -0,0 +1,273 @@
+// *=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*
+// ** Copyright UCAR (c) 1992 - 2024
+// ** University Corporation for Atmospheric Research (UCAR)
+// ** National Center for Atmospheric Research (NCAR)
+// ** Research Applications Lab (RAL)
+// ** P.O.Box 3000, Boulder, Colorado, 80307-3000, USA
+// *=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*
+
+////////////////////////////////////////////////////////////////////////
+
+using namespace std;
+
+#include
+
+#include "vx_log.h"
+#include "vx_math.h"
+#include "vx_util.h"
+
+#include "iabp_handler.h"
+
+const double IabpHandler::IABP_MISSING_VALUE = -999.0;
+
+
+const int IabpHandler::MIN_NUM_HDR_COLS = 8;
+
+// days in the month
+static int daysOfMonth[] = {31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31};
+
+static int _lookfor(const DataLine &dl, const string &name);
+static int _lookfor(const DataLine &dl, const string &name, const string &ascii_file, bool &ok);
+static time_t _time(const string &syear, const string &shour, const string &smin, const string &sdoy);
+
+
+////////////////////////////////////////////////////////////////////////
+//
+// Code for class IabpHandler
+//
+////////////////////////////////////////////////////////////////////////
+
+IabpHandler::IabpHandler(const string &program_name) :
+ FileHandler(program_name) {
+ use_var_id = true;
+}
+
+////////////////////////////////////////////////////////////////////////
+
+IabpHandler::~IabpHandler() {
+}
+
+////////////////////////////////////////////////////////////////////////
+
+bool IabpHandler::isFileType(LineDataFile &ascii_file) const {
+
+ // IABP files are identified by having a .dat suffix and
+ // checking the always present data columns.
+ // The header look like this:
+ // BuoyID Year Hour Min DOY POS_DOY Lat Lon [ BP Ts Ta]
+
+ // Initialize using the filename suffix
+ bool is_file_type = check_prefix_suffix(ascii_file.short_filename(),
+ nullptr, ".dat");
+
+ // Read the header line
+ DataLine dl;
+ while(dl.n_items() == 0) ascii_file >> dl;
+
+ // Check the minimum number of header columns
+ if(dl.n_items() < MIN_NUM_HDR_COLS) {
+ return false;
+ }
+
+ string line = dl.get_line();
+ ConcatString cstring(line);
+
+ StringArray tokens = cstring.split(" ");
+ if (tokens[0] != "BuoyID") is_file_type = false;
+ if (tokens[1] != "Year") is_file_type = false;
+ if (tokens[2] != "Hour") is_file_type = false;
+ if (tokens[3] != "Min") is_file_type = false;
+ if (tokens[4] != "DOY") is_file_type = false;
+ if (tokens[5] != "POS_DOY") is_file_type = false;
+ if (tokens[6] != "Lat") is_file_type = false;
+ if (tokens[7] != "Lon") is_file_type = false;
+
+ return(is_file_type);
+}
+
+////////////////////////////////////////////////////////////////////////
+// Private/Protected methods
+////////////////////////////////////////////////////////////////////////
+
+bool IabpHandler::_readObservations(LineDataFile &ascii_file)
+{
+ // Read and save the header information
+ if(!_readHeaderInfo(ascii_file)) return(false);
+
+ string header_type = "IABP_STANDARD";
+
+ // Process the observation lines
+ DataLine dl;
+ while(ascii_file >> dl) {
+
+ // Make sure that the line contains the correct number of tokens
+ if(dl.n_items() != _numColumns) {
+ mlog << Error << "\nIabpHandler::_readObervations() -> "
+ << "unexpected number of columns (" << dl.n_items()
+ << " != " << _numColumns << ") on line number "
+ << dl.line_number() << " of IABP file \""
+ << ascii_file.filename() << "\"!\n\n";
+ return(false);
+ }
+
+ // Extract the valid time from the data line, using POS_DOY (scientist is most
+ // interested in location) Use POS_DOY to compute the month and day based on year
+ // (to handle leap year)
+ time_t valid_time = _time(dl[_yearPtr], dl[_hourPtr], dl[_minutePtr], dl[_posdoyPtr]);
+ if(valid_time == 0) {
+ mlog << Warning << "\nIabpHandler::_readObservations() -> "
+ << "No valid time computed in file, line number "
+ << dl.line_number() << " of IABP file \""
+ << ascii_file.filename() << "\". Ignore this line\n\n";
+ return(false);
+ }
+
+ double lat = stod(dl[_latPtr]);
+ double lon = stod(dl[_lonPtr]);
+ string stationId = dl[_idPtr];
+ string quality_flag = na_str;
+ int grib_code = 0;
+ double height_m = bad_data_double;
+ double pres = bad_data_double;
+ double elev = bad_data_double;
+ double ts = bad_data_double;
+ double ta = bad_data_double;
+
+ if (lat == IABP_MISSING_VALUE || lon == IABP_MISSING_VALUE) {
+ // This is either a rare event or never happens
+ mlog << Warning << "\nIabpHandler::_readObservations() -> "
+ << "Latitude/longitude has missing value " << IABP_MISSING_VALUE
+ << ", line number " << dl.line_number() << " of IABP file \""
+ << ascii_file.filename() << "\". Ignore this line\n\n";
+ return(false);
+ }
+
+ if (_bpPtr >= 0) {
+ // is this the right placeholder for this? To always put it in to the
+ // fixed slot of an observation?
+ pres = stod(dl[_bpPtr]);
+ if (pres == IABP_MISSING_VALUE) {
+ pres = bad_data_double;
+ }
+ }
+
+ // Add a location placeholder observation in case neither of the temps are available
+ // Otherwise there would be no observations for this entry, and we want to have
+ // valid entries for every time/lat/lon.
+ _addObservations(Observation(
+ header_type, stationId, valid_time,
+ lat, lon, elev, quality_flag, grib_code,
+ pres, height_m, 1.0, "Location"));
+ grib_code++;
+
+ if (_tsPtr >= 0) {
+ ts = stod(dl[_tsPtr]);
+ if (ts != IABP_MISSING_VALUE) {
+ _addObservations(Observation(
+ header_type, stationId, valid_time,
+ lat, lon, elev, quality_flag, grib_code,
+ pres, height_m, ts, "Temp_surface"));
+ grib_code++;
+ }
+ }
+ if (_taPtr >= 0) {
+ ta = stod(dl[_taPtr]);
+ if (ta != IABP_MISSING_VALUE) {
+ _addObservations(Observation(
+ header_type, stationId, valid_time,
+ lat, lon, elev, quality_flag, grib_code,
+ pres, height_m, ta, "Temp_air"));
+ }
+ }
+
+
+
+ } // end while
+
+ return(true);
+}
+
+// ////////////////////////////////////////////////////////////////////////
+
+bool IabpHandler::_readHeaderInfo(LineDataFile &ascii_file) {
+
+ DataLine dl;
+ if (!(ascii_file >> dl))
+ {
+ mlog << Error << "\nIabpHandler::_readHeaderInfo() -> "
+ << "error reading header line from input ASCII file \""
+ << ascii_file.filename() << "\"\n\n";
+ return false;
+ }
+
+ // Check the minimum number of header columns
+ if(dl.n_items() < MIN_NUM_HDR_COLS) {
+ mlog << Error << "\nIabpHandler::_readHeaderInfo() -> "
+ << "unexpected number of header columns ("
+ << dl.n_items() << " < " << MIN_NUM_HDR_COLS
+ << ") in IABP file \"" << ascii_file.filename()
+ << "\"!\n\n";
+ return(false);
+ }
+
+ // Map the header information to column numbers
+ bool ok = true;
+ string filename = ascii_file.filename();
+ _idPtr = _lookfor(dl, "BuoyID", filename, ok);
+ _yearPtr = _lookfor(dl, "Year", filename, ok);
+ _hourPtr = _lookfor(dl, "Hour", filename, ok);
+ _minutePtr = _lookfor(dl, "Min", filename, ok);
+ _doyPtr = _lookfor(dl, "DOY", filename, ok);
+ _posdoyPtr = _lookfor(dl, "POS_DOY", filename, ok);
+ _latPtr = _lookfor(dl, "Lat", filename, ok);
+ _lonPtr = _lookfor(dl, "Lon", filename, ok);
+ _numColumns = MIN_NUM_HDR_COLS;
+ _bpPtr = _lookfor(dl, "BP");
+ if (_bpPtr >= 0) ++_numColumns;
+ _tsPtr = _lookfor(dl, "Ts");
+ if (_tsPtr >= 0) ++_numColumns;
+ _taPtr = _lookfor(dl, "Ta");
+ if (_taPtr >= 0) ++_numColumns;
+ return ok;
+}
+
+////////////////////////////////////////////////////////////////////////
+
+static int _lookfor(const DataLine &dl, const string &name, const string &ascii_file, bool &ok)
+{
+ for (int i=0; i "
+ << "reading ASCII file \""
+ << ascii_file << "\" did not find expected header item:\"" << name << "\" ignore file\n\n";
+ ok = false;
+ return -1;
+}
+
+////////////////////////////////////////////////////////////////////////
+
+static int _lookfor(const DataLine &dl, const string &name)
+{
+ for (int i=0; i
+#include
+
+#include "file_handler.h"
+
+////////////////////////////////////////////////////////////////////////
+//
+// International Arctic Buoy Programme
+// https://iabp.apl.uw.edu/data.html
+// Files pulled from:
+// https://iabp.apl.uw.edu/WebData/
+//
+//
+// Dataset file names:
+// .dat where is an integer.
+//
+// Dataset Conents:
+//
+// Buoy data files are updated daily and made available individually at the WebData URL above.
+// Values provided are confined to surface temperature,atmospheric temperature, and barometric pressure when these values are available.
+// All buoy files contain at least dates and positions.
+// Each file has a header line and one or more data lines.
+//
+// Header Line example (BP Ts and Ta are not always present, any subset could be there or not):
+//
+// BuoyID Year Hour Min DOY POS_DOY Lat Lon BP Ts Ta
+//
+// Record Lines (many per file, typically):
+// Each file contains data from all buoys on a given date.
+// Data includes: BuoyID, year, hour, minute, Day of Year, Position Day of year,
+// latitude, longitude, [Barometric Pressure], [Surface Temp], and [Atmospheric Temp].
+// The last 3 values are optional and depend on the header line.
+//
+// Record line Example
+//
+// 5318 2014 02 20 28.0970 28.0930 72.75970 -165.25190 1016.62 -999.00 -13.92
+//
+// It looks like the missing data value is -999.0, but this might not always be the case, it seems to not be documented.
+//
+
+////////////////////////////////////////////////////////////////////////
+
+class IabpHandler : public FileHandler {
+
+ public:
+
+ IabpHandler(const string &program_name);
+ virtual ~IabpHandler();
+
+ virtual bool isFileType(LineDataFile &ascii_file) const;
+
+ static string getFormatString() { return "iabp"; }
+
+ protected:
+
+ /////////////////////////
+ // Protected constants
+ /////////////////////////
+
+ // The minimum number of columns in the header line in the file.
+ static const int MIN_NUM_HDR_COLS;
+
+ // a missing data value found in some iabp files
+ static const double IABP_MISSING_VALUE;
+
+ ///////////////////////
+ // Protected members
+ ///////////////////////
+
+ // Store list of unqiue output variable names
+ StringArray _varNames;
+
+ // pointers based on header content
+ int _idPtr;
+ int _yearPtr;
+ int _hourPtr;
+ int _minutePtr;
+ int _doyPtr;
+ int _posdoyPtr;
+ int _latPtr;
+ int _lonPtr;
+ int _bpPtr;
+ int _tsPtr;
+ int _taPtr;
+
+ // depends on which of the optional data types are present in a file
+ int _numColumns;
+
+ string _buoyId;
+ time_t _validTime;
+ double _stationLat;
+ double _stationLon;
+ double _stationElv;
+ double _bp;
+ double _ts;
+ double _ta;
+
+ ///////////////////////
+ // Protected methods
+ ///////////////////////
+
+ // Read and save the header information from the given file
+ bool _readHeaderInfo(LineDataFile &ascii_file);
+
+ // Get the valid time from the observation line
+ // Read the observations and add them to the
+ // _observations vector
+ virtual bool _readObservations(LineDataFile &ascii_file);
+
+ // compute the time value from inputs
+ time_t _time2(const string &syear, const string &shour, const string &smin, const string &sdoy);
+};
+
+////////////////////////////////////////////////////////////////////////
+
+#endif /* __IABP_HANDLER_H__ */
+
+////////////////////////////////////////////////////////////////////////