Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simple tests #17

Merged
merged 7 commits into from
Jul 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions .github/labels.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
---
# Labels names are important as they are used by Release Drafter to decide
# regarding where to record them in changelog or if to skip them.
#
# The repository labels will be automatically configured using this file and
# the GitHub Action https://github.com/marketplace/actions/github-labeler.
- name: breaking
description: Breaking Changes
color: bfd4f2
- name: bug
description: Something isn't working
color: d73a4a
- name: build
description: Build System and Dependencies
color: bfdadc
- name: ci
description: Continuous Integration
color: 4a97d6
- name: dependencies
description: Pull requests that update a dependency file
color: 0366d6
- name: documentation
description: Improvements or additions to documentation
color: 0075ca
- name: duplicate
description: This issue or pull request already exists
color: cfd3d7
- name: enhancement
description: New feature or request
color: a2eeef
- name: github_actions
description: Pull requests that update Github_actions code
color: "000000"
- name: good first issue
description: Good for newcomers
color: 7057ff
- name: help wanted
description: Extra attention is needed
color: 008672
- name: invalid
description: This doesn't seem right
color: e4e669
- name: performance
description: Performance
color: "016175"
- name: python
description: Pull requests that update Python code
color: 2b67c6
- name: question
description: Further information is requested
color: d876e3
- name: refactoring
description: Refactoring
color: ef67c4
- name: removal
description: Removals and Deprecations
color: 9ae7ea
- name: style
description: Style
color: c120e5
- name: testing
description: Testing
color: b1fc6f
- name: wontfix
description: This will not be worked on
color: ffffff
28 changes: 28 additions & 0 deletions .github/release-drafter.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
categories:
- title: ":boom: Breaking Changes"
label: "breaking"
- title: ":rocket: Features"
label: "enhancement"
- title: ":fire: Removals and Deprecations"
label: "removal"
- title: ":beetle: Fixes"
label: "bug"
- title: ":racehorse: Performance"
label: "performance"
- title: ":rotating_light: Testing"
label: "testing"
- title: ":construction_worker: Continuous Integration"
label: "ci"
- title: ":books: Documentation"
label: "documentation"
- title: ":hammer: Refactoring"
label: "refactoring"
- title: ":lipstick: Style"
label: "style"
- title: ":package: Dependencies"
labels:
- "dependencies"
- "build"
template: |
## Changes
$CHANGES
18 changes: 18 additions & 0 deletions .github/workflows/labeler.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
name: Labeler

on:
push:
branches:
- main

jobs:
labeler:
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v3

- name: Run Labeler
uses: crazy-max/[email protected]
with:
skip-delete: true
78 changes: 78 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
name: Release

on:
push:
branches:
- main

jobs:
release:
name: Release
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v3
with:
fetch-depth: 2

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"

- name: Upgrade pip
run: |
pip install --constraint=.github/workflows/constraints.txt pip
pip --version

- name: Install Poetry
run: |
pip install --constraint=.github/workflows/constraints.txt poetry
poetry --version

- name: Check if there is a parent commit
id: check-parent-commit
run: |
echo "::set-output name=sha::$(git rev-parse --verify --quiet HEAD^)"

- name: Detect and tag new version
id: check-version
if: steps.check-parent-commit.outputs.sha
uses: salsify/[email protected]
with:
version-command: |
bash -o pipefail -c "poetry version | awk '{ print \$2 }'"

- name: Bump version for developmental release
if: "! steps.check-version.outputs.tag"
run: |
poetry version patch &&
version=$(poetry version | awk '{ print $2 }') &&
poetry version $version.dev.$(date +%s)

- name: Build package
run: |
poetry build --ansi

# - name: Publish package on PyPI
# if: steps.check-version.outputs.tag
# uses: pypa/[email protected]
# with:
# user: __token__
# password: ${{ secrets.PYPI_TOKEN }}

# - name: Publish package on TestPyPI
# if: "! steps.check-version.outputs.tag"
# uses: pypa/[email protected]
# with:
# user: __token__
# password: ${{ secrets.TEST_PYPI_TOKEN }}
# repository_url: https://test.pypi.org/legacy/

- name: Publish the release notes
uses: release-drafter/[email protected]
with:
publish: ${{ steps.check-version.outputs.tag != '' }}
tag: ${{ steps.check-version.outputs.tag }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -127,3 +127,6 @@ dmypy.json

# Pyre type checker
.pyre/

# My additions
tests/data
51 changes: 37 additions & 14 deletions src/cesm_helper_scripts/gen_agg
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,20 @@ parser.add_argument(
default=["TREFHT"],
help="List of attributes of netCDF file",
)
parser.add_argument(
"--append-to",
type=str,
default="",
help="An output file that should be appended to with the input data files.",
)

args = parser.parse_args()
if args.append_to != "":
sys.exit(
"Sorry, but an appending method will not be implemented in the near future."
" Instead you can use the ncrcat function from the NCO family"
" (http://nco.sf.net/nco.html#ncrcat)."
)
# Correct the input argument
if args.input is None:
raise ValueError("you must give the input files")
Expand All @@ -74,9 +86,11 @@ else:
path = ""
# Combine the path with all files
inputs = [
f"{path}{file}"
if file.split(".")[-1] == "nc" or "*" in file
else f"{path}{file}.nc"
(
f"{path}{file}"
if file.split(".")[-1] == "nc" or "*" in file
else f"{path}{file}.nc"
)
for file in args.input
]
# If an asterisk (*) is used, all other files are discarded
Expand All @@ -103,14 +117,16 @@ def _attr_present(attr) -> bool:
check_input = glob.glob(the_input)[0]
elif isinstance(the_input, list):
check_input = the_input[0]
ds = xr.open_mfdataset(check_input)
ds = xr.open_mfdataset(check_input, lock=False)
try:
_ = getattr(ds, attr)
except AttributeError as e:
print(e)
return False
else:
return True
finally:
ds.close()


# Correct the savepath argument
Expand All @@ -135,11 +151,10 @@ if not attrs:
sys.exit("All attributes files already exist. Exiting...")

print("Creating aggregated dataset... ", end="", flush=True)
dataset = xr.open_mfdataset(the_input)
# See issue https://github.com/pydata/xarray/issues/3961
dataset = xr.open_mfdataset(the_input, lock=False)
dataset = xr.decode_cf(dataset)
print("Finished creating aggregated dataset.")
# TREFHT: This is probably the correct one for global temperature, reference height
# temperature. SNOWHLND: land snow volume?
for i, a in enumerate(attrs):
print(
f"{i+1}/{len(attrs)}: Start creating file for attr {a}... ", end="", flush=True
Expand All @@ -165,14 +180,22 @@ for i, a in enumerate(attrs):
parts += 1
continue
bulk = ds[(parts - 1) * ten_years : parts * ten_years]
bulk = bulk.assign_attrs(
{"Time span": f"From {bulk.time.data[0]} to {bulk.time.data[-1]}"}
bulk = ds.to_dataset()
bulk.attrs[
"history"
] = f"Time span: From {ds.time.data[0]} to {ds.time.data[-1]}"
bulk.to_netcdf(
savepath + a + output[:-3] + f"-{parts}.nc", unlimited_dims="time"
)
bulk.to_netcdf(savepath + a + output[:-3] + f"-{parts}.nc")
bulk.close()
parts += 1
else:
ds = ds.assign_attrs(
{"Time span": f"From {ds.time.data[0]} to {ds.time.data[-1]}"}
)
ds.to_netcdf(savepath + a + output)
ds = ds.to_dataset()
ds.attrs[
"history"
] = f"Time span: From {ds.time.data[0]} to {ds.time.data[-1]}"
ds.to_netcdf(savepath + a + output, unlimited_dims="time")
print(f"{tabs}Finished creating {a + output}.")
finally:
ds.close()
dataset.close()
Loading