Skip to content

Commit

Permalink
Merge pull request #370 from OpenCOMPES/notebook_rework
Browse files Browse the repository at this point in the history
Fix documentation and update workflows
  • Loading branch information
rettigl authored Mar 22, 2024
2 parents 52cbb66 + 65b8e89 commit 5553e53
Show file tree
Hide file tree
Showing 10 changed files with 23 additions and 22 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ on:
workflow_dispatch:
push:
branches: [ main, create-pull-request/patch ]
paths-ignore:
pyproject.toml

jobs:
benchmark:
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -99,10 +99,10 @@ jobs:
run: poetry run sphinx-build -b html $GITHUB_WORKSPACE/docs $GITHUB_WORKSPACE/_build

- name: Setup Pages
uses: actions/configure-pages@v3
uses: actions/configure-pages@v4

- name: Upload artifact
uses: actions/upload-pages-artifact@v2
uses: actions/upload-pages-artifact@v3
with:
path: '_build'

Expand All @@ -116,4 +116,4 @@ jobs:
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
uses: actions/deploy-pages@v4
2 changes: 1 addition & 1 deletion .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# Check out repo and set up Python
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
lfs: true

Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
outputs:
version: ${{ steps.version.outputs.version }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
lfs: true
path: 'sed-processor'
Expand Down Expand Up @@ -68,7 +68,7 @@ jobs:
outputs:
version: ${{ steps.version.outputs.version }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
lfs: true
path: 'sed-processor'
Expand Down Expand Up @@ -120,11 +120,11 @@ jobs:

steps:
- name: Download a single artifact
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: dist

- name: Publish package distributions to PyPI Test
- name: Publish package distributions to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: .
Expand All @@ -141,13 +141,13 @@ jobs:
app-id: ${{ secrets.APP_ID }}
private-key: ${{ secrets.APP_PRIVATE_KEY }}

- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
lfs: true
token: ${{ steps.generate_token.outputs.token }}

- name: Download pyproject.toml
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: pyproject

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/update_dependencies.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# Check out repo and set up Python
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
lfs: true

Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
*.nx
*.nxs
*.zip
**/parquet/*
**/processed/*

# Byte-compiled / optimized / DLL files
__pycache__/
Expand Down
2 changes: 1 addition & 1 deletion docs/build_flash_parquets.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
"core": {
"paths": {
"data_raw_dir": data_path + "/flash_data/",
"data_parquet_dir": data_path + "/parquet/",
"data_parquet_dir": data_path + "/processed/",
},
},
}
Expand Down
4 changes: 2 additions & 2 deletions tutorial/1_binning_fake_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"tags": []
},
"source": [
"### Binning demonstration on locally generated fake data\n",
"# Binning demonstration on locally generated fake data\n",
"In this example, we generate a table with random data simulating a single event dataset.\n",
"We showcase the binning method, first on a simple single table using the bin_partition method and then in the distributed mehthod bin_dataframe, using daks dataframes.\n",
"The first method is never really called directly, as it is simply the function called by the bin_dataframe on each partition of the dask dataframe."
Expand Down Expand Up @@ -136,7 +136,7 @@
"id": "01066d40-010a-490b-9033-7339e5a21b26",
"metadata": {},
"source": [
"## compute distributed binning on the partitioned dask dataframe\n",
"## Compute distributed binning on the partitioned dask dataframe\n",
"In this example, the small dataset does not give significant improvement over the pandas implementation, at least using this number of partitions.\n",
"A single partition would be faster (you can try...) but we use multiple for demonstration purpouses."
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@
"id": "1a3697b1",
"metadata": {},
"source": [
"##### Optional (Step 1a): \n",
"#### Optional (Step 1a): \n",
"Save momentum calibration parameters to configuration file in current data folder: "
]
},
Expand Down Expand Up @@ -359,7 +359,7 @@
"id": "e43fbf33",
"metadata": {},
"source": [
"##### Optional (Step 1a): \n",
"#### Optional (Step 1a): \n",
"Save energy correction parameters to configuration file in current data folder: "
]
},
Expand Down Expand Up @@ -489,7 +489,7 @@
"id": "df63c6c7",
"metadata": {},
"source": [
"##### Optional (Step 3a): \n",
"#### Optional (Step 3a): \n",
"Save energy calibration parameters to configuration file in current data folder: "
]
},
Expand Down
7 changes: 3 additions & 4 deletions tutorial/4_hextof_workflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,6 @@
"config_override['core']['paths']['data_parquet_dir'] = \"/asap3/flash/gpfs/pg2/2023/data/11019101/processed\"\n",
"# So we write to user space\n",
"config_override['core']['paths']['data_parquet_dir'] = \"./processed\"\n",
"os.mkdir(config_override['core']['paths']['data_parquet_dir'])\n",
"# If you didn't download data and are using maxwell, don't use this line\n",
"config_override['core']['paths']['data_raw_dir'] = \"./flash_data/\""
]
Expand Down Expand Up @@ -975,9 +974,9 @@
],
"metadata": {
"kernelspec": {
"display_name": "sed_poetry",
"display_name": ".pyenv",
"language": "python",
"name": "sed_poetry"
"name": "python3"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -989,7 +988,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.8.12"
}
},
"nbformat": 4,
Expand Down

0 comments on commit 5553e53

Please sign in to comment.