Skip to content

Commit

Permalink
Sync develop changes June 6 - July 2 to hdf5_1_14 (#4623)
Browse files Browse the repository at this point in the history
* Fix typos in context/property documentation (#4550)

* Fix CI markdown link check http 500 errors (#4556)

Sites like GitLab can have internal problems that return http 500
errors while they fix their problems. Some sites also return http
200 OK, which is fine.

This PR adds a config file to the markdown link check so those
are considered "passing" and don't break the CI.

* Simplify property copying between lists internally (#4551)

* Add Python examples (#4546)

These examples are referred to from the replacement page of https://portal.hdfgroup.org/display/HDF5/Other+Examples.

* Correct property cb signatures in docs (#4554)

* Correct property cb signatures in docs
* Correct delete callback type name in docs
* add missing word to H5P__free_prop doc

* Move C++ and Fortran and examples to HDF5Examples folder (#4552)

* Document 'return-and-read' field in API context (#4560)

* Add compression includes to tests needing zlib support (#4561)

* Allow usage of page buffering for serial file access from parallel HDF5 builds (#4568)

* Remove old version of libaec (#4567)

* Add property names to context field docs (#4563)

* Document property shared name behavior (#4565)

* Clarify H5CX macro documentation (#4569)

* Document H5Punregister modifying default properties (#4570)

* Update NVHPC to 24.5 (#4171)

We don't test parallel in other GitHub actions, so this also converts the
NVHPC check to configure and build only while we discuss how we'll
test parallel HDF5 in GitHub.

There is a blocking GitHub issue to address the test failures for
HDF5 1.14.5 (#4571).

* Clean up comments in H5FDros3.c (#4572)

* Rename INSTALL_Auto.txt to INSTALL_Autotools.txt (#4575)

* Clean up ros3 VFD stats code (#4579)

* Removes printf debugging
* Simplifies and centralizes stats code
* Use #ifdef ROS3_STATS instead of #if
* Other misc tidying

* Turn off ros3 VFD stat collection by default (#4581)

Not a new change - an artifact from a previous check-in.

* Pause recording errors instead of clearing the error stack (#4475)

An internal capability that's similar to the H5E_BEGIN_TRY / H5E_END_TRY
macros in H5Epublic.h, but more efficient since we can avoid pushing errors on
the stack entirely (and those macros use public API routines).

This capability (and other techniques) can be used to remove use of
H5E_clear_stack() and H5E_BEGIN_TRY / H5E_END_TRY within library routines.

We want to remove H5E_clear_stack() because it can trigger calls to the H5I
interface from within the H5E code, which creates a great deal of complexity
for threadsafe code.  And we want to remove H5E_BEGIN_TRY / H5E_END_TRY's
because they make public API calls from within the library code.

Also some other minor tidying in routines related to removing the use of
H5E_clear_stack() and H5E_BEGIN_TRY / H5E_END_TRY from H5Fint.c

* Add page buffer cache command line option to tools (#4562)


Co-authored-by: github-actions <41898282+github-actions[bot]@users.noreply.github.com>

* Clarify documentation for H5CX_get_data_transform (#4580)

* Correct comment for H5CX_get_data_transform

* Document why data transform ctx field doesnt use macro

* Remove public API call from ros3 VFD (#4583)

* Remove printf debugging from H5FDs3comms.c (#4584)

* Cleanup of ros3 test (#4587)

* Removed JS* macro scheme (replaced w/ h5test.h macros)
* Moved curl setup/teardown to main()
* A lot of cleanup and simplification

* Removed unused code from H5FDs3comms.c (#4588)

* H5FD_s3comms_nlowercase()
* H5FD_s3comms_trim()
* H5FD_s3comms_uriencode()

* Remove magic fields from s3comms structs (#4589)

* Remove dead H5FD_s3comms_percent_encode_char() (#4591)

* Rework the TestExpress usage and refactor dead code (#4590)

* Skip examples if running sanitizers (#4592)

* Clean up s3comms test code (#4594)

* Remove JS* macros
* Remove dead code
* Bring in line with other test code

* Add publish to bucket workflow (#4566)

* Update abi report CI workflow for last release (#4596)

* Update abi report workflow to handle 1.14.4.3 release

* Update name of java report

* Document that ctx VOL property isn't drawn from the FAPL (#4597)

* Update macos workflow to 14 (keep 13 as alternate) (#4603)

* Removed unnecessary call to H5E_clear_stack (#4607)

H5FO_opened and H5SL_search don't push errors on the stack

* Bring subfiling VFD code closer to typical library code (#4595)

Remove API calls, use FUNC_ENTER/LEAVE macros, use the library's error macros,
rename functions to have more standardized names, etc.

* Correct documentation for return-and-read fields (#4598)

* These two generators create strings without NUL for testing (#4608)

* Fix Fortran pkconfig to indicate full path of modules (#4593)

* Updated release schedule (#4615)

1.16 and 2.0 information

* Document VOL object wrapping context (#4611)

* Earray.c and farray.c in hdf5_1_14 still need time_t curr_time for HDsrandom.

* Remove line to use future 116_API from CMakeListat.txt files in HDF5
examples directories
  • Loading branch information
lrknox authored Jul 3, 2024
1 parent ad50867 commit 95fa8f1
Show file tree
Hide file tree
Showing 385 changed files with 8,548 additions and 11,532 deletions.
15 changes: 9 additions & 6 deletions .github/workflows/abi-report.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 Check Application Binary Interface (ABI)

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down Expand Up @@ -40,8 +41,10 @@ jobs:
- name: Convert hdf5 reference name (Linux)
id: convert-hdf5lib-refname
run: |
FILE_DOTS=$(echo "${{ inputs.file_ref }}" | sed -r "s/([0-9]+)\_([0-9]+)\_([0-9]+).*/\1\.\2\.\3/")
FILE_DOTS=$(echo "${{ inputs.file_ref }}" | sed -r "s/([0-9]+)\.([0-9]+)\.([0-9]+)\.([0-9]+).*/\1\.\2\.\3-\4/")
echo "HDF5R_DOTS=$FILE_DOTS" >> $GITHUB_OUTPUT
FILE_DOTSMAIN=$(echo "${{ inputs.file_ref }}" | sed -r "s/([0-9]+)\.([0-9]+)\.([0-9]+).*/\1\.\2\.\3/")
echo "HDF5R_DOTSMAIN=$FILE_DOTSMAIN" >> $GITHUB_OUTPUT
- uses: actions/[email protected]

Expand Down Expand Up @@ -81,8 +84,8 @@ jobs:
run: |
mkdir "${{ github.workspace }}/hdf5R"
cd "${{ github.workspace }}/hdf5R"
wget -q https://github.com/HDFGroup/hdf5/releases/download/hdf5-${{ inputs.file_ref }}/hdf5-${{ inputs.file_ref }}-ubuntu-2204.tar.gz
tar zxf hdf5-${{ inputs.file_ref }}-ubuntu-2204.tar.gz
wget -q https://github.com/HDFGroup/hdf5/releases/download/hdf5_${{ inputs.file_ref }}/hdf5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}-ubuntu-2204_gcc.tar.gz
tar zxf hdf5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}-ubuntu-2204_gcc.tar.gz
- name: List files for the space (Linux)
run: |
Expand All @@ -91,7 +94,7 @@ jobs:
- name: Uncompress hdf5 reference binary (Linux)
run: |
cd "${{ github.workspace }}/hdf5R"
tar -zxvf ${{ github.workspace }}/hdf5R/hdf5/HDF5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}-Linux.tar.gz --strip-components 1
tar -zxvf ${{ github.workspace }}/hdf5R/hdf5/HDF5-${{ inputs.file_ref }}-Linux.tar.gz --strip-components 1
- name: List files for the HDFR space (Linux)
run: |
Expand All @@ -113,7 +116,7 @@ jobs:
- name: Run Java API report
run: |
japi-compliance-checker ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/jarhdf5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTS }}.jar ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/jarhdf5-${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}.jar
japi-compliance-checker ${{ steps.set-hdf5lib-refname.outputs.HDF5R_ROOT }}/lib/jarhdf5-${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTSMAIN }}.jar ${{ steps.set-hdf5lib-name.outputs.HDF5_ROOT }}/lib/jarhdf5-${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}.jar
- name: Run ABI report
run: |
Expand Down Expand Up @@ -145,7 +148,7 @@ jobs:

- name: Copy ABI reports
run: |
cp compat_reports/jarhdf5-/${{ steps.set-hdf5lib-refname.outputs.HDF5R_VERS }}_to_${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}/compat_report.html ${{ inputs.file_base }}-java_compat_report.html
cp compat_reports/jarhdf5-/${{ steps.convert-hdf5lib-refname.outputs.HDF5R_DOTSMAIN }}_to_${{ steps.set-hdf5lib-name.outputs.HDF5_VERS }}/compat_report.html ${{ inputs.file_base }}-java_compat_report.html
ls -l compat_reports/${{ inputs.file_base }}/X_to_Y
cp compat_reports/${{ inputs.file_base }}/X_to_Y/compat_report.html ${{ inputs.file_base }}-hdf5_compat_report.html
ls -l compat_reports/${{ inputs.file_base }}_hl/X_to_Y
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/aocc-auto.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 PAR autotools aocc ompi

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/aocc-cmake.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 PAR CMake aocc ompi

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/autotools.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 autools CI

# Controls when the action will run. Triggers the workflow on push or pull request
# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
Expand All @@ -23,9 +23,6 @@ concurrency:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel. We just have one job, but the matrix items defined below will
# run in parallel.
jobs:
call-workflow-special-autotools:
name: "Autotools Special Workflows"
Expand Down
18 changes: 8 additions & 10 deletions .github/workflows/cmake-bintest.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 examples bintest runs

# Controls when the action will run. Triggers the workflow on a schedule
# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand All @@ -12,8 +12,6 @@ on:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel
jobs:
test_binary_win:
# Windows w/ MSVC + CMake
Expand Down Expand Up @@ -149,7 +147,7 @@ jobs:
# MacOS w/ Clang + CMake
#
name: "MacOS Clang Binary Test"
runs-on: macos-13
runs-on: macos-latest
steps:
- name: Install Dependencies (MacOS)
run: brew install ninja doxygen
Expand Down Expand Up @@ -189,12 +187,12 @@ jobs:
ls ${{ runner.workspace }}
# symlinks the compiler executables to a common location
# - name: Setup GNU Fortran
# uses: fortran-lang/setup-fortran@v1
# id: setup-fortran
# with:
# compiler: gcc
# version: 12
- name: Setup GNU Fortran
uses: fortran-lang/setup-fortran@v1
id: setup-fortran
with:
compiler: gcc
version: 12

- name: Run ctest (MacOS)
id: run-ctest
Expand Down
4 changes: 1 addition & 3 deletions .github/workflows/cmake-ctest.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 ctest runs

# Controls when the action will run. Triggers the workflow on a call
# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand All @@ -26,8 +26,6 @@ on:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel
jobs:
build_and_test_win:
# Windows w/ MSVC + CMake
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/cmake.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 cmake CI

# Controls when the action will run. Triggers the workflow on push or pull request
# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
Expand All @@ -23,9 +23,6 @@ concurrency:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel. We just have one job, but the matrix items defined below will
# run in parallel.
jobs:
call-workflow-special-cmake:
name: "CMake Special Workflows"
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/cve.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: cve 1.14

# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/cygwin-auto.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 autotools cygwin

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/cygwin-cmake.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 CMake cygwin

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
6 changes: 2 additions & 4 deletions .github/workflows/daily-build.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 daily build

# Controls when the action will run. Triggers the workflow on a schedule
# Triggers the workflow on a schedule or on demand
on:
workflow_dispatch:
schedule:
Expand All @@ -9,8 +9,6 @@ on:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel.
jobs:
get-old-names:
runs-on: ubuntu-latest
Expand Down Expand Up @@ -52,7 +50,7 @@ jobs:
needs: [call-workflow-tarball, call-workflow-ctest]
uses: ./.github/workflows/abi-report.yml
with:
file_ref: '1_14_3'
file_ref: '1.14.4.3'
file_base: ${{ needs.call-workflow-tarball.outputs.file_base }}
use_tag: snapshot-1.14
use_environ: snapshots
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/hdfeos5.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdfeos5 1.14

# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/intel-auto.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 autotools icx CI

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/intel-cmake.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 CMake icx CI

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/main-auto.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 autotools CI

# Controls when the action will run. Triggers the workflow on a call
# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand All @@ -16,9 +16,6 @@ on:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel. We just have one job, but the matrix items defined below will
# run in parallel.
jobs:

# A workflow that builds the library and runs all the tests
Expand Down
45 changes: 36 additions & 9 deletions .github/workflows/main-cmake.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: hdf5 1.14 CMake CI

# Controls when the action will run. Triggers the workflow on a call
# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand All @@ -17,9 +17,6 @@ on:
permissions:
contents: read

# A workflow run is made up of one or more jobs that can run sequentially or
# in parallel. We just have one job, but the matrix items defined below will
# run in parallel.
jobs:

# A workflow that builds the library and runs all the tests
Expand All @@ -36,6 +33,7 @@ jobs:
name:
- "Windows MSVC"
- "Ubuntu gcc"
- "MacOS-13 Clang"
- "MacOS Clang"

# This is where we list the bulk of the options for each configuration.
Expand Down Expand Up @@ -85,15 +83,35 @@ jobs:
generator: "-G Ninja"
run_tests: true

# MacOS w/ Clang + CMake
#
# We could also build with the Autotools via brew installing them,
# but that seems unnecessary
- name: "MacOS-13 Clang"
os: macos-13
cpp: OFF
fortran: ON
java: ON
docs: ON
libaecfc: ON
localaec: OFF
zlibfc: ON
localzlib: OFF
parallel: OFF
mirror_vfd: ON
direct_vfd: OFF
ros3_vfd: OFF
generator: "-G Ninja"
run_tests: true

# MacOS w/ Clang + CMake
#
# We could also build with the Autotools via brew installing them,
# but that seems unnecessary
- name: "MacOS Clang"
os: macos-13
cpp: ON
fortran: OFF
os: macos-latest
cpp: OFF
fortran: ON
java: ON
docs: ON
libaecfc: ON
Expand Down Expand Up @@ -137,7 +155,16 @@ jobs:

- name: Install Dependencies (macOS)
run: brew install ninja
if: matrix.os == 'macos-13'
if: ${{ matrix.os == 'macos-13' || matrix.os == 'macos-latest' }}

# symlinks the compiler executables to a common location
- name: Install GNU Fortran (macOS)
uses: fortran-lang/setup-fortran@v1
id: setup-fortran
with:
compiler: gcc
version: 12
if: ${{ matrix.os == 'macos-13' || matrix.os == 'macos-latest' }}

- name: Install Dependencies
uses: ssciwr/doxygen-install@v1
Expand Down Expand Up @@ -264,4 +291,4 @@ jobs:
name: tgz-osx-${{ inputs.build_mode }}-binary
path: ${{ runner.workspace }}/build/HDF5-*-Darwin.tar.gz
if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
if: ${{ (matrix.os == 'macos-13') && (inputs.thread_safety != 'TS') }}
if: ${{ (matrix.os == 'macos-latest') && (inputs.thread_safety != 'TS') }}
5 changes: 5 additions & 0 deletions .github/workflows/markdown-link-check.yml
Original file line number Diff line number Diff line change
@@ -1,14 +1,19 @@
name: Check Markdown links

# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
pull_request:
branches: [ hdf5_1_14 ]

# The config file handles things like http 500 errors from sites like GitLab
# and http 200 responses
jobs:
markdown-link-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- uses: gaurav-nelson/github-action-markdown-link-check@v1
with:
config-file: '.github/workflows/markdown_config.json'
3 changes: 3 additions & 0 deletions .github/workflows/markdown_config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"aliveStatusCodes": [200, 500]
}
1 change: 1 addition & 0 deletions .github/workflows/msys2-auto.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: hdf5 1.14 Autotools MSys2

# Triggers the workflow on a call from another workflow
on:
workflow_call:
inputs:
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/netcdf.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: netCDF 1.14

# Triggers the workflow on push or pull request or on demand
on:
workflow_dispatch:
push:
Expand Down
Loading

0 comments on commit 95fa8f1

Please sign in to comment.