Skip to content

Commit

Permalink
Merge branch 'dev' into eschnett/regioncalculus
Browse files Browse the repository at this point in the history
# Conflicts:
#	.github/workflows/unix.yml
  • Loading branch information
eschnett committed Jul 8, 2021
2 parents 15399b1 + 239cf68 commit c436db6
Show file tree
Hide file tree
Showing 34 changed files with 1,086 additions and 186 deletions.
8 changes: 4 additions & 4 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,14 +38,14 @@ A clear and concise description of what you expected to happen.

**Software Environment**
- version of openPMD-api: [X.Y.Z-abc]
- installed openPMD-api via: [conda-forge, spack, pip, from source, module system, ...]
- installed openPMD-api via: [conda-forge, spack, pip, brew, yggdrasil, from source, module system, ...]
- operating system: [name and version]
- machine: [Are you running on a public cluster? It's likely we compute on it as well!]
- name and version of Python implementation: [e.g. CPython 3.8]
- name and version of Python implementation: [e.g. CPython 3.9]
- version of HDF5: [e.g. 1.12.0]
- version of ADIOS1: [e.g. 1.13.1]
- version of ADIOS2: [e.g. 2.6.0]
- name and version of MPI: [e.g. OpenMPI 3.1.5]
- version of ADIOS2: [e.g. 2.7.1]
- name and version of MPI: [e.g. OpenMPI 4.1.1]

**Additional context**
Add any other context about the problem here.
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/install_problem.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ You can also add images such as screenshots :-)
- name and version of Python implementation: [e.g. `python3 --version`]

**Additional context**
Any addition info that might help us, e.g. did you try to use a certain language binding (C++/Python?) or did you try to use a specific variant (MPI-parallel?) or backend (HDF5? ADIOS1?)?
Any addition info that might help us, e.g. did you try to use a certain language binding (C++/Python?) or did you try to use a specific variant (MPI-parallel?) or backend (HDF5? ADIOS2?)?
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/question.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,4 +27,4 @@ Have you already installed openPMD-api?
If so, please tell us which *version* of openPMD-api your question is about:

- version of openPMD-api: [X.Y.Z-abc]
- installed openPMD-api via: [conda-forge, spack, pip, from source, module system, ...]
- installed openPMD-api via: [conda-forge, spack, pip, brew, yggdrasil, from source, module system, ...]
63 changes: 63 additions & 0 deletions .github/ci/spack-envs/clang8_py38_mpich_h5_ad1_ad2/spack.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# This is a Spack environment file.
#
# Activating and installing this environment will provide all dependencies
# that are needed for full-feature development.
# https//spack.readthedocs.io/en/latest/environments.html#anonymous-environments
#
spack:
specs:
- adios
- adios2
- hdf5
- mpich

packages:
adios:
variants: ~zfp ~sz ~lz4 ~blosc
adios2:
variants: ~zfp ~sz ~png ~dataman ~python ~fortran ~ssc ~shared ~bzip2
cmake:
externals:
- spec: "cmake"
prefix: /usr
buildable: False
mpich:
externals:
- spec: "mpich"
prefix: /usr
buildable: False
perl:
externals:
- spec: "perl"
prefix: /usr
buildable: False
python:
externals:
- spec: "python"
prefix: /usr
buildable: False
all:
target: ['x86_64']
variants: ~fortran
providers:
mpi: [mpich]
compiler: [[email protected]]

compilers:
- compiler:
environment: {}
extra_rpaths: []
flags: {}
modules: []
operating_system: ubuntu20.04
paths:
cc: /usr/lib/llvm-8/bin/clang
cxx: /usr/lib/llvm-8/bin/clang++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
spec: [email protected]
target: x86_64

config:
build_jobs: 2

2 changes: 1 addition & 1 deletion .github/workflows/source.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- uses: s-weigand/[email protected].5
- uses: s-weigand/[email protected].7
with:
update-conda: true
conda-channels: conda-forge
Expand Down
30 changes: 26 additions & 4 deletions .github/workflows/unix.yml
Original file line number Diff line number Diff line change
Expand Up @@ -137,8 +137,8 @@ jobs:
sudo apt-get install clang-5.0 gfortran libopenmpi-dev python3
sudo .github/workflows/dependencies/install_spack
- name: Build
env: {CC: clang-5.0, CXX: clang++-5.0, CXXFLAGS: -Werror -Wno-deprecated-declarations}
# Clang 5 does not handle GCC 10's STL in C++17-mode
env: {CC: clang-5.0, CXX: clang++-5.0, CXXFLAGS: -Werror -Wno-deprecated-declarations, OPENPMD2_ADIOS2_SCHEMA: 20210209}
run: |
eval $(spack env activate --sh .github/ci/spack-envs/clang5_nopy_ompi_h5_ad1_ad2_bp3/)
spack install
Expand All @@ -147,7 +147,6 @@ jobs:
../share/openPMD/download_samples.sh && chmod u-w samples/git-sample/*.h5
cmake -S .. -B . -DCMAKE_BUILD_TYPE=RelWithDebInfo -DopenPMD_USE_PYTHON=OFF -DopenPMD_USE_MPI=ON -DopenPMD_USE_HDF5=ON -DopenPMD_USE_ADIOS1=OFF -DopenPMD_USE_ADIOS2=ON -DopenPMD_USE_REGIONS=OFF -DopenPMD_USE_INVASIVE_TESTS=ON
cmake --build . --parallel 2
export OPENPMD2_ADIOS2_SCHEMA=20210209
ctest --output-on-failure
# TODO
Expand Down Expand Up @@ -175,10 +174,33 @@ jobs:
cmake --build . --parallel 2
ctest --output-on-failure
clang8_py38_mpich_h5_ad1_ad2_newLayout:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Spack Cache
uses: actions/cache@v2
with: {path: /opt/spack, key: clang8_py38_mpich_h5_ad1_ad2_newLayout }
- name: Install
run: |
sudo apt-get update
sudo apt-get install clang-8 gfortran libmpich-dev python3
sudo .github/workflows/dependencies/install_spack
- name: Build
env: {CC: clang-8, CXX: clang++-8, CXXFLAGS: -Werror -Wno-deprecated-declarations, OPENPMD2_ADIOS2_SCHEMA: 20210209}
run: |
eval $(spack env activate --sh .github/ci/spack-envs/clang8_py38_mpich_h5_ad1_ad2/)
spack install
mkdir build && cd build
../share/openPMD/download_samples.sh && chmod u-w samples/git-sample/*.h5
cmake -S .. -B . -DopenPMD_USE_PYTHON=OFF -DopenPMD_USE_MPI=ON -DopenPMD_USE_HDF5=ON -DopenPMD_USE_ADIOS1=ON -DopenPMD_USE_ADIOS2=ON -DopenPMD_USE_INVASIVE_TESTS=ON
cmake --build . --parallel 2
ctest --output-on-failure
# TODO
# clang7_py37_ompi_h5_ad2_asan_ubsan_static
# clang8_py38_mpich_h5_ad1_ad2_h5coll
# clang10_py38_ompi_h5_1-10-6_ad1_ad2_release
# ..._h5coll with OPENPMD_HDF5_INDEPENDENT: OFF

# appleclang9_py37_nompi_h5_ad1
# appleclang10_py37_h5_ad2_libcpp
Expand Down
5 changes: 4 additions & 1 deletion CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,10 @@ Other

- ADIOS2: require version 2.7.0+ #927
- pybind11: require version 2.6.2+ #977
- CMake: Expose Python LTO Control #980
- CMake:

- Expose Python LTO Control #980
- Require only C-in-CXX MPI component #710


0.13.4
Expand Down
90 changes: 63 additions & 27 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -128,20 +128,40 @@ option(openPMD_BUILD_EXAMPLES "Build the examples" ${BUILD_EXAMPLES})
# Dependencies ################################################################
#
# external library: MPI (optional)
# Implementation quirks for BullMPI, Clang+MPI and Brew's MPICH
# definitely w/o MPI::MPI_C:
# brew's MPICH with C-flag work-arounds - errors AppleClang for CXX targets
# https://github.com/Homebrew/homebrew-core/issues/80465
# https://lists.mpich.org/pipermail/discuss/2020-January/005863.html
# sometimes needed MPI::MPI_C in the past:
# Clang+MPI: Potentially needed MPI::MPI_C targets in the past
# (exact MPI flavor & Clang version lost)
# BullMPI: PUBLIC dependency to MPI::MPI_CXX is missing in MPI::MPI_C target
set(openPMD_MPI_LINK_C_DEFAULT OFF)
option(openPMD_MPI_LINK_C "Also link the MPI C targets" ${openPMD_MPI_LINK_C_DEFAULT})
mark_as_advanced(openPMD_MPI_LINK_C)
set(openPMD_MPI_NEED_COMPONENTS CXX)
set(openPMD_MPI_TARGETS MPI::MPI_CXX)
if(openPMD_MPI_LINK_C)
set(openPMD_MPI_NEED_COMPONENTS C ${openPMD_MPI_NEED_COMPONENTS})
set(openPMD_MPI_TARGETS MPI::MPI_C ${openPMD_MPI_TARGETS})
endif()

if(openPMD_USE_MPI STREQUAL AUTO)
find_package(MPI)
find_package(MPI COMPONENTS ${openPMD_MPI_NEED_COMPONENTS})
if(MPI_FOUND)
set(openPMD_HAVE_MPI TRUE)
else()
set(openPMD_HAVE_MPI FALSE)
endif()
elseif(openPMD_USE_MPI)
find_package(MPI REQUIRED)
find_package(MPI REQUIRED COMPONENTS ${openPMD_MPI_NEED_COMPONENTS})
set(openPMD_HAVE_MPI TRUE)
else()
set(openPMD_HAVE_MPI FALSE)
endif()


# external library: nlohmann-json (required)
if(openPMD_USE_INTERNAL_JSON)
set(JSON_BuildTests OFF CACHE INTERNAL "")
Expand All @@ -158,46 +178,64 @@ target_link_libraries(openPMD::thirdparty::nlohmann_json


# external library: HDF5 (optional)
# note: in the new hdf5-cmake.config files, major releases like
# 1.8, 1.10 and 1.12 are not marked compatible versions
# We could use CMake 3.19.0+ version ranges, but:
# - this issues a Wdev warning with FindHDF5.cmake
# - does not work at least with HDF5 1.10:
# Could not find a configuration file for package "HDF5" that is compatible
# with requested version range "1.8.13...1.12".
# The following configuration files were considered but not accepted:
# ../share/cmake/hdf5/hdf5-config.cmake, version: 1.10.7
# - thus, we do our own HDF5_VERSION check...
if(openPMD_USE_HDF5 STREQUAL AUTO)
set(HDF5_PREFER_PARALLEL ${openPMD_HAVE_MPI})
find_package(HDF5 1.8.13 COMPONENTS C)
find_package(HDF5 COMPONENTS C)
if(HDF5_FOUND)
set(openPMD_HAVE_HDF5 TRUE)
else()
set(openPMD_HAVE_HDF5 FALSE)
endif()
elseif(openPMD_USE_HDF5)
set(HDF5_PREFER_PARALLEL ${openPMD_HAVE_MPI})
find_package(HDF5 1.8.13 REQUIRED COMPONENTS C)
find_package(HDF5 REQUIRED COMPONENTS C)
set(openPMD_HAVE_HDF5 TRUE)
else()
set(openPMD_HAVE_HDF5 FALSE)
endif()

# HDF5 checks
string(CONCAT openPMD_HDF5_STATUS "")
# version: lower limit
if(openPMD_HAVE_HDF5 AND HDF5_VERSION VERSION_LESS 1.8.13)
string(CONCAT openPMD_HDF5_STATUS
"Found HDF5 version ${HDF5_VERSION} is too old. At least "
"version 1.8.13 is required.\n")
endif()
# we imply support for parallel I/O if MPI variant is ON
if(openPMD_HAVE_MPI AND openPMD_HAVE_HDF5 AND NOT HDF5_IS_PARALLEL)
if(openPMD_HAVE_MPI AND openPMD_HAVE_HDF5
AND NOT HDF5_IS_PARALLEL # FindHDF5.cmake
AND NOT HDF5_ENABLE_PARALLEL # hdf5-config.cmake
)
string(CONCAT openPMD_HDF5_STATUS
"Found MPI but only serial version of HDF5. Either set "
"openPMD_USE_MPI=OFF to disable MPI or set openPMD_USE_HDF5=OFF "
"to disable HDF5 or provide a parallel install of HDF5.\n"
"If you manually installed a parallel version of HDF5 in "
"a non-default path, add its installation prefix to the "
"environment variable CMAKE_PREFIX_PATH to find it: "
"https://cmake.org/cmake/help/latest/envvar/CMAKE_PREFIX_PATH.html")
if(openPMD_USE_HDF5 STREQUAL AUTO)
message(WARNING "${openPMD_HDF5_STATUS}")
set(openPMD_HAVE_HDF5 FALSE)
elseif(openPMD_USE_HDF5)
message(FATAL_ERROR "${openPMD_HDF5_STATUS}")
endif()
"Found MPI but only serial version of HDF5. Either set "
"openPMD_USE_MPI=OFF to disable MPI or set openPMD_USE_HDF5=OFF "
"to disable HDF5 or provide a parallel install of HDF5.\n")
endif()
# HDF5 includes mpi.h in the public header H5public.h if HDF5_IS_PARALLEL
if(openPMD_HAVE_HDF5 AND HDF5_IS_PARALLEL AND NOT openPMD_HAVE_MPI)
# HDF5 includes mpi.h in the public header H5public.h if parallel
if(openPMD_HAVE_HDF5 AND
(HDF5_IS_PARALLEL OR HDF5_ENABLE_PARALLEL)
AND NOT openPMD_HAVE_MPI)
string(CONCAT openPMD_HDF5_STATUS
"Found only parallel version of HDF5 but no MPI. Either set "
"openPMD_USE_MPI=ON to force using MPI or set openPMD_USE_HDF5=OFF "
"to disable HDF5 or provide a serial install of HDF5.\n"
"If you manually installed a serial version of HDF5 in "
"to disable HDF5 or provide a serial install of HDF5.\n")
endif()

if(openPMD_HDF5_STATUS)
string(CONCAT openPMD_HDF5_STATUS
${openPMD_HDF5_STATUS}
"If you manually installed a version of HDF5 in "
"a non-default path, add its installation prefix to the "
"environment variable CMAKE_PREFIX_PATH to find it: "
"https://cmake.org/cmake/help/latest/envvar/CMAKE_PREFIX_PATH.html")
Expand Down Expand Up @@ -462,9 +500,7 @@ if(openPMD_BUILD_TESTING)
endif()

if(openPMD_HAVE_MPI)
# MPI targets: CMake 3.9+
# note: often the PUBLIC dependency to CXX is missing in C targets...
target_link_libraries(openPMD PUBLIC MPI::MPI_C MPI::MPI_CXX)
target_link_libraries(openPMD PUBLIC ${openPMD_MPI_TARGETS})
endif()

# JSON Backend and User-Facing Runtime Options
Expand Down Expand Up @@ -500,7 +536,7 @@ if(openPMD_HAVE_ADIOS1)
${openPMD_SOURCE_DIR}/include ${openPMD_BINARY_DIR}/include)

if(openPMD_HAVE_MPI)
target_link_libraries(openPMD.ADIOS1.Parallel PUBLIC MPI::MPI_C MPI::MPI_CXX)
target_link_libraries(openPMD.ADIOS1.Parallel PUBLIC ${openPMD_MPI_TARGETS})
target_compile_definitions(openPMD.ADIOS1.Parallel PRIVATE openPMD_HAVE_MPI=1)
else()
target_compile_definitions(openPMD.ADIOS1.Parallel PRIVATE openPMD_HAVE_MPI=0)
Expand Down Expand Up @@ -778,7 +814,7 @@ if(openPMD_BUILD_TESTING)
target_link_libraries(CatchRunner PUBLIC openPMD::thirdparty::Catch2)
target_link_libraries(CatchMain PUBLIC openPMD::thirdparty::Catch2)
if(openPMD_HAVE_MPI)
target_link_libraries(CatchRunner PUBLIC MPI::MPI_C MPI::MPI_CXX)
target_link_libraries(CatchRunner PUBLIC ${openPMD_MPI_TARGETS})
target_compile_definitions(CatchRunner PUBLIC openPMD_HAVE_MPI=1)
endif()

Expand Down
5 changes: 5 additions & 0 deletions docs/source/details/backendconfig.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,11 @@ Using the Streaming API (i.e. ``SeriesImpl::readIteration()``) will do this auto
Parsing eagerly might be very expensive for a Series with many iterations, but will avoid bugs by forgotten calls to ``Iteration::open()``.
In complex environments, calling ``Iteration::open()`` on an already open environment does no harm (and does not incur additional runtime cost for additional ``open()`` calls).

The key ``resizable`` can be passed to ``Dataset`` options.
It if set to ``{"resizable": true}``, this declares that it shall be allowed to increased the ``Extent`` of a ``Dataset`` via ``resetDataset()`` at a later time, i.e., after it has been first declared (and potentially written).
For HDF5, resizable Datasets come with a performance penalty.
For JSON and ADIOS2, all datasets are resizable, independent of this option.

Configuration Structure per Backend
-----------------------------------

Expand Down
3 changes: 3 additions & 0 deletions docs/source/usage/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ C++
- `5_write_parallel.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/5_write_parallel.cpp>`_: MPI-parallel mesh write
- `6_dump_filebased_series.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/6_dump_filebased_series.cpp>`_: detailed reading with a file-based series
- `7_extended_write_serial.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/7_extended_write_serial.cpp>`_: particle writing with patches and constant records
- `10_streaming_write.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/10_streaming_write.cpp>`_ / `10_streaming_read.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/10_streaming_read.cpp>`_: ADIOS2 data streaming
- `12_span_write.cpp <https://github.com/openPMD/openPMD-api/blob/dev/examples/12_span_write.cpp>`_: using the span-based API to save memory when writing

Benchmarks
Expand All @@ -40,6 +41,8 @@ Python
- `5_write_parallel.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/5_write_parallel.py>`_: MPI-parallel mesh write
- `7_extended_write_serial.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/7_extended_write_serial.py>`_: particle writing with patches and constant records
- `9_particle_write_serial.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/9_particle_write_serial.py>`_: writing particles
- `10_streaming_write.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/10_streaming_write.py>`_ / `10_streaming_read.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/10_streaming_read.py>`_: ADIOS2 data streaming
- `11_particle_dataframe.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/11_particle_dataframe.py>`_: reading data into `Pandas <https://pandas.pydata.org>`_ dataframes or `Dask <https://dask.org>`_ for distributed analysis
- `12_span_write.py <https://github.com/openPMD/openPMD-api/blob/dev/examples/12_span_write.py>`_: using the span-based API to save memory when writing

Unit Tests
Expand Down
Loading

0 comments on commit c436db6

Please sign in to comment.