diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index aacfe9cc07f..8810c9282b9 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -122,7 +122,7 @@ Please make sure that you check the items applicable to your pull request:
* [ ] If changes were done to Autotools build, were they added to CMake and vice versa?
* [ ] Is the pull request applicable to any other branches? If yes, which ones? Please document it in the GitHub issue.
* [ ] Is the new code sufficiently documented for future maintenance?
- * [ ] Does the new feature require a change to an existing API? See "API Compatibility Macros" document (https://docs.hdfgroup.org/hdf5/develop/api-compat-macros.html)
+ * [ ] Does the new feature require a change to an existing API? See "API Compatibility Macros" document (https://hdfgroup.github.io/hdf5/develop/api-compat-macros.html)
* Documentation
* [ ] Was the change described in the release_docs/RELEASE.txt file?
* [ ] Was the new function documented in the corresponding public header file using [Doxygen](https://hdfgroup.github.io/hdf5/develop/_r_m_t.html)?
diff --git a/README.md b/README.md
index bc98e308af7..ecf2f64a684 100644
--- a/README.md
+++ b/README.md
@@ -31,15 +31,15 @@ DOCUMENTATION
-------------
This release is fully functional for the API described in the documentation.
- https://docs.hdfgroup.org/hdf5/develop/_l_b_a_p_i.html
+ https://hdfgroup.github.io/hdf5/develop/_l_b_a_p_i.html
Full Documentation and Programming Resources for this release can be found at
- https://docs.hdfgroup.org/hdf5/develop/index.html
+ https://hdfgroup.github.io/hdf5/develop/index.html
The latest doxygen documentation generated on changes to develop is available at:
- https://hdfgroup.github.io/hdf5/
+ https://hdfgroup.github.io/hdf5/develop
See the [RELEASE.txt](release_docs/RELEASE.txt) file in the [release_docs/](release_docs/) directory for information specific
to the features and updates included in this release of the library.
diff --git a/doc/parallel-compression.md b/doc/parallel-compression.md
index a0567bfa546..48ed4c3c37d 100644
--- a/doc/parallel-compression.md
+++ b/doc/parallel-compression.md
@@ -79,7 +79,7 @@ participate in the collective write call.
## Multi-dataset I/O support
The parallel compression feature is supported when using the
-multi-dataset I/O API routines ([H5Dwrite_multi](https://hdfgroup.github.io/hdf5/group___h5_d.html#gaf6213bf3a876c1741810037ff2bb85d8)/[H5Dread_multi](https://hdfgroup.github.io/hdf5/group___h5_d.html#ga8eb1c838aff79a17de385d0707709915)), but the
+multi-dataset I/O API routines ([H5Dwrite_multi](https://hdfgroup.github.io/hdf5/develop/group___h5_d.html#gaf6213bf3a876c1741810037ff2bb85d8)/[H5Dread_multi](https://hdfgroup.github.io/hdf5/develop/group___h5_d.html#ga8eb1c838aff79a17de385d0707709915)), but the
following should be kept in mind:
- Parallel writes to filtered datasets **must** still be collective,
@@ -99,7 +99,7 @@ following should be kept in mind:
## Incremental file space allocation support
-HDF5's [file space allocation time](https://hdfgroup.github.io/hdf5/group___d_c_p_l.html#ga85faefca58387bba409b65c470d7d851)
+HDF5's [file space allocation time](https://hdfgroup.github.io/hdf5/develop/group___d_c_p_l.html#ga85faefca58387bba409b65c470d7d851)
is a dataset creation property that can have significant effects
on application performance, especially if the application uses
parallel HDF5. In a serial HDF5 application, the default file space
@@ -118,7 +118,7 @@ While this strategy has worked in the past, it has some noticeable
drawbacks. For one, the larger the chunked dataset being created,
the more noticeable overhead there will be during dataset creation
as all of the data chunks are being allocated in the HDF5 file.
-Further, these data chunks will, by default, be [filled](https://hdfgroup.github.io/hdf5/group___d_c_p_l.html#ga4335bb45b35386daa837b4ff1b9cd4a4)
+Further, these data chunks will, by default, be [filled](https://hdfgroup.github.io/hdf5/develop/group___d_c_p_l.html#ga4335bb45b35386daa837b4ff1b9cd4a4)
with HDF5's default fill data value, leading to extraordinary
dataset creation overhead and resulting in pre-filling large
portions of a dataset that the application might have been planning
@@ -126,7 +126,7 @@ to overwrite anyway. Even worse, there will be more initial overhead
from compressing that fill data before writing it out, only to have
it read back in, unfiltered and modified the first time a chunk is
written to. In the past, it was typically suggested that parallel
-HDF5 applications should use [H5Pset_fill_time](https://hdfgroup.github.io/hdf5/group___d_c_p_l.html#ga6bd822266b31f86551a9a1d79601b6a2)
+HDF5 applications should use [H5Pset_fill_time](https://hdfgroup.github.io/hdf5/develop/group___d_c_p_l.html#ga6bd822266b31f86551a9a1d79601b6a2)
with a value of `H5D_FILL_TIME_NEVER` in order to disable writing of
the fill value to dataset chunks, but this isn't ideal if the
application actually wishes to make use of fill values.
@@ -220,14 +220,14 @@ chunks to end up at addresses in the file that do not align
well with the underlying file system, possibly leading to
poor performance. As an example, Lustre performance is generally
good when writes are aligned with the chosen stripe size.
-The HDF5 application can use [H5Pset_alignment](https://hdfgroup.github.io/hdf5/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
+The HDF5 application can use [H5Pset_alignment](https://hdfgroup.github.io/hdf5/develop/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
to have a bit more control over where objects in the HDF5
file end up. However, do note that setting the alignment
of objects generally wastes space in the file and has the
potential to dramatically increase its resulting size, so
caution should be used when choosing the alignment parameters.
-[H5Pset_alignment](https://hdfgroup.github.io/hdf5/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
+[H5Pset_alignment](https://hdfgroup.github.io/hdf5/develop/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
has two parameters that control the alignment of objects in
the HDF5 file, the "threshold" value and the alignment
value. The threshold value specifies that any object greater
@@ -264,19 +264,19 @@ in a file, this can create significant amounts of free space
in the file over its lifetime and eventually cause performance
issues.
-An HDF5 application can use [H5Pset_file_space_strategy](https://hdfgroup.github.io/hdf5/group___f_c_p_l.html#ga167ff65f392ca3b7f1933b1cee1b9f70)
+An HDF5 application can use [H5Pset_file_space_strategy](https://hdfgroup.github.io/hdf5/develop/group___f_c_p_l.html#ga167ff65f392ca3b7f1933b1cee1b9f70)
with a value of `H5F_FSPACE_STRATEGY_PAGE` to enable the paged
aggregation feature, which can accumulate metadata and raw
data for dataset data chunks into well-aligned, configurably
sized "pages" for better performance. However, note that using
the paged aggregation feature will cause any setting from
-[H5Pset_alignment](https://hdfgroup.github.io/hdf5/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
+[H5Pset_alignment](https://hdfgroup.github.io/hdf5/develop/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a)
to be ignored. While an application should be able to get
-comparable performance effects by [setting the size of these pages](https://hdfgroup.github.io/hdf5/group___f_c_p_l.html#gad012d7f3c2f1e1999eb1770aae3a4963) to be equal to the value that
-would have been set for [H5Pset_alignment](https://hdfgroup.github.io/hdf5/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a),
+comparable performance effects by [setting the size of these pages](https://hdfgroup.github.io/hdf5/develop/group___f_c_p_l.html#gad012d7f3c2f1e1999eb1770aae3a4963) to be equal to the value that
+would have been set for [H5Pset_alignment](https://hdfgroup.github.io/hdf5/develop/group___f_a_p_l.html#gab99d5af749aeb3896fd9e3ceb273677a),
this may not necessarily be the case and should be studied.
-Note that [H5Pset_file_space_strategy](https://hdfgroup.github.io/hdf5/group___f_c_p_l.html#ga167ff65f392ca3b7f1933b1cee1b9f70)
+Note that [H5Pset_file_space_strategy](https://hdfgroup.github.io/hdf5/develop/group___f_c_p_l.html#ga167ff65f392ca3b7f1933b1cee1b9f70)
has a `persist` parameter. This determines whether or not the
file free space manager should include extra metadata in the
HDF5 file about free space sections in the file. If this
@@ -300,12 +300,12 @@ hid_t file_id = H5Fcreate("file.h5", H5F_ACC_TRUNC, fcpl_id, fapl_id);
While the parallel compression feature requires that the HDF5
application set and maintain collective I/O at the application
-interface level (via [H5Pset_dxpl_mpio](https://hdfgroup.github.io/hdf5/group___d_x_p_l.html#ga001a22b64f60b815abf5de8b4776f09e)),
+interface level (via [H5Pset_dxpl_mpio](https://hdfgroup.github.io/hdf5/develop/group___d_x_p_l.html#ga001a22b64f60b815abf5de8b4776f09e)),
it does not require that the actual MPI I/O that occurs at
the lowest layers of HDF5 be collective; independent I/O may
perform better depending on the application I/O patterns and
parallel file system performance, among other factors. The
-application may use [H5Pset_dxpl_mpio_collective_opt](https://hdfgroup.github.io/hdf5/group___d_x_p_l.html#gacb30d14d1791ec7ff9ee73aa148a51a3)
+application may use [H5Pset_dxpl_mpio_collective_opt](https://hdfgroup.github.io/hdf5/develop/group___d_x_p_l.html#gacb30d14d1791ec7ff9ee73aa148a51a3)
to control this setting and see which I/O method provides the
best performance.
@@ -318,7 +318,7 @@ H5Dwrite(..., dxpl_id, ...);
### Runtime HDF5 Library version
-An HDF5 application can use the [H5Pset_libver_bounds](https://hdfgroup.github.io/hdf5/group___f_a_p_l.html#gacbe1724e7f70cd17ed687417a1d2a910)
+An HDF5 application can use the [H5Pset_libver_bounds](https://hdfgroup.github.io/hdf5/develop/group___f_a_p_l.html#gacbe1724e7f70cd17ed687417a1d2a910)
routine to set the upper and lower bounds on library versions
to use when creating HDF5 objects. For parallel compression
specifically, setting the library version to the latest available
diff --git a/doxygen/aliases b/doxygen/aliases
index 774ecc5038b..e8058605d29 100644
--- a/doxygen/aliases
+++ b/doxygen/aliases
@@ -1,5 +1,18 @@
ALIASES += THG="The HDF Group"
+################################################################################
+# Default URLs (Note that md files do not use any aliases)
+################################################################################
+ALIASES += ARCURL="docs.hdfgroup.org/hdf5"
+ALIASES += RFCURL="docs.hdfgroup.org/hdf5/rfc"
+ALIASES += DSPURL="portal.hdfgroup.org/display/HDF5"
+ALIASES += DOCURL="portal.hdfgroup.org/documentation/hdf5-docs"
+ALIASES += AEXURL="support.hdfgroup.org/ftp/HDF5/examples"
+# doxygen subdir (develop, v1_14)
+ALIASES += DOXURL="hdfgroup.github.io/hdf5/develop"
+#branch name (develop, hdf5_1_14)
+ALIASES += SRCURL="github.com/HDFGroup/hdf5/blob/develop"
+
################################################################################
# Styling
################################################################################
@@ -234,16 +247,16 @@ ALIASES += sa_metadata_ops="\sa \li H5Pget_all_coll_metadata_ops() \li H5Pget_co
# References
################################################################################
-ALIASES += ref_cons_semantics="Enabling a Strict Consistency Semantics Model in Parallel HDF5"
-ALIASES += ref_file_image_ops="HDF5 File Image Operations"
-ALIASES += ref_filter_pipe="Data Flow Pipeline for H5Dread()"
-ALIASES += ref_group_impls="Group implementations in HDF5"
+ALIASES += ref_cons_semantics="Enabling a Strict Consistency Semantics Model in Parallel HDF5"
+ALIASES += ref_file_image_ops="HDF5 File Image Operations"
+ALIASES += ref_filter_pipe="Data Flow Pipeline for H5Dread()"
+ALIASES += ref_group_impls="Group implementations in HDF5"
ALIASES += ref_h5lib_relver="HDF5 Library Release Version Numbers"
-ALIASES += ref_mdc_in_hdf5="Metadata Caching in HDF5"
-ALIASES += ref_mdc_logging="Metadata Cache Logging"
-ALIASES += ref_news_112="New Features in HDF5 Release 1.12"
-ALIASES += ref_h5ocopy="Copying Committed Datatypes with H5Ocopy()"
-ALIASES += ref_sencode_fmt_change="RFC H5Sencode() / H5Sdecode() Format Change"
+ALIASES += ref_mdc_in_hdf5="Metadata Caching in HDF5"
+ALIASES += ref_mdc_logging="Metadata Cache Logging"
+ALIASES += ref_news_112="New Features in HDF5 Release 1.12"
+ALIASES += ref_h5ocopy="Copying Committed Datatypes with H5Ocopy()"
+ALIASES += ref_sencode_fmt_change="RFC H5Sencode() / H5Sdecode() Format Change"
ALIASES += ref_vlen_strings="\Emph{Creating variable-length string datatypes}"
ALIASES += ref_vol_doc="VOL documentation"
@@ -251,103 +264,103 @@ ALIASES += ref_vol_doc="VOL documentation"
# RFCs
################################################################################
-ALIASES += ref_rfc20220819="Terminal VOL Connector Feature Flags"
-ALIASES += ref_rfc20210528="Multi-Thread HDF5"
-ALIASES += ref_rfc20210219="Selection I/O"
-ALIASES += ref_rfc20200213="VFD Sub-filing"
-ALIASES += ref_rfc20200210="Onion VFD"
-ALIASES += ref_rfc20190923="Virtual Object Layer (VOL) API Compatibility"
-ALIASES += ref_rfc20190715="Variable-Length Data in HDF5 Sketch Design"
-ALIASES += ref_rfc20190410="A Plugin Interface for HDF5 Virtual File Drivers"
-ALIASES += ref_rfc20181231="Dataset Object Header Size"
-ALIASES += ref_rfc20181220="MS 3.2 – Addressing Scalability: Scalability of open, close, flush CASE STUDY: CGNS Hotspot analysis of CGNS cgp_open"
-ALIASES += ref_rfc20180830="Sparse Chunks"
-ALIASES += ref_rfc20180829="H5FD_MIRROR Virtual File Driver"
-ALIASES += ref_rfc20180815="Splitter_VFD"
-ALIASES += ref_rfc20180712="Update to HDF5 References"
-ALIASES += ref_rfc20180620="Chunk query functionality in HDF5"
-ALIASES += ref_rfc20180610="VFD SWMR"
-ALIASES += ref_rfc20180321="API Contexts"
-ALIASES += ref_rfc20180125="Enhancement to the tool h5clear"
-ALIASES += ref_rfc20170707="H5Sencode/H5Sdecode Format Change"
-ALIASES += ref_rfc20160105="Setting Bounds for Object Creation in HDF5 1.10.0"
-ALIASES += ref_rfc20150915="File Format Changes in HDF5 1.10.0"
-ALIASES += ref_rfc20150709="Page Buffering"
-ALIASES += ref_rfc20150615="Metadata Cache Image"
-ALIASES += ref_rfc20150429="New Datatypes"
-ALIASES += ref_rfc20150424="Collective Metadata Writes"
-ALIASES += ref_rfc20150423="Enabling Collective Metadata Reads"
-ALIASES += ref_rfc20150301="The Tool to Handle HDF5 File Format Compatibility for Chunked Datasets"
-ALIASES += ref_rfc20150212="H5LTget_hardlinks – High-level API to list all the hard links to an object"
-ALIASES += ref_rfc20150205="HDF5 Fortran Wrappers Maintenance: Dropping Support for Non-Fortran 2003 Standard Compliant Compilers"
-ALIASES += ref_rfc20150202="New Autotools Behavior"
-ALIASES += ref_rfc20141210="HDF5 Virtual Dataset"
-ALIASES += ref_rfc20141201="Allocate/Free Mismatches in HDF5 Filter Code on Windows"
-ALIASES += ref_rfc20140916="Virtual Object Layer"
-ALIASES += ref_rfc20140827="Chunking and Compression Performance Tool Requirements"
-ALIASES += ref_rfc20140729="Replacing H5Fis_hdf5() with H5Fis_accessible()"
-ALIASES += ref_rfc20140722="Switching to a 64-bit hid_t Space in HDF5"
-ALIASES += ref_rfc20140717="Data Analysis Extensions"
-ALIASES += ref_rfc20140707="Virtual Object Layer"
-ALIASES += ref_rfc20140524="HDF5 Compression Demystified"
-ALIASES += ref_rfc20140318="Freeing Memory Allocated by the HDF5 Library"
-ALIASES += ref_rfc20140313="Options to handle compatibility issues for HDF5 files"
-ALIASES += ref_rfc20140224="Design: Metadata Cache Logging"
-ALIASES += ref_rfc20131211="Fine-Grained Control of Metadata Cache Flushes"
-ALIASES += ref_rfc20130930="Read Attempts for Metadata with Checksum"
-ALIASES += ref_rfc20130919="Core VFD Backing Store Paged Writes"
-ALIASES += ref_rfc20130630="Flush Dependency Testing"
-ALIASES += ref_rfc20130316="HDF5 Dynamically Loaded Filters"
-ALIASES += ref_rfc20121114="Direct Chunk Write"
-ALIASES += ref_rfc20121024="HDF5 File Space Management"
-ALIASES += ref_rfc20120828="New HDF5 API Routines for HPC Applications - Read/Write Multiple Datasets in an HDF5 file"
-ALIASES += ref_rfc20120523="HDF5 File Space Management: Paged Aggregation"
-ALIASES += ref_rfc20120501="HDF5 File Image Operations"
-ALIASES += ref_rfc20120305="Enabling a Strict Consistency Semantics Model in Parallel HDF5"
-ALIASES += ref_rfc20120220="h5repack: Improved Hyperslab selections for Large Chunked Datasets"
-ALIASES += ref_rfc20120120="A Maintainer's Guide for the Datatype Module in HDF5 Library"
-ALIASES += ref_rfc20120104="Actual I/O Mode"
-ALIASES += ref_rfc20111119="New public functions to handle comparison"
-ALIASES += ref_rfc20110825="Merging Named Datatypes in H5Ocopy()"
-ALIASES += ref_rfc20110811="Expanding the HDF5 Hyperslab Selection Interface"
-ALIASES += ref_rfc20110726="HDF5 File Space Allocation and Aggregation"
-ALIASES += ref_rfc20110614=" Refactor h5dump to Improve Maintenance"
-ALIASES += ref_rfc20110329="Support External Link Open File Cache in HDF5 Tools"
-ALIASES += ref_rfc20110118="h5diff Attribute Comparisons"
-ALIASES += ref_rfc20101122="SWMR Timeouts"
-ALIASES += ref_rfc20101104="Caching Files Opened Through External Links"
-ALIASES += ref_rfc20101018="HDF5 File and Object Comparison Specification"
-ALIASES += ref_rfc20100902="h5edit – An HDF5 File Editing Tool"
-ALIASES += ref_rfc20100727="Reserved Characters for HDF5 Applications"
-ALIASES += ref_rfc20100726="High-Level HDF5 API routines for HPC Applications"
-ALIASES += ref_rfc20100511="h5diff – Exclude Object(s) from Comparison"
-ALIASES += ref_rfc20100422="Generating attributes into an object with a tool"
-ALIASES += ref_rfc20100312="Supporting HDF5 1.8 in HDF5 Command Line Tools"
-ALIASES += ref_rfc20091218="Supporting soft-link and external-link for h5diff"
-ALIASES += ref_rfc20090907="HDF5 Tools Library Functions"
-ALIASES += ref_rfc20090612="Default EPSILON values for comparing floating point data"
-ALIASES += ref_rfc20081218="Reporting of Non-Comparable Datasets by h5diff"
-ALIASES += ref_rfc20081205="External Link Traversal Callback"
-ALIASES += ref_rfc20081030="Setting Raw Data Chunk Cache Parameters in HDF5"
-ALIASES += ref_rfc20080915="Performance Report for Free-space Manager"
-ALIASES += ref_rfc20080904="Setting File Access Property List for accessing External Link"
-ALIASES += ref_rfc20080728="Native Time Types in HDF5"
-ALIASES += ref_rfc20080723="Special Values in HDF5"
-ALIASES += ref_rfc20080301="Dynamic Transformations to HDF5 Data"
-ALIASES += ref_rfc20080209="Using SVN branching to improve software development process at THG"
-ALIASES += ref_rfc20080206="Maintaining the HISTORY.txt and RELEASE.txt files in HDF5"
-ALIASES += ref_rfc20071111="Addressing HDF5 file corruption issue"
-ALIASES += ref_rfc20071018="NaN detection in HDF5"
-ALIASES += ref_rfc20070801="Metadata Journaling to Improve Crash Survivability"
-ALIASES += ref_rfc20070413="API Compatibility Strategies for HDF5"
-ALIASES += ref_rfc20070115="A \"Private\" Heap for HDF5"
-ALIASES += ref_rfc20060623="Performance Comparison of Collective I/O and Independent I/O with Derived Datatypes"
-ALIASES += ref_rfc20060604="h5stat tool"
-ALIASES += ref_rfc20060505="Simple Performance Test on Fletcher32 Filter"
-ALIASES += ref_rfc20060410="Requirement Specifications of an HDF5 File Format Validation Tool"
-ALIASES += ref_rfc20060317="Proposed changes to the sec2 driver "
-ALIASES += ref_rfc20060124="Mapping FITS data to HDF5"
-ALIASES += ref_rfc20040811="Conversion Between Text and Datatype"
+ALIASES += ref_rfc20220819="Terminal VOL Connector Feature Flags"
+ALIASES += ref_rfc20210528="Multi-Thread HDF5"
+ALIASES += ref_rfc20210219="Selection I/O"
+ALIASES += ref_rfc20200213="VFD Sub-filing"
+ALIASES += ref_rfc20200210="Onion VFD"
+ALIASES += ref_rfc20190923="Virtual Object Layer (VOL) API Compatibility"
+ALIASES += ref_rfc20190715="Variable-Length Data in HDF5 Sketch Design"
+ALIASES += ref_rfc20190410="A Plugin Interface for HDF5 Virtual File Drivers"
+ALIASES += ref_rfc20181231="Dataset Object Header Size"
+ALIASES += ref_rfc20181220="MS 3.2 – Addressing Scalability: Scalability of open, close, flush CASE STUDY: CGNS Hotspot analysis of CGNS cgp_open"
+ALIASES += ref_rfc20180830="Sparse Chunks"
+ALIASES += ref_rfc20180829="H5FD_MIRROR Virtual File Driver"
+ALIASES += ref_rfc20180815="Splitter_VFD"
+ALIASES += ref_rfc20180712="Update to HDF5 References"
+ALIASES += ref_rfc20180620="Chunk query functionality in HDF5"
+ALIASES += ref_rfc20180610="VFD SWMR"
+ALIASES += ref_rfc20180321="API Contexts"
+ALIASES += ref_rfc20180125="Enhancement to the tool h5clear"
+ALIASES += ref_rfc20170707="H5Sencode/H5Sdecode Format Change"
+ALIASES += ref_rfc20160105="Setting Bounds for Object Creation in HDF5 1.10.0"
+ALIASES += ref_rfc20150915="File Format Changes in HDF5 1.10.0"
+ALIASES += ref_rfc20150709="Page Buffering"
+ALIASES += ref_rfc20150615="Metadata Cache Image"
+ALIASES += ref_rfc20150429="New Datatypes"
+ALIASES += ref_rfc20150424="Collective Metadata Writes"
+ALIASES += ref_rfc20150423="Enabling Collective Metadata Reads"
+ALIASES += ref_rfc20150301="The Tool to Handle HDF5 File Format Compatibility for Chunked Datasets"
+ALIASES += ref_rfc20150212="H5LTget_hardlinks – High-level API to list all the hard links to an object"
+ALIASES += ref_rfc20150205="HDF5 Fortran Wrappers Maintenance: Dropping Support for Non-Fortran 2003 Standard Compliant Compilers"
+ALIASES += ref_rfc20150202="New Autotools Behavior"
+ALIASES += ref_rfc20141210="HDF5 Virtual Dataset"
+ALIASES += ref_rfc20141201="Allocate/Free Mismatches in HDF5 Filter Code on Windows"
+ALIASES += ref_rfc20140916="Virtual Object Layer"
+ALIASES += ref_rfc20140827="Chunking and Compression Performance Tool Requirements"
+ALIASES += ref_rfc20140729="Replacing H5Fis_hdf5() with H5Fis_accessible()"
+ALIASES += ref_rfc20140722="Switching to a 64-bit hid_t Space in HDF5"
+ALIASES += ref_rfc20140717="Data Analysis Extensions"
+ALIASES += ref_rfc20140707="Virtual Object Layer"
+ALIASES += ref_rfc20140524="HDF5 Compression Demystified"
+ALIASES += ref_rfc20140318="Freeing Memory Allocated by the HDF5 Library"
+ALIASES += ref_rfc20140313="Options to handle compatibility issues for HDF5 files"
+ALIASES += ref_rfc20140224="Design: Metadata Cache Logging"
+ALIASES += ref_rfc20131211="Fine-Grained Control of Metadata Cache Flushes"
+ALIASES += ref_rfc20130930="Read Attempts for Metadata with Checksum"
+ALIASES += ref_rfc20130919="Core VFD Backing Store Paged Writes"
+ALIASES += ref_rfc20130630="Flush Dependency Testing"
+ALIASES += ref_rfc20130316="HDF5 Dynamically Loaded Filters"
+ALIASES += ref_rfc20121114="Direct Chunk Write"
+ALIASES += ref_rfc20121024="HDF5 File Space Management"
+ALIASES += ref_rfc20120828="New HDF5 API Routines for HPC Applications - Read/Write Multiple Datasets in an HDF5 file"
+ALIASES += ref_rfc20120523="HDF5 File Space Management: Paged Aggregation"
+ALIASES += ref_rfc20120501="HDF5 File Image Operations"
+ALIASES += ref_rfc20120305="Enabling a Strict Consistency Semantics Model in Parallel HDF5"
+ALIASES += ref_rfc20120220="h5repack: Improved Hyperslab selections for Large Chunked Datasets"
+ALIASES += ref_rfc20120120="A Maintainer's Guide for the Datatype Module in HDF5 Library"
+ALIASES += ref_rfc20120104="Actual I/O Mode"
+ALIASES += ref_rfc20111119="New public functions to handle comparison"
+ALIASES += ref_rfc20110825="Merging Named Datatypes in H5Ocopy()"
+ALIASES += ref_rfc20110811="Expanding the HDF5 Hyperslab Selection Interface"
+ALIASES += ref_rfc20110726="HDF5 File Space Allocation and Aggregation"
+ALIASES += ref_rfc20110614=" Refactor h5dump to Improve Maintenance"
+ALIASES += ref_rfc20110329="Support External Link Open File Cache in HDF5 Tools"
+ALIASES += ref_rfc20110118="h5diff Attribute Comparisons"
+ALIASES += ref_rfc20101122="SWMR Timeouts"
+ALIASES += ref_rfc20101104="Caching Files Opened Through External Links"
+ALIASES += ref_rfc20101018="HDF5 File and Object Comparison Specification"
+ALIASES += ref_rfc20100902="h5edit – An HDF5 File Editing Tool"
+ALIASES += ref_rfc20100727="Reserved Characters for HDF5 Applications"
+ALIASES += ref_rfc20100726="High-Level HDF5 API routines for HPC Applications"
+ALIASES += ref_rfc20100511="h5diff – Exclude Object(s) from Comparison"
+ALIASES += ref_rfc20100422="Generating attributes into an object with a tool"
+ALIASES += ref_rfc20100312="Supporting HDF5 1.8 in HDF5 Command Line Tools"
+ALIASES += ref_rfc20091218="Supporting soft-link and external-link for h5diff"
+ALIASES += ref_rfc20090907="HDF5 Tools Library Functions"
+ALIASES += ref_rfc20090612="Default EPSILON values for comparing floating point data"
+ALIASES += ref_rfc20081218="Reporting of Non-Comparable Datasets by h5diff"
+ALIASES += ref_rfc20081205="External Link Traversal Callback"
+ALIASES += ref_rfc20081030="Setting Raw Data Chunk Cache Parameters in HDF5"
+ALIASES += ref_rfc20080915="Performance Report for Free-space Manager"
+ALIASES += ref_rfc20080904="Setting File Access Property List for accessing External Link"
+ALIASES += ref_rfc20080728="Native Time Types in HDF5"
+ALIASES += ref_rfc20080723="Special Values in HDF5"
+ALIASES += ref_rfc20080301="Dynamic Transformations to HDF5 Data"
+ALIASES += ref_rfc20080209="Using SVN branching to improve software development process at THG"
+ALIASES += ref_rfc20080206="Maintaining the HISTORY.txt and RELEASE.txt files in HDF5"
+ALIASES += ref_rfc20071111="Addressing HDF5 file corruption issue"
+ALIASES += ref_rfc20071018="NaN detection in HDF5"
+ALIASES += ref_rfc20070801="Metadata Journaling to Improve Crash Survivability"
+ALIASES += ref_rfc20070413="API Compatibility Strategies for HDF5"
+ALIASES += ref_rfc20070115="A \"Private\" Heap for HDF5"
+ALIASES += ref_rfc20060623="Performance Comparison of Collective I/O and Independent I/O with Derived Datatypes"
+ALIASES += ref_rfc20060604="h5stat tool"
+ALIASES += ref_rfc20060505="Simple Performance Test on Fletcher32 Filter"
+ALIASES += ref_rfc20060410="Requirement Specifications of an HDF5 File Format Validation Tool"
+ALIASES += ref_rfc20060317="Proposed changes to the sec2 driver "
+ALIASES += ref_rfc20060124="Mapping FITS data to HDF5"
+ALIASES += ref_rfc20040811="Conversion Between Text and Datatype"
################################################################################
# The Usual Suspects
diff --git a/doxygen/dox/About.dox b/doxygen/dox/About.dox
index a8b31d79af4..8b7c141dd1b 100644
--- a/doxygen/dox/About.dox
+++ b/doxygen/dox/About.dox
@@ -23,7 +23,7 @@ Use Doxygen's here.
+here.
\subsection new_rm_entry Creating a New Reference Manual Entry
@@ -33,7 +33,7 @@ Please refer to the \ref RMT for guidance on how to create a new reference manua
For each HDF5 module, such as \Code{H5F}, there is an examples source file called
\Code{H5*_examples.c}. For example, the \Code{H5F} API examples are located in
-
+
H5F_examples.c
. Examples are code blocks marked as Doxygen
snippets.
For example, the source code for the H5Fcreate() API sample is located between
@@ -44,7 +44,7 @@ the
//!
\endverbatim
comments in
-
+
H5F_examples.c
.
Add a new API example by adding a new code block enclosed between matching
@@ -80,7 +80,7 @@ See Doxygen's Custom Comman
as a general reference.
All custom commands for this project are located in the
-aliases
+aliases
file in the doxygen
subdirectory of the main HDF5 repo.
@@ -91,7 +91,7 @@ ask for help if unsure!
For ease of reference, we define custom commands for each RFC in the RFCs section
of the
-aliases
+aliases
file. For example the custom command \Code{ref_rfc20141210} can be used to insert a
reference to "RFC: Virtual Object Layer". In other words, the markup
\verbatim
@@ -102,16 +102,16 @@ yields a clickable link:
\ref_rfc20141210
To add a new RFC, add a custom command for the RFC to the
-aliases
+aliases
file. The naming convention for the custom command is \Code{ref_rfcYYYYMMDD},
where \Code{YYYYMMDD} is the ID of the RFC. The URL is composed of the prefix
\verbatim
-https://docs.hdfgroup.org/hdf5/rfc/
+https://\RFCURL/
\endverbatim
and the name of your RFC file, typically, a PDF file, i.e., the full URL would
be
\verbatim
-https://docs.hdfgroup.org/hdf5/rfc/my_great_rfc_name.pdf
+https://\RFCURL/my_great_rfc_name.pdf
\endverbatim
\subsection hosting How Do Updates and Changes Get Published?
diff --git a/doxygen/dox/ExamplesAPI.dox b/doxygen/dox/ExamplesAPI.dox
index 25342cc250d..156f8826b2e 100644
--- a/doxygen/dox/ExamplesAPI.dox
+++ b/doxygen/dox/ExamplesAPI.dox
@@ -27,236 +27,236 @@ Languages are C, Fortran, Java (JHI5), Java Object Package, Python (High Level),
Set Space Allocation Time for Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_alloc.h5 |
-h5ex_d_alloc.tst |
-h5ex_d_alloc.ddl |
+h5ex_d_alloc.tst |
+h5ex_d_alloc.ddl |
Read / Write Dataset using Fletcher32 Checksum Filter |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_checksum.h5 |
-h5ex_d_checksum.tst |
-h5ex_d_checksum.ddl |
+h5ex_d_checksum.tst |
+h5ex_d_checksum.ddl |
Read / Write Chunked Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_chunk.h5 |
-h5ex_d_chunk.tst |
-h5ex_d_chunk.ddl |
+h5ex_d_chunk.tst |
+h5ex_d_chunk.ddl |
Read / Write Compact Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_compact.h5 |
-h5ex_d_compact.tst |
-h5ex_d_compact.ddl |
+h5ex_d_compact.tst |
+h5ex_d_compact.ddl |
Read / Write to External Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_extern.h5 |
-h5ex_d_extern.tst |
-h5ex_d_extern.ddl |
+h5ex_d_extern.tst |
+h5ex_d_extern.ddl |
Read / Write Dataset w/ Fill Value |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_fillval.h5 |
-h5ex_d_fillval.tst |
-h5ex_d_fillval.ddl |
+h5ex_d_fillval.tst |
+h5ex_d_fillval.ddl |
Read / Write GZIP Compressed Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_gzip.h5 |
-h5ex_d_gzip.tst |
-h5ex_d_gzip.ddl |
+h5ex_d_gzip.tst |
+h5ex_d_gzip.ddl |
Read / Write Data by Hyperslabs |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_hyper.h5 |
-h5ex_d_hyper.tst |
-h5ex_d_hyper.ddl |
+h5ex_d_hyper.tst |
+h5ex_d_hyper.ddl |
Read / Write Dataset with n-bit Filter |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_nbit.h5 |
-h5ex_d_nbit.tst |
-h5ex_d_nbit.ddl |
+h5ex_d_nbit.tst |
+h5ex_d_nbit.ddl |
Read / Write Integer Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_rdwrc.h5 |
-h5ex_d_rdwrc.tst |
-h5ex_d_rdwr.ddl |
+h5ex_d_rdwrc.tst |
+h5ex_d_rdwr.ddl |
Read / Write Dataset w/ Shuffle Filter and GZIP Compression |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_shuffle.h5 |
-h5ex_d_shuffle.tst |
-h5ex_d_shuffle.ddl |
+h5ex_d_shuffle.tst |
+h5ex_d_shuffle.ddl |
Read / Write Dataset using Scale-Offset Filter (float) |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_sofloat.h5 |
-h5ex_d_sofloat.tst |
-h5ex_d_sofloat.ddl |
+h5ex_d_sofloat.tst |
+h5ex_d_sofloat.ddl |
Read / Write Dataset using Scale-Offset Filter (integer) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_soint.h5 |
-h5ex_d_soint.tst |
-h5ex_d_soint.ddl |
+h5ex_d_soint.tst |
+h5ex_d_soint.ddl |
Read / Write Dataset using SZIP Compression |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_szip.h5 |
-h5ex_d_szip.tst |
-h5ex_d_szip.ddl |
+h5ex_d_szip.tst |
+h5ex_d_szip.ddl |
Read / Write Dataset using Data Transform Expression |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_transform.h5 |
-h5ex_d_transform.tst |
-h5ex_d_transform.ddl |
+h5ex_d_transform.tst |
+h5ex_d_transform.ddl |
Read / Write Unlimited Dimension Dataset |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_unlimadd.h5 |
-h5ex_d_unlimadd.tst |
-h5ex_d_unlimadd.ddl |
+h5ex_d_unlimadd.tst |
+h5ex_d_unlimadd.ddl |
Read / Write GZIP Compressed Unlimited Dimension Dataset |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_unlimgzip.h5 |
-h5ex_d_unlimgzip.tst |
-h5ex_d_unlimgzip.ddl |
+h5ex_d_unlimgzip.tst |
+h5ex_d_unlimgzip.ddl |
Read / Write / Edit Unlimited Dimension Dataset |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_d_unlimmod.h5 |
-h5ex_d_unlimmod.tst |
-h5ex_d_unlimmod.ddl |
+h5ex_d_unlimmod.tst |
+h5ex_d_unlimmod.ddl |
@@ -272,105 +272,105 @@ FORTRAN
Create "compact-or-indexed" Format Groups |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_compact.h5 |
-h5ex_g_.tst |
-h5ex_g_compact1.ddl |
-h5ex_g_compact2.ddl |
+h5ex_g_.tst |
+h5ex_g_compact1.ddl |
+h5ex_g_compact2.ddl |
Track links in a Group by Creation Order |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_corder.h5 |
-h5ex_g_corder.tst |
+h5ex_g_corder.tst |
h5ex_g_corder.ddl |
Create / Open / Close a Group |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_create.h5 |
h5ex_g_create.tst |
-h5ex_g_create.ddl |
+h5ex_g_create.ddl |
Create Intermediate Groups |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_intermediate.h5 |
-h5ex_g_intermediate.tst |
+h5ex_g_intermediate.tst |
h5ex_g_intermediate.ddl |
Iterate over Groups w/ H5Literate |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_iterate.h5 |
-h5ex_g_iterate.tst |
+h5ex_g_iterate.tst |
h5ex_g_iterate.ddl |
Set Conditions to Convert between Compact and Dense Groups |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_phase.h5 |
-h5ex_g_phase.tst |
+h5ex_g_phase.tst |
h5ex_g_phase.ddl |
Recursively Traverse a File with H5Literate |
-C
+C
FORTRAN
-Java
+Java
JavaObj MATLAB PyHigh PyLow
|
h5ex_g_traverse.h5 |
-h5ex_g_traverse.tst |
+h5ex_g_traverse.tst |
h5ex_g_traverse.ddl |
Recursively Traverse a File with H5Ovisit / H5Lvisit |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_g_visit.h5 |
-h5ex_g_visit.tst |
+h5ex_g_visit.tst |
h5ex_g_visit.ddl |
@@ -387,347 +387,347 @@ FORTRAN
Read / Write Array (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_arrayatt.h5 |
-h5ex_t_arrayatt.tst |
-h5ex_t_arrayatt.ddl |
+h5ex_t_arrayatt.tst |
+h5ex_t_arrayatt.ddl |
Read / Write Array (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_array.h5 |
-h5ex_t_array.tst |
-h5ex_t_array.ddl |
+h5ex_t_array.tst |
+h5ex_t_array.ddl |
Read / Write Bitfield (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_bitatt.h5 |
-h5ex_t_bitatt.tst |
-h5ex_t_bitatt.ddl |
+h5ex_t_bitatt.tst |
+h5ex_t_bitatt.ddl |
Read / Write Bitfield (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_bit.h5 |
-h5ex_t_bit.tst |
-h5ex_t_bit.ddl |
+h5ex_t_bit.tst |
+h5ex_t_bit.ddl |
Read / Write Compound (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_cmpdatt.h5 |
-h5ex_t_cmpdatt.tst |
-h5ex_t_cmpdatt.ddl |
+h5ex_t_cmpdatt.tst |
+h5ex_t_cmpdatt.ddl |
Read / Write Compound (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_cmpd.h5 |
-h5ex_t_cmpd.tst |
-h5ex_t_cmpd.ddl |
+h5ex_t_cmpd.tst |
+h5ex_t_cmpd.ddl |
Commit Named Datatype and Read Back |
-C
+C
FORTRAN
-Java
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_commit.h5 |
-h5ex_t_commit.tst |
-h5ex_t_commit.ddl |
+h5ex_t_commit.tst |
+h5ex_t_commit.ddl |
Convert Between Datatypes in Memory |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_convert.h5 |
-h5ex_t_convert.tst |
-h5ex_t_convert.ddl |
+h5ex_t_convert.tst |
+h5ex_t_convert.ddl |
Read / Write Complex Compound (Attribute) |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_cpxcmpdatt.h5 |
-h5ex_t_cpxcmpdatt.tst |
-h5ex_t_cpxcmpdatt.ddl |
+h5ex_t_cpxcmpdatt.tst |
+h5ex_t_cpxcmpdatt.ddl |
Read / Write Complex Compound (Dataset) |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_cpxcmpd.h5 |
-h5ex_t_cpxcmpd.tst |
-h5ex_t_cpxcmpd.ddl |
+h5ex_t_cpxcmpd.tst |
+h5ex_t_cpxcmpd.ddl |
Read / Write Enumerated (Attribute) |
-C
-FORTRAN
+C
+FORTRAN
Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_enumatt.h5 |
-h5ex_t_enumatt.tst |
-h5ex_t_enumatt.ddl |
+h5ex_t_enumatt.tst |
+h5ex_t_enumatt.ddl |
Read / Write Enumerated (Dataset) |
-C
-FORTRAN
+C
+FORTRAN
Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_enum.h5 |
-h5ex_t_enum.tst |
-h5ex_t_enum.ddl |
+h5ex_t_enum.tst |
+h5ex_t_enum.ddl |
Read / Write Floating Point (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_floatatt.h5 |
-h5ex_t_floatatt.tst |
-h5ex_t_floatatt.ddl |
+h5ex_t_floatatt.tst |
+h5ex_t_floatatt.ddl |
Read / Write Floating Point (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_float.h5 |
-h5ex_t_float.tst |
-h5ex_t_float.ddl |
+h5ex_t_float.tst |
+h5ex_t_float.ddl |
Read / Write Integer Datatype (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_intatt.h5 |
-h5ex_t_intatt.tst |
-h5ex_t_intatt.ddl |
+h5ex_t_intatt.tst |
+h5ex_t_intatt.ddl |
Read / Write Integer Datatype (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_int.h5 |
-h5ex_t_int.tst |
-h5ex_t_int.ddl |
+h5ex_t_int.tst |
+h5ex_t_int.ddl |
Read / Write Object References (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_objrefatt.h5 |
-h5ex_t_objrefatt.tst |
-h5ex_t_objrefatt.ddl |
+h5ex_t_objrefatt.tst |
+h5ex_t_objrefatt.ddl |
Read / Write Object References (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_objref.h5 |
-h5ex_t_objref.tst |
-h5ex_t_objref.ddl |
+h5ex_t_objref.tst |
+h5ex_t_objref.ddl |
Read / Write Opaque (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_opaqueatt.h5 |
-h5ex_t_opaqueatt.tst |
-h5ex_t_opaqueatt.ddl |
+h5ex_t_opaqueatt.tst |
+h5ex_t_opaqueatt.ddl |
Read / Write Opaque (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_opaque.h5 |
-h5ex_t_opaque.tst |
-h5ex_t_opaque.ddl |
+h5ex_t_opaque.tst |
+h5ex_t_opaque.ddl |
Read / Write Region References (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_regrefatt.h5 |
-h5ex_t_regrefatt.tst |
-h5ex_t_regrefatt.ddl |
+h5ex_t_regrefatt.tst |
+h5ex_t_regrefatt.ddl |
Read / Write Region References (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_regref.h5 |
-h5ex_t_regref.tst |
-h5ex_t_regref.ddl |
+h5ex_t_regref.tst |
+h5ex_t_regref.ddl |
Read / Write String (Attribute) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_stringatt.h5 |
-h5ex_t_stringatt.tst |
-h5ex_t_stringatt.ddl |
+h5ex_t_stringatt.tst |
+h5ex_t_stringatt.ddl |
Read / Write String (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_string.h5 |
-h5ex_t_string.tst |
-h5ex_t_string.ddl |
+h5ex_t_string.tst |
+h5ex_t_string.ddl |
Read / Write Variable Length (Attribute) |
-C
-FORTRAN
+C
+FORTRAN
Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_vlenatt.h5 |
-h5ex_t_vlenatt.tst |
-h5ex_t_vlenatt.ddl |
+h5ex_t_vlenatt.tst |
+h5ex_t_vlenatt.ddl |
Read / Write Variable Length (Dataset) |
-C
-FORTRAN
+C
+FORTRAN
Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_vlen.h5 |
-h5ex_t_vlen.tst |
-h5ex_t_vlen.ddl |
+h5ex_t_vlen.tst |
+h5ex_t_vlen.ddl |
Read / Write Variable Length String (Attribute) |
-C
-FORTRAN
+C
+FORTRAN
Java JavaObj MATLAB PyHigh PyLow
|
h5ex_t_vlstringatt.h5 |
-h5ex_t_vlstringatt.tst |
-h5ex_t_vlstringatt.ddl |
+h5ex_t_vlstringatt.tst |
+h5ex_t_vlstringatt.ddl |
Read / Write Variable Length String (Dataset) |
-C
-FORTRAN
-Java
+C
+FORTRAN
+Java
JavaObj
MATLAB PyHigh PyLow
|
h5ex_t_vlstring.h5 |
-h5ex_t_vlstring.tst |
-h5ex_t_vlstring.ddl |
+h5ex_t_vlstring.tst |
+h5ex_t_vlstring.ddl |
@@ -743,92 +743,92 @@ FORTRAN
Read / Write Dataset using Blosc Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_blosc.h5 |
-h5ex_d_blosc.tst |
-h5ex_d_blosc.ddl |
+h5ex_d_blosc.h5 |
+h5ex_d_blosc.tst |
+h5ex_d_blosc.ddl |
Read / Write Dataset using Bit Shuffle Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_bshuf.h5 |
-h5ex_d_bshuf.tst |
-h5ex_d_bshuf.ddl |
+h5ex_d_bshuf.h5 |
+h5ex_d_bshuf.tst |
+h5ex_d_bshuf.ddl |
Read / Write Dataset using BZip2 Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_bzip2.h5 |
-h5ex_d_bzip2.tst |
-h5ex_d_bzip2.ddl |
+h5ex_d_bzip2.h5 |
+h5ex_d_bzip2.tst |
+h5ex_d_bzip2.ddl |
Read / Write Dataset using JPEG Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_jpeg.h5 |
-h5ex_d_jpeg.tst |
-h5ex_d_jpeg.ddl |
+h5ex_d_jpeg.h5 |
+h5ex_d_jpeg.tst |
+h5ex_d_jpeg.ddl |
Read / Write Dataset using LZ4 Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_lz4.h5 |
-h5ex_d_lz4.tst |
-h5ex_d_lz4.ddl |
+h5ex_d_lz4.h5 |
+h5ex_d_lz4.tst |
+h5ex_d_lz4.ddl |
Read / Write Dataset using LZF Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_lzf.h5 |
-h5ex_d_lzf.tst |
-h5ex_d_lzf.ddl |
+h5ex_d_lzf.h5 |
+h5ex_d_lzf.tst |
+h5ex_d_lzf.ddl |
Read / Write Dataset using MAFISC Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_mafisc.h5 |
-h5ex_d_mafisc.tst |
-h5ex_d_mafisc.ddl |
+h5ex_d_mafisc.h5 |
+h5ex_d_mafisc.tst |
+h5ex_d_mafisc.ddl |
Read / Write Dataset using ZFP Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_zfp.h5 |
-h5ex_d_zfp.tst |
-h5ex_d_zfp.ddl |
+h5ex_d_zfp.h5 |
+h5ex_d_zfp.tst |
+h5ex_d_zfp.ddl |
Read / Write Dataset using ZStd Compression |
-C
+C
FORTRAN Java JavaObj MATLAB PyHigh PyLow
|
-h5ex_d_zstd.h5 |
-h5ex_d_zstd.tst |
-h5ex_d_zstd.ddl |
+h5ex_d_zstd.h5 |
+h5ex_d_zstd.tst |
+h5ex_d_zstd.ddl |
@@ -842,66 +842,66 @@ FORTRAN
Create/Read/Write an Attribute |
-Java
+Java
JavaObj
|
-HDF5AttributeCreate.txt |
+HDF5AttributeCreate.txt |
Create Datasets |
-Java
+Java
JavaObj
|
-HDF5DatasetCreate.txt |
+HDF5DatasetCreate.txt |
Read/Write Datasets |
-Java
+Java
JavaObj
|
-HDF5DatasetRead.txt |
+HDF5DatasetRead.txt |
Create an Empty File |
-Java
+Java
JavaObj
|
-HDF5FileCreate.txt |
+HDF5FileCreate.txt |
Retrieve the File Structure |
-Java
+Java
JavaObj
|
-HDF5FileStructure.txt |
+HDF5FileStructure.txt |
Create Groups |
-Java
+Java
JavaObj
|
-HDF5GroupCreate.txt |
+HDF5GroupCreate.txt |
Select a Subset of a Dataset |
-Java
+Java
JavaObj
|
-HDF5SubsetSelect.txt |
+HDF5SubsetSelect.txt |
Create Two Datasets Within Groups |
-Java
+Java
JavaObj
|
-HDF5GroupDatasetCreate.txt |
+HDF5GroupDatasetCreate.txt |
@@ -917,8 +917,8 @@ FORTRAN
Creating and Accessing a File |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -927,8 +927,8 @@ FORTRAN
Creating and Accessing a Dataset |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -937,8 +937,8 @@ FORTRAN
Writing and Reading Contiguous Hyperslabs |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -947,8 +947,8 @@ FORTRAN
Writing and Reading Regularly Spaced Data Hyperslabs |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -957,8 +957,8 @@ FORTRAN
Writing and Reading Pattern Hyperslabs |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -967,8 +967,8 @@ FORTRAN
Writing and Reading Chunk Hyperslabs |
-C
-FORTRAN
+C
+FORTRAN
MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -977,7 +977,7 @@ FORTRAN
Using the Subfiling VFD to Write a File Striped Across Multiple Subfiles |
-C
+C
FORTRAN MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -986,7 +986,7 @@ FORTRAN
Write to Datasets with Filters Applied |
-C
+C
FORTRAN MATLAB PyHigh PyLow
|
ph5_.h5 |
@@ -995,7 +995,7 @@ FORTRAN
Collectively Write Datasets with Filters and Not All Ranks have Data |
-C
+C
FORTRAN MATLAB PyHigh PyLow
|
ph5_.h5 |
diff --git a/doxygen/dox/IntroHDF5.dox b/doxygen/dox/IntroHDF5.dox
index d9c70ae23cf..737b17de4ba 100644
--- a/doxygen/dox/IntroHDF5.dox
+++ b/doxygen/dox/IntroHDF5.dox
@@ -608,7 +608,8 @@ on the HDF-EOS Tools and Information Center pag
\section secHDF5Examples Examples
\li \ref LBExamples
\li \ref ExAPI
-\li Other Examples
+\li Examples in the Source Code
+\li Other Examples
\section secHDF5ExamplesCompile How To Compile
For information on compiling in C, C++ and Fortran, see: \ref LBCompiling
@@ -617,10 +618,10 @@ For information on compiling in C, C++ and Fortran, see: \ref LBCompiling
IDL, MATLAB, and NCL Examples for HDF-EOS
Examples of how to access and visualize NASA HDF-EOS files using IDL, MATLAB, and NCL.
-Miscellaneous Examples
+Miscellaneous Examples
These (very old) examples resulted from working with users, and are not fully tested. Most of them are in C, with a few in Fortran and Java.
-Using Special Values
+Using Special Values
These examples show how to create special values in an HDF5 application.
*/
diff --git a/doxygen/dox/IntroParExamples.dox b/doxygen/dox/IntroParExamples.dox
index 40e07c79966..cdab44f35a5 100644
--- a/doxygen/dox/IntroParExamples.dox
+++ b/doxygen/dox/IntroParExamples.dox
@@ -95,7 +95,7 @@ Below is the example program:
@@ -205,7 +205,7 @@ Below is the F90 example program which illustrates how to write contiguous hyper
@@ -275,7 +275,7 @@ Below is an example program for writing hyperslabs by column in Parallel HDF5:
@@ -346,7 +346,7 @@ Below is the example program for writing hyperslabs by column in Parallel HDF5:
@@ -431,12 +431,12 @@ Below are example programs for writing hyperslabs by pattern in Parallel HDF5:
@@ -530,12 +530,12 @@ Below are example programs for writing hyperslabs by pattern in Parallel HDF5:
diff --git a/doxygen/dox/IntroParHDF5.dox b/doxygen/dox/IntroParHDF5.dox
index 73a2589f4e6..414a186af89 100644
--- a/doxygen/dox/IntroParHDF5.dox
+++ b/doxygen/dox/IntroParHDF5.dox
@@ -145,8 +145,8 @@ Following is example code for creating an access template in HDF5:
\endcode
The following example programs create an HDF5 file using Parallel HDF5:
-C: file_create.c
-F90: file_create.F90
+C: file_create.c
+F90: file_create.F90
\subsection subsec_pintro_create_dset Creating and Accessing a Dataset with PHDF5
@@ -226,8 +226,8 @@ The following code demonstrates a collective write using Parallel HDF5:
\endcode
The following example programs create an HDF5 dataset using Parallel HDF5:
-C: dataset.c
-F90: dataset.F90
+C: dataset.c
+F90: dataset.F90
\subsubsection subsec_pintro_hyperslabs Hyperslabs
@@ -264,7 +264,6 @@ HDF5 by contiguous hyperslab, by regularly spaced data in a column/row, by patte
-
Navigate back: \ref index "Main" / \ref GettingStarted
diff --git a/doxygen/dox/LearnBasics.dox b/doxygen/dox/LearnBasics.dox
index bb8d3e0d457..bbdf4224c09 100644
--- a/doxygen/dox/LearnBasics.dox
+++ b/doxygen/dox/LearnBasics.dox
@@ -59,7 +59,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create a file
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -67,7 +67,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create a dataset
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -75,7 +75,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Read and write to a dataset
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -83,7 +83,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create an attribute
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -91,7 +91,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create a group
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -99,7 +99,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create groups in a file using absolute and relative paths
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -107,7 +107,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create datasets in a group
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
@@ -115,7 +115,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create a file and dataset and select/read a subset from the dataset
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
Also see examples to Write by row (and column) below.
|
@@ -123,7 +123,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create an extendible (unlimited dimension) dataset
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
Also see examples to Extend by row (and column) below
|
@@ -131,7 +131,7 @@ These examples (C, C++, Fortran) are provided in the HDF5 source code and (Unix)
Create a chunked and compressed dataset
|
-C Fortran C++ Java Python
+ | C Fortran C++ Java Python
|
|
diff --git a/doxygen/dox/LearnBasics2.dox b/doxygen/dox/LearnBasics2.dox
index ed2810c59bf..8eda57bc0c2 100644
--- a/doxygen/dox/LearnBasics2.dox
+++ b/doxygen/dox/LearnBasics2.dox
@@ -468,7 +468,7 @@ If the offset were 1x1 (instead of 1x2), then the selection can be made:
The selections above were tested with the
-h5_subsetbk.c
+h5_subsetbk.c
example code. The memory dataspace was defined as one-dimensional.
\subsection subsecLBDsetSubRWProgRem Remarks
diff --git a/doxygen/dox/LearnBasics3.dox b/doxygen/dox/LearnBasics3.dox
index be227c2ba2c..65c873d63f0 100644
--- a/doxygen/dox/LearnBasics3.dox
+++ b/doxygen/dox/LearnBasics3.dox
@@ -156,9 +156,9 @@ Specifically look at the \ref ExAPI.
There are examples for different languages.
The C example to create a chunked dataset is:
-h5ex_d_chunk.c
+h5ex_d_chunk.c
The C example to create a compact dataset is:
-h5ex_d_compact.c
+h5ex_d_compact.c
\section secLBDsetLayoutChange Changing the Layout after Dataset Creation
The dataset layout is a Dataset Creation Property List. This means that once the dataset has been
@@ -166,7 +166,7 @@ created the dataset layout cannot be changed. The h5repack utility can be used t
to a new with a new layout.
\section secLBDsetLayoutSource Sources of Information
-Chunking in HDF5
+Chunking in HDF5
(See the documentation on Advanced Topics in HDF5)
\see \ref sec_plist in the HDF5 \ref UG.
@@ -184,7 +184,7 @@ certain initial dimensions, then to later increase the size of any of the initia
HDF5 requires you to use chunking to define extendible datasets. This makes it possible to extend
datasets efficiently without having to excessively reorganize storage. (To use chunking efficiently,
-be sure to see the advanced topic, Chunking in HDF5.)
+be sure to see the advanced topic, Chunking in HDF5.)
The following operations are required in order to extend a dataset:
\li Declare the dataspace of the dataset to have unlimited dimensions for all dimensions that might eventually be extended.
@@ -224,7 +224,7 @@ Navigate back: \ref index "Main" / \ref GettingStarted / \ref LearnBasics
\section secLBComDsetCreate Creating a Compressed Dataset
HDF5 requires you to use chunking to create a compressed dataset. (To use chunking efficiently,
-be sure to see the advanced topic, Chunking in HDF5.)
+be sure to see the advanced topic, Chunking in HDF5.)
The following operations are required in order to create a compressed dataset:
\li Create a dataset creation property list.
@@ -294,12 +294,12 @@ Specifically look at the \ref ExAPI.
There are examples for different languages, where examples of using #H5Literate and #H5Ovisit/#H5Lvisit are included.
The h5ex_g_traverse example traverses a file using H5Literate:
-\li C: h5ex_g_traverse.c
-\li F90: h5ex_g_traverse_F03.f90
+\li C: h5ex_g_traverse.c
+\li F90: h5ex_g_traverse_F03.f90
The h5ex_g_visit example traverses a file using H5Ovisit and H5Lvisit:
-\li C: h5ex_g_visit.c
-\li F90: h5ex_g_visit_F03.f90
+\li C: h5ex_g_visit.c
+\li F90: h5ex_g_visit_F03.f90
Navigate back: \ref index "Main" / \ref GettingStarted / \ref LearnBasics
diff --git a/doxygen/dox/LearnHDFView.dox b/doxygen/dox/LearnHDFView.dox
index 2916db841e6..7624e42c40d 100644
--- a/doxygen/dox/LearnHDFView.dox
+++ b/doxygen/dox/LearnHDFView.dox
@@ -8,7 +8,7 @@ any programming experience.
\section sec_learn_hv_install HDFView Installation
\li Download and install HDFView. It can be downloaded from the Download HDFView page.
-\li Obtain the storm1.txt text file, used in the tutorial.
+\li Obtain the storm1.txt text file, used in the tutorial.
\section sec_learn_hv_begin Begin Tutorial
Once you have HDFView installed, bring it up and you are ready to begin the tutorial.
@@ -113,11 +113,11 @@ Datatype information as is):
Double left click on the Storm dataset in the tree view. A window with an empty spreadsheet pops open.
-Copy the data from the storm1.txt file into the dataset.
+Copy the data from the storm1.txt file into the dataset.
-If you downloaded storm1.txt,
+If you downloaded storm1.txt,
then click on the Import/Export Data menu and select Import Data from -> Text File.
-Specify a location, select storm1.txt
+Specify a location, select storm1.txt
and click on the Open button. Answer Yes in the dialog box that
pops up (which asks if you wish to paste the selected data).
@@ -246,7 +246,7 @@ in the file).
Please note that the chunk sizes used in this topic are for demonstration purposes only. For
information on chunking and specifying an appropriate chunk size, see the
-Chunking in HDF5 documentation.
+Chunking in HDF5 documentation.
Also see the HDF5 Tutorial topic on \ref secLBComDsetCreate.
@@ -317,8 +317,8 @@ You will see the Another Storm dataset in the Image group:
-Copy the data from the storm1.txt file into the dataset. (See the previous topic for copying
-storm1.txt into a dataset.)
+Copy the data from the storm1.txt file into the dataset. (See the previous topic for copying
+storm1.txt into a dataset.)
Table -> Close, and save the data.
Right click on Another Storm, and select Open As.
Select the Image button in the Dataset Selection window that pops up. Click the Ok button at the
@@ -375,7 +375,7 @@ Add data to the Storm Image dataset as was shown previously:
Right click on Storm Image, and select Open As to open the Dataset Selection window.
Click on the Spreadsheet button at the top left of the Dataset Selection window to view the image
as a spreadsheet.
-Copy the data from the storm1.txt file into the dataset.
+Copy the data from the storm1.txt file into the dataset.
Close the dataset and save the data.
diff --git a/doxygen/dox/Overview.dox b/doxygen/dox/Overview.dox
index 41ea112b20c..f6116996b07 100644
--- a/doxygen/dox/Overview.dox
+++ b/doxygen/dox/Overview.dox
@@ -23,7 +23,7 @@ documents cover a mix of tasks, concepts, and reference, to help a specific
\par Versions
Version-specific documentation (see the version in the title area) can be found
here:
- - HDF5 develop
branch (this site)
+ - HDF5 develop
branch (this site)
- HDF5 1.14.x
- HDF5 1.12.x
- HDF5 1.10.x
diff --git a/doxygen/dox/UsersGuide.dox b/doxygen/dox/UsersGuide.dox
index 3dd26f1a40a..13f4edbe3f5 100644
--- a/doxygen/dox/UsersGuide.dox
+++ b/doxygen/dox/UsersGuide.dox
@@ -374,7 +374,7 @@ These documents provide additional information for the use and tuning of specifi
- HDF5 Dynamically Loaded Filters
+ HDF5 Dynamically Loaded Filters
|
Describes how an HDF5 application can apply a filter that is not registered with the HDF5 Library.
@@ -382,7 +382,7 @@ These documents provide additional information for the use and tuning of specifi
|
- HDF5 File Image Operations
+ HDF5 File Image Operations
|
Describes how to work with HDF5 files in memory. Disk I/O is not required when file images are opened, created, read from, or written to.
diff --git a/doxygen/dox/VOLConnGuide.dox b/doxygen/dox/VOLConnGuide.dox
index 7a03ab1590d..e30a48428a8 100644
--- a/doxygen/dox/VOLConnGuide.dox
+++ b/doxygen/dox/VOLConnGuide.dox
@@ -92,7 +92,7 @@ Public header Files you will need to be familiar with include:
Many VOL connectors are listed on The HDF Group's VOL plugin registration page, located at:
-Registered VOL Connectors.
+Registered VOL Connectors.
Not all of these VOL connectors are supported by The HDF Group and the level of completeness varies, but the
connectors found there can serve as examples of working implementations
@@ -195,7 +195,7 @@ contact help@hdfgroup.org for help with this. We
name you've chosen will appear on the registered VOL connectors page.
As noted above, registered VOL connectors will be listed at:
-Registered VOL Connectors
+Registered VOL Connectors
A new \b conn_version field has been added to the class struct for 1.13. This field is currently not used by
the library so its use is determined by the connector author. Best practices for this field will be determined
diff --git a/doxygen/dox/ViewTools.dox b/doxygen/dox/ViewTools.dox
index d1a7508bf02..0a13daf0268 100644
--- a/doxygen/dox/ViewTools.dox
+++ b/doxygen/dox/ViewTools.dox
@@ -54,9 +54,9 @@ HDF5 files can be obtained from various places such as \ref HDF5Examples and . Specifically, the following examples are used in this tutorial topic:
\li HDF5 Files created from compiling the \ref LBExamples
\li HDF5 Files on the \ref ExAPI page
-\li NPP JPSS files, SVM01_npp.. (gzipped)
-and SVM09_npp.. (gzipped)
-\li HDF-EOS OMI-Aura file
+\li NPP JPSS files, SVM01_npp.. (gzipped)
+and SVM09_npp.. (gzipped)
+\li HDF-EOS OMI-Aura file
\section secViewToolsCommandTutor Tutorial Topics
A variety of command-line tools are included in the HDF5 binary distribution. There are tools to view,
@@ -196,44 +196,29 @@ The following h5dump options can be helpful in viewing the content and structure
| Comment |
--n, --contents
- |
-Displays a list of the objects in a file
- |
-See @ref subsubsecViewToolsViewContent_h5dumpEx1
- |
+-n, --contents |
+Displays a list of the objects in a file |
+See @ref subsubsecViewToolsViewContent_h5dumpEx1 |
--n 1, --contents=1
- |
-Displays a list of the objects and attributes in a file
- |
-See @ref subsubsecViewToolsViewAttr_h5dumpEx6
- |
+-n 1, --contents=1 |
+Displays a list of the objects and attributes in a file |
+See @ref subsubsecViewToolsViewAttr_h5dumpEx6 |
--H, --header
- |
-Displays header information only (no data)
- |
-See @ref subsubsecViewToolsViewContent_h5dumpEx2
- |
+-H, --header |
+Displays header information only (no data) |
+See @ref subsubsecViewToolsViewContent_h5dumpEx2 |
--A 0, --onlyattr=0
- |
-Suppresses the display of attributes
- |
-See @ref subsubsecViewToolsViewContent_h5dumpEx2
- |
+-A 0, --onlyattr=0 |
+Suppresses the display of attributes |
+See @ref subsubsecViewToolsViewContent_h5dumpEx2 |
--N P, --any_path=P
- |
-Displays any object or attribute that matches path P
- |
-See @ref subsubsecViewToolsViewAttr_h5dumpEx6
- |
+-N P, --any_path=P |
+Displays any object or attribute that matches path P |
+See @ref subsubsecViewToolsViewAttr_h5dumpEx6 |
@@ -997,7 +982,7 @@ In other words, it is an array of four elements, in which each element is a 3 by
This dataset is much more complex. Also note that subsetting cannot be done on Array datatypes.
-See this section for more information on the Array datatype.
+See this section for more information on the Array datatype.
\subsubsection subsubsecViewToolsViewDtypes_objref Object Reference
An Object Reference is a reference to an entire object (dataset, group, or named datatype).
diff --git a/doxygen/dox/ViewToolsJPSS.dox b/doxygen/dox/ViewToolsJPSS.dox
index 9c153956797..18c8ccefa0f 100644
--- a/doxygen/dox/ViewToolsJPSS.dox
+++ b/doxygen/dox/ViewToolsJPSS.dox
@@ -11,8 +11,8 @@ Navigate back: \ref index "Main" / \ref GettingStarted / \ref ViewToolsCommand
This tutorial illustrates how to use the HDF5 tools to examine NPP files from the JPSS project. The following files are discussed:
\code
-SVM09_npp_d20120229_t0849107_e0854511_b01759_c20120229145452682127_noaa_ops.h5 (gzipped file)
-SVM01_npp_d20130524_t1255132_e1256374_b08146_c20130524192048864992_noaa_ops.h5 (gzipped file)
+SVM09_npp_d20120229_t0849107_e0854511_b01759_c20120229145452682127_noaa_ops.h5 (gzipped file)
+SVM01_npp_d20130524_t1255132_e1256374_b08146_c20130524192048864992_noaa_ops.h5 (gzipped file)
\endcode
\section secViewToolsJPSSDeter Determining File Contents
diff --git a/doxygen/dox/rm-template.dox b/doxygen/dox/rm-template.dox
index 1e9f2d7b6b3..003d5c4b862 100644
--- a/doxygen/dox/rm-template.dox
+++ b/doxygen/dox/rm-template.dox
@@ -2,9 +2,9 @@
We treat documentation like code and use
Doxygen to
-markup
+markup
comments in the code or create
-stand-alone pages.
+stand-alone pages.
Every RM entry consists of a subset of the elements listed below. Not every RM
entry warrants the full set. More is better, and we can, perhaps, distinguish
diff --git a/doxygen/examples/FileFormat.html b/doxygen/examples/FileFormat.html
index 0eb56d3746b..eff1aead0a3 100644
--- a/doxygen/examples/FileFormat.html
+++ b/doxygen/examples/FileFormat.html
@@ -36,7 +36,7 @@
Background Reading:
- - HDF5 File Format Specification
+
- HDF5 File Format Specification
- This describes the current HDF5 file format.
diff --git a/doxygen/examples/intro_SWMR.html b/doxygen/examples/intro_SWMR.html
index f4cd586f5f8..b1adb62bdb5 100644
--- a/doxygen/examples/intro_SWMR.html
+++ b/doxygen/examples/intro_SWMR.html
@@ -17,8 +17,8 @@ Documentation
HDF5 Library APIs
-- H5F_START_SWMR_WRITE — Enables SWMR writing mode for a file
-- H5DO_APPEND — Appends data to a dataset along a specified dimension
+- H5F_START_SWMR_WRITE — Enables SWMR writing mode for a file
+- H5DO_APPEND — Appends data to a dataset along a specified dimension
- H5P_SET_OBJECT_FLUSH_CB — Sets a callback function to invoke when an object flush occurs in the file
- H5P_GET_OBJECT_FLUSH_CB — Retrieves the object flush property values from the file access property list
- H5O_DISABLE_MDC_FLUSHES — Prevents metadata entries for an HDF5 object from being flushed from the metadata cache to storage
diff --git a/doxygen/hdf5doxy_layout.xml b/doxygen/hdf5doxy_layout.xml
index db25b9e2929..2f6d800d674 100644
--- a/doxygen/hdf5doxy_layout.xml
+++ b/doxygen/hdf5doxy_layout.xml
@@ -6,13 +6,13 @@
-
+
-
+
diff --git a/hl/src/H5DOpublic.h b/hl/src/H5DOpublic.h
index 887e65973e2..5054178846b 100644
--- a/hl/src/H5DOpublic.h
+++ b/hl/src/H5DOpublic.h
@@ -161,7 +161,7 @@ H5_HLDLL herr_t H5DOappend(hid_t dset_id, hid_t dxpl_id, unsigned axis, size_t e
* from one datatype to another, and the filter pipeline to write the chunk.
* Developers should have experience with these processes before
* using this function. Please see
- *
+ *
* Using the Direct Chunk Write Function
* for more information.
*
diff --git a/hl/src/H5LTpublic.h b/hl/src/H5LTpublic.h
index 1ce5c81d3e2..343f5272453 100644
--- a/hl/src/H5LTpublic.h
+++ b/hl/src/H5LTpublic.h
@@ -1387,7 +1387,7 @@ H5_HLDLL herr_t H5LTget_attribute_info(hid_t loc_id, const char *obj_name, const
* Currently, only the DDL(#H5LT_DDL) is supported.
* The complete DDL definition of HDF5 datatypes can be found in
* the last chapter of the
- *
+ *
* HDF5 User's Guide.
*
* \par Example
@@ -1425,7 +1425,7 @@ H5_HLDLL hid_t H5LTtext_to_dtype(const char *text, H5LT_lang_t lang_type);
* Currently only DDL (#H5LT_DDL) is supported for \p lang_type.
* The complete DDL definition of HDF5 data types can be found in
* the last chapter of the
- *
+ *
* HDF5 User's Guide.
*
* \par Example
@@ -1625,7 +1625,7 @@ H5_HLDLL htri_t H5LTpath_valid(hid_t loc_id, const char *path, hbool_t check_obj
* \note **Recommended Reading:**
* \note This function is part of the file image operations feature set.
* It is highly recommended to study the guide
- *
+ *
* HDF5 File Image Operations before using this feature set.\n
* See the “See Also” section below for links to other elements of
* HDF5 file image operations.
diff --git a/release_docs/INSTALL_Auto.txt b/release_docs/INSTALL_Auto.txt
index 7d246ab3191..204f204bae4 100644
--- a/release_docs/INSTALL_Auto.txt
+++ b/release_docs/INSTALL_Auto.txt
@@ -361,7 +361,7 @@ III. Full installation instructions for source distributions
source code. For additional configuration options and other details,
see "API Compatibility Macros":
- https://docs.hdfgroup.org/hdf5/develop/api-compat-macros.html
+ hdfgroup.github.io/hdf5/develop/api-compat-macros.html
4. Building
The library, confidence tests, and programs can be built by
diff --git a/src/H5Fmodule.h b/src/H5Fmodule.h
index 23c15c897de..2e7918669b6 100644
--- a/src/H5Fmodule.h
+++ b/src/H5Fmodule.h
@@ -235,10 +235,10 @@
* Note that the root group, indicated above by /, was automatically created when the file was created.
*
* h5dump is described on the
- *
+ *
* Tools
* page under
- * Command-line Tools.
+ * Command-line Tools.
* The HDF5 DDL grammar is described in the document \ref DDLBNF114.
*
* \subsection subsec_file_summary File Function Summaries
@@ -712,7 +712,7 @@
* If the application opens an HDF5 file without both determining the driver used to create the file
* and setting up the use of that driver, the HDF5 Library will examine the superblock and the
* driver definition block to identify the driver.
- * See the HDF5 File Format Specification
+ * See the HDF5 File Format Specification
* for detailed descriptions of the superblock and the driver definition block.
*
* \subsubsection subsubsec_file_alternate_drivers_sec2 The POSIX (aka SEC2) Driver
@@ -888,7 +888,7 @@
*
* Additional parameters may be added to these functions in the future.
*
- * @see
+ * @see
* HDF5 File Image Operations
* section for information on more advanced usage of the Memory file driver, and
* @see
+ *
* HDF5 File Image Operations.
*
*
@@ -3748,7 +3748,7 @@ H5_DLL herr_t H5Pget_file_image(hid_t fapl_id, void **buf_ptr_ptr, size_t *buf_l
* \see H5LTopen_file_image(), H5Fget_file_image(), H5Pset_file_image(),
* H5Pset_file_image_callbacks(), H5Pget_file_image_callbacks(),
* \ref H5FD_file_image_callbacks_t, \ref H5FD_file_image_op_t,
- *
+ *
* HDF5 File Image Operations.
*
* \since 1.8.9
@@ -4692,7 +4692,7 @@ H5_DLL herr_t H5Pset_fclose_degree(hid_t fapl_id, H5F_close_degree_t degree);
* This function is part of the file image
* operations feature set. It is highly recommended to study the guide
* [HDF5 File Image Operations]
- * (https://portal.hdfgroup.org/documentation/hdf5-docs/advanced_topics/file_image_ops.html
+ * (https://\DOCURL/advanced_topics/file_image_ops.html
* ) before using this feature set. See the “See Also” section below
* for links to other elements of HDF5 file image operations.
*
@@ -4704,9 +4704,9 @@ H5_DLL herr_t H5Pset_fclose_degree(hid_t fapl_id, H5F_close_degree_t degree);
* \li H5Pget_file_image_callbacks()
*
* \li [HDF5 File Image Operations]
- * (https://portal.hdfgroup.org/documentation/hdf5-docs/advanced_topics/file_image_ops.html)
+ * (https://\DOCURL/advanced_topics/file_image_ops.html)
* in [Advanced Topics in HDF5]
- * (https://portal.hdfgroup.org/documentation/hdf5-docs/advanced_topics_list.html)
+ * (https://\DOCURL/advanced_topics_list.html)
*
* \li Within H5Pset_file_image_callbacks():
* \li Callback #H5FD_file_image_callbacks_t
@@ -4729,7 +4729,7 @@ H5_DLL herr_t H5Pset_file_image(hid_t fapl_id, void *buf_ptr, size_t buf_len);
* **Recommended Reading:** This function is part of the file
* image operations feature set. It is highly recommended to study
* the guide [HDF5 File Image Operations]
- * (https://portal.hdfgroup.org/documentation/hdf5-docs/advanced_topics/file_image_ops.html
+ * (https://\DOCURL/advanced_topics/file_image_ops.html
* ) before using this feature set. See the “See Also” section below
* for links to other elements of HDF5 file image operations.
*
@@ -5205,7 +5205,7 @@ H5_DLL herr_t H5Pset_mdc_config(hid_t plist_id, H5AC_cache_config_t *config_ptr)
* current state of the logging flags.
*
* The log format is described in [Metadata Cache Logging]
- * (https://portal.hdfgroup.org/display/HDF5/Fine-tuning+the+Metadata+Cache).
+ * (https://\DSPURL/Fine-tuning+the+Metadata+Cache).
*
* \since 1.10.0
*
@@ -7116,7 +7116,7 @@ H5_DLL herr_t H5Pset_szip(hid_t plist_id, unsigned options_mask, unsigned pixels
* VDS access time. Example code for many source and virtual dataset mappings
* is available in the "Examples of Source to Virtual Dataset Mapping"
* chapter in the
- *
+ *
* RFC: HDF5 Virtual Dataset.
*
*
@@ -7189,7 +7189,7 @@ H5_DLL herr_t H5Pset_szip(hid_t plist_id, unsigned options_mask, unsigned pixels
* If that source file does not exist, the new \p src_file_name
* after stripping will be \Code{A.h5}
*
- * \see
+ * \see
* Virtual Dataset Overview
*
* \see_virtual
@@ -9040,7 +9040,7 @@ H5_DLL herr_t H5Pset_link_phase_change(hid_t plist_id, unsigned max_compact, uns
* must be created and maintained in the original style. This is HDF5's default
* behavior. If backward compatibility with pre-1.8.0 libraries is not a concern,
* greater efficiencies can be obtained with the new-format compact and indexed
- * groups. See Group
+ * groups. See Group
* implementations in HDF5 in the \ref H5G API introduction (at the bottom).\n
* H5Pset_local_heap_size_hint() is useful for tuning file size when files
* contain original-style groups with either zero members or very large
diff --git a/src/H5VLmodule.h b/src/H5VLmodule.h
index 0cca38cf1db..125e2a621c9 100644
--- a/src/H5VLmodule.h
+++ b/src/H5VLmodule.h
@@ -83,7 +83,7 @@
* to be much more common than internal implementations.
*
* A list of VOL connectors can be found here:
- *
+ *
* Registered VOL Connectors
*
* This list is incomplete and only includes the VOL connectors that have been registered with
diff --git a/src/H5module.h b/src/H5module.h
index 052fe5b30ea..6e7dcf7bdfb 100644
--- a/src/H5module.h
+++ b/src/H5module.h
@@ -49,7 +49,7 @@
* The Abstract Data Model is a conceptual model of data, data types, and data organization. The
* abstract data model is independent of storage medium or programming environment. The
* Storage Model is a standard representation for the objects of the abstract data model. The
- * HDF5 File Format Specification
+ * HDF5 File Format Specification
* defines the storage model.
*
* The Programming Model is a model of the computing environment and includes platforms from
@@ -100,7 +100,7 @@
* model, and stored in a storage medium. The stored objects include header blocks, free lists, data
* blocks, B-trees, and other objects. Each group or dataset is stored as one or more header and data
* blocks.
- * @see HDF5 File Format Specification
+ * @see HDF5 File Format Specification
* for more information on how these objects are organized. The HDF5 library can also use other
* libraries and modules such as compression.
*
@@ -125,7 +125,7 @@
* HDF5 abstract data model is up to the application developer. The application program only
* needs to deal with the library and the abstract data model. Most applications need not consider
* any details of the
- * HDF5 File Format Specification
+ * HDF5 File Format Specification
* or the details of how objects of abstract data model are translated to and from storage.
*
* \subsection subsec_data_model_abstract The Abstract Data Model
@@ -408,7 +408,7 @@
*
* \subsection subsec_data_model_storage The HDF5 Storage Model
* \subsubsection subsubsec_data_model_storage_spec The Abstract Storage Model: the HDF5 Format Specification
- * The HDF5 File Format Specification
+ * The HDF5 File Format Specification
* defines how HDF5 objects and data are mapped to a linear
* address space. The address space is assumed to be a contiguous array of bytes stored on some
* random access medium. The format defines the standard for how the objects of the abstract data