Skip to content

Commit

Permalink
Merge branch 'develop' into 9317-delete-saved-search
Browse files Browse the repository at this point in the history
  • Loading branch information
luddaniel committed Jan 31, 2024
2 parents a50f963 + 97508e6 commit 98f7b53
Show file tree
Hide file tree
Showing 54 changed files with 837 additions and 400 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,12 +56,12 @@ If you are interested in working on the main Dataverse code, great! Before you s

Please read http://guides.dataverse.org/en/latest/developers/version-control.html to understand how we use the "git flow" model of development and how we will encourage you to create a GitHub issue (if it doesn't exist already) to associate with your pull request. That page also includes tips on making a pull request.

After making your pull request, your goal should be to help it advance through our kanban board at https://github.com/orgs/IQSS/projects/2 . If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request).
After making your pull request, your goal should be to help it advance through our kanban board at https://github.com/orgs/IQSS/projects/34 . If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request).

Thanks for your contribution!

[dataverse-community Google Group]: https://groups.google.com/group/dataverse-community
[Community Call]: https://dataverse.org/community-calls
[dataverse-dev Google Group]: https://groups.google.com/group/dataverse-dev
[community contributors]: https://docs.google.com/spreadsheets/d/1o9DD-MQ0WkrYaEFTD5rF_NtyL8aUISgURsAXSL7Budk/edit?usp=sharing
[dev efforts]: https://github.com/orgs/IQSS/projects/2#column-5298405
[dev efforts]: https://github.com/orgs/IQSS/projects/34/views/6
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
The response for getVersionFiles (/api/datasets/{id}/versions/{versionId}/files) endpoint has been modified to include a total count of records available (totalCount:x).
This will aid in pagination by allowing the caller to know how many pages can be iterated through. The existing API (getVersionFileCounts) to return the count will still be available.
5 changes: 5 additions & 0 deletions doc/release-notes/10216-metadatablocks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
The API endpoint `/api/metadatablocks/{block_id}` has been extended to include the following fields:

- `isRequired`: Whether or not this field is required
- `displayOrder`: The display order of the field in create/edit forms
- `typeClass`: The type class of this field ("controlledVocabulary", "compound", or "primitive")
4 changes: 4 additions & 0 deletions doc/release-notes/9275-harvest-invalid-query-params.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
OAI-PMH error handling has been improved to display a machine-readable error in XML rather than a 500 error with no further information.

- /oai?foo=bar will show "No argument 'verb' found"
- /oai?verb=foo&verb=bar will show "Verb must be singular, given: '[foo, bar]'"
1 change: 1 addition & 0 deletions doc/release-notes/9728-universe-variablemetadata.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
universe field in variablemetadata table was changed from varchar(255) to text. The change was made to support longer strings in "universe" metadata field, similar to the rest of text fields in variablemetadata table.
3 changes: 3 additions & 0 deletions doc/release-notes/9920-postgres16.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
This release adds install script support for the new permissions model in Postgres versions 15+, and bumps FlyWay to support Postgres 16.

Postgres 13 remains the version used with automated testing.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Listing collction/dataverse role assignments via API still requires ManageDataversePermissions, but listing dataset role assignments via API now requires only ManageDatasetPermissions.
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/integrations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ Future Integrations

The `Dataverse Project Roadmap <https://www.iq.harvard.edu/roadmap-dataverse-project>`_ is a good place to see integrations that the core Dataverse Project team is working on.

The `Community Dev <https://github.com/orgs/IQSS/projects/2#column-5298405>`_ column of our project board is a good way to track integrations that are being worked on by the Dataverse Community but many are not listed and if you have an idea for an integration, please ask on the `dataverse-community <https://groups.google.com/forum/#!forum/dataverse-community>`_ mailing list if someone is already working on it.
If you have an idea for an integration, please ask on the `dataverse-community <https://groups.google.com/forum/#!forum/dataverse-community>`_ mailing list if someone is already working on it.

Many integrations take the form of "external tools". See the :doc:`external-tools` section for details. External tool makers should check out the :doc:`/api/external-tools` section of the API Guide.

Expand Down
8 changes: 7 additions & 1 deletion doc/sphinx-guides/source/api/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@ This API changelog is experimental and we would love feedback on its usefulness.
:local:
:depth: 1

v6.2
----

- **/api/datasets/{id}/versions/{versionId}**: The includeFiles parameter has been renamed to excludeFiles. The default behavior remains the same, which is to include files. However, when excludeFiles is set to true, the files will be excluded. A bug that caused the API to only return a deaccessioned dataset if the user had edit privileges has been fixed.
- **/api/datasets/{id}/versions**: The includeFiles parameter has been renamed to excludeFiles. The default behavior remains the same, which is to include files. However, when excludeFiles is set to true, the files will be excluded.

v6.1
----

Expand All @@ -15,4 +21,4 @@ v6.1
v6.0
----

- **/api/access/datafile**: When a null or invalid API token is provided to download a public (non-restricted) file with this API call, it will result on a ``401`` error response. Previously, the download was allowed (``200`` response). Please note that we noticed this change sometime between 5.9 and 6.0. If you can help us pinpoint the exact version (or commit!), please get in touch. See :doc:`dataaccess`.
- **/api/access/datafile**: When a null or invalid API token is provided to download a public (non-restricted) file with this API call, it will result on a ``401`` error response. Previously, the download was allowed (``200`` response). Please note that we noticed this change sometime between 5.9 and 6.0. If you can help us pinpoint the exact version (or commit!), please get in touch. See :doc:`dataaccess`.
10 changes: 7 additions & 3 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1066,7 +1066,11 @@ The fully expanded example above (without environment variables) looks like this
curl "https://demo.dataverse.org/api/datasets/24/versions/1.0/files"
This endpoint supports optional pagination, through the ``limit`` and ``offset`` query parameters:
This endpoint supports optional pagination, through the ``limit`` and ``offset`` query parameters.

To aid in pagination the JSON response also includes the total number of rows (totalCount) available.

Usage example:

.. code-block:: bash
Expand Down Expand Up @@ -1568,8 +1572,8 @@ The fully expanded example above (without environment variables) looks like this
Set Citation Date Field Type for a Dataset
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Sets the dataset citation date field type for a given dataset. ``:publicationDate`` is the default.
Note that the dataset citation date field type must be a date field.
Sets the dataset citation date field type for a given dataset. ``:publicationDate`` is the default.
Note that the dataset citation date field type must be a date field. This change applies to all versions of the dataset that have an entry for the new date field. It also applies to all file citations in the dataset.

.. code-block:: bash
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/api/search.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Parameters
Name Type Description
=============== ======= ===========
q string The search term or terms. Using "title:data" will search only the "title" field. "*" can be used as a wildcard either alone or adjacent to a term (i.e. "bird*"). For example, https://demo.dataverse.org/api/search?q=title:data . For a list of fields to search, please see https://github.com/IQSS/dataverse/issues/2558 (for now).
type string Can be either "Dataverse", "dataset", or "file". Multiple "type" parameters can be used to include multiple types (i.e. ``type=dataset&type=file``). If omitted, all types will be returned. For example, https://demo.dataverse.org/api/search?q=*&type=dataset
type string Can be either "dataverse", "dataset", or "file". Multiple "type" parameters can be used to include multiple types (i.e. ``type=dataset&type=file``). If omitted, all types will be returned. For example, https://demo.dataverse.org/api/search?q=*&type=dataset
subtree string The identifier of the Dataverse collection to which the search should be narrowed. The subtree of this Dataverse collection and all its children will be searched. Multiple "subtree" parameters can be used to include multiple Dataverse collections. For example, https://demo.dataverse.org/api/search?q=data&subtree=birds&subtree=cats .
sort string The sort field. Supported values include "name" and "date". See example under "order".
order string The order in which to sort. Can either be "asc" or "desc". For example, https://demo.dataverse.org/api/search?q=data&sort=name&order=asc
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/developers/documentation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ If you find a typo or a small error in the documentation you can fix it using Gi
- Under the **Write** tab, delete the long welcome message and write a few words about what you fixed.
- Click **Create Pull Request**.

That's it! Thank you for your contribution! Your pull request will be added manually to the main Dataverse Project board at https://github.com/orgs/IQSS/projects/2 and will go through code review and QA before it is merged into the "develop" branch. Along the way, developers might suggest changes or make them on your behalf. Once your pull request has been merged you will be listed as a contributor at https://github.com/IQSS/dataverse/graphs/contributors
That's it! Thank you for your contribution! Your pull request will be added manually to the main Dataverse Project board at https://github.com/orgs/IQSS/projects/34 and will go through code review and QA before it is merged into the "develop" branch. Along the way, developers might suggest changes or make them on your behalf. Once your pull request has been merged you will be listed as a contributor at https://github.com/IQSS/dataverse/graphs/contributors

Please see https://github.com/IQSS/dataverse/pull/5857 for an example of a quick fix that was merged (the "Files changed" tab shows how a typo was fixed).

Expand Down
16 changes: 10 additions & 6 deletions doc/sphinx-guides/source/developers/globus-api.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
Globus Transfer API
===================

.. contents:: |toctitle|
:local:

The Globus API addresses three use cases:

* Transfer to a Dataverse-managed Globus endpoint (File-based or using the Globus S3 Connector)
* Reference of files that will remain in a remote Globus endpoint
* Transfer from a Dataverse-managed Globus endpoint
Expand Down Expand Up @@ -68,7 +72,7 @@ The response includes the id for the Globus endpoint to use along with several s

The getDatasetMetadata and getFileListing URLs are just signed versions of the standard Dataset metadata and file listing API calls. The other two are Globus specific.

If called for a dataset using a store that is configured with a remote Globus endpoint(s), the return response is similar but the response includes a
If called for, a dataset using a store that is configured with a remote Globus endpoint(s), the return response is similar but the response includes a
the "managed" parameter will be false, the "endpoint" parameter is replaced with a JSON array of "referenceEndpointsWithPaths" and the
requestGlobusTransferPaths and addGlobusFiles URLs are replaced with ones for requestGlobusReferencePaths and addFiles. All of these calls are
described further below.
Expand All @@ -87,7 +91,7 @@ The returned response includes the same getDatasetMetadata and getFileListing UR
Performing an Upload/Transfer In
--------------------------------

The information from the API call above can be used to provide a user with information about the dataset and to prepare to transfer or to reference files (based on the "managed" parameter).
The information from the API call above can be used to provide a user with information about the dataset and to prepare to transfer (managed=true) or to reference files (managed=false).

Once the user identifies which files are to be added, the requestGlobusTransferPaths or requestGlobusReferencePaths URLs can be called. These both reference the same API call but must be used with different entries in the JSON body sent:

Expand All @@ -98,7 +102,7 @@ Once the user identifies which files are to be added, the requestGlobusTransferP
export PERSISTENT_IDENTIFIER=doi:10.5072/FK27U7YBV
export LOCALE=en-US
curl -H "X-Dataverse-key:$API_TOKEN" -H "Content-type:application/json" -X POST "$SERVER_URL/api/datasets/:persistentId/requestGlobusUpload"
curl -H "X-Dataverse-key:$API_TOKEN" -H "Content-type:application/json" -X POST "$SERVER_URL/api/datasets/:persistentId/requestGlobusUploadPaths"
Note that when using the dataverse-globus app or the return from the previous call, the URL for this call will be signed and no API_TOKEN is needed.

Expand Down Expand Up @@ -153,7 +157,7 @@ In the remote/reference case, the map is from the initially supplied endpoint/pa
Adding Files to the Dataset
---------------------------

In the managed case, once a Globus transfer has been initiated a final API call is made to Dataverse to provide it with the task identifier of the transfer and information about the files being transferred:
In the managed case, you must initiate a Globus transfer and take note of its task identifier. As in the JSON example below, you will pass it as ``taskIdentifier`` along with details about the files you are transferring:

.. code-block:: bash
Expand All @@ -164,9 +168,9 @@ In the managed case, once a Globus transfer has been initiated a final API call
"files": [{"description":"My description.","directoryLabel":"data/subdir1","categories":["Data"], "restrict":"false", "storageIdentifier":"globusm://18b3972213f-f6b5c2221423", "fileName":"file1.txt", "mimeType":"text/plain", "checksum": {"@type": "MD5", "@value": "1234"}}, \
{"description":"My description.","directoryLabel":"data/subdir1","categories":["Data"], "restrict":"false", "storageIdentifier":"globusm://18b39722140-50eb7d3c5ece", "fileName":"file2.txt", "mimeType":"text/plain", "checksum": {"@type": "MD5", "@value": "2345"}}]}'
curl -H "X-Dataverse-key:$API_TOKEN" -H "Content-type:multipart/form-data" -X POST "$SERVER_URL/api/datasets/:persistentId/addGlobusFiles -F "jsonData=$JSON_DATA"
curl -H "X-Dataverse-key:$API_TOKEN" -H "Content-type:multipart/form-data" -X POST "$SERVER_URL/api/datasets/:persistentId/addGlobusFiles" -F "jsonData=$JSON_DATA"
Note that the mimetype is multipart/form-data, matching the /addFiles API call. ALso note that the API_TOKEN is not needed when using a signed URL.
Note that the mimetype is multipart/form-data, matching the /addFiles API call. Also note that the API_TOKEN is not needed when using a signed URL.

With this information, Dataverse will begin to monitor the transfer and when it completes, will add all files for which the transfer succeeded.
As the transfer can take significant time and the API call is asynchronous, the only way to determine if the transfer succeeded via API is to use the standard calls to check the dataset lock state and contents.
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/developers/intro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ For the Dataverse Software development roadmap, please see https://www.iq.harvar
Kanban Board
------------

You can get a sense of what's currently in flight (in dev, in QA, etc.) by looking at https://github.com/orgs/IQSS/projects/2
You can get a sense of what's currently in flight (in dev, in QA, etc.) by looking at https://github.com/orgs/IQSS/projects/34

Issue Tracker
-------------
Expand Down
10 changes: 6 additions & 4 deletions doc/sphinx-guides/source/developers/making-releases.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,18 @@ See :doc:`version-control` for background on our branching strategy.

The steps below describe making both regular releases and hotfix releases.

.. _write-release-notes:

Write Release Notes
-------------------

Developers express the need for an addition to release notes by creating a file in ``/doc/release-notes`` containing the name of the issue they're working on. The name of the branch could be used for the filename with ".md" appended (release notes are written in Markdown) such as ``5053-apis-custom-homepage.md``.
Developers express the need for an addition to release notes by creating a "release note snippet" in ``/doc/release-notes`` containing the name of the issue they're working on. The name of the branch could be used for the filename with ".md" appended (release notes are written in Markdown) such as ``5053-apis-custom-homepage.md``. See :ref:`writing-release-note-snippets` for how this is described for contributors.

The task at or near release time is to collect these notes into a single doc.
The task at or near release time is to collect these snippets into a single file.

- Create an issue in GitHub to track the work of creating release notes for the upcoming release.
- Create a branch, add a .md file for the release (ex. 5.10.1 Release Notes) in ``/doc/release-notes`` and write the release notes, making sure to pull content from the issue-specific release notes mentioned above.
- Delete the previously-created, issue-specific release notes as the content is added to the main release notes file.
- Create a branch, add a .md file for the release (ex. 5.10.1 Release Notes) in ``/doc/release-notes`` and write the release notes, making sure to pull content from the release note snippets mentioned above.
- Delete the release note snippets as the content is added to the main release notes file.
- Include instructions to describe the steps required to upgrade the application from the previous version. These must be customized for release numbers and special circumstances such as changes to metadata blocks and infrastructure.
- Take the release notes .md through the regular Code Review and QA process.

Expand Down
Loading

0 comments on commit 98f7b53

Please sign in to comment.