Skip to content

Commit

Permalink
Merge branch 'develop' into IQSS/8235-auxfile_enhancements IQSS#8235
Browse files Browse the repository at this point in the history
Conflicts (from PR IQSS#8192 it seems):
src/main/java/edu/harvard/iq/dataverse/api/Access.java
  • Loading branch information
pdurbin committed Nov 16, 2021
2 parents 7464a1d + 1b89c59 commit dedd9f8
Show file tree
Hide file tree
Showing 10 changed files with 76 additions and 121 deletions.
34 changes: 32 additions & 2 deletions doc/sphinx-guides/source/developers/version-control.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,10 +96,11 @@ Look at https://github.com/IQSS/dataverse/blob/master/CONTRIBUTING.md for variou
Summary of Git commands
~~~~~~~~~~~~~~~~~~~~~~~

This section provides sequences of Git commands for two scenarios:
This section provides sequences of Git commands for three scenarios:

* preparing the first request, when the IQSS Dataverse Software repository and the forked repository are identical
* creating an additional request after some time, when the IQSS Dataverse Software repository is ahead of the forked repository
* while your pull requests are in review the develop branch has been updated, so you have to keep your code base synchronized with the current state of develop branch

In the examples we use 123-COOL-FEATURE as the name of the feature branch, and https://github.com/YOUR_NAME/dataverse.git as your forked repository's URL. In practice modify both accordingly.

Expand Down Expand Up @@ -133,7 +134,7 @@ In the examples we use 123-COOL-FEATURE as the name of the feature branch, and h
git checkout develop
# update local develop banch from https://github.com/IQSS/dataverse
# update local develop branch from https://github.com/IQSS/dataverse
git fetch upstream develop
git rebase upstream/develop
Expand All @@ -152,6 +153,35 @@ In the examples we use 123-COOL-FEATURE as the name of the feature branch, and h
# ... then create pull request at github.com/YOUR_NAME/dataverse
**3rd scenario: synchronize your branch with develop branch**

.. code-block:: bash
git checkout develop
# update local develop branch from https://github.com/IQSS/dataverse
git fetch upstream develop
git rebase upstream/develop
# update remote develop branch at https://github.com/YOUR_NAME/dataverse
git push
# change to the already existing feature branch
git checkout 123-COOL-FEATURE
# merge changes of develop to the feature branch
git merge develop
# check if there are conflicts, if there are follow the next command, otherwise skip to next block
# 1. fix the relevant files (including testing)
# 2. commit changes
git add <fixed files>
git commit
# update remote feature branch at https://github.com/YOUR_NAME/dataverse
git push
How to Resolve Conflicts in Your Pull Request
---------------------------------------------

Expand Down
12 changes: 6 additions & 6 deletions doc/sphinx-guides/source/installation/shibboleth.rst
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ Configure Shibboleth
shibboleth2.xml
~~~~~~~~~~~~~~~

``/etc/shibboleth/shibboleth2.xml`` should look something like the :download:`sample shibboleth2.xml file <../_static/installation/files/etc/shibboleth/shibboleth2.xml>` below, but you must substitute your hostname in the ``entityID`` value. If your starting point is a ``shibboleth2.xml`` file provided by someone else, you must ensure that ``attributePrefix="AJP_"`` is added under ``ApplicationDefaults`` per the `Shibboleth wiki <https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPJavaInstall>`_ . Without the ``AJP_`` configuration in place, the required :ref:`shibboleth-attributes` will be null and users will be unable to log in.
``/etc/shibboleth/shibboleth2.xml`` should look something like the :download:`sample shibboleth2.xml file <../_static/installation/files/etc/shibboleth/shibboleth2.xml>` below, but you must substitute your hostname in the ``entityID`` value. If your starting point is a ``shibboleth2.xml`` file provided by someone else, you must ensure that ``attributePrefix="AJP_"`` is added under ``ApplicationDefaults`` per the `Shibboleth wiki <https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPJavaInstall>`_. Without the ``AJP_`` configuration in place, the required :ref:`shibboleth-attributes` will be null and users will be unable to log in.

.. literalinclude:: ../_static/installation/files/etc/shibboleth/shibboleth2.xml
:language: xml
Expand All @@ -171,11 +171,11 @@ Most Dataverse installations will probably only want to authenticate users via S
Identity Federation
^^^^^^^^^^^^^^^^^^^

Rather than or in addition to specifying individual Identity Provider(s) you may wish to broaden the number of users who can log into your Dataverse installation by registering your Dataverse installation as a Service Provider (SP) within an identity federation. For example, in the United States, users from the `many institutions registered with the "InCommon" identity federation <https://incommon.org/federation/info/all-entities.html#IdPs>`_ that release the `"Research & Scholarship Attribute Bundle" <https://spaces.internet2.edu/display/InCFederation/Research+and+Scholarship+Attribute+Bundle>`_ will be able to log into your Dataverse installation if you register it as an `InCommon Service Provider <https://incommon.org/federation/info/all-entities.html#SPs>`_ that is part of the `Research & Scholarship (R&S) category <https://incommon.org/federation/info/all-entity-categories.html#SPs>`_.
Rather than or in addition to specifying individual Identity Provider(s) you may wish to broaden the number of users who can log into your Dataverse installation by registering your Dataverse installation as a Service Provider (SP) within an identity federation. For example, in the United States, users from the `many institutions registered with the "InCommon" identity federation <https://incommon.org/community-organizations/>`_ that release the `"Research & Scholarship Attribute Bundle" <https://refeds.org/research-and-scholarship>`_ will be able to log into your Dataverse installation if you register it as an `InCommon Service Provider <https://spaces.at.internet2.edu/display/federation/federation-manager-add-sp>`_ that is part of the `Research & Scholarship (R&S) category <https://refeds.org/research-and-scholarship>`_.

The details of how to register with an identity federation are out of scope for this document, but a good starting point may be this list of identity federations across the world: http://www.protectnetwork.org/support/faq/identity-federations
The details of how to register with an identity federation are out of scope for this document, but a good starting point may be `this list of identity federations across the world <https://refeds.org/federations>`_.

One of the benefits of using ``shibd`` is that it can be configured to periodically poll your identity federation for updates as new Identity Providers (IdPs) join the federation you've registered with. For the InCommon federation, the following page describes how to download and verify signed InCommon metadata every hour: https://spaces.internet2.edu/display/InCFederation/Shibboleth+Metadata+Config#ShibbolethMetadataConfig-ConfiguretheShibbolethSP . You can also see an example of this as ``maxRefreshDelay="3600"`` in the commented out section of the ``shibboleth2.xml`` file above.
One of the benefits of using ``shibd`` is that it can be configured to periodically poll your identity federation for updates as new Identity Providers (IdPs) join the federation you've registered with. For the InCommon federation, `this page describes how to download and verify signed InCommon metadata every hour <https://spaces.at.internet2.edu/display/federation/Download+InCommon+metadata>`_. You can also see an example of this as ``maxRefreshDelay="3600"`` in the commented out section of the ``shibboleth2.xml`` file above.

Once you've joined a federation the list of IdPs in the dropdown can be quite long! If you're curious how many are in the list you could try something like this: ``curl https://dataverse.example.edu/Shibboleth.sso/DiscoFeed | jq '.[].entityID' | wc -l``

Expand All @@ -192,7 +192,7 @@ The following attributes are required for a successful Shibboleth login:
- sn
- email

See also https://www.incommon.org/federation/attributesummary.html and https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPAttributeAccess
See also https://incommon.org/federation/attributes/ and https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPAttributeAccess

attribute-map.xml
~~~~~~~~~~~~~~~~~
Expand All @@ -214,7 +214,7 @@ SELinux is set to "enforcing" by default on RHEL/CentOS, but unfortunately Shibb
Disable SELinux
~~~~~~~~~~~~~~~

The first and easiest option is to set ``SELINUX=permisive`` in ``/etc/selinux/config`` and run ``setenforce permissive`` or otherwise disable SELinux to get Shibboleth to work. This is apparently what the Shibboleth project expects because their wiki page at https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPSELinux says, "At the present time, we do not support the SP in conjunction with SELinux, and at minimum we know that communication between the mod_shib and shibd components will fail if it's enabled. Other problems may also occur."
The first and easiest option is to set ``SELINUX=permisive`` in ``/etc/selinux/config`` and run ``setenforce permissive`` or otherwise disable SELinux to get Shibboleth to work. This is apparently what the Shibboleth project expects because their `wiki page <https://wiki.shibboleth.net/confluence/display/SHIB2/NativeSPSELinux>`_ says, "At the present time, we do not support the SP in conjunction with SELinux, and at minimum we know that communication between the mod_shib and shibd components will fail if it's enabled. Other problems may also occur."

Reconfigure SELinux to Accommodate Shibboleth
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
10 changes: 9 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,10 @@ public enum License {
@Column(nullable=true)
private String externalStatusLabel;

@Transient
private DatasetVersionDifference dvd;


public Long getId() {
return this.id;
}
Expand Down Expand Up @@ -396,6 +400,10 @@ public String getVersionNote() {
}

public DatasetVersionDifference getDefaultVersionDifference() {
//Cache to avoid recalculating the difference many many times in the dataset-versions.xhtml page
if(dvd!=null) {
return dvd;
}
// if version is deaccessioned ignore it for differences purposes
int index = 0;
int size = this.getDataset().getVersions().size();
Expand All @@ -407,7 +415,7 @@ public DatasetVersionDifference getDefaultVersionDifference() {
if ((index + 1) <= (size - 1)) {
for (DatasetVersion dvTest : this.getDataset().getVersions().subList(index + 1, size)) {
if (!dvTest.isDeaccessioned()) {
DatasetVersionDifference dvd = new DatasetVersionDifference(this, dvTest);
dvd = new DatasetVersionDifference(this, dvTest);
return dvd;
}
}
Expand Down
3 changes: 3 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/LoginPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.CredentialsAuthenticationProvider;
import edu.harvard.iq.dataverse.authorization.exceptions.AuthenticationFailedException;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinAuthenticationProvider;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
import edu.harvard.iq.dataverse.authorization.providers.shib.ShibAuthenticationProvider;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
Expand Down Expand Up @@ -166,6 +167,7 @@ public String login() {
for ( FilledCredential fc : filledCredentialsList ) {
authReq.putCredential(fc.getCredential().getKey(), fc.getValue());
}

authReq.setIpAddress( dvRequestService.getDataverseRequest().getSourceAddress() );
try {
AuthenticatedUser r = authSvc.getUpdateAuthenticatedUser(credentialsAuthProviderId, authReq);
Expand Down Expand Up @@ -205,6 +207,7 @@ public String login() {
logger.log( Level.WARNING, "Error logging in: " + response.getMessage(), response.getError() );
return null;
case BREAKOUT:
FacesContext.getCurrentInstance().getExternalContext().getFlash().put("silentUpgradePasswd",authReq.getCredential(BuiltinAuthenticationProvider.KEY_PASSWORD));
return response.getMessage();
default:
JsfHelper.addErrorMessage("INTERNAL ERROR");
Expand Down
71 changes: 0 additions & 71 deletions src/main/java/edu/harvard/iq/dataverse/api/Access.java
Original file line number Diff line number Diff line change
Expand Up @@ -511,77 +511,6 @@ public String tabularDatafileMetadataDDI(@PathParam("fileId") String fileId, @Q

return retValue;
}

@Path("variable/{varId}/metadata/ddi")
@GET
@Produces({ "application/xml" })

public String dataVariableMetadataDDI(@PathParam("varId") Long varId, @QueryParam("fileMetadataId") Long fileMetadataId, @QueryParam("exclude") String exclude, @QueryParam("include") String include, @Context HttpHeaders header, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ {
String retValue = "";

ByteArrayOutputStream outStream = null;
try {
outStream = new ByteArrayOutputStream();

ddiExportService.exportDataVariable(
varId,
outStream,
exclude,
include,
fileMetadataId);
} catch (Exception e) {
// For whatever reason we've failed to generate a partial
// metadata record requested. We simply return an empty string.
return retValue;
}

retValue = outStream.toString();

return retValue;
}


/*
* GET method for retrieving various auxiliary files associated with
* a tabular datafile.
*/

@Path("datafile/{fileId}/auxiliary/{origin}")
@GET
public Response listDatafileMetadataAux(@PathParam("fileId") String fileId,
@PathParam("origin") String origin,
@QueryParam("key") String apiToken,
@Context UriInfo uriInfo,
@Context HttpHeaders headers,
@Context HttpServletResponse response) throws ServiceUnavailableException {

DataFile df = findDataFileOrDieWrapper(fileId);

if (apiToken == null || apiToken.equals("")) {
apiToken = headers.getHeaderString(API_KEY_HEADER);
}

List<AuxiliaryFile> auxFileList = auxiliaryFileService.listAuxiliaryFiles(df, origin);

if (auxFileList == null || auxFileList.isEmpty()) {
throw new NotFoundException("No Auxiliary files exist for datafile " + fileId + " and the specified origin");
}
boolean isAccessAllowed = isAccessAuthorized(df, apiToken);
JsonArrayBuilder jab = Json.createArrayBuilder();
auxFileList.forEach(auxFile -> {
if (isAccessAllowed || auxFile.getIsPublic()) {
JsonObjectBuilder job = Json.createObjectBuilder();
job.add("formatTag", auxFile.getFormatTag());
job.add("formatVersion", auxFile.getFormatVersion());
job.add("fileSize", auxFile.getFileSize());
job.add("contentType", auxFile.getContentType());
job.add("isPublic", auxFile.getIsPublic());
job.add("type", auxFile.getType());
jab.add(job);
}
});
return ok(jab);
}

/*
* GET method for retrieving various auxiliary files associated with
Expand Down
29 changes: 0 additions & 29 deletions src/main/java/edu/harvard/iq/dataverse/api/Meta.java
Original file line number Diff line number Diff line change
Expand Up @@ -68,35 +68,6 @@ public class Meta {
@EJB
DatasetServiceBean datasetService;

@Deprecated
@Path("variable/{varId}")
@GET
@Produces({ "application/xml" })

public String variable(@PathParam("varId") Long varId, @QueryParam("fileMetadataId") Long fileMetadataId, @QueryParam("exclude") String exclude, @QueryParam("include") String include, @Context HttpHeaders header, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ {
String retValue = "";

ByteArrayOutputStream outStream = null;
try {
outStream = new ByteArrayOutputStream();

ddiExportService.exportDataVariable(
varId,
outStream,
exclude,
include,
fileMetadataId);
} catch (Exception e) {
// For whatever reason we've failed to generate a partial
// metadata record requested. We simply return an empty string.
return retValue;
}

retValue = outStream.toString();

return retValue;
}

// Because this API is deprecated, we prefer to continue letting it operate on fileId rather adding support for persistent identifiers.
@Deprecated
@Path("datafile/{fileId}")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import edu.harvard.iq.dataverse.DataFile;
import edu.harvard.iq.dataverse.Dataverse;
import edu.harvard.iq.dataverse.DvObject;
import edu.harvard.iq.dataverse.RoleAssignment;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.engine.command.AbstractVoidCommand;
Expand Down Expand Up @@ -38,5 +39,9 @@ public Map<String, Set<Permission>> getRequiredPermissions() {
return Collections.singletonMap("",
toBeRevoked.getDefinitionPoint() instanceof Dataverse ? Collections.singleton(Permission.ManageDataversePermissions)
: Collections.singleton(Permission.ManageDatasetPermissions));
}
}

@Override public String describe() {
return toBeRevoked.getAssigneeIdentifier() + " has had the role: " + toBeRevoked.getRole() + " REVOKED on " + toBeRevoked.getDefinitionPoint().accept(DvObject.NameIdPrinter);
}
}
Loading

0 comments on commit dedd9f8

Please sign in to comment.