Skip to content

Commit

Permalink
Merge pull request #47 from IQSS/develop
Browse files Browse the repository at this point in the history
Update
  • Loading branch information
lubitchv authored Mar 18, 2020
2 parents 02c3e38 + c574792 commit 6844b6b
Show file tree
Hide file tree
Showing 133 changed files with 4,418 additions and 1,317 deletions.
76 changes: 76 additions & 0 deletions CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Contributor Covenant Code of Conduct

## Our Pledge

In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.

## Our Standards

Examples of behavior that contributes to creating a positive environment
include:

* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members

Examples of unacceptable behavior by participants include:

* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting

## Our Responsibilities

Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.

Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.

## Scope

This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at support at dataverse dot org. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.

Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html

[homepage]: https://www.contributor-covenant.org

For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ Dataverse is a trademark of President and Fellows of Harvard College and is regi
[![Dataverse Project logo](src/main/webapp/resources/images/dataverseproject_logo.jpg?raw=true "Dataverse Project")](http://dataverse.org)

[![API Test Status](https://jenkins.dataverse.org/buildStatus/icon?job=IQSS-dataverse-develop&subject=API%20Test%20Status)](https://jenkins.dataverse.org/job/IQSS-dataverse-develop/)
[![API Test Coverage](https://img.shields.io/jenkins/coverage/jacoco?jobUrl=https%3A%2F%2Fjenkins.dataverse.org%2Fjob%2FIQSS-dataverse-develop&label=API%20Test%20Coverage)](https://jenkins.dataverse.org/job/IQSS-dataverse-develop/)
[![Unit Test Status](https://img.shields.io/travis/IQSS/dataverse?label=Unit%20Test%20Status)](https://travis-ci.org/IQSS/dataverse)
[![Unit Test Coverage](https://img.shields.io/coveralls/github/IQSS/dataverse?label=Unit%20Test%20Coverage)](https://coveralls.io/github/IQSS/dataverse?branch=develop)

Expand Down
4 changes: 2 additions & 2 deletions conf/docker-aio/0prep_deps.sh
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,12 @@ if [ ! -e dv/deps/glassfish4dv.tgz ]; then
# assuming that folks usually have /tmp auto-clean as needed
fi

if [ ! -e dv/deps/solr-7.3.1dv.tgz ]; then
if [ ! -e dv/deps/solr-7.7.2dv.tgz ]; then
echo "solr dependency prep"
# schema changes *should* be the only ones...
cd dv/deps/
#wget https://archive.apache.org/dist/lucene/solr/7.3.0/solr-7.3.0.tgz -O solr-7.3.0dv.tgz
wget https://archive.apache.org/dist/lucene/solr/7.3.1/solr-7.3.1.tgz -O solr-7.3.1dv.tgz
wget https://archive.apache.org/dist/lucene/solr/7.7.2/solr-7.7.2.tgz -O solr-7.7.2dv.tgz
cd ../../
fi

6 changes: 3 additions & 3 deletions conf/docker-aio/1prep.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
# this was based off the phoenix deployment; and is likely uglier and bulkier than necessary in a perfect world

mkdir -p testdata/doc/sphinx-guides/source/_static/util/
cp ../solr/7.3.1/schema*.xml testdata/
cp ../solr/7.3.1/solrconfig.xml testdata/
cp ../solr/7.3.1/updateSchemaMDB.sh testdata/
cp ../solr/7.7.2/schema*.xml testdata/
cp ../solr/7.7.2/solrconfig.xml testdata/
cp ../solr/7.7.2/updateSchemaMDB.sh testdata/
cp ../jhove/jhove.conf testdata/
cp ../jhove/jhoveConfig.xsd testdata/
cd ../../
Expand Down
8 changes: 4 additions & 4 deletions conf/docker-aio/c7.dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ COPY testdata/sushi_sample_logs.json /tmp/
COPY disableipv6.conf /etc/sysctl.d/
RUN rm /etc/httpd/conf/*
COPY httpd.conf /etc/httpd/conf
RUN cd /opt ; tar zxf /tmp/dv/deps/solr-7.3.1dv.tgz
RUN cd /opt ; tar zxf /tmp/dv/deps/solr-7.7.2dv.tgz
RUN cd /opt ; tar zxf /tmp/dv/deps/glassfish4dv.tgz

# this copy of domain.xml is the result of running `asadmin set server.monitoring-service.module-monitoring-levels.jvm=LOW` on a default glassfish installation (aka - enable the glassfish REST monitir endpoint for the jvm`
Expand All @@ -28,9 +28,9 @@ RUN sudo -u postgres /usr/pgsql-9.6/bin/initdb -D /var/lib/pgsql/data

# copy configuration related files
RUN cp /tmp/dv/pg_hba.conf /var/lib/pgsql/data/
RUN cp -r /opt/solr-7.3.1/server/solr/configsets/_default /opt/solr-7.3.1/server/solr/collection1
RUN cp /tmp/dv/schema*.xml /opt/solr-7.3.1/server/solr/collection1/conf/
RUN cp /tmp/dv/solrconfig.xml /opt/solr-7.3.1/server/solr/collection1/conf/solrconfig.xml
RUN cp -r /opt/solr-7.7.2/server/solr/configsets/_default /opt/solr-7.7.2/server/solr/collection1
RUN cp /tmp/dv/schema*.xml /opt/solr-7.7.2/server/solr/collection1/conf/
RUN cp /tmp/dv/solrconfig.xml /opt/solr-7.7.2/server/solr/collection1/conf/solrconfig.xml

# skipping glassfish user and solr user (run both as root)

Expand Down
2 changes: 1 addition & 1 deletion conf/docker-aio/entrypoint.bash
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
export LANG=en_US.UTF-8
#sudo -u postgres /usr/bin/postgres -D /var/lib/pgsql/data &
sudo -u postgres /usr/pgsql-9.6/bin/postgres -D /var/lib/pgsql/data &
cd /opt/solr-7.3.1/
cd /opt/solr-7.7.2/
# TODO: Run Solr as non-root and remove "-force".
bin/solr start -force
bin/solr create_core -c collection1 -d server/solr/collection1/conf -force
Expand Down
2 changes: 1 addition & 1 deletion conf/docker-aio/testscripts/install
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ export SMTP_SERVER=localhost
export MEM_HEAP_SIZE=2048
export GLASSFISH_DOMAIN=domain1
cd scripts/installer
cp pgdriver/postgresql-42.2.2.jar $GLASSFISH_ROOT/glassfish/lib
cp pgdriver/postgresql-42.2.9.jar $GLASSFISH_ROOT/glassfish/lib
#cp ../../conf/jhove/jhove.conf $GLASSFISH_ROOT/glassfish/domains/$GLASSFISH_DOMAIN/config/jhove.conf
cp /opt/dv/testdata/jhove.conf $GLASSFISH_ROOT/glassfish/domains/$GLASSFISH_DOMAIN/config/jhove.conf
cp /opt/dv/testdata/jhoveConfig.xsd $GLASSFISH_ROOT/glassfish/domains/$GLASSFISH_DOMAIN/config/jhoveConfig.xsd
Expand Down
2 changes: 1 addition & 1 deletion conf/docker/dataverse-glassfish/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ RUN /tmp/dvinstall/glassfish-setup.sh
###glassfish-setup will handle everything in Dockerbuild

##install jdbc driver
RUN cp /tmp/dvinstall/pgdriver/postgresql-42.2.2.jar /usr/local/glassfish4/glassfish/domains/domain1/lib
RUN cp /tmp/dvinstall/pgdriver/postgresql-42.2.9.jar /usr/local/glassfish4/glassfish/domains/domain1/lib

# Customized persistence xml to avoid database recreation
#RUN mkdir -p /tmp/WEB-INF/classes/META-INF/
Expand Down
File renamed without changes.
2 changes: 1 addition & 1 deletion conf/solr/7.3.1/schema.xml → conf/solr/7.7.2/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@
<!-- Dataverse copyField from http://localhost:8080/api/admin/index/solr/schema -->
<xi:include href="schema_dv_mdb_copies.xml" xmlns:xi="http://www.w3.org/2001/XInclude" />

<!-- End: Dataverse Specific -->
<!-- End: Dataverse-specific -->

<!-- This can be enabled, in case the client does not know what fields may be searched. It isn't enabled by default
because it's very expensive to index everything twice. -->
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
2 changes: 1 addition & 1 deletion doc/release-notes/4.19-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ Additional fields are now available via the Search API, mostly related to inform

## Complete List of Changes

For the complete list of code changes in this release, see the <a href="https://github.com/IQSS/dataverse/milestone/86?closed=1">4.19 milestone</a> in Github.
For the complete list of code changes in this release, see the <a href="https://github.com/IQSS/dataverse/milestone/87?closed=1">4.19 milestone</a> in Github.

For help with upgrading, installing, or general questions please post to the <a href="https://groups.google.com/forum/#!forum/dataverse-community">Dataverse Google Group</a> or email [email protected].

Expand Down
4 changes: 4 additions & 0 deletions doc/release-notes/6262-release-notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
### Dataverse Linking Fix

The fix implemented for ticket 6262 will display the datasets contained in linked dataverses in the linking dataverse. Going forward this will happen automatically whenever a dataverse is linked.
In order for datasets belonging to dataverses previously linked to be displayed properly in the linking dataverse you must re-index those dataverses that had been linked. It's probably easiest to simply re-index all.
36 changes: 36 additions & 0 deletions doc/release-notes/6485-multiple-stores.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Multiple Store Support
Dataverse can now be configured to store files in more than one place at the same time (multiple file, s3, and/or swift stores).

General information about this capability can be found in the <a href="http://guides.dataverse.org/en/latest/installation/config.html">Configuration Guide</a> - File Storage section.

**Upgrade Information:**

**Existing installations will need to make configuration changes to adopt this version, regardless of whether additional stores are to be added or not.**

Multistore support requires that each store be assigned a label, id, and type - see the documentation for a more complete explanation. For an existing store, the recommended upgrade path is to assign the store id based on it's type, i.e. a 'file' store would get id 'file', an 's3' store would have the id 's3'.

With this choice, no manual changes to datafile 'storageidentifier' entries are needed in the database. (If you do not name your existing store using this convention, you will need to edit the database to maintain access to existing files!).

The following set of commands to change the Glassfish JVM options will adapt an existing file or s3 store for this upgrade:
For a file store:

./asadmin create-jvm-options "\-Ddataverse.files.file.type=file"
./asadmin create-jvm-options "\-Ddataverse.files.file.label=file"
./asadmin create-jvm-options "\-Ddataverse.files.file.directory=<your directory>"

For an s3 store:

./asadmin create-jvm-options "\-Ddataverse.files.s3.type=s3"
./asadmin create-jvm-options "\-Ddataverse.files.s3.label=s3"
./asadmin delete-jvm-options "-Ddataverse.files.s3-bucket-name=<your_bucket_name>"
./asadmin create-jvm-options "-Ddataverse.files.s3.bucket-name=<your_bucket_name>"

Any additional S3 options you have set will need to be replaced as well, following the pattern in the last two lines above - delete the option including a '-' after 's3' and creating the same option with the '-' replaced by a '.', using the same value you currently have configured.

Once these options are set, restarting the glassfish service is all that is needed to complete the change.

<<<<<<< HEAD
Note that the "\-Ddataverse.files.directory", if defined, continues to control where temporary files are stored (in the /temp subdir of that directory), independent of the location of any 'file' store defined above.
=======
Note that the "\-Ddataverse.files.directory", if defined, continues to control where temporary files are stored (in the /temp subdir of that directory), independent of the location of any 'file' store defined above.
>>>>>>> branch 'IQSS/6485' of https://github.com/TexasDigitalLibrary/dataverse.git
22 changes: 22 additions & 0 deletions doc/release-notes/6510-duplicate-datafiles-and-datatables.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
We recently discovered a *potential* data integrity issue in
Dataverse databases. One manifests itself as duplicate DataFile
objects created for the same uploaded file (https://github.com/IQSS/dataverse/issues/6522); the other as duplicate
DataTable (tabular metadata) objects linked to the same
DataFile (https://github.com/IQSS/dataverse/issues/6510). This issue impacted approximately .03% of datasets in Harvard's Dataverse.

To see if any datasets in your installation have been impacted by this data integrity issue, we've provided a diagnostic script here:

https://github.com/IQSS/dataverse/raw/develop/scripts/issues/6510/check_datafiles_6522_6510.sh

The script relies on the PostgreSQL utility psql to access the
database. You will need to edit the credentials at the top of the script
to match your database configuration.

If neither of the two issues is present in your database, you will see
a message "... no duplicate DataFile objects in your database" and "no
tabular files affected by this issue in your database".

If either, or both kinds of duplicates are detected, the script will
provide further instructions. We will need you to send us the produced
output. We will then assist you in resolving the issues in your
database.
10 changes: 10 additions & 0 deletions doc/release-notes/6534-new-guestbook-columns.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
Users of downloaded guestbooks should note that two new columns have been added:

- Dataset PID
- File PID

If you are expecting column in the CVS file to be in a particular order, you will need to make adjustments. Please see below for details:

Old columns: Guestbook, Dataset, Date, Type, File Name, File Id, User Name, Email, Institution, Position, Custom Questions

New columns: Guestbook, Dataset, Dataset PID, Date, Type, File Name, File Id, File PID, User Name, Email, Institution, Position, Custom Questions
1 change: 1 addition & 0 deletions doc/release-notes/6570-search-api-contact-affiliation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
As reported in https://github.com/IQSS/dataverse/issues/6570 the affiliation for dataset contacts has been wrapped in parentheses in the JSON output from the Search API. These parentheses have now been removed. This is a backward incompatible change but it is hoped that it won't cause any problems for integrations and API users.
4 changes: 4 additions & 0 deletions doc/release-notes/6590-citation-reload.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
5. Update Citation Metadata Block

- `wget https://github.com/IQSS/dataverse/releases/download/$RELEASE_NUMBER/citation.tsv`
- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"`
9 changes: 9 additions & 0 deletions doc/release-notes/6599-update-solr-772.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
With this release we upgrade to the latest available stable release in the
Solr 7.x branch.

We recommend a fresh installation of Solr 7.2.2 (the index will be empty)
followed by an "index all".

Before you start the "index all", Dataverse will appear to be empty because
the search results come from Solr. As indexing progresses, results will appear
until indexing is complete.
1 change: 1 addition & 0 deletions doc/release-notes/6644-role-name-change.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Note for integrators - the role alias has changed, so if anything was hard-coded to "editor" instead of "contributor" it'll need to be updated.
3 changes: 3 additions & 0 deletions doc/release-notes/6711-coverage-badge
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
Integration Test Coverage Reporting

API-based integration tests are run every time a branch is merged to develop and the percentage of code covered by these integration tests is now shown on a badge at the bottom of the README.md file that serves as the homepage of Dataverse Github Repository.
3 changes: 3 additions & 0 deletions doc/release-notes/6725-analytics-bug.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Google Analytics Download Tracking Bug

The button tracking capability discussed in the installation guide (http://guides.dataverse.org/en/4.20/installation/config.html#id88) relies on an analytics-code.html file that must be configured using the :WebAnalyticsCode setting. The example file provided in the installation guide is no longer compatible with recent Dataverse releases (>v4.16). Installations using this feature should update their analytics-code.html file by following the installation instructions using the updated example file. (Alternately, sites can modify their existing files to include the one-line change made in the example file at line 120.)
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
TwoRavens explore file A system of interlocking statistical tools for data exploration, analysis, and meta-analysis: http://2ra.vn. See the :doc:`/user/data-exploration/tworavens` section of the User Guide for more information on TwoRavens from the user perspective and the :doc:`/installation/r-rapache-tworavens` section of the Installation Guide.
Data Explorer explore file A GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Explorer for the instructions on adding Data Explorer to your Dataverse; and the :doc:`/installation/prerequisites` section of the Installation Guide for the instructions on how to set up **basic R configuration required** (specifically, Dataverse uses R to generate .prep metadata files that are needed to run Data Explorer).
Whole Tale explore dataset A platform for the creation of reproducible research packages that allows users to launch containerized interactive analysis environments based on popular tools such as Jupyter and RStudio. Using this integration, Dataverse users can launch Jupyter and RStudio environments to analyze published datasets. For more information, see the `Whole Tale User Guide <https://wholetale.readthedocs.io/en/stable/users_guide/integration.html>`_.
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, and spreadsheets - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/QualitativeDataRepository/dataverse-previewers
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, and spreadsheets - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/GlobalDataverseCommunityConsortium/dataverse-previewers
Data Curation Tool configure file A GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Curation-Tool for the installation instructions.
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
# chkconfig: 35 92 08
# description: Starts and stops Apache Solr

SOLR_DIR="/usr/local/solr/solr-7.3.1"
SOLR_DIR="/usr/local/solr/solr-7.7.2"
SOLR_COMMAND="bin/solr"
SOLR_ARGS="-m 1g -j jetty.host=127.0.0.1"
SOLR_USER=solr
Expand Down
Loading

0 comments on commit 6844b6b

Please sign in to comment.