Skip to content

Commit

Permalink
Merge commit '52e9ec326db57391ff7843ce26271300f4a8f875' into 2513-no_…
Browse files Browse the repository at this point in the history
…doc_lint
  • Loading branch information
pameyer committed Dec 19, 2016
2 parents e85d2eb + 52e9ec3 commit 635685b
Show file tree
Hide file tree
Showing 47 changed files with 1,961 additions and 1,245 deletions.
10 changes: 5 additions & 5 deletions conf/httpd/conf.d/dataverse.conf
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ ProxyPassMatch ^/error-documents !
# pass everything else to Glassfish
ProxyPass / ajp://localhost:8009/

<Location /shib.xhtml>
AuthType shibboleth
ShibRequestSetting requireSession 1
require valid-user
</Location>
#<Location /shib.xhtml>
# AuthType shibboleth
# ShibRequestSetting requireSession 1
# require valid-user
#</Location>

ErrorDocument 503 /error-documents/503.html
Alias /error-documents /var/www/dataverse/error-documents
Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,9 @@
# built documents.
#
# The short X.Y version.
version = '4.5.1'
version = '4.6'
# The full version, including alpha/beta/rc tags.
release = '4.5.1'
release = '4.6'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
19 changes: 5 additions & 14 deletions doc/sphinx-guides/source/developers/tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,25 +50,16 @@ According to https://pagekite.net/support/free-for-foss/ PageKite (very generous
Vagrant
+++++++

Vagrant allows you to spin up a virtual machine running Dataverse on
your development workstation.
Vagrant allows you to spin up a virtual machine running Dataverse on your development workstation. You'll need to install Vagrant from https://www.vagrantup.com and VirtualBox from https://www.virtualbox.org.

We assume you have already cloned the repo from https://github.com/IQSS/dataverse as explained in the :doc:`/developers/dev-environment` section.

From the root of the git repo, run ``vagrant up`` and eventually you
should be able to reach an installation of Dataverse at
http://localhost:8888 (or whatever forwarded_port indicates in the
Vagrantfile)

The Vagrant environment can also be used for Shibboleth testing in
conjunction with PageKite configured like this:

service_on = http:@kitename : localhost:8888 : @kitesecret

service_on = https:@kitename : localhost:9999 : @kitesecret
Vagrantfile).

Please note that before running ``vagrant up`` for the first time,
you'll need to ensure that required software (GlassFish, Solr, etc.)
is available within Vagrant. If you type ``cd downloads`` and
``./download.sh`` the software should be properly downloaded.
Please note that running ``vagrant up`` for the first time should run the ``downloads/download.sh`` script for you to download required software such as Glassfish and Solr and any patches. However, these dependencies change over time so it's a place to look if ``vagrant up`` was working but later fails.

MSV
+++
Expand Down
6 changes: 3 additions & 3 deletions doc/sphinx-guides/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Dataverse 4.5.1 Guides
======================
Dataverse 4.6 Guides
====================

These guides are for the most recent version of Dataverse. For the guides for **version 4.5** please go `here <http://guides.dataverse.org/en/4.5/>`_.
These guides are for the most recent version of Dataverse. For the guides for **version 4.5.1** please go `here <http://guides.dataverse.org/en/4.5.1/>`_.

.. toctree::
:glob:
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/installation/prep.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Choose Your Own Installation Adventure
Vagrant (for Testing Only)
++++++++++++++++++++++++++

If you are looking to simply kick the tires on Dataverse and are familiar with Vagrant, running ``vagrant up`` after cloning the Dataverse repo **should** give you a working installation at http://localhost:8888 . This is one of the :doc:`/developers/tools` developers use to test the installation process but you're welcome to give it a shot.
If you are looking to simply kick the tires on installing Dataverse and are familiar with Vagrant, you are welcome to read through the "Vagrant" section of the :doc:`/developers/tools` section of the Developer Guide. Checking out a tagged release is recommended rather than running ``vagrant up`` on unreleased code.

Pilot Installation
++++++++++++++++++
Expand Down
27 changes: 27 additions & 0 deletions doc/sphinx-guides/source/installation/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -190,4 +190,31 @@ Installing jq
# chmod +x jq
# jq --version

ImageMagick
-----------

Dataverse uses `ImageMagick <https://www.imagemagick.org>`_ to generate thumbnail previews of PDF files. This is an optional component, meaning that if you don't have ImageMagick installed, there will be no thumbnails for PDF files, in the search results and on the dataset pages; but everything else will be working. (Thumbnail previews for non-PDF image files are generated using standard Java libraries and do not require any special installation steps).

Installing and configuring ImageMagick
======================================

On a Red Hat and similar Linux distributions, you can install ImageMagick with something like::

# yum install ImageMagick

(most RedHat systems will have it pre-installed).
When installed using standard ``yum`` mechanism, above, the executable for the ImageMagick convert utility will be located at ``/usr/bin/convert``. No further configuration steps will then be required.

On MacOS you can compile ImageMagick from sources, or use one of the popular installation frameworks, such as brew.

If the installed location of the convert executable is different from ``/usr/bin/convert``, you will also need to specify it in your Glassfish configuration using the JVM option, below. For example::

<jvm-options>-Ddataverse.path.imagemagick.convert=/opt/local/bin/convert</jvm-options>

(see the :doc:`config` section for more information on the JVM options)



Now that you have all the prerequisites in place, you can proceed to the :doc:`installation-main` section.


2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

<groupId>edu.harvard.iq</groupId>
<artifactId>dataverse</artifactId>
<version>4.5.1</version>
<version>4.6</version>
<packaging>war</packaging>

<name>dataverse</name>
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@ ALTER TABLE datafile ALTER COLUMN checksumtype SET NOT NULL;
-- note that in the database we use "SHA1" (no hyphen) but the GUI will show "SHA-1"
--UPDATE datafile SET checksumtype = 'SHA1';
ALTER TABLE datafile RENAME md5 TO checksumvalue;
ALTER TABLE filemetadata ADD COLUMN directorylabel character varying(255);
1 change: 1 addition & 0 deletions scripts/vagrant/install-dataverse.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,4 @@ if [ ! -f $WAR ]; then
fi
cd /dataverse/scripts/installer
./install --hostname localhost $MAILSERVER_ARG --gfdir /home/glassfish/glassfish4 -y --force
echo "If "vagrant up" was successful (check output above) Dataverse is running on port 8080 of the Linux machine running within Vagrant, but this port has been forwarded to port 8888 of the computer you ran "vagrant up" on. For this reason you should go to http://localhost:8888 to see the Dataverse app running."
15 changes: 9 additions & 6 deletions scripts/vagrant/setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,10 @@ sudo mv jq /usr/bin/jq
echo "Adding Shibboleth yum repo"
cp /dataverse/conf/vagrant/etc/yum.repos.d/shibboleth.repo /etc/yum.repos.d
cp /dataverse/conf/vagrant/etc/yum.repos.d/epel-apache-maven.repo /etc/yum.repos.d
yum install -y java-1.8.0-openjdk-devel postgresql-server apache-maven httpd mod_ssl shibboleth shibboleth-embedded-ds
# Uncomment this (and other shib stuff below) if you want
# to use Vagrant (and maybe PageKite) to test Shibboleth.
#yum install -y shibboleth shibboleth-embedded-ds
yum install -y java-1.8.0-openjdk-devel postgresql-server apache-maven httpd mod_ssl
alternatives --set java /usr/lib/jvm/jre-1.8.0-openjdk.x86_64/bin/java
alternatives --set javac /usr/lib/jvm/java-1.8.0-openjdk.x86_64/bin/javac
java -version
Expand Down Expand Up @@ -44,17 +47,17 @@ if [ ! -d $GLASSFISH_ROOT ]; then
else
echo "$GLASSFISH_ROOT already exists"
fi
service shibd start
#service shibd start
service httpd stop
cp /dataverse/conf/httpd/conf.d/dataverse.conf /etc/httpd/conf.d/dataverse.conf
mkdir -p /var/www/dataverse/error-documents
cp /dataverse/conf/vagrant/var/www/dataverse/error-documents/503.html /var/www/dataverse/error-documents
service httpd start
curl -k --sslv3 https://pdurbin.pagekite.me/Shibboleth.sso/Metadata > /tmp/pdurbin.pagekite.me
cp -a /etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml.orig
cp -a /etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml.orig
#curl -k --sslv3 https://pdurbin.pagekite.me/Shibboleth.sso/Metadata > /tmp/pdurbin.pagekite.me
#cp -a /etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml.orig
#cp -a /etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml.orig
# need more attributes, such as sn, givenName, mail
cp /dataverse/conf/vagrant/etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml
#cp /dataverse/conf/vagrant/etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml
# FIXME: automate this?
#curl 'https://www.testshib.org/cgi-bin/sp2config.cgi?dist=Others&hostname=pdurbin.pagekite.me' > /etc/shibboleth/shibboleth2.xml
#cp /dataverse/conf/vagrant/etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml
Expand Down
2 changes: 1 addition & 1 deletion src/main/java/edu/harvard/iq/dataverse/DataFile.java
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
})
@Entity
@Table(indexes = {@Index(columnList="ingeststatus")
, @Index(columnList="md5")
, @Index(columnList="checksumvalue")
, @Index(columnList="contenttype")
, @Index(columnList="restricted")})
public class DataFile extends DvObject implements Comparable {
Expand Down
15 changes: 10 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -267,6 +267,7 @@ public DataFile findCheapAndEasy(Long id) {
Integer file_id = (Integer) result[0];

dataFile = new DataFile();
dataFile.setMergeable(false);

dataFile.setId(file_id.longValue());

Expand Down Expand Up @@ -482,6 +483,7 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
Integer file_id = (Integer) result[0];

DataFile dataFile = new DataFile();
dataFile.setMergeable(false);

dataFile.setId(file_id.longValue());

Expand Down Expand Up @@ -756,9 +758,12 @@ public List<DataFile> findAll() {
}

public DataFile save(DataFile dataFile) {

DataFile savedDataFile = em.merge(dataFile);
return savedDataFile;
if (dataFile.isMergeable()) {
DataFile savedDataFile = em.merge(dataFile);
return savedDataFile;
} else {
throw new IllegalArgumentException("This DataFile object has been set to NOT MERGEABLE; please ensure a MERGEABLE object is passed to the save method.");
}
}

public Boolean isPreviouslyPublished(Long fileId){
Expand Down Expand Up @@ -794,7 +799,7 @@ public List<DataFile> findHarvestedFilesByClient(HarvestingClient harvestingClie
return query.getResultList();
}

/**/
/*moving to the fileutil*/

public void generateStorageIdentifier(DataFile dataFile) {
dataFile.setStorageIdentifier(generateStorageIdentifier());
Expand Down Expand Up @@ -1234,4 +1239,4 @@ public void populateFileSearchCard(SolrSearchResult solrSearchResult) {
solrSearchResult.setEntity(this.findCheapAndEasy(solrSearchResult.getEntityId()));
}

}
}
3 changes: 2 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/Dataset.java
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ public class Dataset extends DvObjectContainer {
private static final long serialVersionUID = 1L;

@OneToMany(mappedBy = "owner", cascade = CascadeType.MERGE)
@OrderBy("id")
private List<DataFile> files = new ArrayList();

private String protocol;
Expand Down Expand Up @@ -187,7 +188,7 @@ public String getGlobalId() {
}

public List<DataFile> getFiles() {
logger.info("getFiles() on dataset "+this.getId());
//logger.info("getFiles() on dataset "+this.getId());
return files;
}

Expand Down
48 changes: 7 additions & 41 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,12 @@
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.authorization.users.PrivateUrlUser;
import edu.harvard.iq.dataverse.authorization.users.GuestUser;
import edu.harvard.iq.dataverse.datavariable.VariableServiceBean;
import edu.harvard.iq.dataverse.engine.command.Command;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand;
import edu.harvard.iq.dataverse.engine.command.impl.CreateGuestbookResponseCommand;
import edu.harvard.iq.dataverse.engine.command.impl.CreatePrivateUrlCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeaccessionDatasetVersionCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeleteDatasetVersionCommand;
Expand All @@ -37,6 +33,7 @@
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.FileSortFieldAndOrder;
import edu.harvard.iq.dataverse.util.FileUtil;
import edu.harvard.iq.dataverse.util.JsfHelper;
import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
import edu.harvard.iq.dataverse.util.StringUtil;
Expand Down Expand Up @@ -68,24 +65,18 @@
import javax.inject.Named;
import org.primefaces.event.FileUploadEvent;
import org.primefaces.model.UploadedFile;
import javax.servlet.ServletOutputStream;
import javax.servlet.http.HttpServletResponse;
import javax.validation.ConstraintViolation;
import org.apache.commons.httpclient.HttpClient;
import org.primefaces.context.RequestContext;
import java.text.DateFormat;
import java.util.Arrays;
import java.util.HashSet;
import javax.faces.model.SelectItem;
import java.util.logging.Level;
import edu.harvard.iq.dataverse.datasetutility.TwoRavensHelper;
import edu.harvard.iq.dataverse.datasetutility.WorldMapPermissionHelper;
import javax.faces.component.UIComponent;
import javax.faces.component.UIInput;

import javax.faces.event.AjaxBehaviorEvent;

import javax.faces.context.ExternalContext;
import org.apache.commons.lang.StringEscapeUtils;

import org.primefaces.component.tabview.TabView;
Expand Down Expand Up @@ -2098,10 +2089,10 @@ public void deleteFiles() {
// local filesystem:

try {
Files.delete(Paths.get(ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
Files.delete(Paths.get(FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
} catch (IOException ioEx) {
// safe to ignore - it's just a temp file.
logger.warning("Failed to delete temporary file " + ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
logger.warning("Failed to delete temporary file " + FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
}

dfIt.remove();
Expand Down Expand Up @@ -2145,12 +2136,6 @@ public String save() {
return "";
}



// One last check before we save the files - go through the newly-uploaded
// ones and modify their names so that there are no duplicates.
// (but should we really be doing it here? - maybe a better approach to do it
// in the ingest service bean, when the files get uploaded.)
// Finally, save the files permanently:
ingestService.addFiles(workingVersion, newFiles);

Expand Down Expand Up @@ -3328,7 +3313,7 @@ public boolean isDownloadPopupRequired() {
return fileDownloadService.isDownloadPopupRequired(workingVersion);
}

public String requestAccessMultipleFiles(String fileIdString) {
public String requestAccessMultipleFiles(String fileIdString) {
if (fileIdString.isEmpty()) {
RequestContext requestContext = RequestContext.getCurrentInstance();
requestContext.execute("PF('selectFilesForRequestAccess').show()");
Expand All @@ -3347,37 +3332,18 @@ public String requestAccessMultipleFiles(String fileIdString) {
test = null;
}
if (test != null) {
DataFile request = datafileService.find(test);
idForNotification = test;
requestAccess(request, false);
fileDownloadService.requestAccess(test);
}
}
}
if (idForNotification.intValue() > 0) {
sendRequestFileAccessNotification(idForNotification);
fileDownloadService.sendRequestFileAccessNotification(dataset,idForNotification);
}
return returnToDatasetOnly();
}


public void requestAccess(DataFile file, boolean sendNotification) {
if (!file.getFileAccessRequesters().contains((AuthenticatedUser) session.getUser())) {
file.getFileAccessRequesters().add((AuthenticatedUser) session.getUser());
datafileService.save(file);

// create notifications
if (sendNotification) {
sendRequestFileAccessNotification(file.getId());

}
}
}

private void sendRequestFileAccessNotification(Long fileId) {
for (AuthenticatedUser au : permissionService.getUsersWithPermissionOn(Permission.ManageDatasetPermissions, dataset)) {
userNotificationService.sendNotification(au, new Timestamp(new Date().getTime()), UserNotification.Type.REQUESTFILEACCESS, fileId);
}

}

public boolean isSortButtonEnabled() {
/**
Expand Down
3 changes: 1 addition & 2 deletions src/main/java/edu/harvard/iq/dataverse/Dataverse.java
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,7 @@
@NamedQuery(name = "Dataverse.filterByAliasNameAffiliation", query="SELECT dv FROM Dataverse dv WHERE (LOWER(dv.alias) LIKE :alias) OR (LOWER(dv.name) LIKE :name) OR (LOWER(dv.affiliation) LIKE :affiliation) order by dv.alias")
})
@Entity
@Table(indexes = {@Index(columnList="fk_dataverse_id")
, @Index(columnList="defaultcontributorrole_id")
@Table(indexes = {@Index(columnList="defaultcontributorrole_id")
, @Index(columnList="defaulttemplate_id")
, @Index(columnList="alias")
, @Index(columnList="affiliation")
Expand Down
Loading

0 comments on commit 635685b

Please sign in to comment.