Skip to content

Commit

Permalink
Merge branch 'master' into pandoc
Browse files Browse the repository at this point in the history
  • Loading branch information
romainx authored Feb 17, 2020
2 parents 9b983ea + 31b807e commit 581534e
Show file tree
Hide file tree
Showing 6 changed files with 23 additions and 9 deletions.
3 changes: 2 additions & 1 deletion base-notebook/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@

# Ubuntu 18.04 (bionic)
# https://hub.docker.com/_/ubuntu/?tab=tags&name=bionic
ARG BASE_CONTAINER=ubuntu:bionic-20200112@sha256:bc025862c3e8ec4a8754ea4756e33da6c41cba38330d7e324abd25c8e0b93300
ARG ROOT_CONTAINER=ubuntu:bionic-20200112@sha256:bc025862c3e8ec4a8754ea4756e33da6c41cba38330d7e324abd25c8e0b93300
ARG BASE_CONTAINER=$ROOT_CONTAINER
FROM $BASE_CONTAINER

LABEL maintainer="Jupyter Project <jupyter@googlegroups.com>"
Expand Down
2 changes: 1 addition & 1 deletion docs/contributing/features.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ If there's agreement that the feature belongs in one or more of the core stacks:
1. Implement the feature in a local clone of the `jupyter/docker-stacks` project.
2. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call:
```
make image/somestack-notebook
make build/somestack-notebook
```
3. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
4. Watch for Travis to report a build success or failure for your PR on GitHub.
Expand Down
2 changes: 1 addition & 1 deletion docs/contributing/packages.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Please follow the process below to update a package version:
2. Adjust the version number for the package. We prefer to pin the major and minor version number of packages so as to minimize rebuild side-effects when users submit pull requests (PRs). For example, you'll find the Jupyter Notebook package, `notebook`, installed using conda with `notebook=5.4.*`.
3. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call:
```
make image/somestack-notebook
make build/somestack-notebook
```
4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
5. Watch for Travis to report a build success or failure for your PR on GitHub.
Expand Down
2 changes: 1 addition & 1 deletion docs/contributing/tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Please follow the process below to add new tests:
2. If your test should run against a single image, add your test code to one of the modules in `some-notebook/test/` or create a new module.
3. Build one or more images you intend to test and run the tests locally. If you use `make`, call:
```
make image/somestack-notebook
make build/somestack-notebook
make test/somestack-notebook
```
4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
Expand Down
10 changes: 6 additions & 4 deletions pyspark-notebook/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,10 @@ RUN apt-get -y update && \
apt-get install --no-install-recommends -y openjdk-8-jre-headless ca-certificates-java && \
rm -rf /var/lib/apt/lists/*

# Using the preferred mirror to download the file
RUN cd /tmp && \
wget -q http://mirrors.ukfast.co.uk/sites/ftp.apache.org/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz && \
wget -q $(wget -qO- https://www.apache.org/dyn/closer.lua/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz\?as_json | \
python -c "import sys, json; content=json.load(sys.stdin); print(content['preferred']+content['path_info'])") && \
echo "2426a20c548bdfc07df288cd1d18d1da6b3189d0b78dee76fa034c52a4e02895f0ad460720c526f163ba63a17efae4764c46a1cd8f9b04c60f9937a554db85d2 *spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" | sha512sum -c - && \
tar xzf spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz -C /usr/local --owner root --group root --no-same-owner && \
rm spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz
Expand All @@ -36,11 +38,11 @@ RUN apt-get -y update && \
rm -rf /var/lib/apt/lists/*

# Spark and Mesos config
ENV SPARK_HOME=/usr/local/spark \
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip \
ENV SPARK_HOME=/usr/local/spark
ENV PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip \
MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so \
SPARK_OPTS="--driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info" \
PATH=$PATH:/usr/local/spark/bin
PATH=$PATH:$SPARK_HOME/bin

USER $NB_UID

Expand Down
13 changes: 12 additions & 1 deletion pyspark-notebook/test/test_spark.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,15 @@ def test_spark_shell(container):
c.wait(timeout=30)
logs = c.logs(stdout=True).decode('utf-8')
LOGGER.debug(logs)
assert 'res0: Int = 2' in logs
assert 'res0: Int = 2' in logs

def test_pyspark(container):
"""PySpark should be in the Python path"""
c = container.run(
tty=True,
command=['start.sh', 'python', '-c', '"import pyspark"']
)
rv = c.wait(timeout=30)
assert rv == 0 or rv["StatusCode"] == 0
logs = c.logs(stdout=True).decode('utf-8')
LOGGER.debug(logs)

0 comments on commit 581534e

Please sign in to comment.