From c7f6236d02035292f4b59aa1625982a34acfa003 Mon Sep 17 00:00:00 2001 From: iGN5117 Date: Tue, 6 Jun 2023 13:30:32 -0700 Subject: [PATCH 1/3] Update compile the code documentation for sedona --- docs/setup/compile.md | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) diff --git a/docs/setup/compile.md b/docs/setup/compile.md index b3ae72b7bd..3958e1cc22 100644 --- a/docs/setup/compile.md +++ b/docs/setup/compile.md @@ -6,7 +6,7 @@ ## Compile Scala / Java source code Sedona Scala/Java code is a project with multiple modules. Each module is a Scala/Java mixed project which is managed by Apache Maven 3. -* Make sure your Linux/Mac machine has Java 1.8, Apache Maven 3.3.1+, and Python3. The compilation of Sedona is not tested on Windows machine. +* Make sure your Linux/Mac machine has Java 1.8, Apache Maven 3.3.1+, and Python3.7+. The compilation of Sedona is not tested on Windows machine. To compile all modules, please make sure you are in the root folder of all modules. Then enter the following command in the terminal: @@ -76,7 +76,7 @@ export PYTHONPATH=$SPARK_HOME/python ``` 2. Compile the Sedona Scala and Java code with `-Dgeotools` and then copy the ==sedona-spark-shaded-{{ sedona.current_version }}.jar== to ==SPARK_HOME/jars/== folder. ``` -cp spark-shaded/target/sedona-spark-shaded-xxx.jar SPARK_HOME/jars/ +cp spark-shaded/target/sedona-spark-shaded-xxx.jar $SPARK_HOME/jars/ ``` 3. Install the following libraries ``` @@ -86,6 +86,9 @@ sudo pip3 install -U wheel sudo pip3 install -U virtualenvwrapper sudo pip3 install -U pipenv ``` +!!!tip + Homebrew can be used to install libgeos-dev in macOS: ```brew install geos``` + 4. Set up pipenv to the desired Python version: 3.7, 3.8, or 3.9 ``` cd python @@ -94,7 +97,8 @@ pipenv --python 3.7 5. Install the PySpark version and other dependency ``` cd python -pipenv install pyspark==3.0.1 +pipenv install pyspark +pipenv install shapely~=1.7 pipenv install --dev ``` 6. Run the Python tests From ceed7a46cfa6376859aa0b18749e19a79d21ebe9 Mon Sep 17 00:00:00 2001 From: iGN5117 Date: Tue, 6 Jun 2023 13:29:39 -0700 Subject: [PATCH 2/3] Remove prebuild_index config from search plugin --- mkdocs.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/mkdocs.yml b/mkdocs.yml index 996dc6ae09..a9ec3645d5 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -164,7 +164,7 @@ markdown_extensions: - pymdownx.tilde plugins: - search: - prebuild_index: true + #prebuild_index: true - macros - git-revision-date-localized: type: datetime From 6e9883d3b6c57cecaddcf3400dda4d1fe490b331 Mon Sep 17 00:00:00 2001 From: Nilesh Gajwani Date: Wed, 7 Jun 2023 09:46:13 -0700 Subject: [PATCH 3/3] Addressed PR comments for documentation changes Removed note/tip block elements that caused numbering to be reset --- docs/setup/compile.md | 9 +++------ 1 file changed, 3 insertions(+), 6 deletions(-) diff --git a/docs/setup/compile.md b/docs/setup/compile.md index 3958e1cc22..bb9fc57d88 100644 --- a/docs/setup/compile.md +++ b/docs/setup/compile.md @@ -66,9 +66,7 @@ User can specify `-Dspark` and `-Dscala` command line options to compile with di Sedona uses GitHub action to automatically generate jars per commit. You can go [here](https://github.com/apache/sedona/actions/workflows/java.yml) and download the jars by clicking the commit's ==Artifacts== tag. ## Run Python test - 1. Set up the environment variable SPARK_HOME and PYTHONPATH - For example, ``` export SPARK_HOME=$PWD/spark-3.0.1-bin-hadoop2.7 @@ -86,9 +84,7 @@ sudo pip3 install -U wheel sudo pip3 install -U virtualenvwrapper sudo pip3 install -U pipenv ``` -!!!tip - Homebrew can be used to install libgeos-dev in macOS: ```brew install geos``` - +Homebrew can be used to install libgeos-dev in macOS: `brew install geos` 4. Set up pipenv to the desired Python version: 3.7, 3.8, or 3.9 ``` cd python @@ -98,9 +94,10 @@ pipenv --python 3.7 ``` cd python pipenv install pyspark -pipenv install shapely~=1.7 pipenv install --dev ``` +`pipenv install pyspark` install the latest version of pyspark. +In order to remain consistent with installed spark version, use `pipenv install pyspark==` 6. Run the Python tests ``` cd python