-
Notifications
You must be signed in to change notification settings - Fork 7
/
Copy pathREADME.Rmd
83 lines (61 loc) · 2.43 KB
/
README.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
title: "sparkhello: Scala to Spark - Hello World"
output:
github_document:
fig_width: 9
fig_height: 5
---
**sparkhello** demonstrates how to build a [sparklyr](http://github.com/rstudio/sparklyr) extension package that uses custom Scala code which is compiled and deployed to Apache Spark.
For example, suppose that you want to deploy the following Scala code to Spark as part of your extension:
```{scala, eval=FALSE}
object HelloWorld {
def hello() : String = {
"Hello, world! - From Scala"
}
}
```
The R wrapper for this Scala code might look like this:
```{r, eval=FALSE}
spark_hello <- function(sc) {
sparklyr::invoke_static(sc, "SparkHello.HelloWorld", "hello")
}
```
To package and deploy this Scala code as part of your extension, store the Scala code within the `/java` directory of your package and then compile it using the following command, which will build the JARs required for various versions of Spark under the `inst/java` directory of your package:
```{r, eval= FALSE}
sparklyr::compile_package_jars()
```
There are a couple of conditions required for `sparklyr::compile_package_jars` to work correctly:
1. Your current working directory should be the root directory of your package.
2. You should [download and install](http://www.scala-lang.org/download/) the Scala 2.10 and 2.11 compilers to one of the following paths:
- /opt/scala
- /opt/local/scala
- /usr/local/scala
- ~/scala (Windows-only)
You then need to implement the `spark_dependencies` function (which tells sparklyr that your JARs are required dependencies) as well as an `.onLoad` function which registers your extension. For example:
```{r, eval=FALSE}
spark_dependencies <- function(spark_version, scala_version, ...) {
sparklyr::spark_dependency(
jars = c(
system.file(
sprintf("java/sparkhello-%s-%s.jar", spark_version, scala_version),
package = "sparkhello"
)
),
packages = c()
)
}
.onLoad <- function(libname, pkgname) {
sparklyr::register_extension(pkgname)
}
```
Assuming the **sparkhello** package was loaded prior to connecting to Spark, you can now call the `spark_hello` function which in turn executes the Scala code in your custom JAR:
```{r message=FALSE}
library(sparklyr)
library(sparkhello)
sc <- spark_connect(master = "local")
spark_hello(sc)
```
```{r}
spark_disconnect(sc)
```
You can learn more about sparklyr extensions at <http://spark.rstudio.com/extensions.html>.