Skip to content

Commit

Permalink
add profile for example to support multiple spark version without mod…
Browse files Browse the repository at this point in the history
…ify the pom (#121)

* add profile for example to support multiple spark version without modify the pom

* update version
  • Loading branch information
Nicole00 authored Aug 18, 2023
1 parent e3e8200 commit f90612b
Show file tree
Hide file tree
Showing 5 changed files with 113 additions and 24 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@ NebulaGraph Spark Connector support spark 2.2, 2.4 and 3.0.
* Spark Reader Supports reading data from NebulaGraph to Graphx as VertexRD and EdgeRDD, it also supports String type vertexId.
* NebulaGraph Spark Connector 2.0 uniformly uses SparkSQL's DataSourceV2 for data source expansion.
* NebulaGraph Spark Connector 2.1.0 support UPDATE write mode to NebulaGraph, see [Update Vertex](https://docs.nebula-graph.io/2.0.1/3.ngql-guide/12.vertex-statements/2.update-vertex/) .
* NebulaGraph Spark Connector 2.5.0 support DELETE write mode to NebulaGraph, see [Delete Vertex](https://docs.nebula-graph.io/master/3.ngql-guide/12.vertex-statements/4.delete-vertex/)
* NebulaGraph Spark Connector 2.5.0 support DELETE write mode to NebulaGraph, see [Delete Vertex](https://docs.nebula-graph.io/master/3.ngql-guide/12.vertex-statements/4.delete-vertex/) .
* NebulaGraph Spark Connector for spark 3.x does not support nqgl reader from NebulaGraph.
## How to Use
Expand Down
1 change: 1 addition & 0 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ Nebula Spark Connector 支持 Spark 2.2, 2.4 和 3.x.
写入模式,相关说明参考[Update Vertex](https://docs.nebula-graph.com.cn/2.0.1/3.ngql-guide/12.vertex-statements/2.update-vertex/)
* Nebula Spark Connector 2.5.0 增加了 DELETE
写入模式,相关说明参考[Delete Vertex](https://docs.nebula-graph.com.cn/2.5.1/3.ngql-guide/12.vertex-statements/4.delete-vertex/)
* 支持Spark 3.x 的 Nebula Spark Connector 不支持通过 NGQL 的查询方式从 NebulaGraph 中获取边数据。

## 使用说明

Expand Down
21 changes: 21 additions & 0 deletions example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Example for nebula-spark-connector

## How to compile

for spark2.2:
```agsl
cd example
mvn clean package -Dmaven.test.skip=true -Pspark2.2
```

for spark2.4:
```agsl
cd example
mvn clean package -Dmaven.test.skip=true -Pspark2.4
```

for spark3.x
```agsl
cd example
mvn clean package -Dmaven.test.skip=true -Pspark3.0
```
108 changes: 86 additions & 22 deletions example/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -145,30 +145,94 @@
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.11</artifactId>
<version>2.4.4</version>
</dependency>

<dependency>
<groupId>com.vesoft</groupId>
<artifactId>nebula-spark-connector</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>

<profiles>
<profile>
<id>spark2.2</id>
<dependencies>
<dependency>
<groupId>com.vesoft</groupId>
<artifactId>nebula-spark-connector_2.2</artifactId>
<version>3.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
</profile>
<profile>
<id>spark2.4</id>
<dependencies>
<dependency>
<groupId>com.vesoft</groupId>
<artifactId>nebula-spark-connector</artifactId>
<version>3.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.11</artifactId>
<version>2.4.4</version>
</dependency>
</dependencies>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
</profile>
<profile>
<id>spark3.0</id>
<dependencies>
<dependency>
<groupId>com.vesoft</groupId>
<artifactId>nebula-spark-connector_3.0</artifactId>
<version>3.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-collection-compat_2.12</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
</profile>
</profiles>

<repositories>
<repository>
<id>SparkPackagesRepo</id>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ object NebulaSparkReaderExample {
readEdges(spark)
readVertexGraph(spark)
readEdgeGraph(spark)
readEdgeWithNgql(spark)
//readEdgeWithNgql(spark)

spark.close()
sys.exit()
Expand Down Expand Up @@ -163,6 +163,7 @@ object NebulaSparkReaderExample {
println("vertex count: " + vertex.count())
}

/*
def readEdgeWithNgql(spark: SparkSession): Unit = {
LOG.info("start to read nebula edge with ngql")
val config =
Expand All @@ -187,4 +188,5 @@ object NebulaSparkReaderExample {
edge.show(20)
println("edge count: " + edge.count())
}
*/
}

0 comments on commit f90612b

Please sign in to comment.