Skip to content

Commit

Permalink
merge dev
Browse files Browse the repository at this point in the history
  • Loading branch information
毕博 committed Nov 4, 2022
2 parents 577e8a3 + f429ace commit 604545a
Show file tree
Hide file tree
Showing 274 changed files with 7,697 additions and 2,647 deletions.
4 changes: 4 additions & 0 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,3 +26,7 @@ Feel free to ping committers for the review!
* [ ] If any new Jar binary package adding in your PR, please add License Notice according
[New License Guide](https://github.com/apache/incubator-seatunnel/blob/dev/docs/en/contribution/new-license.md)
* [ ] If necessary, please update the documentation to describe the new feature. https://github.com/apache/incubator-seatunnel/tree/dev/docs
* [ ] If you are contributing the connector code, please check that the following files are updated:
1. Update change log that in connector document. For more details you can refer to [connector-v2](https://github.com/apache/incubator-seatunnel/tree/dev/docs/en/connector-v2)
2. Update [plugin-mapping.properties](https://github.com/apache/incubator-seatunnel/blob/dev/plugin-mapping.properties) and add new connector information in it
3. Update the pom file of [seatunnel-dist](https://github.com/apache/incubator-seatunnel/blob/dev/seatunnel-dist/pom.xml)
44 changes: 26 additions & 18 deletions .github/workflows/backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -121,16 +121,16 @@ jobs:
- "seatunnel-config/**"
- "seatunnel-connectors/**"
- "seatunnel-core/**"
- "seatunnel-dist/**"
- "seatunnel-e2e/seatunnel-e2e-common/**"
- "seatunnel-formats/**"
- "seatunnel-plugin-discovery/**"
- "seatunnel-transforms/**"
- "seatunnel-transforms-v2/**"
- "seatunnel-translation/**"
- "seatunnel-e2e/seatunnel-transforms-v2-e2e/**"
- "seatunnel-e2e/seatunnel-flink-e2e/**"
- "seatunnel-e2e/seatunnel-spark-e2e/**"
- "seatunnel-connectors/**"
- "plugin-mapping.properties"
- "pom.xml"
- "**/workflows/**"
- "**/tools/**"
Expand Down Expand Up @@ -199,23 +199,31 @@ jobs:
run: |
modules='${{ steps.engine-modules.outputs.modules }}${{ steps.cv2-modules.outputs.modules }}'
modules=${modules: 1}
includes=`python tools/update_modules_check/update_modules_check.py tree $modules`
./mvnw -Pno_dist -D"e2e.dependency.skip"=false dependency:tree $includes -DoutputType=text -DoutputFile=/tmp/tree_out.txt
modules=`python tools/update_modules_check/update_modules_check.py final /tmp/tree_out.txt`
echo $modules
echo "modules=$modules" >> $GITHUB_OUTPUT
pl_modules=`python tools/update_modules_check/update_modules_check.py replace $modules`
./mvnw help:evaluate -Dexpression=project.modules -q -DforceStdout -pl $pl_modules > /tmp/sub_module.txt
sub_modules=`python tools/update_modules_check/update_modules_check.py sub /tmp/sub_module.txt`
tree_modules="$modules$sub_modules"
includes=`python tools/update_modules_check/update_modules_check.py tree $tree_modules`
./mvnw -Pci -D"e2e.dependency.skip"=false dependency:tree $includes -DoutputType=text -DoutputFile=/tmp/tree_out.txt
build_modules=`python tools/update_modules_check/update_modules_check.py final_ut /tmp/tree_out.txt`
echo $build_modules
echo "modules=$build_modules" >> $GITHUB_OUTPUT
- name: Make integration test modules
id: it-modules
if: ${{ steps.filter.outputs.api == 'false' }}
run: |
modules='${{ steps.cv2-e2e-modules.outputs.modules }}${{ steps.cv2-flink-e2e-modules.outputs.modules }}${{ steps.cv2-spark-e2e-modules.outputs.modules }}${{ steps.engine-e2e-modules.outputs.modules }}${{ steps.engine-modules.outputs.modules }}${{ steps.cv2-modules.outputs.modules }}'
modules=${modules: 1}
includes=`python tools/update_modules_check/update_modules_check.py tree $modules`
./mvnw -Pno_dist -D"e2e.dependency.skip"=false dependency:tree $includes -DoutputType=text -DoutputFile=/tmp/tree_out.txt
modules=`python tools/update_modules_check/update_modules_check.py final /tmp/tree_out.txt`
echo $modules
echo "modules=$modules" >> $GITHUB_OUTPUT
pl_modules=`python tools/update_modules_check/update_modules_check.py replace $modules`
./mvnw help:evaluate -Dexpression=project.modules -q -DforceStdout -pl $pl_modules > /tmp/sub_module.txt
sub_modules=`python tools/update_modules_check/update_modules_check.py sub /tmp/sub_module.txt`
tree_modules="$modules$sub_modules"
includes=`python tools/update_modules_check/update_modules_check.py tree $tree_modules`
./mvnw -Pci -D"e2e.dependency.skip"=false dependency:tree $includes -DoutputType=text -DoutputFile=/tmp/tree_out.txt
build_modules=`python tools/update_modules_check/update_modules_check.py final_it /tmp/tree_out.txt`
echo $build_modules
echo "modules=$build_modules" >> $GITHUB_OUTPUT
dependency-license:
if: needs.changes.outputs.api == 'true' || needs.changes.outputs.engine == 'true'
Expand All @@ -237,7 +245,6 @@ jobs:
./mvnw -B -q install -DskipTests
-D"maven.test.skip"=true
-D"maven.javadoc.skip"=true
-D"scalastyle.skip"=true
-D"checkstyle.skip"=true
-D"license.skipAddThirdParty"
- name: Check Dependencies Licenses
Expand All @@ -263,19 +270,20 @@ jobs:
- name: run all modules unit test
if: needs.changes.outputs.api == 'true'
run: |
./mvnw -B -T 1C clean verify -D"maven.test.skip"=false -D"checkstyle.skip"=true -D"scalastyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates
./mvnw -B -T 1C clean verify -D"maven.test.skip"=false -D"checkstyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates
env:
MAVEN_OPTS: -Xmx2048m

- name: run updated modules unit test
if: needs.changes.outputs.api == 'false' && needs.changes.outputs.ut-modules != ''
run: |
./mvnw -B -T 1C clean verify -D"maven.test.skip"=false -D"checkstyle.skip"=true -D"scalastyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates -pl ${{needs.changes.outputs.ut-modules}} -am -amd -Pno_dist
./mvnw -B -T 1C clean verify -D"maven.test.skip"=false -D"checkstyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates -pl ${{needs.changes.outputs.ut-modules}} -am -Pci
env:
MAVEN_OPTS: -Xmx2048m

integration-test:
needs: [ changes, sanity-check ]
if: needs.changes.outputs.api == 'true' || (needs.changes.outputs.api == 'false' && needs.changes.outputs.it-modules != '')
runs-on: ${{ matrix.os }}
strategy:
matrix:
Expand All @@ -293,14 +301,14 @@ jobs:
- name: run all modules integration test
if: needs.changes.outputs.api == 'true'
run: |
./mvnw -T 1C -B verify -DskipUT=true -DskipIT=false -D"checkstyle.skip"=true -D"scalastyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates
./mvnw -T 1C -B verify -DskipUT=true -DskipIT=false -D"checkstyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates
env:
MAVEN_OPTS: -Xmx2048m

- name: run updated modules integration test
if: needs.changes.outputs.api == 'false'
if: needs.changes.outputs.api == 'false' && needs.changes.outputs.it-modules != ''
run: |
./mvnw -T 1C -B verify -DskipUT=true -DskipIT=false -D"checkstyle.skip"=true -D"scalastyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates -pl ${{needs.changes.outputs.it-modules}} -am -amd -Pno_dist
./mvnw -T 1C -B verify -DskipUT=true -DskipIT=false -D"checkstyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates -pl ${{needs.changes.outputs.it-modules}} -am -Pci
env:
MAVEN_OPTS: -Xmx2048m

2 changes: 1 addition & 1 deletion .github/workflows/codeql.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
timeout-minutes: 60
timeout-minutes: 120
env:
JAVA_TOOL_OPTIONS: -Xmx2G -Xms2G -Dhttp.keepAlive=false -Dmaven.test.skip=true -Dcheckstyle.skip=true -Dlicense.skipAddThirdParty=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.http.retryHandler.count=3 -Dmaven.wagon.httpconnectionManager.ttlSeconds=120

Expand Down
1 change: 1 addition & 0 deletions .github/workflows/schedule_backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -143,3 +143,4 @@ jobs:
./mvnw -T 1C -B verify -DskipUT=true -DskipIT=false -D"checkstyle.skip"=true -D"scalastyle.skip"=true -D"license.skipAddThirdParty"=true --no-snapshot-updates
env:
MAVEN_OPTS: -Xmx2048m

78 changes: 78 additions & 0 deletions config/log4j2.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
################################################################################
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
################################################################################

property.file_path = ${sys:seatunnel.logs.path:-/tmp/seatunnel/logs}
property.file_name = ${sys:seatunnel.logs.file_name:-seatunnel}
property.file_split_size = 100MB
property.file_count = 100
property.file_ttl = 7d

rootLogger.level = INFO

############################ log output to console #############################
rootLogger.appenderRef.consoleStdout.ref = consoleStdoutAppender
rootLogger.appenderRef.consoleStderr.ref = consoleStderrAppender
############################ log output to console #############################
############################ log output to file #############################
#rootLogger.appenderRef.file.ref = fileAppender
############################ log output to file #############################

appender.consoleStdout.name = consoleStdoutAppender
appender.consoleStdout.type = CONSOLE
appender.consoleStdout.target = SYSTEM_OUT
appender.consoleStdout.layout.type = PatternLayout
appender.consoleStdout.layout.pattern = %d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c - %m%n
appender.consoleStdout.filter.acceptLtWarn.type = ThresholdFilter
appender.consoleStdout.filter.acceptLtWarn.level = WARN
appender.consoleStdout.filter.acceptLtWarn.onMatch = DENY
appender.consoleStdout.filter.acceptLtWarn.onMismatch = ACCEPT

appender.consoleStderr.name = consoleStderrAppender
appender.consoleStderr.type = CONSOLE
appender.consoleStderr.target = SYSTEM_ERR
appender.consoleStderr.layout.type = PatternLayout
appender.consoleStderr.layout.pattern = %d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c - %m%n
appender.consoleStderr.filter.acceptGteWarn.type = ThresholdFilter
appender.consoleStderr.filter.acceptGteWarn.level = WARN
appender.consoleStderr.filter.acceptGteWarn.onMatch = ACCEPT
appender.consoleStderr.filter.acceptGteWarn.onMismatch = DENY

appender.file.name = fileAppender
appender.file.type = RollingFile
appender.file.fileName = ${file_path}/${file_name}.log
appender.file.filePattern = ${file_path}/${file_name}.log.%d{yyyy-MM-dd}-%i
appender.file.append = true
appender.file.layout.type = PatternLayout
appender.file.layout.pattern = %d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %c - %m%n
appender.file.policies.type = Policies
appender.file.policies.time.type = TimeBasedTriggeringPolicy
appender.file.policies.time.modulate = true
appender.file.policies.size.type = SizeBasedTriggeringPolicy
appender.file.policies.size.size = ${file_split_size}
appender.file.strategy.type = DefaultRolloverStrategy
appender.file.strategy.fileIndex = nomax
appender.file.strategy.action.type = Delete
appender.file.strategy.action.basepath = ${file_path}
appender.file.strategy.action.maxDepth = 1
appender.file.strategy.action.condition.type = IfFileName
appender.file.strategy.action.condition.glob = ${file_name}.log*
appender.file.strategy.action.condition.nested_condition.type = IfAny
appender.file.strategy.action.condition.nested_condition.lastModify.type = IfLastModified
appender.file.strategy.action.condition.nested_condition.lastModify.age = ${file_ttl}
appender.file.strategy.action.condition.nested_condition.fileCount.type = IfAccumulatedFileCount
appender.file.strategy.action.condition.nested_condition.fileCount.exceeds = ${file_count}
2 changes: 0 additions & 2 deletions config/plugin_config
Original file line number Diff line number Diff line change
Expand Up @@ -62,8 +62,6 @@ seatunnel-connector-spark-console
seatunnel-connector-spark-doris
seatunnel-connector-spark-email
--connectors-v2--
connector-fake
connector-console
connector-assert
connector-kafka
connector-http-base
Expand Down
69 changes: 69 additions & 0 deletions docs/en/connector-v2/sink/Amazondynamodb.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@

# Amazondynamodb

> Amazondynamodb sink connector
## Description

Write data to `Amazondynamodb`

## Key features

- [ ] [exactly-once](../../concept/connector-v2-features.md)
- [ ] [schema projection](../../concept/connector-v2-features.md)

## Options

| name | type | required | default value |
|----------------- | ------ |----------| ------------- |
| url | string | yes | - |
| region | string | yes | - |
| access_key_id | string | yes | - |
| secret_access_key| string | yes | - |
| table | string | yes | - |
| batch_size | string | no | 25 |
| batch_interval_ms| string | no | 1000 |
| common-options | | no | - |

### url [string]

url to write to Amazondynamodb.

### region [string]

The region of Amazondynamodb.

### accessKeyId [string]

The access id of Amazondynamodb.

### secretAccessKey [string]

The access secret of Amazondynamodb.

### table [string]

The table of Amazondynamodb.

### common options

Sink plugin common parameters, please refer to [Sink Common Options](common-options.md) for details.

## Example

```bash
Amazondynamodb {
url = "http://127.0.0.1:8000"
region = "us-east-1"
accessKeyId = "dummy-key"
secretAccessKey = "dummy-secret"
table = "TableName"
}
```

## Changelog

### next version

- Add Amazondynamodb Sink Connector

6 changes: 6 additions & 0 deletions docs/en/connector-v2/sink/Clickhouse.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,3 +126,9 @@ sink {

### 2.3.0-beta 2022-10-20
- [Improve] Clickhouse Support Int128,Int256 Type ([3067](https://github.com/apache/incubator-seatunnel/pull/3067))

### next version

- [Improve] Clickhouse Sink support nest type and array type([3047](https://github.com/apache/incubator-seatunnel/pull/3047))

- [Improve] Clickhouse Sink support geo type([3141](https://github.com/apache/incubator-seatunnel/pull/3141))
43 changes: 32 additions & 11 deletions docs/en/connector-v2/sink/Redis.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,18 @@ Used to write data to Redis.

## Options

| name | type | required | default value |
|-------------- |--------|----------|---------------|
| host | string | yes | - |
| port | int | yes | - |
| key | string | yes | - |
| data_type | string | yes | - |
| auth | string | no | - |
| format | string | no | json |
| common-options| | no | - |
| name | type | required | default value |
|----------------|--------|----------|---------------|
| host | string | yes | - |
| port | int | yes | - |
| key | string | yes | - |
| data_type | string | yes | - |
| user | string | no | - |
| auth | string | no | - |
| mode | string | no | - |
| auth | list | no | - |
| format | string | no | json |
| common-options | | no | - |

### host [string]

Expand Down Expand Up @@ -75,11 +78,25 @@ Redis data types, support `key` `hash` `list` `set` `zset`
- zset
> Each data from upstream will be added to the configured zset key with a weight of 1. So the order of data in zset is based on the order of data consumption.
### auth [String]
### user [string]

redis authentication user, you need it when you connect to an encrypted cluster

### auth [string]

Redis authentication password, you need it when you connect to an encrypted cluster

### format [String]
### mode [string]

redis mode, `single` or `cluster`, default is `single`

### nodes [list]

redis nodes information, used in cluster mode, must like as the following format:

[host1:port1, host2:port2]

### format [string]

The format of upstream data, now only support `json`, `text` will be supported later, default `json`.

Expand Down Expand Up @@ -121,3 +138,7 @@ simple:
### 2.2.0-beta 2022-09-26

- Add Redis Sink Connector

### next version

- [Improve] Support redis cluster mode connection and user authentication [3188](https://github.com/apache/incubator-seatunnel/pull/3188)
Loading

0 comments on commit 604545a

Please sign in to comment.