Skip to content

Commit

Permalink
Merge pull request #28 from interTwin-eu/dev-slangarita
Browse files Browse the repository at this point in the history
Dev slangarita
  • Loading branch information
esparig authored Aug 2, 2024
2 parents 959cc5c + 53a31e2 commit 67f9ec3
Show file tree
Hide file tree
Showing 10 changed files with 84 additions and 7 deletions.
12 changes: 10 additions & 2 deletions cli/description.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,14 @@ nifi:
ssl_context:
Truststore_Filename: <name-of-p12>
Truststore_Password: "<password-of-p12>"
alterations:
- action: Encode
Encoding: base64
- action: Decode
Encoding: base64
- action: Merge
maxMessages: 10
windowSeconds: 7
SQS:
- name: SQS-events
AWS_DEFAULT_REGION: <region us-east-1 as example>
Expand All @@ -57,5 +65,5 @@ nifi:
connection:
- from: dcache
to: invoke
- from: dcache
to: invoke2
- from: kafka
to: invoke
File renamed without changes.
2 changes: 1 addition & 1 deletion cli/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from sources.kafka import createKafka
from sources.generic import createGeneric
from destinations.oscar import createOSCAR
from destinations.s3sqs import createPutS3
from destinations.s3aws import createPutS3

folderSource = "template/sources/"

Expand Down
18 changes: 18 additions & 0 deletions docpage/docs/05.- Alterations/Decode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
sidebar_position: 2
---

# Decode

Alteration's Decode decodes the data flow from the chosen encoding. The user must ensure the input data is encoded using the selected encoding.
Three encodes are available: `base64`, `base32` and `hex`. It is similar to the command `base64 -d` or `base32 -d`. For example, If the input data is a string in base64 with the value `aGVsbG8K` or in base32 with the value `NBSWY3DPBI======`. The output data will be the same in both cases, `hello`.


Here is the YAML example.


```
alterations:
- action: Decode
Encoding: base64
```
16 changes: 16 additions & 0 deletions docpage/docs/05.- Alterations/Encode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
sidebar_position: 1
---

# Encode

Alteration's Encode changes the data flow to the chosen encoding. Three encodings are available: `base64`, `base32` and `hex`. It is similar to the command `base64` `base32` or `hexdump`.
For example, If the input data is a string with the message `hello`. The output message encode in base64 will be `aGVsbG8K`, in base32 will be `NBSWY3DPBI======` and in hex `0000000 6568 6c6c 0a6f 0000006`

Here is the YAML example.

```
alterations:
- action: Encode
Encoding: base64
```
23 changes: 23 additions & 0 deletions docpage/docs/05.- Alterations/Merge.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
---
sidebar_position: 3
---

# Merge

Alteration's Merge put together some messages in one. There are two variables to set: the maximum messages `maxMessages` can get and the waiting window time `windowSeconds`.
The final output will aggregate the input messages separated by a lane break. If the receive data order is `I am the message 1` and `I am the message 2` the output will be:

```
I am the message 1
I am the message 2
```

Here is the YAML example.


```
alterations:
- action: Merge
maxMessages: 10
windowSeconds: 2
```
4 changes: 2 additions & 2 deletions docpage/docs/Introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ DCNiOS is an open-source command-line tool that easily manages the creation of e
Apache NiFi Process Group is a group of Processors that compose a dataflow. DCNiOS uses predefined Process Groups that make simple actions like interacting with a third-party component (e.g., consuming from Kafka) or changing the data content (e.g.encoding the data in base64) to compose a complete dataflow.

In DCNiOS documentation, the Process Groups are split by purpose into three main groups: 'Sources', 'Destinations', and 'Alterations'.
- 'Sources' interact with a third-party component as the input data receiver.
- 'Destinations' interact with a third-party component as output data sender.
- 'Sources' interact with a third-party component as the input data receiver.
- 'Destinations' interact with a third-party component as an output data sender.
- 'Alterations' that do not interact with third-party components and change the format of the data flow.


Expand Down
12 changes: 12 additions & 0 deletions docpage/docs/Users.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,18 @@ components:
```


#### Alterations

The subsection `alterations`, inside Sources, change the data format. These alterations are applied as a descendent definition. In this example, the input data is merged into one message. Then, the merge message is encoded.

```
- action: Merge
maxMessages: 10
windowSeconds: 7
- action: Encode
Encoding: base64
```

### Connections


Expand Down
2 changes: 1 addition & 1 deletion docpage/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ const config = {
tagline: 'Data Connector through Apache NiFi for OSCAR',
favicon: 'img/favicon.ico',

// Set the production url of your site here #3f3f9f
// Set the production url of your site here
url: 'https://intertwin-eu.github.io/',
// Set the /<baseUrl>/ pathname under which your site is served
// For GitHub pages deployment, it is often '/<projectName>/'
Expand Down
2 changes: 1 addition & 1 deletion docpage/src/pages/markdown-page.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ title: Main Page
DCNiOS is an open-source command-line tool to easily manage the creation of event-driven data processing flows.
DCNiOS, Data Connector through [Apache NiFi](https://nifi.apache.org/) for [OSCAR](https://oscar.grycap.net/), facilitates the creation of event-driven processes connecting a Storage System like [dCache](https://www.dcache.org/) or [S3](https://aws.amazon.com/s3) to a scalable OSCAR cluster by employing predefined dataflows that are processed by Apache NiFi.

DCNiOS was developed within the interTwin project. DCNiOS creates dataflows in Apache NiFi that captures events or messages, and ingests them in an OSCAR cluster at a customized rate, where an OSCAR service is run based on a user-defined application (containerized in a Docker image).
DCNiOS was developed within the interTwin project. DCNiOS creates dataflows in Apache NiFi that capture events or messages and ingest them in an OSCAR cluster at a customized rate, where an OSCAR service is run based on a user-defined application (containerized in a Docker image).

The DCNiOS command-line application is available in the Source Code repository. Additionally, the corresponding TOSCA templates and the ansible roles that are required to deploy an Apache Nifi cluster via the Infrastructure Manager (IM) have been provided. Any user can self-deploy such a cluster via the [IM Dashboard](https://im.egi.eu).

Expand Down

0 comments on commit 67f9ec3

Please sign in to comment.