Skip to content

Commit

Permalink
See open-metadata/OpenMetadata@e65f686 from refs/heads/main
Browse files Browse the repository at this point in the history
  • Loading branch information
open-metadata committed Feb 4, 2025
1 parent f325090 commit d04b80a
Show file tree
Hide file tree
Showing 16 changed files with 76 additions and 76 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ slug: /connectors/dashboard/powerbireportserver
---

{% connectorDetailsHeader
name="PowerBIReportServer"
name="PowerBI Report Server"
stage="BETA"
platform="Collate"
availableFeatures=["Dashboards"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ slug: /connectors/dashboard/powerbireportserver/yaml
---

{% connectorDetailsHeader
name="PowerBIReportServer"
name="PowerBI Report Server"
stage="BETA"
platform="Collate"
availableFeatures=["Dashboards"]
Expand Down
10 changes: 5 additions & 5 deletions content/v1.6.x/connectors/database/deltalake/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,17 +4,17 @@ slug: /connectors/database/deltalake
---

{% connectorDetailsHeader
name="DeltaLake"
name="Delta Lake"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "dbt"]
unavailableFeatures=["Query Usage", "Data Profiler", "Data Quality", "Lineage", "Column-level Lineage", "Owners", "Tags", "Stored Procedures"]
/ %}


In this section, we provide guides and references to use the Deltalake connector.
In this section, we provide guides and references to use the Delta Lake connector.

Configure and schedule Deltalake metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule Delta Lake metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -25,10 +25,10 @@ Configure and schedule Deltalake metadata and profiler workflows from the OpenMe

## Requirements

Deltalake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
Delta Lake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
for Python 3.11

The DeltaLake connector is able to extract the information from a **metastore** or directly from the **storage**.
The Delta Lake connector is able to extract the information from a **metastore** or directly from the **storage**.

If extracting directly from the storage, some extra requirements are needed depending on the storage

Expand Down
12 changes: 6 additions & 6 deletions content/v1.6.x/connectors/database/deltalake/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/database/deltalake/yaml
---

{% connectorDetailsHeader
name="DeltaLake"
name="Delta Lake"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "dbt"]
unavailableFeatures=["Query Usage", "Data Profiler", "Data Quality", "Lineage", "Column-level Lineage", "Owners", "Tags", "Stored Procedures"]
/ %}

In this section, we provide guides and references to use the Deltalake connector.
In this section, we provide guides and references to use the Delta Lake connector.

Configure and schedule Deltalake metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule Delta Lake metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -23,14 +23,14 @@ Configure and schedule Deltalake metadata and profiler workflows from the OpenMe

## Requirements

Deltalake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
Delta Lake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
for Python 3.11

### Python Requirements

{% partial file="/v1.6/connectors/python-requirements.md" /%}

To run the Deltalake ingestion, you will need to install:
To run the Delta Lake ingestion, you will need to install:

- If extracting from a metastore

Expand All @@ -49,7 +49,7 @@ pip3 install "openmetadata-ingestion[deltalake-storage]"

All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltaLakeConnection.json)
you can find the structure to create a connection to Deltalake.
you can find the structure to create a connection to Delta Lake.

In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
Expand Down
6 changes: 3 additions & 3 deletions content/v1.6.x/connectors/database/singlestore/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/database/singlestore
---

{% connectorDetailsHeader
name="Singlestore"
name="SingleStore"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "Data Profiler", "Data Quality", "View Lineage", "View Column-level Lineage", "dbt"]
unavailableFeatures=["Query Usage", "Stored Procedures", "Owners", "Tags"]
/ %}

In this section, we provide guides and references to use the Singlestore connector.
In this section, we provide guides and references to use the SingleStore connector.

Configure and schedule Singlestore metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule SingleStore metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand Down
10 changes: 5 additions & 5 deletions content/v1.6.x/connectors/database/singlestore/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,17 +4,17 @@ slug: /connectors/database/singlestore/yaml
---

{% connectorDetailsHeader
name="Singlestore"
name="SingleStore"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "Data Profiler", "Data Quality", "View Lineage", "View Column-level Lineage", "dbt"]
unavailableFeatures=["Query Usage", "Stored Procedures", "Owners", "Tags"]
/ %}


In this section, we provide guides and references to use the Singlestore connector.
In this section, we provide guides and references to use the SingleStore connector.

Configure and schedule Singlestore metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule SingleStore metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -31,7 +31,7 @@ Configure and schedule Singlestore metadata and profiler workflows from the Open

{% partial file="/v1.6/connectors/python-requirements.md" /%}

To run the Singlestore ingestion, you will need to install:
To run the SingleStore ingestion, you will need to install:

```bash
pip3 install "openmetadata-ingestion[singlestore]"
Expand All @@ -52,7 +52,7 @@ The workflow is modeled around the following

### 1. Define the YAML Config

This is a sample config for Singlestore:
This is a sample config for SingleStore:

{% codePreview %}

Expand Down
16 changes: 8 additions & 8 deletions content/v1.6.x/connectors/pipeline/nifi/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/pipeline/nifi
---

{% connectorDetailsHeader
name="Nifi"
name="NiFi"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Pipelines"]
unavailableFeatures=["Pipeline Status", "Owners", "Tags", "Lineage"]
/ %}

In this section, we provide guides and references to use the Nifi connector.
In this section, we provide guides and references to use the NiFi connector.

Configure and schedule Nifi metadata workflows from the OpenMetadata UI:
Configure and schedule NiFi metadata workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -23,11 +23,11 @@ Configure and schedule Nifi metadata workflows from the OpenMetadata UI:
## Requirements

### Metadata
OpenMetadata supports 2 types of connection for the Nifi connector:
- **basic authentication**: use username/password to authenticate to Nifi.
OpenMetadata supports 2 types of connection for the NiFi connector:
- **basic authentication**: use username/password to authenticate to NiFi.
- **client certificate authentication**: use CA, client certificate and client key files to authenticate.

The user should be able to send request to the Nifi API and access the `Resources` endpoint.
The user should be able to send request to the NiFi API and access the `Resources` endpoint.

## Metadata Ingestion

Expand All @@ -48,9 +48,9 @@ The user should be able to send request to the Nifi API and access the `Resource

- **Host and Port**: Pipeline Service Management/UI URI. This should be specified as a string in the format 'hostname:port'.

- **Nifi Config**: OpenMetadata supports username/password or client certificate authentication.
- **NiFi Config**: OpenMetadata supports username/password or client certificate authentication.
1. Basic Authentication
- Username: Username to connect to Nifi. This user should be able to send request to the Nifi API and access the `Resources` endpoint.
- Username: Username to connect to NiFi. This user should be able to send request to the NiFi API and access the `Resources` endpoint.
- Password: Password to connect to Nifi.
- Verify SSL: Whether SSL verification should be perform when authenticating.
2. Client Certificate Authentication
Expand Down
16 changes: 8 additions & 8 deletions content/v1.6.x/connectors/pipeline/nifi/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/pipeline/nifi/yaml
---

{% connectorDetailsHeader
name="Nifi"
name="NiFi"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Pipelines"]
unavailableFeatures=["Pipeline Status", "Owners", "Tags", "Lineage"]
/ %}

In this section, we provide guides and references to use the Nifi connector.
In this section, we provide guides and references to use the NiFi connector.

Configure and schedule Nifi metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule NiFi metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -26,7 +26,7 @@ Configure and schedule Nifi metadata and profiler workflows from the OpenMetadat

{% partial file="/v1.6/connectors/python-requirements.md" /%}

To run the Nifi ingestion, you will need to install:
To run the NiFi ingestion, you will need to install:

```bash
pip3 install "openmetadata-ingestion[nifi]"
Expand All @@ -47,7 +47,7 @@ The workflow is modeled around the following

### 1. Define the YAML Config

This is a sample config for Nifi:
This is a sample config for NiFi:

{% codePreview %}

Expand All @@ -58,10 +58,10 @@ This is a sample config for Nifi:
{% codeInfo srNumber=1 %}

**hostPort**: Pipeline Service Management UI URL
**nifiConfig**: one of
**NiFiConfig**: one of
**1.** Using Basic authentication
- **username**: Username to connect to Nifi. This user should be able to send request to the Nifi API and access the `Resources` endpoint.
- **password**: Password to connect to Nifi.
- **username**: Username to connect to NiFi. This user should be able to send request to the NiFi API and access the `Resources` endpoint.
- **password**: Password to connect to NiFi.
- **verifySSL**: Whether SSL verification should be perform when authenticating.
**2.** Using client certificate authentication
- **certificateAuthorityPath**: Path to the certificate authority (CA) file. This is the certificate used to store and issue your digital certificate. This is an optional parameter. If omitted SSL verification will be skipped; this can present some sever security issue.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ slug: /connectors/dashboard/powerbireportserver
---

{% connectorDetailsHeader
name="PowerBIReportServer"
name="PowerBI Report Server"
stage="BETA"
platform="Collate"
availableFeatures=["Dashboards"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ slug: /connectors/dashboard/powerbireportserver/yaml
---

{% connectorDetailsHeader
name="PowerBIReportServer"
name="PowerBI Report Server"
stage="BETA"
platform="Collate"
availableFeatures=["Dashboards"]
Expand Down
10 changes: 5 additions & 5 deletions content/v1.7.x-SNAPSHOT/connectors/database/deltalake/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,17 +4,17 @@ slug: /connectors/database/deltalake
---

{% connectorDetailsHeader
name="DeltaLake"
name="Delta Lake"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "dbt"]
unavailableFeatures=["Query Usage", "Data Profiler", "Data Quality", "Lineage", "Column-level Lineage", "Owners", "Tags", "Stored Procedures"]
/ %}


In this section, we provide guides and references to use the Deltalake connector.
In this section, we provide guides and references to use the Delta Lake connector.

Configure and schedule Deltalake metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule Delta Lake metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -25,10 +25,10 @@ Configure and schedule Deltalake metadata and profiler workflows from the OpenMe

## Requirements

Deltalake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
Delta Lake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
for Python 3.11

The DeltaLake connector is able to extract the information from a **metastore** or directly from the **storage**.
The Delta Lake connector is able to extract the information from a **metastore** or directly from the **storage**.

If extracting directly from the storage, some extra requirements are needed depending on the storage

Expand Down
12 changes: 6 additions & 6 deletions content/v1.7.x-SNAPSHOT/connectors/database/deltalake/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/database/deltalake/yaml
---

{% connectorDetailsHeader
name="DeltaLake"
name="Delta Lake"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "dbt"]
unavailableFeatures=["Query Usage", "Data Profiler", "Data Quality", "Lineage", "Column-level Lineage", "Owners", "Tags", "Stored Procedures"]
/ %}

In this section, we provide guides and references to use the Deltalake connector.
In this section, we provide guides and references to use the Delta Lake connector.

Configure and schedule Deltalake metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule Delta Lake metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand All @@ -23,14 +23,14 @@ Configure and schedule Deltalake metadata and profiler workflows from the OpenMe

## Requirements

Deltalake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
Delta Lake requires to run with Python 3.8, 3.9 or 3.10. We do not yet support the Delta connector
for Python 3.11

### Python Requirements

{% partial file="/v1.7/connectors/python-requirements.md" /%}

To run the Deltalake ingestion, you will need to install:
To run the Delta Lake ingestion, you will need to install:

- If extracting from a metastore

Expand All @@ -49,7 +49,7 @@ pip3 install "openmetadata-ingestion[deltalake-storage]"

All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltaLakeConnection.json)
you can find the structure to create a connection to Deltalake.
you can find the structure to create a connection to Delta Lake.

In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@ slug: /connectors/database/singlestore
---

{% connectorDetailsHeader
name="Singlestore"
name="SingleStore"
stage="PROD"
platform="OpenMetadata"
availableFeatures=["Metadata", "Data Profiler", "Data Quality", "View Lineage", "View Column-level Lineage", "dbt"]
unavailableFeatures=["Query Usage", "Stored Procedures", "Owners", "Tags"]
/ %}

In this section, we provide guides and references to use the Singlestore connector.
In this section, we provide guides and references to use the SingleStore connector.

Configure and schedule Singlestore metadata and profiler workflows from the OpenMetadata UI:
Configure and schedule SingleStore metadata and profiler workflows from the OpenMetadata UI:

- [Requirements](#requirements)
- [Metadata Ingestion](#metadata-ingestion)
Expand Down
Loading

0 comments on commit d04b80a

Please sign in to comment.