diff --git a/content/operate/rs/7.4/references/rest-api/objects/bdb/backup_location.md b/content/operate/rs/7.4/references/rest-api/objects/bdb/backup_location.md index a1d331df22..0991403182 100644 --- a/content/operate/rs/7.4/references/rest-api/objects/bdb/backup_location.md +++ b/content/operate/rs/7.4/references/rest-api/objects/bdb/backup_location.md @@ -16,7 +16,7 @@ You can back up or export a database's dataset to the following types of locatio - FTP/S - SFTP -- Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -61,6 +61,27 @@ Any additional required parameters may differ based on the backup/export locatio | secret_access_key | string | The AWS Secret Access Key that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + + ### Google Cloud Storage | Key name | Type | Description | diff --git a/content/operate/rs/7.4/references/rest-api/objects/bdb/dataset_import_sources.md b/content/operate/rs/7.4/references/rest-api/objects/bdb/dataset_import_sources.md index 49b2d7bea4..f00f044a29 100644 --- a/content/operate/rs/7.4/references/rest-api/objects/bdb/dataset_import_sources.md +++ b/content/operate/rs/7.4/references/rest-api/objects/bdb/dataset_import_sources.md @@ -18,6 +18,7 @@ You can import data to a database from the following location types: - FTP - SFTP - Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -66,6 +67,26 @@ Any additional required parameters may differ based on the import location type. | secret_access_key | string | The AWS Secret Access that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + ### Google Cloud Storage | Key name | Type | Description | diff --git a/content/operate/rs/7.8/references/rest-api/objects/bdb/backup_location.md b/content/operate/rs/7.8/references/rest-api/objects/bdb/backup_location.md index 236b9a125b..554ae4253d 100644 --- a/content/operate/rs/7.8/references/rest-api/objects/bdb/backup_location.md +++ b/content/operate/rs/7.8/references/rest-api/objects/bdb/backup_location.md @@ -16,7 +16,7 @@ You can back up or export a database's dataset to the following types of locatio - FTP/S - SFTP -- Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -61,6 +61,26 @@ Any additional required parameters may differ based on the backup/export locatio | secret_access_key | string | The AWS Secret Access Key that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + ### Google Cloud Storage | Key name | Type | Description | diff --git a/content/operate/rs/7.8/references/rest-api/objects/bdb/dataset_import_sources.md b/content/operate/rs/7.8/references/rest-api/objects/bdb/dataset_import_sources.md index 9890431e69..091d4c068f 100644 --- a/content/operate/rs/7.8/references/rest-api/objects/bdb/dataset_import_sources.md +++ b/content/operate/rs/7.8/references/rest-api/objects/bdb/dataset_import_sources.md @@ -18,6 +18,7 @@ You can import data to a database from the following location types: - FTP - SFTP - Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -67,6 +68,26 @@ Any additional required parameters may differ based on the import location type. | secret_access_key | string | The AWS Secret Access that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + ### Google Cloud Storage | Key name | Type | Description | diff --git a/content/operate/rs/references/rest-api/objects/bdb/backup_location.md b/content/operate/rs/references/rest-api/objects/bdb/backup_location.md index a3a874c0d3..26cde75034 100644 --- a/content/operate/rs/references/rest-api/objects/bdb/backup_location.md +++ b/content/operate/rs/references/rest-api/objects/bdb/backup_location.md @@ -16,6 +16,7 @@ You can back up or export a database's dataset to the following types of locatio - FTP/S - SFTP - Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -60,6 +61,26 @@ Any additional required parameters may differ based on the backup/export locatio | secret_access_key | string | The AWS Secret Access Key that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + ### Google Cloud Storage | Key name | Type | Description | diff --git a/content/operate/rs/references/rest-api/objects/bdb/dataset_import_sources.md b/content/operate/rs/references/rest-api/objects/bdb/dataset_import_sources.md index e031fff61f..d685d5a83e 100644 --- a/content/operate/rs/references/rest-api/objects/bdb/dataset_import_sources.md +++ b/content/operate/rs/references/rest-api/objects/bdb/dataset_import_sources.md @@ -17,6 +17,7 @@ You can import data to a database from the following location types: - FTP - SFTP - Amazon S3 +- Amazon S3 or S3-compatible storage - Google Cloud Storage - Microsoft Azure Storage - NAS/Local Storage @@ -66,6 +67,26 @@ Any additional required parameters may differ based on the import location type. | secret_access_key | string | The AWS Secret Access that matches the Access Key ID | | subdir | string | Path to the backup directory in the S3 bucket (optional) | +You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. + +To connect to an S3-compatible storage location: + +1. Configure the S3 URL with [`rladmin cluster config`]({{}}): + + ```sh + rladmin cluster config s3_url + ``` + + Replace `` with the hostname or IP address of the S3-compatible storage location. + +1. Configure the S3 CA certificate: + + ```sh + rladmin cluster config s3_ca_cert + ``` + + Replace `` with the location of the S3 CA certificate `ca.pem`. + ### Google Cloud Storage | Key name | Type | Description |