General information for Amazon's Elasticsearch Service can be found within the official documentation
For general help to upgrade Amazon Elasticsearch click here
The following steps are required to Upgrade from Elastic 6.x to 7.x:
- Important: a major upgrade from 6.x to 7.x will delete everything and the cluster will be rebuild
- Installed AWS SDK for console
- Configured SDK via ~/.aws/config
- Authenticated user (in config) need access to elasticsearch service
- All requests (GET, PUT, POST) has to be signed by AWS. Therefore it is recommended to use a Python script with the necessary tools like requests, boto 3 and AWS4Auth.
pip install requests pip install requests-aws4auth pip install boto3
All steps are described in detail in the official AWS documentation and should be considered.
The Variable host = '{ELASTICSEARCH_URL}/'
in the python scripts below has to be set on the OLD elasticsearch server, because we want to backup it.
See Creating a bucket in the official AWS documentation
See Manual Snapshot Prerequisites in the official AWS documentation
The snapshot repository is the registration for elasticsearch to store and restore snapshots. The following script can be used to fulfil this operation. In the following sections, the header will be used for all scripts and only differnt parts will be described. So the first 9 lines are required for all operations and payload, request type etc. differs for each operation. The variables like {AWS_REGION}
in the code-sections are required to change and are just placeholders for clarification.
import boto3
import requests
from requests_aws4auth import AWS4Auth
host = '{ELASTICSEARCH_URL}' # include https:// and trailing /
region = '{AWS_REGION}' # e.g. us-west-1
service = 'es'
credentials = boto3.Session().get_credentials()
awsauth = AWS4Auth(credentials.access_key, credentials.secret_key, region, service, session_token=credentials.token)
# Register repository
path = '_snapshot/upgrade_repo' # the Elasticsearch API endpoint
url = host + path
payload = {
"type": "s3",
"settings": {
"bucket": "{BUCKKET_NAME}",
# "endpoint": "s3.amazonaws.com", # for us-east-1
"region": "{AWS_REGION}", # for all other regions
"role_arn": "{SNAPSHOT_TO_S3_ROLE}"
}
}
headers = {"Content-Type": "application/json"}
r = requests.put(url, auth=awsauth, json=payload, headers=headers)
print(r.status_code)
print(r.text)
# Take snapshot
path = '_snapshot/my-snapshot-repo/my-snapshot'
url = host + path
r = requests.put(url, auth=awsauth)
print(r.text)
- Log in to Kibana
- Click on
Management
- Click on
Saved Objects
- Mark everything necessary and click on
Export x objects
on top right - This will trigger the download of a json file
export.json
The Variable host = '{ELASTICSEARCH_URL}/'
in the python scripts has to be set on the NEW elasticsearch server, because we want to restore the snapshop on it.
The steps are the same like in the backup repo registration, but now with the new elasticsearch url
path = '_snapshot/upgrade_repo' # the Elasticsearch API endpoint
url = host + path
payload = {
"type": "s3",
"settings": {
"bucket": "{BUCKKET_NAME}",
# "endpoint": "s3.amazonaws.com", # for us-east-1
"region": "{AWS_REGION}", # for all other regions
"role_arn": "{SNAPSHOT_TO_S3_ROLE}"
}
}
headers = {"Content-Type": "application/json"}
r = requests.put(url, auth=awsauth, json=payload, headers=headers)
print(r.status_code)
print(r.text)
The restore can take quite a while, so watch the progress in AWS Console -> Cloudwatch or on the elasticsearch dashboard itself.
path = '_snapshot/{REPO_NAME}/{SNAPHOT_NAME}/_restore'
url = host + path
payload = {
"indices": [ "{INDEX_1}", "{INDEX_2}" ],
"include_global_state": False
}
headers = {"Content-Type": "application/json"}
r = requests.post(url, auth=awsauth, json=payload, headers=headers)
print(r.status_code)
print(r.text)
- Log in to Kibana
- Click on
Management
- Click on
Saved Objects
- Mark everything necessary and click on
Export x objects
on top right - This will trigger the download of a json file
export.json
path = '_snapshot/{REPO_NAME}'
url = host + path
headers = {"Content-Type": "application/json"}
r = requests.get(url, auth=awsauth, headers=headers)
print(r.text)
path = '_snapshot/{REPO_NAME}/_all?pretty'
url = host + path
headers = {"Content-Type": "application/json"}
r = requests.get(url, auth=awsauth, headers=headers)
print(r.text)
path = '*'
url = host + path
r = requests.delete(url, auth=awsauth)
print(r.text)