From 9cb67895016c6976ed035ce4ff187db0af4c0ad1 Mon Sep 17 00:00:00 2001 From: Colin Saliceti Date: Fri, 13 Dec 2024 10:41:33 +0000 Subject: [PATCH 1/2] Remove the need for ARM_ACCESS_KEY Obtain it dynamically so we don't need to maintain it --- .github/workflows/destroy.yml | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/.github/workflows/destroy.yml b/.github/workflows/destroy.yml index 0c55f1b82..46304359b 100644 --- a/.github/workflows/destroy.yml +++ b/.github/workflows/destroy.yml @@ -55,5 +55,11 @@ jobs: run: | make ci review terraform-destroy PR_NUMBER=${{github.event.number}} + - name: Set Container Access Key + run: | + TFSTATE_CONTAINER_ACCESS_KEY="$(az storage account keys list -g s189t01-gse-rv-rg -n s189t01gsetfstatervsa | jq -r '.[0].value')" + echo "::add-mask::$TFSTATE_CONTAINER_ACCESS_KEY" + echo "TFSTATE_CONTAINER_ACCESS_KEY=$TFSTATE_CONTAINER_ACCESS_KEY" >> $GITHUB_ENV + - name: Delete Terraform State File - run: az storage blob delete --container-name terraform-state --account-name s189t01gsetfstatervsa --account-key ${{secrets.ARM_ACCESS_KEY}} -n ${{github.event.number}}_kubernetes.tfstate + run: az storage blob delete --container-name terraform-state --account-name s189t01gsetfstatervsa --account-key ${TFSTATE_CONTAINER_ACCESS_KEY} -n ${{github.event.number}}_kubernetes.tfstate From e7ff3f8179f87bf4f6ba4412f0143329ef388a2c Mon Sep 17 00:00:00 2001 From: Colin Saliceti Date: Fri, 13 Dec 2024 10:49:09 +0000 Subject: [PATCH 2/2] Update docs --- docs/documents/DevOps/database-migrations.md | 38 ----------- docs/documents/DevOps/manual-terraform.md | 66 -------------------- docs/documents/DevOps/tools/trello.md | 2 +- docs/documents/DevOps/workflows.md | 3 - docs/documents/disaster-recovery.md | 64 ------------------- 5 files changed, 1 insertion(+), 172 deletions(-) delete mode 100644 docs/documents/DevOps/database-migrations.md delete mode 100644 docs/documents/DevOps/manual-terraform.md delete mode 100644 docs/documents/disaster-recovery.md diff --git a/docs/documents/DevOps/database-migrations.md b/docs/documents/DevOps/database-migrations.md deleted file mode 100644 index 179cfc0bf..000000000 --- a/docs/documents/DevOps/database-migrations.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -title: Automatic Migrations -nav_order: 20 -parent: DevOps -layout: page ---- - -## Automatic Migrations -The endpoint has been written for the Dockerfile. This has three parameters - -- -m - Carry out migration -- -f run in frontend mode -- -b run in background jobs mode -- -s run in sidekiq jobs mode -( -f -b and -s are mutually exclusive ) - -## Migration Policy -In the following environments the two applications are set as - -|Environment |Frontend |Background Jobs| -|-------------|---------|---------------| -|Development |Migrates| Does Not Migrate| -|Review |Migrates| Does Not Migrate| -|Staging | Migrates| Does Not Migrate| -|Production | Migrates| Does Not Migrate| - -Development and Review run in the same space and share databases. We must take care not to remove or change existing objects unless on the master branch as this could impact older 'review' instances. Adding new objects should not normally be a problem, unless they are made mandatory and have no 'Defaults' - -## Manual Migration -To manually migrate a database you need access to Cloud Foundry and the ability to ssh on to the application. Makesure the application is deployed -with the version of the code you wish to migrate the database to, then issue the following commands. - -``` -cf ssh school-experience-app-dev ->> cd /app ->> export PATH=${PATH}:/usr/local/bin ->> bundle exec rails db:migrate -``` diff --git a/docs/documents/DevOps/manual-terraform.md b/docs/documents/DevOps/manual-terraform.md deleted file mode 100644 index 6c7649d4a..000000000 --- a/docs/documents/DevOps/manual-terraform.md +++ /dev/null @@ -1,66 +0,0 @@ ---- -title: Manual Deployments -nav_order: 20 -parent: DevOps -layout: page ---- - - -# Manual Deployment - -These are instructions on how to manually deploy a Cloud Foundry application to **Development** using the terraform cli. - -Log into Azure and make sure you have access to the right account, setting your default subscription - -``` -az login -az account list --output table -az account set --subscription - -``` - -We need to set the ```ARM_ACCESS_KEY```, you will find the ```storage_account_name``` used to store the terraform state. -Then list the storage accounts in your subscription - -``` -az storage account list --output table -az storage account keys list --resource-group --account-name -export ARM_ACCESS_KEY= -``` - - -Log into to cloud foundry CLI and get the deployed version of the docker image for the application you are interested in ` - -``` -❯ cf app school-experience-app-dev -Showing health and status for app school-experience-app-dev in org dfe / space get-into-teaching as 112871240885355412226... - -name: school-experience-app-dev -requested state: started -routes: school-experience-app-dev.london.cloudapps.digital, school-experience-app-dev-internal.apps.internal -last uploaded: Mon 07 Feb 15:53:49 GMT 2022 -stack: -docker image: ghcr.io/dfe-digital/schools-experience:sha-1d5aaa8 - -type: web -sidecars: -instances: 1/1 -memory usage: 1024M - state since cpu memory disk details -#0 running 2022-02-07T15:55:50Z 0.1% 149.4M of 1G 1G of 1.5G -``` - -Now this bit is a little tricky and depends on your variable name, so you in the case of School Experience the variable is called ```docker_image``` but it may vary and you will need to find it usually in your variables.tf file. - -set this to the current image, but as it is used in terraform it needs to be preceeded with TF_VAR_ so - -```export TF_VAR_docker_image=ghcr.io/dfe-digital/schools-experience:sha-1d5aaa8``` - -now we are ready to run terraform. - -``` -rm -rf .terraform -terraform init -backend-config=dev.bk.vars -terraform plan --var-file=dev.env.tfvars -``` -The plan should show nothing has changed, if all is good you can now change what you like and run tests. diff --git a/docs/documents/DevOps/tools/trello.md b/docs/documents/DevOps/tools/trello.md index 5d464949d..96339e98d 100644 --- a/docs/documents/DevOps/tools/trello.md +++ b/docs/documents/DevOps/tools/trello.md @@ -17,7 +17,7 @@ In School Experience there is a workflow which uses the [Github Action](https:// [Get School Experience](https://trello.com/b/nS4OTSIl/get-school-experience) ## Change Key -The trello keys are linked to the service account schoolexperience-tech@digital.education.gov.uk. The email is a Google group and the trello password +The trello keys are linked to the service account schoolexperience-tech@digital.education.gov.uk. The email is a 365 distribution list and the trello password is kept in azure key vault s105p01-kv. To generate the API key and token, follow the instructions on [Developer API Keys](https://trello.com/app-key). Then once you have them, use the command `make development edit-infra-secrets` to edit your secrets and store them in Azure key vault. diff --git a/docs/documents/DevOps/workflows.md b/docs/documents/DevOps/workflows.md index 8df15b7a4..612cab9be 100644 --- a/docs/documents/DevOps/workflows.md +++ b/docs/documents/DevOps/workflows.md @@ -31,9 +31,6 @@ This is the heart of the deployment system, it has a number of phases 8. **Quality Assurance** - Also known as the staging area, this is a near live test, ensuring all the components hang together, so there is a high degree of confidence the Production deployment will work. 9. **Production** - The final deployment to the live system -### [Check Service Principal](https://github.com/DFE-Digital/schools-experience/blob/master/.github/workflows/check_sp.yml) -Ran each night this workflow checks that Service Prinicipal has at least 30 days left before its security credentials expire. If due to expire a message is sent to [SLACK](https://slack.com) with instructions on how to reset it. - ### [DO NOT MERGE](https://github.com/DFE-Digital/schools-experience/blob/master/.github/workflows/stop_merges.yml) When a PR is labelled with DO NOT MERGE, this workflow will prevent the system from merging it. diff --git a/docs/documents/disaster-recovery.md b/docs/documents/disaster-recovery.md deleted file mode 100644 index f56272ebb..000000000 --- a/docs/documents/disaster-recovery.md +++ /dev/null @@ -1,64 +0,0 @@ ---- -title: Disaster Recovery Plan -layout: page ---- - -## Disaster Recovery Plan - -Disaster Recover (DR) follows the principles outlined in the [Becoming a Teacher Disaster recovery plan](https://dfedigital.atlassian.net/wiki/spaces/BaT/pages/2921365676/Disaster+recovery). - -## Pre Requisit Installations - -[Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) - -[Install the Cloud Foundry CLI](https://docs.cloudfoundry.org/cf-cli/install-go-cli.html#pkg) - -## Access - - -### Azure Access - -The Department of Education Azure system is known as Cloud Infrastructure Platform (CIP). The user will require access to the School Experience production subscription, and will need to have elevated privilages set via Azure privilaged identity managment, so they can download files in the storage account. - -[Access to CIP Guide](https://dfedigital.atlassian.net/wiki/spaces/BaT/pages/1897955586/Azure+CIP) - -## Specific to School Experience - -The data in Postgres is persistent and regular backups are carried out by Cloud Foundry (AWS). - -However, Every night the Azure Pipeline Datase backup will run, as part of this process it exports the live databases. For restore purposes we are storing this file in an Azure file storage account (Production subscription ). - -This file can be used to recover the database in the event of accidental deletion. - -### Download Recovery file - -The file can be downloaded via the Azure portal or using Azure CLI, in both cases you will need permission to access the Azure Storage Account. - -#### Azure CLI get storage keys - -Note: you will need to set your AZURE_STORAGE_KEY variable. To get this: - -``` -az login -az account list --output=table ### List of accounts you have access to -az account set --subscription -az storage account list --output=table ### List of Storage accounts you have access to -az storage account keys list --resource-group --account-name -export AZURE_STORAGE_KEY= -``` - -#### Azure CLI download file - -Once your AZURE_STORAGE_KEY is set, you will need to download the backup file, these instructions show how you can do this using the Azure CLI: - -``` -az login -az account list --output=table ### List of accounts you have access to -az account set --subscription -az storage account list --output=table ### Output is a list of storage accounts -az storage container list --account-name --output=table ### Output is list of containers -az storage blob list --container-name --account-name --output=table ### Output is list of files - -## Choose a File and download it. -az storage blob download --container-name --account-name --name Tuesday.tar.gz --file output.sql -```