Skip to content

hmcts/common-dev-env-bsbp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Common Dev Env (for Bulk Scan/Bulk Print)

An attempt to create a simplified way of running BSBP services locally on mac machines for a specified list of github repos

Pre-requisites

  1. Ensure your VPN is on!

Please run the following commands to ensure you don't have issues pulling down Docker Images:

  1. az acr login -n hmctspublic.azurecr.io
  2. nano ~/.docker/config.json
  3. remove "credsStore": "desktop"

Can simply remove the line and do Control + O, then Control + X to leave.

Purpose

This will primarily be used for developers to pull down and run specific services which are located in the projects services.json file. Any modifications to what service will be cloned and run can be done there.

This will take roughly a few minutes to spin up the BSBP services (at present, but can be expanded to other projects) and enables a simple way to provide end-to-end testing, or for opening a simple application in IntelliJ, running and debugging...and so forth

For each service it will do the following:

  1. Prompt the user whether they want only the database spun up, or the service as well (catering for IDE running vs docker running)
  2. clone down the repo if not already cloned down
  3. copy script files to the bin directory which will be used for creating the .env file which is needed for the service to run in docker (or locally through IntellIJ / whatever IDE you may use via the env plugin)
  4. for said service, those script files will be added to the .gitignore (if not already there)
  5. Run the setup script which will create the environment variables, assemble the jar, and run the docker services listed in the docker-compose.yml file

The simplest way to spin up all the services is to run:

python3 ./start.py start service all

Requirements

Make sure you have the following on your machine. Although you will be prompted if you don't when running the scripts, so not a big worry if you don't check beforehand

  1. python 3 version 3.9 or later (simply run python3 --version to check)
  2. Bash 4 or later (bash --version to check in a terminal window)
  3. Make sure you have Azure Storage Explorer downloaded if you want to view containers locally on your machine

Specific commands on the BSBP dev env

Go to the main directory of the service and run:

python3 ./start.py :command

Where :command is one of the following:

  1. (no args) (default, prompt to start all services)
  2. start service all (start all services)
  3. start activemq (add an activemq instance with all for BSBP (done by default on start service all))
  4. start service :service (start specific service - i.e bulk-scan-orchestrator)
  5. stop service all (stop all services)
  6. stop service :name (stop one service)
  7. stop activemq (stop docker activemq instance)
  8. get docker logs :service (get docker logs for one service)
  9. run dailychecks (requires bearer token)
  10. reset branches (to set each services branch to master and run git pull)

Connecting to FileZilla

For services where the setup-sftp.sh script is required, you can connect to the local server via FileZilla (or the command line if preferred).

To connect to SFTP:

  1. Download FileZilla
  2. Navigate to the location where the sftp certificates are configured (for example, docker/database) and acquire the public certificate.
  3. Navigate to FileZilla and set the site settings accordingly:
    1. Protocol: SFTP - SSH File Transfer Protocol
    2. Host: localhost
    3. Logon Type: Key file
    4. User: the one configured as a part of the docker-compose.yaml setup. For example: mosh
    5. Key file: the downloaded public key for the local dev-env only.
  4. Click connect.

Setting up Local SFTP Configuration For a Service

For this to work with a service the following is needed:

  1. The service needs a docker-compose.yaml
  2. Within this file, there needs to be a service (within the services section) that has the value sftp as a part of the name.
  3. For the service that's configured, the certificates that are required need to be managed accordingly. Normally there is a Dockerfile that exists that manages it and sets up the certificates.
  4. For an example, refer to this link

Connecting to Local Azure Blob Storage

For services where the setup-azurite.sh script is required, you can connect to the local server via Microsoft Azure Storage Explorer. For a specific example of how to do this I will explore Bulk Scan which should give a picture of what is required:

  1. Open Microsoft Azure Storage Explorer
  2. Click the connection button, and select the “storage account or service” option.
  3. Add in the SAS token found in the docker -> storage -> init-azurite.sh file (using the example above).
  4. If you get an authentication error when doing this. Instead pick the Local Storage Emulator option instead, and find the account key via the SAS token. Reference the account name as bulk/reformscanlocal too if using the example.
  5. If using Bulk Scan example, do this again for the bulkscanlocal SAS token found within the same file
  6. Refresh the explorer, and you should be able to see all the containers added through the shell script! Woo!

Setting up Local Azure Blob Storage For a Service

For this to work with a service the following is needed:

  1. The service needs a docker-compose.yaml and the services.json needs setup-azurite.sh as a part of the required scripts.
  2. Ensure the docker-compose.yaml has the following:
      azure-storage-emulator-azurite:
         image: mcr.microsoft.com/azure-storage/azurite
         command: azurite-blob --blobHost 0.0.0.0 --loose --skipApiVersionCheck
         environment:
            AZURITE_ACCOUNTS: container:key;container2:key2; (can be more than one, replace container and key values)
         volumes:
            - ./<service-name>-azure-blob-data:/opt/azurite/folder
         ports:
            - 10000:10000
      init-storage:
         build:
            context: ./docker/storage
         links:
            - azure-storage-emulator-azurite
         depends_on:
            - azure-storage-emulator-azurite
  3. Make sure the /docker/storage path exists for the repo.
  4. Within it, the init.azurite.sh file (or applicable) will be run as a part of the build context. Ensure it contains the setup required for users, containers and so forth, for example:
    SOURCE_CONNECTION_STRING="DefaultEndpointsProtocol=http;AccountName=<account name to add>;AccountKey=<a random key>;BlobEndpoint=http://azure-storage-emulator-azurite:10000/<account name>;"
    
    az storage container create --name <container> --connection-string $SOURCE_CONNECTION_STRING
  5. Ensure as a part of the running of the service that the docker container is created for azure-azurite, as this is required to connect using Azure Storage Explorer
  6. For a complete example, see here

Connecting to Queues (With activemq)

If you want to explore queues, run the dev-env and an activemq instance will be spun up. Within this currently several queues are configured. To access this go to:

  1. http://localhost:8161
  2. Use admin as username and password as password

If you then want to add a message to a queue, you simply need to click on it and add the message. To determine what message to add depends on the context of the service. You may want to go to Azure and look at an existing Service Bus and look at some examples.

Contacts

If you have any suggestions or want to know more about this, let me know (Adam Stevenson)

About

Common dev env for running services locally

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published