You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The way we configured Configstore properties file seems to be more specific to one application and specifying volumePathOnNode for config store may be complicated on different machines (especially WSL).
We need additional ways to be more suitable across all machines and be able to abstract out properties so that we can reuse properties across different applications (not just spark-infrastructure).
Definition Of Done
Rename spark-infrastructure.properties to spark.properties
Support of updating configuration-store values file to add baseproperties with filename associated with properties key/values pair
update to support Configuration store logic to use properties pattern in addition to configurationVolume
Update README and antora doc of configuration Store based on this change.
Make sure configuration store can still work in local ArgoCD and can inject configurations correctly.
Test Strategy/Script
Pull test533-with-deploy-scripts branch (This includes Feature/Universal-Config and some deploy script in feature/path2production-alignment for easily testing since this feature uses mutation webhook )
Run mvn clean install
Create a downstream project with baseline version 1.11.0-SNAPSHOT
Replace {your-github-username} with your github username
Add the SparkPipeline.json file to the test-533-pipeline-models/src/main/resources/pipelines directory
Run mvn clean install until all the manual actions are complete
Once the manual actions are complete, run mvn clean install -Dmaven.build.cache.skipCache=true once to get any remaining manual actions
Modify the helm templates for Argocd deployment:
- Mac User: In the -deploy/src/main/resources/apps/configuration-store/values-dev.yaml file update the volumePathOnNode to be /<pathToProject>/test-533-deploy/src/main/resources/configurations
- Window User: In the -deploy/src/main/resources/values-dev.yaml file update the volumePathOnNode to be /mnt/c/Users/YOUR_USER/PATH/TO/test-533-deploy/src/main/resources/configurations
Modify Hive metastore chart
in the -deploy/src/main/resources/apps/spark-infrastructure/Chart.yaml update aissemble-hive-metastore-service-chart repository to repository: oci://ghcr.io/jaebchoi. (Use my repo as I have uploaded Chart that has Feature's change)
i.e.
make deploy script executable permission chmod +x deploy.sh
change port fowarding of argocd-server to 30080
i.e. go to Rancher desktop-> Click Portfowarding on left section
NOTE: we’re port forwarding to avoid bug with NodePort in RD on ARM Mac
Run following to stand up argoCD ./deploy.sh up
Go to localhost:30080
In Argo CD, Go to spark-infrastructure and make sure config store correctly injects secret values and username with hiveNewBase
References/Additional Context
A clear and concise description of any alternative solutions or features you've considered.
Add any other context, links, or screenshots about the feature request here.
The text was updated successfully, but these errors were encountered:
jaebchoi
changed the title
Feature: Restructure Configuration Store values to use properties instead of volumePathOnNode
Feature: Support Configuration Store values to use properties
Jan 15, 2025
jaebchoi
changed the title
Feature: Support Configuration Store values to use properties
Feature: Support Configuration Store values to use properties to inject values
Jan 15, 2025
Description
The way we configured Configstore properties file seems to be more specific to one application and specifying volumePathOnNode for config store may be complicated on different machines (especially WSL).
We need additional ways to be more suitable across all machines and be able to abstract out properties so that we can reuse properties across different applications (not just spark-infrastructure).
Definition Of Done
Test Strategy/Script
Pull
test533-with-deploy-scripts
branch (This includes Feature/Universal-Config and some deploy script in feature/path2production-alignment for easily testing since this feature uses mutation webhook )Run mvn clean install
Create a downstream project with baseline version 1.11.0-SNAPSHOT
Replace {your-github-username} with your github username
Add the SparkPipeline.json file to the test-533-pipeline-models/src/main/resources/pipelines directory
Run mvn clean install until all the manual actions are complete
Once the manual actions are complete, run
mvn clean install -Dmaven.build.cache.skipCache=true
once to get any remaining manual actionsModify the helm templates for Argocd deployment:
- Mac User: In the
-deploy/src/main/resources/apps/configuration-store/values-dev.yaml file
update thevolumePathOnNode
to be/<pathToProject>/test-533-deploy/src/main/resources/configurations
- Window User: In the
-deploy/src/main/resources/values-dev.yaml file
update thevolumePathOnNode
to be/mnt/c/Users/YOUR_USER/PATH/TO/test-533-deploy/src/main/resources/configurations
Modify Hive metastore chart
-deploy/src/main/resources/apps/spark-infrastructure/Chart.yaml
update aissemble-hive-metastore-service-chart repository to repository: oci://ghcr.io/jaebchoi. (Use my repo as I have uploaded Chart that has Feature's change)i.e.
Create a repo for the project created
git init
git add .
git commit -m "init test-533 base"
git branch -M main
git remote add origin https://github.com/<username>/<repo_name>.git
git push -u origin main
on the root of your downstream project
make deploy script executable permission
chmod +x deploy.sh
change port fowarding of argocd-server to 30080
i.e. go to Rancher desktop-> Click Portfowarding on left section
NOTE: we’re port forwarding to avoid bug with NodePort in RD on ARM Mac
Run following to stand up argoCD
./deploy.sh up
Go to localhost:30080
In Argo CD, Go to spark-infrastructure and make sure config store correctly injects secret values and username with
hiveNewBase
References/Additional Context
A clear and concise description of any alternative solutions or features you've considered.
Add any other context, links, or screenshots about the feature request here.
The text was updated successfully, but these errors were encountered: