Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Miscellenaneous changes #635

Merged
merged 25 commits into from
Feb 1, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
504016d
#310 Allow STS deployment of ODF
fketelaars Jan 18, 2024
72589de
#572 Allow custom non-mco script
fketelaars Jan 18, 2024
5b17df3
#572 Add warning not to use exit + fix typo
fketelaars Jan 19, 2024
e18c1f0
#618 Fix preprocessor for cp4d
fketelaars Jan 19, 2024
829458a
#618 Cluster components not applied for 2nd cluster
fketelaars Jan 19, 2024
026a69b
#618 Show cluster name that changes are made to
fketelaars Jan 19, 2024
1b1f47b
#622 Also move case files when applying cluster components
fketelaars Jan 19, 2024
9dcf9b2
#624 Default to ibm-licensing project
fketelaars Jan 20, 2024
f44d2ea
#622 Remove duplicate code
fketelaars Jan 22, 2024
924a9de
#626 Remove pip upgrade from Dockerfile
fketelaars Jan 22, 2024
9e4cf84
#625 Download case files when --skip-mirror-images
fketelaars Jan 25, 2024
4e90a1c
#629 Rescue copy of list-components output
fketelaars Jan 25, 2024
c5a4150
#628 Do not install Python
fketelaars Jan 25, 2024
ae35ad1
#623 Deploy knative eventing
fketelaars Jan 25, 2024
2e5a201
#623 Install knative eventing
fketelaars Jan 26, 2024
d6f0c81
#625 Fix message in the log
fketelaars Jan 26, 2024
a848569
Merge branch 'aws-odf' into airgap-misc
fketelaars Jan 26, 2024
2b3a47b
#623 Also delete Knative eventing resources
fketelaars Jan 27, 2024
37bdbdb
#631 Add Watson Assistant instance handling
fketelaars Jan 28, 2024
fd006c6
#631 Add DMC instance
fketelaars Jan 28, 2024
63e959b
#631 Fix WA instance
fketelaars Jan 28, 2024
271ff66
#631 Add Watson Discovery instances through deployer
fketelaars Jan 30, 2024
6cf2f8b
#631 Add DMC instance
fketelaars Jan 31, 2024
9488f32
#631 Update sample configuration
fketelaars Jan 31, 2024
f7b8491
#537 Add watsonx.a@ibm-open-source-bot
fketelaars Feb 1, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,10 @@ LABEL authors="Arthur Laimbock, \
USER 0

# Install required packages, including HashiCorp Vault client
RUN yum install -y yum-utils python38 python38-pip && \
RUN yum install -y yum-utils && \
yum-config-manager --add-repo https://rpm.releases.hashicorp.com/RHEL/hashicorp.repo && \
yum install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-8.noarch.rpm && \
yum install -y tar sudo unzip wget jq skopeo httpd-tools git hostname bind-utils iproute procps-ng && \
pip3 install --upgrade pip && \
pip3 install jmespath pyyaml argparse python-benedict pyvmomi psutil && \
alternatives --set python /usr/bin/python3 && \
yum install -y vault && \
Expand Down
2 changes: 0 additions & 2 deletions automation-generators/aws/openshift/preprocessor.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,6 @@ def preprocessor(attributes=None, fullConfig=None, moduleVariables=None):
elif os['storage_name'] not in nfs_server_names:
g.appendError(msg="'"+ os['storage_name'] + "' is not an existing nfs_server name (Found nfs_server: ["+ ','.join(nfs_server_names) +"] )")
if "storage_type" in os and os['storage_type']=='ocs':
if "credentials_mode" in ge['infrastructure']:
g.appendError(msg='Installation of ODF using temporary cloud credentials (credentials_mode property) is not supported. Please choose elastic storage or install using permanent credentials.')
if "ocs_storage_label" not in os:
g.appendError(msg='ocs_storage_label must be specified when storage_type is ocs')
if "ocs_storage_size_gb" not in os:
Expand Down
13 changes: 10 additions & 3 deletions automation-generators/generic/cp4d/preprocessor.py
Original file line number Diff line number Diff line change
Expand Up @@ -226,9 +226,7 @@ def preprocessor(attributes=None, fullConfig=None, moduleVariables=None):

g('project').isRequired()
g('openshift_cluster_name').expandWith('openshift[*]',remoteIdentifier='name')
openshift_cluster_name=g('openshift_cluster_name').getExpandedAttributes()['openshift_cluster_name']
g('cp4d_version').isRequired()
g('openshift_storage_name').expandWithSub('openshift', remoteIdentifier='name', remoteValue=openshift_cluster_name, listName='openshift_storage',listIdentifier='storage_name')
g('cartridges').isRequired()
g('use_case_files').isOptional().mustBeOneOf([True, False])
g('sequential_install').isOptional().mustBeOneOf([True, False])
Expand All @@ -237,6 +235,11 @@ def preprocessor(attributes=None, fullConfig=None, moduleVariables=None):
g('cp4d_entitlement').isOptional().mustBeOneOf(['cpd-enterprise', 'cpd-standard'])
g('cp4d_production_license').isOptional().mustBeOneOf([True, False])

# Expand storage if no errors yet
if len(g.getErrors()) == 0:
openshift_cluster_name=g('openshift_cluster_name').getExpandedAttributes()['openshift_cluster_name']
g('openshift_storage_name').expandWithSub('openshift', remoteIdentifier='name', remoteValue=openshift_cluster_name, listName='openshift_storage',listIdentifier='storage_name')

# Now that we have reached this point, we can check the attribute details if the previous checks passed
if len(g.getErrors()) == 0:
fc = g.getFullConfig()
Expand Down Expand Up @@ -337,13 +340,17 @@ def preprocessor(attributes=None, fullConfig=None, moduleVariables=None):
dep_found=True
if not dep_found:
g.appendError(msg='Cartridge {} is selected to be installed but dependent cartridge {} is not'. format(c['name'],dep['name']))
# If instances for cartridge are specified, iterate over instances
if 'instances' in c and 'name' in c:
for i in c['instances']:
if "name" not in i:
g.appendError(msg='Instance name must be specifed for every instance in cartridge {}'.format(c['name']))
# Iteration over cartridges is done, now check if the required fields were found in the for-loop
if cpfsFound==False:
g.appendError(msg='You need to specify a cartridge for the Cloud Pak Foundational Services (cpfs or cp-foundation)')
if cpdPlatformFound==False:
g.appendError(msg='You need to specify a cartridge for the Cloud Pak for Data platform (cpd_platform or lite)')


result = {
'attributes_updated': g.getExpandedAttributes(),
'errors': g.getErrors()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ def expandWith(self, matchPattern, remoteIdentifier='name'):
self.attributesDict[ self.recentCheck.get('pathToCheck') ]=listOfMatches[0]
else:
#print(listOfMatches)
self.appendError(msg="Can't expand, result of given path ("+ matchPatternCombined +") not unique, found:" + ','.join(listOfMatches))
self.appendError(msg="Can't expand attribute "+self.recentCheck.get('pathToCheck')+", resource to infer from: " + matchPatternCombined + ") not unique, found: " + ','.join(listOfMatches))
return self

# matchPattern (string): should look like 'vpc[*].name'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@
-o custom-columns='name:metadata.name,phase:status.phase' | \
grep -i ready | wc -l
register: _ocs_csv_status
retries: 30
retries: 60
delay: 30
until: _ocs_csv_status.stdout == "1"
vars:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
--no-headers \
-o custom-columns='phase:status.phase'
register: _storage_cluster_status
retries: 30
retries: 60
delay: 30
until: _storage_cluster_status.stdout == "Ready"
vars:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,9 @@
_p_delete_all_instances: True
when: (_current_cartridge_cr.olm_utils_name | default("")) == "watsonx_ai"

- name: Delete all Watson Assistant instances
include_role:
name: cp4d-instance-wa
vars:
_p_delete_all_instances: True
when: (_current_cartridge_cr.olm_utils_name | default("")) == "wa"
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
---
- include_role:
name: cp4d-variables

- name: Create /tmp/work directory
file:
path: /tmp/work
state: directory

- name: Generate mirror-images command preview script to save case files
set_fact:
_mirror_images_case_save: "{{ lookup('template', 'mirror-images-case-save.j2') }}"

- name: Show mirror-images command to save case files
debug:
var: _mirror_images_case_save

- name: Write script to "{{ status_dir }}/cp4d/mirror-images-case-save.sh"
copy:
content: "{{ _mirror_images_case_save }}"
dest: "{{ status_dir }}/cp4d/mirror-images-case-save.sh"
mode: u+rwx

- name: Run mirror-images command to identify case files, logs are in {{ status_dir }}/log/mirror-images-case-save.log
shell: |
{{ status_dir }}/cp4d/mirror-images-case-save.sh > {{ status_dir }}/log/mirror-images-case-save.log 2>&1

- name: Copy preview script to {{ status_dir }}/cp4d/mirror-images-case-save-ibm-pak.sh
copy:
content: "{{ lookup('file', '/tmp/work/preview.sh') }}"
dest: "{{ status_dir }}/cp4d/mirror-images-case-save-ibm-pak.sh"
mode: u+rwx

- name: Comment out the mirroring of the images
replace:
path: "{{ status_dir }}/cp4d/mirror-images-case-save-ibm-pak.sh"
regexp: "{{ item }}"
replace: '#\1'
with_items:
- '(oc image mirror.*)'
- '(oc ibm-pak generate.*)'
- '(mv /tmp/work.*)'

- name: Download case files, logs are in {{ status_dir }}/cp4d/mirror-images-case-save-ibm-pak.log
shell: |
{{ status_dir }}/cp4d/mirror-images-case-save-ibm-pak.sh > {{ status_dir }}/log/mirror-images-case-save-ibm-pak.log
args:
chdir: /tmp/work

- name: Create {{ status_dir }}/cp4d/offline/{{ _p_current_cp4d_cluster.cp4d_version }} directory
file:
path: "{{ status_dir }}/cp4d/offline/{{ _p_current_cp4d_cluster.cp4d_version }}"
state: directory

- name: Copy case files to {{ status_dir }}/cp4d/offline/{{ _p_current_cp4d_cluster.cp4d_version }}
copy:
src: "/opt/ansible/.ibm-pak"
dest: "{{ status_dir }}/cp4d/offline/{{ _p_current_cp4d_cluster.cp4d_version }}/"
remote_src: True
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
mirror-images \
--release={{ _p_current_cp4d_cluster.cp4d_version }} \
--target_registry=127.0.0.1:12443 \
--preview=True \
--components={% for c in _cartridges_to_install -%}
{%- if ((c.state | default('installed')) == 'installed') or (cpd_test_cartridges | default(False) | bool) -%}
{%- if not loop.first -%},{% endif -%}
{{ c.olm_utils_name }}
{%- endif -%}
{%- endfor -%}
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,6 @@
path: /tmp/work/offline
state: absent

- name: Create offline directory
file:
path: /tmp/work/offline
state: absent

- name: If air-gapped, copy case files from {{ status_dir }}/cp4d/offline to /tmp/work/offline
copy:
src: "{{ status_dir }}/cp4d/offline"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
- (current_cp4d_cluster.image_registry_name | default("")) != ""
- not (cpd_skip_mirror | bool)

- name: Migrate to private topology if upgrading to CP4D 4.7.0 or higher
- name: Migrate to private topology on OpenShift cluster {{ current_cp4d_cluster.openshift_cluster_name }} if upgrading to CP4D 4.7.0 or higher
include_role:
name: cp4d-migrate-private-topology
vars:
Expand All @@ -30,7 +30,7 @@
- _installed_ibmcpd_version < "4.7.0"
- current_cp4d_cluster.cp4d_version >= "4.7.0"

- name: Activate license service and certificate manager for CP4D 4.7.0 and higher
- name: Activate license service and certificate manager on OpenShift cluster {{ current_cp4d_cluster.openshift_cluster_name }}
include_role:
name: cp-fs-cluster-components
vars:
Expand All @@ -39,6 +39,9 @@
_p_preview: False
when: current_cp4d_cluster.cp4d_version >= "4.7.0"

- name: Prepare KNative for OpenShift
include_tasks: openshift-install-knative.yml

- name: Prepare OpenShift project {{ current_cp4d_cluster.project }} for Cloud Pak for Data
include_tasks: openshift-prepare-project.yml

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---

# Prepare CP4D for versions >= 4.8.0
- block:
- name: Generate deploy KNative eventing script {{ status_dir }}/cp4d/{{ current_cp4d_cluster.project }}-deploy-knative-eventing.sh
template:
src: deploy-knative-eventing.j2
dest: "{{ status_dir }}/cp4d/{{ current_cp4d_cluster.project }}-deploy-knative-eventing.sh"
mode: u+rwx

- name: Run script to deploy KNative eventing, output can be found in {{ status_dir }}/log/{{ current_cp4d_cluster.project }}-deploy-knative-eventing.log
shell: |
{{ status_dir }}/cp4d/{{ current_cp4d_cluster.project }}-deploy-knative-eventing.sh

when:
- current_cp4d_cluster.cp4d_version >= '4.8.0'
- (_knative_eventing_dependency | default(False))
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
set -o pipefail
deploy-knative-eventing \
--release={{ current_cp4d_cluster.cp4d_version }} \
--block_storage_class={{ ocp_storage_class_block }} \
2>&1 | tee {{ status_dir }}/log/{{ current_cp4d_cluster.project }}-deploy-knative-eventing.log
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
fail: msg="cloud_platform {{ cloud_platform }} is not implemented, current implemented cloud platforms are {{ implemented_cloud_platform_types }} "
when: "cloud_platform not in implemented_cloud_platform_types"

- name: Retrieve or detect cloud infra
- name: Retrieve or detect cloud infrastructure type for OpenShift cluster {{ current_cp4d_cluster.openshift_cluster_name }}
include_role:
name: retrieve-cloud-infra-type
vars:
Expand All @@ -19,5 +19,5 @@
- cloud_platform == 'existing-ocp'
- _storage_type == 'pwx'

- name: Prepare cluster-wide configuration for Cloud Pak for Data
- name: Prepare Cloud Pak for Data cluster-wide configuration on cluster {{ current_cp4d_cluster.openshift_cluster_name }}
include_tasks: cp4d-prepare-openshift.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
# Determine if any cartridges with a db2u dependency are installed
- set_fact:
_db2u_dependency: False

- name: Determine if any of the cartridges has a db2u dependency
set_fact:
_db2u_dependency: "{{ _db2u_dependency or (item.db2u_dependency | default(False)) }}"
when: (item.state | default('installed')) == 'installed'
loop: "{{ _cartridges_to_install }}"
no_log: True

- name: Show if there is a db2u dependency
debug:
var: _db2u_dependency
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
# Determine if any cartridges with a Knative dependency are installed
- set_fact:
_knative_eventing_dependency: False

- name: Determine if any of the cartridges has a Knative eventing dependency
set_fact:
_knative_eventing_dependency: "{{ _knative_eventing_dependency or (item.knative_eventing_dependency | default(False)) }}"
when: (item.state | default('installed')) == 'installed'
loop: "{{ _cartridges_to_install }}"

- name: Show if there is a Knative eventing dependency
debug:
var: _knative_eventing_dependency
Original file line number Diff line number Diff line change
Expand Up @@ -43,13 +43,20 @@
shell: |
{{ _list_components_command }}

- name: Copy file to {{ _cp4d_components_file }}
copy:
src: /tmp/work/components.csv
dest: "{{ _cp4d_components_file }}"
remote_src: True
force: True
mode: u+rwx,g+rwx,o+rwx
# Try to copy the list-components output file. In certain situations (Windows, SELinux), the command may fail and has to be re-run without remote_src
- block:
- name: Copy file to {{ _cp4d_components_file }}
copy:
src: /tmp/work/components.csv
dest: "{{ _cp4d_components_file }}"
remote_src: True
force: True
mode: u+rwx,g+rwx,o+rwx
rescue:
- name: Rescue copy file to {{ _cp4d_components_file }}
copy:
src: /tmp/work/components.csv
dest: "{{ _cp4d_components_file }}"

when:
- not (cpd_airgap | bool)
Expand All @@ -70,11 +77,18 @@
head -1 {{ _cp4d_components_file }}
register: _csv_column_headers

- name: Copy file to _cp4d_components_file
copy:
src: "{{ _cp4d_components_file }}"
dest: "{{ _cp4d_components_no_header_file }}"
remote_src: True
# Try to copy the list-components output file. In certain situations (Windows, SELinux), the command may fail and has to be re-run without remote_src
- block:
- name: Copy file to {{ _cp4d_components_no_header_file }}
copy:
src: "{{ _cp4d_components_file }}"
dest: "{{ _cp4d_components_no_header_file }}"
remote_src: True
rescue:
- name: Copy file to {{ _cp4d_components_no_header_file }}
copy:
src: "{{ _cp4d_components_file }}"
dest: "{{ _cp4d_components_no_header_file }}"

- name: Remove first line from file
lineinfile:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,9 @@
foundational_services_project: "{{ _p_current_cp4d_cluster.operators_project | default('cpd-operators') }}"
when: _p_current_cp4d_cluster.cp4d_version >= "4.7.0"

# Set the license server project to the correct value, dependent if the license service is already installed in cs-control
- include_tasks: set-license-service-project.yml

- debug:
var: implemented_cloud_platform_types

Expand Down Expand Up @@ -157,27 +160,18 @@
debug:
var: _sequential_install

# Determine if any cartridges with a db2u dependency are installed
- set_fact:
_db2u_dependency: False

- name: Determine if any of the cartridges has a db2u dependency
set_fact:
_db2u_dependency: "{{ _db2u_dependency or (item.db2u_dependency | default(False)) }}"
when: (item.state | default('installed')) == 'installed'
loop: "{{ _cartridges_to_install }}"
no_log: True

- name: Show if there is a db2u dependency
debug:
var: _db2u_dependency
# check if any of the cartridges have a db2u dependency
- include_tasks: check-db2u-dependency.yml

- include_tasks: check-db2u-node-tuning.yml
when: _db2u_dependency

- include_tasks: check-db2u-kubelet.yml
when: _db2u_dependency

# check if any of the cartridges have a knative-eventing dependency
- include_tasks: check-knative-eventing.yml

- name: Determine cartridge dependencies
set_fact:
_cartridge_dependencies: >-
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
---
- name: Check if the license service runs in the {{ cs_control_project }} project
shell: |
oc get deploy ibm-licensing-service-instance \
-n {{ cs_control_project }}
failed_when: False
register: _get_license_service_instance

- name: Set the license server project to {{ cs_control_project }} if it already runs there
set_fact:
license_service_project: "{{ cs_control_project }}"
when: _get_license_service_instance.rc == 0
Loading
Loading